Hosted by Clark Atlanta University and chaired by SMPTE President Renard T. Jenkins, SMPTE’s Power of Color Symposium was a mixture of sessions on colourism, representation and diversity, with in-depth looks at colour workflows and the meritorious and odious potentials of AI. George Jarrett reports.
One of the standout sessions from this year’s symposium was a grading suite masterclass on colour management and its associated problems by the colourists John Petersen and Grant Reynolds from Atlanta’s Moonshine Post.
Petersen is Co-Founder and Co-Owner of the business, and Reynolds is both digital technician and experienced finisher. Catherine Meininger, the Director of Colour Science with Portrait Displays was the session chair. Framing the session, Meininger said: “We are talking about a more subtractive medium, where light is going through something to create a colour versus a display which is producing a colour on its own. That has different physical properties, and you need to think about that process when it comes to understanding the value of colour.”
Petersen added: “I am just an operator that knows the tools and what my output needs to be. Film was a physical project, but now I sit with terabytes at my fingertips. My scopes are my guide, my gallery is my reference.”
Reynolds turned to the camera codecs. He said: “The camera bodies themselves will always outstrip the media needed to do the processing. Take the evolution of the Sony Venice for example: we talked about the Venice and 6K capture in 2018, and now it will capture 8K. Those codecs have evolved to the point where they are capturing a wider gamut of colour.
“In the digital realm, the goal and the trend are always to get as close to a film print as we can. That is the biggest thing,” he added. “And as far as the transforms and things like that are concerned, we make sure that the pie point is intact so we can present that in a display-referred situation. From scene-refer to display-refer we try to maintain that pipeline as cleanly as we possibly can.”
Read more Barbie: Pretty in Pink
Taking the six years from 2018 again as a reference, Reynolds said: “Now, everyone is shooting open gate, whether they are shooting on an Arri, shooting wide open on a Venice or wide open on a Red Monstro 8K. All those bodies capture colour differently, and all of them represent colour differently, so it is our job to bring those into a common space and make sure they are represented accurately.”
Confusion with terminology
Providing some background on scene- and display-referred, Meininger said: “When you heard scene-referred I wanted you (the audience) to think about those linear light principles and what colour was coming from an object. When we are talking about display-referred think what is that colour showing up in the final stage,” she said. “That is where a lot of confusion comes into colour management. People will hear phrases and see charts and think it feels really complicated.”
Focussing on the efficiency of processing codecs, Reynolds said: “Any 8K capture might be tough in real-time playback, so that does affect us, but not necessarily in the colour pipeline. It is with the online pipeline. We create proxies and work off those until the final online, and it is the artist on duty who deals with the headache.
“But as far as display technology goes, it is now a very exciting area because new products are affordable to post houses, and are capable of 4,000 nits. For us to finish P3, finish and deliver Rec. 2020, or finish Dolby Vision consistently, to me this is the most exciting part,” he added.
Petersen agreed: “This technology has made my job much easier. You turn your monitor on, give it 10-20 minutes, and it is going to stay consistent throughout the day. In the days of CRT monitors, we had drift.”
It looks milky
Reynolds talked about his role since moving into full-time finishing in 2022. He said: “The delivery aspect is always driven by the creative intent of the project – maybe the director’s vision or the director of photography’s (DOP) vision. We adhere to whatever that may be, but the correct presentation of the human being is our number one priority, however that is displayed and within the creative intent.”
Peterson added: “We have a full crew so we begin at pre-production with my in-house producers, online artists, and ourselves. We are heavily involved with every project that comes through. ShotDeck is something that we always get references from. We create our own internal references and we keep them in a gallery so we can always call them back.”
Read more AI remains in focus: a look ahead to 2024
Joining projects at the camera package stage means knowing what the client is editing on, and the final point of distribution. Asked about terminologies that clients use, Petersen said: “We talk about saturation of an image. Some terms have come through, and if we are looking at an image and somebody says it looks milky that just means we have a loss of contrast, and that means lowering my black point a little bit to create depth in the image.”
Reynolds had two experiences to share. He said: “From my side of the house, the technical and the creative aspects are two completely different things. One of the things we have built in is a technical pass to bring everything into common space, especially when you have projects captured with primary, secondary, and tertiary cameras.
“If I dovetail into the client side and the creative side, the thing I hear most often is, ‘This needs to Pop’: it means something different to everybody – this red needs to pop, this screen needs to pop,” he added.
AI in post-production
Clark Atlanta University’s Department of Cyber-Physical Systems has around 200 students at Batchelor, Masters and PHD levels, with PHD students able to study AI and robotics. Department leader Dr Roy George presented a session on AI as a creative and cost-effective tool. He said: “One of the questions I get a lot is what exactly are cyber-physical systems. It is the synergy between hardware platforms and software, and it is quickly becoming indistinguishable where one ends and the other starts.
“This is the landscape in which I see how AI can transform the entertainment industry through the entire life cycle, from development to post. We have some fancy tools being used in the development and automation of ideation.
“Looking at the business side, I believe a lot of it has been exploited, but between the science and AI, there are a lot of things that can be done at the latter parts of the life cycle,” he said. “There is a whole host of data about audience preferences and previous media that can be exploited, understood, and enable the drawing of insights. Content creators will be given new ideas for creating compelling storylines that are marketable.”
AI in music and arts will further influence audiences, but for now, the department’s R&D could extend to AI tools for colour. “What we are trying to do is look at costs, increase the speeds of all the post-production processes, and somehow create enhanced (content) products with the least amount of cost. A different style of storytelling is probably uniquely possible now, and perhaps even the viewer will be a participant in driving the storyline through as it proceeds,” George added. “This introduces unique endings.”
He pointed to the marketing hullabaloo around Apple’s Vision Pro launch. He said: “It has been available for only a week, but imagine the combination of AI and the VR headset, and being in a self-driving car.”
Biases in AI
In another key session, data scientist Dr Robert Joseph, President and Co-Founder of Team MindShift, and Instructor at three eminent educational establishments including Georgia State, looked at maximising profit and efficiency using data and AI. He said: “It is extremely important to me that people do not just jump on the bandwagon of AI. It is just a tool.
“If you are looking to boost your productivity in a studio, then figure out what needs to happen before you figure out what tools you need,” he added. “Way back when I studied AI it was almost cool, but it is now totally cool, and generative AI is the hot topic. But it has clear strengths and weaknesses.”
Joseph continued: “We use AI in MindShift for several things, but not alone. There is programming that goes around AI. There are also focus groups and knowledge that go around what I am trying to achieve, and multiple conversations with my co-founder to figure out what we are trying to accomplish. AI is just a piece of that.
“One of the things you are seeing happening is that the ability to create amazing media content is growing by leaps and bounds. You have ChatGPT 4.0, Gemini and all these other companies building amazing models, and others building models on top of models,” he added.
Right out of the box generative AI gives you content creation, a thought partner and document analysis. Will the evil-intentioned use of AI dominate?
“We must be very cautious about using a system that has the potential to affect so many people so profoundly. It offers the ability to influence large numbers of people in a very serious way. They are using AI now to set bail, and in some legal cases it is better than what the judge would do,” said Joseph.
“It is interesting to me how people are suddenly saying that AI is so biased, and that is because AI is mimicking society. I would like to say just solve the root problem,” he added. “The reason the AI bias problem is more talked about is because of how you can quantify it.”
Read more SMPTE: Architecting multi-cloud and on-prem workflows’ solutions for global production needs
No comments yet