SMPTE President Renard T. Jenkins calls on artists, lawmakers and the tech community to educate themselves about AI and to do so in sync with each other or fear and misuse will fill the gap.

The discourse around AI in media and entertainment has centred around the hot-button topics of preventing bias and ensuring artists are paid fairly for their work.

Renard_Jenkins

Renard Jenkins, SMPTE

The Society of Motion Picture and Television Engineers (SMPTE) is taking a stand on both. Its President Renard T. Jenkins is concerned that the proliferation and sophistication of large language models are being embedded with bias, unconscious or otherwise.

“All of us in media and entertainment should be cognisant of what we are developing with AI tools. We should not be building models and then thinking about how we’re going to responsibly use them as an afterthought.”

Read more IBC2024 Primed to Lead Innovation, Explore Trends and Foster Collaboration across the Global Media Technology Community

That said, bias is not inherently a bad thing, he suggests, because certain forms of bias are there for our protection.

“I want to be very clear that I’m speaking about bias that is exclusive, that there are certain things that you would like to exclude out of a large language model,” Jenkins says.

“You want to exclude misogynism and racism and homophobia. You want to exclude anything that is harmful to a particular group or a particular person. That is what you should be working towards.”

He uses the example of a child putting their hand on a hot stove and learning very quickly through that experience that a hot stove can burn. “So having a bias against something that is bad for you is not a bad thing. We need to think about the issue that way instead of just trying to blanket our thought process in that regard.”

“Start with the developers and make sure that the product is being designed and tested with ethics in mind” Renard T. Jenkins, President, SMPTE

Begin with developers

However, the commercial pressure to monetise the technology means some AI tool developers are rushing to release without forensic testing or not giving ethical concerns due weight. Not even new AI regulation in Europe or the US may be sufficient to stop the genie once out in the wild.

“You can write as many laws and policies as you want but regulation is not going to change the hearts or minds of the individuals who are developing AI tools,” Jenkins says. “But you have to start with big tech. Start with the developers and make sure that the product is being designed and tested with ethics in mind. It is necessary because the power of AI tools that we have available to us today are exponentially more powerful than anything we’ve ever seen in our lifetime. AI is not going to stop.”

Putting pressure on big tech to ensure it is training AI tools on ethically sourced data may not be sufficient. Jenkins also sits on the board of the Hollywood Professionals Association having served as Warner Bros. SVP, Production Integration & Creative Technology Services until December. He calls on standards bodies and policymakers to educate themselves “at the same rate” as AI tools are being developed.

“The education of those making policies about AI and the understanding of the tech community needs to be happening in parallel,” he urges. “We have to make sure that we include content creators as part of that conversation. All of that has to happen in tandem. It can’t be out of sync with one another because that’s where you leave gaps for things that could be harmful.”

Read more Danger does not discriminate: World Press Freedom Day 2024

Jenkins’ worry is that if different parts of industry and society operate on different tracks then when the inevitable moment arrives, and big tech births Artificial General Intelligence (AGI), which can perform at least as well if not better than humans, then it may truly be harmful.

One way that developers can help ensure that they’re not building bias into their product is by staffing a diverse workforce.

At SMPTE, Jenkins helped start the Global Inclusion Working Group with plans to connect with technical organisations, the Hispanic Association of Colleges and Universities, and historically Black colleges and universities (HBCUs).

SMPTE’s work espouses the concept of inclusive innovation, pertaining to the development of a product or service intended to serve a multicultural audience.

Jenkins explains: “The tech world and media and entertainment has not always been inclusive in the way that it operates, especially at the higher levels of decision making.

“I do see change but not enough,” he says. “I think that we need to move beyond performative actions and get into truly transformative processes and practices. We have to think about the fact that we represent a global audience and a global user base. Whatever you are designing, whatever you are creating should not be exclusive to a single group unless it is something that is specific to that group and it is only necessary for that group. Things that would fit into that category are finite.”

“If someone is going to use your IP, they need to compensate you” Renard T. Jenkins, President, SMPTE

The other area where development is outpacing legislation is copyright. Although OpenAI, the developer of ChatGPT, declines to divulge what data its products are trained on, the New York Times revealed that this includes millions of YouTube videos, none of whose creators were apparently notified, let alone recompensed.

A new bill being introduced in the US Congress intends to force AI companies to reveal the copyrighted material they use to make their generative AI models.

Jenkins believes transparency, whether legally enforced or not, is the only way for the industry to move forward.

“Transparency is the only way that we can truly protect content creators and to be honest protect developers and the innovators who are building the tools as well. No one wants to be accused of utilising someone’s intellectual property without either compensation or acknowledgement. Transparency is the foundational thing that we can do.

“I also know that in a competitive landscape, transparency can often lead to a negative outcome for developers. We have to come to some sort of balance where developers retain their ‘secret sauce’, if you like, of their algorithms but which inculcates a level of trust between them and artists.

“That’s not going to come unless intellectual property is valued. Today, your IP is your data. So if someone is going to use your IP, they need to compensate you. They need to acknowledge that it’s yours. That is the only way that we can do this in a fair, responsible and ethical manner.”

Embracing AI

Jenkins is a sound engineer by trade and one wonders what he would tell creators about how AI might impact their jobs today. He says that people shouldn’t fear the tool itself.

“I believe that the fear really stems from those individuals who have control over the tool and how they plan to implement it. AI is not an entity. AI is simply a tool that can assist in the work that we do and, if utilised properly, there will be new jobs and new opportunities created.

“This technology is going to continue to iterate and we all have to do the same thing. It is necessary that you educate yourself about that. Do your homework.”

SMPTE recently published an AI document which emphasises the importance of standards development, interoperability and authentication as key components to adoption which need to be integrated into the development and deployment process of the technology.

“The industry needs in-depth analysis and unbiased research in regards to the development of AI’s advancement while keeping time to market expectations, performance constraints, and security concerns in mind,” Jenkins says. “Secondly, artists need to protect their art. Historically, a lot of artists have had their IP stolen and have had to fight to regain control of what they created. That is part of the fear.

“Always read the fine print before using any of these tools. Read the licensing agreement. Don’t just click your terms of agreement. Question whether or not that is something that you want to actually engage with.

“That is not anti-business,” he concludes. “That is simply making sure that any deal that you make protects your art. That is the same way that we need to approach artificial intelligence. If you’re going to use these tools, make sure that you are protecting your IP.”

SMPTE recently released an engineering document arguing for Standards for Artificial Intelligence and has also created a series of webcasts discussing this topic, available here.

Read more Adobe’s Deepa Subramaniam: How AI video will shake-up post production