This Al Jazeera-led project, which is a continuation of last year’s AI content moderation Accelerator project, focuses on a topic of significant reputational value to broadcasters; how to detect, measure and flag bias in the representation and portrayal of diverse genders, cultures and ethnicities, to ensure fairness and transparency in news reporting?
As we move on from AI’s initial role in the media space being primarily limited to automation, the rapidly evolving technology is spawning a plethora of new use cases. One of the most intriguing areas of current research is the subject of one of the IBC2021 Accelerators, in which a powerful range of broadcast and news organisation Champions have been investigating its potential for detecting bias in news reporting.
The Accelerator builds on the AI-enabled Content Moderation Accelerator project of 2020. Led by Al Jazeera, this year the team has been examining how AI can be used to detect, measure and flag bias in the representation and portrayal of diverse genders, cultures and ethnicities to ensure fairness and transparency in news reporting.
Champions Al Jazeera, AP, BBC, Reuters, RTÈ, ETC (University of Southern California), Multichoice
The Accelerator has looked at how the technology can be used to preserve and protect the fundamental notions of neutrality and balance, which is key to the reputation — and, in some cases, perhaps even survival in choppy political climates — of public broadcasters and news organisations around the world.
A task that scales
“As the world’s biggest and oldest news organisation getting it first and getting it right — speed and accuracy — are two of the pillars on which this temple is built,” says Sandy Macintyre, Vice President News at The Associated Press. “But the third is being fair, balanced and impartial, and therefore avoiding both intended and unintended bias and being extremely careful in our tone.”
Macintyre says that he can’t see bias detection wholly outsourced to AI, but he can see it as being a second set of eyes. This is particularly relevant when it comes to understanding the sheer scale of the task and the amount of data output by even a small news organisation across all of its platforms on a daily basis, never mind a global organisation the size of AP. Uniquely, AP has been working collaboratively with Reuters as well as with other world leading news organisations and broadcasters in the project.
The team has taken the pragmatic decision to narrow the focus of the POC to something that can both be achievable and indicative of the promise.
“There are all kinds of biases; coverage bias, selection bias, gatekeeping bias, and obviously for a POC, there’s just far too much to do in a meaningful way, so we’ve decided to zone in on tonality or tone,” explains Dr Niamh McCole, Broadcast Compliance Specialist at RTÉ. “The starting point is the recognition that the language of news broadcasting is a powerful way of conveying very subtle meaning and is a significant means to persuade, to endorse, to contradict, or to cast out.”
While acknowledging that language choices are reinforced by visual elements, whether that be human expression and gesture or choices made in an edit suite, the Accelerator has concentrated on analysing text. This is still a fearsomely complex task. Yves Bergquist is the Director of the Data and Analytics Project at the University of Southern California’s Entertainment Technology Center and is heading up the programming of the AI.
“The words we use are very indicative of our ideology and our opinions about the events that we’re describing,” he says. “Whether we say the word ‘regime’ for example, or government, those are two different words with two different connotations.”
AIs of course, do not fundamentally understand the nuance between the two words so have to be trained. Two different methods are being used in the Accelerator. The first is supervised learning where the application is trained on massive amounts of data that has been hand-labelled by humans. It also uses sentiment analysis to detect emotional tonalities across a wide number of fields. Is the person being aggressive or happy in what they say? Is the person in a position of power or not (those that are tend to use the pronoun ‘we’, people who are in positions where they feel disempowered tend to use ‘I’, and so on).
The second technique is unsupervised machine learning. This is basically clustering. Text is input with no annotations, but the application will recognise clusters of words per topic and per news organisations. So it can say that News Organisation A is using ‘regime’ to describe the Afghan government more than News Organisation B which is using the word ‘government’.
In practice, both methods are being used for the Accelerator. “I think if the field of AI has learned anything over the past 10 or 15 years it is that what we call ensemble models tend to work a lot better,” says Bergquist. “Using a combination of hybrid approaches and algorithms to solve a problem tends to outperform simply using one model. And that’s basically what we’re trying to do.”
Examining the Fall of Kabul
The POC is based on using AI to examine the coverage of a single event by multiple news organisations, with the Fall of Kabul from 15 August onwards chosen as one example.
“We have been looking at the way in which news packages dealt with that event and its aftermath in terms of the corpus of words used, the quotes that broadcasters choose to use, and the language that’s included in the selection of the editing of the interviews,” says McCole.
There are two things worth pointing out here. One is that it is vital that such a tool, and definitely a productised version in the future, is open and transparent. Macintyre says that what goes into the box, the data and the algorithms that power it, need to be open, and also we need to be honest about what it can’t do. In that way when news organisations are accused of bias, or if they want to check their own output against reference markers, the whole process takes place in the open and can be examined for any faults or discrepancies.
Read more Accelerators: 5G and innovation in live production workflows
The second is that this is a tricky subject with many sensitivities. “The truth is disruptive,” says Bergquist. “To confirm your own cultural biases is uncomfortable and kudos to the organisations in this Accelerator for putting themselves in such an uncomfortable place because they risk being confronted with their own limitations and biases. I think it’s really great to have big news organisations jumping into this deep end of the pool.”
An ongoing process
The team caution that what will be seen in the culmination of the 2021 phase of the project and outputs at year end are not going to be a finished product in any way, but it will provide important insights as to what the technology can do and point toward some key possibilities to comeg. McCole says that it is best considered as an ongoing project, one that may well start to address other inputs such as video in the future, and that is the conversations that it will engender with fellow broadcasters via IBC that will be invaluable and help steer its course across future iterations.
Macintyre concludes that he has often said that said that AI is the best assistant producer you ever had. And the smarter AI gets, your assistant producer becomes a senior producer and eventually maybe an executive producer. “Even an executive producer still needs a human editor on top, though,” he says.
No comments yet