Threatened by a public referendum and squeezed for talent, Léonard Bouchet of RTS delivered revelatory tools while establishing a better work culture, all by embracing data-driven principles.
The morning of March 4, 2018 was a tense one on in the corridors of the Swiss Broadcasting Corporation in Bern. The alpine nation was going to the polls in a national referendum on whether to do away with the license fees that make up roughly three quarters of the organisation’s funding. Hundreds of jobs, and the whole idea of funding public broadcasting in the country were on the line.
As the results came in, it became clear that the proposal to abolish the fees was to be rejected by a considerable margin (less than 29% voted in favour).
It was a relief to Léonard Bouchet, head of data and archives at RTS Radio Télévision Suisse, the French wing of the public broadcaster. “Our existence could have been completely removed from the landscape,” he recalls.
Bouchet regards the vote as being one of the background factors that continues to spur his own efforts to deliver better public service, achieve savings and improve use-cases.
At the time of the referendum, his archive department at RTS was already hard at work on implementing better uses of machine learning, aiming to get more from the decades of archived content they held.
The department has since undergone a revolution in both what it can deliver, and how its run.
“The political side of things can be tricky, but on the machine-learning side of things, what we’ve achieved, the way we’re organised (which is completely new) within the group - we’re quite happy with what the public sees from our offer,” he says.
Machine learning
With a background in media production at RTS, Bouchet was appointed head of data and archives at the broadcaster in 2016, and has been incrementally developing new tools that make use of machine learning since then. Tackling detailed requests regarding archived content has been an increasingly do-or-die priority.
This includes crunching data to discover which exactly how much airtime political parties and figures have been getting across RTS.
“We were contacted and asked “Would it be possible to collect the number of minutes that politicians appear on-air, and to collect that precisely? To count (appearances) by political colour, how many minutes each would get on-air? The answer was yes, and we have done that,” says Bouchet.
“We talk about AI, but in the end, it’s about data. That’s where the real value is”
This is thanks to the department’s focus on crafting a UI tool and search function that provides direct access to the archives. Any non-skilled person can check if the archives carry images or footage they need.
“We’ve built something that we call a classifier tool, where normal people are able to do the test by themselves,” explains Bouchet. “There is a user interface now, where they can actually input sample images of situations, and then create a kind of classifier. For example, if you just want to find images of squirrels in red trees, it will provide some images, and you can see if we have it.”
Other tool uses include digging back through years of footage to find individuals who would become more newsworthy in the present day, parsing footage to get a better idea of historic gender breakdowns, or isolating footage that the station doesn’t want to re-broadcast for legal reasons. The nature of the department has had to shift to nimbly handle these requests.
“Now, we are more of a proposal-force in the organisation. People come with quite complex questions to us,” Bouchet says.
In a time-sensitive example, the appointment of a new boss (Mattia Binotto) of the Ferrari F1 racing team earlier this year provided the target. The team knew they had footage and turned to tools to find it quickly.
“We suspected quite strongly that we had (archived) images of him working in the field, before he was appointed head of Ferrari. We had no metadata about him in the system. With all the systems combined, we were able to find images of him, and we actually had a lot of shots of him,” recalls Bouchet.
AI for the masses
These are the fruits of in-house efforts at the Swiss broadcaster, but they come as an increasing number of commercial AI-based tools are being promoted in the general market. Video analysis being prominent in these offerings.
The likes of Amazon’s Rekognition service, and Google’s Video AI service are amongst those big tech solutions that promise transformative insights once applied to a content producer’s video content.
When considering these kinds of services, Bouchet was swayed by an overwhelming figure. By developing the tools at RTS, and then building them into their existing systems, he estimated they would expend less than 10% of the funds that a commercial service would charge.
“I see a lot of providers and suppliers beginning to offer the kind of solutions we have into their systems,” he notes. “The broadcaster could be able to theoretically submit their content to these systems, and get back the same kind of features or information or metadata that we have.
“However, we’ve taken a different approach,” he explains. “We’ve taken all the algorithms and open source information and put that into our systems. We did that because we calculated that it was over ten times cheaper to do that, rather than uploading content (to a commercial service) and getting back the result”
“That was even without the transfer fees, this was simply calculating the service prices at the volume we’re talking about.”
Bouchet reports that his counterparts in other European Broadcasting Corporation member states often lament that funding constraints, their perceived lack of technical expertise, and inability to hire top talent means that developing tools like this are beyond them.
“It’s clear that it’s hard, and we were really lucky to be in a position where I have some expertise in this field, so I’m heard. I know (other broadcaster officials) don’t, so they’re not heard by IT departments when they want to do things differently. I had a chance where the organisation let me do that, but it’s also a chance you have to provoke somehow.”
In a small country like Switzerland, the competition for workers capable of applying this technology is considerable, but Bouchet took the step of getting a foothold with a very limited number of workers, and then trying to apply a data-driven approach to keeping those workers as satisfied as possible.
“We are not able, at all, to pay the kind of money that developers will find at our so-called competitors, like Google. They have a big development centre in Zürich for example, and they hire a lot of very interesting people in that field. But we were able to hire some, and with different agile approaches, able to create an environment where these people were really happy working with us.”
Data-driven hierarchy
The use of the holacracy system is what the team hit upon as a way to improve worker satisfaction. An alternative to the traditional management hierarchy, holacracy is advertised as empowering users to act on their own knowledge and initiative. Two years after it was rolled out, it’s been embraced amongst the 60-odd workers in Bouchet’s department.
“We started it as a test, as a one-year experiment,” he says. “Now we are in production with it. You can kind of think about it as a kind of operating system in that regard.”
Using the best aspects of the system, workers were able to learn and carry on the constructive process.
“That’s one of the things that holacracy can bring. You let people find their own talents. It’s really far from anarchy - you still have your own domains.”
“You still need a manager of the team, but you still need a coach to make sure you’re playing your role or not. That’s really key - the capability of the system to ask itself powerful questions now and then.”
The need for collaboration
Having previously attended IBC, Bouchet returned this year as one of the speakers at ‘The AI/ML effect’, on 17 September. Sharing his experiences in building and employing machine learning tools, as well as managing the human aspect that surrounds it will be core to his contribution. He will also be seeking to tackle the trickier aspects of this emerging domain, with an open ear on how to confront the issue of data dependability in particular.
“One of the questions I want to solve is finding out where the other players are that we can play with. We have big challenges regarding data quality. We talk about AI, but in the end, it’s about data. That’s where the real value is.”
“We have a level of confidence that we ascribe to data. The really hard thing is when you begin to have confidence, but then how do you detect the false negatives or the false positives? I’d be very interested in collaborating with people on that.”
1 Readers' comment