Explainability of artificial intelligence (AI) and the ethical aspects of recommendation algorithms have been widely discussed in many forums. Before delving into the topic of distribution platforms and the challenges they pose, it is worth noting that the radio broadcasting industry celebrated World Radio Day (WRD) on 13 February 2026. This year’s theme, announced by UNESCO, highlighted threats to trust, particularly arising from AI.
WRD is a global celebration observed annually on February 13, recognising and thanking broadcasters for the news they deliver, the voices they amplify, and the stories they share.
The theme for World Radio Day 2026 announced by UNESCO, was “Radio and Artificial Intelligence: AI is a tool, not a voice.”
Today, AI presents new opportunities and challenges not just for innovation but also for deepening radio broadcasters’ connection with their listeners. As the WRD theme reminds us, AI is a tool, not a voice.
When used ethically and responsibly to support professional judgement, creativity and public service values, AI can be a powerful tool in strengthening audience trust. However, technology alone does not build trust. Radio broadcasters must use AI appropriately as a tool to uphold their responsibilities and fulfil their role as radio broadcasters.
Certain AI technologies, such as speech-to-text and language translation, offer significant utility for broadcasters. At the same time, potential negative consequences of AI — such as deep-fake threats — can be addressed through technical standards like C2PA, the Coalition for Content Provenance and Authenticity.
C2PA is an open standard designed to certify the origin and history of digital content, functioning like a tamper-evident ‘birth certificate’ embedded within media files. By providing cryptographically signed provenance information, C2PA helps publishers, creators, and consumers verify the authenticity of digital content and combat misinformation.
Radio broadcasters must amplify voices while safeguarding the trust that audiences place in them. The careful selection of technologies, including AI, is essential to harness their benefits responsibly. While embracing technologies such as AI, radio broadcasters must remain guided by the fundamental principles of inclusiveness, universal access to information, and the provision of entertainment and education to all audience segments without creating divides.
AI can assist broadcasters with repetitive and routine tasks, enabling them to focus their human-centred creativity on producing content that respects creative rights and serves the public interest.
On 13 February 2026, broadcasters around the world marked World Radio Day, reflecting on the trust they have built with their audiences and reaffirming their commitment to maintaining it.
Focusing on platforms and social media, it is worth noting that Meta owns a diverse portfolio of platforms, messaging apps, and technology ventures, including Facebook, Instagram, WhatsApp, Messenger, Threads, and Reality Labs.
Recently, the Head of Global Safety (HGS) of Meta visited New Zealand and pushed back against calls to ban children from social media. In light of measures being considered by the New Zealand government, the company cautioned that bans alone will not prevent kids from going online, as authorities weigh whether to follow Australia’s lead.
The HGS noted that the challenge with a ban is that it is impossible to block the entire Internet. Speaking to New Zealand public TV broadcaster One News, Meta explained that an outright ban can create a false sense of security, as machines and workarounds can circumvent safeguards. The company also highlighted that platforms can inadvertently drive users towards less safe online experiences, citing Australia’s under-16 ban as an example.
The HSG recommended building “team accounts” with parental oversight, designed with three main concerns in mind: who children are connecting with, what kind of content they are seeing, and how much time they are spending online. Children cannot access these apps without parental permission.
The HGS also emphasised that responsibility for children’s online safety is shared. Meta does not place all the responsibility on a single party; instead, the company maintains that a safe experience involves a partnership between parents, the platform, and other stakeholders.
There is no single fix. While there are no silver bullets, addressing the issue requires more than planning and restricting access, with a greater emphasis on education and stronger regulatory frameworks needed. The New Zealand Online Safety Group remained unconvinced. Although it acknowledged that Meta had responded to growing calls for safer products, the trust it once placed in these platforms had been fundamentally eroded, with meaningful action coming only after years of exposure and harm.
It was reported recently that YouTube and the BBC reached agreements for the distribution of selected BBC content. At the same time, questions have been raised about the algorithms that drive content recommendations to audiences.
YouTube has also announced new bundling options for YouTube TV, which will roll out over the next several weeks at prices slightly lower than the main YouTube TV Plan. The company plans to offer more than 10 bundles across sports, news, entertainment, and family content, although only four have been detailed so far.
According to research from Omdia, YouTube reached 29 billion videos as of December 2025, with growth driven by Shorts, AI-generated content, and expansion in markets such as India. Omdia also notes that YouTube is the most popular video service globally and is on track to surpass 30 billion videos in early 2026.
Omdia research shows that the least-watched 99% of videos account for just 9% of total viewing time. Yet YouTube continues to host the equivalent of 280,000 years of video content, most of which is rarely watched. This reflects an interesting platform strategy to note.
One reason for maintaining such a vast library is that the content also serves as training data for Google’s Gemini video models. While user-generated content often shapes perceptions of YouTube’s success, Omdia’s findings suggest a more complex reality. By 2026, YouTube has evolved into a highly diverse platform, with professional content, music, news, and podcasts all influencing viewing patterns.
Omdia further notes that YouTube’s growth has accelerated in recent years. Twenty-five percent of all videos available in 2025 were uploaded within the first 10 months of the year. This surge has been driven largely by short-form videos, with Shorts accounting for more than 90% of new uploads. The top 1% of videos generate 91% of total viewing time, while the remaining 99% account for just 9%. Despite this imbalance, the long tail of content continues to play a critical role in sustaining the platform’s ecosystem.
Three key findings by Omdia on viewer preferences are noteworthy. First, music and professional content dominate viewing: music videos account for 33% of total YouTube viewing time, while professionally produced content represents 46%. Second, podcasts are growing traction, with video podcasts now making up 5% of total viewing, reflecting rapid growth in the format. Third, news remains significant, capturing 10% of viewing time and ranking as the third most popular category on the platform.
Against this backdrop, the newly announced programming agreement between the BBC and YouTube aims to target UK children and young adult audiences through the launch of dedicated channels. These include a working-titled channel, Deepwatch, featuring new and existing BBC documentaries, as well as seven new children’s channels, including The Epic Facts channel, which will feature content from CBBC’s Operation Ouch, Horrible Histories, Horrible Science, and Deadly 60.
Led by the National Film & Television School (NFTS), the initiative will bring 150 media professionals to develop their YouTube skills through a series of workshops and events. A specially curated training programme will be delivered both online and at BBC hubs in Salford, Birmingham, Glasgow, Newcastle, Belfast, and Cardiff, recognising the role of creators in strengthening the UK economy and showcasing home-grown talent.
The BBC Director-General said it is essential that everyone derives value from the BBC, and that this partnership will enable the BBC to connect with audiences in new ways. Building on a strong foundation, the collaboration is intended to take the BBC to the next level by delivering bold, home-grown content in formats that audiences seek on YouTube, alongside an unprecedented training programme to upskill the next generation of UK-based YouTube creators.
The partnership is also expected to provide new audiences with alternative pathways into BBC services such as BBC iPlayer and BBC Sounds.
While these developments are under-way, Google moved to shut down YouTube UK ratings. In late January 2026, Google required UK television measurement providers Barb and Kantar Media to suspend a service that compared viewership on its video platform YouTube with other TV channels and streaming services.
According to reports by the Financial Times, Barb and Kantar Media had sought to expand their initiative to measure YouTube content viewed on TV sets. The aim was to enable comparisons between YouTube channels and linear TV and streaming platforms using the same independent methodology.
As reported by the Financial Times, individuals close to the dispute said that while Google did not agree that the service accurately represented viewership on YouTube, its legal request was based on alleged breaches of the platform’s terms of service regarding the user of creator content.
YouTube’s UK viewership is also measured through Ipsos/Iris, while advertising metrics are assessed by Nielsen, AudienceProject and ISBA’s Origin, a cross-media measurement tool for advertisers.
In conclusion, two questions merit consideration.
Meta’s visit to New Zealand was timely in raising awareness among parents, children, and platform operators. However, simply telling parents that their concerns have been heard is unlikely to be sufficient. What concrete assurances can platform operators such as Meta provide to demonstrate that they are genuinely safeguarding children on their platforms?
More broadly, in light of concerns about insufficient accountability and limited transparency, how can the broadcast and wider media ecosystem, including audiences themselves, place trust in these platforms and other social media services?
I wrote this article in late February/early March 2026 and published it in a monthly article of the APB+ publication.
Since the article, I also have an update as follows.
Meta and Google have been found liable in court:
As USA Today reported, Meta and Google have been found liable in court for a woman's social media addiction. In a landmark decision in Los Angeles, the jury found the companies had built addictive social media platforms that harmed the woman's mental health. Jurors said Meta, which owns Instagram, was 70% responsible for harm to the plaintiff, and YouTube, owned by Google, was responsible for the other 30% in a statement, meta says, respectfully disagrees with the verdict, and it's evaluating its legal options hundreds of similar cases currently before the US courts, with the latest decision likely to influence those outcomes.
This is an article published on 12 March by APB+DistributionNews & Events
Written by a Member of the Executive Committee of MTN, FIET and CEng of IET, Distinguished Lecturer of IEEE-BTS and FEngNZ