Close

Artificial Intelligence and Broadcasting: Why ethics must come before efficiency

logo

logo



Each year, on World Radio Day, observed on February 13, the world pauses to celebrate one of the most enduring institutions of public communication.

UNESCO’s message has remained largely consistent, which is that radio matters because it informs, educates, and amplifies voices that are too often excluded from the public conversation.

This year’s theme, “Radio and Artificial Intelligence: AI is a tool, not a voice,” is therefore not merely ceremonial. It is a reminder that the strength of broadcasting has never been rooted in technology alone, but in the credibility that broadcasters build through consistency, judgement, and accountability.

UNESCO’s focus on technology this year is not accidental; it is a direct response to the growing presence of artificial intelligence in news production and broadcast routines.

What follows is a simple argument: if broadcasters and regulators embrace AI without clear ethical guardrails, they may gain efficiency in the short term, but in the long term they risk losing credibility, which remains the true currency of broadcasting.

That reminder could not be more timely, because trust is precisely what new technologies now place under strain. Artificial intelligence is no longer a distant prospect for media executives to debate at conferences.

It has already entered the daily routines of broadcasting, assisting with editing, scheduling, transcription, translation, audience analytics, and content discovery. At a time when traditional media organisations face shrinking advertising revenue and aggressive competition from digital platforms, AI offers an attractive promise of speed, scale, and efficiency.

Yet it also introduces new ethical dilemmas, because the same systems that improve productivity can, if poorly governed, blur the boundaries between editorial judgement and automated output.

Yet the very qualities that make AI appealing also make it risky. Recent controversies involving synthetic voices, manipulated audio clips, and AI-assisted misinformation, particularly in politically sensitive contexts, have shown how easily technology can blur the line between authentic speech and manufactured reality.

For regulators and station owners, the challenge is therefore not whether AI should be adopted, because it already has been; it is whether it can be governed in a way that strengthens broadcasting without eroding the trust on which its authority ultimately depends.

Broadcasting has never been a neutral industry. Unlike digital platforms that are largely designed to maximise clicks, engagement, and rapid circulation, radio and television operate under a duty, explicit or implied, to serve the public interest. In some jurisdictions, this obligation is written into law; in others, it rests on regulatory tradition and public expectation.

Trending:  118 out of 137 of 2024 NPP PCs declare support for Bawumia in presidential primaries

Either way, it remains central to how society understands broadcasting and why it continues to matter. Radio in particular holds a distinctive place in civic life. It reaches across literacy levels, income groups, and geographic divides, and in many communities it remains the most trusted source of news and public information.

When technology reshapes radio, therefore, it is not merely changing a business model; it is touching the infrastructure of public life itself.

When used responsibly, AI can indeed become a powerful ally. It can preserve institutional memory by organising archives that would otherwise be lost to time and poor storage. It can promote inclusion through translation and accessibility tools that allow more people to participate in public discourse.

It can also support audience research and programme planning in ways that help broadcasters remain relevant and responsive. In these roles, AI strengthens broadcasting rather than diminishing it. The difficulty arises when the pursuit of efficiency begins to eclipse the duty to serve ethically, carefully, and with professional judgement.

The first ethical line that must not be crossed is editorial accountability. Decisions about what to air, how to frame a story, which voices to foreground, and what tone to adopt are not merely technical tasks. They are editorial responsibilities that require experience, context, and moral awareness. AI may assist, analyse, and recommend, but it must never be allowed to replace human judgement.

Regulators should therefore require broadcasters to clearly define which functions are automated and which remain under human control, with these boundaries properly documented as part of licensing and compliance processes.

In the United Kingdom, for instance, OFCOM has repeatedly emphasised that broadcasters remain responsible for the standards and protections required under the Broadcasting Code, regardless of whether new technologies are used in the production process. The principle is clear: automation does not remove accountability.

Closely linked to accountability is the question of transparency. Listeners have a right to know when content has been generated, manipulated, or significantly shaped by artificial intelligence. If a voice is synthetic, a story has been automatically summarised, or a programme segment has been digitally assembled, audiences should not be left to guess.

Trending:  Agric Minister unveils local post-harvest equipment to strengthen Ghana’s farming

Trust does not collapse because technology exists; it collapses when people feel misled. It is therefore reasonable that disclosure rules for AI-generated or AI-assisted content should sit alongside existing regulations on sponsorship, advertising, and political messaging.

A third concern, increasingly urgent, relates to voice and identity. The familiar voices that define radio stations are not merely sounds. They are anchors of credibility, cultivated over years through consistency, presence, and trust-building. The use of AI to clone or replicate those voices without explicit consent raises serious moral and legal questions, because it effectively turns a person’s identity into a reusable asset.

Around the world, there have already been cases of voice cloning used for fraudulent advertising, misinformation campaigns, and impersonation. Regulators should treat this not as a novelty, but as an emerging rights issue, while station owners must ensure that presenters and journalists retain control over their own voices and reputations, both contractually and ethically.

There is also the critical issue of data responsibility. AI systems rely on audience data to function effectively, and the temptation to collect and monetise listener information will only grow. Broadcasting is no longer confined to the traditional receiver of old.

Stations now engage audiences through apps, streaming platforms, websites, newsletters, and digital listener communities, all of which generate valuable personal data. This data must be handled with care, particularly in environments where privacy laws are weak, outdated, or inconsistently enforced. Recent high-profile breaches across the digital economy have shown how quickly public trust can evaporate when personal information is mishandled or quietly traded.

If radio is to remain a trusted institution, it cannot afford to behave like the least accountable corners of the online world. Regulators should therefore insist on clear data governance frameworks that prioritise listener privacy and ethical data use over short-term commercial gain.

Finally, there is the question of institutional purpose, which is perhaps the most overlooked dimension of the AI conversation. Artificial intelligence must not become a convenient excuse to hollow out newsrooms, weaken professional capacity, or replace human development with automation.

When technology substitutes for mentorship, editorial debate, newsroom collaboration, and professional training, the result is not simply efficiency; it is a slow decline in depth, ethical awareness, and institutional competence. Regulators cannot dictate how broadcasters run their businesses, but they can establish standards that ensure technological innovation does not erode the editorial capacity and public service mission that justify broadcasting’s privileged role in society.

Trending:  She Made Me Who I Am - Owusu Bempah

History shows that once trust in broadcasting is lost, it is extremely difficult to rebuild. Audiences do not tune out because a station lacks innovation; they tune out when it begins to sound careless, detached, or unaccountable. Artificial intelligence will undoubtedly change how broadcasting operates, but it must not be allowed to redefine what broadcasting represents.

In the end, the future of broadcasting will not be measured by how efficiently it runs, but by how faithfully it serves. Technology may amplify voices, but only ethics can make them credible. The time has therefore come for regulators to establish clear AI standards for the broadcast sector, and for station owners to adopt enforceable internal policies that protect editorial accountability, mandate transparency, safeguard identity, and secure audience data.

Broadcasters should not wait for scandal or public backlash before acting. If these steps are taken decisively, AI can strengthen broadcasting’s public value and restore confidence in an increasingly noisy media environment. If they are postponed, the industry may discover too late that credibility, once damaged, cannot be repaired by innovation.

Artificial intelligence may sit in the studio, but it is human responsibility that must remain firmly behind the microphone.

(The writer is an award-winning senior media executive, historian, educator, and leadership consultant. He holds master’s degrees in Business Administration, Communication Studies, Education, and African Studies, and specialises in history education, media innovation, organisational strategy, and fostering leadership excellence across diverse sectors)

DISCLAIMER: The Views, Comments, Opinions, Contributions and Statements made by Readers and Contributors on this platform do not necessarily represent the views or policy of Multimedia Group Limited.

DISCLAIMER: The Views, Comments, Opinions, Contributions and Statements made by Readers and Contributors on this platform do not necessarily represent the views or policy of Multimedia Group Limited.


Source: www.myjoyonline.com
scroll to top