Agrégateur de contenus

Titre unique

Navigating risks and rewards - How South African journalists use AI in the newsroom

New Study Finds South African Newsrooms Rapidly Adopting AI – But Gaps in Training, Policy and Local Tools Remain

This study provides one of the first comprehensive assessments of how South African newsrooms are adopting and using AI. The research finds that AI tools are already widely used for tasks such as research, summarisation, transcription, and drafting content, yet adoption remains informal and uneven across organisations. Most journalists reported receiving little to no formal training on AI, raising concerns about ethical use, accuracy, and the risk of declining public trust. The study highlights the urgent need for structured training, clear editorial guidelines, and the development of tools tailored to African languages and contexts.

Agrégateur de contenus

Partager

A new study finds that artificial intelligence (AI) is already widely embedded in South African newsrooms, delivering efficiency gains while raising urgent concerns around training, ethics and public trust. Authored by Karen Allen, Prof. Herman Wasserman and Nande Mbekela, the report provides one of the first comprehensive snapshots of how journalists across print, broadcast and digital media are using AI in their daily work.

 

AI is widespread but unevenly understood

The study shows that journalists are using AI tools for research, summarisation, transcription, translation, and drafting headlines and social media content. These technologies are valued for saving time and improving workflow efficiency. However, adoption remains largely informal and uneven, driven more by individual initiative than by institutional strategy.


Most journalists report feeling ill‑equipped to use AI responsibly, with little to no formal training provided by their organisations. As a result, many rely on self‑teaching or peer learning, leading to inconsistent practices across newsrooms.


Lack of policies and training creates risk

One of the report’s most significant findings is the absence of formal AI policies in many South African news organisations. This leaves journalists to navigate complex ethical and professional challenges without clear guidance. The study warns that this gap increases the risk of inaccurate or misleading content due to AI “hallucinations”, plagiarism and copyright violations, erosion of journalistic skills, and declining public trust. While journalists are aware of these risks, many compensate by manually double‑checking AI outputs - often reducing the efficiency gains AI is meant to provide.


Trust, ethics and local relevance

Concerns around trust emerge strongly in the research. Journalists expressed unease about the reliability of AI‑generated content and the risk of producing generic or context‑poor reporting. The report also highlights challenges in South Africa’s multilingual context, noting that many AI tools perform poorly in African languages such as isiZulu, isiXhosa and Sepedi, raising concerns about linguistic accuracy and cultural relevance.


Balancing opportunity and threat

Despite these concerns, journalists recognise AI’s potential to strengthen journalism, particularly in resource‑constrained environments. AI can support data analysis, verification and content production, including complex investigations. However, this optimism is balanced by fears of job displacement and the weakening of core journalistic skills.


Key recommendations

The report concludes that the key question is no longer whether AI should be used in newsrooms, but how it can be used responsibly. It calls for:

  • Structured and practical AI training for journalists
  • Clear editorial guidelines and newsroom AI policies
  • Development of AI tools tailored to African languages and contexts
  • Strong ethical frameworks to protect accuracy, accountability and public trust

 

Read the full study available for download above!

 

You can watch the recording of the study launch here: Study Launch - Livestream recording

Agrégateur de contenus

Contact

Rebecca Sibanda

Rebecca Sibanda
Chef de projet
rebecca.sibanda@kas.de +27 (11) 214 2900 +27 11 214 2913/4