We are excited to announce that Brink is now part of Africa Practice. Learn more

Contested truths in the age of AI: Nigeria’s road to 2027

Nigeria enters 2026 as a decisive pre-election year, with political mobilisation, reform delivery, and public trust increasingly shaped by a fast-evolving information environment. With general elections set for 20 February 2027, this year effectively marks the start of the formal campaign cycle – when narratives crystallise, alliances consolidate, and institutional credibility is tested well before ballots are cast. 

While disinformation is a longstanding challenge in Nigerian politics, the 2027 cycle marks a qualitative shift. The democratisation of Generative AI (GenAI) means that synthetic audio, deepfake videos, and hyper-targeted micro-narratives are no longer the tools of fringe actors but targeted manipulation tactics deployed to undermine the democratic process. Yet this same shift creates space for more deliberate voter education on how to detect AI-generated content. It also requires exploring how AI can be leveraged for institutional monitoring and developing policies and practical guardrails to protect citizens and reduce the risk of arbitrary use.

The trust deficit: high stakes, low threshold

The 2027 elections unfold against a backdrop of already fragile public confidence in Nigeria’s electoral process. Post-election assessments and public opinion surveys following the 2023 polls documented declining trust in electoral outcomes and in the institutions charged with managing them. For example, of the 542  people surveyed, 63% expressed no confidence in the Independent Electoral Commission’s (INEC) vote tally and declaration of a winner, 67% disagreed that the elections were free of fraud, and 52% expressed dissatisfaction with how democracy works in Nigeria. 

This existing trust deficit will shape how citizens engage with the electoral process in the lead-up to the next general elections. As AI-enabled manipulation becomes more commonplace, political actors are likely to incorporate these tools into their campaign strategies, accelerating scepticism and further lowering the threshold at which doubt takes hold. In such an environment, contested narratives can gain traction more quickly, making trust harder to rebuild once it has been eroded.

What “institutional preparedness” really means

Without adequate institutional preparedness, AI risks further eroding public trust, amplifying polarisation, and undermining the legitimacy of the 2027 elections at a time when Nigeria is also grappling with economic reform fatigue and heightened social pressures. In this context, preparedness extends beyond logistical readiness to the ability of institutions, including electoral bodies such as the INEC, as well as security and communications agencies, to verify information rapidly and respond before false narratives take hold. It also requires the proactive use of AI as a defensive and monitoring tool to detect coordinated disinformation campaigns, flag synthetic media, and identify unusual patterns in digital political engagement. More broadly, it requires stronger data-driven analysis to distinguish genuine threats from manufactured narratives, along with timely, clear, and credible public communication when misinformation begins to circulate.

Institutional responses, however, carry their own risks. In election cycles across the region, including recently in Uganda and Tanzania, governments have resorted to internet shutdowns or restrictions on social media access to restore order. Nigeria has also experimented with similar measures, most notably the suspension of Twitter in 2021 following tensions linked to the #EndSARS protests, illustrating how quickly digital controls can become part of the state’s crisis-response toolkit. While often framed as a regulatory necessity, such measures can undermine economic activity, fuel suspicion, and exacerbate perceptions of opacity precisely when transparency is most needed.

The actors and the risks

The most sophisticated use of AI is likely to come from well-resourced political actors. Major parties such as the All Progressives Congress, the Peoples Democratic Party, and the African Democratic Congress, alongside high-profile aspirants and their consultants, are increasingly supported by digital teams capable of deploying AI-assisted messaging at scale. Fringe players are also expected to leverage technology at the subnational levels. Much of this activity is likely to operate through proxy networks, unofficial pages, influencers, and pseudo-media platforms, allowing plausible distance from candidates themselves. These campaigns are most likely to tap into longstanding sensitivities around ethnicity, religion, economic grievance, insecurity, and gendered narratives aimed at discrediting women candidates. 

Beyond domestic actors, there is also growing precedent for external influence in African information environments, from foreign-linked troll networks shaping public sentiment in Sahelian states to international political consulting firms such as Cambridge Analytica, which previously worked to influence online opinion during elections in Kenya. In Nigeria’s case, political actors may increasingly draw on or compete with transnational digital consultancies, data firms, and covert influence networks that can amplify narratives at scale and across borders.

How citizens encounter AI

Crucially, the implications of AI-driven information disorder extend beyond institutions to everyday social interactions. Some of the most vulnerable constituencies are digitally connected but formally illiterate populations that rely heavily on audio messages, videos, and forwarded content rather than on text-based verification. Youth-heavy online communities, already shaped by economic frustration and algorithmic amplification, are also particularly exposed, as are rural and peri-urban communities where trusted intermediaries play an outsized role in information transmission. In such contexts, the question is not whether citizens understand AI as a concept, but how they assess credibility in practice.

This raises a critical policy challenge for 2026: how do we approach digital and civic education in a way that reflects lived realities? Public awareness efforts cannot assume high literacy levels, nor can they assume that citizens readily recognise or trust official sources of information. Instead, effective interventions are likely to work through local radio, messaging through trusted community and social influencers, and local associations. These interventions should focus on practical cues – questioning emotionally charged content, recognising common manipulation tactics, and pausing before sharing – rather than technical explanations of AI itself. 

Complementing this is growing momentum around accessible, AI-enabled fact-checking tools that reflect Nigeria’s linguistic diversity and social context. Civil society organisations such as Dubawa, Africa Check, the Premium Times Centre for Investigative Journalism, and election-monitoring coalitions such as Situation Room already play a central role in verification and public education, though their reach remains limited. Dubawa and FactCheckAfrica are now deploying AI tools to strengthen this work, including rapid verification platforms, real-time claim responses via WhatsApp, and audio analysis of live radio in Nigerian English and Pidgin. Progress, however, is constrained by the high cost of AI infrastructure, the risk of generative AI “hallucinations” that require human oversight, and the complexity of accurately processing Nigeria’s diverse and often mixed languages.

This also places responsibility on social media companies whose content moderation and detection systems often struggle to interpret local nuance, including language, context, and culturally specific references, limiting their effectiveness in fast-moving information environments. Strengthening this capacity will require greater investment in region-specific language models, deeper partnerships with local actors, and faster escalation pathways to identify and respond to harmful narratives before they gain traction.

Why 2026 matters

The stakes extend beyond the election itself. A fragile information environment complicates government communication, increases the risk of misinformation-driven unrest, disrupts the delivery of social and economic interventions, and raises political risk for investors and development partners. In a country with a large youth population facing limited economic opportunities, contested narratives can spread quickly and have real-world consequences. The choices made in 2026, whether to strengthen institutional capacity and public preparedness or rely on reactive controls, will shape not only the legitimacy of the 2027 elections but Nigeria’s ability to manage stability and sustain reform momentum in the years that follow.

About the Author

Nabila Okino is a Consultant in our Insights team, based in Abuja, Nigeria. Nabila joined Africa Practice from the Central Bank of Nigeria, where she developed expertise in policy development, research, advocacy, and stakeholder engagement across diverse areas, including financial inclusion and women’s rights. She can be reached at [email protected]

Subscribe to our newsletter

Get our weekly take on the latest from across the continent – from politics and business, to culture and sport – straight to your inbox

Proud to be BCorp. We are part of the global movement for an inclusive, equitable, and regenerative economic system. Learn more