UNFPA Warns AI Accountability Gap Threatens Trust & Digital Economy

Andrea Wojnar of UNFPA India has highlighted a critical "accountability gap" in artificial intelligence systems, warning they risk deepening existing inequalities. She emphasized that when users, particularly women and girls, feel unsafe online, participation drops and the digital economy's potential narrows. Wojnar framed trust in AI not just as an ethical issue but as a fundamental economic one, where mistrust can slow adoption and increase reputational risks. Her remarks underscore that closing this accountability gap is essential for sustainable and inclusive digital growth.

Key Points: AI Accountability Gap Risks Deepening Inequality: UNFPA

  • AI reshapes risk landscape
  • Accountability gap reflects structural inequalities
  • Trust is a core economic issue
  • Mistrust slows AI adoption
  • Digital safety impacts economic growth
3 min read

UNFPA's Andrea Wojnar warns of AI accountability gap and its impact on trust and the digital economy

UNFPA's Andrea Wojnar warns AI's accountability gap erodes trust, slows digital economy growth, and disproportionately impacts women and girls.

"When people, especially women and girls, feel unsafe, online participation drops and the promise of the digital economy narrows. - Andrea Wojnar"

New Delhi, February 16

Andrea Wojnar Resident Representative for United Nations Population Fund India has raised concerns about what she described as a widening "accountability gap" in the age of artificial intelligence, warning that unequal and biased systems risk deepening existing inequalities, particularly for women and girls at India Impact AI Summit 2026.

Speaking on the evolving role of artificial intelligence in society, Wojnar emphasized that while AI presents enormous opportunities, it also reshapes the landscape of risk. "AI is reshaping risks but possiblities also. AI will influence safety," she said, underlining the dual nature of rapidly advancing technologies.

"When people, especially women and girls, feel unsafe, online participation drops and the promise of the digital economy narrows. When users don't trust AI enabled services, adoption slows and reputational risks Grow digital economy, do not reach its potential. It happens with observation is navigating it under threat," she added

According to Wojnar, the accountability gap in AI systems is not neutral. It reflects structural inequalities that can disproportionately affect those already marginalized. She stressed that questions of responsibility -- who designs, regulates, deploys and benefits from AI -- remain unevenly addressed across sectors and geographies.

A central theme of her remarks focused on trust. Beyond ethics and governance, she framed trust as a core economic issue. "But trust is also an economic issue, and for those of you who attended our session in December with our private sector tech partners, you'll know that when people, especially women and girls, feel unsafe, online participation drops and the promise of the digital economy narrows," she said.

Her comments suggest that digital safety is not merely a human rights concern but also a determinant of economic growth. When online spaces feel hostile or unsafe, participation declines. This withdrawal has ripple effects: fewer users, reduced engagement, and ultimately a contraction in the potential of digital markets.

Wojnar further cautioned that mistrust in AI-enabled services can slow technological adoption. "When users don't trust AI enabled services, adoption slows and reputational risks Grow digital economy, do not reach its potential. It happens with observation is navigating it under threat," she said.

The implications, she indicated, extend to both public institutions and private sector actors. Companies investing heavily in AI innovation may find that technical sophistication alone does not guarantee uptake. Without safeguards, transparency and accountability, reputational risks can escalate, limiting the very growth the digital economy promises.

Her remarks align with broader global discussions about ethical AI governance, data protection and inclusive digital transformation. As AI systems become embedded in health care, education, finance and public services, ensuring they operate fairly and safely is increasingly seen as foundational to sustainable development.

For UNFPA, whose mandate centers on reproductive health, gender equality and population dynamics, the intersection of AI, safety and gender equity is particularly significant. Wojnar's intervention underscores a growing recognition that digital transformation must be accompanied by deliberate efforts to close accountability gaps -- or risk reinforcing the inequalities it has the potential to solve.

- ANI

Share this article:

Reader Comments

A
Arjun K
Trust is everything. Look at UPI's success in India - it worked because people trusted the system. If AI services are biased or unsafe, people will simply avoid them. The accountability gap is real and needs fixing before it's too late.
S
Sarah B
As someone working in tech, I see this daily. The teams building these AI systems often lack diversity. The "accountability gap" starts there. We need more women and people from varied backgrounds in the room where these tools are designed.
R
Rohit P
Absolutely correct. But I respectfully disagree on one point - the issue isn't just about feeling unsafe. It's about actual safety. We need concrete technical safeguards and transparent grievance mechanisms, not just awareness campaigns.
M
Meera T
My daughter uses educational apps with AI. I'm always worried about what data is collected and how it's used. If parents don't trust these platforms, they won't let their children use them. Trust is the foundation of the digital economy, bhaiya.
D
David E
The link between safety and economic growth is spot on. In India's push for a $1 trillion digital economy, ignoring these social aspects would be a strategic mistake. Companies need to build trust by design.
K
Kavya N
So true! I've

We welcome thoughtful discussions from our readers. Please keep comments respectful and on-topic.

Leave a Comment

Minimum 50 characters 0/50