I Built this ‘AI aunt’ For Women After a Family Tragedy in South Africa

Profile Image
Updated Date: November 10, 2025
Written by Kapil Kumar
AI aunt

Leonora Tima, an NPO Specialist and Gender Equality Activist in South Africa, is nowadays a topic of discussion around the world for her new digital platform (AI Aunt), where people can talk to each other and track abuse. Tima got the idea of building such a platform when one of her own relatives, aged 19 and nine months pregnant, was killed in such a case. Her body was dumped on a highway near Cape Town in 2020.

“I work in the development sector, so I’ve seen violence. But what stood out for me was that my family member’s violent death was seen as so normal in South African society. Her death wasn’t published by any news outlet because the sheer volume of these cases in our country is such that it doesn’t qualify as news,” Leonora shared.

According to reports, the killer was never identified or caught. All that Leonora could see was a silent acceptance of a woman’s death by society. The incident shook Leonora to an extent that she finally came up with the idea that could bring a voice to society for women’s justice. As a result, an App came into existence with the name Grit (Gender Rights in Tech), with an integrated AI chatbot called Zuzi.

This is one of the best and first AI tools made by African creators that focuses on handling gender-based violence. According to Leonora, “This is an African solution co-designed with African communities.”

The core focus of this app is to help gather evidence for such incidents and offer support to the victims. The collected evidence can then also be used for the legal proceedings against abusers.

The new initiative has been excellently welcomed by international women’s rights activists. However, it is ensured that the chatbot doesn’t act to replace human support, offers empathy to survivors, and establishes an emotional connection with the user.

Leonora found through a survey among residents that people wanted to talk about their abuse but were not comfortable with the traditional means of doing so. “Some women would post about it on Facebook and even tag their abuser, only to be served with defamation papers,” she says.

The app was built with support from Mozilla, the Gates Foundation, and the Patrick McGovern Foundation. The platform is free to use and currently has a user base of more than 13,000, with more than 10,000 requests for help in September.

Grit has mainly 3 features, which are most used by people to report and get support for the abuse incidents. The first one is a large and circular help button on the home screen. When pressed, it starts recording 20 seconds of audio while capturing what is happening in the surroundings. Also, it pushes an alert to the private rapid-response call center where a trained operator is available to call the user.

When the user requires immediate help, the response team sends their team members to the site or calls the organization’s local representative who is at a better distance to go to their aid. “We need to earn people’s trust. These are communities that are often ignored. We are asking a lot from people when it comes to sharing data,” Leonora shared.

Another feature is “the vault,” which offers a secure space for the user to store evidence of abuses. This space is highly encrypted and dated, which can be later used for the legal proceedings. Different media formats like screenshots, photos, and videos can be uploaded to this space.

“Sometimes women take photos of injuries or save threatening messages, but those can get lost or deleted,” Leonora told. “The vault means that evidence isn’t just sitting on a phone that could be taken away or destroyed.”

The third feature is planned to launch this month, with its third feature, named Zuzi, an AI-powered chatbot which is exclusively designed to advise, listen, and guide the users to get local community support. “We asked people: ‘Should it be a woman? Should it be a man? Should it be a robot? Should it sound like a lawyer, a social worker, a journalist, or another authority figure?” Leonora explains.

As a response, she got to know that people wanted Zuzi to be an “aunt figure”- someone trustworthy and warm that one can confide in without the fear of judgment.

While Zuzi was originally designed to help women victims of abuse. However, the testing phase revealed that men were also seeking help. “Some conversations are from perpetrators, men asking Zuzi to teach them how to get help with their anger issues, which they often direct at their partners,” Leonora explains. “There are also men who are victims of violence and have used Zuzi to talk more openly about their experience.”

“People like talking to AI because they don’t feel judged by it,” she adds. “It’s not a human.”