Friday, 20 March 2026

Even Good Intentions Can Cause Harm

Photo by Zachary Keimig on Unsplash

By Ramatu Ada Ochekliye

I made a post on the Shades of Us page about Female Genital Mutilation (FGM), and Facebook flagged it as “Child Sexual Exploitation,” placing a restriction on my profile.


Even after an appeal, the decision stood.


So here is what I will stand on:


According to UNICEF and other FGM-related research:


  • Over 144 million girls and women in Africa have undergone FGM.

  • The practice is concentrated across Western, Eastern, and North-Eastern Africa.

  • Highest prevalence countries (2022 data): Somalia (98%), Guinea (97%), Djibouti (93%), Egypt (91%), and Mali (91%). It is also widely practiced in Nigeria, The Gambia, and beyond.

  • The highest numbers of affected individuals are in Egypt (31 million), Ethiopia (33 million), and Nigeria (20 million). While prevalence differs due to population size, the scale remains staggering.

  • Several countries, including Cameroon, Ghana, and Uganda, are on track to eliminate FGM by 2030. This aligns with the United Nations Sustainable Development Goal target of zero cases by 2030. Progress, however, is not fast enough.

  • In The Gambia, legislation banning FGM is currently under threat of reversal by some religious and traditional institutions.


This is why my post mattered.


And yet, the AI system flagged it, specifically because of the image I used.


So, what image did I use? Well, the image showed a child being held down by adult women – often the enablers and perpetrators of this violence against other women and girls – while a razor was poised to cut her. Her genitalia was not visible, but the terror on her face was unmistakable.


The question that then begs to be asked is, why did I use it?


Honestly, because I wanted people to feel something.


I had already created a four-part educational series using illustrations to explain the impact of FGM, but I feared the message wasn’t landing based on the gruesomeness of this form of gender-based violence. So, I admit that I fell for the sensationalism and chose shock value.


But here is the truth I now hold myself accountable to: even when advocating for children, we must never expose them to further harm.


Since learning more about safeguarding—especially in storytelling, advocacy, and media—I understand that intention is not enough. Protection must always be deliberate.


Using graphic or distressing images of children, even to condemn abuse, can:


  • Violate their dignity

  • Re-traumatize survivors

  • Normalize harmful imagery

  • And, in some cases, unintentionally serve the very systems we are trying to dismantle


So while I wholeheartedly disagree with Facebook’s decision and categorization, I also acknowledge this: I could have protected that child better.


For the last four years at Shades of Us, we now have a child safeguarding policy. Every image we take and share must preserve a child’s dignity, humanity, and safety, including images sourced from the internet, which this one we shared was.


We have taken down the image everywhere and will never use it again.


But this brings me to a critical question: if we must protect children, even in conversations about their abuse, how do we still hold the world accountable?


Here is what I am learning: we DO NOT need to show harm to prove harm exists.


Instead, we can safeguard children while strengthening advocacy by:


  • Using Survivor-Centered Storytelling: We can share stories with consent, anonymization, or composite narratives that protect identities while conveying truth.

  • Choosing Dignity-First Visuals: This can happen when we use illustrations, symbols, or contextual imagery that communicate the issue without exposing a child in distress.

  • Centering Data and Verified Research: Statistics from credible bodies like UNICEF and the World Health Organization can be powerful without being exploitative.

  • Amplifying Local Voices and Solutions: Highlight activists, educators, and communities actively working to end FGM. 

  • Providing Pathways to Action: Advocacy should not end in shock. It should lead to reporting mechanisms, community education, and support systems for at-risk children.

  • Promoting Safeguarding Practices in Media: Starting with ourselves, we are encouraging organizations, creators, and advocates to adopt ethical guidelines when representing children.

  • Supporting Children in Abusive Environments: This can include strengthening community awareness so harmful practices are challenged locally; promoting access to safe reporting channels and child protection services; supporting education for girls, which is one of the strongest protective factors; partnering with trusted organizations that provide shelter, counseling, and legal aid; advocating for enforcement—not just existence—of protective laws; and training educators, healthcare workers, and community leaders to identify and respond to risk


These are steps we are already doing. Yet, things can slip through the crack if we are not consistently holding ourselves accountable, and opening our minds to our blind spots and biases. 


Ultimately, safeguarding is not silence.

Safeguarding is responsibility.

Because even good intentions can cause harm.


We can speak loudly about injustice without putting children in harm’s way.


And that is the standard I am committing to going forward.


In a way, I am glad Facebook issued the penalty, even if I strongly disagree with how the post was categorized. It pushed me to pause, reflect more critically, and engage with the issue in a deeper way. It even inspired me to write about it, something I have not done in a while.


And while I can acknowledge that this action by Facebook was, in some ways, commendable, it is impossible to ignore the many posts on the platform that are clearly exploitative (including content involving child sexual abuse and exploitation) that continue to gain massive traction. But that is a conversation for another day.


In the end, we must all hold ourselves, our organizations, and the technology platforms we use accountable. We all have a responsibility to ensure that we are not normalizing, enabling, or perpetuating violence against women, girls, children, or anyone in society.

No comments:

Post a Comment