- AI applications can pose risks to women’s safety and require careful consideration
- Content experts and practitioners in cultural studies are crucial for designing AI systems that are culturally sensitive and inclusive
- LLMs built for other audiences can be biased and unfair, leading to a lack of trust in their output
Above: Women in AI participants (L-R); Stephan Barrow, Senior Data Scientist, CIBC Caribbean.Greta Gunpat, Technology Project Manager at RAMPS Logistic, Amanda Zilla, Principal Investigator at the UWI AI Innovation Centre, Dr. Phaedra Mohammed, Computer Science Lecturer at The UWI, Tamika Ramkissoon, Senior Product Manager and AI Engineer at I’Deffect and Sarah Rudder-Chulhan, Associate Director – Digital Client Experience, CIBC Caribbean. Photo courtesy CIBC,
BitDepth 1562 for May 11, 2026
Right from the start, the all-female panel assembled to discuss the topic, Women in AI: Ghosts in the Machine, were clear that women are neither content with being considered adjacent to AI development nor are they planning to allow AI to remain a reflection of a man’s world perception of global reality.
“Grok [an AI component of Elon Musk’s X] has this lovely feature where you could change, in public pictures, what the person’s wearing. When I saw that, I was sure a dude did that, one-hundred percent there was no female involved there”, said Sarah Rudder-Chulan, Associate Director, Digital and Client Experience, CIBC.
“Because as a woman, we know what the results of that will be and how dangerous that is for us and for our children. So I think there is definitely a role [for women] in reviewing what’s out there and saying, oh, sorry that doesn’t work for me and my safety.”
The discussion was hosted by CIBC Caribbean and Ramps Logistics and took place at the St Augustine Campus of the University of the West Indies on April 23.
“The type of harm [that software can wreak] is really what you have to look at, and then you have to look at whether the build didn’t take responsibility for putting in those safety checks,” said Dr Phaedra Mohammed, UWI Computer Science lecturer.
“There are obvious things that you have to commit to building as a software developer. As a company, as an educator, if you have not done that, well then sure, you’re responsible for the harm. But the other thing is that as the owner of whatever system you’re building, you have to take responsibility for what your users may do with that thing. So it’s a tricky situation.”
“The Artificial Intelligence Innovation Center is a dream lab, a digital humanities base cluster,” said Amanda Zilla, Principal Investigator, UWI Artificial Intelligence Innovation Center.
“This is where we see the disciplines in the humanities space coming to the forefront of AI development. Everyone in this room at some point in their life was studying literature, you were forced to pick up a book and you needed to be able to tell your teacher something about the people and places. You take that subject and move it higher up to different levels of education, what is required of you to be able to comment on the text becomes more complex.”
“You are now expected to do research into sociological phenomena, psychoanalytic theories, all things that allow you to connect these fictional representations of things to real-world phenomena in designing AI systems, and you’re thinking about who the user is at the end of the product.”
“To make connections between people’s needs and personality traits and all the things that already do, we build these personalities that we’re targeting. It’s the same thing too with AI. So we see all these humanities skills coming out in different ways across different types of research.”
“In terms of cultural studies as part of the Caribbean, there’s a wide diversity of cultural backgrounds that we’re attempting to address and develop systems for. So persons who are content experts and practitioners in these fields, they become key figures in ensuring that we’re designing systems that are culturally sensitive and biased and inclusive across the region.”
Creating Large Language Models (LLMs) that are relevant and culturally appropriate to the Caribbean is part of the solution, but systems that deliver output that is useful to users is also critical.
Tamika Ramkissoon worries about issues of bias and unfairness in LLMs that are built for other audiences.
“Unfairness, because when we’re using tools that aren’t applicable for us, you tend to lose trust in the system because the output that it’s giving you isn’t relevant. That harms a lot of what we do in the industry because you can’t just put a prompt in, get a response, and then submit it. You have to audit it to make sure it’s relevant.”
“The top companies build systems for engagement and we see that with social media platforms, their job is to keep you on it all the time,” said Rudder-Chulan.
“The result of that is kids who are addicted. There are a whole bunch of psychological problems resulting from that, and that’s why we need to build from experience for the best outcome for our customers from our employees who are supporting it. That is also fundamental part we need to bring, our culture and how we try to look after each other, into the systems that we build as opposed to copying a system that’s built purely for engagement and numbers.”
“When I first came out of university a million years ago, everybody was like, why build something here?” Rudder-Chulan said.
“Just take what’s in Europe, lift and shift. That has been the approach of most of the organizations in the Caribbean. And how do we feel about most of the organizations in the Caribbean?”
“Most of us are frustrated and that frustration comes from these systems not working for the environment that we have. My nail tech downloaded Fresha [spa and salon booking software], which is awesome abroad, to be able to book and schedule appointments. But our payment systems don’t work with Fresha.
“We don’t have a digital ID so that the workflows are broken at every stage. She’s not able to optimize the use of a tool like that. So imagine that in the context of banking. You trust us to take care of your money? But when you interact with something that’s been built for other people, you’re going to meet breaks at every stage of your journey.”\
“This is why it was important for us at CIBC to build our own applications, to build our own AI systems, because we need to be able to deliver a pleasant, happy client experience.”
“Which means if our customers fall off the best path, we get them right back on so they can achieve what they need to achieve.
If we don’t build systems that support the fact that, for example in Trinidad, your proof of address is Trace Whatever or lamppost 45, you can see how you’re going to run into problems.”
Amanda Zilla believes that hand in hand with experimentation and innovation, internal testing should be part of any development before a customer facing system is deployed.
“[Testing before] crossing that boundary between this creation phase and users is a way of minimizing possible harms,” Zilla said.
“I believe strongly in the diversity of backgrounds. It is only when your team is as diverse as possible that you’re able to account for a large number of factors. If everyone on your team is a computer scientist or a developer, your focus will be on the development. You need the sociologist, the economist, the people in humanities and cultural spaces to be able to tell you how this will impact people.”
“It’s impossible to totally remove bias and opinion. The only way that we can really get closer to minimizing bias in development is to have teams that are as diverse as possible, that are willing to challenge each other’s views along the way.”




