Equally Sualeha Irshad, Moniola Odunsi, Sora Shirai

Growing up in America, Sualeha, Moniola, and Sora -- three high school students and young women of color -- witnessed firsthand the racial biases that exacerbate inequalities for minority groups. Brought together across different parts of the US by their shared interest in technology and software development, the team decided to tackle the Sustainable Development Goal of Peace and Equity in a competition for the Moody's Foundation Peace and Justice Challenge.
As they brainstormed potential issues to address under this larger global issue, Sualeha, Moniola, and Sora eventually thought of generating solutions for racial discrimination. Due to our own personal experiences, finding solutions for preventing and alleviating discrimination and racial inequalities was a matter that they were all deeply passionate about. However, through further research, they found that the majority of discrimination wasn't intentional—it was a byproduct of society's subtle messages. Many people don't realize the biases they possess but nonetheless can cause harm, and so the team set out to create a solution to change this.
Using their background knowledge in emerging technologies, Sualeha, Moniola, and Sora realized that using existing AI algorithms to detect keywords in written online communication could help identify and address implicit bias in text. That's how they came up with the idea for Equally, a software program using AI that helps companies, organizations, and public service providers identify and address implicit racial bias in written communications.
Equally was selected as a first place winner for the Moody Foundation Peace and Justice Challenge. With support from Peace First and the Moody Foundation, the team is working alongside experts from Dartmouth and Grammarly to make the software program a reality. But that did not come without challenges. Understanding and providing nuance to the technical aspect of Equally has proved to be a challenge for its implementation, says Sualeha. A large part of the complexity of Equally lies in the technological aspects needed; however, the team has consistently worked to mitigate this issue by reaching out to skilled professors and professionals, like those at Grammarly, to gain a greater understanding of what is needed to take Equally from ideation to implementation. Additionally, to make Equally as effective as possible, the team has had to evaluate how Equally would prove most beneficial to various communities, as implicit bias indicators and language nuances could vary greatly from one community to another. Understanding the different ways Equally would be of use has required direct outreach to the communities the team hopes to impact.
The team writes, "although we've yet to make a tangible impact through Equally, we've had the amazing opportunity to meet with experts in AI and implicit bias all over the world . Professionals at institutions such as Darthmouth and Grammarly have offered their support and validation for our project, and we've been so grateful to have been able to gain a deeper understanding of both implicit bias and artificial intelligence through these meetings. We're super excited to apply all of the knowledge that we have gained in the future development of Equally."
Equally has the potential to have a massive impact across several domains, including the education, legal and corporate sectors. Implicit Bias is not limited to one area of life. Instead, it affects most, if not all, aspects. Unfortunately, many are in denial of their biases and because society has made stereotyping such a norm, they do not realize what Equally directly realizes for them. The team understands that Equally will not solve the issue of Implicit Bias, as it is such a systemic issue. However, they hope that it will allow people to recognize their implicit biases, understand them and begin to take steps towards combating the larger issue of racism.