• Skip to Content
  • Skip to Main Navigation
  • Skip to Search

Indiana University Bloomington Indiana University Bloomington IU Bloomington

Open Search
  • About Us
    • Our Mission
    • Staff
      • Professional Staff
      • Student Alliance Staff
      • Student Office Staff
    • Bicentennial Report
    • Advisory Council
    • News and Resources
      • 2025
      • 2024
      • 2023
      • 2022
      • 2021
    • Job Openings
    • Visit Us
  • Get Started
    • Students
    • Faculty
      • Ethical AI Podcast
    • Staff
    • Alumni
  • Programs
  • Tech at IU
    • Tech Courses
      • A&H Tech Courses
      • N&M Tech Courses
      • S&H Tech Courses
      • All Tech Courses
    • Undergraduate Majors
    • Undergraduate Minors
    • Undergraduate Certificates
    • Graduate Programs
    • Noncredit IT Training
    • eBadges
  • Support
  • Awards
    • Student Leadership Awards
    • Staff Mentor/Advocate Awards
    • Faculty Mentor/Advocate Awards
    • Maureen Biggers Leadership Award for Equity & Tech
  • Events
  • Connect
  • Summit
    • Featured Speakers
      • Erika Cheung
      • Tara Aggarwal
      • Anne Leftwich
    • Schedule Overview
    • Breakout Sessions
    • Code of Conduct
    • Whova Guide
    • Summit Sponsors
    • Summit Hub
    • Faculty Sessions
    • FAQ

Center of Excellence for
Women & Technology

  • Home
  • About Us
    • Our Mission
    • Staff
    • Bicentennial Report
    • Advisory Council
    • News and Resources
    • Job Openings
    • Visit Us
  • Get Started
    • Students
    • Faculty
    • Staff
    • Alumni
  • Programs
  • Tech at IU
    • Tech Courses
    • Undergraduate Majors
    • Undergraduate Minors
    • Undergraduate Certificates
    • Graduate Programs
    • Noncredit IT Training
    • eBadges
  • Support
  • Awards
    • Student Leadership Awards
    • Staff Mentor/Advocate Awards
    • Faculty Mentor/Advocate Awards
    • Maureen Biggers Leadership Award for Equity & Tech
  • Events
  • Search
  • Connect
  • Summit
  • Home
  • About Us
  • News and Resources
  • 2025
  • Advancing Women & Technology: Empowering Futures Through Digital Skills

Mind the Gap – Why AI Needs Women Now

By: By Anshu Roja Selvamani

Wednesday, November 12, 2025

 Image Source: Women in Technology – “Why Are There So Few Women in Artificial Intelligence (AI)?”

Every time you talk to ChatGPT, Alexa, or any AI assistant, you’re experiencing the imprint of its creators and their choices, biases, and perspectives. Less than 30% of the AI workforce is female, as reported in the Global Gender Gap Report of 2023, and this number drops to 15% as you move up the ranks. Artificial intelligence is revolutionizing how we live our everyday lives and shaping the decisions that have a tremendous impact on societal functioning. In 2025, we still see women drastically underrepresented not just as consumers of AI but also as architects, leaders, and creators. Global data on the AI gender gap reflects more than just statistics; it reflects the systems of exclusion. There are fewer women able to leverage AI tools, stay in tech roles in the long term, and even fewer who reach leadership positions. When women are not involved in designing AI systems, these systems will continue to perpetuate bias. When bias seeps into medical imaging, hiring algorithms, or predictive policing, it shapes lives, opportunities, and justice, leading to negative consequences for vulnerable populations.

When you realize that AI systems draw on data shaped by human behaviour, you begin to understand why AI mirrors some of the biases we see in the human world. These systems are developed by people with specific assumptions, experiences, and blind spots. If these teams are mostly male and/or from similar racial and ethnic backgrounds, the algorithms are bound to reflect those perspectives exclusively. For example, if a storyteller uses Generative AI to create visuals for video characters of a CEO and a secretary, GenAI is guilty of making the CEO a man and the secretary a woman, regardless of the iterations in prompting. UNESCO conducted a study titled “Bias Against Women and Girls in Large Language Models,” which found that women were often associated with terms like ‘family’, ‘home’, and ‘children’. On the other hand, men were associated with words like ‘executive’, ‘business’, ‘career’, and ‘salary’. Compared to men, women were linked to working in domestic roles up to four times as often by the Llama 2 model.

The scale of the problem is more widespread than one might think. A study conducted by the Berkeley Haas Center analysed 133 AI-based systems across multiple industries, revealing that almost 44% of them exhibited gender bias.

There’s more to the AI gender gap crisis than biases in AI systems. Women are disproportionately judged as incompetent for using GenAI in their jobs. In a study conducted by Harvard Business Review, researchers noted that female-identifying engineers who used AI for code generation were rated almost 9% less competent than their male-identifying counterparts, even though the evaluators were looking at identical outputs.

In addition, many consumer AI tools are anthropomorphised as women. In the USA, voice assistants such as Alexa, Siri, Google Assistant, and Cortana, which together account for approximately 92% of the smartphone assistant market, have historically been designed with female-sounding voices. It is also important to understand how AI responses to sexual harassment have changed. In 2017, journalist Leah Fessler investigated how voice assistants, specifically Siri, Google Home, Alexa, and Cortana, responded to sexual or demeaning remarks. She found that the AI assistants’ responses were largely subservient, evasive, and sometimes even grateful. For example, when the voice assistants were called the “B” word, Siri responded, “I’d blush if I could,” and Alexa responded, “Thanks for the feedback.” This is only one example; the voice assistants were prompted with several other derogatory and sexual comments, which were met with the same kind of responses. These types of AI responses, especially when personified with a female-sounding voice and human characteristics, reinforce stereotypes of female subservience in digital form. This is also highlighted in the UNESCO publication, “I’d blush if I could.” While AI bots have come a long way, it is still critical to consider how training data, design choices, and voice personas embedded in AI systems can continue to shape perceptions of power, gender, and respect in human-computer interaction.

However, what’s clear is that any meaningful analysis of AI systems should account for the diversity and the inherent biases of the teams that build them. In a 2019 report by the AI Now Institute, Kate Crawford et al highlighted how the demographic composition of technology companies can shape algorithmic behaviour, warning that AI frequently acts as a ‘feedback loop’ based on the assumptions, experiences, perspectives, and demographics of its developers. This is why we need more women and underrepresented voices involved at every stage of AI development and, most importantly, in senior leadership roles to ensure that the technologies shaping our future serve everyone more equitably. It is critical to have diverse perspectives to challenge existing biases and offer different ideas on the spectrum of identity and human experience.

The need for more female voices in AI is clear. To make real progress, we must take decisive action to close the gender gap. What are some strategies to deal with the AI gender gap?

Organizations should audit their hiring algorithms and other internal AI tools used for evaluation to identify and eliminate any gender biases within the systems. They should also focus on fostering an organizational culture that supports women navigating workplaces where few people look like them. This can be achieved by offering programs that help women overcome imposter syndrome, encouraging risk-taking and experimentation, and building strong networks of mentorship and support.

It is also critical to ensure AI systems are less biased. One clear way to do this is for organizations to hire teams with diverse gender, ethnic, and socio‐economic backgrounds. In addition, organizations should evaluate any tool they plan to launch specifically for potential gender biases. This must be an ongoing process at every stage - from evaluating the training data for bias to monitoring outputs during deployment and beyond.

We can also leverage responsible AI tools to take a data-driven approach to career development. There has been a rise in AI-driven workforce intelligence tools that analyse employee performance, potential, and skills, which can be used to identify skill gaps and ensure women and other underrepresented groups receive the support they need. These AI tools can infer employee skills from “digital artifacts created during their regular work activities,” says Julia Samoylenko, Founder and CEO of Asteri AI. For context, digital artifacts include project tickets, performance reviews, video conference transcription, among others. Such data can help AI analyse patterns and generate a detailed understanding of employee strengths, potential fit for senior roles, and opportunities for growth. While it is critical to audit these internal evaluation tools for bias, when used responsibly, they can serve as personalized career guides and provide adaptable roadmaps for women seeking to climb the corporate ladder.

Apart from organizations, there needs to be stronger AI governance at a higher level. Experts agree that AI governance can accelerate progress towards gender equity. International cooperation on technology is often focused on infrastructural or technical problems and the digital economy. This neglects how new developments in tech are impacting members of society, particularly historically excluded groups and populations who are comparatively more vulnerable.

There are reports such as the Towards Substantive Equality in Artificial Intelligence: Transformative AI Policy for Gender Equality and Diversity, which policy recommendations are “informed by consultations with diverse groups and regions and are grounded in a human rights-based framework.” We need stronger AI governance systems that actively monitor new AI developments, their potential to cause harm, and make informed policy choices to mitigate risks and foster inclusive, just, and equitable AI systems.

On the other side, it is crucial to build the pipeline early. Investments must be made in STEM and AI education for young girls to expose them to opportunities in tech as early as middle or high school. There is potential for partnerships with community or non-profit organizations that particularly support women in tech, which can provide a foundation for women rising in STEM. In addition, it is important to boost the visibility of women in AI. When young girls see more women in positions they aspire to reach, they are more likely to believe it is a possibility for them. It is important for women in leadership tech roles to build their digital presence to increase visibility and inspire more young women to join the tech workforce.

The world cannot afford to build the next era of intelligence without women. Act now: demand diverse representation in AI, champion inclusive hiring, and mentor or sponsor women to leadership. Together, let’s close the gap and create an equitable future.




References

Artificial, in. (2024, November 27). Towards Substantive Equality in Artificial Intelligence: Transformative AI Policy for Gender Equality and Diversity - Data-Pop Alliance. Data-Pop Alliance. https://datapopalliance.org/publications/towards-real-diversity-and-gender-equality-in-ai/

Chin-Rothmann, C., & Robison, M. (2020, November 23). How AI bots and voice assistants reinforce gender bias. Brookings. https://www.brookings.edu/articles/how-ai-bots-and-voice-assistants-reinforce-gender-bias/

Crawford, K., Dobbe, R., Dryer, T., Fried, G., Green, B., Kaziunas, E., Kak, A., Mathur, V., McElroy, E., Nill Sánchez, A., Raji, D., Rankin, J. L., Richardson, R., Schultz, J., Myers West, S., & Whittaker, M. (2019). AI Now 2019 report. AI Now Institute. https://ainowinstitute.org/AI_Now_2019_Report.html

Fessler, L. (2017, February 22). We tested bots like Siri and Alexa to see who would stand up to sexual harassment. Quartz. https://qz.com/911681/we-tested-apples-siri-amazon-echos-alexa-microsofts-cortana-and-googles-google-home-to-see-which-personal-assistant-bots-stand-up-for-themselves-in-the-face-of-sexual-harassment

Pal, K. K., Piaget, K., Zahidi, S., & Baller, S. (2024, June 11). Global gender gap report 2024. World Economic Forum. https://www.weforum.org/publications/global-gender-gap-report-2024/digest/

Smith, G., & Rustagi, I. (2021, March 31). When Good Algorithms Go Sexist: Why and How to Advance AI Gender Equity. Stanford Social Innovation Review. https://ssir.org/articles/entry/when_good_algorithms_go_sexist_why_and_how_to_advance_ai_gender_equity

Samoylenko, J. (2024, December 2). How to Mitigate Automation Risks and Close the Gender Gap. Built In. https://builtin.com/artificial-intelligence/automation-close-gender-gap

Travis, M. (2025, August 26). Women Who Use AI At Work Face A Predictable “Competence Penalty.” Forbes. https://www.forbes.com/sites/michelletravis/2025/08/26/women-who-use-ai-at-work-face-a-predictable-competence-penalty/

UN Women. (2024, June 28). Artificial Intelligence and gender equality. UN Women – Headquarters. https://www.unwomen.org/en/articles/explainer/artificial-intelligence-and-gender-equality

UNESCO. (2024). Generative AI: UNESCO study reveals alarming evidence of regressive gender stereotypes. Unesco.org. https://www.unesco.org/en/articles/generative-ai-unesco-study-reveals-alarming-evidence-regressive-gender-stereotypes

West, M., Kraut, R., & Chew, H. E. (2019). I’d blush if I could: Closing gender divides in digital skills through education. UNESCO. https://doi.org/10.54675/rapc9356

Willige, A. (2025, March 26). Can AI fix the gender gap in STEM? World Economic Forum. https://www.weforum.org/stories/2025/03/ai-stem-women-gender-gap/

Center of Excellence for Women & Technology social media channels

  • Facebook
  • Instagram
  • YouTube
  • LinkedIn

Indiana University

Accessibility | College Scorecard | Open to All | Privacy Notice | Copyright © 2025 The Trustees of Indiana University