With the speedy development of Artificial Intelligence (AI) elevating questions on its influence on society, Google’s Bard, together with UNESCO and specialists in India, believes that ladies’s involvement may help develop “fair, unbiased and beneficial to all AI”.
It was highlighted earlier that AI makes use of information produced by people, but it surely additionally inherits human imperfections equivalent to bias primarily based on age, gender or ethnicity.
Google’s AI chatbot Bard shared an instance of gender bias. It stated if a dataset is made up of largely males engineers, then an AI system educated on this dataset could also be extra probably to make choices that favour males.
The chatbot then stated: “Women’s involvement in the AI lifecycle can help reduce bias in AI systems in several ways. First, women can help identify and address biases in the data that is used to train AI systems. Second, women can help develop AI systems that are more inclusive and that take into account the needs of all users. Third, women can help ensure that AI systems are used fairly and ethically.”
In March, UNESCO’s Social and Human Sciences Sector launched the ‘Women 4 Ethical AI Platform’ throughout a high-level occasion. It was stated: “This new platform will provide a unique space for global gender quality leaders in frontier technologies to combine their strengths and influence to achieve the clear goal of implementing UNESCO’s recommendation on the ethics of AI with a gender lens.”
UNESCO will contain girls in AI by the advice to advance the moral growth and deployment of AI for honest and inclusive outcomes, with a particular emphasis on gender range and empowerment.
The platform will give attention to:
- Dedicating funds to gender-related schemes
- Ensuring that nationwide digital insurance policies embody a gender motion plan
- Encouraging feminine entrepreneurship, participation and management in AI
- Investing in programmes to enhance ladies’ and girls’s participation in STEM and ICT disciplines
- Eradicating gender stereotyping and making certain that AI programs will not be biased.
TIME FOR INDIA
Regarding the significance of involving extra girls within the area of AI, Neha Swetambari, SVP, Products and Delivery at Think360.ai, instructed News18: “AI allows for a micro-segmented approach to a huge base of consumers, like our country. Hence, India, while slow in adopting AI, will move at a tremendous pace. In this process, inherent biases will become a part of the system.”
She believes that having a various workforce will assist determine biases quicker and make them simpler to rectify. “Biases introduced by low participation of women in the labour force or low ownership of property in risk frameworks need to be managed with equity-based weights to the right metrics and parameters.”
Sachin Arora, Partner and Head at Lighthouse (Data, AI and Analytics), KPMG in India, stated AI literacy must be one of many core focus areas, as it can enhance the GDP, given the productiveness and jobs AI affords.
“We’re increasingly moving towards organisations becoming a collection of AI models. As we witness a significant shift, embracing diversity of perspective becomes even more paramount. AI teams building these models must represent the population they’re going to serve, otherwise, there is a huge risk of substantial bias affecting the outcomes of these models,” he added.
Meanwhile, Dr Mamta Arora, Associate Professor, Department of Computer Science and Technology, Manav Rachna University, additionally highlighted the significance of ladies’s inclusion, saying that gender biases in AI programs can perpetuate societal inequalities and reinforce stereotypes, main to biased decision-making and unfair outcomes.
“India, like many other countries, is experiencing a growing interest in AI and its potential applications. We can promote women in AI by encouraging STEM education, women’s participation in AI research, supporting women-led AI startups, creating funding programmes, taking initiatives that inspire and support girls to engage in AI-related fields, and providing awareness of career prospects in AI,” Prof Arora famous.
Echoing an analogous thought, Smita Khanna, COO at Newton Consulting Group, stated that contemplating the speedy digitization, that is the perfect time to contain and encourage better girls’s participation in AI, which may be executed through instructional initiatives, scholarships, mentorship applications, outreach programmes and devoted help networks.
She believes that AI can play a task in addressing gender-based violence in India. Citing the National Family Health Survey report of 2022, Khanna stated that almost one-third of ladies have skilled bodily or sexual violence.
“Indian non-profit organization Safecity uses AI to collect and analyze data on incidents of sexual harassment and violence against women. This data is then used to identify patterns, inform policy advocacy and implement preventive measures in communities,” she famous.