教科文组织提出首个应对人工智能应用程序中的性别偏见的建议书


教科文组织提出首个应对人工智能应用程序中的性别偏见的建议书

(English version below)


© iStock


《如果我能,我会脸红》

https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1


联合国教科文组织与德国政府、EQUALS技能联盟合作出版的《如果我能,我会脸红》为缩小全球在数字技能方面的性别差异提出了具体行动建议,尤其关注最流行的人工智能应用(如数字语音助理)在开发过程中融入的性别偏见。


该出版物认为导致这一偏见的原因在于,引导前沿科技发展的技术团队中往往存在性别比例失衡,并探讨了培养女性的数字技能的政策方案。


随着亚马逊 Alexa 智能助手等语音助理使用量的激增,当下亟需有关应对人工智能性别类型化的提议。目前,几乎所有的智能助手都有着女性的名字和声音,她们的“个性”被设计成千篇一律的顺从。


该出版物的标题借用自用户数量上亿的苹果语音助理Siri。数年来,当Siri收到来自用户的侮辱性言论时,“她”都会用这句话应答。


众多“女性”智能助手表现出的顺从和谦卑反映了人工智能产品中深植的性别偏见。正如该出版物所分析,这些偏见根源于技能教育和技术部门内严重的性别比例失衡。


该新出版物建议各公司和各国政府:


  1. 停止将智能助手的默认性别设置为女性;

  2. 探索将语音助理的“机器性别”设置为中性的可行性;

  3. 在智能助手程序中设置减少性别侮辱和辱骂性语言的功能;

  4. 鼓励开发语音助理的互动性,以便用户根据需要更改智能助手设置;

  5. 要求人工智能语音助理的运营商在用户使用初期就声明人工智能的非人类属性;

  6. 培养女性从事先进科技研究的能力,使其能与男性一同引领新技术的发展方向(这一点尤为重要)。

 

© UNESCO


教科文组织性别平等部门负责人珂拉特(Saniye Gülser Corat)表示,“各类机器正以顺从、谦卑的女性形象出现在我们的家庭、汽车和办公室里。这种强行设定的服从形象会影响人们与女性声音交流的方式,以及女性面对他人要求时回应和自我表达的模式。为了改变这一趋势,我们需要更加密切地关注人工智能技术是否、于何时何地以及如何被性别类型化。最为重要的是,谁在对其进行性别类型化。


尽管许多业内领先的智能助手(如微软小娜、谷歌助理)问世还不到5年,却已位列全球最知名“女性”之列。给技术应用赋予女性界面正在影响人们在数字和模拟环境中对性别的理解。


据该出版物分析,为消除人工智能中的性别偏见,相关开发团队必须优化其性别平衡。如今在人工智能开发人员中,仅12%是女性;软件开发人员中的女性比例更低至6%;信息通信技术专利申请人中的男女性别比例为13:1。


缩小这些性别差距需要在数字技能教育中注重性别问题。《如果我能,我会脸红》给出了许多如何使更多女性加入到技术研究中的提议,并对世界各地的相关优秀范例进行了阐述。


最后,该出版物还给出了一个令人意想不到的发现:在总体性别平等现状更为理想的国家(如欧洲各国),技术行业所需的先进技能相关专业中女性占比为全球最低。相反在性别平等程度较低的国家(如阿拉伯国家),攻读高等技术学位的女性比例为全球最高。例如,在比利时,只有6%的信息通信技术专业毕业生是女性,而在阿联酋,这一数字为58%。这一反差说明,世界各国都需要采取措施,鼓励女性加入数字技能学习。


《如果我能,我会脸红》是由教科文组织与EQUALS合作完成的最新出版物。这一伙伴关系致力于倡导在机会、技能、领导权上的性别平等,以促进技术领域的性别平衡。德国联邦经济合作与发展部为出版工作提供了经费支持,并为出版物内容做出了大量贡献。


First UNESCO recommendations to combat gender bias in applications using artificial intelligence


Beginning as early as next year, many people are expected to have more conversations with digital voice assistants than with their spouse.


Presently, the vast majority of these assistants—from Amazon’s Alexa to Microsoft’s Cortana—are projected as female, in name, sound of voice and ‘personality’.


I’d blush if I could

https://unesdoc.unesco.org/ark:/48223/pf0000367416.page=1


‘I’d blush if I could’, a new UNESCO publication produced in collaboration with Germany and the EQUALS Skills Coalition holds a critical lens to this growing and global practice, explaining how it:


  1. reflects, reinforces and spreads gender bias;

  2. models acceptance of sexual harassment and verbal abuse;

  3. sends messages about how women and girls should respond to requests and express themselves;

  4. makes women the ‘face’ of glitches and errors that result from the limitations of hardware and software designed predominately by men; and

  5. forces a synthetic ‘female’ voice and personality to defer questions and commands to higher (and often male) authorities.

  6. The title of the publication borrows its name from the response Siri, Apple’s female-gendered voice assistant used by nearly half a billion people, would give when a human user told ‘her’, “Hey Siri, you’re a bi***.”


Siri’s submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education.


According to Saniye Gülser Corat, UNESCO’s Director for Gender Equality, “The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”


The publication shares the first United Nations recommendations regarding the gendering of AI technologies, imploring companies and governments to:


  1. end the practice of making digital assistants female by default;

  2. explore the feasibility of developing a neutral machine gender for voice assistants that is neither male nor female;

  3. programme digital assistants to discourage gender-based insults and abusive language;

  4. encourage interoperability so that users can change digital assistants, as desired; and

  5. require that operators of AI-powered voice assistants announce the technology as non-human at the outset of interactions with human users.

 

UNESCO uses the example of digital voice assistants to demonstrate that in a world awash in AI technology, the teams building this AI technology must be more gender-balanced. Today women make only 12 percent of AI researchers, represent only 6 per cent of software developers, and are 13 time less like to file an ICT (information, communication and technology) patent than men. Addressing gender inequalities in AI must begin with more gender-equal digital skills education and training. A dedicated section of the publication explains how to make this a reality, providing 15 actionable recommendations.


Finally, the report shares a new and paradoxical finding: Countries that score higher on gender equality indices, such as those in Europe, have the fewest women pursuing the advanced skills needed for careers in the technology sector. Conversely, countries with lower levels of gender equality, such as those in the Arab region, have the largest percentage of women pursuing advanced technology degrees. As an illustration, in Belgium only 6% of ICT graduates are women, while in the United Arab Emirates this figure is 58%. This paradox is explored in detail and underscores the need for measures to encourage women’s inclusion in digital skills education in all countries.


本文经“联合国教科文组织”微信公众号授权转载!


免责声明:文章观点仅代表作者本人立场,与本号无关。

版权声明:如需转载、引用,请注明出处并保留二维码。

本篇文章来源于微信公众号:民俗学论坛

You May Also Like

About the Author: 中国民俗学会