成人小说亚洲一区二区三区,亚洲国产精品一区二区三区,国产精品成人精品久久久,久久综合一区二区三区,精品无码av一区二区,国产一级a毛一级a看免费视频,欧洲uv免费在线区一二区,亚洲国产欧美中日韩成人综合视频,国产熟女一区二区三区五月婷小说,亚洲一区波多野结衣在线

立即打開
數(shù)千款隱含性別歧視的AI機(jī)器人即將誕生,如何阻止它們

數(shù)千款隱含性別歧視的AI機(jī)器人即將誕生,如何阻止它們

Robert Locascio 2018年05月15日
四大家居人工智能助手默認(rèn)使用的全都是女聲,它們最早設(shè)定的溫柔順從的人格存在著過分的性別歧視。

近來,我無意中聽到自己兩歲的女兒與亞馬遜(Amazon)的語音助手Alexa對話。期間有兩件事震驚了我:第一,女兒無法分辨機(jī)械音與人聲的區(qū)別;第二,無論從哪種社會(huì)習(xí)俗的角度來看,她給Alexa發(fā)布指令的方式都很無禮。

我突然意識(shí)到,Alexa給我女兒樹立了一個(gè)糟糕的榜樣——女性應(yīng)該卑順、忍受無禮的行為并待在家里,這讓我感到十分困擾。

四大家居人工智能(AI)助手——Alexa、蘋果(Apple)的Siri、谷歌(Google)的Google Assistant和微軟(Microsoft)的Cortana——默認(rèn)使用的全都是女聲。在最近一次抵制之前,它們設(shè)定的溫柔順從的人格也存在著過分的性別歧視。我女兒的經(jīng)歷引發(fā)了我對這種微妙的AI性別歧視的關(guān)注,我擔(dān)心情況會(huì)很快變得更糟。

全球最大的那些技術(shù)平臺(tái)今年推出的新服務(wù),將很快與消費(fèi)者之間進(jìn)行交流并形成標(biāo)準(zhǔn)。為了高效率地處理數(shù)百萬條信息,這些公司必須推出自己的AI助手。因此,助手機(jī)器人的類型很快就會(huì)增加到幾千種,并在網(wǎng)頁、應(yīng)用和社交網(wǎng)絡(luò)上與數(shù)十億用戶進(jìn)行交流。

隨著這種“交流型AI”的用途迅速增加,它隱含的性別歧視會(huì)在我們、包括我們孩子的世界中迅速發(fā)酵。微妙的暗示通過不斷重復(fù),會(huì)產(chǎn)生累加的效果,隨著時(shí)間的推移,形成一種病態(tài)的心理狀況。如今,它正在悄悄影響我們,因?yàn)槲覀兪褂弥謾C(jī)器人的時(shí)間還相對較短,可能每天只有幾分鐘。不過,隨著它們開始完全取代網(wǎng)頁和應(yīng)用,AI很快就會(huì)變得普遍許多。

如果我們不做出改變,編寫現(xiàn)有的性別歧視算法和腳本的這批人將會(huì)創(chuàng)造出下一代對話型AI,而它們的規(guī)模則會(huì)以指數(shù)形式增長。隨著對話型AI在全球推廣,那些讓AI系統(tǒng)把女性定位為廚娘和秘書,把男性定位為高管的工程師,將會(huì)把他們的偏見成倍放大。

一切的問題在于男性。如今的AI主要由白人男性工程師開發(fā),他們工作過于匆忙,無暇拷問自己的大男子主義或思考自己的工作可能帶來什么危害。我從1995年起擔(dān)任科技公司的首席執(zhí)行官,過去這20多年來,在網(wǎng)頁、搜索和社交革命中,這樣的情況我曾有所目睹。AI革命才起步不久,但全球卻有一半的人口已經(jīng)遭到了邊緣化。這真是我們的恥辱。

或者我應(yīng)該說,這又一次成為了我們的恥辱。科技界不斷發(fā)生著連環(huán)犯罪。收入排行前20的美國科技公司中,有18家的首席執(zhí)行官都是男性。Facebook、谷歌和微軟的工程師中只有五分之一是女性。而在AI領(lǐng)域,2017年業(yè)內(nèi)專家的重要年度會(huì)議——會(huì)議神經(jīng)信息處理系統(tǒng)大會(huì)(Neural Information Processing Systems, NIPS)上,83%的出席者是男性,同年NIPS論文的作者中則有90%是男性。(由于AI領(lǐng)域相對較新,我們還無法獲取有關(guān)性別多樣性的更宏觀的數(shù)據(jù)。)

如果女性無法參與,我們怎么能夠開發(fā)出持久且影響深遠(yuǎn)的AI技術(shù)?我們尚處于起步階段,但跡象卻已經(jīng)令人擔(dān)憂。放任下去,恐怕會(huì)產(chǎn)生災(zāi)難性的后果。

為了避免對話型AI引發(fā)的災(zāi)難,我們公司正在積極采取措施,糾正技術(shù)人員的男性偏見,其中一種重要的方式就是在開發(fā)助手機(jī)器人時(shí)與核心員工和編碼人員保持密切聯(lián)系。根據(jù)美國勞工部(Labor Department)的數(shù)據(jù),美國客戶服務(wù)代表中有65%是女性,這個(gè)團(tuán)隊(duì)的多樣性要優(yōu)于撰寫代碼的程序員,女性占比也遠(yuǎn)高于大型科技公司中女性程序員的平均水平。

研究AI的公司應(yīng)該努力平衡員工的性別,與女性領(lǐng)袖合作,減少男性的偏見,并主持由女性領(lǐng)頭的科研計(jì)劃。我們需要在研發(fā)助手機(jī)器人上提出一套最好的方案,并在整個(gè)行業(yè)內(nèi)推廣。

AI具有巨大的潛力,然而取得真正進(jìn)步所需的洞察力和包容性,我們尚未具備,無法讓該領(lǐng)域自己承擔(dān)責(zé)任。要抵御隱含偏見的重復(fù)和增強(qiáng),最好的方法就是保持團(tuán)隊(duì)性別的多樣性。做不到這一點(diǎn),AI很快就會(huì)孕育出科技界的下一個(gè)危機(jī)。(財(cái)富中文網(wǎng))

作者羅伯特·洛卡西奧是LivePerson的創(chuàng)始人和首席執(zhí)行官。

譯者:嚴(yán)匡正

?

I recently overheard my 2-year-old daughter talking to Amazon’s voice assistant Alexa, and two things struck me. First, she doesn’t distinguish the disembodied voice from that of a regular human. Second, she barks orders at Alexa in a way that would be considered rude by any social convention.

I was suddenly aware and troubled that Alexa is setting a terrible example for my daughter—that women are subservient, should accept rudeness, and belong in the home.

All four of the major in-home artificial intelligence, or AI, assistants—Alexa, Apple’s Siri, Google Assistant, and Microsoft’s Cortana—speak by default with a female voice. Until a recent backlash, they also had docile, obedient personalities that would tolerate an exorbitant amount of sexism. The experience with my daughter opened my eyes to this subtle AI sexism, and I’m afraid it will soon get even worse.

The world’s largest tech platforms have this year launched new services that will quickly make texting between consumers and brands the norm. To handle the millions of messages efficiently, these companies will have to launch their own AI assistants. As a result, the number of assistant bots will quickly expand into the thousands, communicating with billions of consumers across websites, apps, and social networks.

As this “conversational AI” dramatically grows in usage, its sexism could get baked into the world around us, including that of our kids. Subtle reinforcement through repetition can add up, over time, to a form of problematic psychological conditioning. Today, this is quietly creeping up on us because the use of bots is still relatively low—a few minutes per day, perhaps. But soon AI will be much more ubiquitous, as bots start to replace websites and apps completely.

If we don’t change course, this next generation of conversational AI will be created by the same people who built the current sexist algorithms and scripts—but on an exponentially bigger scale. The engineers whose AI systems categorized women into kitchen and secretarial roles while offering men jobs with executive titles will have their biases massively amplified, as conversational AI goes global.

The common thread is men. The AI of today was developed by predominantly white male engineers in too much of a hurry to challenge their own chauvinism or consider the harm their work could do. As a tech company CEO since 1995, it’s a pattern I’ve seen before, during the web, search, and social revolutions of the past 20-plus years. The AI revolution started only recently, but it’s already marginalized half of the world’s population. Shame on us.

Or, I should say, shame on us again. The technology industry is a serial offender. Of the 20 largest U.S. technology firms by revenue, 18 have male CEOs. Only one in five engineers at Facebook, Google, and Microsoft are women. In AI specifically, 83% of attendees at the 2017 main annual gathering of AI experts, the Neural Information Processing Systems (NIPS) conference, were men, as were 90% of NIPS paper authors that year. (Wide-scale statistics on gender diversity in AI, which is relatively new as a specific sector, are not yet available.)

How can we build lasting and far-reaching AI technology if women are missing from the equation? We’re just getting started, but the signs are already worrying. Left unchecked, the results could be catastrophic.

To avert a disaster in conversational AI, one important antidote to techie male bias that we are pursuing aggressively in our company is to engage contact center staff alongside coders in building the bots. Customer service representatives—who are 65% female in the U.S., per the Labor Department—are a more diverse group than the programmers who write code, and far above the average number of female engineers at the big tech companies.

Companies working in AI should work to recruit more balanced workforces, partner with female leaders to reduce male bias, and host women-led tech initiatives. We need to develop a set of best practices in bot building and spread them across the industry.

AI has huge potential, but until the field begins to hold itself accountable, we’ll continue to miss the perspective and inclusivity we need for true progress. Diversity is our best defense against replicating and amplifying hidden biases. Without it, AI will soon birth the next crisis in the technology industry.

Robert LoCascio is the founder and CEO of LivePerson.

掃碼打開財(cái)富Plus App
免费A级毛片无码A∨免费软件| 亚洲欧洲日产国码无码久久99| 成人特级毛片全部免费播放| avtt中文字幕无码一区| 国产乱子伦对白视频免费| 精品无码国产一区二区三区16| 久久人妻夜夜做天天爽| 国产精品成人一区无码| 无套内谢少妇毛片免费看看| 亚洲欧美日韩国产成人精品影院| 君岛美绪暴雨夜中文字幕| 国产精品黄网站免费进入| 特级做A爰片毛片A片免费| 免费观看又污又黄又无遮挡的网站| 国产男女爽爽爽免费视频| 亚洲AV日韩AV无码| 欧美日韩免费一区二区三区播放| 日产亚洲一区二区三区| 99亚洲精品中文字幕无码不卡| 国产欧美久久久精品影院| 免费 无码 国产在线观看p午夜亚洲av中文字字幕| 国产精品9999久久久久仙踪林| 亚洲天堂在线观看视频网站| 丰满爆乳一区二区三区| 精品国在线观看视频在线播放| 国产精品亚洲αv天堂无码| 国产农村妇女精品一二区| 色婷婷亚洲六月婷婷中文字幕| 精品国产乱子伦一区二区三区| 真人作爱90分钟免费看视频| 国产成人精品无码一区二区| 青青草无码精品伊人久久| 欧美A级毛欧美1级a大片免费播放| 国产超清无码 片内射在线| 一区二区三区动漫成人在线观看| 被公侵犯玩弄漂亮人妻中文| 久久青青草原精品老司机| 国产精品亚洲综合一区在线观看欧美黑人| 黄色小视频在线免费观看| 国产成人猛男69精品视频| 日韩黑人AV全部在线看|