x
Breaking News
More () »

Microsoft's Nadella says 'A.I. must guard against bias'

 

 

Satya Nadella is a believer in the vast promise of artificial intelligence. But the Microsoft CEO says humans and machines need to work together to solve the world’s great societal challenges, including issues of diversity and inequality.

And responsibility is in the hands of the designers.

“Ultimately, it’s not going to be about human vs. machine,” Nadella wrote in a piece on Slate later reposted to LinkedIn, the professional networking service Microsoft is buying. “We humans have creativity, empathy, emotion, physicality, and insight that can then be mixed with powerful A.I. computation—the ability to reason over large amounts of data and do pattern recognition more quickly—to help move society forward.”

Nadella’s remarks come as the booming but still relatively nascent AI field confronts instances of gender and race bias in computer programs, from chat bots to search engines to photo apps. Ugly examples (Microsoft's own chatbot engaging in racial slurs, for instance) have served as a reminder that these algorithms reflect the views, biases and experiences of their creators.  And unless the industry shakes up its programming ranks, those may largely consist of highly paid white men. 

Consider the recent headlines: “Artificial Intelligence Has a `Sea of Dudes’ problem declared Bloomberg, which questioned what happens when most of the computer scientists working in the AI field are male.  

The New York Times chimed in with “A.I’s Grasp of Diversity May Begin With Who Builds It,” which followed a Sunday Times opinion piece, “Artificial Intelligence’s White Guy Problem.” It was written by a Microsoft principal researcher Kate Crawford, who is also co-chairwoman of a White House symposium on society and A.I.

For years, critics have been flagging racial bias in search results. Most recently, an 18-year old Virginia high school senior compared a Google image search for “three black teenagers” to a search for “three white teenagers.” The results were troubling: the black teen query turned up police mugshots; the white teen search yielded groups of smiling teens.

As that student Kabir Ali told USA TODAY: "I had actually heard about this search from one of my friends and just wanted to see everything for myself. I didn't think it would actually be true. When I saw the results I was nothing short of shocked."

Google says the results only magnified biases that were already prevalent. In a statement released at the time to USA TODAY, Google said that “sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query. These results don’t reflect Google’s own opinions or beliefs — as a company, we strongly value a diversity of perspectives, ideas and cultures."

Microsoft had to apologize back in March, for “unintended offensive and hurtful tweets” spewed by its nascent A.I. chatbot Tay.

Some who study the topic say tech companies, the creators of these programs, can't hide behind the defense that it's the users rather than the objective piece of coding that's at fault. 

"We have to ask ourselves: If Google is not responsible for its algorithm, who is?" said UCLA information studies and African American studies professor Safiya Umoja Noble.

In today’s Slate post, Nadella pushed on the idea that the humans behind the machine are ultimately responsible for what happens next.

“All of the technology we build must be inclusive and respectful to everyone…. A.I. must maximize efficiencies without destroying the dignity of people: It should preserve cultural commitments, empowering diversity.

We need broader, deeper, and more diverse engagement of populations in the design of these systems… A.I. must have algorithmic accountability so that humans can undo unintended harm. We must design these technologies for the expected and the unexpected. A.I. must guard against bias, ensuring proper, and representative research so that the wrong heuristics cannot be used to discriminate.”

Email: ebaig@usatoday.com; Follow USA TODAY Personal Tech Columnist @edbaig on Twitter Contributing: Jessica Guynn

Before You Leave, Check This Out