Begin typing your search...

    Empathy in technology: Baby steps in making robots less biased than we are

    On a summer night in Dallas in 2016, a bomb-handling robot made technological history. Police officers had attached roughly a pound of C-4 explosive to it, steered the device up to a wall near an active shooter and detonated the charge. In the explosion, the assailant, Micah Xavier Johnson, became the first person in the United States to be killed by a police robot.

    Empathy in technology: Baby steps in making robots less biased than we are
    X
    Chris S. Crawford, a computer scientist at the University of Alabama

    Chennai

    Afterward, then-Dallas Police Chief David Brown called the decision sound. Before the robot attacked, Johnson had shot five officers dead, wounded nine others and hit two civilians, and negotiations had stalled. Sending the machine was safer than sending in human officers, Brown said.

    But some robotics researchers were troubled. “Bomb squad” robots are marketed as tools for safely disposing of bombs, not for delivering them to targets. (In 2018, police officers in Dixmont, Maine, ended a shootout in a similar manner.). Their profession had supplied the police with a new form of lethal weapon, and in its first use as such, it had killed a Black man.

    “A key facet of the case is the man happened to be African-American,” Ayanna Howard, a robotics researcher at Georgia Tech, and Jason Borenstein, a colleague in the university’s school of public policy, wrote in a 2017 paper titled “The Ugly Truth About Ourselves and Our Robot Creations” in the journal Science and Engineering Ethics. Like almost all police robots in use today, the Dallas device was a straightforward remote-control platform. But more sophisticated robots are being developed in labs around the world, and they will use artificial intelligence to do much more.

    A robot with algorithms for, say, facial recognition, or predicting people’s actions, or deciding on its own to fire “non-lethal” projectiles is a robot that many researchers find problematic. The reason: Many of today’s algorithms are biased against people of colour and others who are unlike the white, male, affluent and able-bodied designers of most computer and robot systems. While Johnson’s death resulted from a human decision, in the future such a decision might be made by a robot — one created by humans, with their flaws in judgment baked in.

    “Given the current tensions arising from police shootings of African-American men from Ferguson to Baton Rouge,” Dr. Howard, a leader of the organisation Black in Robotics, and Dr. Borenstein wrote, “it is disconcerting that robot peacekeepers, including police and military robots, will, at some point, be given increased freedom to decide whether to take a human life, especially if problems related to bias have not been resolved.”

    Last summer, hundreds of AI and robotics researchers signed statements committing themselves to changing the way their fields work. One statement, from the organisation Black in Computing, sounded an alarm that “the technologies we help create to benefit society are also disrupting Black communities through the proliferation of racial profiling.” Another manifesto, “No Justice, No Robots,” commits its signers to refusing to work with or for law enforcement agencies.

    The long-term solution for such lapses is “having more folks that look like the United States population at the table when technology is designed,” said Chris S. Crawford, a professor at the University of Alabama who works on direct brain-to-robot controls. Algorithms trained mostly on white male faces (by mostly white male developers who don’t notice the absence of other kinds of people in the process) are better at recognising white males than other people.

    “I was in Silicon Valley when some of these technologies were being developed,” he said. More than once, he added, “I would sit down and they would test it on me, and it wouldn’t work. And I was like, You know why it’s not working, right?”

    Berreby is a journalist with NYT©2020

    The New York Times

    Visit news.dtnext.in to explore our interactive epaper!

    Download the DT Next app for more exciting features!

    Click here for iOS

    Click here for Android

    migrator
    Next Story