Northeastern U. Scientists Plan to Use Surveillance Tech to Catch ‘Implicit Bias’

Amazon offers a way to delete Alexa recordings automatically
Elaine Thompson/AP

Researchers at Northeastern University are currently working to develop an artificially intelligent listening device that would catch and report “implicit bias” in the workplace.

According to a blog post on the Northeastern University website, researchers at the university in Boston are working to build a smart listening device that would record “implicit bias” in the workplace.

The concept is chillingly Orwellian. Northeastern University, in the post, says that the device would be used to report workers don’t value the perspective of their female colleagues.

“But what if a smart device, similar to the Amazon Alexa, could tell when your boss inadvertently left a female colleague out of an important decision, or made her feel that her perspective wasn’t valued?” the post reads.

Northeastern Professor Christopher Riedl, who is working on the project, says that the device to ensure “inclusion” in the workplace.

“The vision that we have [for this project] is that you would have a device, maybe something like Amazon Alexa, that sits on the table and observes the human team members while they are working on a problem, and supports them in various ways,” Riedl said. “One of the ways in which we think we can support that team is by ensuring equal inclusion of all team members.”

“You could imagine [a scenario] where maybe a manager at the end of a group deliberation gets a report that says person A was really dominating the conversation,” another professor, Brooke Foucault Welles, said.

Breitbart News has reported extensively on the popularity of “implicit bias” testing in academia. Breitbart News reported in January that Harvard University had embraced the debunked “implicit bias” test, an online quiz hosted on the Harvard website that purports to determine the level of a participant’s racism.

Stay tuned to Breitbart News for more updates on this story.

.

Please let us know if you're having issues with commenting.