Report: Snapchat Employees Look to Redesign ‘Racist’ Camera

LONDON, ENGLAND - AUGUST 03: A finger is posed next to the Snapchat app logo on an iPad on August 3, 2016 in London, England. (Photo by Carl Court/Getty Images)
Carl Court/Getty Images

Snapchat, a social media platform, is looking to “redesign” its camera technology in order to make it more “inclusive” after employees complained, “the camera is, in fact, racist,” according to an Axios report.

The social media platform “has launched an initiative to redesign its core camera technology to make it better able to capture a wide range of skin tones,” according to a report by Axios.

According to the report, Snapchat claims their application is used to take “around 5 billion pictures” a day. The report added users take pictures on Snapchat as a “starting point for how many people see themselves, their friends and their world.”

Bertrand Saint-Preux, a Snapchat engineer, told Axios, “Historically, the chemical processes behind film development used light skin as its chemical baseline — basically optimizing for whiteness, a legacy that continues today. Additionally, he added, because of this, “the camera is, in fact, racist.”

Axios reports that the technology in film cameras over time has gotten better for “exposing for darker tones, but not as part of a concerted effort to make things more equitable for people.” The complaints regarding this were from “chocolate makers and photographers shooting other dark subjects that pushed the industry to do better,” according to the report.

Additionally, in the report, the “inclusive camera” Snapchat is using its time to create what is reportedly supposed to be “broader than just capturing dark skin as well as light skin.” The report went deeper to say they mean to be “identifying and removing biased assumptions (e.g. that smaller, thinner noses are better) when automatically adjusting people’s appearance.”

The company ideally still would want “people to have flexibility, but wants to make a high-quality true image the starting point and then put the controls in the hands of the individual.” Axios found that the initiative started from a presentation by Saint-Preux that was made in front of some of the top executives during the time of the George Floyd riots last year.

Moreover, the report claims that Snapchat’s plan is in the works with “several noted directors of photography from the film industry to learn techniques they use to best capture actors with darker skin tones.” Snapchat hopes to develop techniques in order to adjust any images after the picture has been taken in addition to “correcting brightness and exposure to create a more balanced image.”

The report listed the technology the company hopes to develop:

  • Developing techniques to adjust images after they have been captured, such as correcting brightness and exposure to create a more balanced image.
  • Improving the selfie camera’s ability to capture low light by making adjustments to the front flash. So, for example, if someone was taking a selfie in a dark room, the display would use the right type of lightwaves to properly illuminate their skin tone.
  • Another key area involves machine learning systems and how those systems are optimized. If you tell a computer to optimize for the best average result in photos — which is what many algorithms do — it will make most people appear better and not worry if some people at the margins have a poor result.
  • On the flip side, if you focus on getting the quality of everyone’s image above a certain threshold, you will produce a more equitable result.

This would include a way for outside developers and partners to also be able to take advantage of the technology they are looking to create, according to the report. This could possibly make Snapchat’s efforts take longer, since they would like to bring this to a large-scale market, the report said.

Furthermore, Axios reported the company has a “far from perfect” track record. Axios claims the company released a digital blackface Bob Marley feature for 4/20 several years ago, in addition to having to apologize for putting a “filter that asked subjects to ‘smile’ while they break free from chains to honor Juneteenth.

The social media platform told Axios, “We are very mindful of our past mistakes and are applying what we’ve learned to all of our efforts to build more inclusive design processes, systems and products.”


Please let us know if you're having issues with commenting.