Wired Magazine recently published an article giving insight into how Facebook and other social media websites manipulate users’ privacy choices to influence them into giving away more of their data.
Wired reports in an article titled “How Facebook and Other Sites Manipulate Your Privacy Choices,” that social media sites such as Facebook manipulate users into giving away more of their private data. The Electronic Frontier Foundation (EFF) coined the term “Privacy Zuckering,” for when “you are tricked into publicly sharing more information about yourself than you really intended to.”
Wired provided an example of this tactic, writing:
Actually, it’s an old trick. Facebook used it back in 2010 when it let users opt out of Facebook partner websites collecting and logging their publicly available Facebook information. Anyone who declined that “personalization” saw a pop-up that asked, “Are you sure? Allowing instant personalization will give you a richer experience as you browse the web.” Until recently, Facebook also cautioned people against opting out of its facial-recognition features: “If you keep face recognition turned off, we won’t be able to use this technology if a stranger uses your photo to impersonate you.” The button to turn the setting on is bright and blue; the button to keep it off is a less eye-catching grey.
Privacy researchers refer to these design and wording decisions as “dark patterns,” a term applied to website designs that purposefully attempt to manipulate users’ choices. This includes Instagram nagging users to “please turn on notification” without having an option to decline — this is referred to as a dark pattern.
Wired notes that these dark patterns usually relate to pushing users to use the website for extended periods of time, but become particularly nefarious when sites start using these dark patterns to convince users to hand over more personal information. Wired writes:
Dark patterns show up all over the web, nudging people to subscribe to newsletters, add items to their carts, or sign up for services. But, says says Colin Gray, a human-computer interaction researcher at Purdue University, they’re particularly insidious “when you’re deciding what privacy rights to give away, what data you’re willing to part with.” Gray has been studying dark patterns since 2015. He and his research team have identified five basic types: nagging, obstruction, sneaking, interface interference, and forced action. All of those show up in privacy controls. He and other researchers in the field have noticed the cognitive dissonance between Silicon Valley’s grand overtures toward privacy and the tools to modulate these choices, which remain filled with confusing language, manipulative design, and other features designed to leech more data.
Those privacy shell games aren’t limited to social media. They’ve become endemic to the web at large, especially in the wake of Europe’s General Data Protection Regulation. Since GDPR went into effect in 2018, websites have been required to ask people for consent to collect certain types of data. But some consent banners simply ask you to accept the privacy policies—with no option to say no. “Some research has suggested that upwards of 70 percent of consent banners in the EU have some kind of dark pattern embedded in them,” says Gray. “That’s problematic when you’re giving away substantial rights.”
Read the full article at Wired Magazine here.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan or contact via secure email at the address firstname.lastname@example.org