Facebook and Google accused of manipulating us with “dark patterns”
By now, most of us have seen privacy notifications from popular web sites and services. These pop-ups appeared around the time that the General Data Protection Regulation (GDPR) went into effect, and they are intended to keep the service providers compliant with the rules of GDPR. The regulation requires that companies using your data are transparent about what they do with it and get your consent for each of these uses.
Facebook, Google and Microsoft are three tech companies that have been showing their users these pop-ups to ensure that they’re on the right side of European law. Now, privacy advocates have analysed these pop-ups and have reason to believe that the tech trio are playing subtle psychological tricks on users. They worry that these tech giants are guilty of using ‘dark patterns’ – design and language techniques that it more likely that users will give up their privacy.
In a report called Deceived By Design, the Norwegian Consumer Council (Forbrukerrådet) calls out Facebook and Google for presenting their GDPR privacy options in manipulative ways that encourage users to give up their privacy. Microsoft is also guilty to a degree, although performs better than the other two, the report said. Forbrukerrådet also made an accompanying video:
Tech companies use so-called dark patterns to do everything from making it difficult to close your account through to tricking you into clicking online ads (for examples, check out darkpatterns.org‘s Hall of Shame).
In the case of GDPR privacy notifications, Facebook and Google used a combination of aggressive language and inappropriate default selections to keep users feeding them personal data, the report alleges.
A collection of privacy advocacy groups joined Forbrukerrådet in writing to the Chair of the European Data Protection Board, the EU body in charge of the application of GDPR, to bring the report to its attention. Privacy International, BEUC (an umbrella group of 43 European consumer organizations), ANEC, a group promoting European consumer rights in standardization, and Consumers International are all worried that tech companies are making intentional design choices to make users feel in control of their privacy while using psychological tricks to do the opposite. From the report:
When dark patterns are employed, agency is taken away from users by nudging them toward making certain choices. In our opinion, this means that the idea of giving consumers better control of their personal data is circumvented.
The report focuses on one of the key principles of GDPR, known as data protection by design and default. This means that a service is configured to protect privacy and transparency. It makes this protection the default option rather than something that the user must work to enable. Their privacy must be protected even if they don’t opt out of data collection options. As an example, the report states that the most privacy-friendly option boxes should be those that are ticked by default when a user is choosing their privacy settings.