DETOUR Act seeks to provide more transparency for social media users
The Deceptive Experiences to Online Users Reduction Act would prohibit large online platforms from using deceptive user interfaces, known as “dark patterns,” to trick consumers into handing over their personal data.
The DETOUR Act would also prohibit these platforms from using features that result in compulsive usage by children.
Sen. Mark Warner (D-VA) first introduced the DETOUR ACT in 2019 and has been raising concerns about the implications of social media companies’ reliance on dark patterns for years.
“For years dark patterns have allowed social media companies to use deceptive tactics to convince users to hand over personal data without understanding what they are consenting to. The DETOUR Act will end this practice while working to instill some level of transparency and oversight that the tech world currently lacks,” said Sen. Warner, chairman of the Senate Select Committee on Intelligence and former technology executive. “Consumers should be able to make their own informed choices on when to share personal information without having to navigate intentionally misleading interfaces and design features deployed by social media companies.”
The term “dark patterns” is used to describe online interfaces in websites and apps designed to intentionally manipulate users into taking actions they would otherwise not.
These design tactics, drawn from extensive behavioral psychology research, are frequently used by social media platforms to mislead consumers into agreeing to settings and practices advantageous to the company.
Dark patterns can take various forms, often exploiting the power of defaults to push users into agreeing to terms stacked in favor of the service provider. Some examples of these actions include: a deliberate obscuring of alternative choices or settings through design or other means; the use of privacy settings that push users to ‘agree’ as the default option, while users looking for more privacy-friendly options often must click through a much longer process, detouring through multiple screens. Other times, users cannot find the alternative option, if it exists at all, and simply give up looking.
The result is that large online platforms have an unfair advantage over users and potential competitors in forcing consumers to give up personal data such as their contacts, messages, web activity, or location to the benefit of the company.
“Tech companies have clearly demonstrated that they cannot be trusted to self-regulate. So many companies choose to utilize manipulative design features that trick kids into giving up more personal information and compulsive usage of their platforms for the sake of increasing their profits and engagement without regard for the harm it inflicts on kids,” said Jim Steyer, CEO of Common Sense. “Common Sense supports Senators Warner and Fischer and Representatives Blunt Rochester and Gonzalez on this bill, which would rightfully hold companies accountable for these practices so kids can have a healthier and safer online experience.”
“’Dark patterns’ and manipulative design techniques on the internet deceive consumers. We need solutions that protect people online and empower consumers to shape their own experience. We appreciate Senator Warner and Senator Fischer’s work to address these misleading practices,” said Jenn Taylor Hodges, head of U.S. Public Policy at Mozilla.
“Manipulative design, efforts to undermine users’ independent decision making, and secret psychological experiments conducted by corporations are everywhere online. The exploitative commercial surveillance model thrives on taking advantage of unsuspecting users. The DETOUR Act would put a stop to this: prohibiting online companies from designing their services to impair autonomy and to cultivate compulsive usage by children under 13. It would also prohibit companies from conducting online user experiments without consent. If enacted, the DETOUR Act will make an important contribution to living in a fairer and more civilized digital world,” said Katharina Kopp, Director of Policy at Center for Digital Democracy.