“Dark Pattern” design is a new frontier in the digital privacy world, a term coined in 2010 by UX designer Harry Brignull (check his name-and-shame site at darkpatterns.org). Basically, it’s any user interface that has been designed to undermine a user’s ability to make autonomous decisions in order to extract compliance and get them to do things they wouldn’t otherwise do. Most often this includes buying things they don’t need or giving away sensitive personal information. The Electronic Frontier Foundation and Consumer Reports teamed up last year to create a tip line, and the FTC has held conferences to gather ideas from the public while crafting legislation to potentially regulate it.
But first, let’s define our terms to distinguish some fields of expertise from others, and start with the concept of emotional design. “Emotional design” is a recognized term, which the Interaction Design Foundation describes as “the concept of how to create designs that evoke emotions which result in positive user experiences.” It’s also the subject of a fascinating book by Donald Norman, emeritus professor of computer science at Northwestern. Similarly, “social design” is a term used to indicate design thinking specifically attempting to further positive social change. Squish the two together and you have “social emotional design,” a term used in behavioral economics. But let’s start with emotional design, a term abstracted from industrial design to discuss a specific strain in object design theory and practice.
“My field is physical product design, not digital interfaces, so I can only speak to that part of the discipline,” says Tim Parsons, associate professor in the Designed Objects programs at SAIC, and co-founder of Chicago studio Parsons & Charlesworth. “Conventional design-historical wisdom is that emotional response as a conscious driver of product function and aesthetics only came to prominence in the postmodern era. Italian radical designers, among others, turned to pop imagery and historical references to generate products where form-followed-fun, lampooning the dryness of Bauhaus-inspired modernism. In the nineties, independent designers making one-off or small-batch products were able to tap into emotional connections between end-users and objects far more effectively than industrial designers for mass production could.
“Droog Design,” Parson continues, “which gave designers like Hella Jongerius and Marcel Wanders their first significant career exposure, were showing products which strived for poetic connection. Jongerius’ Soft Urn and Wanders’ Knotted Chair did this through surprising materiality (the vase looks like an ancient artifact but is modern and flexible; the chair is crocheted yet solid due to being dipped in resin).
“This kind of material experimentation is still very much alive in design today but is often done with an environmental impetus, like Christien Meindertsma’s Flax Chair shown at the Art Institute in 2019. Flax has been processed into a bio-composite that can replace oil-based plastics. I see these design approaches as genuine attempts to create a lasting bond between the purchaser and the product rather than marketing tactics.” As author Jonathan Chapman relates in his book “Emotionally Durable Design,” “If these strategies create genuine attachment and we hold onto—and even repair—objects instead of throwing them away, this attachment serves a sustainable purpose.” So, this is clearly all design for enhancing individual enjoyment and social goods, more broadly.
But this also then becomes our jumping-off point where emotional object design ideas resonate within the designers of our new digital world, and that second term “Social Design” comes back into play. You’ve heard that in social media, “we are the product?” In recent years, in the somewhat nebulous digital design field of UX (user experience), fostering that warm fuzzy feeling can translate into torqued-up user engagement, which then in turn translates into advertising audience, and ultimately financial valuation figures.
Witness, for instance, the hot water Facebook got itself into with its infamous 2012 decision to use their UX framework to experiment to actively manipulate user emotions. As reported by Wired in 2014, the social media giant “conducted two experiments, with a total of four groups of users (about 155,000 each)… In the first experiment, Facebook reduced the positive content of News Feeds. Each positive post ‘had between a ten-percent and ninety-percent chance (based on their User ID) of being omitted from their News Feed for that specific viewing.’ In the second experiment, Facebook reduced the negative content of News Feeds in the same manner. In both experiments, these treatment conditions were compared with control conditions in which a similar portion of posts were randomly filtered out (i.e., without regard to emotional content).”
There’s also the experiment conducted by Uber, widely reported by the New York Times: to summarize the reporting, to deliver passengers a car more or less immediately when they ask for one, Uber needs to have more drivers available and sitting in their cars ready to respond at the sound of a ping. To do so, according to the Times reporting, their social scientists employed psychologically manipulative reward-incentive tricks to keep drivers waiting endlessly for their next score.
And that’s not to mention dating apps like Tinder, Hinge and OkCupid, which have also faced increased scrutiny, with users reporting as low as four-percent success rates over a decade—how’s that for weaponized incentive salience? But ultimately, what Facebook, and other corporations that are experimenting with these systems are attempting to do is experiment with our dopaminergic systems—that is, the tiny squirts of dopamine emitted by our brains when we sense a reward. Like Pavlov’s dogs hearing a bell, they know means it’s feeding time. But rather than cut off our cheeks and sew them on backwards to observe salivation responses like Pavlov did, Facebook and Uber observed and collected data about our response in likes and click-throughs. A bioethicist I spoke with, who is informed on the subject, and agreed to speak on background, framed the initial Facebook argument as one in which the company was attempting to act responsibly, and as a choice between mood contagion theory and social comparison hypothesis.
Back before the experiment, there was a lot of bruit that social media may be driving people to suicide—you’ll recall the example of The Wall Street Journal report revealing that Facebook researchers had “repeatedly found that the photo-sharing platform is toxic for teen girls,” for example. So, in this example, if a teenager is exposed to impossibly idealized body imagery and mood contagion theory is correct, they feel good about themselves. But if social comparison hypothesis—a theory that we look to others to sharpen the accuracy of our self-evaluations—is true, then they would encounter these images and feel bad. So, these social media companies have an obligation to figure out what’s going on, and that’s what Facebook was reportedly doing.
Except, without much informed consent from the test subjects beyond their broadly worded user service agreement, the experiment very arguably constituted a violation of laws already in place regulating Human Subjects Research. And most social media companies have not budged from justifying this kind of tweaking, by arguing that users agree to changes in the algorithm in their terms of service—regardless how intentionally changing the algorithm to bring about these experiments might actually counter that argument. The debate around the issue has continued raging to this day, even as the not-quite-yet-existent field of “Social Emotional Design” has continued to evolve; but the underlying context remains consistent as well as deeply disturbing—especially given the current zeal to bring UX space more into our brick-and-mortar existence. “As web 3 and the metaverse develop, we will likely see the tactics of UX design enter the AR/VR space,” says Parsons.
Knowing this, I couldn’t help but wonder: if these psychological tricks continue to go unregulated, and become a normal part of the background, standard operating procedure for design of digital environments, could people develop dependencies on it? For those without regular sources of self-esteem and positive feedback, there is absolutely “a potential for addiction,” the bioethicist told me flatly. And that, which goes beyond the phenomenon of mere deceptive design with a goal of extracting goods and information, is the whole new, largely still unregulated world of user interface into which it seems we are imminently about to tread.
Michael Workman is an artist, writer, dance, performance art and sociocultural critic, theorist, dramaturge, choreographer, reporter, poet, novelist, curator, manager and promoter of numerous art, literary and theatrical productions. In addition to his work at The Guardian and Newcity, Workman has also served as a reporter for WBEZ Chicago Public Radio, and as Chicago correspondent for Italian art magazine Flash Art. He is currently producing exhibitions, films and recordings, dance and performance art events under his curatorial umbrella, Antidote Projects. Michael has lectured widely at universities including Northwestern University, The School of the Art Institute of Chicago and The University of Illinois at Chicago, and served as advisor to curators of the Whitney Biennial. His reporting, criticism and other writing has appeared in New Art Examiner, the Chicago Reader, zingmagazine, and Contemporary magazine, among others, and his projects have been written about in Artforum, The New York Times, Artnet, The Financial Times, The Huffington Post, The Times of London, The Art Newspaper, The Wall Street Journal, New York Magazine, Art In America, Time Out NY, Chicago and London, The Gawker, ARTINFO, Flavorpill, The Chicago Tribune, NYFA Current, The Frankfurter Algemeine, The Chicago Sun-Times, The Village Voice, Monopol, and numerous other news media, art publications and countless blog, podcast and small press publishing outlets throughout the years.
Contact: michaelworkman1@gmail.com. Website: michael-workman.com