Dark Patterns: How design seeks to control us | Sally Woellner | TEDxSydney
Summary
TLDRThe speaker delves into the evolution of design, particularly in the digital realm, highlighting the rise of UX Design and its potential dangers. They introduce 'dark patterns,' manipulative design tactics used to control user behavior, such as 'confirmshaming' and 'misdirection.' Examples include Duolingo's guilt-inducing owl and StubHub's hidden fees. The talk also touches on the use of color psychology and notification algorithms to increase engagement, leading to 'privacy Zuckering,' where personal data is exploited. The speaker calls for awareness, ethical design, and accountability from tech platforms.
Takeaways
- đ The evolution of design has shifted from aesthetics to user experience (UX), focusing on how users interact with technology.
- đ ïž UX designers aim to make digital spaces more user-friendly, but can also manipulate user behavior through 'dark patterns'.
- đ« 'Dark patterns' are manipulative design tactics used to influence user behavior without their full awareness.
- đŸ 'Confirmshaming' is a dark pattern where users feel guilty for canceling services, using emotional design to retain them.
- đ 'Misdirection' hides important information and draws attention to less relevant details to guide user decisions.
- đł Platforms can increase sales by hiding fees until the last moment, exploiting user urgency and lack of attention to terms.
- â€ïž The color red and other visual cues are used to grab attention and increase engagement, mimicking the effects of gambling mechanics.
- đ° Social media platforms like Instagram use algorithms to withhold notifications, creating a sense of anticipation and reward.
- đ 'Trick questions' in terms and conditions can mislead users into agreeing to unfavorable terms without realizing it.
- đ 'Privacy Zuckering' combines various dark patterns to extract more personal information from users than intended.
- đïž Recent legislation in the EU and UK has started to outlaw particularly exploitative dark patterns due to public demand.
Q & A
What is the primary goal of a UX Designer according to the transcript?
-A UX Designer's primary goal is to design the way users interact with technology, products, and apps, making digital spaces easier, more delightful, and sometimes more addictive.
What are 'dark patterns' in the context of design?
-Dark patterns are manipulative tactics used by designers to influence user behavior in ways that are often not in the user's best interest.
Can you explain the term 'confirmshaming' as mentioned in the transcript?
-Confirmshaming is a dark pattern where designers use emotionally charged language and design to make users feel guilty about canceling a service or leaving a platform.
What is 'misdirection' in the context of dark patterns?
-Misdirection is a dark pattern where designers hide important information and draw attention to less relevant details to manipulate users into behaving in a certain way.
How does the color red play a role in dark patterns as described in the transcript?
-The color red is used to grab attention and increase heart rate, and it's been shown to increase click-through rates, as demonstrated by HubSpot's experiment with button colors.
What is the psychological effect that Instagram's notification algorithm is designed to create?
-Instagram's notification algorithm is designed to create a sense of unease, disappointment, and anticipation, followed by a rush of dopamine when the withheld likes are suddenly revealed.
What is a 'trick question' in the context of dark patterns?
-A trick question in dark patterns refers to ambiguous or misleading choices presented to users, such as unclear options to cancel an account, which can lead to users unintentionally providing more information or continuing a service.
What is 'privacy Zuckering' and how does it relate to dark patterns?
-Privacy Zuckering is a dark pattern where designers combine various manipulative techniques to get users to reveal more personal information than they intended, often without their full awareness.
How do data brokers use personal information according to the transcript?
-Data brokers collect and sell personal information, which can be used to classify people into groups and target them with specific marketing, sometimes leading to discriminatory practices.
What is the role of attention in the business model of platforms like Facebook as described in the transcript?
-Attention is the most valuable commodity for platforms like Facebook. By keeping users engaged, they can gather more data, sell it to data brokers, and serve personalized advertising to make more money.
What action does the speaker suggest users take to avoid falling victim to dark patterns?
-The speaker suggests users hold platforms accountable, choose products that use data ethically, visit darkpatterns.org for awareness, and raise their voice to influence legislation against predatory dark patterns.
Outlines
đ The Evolution and Dark Side of Design
The speaker, a designer, expresses initial optimism about design's role in beautifying the world but then warns of its darker applications. The evolution of design, particularly in the digital realm of User Experience (UX), is discussed, highlighting how UX designers focus on user interaction rather than aesthetics. The concept of 'dark patterns' is introduced as manipulative design tactics that can control user behavior without their awareness. Examples of such patterns include 'confirmshaming', which uses guilt to retain users, and 'misdirection', which hides fees until the last moment in a purchasing process. The speaker also touches on the psychological impact of color and design in user engagement, and how these tactics can be combined to extract more personal information, leading to privacy concerns.
đš The Impact of Dark Patterns on Privacy and Health
This paragraph delves into the consequences of dark patterns, emphasizing how they can be used to exploit users' privacy. The term 'privacy Zuckering' is introduced, describing how personal data is collected even after account deactivation. The role of Data Brokers in compiling and selling personal information is highlighted, along with the potential for this data to be used in discriminatory ways. The speaker points out the irony that while health and therapy apps are meant to protect users' well-being, they can also be complicit in sharing sensitive data for profit. The importance of user attention as a commodity in the digital economy is discussed, explaining the financial incentives behind companies' use of dark patterns. The speaker concludes by advocating for ethical design, encouraging the audience to hold platforms accountable and to support legislation against predatory practices.
Mindmap
Keywords
đĄDesign
đĄUser Experience (UX) Design
đĄDark Patterns
đĄConfirmshaming
đĄMisdirection
đĄData Brokers
đĄPrivacy Zuckering
đĄPersonalized Advertising
đĄEthical Design
đĄAccountability
đĄDark Patterns Legislation
Highlights
Design has evolved alongside technology, with UX designers now shaping how people interact with digital products and spaces.
Dark patterns are manipulative design tactics used to influence user behavior without their awareness.
Confirmshaming is a dark pattern where designers use emotionally manipulative language to guilt users into staying or not canceling services.
Misdirection occurs when designers hide essential information and highlight less important things to control user behavior, often leading to hidden fees.
Changing a button's color, like from green to red, can increase user engagement, as shown by a 20% rise in clicks on HubSpot's platform.
Instagram's notification algorithm withholds likes to create anticipation, triggering dopamine responses similar to gambling mechanisms.
Trick questions in interfaces, such as unclear buttons, confuse users and lead to unintended actions, like not canceling services.
Privacy Zuckering combines multiple dark patterns to extract more personal information from users than they realize.
Facebook continues collecting user data even after accounts are deactivated, highlighting the danger of information exploitation.
Data brokers collect thousands of personal data points on nearly every American, which can be sold and used for manipulative marketing.
Many health apps risk patient privacy, with some failing to meet clinical standards, putting vulnerable users at risk.
Therapy apps have been caught sharing sensitive mental health information with advertising platforms like Facebook and Snapchat.
Your attention is the most valuable asset in the digital ecosystem, driving companies to use manipulative designs to keep users engaged.
Design can be a powerful tool for good, and users should hold companies accountable for ethical practices in their digital experiences.
Recent EU and UK legislation has started to outlaw certain dark patterns, showing that public outcry can lead to protective measures.
Transcripts
Transcriber: Andrea Carrer Reviewer: Anna Sobota
I became a designer to make the world a more beautiful place.
But the longer Iâve continued in my career,
the more Iâve started to notice
some aspects of design changing into something a bit different,
something that can be quite dangerous for all of us.
Many of you may remember design, as most people think of it,
designing a beautiful poster for your wall
or designing the interior of a gorgeous building.
But design has evolved alongside technology
and many designers these days are crafting
what we call User Experiences, or UX Design.
A UX Designer isnât really concerned with how a website looks.
A UX Designer is designing the way that you interact
with the technology, products and apps that you use every day.
A UX Designer is working to make your digital spaces easier,
more delightful and sometimes more addictive.
And thatâs how design might be controlling your behavior, without you even noticing,
when designers turn to what we call dark patterns.
Dark patterns is a name for the multitude of worryingly effective ways
that designers use manipulative tactics to get you to behave
the way that they would like.
Iâm about to tell you about four of them today.
So think back to last time you signed up for something.
It was probably super simple.
It might have felt a little bit exciting.
If it was something like this, you might have done it even with a single click.
And then, you might want to cancel that service or unsubscribe from that email.
In that case, you may have run into one of our first dark patterns,
confirmshaming.
Confirmshaming is when designers use manipulative language,
an emotionally charged design
to make you feel really, really guilty about canceling a service or leaving.
Duolingo redesigned their owl to make you feel even more guilty about quitting
than you did before.
Itâs been proven to be really effective.
I mean, do you want to make the owl cry?
Platforms that use this technique also often employ our next dark pattern,
misdirection.
Thatâs when designers will hide the things that they donât really want you to see
and draw lots of attention to the things that you do,
in order to help you to behave the way that they want.
I know all of you will have had some experience with buying tickets
and you know, this is a great price.
But four screens later, when you've raced to enter all your personal information,
you might see something more like this in the fine print.
On platform StubHub, they used to put the entire price up front,
even with that 30 % extra you can see in the tiny little text there.
But they discovered when they moved that fee just to the end
and hid it until it was almost too late,
people spent, on average, about 21 % extra.
But, that oneâs kind of obvious.
Youâve probably noticed that one.
Designers are using even more manipulative techniques
to influence the way that you behave.
The color red has been shown to raise heart rate
and it gets your attention.
Online platform HubSpot discovered
that when they changed one button from green to red,
about 20 % more people actually clicked on it just for that.
Even that is pretty simple
when you compare it to how some platforms are using this little red dot
to enable mechanics that are kind of like gambling or slot machines.
What will happen is
Instagramâs notification algorithm will actually withhold likes from you.
Itâll create this growing sense of unease, disappointment and anticipation,
and then, suddenly, BAM!
They hit you with it all at once.
And that's what's being designed,
that rush of dopamine that someone noticed your cat picture
that keeps you coming back.
And when you combine that compulsion to keep clicking
with the next dark pattern number three, the trick question,
things can get bad pretty quickly.
In 2010, online e-commerce platform Game Station
added this little clause to their shopping checkout,
which gave them ownership of your immortal soul
when you purchased with them.
They did also put a big opt-out option
but they were quite surprised to end up the owners of several thousand souls.
They concluded that around 88 % of people didnât read
the Terms and Conditions or text when they checked out.
The problem is, even if you are someone
who happens to read all the Terms and Conditions,
it might not help you.
For instance, this is just a trick.
[Are you sure you want to cancel your account? Cancel. Continue.]
I mean, I put this in the presentation,
and Iâm still not sure which button I meant to be clicking.
When you bring these all together,
often you enable dark pattern number four, privacy Zuckering, named, of course,
for Facebook CEO, Mark Zuckerberg.
This pattern is when designers combine all of these different techniques
to get people to reveal much more information about themselves
than they intended to.
Many people donât know
that in a disquieting form of digital grave robbering,
Facebook can actually continue to collect information on you
even if your account is deactivated.
And, why does this matter?
Well, it's all to do with how these platforms work.
The problem is that they use your information to make money.
They collect it and sell it to other people.
The people who collect and sell this information are called Data Brokers,
and research by the U.S. Federal Trade Commission uncovered the fact
that one Data Broker had as many as 3 000 pieces of personal information
on nearly every American in the United States.
And Professor Frank Pasquale highlighted the fact
that this can be used to classify people into groups
that can be dangerous or discriminatory.
They may group people together into a group called âElderly and Gullibleâ,
and then, sell their information onto gambling marketers.
This is what the dark side of design can enable.
And this is why it matters.
Recent research, backed by the British National Health Service,
uncovered the fact that 4 out of 5 health apps are so bad
they may be putting patient health at risk.
7 out of 10 apps designed to prevent suicide
fail to even meet basic measures of clinical quality.
But, more than that, when youâre at your most vulnerable,
that's when you need your privacy the most.
Thatâs the reason that there are such strict laws on therapists and counselors
that control what information they can share about you.
But unfortunately,
design and technology is moving faster than the law can catch up.
Four of the major therapy apps have been shown to be sharing some of your data
with platforms like Facebook, Pinterest, Google and Snapchat.
One of the reasons, these platforms gave, was
that it enables a personalized experience.
That is, they serve personalized advertising to you
while youâre on those platforms.
That means that these companies are using information about your mental health
to sell you advertising.
Why is this happening?
Well, what you need to know is how platforms like Facebook make their money.
They make it from you and your attention.
Your attention is the most valuable commodity in our modern digital ecosystem.
If they have your attention, if they can keep you on the app or their website,
they can gather more data about you.
They can sell more data on to Data Brokers.
They can give you more advertising and they can make more money off you.
Thatâs why all of these companies are so desperate
to get your time and attention.
And thatâs why their designers have been pushed to such extremes
to try and keep your attention.
But it doesnât have to be like this.
Design is just a tool.
It can be used far more effectively to do good than to harm.
And it shouldn't be like this.
Now that I've told you about all of this,
I want you to hold these platforms and companies to account.
Now that you're aware, you can avoid platforms that use techniques like this.
Choose products that use your data ethically.
Check out darkpatterns.org
where ethical designers raise awareness of these patterns
so that you can avoid them and not get trapped by them.
On top of that, raise your voice.
In the EU and UK, recent legislation has outlawed several dark patterns,
that were particularly predatory,
thanks to widespread public outcry.
Together, I believe that we can use design the way that it was intended.
We can design a better world together.
Thank you.
5.0 / 5 (0 votes)