Blog icon

By  Alice Trend Jieshan Chen 22 November 2023 6 min read

Key points

  • Only half of us are aware of deceptive or dark patterns online or on our apps.
  • 11 per cent of shopping websites and 95 per cent of apps contain deceptive patterns, which trick users into making bad choices.
  • Our researchers have developed UIGuard, an automatic deceptive pattern detection tool to protect shoppers and consumers.

Buzz buzz. You have a new email, telling you about another sale 'Starting NOW – 40% OFF, hurry while stock lasts!' It's the time of year when advertisers can't wait to tell us all about their amazing sales. But underneath the fanfare lurks a hidden danger.

Worryingly, only half of Aussies are aware of deceptive or dark patterns online. Meanwhile, 95 per cent of popular mobile apps contain at least seven types of malicious designs on their user interface (on average) and 11 per cent of shopping websites contain deceptive patterns. To top it off, 83 per cent of Australians have lost money, control of their data or have been manipulated to make a choice which worked against them. And you were just trying to get some shopping done! 

From disguised ads to scams, dark or deceptive patterns can be annoying at the very least and dangerous at their worst. Let’s delve into some common deceptive patterns to look out for on mobile user interfaces (UIs) and how we're combatting them with the power of technology research.

Baited bytes: falling victim to phishing and scams

Jieshan Chen is the lead author of Unveiling the tricks: automated detection of dark patterns in mobile applications. She explains that a deceptive pattern is a web, app or UI design that works against you. 

“Deceptive patterns might fool you into providing unnecessary data or to make it hard to exit a page, ad, or subscription you don’t want. While some deceptive patterns on your phone are merely irritating, like pop-up ads or notifications, others can be severe if linked to phishing websites, " Jieshan said.

“By deceiving you into providing your personal details, deceptive agents can send you targeted scams and ads, or even pretend to be a friend or relative for scamming purposes." 

Vulnerable users like children and young adults are at particular risk of being deceived. 

UIGuard detects a deceptive pattern called False hierarchy.

How scammers leverage deceptive patterns online 

Clicks on hidden ads, forced marketing subscriptions, and the ability to import contact lists or user data to sell can create perceived benefits for companies. But loss of trust from frustrated users and damage to company brand seems to cancel out any real or perceived benefit. On the malicious side, scammers want to steal your data or trick you into buying from fake sites. 

Novel app safeguards shoppers online

“Unfortunately, not a lot is available on the market for consumers right now," Jieshan said.  

Worldwide, researchers are working on identifying and characterising deceptive patterns so people can better understand them. Some even provide platforms to name and shame companies using deceptive patterns. But these tactics rely on consumers having to accumulate knowledge and use time-consuming ways to identify and defeat these patterns.  

To tackle this, our researchers have partnered with collaborators from Monash University to create a world first AI-driven app called UIGuard to automatically detect deceptive patterns. We wanted to develop a detection tool to protect end users from harm. 

"To create the tool, first we looked for new deceptive patterns to create a whole new database of deceit, so we could create a data-driven solution. We applied AI techniques, including computer vision and natural language pattern matching, onto our dataset to train it to automatically find a wide range of deceptive patterns on mobile apps and UIs," Jieshan said. 

The researchers found UIGuard identified deceptive patterns with over 83 per cent accuracy. Mobile phone users in the study who used UIGuard were able to detect nearly three times as many deceptive patterns to protect themselves from harm. 

“We are confident that the public will really benefit from this research and expect it may be useful to regulators and app providers," she said.  

The team hope to release UIGuard to the public early next year. In the meantime, users should be aware of deceptive patterns and decide how vigilant they want to be.  

Red flags to watch out for

Beware of these actions when you're visiting an app or website.

Nagging

Nagging is a repeated app action that interrupts the users’ current task and nags them to do something else. This includes ad, rating, or upgrade pop ups. No one likes a nag, but like most nags, these are relatively harmless. 

Obstruction

Designers can make it harder to do things companies don’t want you to do.

  • For example, you’re a victim of the Roach Motel if it was easy to check in, but hard to check out (e.g. hard to logout, unsubscribe, or delete your account). 

Victims of the Roach motel. Image created in Dall-E.

  • If you can’t select text to compare prices or products on other sites, you’re experiencing price comparison prevention.
  • Intermediate currency is an effort to disconnect users from real money by using virtual currency for games or reading.

Sneaking

Scammers using this approach try to hide, disguise or delay information that is relevant to users. This could include a free trial that you end up paying for. It can also come in disguises like:

  • Forced continuity, which forces users to auto continue the service when their purchase expires – remember old gym memberships?
  • Hidden costs delay information like a high tax rate or delivery fee to a late stage of checkout.
  • Sneak into basket quietly adds additional items without your consent.
  • The bait and switch tricks you by hiding ads behind normal looking click buttons.

These kinds of deception can result in unexpected bills.

Forced action

This strategy forces users to perform tasks like logging in to get rewards, view content, or unlock features.

Interface inference

How do you promote some options over others in a way that confuses and hides information from users? 

  • Hidden information diminishes the visibility of information relevant and most useful to users: think small print, in a less visible colour, with what they want you to click in a more obvious font, colour or size. 
  • Preselection means unfavourable options have already been selected for you. Careful reading is the best way to get around this.

Aesthetic manipulation

Visual distraction or attracting users’ attention to specific actions is at the heart of aesthetic manipulation.

  •  Toying with emotion can include strategies like offering a countdown to create urgency on a fake retail site, or emotionally manipulative messages to get you to stay on social media sites. This is a significant issue affecting the mental health of children who have become addicted to social media sites. Deceptive patterns can create regret and anxiety: why did I just waste so much time on that game or social media platform when I had other, more important things to do?
 

False hierarchy

This makes options more favourable to the company more prominent than those in your interests. 

  • Disguised ads are when ads look the same as real content. 
  • Trick questions might use the old double negative to deliberately mislead you to an action. The severity of the deception depends on what the ad is: if it just links to a retail site, no harm done. However, if it links to a phishing website, you could be in trouble.

Social pyramid

This style of deception asks users to leverage their contacts to get rewards or even just to sign up.

  • You can get privacy suckered when you don’t read lengthy Terms of Service documents – and then your data is sold to third-party companies. Or when privacy-related patterns are selected by default, like “send my usage statistics for analytics purposes”.

“Once you’ve been suckered in – sometimes by a combination of strategies like false hierarchy, nagging and obstruction, the severity depends on how they use your details,” Jieshan said.

“If it’s just to improve services, it can be useful. But if it’s to send targeted ads to vulnerable people, the impact can be damaging.” 

Jieshan recommends being especially careful when giving out personal and financial information. Remember these patterns as you browse and follow @darkpatterns on Twitter/X to stay up-to-date until the tool becomes available in the new year. 

Examples of deceptive patterns on common apps and websites.

Contact us

Find out how we can help you and your business. Get in touch using the form below and our experts will get in contact soon!

CSIRO will handle your personal information in accordance with the Privacy Act 1988 (Cth) and our Privacy Policy.


First name must be filled in

Surname must be filled in

I am representing *

Please choose an option

Please provide a subject for the enquriy

0 / 100

We'll need to know what you want to contact us about so we can give you an answer

0 / 1900

You shouldn't be able to see this field. Please try again and leave the field blank.