1. UX design & its impact on our mental health
With the age of internet addiction and instant gratification only continuing to rise, it’s crucial that we not only understand, but also try to combat the negative impacts that online experiences and UX design can have on our user’s mental health.
Companies are using the same techniques used in gambling to facilitate releases of dopamine, promoting unhealthy addictions through a false happiness, be it through apps, websites, Alexa Skills or any other platform that provides a digital experience for users.
For me, one of the most ethically-interesting areas in design is social media, where it is common to see features that users not only want but expect, such as near-endless feeds, despite that these features have huge negative impacts on our mental health and well being.
These unethical tactics are known as dark patterns
From false notifications to hidden items in your basket and even purposefully misplaced unsubscribe buttons; there are a lot of dark patterns out there and every single one of them has a negative impact on mental health.
These patterns are perhaps most prevalent in the likes of social media and e-commerce platforms, but can be found in most of today’s digital experiences, where screen-time and monetisation models are all setup to keep us scrolling, tapping and swiping our lives away with no interest in the impact this has on us long-term.
Algorithmic Dark Patterns & Echo-chambers
Dark patterns aren’t just present in physical component-based features within an interface but also occur algorithmically. The algorithms that social networks, e-commerce sites and most digital platforms alike use to serve content to users are extremely clever but equally ethically challenging.
As you like, favourite, subscribe or even purchase on these platforms you are informing these algorithms of your interests. These algorithms can then serve you native adverts that can spookily show up across all your websites and apps, sound familiar?
These algorithms in turn form echo-chambers. The idea of an echo chamber is that as you subscribe to things you are interested in you slowly get served more and more content on this topic by these here algorithms.
The problem being you end up getting served with the most extreme views of that topic, as the media and products being served to you are coming from the powerhouses in charge of that sphere of media. This dictates a circle of influence that we end up following.
This isn’t helped by the fact the entire model of the like is designed to induce dopamine hits and this massively promotes media toxicity, as people’s need for approval, from those they aren’t close to or have never even met increases, and that’s not mentioning the fact that it opens a playground of cyberbullying and internet-trolls.
The worst part? Jakob’s Law states that we spend most of our time on other platforms so when a new platform arises we want the platform to be consistent with what we are used to, meaning we welcome these addictive dark patterns into new platforms and experiences.
This being said there are ways we can use such dark patterns for good. For example, promoting healthy eating, in what I call positive echo-chambers.
What can designers do to avoid using dark patterns?
Our first responsibility as designers to avoid dark patterns and unethical designs is to do our research into them and make sure we are aware of them, looking at resources such as Dark Patterns.
There’s a lot of design resources out there, one of the best resources I’ve come across is IDEO’s Human-Centred Design Toolkit, but I’ve always come back to the more bitesize Laws of UX; a set of guiding principles that outline ways design can be used to influence our users.
We can follow the guidelines from ethics communities such as AIGA, study cognitive behavioural science to better understand the psychological implications of our design. Such as the Hook Model and Manipulation Matrix, that provide designers with the power to build habit-forming products, as found in Nir Eyal’s book; Hooked.
We can perform ethical passes by asking ourselves questions about our designs, such as the ones outlined by the UX Collective on Dark Patterns:
- Is it hard for the user to get out of a certain situation (subscription plan)?
- Does the design use confusing colours to influence the user’s decision?
- Does the design use trick questions to manipulate the user?
- Does the design mislead the user’s focus?
- Does the design use guilt-tripping copy to influence the user?
User testing and user interviews are often the most important stage in digging out usability issues and ethical issues in a design. The problem is that dark patterns often go untraced by users in these situations as they don’t realise they are there. These dark patterns have become accepted and worse, expected by users because it is what they are used to.
Where have I had to offer an alternative to dark patterns?
I recently got to work on an inspiring project around mental health and mobile apps, where balancing business goals with the user’s best interests was an extremely difficult and ethically challenging thing to get my head around.
One such question I spent hours debating with myself was the use of gamification; creating an experience that I know for a fact would have a hugely positive impact on them, even if that experience itself used tactics often associated with dark patterns.
Or how we could scale the platform to reach as many users as possible and provide the best experience we could while finding the resource and budget that would make that very thing possible in the first place.
Most of these debates were internal and in actual fact, the choices we made with the client always had the user’s best interests at heart, but it did get me thinking about how other brands would handle the situation, and I’m sure the outcome would have been the polar opposite.