In this series we explore the most ethically-challenging areas of design and the dilemmas we face in balancing user-interests with commercial goals.
From dark patterns promoting addictive behaviour to the duality of ethics and morality, UX designer Noah Abbott shares his experiences.
Testing with users who have varying experience with technology
We are faced with many hugely interesting challenges when we are designing, especially in the Tech for Good sector. The challenges around usability and user experience are paramount to us creating products that help change the world.
But how do you design an app and test it with someone who has never used a phone before, who doesn’t speak the same language as you, lives on the other side of the world to you and comes from a completely different culture?
This is a challenge we are faced with daily at 3 SIDED CUBE and it is something we are always learning more about and defining further.
Every project begins with ensuring that you have done your research. You need to understand not only the problem you are solving but why you are solving it and who you are solving it for.
As for who you are solving it for you need to extend your research to their culture, their day to day activities outside of your product’s exposure to them, this is particularly important when you are user testing to ensure polite and agreeable communication.
Dealing with low tech proficiency and training
For those that are using our apps who have very little tech proficiency, and low literacy levels, we have to ensure that as we introduce new features we don’t massively change the new interface for every new feature we introduce.
This is so important for those with lower-tech proficiency because every time we change the app it can be like learning something brand new all over again. One thing we do in order to keep this disruption to a minimum is to follow a pattern we refer to as ‘UX Vision‘.
How to handle design debt and build back better
A problem we are often faced with is taking on and building upon an existing app or website that is either years out of date or whose design and build quality wasn’t well thought out from the beginning.
This ensues a strong sense of design debt, where features have been bolted on here and there and older versions of the designs get carried through. Design debt not only ages a user interface, but the inconsistencies it carries makes interfaces much less comprehensible, more difficult to navigate and breaks all sorts of accessibility guidelines.
Using ethnographic observation
Ethnographic observation is a method we can use to put ourselves in the shoes of our users, by conducting user research in environments where people would actually be using our product or service.
We can actually experience the problems our user’s experience and try our solution in the same context, by looking at our user’s behaviour and what role our online or digital experiences play in their wider-lives.
Combining this with contextual observation to create our user journeys, we can gain a far more comprehensive understanding of our user’s behaviour, interests, pain-points and reasons for interacting with different digital platforms.
Running an ethical user testing workshop
We have devised a user testing planning workshop that allows us to work as a team to deliver a strong itinerary for testing that helps avoid planning bias.
As always we start with the why and build up to the what, making sure that we plan the testing sessions/user interviews to capture as much information that is useful to as many team members as possible, to avoid questions being asked further down the line.
We start by setting objectives for the testing sessions. Then work out the wireframes, prototypes and resources we will need to achieve our objectives. Finally working out the tasks and questions necessary to achieve our objectives. Then map this out into a comprehensive plan that we can share with our clients and among the team.
We have two main categories for testing… qualitative and quantitative. Qualitative sessions are run like guided discussions running through concepts, wireframes, prototypes or live products and reporting on user feedback, engagement and interactions.
Quantitative sessions focus on providing metrics and hard data, and we will report on the effectiveness, efficiency and satisfaction of completing tasks with trackable figures/metrics that can later be compared against, as well as the qualitative user feedback, engagement and interactions.