We’re lucky to have some funding from Impetus to have a go at making a citizen science game that’s more accessible for people with little or no sight, and this post is to document the first steps in that process.
Nergal is the game we’re trying this out on - it’s an open game world to see how people’s social decision-making affects how diseases spread. We know that as humans, our senses differ from each other, and our social cues and social networks also differ. All these things will affect the decisions we make, and can alter our risk of catching or spreading diseases. So, there are very real public health implications to making this game work for a broader group of people.
We’re currently building the screen-based version of Nergal with epidemiologists Dr. Matthew Silk and Nitara Wijayatilake at the University of Edinburgh, and we’re hoping to have that ready around Autumn 2024. The development of a more accessible version is going to happen at the same time, but aiming to be ready around the end of 2024.
We need to do some thinking on where to prioritise our efforts as this is a scarily short project with very limited funding. The lowest hanging fruit is probably to tweak the game design so it has bigger text, higher contrast, that kind of thing, but that’s only going to help a small number of people. The WC3 guidance for web design for low vision is a good starting point for this, and we have a straightforward list of things to do to make a somewhat more accessible screen-based version of Nergal, including:
- Replacing our twiddly font with a standard font, possibly using bold font
- Increasing spacing between text lines
- Left justification of text
- Shorter lines of text
- Leaving a longer time for reading text
- Making everything zoomable in and out
- Using black and white only for high contrast
But - the more interesting and fun thing to do would be to see if we can make the game work entirely through sound. This would also have the benefit of being more accessible for more people. It’s also quite difficult.
As always, talking to the people you’re making something for is crucial before even starting, otherwise you’ll inevitably end up making something pretty useless. We ganged up with iSight Cornwall (a brilliant charity that helps people with sight loss) for writing the grant application to Impetus, with a view to working together on this. We’ve just held our first workshop with staff from iSight, and a fab group of their clients with varying levels of sight - including no sight, severe sight impairments, light perception only, or some vision. (A little note here for funders, researchers and others wanting to do similar work - we had to ask for special permission to be allowed to pay participants for their time, it’s unbelievable that this still isn’t standard, so always argue for it, the more we all do that the more likely the culture will be to change).
This is the lovely bunch of people who came (plus one behind the camera! photo by iSight Cornwall):
The workshop conversation was so lively and fruitful - we took lots of notes, which are listed anonymously below in the hope that they might also be useful for others attempting similar things:
Controlling the game:
- Some things use ‘virtual joysticks’ which are usually in the corner of touchscreens, sometimes with haptic feedback - these are generally rubbish, as anything touchscreen-based is if you don’t have much sight.
- Things like xbox controllers are good and can work on a PC, keyboards and mice too of course.
- Voice activated is great, and opens things up to people with other disabilities too.
- Some games use a bump sound/vibration when the player moves to a new grid square which is helpful for orientation
Audio for the game:
- We might need to have a tutorial of the sound effects - e.g this sound means this.
- Audio descriptions of the environment are great for lots of people.
- Atmospheric sounds instead of voice/descriptions helps it be less overloading and a more natural experience. This allows you to hear multiple things at once and decide yourself which ones to pay attention to. The balance is important.
- Lots of sound effects make it more interesting.
- The world could describe itself to you (this was popular!)
- Can tell the usual screen reader to turn off while in a game. Often better to have the game work essentially as its own bespoke screen reader as that should actually work, while most screen readers won’t (screen readers are so variable in how they work it is difficult to make something like this that would work with all of them).
- Use different voices, not robotic ones (one person liked to speed up a voice reader to make it sound happier and less robotic).
- Subtitles are important for some people.
- Players could pick their voice (instead of their hat, which we use in the screen based game so the player can tell who they are amongst a sea of identical looking characters).
- For navigation, it could say ‘you are by’, and/or have every area described as a grid - you only hear about stuff close to you on the grid.
- Include sounds of birds, trees rustling, rain, wind.
- Can have a key to press to describe the surroundings - use a clock face to describe where things are rather than left/right.
- Could have a menu option to describe other things about the current situation, like energy level/health.
- Could have music when you get close to someone.
- Character could narrate their own health condition.
- 3D audio can be used.
- 2D world should be easier to understand than a 3D first person one.
- Character says things like “oops” when walking into things, or “I went the wrong way”.
What noises could a character make when they are ill?
- Dragging feet
- Stuffy nose voice
- Sniffing
- Cough, getting louder as they approach
- Scratching skin
- Everything visible should have a sound (e.g sneezing, the sneeze particles could have their own sound)
- There should be some subtlety so you don’t know immediately that someone is ill (in the screen-based version there is a brief sad animation before a character starts sneezing)
- Periodic warnings, e.g. like diseases in Pokemon where they walk a few steps then give a warning sound (which also slows them down)
Science questions - things the researchers need to think about:
- Do we need multiple diseases, and perhaps some that are worse/less bad than each other?
- Do we need differences in immune systems, so some characters are more/less likely to get unwell?
Other ideas for the game:
- The player could get sad if they don’t chat to others.
- The cost of catching a disease maybe shouldn’t be too obvious.
- There should be a consequence of infecting someone else.
- Players could move slower when sick.
- Could players wear a mask/wash their hands (or feet since Nergal’s don’t have hands!).
- Could start with a set amount of points, put some into sociability or into immunity.
- Could aim to find a cure - have to talk to x amount of people to find out where the pharmacy is for example.
Example games that the group mentioned:
- The Veil Shadow of the Moon - nothing on the screen at all, all narrated/sound, xbox controlled, tells you what to press e.g. right to fight, left to shield, you can hear things like the river which gets louder as you get closer, you can hear a person limping, there’s a helper character.
- Last of us 2 (playstation - VI mode - and the screen reader actually works)
- What Remains of Edith Finch (the atmospheric noises are good)
- A Blind Legend (free, the character moves with their daughter who tells him which way to go)
- Among Us
- Subnautica
- The Stanley Parable
- Tabletop warhammer adapted with a grid so can play a bit like battleships
Other bits and bobs:
- Netflix audio descriptions are great, not intrusive, don’t go over the speech, just give the right amount of info.
- One person described their kids making a clicking noise when they smile, we tend to automatically start doing these things to communicate more fully.
- At their shooting club, the gun beeps as it approaches a target 11ft away, like a proximity sensor getting louder or changing pitch depending on position.
The group also talked a bit about their experiences at the height of the Covid pandemic, which gives insight into how experiences of disease outbreaks can differ for people without full sight:
- People getting angry at them for not social distancing, when they couldn’t see those people.
- Relying on people to help, e.g. if a bus driver helps you off the bus, they’re touching you and close to you, so there’s a higher risk of transmission.
- Need to touch everything to feel your way around, but especially early on in the pandemic we all thought that we needed to avoid touching things.
- Many just shielded because it was so difficult.
At the end of all the chatting we switched modes and had a play with three analogue synthesisers that we’d been lent by Aphex Twin (no screens! all knobs and sliders - arp odyssey, yamaha cs01, roland sh101), we didn’t quite have enough time for it on the day, but there’s scope here to make sounds together for the game.
It’s not really relevant to the game, but we were also shown a new system called VoxiVision, which is a really impressive bit of kit using machine vision to describe what its camera sees with remarkable detail and accuracy - it can also do speech to text, and then convert that text to another language to read out, it can scan barcodes to say what an item is, and it can magnify things. Very much worth seeking out if you’re interested in these things or could benefit from it, though it is pricey at the moment (around £1700).
A huge thank you to everyone who came and for all these thoughts and ideas that will have our heads spinning for months. We’ll try to do it justice and make something interesting. Around November we’re hoping to have another workshop to try out what we’ve made and figure out what needs tweaking/improving. We absolutely loved that people were swapping numbers with each other, and hopefully there will be some new friendships as a side effect! We’ll leave you for now with our favourite, simple quote from the day: “People should open their eyes to blind people”.
[Last two photos from iSight Cornwall]
IMPETUS is supporting our project. IMPETUS is funded by the European Union’s Horizon Europe research and innovation programme under grant agreement number 101058677. Views and opinions expressed are, however, those of the author(s) only and do not necessarily reflect those of the European Union or the European Research Executive Agency (REA). Neither the European Union nor the granting authority can be held responsible for them.