Until fairly recently, the deaf were the forgotten users of the web. By contrast, great hullabaloos were made over the course of the last decade and a half about the needs of the visually impaired. Browser software began to enact zooming features and on-the-fly text-size alteration, and screen readers such as Jaws have been undergoing steady development for years.
If you’re looking for information about ‘accessibility’ for the internet then chances you’ll find that most information is about the needs of those with visual problems – closely followed by those with motor difficulties. As a result, there are a raft of conventions and technologies out there for anyone wishing to make their content accessible to the blind, and almost nothing about the deaf.
At first pass, you could be forgiven for thinking that the internet is actually the perfect medium for deaf people. Largely based on textual communication, there seemed to be little requirement to take into account the needs of the deaf.
But that was the old web. The web of usenet and forums and blogs. The web where the word was king. But today, streaming media has finally become reality. From YouTube to Spotify, Joost to Hulu, content is increasingly being served as audio and video. The attractions are manifest and obvious. We do our best communication face to face, and the nuances of extemporised speech make points in a way that only the best of writers can emulate in text. When Google decided to start answering questions from its users, the format it chose was spoken-to-camera video on their popular Webmasters YouTube channel. And when President Obama called Kanye a ‘jackass’ the story only really went into hyperdrive with the release of the audio which turned dry reportage into something real and immediate.
Suddenly, a new frontier in web accessibility is opening up – and the deaf are on the wrong side of that gulf.
Does the answer lie with the BBC?
Currently, there is very little in the way of provision for the deaf to interact with video content in particular. As video blogging takes off, the problem is only going to get greater. Whilst lip-reading on television or in person might be second nature to many deaf people, the relatively poor resolution of much video content makes it unclear how well this ability will help deaf listeners. And of course in the case where voiceover narration is used to contextualise the imagery, lip-reading will be no use at all.
Now the BBC has announced that it’s own in-house video player – the iPlayer – will be almost fully subtitled by the end of this year, with further improvements mooted for 2010. This includes live streaming content. The aim is ambitious, as the Beeb concede:
“Improved live subtitle synchronisation – live subtitles on iPlayer, at present, are based on those from broadcast TV and we are still working on ensuring that the time-lag between speech and subtitles, which is a limitation of the current live subtitling broadcast process and the current online repurposing process, is reduced as much as reasonably possible to improve the experience of watching live subtitles online”
Now this technology is proprietary and currently only available to UK television license payers who watch BBC TV, but an important watermark has been reached: it is technically feasible to bring video content to deaf people online.
Will this cross over into the mainstream and become an automated service on sites like YouTube? The answer is probably “yes”, but not for a long time. Deciphering audio into text has proven to be a very difficult hurdle for technologists to bridge, although there are plenty of companies developing solutions as we speak.
So while the gulf has finally been acknowledged, and technology is coming up to speed in some ways, but we are a long way from utopia. But with the weight of organisations like the BBC behind it, the solution is tantalisingly close to making a truly accessible web for all.