In the three-part series, hosted by blind technology journalist - Fern Lulham, listeners were introduced to a range of accessibility topics including coverage of our own exciting project!
In the first episode, Mariana was invited to talk to Fern and her listeners about the project itself and its real-life applications in broadcasting. Fern explained to her listeners that traditional AD gives a basic spoken audio description of events happening on screen to visually impaired users. Mariana was invited to talk about how the Enhanced Audio Description (EAD) methods, which include a combination of sound effects, spatial audio rendering and first-person narration, can provide an alternative to traditional accessibility practices. They discussed how using spatial audio can enhance the experience by changing the location of where sound effects and voices are coming from, allowing listeners to paint a better picture in their mind of a setting as well as where people and objects in this setting are located. The use of additional sound effects was also discussed, as well as the use of first-person narration, which can help convey important actions or gestures.
Fern was keen to understand how users can access this technology, so Mariana explained that end users only need a set of headphones to plug into their device to be able to access EAD, making this technology accessible to all. Fern was keen to get an understanding on the feedback and uptake from the broadcasting industry and luckily Mariana was able to report that so far this has been extremely positive and that the EAD II team are actively partnering with broadcasters to integrate EAD techniques into current productions.
Listen to the full report here. It starts 21 minutes in.
In episode two, Fern explored some of the technological developments underway around the globe designed to assist visually impaired people. She spoke to Pat Slade, a research assistant at Stanford University about how only 50% of visually impaired people are able to leave the house independently. Pat and his colleagues are working on a project to tackle this and have some amazing technological advances and developments to the traditional white cane. Fern talks with Pat about the development of an ‘augmented’ white cane which uses a number of sensors, a camera and GPS to detect obstacles, pick up on environmental information and even find the way to your favourite shop or cafe! The research team working on this exciting project are creating this device with the hope that it will improve independence and mobility for visually impaired people and aspire to produce it at a cost which is affordable to all within the next 12 months.
Listen to the full report here. It starts at minute 20 in.
In the final episode, Fern explored advances in eye care. She spoke to Dr Tommy Corn, an Ophthalmologist from San Diego, who uses the developments in technology on the iPhone 13 Pro Max to support his patients with their eye care. Due to the improved camera technology on this particular model of iPhone, he is able to quickly take and store photos of his patient’s eyes, enabling him to easily track changes over time. He can even AirDrop the eye photos to his patients, to keep for their own records. Before the development of the iPhone 13 Pro Max, he used a large Nikon camera with a long lens and an incredibly bright flash which often caused some discomfort for patients. The use of the iPhone is a cheaper and more time-efficient alternative, allowing doctors to track the progress of more people’s eyesight in less time. He hopes that in the future, further advances in technology and artificial intelligence (AI) will allow for automatic diagnosis and treatment for some conditions, meaning that more and more people around the globe can access eye care treatment.
Listen to the full report here. Fern’s section starts about 12’ 30” in.
By Becky Shaw