Seeing themselves has been a dream for blind people for centuries, but now, this reality is possible. An Indian-origin CEO’s company has created AI mirrors in a major tech breakthrough and they help blind people see themselves closely. Already popular among those with vision problems, the app is known as Envision and helps them access information that was previously inaccessible with the use of image recognition and intelligent processing. It not only describes an image but also offers feedback, comparisons and advice.
Envisioning a new future
Envision, an AI company is the mind behind this new life-changing piece of tech and is one of the first companies to use artificial intelligence in this way. Karthik Mahadevan, chief executive officer, shared with BBC that the effort began in 2017 and has developed over the years. “When we started in 2017, we were able to offer basic descriptions, just a short sentence of two or three words,” he said. Envision began as a mobile app that allowed blind people to access information in printed text through character recognition. Through the years it has introduced advanced AI models into smart glasses and created an assistant that helps the blind interact with the visual world around them. “Some use it for obvious things, like reading letters or shopping, but we were surprised by the number of customers who use it to do their makeup or coordinate their outfits,” added the CEO.
How are people using the mirrors?
Lucy Edwards, a blind content creator, shared with the BBC how the new information is life-altering considering she had sight for 17 years of her 30-year-old life. “It feels like AI is pretending to be my mirror,” she said to the outlet.“The truth is that I haven’t had an opinion about my face for 12 years. Suddenly I’m taking a photo and I can ask AI to give me all the details, to give me a score out of 10, and although it’s not the same as seeing, it’s the closest I’m going to get for now.” Not just Envision, Milagros Costabel shared with the outlet how she uses an app called ‘Be My Eyes’ to upload a photo of herself after doing skincare and verify if her skin looks the way she wants it to. “Your skin is hydrated, but it definitely doesn’t look like the almost perfect example of reflective skin, with non-existent pores as if it were glass, in beauty ads,” said the AI to her once.
AI and body image
How the use of such AI tools might affect blind people, is yet to be thoroughly researched. “We have seen that people who seek more feedback about their bodies, in all areas, have lower body image satisfaction,” said Helena Lewis-Smith, an applied health psychology researcher focused on body image at the University of Bristol. “AI is opening up this possibility for blind people.”As per experts in body image psychology, the results of AI tools may not always be positive. This may be because many of them are trained to idealise Western body shape standards. “The AI’s processing can return a photo with a lot of changes that make the person look totally different, implying that all of this is what they should change, and therefore that the way they look now is not good enough,” added Lewis-Smith. “In psychological literature, rather than how a person looks, we understand that body image is not one-dimensional and is made up of several factors, such as context, the type of people we want to compare ourselves to, and the things we are capable of doing with our bodies,” explained Meryl Alper, a researcher on media, body image, and people with disabilities at Northeastern University in Boston in the US. “All of this is something that AI does not understand and will not take into account when making its descriptions.“
Trusting the new AI
As per Mahadevan, his AI product learns the user’s preferences and desires and gives them the information they need to hear. As per Edwards, it is a double-edged sword. While one could ask the app to describe them romantically and get the desired result, they could also ask if their hair is messy and be left with tips to change it.Additionally, hallucinations, where AI models mislead users by passing off inaccurate information as true, are also a problem. “At first, the descriptions were very good, but we noticed that many of them were inaccurate and changed important details, or invented information when what was in the image didn’t seem to be enough,” explains Mahadevan. “But the technology is improving by leaps and bounds, and these errors are becoming less and less common.“However, there is more profit than loss. “We’re going to take it as a positive thing because even though we don’t see visual beauty in the same way that sighted people do, the more robots that describe photos to us, guide us, and help us with shopping, the happier we’ll be. These are things we thought we’d lost and now technology allows us to have,” added Edwards. The mirrors are there, to use them or not, is the decision to make.
