Skip to content

AI-enabled Smart Glasses Help the Blind and Visually Impaired

How AI technology can be optimized to help the visually challenged

Assistive technology company Envision has unveiled its latest version of Smart Glasses to help those with no or low vision see better with the power of AI.

Showcased at a recent conference from the California State University Northridge (CSUN), the Smart Glasses uses AI to organize different types of information from visual cues and verbally translates the information to the user. The wearable tech reads documents aloud, identifies acquaintances, finds missing items in the house, and helps the wearer use public transportation.

The latest version is an enhanced model of the eyeglasses that were debuted at the 2020 CSUN conference. Since then, Smart Glasses have been rolled out globally, trialed in over 20 countries.

“By analyzing real time user data and direct feedback from across our communities, we can constantly enrich the Envision experience and innovate our products,” said Karthik Kannan, co-founder of Envision.

The improved version incorporates several features with enhanced functionalities.

  • Accurate text reading: Smart Glasses can read and translate digital and handwritten texts from various sources, including computer screens, posters, barcodes, timetables, and food packaging, barcodes.
  • Optimized Optical Character Recognition (OCR): It uses tens of millions of data points to be interpreted by Envision Glasses and Apps for accurate image capture.
  • Third-Party App Integration: Envision created an app ecosystem, making it easier for its software to integrate with external services, such as outdoor and indoor navigation. It can also recognize over 100 currencies with the Cash Reader app.
  • Ally function: A secure video calling capability allows users to ask for help from contacts, using both Wi-Fi and mobile networks.
  • Language Capabilities: Four new Asian languages were added, bringing the total number of supported languages to 60 when connected. There are 26 supported languages when offline.
  • Layout Detection: Smart Glasses can contextualize a document, making it less confusing for the user to read a food menu, newspaper, road sign, or a poster.

Envison’s tech is compatible with Android and iOS systems, and can be integrated with Google Glass.

“Our mission is to improve the lives of the world’s two billion people who are blind or visually impaired by providing them with life-changing assistive technologies, products and services,” said Kannan.

Written by AI Business and republished with permission.

This image has an empty alt attribute; its file name is aib_logo_1.png

CATEGORIES

MEDIA CONTACT

Reach out to us at [email protected]

Related Posts

Procter & Gamble on Scaling AI for Enterprise

Procter & Gamble on Scaling AI for Enterprise

Data, talent, platforms and trust: these are the four key pillars firms need when building AI into their applications according…
Scaling Up: AWS’s Allie Miller Outlines the Key Trends in AI and Machine Learning

Scaling Up: AWS’s Allie Miller Outlines the Key Trends in AI and Machine Learning

We are entering an AI world where start-ups are transforming everyday lives through artificial intelligence and machine learning. AWS’ Allie…

ScaleUp:AI uses cookies to enhance your experience and help us analyze website usage. By continuing to browse or dismissing this banner, you indicate your agreement as outlined in our Privacy Policy.