Wearable technology has quietly entered the university classroom, and researchers at the University of Notre Dame believe it is worth paying attention to now, before the technology matures.
A team there has been running hands-on experiments with smart glasses to explore how consumer devices might support teaching, research, and accessibility. The group includes Steve Varela, Director of Teaching and Learning Technologies; Elena Mangione-Lora, Senior Teaching Professor of Spanish; and Madeline Link, a third-year Ph.D. candidate in Medieval Studies.
Their focus is primarily on Meta’s Ray-Ban smart glasses, which combine hands-free video recording with emerging AI capabilities. The technology is not yet fully developed, but the team sees value in experimenting early to understand its potential in academic settings.
The project began with a practical question: could wearable video help address academic integrity concerns in laboratory environments? Hands-free recording could offer new ways to document student work or review procedures without interrupting the process.
This initial focus quickly expanded. As testing progressed, the researchers began asking broader questions about what the glasses could do and who might benefit from them.
Two features became central to their work: hands-free video capture and AI-powered scene analysis, which allows the glasses to describe what the wearer is looking at.
The discussion also raises broader questions about how wearable AI may be used beyond the university setting. To place these developments in a wider assistive technology context, HumanWare shared its perspective on the current role of smart glasses and where the technology may be heading.
“With smart glasses and AI tools becoming increasingly ubiquitous, we anticipate the development of new applications and products that will benefit a growing number of visual impairment people. Both technologies will also contribute to improve the user experience through voice and natural conversations and haptic feedback.
In academia and school system, one can imagine professors conducting accessibility research involving wearable AI and teachers training visually impaired students based on their experiences recorded with smart glasses, whether accessing documents or navigating complex outdoor environments.”
– François Boutrouille, Emerging Technology Leader at HumanWare
The most significant findings so far relate to accessibility. Madeline Link, who has been testing the glasses extensively, described how they can support everyday tasks for people with visual impairments.
She has used the glasses to identify food ingredients, check mail, and manage medications. In one example, she held up a piece of mail and asked the glasses to read it aloud. The AI analyzed the image and read the content.
In academic contexts, the potential extends further. Madeline Link has tested translation features using a Korean skincare product and a Latin text, and is exploring how the glasses might help navigate large library collections by capturing text from reference volumes that are not available digitally.
Another goal under development is using the glasses to capture printed text and convert it into a format compatible with braille reader devices. Researchers are also investigating how the technology might support engagement with visual elements such as footnotes, marginal notes, and manuscript handwriting.
That combination of portability, hands-free use, and AI interpretation is what makes the technology distinctive, according to HumanWare, a developer and distributor of assistive technology for blind and low-vision users that has begun selling Meta’s smart glasses as part of its product line.
“In accessibility technology, the real value of hands-free video and AI is context. These tools allow users to capture what’s happening around them and get meaningful information without stopping what they’re doing.
For many blind and low-vision users, that shift, from needing assistance to accessing information independently, can make a huge difference.”
What interests HumanWare’s team is not only what smart glasses can do independently, but how they might complement existing assistive technologies used by people who are blind or have low vision.
Traditional screen readers depend on structured digital content, which limits access to graphical documents or scanned materials. AI-powered glasses could help bridge that gap by capturing and interpreting content that current tools cannot access.
“Smart glasses with AI could be used to capture text, say, the page of a book, and translate it to a braille format that could then be represented on a braille display. That would be one step closer to sighted reading for a blind person.”
– Dr. Louis-Philippe Massé, Vice President, Product Innovation and Marketing at HumanWare
Dr. Louis-Philippe Massé notes that combining AI with wearable hardware could enable navigation of unstructured documents and real-time delivery of content to braille displays, creating new possibilities for access.
The Notre Dame team is also exploring how smart glasses could support language learning. Elena Mangione-Lora has been testing the glasses for real-time translation during Spanish conversations. The goal is to support language practice and cultural exchange by helping learners follow spoken dialogue without interrupting interaction. This use case remains at an early stage. Technology can feel cumbersome, and its accuracy is not yet sufficient for high-stakes instruction, but it highlights potential areas for development.
The use of wearable cameras in classrooms raises ethical concerns. The researchers highlighted questions around awareness, consent, and data control: who knows when recording is taking place, who owns the footage, and how sensitive information is protected. Because smart glasses can record continuously and discreetly, institutions must establish clear policies around consent, data storage, and appropriate use before adopting them in educational settings.
Despite current limitations, the team’s position is clear. Universities that wait for fully developed solutions risk missing the opportunity to influence how the technology evolves. By experimenting now, researchers can identify practical applications, document limitations, and contribute to shaping tools that better serve students and educators. HumanWare’s involvement reflects a similar approach: placing emerging technology in real-world contexts and learning from how it is used.
“Our collaboration with Meta reflects HumanWare’s broader mission, taking emerging technologies like the Ray-Ban Meta Smart Glasses and ensuring they work within the assistive technology ecosystem that blind and low-vision users rely on every day.”
– Mathieu Paquette, Product Manager at HumanWare
Smart glasses may not transform higher education overnight, but ongoing research is already revealing new possibilities for accessibility, teaching, and academic work.