I don’t usually post personal things on my blog, but this one I couldn’t keep to myself.
My son Lucas is 10 years old. He loves Python, he loves tinkering, and he has an enormous heart. So when he told me he wanted to build something to help visually impaired children “see the world better,” I set him up with a Raspberry Pi and a camera connected to it and got out of the way.
What he built left me speechless.
Lucas designed AI-powered glasses that use a camera mounted on the frame, connected to a Raspberry Pi carried in a backpack, to help visually impaired children perceive their environment in real time. He wrote every line of code himself. In Python. At age 10.
The results?
1st Place at his school science fair.
1st Place at GARSEF (Greater Austin Regional Science & Engineering Fair).
Featured on KVUE News, where he was interviewed and got to explain, in his own words, why he built it and who it’s for.
(You can watch the interview here: https://www.kvue.com/video/news/local/texas-10-year-old-develops-ai-glasses-to-assist-visually-impaired-children/269-0b8c6cc5-b882-49d8-98f6-49276133b93b)
This project didn’t happen in a vacuum. Lucas stood on the shoulders of incredible technology and the people behind it, and I want to take a moment to recognize every single one of them.
Arducam: The camera module attached to Lucas’s glasses is from Arducam. Their tiny, powerful embedded vision hardware made it possible for a 10-year-old to build a wearable AI camera system.
Thank you to the whole team, including founder and CTO Lee Jackson, for making embedded vision accessible to makers of all ages.
Raspberry Pi Foundation: The brains of the operation literally lives in Lucas’s backpack. A Raspberry Pi processes the camera feed and runs his AI model in real time. The Raspberry Pi’s mission — putting computing in the hands of young people everywhere — is exactly what happened here.
Special shoutout to James Adams, CTO (Hardware) at Raspberry Pi, the man who has designed six of the seven flagship Raspberry Pi boards and whose work made this possible.
EMOTIV: Lucas was deeply inspired by the work EMOTIV is doing at the intersection of neurotechnology and human empowerment. The idea that technology can directly bridge cognitive and physical gaps between people is what motivated him to focus his project on accessibility.
Thank you to Tan Le, Founder & CEO of EMOTIV, a true pioneer whose vision of technology serving human potential speaks directly to what Lucas was trying to accomplish.
OpenCV: Lucas used the OpenCV computer vision library to process the camera feed and power the visual intelligence of the glasses. That a 5th grader could harness one of the most powerful computer vision frameworks in the world is a testament to how this library has democratized AI.
A huge thank you to Gary Bradski, founder of OpenCV, the library that powers everything from self-driving cars to a 10-year-old’s accessibility project in Austin, Texas.
Ultralytics (YOLO): If OpenCV is the eyes, Ultralytics YOLO is the brain that understands what those eyes are seeing. Lucas used Ultralytics to run real-time object detection, the AI model that looks at each camera frame and identifies what’s there: a person, a door, an obstacle, a sign. YOLO (You Only Look Once) is the reason the glasses can tell a visually impaired child what’s in front of them instantly, without any delay.
A huge thank you to Glenn Jocher, Founder & CEO of Ultralytics, who built the world’s most widely used object detection AI (100,000+ GitHub stars, 10,000+ research citations, 100 million+ downloads) and made it accessible enough that a 10-year-old in Austin, Texas could deploy it on a Raspberry Pi in a backpack.
pyttsx3 : This is the library that literally gave Lucas’s glasses a voice. When the camera detects something, the glasses speak it aloud to the visually impaired user, instantly, reliably, and without needing a Wi-Fi connection. That’s exactly what pyttsx3 does: offline, cross-platform, text-to-speech in pure Python. A 10-year-old was able to make a wearable device talk because of this library.
A huge thank you to Natesh M Bhat, the creator and maintainer of pyttsx3, a developer who built something downloaded millions of times and now powers accessibility tools built by children in Austin, Texas.
Python Software Foundation: Lucas coded the entire project in Python. His first language. His favorite language. There is something very special about a programming language that is both powerful enough to run AI at the edge and approachable enough for a 10-year-old to master. That’s Python.
Thank you to Deb Nicholson, Executive Director of the Python Software Foundation, and the entire PSF community for stewarding a language that is genuinely changing who gets to build the future.
Lucas doesn’t see any of this as extraordinary. He just saw a problem (kids who couldn’t see well enough to navigate the world around them) and decided to fix it. With a camera on a pair of glasses, a Raspberry Pi in a backpack, and a lot of Python.
If you’re reading this and you work in accessibility tech, education, or AI, I’d love to connect. And if you’re a company whose tools are being used by children to change the world, you deserve to know about it.
The future is being built right now. Sometimes by 10-year-olds.
Proud doesn’t even begin to cover it!





