← BackRutgers University · Jan – May 2019

PM & Designer · Rutgers Capstone · Team of 4

VISN.

A wearable navigation system for visually impaired people — hardware and software, end to end. Built as a Rutgers senior capstone. Won first place among 60 engineering teams. The project that confirmed my instinct: when I back an idea that matters, I can make it real.

Hero — photo of Nikki presenting VISN at the Rutgers capstone fair

The Problem

Navigation without sight is a design problem

Visually impaired people navigate the world with a combination of memory, muscle memory, and whatever technology they can afford — canes, guide dogs, and a handful of smart devices that each solve part of the problem but none of it completely.

Existing solutions like SUNU (a sonar wristband) or Google Lookout could detect nearby objects or read aloud what a camera saw. But no single system combined real-time object proximity, directional awareness, and turn-by-turn navigation in one wearable, accessible package.

We wanted to build that. A system that could tell you: where you're going, what's in your way, and which direction you're facing — all through your ears, hands-free.

Problem — visually impaired navigation: person with cane, phone, and sensor device; gap in existing solutions

01

One system, two layers

VISN was a hardware-software system designed to work as one. The hardware lived on the body; the software ran on the user's phone; they communicated over Bluetooth in near real-time.

Hardware. An Arduino Nano, four Maxbotix ultrasonic sensors, a magnetometer (compass), and an HC-06 Bluetooth module — all wired into a breadboard circuit and enclosed in a fanny pack worn on the chest. The sensors measured the distance and angle to objects in the user's path. The compass tracked the direction they were facing. All of it streamed to the app.

Software. An Android app built in Android Studio. It pulled Google Maps data for turn-by-turn directions and layered in the live hardware stream — so as the user walked, they heard both their route and real-time obstacle alerts: "Object 3 feet ahead. Please move."

Circuit diagram — Arduino Nano connected to four ultrasonic sensors, magnetometer, and HC-06 Bluetooth module

Circuit diagram — Arduino Nano, four ultrasonic sensors, magnetometer, HC-06 Bluetooth module

Hardware components — Arduino Nano, HC-06 Bluetooth module, Android Studio (the software stack)

The hardware stack — inexpensive, modular, and small enough to fit in a fanny pack

02

The design decision that mattered most

We tried a harness first. It held the hardware well — good sensor angles, stable on the body. But it was heavy, conspicuous, and made people feel more disabled, not less. That wasn't acceptable.

The fanny pack was the answer. Worn on the chest, it gave the sensors the right field of view without restricting movement. It was familiar, lightweight, and — critically — something a person might choose to wear anyway. It didn't announce that you needed help.

That decision shaped how I think about assistive technology. The best tools disappear into the life of the person using them. Dignity is a design requirement.

Wearable — fanny pack worn on chest with ultrasonic sensor placement labeled, alongside VISN Android app UI

The wearable setup — sensors, Arduino, and compass inside a fanny pack worn on the chest; the Android app handled navigation and obstacle alerts

03

What we shipped — and what we didn't

We shipped a working system. The app delivered real walking directions. The hardware detected stationary objects in front of the user and updated proximity as they moved. Bluetooth connected reliably. The fanny pack held everything together.

But we were honest about the gaps. The magnetometer wasn't fully integrated in time — so directional object alerts (left/right, not just "ahead") didn't make the demo. Indoor GPS was unreliable. Moving obstacles were outside scope. We documented all of it in the poster, because we believed the foundation was strong enough that the gaps were roadmap, not failure.

For a four-person team with four months and a hardware budget, we built something that worked outdoors, in real conditions, for a real user need.

Outcome

First place. 60 teams.

VISN won the Rutgers senior capstone competition among 60 engineering teams.

The judges weren't just evaluating whether the technology worked. They were evaluating whether the idea was worth building in the first place — whether the team understood the problem, made thoughtful decisions, and built something with a real future. We did.

Winning validated something I'd been carrying for a while: that my instinct for which problems to chase, and my ability to pull a team toward a working answer, was real. That combination — technical background, product thinking, design instinct — isn't common. I've been building on it ever since.