FeelBeam 3D render. Front view.
FeelBeam 3D render. X-Ray view.
FeelBeam 3D render. Back view.

Rethinking Navigation:
A Haptic Device That Lets You Feel Distance

Traditional aids like the white cane are essential, but can lack information about obstacles at head-level. Electronic aids often rely on audio or vibration, which can be distracting. 

 

FeelBeam is a prototype exploring a new way for visually impaired individuals to sense their surroundings. Instead of simple vibration, it uses a moving lever to provide intuitive, proportional feedback on the distance to objects. 

How It Works

Simple, Intuitive, Kinesthetic Feedback

FeelBeam uses a safe, invisible laser sensor (LiDAR) to measure distance in real-time. This information is instantly translated into the physical position of a lever under your thumb. The principle is simple:

FeelBeam haptic navigator. The lever has moved back towards the user, indicating a close obstacle.

Near Distance (<1m)

The lever slides all the way back, creating a distinct and unmistakable signal of a close obstacle.

FeelBeam prototype on the table. The lever is in the middle position, indicating a mid-range distance.

Mid-Range (~2m)

The lever is in the middle of its travel, providing a clear reference point.

FeelBeam prototype on the table. The lever is in the forward position, indicating a far distance.

Far Distance (>4m)

The lever rests in its forward-most position.

This continuous, analog feedback allows you to scan your environment and build a mental map through the sense of touch.

The Mission & Vision

An Open Question: Is This Truly Useful?

This project was born from a simple idea: what if distance could be felt as a position, not just a vibration?

As a sighted developer, I can design and build the technology, but I cannot determine its real-world value alone. The most important step is to listen to the experts: the potential users.

 

FeelBeam is therefore presented not as a final product, but as an open invitation for feedback. The goal is to collaborate with the visually impaired community to understand if this haptic modality is practical, intuitive, and genuinely helpful.

Project Status

  • Phase 1: Concept & Prototyping (✅ Done)

  • Phase 2: User Feedback & Validation (📍 We are here. Your input is needed!)

  • Phase 3: Miniaturization & Ergonomics (Next Step)

  • Phase 4: Open-Source Release / Pilot Program (Future Goal)

The Future Vision

The current prototype is powered by an external Raspberry Pi for rapid development. The final vision is a completely self-contained, wireless device the size and weight of a small flashlight, with an integrated battery and a target price that makes it widely accessible.

FeelBeam 3D render. Front view.
FeelBeam 3D render. X-Ray view.
FeelBeam 3D render. Back view.