360° rotation to explore and communicate with simple gestures.
Distance (ToF), IMU, temperature/ humidity/ pressure, microphone.
Eyes on LED ring or circular display, ambient lights, and sounds.
Companion · Sentry · Showcase demos.
H.A.R.O. is an interactive desktop companion that blends interaction, monitoring, and micro-expressivity. It rotates to observe the environment, senses distance and room conditions, and responds with eyes/LEDs, sounds, and small gestures.
The architecture is modular: a C/C++ microcontroller handles motors, sensors, and lights; an optional “smart” layer adds voice, vision, and a web dashboard.
Idea shaping, architecture analysis, component selection, and milestones.
Base rotation + sensor readings + 3 LED emotions.
Presence/simple gestures + sounds/feedback.
Voice recognition, optional camera, web dashboard, behavior trees.
Choreographed scenes, photos/videos, and documentation.
Short GIFs/images will be added for Companion, Sentry, and Choreographed Demos.
Reactions and emotions via eyes/LEDs. (Preview coming soon)
Periodic scan and change detection. (Preview coming soon)
Choreographies of lights/sounds/motion. (Preview coming soon)
Status: defining architecture, components, and milestones. Next: MVP with rotation, core sensors, and LED emotions, Repo on Github.
~1% complete (starting phase)
Scalability: