VisionLink Module — Vision-Based Bidirectional Interaction

Human → Machine. A chest-mounted camera captures the user's hand gestures. VisionLink recognizes each gesture, Orchestra classifies intent and sends commands to a connected exoskeleton operating at three speed levels plus rest. The same pipeline can command robotic arms, mobility devices, or any actuator on the Orchestra Connect Protocol.

World → Human. VisionLink detects a person approaching, measures distance and direction in real time, and Orchestra translates this into directional haptic feedback. Closer means stronger; left-side approach triggers left-side response. No user input required.

Conductor Module — EMG-Based Neural Gesture Recognition

Real-Time Hand Tracking. A developer wears an sEMG wristband. Conductor processes the muscle signals in real time, and a 3D virtual hand on a connected display mirrors the wearer's physical movement — vertical, lateral, and rotational — without the hand needing to be in any camera's field of view.

Spatial Localization — In Active Development

A foundational layer for Spatial Intent Fusion currently in active development. Wearable sensor data is processed in real time to produce precise positional information within a 3D environment, displayed as the wearer moves through a room. Once integrated, this capability will combine with visual context (VisionLink) and gestural intent (Conductor) to enable unified commands for connected physical devices.

Demonstration videos for all four milestones are available at www.wetourrobotics.com and on the Company's LinkedIn (Wetour Robotics) and X (@WETO_IR_TEAM) channels.

Why Spatial Intent Fusion Matters

Today's wearable devices and physical machines are fragmented. A smartwatch cannot command a robotic arm. A camera cannot coordinate with a wheelchair. A wristband cannot direct a drone. Spatial Intent Fusion is Orchestra's answer: an application layer that lets a wearable on the wrist, a camera in the room, and a connected device across the space act as one coordinated system.

Both VisionLink and Conductor are software pipelines running on the Orchestra edge hub — hardware-agnostic and compatible with any conforming wearable or camera device. Orchestra is designed to be device-agnostic, sensor-flexible, and open to builders. The Company does not manufacture wearable devices or physical end-devices. It develops the platform layer that makes them work together.