arduino-cli monitor -p /dev/ttyUSB0 -b 115200 --config-file "$ARDUINO_DATA_DIR/arduino-cli.yaml"
A 6-DOF autonomous sorting robot with interrupt-safe human-in-the-loop control — local gesture, remote MQTT, and full AI vision pipeline.
Open the remote console in your phone browser and use Add to Home Screen / Install App. It launches like a native application in standalone mode.
Supported ✓Dual virtual joysticks + dedicated wrist/gripper hold buttons reduce accidental jumps compared to camera-only control on mobile.
Low LatencyRuntime sliders for speed, deadzone, smoothing, and publish rate allow quick tuning based on network conditions and operator preference.
Operator FriendlyA dedicated reset command immediately returns target joints to neutral values for safe recovery when remote motion becomes unstable.
Safety FocusMetal servo arm kit with 6× MG996R servos, aluminium brackets, claw gripper.
On Hand ✓Wi-Fi microcontroller acting as ROS2 ↔ PCA9685 bridge. Replaces Arduino Mega for wireless capability.
On Hand ✓I²C servo driver. Takes joint angle commands from ESP8266, outputs PWM to all 6 servo channels.
On Hand ✓Dedicated servo power supply. Prevents brownout under peak MG996R stall current (~1.8A × 6 servos).
On Hand ✓Laptop-mounted. Used exclusively for LOCAL gesture control via MediaPipe.
On Hand ✓Fixed overhead via IP Webcam app. MJPEG stream → YOLOv8 object detection. Free, 1080p+.
On Hand ✓6-axis accelerometer + gyroscope on wrist link. Detects cumulative servo drift. I²C address 0x68.
To Order ₹199Main compute: ROS2, Gazebo, YOLOv8 (CUDA), MoveIt2, MediaPipe. Ubuntu 24.04 LTS.
On Hand ✓Default state. The arm runs the full AI vision pipeline independently — no human input required. YOLOv8 detects objects overhead, MoveIt2 plans collision-free paths, the arm picks and sorts continuously.
/camera/image_raw → /detected_objects → /target_pose → /arm_controller/joint_trajectory
Override triggered when MediaPipe detects a hand in the laptop webcam. Autonomous mode pauses immediately. The operator's 21 hand landmarks are mapped to 6 joint angles in real-time.
Global override. Any person anywhere in the world opens the ngrok URL in their phone browser. MediaPipe.js runs in-browser — no app install. Joint angles publish via MQTT to HiveMQ cloud, bridged into ROS2.
Fixed overhead, mounted on a stand above the workspace. Streams MJPEG via IP Webcam app on local Wi-Fi. Always on during autonomous mode.
Built-in webcam, always active. MediaPipe monitors continuously for a hand appearing in frame — the presence of a hand triggers LOCAL mode.
Remote operators use their own phone. Browser opens ngrok URL. No app install. MediaPipe.js runs in the browser tab itself.
| From | To | Signal | Wire | Note |
|---|---|---|---|---|
| PSU V+ (barrel) | PCA9685 V+ | 5V servo power | 20–22 AWG red | Through DC jack screw terminal |
| PSU GND | PCA9685 GND | Power return | 20–22 AWG black | Through DC jack screw terminal |
| ESP8266 3.3V | PCA9685 VCC | Logic power only | Any thin wire | NOT servo power — IC logic only |
| ESP8266 GND | PCA9685 GND | Common ground | Any thin wire | MANDATORY — shared reference |
| ESP8266 D1 (GPIO5) | PCA9685 SCL | I²C clock | 24–26 AWG | Pull-ups already on PCA9685 board |
| ESP8266 D2 (GPIO4) | PCA9685 SDA | I²C data | 24–26 AWG | Pull-ups already on PCA9685 board |
| MPU6050 SCL/SDA | Same I²C bus | I²C (addr 0x68) | 24–26 AWG | No address conflict with PCA9685 |
40-link CAD → clean 12-link URDF. 9 Gazebo Harmonic bugs fixed. RViz2 TF tree — fully working. joint_state_relay.py written.
MediaPipe installed and verified. 21 landmarks extracted from webcam at 30+ FPS. Landmark→joint mapping not yet written.
Install MoveIt2. Run Setup Assistant on robot_arm_description package. Configure planning group 'arm', end-effector 'gripper', SRDF, kinematics solver.
Complete landmark-to-joint-angle mapping. Publish JointTrajectory → /arm_controller. Live arm mirroring in Gazebo simulation.
IP Webcam stream → YOLOv8 inference → pixel-to-world coordinate → sorting planner → pick-sort-return cycle in Gazebo.
Flask web app served via ngrok. MediaPipe.js in browser. MQTT pub/sub to HiveMQ. mqtt_bridge_node.py decodes JSON → JointTrajectory.
Flash ESP8266 firmware. Wire PCA9685. Implement ros2_control hardware interface. MPU6050 drift correction. Real servo validation.
No MQTT command for 3 seconds → arm returns to home pose → AUTO mode resumes automatically.
All joint angles hard-clamped to URDF limits. A phone cannot send a value that would physically damage the arm.
Exponential moving average on all gesture inputs. Eliminates hand tremor and network jitter from becoming servo oscillation.
150 ms minimum time per joint command. Prevents instantaneous angular jumps that would strip gears or rip brackets.
Old MQTT commands are dropped immediately. No stale command backlog that could cause sudden unexpected movements.
Mode manager waits for the active motion to complete before switching modes. No mid-trajectory context changes.
Any error, disconnect, or timeout triggers a return to a predefined safe home pose before resuming.
Physical joint limits enforced at URDF level. MoveIt2 OMPL planner respects these limits in all trajectory planning.