Photo of Shulabh Gupta

Shulabh Gupta

Applied Electromagnetics & Antennas

Email:shulabh.gupta@carleton.ca
Website:Browse

Project Title:ÌýWireless Semi-Autonomous 4-DOF Robotic Arm with 2.4 GHz Control, Custom Antenna, and Computer Vision Integration

Ìý

Project Overview

This Capstone project developed a compact, 3D-printed 4-degree-of-freedom (4-DOF) robotic arm that combines real-time wireless teleoperation, camera-based computer vision, and semi-autonomous functionality. The system enables seamless manual control via a remote joystick or master device, while the mounted camera and OpenCV pipeline detect colored objects, compute target coordinates, and trigger autonomous pick-and-place sequences. An nRF24L01+ 2.4 GHz transceiver pair with custom antenna design ensures reliable, low-latency wireless synchronization between controller and arm, supporting master-slave mirroring or override modes. The hybrid architecture allows operators to intervene wirelessly at any time, blending human oversight with automated tasks for flexible, intelligent manipulation in a lab environment.

Requirements

  • Functional: Support wireless real-time control of all four joints, camera-based object detection and coordinate extraction, semi-autonomous pick-and-place with manual override, and stable payload handling (up to 200–300 g).
  • Performance: Achieve low wireless latency (<20 ms target for smooth synchronization), positioning accuracy/repeatability within ±5–10 mm, reliable operation indoors (10–50 m range), and robust performance under varying lighting conditions.
  • Technical Constraints: Use affordable, lab-accessible components (Arduino-based control, 3D-printed structure, hobby servos, USB webcam or equivalent), external stable power supply, and open-source tools (RF24 library, OpenCV).
  • Safety & Usability: Include emergency stop, clear mode switching (manual/wireless/autonomous), and modular design for easy testing and iteration.

Required Skills:

The project required and showcased a multidisciplinary skill set, including:

  • Mechanical Design: 3D CAD modeling (Fusion 360), torque analysis, 3D printing, and mechanical assembly of linkages and gripper.
  • Electronics & Embedded Systems: Arduino programming, servo control, sensor integration (ultrasonic/camera), and stable power management.
  • RF Communications: nRF24L01+ transceiver setup, custom antenna design/modification, packet structuring, and wireless protocol implementation.
  • Computer Vision & Autonomy: OpenCV pipeline for color/object detection, coordinate transformation, and integration with inverse kinematics.
  • Control & Analysis: Forward/inverse kinematics, trajectory smoothing, latency measurement, packet loss testing, and accuracy/repeatability evaluation.
  • System Integration & Testing: Hybrid manual-autonomous architecture, documentation (BOM, wiring diagrams, code), and performance benchmarking.

Objectives

  • Design and fabricate a functional 4-DOF robotic arm with wireless 2.4 GHz communication and enhanced antenna for improved range and signal quality.
  • Integrate a camera system with computer vision to enable real-time object detection and semi-autonomous pick-and-place operations.
  • Achieve reliable low-latency wireless synchronization between a remote controller and the arm, supporting both teleoperation and master-slave mirroring.
  • Investigate and quantify system performance through metrics such as wireless latency, packet reliability, positioning accuracy, and object manipulation success rate under different conditions.
  • Develop a flexible hybrid control system that combines autonomous behavior with manual wireless intervention for practical usability.

Goals

The primary goal was to create a robust, demo-ready platform that demonstrates the synergy of wireless communication, sensor fusion, and computer vision in robotic manipulation. Specific measurable goals included:

  • Minimizing wireless latency while maintaining high packet success rate, with detailed investigation and optimization (including antenna effects).
  • Achieving high accuracy and repeatability in object manipulation tasks (e.g., successful color-based sorting or pick-and-place with <10 mm error).
  • Building a scalable semi-autonomous system suitable for lab demonstrations and potential extensions (e.g., multi-arm coordination or advanced vision).
  • Producing comprehensive documentation and test data to highlight engineering trade-offs, serving as a strong showcase of integrated mechatronics, embedded systems, and intelligent automation skills.