Shakila Praveen Rathnayake

shakilar.com
~/ projects / three-tier-mobile-robot
Robotics & Embedded Systems

3-Tier Architecture Mobile Robot

A modular, scalable mobile robot platform utilizing a 3-tier architecture with ESP32 for real-time control and Pi Zero 2W for high-level processing.

ESP32FreeRTOSC++PythonUARTPlatformIO

This project features a robust 3-tier custom architecture designed for modularity and real-time performance. It integrates high-level decision-making with low-level hardware control through a reliable communication bridge.

System Architecture

The robot employs a hierarchical 3-tier architecture:

TierHardwarePrimary RoleDescription
Tier 1: Real-Time ControlESP32MicrocontrollerLow-level motor control, sensor acquisition, hardware I/O
Tier 2: Supervision & MiddlewareRaspberry Pi Zero 2WMiddlewareInter-process communication, bridge nodes, sensor preprocessing
Tier 3: Computational ProcessingHost PC (Linux/ROS 2)High-Level ProcessingSLAM, computer vision, decision-making, trajectory prediction

Communication Bridge

Communication between the Pi Zero 2W and ESP32 is established via a reliable UART serial bridge operating at 921.6 kbps.

// Sample Telemetry Packet
{
  "time": 12345,
  "imu": {
    "quat": [0.99, 0.01, 0.02, 0.03],
    "gyro": [0.0, 0.0, 0.1],
    "accel": [0.0, 0.0, 9.81],
    "cal": {"sys": 3, "gyro": 3, "accel": 3, "mag": 0}
  },
  "encoders": {
    "left": {"pos": 1024, "vel": 0.5},
    "right": {"pos": 1024, "vel": 0.5}
  },
  "state": {"motors_enabled": true, "emergency_stop": false}
}

Hardware Specifications

ComponentDetail
ProcessorESP32 (Dual-core 240 MHz)
MotorsJGA25-371 DC + BTS7960B Drivers (12V)
IMUAdafruit BNO055 (9-DOF Fusion)
EncodersDual Hall Effect (PCNT Peripheral)
Visuals12x4 LED Matrix (WS2812B)

Sensor Integration

The robot integrates a suite of complementary sensors providing proprioceptive (self-state) and exteroceptive (environmental) information:

Proprioceptive Sensors (Tier 1 Real-Time)

Exteroceptive Sensors (Tier 1 → Tier 3)

These sensors provide the foundation for simultaneous localization and mapping (SLAM), human pose estimation, and monocular depth inference, the three primary perception pipelines executing on Tier 3.

Software Design (ESP32)

The firmware is built on FreeRTOS to ensure deterministic timing for critical tasks.

TaskCorePriorityFrequencyStackFunction
UART RX031000 Hz4 KBCommand RX + JSON parse
Control02100 Hz4 KBMotor control + watchdog
UART TX1250 Hz8 KBSensor telemetry TX
LED/Buzzer11100 Hz2 KBAnimations + feedback

Visual Feedback System

A dedicated LEDMatrix class manages patterns for status indication:

Project Structure

fif-robot/
├── platformio.ini                 # Build configuration
├── include/                       # Header files
│   ├── config.h                  # Global configuration (pins, parameters, RTOS settings)
│   ├── act_motor.h               # Motor driver interface (BTS7960B)
│   ├── act_buzzer.h              # Buzzer controller
│   ├── act_matrix.h              # LED matrix patterns
│   ├── sen_encoder.h             # Encoder reader
│   ├── sen_imu.h                 # IMU sensor interface
│   └── comm_uart.h               # UART communication (header only)

└── src/                          # Implementation files
    ├── main.cpp                  # Entry point + RTOS task definitions
    ├── act_motor.cpp             # Motor driver implementation
    ├── act_buzzer.cpp            # Buzzer implementation (melodies)
    ├── act_matrix.cpp            # LED pattern implementations
    ├── sen_encoder.cpp           # Encoder velocity calculations
    ├── sen_imu.cpp               # IMU data acquisition & calibration
    ├── comm_protocol.cpp         # JSON command/sensor protocol + UART ISR
    └── debug_commands.cpp        # USB debug CLI interface
Back to projects