This project features a robust 3-tier custom architecture designed for modularity and real-time performance. It integrates high-level decision-making with low-level hardware control through a reliable communication bridge.
System Architecture
The robot employs a hierarchical 3-tier architecture:
| Tier | Hardware | Primary Role | Description |
|---|---|---|---|
| Tier 1: Real-Time Control | ESP32 | Microcontroller | Low-level motor control, sensor acquisition, hardware I/O |
| Tier 2: Supervision & Middleware | Raspberry Pi Zero 2W | Middleware | Inter-process communication, bridge nodes, sensor preprocessing |
| Tier 3: Computational Processing | Host PC (Linux/ROS 2) | High-Level Processing | SLAM, computer vision, decision-making, trajectory prediction |
Communication Bridge
Communication between the Pi Zero 2W and ESP32 is established via a reliable UART serial bridge operating at 921.6 kbps.
- Telemetry (ESP32 → Host @ 50Hz): Sends IMU data (quaternions, accel, gyro), encoder ticks/velocity, and system state.
- Commands (Host → ESP32 @ 100Hz): Receives velocity targets, LED patterns, and buzzer controls.
// Sample Telemetry Packet
{
"time": 12345,
"imu": {
"quat": [0.99, 0.01, 0.02, 0.03],
"gyro": [0.0, 0.0, 0.1],
"accel": [0.0, 0.0, 9.81],
"cal": {"sys": 3, "gyro": 3, "accel": 3, "mag": 0}
},
"encoders": {
"left": {"pos": 1024, "vel": 0.5},
"right": {"pos": 1024, "vel": 0.5}
},
"state": {"motors_enabled": true, "emergency_stop": false}
}
Hardware Specifications
| Component | Detail |
|---|---|
| Processor | ESP32 (Dual-core 240 MHz) |
| Motors | JGA25-371 DC + BTS7960B Drivers (12V) |
| IMU | Adafruit BNO055 (9-DOF Fusion) |
| Encoders | Dual Hall Effect (PCNT Peripheral) |
| Visuals | 12x4 LED Matrix (WS2812B) |
Sensor Integration
The robot integrates a suite of complementary sensors providing proprioceptive (self-state) and exteroceptive (environmental) information:
Proprioceptive Sensors (Tier 1 Real-Time)
- Wheel encoders: 100 Hz odometry acquisition via hardware interrupts.
- IMU (6-axis): 50 Hz orientation and acceleration data for dead-reckoning and slip correction.
Exteroceptive Sensors (Tier 1 → Tier 3)
- 2D LiDAR: 360° environmental mapping at 25 Hz (360-point scans).
- RGB Camera: Forward-facing perception for human detection.
These sensors provide the foundation for simultaneous localization and mapping (SLAM), human pose estimation, and monocular depth inference, the three primary perception pipelines executing on Tier 3.
Software Design (ESP32)
The firmware is built on FreeRTOS to ensure deterministic timing for critical tasks.
| Task | Core | Priority | Frequency | Stack | Function |
|---|---|---|---|---|---|
| UART RX | 0 | 3 | 1000 Hz | 4 KB | Command RX + JSON parse |
| Control | 0 | 2 | 100 Hz | 4 KB | Motor control + watchdog |
| UART TX | 1 | 2 | 50 Hz | 8 KB | Sensor telemetry TX |
| LED/Buzzer | 1 | 1 | 100 Hz | 2 KB | Animations + feedback |
Visual Feedback System
A dedicated LEDMatrix class manages patterns for status indication:
- Pulse/Breathing: Idle states.
- Rapid Flash: Emergency stop or system fault.
- Directional Indicators: Blinking yellow for turns.
- Rainbow Sweep: Mapping mode active.
Project Structure
fif-robot/
├── platformio.ini # Build configuration
├── include/ # Header files
│ ├── config.h # Global configuration (pins, parameters, RTOS settings)
│ ├── act_motor.h # Motor driver interface (BTS7960B)
│ ├── act_buzzer.h # Buzzer controller
│ ├── act_matrix.h # LED matrix patterns
│ ├── sen_encoder.h # Encoder reader
│ ├── sen_imu.h # IMU sensor interface
│ └── comm_uart.h # UART communication (header only)
│
└── src/ # Implementation files
├── main.cpp # Entry point + RTOS task definitions
├── act_motor.cpp # Motor driver implementation
├── act_buzzer.cpp # Buzzer implementation (melodies)
├── act_matrix.cpp # LED pattern implementations
├── sen_encoder.cpp # Encoder velocity calculations
├── sen_imu.cpp # IMU data acquisition & calibration
├── comm_protocol.cpp # JSON command/sensor protocol + UART ISR
└── debug_commands.cpp # USB debug CLI interface