This project involved programming the iRobot Create 3 to perform a variety of interactive and autonomous tasks using Python. The goal was to create a multi-functional system where the iRobot Create 3 could react to its environment, perform tasks based on physical inputs, and autonomously navigate through obstacles and complex paths. This involved programming collision avoidance algorithms, sweeping functions, and a maze navigation program.
The iRobot Create 3 features several onboard sensors and actuators, including bumpers, buttons, infrared (IR) sensors, and a ring light, which were utilized in various sections of the project. The main components of this work can be categorized into the following parts: bumpers and buttons interaction, IR sensor-based behavior, CodeBreaker password system, and advanced applications for autonomous delivery and maze-solving.
I utilized the iRobot Create 3's IR sensors to gather proximity data and control the robot's behavior accordingly. In the script ir_sensors.py, I programmed the robot to use its IR sensors to determine the location of nearby objects. Depending on the proximity of an object to the left, right, or center of the robot, the ring light would change colors to provide a visual indication: red for an object on the left, green for an object on the right, and white when no object was detected or the closest object was directly in front.
To convert the raw IR sensor readings into approximate distances, the formula used was proximity (in cm) = 4095 / (ir_reading + 1). This allowed for better interpretation of the sensor data and ensured more meaningful responses to environmental conditions. The IR data was gathered asynchronously, allowing for real-time updates to the robot's behavior.
I developed a system that enabled the iRobot Create 3 to autonomously navigate to a designated destination while avoiding obstacles. The goal was to simulate a basic version of an autonomous delivery robot, similar to those used in last-mile delivery scenarios. The IR sensors played a crucial role in detecting obstacles, allowing the robot to navigate around them while maintaining its course.
The autonomous delivery system included various event-driven fail-safe mechanisms. For instance, if either bumper was pressed or a button was touched, the robot would immediately stop and turn on a solid red light, ensuring safety and preventing damage. The main navigation function, makeDelivery(), utilized helper functions to handle movement, detect obstacles, and reorient the robot as necessary to continue towards its target. This approach combined real-time sensor input with autonomous path planning to create a responsive and adaptive delivery system.
I also worked on a game called "Robot Pong." In this game, the iRobot Create 3 acted as one of the players, while a human could use physical interactions to control the other side. The robot used its IR sensors to track the movement of a ball (represented by an object), and based on the ball's position, it adjusted its movements to "hit" the ball back. This required precise sensor data interpretation and quick, responsive movement commands to ensure the robot could effectively follow and interact with the moving object.
The implementation of Robot Pong provided valuable experience in real-time control and multi-sensor fusion, as the robot had to constantly adjust its position based on changing sensor input. This game also helped in understanding how to create engaging interactive behaviors in robots.
The Sweeper mode was developed to simulate the behavior of an autonomous floor cleaning robot, similar to commercial robotic vacuum cleaners. In this mode, the iRobot Create 3 navigated around a designated area to "sweep" the floor, using its IR sensors and bumpers to detect obstacles and ensure efficient coverage of the space.
The Sweeper mode was implemented using a combination of random exploration and systematic coverage algorithms. The robot initially moved in a straight line until it detected an obstacle using its IR sensors or bumpers. When an obstacle was detected, the robot would stop, turn to avoid the obstacle, and continue in a new direction. To ensure comprehensive coverage, the robot periodically adjusted its path to create a zigzag or spiral pattern, allowing it to cover a larger area systematically.
The IR sensors provided continuous feedback on the distance to obstacles, while the bumpers acted as a fail-safe to prevent collisions. A key aspect of the implementation was managing the robot's movement speed and turning angles to balance coverage efficiency with collision avoidance. Additionally, a finite state machine (FSM) was used to manage different behaviors, such as moving forward, avoiding obstacles, and changing directions, ensuring a cohesive and effective cleaning pattern.
This mode required precise coordination between sensor inputs and motor commands to maintain a smooth, uninterrupted sweeping action. The Sweeper mode reinforced the importance of autonomous navigation, sensor integration, and systematic path planning, showcasing the potential of robotic solutions for practical household tasks.
The maze-solving component (MazeSolver.py) involved programming the robot to autonomously navigate through a maze. The robot used its IR sensors to detect walls and obstacles, dynamically making decisions on how to proceed based on sensor feedback. The goal was to move from a start point to an end point, finding the most efficient route through the maze.
The maze-solving functionality was implemented using several key algorithms and helper functions to facilitate navigation. The robot utilized its IR sensors to continuously monitor the surroundings and detect any obstacles or walls that may impede its progress. The primary function, navigateMaze(), operated as a state machine, controlling the robot's behavior through different stages of navigation, reorientation, and movement towards the goal.
The navigateMaze() function used global variables to track the robot's current position (CURR_CELL) and orientation (HEADING). Helper functions such as getMinProxApproachAngle() and getAngleToDestination() were used to determine the optimal angle for the robot to move in order to face the destination while avoiding obstacles. The pathfinding algorithm employed was a variant of wall-following, combined with real-time distance measurements from IR sensors to dynamically adjust the route.
When the robot encountered an obstacle, it would execute a reorientation sequence using the realignRobot() function to adjust its heading to face a more navigable direction. The robot's movement through the maze was characterized by a sequence of small, precise adjustments to ensure efficient navigation, minimizing collisions while maximizing progress towards the destination.
To track whether the robot had reached the target, a helper function checkPositionArrived() compared the robot's current position to the destination coordinates within a specified threshold. If the destination was reached, the robot indicated success by stopping and turning on a spinning green light.
The maze-solving component provided valuable experience in implementing autonomous navigation, state management, and sensor-driven pathfinding. It showcased how an intelligent system can adapt to a dynamic environment and find its way through complex, unfamiliar terrain using limited sensor input and well-defined navigation strategies.