ROS 2 Success with ReentrantCallbackGroups

Robot "Kilted-Dave" was not able to declare docking success from the dock callback because the battery_state callback was not allowed to execute to update the new charging state.

I had already setup callback groups but failed to initialize the battery_state callback group as reentrant.

Single line fix to a problem plaguing my #GoPiGo3 #RaspberryPi4 #ROS2_KiltedKaiju #robot since he was "Humble-Dave" on #ROS2Humble

In ROS there is a URDF file (Universal Robot Description File) which allows the robot visualization application to show a symbolic robot and the robot and state publishers to update where the robot parts are located as the robot moves.

These files can be amazingly complex to get the values correct for every part of a robot to appear in the correct place, and for sensor data to be accurate.

Over the years I have created #GoPiGo3 #URDF files. Today the GoPi5Go-Dave file:

#ros2humble

Finally got around to diagramming ROSbot GoPi5Go-Dave's Software Architecture

#GoPiGo3 #Docker #RaspberryPi5 #RaspberryPiOS #Robot #ros2humble

Progress - GoPi5Go-Dave Robot Is Docking and Undocking 24/7 managed by ROS 2 nodes.

2024-08-26 19:16|dave_node.py| ---- GoPi5Go-Dave ROS Docking 928 : success at battery 10v after 2.5 h playtime

His stack:
- dave_node
- docking_node
- [slam_toolkit and nav2]
- [RTABmap]
- battery_node
- gopigo3_node
- odometer node
- ydLIDAR node
- oak_d_w_ros2 node
- teleop_joy node
- say_server node(Piper-TTS)
- safety_shutdown node
- I2C mutex

#GoPiGo3 #ROS2Humble #Robot #RaspberryPi5 #Docker #autonomous

@drfootleg
> ROS with real robot

Not a trained ROSer, but figured out from first robot node and keyboard driving, joystick driving, adding LIDAR, and URDF for visualizing my bot in rviz2 on the desktop, up to running SLAM on the bot.

I created a custom "ROS 2 GoPiGo3 Robot Node", and then a set of guided "tests" with shell script commands to help remember how to do things.

Perhaps some useful stuff:

https://github.com/slowrunner/ROS2-GoPiGo3/blob/main/Docs/Test_ROS2_GoPiGo3.md

and the node is at https://github.com/slowrunner/ROS2-GoPiGo3/blob/main/ros2ws/src/ros2_gopigo3_node/ros2_gopigo3_node/gopigo3_node.py

#GoPiGo3 #ros2humble

ROS2-GoPiGo3/Docs/Test_ROS2_GoPiGo3.md at main · slowrunner/ROS2-GoPiGo3

Headless Setup of ROS2 GoPiGo3 Robot Humble Hawksbill, Ubuntu 22.04 Server - slowrunner/ROS2-GoPiGo3

GitHub

@themagpi

> Raspberry Pi projects?

Solved a month long mystery!

How to manage Raspberry Pi I2C device access inside ROS2/Ubuntu/Docker container **AND** outside the container in Raspberry PiOS Bookworm?

Docker invocation must map both the bus AND the mutex lock folder /var/lock/

#RaspberryPi5 #RaspberryPiOS
#I2C #Mutex #Docker #ROS2humble #Robot #GoPiGo3

@drfootleg Another tip that has been plaguing me for a month!

If you have processes inside a Docker container and other processes outside the Docker container that need mutex protection such as I2C bus or SPI bus accesses, you need to map the bus (obviously) **and** the mutex folder!

I had the sensor access working great but could not figure out why my inside and outside "mutex protected" accesses were colliding...

#GoPiGo3 #Docker #RaspberryPiOS #mutex #ros2humble

Think I solved why my inside Docker (ROS 2) access to I2C INA219 current sensor was colliding with my outside Docker access to the device - forgot to map the /var/lock directory so the mutex is visible from both environments!

#Docker #mutex #ina219 #robot #GoPiGo3 #ros2humble

@drfootleg

> ROS2 in Docker on PiOS Bookworm for Pi5

Tip: Break the build into two Dockerfiles - 1) the long base build, and 2) the experimental build steps that will fail a few times (quickly).

Check out my robot's Docker build steps and DockerFiles at
https://github.com/slowrunner/GoPi5Go/tree/main/config/docker

#GoPiGo3 #Docker #ros2humble #RaspberryPi5 #RaspberryPiOS

GoPi5Go/config/docker at main · slowrunner/GoPi5Go

GoPiGo3 Robot with Raspberry Pi 5 Processor . Contribute to slowrunner/GoPi5Go development by creating an account on GitHub.

GitHub
Can We Learn From Insect Navigation to Make Better Autonomous Robots? - Lab Horizons

TU Delft researchers have drawn inspiration from ants' navigation techniques to develop an autonomous navigation strategy for tiny, lightweight robots that could revolutionise industries and research environments.

Lab Horizons - Exploring the Bright Future of Science in a Digital World