# TB5-WaLI Home Tour Success (x3)

==== 3/31/26 wali_tours test ====
Successfully performed three 10-stop wali_tours of about 7 minutes each (including successful recoveries)

Dock, Set_Pose_Docked, Undock,
Drive/Turn to "Ready Position"

Nav to front_door, couch_view, laundry, table, dining, kitchen, patio_view, office, hall_view, ready

Dock

#ROS2Nav2 #TurtleBot4 #RaspberryPi5

Navigation, Localization, TurtleBot4, wali, and wali_tour nodes consume 35% not navigating, 75% cpu navigating

Ugh - ROS 2 Nav2 Testing (with default planners and critics - just parameter tweaks)

Managing to nav successfully along open paths, but choke points fail then succeed the second ask.

Ah, but the laundry room - robot sometimes needs human assistance. Perhaps "intentional failures to prevent being assigned laundry duty".

#Ros2Nav2 #TurtleBot4 #RaspberryPi5 #AutonomousRobots

Extensive ALT text on photo

# Week Later Google Gemini Has Fried My Brain

Never ask an AI for help - they will help you to go crazy.

The story of asking Gemini to help optimize ROS 2 Jazzy Nav2 for my TurtleBot4 / Raspberry Pi 5 robot TB5-WaLI

https://forum.dexterindustries.com/t/years-asking-humans-and-google-finally-google-gemini-answered/10759/2?u=cyclicalobsessive

#GoogleGemini #ROS2Nav2 #TurtleBot4 #RaspberryPi5

Years Asking Humans and Google, Finally Google Gemini Answered

Week Later - Google Gemini Has Fried My Brain I sic’d Google Gemini on my ROS Nav2 issues: planning going too close to walls and bar stools causing goal failures “Out of Map Bounds” errors after tile-rug transitions Gemini would recommend something, and when I would tell it “that didn’t work”, it responds “right, you are using ROS Jazzy which works differently, try this next idea”, or I would point out it first told me one thing and then told me the opposite, and it would tell me "that is ver...

Modular Robotics Forum

Google Gemini Thinks It Knows ROS Secrets

In July 2022, Clearpath published a video of a #TurtleBot4 delivering donuts.

Try as I might I could not get my TB5-WaLI to reliably deliver donuts, until I asked Google Gemini for help.

It solved my navigation reliability problems, but when I asked about the donut delivery video it confidently gave me some bad advice.

No Donut Delivery To Google!

https://github.com/turtlebot/turtlebot4/discussions/517#discussioncomment-16193022

#GoogleGemini #ROS2Jazzy #ROS2Nav2 #VibeTuning

Oh no, my #ROS2Jazzy #robot discovered #MoonDream #VisionAssistant (v0.0.6 with local model #Python API) and found an image of "a bunch of models enjoying each others company sitting on a ledge" in its filesystem.

I hope it doesn't go all obsessive and start googling them.

Certainly not letting it near social media.

#TurtleBot4

Dream has been to have my robot wander (using vSLAM) taking pictures of “unknown interesting objects”, (<-hard part), segmenting and learning the objects for #OpenWorldObjectDetection

(All local processing for privacy)

#MastersThesisIdea use #GoPiGo3 or #TurtleBot4 robot with #Oak-D #StereoDepthCamera, #ROS 2 distributed processing for #RTABmap #vSLAM and rejected RTABmap frames (unknowns) for #UnknownObject Detection and incremental model #MachineLearning

Reference Paper in comment ->

Done It - No More FOMO

Added Raspberry Pi M.2 Hat+ to my Pi5 running Pi OS Trixie and rpi-cloned/raspi-config'd it to boot from the NVME SSD.

I can still backup the SSD to the SD Card for a while, but I guess eventually I'm going to have to backup to my 256GB USB stick.

This is a dry run for eventual NVME SSD boot for TB5-WaLI robot.

#RaspberryPi5 #nvme_ssd_m2 #PiOS_Trixie #turtlebot4

Playing around with "knowledge transfer" to my #Robots

I told each "I need to go take a shower. Talk to you later."

RPi4 #GoPiGo3 robot Dave with tinyllama responded: "I understand how important your shower is... Enjoy the shower!"

RPi5 #TurtleBot4 robot WaLI with Gemma responded: "Acknowledged. Initiating shower sequence. Requesting allocation of cleaning supplies..."

I hope I don't get a "We shipped your soap" email from Amazon.

#ollama #tinyllama #Gemma3

Looking for opinions about OpenMind OM1 as a source for reusable #Robot_Intelligence. I run 3 robots - 2 #GoPiGo3 robots and a #TurtleBot4 (WaLI - Wallfollower Looking for Intelligence).

Robots need a way to share and “inherit” knowledge and abilities. OM1 is an open source robot domain transferable “brain” based on a trained #LLM. I don’t know how to evaluate the usefulness of the model’s knowledge and how much NLU to Turtlebot4 interface code I will have to write to use #OpenMind_OM1.

#Create3 #IRsensor SLAM - First Attempt

My ir2scan node is working well, but setting the 52 #slam_toolbox params is a mystery.

My create3_navigation slam.launch.py (using #Turtlebot4 LIDAR parms modified for Create3 IR "distance" sensor scan), with Wali facing a wall, maps a wall!

(bouncing around in angle in front of bot...)

As I proceed to drive Wali around the room, more walls enclosing a "known to be open" space are mapped, but not where the sensors and #odometry put them.