Years Asking - #ROS Issue Finally Solved

For years I have asked every human that would listen, why my ROS nodes complain about a shutdown exception. (And I mean **yeeeeeears**. )

Finally asking #GoogleGemini has shown the pattern that should have been in the ROS tutorials all along.

(I have a feeling the "if rclpy.ok():" is actually all that is needed. Have not tested sufficiently yet.)

https://github.com/turtlebot/turtlebot4/discussions/517#discussioncomment-16121153

#Ros2Jazzy #TurtleBot4lite #Vibe_Coding

What Does A Robot See In The Mirror?

I couldn't resist comparing the #MoonDream 0.5b and 2b models looking at images of Dave or Wali.

The 2b model recognized Dave has a Minion character and is a robot.

The 2b model recognized that Wali is a robot but did not recognize the WALL-E character.

The 0.5b model simply hallucinates stuff, just like Ollama local models did for text queries.

https://forum.dexterindustries.com/t/moondream-vision-language-assistant-on-gopigo3-robot-kilted-dave/10750/2?u=cyclicalobsessive

#MoonDream #GoPiGo3 #TurtleBot4lite #Robot #Vision #LanguageModels #RaspberryPi5_8gb

Just had to try the #MoonDream Vision Language Assistant on my 4GB #RaspberryPi4 #GoPiGo3 #robot and my 8GB #RaspberryPi5 #TurtleBot4lite robot:

https://forum.dexterindustries.com/t/moondream-vision-language-assistant-on-gopigo3-robot-kilted-dave/10750?u=cyclicalobsessive

TL:DR; Pi5 with 2B model might be useful, but the 0.5B model will not be useful regardless of running on Pi4 or Pi5.

MoonDream Vision Language Assistant on GoPiGo3 Robot Kilted-Dave

Couldn’t resist trying MoonDream vision language assistant on Dave and Wali. Now WaLI sports an 8GB Pi5 and Dave only has a 4GB Pi4, but why not. Here is the pic: So first we ask Dave “What do you see?” (using the MoonDream 0.5b model 693Mb) (moondream_006_venv) ubuntu@kilteddave:~/KiltedDave/systests/moondream/examples_with_API_006$ ./see_wali_and_dave.py Using moondream-0.5b model with moondream 0.0.6 Python API Model Load Time: 19.13 seconds Image Load and Encode Time: 36.69 seconds Qu...

Modular Robotics Forum

@pythonhub

My #RaspberryPi5 #TurtleBot4lite robot WaLI (Wallfollower Looking for Intelligence) could really use an “off the shelf” mind!

First test of my new #ROS2 #Galactic #TurtleBot4lite power system.

This was the first discharge-to -20%-recharge-cycle in "as delivered" config.

Summary:

- max operating time, (to <20% charge), off the dock is about 1.75 hours

(No motion, just pubs Create3 topics, LiDAR scan, Oak-D-Lite color image, stereo 3D cloud, Localization and Mapper)

Recharge: ~2 hours

Battery "full": percentage = 1.0

If stays on dock after full:

- bot drains ~9 min to 95%,
- then recharges ~11 min to 100%

Choosing a name for my new #TurtleBot4lite

Trying out:
WaLI - Wall follower Looking for Intelligence

Wouldn't you think they would include a TurtleBot4 logo sticker (or two)?

Yo #ClearPathRobotics Its #ROS2 and a #TurtleBot4 - It needs a logo on it.

Am I the only person that did not know SSIDs are case sensitive?

Had one chance to setup my new #Turtlebot4lite Raspberry Pi4's WiFi without using wired Ethernet, and I blew it.

Luckily, had a laptop I could setup with a static IP, and luckily the drawer with the RPi4 can be opened enough to plug in an Ethernet cable.

Step 1: Connect TB4lite RPi4 to 5GHz WiFi done.

Now to Connect the TB4lite's Create3 base to my 2.4GHz WiFi.

>"What's on your mind?"

My #TurtleBot4lite arrives today! I participated in the #iRobot #Create3 beta (via #simulation,
they didn't gift me with hardware), so I am familiar with the #robot's lower level #topics, #services, and #actions.

My #ROS2 #Galactic #Ignition #Gazebo crashes loading the default #TB4 world.

And TB4 comes up half embedded in an obstacle in the maze world, but managed #dock, #undock, #wallfollow and to drive the TB4lite around with a #teleop_twist_keyboard node.

Any #ROS2 #TurtleBot4lite owners haunting this embodiment of the ether?