Always nice to spend pi day at a robotics competition #omgrobots. Happy pi day!
Digital Dragons are going to Worlds! #omgrobots

The team has come up with an idea to manage the power consumption in a centralized manner. Each mechanism before doing anything would estimate how much power it's going to need in this iteration and reserve that amount at a central location. The kids now are working out an allocation algorithm that would process such reservation requests and distribute the power fairly according to mechanisms' importance.

#OMGRobots #firstroboticscompetition #frc #rebuilt

#Brownout is going to be a major issue in the #frc robots this 2026 season for teams will try to overpower their shooter flywheels with hefty Kraken/Falcon motors on top of already power hungry swerve drivetrains.

/cmv

#OMGRobots #firstroboticscompetition

Y'ALL THIS GAME IS SO GOOD I'M SO HYPE #omgrobots #frc2026

2026 FIRST Robotics Competitio...
2026 FIRST Robotics Competition REBUILT Presented by Haas Game Animation

YouTube

#OMGRobots!

This amazing #firstRobotics team of exceptionally bright kids is confident they can succeed this upcoming 2026 competition season. But they need some help to participate in two tournaments in #Minnesota.

I vouch for this incredible #STEM program called #FIRST that teaches kids not only how to build robots but also innovate, solve problems, manage budget, communicate, and many other skills useful in real life.

This team is run by high school students, their parents, and volunteer mentors with no financial support other than donations. Please consider #giving some help to these awesome kids this #thanksgiving season. Or just BOOST this post, that will help too. THANK YOU!

https://www.givemn.org/story/hjbcig
The recipient is a 501C3 non-profit. Donations are tax deductible. The tax ID is in the profile on this #giveMN website.

#givingtuesday #twincities #mnastodon #givetothemaxday

Support Regional Competition Entry Fees on GiveMN

Your generous donation will allow us to compete in two regional, 3-day competitions. Thank you!

GiveMN
blogged about top skills/attributes for a new programmer after a student on the robotics team asked my opinion https://www.selikoff.net/2025/11/08/more-important-skills-traits-in-becoming-a-computer-programmer/ #omgrobots
More important skills/traits in becoming a computer programmer | Down Home Country Coding With Scott Selikoff and Jeanne Boyarsky

One of students on the FIRST robotics team I mentor asked "what do you think is the most important skill or trait in becoming a successful computer programmer?". I wrote a paragraph about persistence, problem solving and attention to detail commenting I'd choose problem solving if I could only pick one. I also immediately thought

Down Home Country Coding With Scott Selikoff and Jeanne Boyarsky | Java/J2EE Software Development and Technology Discussion Blog

Context: #FRCRobot, #AprilTags, Camera mounted on the #robot

Problem: Can we find the translation and the rotation of the camera relative to the robot chassis programmatically fully or somewhat automated?

Idea: #OpenCV library has a powerful calibration subroutine called "Hand Eye Calibration" which can calculate the relationship between the "hand" and the "eye" in a setup that the hand holds the eye in a fixed manner and moves relative the "base". The camera (the eye) can see a target that is also fixed relative to the base. Can we use it in our robot with an eye rolling on a field with a fixed April Tag configuration?

Let's set the terminology. The "eye" is the eye - the camera. The camera is mounted on the robot. So the robot is the "hand" holding the "eye". The robot can move across the field freely and we can track this movement using robot's odometry. So the field is the "base". And the "target", the April Tag, is mounted on the field/base.

This hand-eye calibration function requires two sets of measurements as inputs: Eye-to-target transformations and Base-to-hand transformations. Each transformation is a translation vector and a rotation matrix. In our case "eye-to-target" is camera-to-target which we can obtain from #PhotonVision. And the "base-to-hand" is the robot pose on the field which we can get from the drivetrain odometry. Then the calibration result will be hand-to-eye which is the robot-to-camera transformation.

Should work, right?

#omgRobots

i was looking at a volunteer report and it says I've volunteered at 99 official FIRST events (over 17) years. That's a cool stat! #omgrobots