This thing is one sexy, sexy beast of a robot. At least when your coming from a Rovio. For one, it is accurate, secondly, it is responsive. ( Did I mention the senors are accurate too?

But, onto the meat of this encounter…

For this little tidbit, we had to take this thing with ROS, and get it to “fixate” on three color dots that we printed out. The greatest challenge for this was convincing this bot that Blue was Blue due to shifting lights in the room. (It kept attempting to turn blue green). But all in all, once we got the camera calibrated, it worked like magic

It worked very, very well too, and I was very happy with this assignment. As an experiment, I took a hold of the aft of the cowling on the robot, and pulled it back a hair, and it has… amazing stickiness to that colored dot. And by that, I mean it sprung right pack to where it should be. We could also lead it about my moving the dots about (ie, carrying them). All in all, this was a fun bot to work with.


3Pi Robot, A Reflection:

These past few weeks, Ive been playing with the 3pi robot for CS3630. Fun experience, but when it came down to testing, didn’t quite work out. And of course, there’s allays a few things we like to say about the experience.

First off, the build design was done without really testing the robot in person, mostly because I didnt have the maze-following code finished in time to be used during the competition.

The design was to create a robot that used DFS to find the shortest path through the maze. It did this by creating a representation of the maze in memory (All 2kB of it), holding only visited, left, right, straight and parent information. It also did this relying ~very~ heavily upon sensors. It was not reliant on timing loops by doing so.

Downfalls of course are we dont record x-y coords of nodes, and memory concerns. ( 256 total nodes in a perfect heap. And theres no way Im getting all 2kB for these elements in the heap ). Also, it required writing a C++ Library for the bot, which attempted to mirror the Java library used for the simulator. (The rest of course just somehow magically working because of processing for the most part). And unfortunately  despite how close the two languages are, they do not translate 1 to 1, even if I avoid using libraries within them.

The DFS itself ran on the preference L/R/S, which of course is a bit strange, but made no difference in production.

The downfall of the bot was our speed. Because I spent all the time coding, I didnt get time to test things, (Seeing as we had very limited access to the bot itself), and didnt get to verify that settings that worked in the simulator also worked in Real Life Conditions. And because we turned all our speeds way down in order to give the senors more time to acquire and process data as we moved, it turns out the bot outright didn’t move when it the code was brought into the real world. And even after a quick fix, our turns where off, and for somer reason, instead of turning on a dime, it made arcs. (and arcs are subpar for what we where doing).

so, short story short, what works in the simulator doesnt work in the lab.

Link to the Codebase: GITHUB