With two days to go until competition day (only one day for schools and clubs!), I thought I’d capture the status of Mini Mouse, for posterity and for something to compare against in a Post-Competition Post-Mortem.
I’m quite happy with where I’ve got to. The autonomous challenges are all working mostly reliably in practice runs, and the performance seems to be competitive. Of course there’s no guarantee that it’ll perform “on the day”, but I’ve done all I can at this point!
As I’ve mentioned previously, I ran into some difficulties with my camera and software, causing a significant Blast Off “regression”. I went back to the drawing board and implemented a new algorithm, which is a little smarter.
Now, I find the line points as before, but then use the “closest” and
“furthest” visible points to calculate the equation of a straight line -
mx + c. In this case,
y is actually the horizontal axis of the image.
gives the gradient of the line, which I want to be as close to zero as
c gives where the line crosses (or would cross) the top of the
image, which I want to be as close to the middle as possible.
So, I use
c to decide whether to turn left or right, and
m to determine how
much to slow down (the larger the
m value, the less “straightly” I’m
following the line, and so I need to slow down to reorient).
I also added some buttons on the controller to let me pick a maximum speed at runtime, so I can pick how brave I’m feeling on the day.
It’s working pretty well, and seems quite a bit less fragile than the previous approach (sorry for the shaky camera):
Nothing significant has changed in my maze implementation for quite some time. I had to recalibrate it to account for the great camera disaster, but otherwise everything’s the same: drive until the camera says we’re close to a wall, turn 90 degrees, repeat.
I ran into an issue during practice, where a bump in the floor (a join between two wooden boards) would lift up the front of the robot, making the horizon move down, and making the robot think it was close to a wall. I’ve no idea how well joined the real maze will be, so I’ve implemented a workaround where the code makes sure that the horizon is “low” for several frames in a row before turning. This slows down the response to real walls - so whilst it’s deciding if the wall is real or not, it slows to a crawl so as not to drive into a real wall.
As with Blast Off, I’ve added speed selection, so that at the start of the challenge I can choose how bold to be.
The video below shows a representative run of the code. The practice course isn’t the same size as the real thing, but the sequence of turns is the same.
I’ve made a few minor refinements to the Hubble code. Namely:
- On the turn to the first corner, it wasn’t always turning the smallest amount possible. I fixed that
- I made the distances driven more accurate - taking the size of the robot into account. Hopefully making it enter zones more reliably
- Speed is selectable, the same as the two challenges above.
It’s working pretty well. I did see one run where it incorrectly identified blue and green. I didn’t have the illumination lights turned on - so I’m hoping with good lighting that won’t happen on the day! This is the first and only time I’ve seen it.
The video below shows a full run, including identifying the corners.
This is the only one of the remote-controlled challenges I can effectively practice. Even then, I don’t know what the real targets are like; I can only hope they fall down easily!
The dart guns are a little variable, but I was able to knock down at least 3 targets in my practices.
The laser was on a 10-second timeout before, which proved to be too short. I increased it to 20 seconds, because getting it to re-activate means wiggling a trigger, and it’s very easy to accidentally shoot.
Video below, getting lucky with a 2-in-1! A full salvo of 5 darts, with reloads, is taking somewhere in the neighbourhood of 30 seconds; which should be fine.
One more thing…
Well, a few more things. To help with my terribly low ground clearance, wimpy motors and small wheels, I’ve added an articulated bulldozer scoop. This serves two purposes: It can be used to move obstacles out of the way, and it can also help to overcome obstacles which I otherwise wouldn’t be able to, as shown in the video below. I’m hoping this will help with the obstacle course.
Sadly, it never occurred to me until too late that it could also be helpful on The Spirit of Curiosity, and unfortunately I can’t figure a way to have both the sample basket and the scoop attached at the same time. I’ll just have to hope for the best on the rough terrain.
I also chopped up some foam for a new carrying case. With the different attachments being quite fragile, I wanted a safe way of transporting everything around:
Lastly, as I mentioned on my application form - I finally got around to implementing a MIDI file player, using the stepper motors. Mini Mouse can sing (and dance)! There’s no speakers on-board, all of the sound is coming from the motors themselves.
Final, final preparations
Before Sunday I have a couple of minor things left to do. One is to make sure everything is charged up, and to make sure I can access the robot over my portable WiFi hotspot. I shouldn’t need a network connection if everything works as expected; but just in-case I need to make some last-minute software tweaks, I want to have the possibility.
Another is to build a new set of wheels. They’re already printed, I just need to put in the metal inserts and tires. The current set are just plain worn out!
I’ve also got to cut a couple of new holes in the foam in the carry case to hold some extra bits.
Lastly, good luck to all the people competing at the weekend, and I look forward to meeting all you folks in person!