Difference between revisions of "Self-Guided RC Car AI"

From Hive13 Wiki
Jump to navigation Jump to search
(Next Steps)
(Current Status - Active)
Line 59: Line 59:
 
= Current Status  - Active =
 
= Current Status  - Active =
  
* Project breadboard wiring complete. Basic software control tested and functional. Working on getting error feedback from accelerometer nailed down tighter.
+
* Project breadboard wiring complete.
 +
* Basic software control tested and functional.
 +
* Learning software control testing underway.
  
 
= Log =
 
= Log =

Revision as of 18:40, 24 September 2009

This is the wiki page for the Self-Guided RC car that TP is working on.

SGRC2 Early Forward.jpg

Overview

Background

This project is a continuation of my Self-Guided RC Car project, but enough [has changed / is changing] that I felt it deserved a new page.


The Vehicle

"Standard issue cheap piece of junk from Radio Shack."


The car has two simple DC motors. Rear drive motor has High/Low gear box; steering assembly is spring-loaded to return to center when steering motor is not powered.

Control

Controller

In all honesty I've been procrastinating with this project waiting to find a spare couple dollars to buy an Arduino to control it with, but the girlfriend somehow lead me to remember that quote "You go to war with the army you have," so I'm not waiting anymore (we were actually talking about this very project if you can believe it). I just happen to have a Z-World JackRabbit laying around, and I even know where the programming cable is.


The biggest disadvantages the JackRabbit has against the Arduino I can see are:

  • No pulseIn() function. This means I'll need to do the timing for my accelerometer readings the hard way.
  • The headers are difficult to work with and I only have one (1) ribbon cable for I/O. Not really a big deal, except of course that the digital I/O and the ADC pins are on seperate headers (did I say "of course" ?).

H-Bridge

Each motor is controlled by a transistor H-bridge, which allows a logic signal from a controller or something to turn the motor in either direction. I had meant to use power MOSFETs, but the nearest surplus electronics store was out of the parts I wanted so I ended up with BJT Darlingtons.

Chuck's Robotics Notebook has a nice tutorial on this kind of H-bridge, and my bridges are exactly as shown on his page except for the different transistor parts (I'm using TIP30 for the PNP's and 2N6387 for the NPN's). Since I'm now going to be using a controller I actually care about, my complete lack of motivation to use opto-coupling has waned considerably.


Sensors

Obstacle Avoidance

I am switching from two fixed IR sensors to a single sensor mounted on a hobby aircraft servo. I got a box of these servos from Hobby King super cheap (under $4 a piece?!), but the shipping costs from HONG KONG kinda necessitate a large order to really be worth it. Since I won't be able to use the ADC pin of my controller without a second ribbon cable, I'll still be using the op-amp comparators from the old project, but in a slightly different way. Before I had one (1) trigger level on two (2) sensors - now I'll have two (2) trigger levels on one (1) sensor, so I can distiguish between Close(11), Medium(01), and Far(00).

Learning Feedback

With the goal of creating an AI decision-making routine, the car is going to need a source of pain. A 2-axis accelerometer (pulsed output... argh) I bought years ago will providing this crucial learning element. No Pain No Gain


Decision Making

"Probably the only thing making this project unique in any sort of way is that at this point I'm not using a microcontroller." - Quote from old Self-Guided RC Car project.


Oh well. Trying to make it AI is my attempt to keep some unique-ness. At this point I only know that I'll be using a Back-Propagation Neural Network with an accelerometer providing the error signal. The car's general happiness will be determined by:

  • Moving - Very Happy
  • Not Moving - Uncomfortable
  • Suddenly Changing from Moving to Not Moving - Severe Pain


I can't help but think of the super-happy little flying robot following Ford around in the Hitchhiker's Guide. Don't panic if you're not familiar with this classic series of novels, but you really should leave your house this very moment and go buy a copy right now - and NO, just seeing the movie doesn't count. While the movie was a great tribute that I proudly own a (legal) copy of, you really do have to read the books to fully appreciate the genius of Douglas Adams (may he rest in peace).


Current Status - Active

  • Project breadboard wiring complete.
  • Basic software control tested and functional.
  • Learning software control testing underway.

Log

Sept 10, 2009

Created new project page.

Mounted IR sensor to servo and then servo to vehicle.

Made reference to HitchHiker's Guide in wiki page.

Began re-wiring H-Bridges, but got wrapped up in updating the wiki page and only got one bridge done. I felt it was important to take a shot showing the difference between the two:

SGRC2 Bridge Diff.jpg
(New bridge on left)

The black lead on the new bridge enables it to be used with a PWM signal (active low), although I'll most probably tie the enable on the steering bridge to ground since PWM is completely worthless with that motor.

Sept 13, 2009

Found terminal strip down in the basement and mounted it to car.

Determined mounting method for controller board and two (2) breadboards (one for H-bridges, second for sensors).

Completed 2nd H-bridge. Tested bridges for operation with controller board.

Set triggers for IR sensor and tested input to controller. Have to admit I'm suddenly extra disappointed in the range and response of the IR sensor. The fact that 0 cm and 30 cm look the same to the sensor could be a problem for the neural network - hopefully short-term memory design can compensate. I'm thinking that 2nd trigger level will be of little value since the overall range is so short, but I guess we'll see.

SGRC2 Progess 1.jpg

Sept 14, 2009

Car assembly completed and basic functions tested (accelerometer, servo, motors, IR sensor). Ready for programming...

The intended AI model hinges on the ability of the accelerometer to alert the system that the car has ran into something, independently of anything else, so at this point I'm playing with trying to make that happen. The software the car is loaded with now attempts to just go forward full speed until it hits something, then back up full speed until it hits something, then go foward, etc. Right now this works about 72% of the time (a very exact eyeball statistic).

I think the issue is that the accel is really designed to be used as a tilt sensor - a job it does surprising well. I DO see slight up or down spikes in the output when the car suddenly changes speed, but these are very fleeting and if you don't happen to be looking at THAT moment they're easy to miss. The JackRabbit has some limited multitasking ability, and that may be what is required: a seperate thread constantly watching the accel and setting flags when an up or down spike event occurs. This may even require a stack for flag events.

I had originally planned three (3) servo positions for the IR sensor, (+/-) 90 and dead center. After playing with the working thing I think I'll be adding the 45 degree marks. This will increase the size of the neural net slightly, but I think it's well worth it.

I can hardly wait to see if this thing will be able to learn obstacle avoidance on its own!

SGCR2 Assembled.jpg


Sept 22, 2009

Over the past few days I've been working on getting a usable error signal from the accelerometer. I've tried several approaches which all seemed to have good theory behind them, but uh, well... what I've ended up with is a jerk detector (rate of change in acceleration). If the jerk is above a certain threshold we'll call it a crash. If the jerk stays at zero too long, we'll call that idle (the signal bounces around within a certain limit when the car is moving).

This afternoon I implemented the above planned behaviour (forward, crash, backward, crash, repeat) and took the car along for the Hive meeting. Having some real space to run around in showed that the jerk detector was working more-or-less. Biggest issues:

  • Bumps in the floor can signal false crashes
  • Running under something that's at H-bridge height sucks (almost blew the bridge when breadboarded components got shorted together)

On the plus side - the car really seems to be able to take a beating (H-bridge issue aside). I'm going to shorten the idle timeout, but for now the crash jerk-level will remain the same, although 'bumps' will certainly confuse the AI routine.


Sept 24, 2009

First stab at coding the control network. I'd like to say it works. It might, even...

I started with a short term memory array of the last 10 moments. I'm going to increase that to 30 once I can get back to testing. There is a three year old running around right now that thinks this is the greatest toy ever (it might be), which is cool, unless you're still in the R&D phase and trying to debug a deceptively simple AI routine (I am).

Servo change: the servo now has 8 position stops (technically) - this way I can use the binary output of three nodes as the position address without a special control routine. The positions aren't as would be expected - see pic. I'm trying to have some method to the madness (note the symmetry). Basically Node 1 says turn left. Node 2 says turn alot. Node 3 says turn a little (a little + a lot = midway).

SGRC2 Servo Postions.png

All in all the initial tests were hopeful. The car just kinda sits there for a while and then suddenly starts doing random things (expected). Watching the little servo head looking around is very entertaining. It eventually starts going a direction. The longest I've been able to let it run, it would instantly start moving forward if it crashed while going backward, and it was starting to make the association (forward + sensor input = crash), but only in a minor way. Unfortunately I don't have much space to run in - the crash detection requires the car to have a little speed. I haven't figured out a workspace at the Hive building yet, so test runs down there are strictly observational.

My plan is to do a little more testing tonight and then go down to the hackerspace with a fresh battery charge tomorrow afternoon and see what happens when there's room to move around in.

Current Issues

  • Independent crash and idle detection less than ideal, but improving
    • 2nd thread seems to have helped, but the JackRabbit's multitask support is weird.
  • IR sensor mounted on servo slips a bit if it encounters much resistance - the servo will keep turning but the sensor will stay in place if anything is in the way. Some hot glue might help, but the servo has quite a bit of torque.
  • Car is just too damned fast. May need some kind of pulse-y-ness to the drive motors to keep from overdriving the limited range of the IR sensor. Looking to avoid true PWM - wonder if the AI will be able to figure out that it just can't drive full throttle for very long?
    • Under AI control this hasn't seemed to be a big issue in the limited confines of my apartment. I'm starting to think that given enough time the car may actually solve the full throttle problem.
  • No storage of data - car is 'reborn' every time power is cycled.

Next Steps

  • Charge battery and observe car down at hackerspace (maybe also time battery life?).
  • Tweak ANN network and error feedback routines.
  • Repeat.


  • Figure out way to save net data so the car doesn't have to start re-learning from scratch every time power is cycled.
  • Layout, etch, and solder driver board.