Inside a Donkeycar

So we bought a radio-controlled car and replaced the human driver with AI. No big deal. Or was it? If someone would have told me before the project that I was part of something like that, I wouldn’t have believed. At least my scepticism would’ve raised assumptions like: “Yeah sure you can do that with some lame line following PID controller” or “You probably just used some randomized movement with maybe some collision detection”. I would have never even dreamed that we made an actual neural network connected directly to the speed controller and steering servo and that it can actually follow a track using just a camera.

Now it’s time to share.

Physical world

In the beginning there was Antti, who casually stated that “physical world makes me sick”. As much as it made us laugh back then, would it make us realize that making software is pretty easy and fast compared to hardware. But that was more about our ambition than about the minimal level to get things working. For just getting the car going, the hardware is not complex at all.

Radio controlled car

A standard radio controlled car has one motor for driving the wheels and a single servo motor for turning the front wheels. In addition there is an electronic speed controller (ESC), battery and radio receiver. Radio receiver is the “driver” inside the car, so that’s what we got rid of.

Stock RC carHardware parts of a stock radio controlled car

Insert a computer

This all started with an open source project called donkeycar, so we’re using the hardware suggested for it. In all its ease it consists of a tiny computer, battery bank, wide angle camera and a servo driver board.

Battery bank is for the computer, because it’s simpler to use a separate battery than to regulate the other battery meant to drive the wheels.

Servo driver board is probably the most uncommon of these. Its purpose is to speak the same language as the radio receiver to the ESC and servo motor. In detail it’s about generating accurate PWM signal using cheap dedicated hardware.

As stated above, our ambition level was quite high and we designed and 3D-printed mounts for everything. We could’ve just used duct tape and zip-ties for everything. Or naturally the official Donkeycar mounts, if we could find some of the supported cars.

Stock RC carDonkeycar hardware essentials

Software

Donkeycar is a high level library that includes all the software required to get a car running. It is written in Python and includes both the car software and the utilities for training the neural network on a desktop. We pretty soon ended up modifying it here and there, but we found the core quite usable.

Parts

Driving software is built from modules named as Parts. Each part gets inputs and returns outputs each time it’s called. For example PiCamera part returns a frame from the camera and PWMThrottle part takes throttle as input and uses it to control the Motor.

We already have a few custom parts like Sonar, which measures the distance to object in front of the car and returns also “time to impact” in seconds. That is then used with another custom part, EBrake, that takes both throttle and time to impact as inputs and returns a tuned throttle value. In a case of inevitable crash it should reduce the speed to avoid the crash. It’s also connected to our Subwoofer part, so the car screams when close to crashing.

Core loop

Parts are connected to each other in a drive loop. It’s a built-in function that loops forever the configured parts in the given order. That makes it easy to experiment new parts for new behaviour. Our current drive loop is visualized below.

Drive loopDrive loop

Artificial Intelligence

What is Artificial Intelligence exactly? I personally have seen the term used from plain old imperative conditional code to some magical deep learning solutions so one could state that it does not automatically mean much. In a default Donkeycar context, Artificial Intelligence means a trained Neural Network that takes a single image as input and predicts two numbers as output: throttle and angle. So it is positioned more on the magical end of the spectrum. At some context that line following PID could maybe beat the neural network but what makes it really awesome is its ability to learn from the training data. But in contrast, debugging and reasoning with it is really hard. We’ll get back to the AI and getting it trained in a future post.