Learning code with robots
Richard Hopkins built his career delivering major new information technology systems, mostly for the UK government. If you’ve been in the UK for the last 25 years, then there’s a good chance you’ve used (or been processed) by one of his systems. The projects he leads are large and complex; back in 2011 he was appointed as an IBM Distinguished Engineer, an executive but technical role.
Five years ago, Richard started work on a full-size replica of Doctor Who’s robot dog, K9. His K9 is just like the one on the television; the only difference is that his is a real robot, not a BBC radio-controlled prop. K9 can wag his tail, hold a conversation and follow an ultrasonic transmitter. He even knows when he’s been patted.
“K9’s my long-term project to inspire kids, but he weighs as much as a big Labrador dog and I have to remove his tail to fit in my car! I pretty quickly realised I needed an easily portable version,” said Richard. “My new one is a third of the size and only weighs the same as a Chihuahua. Just like the big one, you can have a conversation, but his special additional skill is that he can play grandmaster-level chess. He’s called Kasperwoof.”
but his special additional skill is that he can play grandmaster-level chess. He’s called Kasperwoof
Kasperwoof’s name is a joke – he’s a canine homage to Gary Kasparov the world chess champion who was famously beaten in 1999 by IBM’s Deep Blue computer, the first computer to beat a reigning world chess champion. It is a sign of how far computers have come in that Richard claims that Kasperwoof would probably give Deep Blue a good match.
Richard has a self-imposed rule that all his robots must be made from inexpensive, commodity technology. His blog provides designs, code and explanations, and everything is published under an open source licence. He encourages students to inexpensively copy his inventions.
How does K9 work?
K9 streams the sound from his microphone over the internet to IBM Watson Speech to Text Cloud service; a written version of what he has heard is then returned to him via the same stream. He uses the free tier of IBM Watson Assistant to respond to what he has heard. Watson Assistant works out the likely ‘intent’ of the command that K9 has been given and then choses from a variety of responses. For example, K9 identifies phrases such as “Good boy K9” or “Would you like a jelly baby?” as the ‘praise’ intent. His response will be to wag his tail and respond: “Thanks are not necessary”. His side-screen displays a dashboard that describes the state of his batteries, his speed and his power consumption.
Not bumping into things is important for any robot. To stop this from happening, K9 has 12 sensors (to avoid collisions) and four servo motors (two to move the tail, two to turn the ears). Controlling all of this via a Raspberry Pi would quickly overwhelm it, so he has three microcontrollers that do this for the Pi. These microcontrollers (called Espruinos) convert the signals coming from the sensors into information that the Pi can easily process. For example one microcontroller moves the ears forwards and backwards. While it does this, it measures the voltage signals from the LIDAR (which tells it how far it is to an obstacle) and the signal from the potentiometer in the servo (which tells it which direction the ear is facing). It then translates those signals into vectors to the nearest obstacle. These microcontrollers use USB to send the vector data to the Pi as simple strings. This ‘pre-processing’ offloads work from the Pi and gives it enough time to make decisions (such as whether it is safe to move forward or not).
Making him look alive is done by giving him movement and lots of flashing lights. K9 has 12 flashing buttons on his control panel, a concealed blue LED strip around his base (which makes it look like he’s hovering) and two ‘eye’ lights. The brightness of these lights is controlled by a very fast pulse 3.3V signal from a microcontroller. This signal is ‘amplified’ to 12V. If the high pulse signals are short, the light looks dim, if they are long the light is very bright. The faster K9 is moving, the brighter the blue lights underneath him glow.
K9 is quite complex. To enable information to flow around the robot so that he can make sensible decisions, Richard uses two key programs. The first is Node-RED, which is a free tool that enables the user to wire together hardware devices via visual flows without doing any programming (like a sophisticated version of Scratch). Nearly all the sensor data that flows around K9 is fed into Node-RED so it can be routed to the right program. The second program, called Redis, is the main destination for much of the data – it is also free. Redis is usually used to help scale large websites, but Richard uses it as K9’s short-term memory. Node-RED flows the data into Redis so that K9 can build up a picture or ‘context’ of his surroundings. This enables him to make decisions, such as how to navigate past an obstacle to get to where he’s been asked to go. A Raspberry Pi is a relatively small computer, so remembering all this data would quickly fill it up; Redis automatically forgets what it saw as new data becomes available.
Moving precisely is very important for any robot. Unfortunately, powerful, precisely controllable motors are very expensive. Richard solved this problem by using two inexpensive scooter motors and gluing a 3D-printed disc to each. These discs have many holes around the edge. As each wheel turns, four sensors detect these holes. By counting the holes that pass the sensors, it is easy to calculate how far each wheel has travelled. Depending upon which sensor sees the hole first, it is also possible to work out which way the wheel is turning. Combining the signals from both wheels allows the robot to work out in which direction he is moving (and make adjustments to the power going to his motors if he not moving as expected). Richard’s K9 is fast and can move in a straight line, or spin precisely on the spot without a human operator controlling him.
Why is this important? Older readers will probably remember IBM, but from the 1990s onwards, IBM slowly disappeared from public view. As it shut down its consumer-facing divisions – typewriters, then printers, PCs, and laptops and ceased to be as visible to everyday consumers. The company encourages its employees to work with schools and universities to inspire pupils to study STEM subjects and learn a little about the company.
I knew I’d won them over when one of them asked ‘And they pay you to do this? Are you a genius?’ I would have grinned. ‘Nope – just an engineer, still learning’
Richard explains: “When I was in primary school, Star Wars was the thing and the robots R2D2 and C-3PO were stars. Back then, even Doctor Who had a robot dog. When I talk to kids today, nothing much has changed … artificial intelligence is a bit scary, but robots like BB-8 are still cool. So that’s what I decided to do. I would build robots to inspire kids.
“As part of the STEM Ambassadors scheme, I visited a school where I met children in years four to six. I’d been warned by my wife, an ex-primary school teacher, that year six might be challenging and that nothing I’d be able to do would impress them. Not so. I knew I’d won them over when one of them asked ‘And they pay you to do this? Are you a genius?’ I would have grinned. ‘Nope – just an engineer, still learning’. I’m now roughly the same age as their grandfathers, but they were almost in awe that I’m still being paid to have fun and learn new things. For a brief moment they grasped why engineering is so great. I hope that inspiration lasts.”
Richard is convinced that his work has brought unexpected benefits, helping to transform some of his major system designs. IBM Distinguished Engineers deliver major new innovations or design and build challenging systems. Richard explains: “This continual focus and valuing of questioning and renewal has helped IBM to exist well beyond 100 years. But it’s that questioning that keeps us innovating and inventing. Our challenge, of course, is staying up to date with fast-moving technologies.”
Things he has learned from his robots and commodity technologies, especially cloud and AI, have helped IBM deliver new systems for clients. Richard recalls one episode “The ‘hands-on’ knowledge that I gained from building my robots meant that I had an in-depth understanding of how to automate the creation and testing of new systems on the cloud. That convinced the client that we were the right company to work with. On other occasions, simply being able to talk about the practical pros and cons of the latest technologies has made all the difference when co-creating solutions with the client’s team.”
Richard has also found that he pushes both himself and technical boundaries. For example, he says: “Imagine you want your robot to find you and you’re holding an ultrasonic transmitter. Your robot dog has five ultrasonic sensors arranged around his body, which are creating a flood of raw sensor data. Unfortunately, the data is noisy, so you must perform an algorithm sequence on each to separate the signal from the noise. Those signals then have to go into a neural net so the robot can determine the bearing and distance for the transmitter, which will allow him to home in on you. It sounds simple, but imagine having to do that every few milliseconds on a £15 microprocessor. When you have powerful computers with GHz processors to play with, making every clock cycle count is a forgotten skill. Not for me, not anymore.”
“Not many of my family or friends understand what I do as an IBM Distinguished Engineer, but my robots capture the essence of it and allow the kids to imagine what I do. Because of my ‘commodity technology only’ rule, budding engineers can follow my lead.”
One of Richard’s ‘robot proteges’ built his own K9 from cardboard that could answer basic questions. Now in college, he is working with IBM Researchers in Australia and New York on a project to engineer a companion robot for those with Alzheimer’s.
The Dalek works out who is in, or out, of the house using their mobile phones and then recognises faces so it can greet you or say goodbye
Richard is currently working on a Dalek that acts as a doorman for his home. The Dalek works out who is in, or out, of the house using their mobile phones and then recognises faces so it can greet you or say goodbye. The lights and the iris of his eyestalk are all under computer control, so “when he ‘wakes up’ and talks he’s pretty convincing. Of course, if he doesn’t recognise you he threatens you with extermination. He’s grown quite famous with our local delivery people.”
BBC, Doctor Who, Dalek and K9 (word marks, logos and devices) are trademarks of the British Broadcasting Corporation © BBC 2020. Daleks are copyright BBC/Terry Nation. K9 is copyright Bob Baker/David Martin/BBC. Richard’s robot designs are not used for commercial purposes.
This article has been adapted from "Learning code with robots", which originally appeared in the print edition of Ingenia 84 (September 2020).
Richard Hopkins is an IBM Distinguished Engineer and a Fellow of the Royal Academy of Engineering specialising in Hybrid Cloud Solutions and Quantum Computing. He was the 19th President of IBM’s global Academy of Technology. He lives in the North East. To find out more about his robot designs, visit his blog and for more information on code, follow his GitHub.
Keep up-to-date with Ingenia for freeSubscribe
Technology & robotics
Q&A: Sinead O'Sullivan
Sinead O’Sullivan is an academic researcher at Harvard Business School and the US Center for Climate and Security, working on aerospace engineering, technology, business and policy. She is also commercialising technology to monitor interference in democratic elections.
Heating homes with robots
Suspended flooring helps construction workers build level floors above uneven and damp surfaces. However the space between the floor and ground allows cold air to enter and can result in heat loss. To solve this problem, construction technology company Q-Bot has created robots that can install underfloor insulation without messy construction work.
Speech recognition is a machine or program’s ability to recognise spoken words and phrases and convert them into a machine-readable format. The software is now a common feature in several devices, including smartphones, computers and virtual assistants.
Hollowing out a future in fibre optics
Optical fibres are used in many settings, from computer networks to broadcasting and medicine, to carry information. The fibres are usually made up of strands of glass, each one thinner than human hair, but researchers have been working on fibres to transmit data that contain just air.
Other content from Ingenia
- Environment & sustainability
A young engineer’s perspective on the good, the bad and the ugly of COP27
- Environment & sustainability
- Issue 95
How do we pay for net zero technologies?
- How I got here
Electrifying trains and STEMAZING outreach
- Civil & structural
- Environment & sustainability
- Issue 95