Skip to main content
Back
BB-8 on the red carpet for Star Wars the Force Awakens movie premiere.
Star Wars: The Force Awakens droid BB-8 poses in front of press and fans at the Los Angeles premiere of the film. Originally thought to be computer generated, there was surprise and awe when it was revealed that the robot was real © Lucasfilm Ltd. & TM. All Rights Reserved

Engineering personality into robots

Robots that have personalities and interact with humans have long been the preserve of sci-fi films, although usually portrayed by actors in costumes or CGI. However, as the field of robotics develops, these robots are becoming real. Technology journalist Richard Gray talks to Matt Denton, electrical software and robotics engineer, and Lee Towersey, amateur robot builder, about the scene-stealing, real-life Star Wars droids.

With stars such as Harrison Ford and Carrie Fisher walking the red carpet at the world premiere of the most recent Star Wars film, The Force Awakens, it would have taken something very special to steal the spotlight. Yet this is exactly what happened when a two-foot-tall robot rolled out onto the red carpet, in front of the world’s media.

Named BB-8, this spherical droid, dreamed up by the film’s director JJ Abrams, was a huge hit with fans. However, most had believed it to be the work of clever computer-generated graphics. The droid’s surprise appearance at the premiere in Los Angeles stunned many by proving it was a real robot; it was science fiction turned into reality.

This little ball-shaped robot highlights how Hollywood is now at the cutting edge of robotics development. The film industry’s demands are almost unique in the robotics world – few other applications require machines to move in lifelike ways, react to the humans around them and, perhaps most importantly, have a personality. BB-8 is a perfect example of how the skills and tenacity of engineers can transform lifeless hunks of electronics and motors into something magical to capture viewers’ hearts. “Basically, we are just given drawings by the concept artists and some guidance from the director, then it is up to us to work out how to make it all work,” explains Lee Towersey, a member of the creature effects team on the new Star Wars films who built many of the droids.

Robots on screen 

Robots have been appearing in films for nearly 100 years. The first known appearance of one on the big screen was in the 1927 German silent movie, Metropolis. However, in this, like so many films that came after, the robot was merely an actor in a suit.

Even in the original Star Wars movies, hailed for their innovative special effects when they were made in the 1970s, the droids C-3PO and R2-D2 were brought to life by humans inside suits. Famously, the late actor Kenny Baker spent hours inside R2-D2 in the original films, wearing the robotic shell almost like a rucksack and controlling the movements from inside.

In The Force Awakens, the R2-D2 that appears on the screen is a real robot. Towersey and his colleague Oliver Steeples were amateur robot builders who were recruited by the production team to use their skills to create a real working R2-D2 droid.

Towersey and his colleague Oliver Steeples were amateur robot builders who were recruited by the production team to use their skills to create a real working R2-D2 droid

It highlights the drive within the film industry to take advantage of the huge changes in technological capability that have taken place since the original Star Wars movies were made.

“The technology wasn’t really there then and the control mechanisms were not as precise as they are now,” recalls Towersey. “They did try using robots but, towards the end, crew members were having to pull them around on string to get them to move in a straight line.”

This lack of control generally led to robotics being used to bring onscreen monsters to life. Think about the shark bursting from the water in Jaws and the fearsome Tyrannosaurus rex in Jurassic Park. The animatronics used to create the dinosaur in Steven Spielberg’s 1993 movie were cutting edge at the time, but it could essentially do just three things: blink, move its head and open its mouth.

Different concept sketches of BB-8 showing different angle's of BB-8's head on its body.

BB-8’s personality is engineered through the ability to tilt and turn its head. Throughout the film, these simple movements show the viewer the droid’s emotions and give it an almost childlike personality © Lucasfilm Ltd. & TM. All Rights Reserved

New technology

As servomotors (which allow for precise control of angular or linear position, velocity and acceleration), and perhaps more importantly, the microcontrollers needed to coordinate their actions have improved and grown smaller, the possibilities in robotics have opened up. Suddenly, directors have found that they can get their robotic stars to convey emotion and even personality.

“There was only so much we could do with R2-D2 to give him character,” says Towersey. “He can drive around, we can rotate his head and he can make sounds.

“However, BB-8 is much more interactive. I think a lot of kids who grew up with the prequel films, which had extensive CGI in them, expected BB-8 to be computer generated, so when they saw that it was a real robot moving about, they were blown away.” Computer-generated graphics became a major component of the Star Wars films when director George Lucas decided to make the prequels.

The decision to revert back to so-called ‘practical effects’ (what physical props are called in the special FX world) left the engineers in the creature workshop with a major challenge. Director JJ Abrams had sketched out his idea on a napkin for a ball-shaped robot and asked the team to see what they could come up with. Matt Denton, an electrical software and robotics engineer, and his long-time colleague Josh Lee were the people who ultimately built BB-8.

There is a huge amount of emotion that effectively comes from three axes of movement

“We had to figure out how to make it work,” says Denton. “There has never been a round robot with a head on it, mainly because why would you do that? But having a head gave new ways of showing emotion and personality. You can move the body around slowly to point the other way, you can dip the head a little, cock it to one side. There is a huge amount of emotion that effectively comes from three axes of movement.”

BB-8 moving in a desert landscape on the star wars set.

As BB-8 is a round droid that’s body moves in all directions while its head stays still, creating the robot proved challenging for the engineers. The ball-bearings on the underside of its head attach it magnetically to the top of the body, which allows it to move freely when the body rotates © Lucasfilm Ltd. & TM. All Rights Reserved

How BB-8 was brought to life

🤖 Creating a spherical robot with a floating head

Matt Denton and Josh Lee toyed with several approaches to cracking the problem presented to them by director JJ Abrams for The Force Awakens: a spherical robot with a head that ‘floats’ on top of the main body.

They opted for a pendulum that sits inside the ball-shaped shell, which is kept upright by gyroscopes. The spherical body itself is driven by a rotating hub that spins the ball while the movement of the pendulum insides creates momentum to alter the direction. The robot has an axle running horizontally through the middle of the ball, which is driven by motors to rotate hubs on either side of the ball. A shaft is suspended vertically from the axle and this has the motor, the battery and actuators suspended on the bottom, and acts like the mass at the end of a pendulum inside the ball. When the motor turns the axel, this swings the centre of mass forward and causes the ball to rotate in that direction. Gyroscopes also help to keep the pendulum in position when the ball is moving.

Although the mechanisms of how the head was attached to the robot have not been officially revealed, images that have been released indicate that a disk embedded with magnets was attached to the top of the pendulum. With ball-bearings on the underside of the head, the head could be attached magnetically to the top of the body while allowing it to move freely when the body rotates under it. Tethering the head to the top of the pendulum in this way also means it shifts forward in the direction of travel when BB-8 is moving and turning. Additional motors at the bottom of the pendulum rotate this shaft so the head can turn, tilt and dip when it is moving.

Even the pattern on BB-8’s body was designed to help viewers connect with the little droid: differently shaped panels were used on each side to make it easier to follow its movement.

*To watch Matt Denton and Josh Lee discuss how they built BB-8 at the event, please visit http://bit.ly/2ejM6bq

There were existing examples of spherical robots on the market. One, built by researchers at Uppsala University in Sweden, features cameras on either side and has been tested by the US Military as a security droid. It works using a pendulum inside the ball that is suspended from an axis. A motor moves the pendulum forward, displacing the centre of mass and causing the ball to roll forward. Moving it from side to side allows it to steer left and right.

The problem Denton and Lee faced was that their BB-8 was going to face some unpredictable and difficult conditions, yet had to stand up to the rigours of a frantic filming schedule. “We had to operate on sand and rough terrain,” explains Denton. “But none of the methods out there would work in all those conditions while also hitting marks over and over again.

“There is a temptation to over engineer it, but sometimes you just have to look for the simplest solution. You are wasting a huge amount of money if the device is broken down and the film crew is waiting for you to fix it. It can be embarrassing if it doesn’t work. In that sense, we were doing fast turnaround prototypes that had to work. Usually,they only have to work for a few weeks but BB-8 had to survive a whole year’s shooting.”

In the end, the pair created eight different versions of the robot: some that could be used in rough outdoor terrain, one that would stay stationary and some that would work inside. The simplest of these was something they nicknamed the ‘wiggler’, which could be bolted to the floor and the ball could twist and tilt while its head could turn around on a shaft through the middle. It was this that allowed the team to get many of the iconic close-up shots that show BB-8 peeking around the door of the Millennium Falcon or sagging in defeat. A separate radio-control unit was used in each version to allow the head to twist and tilt on the body on cue, giving BB-8 its precocious and often comic characterisation.

Everyone would want to see BB-8 but they wouldn’t want to see a static version. That is what drove us to figure out how it was going to work

Different versions of BB-8 being built.

During filming, eight different versions of the droid were created to be used in a variety of locations and situations © David James (top), John Wilson (bottom) Lucasfilm Ltd. & TM. All Rights Reserved

The off-road versions had stabilisers fixed to the back, turning the robot into a threewheel- drive trike with the head fixed on a bracket, while another version was essentially a wheelbarrow pushed by a puppeteer. They also had a lightweight version that could be carried and a ‘bowling ball’ version that could be thrown into shot but remain upright.

But it was only during an enforced gap in filming, caused by actor Harrison Ford breaking his leg on set, that the pair returned to the idea of building a fully working sphere robot. “We started thinking about what would happen when we got to the red carpet,” recalls Denton. “Everyone would want to see BB-8 but they wouldn’t want to see a static version. That is what drove us to figure out how it was going to work. When we first rolled it out on stage in 2015, no-one was expecting it. There were people in the audience whose jaws just hit the floor.”

Their final robot uses a motor-driven pendulum to propel the beach-ball-sized sphere from the inside to a maximum speed of 4.3 mph. The head ‘floats’ on top of the body so that it can be tilted and turned, giving BB-8 its distinctive, almost childlike personality. It seemed so lifelike that photographers on the red carpet at the premiere called its name to get it to look their way while actor Warwick Davies posed for a selfie with it. Yet BB-8 still requires human help to come alive, with Denton driving its movements by radio control.

However, Denton believes that new technology is starting to take over some of the job of delivering emotion in robotic characters. “We are starting to have control systems that sit on a robotic head and pick up the movement of the actor,” he explains. “So if it is wearing a mask, for example, the control system will take control of the eyes or the face so that it looks in the correct direction and blinks when the actor moves.

We have also been working on automated blinks or breath cycles, so you can have the robot working away by itself

The Hexapod robot.

Matt Denton’s Hexapod robot responds to people in front of it by using facial tracking and gesture recognition software. It’s ‘eye’ (or camera) can follow movement and it swings back on its legs – or shies away – if people get too close © Matt Denton

“With new microcontrollers I can only see it improving. They allow you to mix the action of all the servomotors in a head into different expressions so you can get some really interesting facial movements. We have also been working on automated blinks or breath cycles,so you can have the robot working away by itself.”

Advances in movement tracking are also allowing engineers to produce robots with eyes that automatically follow other actors in a scene, making them look like they are part of the action.

Denton, who runs his own animatronics company, Micromagic Systems, developed a six-legged robot that works without a puppeteer altogether. An earlier version of his Hexapod robot appeared as one of the weird magical creatures in the Harry Potter movies when it was dressed up like a six-legged turtle.

His updated model uses facial tracking and gesture recognition technology so that it responds to what someone standing in front of it is doing**. Looking a little like a giant spider, it shifts its weight like a predator preparing to pounce as it tracks the movement of the person in front of it. “It is much closer to what I consider to be real robotics,” says Denton.

Software

🕹️ Programming to make robot’s movements natural

There is no universal programming language for robots so manufacturers of robotics hardware tend to develop their own unique software to achieve the results they want.

In films, robots and animatronic characters are usually controlled using joysticks, which often means that their movement can look unnatural as the joysticks have little weight and so can be moved quickly around while the robot has to act against gravity and other forces in the real world.

Matt Denton has spent years studying how real organisms move in nature to find ways of replicating this within the control software. By including feedback loops and data filtering into the software, he has found that it is possible to remove the twitchiness that can make a robot move in a less realistic way. Other tricks include programming in a feature that means wherever a character is looking is forward. In a robot such as BB-8, this can make controlling it much easier.

Robots in the real world

Outside the movie industry, robots have been largely consigned to performing repetitive, mechanical tasks, but as they start to move into people’s homes and everyday life, researchers are taking some lessons from Hollywood.

Dr Cynthia Breazeal from the Personal Robots Group at Massachusetts Institute of Technology has been developing what she describes as sociable robots. It began with an anthropomorphic robotic head called Kismet that could express itself by adjusting its gaze or facial movement but has since progressed into a fluffy 2.5-foot-tall robot called Leonardo, which features a visual tracking system so that it can interact with children. More recently, her team has developed a humanoid robot called Nexi, which has arms that move along with the face to make its body language realistic; with a flick of its eyebrows and clenching of its fists, Nexi can skip from joy into anger and then sadness.

Then there is Pepper, a Japanese creation built by Softbank Robotics, that uses artificial intelligence to give the $1,600 humanoid ‘emotions’ so that it can be used in shops or for customer service. Voice recognition allows it to recognise when it is being chastised and it responds by adopting a crestfallen pose or, if someone is feeling down, it will try to make them happy with unrelenting cheerfulness.

Toyota has also unveiled a diminutive robot called Kirobo mini, which has been designed to be a ‘communication partner’ and has since been touted by others as a companion for childless couples. With large eyes that blink, a high-pitched voice and a wobble as if it hasn’t quite cracked how to balance, it has a vulnerability that its designers say makes it cute. The robot can hold a basic conversation, make hand gestures and respond to human emotions.

If you give robots a script and get them to hit their mark, they should be able to do it better than humans as they have ultimate memories, the problem is that they would never give you nuances in each scene

There are even some who are hoping robots will become actors in their own right. Researchers at Osaka University in Japan have created an android they call Geminoid F that has been designed to look and behave just like a human. Actuators powered by air pressure beneath its rubber skin allow the robot to copy human facial expressions and mouth voice recordings.

It has already appeared in one film, Sayonara, about the aftermath of a nuclear power plant meltdown. While the performance is a little mechanical, it shows a sign of what may come.

“If you give robots a script and get them to hit their mark, they should be able to do it better than humans as they have ultimate memories,” says Denton. “The problem is that they would never give you nuances in each scene. With artificial intelligence that might be possible but it is a long way off.”

How to engineer a robot

Denton and Lee are now considering publishing the schematics in the future to allow schools and young engineers to build their own BB-8. The robot demonstrates some good engineering principles such as gyroscopic forces, electrical engineering, software and magnetism, and much of the technology needed to build it, such as the motors and gyroscope sensors, are available off the shelf.

One area that they are quiet on the details about is the problem of getting BB-8’s head to stay on top of its spherical body while still letting it move around freely. Although Denton does say that they had obvious help to crack that problem – the Force.

***

This article has been adapted from "Engineering personality into robots", which originally appeared in the print edition of Ingenia 69 (December 2016).

Contributors

Richard Gray

Author

Matt Denton left school in 1989 and completed a four-year apprenticeship in electronic engineering with Marconi Defence Systems. In 1993, he started a degree in computer-based electronic engineering at the University of Portsmouth. During his first year, he had a summer job with a small special FX company at Ealing Studios and didn’t return to complete his degree. Matt formed Micromagic Systems in 1998 and has been specialising in control systems for creature effects ever since. He is now also Chief Technology Officer for Maverick Aviation Ltd.

Lee Towersey built his own R2-D2 in 2009, which has been used in a Currys TV advertising campaign along with many other promotional events on behalf of Lucasfilm. He joined the team of Star Wars: The Force Awakens in 2013 to build and operate droids for the film, and has also worked on droids for Rogue One, Star Wars Episode VIII and the as yet untitled Han Solo film.

Keep up-to-date with Ingenia for free

Subscribe