Skip to main content
Back
Artist’s impression of a generic remote handling robot, showing the operator controlling the robotic arms
Artist’s impression of a generic remote handling robot © Illustration by Benjamin Leon for Ingenia

The robots hard at work in the UK’s most radioactive places

From highly radioactive environments to the ocean floor and out in space, some places are just too hazardous for humans. Beverley D’Silva explores the ‘hot’ robotics under development to take our places in nuclear environments, from safely storing waste to maintaining and decommissioning fusion facilities.

Did you know?

☢️ Hazardous environments

  • Radioactive dust, mud, soil, debris, and even gases can contaminate robots sent into nuclear facilities
  • When Boston Dynamics’ quadruped robot, Spot, was sent into Chornobyl’s New Safe Confinement, engineers protected it from contamination by putting rubber socks on its feet
  • Radiation can harm electronics. Engineers must sometimes wire out sensitive components so robots can still operate in high-radiation environments

In December 2022, US diver Josh Everett became the first person to enter Sellafield’s Pile Storage Pond since 1958. This 100-metre-long pond was built in the 1940s to store highly radioactive spent nuclear fuel at the Cumbrian decommissioning, waste treatment and storage site. Josh and his fellow specialists worked in shifts of up to three and a half hours at a time to clear sludge comprising decaying nuclear fuel, algae and other debris. For now, some such tasks are destined for humans encased in radiation-proof protective gear and working with extreme caution. But robots are getting closer and closer to being able go in our stead. 


“Robots are great for three things: the dirty, the dull and the dangerous,” says Nick Sykes, Director of RACE, a division of the UK Atomic Energy Authority (UKAEA). Based in Culham, Oxfordshire, RACE (which stands for Remote Applications for Challenging Environments) is a research centre dedicated to ‘hot’ robotics designed for extreme industrial environments, including in nuclear decommissioning and fusion research. RACE’s work grew out of the need to maintain the Joint European Torus (JET), the world’s largest and record-holding facility for fusion research. As JET is now being decommissioned, one of RACE’s priorities is repurposing its remote handling robotics for this next chapter. 


“This is a really exciting time to be in hot robotics because we’re on the verge of a new technology shift,” says Sykes. “[Robots are] becoming more and more capable of working in hazardous environments.”

What's so 'hot' about hot robotics?

‘Hot robotics’ are robots designed to operate in a ‘hot’ environment: that is, any place that’s too harmful for humans to be. Aside from radioactive nuclear environments, other hazardous places include far out at sea, for example, offshore wind farms, deep under the sea, or in space.

Two places where hot robotics have already seen a lot of action are Chornobyl (the Ukrainian spelling of Chernobyl), and Fukushima Daiichi, sites of major nuclear accidents in 1986 and 2011. “[They] are very dangerous, so it’s vital to keep humans out of harm’s way,” says Sykes. “Remediating these sites will also take a lot of repetitive activity. The best way we can do both is to use robots.” But aside from the somewhat extraordinary environments of Chornobyl and Fukushima, hot robotics are also prime candidates for more typical decommissioning activities, such as at Sellafield in the UK. 

Squaring up to the UK's nuclear legacy

“You could say we are among the major players [in robotics R&D] because we’ve got some of the biggest – if not the biggest –challenges in the western world,” says Dave Megson-Smith, academic lead at the University of Bristol’s Hot Robotics facility, which develops mobile robots for hot environments (see box ‘The many abbreviations of the UK’s hot robotics scene’). He explains that decommissioning Sellafield will be “vastly costly … and will be for the next 100 years. And most of that work will be done by robots.”

 

Remote handling robot in a nuclear facility. The robot has silver metal arms and has cover protection.

The Joint European Torus fusion facility in Oxfordshire has recently upgraded its remote handling robots to help it with decommissioning. MASCOT, pictured, has haptic feedback that allows the operator to feel every action from carrying a new component to tightening a bolt © UK Atomic Energy Authority

The to-do list for decommissioning is long and varied, with different specialisms needed for different tasks. Some robots inspect sites, some collect the dangerous radioactive material, some transport it, some sort it, and others safely package it for storage at a licensed site. To help us address this huge challenge, we need every robotics tool at our disposal – from navigation systems, to approaches to robotic mobility and manipulation.

Often, the first step is a survey by ground or air (by drone) to map radiation hotspots and assess the scale of the challenge at a nuclear site. Dull routine tasks such as inspection are ideal for robots, but both Sykes and Megson-Smith agree that contamination control is one of the biggest problems for robots sent to nuclear facilities. 

Tracked robots can pick up contaminated dust, mud, soil, or bits of debris that stick to their treads, risking contaminating clean areas. And while wheeled robots have less surface area to become contaminated, they cannot climb stairs. Legged robots, on the other hand, can reach most of the places humans can. Megson-Smith was technical lead on the decommissioning team that sent Spot – the dog-like quadruped robot developed by US firm Boston Dynamics – into the Fukushima Daiishi nuclear plant in 2022 to survey the site and plan for its future. He’s also been on teams that have deployed the robot at Chornobyl. “The reason we took Spot into Chornobyl is because he’s a walking robot and the walking robot only has four points of contact, so the only places that can get contaminated are the bottoms of his feet,” says Megson-Smith. The solution? Rubber socks. “Take those off, throw them away and you have a decontaminated robot.” 

Spot can even be fitted with a manipulator so it can open doors. “We’ve had teams in Sellafield who have taken to using Spot really flexibly, making their work faster, safer and cheaper for them to do,” Sykes says. He gives the example of an alarm sounding at a facility. These alarm systems are designed to alert humans to low levels of radiation – say, before a leak becomes a serious threat to personnel. A mobile robot equipped with a manipulator can enter a room and check the systems in case the alarm has been set off by a procedural mistake or sensor failure. In the event of a leak, no humans are put at risk. 
 

The many abbreviations of the UK's hot robotics scene

Know your RAICo from your RACE

To appreciate the UK’s standing in hot robotics involves some tongue-twisting acronyms and organisations.

  • RAICo is the Robotics and Artificial Intelligence Collaboration between UKAEA, the Nuclear Decommissioning Authority, Sellafield Ltd, and the University of Manchester. (It's building on two previous programmes, RAIN, aka Robotics and Artificial Intelligence for Nuclear, and LongOps.)
  • NNUF-HR (the National Nuclear User Facility for Hot Robotics): UK academics, startups and industry can hire facilities and an array of robots at four NNUF-HR sites for their nuclear research:
    • RACE (Remote Applications for Challenging Environments): NNUF-HR’s primary hub, which houses many robots and mock-ups of the environments where they would operate.
    • NNL (the National Nuclear Laboratory) in Cumbria develops, tests, and demonstrates robotic solutions for the nuclear industry.
    • A dedicated test space at the University of Bristol comprises a 245-acre site that focuses on environmental field surveying, with a focus on uncrewed aerial vehicles (also known as UAVs or drones) and mobile ground vehicles.
    • The University of Manchester hosts another site in Cumbria, including a pond with an underwater positioning system.

After inspection, more complex tasks follow. At legacy nuclear fission plants, reactors must be ‘defuelled’, emptied of spent radioactive fuel. Here, decommissioning teams use purpose-built machines incorporating radiation shielding and systems that automatically locate fuel rods.

Waste is normally transferred to a deep storage pond, such as those at Sellafield, as it cools and the water shields its radioactivity. One of the most complex decommissioning tasks at Sellafield is clearing these pools of debris and sludge. Remotely operated submersibles have made some headway here, although some tasks are still best suited to more dexterous humans.

Other robots have other specialties. For example, robotic arms mounted on stationary platforms – like factory robots on a production line – are programmed to carry out very tailored tasks, such as safely packaging waste. 

“This is a really exciting time to be in hot robotics because we’re on the verge of a new technology shift.”

Nick Sykes, Director of RACE

In the early days after Fukushima, robots sent to the reactors where the meltdown took place failed, as they were not designed to cope with such extreme levels of radiation. With this in mind, you would think that radiation hardness – how resilient components in a robot are to radiation – is a central challenge for nuclear decommissioning.

“In radioactive environments where there are medium to low levels of radiation, radiation hardness isn’t that pressing. It may be radioactive enough you wouldn’t want to send a person in there … but robots can work quite happily,” says Megson-Smith. “I’ve been in radioactive places you wouldn’t want to spend more than 10 minutes in, and robots have been fine, no sign of deterioration in their signal qualities.” In fact, Boston Dynamics has reportedly exposed Spot to almost 250 years’ worth of the allowable human dose of radiation, with no ill effects.

What can be affected, however, are navigation algorithms. If not properly radiation-hardened, ionising radiation can degrade sensor inputs, causing problems for control software, he explains. “If it has to navigate from a fuzzy picture – think of an old TV with a bad signal – can it still do it?”

If it can’t get out by itself, you can’t just send someone in to fix it. “How are you going to rescue or recover that robot?” Sykes asks. One option could be having twice as many motors as would normally be called for, so that if one stops working, the others can take over. “Or you can think about crude rescue mechanisms, such as tying a rope to the robot so you can simply drag it out.”

Tackling the root cause might be a case of wrapping more shielding around electronic components. Another option is designing electronics that may not be quite as fast, but are less susceptible to radiation-induced corruption or bit flips, when a unit of memory data changes from a 0 to a 1 or vice versa, that cause errors.

Charlotte Wilkes, an apprentice mechanical design engineer at the UKAEA, controlling the remote manipulation system at JET © This is Engineering

The robots preparing for future fusion facilities

Much of the effort at RACE is focused on JET, which ran its final experiments in December 2023. After 40 years of operation, it is now being prepared for decommissioning, a process that will take about 15 years.

For over 20 years, engineers at JET have maintained the machine using a remote handling system comprising two 12-metre-long snake-like booms. Controlled via camera and virtual reality, the booms reach through long, narrow maintenance ports reaching up to five metres into the vessel. Such equipment is essential for conditions in fusion plants, where components can reach temperatures of over 220°C and contain hazardous materials such as lithium and beryllium. Then, there’s the level of radiation, which can reach up to 3 kilograys per hour at the start of maintenance. To put these units in perspective, this is over 4,200 times more than the total body exposure required to cause radiation poisoning (which is 0.7 grays or more). Because of safety precautions, personnel have not been able to enter the inner vessel of JET for over 30 years. 

Furthermore, with such high levels of radiation, JET’s remote handling robots must be ‘wired out’ to an area that’s safer for electronics. “We have to take every motor and signal and have a cable leading out of that radioactive area so we can have all the electronics in a safe area,” says Sykes. 

Decommissioning JET will be an incredibly complex challenge. Newly-upgraded remote manipulators will be used to remove about 4,000 individual tiles and components, ranging from the relatively small up to tens of kilogrammes, from the machine’s tokamak. These tiles and components are contaminated with tritium, a radioactive isotope of hydrogen essential for high-powered fusion plasmas. Tritium will need to be recovered both for re-use as fuel, and to reduce the volume of Intermediate Level Waste removed during the decommissioning process.

The other extremes

How do the challenges of hot robotics compare to other sectors, such as space exploration and wind turbine maintenance?  🪐

According to Sykes, preparation across these operations can be very similar. “You create a sequence of events that you will go through and methodically work through those processes.” Working undersea, for example, it’s vital to think through the tasks in hand, just as you would in a radioactive environment: “How are you going to control the robot? What cameras are you going to use? How are you going to see [what it sees]?”

Radiation and control systems are two aspects the field shares with space exploration, he says: “Space is quite a radioactive environment, so making our robots resilient in terms of electronics is also needed in space. Space is difficult for humans to access, and conditions such as no air and freezing cold temperatures aren’t particularly helpful either. But robots can deal with all those conditions.”

The same applies to robots and wind turbines, where some of the most powerful are far out to sea, and difficult to access, especially in bad weather. “If a human has to, say, inspect the edges of a turbine blade, which get eroded on the trailing edge, they will have to land a boat, then climb pretty high up a ladder, which can be difficult and slow, and you’ve lost time when you could be producing electricity.” Instead, drones can be used. “Drones are now very good at noticing changes and can assess when a blade needs a repair or replacing. And you’re not putting a person at risk.”

Tritium is very scarce, yet a critical fuel for fusion. “Every milligram is very important,” explains Sykes. JET’s decommissioning programme is therefore focused on recovering tritium for use in a future fusion machine or power plant, a process that could also involve remote handling capabilities. According to Sykes, this will inform an essential part of the fusion engineering and scientific planning and delivery process. 

At the UKAEA, the future of hot robotics is closely connected to the global effort to produce fusion power. “Many countries are trying different solutions. In the UK we’ve got our own programme,” says Sykes. This will be yet another acronym: STEP (Spherical Tokamak for Energy Production), a prototype fusion power plant that will be built in West Burton in Nottinghamshire, with a goal of being operational in the 2040s.

Fusion power plants aren’t cheap to build, so the UKAEA aims to maximise the amount of time power plants can be generating. “Robotics have a really key part in making sure we do that maintenance of those machines and exchange critical parts in a very short period of time,” says Sykes. 

“As I see it, where we are with robots now is where we were in the 1990s with personal computers. We’re at the cusp of that revolution now."

Dave Megson-Smith, academic lead at the University of Bristol’s Hot Robotics facility

Robotic collaboration

Looking to the future, how much should robots be allowed to do independently, and how much should be controlled by humans? “If it’s a very difficult or unknown task, a lot of human intervention may be needed,” says Sykes.

Robots working with other robots is almost the next step, says Megson-Smith. “You don’t want to throw robots in at the deep end and have them learn on the job, like you can do with a lot of AI training systems,” he cautions. “Even in a collaborative system, it has to be about very gently leading them by the hand.” He envisages AI systems supporting individual human operators over the next decade, to train and gain confidence in the AI. Once the trust is gained, semi-autonomous systems making more of their own decisions will become possible. “Then for the next five to 10 years, have a human watching it who can hit the stop button … until you have the evidence and confidence to make that fully autonomous. In nuclear, we have to move forward more slowly.”

Robots are continually being perfected, he says. “As I see it, where we are with robots now is where we were in the 1990s with personal computers. We’re at the cusp of that revolution now. They’re going to slowly creep into our lives; in a decade, they will be everywhere; within two decades, you won’t know how you ever did without them.”

Contributors

Dr Dave Megson-Smith is a Hot Robotics Research Fellow at the University of Bristol’s Interface Analysis Centre and has been on research visits to Fukushima Daishii and Chornobyl’s New Safe Confinement.

Nick Sykes first joined the UK Atomic Energy Authority (UKAEA) as a senior mechanical engineer at JET, working on remote handling tooling. He was then appointed as a Unit Leader at RACE, before becoming Head of Operations and subsequently Director. At RACE, he has led the successful implementation of many projects for customers such as Sellafield Ltd, while also producing cutting-edge designs for JET, ITER and DEMO.

Beverley D’Silva

Author

Keep up-to-date with Ingenia for free

Subscribe