Meet SecondHands, the robot giving technicians a helping hand at Ocado

Ocado have sold their Smart Platform to supermarkets in Sweden and the US. Find out more about the technology behind their automated warehouse in our two-part series.


Online-only supermarket Ocado is leading the way for automated warehouses with its Smart Platform, which has already been adopted by companies in the US and Sweden.

Behind the scenes at Ocado, the investment in automation technology is impressive – ranging from swarms of droids zooming around grids to robotic hands able to grasp and manipulate fragile, dynamically changing objects.

Of course, all these systems need to be overseen and maintained – and naturally, the company’s Ocado Technology division is exploring how aspects of this maintenance activity can be automated too.

A second pair of hands in the warehouse

Duncan Russell is a Research Coordinator at Ocado Technology

After receiving a BEng in Electronics from Kingston University, UK, and a decade working in industry, Duncan completed a PhD in Secure Collaborative Workflow for Grid Computing at the University of Leeds. As a post-doctoral Senior Research Fellow, he led the Systems Architecture stream in the NECTISE project, funded by EPSRC and BAE Systems.

Duncan has since worked as Head of Research and Development at Image Analysis Ltd and CTO at DRTS Ltd, where he led two EU-funded research and innovation projects (RAISME, AAPD).

Duncan Russell, Research Coordinator, Ocado Technology

Duncan joined Ocado Technology in 2016. As well as organising collaborative activities across various research and innovation teams within Ocado Technology and linking to external academic and industrial research organisations, he’s currently project managing the EU Horizon 2020 robotics research projects SoMa and SecondHands.

“SecondHands is a proactive maintenance assistant that will help the technicians in the warehouse when they have to perform the regular preventative maintenance of the conveyor systems and they’re reaching at a height, they’re operating with some heavy equipment, they’re trying to hold things in parallel and require tools — maybe they’re up the ladder and they need another tool. So the robot SecondHands is offering a second pair of hands to the technician,” says Duncan. 

“That was the overriding use case, and the project then developed into a humanoid robot able to have its own intelligence, [and] make its own decisions on where it could offer help proactively to the technician within a normal workflow.

“So we created a project with a few different academic partners that would solve some of the fundamental research problems going into the integrated robot.”

What is the role of each academic partner?

“We decided to work with KIT [Karlsruhe Institute of Technology], who are building the robot, the design of the robot, the robot real-time control, and working on the grasping pipeline.

“KIT are also working on the speech dialogue system, where the technician should be able to hold a conversation with the robot rather than just issuing single commands and getting a plain response.

“The vision system from UCL [University College London] will be able to recognise the objects in the scene and work out what the technician is doing by looking at the human pose.

“Rome [Sapienza Università di Roma] are providing the actual intelligence for the system. They’re looking at the activities that the human is doing, working out where they are in the workflow and deciding what the robot should be doing in terms of its proactive help, moving it into the right location to help the technician. Included in that is workplace analysis from KIT of where it’s safe to operate the robot alongside the human.

“And finally EPFL [Ecole Polytechnique Fédérale de Lausanne] are providing the bimanual manipulation – the dynamics of handing over tools to humans and sharing the load between each other. For the robot, we have to mimic the same behaviour as the human so it’s very natural behaviour for passing tools – for actually helping and operating in the same space as the human.”

What is Ocado’s role in the project?

“Ocado’s part in that is the project coordination and the integration of all of these systems on the robot.

“So we worked on different ways of integrating these components that allowed interoperability, but also the flexibility that the partners could operate in their own environment because of limited resources of only having the one robot to work with and being able to demonstrate the integration very rapidly between partners.”

How is the SecondHands team at Ocado made up and what challenges have you faced on the project so far?

“There’s three main research scientists on the SecondHands project at Ocado. And they’re performing evaluation tasks, defining the environment that it’s working in and running the tests. There’s also at least another three of us proactively working on the project in terms of managing the experiments, managing the environment in the project – really enabling the work to go ahead.

“The lab space is shared on other projects, but predominantly designed for the SecondHands project. We ended up having an arena in which to operate the robot with a motion capture system for ground truth against the vision system. And we’ve ended up building the diverter systems and replicating the operational systems in the warehouse within our lab.

“We are lucky enough that the robot lab itself is within the warehouse, and actually subject to exactly the same environmental conditions, in terms of the same lighting and the same noisy environment, that the technicians would be working in.

“And those are some of the other challenges that we’ve had to cope with, in terms of lots of the components of the system – so the vision system and understanding what’s going on – have to take into account the lighting and the background noise, and the robot is truly working and being evaluated in that real environment.”

Where are you up to with the project?

“The project proposal was written a few years ago to obtain the funding. This is a large project and it’s actually running for five years.

“We’re over two and a half years into the project [now], and the evolution from the original design has meant that neural networks are a lot more prominent in robotic systems. We’ve got a lot more data for the learning, and in fact, so many of the components, whilst reaching for the same objectives from the original proposal, the technical underlying systems have changed somewhat, [so] we ended up with a lot more requirements on heavyweight processing to run neural networks and obtain the intelligence of the system.

“Recently we took delivery of the new robot, and with our integration approach it didn’t actually take long to get all the components running on it. So that was a really good achievement: the robot was working in the environment, and the approach we needed to take was, we’re trying to adapt some of the limitations of those components to the environment such as the lighting conditions, recognising things in the physical environment. So there was a bit more learning in those systems to do, but the structures were already there. That was the main outcome of that initial integration.

“Providing the proactive help – that’s still very much in development. We’re still going through the approach of the research and things like the vision system. The fundamental recognition is there, but the performance needs to be a lot faster to actually proactively offer help and make decisions in real time, and work with the technician.”

How close are you to deploying this robot in the warehouse?

“I think there are lots of aspects of the project which can be taken forward. I think the robot as whole, and the intelligent system as a whole, will need to go through another phase of development at the end of the project. But there are some initial learnings, some initial components, which can be exploited now.

“The arm design is really fantastic and could be exploited immediately. And some of the intelligence components could be taken forward. But as a whole, getting the whole robot to work in the environment is still very much a challenge. And it’s something that is part of our next steps of the project.

“Coming out of it, we hope to exploit many parts of it immediately and we hope to see the robot continue as a whole development into something that will offer proactive assistance.”

What are the unique challenges you enjoy about working in robotics?

“Robotics is probably the most concentrated systems engineering problem. And especially with something like SecondHands, where we are having to run a robot with human robot interaction.

“The robots are working directly with humans. So there’s a systems engineering challenge in terms of the functional parts of the robot – the physics of the robot – but also the social interaction and how people react to a robot in an environment.”

The SecondHands project represents a significant stride forward in cobot technology and gives us a glimpse of what the future holds for automated warehouses.

Ultimately, it brings us ever closer to a future where we work alongside humanoid robots that can interact with us, anticipate our needs and even exceed some of our capabilities.

Total
0
Shares
Previous Post

AI and humans: what will our shared future look like?

Next Post

Ocado’s SoMa robot could change the grocery business forever

Related Posts