Ocado have sold their Smart Platform to supermarkets in Sweden and the US. Find out more about the technology behind their automated warehouse in our two-part series.
Soon robots could be packing all your groceries.
Advances in robotic and AI technology are making it possible to automate tasks in ways that even a few years ago would not have been thought possible. This technology is already transforming our world and changing how we live and work.
14% of jobs in the 32 OECD countries are highly automatable, according to the OECD paper Automation, skills use and training (Nedelkoska, L. and G. Quintini, 2018). And 32% of jobs are expected to undergo significant changes in the way they’re performed, with an increasing number of tasks being automated.
With more and more businesses looking to boost efficiency and productivity by investing in automation, it’s right that we ask questions about the future: how will we regulate robots? How will we spend our time if our jobs are replaced by machines? Is universal basic income the solution? And ultimately how can we develop and use robots so they work for all of society?
Luckily these questions don’t require an immediate answer. We’re still only able to make robots that operate within narrow contexts. In the absence of more general intelligence and capability, in most cases it is select functions of jobs that are being automated – as the OECD paper highlights.
We see this through collaborative robots (‘cobots’) doing the legwork in smart factories and through businesses using machine learning to model data on a scale beyond human ability, while humans oversee this activity in roles that require less replicable skills.
These cobots are complementing and augmenting human capabilities, and in many cases improving our work-life balance in the process. And different human jobs are emerging – ones requiring strategic thinking, creativity and expertise in engineering, robotics and AI.
The robots packing your shopping
Ocado is the world’s largest online-only supermarket. It’s based in the UK, and one of the companies leading the way in automation.
As part of Ocado’s mission to revolutionise the way people shop forever, it’s invested heavily in researching and developing innovative technology to increase efficiency and improve its services.
Ocado Technology, based in the UK in Hatfield, 20 minutes outside London, accounts for around 1,200 of Ocado’s 12,600 workforce. It also has offices in Spain, Poland and Bulgaria.
The division is responsible for developing the software and systems behind the online grocery platform, “covering real-time control systems and robotics, computer vision systems, machine learning and AI, simulation, data science, forecasting and routing systems, inference engines, cloud, IoT, big data and more”.
As well as working with Ocado Engineering, the division working on innovative engineering, construction and logistics solutions for its Customer Fulfilment Centres, on the Ocado Smart Platform, Ocado Technology is involved with two EU-funded research and innovation projects called SoMa and SecondHands, “which combine state of the art robotics, artificial intelligence, machine learning and advanced sensors to understand and assist human warehouse workers in real time”.
“At Ocado we’re basically trying to get robots to help pack shopping, ” explains Graham Deacon, Robotics Research Team Leader at Ocado.
Graham has a background in engineering, philosophy and psychology. A former lecturer in Robotics and Mechatronics at the University of Surrey, he has also worked on bespoke applications of sensor-guided robots in the automotive industry and spent some time at the BBC Research Labs working on the synthesis of novel views from multiple concurrent images. He was part of a team that developed a robot to assist neurosurgeons, managed a team maintaining a number-plate recognition system, and was technical lead for a start-up company looking at using robots to fold sheet metal.
Graham holds a B.Sc. in Engineering Science (Electrical) from Warwick University, an M.Eng. in Control Systems from Sheffield University, an M.A. in Philosophy of Mind from Hull University and a PhD in Artificial Intelligence from Edinburgh University.
Since joining Ocado in 2010, Graham has assembled a robotics research team and led a number of projects aimed at developing robots that can pack shopping.
One of the robots Ocado has developed is a robotic arm with a suction cup end effector capable of picking up a range of objects. They’re currently working on putting it into production. It’s effective, but limited in the sorts of objects it can pick up.
“We’re putting a solution into production now which uses a suction cup. And so we can pick a whole class of objects which that kind of end effector can cope with,” says Graham. “[But] we needed some way of being able to handle objects which are deformable, easily damaged, never the same shape twice.”
Graham is involved with the research project SoMa as the work package leader. It’s a project funded by the European Union’s Horizon 2020 Research and Innovation programme and aimed at exploring how best to use soft manipulators to develop manipulation strategies that exploit environmental constraints.
The SoMa project is a collaboration between Technische Universität Berlin (TUB), the University of Pisa (UNIPI), the Italian Institute of Technology (IIT), the DLR (the German Aerospace Research Center and the German Space Agency), the Institute of Science and Technology Austria (IST Austria), Ocado and Disney Research Zurich.
Ocado presents a challenging user case for the project.
“We want to be picking up all kinds of things, including fruit and vegetables,” says Graham.
“The items that are probably the most challenging are things that come pre-wrapped. We typically don’t sell individual instances of fruit and vegetables – they come in plastic bags. These things are like semi-articulated objects. And there’s other things which shift their load as you pick them up.
“So if you were to pick up a bag of lentils or a bag of rice, as you pick it up, the load redistributes, so what you’re handling effectively changes dynamically.”
SoMa interacts with its environment, using physical constraints to guide and improve how it handles objects.
We do this all the time. For instance, if we want to pick a credit card up from a table, we’ll typically use the table surface by placing our palm flat on top of the card, drawing it to the edge and then placing our thumb underneath it.
To achieve this, soft manipulation (which is where the name SoMa comes from) – by a soft, underactuated hand able to adapt to objects – is essential. This is how humans usually use their hands, and robots capable of doing this could function in dynamic, variable environments.
The cutting-edge technology has gone through a number of changes as the project has progressed:
“One of the characteristics of the underactuated hands is the fact that, although they’ve got multiple degrees of freedom, they’ve only typically got one or two degrees of control,” says Graham.
“For instance, the Pisa IIT hand has tendons running through the thumb and all of the fingers. And what they do is they pull this one tendon and all the fingers close, and that’s how it shapes itself to an object. As one finger makes contact, it can’t close any further. So the other fingers take up the contraction of the tendon.
“So there’s a limited range of operation of these things, and it’s about being able to configure them beforehand to be able to pick up the things that you want. We’ve actually done experiments with the hands on the things that we’re interested in – instances of fruit and vegetables – and worked out under what conditions they work well and where they don’t work quite so well. And then we’ve fed this back to the guys that design the hands, and they’ve come back with iterations more suited to our use case.”
Because of the complexity of getting robots to grasp these kinds of objects, Graham’s team at Ocado have had to conduct lots of real-world experiments:
“One of the challenges we’ve faced on the SoMa project is characterising the operation of the hands. This has led us to be doing loads of actual physical experiments with the hands in our use case. A number of concerns out there are using simulators to try and predict the behavior of systems. But the problem with simulators is it’s very difficult to model the dynamics of contact. And for the underactuated hand, we don’t even know what contacts they’re going to be making. So it doesn’t make a lot of sense for us to try and simulate this. Which is a shame, because it would save us a lot of time.
“So we’ve had to resort to doing the real experiments in the real world in order to get some results that we can rely on.”
With the project about three-quarters of the way through, with one year left until the end of the project, the team is working on integrating the different technologies it has developed:
“Now we’ve managed to characterise the hands, we can actually use this measure of their performance and present it to the planner. And then the planner can choose the parameters for the hand in order to get the best operation out of them for a given circumstance.
“This planner is being developed primarily by the Technical University of Berlin, with input from the Italian Institute of Technology. The camera system looks at the scene, the planner works out what we should be doing, the planner tells the robot what to do, and we go off and execute it and pick stuff up.
“So it’s getting to the point now where we’re integrating all the results and trying to do something useful with them.”
The technology being developed as part of the SoMa project has wider-ranging possible applications, and robots capable of interacting with and manipulating objects in dynamic, changing environments have exciting potential.
“You know, we take for granted the fact that we can pick stuff up and manipulate it. All this stuff that’s relatively straightforward to us is actually really difficult to get a robot to do. And I’m still trying to work out how I can get a robot to exhibit the same kind of skills and performance that people do every day.”