Woodside bots tested worlds away
Woodside Energy has given some details on how it codes its robots.
Woodside Energy robotics engineer Robert Reid spoke last week at a summit held by hosting giant AWS (Amazon Web Services).
He said that Woodside has come up with a continuous integration and continuous deployment (CI/CD) pipeline for coding its robots.
The company is testing a range of robots including a four-wheeled unit with remotely-controlled arms and sensors to patrol its Pluto liquefied natural gas (LNG) facility in Karratha in Western Australia.
The main job of the robots is to look for leaks and other safety issues. Mr Reid said coding is first written at Woodside’s Perth lab.
“If we’re doing code development, then we’ll be doing that on a development robot out here in the lab,” he said.
“Once we’re happy with some of the code changes we’re making, we’ll be pushing them up to GitHub, where the CI/CD processes will kick off and build those changes into fresh Debian changes.
“We have a staging robot also out in the carpark, so once those packages have been built up into a new Docker image, we’ll pull it down onto the staging robot, and we’ll spend multiple days testing.
“Once we’re happy the robot is performing as expected, then we’ll actually push that image to the production robot, which is sitting up in Karratha right now.”
The equipment relies on the widely-used open source robotics software framework, robot operating system (ROS).
Mr Reid describes it as the “glue that really brings the various parts of the robot together”.
“We have a range of sensors and their device drivers, and ROS allows us to take the data from each of those sensors, bring them together with a range of algorithms such as localisation, obstacle detection, navigation, and also allows us to encode the images as video, for example, so that we can push that data up to the cloud,” he said.
“We bring all of those various components together through a CI/CD pipeline that is running in AWS services.”
The robots at the Pluto-Karratha Gas Plant (KGP) follow a predetermined path to capture data, including images, before re-docking at a ‘bot box’.
The robots guide themselves using 3D point clouds of the facilities, which Woodside can access in digital form in its test lab.
“For the thousands of hours that we do out in the field, you can put hundreds of thousands of hours in simulation,” head of robotics Mark Micire said.
“Our plant doesn’t change a lot, so unlike… other robots that are in very dynamic environments, we can cheat a little bit and go through and generate a point cloud ahead of time that is good to centimetre and millimetre accuracy.”
The experts say their lab environment also allows them to run regression tests and simulations, throwing up issues that would otherwise only become clear in the field.
“Frankly, for a lot of this equipment, it's lab equipment that we’re adapting to the ‘real-world’, so we’re figuring out how it breaks,” Mr Micire said.
“We’re actually searching for those data points that you’re only going to find after the thousandth hour of testing, and it’s those data points that we want to find out now.
“We want to really find them in a testing environment. That way when we’re working in a real operational environment, we’ve already scrubbed out all of those problems.”
Currently, patrols by Woodside robots are accompanied by a human safety operator, but the company says its future plans include fully autonomous patrols, and even using the robots to manipulate machinery and other objects in the environment.