Getting robots to do industrial work has been decades in the making. It’s part science fiction and many parts technology development. But after 30 years of research, Sarcos Robotics is getting finally getting its robots into the market this year.
These Guardian robots enable human workers to extend their strength with exoskeletons or venture into places that are unsafe for humans, like the tops of telephone poles.
Kiva Allgood has been part of Sarcos for six years, and in the last three months she took over as CEO of Sarcos in Salt Lake City, Utah. Her company is mixing technology innovation, business model innovation, and industrial innovation.
There is a great fear that robots will replace industrial workers. But Allgood said the reality is there aren’t enough skilled workers to hire for many different industries, including manufacturing. On top of that, robots can work alongside humans and even enhance their skills, like the mech suits that helped Delta Airlines luggage handlers carry a lot more luggage in a day.
Allgood said that robots can extend the life of an employee, especially if they have to pick up 50-pound boxes many times a day. Allgood said that Guardian XO exoskeletons in particular can make humans three times more productive than normal.
The companies Guardian DX robots can be put into cranes that are many feet off the ground and handle tasks like fixing power lines or cutting trees — tasks that are controlled by a human on the ground.
Here’s an edited transcript of our interview.
VentureBeat: Tell us some background.
Kiva Allgood: Let me start with the history of the company. For me personally, I was a part of the investment team at GE that invested in Sarcos. I’ve been part of the family for a little over six years. I’ve been in the CEO role only for the last three months, but it’s been, to me, a bit like coming home. It’s the perfect mix of technology innovation, business model innovation, and industrial innovation. Having started my career as an IT consultant, spending a lot of time in factories, I’ve been able to live a bit of those different hype cycles.
Sarcos is focused on providing the workforce of the future. What I mean by that is, there aren’t enough workers, as you know. Trained, skilled workers that want to be machinists, want to do heavy industrial work, are few and far between. That was one of our biggest pain points at GE, being able to hire and train enough mechanics. It takes four years to become a certified mechanic of a jet engine. You’re apprenticed for quite a long time, and you have someone beside you for even longer. It’s task-demanding, which means you have to be able to lift very heavy pieces of equipment. We couldn’t hire enough people. There weren’t enough people coming from trade schools.
At the heart of that problem statement is what we’re trying to solve. We’re trying to provide an augmented workforce. It’s not an industrial robot that goes into a factory that does the same task over and over again. The beauty here is that we work alongside a human, leveraging the benefit of–typically places that are different every time. Again, not enough workers to go around. That’s a problem statement I’m sure you’ve heard everybody make. It’s true in almost every industry. We’re trying to make sure that we’re laser-focused on how we bring extra companions to the workforce so that you can get more out of a team member than you do today.
VB: How so?
The way we think about it, this is a fleet of robots. It’s robots as a service. If you think about an average worker during the day, you have a Guardian XT, which is an avatar robot on the left. This is the one product that has been–the pull has been amazing, in part because the fidelity and the dexterity of the robot and what you can do, it’s really solid for that height. There are a lot of companies and use cases where they’ve made OSHA statements about no longer putting humans at height, at 25 feet, to do a task. The robot we have here, the Guardian S, you can be at the shop floor and use a drill at 15 feet high. You can scan with the detection through a pipe. Again, you’re on the ground and moving this robot using an avatar that is connected to your body.
VentureBeat: Did you say you’re not allowed to work at that height, or they suggest that you don’t?
Allgood: A lot of large companies have, as part of their ESG programs, stated that they’re no longer going to allow humans to be at height. They’re trying to come up with augmented solutions. How do I get the task done now that I can’t put a human in a basket? In part because of the cost of insurance. People getting hurt is very expensive.
With that use case, that creates a lot of opportunities for us to look at construction tasks. Anything that’s non-repetitive in an environment that changes. Every construction site is different. We’re doing a lot for vegetation removal. If you think about working around power lines, in the end you have a human at the top who’s trying to cut around power lines. Unfortunately sometimes they miss. Again, if you look at those high-risk tasks, you can use an avatar and have a human assistant on the ground while having the robot do the task. The human is no longer in harm’s way.
We’re also advancing on the fact that we would like those tasks to be autonomous, too. For a given space–for example, the avatar is going to do their task from point A to point B autonomously. We’re going to integrate that as well. It gets set in the field by the avatar robot operator and it’s condition-based, which is very different from being on an industrial factory floor where they’re just pushing a can across, something that’s the same every time. We have to train for in-field, which requires more adaptability.
We’ve also learned that when we get an operator in, someone who knows how to do the task, it’s harder to learn the task than it is to learn how to work the robot. We have our operators who know the robots well, but they aren’t used to using a saw and cutting vegetation. When we bring in our customers, they very quickly jump in and say, “Oh, I can do this, this is great.” With each of the use cases and customers we bring in, if they know the task, learning to do that task with an avatar is a much shorter cycle than if you have someone who knows the robot and is learning a new task.
That’s great for all these different places that are also trying to extend the life of the employee. Some of these jobs are harsh. When you’re using a drill or you’re sandblasting, that puts a lot of strain on your joints. But these are highly skilled individuals. The product extends the life of the worker as well, which is another key point for all of the partners we’re looking at.
That’s the Guardian XT. The other product we have is the Guardian XO. Just as it looks, the exoskeleton itself enables the operator to do tasks with heavy equipment, repeated tasks. Think about anything where the employee is repeatedly picking up 50 pounds, 60 pounds, 70 pounds. Or something awkward. One of the other key requirements has been around thinking about a big barrel. That’s hard for team members to pick up. This allows them to do exactly that. It also allows them to lock the arms in, so if they need to do something else using their hands, they can.
Here, the beautiful part, when someone gets inside–it’s surprising, actually. You think you want to work the exoskeleton, but it actually becomes a companion. From a training perspective, it’s the integration of advanced technology, as well as trying to solve for those repeated tasks in the field, across the board. Lots of different FTEs there.
Those are our two flagship products. It’s been years of development. We’re able to lean in now to commercial products because of the benefit of 30 years of investment from the government. A lot of that technology we’re now leveraging–I’m sure you’ve seen this cycle before, where the government is trying to advance certain technologies, and the price curve has come down. We’re finally at that peak of commercial viability because the investment over the last 30 years into figuring out exoskeleton, dexterity, and autonomy is meeting with AI. Because of that merger we’re able to get to the point of commercialization and solve these tricky problems that enterprises bring to us.
The exciting part is, if you go to Disney World, you’ll actually see a Sarcos product there as well. The founders are the premier robotics leaders in the industry. They have products out there, from dinosaurs to humanoids. Their focus was on, how do they get a product that provides dexterity and mobility like a human? We’ve been able to capitalize on that from the standpoint of, now we have two flagship products that take that innovation and bring it into commercial work as well.
This is the workforce multiplier. Again, it extends the life of a worker and reduces the risk of injury. It allows the team member to do tasks for a longer life. The features, it can lift up to 200 pounds. We have hot swap batteries with almost continuous operation. Capable of up to 100 percent load relief, but what we’ve found is that operators need to know there’s a bit of weight there. It’s better for them to feel five pounds than it is for it to be completely weightless. Pricing right now, we’re doing robot as a service. We’re augmented a fully loaded employee where you pay for the robot on a per-month basis.
VB: Robots as a service?
If you think about the transformation in the automotive industry, a chassis and what a car used to do, now you have a software-defined vehicle like Tesla builds. We’ve seen that same transformation in what we’ve been doing with robots. They used to be hydraulic. Now we’re in electric. We’re using software and AI to understand force, function, gait. We’re doing a lot of simulation to improve the overall performance of the robots.
Highly dextrous robot. It is an avatar. It can reduce a crew task. Typically a lot of the tasks at height require three people – a person in the basket, a person observing, and the person using the basket. Now you can have someone do all three of those things, depending on what the task is. We’re taking a lot of the investment we’ve made into our Guardian XO, the top half, and then putting that stationary into a lift product, depending on the customer, because everyone uses different lifts. Same robot as a service, for indoor and outdoor use, at $5,000 a month, augmenting the work and the team there.
From my perspective, this one is so intuitive right out of the gate, especially because of the form factor itself. We have lots of good momentum there. And that’s Sarcos in a nutshell.
VentureBeat: I saw that you have some military contracts. Is that something you’re doing, or are you staying out of that space?
Allgood: The history–the Department of Defense, non-dilutive, has about $300 million worth of investment. Our focus right now is taking that investment and driving into commercial applications. That doesn’t mean we’re not continuing to partner with the Department of Defense on specific use cases. There’s a lot of opportunity in nuclear waste removal or working with the Navy on yard ships, but our primary focus in what we’ve communicated is on the commercial side.
VentureBeat: It sounds like you don’t want to go into a controversial area, things like combat robots?
Allgood: We do not.
VentureBeat: This is such a fixture of science fiction in the past. Is that where some of the origins are, the ideas for starting this up a long time ago?
Allgood: I think the Department of Defense, if you rewind 30 years ago, they did have a vision for an autonomous robot. I’m not the best person to answer on that whole history. Our founders, Mark and Frazier–when you’re working with the government, they have a clear view on where they’d like the product to go. Their goal, though, was always to lean in on designing and developing a dextrous robot that performed like a human. Being that focused on it resulted in other products. We have products at Disney, as I said. One of the first products they sold was a prosthetic, which is still used today. Their goal was getting to that human dexterity.
VentureBeat: The kind of design it has now, is that works well relative to something that’s all-encompassing, enclosing? That’s kind of science fiction still, like the Iron Man suit.
Allgood: I’ve had a lot of those questions. Tesla is now saying they’re going to build a robot that looks kind of like the Iron Man suit. Right now the goal is to enable someone to do three times the work leveraging the exoskeleton, in part by reducing fatigue, giving them more lift ratio. The enclosure piece–it all has to be ergonomic and comfortable to wear. The focus is on those design elements for each different use case. And keeping workers out of harm’s way.
Our goal is to prevent injury and save lives using autonomous robots. That’s our mission. That means bringing together very complex systems and AI. It sounds a little science fiction, but again, I’ve been in the telecom and mobile industry for a very long time. The ability to travel the world with a single handset and be able to call everybody and Facetime everybody was a little science fiction too.
VentureBeat: How much does each one tend to cost? I imagine it varies, but if you’re doing one of those power line robots, what kind of investment does that represent?
Allgood: As I said, it’s robot as a service, so that’s $5,000 a month, depending on the task. If it’s 24/7, that pricing is based on the fact that they have a fleet of 10 that’s continuously in use over a six-year period, coming back for maintenance after three years. We’ve tried to create something like a fleet vehicle, in a way, and tried to benchmark what other fleet-type services have provided.
VentureBeat: Have you talked about how much you expect to make in coming years, or how much you’ve made to date?
Allgood: Our goal is to get commercial products out in 2022. We hit a pretty big milestone in December and got the new XT arm up and running at fidelity. We now have a little art gallery here. Each team we’re refining what the new hand can do. We’re super excited from a progress standpoint. Our goal in 2022 is to start to ship products. Right now we’re in the process of working with the different commercial engagements we had and making sure we can get products in their hands for testing as quickly as possible. That’s the goal for 2022. Going into 2023, we’ll start to focus on commercial shipments to customers.
VentureBeat: It looks like you have four different models on the website. Are those different generations? How many generations of prototypes have you been through?
Allgood: We’re on our fourth generation of the XO and our second generation of the XT. We’re about to stand up the combo, which is basically the upper for the XO going into the XT. We’ll be using the torso and the top part, which is a new strategy for us.
We do have two other products. We have what we call–it’s more of a military product, a heavy lift system. If a vehicle is flipped over, you can stick this underneath and right it very quickly. And then we have our Guardian S, an inspection snake used in industrial environments, and also in holes to find things that are hidden and such. Those are two additional products we have as well.
VentureBeat: I’ve been talking to Nvidia a lot lately about their Omniverse simulation environment. They mention that a lot of companies are designing digital twins of factories in a metaverse-like setting to simulate all the things about it that they need to review. Once the design is perfect then they build the factory in reality. BMW is doing one of those digital twins. I believe Ericsson is doing it for a different reason. Are you factored into those kinds of ideas? It seems like these could be big parts of factories.
Allgood: 100 percent. Every factory has to go through a factory inspection every year. Typically larger petrochemical companies will have a staff of anywhere from 100 to 400 people where all they do is set up scaffolding and do that inspection. Now they can do it from a control room with an avatar going through. That’s huge.
The metaverse component of it, on the avatar side, you’re wearing glasses. You’re visually seeing through that, and you’re able to get higher fidelity, because we’re collecting that video. That’s also a huge component of our products. We’re able to give both test and action at the same time. You can do an inspection, collect that video, and create a digital twin. We’re an enabler for sure. Digital twins are part of the future. You’ll get that ubiquity across the board. If you think about where we’re going with our robots, we’re going to enable that environment, because we’ll be able to collect that information a lot faster than anyone else because we’re right there. The human eye is only collecting a moment, a snapshot. Using the avatar to do the inspection, we’re actively collecting and automatically creating a digital twin.
Clearly we have a lot of work going on with the compute model inside. Our CTO, Dennis, that’s his background. We’re investing heavily in that regard. I do think there will be a time where, for training, we’ll be using something like a metaverse where operators won’t necessarily train out in the field. They’ll train in an office, and that will be a metaverse. That will be a key part of the training rollout, especially for some of these use cases where they do simulator training today, but they do it in a less engaged environment. Hazardous waste removal, things where they use a virtual simulation. We’ll be able to provide a lot more fidelity around that.
VentureBeat: Integration of the metaverse into your robots, that physical-digital combination.
Allgood: For sure. And then it’s really robot in robot world.
VentureBeat: What kind of user interface works well with these? Are you combining it with things like augmented reality to pass information on to the human controller?
Allgood: Not at the moment. We are, in the sense of–as we’re working specifically on autonomy use cases, we’re leveraging–if you get up there and you want the avatar, the robot to drill these five feet in these segments, you set that space, and then the operator can just watch. Or go do something else. Then they’ll set the next space. From that perspective, it’s a combination of just about every AI and ML combination out there. We’re doing a ton of simulation in different use cases to come up with those models.
If you think about it, we’re selling robots as a service. We’ll also be selling AI as a service. If you want to do it autonomously, we will build that autonomy in. But that’s also software as a service on top of the robot.
VentureBeat: It seems like controlling the robot might be the biggest difficulty. Are there cases where it’s still easier to do it with your hands?
Allgood: If you come visit, I think you’d be surprised. We get this “Aha!” look. We do demos all the time. Somebody who says, “I can’t play video games,” he gets in it and says, “Wow. It’s almost like I’m standing up there. All I have to do is move one hand right and the other hand left.” It’s a lot more intuitive than you think.
In the XO it’s your own hand motions. As long as you lean in, kind of like you would do in a car–when you get in a car you know the car is going to take you someplace. You’re not trying to push it with your feet. When you get in the exoskeleton, as long as you understand this is going to help you and lean into that, it goes where you go and does what you ask it to do. It helps you perform a task at a much easier pace.
VentureBeat: Where are you based? How many people do you have, and how much money have you raised?
Allgood: We’re based in Salt Lake City. We have almost 200 people, and 80 percent of those are engineers. We have $300 million in non-dilutive funding that we’ve gotten from the government, and we raised $260 million back in September.
VentureBeat: Will the factory be in Salt Lake City as well?
Allgood: Correct. We’re on track for our first production units to come from the factory here. We just opened up a brand new facility.
VentureBeat: I remember the Delta Airlines demo. Is that rolling out?
Allgood: Yes, it is. The airlines are having a hard time finding employees as well. We’re seeing two-day delays in some places because they can’t get together enough people. That’s a big opportunity for us across the board. They’ve been a great partner, a lead partner. They’re at the forefront of trying to provide better service to their customers, which includes getting their baggage to them within 15 minutes of landing. That’s a big challenge.
VentureBeat: Are you expecting to be able to produce hundreds of these in 2022 and 2023? Do you envision a point where you could make thousands of them?
Allgood: Our goal in 2022 is to get the production units out in the field and working with customers, so we can collect that feedback. 2023 is the production year, in the hundreds. The goal is to scale from there. We’re looking at what’s the fastest way to get to production, so we’re looking at that both internally and externally.
VentureBeat: The Delta project, did they give you any kind of numbers or stats on the efficiency there?
Allgood: Right now we’re looking at three to one for the exoskeleton. We might even be able to perform better. Each use case is different. There’s different restrictions for team members. If you’re in a hot environment, more than 100 degrees, you can only be on work for 20 minutes, and then you have to go off work for 20 minutes. Depending on the market – and we’ve been very focused on the U.S. – each task has a different requirement. If you’re fully autonomous in the avatar and the operator can be in an air-conditioned building, that work can be continuous, so it’s a lot more than three to one. But I’d say on average it’s three to one.
VentureBeat: How close are you to having something more like a custom robot versus the same robot for everyone?
Allgood: As we start to look at rolling in autonomy, that’s going to be task-specific and use case-specific. The software will be customized. The endo factor is also–we’ve done a lot of work in oil and gas inspection. All the pipes are wrapped, and so you use an X-ray machine. Right now a human walks with that, using a big lead shield and walking on scaffolding. That’s obviously a different use case from the different endo factors.
Our goal is that each of the products would be able to adapt to any type of equipment and be able to run that, but there are definitely specific use cases that will require additional software development, especially if they want it to run autonomously. Those we will customize for each different task and use case. We’re trying to make sure that the base product can serve all the different use cases with its current hand.
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More
Credit: Source link