Is It Time For The Jetsons Yet?

Posted by John Werner, Contributor | 6 hours ago | /ai, /innovation, AI, Innovation, standard | Views: 6


To some of us, the rollout of artificial intelligence is fairly reminiscent of the last technology shift with personal computers, small devices and deterministic programming.

I’ll explain. If you recall in the 1980s in 1990s, as we were seeing computer science developed with languages like Fortran and Basic, the dreamers among us saw how these things could easily be tethered to robotic systems. You could make some kind of physical avatar, like a snail or a little car, move in different directions. You could get robotic arms to pick things up…

So it was surprising to some of us that those sorts of applications never really took on. Computing stayed in the digital realm, where it seemed to belong in terms of computer consumer applications. Now in business, robotics took off in a big way, and that’s still happening. But on the consumer side, we never really got used to the idea that robots could do human labor.

With AI, we stand on the brink of the next piece of what you might call the fourth industrial revolution, where we start to contemplate how smart machines could move around and do things for us, like washing the dishes, or helping a loved one to and from the toilet.

In some ways, it comes at the perfect time, as people are worried about underpopulation and a lack of caregiving labor, not to mention all sorts of other economic and labor problems related to things like housework.

Could AI solve all of this?

I think we all agree that the technology is here. The question is how it will get done.

The Fourth Industrial Revolution

Some experts talk about characterizing this technology transformation in ways that suggest that robotics are coming sooner rather than later.

“The Fourth Industrial Revolution is … not a prediction of the future, but a call to action,” writes Klaus Schwab in an essay on the subject. “It is a vision for developing, diffusing, and governing technologies in ways that foster a more empowering, collaborative, and sustainable foundation for social and economic development, built around shared values of the common good, human dignity, and intergenerational stewardship. Realizing this vision will be the core challenge and great responsibility of the next 50 years.”

It does seem like referring to the prior industrial revolution, and how AI builds on that, is a good way to frame it.

Researching Robotics

Some sciences are taking a technical approach to measuring the development of robotics.

Here’s a paper where scientists discuss some of this method – they’re actually taking manufacturing information and other sources to come up with some kind of synthesis.

“The spread of robots and artificial intelligence has raised concerns about automation technology-driven innovation,” authors write. “This paper investigates the role of robots as a source of unconventional innovation and empirically analyzes the relationship between robots and firm innovation from unconventional and sustainable perspectives. We build a unique dataset containing detailed information on firm characteristics with firms’ patent data and merge it with data on robot adoption in Chinese manufacturing.”

Presumably, we’ll need more of this to really understand what robotics is doing in our markets.

Body and Brain

I want to go to something that my colleague Daniela Rus said in a recent IIA panel about just this particular thing – physical AI and robotics.

“In order to have a functional robot, you really need to have a good body, and you need to have a good brain,” she explained. “The brain controls the body to deliver its capabilities … right now, from the point of view of the hardware, we still don’t have all the sensors that are needed in order to get the robots to do more than navigate the world. So if we want the robots to do interaction in the world, we need better sensors (for) navigation.”

I think that’s very on point, and a good way to think about all of this.

More Thoughts on Physical AI

Rus was part of a panel discussing all of the ways that we can facilitate the advent of robotics endowed with AI capabilities.

“It turns out that it’s frustratingly difficult to develop a robot with what I would call ‘AI spatial understanding,’” said panelist John Leonard. “A lot of our AI approaches are based on human-annotated data sets … Facebook/Meta has a data set technique trained on a billion images. That’s not how children learn. I think that navigation and exploring the world lends itself to robots that can learn from their own experience, from much smaller numbers of samples of data, exploiting the kind of spatial, temporal context of the data that they acquire.”

Talking about something called the Moravec paradox, considered by Minsky and others (see definition here), Leonard suggested we need a sort of “language of physics” to facilitate the robot boom.

Panelist Thomas Baker had this to say about robot operations:

“If I send a robot to a planet, can I say, hey, build a house? Does it understand what it needs? Does it understand the materials that are around it? Does it understand how to construct everything necessary, to then build what’s necessary, and then handle dust storms and radiation and whatnot? So the problem expands quite a bit.”

Caleb Sirak talked about the impact of such physical systems, asking:

“How do we take the more efficient architectures that we know, the computations in AI, matrix multiplication … how do we take that, and apply that onto a chip that we can produce at scale from anywhere around the world, and then provide that to people that need to use it, and typically in AI, how do we do that at a fast enough speed that we can get it in real time?”

The result, he noted, has a big effect.

“That’s been a tremendous impact in all around the world, in rural countries,” Sirak said, “and seeing that drones are being able to deliver medicine and be able to fly autonomously is incredible. And seeing that we’re able to distribute this around the world is really powerful.”

Panelist Annika Thomas talked about her experience using AI in a rapidly changing era.

“I went to undergrad at the time where we didn’t have ChatGPT,” she said. “Learned a lot during that time, but I also learned how to interpret information faster, and that’s something that I want to be able to teach our robots to do as well. I want our robots to be able to parse through the spatial environment and figure out what information is most important to keep, especially when we’re looking at these things from a multi-agent setting.”

The panel also took questions, and discussed other aspect of this phenomenon – check out the video for more.

Robots in our World

Once again, we’re invited to think about what this will actually look like. Back in the 1980s, we had the Jetsons – a cartoon with flying cars, robot maids, and all kinds of high-tech gee-whiz gizmos that we’ve never seen actually manufactured for our homes.

Will that change? Stay tuned.



Forbes

Leave a Reply

Your email address will not be published. Required fields are marked *