Can AI Fold Your Laundry? Scientists Explore the Future of Robots
Artificial intelligence has made remarkable strides in recent years, revolutionizing the digital world. Now, scientists are working to bring that same intelligence into the physical world through robotics, with the ambitious goal of creating robots capable of performing everyday tasks, like folding laundry.
Chelsea Finn, an engineer and researcher at Stanford University, is at the forefront of this effort. “In the long term we want to develop software that would allow the robots to operate intelligently in any situation,” she says. Finn believes that AI could soon power a new era in robotics.
Finn and her team have already demonstrated a general-purpose AI robot that can fold laundry. Other researchers have successfully used AI to improve robots’ abilities like package sorting and drone racing. Even Google has unveiled AI-powered robots designed to pack lunches. However, the consensus among the robotics community is that there’s still a wide gap between expectations and reality.
The Gap Between Expectation and Reality
One of the main differences is that robots need to function reliably in our complex, unpredictable world. They need massive amounts of real-world data, and they have to overcome problems that go far beyond language modeling and image generation.
Ken Goldberg, a professor at UC Berkeley, points out, “Robots are not going to suddenly become this science fiction dream overnight. It’s really important that people understand that, because we’re not there yet.”
At Stanford University, graduate student Moo Jin Kim is working on a program called OpenVLA, which stands for Vision, Language, Action. It illustrates the potential of AI in robotics. Kim explains that the robot is powered by a teachable AI neural network. The robot uses a pair of mechanical arms with pincers.
“It’s one step in the direction of ChatGPT for robotics, but there’s still a lot of work to do,” he says. He demonstrated how the OpenVLA model is trained to perform different tasks by simply showing it how to do them.
To demonstrate the system, Kim used a tray filled with trail mix. By typing “Scoop some green ones with the nuts into the bowl,” the robot’s arms carefully went into action. The robot then placed a star over the correct bin. According to Kim, sometimes the model doesn’t work, and their team “hold[s] their breath.” Yet the robot was able to complete the task.
Training Robots
Finn co-founded a company called Physical Intelligence, which takes this training approach a step further. She envisions a future where robots can adapt to simple jobs, like making a sandwich or stocking shelves. Finn suggests that the best approach might be to train a single model to perform lots of different tasks, which is a departure from the current focus of robotics.
“We actually think that trying to develop generalist systems will be more successful than trying to develop a system that does one thing very, very well,” she says.
Physical Intelligence has developed an AI neural network that folds laundry, scoops coffee beans, and assembles a box. However, the neural network that enables it to perform all those activities is too powerful to be on the robot itself.
“In that case we actually had a workstation that was in the apartment that was computing the actions and then sending them over the network to the robot,” she says.
The Data Dilemma
Building a large dataset of real-world information for robots, akin to the internet data used to train chatbots, is difficult. Finn notes that there is no existing open source of robot data, so the team has to collect the data themselves.
Ken Goldberg is skeptical that this gap can be closed soon. Chatbots have improved dramatically over the past few years aided by the massive amounts of data used to teach them to write sentences and draw pictures.

Goldberg believes that building an internet’s worth of real-world data for robots is going to be a much slower process. He projects that it will take 100,000 years at the current rate.
The complexities of simulating real-world interactions present a significant challenge. As Goldberg puts it, “Basically there is no simulator that can accurately model manipulation.”
Even if the data challenges are overcome, some researchers think deeper issues may be problematic for AI robots. Matthew Johnson-Roberson, a researcher at Carnegie Mellon University, states that the task chatbots are asked to do is comparatively less complicated. Robots will be tasked with performing many more complex actions.
Despite the hurdles, the belief is that AI will transform robotics. Goldberg has co-founded a company called Ambi Robotics, which uses AI to organize packages so a robotic arm can pick them up. He says that the system has dramatically reduced the number of dropped packages.
Finn says, “I think there’s still a long way for the technology to go.” Yet she anticipates that AI-powered robots will bridge the gap caused by aging populations and projected labor shortages.
“I’m envisioning that this is really going to be something that’s augmenting people and helping people,” she added.