9/11/2025
WALL-E like you've never seen before
SCROLL
WALL-E like you've never seen before
Our research in robotics, autonomy and controls
seeks to change the world.
Written by Taylor Parks
Videos by Rachel Berry
MechSE’s faculty are working toward a future in which human productivity is augmented by robotic support and autonomous control systems work tirelessly in the background to further human endeavors. They are contributing a plethora of research toward this revolution, with investigations ranging from penny-sized automated wearables to dynamic educational prosthetics.
Some are also involved with The Grainger College of Engineering’s Center for Autonomy, a shared research facility housed at the Coordinated Science Laboratory that seeks to enable high-impact research to design innovative autonomous systems. The center also offers professional development opportunities to graduate students through its M.Eng. degree that focuses on autonomy and robotics.
With a shared dedication to collaborate and a passion for innovation, MechSE is seizing opportunities in the areas of autonomy, robotics and controls to change the world in unprecedented ways.
Justin Yim
“Our lab works on a lot of robots that move around in unconventional ways,” said Assistant Professor Justin Yim. “We’re really excited about exploring the different ways that robots can move around in the world—walking, rolling, jumping, maybe even slithering.”
As a PhD student at University of California, Berkeley, Yim developed Salto, a one-legged jumping robot inspired by the biomechanics of the squirrel-sized lesser bushbaby. With Salto, he demonstrated the ability to direct accurate leaps onto narrow targets and stick the landing gymnast-style. More recently, he has turned his focus to outer space, investigating the ability to jump on Saturn’s icy moon Enceladus with Phase 1 funding from the NASA Innovative Advanced Concepts (NIAC) program.
“The gravity on Enceladus is roughly 1/80th the strength of that on Earth, so a robot that can jump four feet high [on Earth] might be able to cover the length of a football field in a single bound, allowing it to cover crevices and canyons and other rough terrain,” Yim explained. “We’re also looking at combining jumping with rolling locomotion so that we can take advantage of wheels on flat terrain and jumping on rough surfaces. If we can do both, the robot will be more flexible and more adaptable to situations that we might not have predicted at launch time.”
Yim’s lab is also developing robots that can emulate squirrels’ ability to jump into a climbing motion as well as both walking and wheeled robots that use minimal motors for locomotion—not to mention a rock-shaped robot that uses an internal pendulum to literally rock and roll.
Yim credits nature with the inspiration for his work, which has garnered accolades such as a Faculty Early Career Development Program (CAREER) Award from the National Science Foundation.
“Animals serve as a really great example for how small or big things can get around on our planet,” he said. “When we try to build engineered systems, we don’t always have to start from scratch, trying to figure out every possible way we might solve the problems that we face.”
Over the next few years, Yim is excited to contribute to the evolution of robots’ animalistic capabilities. “I’d love to see robots that explore new ways to move that can be better than how robots move now,” he said. “And maybe even better in some ways than how animals and humans move. That would open up new abilities for robots to do things that humans and other animals can’t.”
Nazanin Farjam
Assistant Professor Nazanin Farjam’s research focuses on developing innovative modeling frameworks and intelligent control strategies to enhance the flexibility, robustness, and efficiency of complex dynamical systems.
“My passion lies in pushing the boundaries of manufacturing and venturing beyond traditional domains into the realm of cutting-edge, complex systems,” Farjam said of her work, which includes enabling the creation of flexible, lightweight, and cost-effective components that seamlessly integrate into advanced technologies—even in harsh environments like space or remote areas (for example, imagine a swarm of robots manufacturing semiconductor chips on Mars). This approach supports scalable and sustainable manufacturing processes, minimizing material waste and production costs.
“What truly excites me is the potential to make manufacturing smarter, more adaptive, and future-ready,” she said. “By leveraging advanced modeling and AI-driven control strategies, we can create systems that dynamically adjust to changes and operate at peak efficiency.”
Farjam is particularly excited about her work’s potential to impact industries such as advanced electronics, biomedical devices, space manufacturing, and robotics.
“Printed electronics offer lightweight, customizable circuits for applications like wearable and biomedical devices,” she said. “Combined with development of intelligent control systems, these technologies will enhance fabrication process efficiency and adaptability. Ultimately this will lead to agile, automated manufacturing systems capable of on-demand, customized production, aligning with Industry 4.0 and smart manufacturing trends.”
Mattia Gazzola
Mehta, Chang, Associate Professor Mattia Gazzola, and other collaborators have also developed an unprecedented computational octopus arm model (see page 28 for more on this work).
“The foundational work began with understanding what happens in an octopus’s biological system,” Mehta recalled of the years-long ongoing study. “Then it quickly transitioned to applying [the octopus’s] inspiration to real engineering systems.”
“Our theoretical understanding is still an intuitive approach,” Gazzola said of ongoing work. “We want to develop an automated framework so that our octopus model can learn to perform tasks on its own.”
Following his efforts to model the arm, Chang also worked to develop a test bed for physical prototypes. Findings from the octopus arm have implications for applying soft robotics to industries like agriculture that rely heavily on physically laborious processes.
“We started with some mathematical backgrounds to understand the math behind it,” Chang said of the process of modeling complex systems for soft robotics applications. “Then we got to learn from a physical arm, and then we developed simulations. It has been very rewarding for me, being able to understand real things that are happening.”
Elizabeth Hsiao-Wecksler
Professor Elizabeth Hsiao-Wecksler currently leads multiple robotics efforts geared toward improving mobility and quality of life for people with physical challenges. On the one hand, her team has been working on a personalized, unique rolling experience (PURE) ball-based mobility device that can be used in place of a traditional wheelchair.
The previous generation of their device could carry a person up to 130 pounds at four miles per hour. “We’re hoping to be able to support people up to 200 pounds and at six miles per hour,” Hsiao-Wecksler said. “Our Gen3 prototype is also being made modular so that we can separate the drive train from the upper module.”
Hsiao-Wecksler hopes that Gen3’s modularity will increase the breadth of its working abilities—such as swapping out a chair for a robotic arm or adapting for a non-human payload.
“The cool thing about ball bots is that they ride on a single spherical wheel, which makes them self-balancing,” she said. “You can move forward and backward, slide left and right, and spin. And we’ve added a vision system to achieve fully autonomous driving.”
Her group is also investigating robotic limbs that can provide haptic feedback to medical students who are training to perform neurological exams. The project, which represents more than ten years of iterative design, is funded by the Jump ARCHES program, a collaboration among OSF HealthCare, the U. of I., and University of Illinois College of Medicine Peoria.
Clinicians will often attempt to diagnose neurological injuries or disorders by manipulating the patient’s limbs, which can reveal two muscle stiffness characteristics indicative of these disorders—spasticity, which is velocity-dependent during passive movement, and rigidity, which is constant.
“[With our prosthetics], we want to mimic spasticity, which is often seen in people with a stroke, cerebral palsy, or spinal cord injury, and rigidity, which is often seen in people with Parkinson’s,” Hsiao-Wecksler explained. The team has developed a working prosthetic arm prototype that is currently being used for student training, and are developing a prosthetic leg for the same.
Most recently, she embarked on the development of a soft robotics cushion, also funded by Jump ARCHES, that ameliorates pressure ulcers for people who are wheelchair bound.
“We’ve been working to understand air bladders—how to design and fabricate them,” she said. “We want to be able to actuate the amount of support provided to the individuals sitting on them.”
“The hope is that this will become an automated system where the chair can detect pressure points and modulate pressure to relieve them,” she said. “It should modulate the behavior of the cushions so that it can change where loading is experienced over time. And it should be fully contained so that the unit can be transferred from chair to chair depending on where the individual is.”
In all of her projects, Hsiao-Wecksler is excited by the prospect of helping more and more people. “I have a deep interest in helping people with disabilities and being able to improve their quality of life,” she said.
experiencing limited mobility.
Naira Hovakimyan
Professor Naira Hovakimyan’s high-flying efforts regularly make the news.
She serves as Director of the Center for Autonomous Vehicles in Air Transportation Engineering (AVIATE) at Illinois—a NASA-funded university leadership initiative that integrates efforts from researchers at the Georgia Institute of Technology, Massachusetts Institute of Technology, University of Nevada, Reno, and Noth Carlina Agricultural and Technical State University as well as industry partners Lockheed Martin and Sierra Nevada Corporation. The center is focused on enabling safe and efficient advanced air mobility through the development of a robust and resilient autonomy framework.
Hovakimyan noted that the team has already developed scaled, operational prototypes of aerial taxis. “For example, these are the so-called flying cars that can facilitate easy transportation between small towns.”
The multidisciplinary team develops, tests and validates algorithms for the safe integration of learning-enabled components, or AI modules, in flying unmanned aerial vehicles (UAVs). Earlier this summer, they hosted their first Autonomy Fest, showcasing their progress with UAV component development and giving talks on relevant topics such as flight safety, fault diagnosis, and outreach. Safety, in particular, is high on Hovakimyan’s priorities for the future of UAVs.
“When you remove the human operator, the autonomous system has to be able to make decisions and avoid all kinds of crashes,” she said, noting the ongoing need for developments to these systems. “The challenging cases are the ones in which obstacles, collision paths and failures are introduced. We are working on these cases to develop additional safety features through autonomous solutions.”
As she looks to the future, Hovakimyan is most excited by what the team collectively can accomplish.
“The best is yet to come,” she said. “With the help of NASA funding a team of exceptional scholars, and through exemplary leadership, global outreach, and industrial partnerships, we are very excited to contribute to the science of autonomy.”
Mickey Clemon
With a background in manufacturing, Teaching Assistant Professor Mickey Clemon has developed meaningful collaborations with roboticists in two broad areas—field work, which is pertinent for agricultural applications, and collaborative 3D printing, in which robotic arms work together to overcome traditional printing constraints.
Typically, 3D printers print one layer at a time, with each subsequent layer relying on the previous for structure. The volume of the printed part is constrained to fit within the print bed and gantries.
“We’ve made the 3D-printing process faster by deconstructing these constraints and figuring out what can be printed in what order,” Clemon said. “We’ve expanded into multiple print arms to print multiple layers simultaneously. Our framework is a sequencing plan that is platform-independent.”
Clemon and collaborators are now exploring other constraints, such as material properties and the energy consumption associated with printing each part. While the overwhelming majority of 3D-printing efforts have historically focused on prototyping, the field is experiencing a shift toward more finalized 3D-printed products.
“The long-term performance of these products is starting to become very relevant,” he said. “Consideration of the material properties is going to be very valuable as the additive manufacturing community moves into printing final products.”
At the same time, Clemon and others have been collaborating with researchers from the University of Technology Sydney to improve efficiency, and safety, in a very different application—wool harvesting.
“Shearers can injure themselves or age out of the job quite quickly—the job lifespan of a professional sheep shearer is often very short,” Clemon said of the ongoing challenge posed by the wool industry’s traditional harvesting methods.
One current state-of-the-art method still under investigation – an injection that weakens wool at the follicles – shows promise. Weakening the wool follicles prior to harvest allows harvesters to pluck the wool from the sheep without the need for shearing devices, which reduces both the risk of injury and the required skill level. Clemon’s interest in supporting these efforts led him to sponsor a Senior Capstone Design (ME 470) team to develop a prototype automated device for collecting the loosened wool from the sheep.
“Now that robotics and microcontrollers are much cheaper than they were 20 years ago, I think there’s a real opportunity to bring automation and clever mechanical design into more farming practices,” Clemon said. “There are lots of jobs that people do because they’re impractical for robots—the job requires the adaptability and insight of people, and it’s done in a dirty, sometimes dangerous environment. There’s an opportunity for students to explore how to bring automation to these sorts of jobs and make them faster and easier for people to accomplish.”
Siyi Xu
Imagine if muscle performance could be tracked during real-time training—this is what Assistant Professor Siyi Xu hopes to achieve.
Xu works to develop wearable actuators that can give tactile feedback in real time as the wearer performs certain tasks or exercises. This is especially pertinent for athletes and patients in physical therapy, where haptic feedback could impact their training or recovery on the spot.
“We’re hoping to use the actuators to sense the biomechanical performance of muscles,” Xu explained. “For example, how well the muscles are being trained or activated during certain motions, and whether they are meeting targeted requirements.”
Current technologies, such as ultrasound or specialized handheld devices, are capable of tracking muscle mechanical properties. However, fully untethering these technologies so that they become wearable, mobile devices is challenging. Xu seeks to address this gap by developing lightweight, waterproof wearables that safely adhere to the skin. Her current prototypes are on the scale of a penny.
“I’m hoping to improve the performance of these actuators and use them as wearables to untether them from electronics. That’s a very exciting contribution we could provide to the field,” Xu said.
The real-time feedback is also pertinent for robotics design. With in-depth feedback from human performance during a task, robots that collaborate with the human to complete the task more efficiently could be adapted to the specifications defined in these metrics.
“I’m excited to investigate the possibility of improving collaboration between humans and robots.”
Sameh Tawfick
Professor Sameh Tawfick has been working to develop jumping capability in 3D-printed insect-scale robots.
“To my knowledge, this is the first time anyone has demonstrated long jumping in insect-scale robots,” Tawfick said of his lab’s accomplishment. “This is significant because it gives the robot planned mobility, where it can now jump from A to B, traversing terrain [with obstacles larger] than its own size.”
Jump performance in insect-scale robots was previously hindered by small-scale manufacturing processes and limited availability of materials and miniature actuators. Tawfick’s team used coiled artificial muscle actuators and projection 3D printing to produce a monolithic elastomeric robot design inspired by a locust’s jumping mechanism.
“We used a four-bar linkage design for jumping, inspired by the locust, which is an outstanding jumper,” said Tawfick, explaining that while a locust’s four legs are not linked, allowing it to both walk and jump, the robot in their study relies on a single muscle serving a linkage system.
Their insect-scale prototype has a lightweight elastomer body and an artificial muscle made from coiled, heat-treated nylon fishing line. Tawfick’s lab previously developed machines to produce these miniature coils. The researchers designed and tested 108 robot iterations produced through additive manufacturing, with the smallest having a mass of 0.216 grams and the ability to jump 60 times its body size in horizontal distance.
“Our dream for the future is to have a small mission in which the robot executes multiple jumps until they reach a target,” he said.
To my knowledge, this is the first time anyone has demonstrated long jumping in insect-scale robots. This is significant because it gives the robot planned mobility, where it can now jump from A to B, traversing terrain [with obstacles larger] than its own size.
- Sameh Tawfick
Prashant Mehta
Professor Prashant Mehta looks eagerly to the future of AI as it pertains to applications like robotics and control. “It’s a great time to be alive,” he quipped, with so many opportunities to explore AI technology while it’s still young.
In previous work, one of Mehta’s former PhD students developed the mathematics for a particular type of control theory that proved to be very relevant to understanding how AI technologies, such as the generative chatbot ChatGPT, function.
“We hope to use that background to understand how ChatGPT and other models work,” said Heng-Sheng Chang, a postdoctoral researcher and former student in Mehta’s lab.
“You can’t help but be amazed by how well it works—which is a surprise,” Mehta said of technologies like ChatGPT. “As a community, we did not really expect that something like this would be possible based on the mathematics that exist. Yet, it is there—it demonstrably works. Our goal is really to try to understand the mathematical foundations for this technology.”
Chang’s perspective on understanding AI for future implementation in soft robotics echoes Mehta’s sentiment. “At first, we try to understand and model AI, and then we work to control it,” he said of the team’s process going forward. “How do we orient AI behavior toward our desired goals?”