LaViers, Park aim to make bipedal robot movement more expressive

12/20/2017 Julia Stackler

Written by Julia Stackler

Assistant Professors Amy LaViers and Hae-Won Park were recently awarded an EAGER grant from the National Science Foundation. Their EArly-concept Grant for Exploratory Research (EAGER) collaborative project, “Center-of-Mass Control for Expressive and Effective Movement in Bipedal Robots,” will apply insights from the study of human athletes and dancers to enable similarly effective and expressive movement in humanoid robots.

Their objective is to create the capability for such movement without duplicating the full complexity and articulation of the human torso. Specifically, the project will implement a novel mechanism for shifting a heavy mass that moves the robot’s center of mass independently of limb movements using novel muscle-like actuators. The team will evaluate the results based on a formal system of movement analysis developed for dance, athletics, and physical therapy.

In addition to improving the robot’s walking gait, LaViers and Park said these capabilities will provide new channels for communication between robots and humans. Humans make unconscious inferences about attitude and intent (e.g. trustworthiness, competence, and leadership, to name a few) from observing movement. The project incorporates the interplay of art and technology into outreach activities, such as using dance movements to understand fundamentals of robot locomotion.

Their novel approach to generating dynamic walking in a humanoid robot is to exploit the dynamics of a novel, core-located rolling ball-and-tray actuation mechanism driven by the study of embodied movement theory and Bartenieff Fundamentals, a formal system of movement analysis. The researchers aim to explore the role of center-of-mass control in human walking and to demonstrate the feasibility of the approach for bipedal robotic walking through modeling, simulation, and an initial prototype.

“This project is tackling robotic system development from many angles: control strategy, platform design, actuation mechanisms, and human perception. It started after an embodied movement workshop where my group noticed the difference in initiation in walking in humans versus robots. Put simply, humans use core muscles (including the muscles surrounding the pelvis) to initiate each step with a small falling motion; on the other hand, typical robots initiate each step from control signals sent to motors in distal joints: hips, knees, and ankles. This inspired us to design a walker that better aligned with this human strategy, hopefully in order to better replicate the myriad of ways humans walk in a synthetic system,” said LaViers.

The research team brings together experts in movement science, dynamic walking, and muscle-like actuation. In addition to improving robot performance, this work has the potential to increase the bandwidth of robot-human communications with a focus on expressive movement. The ability to engineer these contexts into robot movement has many potential applications in human facing scenarios.

The team also includes PI Joshua Schultz, an assistant professor of mechanical engineering at the University of Tulsa, and his graduate student, Caleb Fuller.

The researchers have published several papers on the topic, including one since the grant was awarded:

  • M. Heimerdinger and A. LaViers. “Influence of Environmental Context on Recognition Rates of Stylized Walking Sequences.” Ninth International Conference on Social Robotics (ICSR). Tsukuba, Japan. 2017 (pending).
  • U. Huzaifa and A. LaViers. “Control Design for Planar Model of a Core-located Actuation Walker.” 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob). 2016.
  • U. Huzaifa, C. Bernier, Z. Calhoun, C. Kohout, J. Heddy, B. Libowitz, A. Moen- ning, J. Ye, C. Maguire, and A. LaViers. “Embodied Movement Strategies for Development of a Core-located Actuation Walker.” 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob). 2016. Best Poster Finalist.

Share this story

This story was published December 20, 2017.