Daily Technology
·10/03/2026
The field of humanoid robotics is taking a dynamic leap forward. While stable walking has been a long-standing benchmark, new research is pushing robots to master complex, agile movements like running, vaulting, and climbing. A recent framework known as Perceptive Humanoid Parkour (PHP) demonstrates this evolution, enabling a robot to perceive its environment and execute parkour-like maneuvers on the fly.
A primary trend is the shift from single-motion capabilities to the composition of multiple dynamic skills. Instead of just walking, robots are being trained to chain different actions together to navigate complex terrains. This requires whole-body coordination, precise timing, and the ability to make quick decisions. The goal is to move beyond simple locomotion and capture the agility and adaptivity of human motion.
A key example is the Unitree G1 humanoid robot, which, using the PHP framework, can perform a series of contact-rich maneuvers. It has demonstrated the ability to vault at speeds of approximately 3 m/s, climb onto a 1.25-meter wall (96% of its height), and continuously traverse a complex obstacle course. This showcases a system that can string together different skills in a fluid sequence.
To achieve this new level of agility, researchers are turning to human data. The PHP framework was developed by first collecting videos of people performing parkour. These movements were broken down into smaller, reusable segments and then recombined to create long, continuous motion sequences. This approach, which leverages motion matching, allows the robot to imitate not just isolated actions but also the smooth transitions between them.
These kinematic trajectories were then used to train a robot controller through reinforcement learning. Initially, separate controllers learned individual skills. These were then distilled into a single, unified controller that can coordinate different actions based on visual input. This method preserves the natural elegance of human movement while making it adaptable for a robotic system.
Perhaps the most critical trend is the integration of real-time perception for autonomous decision-making. The robot isn't just following a pre-programmed routine; it is actively interpreting its surroundings. Using only onboard depth sensing and a simple velocity command, the system decides whether to step over, climb onto, vault across, or roll off obstacles of varying shapes and sizes.
This capability marks a significant advance toward creating robots that can operate in unstructured and unpredictable environments. The ability to adapt to real-time obstacle changes, as demonstrated in the Unitree G1 experiments, is crucial for real-world applications. The researchers have also announced plans to fully open-source the PHP framework, which will likely accelerate further development in this area.









