Lie Group Formulation of Recursive Dynamics Algorithms of Higher Order for Floating-Base Robots
Ahmed Ali, Chiara Gabellieri, Antonio Franchi
TLDR
This paper presents Lie-group formulations for higher-order dynamics of floating-base robots, showing quadratic scaling in computational cost.
Key contributions
- Develops Lie-group formulations for higher-order time derivatives of Newton-Euler, ABI, and hybrid dynamics.
- Identifies a passive Coriolis matrix and shows articulated inertia tensor invariance across derivatives.
- Applies methods to a 12-DoF aerial manipulator, deriving dynamics up to fifth order.
- Demonstrates quadratic computational cost scaling with derivative order, outperforming AD.
Why it matters
This work offers a computationally efficient approach for analyzing complex robot dynamics at higher orders. Its quadratic scaling for derivatives provides a significant advantage over exponential automatic differentiation methods, enabling more precise control and simulation for advanced robotic systems like aerial manipulators.
Original Abstract
In this paper, we describe procedures for computing higher-order time derivatives of the Lie-group Newton-Euler, Articulated-Body Inertia, and hybrid dynamics algorithms for floating-base trees, where the base configuration evolves on SE(3) and the attached mechanism is an open kinematic tree with configuration on the (n1+n2)-dimensional manifold T^{n1} \times R^{n2}, using spatial representation of twists. After presenting the algorithms, we collect the resulting recursions into closed-form equations of motion, identifying an admissible Coriolis matrix satisfying the passivity property, and showing that the articulated inertia tensor remains unchanged across all time derivatives. We then apply the developed methods to a 12-DoF aerial manipulator to derive analytical expressions for its geometric forward and inverse dynamics along with their first time derivatives whereas the numerical simulations successfully evaluate these dynamics up to fifth order. Finally, to demonstrate their practical utility, we benchmark the proposed extensions and show that, in the considered tests, their computational cost scales quadratically with the derivative order, whereas the automatic-differentiation baseline exhibits exponential scaling.
📬 Weekly AI Paper Digest
Get the top 10 AI/ML arXiv papers from the week — summarized, scored, and delivered to your inbox every Monday.