How motion capture drives innovation in robotics
We recently had the opportunity to connect with Dr. Hung (Jim) La, an Associate Professor at the University of Nevada, Reno, the Principal Investigator (PI) and Director of the Advanced Robotics and Automation (ARA) Laboratory, and one of our long-term system users. We were also joined by Pratik Walunj, a student working towards a master’s degree in computer science and a research assistant who works closely with Dr. La in the ARA Lab.
In this blog, we’ll dive into what they shared with us about the ways our system is being used in cutting-edge robotics and inspection technologies across a range of applications.
Where our engagement with the University of Nevada, Reno started
Our collaboration with Dr. La and the University of Nevada, Reno began over 10 years ago, when we supplied a 16-camera system. The lab then developed a robot operating system (ROS) interface for Cortex. The result was a bridge between ROS and Cortex, enabling seamless data streaming to the robot. Now, other students who are experienced with ROS are taking things further by adapting the package for use in ROS 2 environments.
Follow the ARA Lab on GitHub to stay up to date on their public repositories.
Broadening applications of our motion capture system
Dr. La shared that, over the years, our motion capture system has become an integral part of the ARA Lab’s research and teaching activities. It has been used extensively in projects funded by the National Science Foundation and NASA, supporting a wide range of robotics applications.
The system has proven useful in undergraduate and postgraduate teaching, with over 40 students who graduated under Dr. La having used the system in their work. One even went on to become a robotics lead with a major defense and aerospace contractor, continuing to apply the skills developed using our system.
Today, the system continues to play a pivotal role in developing and testing algorithms for multi-robot systems, particularly aerial drones and magnetic climbing robots for steel structure inspection. The system supports the ARA Lab’s work in precise motion tracking, multi-agent coordination, and data validation for robotic control and navigation.
Here’s a more detailed look at the current applications and expanding areas of research, all led by Dr. La as the ARA Lab’s PI:
1. Designing an all-terrain vehicle
Pratik was involved with designing an innovative unmanned all-terrain vehicle, capable of inspecting bridges and buildings in detail. This versatile robot can traverse water, ground, walls, and ceilings. Using an ultrasonic thickness sensor mounted on the robot, the system can measure the thickness of steel at specific points, enabling condition assessments to determine wear and tear. He developed the design from scratch and is now enhancing its functionality with path planning, obstacle avoidance, motion planning, and autonomous inspection features.
2. Testing for multi-UAV wildfire monitoring
Under a NASA Space Grant, the ARA Lab is exploring multi-unmanned-aerial-vehicle (UAV) systems for wildfire monitoring. This research is being conducted by a PhD student, Gaurav Srikar. The aim is to develop deep learning and a collaborative control algorithm to enable coordinated and reliable multi-drone operations. Our motion capture system allows the team to test and validate the coordination framework for these drones, precisely and in 3D.
The ability to conduct tracking of multiple UAVs is enhanced with Cortex’s 2D-to-6D tracking, allowing reliable object tracking even in a confined space with many drones present.
3. Developing a multi-robot systems
The ARA Lab is beginning work on an autonomous multi-robot system which would be useful in large-scale projects, for example in work on sizable steel bridges, where multiple robots would be required to work together, supporting one another in localization and navigation. To achieve this, the team would use our motion capture system to evaluate its algorithms using LiDAR odometry data to determine the robots’ positions in a 3D space.
4. Validating bridge vibration monitoring
Another PhD student, An, working together with the Department of Transportation, will also be using our motion capture system. The lab is building an AI model of bridge vibration. To validate the model, the Motion Analysis system will be used to offer ground truth measurement of the bridge vibration.
In conclusion, Dr. La shared: “I am very thankful for the support we first received from Motion Analysis over a decade ago, and for their continued innovation. All these years later, we still rely on the system to evaluate the performance of our robotic devices, and plan to continue doing so on upcoming projects.
For localization, we frequently use the system’s rigid body tracking features. Having worked with other motion capture systems in the past, I appreciate how Motion Analysis’ system stands out for its fast and convenient calibration, as well as its high frame rate, which is invaluable for measuring accurate 6DOF data at high speeds.
The system’s precision and reliability have also attracted interest from other departments, including two international faculty representatives in mechanical engineering.”
If you enjoyed reading this blog, check out how EPFL is paving the way for safe, accurate gas leak detection, and how the team has made use of our mocap system in its localization research. Read the case study here.
To learn more about how Motion Analysis can help you, your faculty or your organization drive advancement, book a demo today.


