Motion Analysis Corporation Unveils Cortex 9.5 Software Upgrade

November 8 2023, California – Motion Analysis Corporation is excited to announce the highly-anticipated release of Cortex 9.5, the latest edition of its cutting-edge motion capture software. This update is now available for download and is accessible to all customers with active warranties or current software maintenance contracts.

Cortex 9.5 introduces a range of exceptional features and improvements that elevate the motion capture experience to new heights, providing users with greater flexibility, efficiency, and accuracy. Here are the key highlights of this remarkable update:

Quick Files Capture Status: Cortex 9.5 introduces Quick Files Capture Status indicators, simplifying the assessment of dataset status. Users can easily classify captures as “Unedited,” “In Progress,” or “Complete.” Customization options are also available, allowing users to create their own status names and icons, providing a user-friendly experience.

Kestrel Plus Cameras: With Cortex 9.5, Motion Analysis Corporation introduces the Kestrel Plus camera line, featuring the Kestrel Plus 3, Kestrel Plus 22, and Kestrel Plus 42. These new cameras seamlessly integrate with Cortex 9, expanding your capture capabilities and delivering high-quality results.

Trim Capture Modifications: Cortex 9.5 enhances the Trim Capture feature, enabling users to modify names, generate captures on a per-markerset basis, and add timecode support. This streamlined process facilitates the extraction of relevant data from capture files and offers improved post-processing options.

Workflow Improvements: Cortex 9.5 enhances the Workflow feature, making task execution even more efficient. Users can now utilize a search tool and a workflow repository, enabling easy access and management of workflows, optimizing productivity.

Live Detailed Hand Identification: Advanced hand tracking techniques have been integrated into Cortex 9.5, reducing marker swapping during live collection and post-processing of intricate finger movements. Users can contact the support team for a sample markerset to enable this feature.

Automatic Wand Identification for Reference Video Overlay Calibration: In a significant time-saving move, Cortex 9.5 automates the marker selection process for reference video overlay calibration, eliminating manual marker selection and potential user errors. This feature can be applied in both Live Mode and Post Process.

Bertec Digital Integration: Cortex 9.5 now offers support for Bertec AM6800 digital amplifiers, simplifying setup and reducing the number of required components, thus enhancing the overall user experience.

National Instruments New Device Compatibility: Cortex 9.5 continues its support for National Instruments A/D board data collection and expands compatibility to their next generation of DAQs, maintaining flexibility and ensuring compatibility with previously supported devices.

Additional Updates and Features: Several additional updates and features, such as the renaming of the Post Process X panel to Tracks, improved contrast in Dark Mode, and an increased marker slot limit, are included in this feature-rich update.

Cortex 9.5 marks a significant milestone in the field of motion capture, empowering users with advanced tools, enhanced workflows, and improved performance.

To learn more about Cortex 9.5 and take advantage of these exciting new features, download the full release notes here, or contact our sales and support teams for further information and assistance.

Motion Analysis Corporation continues to lead the way in motion capture technology, and Cortex 9.5 is a testament to our commitment to delivering innovative solutions that meet the evolving needs of our customers.

About Motion Analysis Corporation

Motion Analysis Corporation is a leading provider of motion capture technology solutions for various industries, including entertainment, sports, healthcare, and research. With a focus on innovation and customer satisfaction, Motion Analysis Corporation strives to make motion capture more accessible and versatile.

Mocap in action: In conversation with Adam Cyr, Biomechanist at Mary Bridge Children’s Hospital

A long-standing client, Mary Bridge Children’s Research and Movement Laboratory (RML) is a multidisciplinary facility that houses a team of engineers and clinicians who conduct research and use the latest technologies to identify, diagnose, and treat individuals with movement challenges.

We caught up with Adam Cyr, a biomechanist at the facility, who has a keen interest in applying engineering principles and techniques to understand how the human body performs. His goal is to improve injury prevention and treatment.

Here, we share what he had to say about his work and how he is using mocap as part of the biomechanics research he does on a daily basis.

Could you give us a quick overview of your background as it relates to the world of biomechanics and biomechanics research?

After completing my studies, I briefly worked at a company doing forensic biomechanics before I found myself at the Research and Movement Lab at Mary Bridge Children’s Hospital. At the RML, we see patients with a wide variety of concerns, including neurological, muscular, and orthopedic disorders. We also see people who are looking to enhance their performance or who suffer from sports-related injuries.

How do you use motion capture technology in the work you do every day?

The more data we can collect, the better. We want to look at kids doing functional tasks. If we see a patient today and collect data on how they move in their preferred way and then they have some sort of intervention, we have data we can use to assess if there’s been an improvement because they will be moving better than before. Our goal is to inform the clinical providers, whether they’re surgeons or physical therapists, and provide them with objective data so they can make better decisions. 

On a typical day, we’ll spend a few hours with a patient either in the morning or the afternoon. We’ll prep the room to make sure that the motion capture system is ready and that the markers are ready to go. We’ll do a subjective history and a physical exam. And then we’ll put the markers on and get the patient to do basic movements. If there’s any particular activity that is causing a problem, we will have them do that activity specifically. After they leave, I compile the data, process it and turn it into graphs and meaningful insights for our therapists to review. It’s great to work this closely with clinicians to see the data and graphs transform into information that means something.  

Can you walk us through your experience using Motion Analysis and share some of the features you find most useful?

The motion capture system I inherited in my current position was an older one. We were very fortunate to be able to upgrade to some newer Motion Analysis cameras recently. The new tech is very impressive. From a size perspective, everything is getting smaller, the optics are better, the speed is better and these cameras can track much smaller markers. 

The cameras are also more advanced, which makes it easier to do things right the first time and not waste time cleaning up the data. This speeds up patient processing times. We want to get a report back to our patients within a couple weeks and if I’m spending a day cleaning up data, that isn’t possible. 

When I do have to clean up data, there are some great features on the backend that make it easier to do so. For example, if a marker dropped off and you didn’t notice, you can use virtual markers to fill in the data gap. I’ve also started to go down the road of playing with what they call the Sky Interface. This allows me to build my own scripts using a batch process. I’ve been working closely with the Motion Analysis team on this and they’ve been hugely helpful. When we collect EMG data, there’s a delay in time so we need to shift the data over for it to line up correctly. With the Sky Interface, I can code something so that I just have to hit one button and it goes through all of my captures and automatically shifts the data over.

We’re also starting to get into real-time feedback using Cortex software. In a clinical setting, we’d use this to better understand upper body motion. For example, we’d put markers on the elbow, the arm and the torso and ask children to reach around so we can see how far they can reach. With real-time feedback, it’s possible to have them reach for virtual markers on a screen, a bit like they are playing a video game. It would all be done in real time using the Motion Analysis workflows I’ve learned. In the work I do, it’s been enormously helpful for me to be able to pick up a phone and connect with the Motion Analysis customer support team or their engineering and technical teams because they are so willing to help out when I have a problem that I need to figure out right away.

If you, like Adam, want to leverage motion capture innovation to better understand movement-related conditions or improve how you monitor the tendencies and patterns of biomechanical movements, we can help. Learn more about how our team can support your mocap needs by scheduling a demo today.

Why Bournemouth University uses Motion Analysis to nurture animation’s next generation

The Customer

Bournemouth University is recognized as one of the foremost animation institutions in the United Kingdom. Under the leadership of Zhidong Xiao, Deputy Head of Department at the National Centre for Computer Animation, their animation focus applies to three main practice areas: teaching the full pipeline of motion capture technology to inspire student animation projects; exploring new mocap usage for research councils; and helping creative filmmakers and artists through studio space and advanced equipment.

The Problem

Having experimented with motion capture systems since 2003, the university’s original fixed capture space was an ample-sized classroom primarily used for teaching character animation, utilizing Motion Analysis’ Raptor 2 active optical motion system. By 2010, however, the team was developing a new, larger studio facility. Able to accommodate a greater number of cameras, there was an increasing need for advanced data processing and motion retargeting to suit larger-scale projects.

The Solution

There is a growing expectation for advanced detail in 3D character animation, which has spurred mocap technology to similarly scale in precision and sophistication. Whether it’s creating robots, mythical creatures or cartoon figures in games, films or television shows, facial and bodily movements are becoming increasingly lifelike to make visual experiences like never before.

To track these actors’ movements (fitted in bodysuits complete with markers), passive optical systems are one option to accurately capture motion data. Able to track the simultaneous motions of objects and humans alongside video footage, the marker movements synchronize with Motion Analysis’ Cortex software, which helps to map the skeleton that will later become a 3D animated character brought to vivid life. The tracked real-time data makes completing re-dos or small edits in post-production far simpler, keeping a record of the actor’s motions before the computer graphics have been superimposed.

For animation educators like Zhidong’s team, that cross-collaboration between software and equipment provides a greater advantage to teaching the full scale of animation methods to students. Now utilizing 16 of our fixed 4K Kestrel cameras for accurate data capture, student classes remain a focus, but the newer space was also designed to better craft virtual reality sets for television shows, film music videos, and to develop special effects for the silver screen. 

The interoperable mocap system opens the doors to create brand new experiences in the studio, to benefit community projects, student work and artistic expression. Bournemouth University’s media department is set to carry on their work in these areas, using their animation studio space and Motion Analysis system to develop machine learning and training techniques for industries choosing to adopt mocap technology’s many advantages. 
Want to discover more about Bournemouth University’s collaboration with Motion Analysis? Catch up on the full story in our case study.

From fruit flies to elephants, and everything in between. We’re celebrating 40 years of mocap!

This October, we’re celebrating our 40th birthday. Over the course of our four decade history, we’ve made a concerted effort to keep innovation at the heart of everything we do, which may explain why we’ve managed to achieve so much during this time. Using motion capture in settings that you wouldn’t expect, our software has traveled from a ballet studio to an ice rink and has even scaled the hills of Mount Doom

This means that we’ve had the incredible opportunity to collaborate with clients who are using our mocap software in their cutting-edge research and incredible creative projects across a wide range of industries

From intern to VP, Phil Hagerman shares interesting insights over a 20+ year tenure

Just ask Phil Hagerman, who started out as an intern at Motion Analysis in the late ‘90s and has spent most of his career learning, growing and excelling as part of our team. 

Today, Phil is our VP of Operations. He has worked across all aspects of the business – employed as everything from an electronics technician to a support engineer, sales and director of customer service. He has helped us to build prototypes, trained our resellers, and improved and refined our processes. Phil has played an integral role in expediting issue resolution for our customers and making sure that everyone has the information they need at their fingertips. 

Over more than 20 years, Phil has also served as a trusted advisor to the business, particularly around how we plan and develop our strategies for the future. 

Thinking ahead to stay ahead

“Recently, with the supply chain shortages, I started to monitor the individual components that go into our products,” says Phil. “I actually spent an absurd amount of time tracking the lifecycles and availability of these components to make sure that we buy the parts we need before they are unavailable.”

We’ve also seen the industry change dramatically over the years. When you think about the fact that things like the iPhone or Google didn’t exist 40 years ago – technologies that have become staples in our everyday lives – you realize just how much progress has been made in recent years. 

A six-camera Motion Analysis VP320 system photographed in the late 80’s

At Motion Analysis, we’re proud to say that we’ve been able to translate this progress into success, not only for our business but also for our customers. From analyzing the movement of dancers, and developing an improved basketball shoe to rehabilitating wounded soldiers, we’ve done a lot.

Pre-realtime labeling: The six-camera Motion Analysis VP320 system Using Motion Analysis’s ExpertVision (EV) software to record and track a gymnast in the late 80’s

Navigating the peaks and valleys 

It’s been great to see the business adapt and thrive through various peaks and valleys, adds Phil. “I was there after 9/11 when some people were moved to part time roles because we just didn’t have enough orders coming in.” 

And on the converse, we had one December where we had to revamp our manufacturing system just to get out all the systems that had been ordered, he continues. “Watching the business go through periods where we’ve struggled and then excelled, I can see how we’ve used periods of downtime to look at how we can make things better.” 

Celebrating the weird and wonderful

For Phil, there isn’t only one standout experience or highlight because, “Motion Analysis has great relationships with all of our customers and we love all the motion capture projects we get to work on.”

“Some of the projects we’ve worked on over the years are just mind blowing. We’ve done motion capture projects where we’ve tracked something as small as a fruit fly to something as large as an elephant. It’s really interesting to see how things move. Yes, this is enabled by innovation in motion capture and the flexibility of our systems, but it’s also about our clients’ creativity.”

Speeding up processes with the introduction of custom designed VPAT cards to record the camera data to memory: The MIDAS based system running ExpertVision Advanced (EVa) in the early-mid 90’s

Acknowledging that customer needs have changed a lot over the years, Phil notes that Motion Analysis has consistently updated its mocap hardware and software to cater to these needs. For example, while we have always been known for developing high-end passive marker systems, we recently launched the BaSix camera family, which consists of three “light” camera models. BaSix was launched in an effort to make mocap more accessible and affordable for smaller studios. 

Looking to the future

Lucy Keighley, president of Motion Analysis, believes that our success comes down to all the people who make the company what it is today. “Most of our team have been working here for many years and that’s because, despite being smaller and spread across the world, our values align and that keeps us connected,” she says. “I would say that our greatest value is the relationship we have with our customers. Whether it’s our developers or sales staff, we all make an effort to get to know and to prioritize the needs of our customers above everything.” 

Looking ahead, we’re excited about the next 40 years of innovation in motion capture. 

“We want to evolve with and stay on top of new technology as it comes out. Our software is a core component that makes us stand out. And so we will continue to ensure that our software evolves with our clients’ needs, so that it can continue to be used in things like industrial design and ergonomics, animation, drone tracking, animal/human biomechanics, and so much more,” Phil says. “When you think about future applications, the possibilities are endless.”

Why Intel partners with Motion Analysis to bring technology to Olympic athletes

THE CUSTOMER

The Intel Olympic Technology Group (OTG) is a division of Intel focused on bringing cutting-edge technology to Olympic athletes and helping them to better prepare for the Olympic Games.  

THE PROBLEM

The Intel OTG wanted to create a smart coaching application using computer vision pose estimation models. These models use key point locations on the body, like joints, to calculate biomechanical attributes relevant to athletes, such as velocity, acceleration, and posture etc.

THE SOLUTION

Motion capture was first used in biomechanics in the late 1970s to analyze a subject’s gait. But a lot has changed since then. Today, this technology is being used across the increasingly data-driven sports industry

The information generated using motion capture software empowers coaches to identify issues that may be preventing a player from improving their performance. This technology can also be used to prevent injuries. When physiotherapists use motion capture software to analyze the kinematics of a particular movement, it is feasible to identify any range of motion (ROM) issues and determine if these are linked to pain or injury in the athlete. When it comes to movement analysis, the accuracy of data is key. The precise data collection and instant translation of this data makes it viable for coaches and physiotherapists to identify areas where re-injury might occur, determine an appropriate recovery time and provide evidence-based recommendations for rehabilitation. 3D motion capture software can also be used to track the movements of an entire team. This data can be used by coaches to strategize better because it is possible to track a range of player performance factors – like accelerations or decelerations.

Benjamin Hansen, Product Engineering Lead in AI & Sports Technology for Intel OTG, has been using motion capture to do just this. Describing himself “a lifetime customer of Motion Analysis”, he has utilized Motion Analysis systems to provide athlete testing services to elite athletes and professional baseball teams. And now, he’s using Cortex to validate and benchmark the smart coaching application described above.

Cortex is Motion Analysis’ most powerful motion capture software that completely manages motion capture and measurement for all applications from biomechanics, broadcasting, and engineering to sports performance, game production, and film.

One of the projects the OTG worked on using Cortex was 3D Athlete Tracking (3DAT). Intel’s motion tracking platform, 3DAT, creates scalable technology that advances the understanding of human health and performance. Crucially, it relies on a Motion Analysis system, which includes Kestrel cameras and Cortex software, to benchmark data accuracy in order to inform the necessary algorithms. Developed over four years for athletes competing at the Tokyo & Beijing Olympics, 3DAT is now being commercialized as a camera agnostic motion capture software development kit (SDK) that developers can use to create biomechanics solutions for the sports, health, and fitness industries.
 
Want to find out more about Intel’s journey with Motion Analysis? Download the full case study, here, to learn more.