3motionAI, a pioneer in AI-powered analysis of human movement, today launched its 3DNeuroNet engine, the world’s first and most advanced AI technology for analyzing activity-specific human movement and biomechanics via videos, extracting critical data insights to help people improve their performance, safety, health, and wellness across a variety of work, home and play environments.
The 3DNeuroNet engine builds on more than 15 years of clinical research, advanced biomechanics, computer vision, machine learning, and AI development. 3motionAI’s breakthrough AI technology has been trained on millions of data points from live-captured and pre-recorded videos of various human activities, analyzing an array of functional movements to extract crucial data for evaluating sports performance, injury risk, workplace safety, physical therapy progress, mobility, aging, and other use cases.
Unlike traditional wearable systems that require cumbersome suits, sensors, or expensive labs operated by trained technicians, the 3DNeuroNet engine can be used to analyze “real world” videos without costly hardware, sensors, or wearables. The engine can be quickly and easily embedded into client applications in weeks, saving millions on engineering and product development. The engine also accelerates time to market, powering a broad range of applications that support human performance at scale with low cost.
“We’ve cracked the code on analyzing activity-specific human movement via AI and video, and we’re delivering advanced analytics that is easily embedded across a range of applications,” said Reed Hanoun, founder and CEO of 3motionAI. “In many ways, this is like ChatGPT for analyzing human movement via video. I’ve spent my career building products for health, wellness, safety, and rehabilitation – and AI, for the first time, enables powerful analysis and interpretation of data with high accuracy and low cost. We can’t wait to see what our customers and partners do with our 3DNeuroNet engine.”
The 3DNeuroNet engine has been proven in a variety of onsite, remote, and virtual environments – including pitch and swing analysis used by many major league professional baseball teams; evaluation of functional movements in physical therapy; analysis of functional abilities of workers; and monitoring of daily activities in active aging environments.
The 3DNeuroNet engine can be embedded in third-party mobile or cloud applications, using APIs and/or SDKs to accelerate the delivery of innovative products and services. The engine includes a robust pre-built library of activity-specific movement analysis for evaluating and benchmarking performance insights across industries and use cases. It delivers its analysis in user-friendly formats for end users, patients, coaches, physical therapists, healthcare professionals, fitness instructors, safety professionals, etc.
3motionAI’s 3DNeuroNet engine is currently embedded in a range of products and services designed to advance human performance while reducing injury risk across industries, including fitness testing, professional sports, healthcare, and the workplace. The 3DNeuroNet engine most recently won the National Safety Council Innovation Award. The technology’s potential ROI is significant – for example, 3DNeuroNet’s workplace risk application can help identify and mitigate the top five most expensive employee safety and musculoskeletal risks, totaling $165 billion.
For more information about 3DNeuroNet use cases and potential integration into custom products or services, please visit the 3motionAI website here or email info@3motionai.com.
About 3motionAI
3motionAI is pioneering the use of AI and video to analyze activity-specific human movement, fueling innovation across a broad range of applications for work, home, or play. The company’s 3DNeuroNet engine is the world’s most advanced embeddable AI technology for analyzing activity-specific human movement via video at scale and low cost. 3motionAI is a privately held, Toronto-based company founded by a team of experienced healthcare and technology innovators.
View source version on businesswire.com: https://www.businesswire.com/news/home/20231107193240/en/