CPS:Medium:Quantitative Visual Sensing of Dynamic Behaviors for Home-based Progressive Rehabilitation
The objective of this research is to develop a comprehensive theoretical and experimental cyber-physical framework to enable intelligent human-environment interaction capabilities by a synergistic combination of computer vision and robotics. Specifically, the approach is applied to examine individualized remote rehabilitation with an intelligent, articulated, and adjustable lower limb orthotic brace to manage Knee Osteoarthritis, where a visual-sensing/dynamical-systems perspective is adopted to: (1) track and record patient/device interactions with internet-enabled commercial-off-the-shelf computer-vision-devices; (2) abstract the interactions into parametric and composable low-dimensional manifold representations; (3) link to quantitative biomechanical assessment of the individual patients; (4) facilitate development of individualized user models and exercise regimen; and (5) aid the progressive parametric refinement of exercises and adjustment of bracing devices. This research and its results will enable us to understand underlying human neuro-musculo-skeletal and locomotion principles by merging notions of quantitative data acquisition, and lower-order modeling coupled with individualized feedback. Beyond efficient representation, the quantitative visual models offer the potential to capture fundamental underlying physical, physiological, and behavioral mechanisms grounded on biomechanical assessments, and thereby afford insights into the generative hypotheses of human actions.
Northeastern University’s College of Engineering is home to numerous federally-funded research centers and an array of leading-edge projects and initiatives that advance discovery and new knowledge in health, sustainability, and security.