Automated preclinical detection of mechanical pain hypersensitivity and analgesia Zhang, Zihe González Cano, Rafael Preclinical pain models Machine learning Machine vision Automated pain detection The lack of sensitive and robust behavioral assessments of pain in preclinical models has been a major limitation for both pain research and the development of novel analgesics. Here, we demonstrate a novel data acquisition and analysis platform that provides automated, quantitative, and objective measures of naturalistic rodent behavior in an observer-independent and unbiased fashion. The technology records freely behaving mice, in the dark, over extended periods for continuous acquisition of 2 parallel video data streams: (1) near-infrared frustrated total internal reflection for detecting the degree, force, and timing of surface contact and (2) simultaneous ongoing video graphing of whole-body pose. Using machine vision and machine learning, we automatically extract and quantify behavioral features from these data to reveal moment-by-moment changes that capture the internal pain state of rodents in multiple pain models. We show that these voluntary pain-related behaviors are reversible by analgesics and that analgesia can be automatically and objectively differentiated from sedation. Finally, we used this approach to generate a paw luminance ratio measure that is sensitive in capturing dynamic mechanical hypersensitivity over a period and scalable for highthroughput preclinical analgesic efficacy assessment. 2022-12-12T12:32:48Z 2022-12-12T12:32:48Z 2022-12 journal article Zhang, Zihe... [et al.]. Automated preclinical detection of mechanical pain hypersensitivity and analgesia. PAIN: December 2022 - Volume 163 - Issue 12 - p 2326-2336 doi: [10.1097/j.pain.0000000000002680] https://hdl.handle.net/10481/78398 10.1097/j.pain.0000000000002680 eng http://creativecommons.org/licenses/by-nc-nd/4.0/ open access Attribution-NonCommercial-NoDerivatives 4.0 Internacional Lippincot Williams & Wilkins