(English) Emotions Analytics change the way we interact with our machines and ourselves

Din păcate acest articol este disponibil doar în Engleză Americană. For the sake of viewer convenience, the content is shown below in the alternative language. You may click the link to switch the active language.

Research Shows Correlation Between Coronary Artery Disease and Voice

A poster presentation at the American Heart Association Scientific Sessions shows that voice analysis can be used to identify the presence of coronary artery disease (CAD). The study indicates a strong correlation between certain voice characteristics and CAD, which represents a significant breakthrough in this field. This finding can significantly improve continuous monitoring and reduce remote healthcare costs related to CAD.

CAD is one of the leading causes of cardiovascular mortality worldwide. When screening for the disease, patients would benefit from simple and noninvasive tests that could also improve the accuracy of risk estimation models. In the past, Beyond Verbal has found voice signal characteristics that can be associated with various conditions such as Autism and Parkinson’s and worked with Mayo Clinic in order to research this same concept with CAD patients.

After conducting the double blind study, which included 120 patients referred for elective coronary angiography and corresponding control subjects, one voice feature was shown to be associated with a 19-fold increased likelihood of CAD. After adjustment for age, gender and cardiovascular risk factors, this feature was shown to be independently associated with a significant 2.6-fold increased likelihood of CAD. This is the world’s first study to suggest a link between voice characteristics and CAD, and hold the potential to assist physicians in estimating the pre-test probability of CAD amongst patients with chest pain.

Amir, Lerman, MD. Of Mayo Clinic: “We are excited by the potential in finding correlation between voice features and CAD condition. This may open the door to other studies to assess the association between voice features and other health conditions.”

Yuval Mor, CEO of Beyond Verbal: “A patient’s voice is the most readily available, easy to capture, and rich outputs the body offers. We are very excited to be able to work with Mayo Clinic on such a breakthrough research, studying the potential of using the human voice in healthcare monitoring and specifically CAD.”

About Beyond Verbal:
Since its launch in 2012, Beyond Verbal has been using voice-driven Emotions Analytics to dramatically change the way we can detect emotions and monitor health just by monitoring the human voice. Beyond Verbal’s technology has been developed based on ongoing research into the science of emotions that started in 1995. By combining the company’s patented technology with its proprietary machine learning-based algorithms and AI, Beyond Verbal is focusing on emotions understanding and discovering vocal biomarkers. During the past 21 years, the company has been able to hone its technology through multiple internal tests and independent external validations. Over time, Beyond Verbal has collected more than 2.5 million emotion-tagged voices in more than 40 languages, and secured their technology with multiple granted patents.

For more info, visit http://www.beyondverbal.com/