This excites me.
AI can diagnose Type 2 Diabetes in 10 seconds from your voice.
From a recent study published by theMayo Clinic: "Voice analysis shows potential as a pre-screening or monitoring tool for T2DM, particularly when combined with other risk factors associated with the condition".
I’m not an expert in this study, but it highlights an example of how AI could make our lives healthier and happier. There are similar use cases where your AI + sensors in your phone detect changes in your gait that could serve as an early indicator of conditions such as Parkinson's, musculoskeletal disorders, neurological ailments, cardiovascular diseases, and increased fall risk.
The implications for the voice analysis are immense:
► Non-invasive diabetes screening
► Remote and automatic diagnosis (could integrate into the things we do every day)
► Reduced need for in-person testing
More detail on what it's looking for:
Results of the study published in the journal Mayo Clinic Proceedings: Digital Health. found that ‘pitch’ and ‘standard deviation from pitch’ were useful features to diagnose the condition in all participants, however, ‘relative average perturbation jitter’ proved more useful in women. ‘Intensity’ and ‘11-point amplitude perturbation quotient shimmer’ were useful in diagnosing men.The study stated: “In women, the predictive features were mean pitch, pitch SD, and RAP jitter, and in men, mean intensity and apq11 shimmer were used. In simple terms, the variation in these features found that women with T2DM reported a slightly lower pitch with less variation, and men with T2DM reported slightly weaker voices with more variation. These differences likely stem from differences in disease symptom manifestations between the sexes.”
As always, I suggest a quick review of the source study for more context.
Comments