Can a computer hear when you're drunk?


Glancing through the latest edition of the Journal of the Acoustical Society of America, I was struck by this paper from Heinrich and Schiel, ‘The influence of alcoholic intoxication on the short-time energy function of speech‘. We’ve all witnessed peoples’ speech becoming slurred when they’re drunk. But can a computer pick up those tell-tell signs, and detect whether someone is intoxicated from their speech?
A number of scientific studies have explored how drinking alcohol changes speech. The average fundamental frequency of the voice (or pitch) tends to go higher, but the pitch also varies more. This rise in frequency might be caused by people talking louder when they are drunk. Another effect is the slurring of speech or what is called disfluency. As people get drunk, they are more likely to stumble over their words, repeat them, elongate them, or even miss out some of the words. [1] However, there are who do not show a significant change in their speech when intoxicated. Some people appear to be able to hide how inebriated they are.
Can a computer detect drunkenness? The most successful system I could find in the literature could detect intoxication correctly about 77% of the time. It did this by looking at a number of features of the speech that could be extracted from a recording by a computer. Some of these related to frequency and others to the rhythm [3]. While a 77% success rate might seem high, this means 23% of cases are misidentified by the computer. Coincidentally, this is about as good as a human can do. When scientists have tested peoples’ abilities to spot drunks by listening to speech alone, the success rate isn’t any better, and sometimes worse [4].

Maybe the solution is to use a multi-modal approach as is being tried in speech recognition. To combine the speech features with other non-audio signs of drunkenness. Maybe one of these features could be physical clumsiness, as ably demonstrated by Les Patterson in the video. One of the reasons for research into detecting intoxication, is to put systems into cars that detect drunk drivers and stop them using their vehicle. The car would be voice activated and would only work if the driver was judged to be sober. Maybe alongside listening to the speech, the car needs to have accelerometers in the driver’s seat, spotting those who have enough alcohol in them to dull their motor skills used when sitting.
Wwwwhat do youuuu think? Please,    pleeeeeaaasee comment below.

Notes

[1] H. Hollien, G. DeJong, C. Martin, R. Schwartz, K.Liljegren, “Effects of ethanol intoxication on speech suprasegmentals”, J. Acoustic. Soc. Am. 110 (2001) 3198-3206.
[2] http://www.phon.ucl.ac.uk/courses/spsci/iss/week10.php
[3] Florian Schiel, Christian Heinrich, Veronika Neumeyer, “Rhythm and Formant Features for Automatic Alcohol Detection”. Proc. INTERSPEECH 2010, ISCA, Makuhari, Japan, pp. 458-461, 2010.
[4] Björn Schuller et al. “Medium-term speaker states—A review on intoxication, sleepinessand the first challenge” Computer Speech and Language 28 (2014) 346–374

Follow me

0 responses to “Can a computer hear when you're drunk?”

  1. Assuming you don’t want to use a breathalyzer (for the car), then you could try to analyze the performance of a control set of words/phrases (using basic forms of AI for matching algorithms), and maybe a short set of hand/eye coordination tests (which would be good for people too tired to drive, as well). So, it should be possible to tell if the person is impaired… though, maybe not necessarily drunk.

  2. I’m not sure how a computer could differentiate between a drunk and someone with a disability that impairs speech but not cognitive reasoning. It’s worth comparing chemically impaired speech to speech impaired by injury or disability to see if there are any differences.

  3. Just a great idea Trevor for adapting speech recognition into the drunk driving strategic pantheon. If you were able to spearhead this or find someone in the auto industry to take this on, it could be a huge service to humankind. When my sound partner and I read The Soundbook we knew
    we had found another sound brother. We will run across each others paths at some point, as there are some things we are working on that I think would send you into extreme excitement . . . that you might even want a piece of!
    in resonance, Alan Tower

  4. Really interesting post Trevor. This is definitely something that I think has potential and could be used in the future for motoring, although there are obviously still many areas that need careful consideration first. Like Damian said, you need to differentiate between disability and drunkeness as well as seeing how best to implement it – I know I wouldn’t want to take a test every time I wanted to drive my car!
    One idea I think may become reality one day is a google glass concept that could be used for a kind of augmented reality sat nav or something projected onto the windscreen. If so there could be some sort of camera to maybe read the drivers eyes to see of they’re drunk/tired? All a way off yet but some interesting food for thought if nothing else.

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: