Is it like a rough inference of what’s being said based on mouth movements, or is it more precise somehow? Would it be a mistake to think you knew exactly what was said by reading lips (even if you were good at it)?
As someone who was born with unilateral moderate->severe hearing loss, I can read lips. The experience is likely unique across the hearing loss spectrum / time of onset, and some people may be able to learn that skill themselves, idk. I’m sure 100% deaf people experience it in their own interesting way.
To me it’s not anything I conciously do, and it’s not something that’s really that visible to me. The fact I can still hear, but not as well as people with normal hearing affects how it works. The way I’d explain it for me is kinda like this:
Sometimes I can’t hear enough to tell what is being said, one way my brain naturally deals with this is by reading the speakers lips and using that to help filter and understand what its hearing. I can kinda apply it as a skill, like with muted videos and people I can’t hear because of distance, but it doesn’t work that well and isn’t worthy of trust.
So for me it’s more of a sense, not something I do or think about. However, its basically the least effort way to understand speech that isn’t clear enough. This is in contrast to another way I/my brain goes about it, which is trying really hard to figure out what it just heard.
To answer your last question, yes it is likely a mistake. Theres a youtube channel about that whole concept, called bad lip reading or something. They dub over video with audio that matches the lips well enough.
To put my experience into perspective, which might work for at least a few people: closed captions
subtitles. I mean… I’ve never asked anyone else but yall arent just reading them, right? To me they just clarify the speech subconsciously (for the most part), rather than me reading them off the screen when I need them. Captions are weird… Who knows if this is accurate to my experience or similar to others.Your experience seems very much like my own. I don’t have hearing loss, but what I assume is an auditory processing issue with speech.
It’s much easier for me to understand what someone is saying when I can see their mouth and microexpressions. If my back is turned, I don’t always catch everything. Sometimes I keep hearing the “wrong thing” no matter how much I ask for them to repeat it… so I just learned to repeat back whatever non-sensical thing I “heard”; and that either helps me process what they were trying to say, or they will repeat it back slower and more clearly. It’s frustrating sometimes, especially in noisier environments with a lot of other stimulus… that’s when seeing someone’s lips will help the most for me
And of course, I love subtitles. Otherwise I have to blast the TV, and still will miss things. The subtitles just clarify what I’m pretty sure I heard, or what I missed. I’m not just reading my way through everything, unless it’s in another language… which than does feel like a switch in the way I “see subtitles”
Similar for me: when my hearing started to go in my 30s, the doctor said “you already know how to lip read.” I didn’t believe him until he showed me “am I saying ‘top’ or ‘cup’?” and if he had his mouth covered, I couldn’t tell which one he was saying.
This, completely. I didn’t even know how much I depended on reading lips until everyone worth listening to was wearing a mask.
Yeah it really sucked, slightly muffled by the mask and no lips to read.
Same. My mind makes up random words to sounds all the time, so if I dont have context or lips to read, the sentences I hear people say are just wild.
I repeat things back to people wearing masks so that I can be sure that I understand them. It annoys some of them, but whatever
Thanks! I appreciate the perspective on this, as lip-reading is kinda like “eye-reading” to me in that I’ve struggled to understand what’s involved.
To put my experience into perspective, which might work for at least a few people: subtitles. I mean… I’ve never asked anyone else but yall arent just reading them, right? To me they just clarify the speech subconsciously (for the most part), rather than me reading them off the screen when I need them. Subtitles are weird… Who knows if this is accurate to my experience or similar to others.
This also helps me understand, as I often do watch stuff with subtitles to help better follow dialogue, and I’m usually not closely reading them all throughout.
As far as being similar to others’ experiences: I don’t have any significant hearing loss, but you basically just described my subjective experience reading lips (and subtitles).
When you say subtitles do you mean closed captions? Because I agree those are a boost for me to follow what I’m also seeing and hearing the person say. But with subtitles they’re speaking a different language so lip-reading isn’t helpful and hearing just adds tone of voice.
Yeah I mean closed captions, oops
I mix them up all the time myself
It’s inference based on mouth movements, but it isn’t as rough as it seems like - context plays a huge role on disambiguation, just like it would for you with homonyms that you hear. It’s just that the number of words that look similar when you mouth read is larger than the number of words that sound the same, since some sounds are distinguished by articulations that you can’t immediately see (such as [f] vs. [v] - you won’t see the vocal folds vibrating for the later, so “fine” and “vine” look almost* the same.)
Also, the McGurk effect hints that everyone uses a bit of lip reading, on an unconscious level; it’s just that for most of us [users of spoken languages] it’s just to disambiguate/reinforce the acoustic signal.
*still not identical - for [v] the teeth will touch the bottom lip a bit longer.
I think exact is impossible but I think with enough practice you can get pretty good at identifying words. A good example of lip reading for false words is Bad Lip Reading
I’ve heard that it’s easier if you’re familiar with the person, past that I’m curious too!
As a linguist, I suspect that everyone lipreads to some extent as a conversation repair mechanism. Accuracy probably depends on skill and context. Family members with hearing loss are pretty good at understanding a speaker that they can see clearly, even when there’s no sound information available at all.
I can’t do it at all, scouring this thread for tips. I suspect it is pattern recognition my brain has not yet been trained to do.
All I know is that when people started wearing masks, suddenly I had trouble understanding them. I guess I’d picked it up subconsciously alongside my hearing loss.
I still wear masks in public, but holy shit does it make people harder to understand when I can’t see their lips. I wanna say that someone made a “clear” mask for that exact reason. Dunno how exactly or where I remember that from, but it’s a good idea
There’s others but this looks like a good one
The muffling of the mask doesn’t help at all
Hearing loss my dude. Army. Necessity is the mother of invention. It makes date night fun with my partner though. I can read and sign remote convos to her and sometimes that’s spicy/fun. It’s not perfect but I can usually follow the thread. More so if the target is animated/angry/excited. Unfortunately the best ones are the hardest. We love the first date awkward convos, the public breakups, and admissions of guilt but those tend to be subdued and difficult. When you get them though it is choice.
Oh man that was awful. I watched it with sound off and did not get a single word. It is indecipherable flapping.
For anyone not wanting to click, it’s a short video from the National Geographic channel titled: “What It’s Like to Read Lips” and it’s good! It definitely reinforces how I’m not great at reading lips. 😅
Seinfeld had an episode that revolved around lip reading.