Hacker News new | past | comments | ask | show | jobs | submit login
How the Ear Works: Nature's solutions for listening (1997) (nih.gov)
97 points by jacquesm on Oct 10, 2023 | hide | past | favorite | 18 comments



The journey from tiny atmospheric vibrations to perceiving something like speech or music is as improbable and absurd as any biological adaptation of the animal kingdom.

The basic, undifferentiated cell functions (including dna encoding etc), while extraordinary, still feel like robust and fairly close to physical-anorganic processes.

Once cells start differentiating and creating these long chains of bizarre mechano-electro-biological contraptions one wonders what sort of weird and wonderfull phase space has been tapped.

I wish there would be more detail about the final neuronal stages, perception of pitch (octaves), timbre, rhythm, direction, volume etc.


The books Tuning, Timbre, Spectrum, Scale [1] and Rhythm and Transforms [2] by Sethares have a lot of detail about the perception of higher level phenomena like pitch and rhythm. Plus he's an electrical engineer, so you get some code out of the deal. He doesn't dwell too much on the neuroscience aspect though--in both cases he's after a more functional explanation.

[1] https://sethares.engr.wisc.edu/ttss.html [2] https://sethares.engr.wisc.edu/RT.html


Iain McGilchrist touches on music perception in The Master and His Emissary, as far as it concerns the main thesis of the book (which is primarily about our two sub-minds—brain hemispheres—and their different modes of attending to the world).


The paper doesn't even cover the intermediate neuronal stages in the brainstem, which are extremely important for localization and detecting of different features of sound - in fact several of those are well characterized in the brainstem.

Your brainstem can detect level and timing differences for a sound for incoming sounds from each ear (interaural level and interaural time differences), depending on the frequency of the sound, and uses that for localization.

As for the improbability - it makes perfect sense when you think of each layer being some sort of improvement on the previous layer(s) rather than a well-engineered preplanned system.


Hijacking this for a PSA:

This hardly-upvoted post [0] made me aware that Apple Airpods Pro can greatly assist your hearing if you dig into the settings on your phone a bit. According to one review [1], really close to the best hearing aids out there. Worked great for my wife.

[0] https://news.ycombinator.com/item?id=36857671

[1] https://arstechnica.com/science/2022/11/study-airpods-pro-ar...


One-sided hearing loss here. I can only underline this. Tested them in professional surroundings against multiple four-digit-priced "real" hearing aids and for the price point it's ridiculous.

One of the leading Hearing Aid Content Creators "Doctor Cliff" on them:

    I'm quite surprised that Apple is able to get away with this. [0]
[0] Detailed Review from Hearing Care Provider: https://www.youtube.com/watch?v=x0gsTz1sSDE

[1] Turn Your AirPods Pro 2nd Gen into Hearing Aids in Under 3 Minutes: https://www.youtube.com/watch?v=CiWD9Rt2fJ0


Earbuds + equalizer can also help with some forms of tinnitus: just filter out the band that triggers the nasty effects.


Unfortunately, tinnitus comes in many shapes and forms.

Acquired a tinnitus during my time working in an open-office spanning three floors with a waterfall in the center of it. So, I was quite motivated to look into the details. Oddly enough, while working for a leading hearing aid provider... for many it's a clicking, buzzing, hissing, or roaring. Not only a ringing at a specific set of frequencies.

To add to the solution provided above, listening to brown noise or drone helps me with a 'hissing' tinnitus.


Note the 'some'.


No, we have to be pedantic like reddit. Otherwise, how else will people know how smart we are?

/s

I hope HN doesn't devolve into reddit, where "akshully" is more common than it needs to be.


My dad was an audiologist and worked for many years with children with cochlear implants. I remember in the 90s him bringing a laptop home with software that tested otoacoustic emissions via a microphone/earbud combo. Later I met one of his colleagues from that time who, whilst working full time and with a large young family, obtained a phd in using this tech to test newborn hearing. That work has been the basis for the UK’s national newborn hearing screening service for decades now, helping to catch issues early where the most impact can be made. I’m really in awe of this profession - it’s pretty out of sight for many but makes a huge difference.


My grad advisor spent the majority of his career developing and improving cochlear implants, and one of my labmates continues that work to this day (dude is brilliant too, MS in EE, PhD in cogneuro).


That's a worthy career.


Thanks!

> The concept that a source of mechanical energy exists in the cochlea appeared validated when in the late 70s it was discovered that sound is produced by the inner ear. [---] Within five years it was discovered that the outer hair cell could be made to elongate and shorten by electrical stimulation. The function of the outer hair cell in hearing is now perceived as that of a cochlear amplifier that refines the sensitivity and frequency selectivity of the mechanical vibrations of the cochlea.

Like high-resolution variable valve timing in ICEs? Amazing things going on in the body.

> Aspirin has long been known to cause a reversible hearing loss and more recently it has been shown to block otoacoustic emissions. The experiments show that aspirin causes both effects by a direct action on the outer hair cell.

Oh, wow. That sounds like useful information.


Aspirin's ototoxicity has been known since at least the 60s, luckily it's reversible. When my advisor did animal studies, they used a similar NSAID (if I remember correctly, it's been a long time) to chemically deafen their animals to then install and test cochlear implants.


This was actually the focus of my graduate studies, and am happy to talk about my (perhaps now slightly outdated, but not as outdated as this paper) research here, which was centered around centrally-mediated attenuation of incoming signals from the periphery (specifically the cochlea) using otoacoustic emissions (echoes from the inner ear) and auditory brainstem responses. This review is a good treatment of the peripheral auditory system, but excludes two important things:

1) the central auditory system. This would've expanded the review to near textbook levels, so it's understandable why it wasn't included, but a lot of really interesting things happen in the brainstem, like sound localization. Auditory brainstem responses are really cool because the different waves you can detect are directly correlated with different structures (like the olivary complex) in the brainstem. Above the thalamus, sound perception is less well-characterized.

2) the descending auditory nerve fibers. This is likely because most of the research here took place after it was published, so it's understandable, but it was my area of research so I have to talk about it. We have neurons that come from the brain and send signals back down to the cochlea, specifically the outer hair cells which moderate the cochlear amplifier, and while it's still not entirely clear what messages they convey, my research was around the interaction of directed attention and these descending nerve fibers.

One caveat is that I've been out of the field for nearly a decade and have barely kept up with the research, but I don't think any major breakthroughs or change in the debate about the descending fibers has happened in the time since then.


I just finished a wonderful book that readers of this comment thread may find interesting: "An Immense World: How Animal Senses Reveal the Hidden Realms Around Us" by Ed Yong. The book is a relatively non-technical deep dive into the many ways different organisms sense their environments. The chapters are organized by the senses; in the chapter on hearing, he covers everything from mammals to insects. His discussion of echolocation was amazing: of the many species of bats, there are a number of quite different ways in which they use echolocation. Some bats use quite different chirps for "scan mode" versus "final approach on target" mode. He gives plots of FFT's of the various sounds researchers have captured. I could go on and on about his amazing descriptions of insect vibration sensing, electric eels, taste, sight in its various forms across species, etc. Great read, highly recommended.


So Amazing.

This somehow makes it obvious we are just machines, and also how complex a machine, so how far we would have to advance to make machines like us.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: