A new accessibility feature of the newly-released iOS 18 is something Apple calls Music Haptics. The feature, which is supported on iPhone 12 and later, is described by the company as a method by which Deaf and hard-of-hearing people can “experience music on iPhone with taps, textures, and refined vibrations that are synchronized with a song’s audio.” Music Haptics, which has a corresponding API for App Store developers to hook into their own apps, works over Wi-Fi or cellular connections and obviously is supported by Apple Music, Apple Music Classical, and the Apple-owned song identification service Shazam.
In an effort to promote the feature, recently Apple released playlists featuring songs enhanced by Music Haptics’ aforementioned taps and textures. I perused a few of the playlists on my new iPhone, happily discovering there are some absolute bangers if you, like me, are a fan of hip-hop. The Haptics Beats playlist contains classics like Jay-Z’s “99 Problems,” Tupac’s “California Love,” and Dr. Dre’s “Forgot About Dre.” What’s more, the hard rock fan in me was extremely tickled to find “Numb” from my all-time favorite band Linkin Park in the same playlist.
Like podcasts, music is a hearing-oriented medium and conventional wisdom says neither are accessible to people who have limited hearing—or can’t hear at all. That’s generally correct, which helps explain why Music Haptics is such a noteworthy interplay of hardware and software.
As with Apple Podcasts getting transcripts earlier this year in iOS 17.4, the advent of which made podcasts accessible to Deaf and hard-of-hearing people, the presence of Music Haptics aims to do the same for music. The reality is, many in the Deaf and hard-of-hearing community experience sound through its commensurate vibrations; sound waves are nothing more than air that moves to various degrees. To wit, there’s a notable scene in CODA, the 2022 Best Picture Oscar winner on Apple TV+, when Deaf family patriarch Frank (played by Best Supporting Actor winner Troy Kotsur) is sitting with hearing daughter Ruby in his pickup truck listening to music. Ruby tells him the volume is really loud, to which Frank replies he likes listening to music because he can “feel the vibrations in my a**.” It’s a crudely funny line that nonetheless speaks to the poignancy of something like Music Haptics. Combine Music Haptics with Apple Music’s impeccably synced lyrics view in the app, and Deaf and hard-of-hearing people more or less can enjoy a functionally equivalent version of “listening” to music. This is a significant development; it isn’t trivial that Apple has innovated—yes, accessibility counts as innovation—to make two ostensibly inaccessible mediums in podcasts and music suddenly accessible using the company’s vaunted airtight integration of hardware and software.
Growing up, my Deaf father would get up every morning for work by setting his aural alarm clock to literally shake the bed at the desired hour. He felt the vibrations, prompting him to get up for the day. It isn’t music or podcasts, but the benefit is conceptually similar: my dad used vibrations to his advantage when an alarm of course is inaccessible.