Sony came up with the idea of personalized spatial audio for Apple


Personalized spatial audio is one of the new iOS 16 features, which was once more highlighted at Apple’s event on Wednesday. Once you’ve updated to the newest version of iOS, you’ll be able to design a unique sound profile for your iPhone. This should enhance the sense of immersion and overall spatial audio experience you get from AirPods.

Apple scans your ears using the front-facing TrueDepth camera on the iPhone to create this custom tuning. Under a minute is required for the procedure, which entails holding your iPhone 10 to 20 centimeters away from the side of your head. The information obtained is then used to enhance spatial audio for your particular ear shape.Mary-Ann Rau, of Apple stated during the keynote that “the way we all perceive sound is unique, based on the size and shape of our heads and ears.” The most immersive listening experience is provided by “personalized spatial audio,” which precisely places sounds in space that are tailored particularly for you.

Also Read: GPS accuracy of the iPhone 14 Pro models has been improved

However, Apple isn’t the first business to take this route. For compatible music services including Amazon Music, Tidal, Deezer, and, Sony has been providing “personalized 360 Reality Audio” since 2019. Both Sony and Apple are attempting to ascertain your ear shape and modify spatial audio processing to take into consideration the particular folds and contours of your ears. Conceptually, this is extremely comparable. The objective is to preserve the 3D audio experience and get rid of any audio flaws that decrease it.

With the help of Kaz Makiyama, vice president of video and sound at Sony Electronics, I was given the following benefits explanation by Sony back in June:

The small differences in the intensity and timing of sound entering the left and right ears from the sound source enable humans to identify spatial sound sources. Additionally, the shape of our ears and skull may affect the sound. So, this technology permits reconstruction of the sound field while wearing headphones by assessing and recreating the characteristics of both ears by taking photos of the ears.

Also Read: Apple’s iPhone 14 Pro delivery estimates are slipping now that the device is live

However, Sony’s strategy is a little bit more problematic than Apple’s. The iOS settings already include the AirPods method. However, using the Headphones Connect app and the phone’s camera, you must take a real snapshot of each ear in order to create a custom sound field with Sony’s goods.

After being sent to Sony’s servers for examination, the photos are kept there for a further few days so that Sony can use them for internal research and feature upgrades. The business claims that at this time, the ear photos are not individually connected to you.

That is not to claim, however, that Apple has mastered the art of ear-scanning. Some users on social media and on Reddit have complained that the iOS 16 beta process can be laborious and occasionally fails to detect an ear. I believe the reality is that there isn’t a dead simple way to accomplish this while simultaneously getting a precise reading of your ear shape.

Also Read: iPhone 14 Pro’s 48-Megapixel ProRAW photos require up to three times more storage than its 12-MegShots counterpart


Please enter your comment!
Please enter your name here