Apple is adding five new features to its $250 AirPods Pro 2 this fall

The five new features Apple is adding to its AirPods Pro 2 this fall — including ‘auto-mute’ when you start speaking

  • Apple is set to unleash five new features on its second-generation AirPods 2 
  • The update will include machine learning that analyzes your surroundings
  • READ MORE:  Apple unveils $3,500 augmented reality headset at WWDC

Apple is just months away from unleashing five new features to its $250 AirPods Pro 2 that will use machine learning for optimize listening.

The new updates were announced during its World Wide Developers Conference (WWDC) this month but was likely overshadowed by the AR Vision Pro headset.

The software will listen for changes in your environment to let the outside world in or block out the noise without you having to adjust it manually and turns off content when speech is identified.

Apple also said the upgrade also reduces the time to connect between different devices and comes with a new Mute or Unmute feature.

Apple announced it is adding five new features to its second-generation AirPods Pro 2 this fall

‘This fall, software updates across AirPods will unlock powerful new capabilities to transform the personal audio experience, the tech giant shared in an announcement.

‘AirPods Pro (2nd generation) become easier to use across environments and interactions with three powerful new features: Adaptive Audio, Personalized Volume, and Conversation Awareness. The entire lineup also gains new and improved features that make calls and Automatic Switching even more seamless.’

Adaptive Audio: Blends two features into one

This is Apple’s middle ground to listening to your favorite music while still chatting with people in the real world.

Machine learning analyzes the earbuds’ surrounding environment and adjusts the sound accordingly as background noise changes.

This way, loud or distracting noises surrounding you will be automatically reduced, while other noises will still be audible. 

9to5Mac shared an example of a user doing the dishes with Transparency activated, but the system switches to ANC the moment they turn on the vacuum cleaner. 

Transparency mode lets outside sound in, so you can hear what is going on around you. 

One is an easy way to mute and unmute the earbuds. Users can press the stem — or the Digital Crown on AirPods Max — to quickly mute or unmute themselves, so multitasking is effortless

Personalized Volume: AirPods Pro 2 will know how you like it

Another feature powered by machine learning is Personalized Volume, which listens to the users’ volume preferences under certain conditions to fine-tune the media experience automatically.

Conversation Awareness: Hear conversations without removing the earbuds

You will no longer have to remove your AirPods to talk to someone again.

Conversation Awareness automatically lowers the volume of your song or podcast and enhances the voice or voices in front of you while reducing background noise, like traffic. 

Automatic Switching: No more waiting for AirPods Pro 2 pair with other Apple devices

Apple said at WWDC that your AirPods Pro 2 will seem to instantly pair with other devices, as it currently takes a few seconds for the process to complete. 

Additionally, moving between Apple devices with AirPods gets even easier with updates to Automatic Switching,’ Apple shared in a press release. 

‘Now, the connection time between a user’s Apple devices is significantly faster and more reliable, making it more seamless to move from a favorite podcast on iPhone to a work call on Mac.’

Mute or Unmute: No more grabbing your iPhone to turn volume on or off

For added convenience, using AirPods on calls is enhanced with the new feature across AirPods Pro (1st and 2nd generations), AirPods (3rd generation), and AirPods Max. 

Users can press the stem — or the Digital Crown on AirPods Max — to quickly mute or unmute themselves, so multitasking is effortless.

While the AirPods Pro 2 software update might sound exciting to some, Apple’s Vision Pro headset was the star of the annual tech conference.

The $3,499 headset lets users merge the real world with a digital one navigated by their eyes, voice and hands – no controllers needed. 

The headset runs on VisionOS, which Apple touts as ‘the world’s first spatial operating system.’

Apple calls it ‘spatial computing’ because it blends content into the space around you.

The new software was announced during its World Wide Developers Conference this month but was likely overshadowed by the AR Vision Pro headset

Mike Rockwell, Apple’s vice president of the Technology Development Group, said: ‘Creating our first spatial computer required invention across nearly every facet of the system.

‘Through a tight integration of hardware and software, we designed a standalone spatial computer in a compact wearable form factor that is the most advanced personal electronics device ever.’

Users move their eyes and hands and say specific commands to power their journey through the augmented experience.

Apple’s human interface chief Alan Dye said that users will select content inside the goggles with their eyes, tap their fingers together to click, and gently flick to scroll.

Source: Read Full Article