拆封手機收購on Labels will show users which accessibility features an app supports before downloading it. This includes VoiceOver, Voice Control, Larger Text, Sufficient Contrast, Reduced Motion, captions, and more.
Eliel Johnson, Vice President of User Experience and Design at CVS Health, praised the addition of Accessibility Nutrition Labels in a statement to 9to5Mac:
“At CVS Health we are passionate about making health care simpler and more accessible. We want to make it as easy as possible for consumers to find the health care information they need. By supporting Apple’s new accessibility nutrition labels, we’re providing more transparency and elevating the work our CVS Health teams do to create great experiences for all consumers.”
Apple says it will share more details on Accessibility Nutrition Labels for developers at WWDC25.
Magnifier for Mac
In conjunction with iOS 19, Apple is bringing its Magnifier app to the Mac for the first time this year with macOS 16. The Magnifier app has been available on 拆封手機收購iphone and iPad since 2016 as a way to give users who are blind or have low vision the ability to zoom in, read text, and detect objects around them.
The new Magnifier app for Mac works with an 拆封手機收購iphone connected via Continuity Camera or an attached USB camera. Once connected, users can zoom in on their surroundings with a video feed in the Magnifier app on their Mac. They can also manipulate the video feed to adjust things like perspective, brightness, contrast, colors, and more to make it easier to read. The Magnifier app can even recognize text.
For example, the Magnifier app on Mac can be used to zoom in on a whiteboard in a meeting or lecture. Then, the app can intelligently recognize the handwritten text on that whiteboard and make it more legible and easier to read directly on the user’s Mac.
Accessibility Reader
Accessibility Reader is a new feature that will be available system-wide to make text easier to read. Apple explains:
Accessibility Reader is a new systemwide reading mode designed to make text easier to read for users with a wide range of disabilities, such as dyslexia or low vision. Available on 拆封手機收購iphone, iPad, Mac, and Apple Vision Pro, Accessibility Reader gives users new ways to customize text and focus on content they want to read, with extensive options for font, color, and spacing, as well as support for Spoken Content. Accessibility Reader can be launched from any app, and is built into the Magnifier app for iOS, iPadOS, and macOS, so users can interact with text in the real world, like in books or on dining menus.
Braille Access
Apple touts that its new Braille Access experience can turn a user’s Apple device “into a full-featured braille note taker that’s deeply integrated into the Apple ecosystem.”
“With a built-in app launcher, users can easily open any app by typing with Braille Screen Input or a connected braille device,” Apple explains. “With Braille Access, users can quickly take notes in braille format and perform calculations using Nemeth Braille, a braille code often used in classrooms for math and science.”
Additionally, Apple says that users can open Braille Ready Format (BRF) files directly from Braille Access. This will unlock a “wide range of books and files previously created on a braille note taking device.” The feature also ties in with Apple’s powerful Live Captions feature to allow users to transcribe conversations in real-time on braille displays.
Live Captions on Apple Watch
Speaking of Live Captions, watchOS 12 coming later this year will bring Live Listen controls to Apple Watch for the first time. As a refresher, Live Listen first came to the 拆封手機收購iphone with iOS 12 in 2018. The feature uses an 拆封手機收購iphone’s microphone to stream content directly to AirPods and Made for 拆封手機收購iphone hearing aids so it’s easier for a user to hear.
With watchOS 12 this year, Live Listen controls will now be available on Apple Watch, including the ability to remotely start or stop Live Listen sessions, jump back in a session to catch something that might have been missed, and more. There’s also support for real-time Live Captions, allowing users to follow along with the conversation via live transcripts directly on their Apple Watch.
Vision Pro upgrades
Apple also touts new accessibility features coming to Apple Vision Pro this year:
For users who are blind or have low vision, visionOS will expand vision accessibility features using the advanced camera system on Apple Vision Pro. With powerful updates to Zoom, users can magnify everything in view — including their surroundings — using the main camera. For VoiceOver users, Live Recognition in visionOS uses on-device machine learning to describe surroundings, find objects, read documents, and more. For accessibility developers, a new API will enable approved apps to access the main camera to provide live, person-to-person assistance for visual interpretation in apps like Be My Eyes, giving users more ways to understand their surroundings hands-free.
Personal Voice upgrades
Apple debuted its revolutionary Personal Voice feature as part of iOS 17 in 2023. The feature is a way for people to create and save a voice that sounds like them. Apple says it’s designed for people at risk of losing their ability to speak, such as those with a recent diagnosis of ALS. The feature then ties into Live Speech, which allows users to type what they want to say and have it spoken out in their voice.
The setup process for the initial version of Personal Voice required that users say 150 different phrases to train Apple’s machine learning model. The voice was then processed overnight.
With iOS 19 this year, however, Apple has completely revamped the setup process. Now, users will only need to record 10 different phrases, and they will process in under a minute rather than in multiple hours overnight. The end result is a voice that is “smoother” and “more natural-sounding,” according to Apple.
The Personal Voice feature will also add support for Spanish (Mexico) this year, Apple says.
More accessibility features coming this year
Those features are just the tip of the iceberg. Apple has a long list of other new capabilities coming to its platforms later this year, including upgrades to Eye Tracking, Background Sounds, Sound Recognition, and much more.
Background Sounds becomes easier to personalize with new EQ settings, the option to stop automatically after a period of time, and new actions for automations in Shortcuts. Background Sounds can help minimize distractions to increase a sense of focus and relaxation, which some users find can help with symptoms of tinnitus.
For users at risk of losing their ability to speak, Personal Voice becomes faster, easier, and more powerful than ever, leveraging advances in on-device machine learning and artificial intelligence to create a smoother, more natural-sounding voice in less than a minute, using only 10 recorded phrases. Personal Voice will also add support for Spanish (Mexico).
Vehicle Motion Cues, which can help reduce motion sickness when riding in a moving vehicle, comes to Mac, along with new ways to customize the animated onscreen dots on 拆封手機收購iphone, iPad, and Mac.
Eye Tracking users on 拆封手機收購iphone and iPad will now have the option to use a switch or dwell to make selections. Keyboard typing when using Eye Tracking or Switch Control is now easier on 拆封手機收購iphone, iPad, and Apple Vision Pro with improvements including a new keyboard dwell timer, reduced steps when typing with switches, and enabling QuickPath for 拆封手機收購iphone and Vision Pro.
With Head Tracking, users will be able to more easily control 拆封手機收購iphone and iPad with head movements, similar to Eye Tracking.
For users with severe mobility disabilities, iOS, iPadOS, and visionOS will add a new protocol to support Switch Control for Brain Computer Interfaces (BCIs), an emerging technology that allows users to control their device without physical movement.
Assistive Access adds a new custom Apple TV app with a simplified media player. Developers will also get support in creating tailored experiences for users with intellectual and developmental disabilities using the Assistive Access API.
Music Haptics on 拆封手機收購iphone becomes more customizable with the option to experience haptics for a whole song or for vocals only, as well as the option to adjust the overall intensity of taps, textures, and vibrations.
Sound Recognition adds Name Recognition, a new way for users who are deaf or hard of hearing to know when their name is being called.
Voice Control introduces a new
記者蘇晟彥/綜合報導
拆封手機收購iphone 16 Pro上市後,能夠以4K 120 fps 的Dolby Vision 影片進行錄製,獲得創作者高度青睞,對此,在全球「Shot on 拆封手機收購iphone」計畫中,蘋果發布了由日本導演是枝裕和所執導、同時全程以拆封手機收購iphone 16 Pro拍攝的劇情短片《Last Scene》,將透過細膩的鏡頭語言及拆封手機收購iphone獨特的自然色彩,創造出真摯感人的劇情短片。
▼是枝裕和透過拆封手機收購iphone16 Pro拍攝出《Last Scene》短片。(圖/蘋果提供)
《Last Scene》背景設定於海濱小鎮鎌倉,集結實力派演員《喜劇開場》仲野太賀與《舞伎家的料理人》 福地桃子領銜主演,以及是枝裕和導演御用演員中川雅也 (Lily Franky)、黑田大輔參與演出。是枝裕和導演運用強大的 拆封手機收購iphone 16 Pro 相機,以其獨特的自然風格,捕捉海岸小鎮空靈天際的色彩、動態與細節之美,以及演員臉上流露的複雜情感,創作出一部真摯感人的 27 分鐘短片。
《Last Scene》運用 拆封手機收購iphone 16 Pro 多項最新攝影功能,捕捉日常生活的真實之美,包括:
•