Starting in iOS 14 and macOS Big Sur, developers will be able to add the capability to detect human body and hand poses in photos and videos to their apps using Apple's updated Vision framework, as explained in this WWDC 2020 session.
This functionality will allow apps to analyze the poses, movements, and gestures of people, enabling a wide variety of potential features. Apple provides some examples, including a fitness app that could automatically track the exercise a user performs, a safety-training app that could help employees use correct ergonomics, and a media-editing app that could find photos or videos based on pose similarity.
Hand pose detection in particular promises to deliver a new form of interaction with apps. Apple's demonstration showed a person holding their thumb and index finger together and then being able to draw in an iPhone app without touching the display.
Additionally, apps could use the framework to overlay emoji or graphics on a user's hands that mirror the specific gesture, such as a peace sign.
Another example is a camera app that automatically triggers photo capture when it detects the user making a specific hand gesture in the air.
The framework is capable of detecting multiple hands or bodies in one scene, but the algorithms might not work as well with people who are wearing gloves, bent over, facing upside down, or wearing overflowing or robe-like clothing. The algorithm can also experience difficulties if a person is close to edge of the screen or partially obstructed.
Similar functionality is already available through ARKit, but it is limited to augmented reality sessions and only works with the rear-facing camera on compatible iPhone and iPad models. With the updated Vision framework, developers have many more possibilities.
Sunday February 1, 2026 10:08 am PST by Joe Rossignol
Last year, Apple launched CarPlay Ultra, the long-awaited next-generation version of its CarPlay software system for vehicles. Nearly nine months later, CarPlay Ultra is still limited to Aston Martin's latest luxury vehicles, but that should change fairly soon.
In May 2025, Apple said many other vehicle brands planned to offer CarPlay Ultra, including Hyundai, Kia, and Genesis.
In his Powe...
Sunday February 1, 2026 12:31 pm PST by Joe Rossignol
The calendar has turned to February, and a new report indicates that Apple's next product launch is "imminent," in the form of new MacBook Pro models.
"All signs point to an imminent launch of next-generation MacBook Pros that retain the current form factor but deliver faster chips," Bloomberg's Mark Gurman said on Sunday. "I'm told the new models — code-named J714 and J716 — are slated...
Tuesday February 3, 2026 7:47 am PST by Joe Rossignol
We are still waiting for the iOS 26.3 Release Candidate to come out, so the first iOS 26.4 beta is likely still at least a week or two away. Following beta testing, iOS 26.4 will likely be released to the general public in March or April.
Below, we have recapped known or rumored iOS 26.3 and iOS 26.4 features so far.
iOS 26.3
iPhone to Android Transfer Tool
iOS 26.3 makes it easier...
Sunday February 1, 2026 5:42 am PST by Joe Rossignol
Apple is planning to launch new MacBook Pro models with M5 Pro and M5 Max chips alongside macOS 26.3, according to Bloomberg's Mark Gurman.
"Apple's faster MacBook Pros are planned for the macOS 26.3 release cycle," wrote Gurman, in his Power On newsletter today.
"I'm told the new models — code-named J714 and J716 — are slated for the macOS 26.3 software cycle, which runs from...
Tuesday February 3, 2026 8:55 am PST by Joe Rossignol
In 2022, Apple introduced a new Apple Home architecture that is "more reliable and efficient," and the deadline to upgrade and avoid issues is fast approaching.
In an email this week, Apple gave customers a final reminder to upgrade their Home app by February 10, 2026. Apple says users who do not upgrade may experience issues with accessories and automations, or lose access to their smart...
Honestly, this seems like the kinda stuff that'd make Apple AR compelling—being able to draw in midair means you’d also be able to navigate an interface in midair with just your hands.
Using AR/VR without bringing a controller everywhere seems analogous to what set the iPhone apart from other touchscreen phones in 2007; you didn’t need a stylus.
The framework is capable of detecting multiple hands or bodies in one scene, but the algorithms might not work as well with people who are wearing gloves, bent over, facing upside down, or wearing overflowing or robe-like clothing.
I don't think the wizarding community is going to be too happy about this...