Hand and Body Pose Detection in iOS 14 Will Provide New Ways to Interact With Your iPhone Without Touching the Display - MacRumors
Skip to Content

Hand and Body Pose Detection in iOS 14 Will Provide New Ways to Interact With Your iPhone Without Touching the Display

Starting in iOS 14 and macOS Big Sur, developers will be able to add the capability to detect human body and hand poses in photos and videos to their apps using Apple's updated Vision framework, as explained in this WWDC 2020 session.

apple vision framework human body pose detection jumping jack
This functionality will allow apps to analyze the poses, movements, and gestures of people, enabling a wide variety of potential features. Apple provides some examples, including a fitness app that could automatically track the exercise a user performs, a safety-training app that could help employees use correct ergonomics, and a media-editing app that could find photos or videos based on pose similarity.

Hand pose detection in particular promises to deliver a new form of interaction with apps. Apple's demonstration showed a person holding their thumb and index finger together and then being able to draw in an iPhone app without touching the display.

apple vision framework hand pose detection
Additionally, apps could use the framework to overlay emoji or graphics on a user's hands that mirror the specific gesture, such as a peace sign.

apple vision framework hand emoji
Another example is a camera app that automatically triggers photo capture when it detects the user making a specific hand gesture in the air.

The framework is capable of detecting multiple hands or bodies in one scene, but the algorithms might not work as well with people who are wearing gloves, bent over, facing upside down, or wearing overflowing or robe-like clothing. The algorithm can also experience difficulties if a person is close to edge of the screen or partially obstructed.

Similar functionality is already available through ARKit, but it is limited to augmented reality sessions and only works with the rear-facing camera on compatible iPhone and iPad models. With the updated Vision framework, developers have many more possibilities.

Related Roundup: WWDC 2026

Popular Stories

iOS 26

iOS 26.4 Adds Two New Features to CarPlay

Tuesday March 24, 2026 1:55 pm PDT by
iOS 26.4 was released today, and it includes a couple of new features for CarPlay: an Ambient Music widget and support for voice-based chatbot apps. To update your iPhone 11 or newer to iOS 26.4, open the Settings app and tap on General → Software Update. CarPlay will automatically offer the new features so long as the iPhone connected to your vehicle is running iOS 26.4 or later....
Apple Business hero

Apple Unveils 'Apple Business' All-in-One Platform

Tuesday March 24, 2026 8:53 am PDT by
Apple today announced Apple Business, a new all-in-one platform that unifies device management, productivity tools, and customer outreach features. The service is designed to be a consolidated replacement for several of Apple's existing business-focused offerings, including Apple Business Essentials, Apple Business Manager, and Apple Business Connect. It provides organizations with a single...
AirPods Pro Firmware Feature

Apple Releases New Firmware for AirPods Pro 3, AirPods Pro 2 and AirPods 4

Tuesday March 24, 2026 12:31 pm PDT by
Apple today released new firmware for the AirPods Pro 2, AirPods Pro 3, and the AirPods 4. The firmware has a version number of 8B39, up from 8B34 on the AirPods Pro 3, 8B28 on the AirPods Pro 2, and 8B21 on the AirPods 4. There is no word on what's included in the firmware, but Apple has a support document with limited notes. Most updates are limited to bug fixes and performance...

Top Rated Comments

75 months ago
When these phones can snap a photo right at the optimum height of a group jump, we will truly be in the future.
Score: 13 Votes (Like | Disagree)
75 months ago
That’s great. I have a hand gesture I’ve been giving to Siri for years. Now maybe she’ll get the message.
Score: 12 Votes (Like | Disagree)
luvbug Avatar
75 months ago

Seems a lot like what Xbox was able to do with the Kinect 10 years ago
That's funny, I didn't think the Xbox was a mobile phone??
Score: 6 Votes (Like | Disagree)
Appleman3546 Avatar
75 months ago
Seems a lot like what Xbox was able to do with the Kinect 10 years ago
Score: 5 Votes (Like | Disagree)
AngerDanger Avatar
75 months ago


Honestly, this seems like the kinda stuff that'd make Apple AR compelling—being able to draw in midair means you’d also be able to navigate an interface in midair with just your hands.

Using AR/VR without bringing a controller everywhere seems analogous to what set the iPhone apart from other touchscreen phones in 2007; you didn’t need a stylus.
Score: 4 Votes (Like | Disagree)
75 months ago

The Kinect required expensive 3D scanning hardware, which ultimately Microsoft couldn't afford and discontinued. (Kinect games even attracted an additional royalty which I recall was rumored at $10) This is all done with computer vision.

That's funny, I didn't think the Xbox was a mobile phone??

How is that product and technology doing, now?
Are none of you aware the TrueDepth Camera is made with PrimeSense technology, the same tech in Kinect? It works the same way, just miniaturized.

https://www.theverge.com/circuitbreaker/2017/9/17/16315510/iphone-x-notch-kinect-apple-primesense-microsoft

And, MS continues to use that technology in Hololens.


Which is a shame, I still have mine around.
Same here.
Score: 3 Votes (Like | Disagree)