Hand and Body Pose Detection in iOS 14 Will Provide New Ways to Interact With Your iPhone Without Touching the Display - MacRumors
Skip to Content

Hand and Body Pose Detection in iOS 14 Will Provide New Ways to Interact With Your iPhone Without Touching the Display

Starting in iOS 14 and macOS Big Sur, developers will be able to add the capability to detect human body and hand poses in photos and videos to their apps using Apple's updated Vision framework, as explained in this WWDC 2020 session.

apple vision framework human body pose detection jumping jack
This functionality will allow apps to analyze the poses, movements, and gestures of people, enabling a wide variety of potential features. Apple provides some examples, including a fitness app that could automatically track the exercise a user performs, a safety-training app that could help employees use correct ergonomics, and a media-editing app that could find photos or videos based on pose similarity.

Hand pose detection in particular promises to deliver a new form of interaction with apps. Apple's demonstration showed a person holding their thumb and index finger together and then being able to draw in an iPhone app without touching the display.

apple vision framework hand pose detection
Additionally, apps could use the framework to overlay emoji or graphics on a user's hands that mirror the specific gesture, such as a peace sign.

apple vision framework hand emoji
Another example is a camera app that automatically triggers photo capture when it detects the user making a specific hand gesture in the air.

The framework is capable of detecting multiple hands or bodies in one scene, but the algorithms might not work as well with people who are wearing gloves, bent over, facing upside down, or wearing overflowing or robe-like clothing. The algorithm can also experience difficulties if a person is close to edge of the screen or partially obstructed.

Similar functionality is already available through ARKit, but it is limited to augmented reality sessions and only works with the rear-facing camera on compatible iPhone and iPad models. With the updated Vision framework, developers have many more possibilities.

Related Roundup: WWDC 2026

Popular Stories

wwdc 2026 horizontal

WWDC 2026 Graphic Teases Major iOS 27 Feature

Sunday April 19, 2026 3:02 pm PDT by
Apple's WWDC 2026 graphic provides "a glimpse of the revamped Siri interface coming in iOS 27," according to Bloomberg's Mark Gurman. In his Power On newsletter today, Gurman said iOS 27 will include a new Siri interface in the Dynamic Island. When you trigger Siri, he said the Dynamic Island will show a "Search or Ask" prompt, and this will apparently be accompanied by a "glowing cursor"...
wwdc 2026 horizontal

WWDC 2026 is a Month Away: Apple Highlights 'Distinguished Winners'

Thursday May 7, 2026 7:35 am PDT by
Apple today highlighted four Distinguished Winners of this year's Swift Student Challenge, ahead of the WWDC 2026 developers conference next month. The annual Swift Student Challenge gives eligible student developers around the world the opportunity to showcase their coding capabilities by using the Swift Playground or Xcode apps to create an interactive "app playground." Apple said this...
MacBook Pro Low Angle Wide Lens

macOS 27: Two More Changes Leaked Ahead of WWDC Next Month

Sunday May 10, 2026 9:45 am PDT by
macOS 27 will have a "slight redesign" compared to macOS Tahoe, according to the latest word from Bloomberg's Mark Gurman. In his Power On newsletter today, Gurman said the design changes will help to improve the readability of macOS Tahoe's Liquid Glass interface:If you've used Tahoe, you're likely familiar with some of the quirks — particularly the transparency effects and shadows that...

Top Rated Comments

77 months ago
When these phones can snap a photo right at the optimum height of a group jump, we will truly be in the future.
Score: 13 Votes (Like | Disagree)
77 months ago
That’s great. I have a hand gesture I’ve been giving to Siri for years. Now maybe she’ll get the message.
Score: 12 Votes (Like | Disagree)
luvbug Avatar
77 months ago

Seems a lot like what Xbox was able to do with the Kinect 10 years ago
That's funny, I didn't think the Xbox was a mobile phone??
Score: 6 Votes (Like | Disagree)
Appleman3546 Avatar
77 months ago
Seems a lot like what Xbox was able to do with the Kinect 10 years ago
Score: 5 Votes (Like | Disagree)
AngerDanger Avatar
77 months ago


Honestly, this seems like the kinda stuff that'd make Apple AR compelling—being able to draw in midair means you’d also be able to navigate an interface in midair with just your hands.

Using AR/VR without bringing a controller everywhere seems analogous to what set the iPhone apart from other touchscreen phones in 2007; you didn’t need a stylus.
Score: 4 Votes (Like | Disagree)
77 months ago

The Kinect required expensive 3D scanning hardware, which ultimately Microsoft couldn't afford and discontinued. (Kinect games even attracted an additional royalty which I recall was rumored at $10) This is all done with computer vision.

That's funny, I didn't think the Xbox was a mobile phone??

How is that product and technology doing, now?
Are none of you aware the TrueDepth Camera is made with PrimeSense technology, the same tech in Kinect? It works the same way, just miniaturized.

https://www.theverge.com/circuitbreaker/2017/9/17/16315510/iphone-x-notch-kinect-apple-primesense-microsoft

And, MS continues to use that technology in Hololens.


Which is a shame, I still have mine around.
Same here.
Score: 3 Votes (Like | Disagree)