Thumbs Up Gesture Recognition

Thumbs Up Gesture Recognition with QuickPose iOS SDK

Are you looking to integrate thumbs up gesture recognition into your app? The QuickPose iOS SDK offers a powerful way to capture and interpret the ‘thumbs up’ hand gesture, making it ideal for a variety of applications. This feature can be used to confirm user readiness, acknowledge actions, or even as a hands-free control mechanism in interactive experiences.

Whether you’re developing a fitness app, an educational tool, or an interactive game, integrating thumbs up gesture recognition can enhance user engagement and streamline interactions. In this guide, you’ll learn how to implement this feature using QuickPose, ensuring accurate detection, real-time feedback, and a seamless user experience.

GIF shows Thumbs Up gesture recognition using pose estimation iOS SDK by QuickPose

Steps to Integrate Thumbs Up Gesture Recognition into Your App:

Register an SDK Key with QuickPose

Get your free SDK key on https://dev.quickpose.ai, usage limits may apply. SDK Keys are linked to your bundle ID, please check Key before distributing to the App Store. 

This is a quick look to integrate Thumbs Up / Down Gesture Recognition using the QuickPose iOS SDK. You can see the full documentation here: QuickPose iOS SDK Thumbs Up Gesture installation.

Activate Thumbs Up Feature

We suggest auto-detecting which hand is raised as this reduces confusion for users. The slight caveat for this approach is if both hands are present QuickPose defaults to the right hand.

				
					case thumbsUp() // auto detects raised hand, defaulting to right if both are raised
case thumbsUp(side: .left) // left hand only
case thumbsUp(side: .right) // right hand only
case thumbsUp(style: customStyled) // with custom style
				
			

Thumbs Up or Down

If you wish to explicitly capture a thumbs down state, the .thumbsUpOrDown() can be substituted in the docs below.

GIF shows thumbs down gesture recognition using QuickPose iOS SDK

As above we suggest used auto for the hand detection.

				
					case thumbsUpOrDown() // auto detects raised hand, defaulting to right if both are raised
case thumbsUpOrDown(side: .left) // left hand only
case thumbsUpOrDown(side: .right) // right hand only
case thumbsUpOrDown(style: customStyled) // with custom style
				
			

Conditional Styling

To give user feedback consider using conditional styling so that when the user’s measurement goes above a threshold, here 0.8, a green highlight is shown.

				
					let greenHighlightStyle = QuickPose.Style(conditionalColors: [QuickPose.Style.ConditionalColor(min: 0.8, max: nil, color: UIColor.green)])
quickPose.start(features: [.thumbsUp(style: customOrConditionalStyle)], 
                onFrame: { status, image, features, feedback, landmarks in  ...
})
				
			

Improving the Captured Results

The basic implementation above would likely capture an incorrect value, as in the real world users need time to understand what they are doing, change their mind, or QuickPose can simply get an incorrect value due to poor lighting or the user’s stance. These issues are partially mitigated by on-screen feedback, but it’s best to use an QuickPoseDoubleUnchangedDetector to keep reading the values until the user has settled on a final answer.

To steady the .thumbsUp() results declare a configurable Unchanged detector, which can be used to turn lots of our input features to read more reliably.

				
					@State private var unchanged = QuickPoseDoubleUnchangedDetector(similarDuration: 2)
				
			

This will on trigger the callback block when the result has stayed the same for 2 seconds, the above has the default leniency, but this can be modified in the constructor.

				
					@State private var unchanged = QuickPoseDoubleUnchangedDetector(similarDuration: 2, leniency: Double = 0.2) // changed to 20% leniency
				
			

The unchanged detector is added to your onFrame callback, and is updated every time a result is found, triggering its onChange callback only when the result has not changed for the specified duration.

				
					quickPose.start(features: [.thumbsUp()], onFrame: { status, image, features, feedback, landmarks in                
    switch status {
        case .success:
            overlayImage = image
            if let result = features.values.first  {
                feedbackText = result.stringValue
                unchanged.count(result: result.value) {
                    print("Final Result \(result.value)") 
                    let isThumbsUp = result.value > 0.5 
                    if isThumbsUp {
                        // your code to save result                    
                    } else {
                        // repeat the task agin
                    }
                }
            } else {
                feedbackText = nil // blank if no hand detected
            }
        case .noPersonFound:
            feedbackText = "Stand in view";
        case .sdkValidationError:
            feedbackText = "Be back soon";
    }
})
				
			

Improving Guidance

Despite the improvements above, the user doesn’t have clear instructions to know what to do, this can be fixed by adding user guidance.

Our recommended pattern is to use an enum to capture all the states in your application.

				
					enum ViewState: Equatable {
        case intro
        case measuring(score: Bool)
        case completed(score: Bool)
        case error(_ prompt: String)
        
        var prompt: String? {
            switch self {
            case .intro:
                return "Are you ready to continue?"
            case .measuring(let score):
                return score ? "Yes?" : "No?"
            case .completed(let score):
                return core ? "Yes\nLet's Continue" : "No\nLet's Wait"
            case .error(let prompt):
                return prompt
            }
        }
        var features: [QuickPose.Feature]{
            switch self {
            case .intro, .measuring:
                return [.thumbsUp()]
            case .completed, .error:
                return []
            }
        }
    }
				
			

Alongside the states we also provide a prompt text, which instructs the user at each step, and similarly the features property instructs which features to pass to QuickPose, note for the completed state QuickPose doesn’t process any features.

Declare this so your SwiftUI views can access it, starting in the .intro state, our exmaple is simplifited to just demonstrate the pattern, as you would typically start with more positioning guidance.

				
					@State private var state: ViewState = .intro
				
			

Next make some modifications, so that your feedbackText is pulled from the state prompt by default.

				
					.overlay(alignment: .center) {
    if let feedbackText = state.prompt {
        Text(feedbackText)
            .font(.system(size: 26, weight: .semibold)).foregroundColor(.white).multilineTextAlignment(.center)
            .padding(16)
            .background(RoundedRectangle(cornerRadius: 8).foregroundColor(Color("AccentColor").opacity(0.8)))
            .padding(.bottom, 40)
    }
}
				
			

This now means you can remove the feedbackText declaration:

				
					//@State private var feedbackText: String? = nil // remove the feedbackText
				
			

There’s two changes we need to make, first we need to update QuickPose with the features for each state:

				
					.onChange(of: state) { _ in
    quickPose.update(features: state.features)
}
				
			

Then we should start QuickPose from the state’s features as well.

				
					.onAppear {
    quickPose.start(features: state.features, onFrame: { status, image, features, feedback, landmarks in
    ...
				
			

And in the onFrame callback update the state instead of the feedbackText. This allows the UI input to change the view state in a controlled manner, so that for example the .intro state can only be accessed if the user’s hand is missing from the .measuring state, or from the .error state.

				
					quickPose.start(features: state.features, onFrame: { status, image, features, feedback, landmarks in
    switch status {
        case .success:
            overlayImage = image
            if let result = features.values.first {
                state = .measuring(score: result.value > 0.5)
                unchanged.count(result: result.value) {
                    state = .completed(score: result.value > 0.5)
                    let isThumbsUp = result.value > 0.5 
                    if isThumbsUp {
                        // your code to save result                    
                    } else {
                        // repeat the task agin
                    }
                }
            } else if case .measuring = state {
                state = .intro
            } else if case .error = state {
                state = .intro
            }
        case .noPersonFound:
            state = .error("Stand in view")
        case .sdkValidationError:
            state = .error("Be back soon")
    }
})
				
			

By following this guide, you can integrate a thumbs up or down feature into your app, allowing your users to interact with your app from a distance. 

Need help building an AI project?

At QuickPose, our mission is to build smart Pose Estimation Solutions that elevate your product. Schedule a free consultation with us to discuss your project.