Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can palm_detection distinguish between right and left hand? #127

Closed
metalwhale opened this issue Sep 24, 2019 · 10 comments
Closed

Can palm_detection distinguish between right and left hand? #127

metalwhale opened this issue Sep 24, 2019 · 10 comments
Assignees
Labels
legacy:hands Hand tracking/gestures/etc

Comments

@metalwhale
Copy link

metalwhale commented Sep 24, 2019

I am wondering if it is possible to use palm_detection (or hand_landmark...) for detecting which hand is rising, right or left?
Don't need this feature right now. Just curious.

@lisbravo
Copy link

quick and dirty
bool rigthHand = (landmarks[5].x()>landmarks[17].x());

@metalwhale
Copy link
Author

metalwhale commented Sep 24, 2019

Thanks for your help. But I think it's not true. What if user show the back of hand to camera?

@lisbravo
Copy link

1 your welcome
2 still working on it

@fanzhanggoogle fanzhanggoogle self-assigned this Sep 27, 2019
@fanzhanggoogle fanzhanggoogle added the legacy:hands Hand tracking/gestures/etc label Sep 27, 2019
@fanzhanggoogle
Copy link

fanzhanggoogle commented Sep 27, 2019

I am wondering if it is possible to use palm_detection (or hand_landmark...) for detecting which hand is rising, right or left?
Don't need this feature right now. Just curious.

This is a very good question! You are absolutely correct that basically there's no sense of right/left hand depending on which camera you are using. For example, right hand in front-facing looks exactly the same as left hand in back-facing camera (because it's flipped). We are adding a feature to the model that it can predict it's the back or front of a hand, then you can tell left/right hand based on which camera you are using.

@metalwhale
Copy link
Author

@fanzhanggoogle thank you for your reply, so glad to hear that!!
We will keep waiting for this amazing feature!

@mgyong mgyong closed this as completed Sep 30, 2019
@jackz314
Copy link

@fanzhanggoogle any updates on the feature? Would love to test it out.

@JuliaPoo
Copy link

You can determine handedness using hand_landmark in 3D. Here's my approach in python which I used in my python implementation of the Multi-hand Tracking pipeline:

def is_right_hand(kp):
    
    '''
    Returns True if kp is right hand and False if left hand.
    '''
    
    digitgroups = [
        (17,18,19,20),
        (13,14,15,16),
        (9,10,11,12),
        (5,6,7,8),
        (2,3,4) # Thumb
    ]
    
    palm_dir_vec = np.array([0,0,0], dtype=np.float64)
    for digit in digitgroups:
        for idx in digit[1:]:
            palm_dir_vec += kp[idx] - kp[digit[0]]
            
    palm_pos_vec = np.array([0,0,0], dtype=np.float64)
    for digit in digitgroups:
        palm_pos_vec += kp[digit[0]]
    palm_pos_vec /= len(digitgroups)
    
    top_palm_pos_vec = kp[9]
    
    val = np.dot(np.cross(kp[2] - palm_pos_vec, palm_dir_vec), top_palm_pos_vec - palm_pos_vec)

    if val < 0: return True
    
    return False

Here kp is a list of 3D coordinates representing the keypoints.

@alan2work
Copy link

Have you researched a feasible way to distinguish between left and right hand ?I can't find a solution ~

@aa12356jm
Copy link

aa12356jm commented Jul 22, 2021

You can determine handedness using hand_landmark in 3D. Here's my approach in python which I used in my python implementation of the Multi-hand Tracking pipeline:

def is_right_hand(kp):
    
    '''
    Returns True if kp is right hand and False if left hand.
    '''
    
    digitgroups = [
        (17,18,19,20),
        (13,14,15,16),
        (9,10,11,12),
        (5,6,7,8),
        (2,3,4) # Thumb
    ]
    
    palm_dir_vec = np.array([0,0,0], dtype=np.float64)
    for digit in digitgroups:
        for idx in digit[1:]:
            palm_dir_vec += kp[idx] - kp[digit[0]]
            
    palm_pos_vec = np.array([0,0,0], dtype=np.float64)
    for digit in digitgroups:
        palm_pos_vec += kp[digit[0]]
    palm_pos_vec /= len(digitgroups)
    
    top_palm_pos_vec = kp[9]
    
    val = np.dot(np.cross(kp[2] - palm_pos_vec, palm_dir_vec), top_palm_pos_vec - palm_pos_vec)

    if val < 0: return True
    
    return False

Here kp is a list of 3D coordinates representing the keypoints.

this approach for 2d coordinates is ok?thanks

@JuliaPoo
Copy link

The above is more of a hack for back when mediapipe did not support handedness. It works by finding if the fingers are behind or in front of the palm, so it would not work for 2D coordinates. Mediapipe now supports handedness so I don't recommend you to use my code.

arttupii pushed a commit to arttupii/mediapipe that referenced this issue Nov 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
legacy:hands Hand tracking/gestures/etc
Projects
None yet
Development

No branches or pull requests

8 participants