We hope that providing this hand perception functionality to the wider research and development community will result in an emergence of creative use cases, stimulating new applications and new research avenues.ģD hand perception in real-time on a mobile phone via MediaPipe. Whereas current state-of-the-art approaches rely primarily on powerful desktop environments for inference, our method achieves real-time performance on a mobile phone, and even scales to multiple hands. This approach provides high-fidelity hand and finger tracking by employing machine learning (ML) to infer 21 3D keypoints of a hand from just a single frame. Today we are announcing the release of a new approach to hand perception, which we previewed CVPR 2019 in June, implemented in MediaPipe-an open source cross platform framework for building pipelines to process perceptual data of different modalities, such as video and audio. finger/palm occlusions and hand shakes) and lack high contrast patterns. While coming naturally to people, robust real-time hand perception is a decidedly challenging computer vision task, as hands often occlude themselves or each other (e.g. For example, it can form the basis for sign language understanding and hand gesture control, and can also enable the overlay of digital content and information on top of the physical world in augmented reality. The ability to perceive the shape and motion of hands can be a vital component in improving the user experience across a variety of technological domains and platforms. Posted by Valentin Bazarevsky and Fan Zhang, Research Engineers, Google Research
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |