perception
Here are 270 public repositories matching this topic...
-
Updated
Oct 19, 2021 - C++
-
Updated
Feb 3, 2022 - C#
-
Updated
Feb 9, 2022 - C#
-
Updated
Oct 5, 2021 - Python
-
Updated
Aug 14, 2020 - C++
To reproduce, run pylot with --flagfile=configs/tracking.conf and tracker_type=deep_sort set in the config file.
-
Updated
Apr 25, 2021 - Python
There are a few key issues with the RosJointStateClient:
https://github.com/personalrobotics/aikido/blob/master/src/control/ros/RosJointStateClient.cpp
(1) There's no reason to lock the mSkeleton mutex in spin, since we don't read or write from it at all.
(2) There is not reason to store the mSkeleton pointer anyway, since we never use it.
(3) We should decide on a better interface. Eith
-
Updated
Feb 11, 2022 - C++
-
Updated
Jun 17, 2020 - Lua
-
Updated
Sep 14, 2021 - Python
-
Updated
Jun 4, 2020
-
Updated
Jan 26, 2022 - Python
-
Updated
Jan 27, 2020 - C++
-
Updated
Sep 12, 2017 - Python
-
Updated
Feb 1, 2022 - C++
-
Updated
Jan 29, 2019 - Python
-
Updated
Mar 21, 2020 - Scheme
-
Updated
May 17, 2021 - Python
-
Updated
Jan 27, 2022 - C++
-
Updated
Oct 12, 2020 - Python
-
Updated
May 30, 2019 - C++
-
Updated
Nov 15, 2019 - Python
-
Updated
Feb 9, 2022 - C#
-
Updated
Nov 7, 2021
-
Updated
Dec 17, 2021 - Python
-
Updated
Jun 10, 2020 - MATLAB
-
Updated
Jan 6, 2019 - C++
Improve this page
Add a description, image, and links to the perception topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the perception topic, visit your repo's landing page and select "manage topics."

I figured out a way to get the (x,y,z) data points for each frame from one hand previously. but im not sure how to do that for the new holistic model that they released. I am trying to get the all landmark data points for both hands as well as parts of the chest and face. does anyone know how to extract the holistic landmark data/print it to a text file? or at least give me some directions as to h