perception
Here are 282 public repositories matching this topic...
-
Updated
Oct 19, 2021 - C++
-
Updated
Apr 29, 2022 - C#
-
Updated
May 20, 2022 - C#
-
Updated
Feb 14, 2022 - Python
To reproduce, run pylot with --flagfile=configs/tracking.conf and tracker_type=deep_sort set in the config file.
-
Updated
Aug 14, 2020 - C++
-
Updated
Apr 25, 2021 - Python
There are a few key issues with the RosJointStateClient:
https://github.com/personalrobotics/aikido/blob/master/src/control/ros/RosJointStateClient.cpp
(1) There's no reason to lock the mSkeleton mutex in spin, since we don't read or write from it at all.
(2) There is not reason to store the mSkeleton pointer anyway, since we never use it.
(3) We should decide on a better interface. Eith
-
Updated
Apr 20, 2022 - C++
-
Updated
May 5, 2022 - Python
-
Updated
Jun 17, 2020 - Lua
-
Updated
Sep 14, 2021 - Python
-
Updated
Apr 13, 2022 - Python
-
Updated
Jun 4, 2020
-
Updated
Mar 24, 2022 - C++
-
Updated
Jan 27, 2020 - C++
-
Updated
Sep 12, 2017 - Python
-
Updated
Apr 22, 2022 - C#
-
Updated
May 19, 2022 - Python
-
Updated
Jan 29, 2019 - Python
-
Updated
Jan 27, 2022 - C++
-
Updated
Mar 21, 2020 - Scheme
-
Updated
May 17, 2021 - Python
-
Updated
Oct 12, 2020 - Python
-
Updated
May 13, 2022 - Python
-
Updated
May 30, 2019 - C++
-
Updated
Nov 15, 2019 - Python
-
Updated
Nov 7, 2021
Improve this page
Add a description, image, and links to the perception topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the perception topic, visit your repo's landing page and select "manage topics."

I figured out a way to get the (x,y,z) data points for each frame from one hand previously. but im not sure how to do that for the new holistic model that they released. I am trying to get the all landmark data points for both hands as well as parts of the chest and face. does anyone know how to extract the holistic landmark data/print it to a text file? or at least give me some directions as to h