|
||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES | |||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |
java.lang.Objectcom.primesense.nite.UserTracker
public class UserTracker
This is the main object of the User Tracker algorithm. It provides access to one half of the algorithms provided by NiTE. Scene segmentation, skeleton, floor plane detection, and pose detection are all provided by this class.
The first purpose of the User Tracker algorithm is to find all of the active users in a specific scene. It individually tracks each human it finds, and provides the means to seperate their outline from each other and from the background. Once the scene has been segmented, the User Tracker is also used to initiate Skeleton Tracking and Pose Detection algorithms.
Each user is provided an ID as they are detected. The user ID remains constant as long as the user remains in the frame. If a user leaves the field of view of the camera, or tracking of that user is otherwise lost, the user may have a different ID when he is detected again. There is currently no mechanism that provides persistant recognition of individuals when they are not being actively tracking. If this functionality is desired, it will need to be implimented at the application level.
A listener class is provided to allow event based interaction with this algorithm.
UserMap
,
UserData
,
Skeleton
,
NiTE
,
HandTracker
Nested Class Summary | |
---|---|
static interface |
UserTracker.NewFrameListener
This is a listener interface that is used to react to events generated by the UserTracker class. |
Method Summary | |
---|---|
void |
addNewFrameListener(UserTracker.NewFrameListener listener)
Adds a NewFrameListner object to this UserTracker so that it will respond when a new
frame is generated. |
Point2D<java.lang.Float> |
convertDepthCoordinatesToJoint(Point3D<java.lang.Integer> point)
In general, two coordinate systems are used in OpenNI 2.0. |
Point2D<java.lang.Float> |
convertJointCoordinatesToDepth(Point3D<java.lang.Float> point)
|
static UserTracker |
create()
Creates and initializes an empty User Tracker. |
static UserTracker |
create(org.openni.Device device)
Creates and initializes an empty User Tracker. |
void |
destroy()
Shuts down the user tracker and releases all resources used by it. This is the opposite of create(). |
long |
getHandle()
|
float |
getSkeletonSmoothingFactor()
Queries the current skeleton smoothing factor. |
UserTrackerFrameRef |
readFrame()
Gets the next snapshot of the algorithm. |
void |
removeNewFrameListener(UserTracker.NewFrameListener listener)
Removes a NewFrameListener object from this UserTracker's list of listeners. |
void |
setSkeletonSmoothingFactor(float factor)
Control the smoothing factor of the skeleton joints. |
void |
startPoseDetection(short userId,
PoseType type)
This function commands the UserTracker to start detecting specific poses for a specific
user. |
void |
startSkeletonTracking(short userId)
Requests that the Skeleton algorithm starts tracking a specific user. |
void |
stopPoseDetection(short userId,
PoseType type)
This function commands the pose detection algorithm to stop detecting a specific pose for a specific user. |
void |
stopSkeletonTracking(short userId)
Stops skeleton tracking for a specific user. |
Methods inherited from class java.lang.Object |
---|
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
Method Detail |
---|
public static UserTracker create(org.openni.Device device)
Creates and initializes an empty User Tracker. This function should be the first one called when a new UserTracker object is constructed.
An OpenNI device with depth capabilities is required for this algorithm to work. See the OpenNI 2.0 documentation for more information about using an OpenNI 2.0 compliant hardware device and creating a Device object.
device
- Initialized OpenNI 2.0 Device object that provides depth streams.
public static UserTracker create()
Creates and initializes an empty User Tracker. This function should be the first one called when a new UserTracker object is constructed.
An OpenNI device with depth capabilities is required for this algorithm to work. See the OpenNI 2.0 documentation for more information about using an OpenNI 2.0 compliant hardware device and creating a Device object.
public void destroy()
public UserTrackerFrameRef readFrame()
public void setSkeletonSmoothingFactor(float factor)
factor
- The smoothing factor.public float getSkeletonSmoothingFactor()
setSkeletonSmoothingFactor(float factor)
public void startSkeletonTracking(short userId)
Requests that the Skeleton algorithm starts tracking a specific user. Once started, the skeleton will provide information on the joint position and orientation for that user during each new frame of the UserTracker.
Note that the computational requirements of calculating a skeleton increase linearly with the number of users tracked. Tracking too many users may result in poor performance and high CPU utilization. If performance slows to the point where the skeleton is not calculated at the full frame rate of the depth data used to generate it, the algorithm tends to perform poorly.
userId
- User for which calculate a skeleton.public void stopSkeletonTracking(short userId)
user
- User to stop tracking.Skeleton
public void startPoseDetection(short userId, PoseType type)
UserTracker
to start detecting specific poses for a specific
user.
userId
- User that you would like to detect a pose for.type
- The type of pose you would like to detect.PoseData
,
PoseType
public void stopPoseDetection(short userId, PoseType type)
userId
- User to stop detecting a specific pose for.type
- The PoseType
of the pose to stop detecting.public void addNewFrameListener(UserTracker.NewFrameListener listener)
UserTracker
so that it will respond when a new
frame is generated.
listener
- A listener to add.public void removeNewFrameListener(UserTracker.NewFrameListener listener)
listener
- A listener to remove.public Point2D<java.lang.Float> convertJointCoordinatesToDepth(Point3D<java.lang.Float> point)
public Point2D<java.lang.Float> convertDepthCoordinatesToJoint(Point3D<java.lang.Integer> point)
In general, two coordinate systems are used in OpenNI 2.0. These conventions are also followed in NiTE 2.0.
Skeleton joint positions are provided in "Real World" coordinates, while the native coordinate system of depth maps is the "projective" system. In short, "Real World" coordinates locate objects using a Cartesian coordinate system with the origin at the sensor. "Projective" coordinates measure straight line distance from the sensor, and indicate x/y coordinates using pixels in the image (which is mathematically equivalent to specifying angles). See the OpenNI 2.0 documentation online for more information.
This function allows you to convert the native depth map coordinates to the system used by the joints. This might be useful for performing certain types of measurements (eg distance between a joint and an object identified only in the depth map).
Note that no output is given for the Z coordinate. Z coordinates remain the same when performing the conversion. An input value is still required for Z, since this can affect the x/y output.
point
- The input point in the "projective" coordinate system.
public long getHandle()
|
||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES | |||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |