com.primesense.nite
Class UserTracker

java.lang.Object
  extended by com.primesense.nite.UserTracker

public class UserTracker
extends java.lang.Object

This is the main object of the User Tracker algorithm. It provides access to one half of the algorithms provided by NiTE. Scene segmentation, skeleton, floor plane detection, and pose detection are all provided by this class.

The first purpose of the User Tracker algorithm is to find all of the active users in a specific scene. It individually tracks each human it finds, and provides the means to seperate their outline from each other and from the background. Once the scene has been segmented, the User Tracker is also used to initiate Skeleton Tracking and Pose Detection algorithms.

Each user is provided an ID as they are detected. The user ID remains constant as long as the user remains in the frame. If a user leaves the field of view of the camera, or tracking of that user is otherwise lost, the user may have a different ID when he is detected again. There is currently no mechanism that provides persistant recognition of individuals when they are not being actively tracking. If this functionality is desired, it will need to be implimented at the application level.

A listener class is provided to allow event based interaction with this algorithm.

See Also:
UserMap, UserData, Skeleton, NiTE, HandTracker

Nested Class Summary
static interface UserTracker.NewFrameListener
          This is a listener interface that is used to react to events generated by the UserTracker class.
 
Method Summary
 void addNewFrameListener(UserTracker.NewFrameListener listener)
          Adds a NewFrameListner object to this UserTracker so that it will respond when a new frame is generated.
 Point2D<java.lang.Float> convertDepthCoordinatesToJoint(Point3D<java.lang.Integer> point)
           In general, two coordinate systems are used in OpenNI 2.0.
 Point2D<java.lang.Float> convertJointCoordinatesToDepth(Point3D<java.lang.Float> point)
           
static UserTracker create()
           Creates and initializes an empty User Tracker.
static UserTracker create(org.openni.Device device)
           Creates and initializes an empty User Tracker.
 void destroy()
          Shuts down the user tracker and releases all resources used by it.
This is the opposite of create().
 long getHandle()
           
 float getSkeletonSmoothingFactor()
          Queries the current skeleton smoothing factor.
 UserTrackerFrameRef readFrame()
          Gets the next snapshot of the algorithm.
 void removeNewFrameListener(UserTracker.NewFrameListener listener)
          Removes a NewFrameListener object from this UserTracker's list of listeners.
 void setSkeletonSmoothingFactor(float factor)
          Control the smoothing factor of the skeleton joints.
 void startPoseDetection(short userId, PoseType type)
          This function commands the UserTracker to start detecting specific poses for a specific user.
 void startSkeletonTracking(short userId)
           Requests that the Skeleton algorithm starts tracking a specific user.
 void stopPoseDetection(short userId, PoseType type)
          This function commands the pose detection algorithm to stop detecting a specific pose for a specific user.
 void stopSkeletonTracking(short userId)
          Stops skeleton tracking for a specific user.
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Method Detail

create

public static UserTracker create(org.openni.Device device)

Creates and initializes an empty User Tracker. This function should be the first one called when a new UserTracker object is constructed.

An OpenNI device with depth capabilities is required for this algorithm to work. See the OpenNI 2.0 documentation for more information about using an OpenNI 2.0 compliant hardware device and creating a Device object.

Parameters:
device - Initialized OpenNI 2.0 Device object that provides depth streams.
Returns:
Initialized an empty User Tracker.

create

public static UserTracker create()

Creates and initializes an empty User Tracker. This function should be the first one called when a new UserTracker object is constructed.

An OpenNI device with depth capabilities is required for this algorithm to work. See the OpenNI 2.0 documentation for more information about using an OpenNI 2.0 compliant hardware device and creating a Device object.

Returns:
Initialized an empty User Tracker.

destroy

public void destroy()
Shuts down the user tracker and releases all resources used by it.
This is the opposite of create(). This function is called automatically by the destructor in the current implimentation, but it is good practice to run it manually when the algorithm is no longer required. Running this function more than once is safe -- it simply exits if called on a non-valid UserTracker.


readFrame

public UserTrackerFrameRef readFrame()
Gets the next snapshot of the algorithm. This causes all data to be generated for the next frame of the algorithm -- algorithm frames correspond to the input depth frames used to generate them.

Returns:
The next frame of data.

setSkeletonSmoothingFactor

public void setSkeletonSmoothingFactor(float factor)
Control the smoothing factor of the skeleton joints. Factor should be between 0 (no smoothing at all) and 1 (no movement at all).

Experimenting with this factor should allow you to fine tune the skeleton performance. Higher values will produce smoother operation of the skeleton, but may make the skeleton feel less responsive to the user.

Parameters:
factor - The smoothing factor.

getSkeletonSmoothingFactor

public float getSkeletonSmoothingFactor()
Queries the current skeleton smoothing factor.

Returns:
Current skeleton smoothing factor.
See Also:
setSkeletonSmoothingFactor(float factor)

startSkeletonTracking

public void startSkeletonTracking(short userId)

Requests that the Skeleton algorithm starts tracking a specific user. Once started, the skeleton will provide information on the joint position and orientation for that user during each new frame of the UserTracker.

Note that the computational requirements of calculating a skeleton increase linearly with the number of users tracked. Tracking too many users may result in poor performance and high CPU utilization. If performance slows to the point where the skeleton is not calculated at the full frame rate of the depth data used to generate it, the algorithm tends to perform poorly.

Parameters:
userId - User for which calculate a skeleton.

stopSkeletonTracking

public void stopSkeletonTracking(short userId)
Stops skeleton tracking for a specific user. If multiple users are being tracked, this will only stop tracking for the user specified -- skeleton calculation will continue for remaining users.

Parameters:
user - User to stop tracking.
See Also:
Skeleton

startPoseDetection

public void startPoseDetection(short userId,
                               PoseType type)
This function commands the UserTracker to start detecting specific poses for a specific user.

Parameters:
userId - User that you would like to detect a pose for.
type - The type of pose you would like to detect.
See Also:
PoseData, PoseType

stopPoseDetection

public void stopPoseDetection(short userId,
                              PoseType type)
This function commands the pose detection algorithm to stop detecting a specific pose for a specific user. Since it is possible to detect multiple poses from multiple users, it is possible that detection of a different pose on the same user (or the same pose on a different user) may continue after this function is called.

Parameters:
userId - User to stop detecting a specific pose for.
type - The PoseType of the pose to stop detecting.

addNewFrameListener

public void addNewFrameListener(UserTracker.NewFrameListener listener)
Adds a NewFrameListner object to this UserTracker so that it will respond when a new frame is generated.

Parameters:
listener - A listener to add.

removeNewFrameListener

public void removeNewFrameListener(UserTracker.NewFrameListener listener)
Removes a NewFrameListener object from this UserTracker's list of listeners. The listener will no longer respond when a new frame is generated.

Parameters:
listener - A listener to remove.

convertJointCoordinatesToDepth

public Point2D<java.lang.Float> convertJointCoordinatesToDepth(Point3D<java.lang.Float> point)

convertDepthCoordinatesToJoint

public Point2D<java.lang.Float> convertDepthCoordinatesToJoint(Point3D<java.lang.Integer> point)

In general, two coordinate systems are used in OpenNI 2.0. These conventions are also followed in NiTE 2.0.

Skeleton joint positions are provided in "Real World" coordinates, while the native coordinate system of depth maps is the "projective" system. In short, "Real World" coordinates locate objects using a Cartesian coordinate system with the origin at the sensor. "Projective" coordinates measure straight line distance from the sensor, and indicate x/y coordinates using pixels in the image (which is mathematically equivalent to specifying angles). See the OpenNI 2.0 documentation online for more information.

This function allows you to convert the native depth map coordinates to the system used by the joints. This might be useful for performing certain types of measurements (eg distance between a joint and an object identified only in the depth map).

Note that no output is given for the Z coordinate. Z coordinates remain the same when performing the conversion. An input value is still required for Z, since this can affect the x/y output.

Parameters:
point - The input point in the "projective" coordinate system.
Returns:
Output point in the "real world" system.

getHandle

public long getHandle()