Thus, the research question (RQ) of the authors is the following: Can these three sensors substitute expensive sensors while taking into account their accuracy, precision and price while also assessing existing gesture recognition algorithms? This means that if they are applied in many fields due to being part of HCI studies, it should be determined if they are considered adequate as sensors by studying their precision and accuracy. The question of the Kinects and the LMC is interesting.
This paper investigates the use of low-cost sensors, such as the Kinect sensor (Microsoft, Redmond, WA, USA)-for full body motion tracking via its depth cameras and skeleton stream -and the Leap Motion Controller (LMC, Leap Motion, San Francisco, CA, USA)-for hand motion tracking -to see if they can substitute the expensive ones. A few of those sensors will be mentioned in this paper. Even if they could, most motion tracking sensors are not available commercially, possibly due to their use and price. The problem with precise human motion tracking is that it is expensive and most people-especially those who live in developing countries-cannot afford it. However, the engineering and the computer science sides of HCI are focused on in this paper, namely on the sensors in order to determine if they are viable as human motion tracking devices.
Human-Computer Interaction (HCI) is a multidisciplinary field of research which deals with these types of questions, not only on the hardware side but on the software side as well. Between the human and the machine, the interaction, usability and comfort are key factors. To fully realize the potential of these realities, tracking only the head is not enough as interaction with the environment is required with the use of other sensors, working together with the mentioned HMD.Īdditionally, the human-also known as the user – is an important element in the whole system which can be seen in Figure 2. Most HMDs have inertial sensors such as gyroscope and accelerometer to track the motion of the user’s head. To do this, the most common method is for the user to wear a Head-Mounted Display (HMD). However, these realities need sensors to work as intended, meaning to place the user in a certain reality, so to say. If you need an older version of the Azure Kinect Sensor SDK, find it here.The types of realities, from real to virtual. You can find the change log for the Azure Kinect Sensor SDK here.
For the latest version of libk4a, see the links in the next section.
For example, if you install the libk4a4.1-dev package, install the corresponding libk4a4.1 package that contains the matching version of shared object files. If the command succeeds, the SDK is ready for use.īe sure to install the matching version of libk4a. The basic tutorials require the libk4a.-dev package. package contains the shared objects needed to run applications/executables that depend on libk4a. The libk4a.-dev package contains the headers and CMake files to build your applications/executables against libk4a. As an alternative, you can launch applications that use the device as root. For instructions, see Linux Device Setup. You will need to add udev rules to access Azure Kinect DK without being the root user. This command installs the dependency packages that are required for the tools to work correctly, including the latest version of libk4a.
The k4a-tools package includes the Azure Kinect Viewer, the Azure Kinect Recorder, and the Azure Kinect Firmware Tool. Now, you can install the necessary packages. To request support for other distributions, see this page.įirst, you'll need to configure Microsoft's Package Repository, following the instructions here. Linux installation instructionsĬurrently, the only supported distribution is Ubuntu 18.04. You will find the tools referenced in articles in this path. For example, "C:\Program Files\Azure Kinect SDK 1.2". When installing the SDK, remember the path you install to.