Software

I support open and reproducible research. Therefore, codes, programs, and datasets I have developed are available for academic use. If you think access to the sources of any of the projects below will tremendously improve your research and can reason this, please contact me.

Human Gait as a Biometric

The intention of the Gait as a Biometric Dataset (UCLA-GBD) is to serve as a benchmark for biometric gait recognition systems and make research efforts based on it repeatable and extendable by researchers in both medical and engineering communities.

Data Description

We conducted two sets of experiments on a total of twenty test subjects. Each test subject conducted at least two runs; one run using the UCLA Smart Insoles in his/her normal footwear, the other using no footwear (i.e., socks). Each run required the test subject to walk for 4 meters, stop, turn around, and repeat this process for 60 seconds. Some subjects conducted additional tests using various varying walking speeds. In addition to the accemerometers embedded in the insoles, two accelerometers were placed near the left and right pockets of the user. The accelerometer data were recorded and it contained a timestamp, the x, y, and z directions of the accelerometer, as well as other data.

Accessing the Dataset

The data (in .zip format) is stored and organized in MATLAB format. Please read the readme.txt in the archive for details on accessing the data. Please email me for the download link (File Size: 25.5MB).

Citation

N. Amini, M. Sarrafzadeh, A. Vahdatpour, W. Xu, “Accelerometer-based on-body sensor localization for health and medical monitoring applications,” Pervasive and Mobile Computing Journal – Elsevier, Vol. 7, Issue 6, December 2011, pp. 746-760.

Important Note

The UCLA-GBD dataset is the property of the UCLA Wireless Health Institute. This dataset can only be used for research and non-commercial purposes. Please cite our paper if you publish results based on this dataset. If you have any questions about the dataset, please email me.


Classifying Human Activities with Inertial Sensors

The intention of the Human Activities Dataset (UCLA-HAD) is to enable comparative studies for classifying the daily activities human subjects using different techniques.

Data Description

We used Apple iPhone 4 smartphones placed at the left hip of human subjects to collect data. A total of nine activities were performed by twenty-five subjects. The sample rate for all the sensors (accelerometer, gyro, and magnetometer) was 60Hz.

Accessing the Dataset

The data (in .zip format) is stored and organized in CSV format. Please read the format.txt and note.docx in the archive for details on how to interpret the data. Please email me for the download link (File Size: 15.2MB).

Citation

Ramyar Saeedi, Navid Amini, Hassan Ghasemzadeh, "Patient-Centric On-Body Sensor Localization in Smart Health Systems, " The Asilomar Conference on Signals, Systems, and Computers, November 2-5th, 2014, Pacific Grove, CA, USA.

Important Note

The UCLA-HAD dataset is the property of the UCLA Wireless Health Institute. This dataset can only be used for research and non-commercial purposes. Please cite our paper if you publish results based on this dataset. If you have any questions about the dataset, please email me.


Google Glass and Hand Gesture Recognition

Hand gesture recognition is an important capability for modern technologies such as Google Glass. However, the current software for Google Glass does not account for gesture controls.

Data Description

We have a preliminary application where the Glass responds to basic gestures such as swiping. We are in the process of skin color calibration and reducing the number of false positives. Please email me for the demo application.