Sensor Signal Data Set for Exploring Context Recognition of Mobile Devices
Keywords: context data, context-awareness, mobile-awareness
Terms of usage
The data set and the associated images can be freely used for research under the following conditions:
- When reporting any results, please refer this WWW page and article
Jani Mäntyjärvi, Johan Himberg, Petri Kangas, Urpo Tuomela, and Pertti Huuskonen. Sensor Signal Data Set for Exploring Context Recognition of Mobile Devices. In Workshop "Benchmarks and a database for
context recognition" in conjuction with the 2nd Int. Conf. on Pervasive Computing (PERVASIVE 2004), April 18-23, Linz/Vienna, Austria, 2004.
- Please send us a reference to the article where you have used the data.
- You may not redistribute copies of the data nor the associated
images; illustrating part of the data and images in tables or diagrams
when reporting the results is, of course, allowed.
Contact address
E-mail address
firstname.familyname@hut.fi
(where firstname=johan and familyname=himberg).
Description of the data
Description of data and studies where the data has been used
See
the article. You can find some of the further
references listed in the paper here.
There are five scenarios numbered 1-5 each containing 40-50 repetitions.
The scenarios are scripts that describe what the testees had to do (mainly
walking inside/outside, using elevator or stairs, sitting by desk or
sitting in a coffee room). Each scenario was repeated then 40-50 times by
two testees. Unfortunately, there is no labeling telling which of the testees
made each repetition.
Summary on user activites in each scenario
Data format
The data is in ASCII file contextdata.txt. The columns
are separated by tabulators. The columns are as follows
- Scenario number
- Repetition number
- Time (s)
- Device:Position:DisplayDown
- Device:Position:DisplayUp
- Device:Position:AntennaDown
- Device:Position:AntennaUp
- Device:Position:SidewaysRight
- Device:Position:SidewaysLeft
- Device:Stability:Stable
- Device:Stability:Unstable
- Device:Placement:AtHand
- Environment:Light:EU
- Environment:Light:USA
- Environment:Light:Bright
- Environment:Light:Normal
- Environment:Light:Dark
- Environment:Light:Natural
- Environment:Light:TotalDarkness
- Environment:Temperature:Hot
- Environment:Temperature:Warm
- Environment:Temperature:Cool
- Environment:Temperature:Cold
- Environment:Humidity:Humid
- Environment:Humidity:Normal
- Environment:Humidity:Dry
- Environment:SoundPressure:Silent
- Environment:SoundPressure:Modest
- Environment:SoundPressure:Loud
- UserAction:Movement:Walking
- UserAction:Movement:WalkingFast
- UserAction:Movement:Running
The three first columns include 1) the ID number of the scenario, 2)
the ID number of the repetition and 3) time from the beginning of
repetition in seconds. Note that some repetitions are missing (For example,
in scenario 1 there is no repetition number 1 nor 2). Rest of the
columns include context attributes derived from the physical signals.
There is also a header file header.txt that contain
context attribute names separated by tabulator in similar manner as in
the actual data file.
Images (video captures)
You'll find here also sequences of images that are snapshots of a
video recording on a repetition from each scenario. This hopefully
helps in seeing what the testees did during each scenario. Filename
scenXX-YY_ZZZ.jpg means scenario XX, repetition YY, ZZZ seconds
from the beginning of that repetition. You'll notice that the time
stamp ZZZ is not very well in sync with the actual recording of the
context data.
Download
Data (zipped)
Images from scenarios 1-2 (zipped)
Images from scenarios 3-5 (zipped)
http://www.cis.hut.fi/jhimberg/contextdata/index.shtml
forename.surename@hut.fi
Wednesday, 12-May-2004 10:50:05 EEST
|