Laboratory of Computer and Information Science

Sensor Signal Data Set for Exploring Context Recognition of Mobile Devices

Keywords: context data, context-awareness, mobile-awareness

Terms of usage

The data set and the associated images can be freely used for research under the following conditions:
  1. When reporting any results, please refer this WWW page and article
    Jani Mäntyjärvi, Johan Himberg, Petri Kangas, Urpo Tuomela, and Pertti Huuskonen. Sensor Signal Data Set for Exploring Context Recognition of Mobile Devices. In Workshop "Benchmarks and a database for context recognition" in conjuction with the 2nd Int. Conf. on Pervasive Computing (PERVASIVE 2004), April 18-23, Linz/Vienna, Austria, 2004.
  2. Please send us a reference to the article where you have used the data.
  3. You may not redistribute copies of the data nor the associated images; illustrating part of the data and images in tables or diagrams when reporting the results is, of course, allowed.

Contact address

E-mail address
(where firstname=johan and familyname=himberg).

Description of the data

Description of data and studies where the data has been used

See the article. You can find some of the further references listed in the paper here.

There are five scenarios numbered 1-5 each containing 40-50 repetitions. The scenarios are scripts that describe what the testees had to do (mainly walking inside/outside, using elevator or stairs, sitting by desk or sitting in a coffee room). Each scenario was repeated then 40-50 times by two testees. Unfortunately, there is no labeling telling which of the testees made each repetition.

Summary on user activites in each scenario

Data format

The data is in ASCII file contextdata.txt. The columns are separated by tabulators. The columns are as follows
  1. Scenario number
  2. Repetition number
  3. Time (s)
  4. Device:Position:DisplayDown
  5. Device:Position:DisplayUp
  6. Device:Position:AntennaDown
  7. Device:Position:AntennaUp
  8. Device:Position:SidewaysRight
  9. Device:Position:SidewaysLeft
  10. Device:Stability:Stable
  11. Device:Stability:Unstable
  12. Device:Placement:AtHand
  13. Environment:Light:EU
  14. Environment:Light:USA
  15. Environment:Light:Bright
  16. Environment:Light:Normal
  17. Environment:Light:Dark
  18. Environment:Light:Natural
  19. Environment:Light:TotalDarkness
  20. Environment:Temperature:Hot
  21. Environment:Temperature:Warm
  22. Environment:Temperature:Cool
  23. Environment:Temperature:Cold
  24. Environment:Humidity:Humid
  25. Environment:Humidity:Normal
  26. Environment:Humidity:Dry
  27. Environment:SoundPressure:Silent
  28. Environment:SoundPressure:Modest
  29. Environment:SoundPressure:Loud
  30. UserAction:Movement:Walking
  31. UserAction:Movement:WalkingFast
  32. UserAction:Movement:Running
The three first columns include 1) the ID number of the scenario, 2) the ID number of the repetition and 3) time from the beginning of repetition in seconds. Note that some repetitions are missing (For example, in scenario 1 there is no repetition number 1 nor 2). Rest of the columns include context attributes derived from the physical signals. There is also a header file header.txt that contain context attribute names separated by tabulator in similar manner as in the actual data file.

Images (video captures)

You'll find here also sequences of images that are snapshots of a video recording on a repetition from each scenario. This hopefully helps in seeing what the testees did during each scenario. Filename scenXX-YY_ZZZ.jpg means scenario XX, repetition YY, ZZZ seconds from the beginning of that repetition. You'll notice that the time stamp ZZZ is not very well in sync with the actual recording of the context data.


Data (zipped)
Images from scenarios 1-2 (zipped)
Images from scenarios 3-5 (zipped)
Wednesday, 12-May-2004 10:50:05 EEST