Physiological Data Visualization

This project/prototype resulting in the publication of a poster at GRAND 2014, as well as a presentation at the Games User Research Workshop at CHI’14.

The language used was Processing to quickly develop a stable prototype.

The research of video game players resorts to many methods, both of quantitative and qualitative nature. When a Games User Researcher employs video recordings and psychophysiological measures as a means of data collection of the player experience, often there is a need to correlate events from both sources of data. The correlation of interesting psychophysiological events with videos is regularly done in a manual fashion that consumes time. We propose a prototype for an application that combines such data sources of player sessions, allowing a researcher to visualize regions of interest of the video in relation to the specified psychophysiological activity parameters set.

The main window of the prototype (see Figure 1) exhibits two different areas. The first one is the video footage area, in which two videos are displayed: one for the gameplay action, and another for the player interaction. The second area, on the bottom half of the screen, displays three different psychophysiological signals, namely GSR and two channels of fEMG. On the main screen, it is possible to play or pause the video footages displayed. These can be played at the same time, or individually. It is also possible to filter the video footage according to psychophysiological data parameters. To enable the filtering of the video, the researcher must rightclick on the graphs of the physiological measures he wants to consider (e.g., GSR Activity graph and fEMG1 Activity graph for negative affect emotional valence). Upon enabling the desired physiological data sources, the user can set two parameters: The first parameter that can be selected and set is the physiological activity level: After processing the raw physiological data, percentile activity levels are obtained as the final result. By moving the slider bar that is present on top of a graph, the user is selecting the minimum activity level of the particular signal that he wants to look at. Doing so updates the displayed regions of interest overlaid on the graph. The second filtering parameter that can be adjusted by the researcher is the minimum region of interest duration: By moving the slider that is attached to a slider bar, the researcher can filter the displayed regions of interest. This slider defines the minimum duration of the regions of interest whenever the activity level they relate to meets the threshold defined by the slider bar. This way, unwanted segments of short duration can be removed for a more clear analysis of the player activity. Finally, after adjusting the filtering parameters, a desired region of interest can be clicked on, thus playing the corresponding segment of the video footages at once. This region becomes highlighted so that the user can understand which video segment is being played.