woensdag 15 oktober 2014

Chapter 6: Synchronizing and combining


Timestamping every frame and game event seemed like a good idea at first. Unfortunately, I had not realized that these timestamps are created based on the local time of the machine running the experiment. I had expected this local time to be more-or-less equivalent between the 5 available machines that were used for the experiment, but this did not seem to be the case. Some machines were up to 4 seconds off! Most of the experiments have already been run, and it would be a shame to let all this data go to waste, so I needed to find a way to fix time.



When I was setting up the database to store all game events, I added a field to this database table that would automatically set itself to the current time value whenever a new event occurred. This time is actually consistent, since this is always the local time of the database. Perfect, let's use that!
No, it's not supposed to be this easy: This time stamp was rounded to the nearest second, so according to the database, two people could perform an action at 'exactly' the same time. I wanted the timings in my program to be more precise than that. After some brainstorming I figured I could look at the 'imprecise' time of the database entry and compare that with the 'exact but not synchronized' time of the local time of the machine creating the event. Averaging over all events for a particular player would give me an estimate of the offset between the database time and the player time. I could determine this offset for both players and subtract these offsets from the original player time to get an estimated exact synchronized database time.

This is what I ended up doing, which took me quite some time to implement just right, but the results were definitely worth it. In some videos it was possible to see me walking around in the experiment room on both videos simultaneously, which was the perfect test to see if these offsets were calculated correctly. Luckily, they were, and I could see myself in the same awkward pose in both videos at the same time, yay me!

In the meantime, my supervisor noticed the tool I was building and had a bunch of ideas of different kinds of information he would like to see on it, such as a 'play' button instead of only the slider at the bottom, an option to hide unimportant game events, the results of the questionnaires for each participant, etc.



Now that most of my planned experiments were done, I could start running my videos through a number of facial expression extraction tools that ICT has available. These tools are able to track the position and rotation of the face, as well as detect different kinds of emotions that the participant might display. These tools created a huge amount of data for each video, and since we have over 200 videos in total, this is a lot of data to process. I started writing up some MatLab scripts to iterate over these huge files and average some meaningful data over different time spans during the game to produce a bunch of graphs; such as a graph for the smile levels during the game for each participant. These simple graphs don't provide too much information though, and it didn't take long before I was writing code to create a lot more complicated graphs...


Although these graphs can be useful to see possible trends, they're not exactly proof of anything. In order to prove that people smile more in the earlier rounds than in the later rounds, I'll need to run some fancy statistical analyses. I'm already looking forward to it...