Hello,
I am programming an experiment that presents several .gif images, however the times of the gifs seem to be varying unexpectedly (even when they should be the same length).
When I changed it to present the same gif for 4 trials in a row, the first trial is often about twice as long as the rest of them, but occasionally other trials are much longer than the others. (It does not have this issue for static images.)
Has anyone encountered this issue, or know how to solve it? Any help would be much appreciated.
Here's a bit more information:
I tried recording the time each image was displayed for (as suggested in this post
http://www.millisecond.com/forums/Topic16654.aspx?Keywords=time-spent-on-page). According to that post, because its being recorded starting from ontrialbegin (rather than the stimulus onset) it will record the times with some additional variability (up to 100ms), which is not actual variability in the presentation times. I see this when I time some static images, but the variability in my gifs is definitely more that that (several hundred ms) and is noticeable to the naked eye.
Here's how I'm recording the times, and below is some of the times I recorded
<trial fastfractaltrial>
/ ontrialbegin = [values.t_start=script.elapsedtime;]
/ stimulusframes = [1 = fastfractals]
/ response = timeout(1)
/ recorddata = false
/ ontrialend = [values.t_end=script.elapsedtime;]
</trial>
<values>
/ t_start = 0
/ t_end = 0
</values>
<expressions>
/ imagedisplaytime = (values.t_end-values.t_start)
</expressions>
Here is an example where I took 2 different gifs (one shorter and one longer), and presented them in blocks of 4.
The largest delay seem to be the first time each gif is used, though e.g. the 3rd trial here is also much longer.
trialtype | imagedisplaytime |
short | 734 |
short | 295 |
short | 796 |
short | 312 |
long | 793 |
long | 547 |
long | 463 |
long | 461 |
short | 276 |
short | 308 |
short | 295 |
short | 295 |
long | 477 |
long | 445 |
long | 467 |
long | 494 |