Millisecond Forums

Inquisit timing accuracy.

https://forums.millisecond.com/Topic4796.aspx

By matt2004 - 7/27/2010

Dear Inquisit-forum-denizens,


I have been struggling with a timing issue with Inquisit for some time now, in fact I posted a question about it some time ago, related to Inquisit 2: http://www.millisecond.com/forums/Topic2348.aspx. I'm now testing out Inquisit 3 with a view to upgrading and am finding the same issue.


The issue is a consistent timing error, such that the total length of each trial is slightly longer than it should be. Sean's response to my original post was to suggest inserting pre and post-trial pauses to give Inquisit time to wrap the trial up, however this doesn't seem to help. 


I've been coping with the issue by deliberately setting the trial duration lower than it 'should' be, so if I want a trial to last for 1000 ms, I actually set the trial duration to, say, 9900 ms. However, fixing the issue is not as simple as having a simple rule of "knock off 100ms from the duration of every trial" - the error seems to be somewhat capricious depending on what kind of stimuli (and how many) are being presented, although it's generally consistent across trials of the same type. Fixing the error and getting the timing I want therefore requires a fair bit of tedious data-logging with external hardware, or sitting there with a stopwatch timing the total length of the program and iteratively knocking off or adding a few milliseconds to the trial durations in order to get the total length as accurate as possible.


To illustrate this I've written a very simple example program. This presents shape stimuli (a green circle, and a black circle to 'erase' the green one) on every trial, and the trial duration has been set at 1000ms. There are 60 trials, so the total length of the program should be 1 minute. In fact, the total length of the program is approximately 1 minute and 3 seconds. Inquisit is therefore adding on approximately 50ms (or three frames) to each trial. Changing the pre and post trial pauses from 100 ms to 0 ms has no effect. Inquisit logs the trial duration in the data file as whatever is specified in the script, which is unhelpful.


Apologies for the length of this post, but I now have several queries:


1. Why does this happen? As I understand it, Inquisit uses the system timer, which should have sub-microsecond resolution. Given the timer's resolution it seems very odd that such a large error occurs.


2. How can I ensure that the timings I enter into the script are what I actually get when the script executes? Is there some little trick I'm missing here? I've tried controlling the length of trials using the timeout feature as well, but that doesn't seem to help. As noted before, Sean's original suggestion of inserting a post-trial pause doesn't seem to help.


3. Are the values for trialduration which are logged in the data file actually measured as the program executes, or are they just copied from the relevant parts of the script? I'm guessing the latter, since they seem to be inaccurate.


4. Has anyone else seen behaviour like this?


5. Can some kind person help me out by trying to reproduce the error using the attached script? The magnitude of the error does seem to be somewhat hardware dependent, but there's always some kind of lag. I generally use up-to-date hardware (dual core processer, lots of RAM, Windows XP, things like anti-virus disabled) for running my programs.


Many thanks for any help you can give me with this. I've been using Inquisit for years and genuinely adore it, but if I can't solve this problem I'll have to start using something else. Plus, I've just embarrassed myself in front of my supervisor by running a load of fMRI scans with crappy timing. :o(


Thanks,


M.

By Dave - 7/27/2010

Hi Matt,


Inquisit's timing is inevitably tied to the computer's display refresh rate. A decent CRT monitor usually refreshes at 100Hz (approx. every 10.00 ms), most LCD or TFT monitors refresh at 60Hz (approx. every 16.67 ms). Inquisit's stimulus display depends on that figure. When using '/stimulustimes', Inquisit will try display your stimulus in the display frame closest to the specified time in ms, i.e. your stimulus might show up either a tad earlier or later than specified (i.e. when the timing entered is not an exact multiple of the refresh rate). If you need your timing to be exact, use the '/stimulusframes' command which let's you specifiy timing in terms of ordinal frame position in a given trial.


As far as I know, the timings entered in '/pretrialpause' and '/posttrialpause' do not directly depend on the refresh rate, however Inquisit will still (have to) wait for the next refresh cycle to start before displaying any stimulus. I believe the same is true for the '/trialduration' settings. That may be another source of the systematic timing error you're seeing.


Another often overlooked issue is Inquisit's stimulus erasing behavior. All stimuli will be erased *by default* at the end of the trial unless specified otherwise (by setting '/erase = false'). This may add (at least one) additional, rogue frame between trials (see here for a discussion: http://www.millisecond.com/forums/Topic4488.aspx).


I suggest you adjust your trial timings, pretrial- and posttrialpauses to be multiples of refresh rate and review the '/erase' setttings. Let me know what you find after doing so.


Best wishes from a fellow Inquisit user,


~Dave


By matt2004 - 7/27/2010

Hi Dave,


Many thanks for the very swift, and very useful reply!


I've already compared the use of /stimulusframes and /stimulustimes and have found little difference between the two in terms of the timing lag. However, as a matter of course I've been using /stimulusframes because it's (in theory) more accurate.


>As far as I know, the timings entered in '/pretrialpause' and '/posttrialpause' do not directly depend on the refresh rate, however Inquisit will still (have to) >wait for the next refresh cycle to start before displaying any stimulus. I believe the same is true for the '/trialduration' settings. That may be another source of >the systematic timing error you're seeing.


This is very interesting and it's something I hadn't considered, although it seems totally obvious now that you've pointed it out! However, in my test program (running at 60Hz) the pre and post-trial pauses are set at 100 ms (i.e. 6 frames, near as dammit) and /trialduration is set at 1000 ms (i.e. 60 frames) so I don't see how that could be optimised any further, and also don't think it could be responsible for the relatively large errors I'm seeing. Good tip though - thanks.


>Another often overlooked issue is Inquisit's stimulus erasing behavior. All stimuli will be erased *by default* at the end of the trial unless specified >otherwise (by setting '/erase = false'). This may add (at least one) additional, rogue frame between trials (see here for a >discussion:http://www.millisecond.com/forums/Topic4488.aspx).


Also very interesting, and in fact, by adding /erase = false to the stimulus elements in my test program I've reduced the error from approximately 3 seconds, to 2 seconds - looks like the erasure process was indeed adding one frame on to the end of each trial. Unfortunately it still seems like there are two more extra frames per trial.


Any other bright ideas about where the error might be creeping in?


Many thanks for your help.


M.


PS. Updated version of the test program (with /erase = false added) attached.

By Dave - 7/28/2010

Matt,


I've been giving this some thought and have been conducting some tests as well. While I'm seeing some minor timing drifts on my system (which btw is not tuned for running experiments), I can't seem to replicate the huge second-range deviations you've been reporting.


I've attached a modified version of your timing/testing script which might be helpful in narrowing this down. Keep me posted on your observations.


~Dave

By seandr - 7/28/2010

Dave -  thanks for the test script, this really simplifies timing the script.


I've run it a couple of times, and each of the "show_circle" trials starts exactly 1000ms after the previous one started. Note that there are trials at the beginning and end that add some more time, but the 60 show_circle trials last exactly 60 seconds as expected. I've confirmed this by adding the


<data>
/ audit = true
</audit>


command, and the audit times show exactly the same thing. Matt - can you try Dave's script and see if you get similar results?


-Sean



By Dave - 7/28/2010

<data>
/ audit = true
</audit>


Wonderful feature, too bad this isn't mentioned anywhere in the docs...;-)


~Dave

By matt2004 - 7/28/2010

Dear Dave/Sean,


Many, many thanks for all your help with this - I'm immensely grateful for the time you've put into helping me out.


I've run Dave's modified script (which is fantastic, by the way!) and attached the data, as well as the log file which results when I include Sean's data auditing element as well. As you can see, I'm still having issues, and it appears that exactly two frames (33ms) are being added to the length of each trial - this very neatly adds up to the roughly two-second lag I'm seeing, over the 60 trials.


Since you guys can't reproduce the problem, I guess this is something hardware-specific - seems too regular to be an interrupt from a background process or something like that. I just searched the forum for any threads containing advice about how to optimise hardware and Windows for running experiments (I already turn off things like virus scanners etc.) but couldn't find much.


Curiouser and curiouser...


Many thanks again,


Matt.

By matt2004 - 7/28/2010

Oh - it appears I can't attach two files at once. Here is the log file (renamed to .txt for uploading purposes).



M.

By Dave - 7/28/2010

Matt,


definitely sounds like an issue with your specific hardware to me. Some monitors introduce consistent lags by one frame (see Plant & Turner (2009) for an example; http://brm.psychonomic-journals.org/content/41/3/598.abstract). Or it may be a graphics card / driver issue. Anyhoo, here's what you can do to troubleshoot furher:


(1) Check and update your graphics card drivers.


(2) Connect a different monitor to your system.


(3) Run the timing tests on a completely different system for control purposes.


~Dave

By Dave - 7/28/2010

Hmm, looking at the data file you posted there's something fishy going on with respect to the posttrialpause column. Only the first trial reports a value of 100 (~6 frames), the remaining trials report 0. According to my tests, the posttrialpause matters as Inquisit apparently needs some refractory period to do some memory cleanup at the end of a given trial. I believe that's where your additional frames creep in. Which exact build / version of Inquisit are you running?


~Dave


EDIT: I see you're running 3.0.2.0. You should definitely update to 3.0.4.0 or 3.0.5.0 (beta). You're probably hitting a bug that was fixed ages ago.

By seandr - 7/29/2010

Wonderful feature, too bad this isn't mentioned anywhere in the docs...;-)


That's because it's still a little rough around the edges and somewhat incomplete. Besides, secret features add a sense of mystery to a product. [H]


-Sean


By seandr - 7/29/2010

Yes, since that old build, I added some optimizations in the way that Inquisit synchronizes with the refresh cycle to erase stimuli, as I recall. The previous behavior could insert some time in between trials.


-Sean

By Dave - 7/29/2010

Yes, since that old build, I added some optimizations in the way that Inquisit synchronizes with the refresh cycle to erase stimuli, as I recall. The previous behavior could insert some time in between trials.


Interesting info, thanks. However, things go seriously kerfuffle on my system if I set the posttrialpause to be zero. Seems like Inquisit needs at least two frames for cleanup at the end of any given trial. So there might be room for further improvement. What happens on your systems when you run the test script with posttrialpause = 0?


That's because it's still a little rough around the edges and somewhat
incomplete. Besides, secret features add a sense of mystery to a
product.


LOL.


~Dave

By Dave - 7/29/2010

Oh, btw, while coming up with the time auditing bits for the test script I also briefly experimented with the stimulusonset property (e.g. '/ ontrialend = values.myonset = shape.green_circle.stimulusonset.1'). However, the returned value always appeared to be empty. Any hint?


Thanks,


~Dave

By seandr - 7/29/2010

A little more info on the timing optimization - there were circumstances in which Inquisit would resynchronize with the vertical retrace rate more than one time while erasing stimuli - i.e., it would wait for the paint cycle to start, erase a stimulus, then wait for the next paint cycle to start, erase the next stimulus, etc. The fix ensure that it would only wait for 1 paint cycle before erasing all stimuli (which means there still may be a delay between trials, but it will usually smaller than before).


A posttrialpause gives Inquisit a fixed time buffer after the trial to erase stimuli and do some other minor cleanup. With a posttrialpause defined, you shouldn't see any delay from erasing stimuli. Without the posttrialpause, I would expect to see some small delays.


As for pretrialpause, that should not affect the overall length of a trial who's trialduration is set to a fixed value. However, it could push stimulus presentation back if preparing the stimuli requires some time. In most cases, we're talking a few milliseconds to prepare a stimulus, but it could become significant on a slow machine with lots of stimuli.


Hope this clarifies things.


-Sean

By Dave - 7/29/2010

A posttrialpause gives Inquisit a fixed time buffer after the trial to erase stimuli and do some other minor cleanup. With a posttrialpause defined, you shouldn't see any delay from erasing stimuli. Without the posttrialpause, I would expect to see some small delays.


Sorry to keep dragging this on, but I'd like to point out that all stims in the testscript are set to '/erase=false', i.e. Inquisit shouldn't have to erase anything ever. The only thing remaining is the "other minor cleanup" (resetting trial properties and such, I assume). And yet -- with posttrialpause=0 -- approx. 2 additional frames are added to each trial. Granted, I'd also expect some (very) small delays for the minor cleanup, but I don't see why this should take 2 frames (~34 ms) because there is no screen / display erasing involved whatsoever. Considering the sheer number of processor and memory operations modern PCs can perform within microseconds, I'd expect the minor cleanup to be virtually undetectable. I definitely don't understand why it adds 2 frames (even one frame would be too much, IMO). Any input is greatly appreciated!


~Dave


P.S.: As mentioned previously, I think there may be a bug involving the 'stimulusonset' property: 'shape.myshape.stimulusonset.1' always returns an empty value on my system (Inquisit 3.0.5.0), despite the stimulus having been displayed and thus having an onset to report.

By seandr - 8/3/2010

I've tested the script with /postrialpause=0 and am seeing intermittent delays between 30-34 ms.


Any resemblance these numbers have to the refresh rate is purely coincidental given that there is no stimulus erasure happening with your test script. At the end of the trial, Inquisit does a few things, including:


1) Captures the screen and saves it to disk (if applicable)
2) Erase stimuli and show feedback stimuli (if applicable)
3) Update stats for the expt, block, and trial
4) Evaluates any /ontrialend expressions
5) Writes trial data to the file
6) Erases any feedback stimuli (if applicable)
7) Releases any video memory used by shapes or pictures (if applicable)
8) Clear any pending input messages from the relevant response buffer


The only item in this list that could possibly require 30ms is 5 - writing the data to disk. Sure enough, when I removed the /audit command from the <data> element and replaced it with /separatefiles=true (which causes Inquisit to save the data in memory and then write it all at once to disk at the end of the experiment), the 30ms delay goes away. I'd love to know if you see the same thing.


Anyway, I think we have our culprit - disk access.


-Sean

By Dave - 8/3/2010

Hi Sean,


many thanks for giving this another look and your detailed response. Here are my observations as requested:


(1) / audit = true had nothing to do with my initial observation. In fact I never even ran the script with auditing turned on.


(2) / separatefiles = true doesn't resolve the issue for me. In fact it makes the delay even worse (~45ms mean).


Thanks again,


~Dave


P.S.: Can you replicate the empty stimulusonset property issue?

By Dave - 8/3/2010

A couple of more observations from tests conducted today:


- Even setting recorddata=false to prevent disc access does not resolve the issue for me. While the delay is somewhat reduced, there are still approx. 33ms added to each trial.


- Unrelated to the timing issue: recorddata doesn't seem to work when set at the <expt> level.


~Dave

By seandr - 8/4/2010

Weird. I've run the test several more times with both data recording and auditing turned on, and I'm seeing trial durations all in the range of 1000-1001 ms. I've run the test on a fast and slow machine with the same results. I can't even reproduce the delays I initially saw, and I'm actually thinking those resulted from me running the tests using a debug build with the debugger attached.


Just to make sure we are looking at the same numbers, I'm looking at the "Difference" value as a measure of the trial duration - is that what you are looking at?


-Sean



By seandr - 8/4/2010

P.S. If you are still seeing the delays, it would be interesting to know whether any of the following steps cumulatively have an impact:


1) Removing all data and audit recording (you've already done this)
2) Remove the ontrialend logic and the text stimuli showing the debug info from the
3) Remove the shape stimuli from the presentation

By Dave - 8/4/2010

Thanks for the reply, Sean. Really weird. I can reproduce these delays reliably, though.


Just to make sure we are looking at the same numbers, I'm looking at the "Difference" value as a measure of the trial duration - is that what you are looking at?


Yes, that's what I'm talking about. The mean difference to be precise. In the test script, "Difference" reflects the duration of the last trial (measured as difference between the previous and current trial's start time), "Mean" reflects the mean duration across all trials.


Thanks,


~Dave

By seandr - 8/4/2010

Ah, that might explain the discrepancy. The mean difference includes "start_trial", which does not have a fixed length (it's length depends on how quickly you respond). The performance in question is the duration of the fixed length trials, so the mean difference isn't relevant.


I've been opening the data file and looking at the reported difference for each fixed length trial, and all the durations have fallen within the range of 1000-1001. Are you seeing any single trial durations outside this range?


-Sean

By Dave - 8/4/2010

Ah, that might explain the discrepancy. The mean difference includes "start_trial", which does not have a fixed length (it's length depends on how quickly you respond). The performance in question is the duration of the fixed length trials, so the mean difference isn't relevant.


No. My script corrects for that.

By seandr - 8/4/2010

More weird - did you update the test script? I just downloaded it again, and the calculations were different than the script I previous had. The version I have now does indeed correct for the start_trial.


I'm still not seeing any delays larger than 1ms, however. I'll try on some more machines...


-Sean

By Dave - 8/4/2010

More weird - did you update the test script? I just downloaded it again, and the calculations were different than the script I previous had. The version I have now does indeed correct for the start_trial.


I might have initially uploaded the wrong version -- however said version can't have been online for more than a couple of minutes... Sorry for the confusion.


~Dave

By matt2004 - 8/8/2010

Dear Dave/Sean,


Many thanks once again for the extensive testing and many helpful comments you guys have provided on this topic - really, really helpful. I'm very happy to report that including a post-trial pause, setting erase=false for all my stimuli, and updating to the latest build of Inquisit have pretty much solved the problem I was having - Dave's debugging timing code reports trial durations which are pretty much bang-on, with an occasional deviation by a millisecond or two. Testing with external hardware (getting Inquisit to output a TTL pulse on the parallel port with each trial and recording it with a analogue-digital converter with 1ms resolution) confirms that the timing is accurate as well. This is a massive improvement over the situation I was in before and I'm re-commencing my experiments with renewed confidence in what I'm doing. 


Interestingly, the above is only true on the computer I use for testing - when I run the programs on the computer in my office (similar specifications) I still see timing lags. Nevertheless, at least with Dave's debugging code I can be certain about what's going on whatever hardware I use in the future.


I have a suggestion actually - it would have been immensely helpful to me if the timing code had been easily available somewhere. I've been using Inquisit for a while but am in no way a 'proper' programmer and had no idea Inquisit could be used in this way, and even less idea how to implement those functions. Would it perhaps be possible to include an example script similar to Dave's in the Inquisit Task Library on the main website?


Many, many thanks again,


Matt.

By Dave - 8/9/2010

Interestingly, the above is only true on the computer I use for testing - when I run the programs on the computer in my office (similar specifications) I still see timing lags.


Request for clarification, if I may: This is to say that on your office PC


(a) the debugging script does not report timing lags although there are such lags (as measured via external equipment / TTL)?


or


(b) the debugging script does accurately report timing lags despite the various optimizations (postrialpause, erase=false, etc)?


Thanks,


~Dave

By matt2004 - 8/9/2010

Dave,


Sorry for being unclear - what I meant was b), i.e. the debugging script is showing lags of about 30ms on each trial, despite all the optimisations.


Cheers,


M.

By seandr - 8/16/2010

Ok, I've dug more deeply into this, and identified the cause of the extra lags. Basically, the lags are equal to two retrace intervals of the monitors (as Dave had earlier surmised).


The first interval is because when Inquisit prepares to write the stimuli to the screen, it stalls until the beginning of a retrace interval. This can take up to 1 interval in duration and is currently added to the total duration of the trial, assuming no postrialpause is set.


The 2nd interval lag was simply a bug as Inquisit mistakenly waited until the next interval began before advancing.


Both of these lags were consumed into the posttrialpause, so they wouldn't affect timing if that command is set. I will hopefully have a fix ready for both shortly.


-Sean



By Dave - 8/16/2010

Ha! Huge thanks for digging into and solving the timing lag mystery, Sean! Any idea why it only showed on some systems but not on others, though? Speaking of bugs, I had discovered a couple of other, minor potential bugs while coding the timing debug / testing script. Since these are scattered all over this thread, I'll summarize them here:


- The 'stimulusonset' property always returns empty.


- The 'recorddata' switch appears to be unfunctional at the <expt> level. This may be only be the case when <data> element is set to 'separatefiles=true'.


Cheers and thanks again,


~Dave


P.S.: I second Matt's sentiment about adding a timing test script to the Task Library. Feel free to use the script I have posted to this thread as a starting point. @Matt: Maybe you could brush up on the script (add a few more features, e.g. a count of deviating trial durations, etc., and comments) and submit it for publication in the Task Library? Apart from 'giving back to the community', that would make a nice, little exercise in using Inquisit's advanced features for you...

By JonRamsay - 8/27/2012

Hi Dave, Sean


I stumbled across this thread while trying to solve a delay/lag issue I'm having with a script I've developed.


I've attached the script (archived as a .rar file with associated files) to this post. Basically the task is to turn off a bunch of lights as quickly as possible. After each light has been turned off by clicking on it, the participant must press a central button to commence the next trial (i.e. turn on the next light to be extinguished).


The problem is that there is a delay of a fraction of a second between each trial (e.g. the next light doesn't turn on immediately after clicking the central button). Since I'm trying to get the participants to do this as quickly as possible this is highly problematic.


I have a feeling the problem is to do with the stimuli. Each of the lights, and also the central button, are represented by two .jpg files - one corresponding to the on state and one corresponding to the off state. There is also a background .jpg file that makes the whole task look like it is being carried out on an actual machine (machine.jpg or machinewin.jpg). All the other stimuli are superimposed on top of this.

After reading through this thread, I've tried messing with the erase and post trial pause commands to try and eliminate the delay but to no avail. Please note that the default for all stimuli currently is erase=false, so erasing shouldn't be the problem.


Strange Observation No. 1: I HAVE managed to eliminate the delay by removing the background stimulus image (i.e. I removed it from /stimulusframes in all the trials. Once I do this, there is no delay. This seems very odd though, as the .jpg files are so small I can't imagine it's overworking my computer's memory.


Strange Observation No. 2: If I DO set all the stimuli to erase=true, the time taken to erase all the stimuli at the end of the trial does seem to match closely with the delay I'm experiencing. This may or may not be a coincidence.


Any ideas? I could always just remove the background image and make the task less pretty/realistic-looking, but I figure there must be a way to solve this.


NB: I have tried on multiple computers and the problem persists.


Hope you can help,


Jon
By Dave - 8/27/2012

Jon,

I'm seeing no such delays with your script, doesn't matter whether I remove the machine background image from the trials or not. Is your Inquisit installation up to date or are you by chance running an outdated build?
By JonRamsay - 8/28/2012

Hi Dave,

That's really strange. I reinstalled Inquisit yesterday with intention of eliminating that possibility. I'm currently running version 3.0.6.0 with a build date of Jan 18 2012. As far as I can see, everything is up to date.

Could there be any other reasons?

Thanks

Jon
By Dave - 8/28/2012

I've tried with the same version. Still, I'm not seeing any issue. Which leads me to ask: How do you detect that delay in the first place? A single-frame lag should be virtually impossible to detect perceptually. So I assume you've devised some other method and have taken some measurements. Would you kindly share that method and data? That might help to track the problem down.

Thanks,
~Dave
By JonRamsay - 8/28/2012

Hi Dave,

The "delay" I'm speaking of hasn't been recorded by any other method - I'm just going by my own perception while running the task. When I remove the background image, the task proceeds as it should - i.e. I can't detect any delay between pressing the central button and one of the outer lights lighting up. However, when I run the version I attached, there is a very noticeable delay - I would estimate it to be almost one second - between pressing the button and illumination. This is especially bizarre because it didn't happen in earlier versions of the task, it only came about after I finished the coding on my home computer (most of it had been done in the office).

If there's anything I can do to help document this problem, i.e. run some sort of Inquisit doiagnostic, please let me know.

Thanks

Jon
By Dave - 9/6/2012

I'm definitely not seeing any such pronounced delays (~1000ms) with the script you attached -- off/on transitions are virtually instantaneous (as one would expect). As for diagnostics, you can set /audit=true in your script's element. That'll give you an additional file containing detailed stimulus timing info. Give that a try, perhaps that'll give some leads.


Regards,


~Dave


P.S.: Sorry for the delayed reply, the forum was partially botched and didn't accept any replies to this thread.

By Admin - 9/6/2012

Hi Jonathan,


I've tested out your script and looked it over, and I'm not seeing any perceptible delays either.


A couple of questions/suggestions to help us narrow this down:


1) What kind of computer are you running, and with which version of Windows?
2) Are the graphics drivers on your computer up to date?
3) Are you able to reproduce this problem on any other computers that aren't configured exactly like yours?


-Sean