We are using Inquisit 6 for an EEG task requiring millisecond precision. To test the timing of stimulus display, I have hooked up a photodiode to a virtual oscilloscope and tested the timing of stimulus onset relative to trigger times and stimulus onset relative to stimulus offset. The primary concern I have is that Inquisit 6 seems to be stopping the display of stimuli before the requested timing in the code. I have compared our oscilloscope measurements with the data audit times and they are inconsistent. I have repeated the test on two different graphics cards and observed the same problem (though on the Dell Integrated graphics, the offset was 30ms and on the NVIDIA graphics card it was 20ms before the scheduled offset). Below are two screenshots from the virtual oscilloscope. On the left, I have requested a white square be on the screen for 100ms, but observed it switching to a black square after 79ms. On the right, I have requested a white square be on the screen for 1000ms, and observe it switching to a black square after 980ms. This is with the NVIDIA graphics card, Windows 10, and Inquisit is running as a realtime process.
The code to do this is very straightforward and I don't think this is a coding error on my part, but feel free to correct any errors. The relevant code is just:
/ stimulustimes = [0 = white_square, trigger, other stimuli;
100 = black_square, other stimuli;
]. // flash white square for 100ms, send trigger to port COM4 when white square appears
Any help in understanding and/or fixing this behavior would be appreciated.
On the plus side, I will say I have achieved precision with the trigger and stimulus onset on the order of +/- 500 microseconds, which is quite impressive.
University of Pittsburgh