Dear all,
I have a question regarding latencies that exceed 10000 ms. As far as I understand, my script omits latencies bigger than 10000 ms for the computation of D measures:
/ ontrialend = [if(block.iat2compatibletest1.latency <= 10000 && block.iat2compatibletest1.currenttrialnumber != 1 ) values.sum1a = values.sum1a + block.iat2compatibletest1.latency]/ ontrialend = [if(block.iat2compatibletest1.latency <= 10000 && block.iat2compatibletest1.currenttrialnumber != 1 ) values.n1a = values.n1a + 1]/ ontrialend = [if(block.iat2compatibletest1.latency <= 10000 && block.iat2compatibletest1.currenttrialnumber != 1 ) values.ss1a = values.ss1a + (block.iat2compatibletest1.latency * block.iat2compatibletest1.latency)]
Now, I have a case where the first totalmaxlatency in the first row (first recorded trial in first recorded block) says 11298 ms while the first latency is 1729 ms. My first guess was that Inquisit omits the 11298 ms in the output data BUT there is no trial missing. There are 24 recorded trials in this block (incompatibletest1) but none of them has the 11298 ms as recorded latency. How can this be?
Furthermore, I understood totalmaxlatency as "The longest response latency for the specified element over the entire experiment" (manual) but it seems the shown latency in this column is reset every time a new block is recorded. Did I understand something wrong?
Can anyone help on this issue? Thank you so much!
Regards, Sonja