We just purchased another eyelink 1000 system. It comes with the added possibility of tracking the subjects head movements.
Am I correct in inferring that you mean the "remote mode" (aka "EyeLink Remote" in the user manual) of the EyeLink 1000 Plus?
Do you know if there is an easy way of integrating these together with the non-normalized eye movements into mworks?
With EyeLink Remote, it looks like MWorks can get head position data (target distance and x/y position) with each eye sample. The positions appear to be given in "raw" camera coordinates, which I think is the same units as pupil coordinates (though I am not certain).
It's not clear from the docs whether the pupil coordinates reported by the tracker have been normalized for head movements. Presumably someone at SR Research could tell us.
Exposing the head position data to MWorks experiments will require modifying the EyeLink plugin, but the changes should be straightforward.
we are considering the use of permanently implanted gyroscopes (eg this one in in the ear) to estimate head movements. Do you have an interface ready for wireless input similar to that?
MWorks doesn't currently have any interface that will work with that hardware. However, the device you linked to does have a C++ API, so I should be able to write a plugin that connects with it. Our ability to support other devices will depend on the specific interface/API that they provide.
Alternatively, are there any other interfaces available that other groups have been using for similar purposes (e.g. camera tracking of head movements, scleral coil to tack head movements etc.) ?