System requirements for using iTrace Core are the same as the requirements for the eye tracker you intend to use. Please refer to your tracker vendor's documentation for exact specifications.
More info
and
then Run anyway
to begin installation.
Add or remove programs
iTrace-Core
in the list of installed programsUninstall
Before you begin using iTrace Core, you will need to ensure that the display the participant will be using is set as the Primary Display in Windows.
If you are using a supported Tobii tracker, iTrace Core will automatically detect the tracker if it
is connected to your computer when the application is started. GazePoint Trackers must have the
tracker connected to the PC and have the GazePoint Control Application running. If you have not yet
connected a tracker before starting iTrace Core, you can setup your tracker and the click the
refresh
button to re-scan for trackers. When you have found your tracker, simply select
it from the
dropdown list. If you are using a Gazepoint tracker DO NOT
close the GazePoint Control Application
program until iTrace Core is closed.
To setup a tracking session you can click Session Setup
. This will allow you to name
the task the
participant will be working on (or simply record the name of the study being performed), provide
the name of the researcher administering the study, record the anonymous ID of the participant,
and select the output path for your data (XML files from both the Core and Plugins). When
finished simply hit Save
. The Clear
button will empty all the data
fields and
restore the output
directory to the default path (the Desktop).
By default, the Core will start a socket and web socket server for plugin communication on socket
8008 and 7007 respectively. If these sockets are not already in use on your workstation,
connecting a plugin simply requires using the connect buttons provided by the plugins for
Eclipse or Visual Studio 2017 (only plugin versions 0.1.0 are supported with this release). If
you need to change the ports, you can use the settings
tab on the core to change
either of the
socket server ports. Valid socket port values are between 1025 and 65535. Keep in mind if you
change the ports for iTrace Core, you will also need to change the settings to match on the
plugin as well. The Core will need to be restarted to reflect the new sock port values.
Before starting a recording session, you should calibrate your selected tracker (the mouse
tracker does not require any calibration). Simply click Calibrate
to start
calibration for your
chosen tracker.
Gazepoint trackers provide their own calibration via
the Gazepoint
Control Application which
needs to be running to communicate with iTrace Core. When calibration is taking place with the
Gazepoint DO NOT click the mouse. This will cause
calibration to terminate early and will cause
communication issues between the Gazepoint tracker and the iTrace Core. Calibration is finished
when the results are displayed. At this time it is safe to close the Window by clicking the
mouse.
By default, the Gazepoint uses a 5 point non-randomized calibration. This can be changed using
the calibration button found on the Gazepoint Control Application (not on iTrace Core). The
options are displayed on the results screen and are activated using the keyboard. One option
allows for the number of calibration points to be changed to use 9 points (also non-randomized).
Additional options are described in the Gazepoint Control.pdf
installed alongside
the Gazepoint
Control Application on your workstation. Changes to these settings will be persistent for future
calibration using the Gazepoint Control Application and iTrace Core. It is important to note
that these options will also be visible after running a calibration from iTrace Core, however
the Gazepoint Control Application does NOT allow them to
be changed at that time. All settings
changes must be done using the Gazepoint Control Application.
For Tobii trackers a calibration screen provided by iTrace Core will appear providing a randomized nine point calibration. When calibration is complete, the results are shown and pressing any key or clicking the mouse will close the window. iTrace Core does not provide any additional calibration settings.
When you are ready to collect eye tracking data from a participant, click Start Tracking
to start
reading data from the eye tracker and transmitting data to the plugin. When using a Gazepoint
tracker it is important to NOT close the Gazepoint Control
Application until tracking is
stopped.
When a tracking session is complete. Simply click Stop Tracking
on the Core to stop the
tracker and
end the tracking session. Data transmission to the plugin will also cease, but the plugin will
remain connected to allow for starting a new tracking session.
When all recording sessions are complete, ensure that iTrace Core is stopped and then disconnect the plugin. This step is VERY important as closing a tracking application without first disconnecting the plugin will produce incomplete data output from the plugin.
Enable Screen Recording
box before clicking Start Tracking
.
Just like
with
tracking, the
screen recorder only captures for the display set to Primary Display
in Windows.
Show
Eye Status
before clicking Start Tracking.
Show Reticle
before
clicking Start
Tracking.
<itrace_core session_id="" session_date_time="[timestamp_milli]" task_name="" researcher=""
participant_id="">
<environment screen_width="" screen_height="" tracker_type="" tracker_serial_number=""
screen_recording_start="[timestamp_milli]" />
<calibration timestamp="[timestamp_milli]">
<calibration_point x="" y="">
<sample left_x="" left_y="" left_validity="" right_x="" right_y="" right_validity="" />
...
</calibration_point>
...
</calibration>
<gazes>
<response event_id="" core_time="[timestamp_milli]" tracker_time="" x="" y="" left_x=""
left_y="" left_pupil_diameter="" left_validation="" right_x="" right_y=""
right_pupil_diameter="" right_validation="" user_left_x="" user_left_y="" user_left_z=""
user_right_x="" user_right_y="" user_right_z="" />
...
</gazes>
</itrace_core>
NOTE: All screen based X and Y coordinates for gazes and calibrations are multiplied by the screen width and height respectively. Only real world user coordinates for X, Y, and Z are not modified
itrace_core
tag attributes:session_id
→ unique identifier for a recording sessionsession_date_time
→ start of session recorded as a UTC Unix style timestamp in
millisecondstask_name
→ unique identifier for a recording sessionresearcher
→ unique identifier for a recording sessionparticipant_id
→ unique identifier for a recording sessionenvironment
tag attributes:screen_width
→ width of display in pixels used for the studyscreen_height
→ height of display in pixels used for the studytracker_type
→ tracker used to record gaze data [API Used]
VALUE
attribute on
the ACK
response tag from a <GET ID="PRODUCT_ID" />
Tobii.Research.IEyeTracker.DeviceName
calibration
tag attributes:timestamp
→ time when calibration was taken as a UTC Unix style timestamp in
millisecondscalibration_point
tag attributes:x
→ x position of point used for calibration [API Used]
CALX#
attribute on
the CAL
tag (# is 1 for the first calibration point and increases until the
total number of calibration points is reached)Tobii.Research.CalibrationResult.CalibrationPoints[#].PositionOnDisplayArea.X
(# is 0 for the first calibration point and increases until total number of calibration
points - 1 is reached)
y
→ y position of point used for calibration [API Used]
CALY#
attribute on
the CAL
tag (# is 1 for the first calibration point and increases until the
total number of calibration points is reached)Tobii.Research.CalibrationResult.CalibrationPoints[#].PositionOnDisplayArea.Y
(# is 0 for the first calibration point and increases until total number of calibration
points - 1 is reached)
sample
tag attributes:left_x
→ x coordinate for left eye calibration sample [API Used]
LX#
attribute on
the CAL
Tag (# is 1 for the first calibration point and increases until the
total number of calibration points is reached)Tobii.Research.CalibrationResult.CalibrationPoints[#].CalibrationSamples[#2].LeftEye.PositionOnDisplayArea.X
(# and #2 represent indices into the points and samples taken at each point
respectively)
left_y
→ y coordinate for left eye calibration sample [API Used]
LY#
attribute on
the CAL
tag (# is 1 for the first calibration point and increases until the
total number of calibration points is reached)Tobii.Research.CalibrationResult.CalibrationPoints[#].CalibrationSamples[#2].LeftEye.PositionOnDisplayArea.Y
(# and #2 represent indices into the points and samples taken at each point
respectively)
left_validity
→ left eye validity for calibration sample [API Used]
LV#
attribute on
the
CAL
tag (# is 1 for the first calibration point and increases until the
total
number of calibration points is reached)
Tobii.Research.CalibrationResult.CalibrationPoints[#].CalibrationSamples[#2].LeftEye.Validity
(# and #2 represent indices into the points and samples taken at each point
respectively)
right_x
→ x coordinate for right eye calibration sample [API Used]
RX#
attribute on
the CAL
Tag (# is 1 for the first calibration point and increases until the
total number of calibration points is reached)Tobii.Research.CalibrationResult.CalibrationPoints[#].CalibrationSamples[#2].RightEye.PositionOnDisplayArea.X
(# and #2 represent indices into the points and samples taken at each point
respectively)
right_y
→ y coordinate for right eye calibration sample [API Used]
RY#
attribute on
the CAL
tag (# is 1 for the first calibration point and increases until the
total number of calibration points is reached)Tobii.Research.CalibrationResult.CalibrationPoints[#].CalibrationSamples[#2].RightEye.PositionOnDisplayArea.Y
(# and #2 represent indices into the points and samples taken at each point
respectively)
right_validity
→ right eye validity for calibration sample [API Used]
RV#
attribute on
the
CAL
tag (# is 1 for the first calibration point and increases until the
total
number of calibration points is reached)
Tobii.Research.CalibrationResult.CalibrationPoints[#].CalibrationSamples[#2].RightEye.Validity
(# and #2 represent indices into the points and samples taken at each point
respectively)
response
tag attributes:event_id
→ unique id for the recorded gazecore_time
→ timestamp when core recorded data in system timetracker_time
→ time information from tracker API [API Used]
TIME_TICK
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.DeviceTimeStamp
x
→ screen based x coordinate [API
Used]
BPOGX
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.RightEye.GazePoint.PositionOnDisplayArea.X
,Tobii.Research.GazeDataEventArgs.LeftEye.GazePoint.PositionOnDisplayArea.X
y
→ screen based y coordinate [API
Used]
BPOGY
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.RightEye.GazePoint.PositionOnDisplayArea.Y
,Tobii.Research.GazeDataEventArgs.LeftEye.GazePoint.PositionOnDisplayArea.Y
left_x
→ screen based left eye x coordinate [API Used]
LPOGX
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.LeftEye.GazePoint.PositionOnDisplayArea.X
left_y
→ screen based left eye y coordinate [API Used]
LPOGY
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.LeftEye.GazePoint.PositionOnDisplayArea.Y
left_pupil_diameter
→ left pupil diameter from tracker [API Used]
LPD
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.LeftEye.Pupil.PupilDiameter
left_validation
→ left eye validation from tracker [API Used]
LPOGV
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.LeftEye.GazePoint.Validity
right_x
→ screen based right eye x coordinate [API Used]
RPOGX
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.RightEye.GazePoint.PositionOnDisplayArea.X
right_y
→ screen based right eye y coordinate [API Used]
RPOGY
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.RightEye.GazePoint.PositionOnDisplayArea.Y
right_pupil_diameter
→ right pupil diameter from tracker [API Used]
LPD
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.RightEye.Pupil.PupilDiameter
right_validation
→ right eye validation from tracker [API Used]
RPOGV
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.RightEye.GazePoint.Validity
user_left_x
→ real world x position of left eye [API Used]
LEYEX
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.LeftEye.GazePoint.PositionInUserCoordinates.X
user_left_y
→ real world y position of left eye [API Used]
LEYEY
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.LeftEye.GazePoint.PositionInUserCoordinates.Y
user_left_z
→ real world z position of left eye [API Used]
LEYEZ
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.LeftEye.GazePoint.PositionInUserCoordinates.Z
user_right_x
→ real world x position of right eye [API Used]
REYEX
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.RightEye.GazePoint.PositionInUserCoordinates.X
user_right_y
→ real world y position of right eye [API Used]
REYEY
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.RightEye.GazePoint.PositionInUserCoordinates.Y
user_right_z
→ real world z position of right eye [API Used]
REYEZ
attribute on
the REC
tagTobii.Research.GazeDataEventArgs.RightEye.GazePoint.PositionInUserCoordinates.Z
Copyrights © 2022 - SERESL, All Rights Reserved.