In our PLATO Lab, we record and analyze the response processes and task-solving strategies of test participants. For example, we capture students’ critical online reasoning in a natural, realistic eye-tracking test environment. To track eye movements, we use a state-of-the-art screen-based Tobii Pro X3-120 eye tracker as well as the corresponding analysis software Tobii Pro Lab. The analyses enable us to better understand the processes involved in searching for information on the Internet, critically evaluating online information, and online-based learning.

Example: Video excerpt of a student processing a task from the Critical Online Reasoning Assessment (CORA) on the topic of “government bonds”

In this video, a student evaluates online information in terms of its credibility. While the student was working on the task and examining the websites, we recorded her/his eye movements.

The bubbles indicate which parts of the websites the students fixated on, while the lines indicate the saccades, i.e., the movements between these fixations that show the direction of the students’ gazes. Areas that are fixated for a longer period of time or more frequently than others indicate a higher level of attention and are associated with deeper processing levels in eye-tracking research.

In this example, the student fixates on different areas of a tweet linked in one of the CORA tasks. There is a relatively high number of fixations on the graph included in the tweet, especially on the reference(s) at the bottom right of the graph. In a subsequent Google search, the student takes a closer look at the organizations listed in the references. The student also fixated on other parts of the tweet that could offer indications regarding the image source, including the author of the tweet as well as the organization that posted it.

By aggregating, analyzing, or comparing fixations on different areas of web pages or between groups of students (e.g. beginning and advanced students), we aim to gain deeper insights into how learners process online information and how they evaluate and select information from the broad range of information available on the Internet.

The heat maps below were generated while a student performed a more or less critical analysis of online information on the basis of absolute fixation durations and scaled to a maximum of 0.5 seconds (radius 50 px). Longer fixation durations of at least 0.5 seconds are indicated in dark red, whereas shorter fixations are indicated in light shades of green. The sections of the graph the student either did not fixate on at all or only for an extremely short period of time compared to the maximum fixation time are not highlighted in any way.

Low performer (top); high performer (bottom)

Process mining - Open web search

Since our experiments usually take place in a natural online environment, with an open web search and no restrictions on the accessibility of websites and programs, a tremendous amount of time-sequential data are gathered from the online assessments and need to be structured for subsequent analyses. The structuring of response process data (e.g. log files) is based on the method of process mining. Process mining enables us to aggregate and visualize the response process steps of students using event logs to identify commonalities and differences in students’ response processes while solving specific online tasks.

Example of the sequential structure of a response process

Most recently, we used the process mining approach to explore differences regarding the number, duration and type of response process steps while students evaluated the credibility of websites. The exemplary visualization of the process mining exploration above indicates that some students spend most of their time on the task editor to read the task and write their answers. They also spend comparatively more time on the website (“Zentrum der Gesundheit”) that is given in the task instead of consulting other webpages on the Internet.

On the basis of this analysis,

  • We identify features of students’ successful and less successful online reasoning
  • We explain differences and commonalities in terms of students’ online reasoning processes across student groups and disciplines (e.g. economics, medicine, teacher education)
  • We identify the potential of digital information for web-based learning