Description:
Our visual attention, i.e. where we look at in an interface, can be driven by consciously by the task we currently do, or affected by the visual saliency of UI elements. Converseley, we can change that saliency to attract or guide a user’s attention.
The goal of this project is to simulate the gaze behavior of a user. Given a specific user interface and task, we want to generate where a user is looking at, at any point in time, taking into account their current task, what they have previosuly looked at, and the dynamically changing data in the interface that might affect the attention. Specifically, we will consider the case of drone monitoring where the user has to keep track of multiple delivery drones to detect any critical situation they might face.
This project is part of a larger research project. The simulated data will be used as artificial training data in an adaptive interface that aims to maximize a user’s situation awareness during drone monitoring.
This project can be easily scaled to your abilities and interests. You will review the literature on gaze simulation and either implement existing models or come up with your own simulation approach to develop a modular simulation framework that can be integrated with other projects. Prior approaches range from traditional symbolic methods, Markov chains, to deep-learning based methods, transformer-based methods, and approaches using reinforcement learning.
we will provide you with existing gaze data, for example to train models.
Requirements:
- Good background in Data science / Machine learning methods
- Familiary with some of the approaches mentioned below and ability to implement them
- If you understand the papers below you’re good to go 🙂
Links
https://yuejiang-nj.github.io/Publications/2024UIST_EyeFormer/project_page/main.html
https://perceptualui.org/publications/sood23_cogsci/, https://www.sciencedirect.com/science/article/abs/pii/S1389041700000152
https://ieeexplore.ieee.org/document/8315047
https://predimportance.mit.edu/
https://www.nature.com/articles/s41598-022-17433-3
Contact:
Joao Belo, Zekun Wu
jbelo@cs.uni-saarland.de, wuzekun@cs.uni-saarland.de
Comments are closed.