2024 : 11 : 21
Farhad Pakdaman

Farhad Pakdaman

Academic rank: Assistant Professor
ORCID:
Education: PhD.
ScopusId:
HIndex:
Faculty: Faculty of Technology and Engineering
Address:
Phone: 011-35302903

Research

Title
A Skill-Based Visual Attention Model for Cloud Gaming
Type
JournalPaper
Keywords
Cloud gaming, visual attention, eye tracking, video coding, perceptual video coding
Year
2021
Journal IEEE Access
DOI
Researchers Hamed Ahmadi ، Saman Zadtootaghaj ، Farhad Pakdaman ، Mahmoud Reza Hashemi ، Shervin Shirmohammadi

Abstract

Despite its recent advances and increasing industrial interest, cloud gaming's high bandwidth usage is still one of its major challenges. In this paper, we demonstrate how incorporating visual attention into cloud gaming helps to reduce bitrate without negatively affecting the player's quality of experience. We show that current visual attention models, which work well for normal videos, underperform in the context of cloud gaming videos. Hence, we propose our novel model, by developing a skill-based visual attention model, based on a cloud gaming dataset. First, it is demonstrated how players' attention maps are correlated with their skill levels and how this can be exploited to improve the accuracy of visual attention modeling. Then, this fact is used to cluster attention maps, according to the player's skill level. A simple yet effective method is introduced to predict players' skill levels using their performance in game. Finally, the models are incorporated into the video encoder to perceptually optimize the bitrate allocation. Incorporating the player's skill level into our model improves the accuracy of saliency maps by 14% with respect to the baseline, and 24% with respect to competing methods, in terms of Normalized Scanpath Saliency (NSS). Furthermore, we show that the maximum possible amount of video bitrate reduction depends on the player's skill level. Experimental results show 13%, 5%, and 15% reduction in video bitrate for beginner, intermediate, and expert players, respectively.