Efficient Contrast Effect Compensation with Personalized Perception Models

Color is one of the most effective visual variables and is frequently used to encode metric quantities. Contrast effects are considered harmful in data visualizations since they significantly bias our perception of colors. For instance, a gray patch appears brighter on a black background than on a w...

Full description

Saved in:
Bibliographic Details
Published inComputer graphics forum Vol. 34; no. 3; pp. 211 - 220
Main Authors Mittelstädt, Sebastian, Keim, Daniel A.
Format Journal Article
LanguageEnglish
Published Oxford Blackwell Publishing Ltd 01.06.2015
Subjects
Online AccessGet full text
ISSN0167-7055
1467-8659
DOI10.1111/cgf.12633

Cover

More Information
Summary:Color is one of the most effective visual variables and is frequently used to encode metric quantities. Contrast effects are considered harmful in data visualizations since they significantly bias our perception of colors. For instance, a gray patch appears brighter on a black background than on a white background. Accordingly, the perception of color‐encoded data items depends on the surround in the rendered visualization. A method that compensates for contrast effects has been presented previously, which significantly improves the users’ accuracy in reading and comparing color encoded data. The method utilizes established perception models to compensate for contrast effects, assuming an average human observer. In this paper, we provide experiments that show a significant difference in the perception of users. We introduce methods to personalize contrast effect compensation and show that this outperforms the original method with a user study. We, further, overcome the major limitation of the original method, which is a runtime of several minutes. With the use of efficient optimization and surrogate models, we are able to reduce runtime to milliseconds, making the method applicable in interactive visualizations.
Bibliography:ArticleID:CGF12633
istex:BD953271FC1E1E0E20C0720D6CD88B7D1DF28C39
ark:/67375/WNG-L9DTKK4D-1
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.12633