IEEE VIS 2025 Content: Towards Difficulty-Aware Analysis of Deep Neural Networks

Towards Difficulty-Aware Analysis of Deep Neural Networks

Linhao Meng -

Stef van den Elzen -

Anna Vilanova -

Image not found
This paper will interest data scientists, ML practitioners, and visual analytics researchers involved in model development and evaluation. Practitioners can apply the proposed methods to better understand model behavior, identify difficult cases, and improve decision-making through human-in-the-loop analysis.
Keywords

Visualization, deep neural network, difficulty

Abstract

Traditional instance-based model analysis focuses mainly on misclassified instances. However, this approach overlooks the varying difficulty associated with different instances. Ideally, a robust model should recognize and reflect the challenges presented by intrinsically difficult instances. It is also valuable to investigate whether the difficulty perceived by the model aligns with that perceived by humans. To address this, we propose incorporating instance difficulty into the deep neural network evaluation process, specifically for supervised classification tasks on image data. Specifically, we consider difficulty measures from three perspectives -- data, model, and human -- to facilitate comprehensive evaluation and comparison. Additionally, we develop an interactive visual tool, DifficultyEyes, to support the identification of instances of interest based on various difficulty patterns and to aid in analyzing potential data or model issues. Case studies demonstrate the effectiveness of our approach.