AUTHORS: Bozorgtabar B, Vray G, Mahapatra D, Thiran JP

IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, : 3324-3333, Virtual, October 2021


ABSTRACT

The goal of out-of-distribution (OoD) detection is to identify unseen categories of inputs different from those seen during training, which is an important requirement for the safe deployment of deep neural networks in computational pathology. Additionally, to make OoD detection applicable in clinical applications, one may encounter image data distribution shifts. This paper argues that practical OoD detection should handle both semantic shift and data
distribution shift simultaneously. We propose a new selfsupervised OoD detector for colorectal cancer tissue types based on a clustering scheme. Our work’s central tenet benefits from multi-view consistency learning with a supplementary view based on style augmentation to mitigate domain shift. The learned representation is then adapted to minimize images’ predictive entropy to segregate indistribution examples from OoDs on the target data domain.
We evaluated our method on two public colorectal tissue types datasets. Our method achieved state-of-the-art OoD detection performance over various self-supervised baselines.
The code, data, and models are available at https://github.com/BehzadBozorgtabar/SOoD.

Download PDF


BibTex


Module: