Abstract

The authors propose Contrastive Adapting, an efficient adapter training strategy that improves the group robustness of large pretrained foundation models (FMs) without finetuning, leading to up to 56.0 percentage points of increase in accuracy compared to zero-shot.