Skip to main content
Article
Conditional Contrastive Learning with Kernel
ICLR 2022 - 10th International Conference on Learning Representations
  • Yao Hung Hubert Tsai, Carnegie Mellon University
  • Tianqin Li, Carnegie Mellon University
  • Martin Q. Ma, Carnegie Mellon University
  • Han Zhao, University of Illinois Urbana-Champaign
  • Kun Zhang, Carnegie Mellon University & Mohamed bin Zayed University of Artificial Intelligence
  • Louis Philippe Morency, Carnegie Mellon University
  • Ruslan Salakhutdinov, Carnegie Mellon University
Document Type
Conference Proceeding
Abstract

Conditional contrastive learning frameworks consider the conditional sampling procedure that constructs positive or negative data pairs conditioned on specific variables. Fair contrastive learning constructs negative pairs, for example, from the same gender (conditioning on sensitive information), which in turn reduces undesirable information from the learned representations; weakly supervised contrastive learning constructs positive pairs with similar annotative attributes (conditioning on auxiliary information), which in turn are incorporated into the representations. Although conditional contrastive learning enables many applications, the conditional sampling procedure can be challenging if we cannot obtain sufficient data pairs for some values of the conditioning variable. This paper presents Conditional Contrastive Learning with Kernel (CCL-K) that converts existing conditional contrastive objectives into alternative forms that mitigate the insufficient data problem. Instead of sampling data according to the value of the conditioning variable, CCLK uses the Kernel Conditional Embedding Operator that samples data from all available data and assigns weights to each sampled data given the kernel similarity between the values of the conditioning variable. We conduct experiments using weakly supervised, fair, and hard negatives contrastive learning, showing CCL-K outperforms state-of-the-art baselines.

Publication Date
1-29-2022
Keywords
  • Contrastive Learning,
  • Conditional Sampling,
  • Kernel methods
Comments

Archived thanks to OpenReview.Net

Open Access

Uploaded 30 January 2024

Citation Information
Y.H.H. Tsai, et al, "Conditional Contrastive Learning with Kernel", in 10th Intl. Conf. on Learning Representations (ICLR 2022)", [Online], April 2022.