Self-supervised learning is a type of supervised learning where the training labels are determined by the input data.
word2vec and similar word embeddings are a good example of self-supervised learning. word2vec models predict a word from its surrounding words (and vice versa). Unlike “traditional” supervised learning, the class labels are not separate from the input data.
Autoencoders are another example of self-supervised learning, as they are trained to shrink and reconstruct their inputs.