Search Results

Vector-Quantized Variational Autoencoders (VQ-VAE)

The Vector-Quantized Variational Autoencoder (VAE) is a type of variational autoencoder where the autoencoder’s encoder neural network emits discrete–not continuous–values by mapping the encoder’s embedding values to a fixed number of codebook values.

The VQ-VAE was originally introduced in the Neural Discrete Representation Learning paper from Google.