Search Results

Weight sharing

In neural networks, weight sharing is a way to reduce the number of parameters while allowing for more robust feature detection. Reducing the number of parameters can be considered a form of model compression.