## What is learning vector quantization?

The disadvantage of the K proximity algorithm is that you need to stick to the entire training data set. The Learning Vector Quantization Algorithm (or LVQ for short) is an artificial neural network algorithm that allows you to choose the number of training instances to suspend and know exactly what these examples should look like.

The representation of LVQ is a collection of codebook vectors. These are randomly selected at the beginning and are suitable for optimally summarizing the training data set in multiple iterations of the learning algorithm. After learning, you can use the codebook vector toK-Nearest NeighborsA similar forecast. The most similar neighbor (best matching codebook vector) is found by calculating the distance between each codebook vector and the new data instance. Then return the class value of the best matching unit or (actual value in the case of regression) as a prediction. The best results are obtained if the data is rescaled to the same range (for example between 0 and 1).

If you findKNNTo provide good results on your dataset, try using LVQ to reduce the memory requirements for storing the entire training dataset.

## Baidu Encyclopedia version

Learning vector quantization Vector Quantization (referred to as LVQ) belongs to prototype clustering, that is, trying to find a set of prototype vectors to cluster, each prototype vector represents a cluster, dividing the space into several clusters, so that for any sample, it can be classified into it. The closest to the cluster, the difference is that LVQ assumes that the data samples have category tags, so these category tags can be used to aid clustering.

## Wikipedia version

LVQ can be understood as a special case of artificial neural networks. More specifically, it applies a Hebbian learning method that is winner-taken. It is a self-organizing map (SOMThe precursor, which is related to nerve gas, is also related to the k-nearest neighbor algorithm (k-NN). LVQ was invented by Teuvo Kohonen.

## Comments