Learning Vector Quantization (LVQ) is a supervised learning algorithm invented by Teuvo Kohonen, used in the field of machine learning to classify data into predefined sets of classes. LVQ is a type of neural network that focuses on dividing the feature space of data into regions corresponding to different classes.
The LVQ process involves assigning weights to each of the nodes in the network. The weights are adjusted during the training phase so that the model can classify the data more effectively. This adjustment is based on the distance measure just like the k-NN classifier. Thus, through competitive learning, the prototype closest to the training sample is the one that will be updated, moving closer or further away as it favours the classification results.
During the prediction phase, the model uses the network weights to assign a class to the new cases presented to it.
Reference :T. Kohonen. Learning vector quantization for pattern recognition.
Clustering methods, or grouping, are a fundamental part of the data analysis process, since they allow an automatic segmentation of the data [...]
Read More »Software as a Service (SaaS) companies have gained enormous prominence in the last few years, mainly due to the novelty of the products [...]
Read More »A few days ago we were able to attend a pioneering event in the world of Retail, the Retail Future 2022 fair. In its fifth edition, and under the slogan "Challenge [...]
Read More »Unlike a computer program, in which a list of commands are processed through a computer program, AI goes beyond the [...]
Read More »