Learning Vector Quantization (LVQ) is a supervised learning algorithm invented by Teuvo Kohonen, used in the field of machine learning to classify data into predefined sets of classes. LVQ is a type of neural network that focuses on dividing the feature space of data into regions corresponding to different classes.
The LVQ process involves assigning weights to each of the nodes in the network. The weights are adjusted during the training phase so that the model can classify the data more effectively. This adjustment is based on the distance measure just like the k-NN classifier. Thus, through competitive learning, the prototype closest to the training sample is the one that will be updated, moving closer or further away as it favours the classification results.
During the prediction phase, the model uses the network weights to assign a class to the new cases presented to it.
Reference :T. Kohonen. Learning vector quantization for pattern recognition.
Companies are increasingly aware of the importance of properly analyzing and managing the huge amount of data they store on a daily basis.
Read More »Today we are going to talk about how to foresee payment problems and foresee the problems in those customers who are currently not giving it to you. In G [...]
Read More »The acquisition of new customers is one of the most important and difficult processes for a company. Traditionally, it has been necessary to resort to [...]
Read More »Hoy, 3 de octubre, hemos estado en los prestigiosos "Premios SCALEUPS B2B organizada por la Fundación Empresa y Sociedad, para hablaros de la Medici [...]
Read More »