Limited memory machines are machine learning models that use an external memory to store relevant information during the learning process. These models are used to solve sequence learning problems where it is required to maintain a long-term memory of previous inputs. The external memory in limited memory machines can be a vector, an array or a random access memory (RAM). The memory is read and written by selective attention, which allows the model to focus on relevant parts of the memory during prediction. Memory-constrained machines are used in natural language processing applications, such as machine translation and text generation, as well as in time series prediction tasks, such as stock market forecasting. A popular example of a limited memory machine is the transform neural network (Transformer), which has been successfully used in a wide variety of natural language processing applications.
If you've ever wondered how Spotify recommends songs you like or how Siri and Alexa can understand what you say to them... the answer is that you can [...]
Read More »Nowadays digital transformation is key in any type of business. The 40% of Spanish companies will not exist in its current form in the next few [...]
Read More »Blockchain technology is best known as the computer architecture on which Bitcoin and other cryptocurrencies are based, and it is also known as the [...]
Read More »Artificial Intelligence (AI) technologies are currently being used in companies to transform business processes, drive innovation and improve the quality of life of their [...]
Read More »