Limited memory machines are machine learning models that use an external memory to store relevant information during the learning process. These models are used to solve sequence learning problems where it is required to maintain a long-term memory of previous inputs. The external memory in limited memory machines can be a vector, an array or a random access memory (RAM). The memory is read and written by selective attention, which allows the model to focus on relevant parts of the memory during prediction. Memory-constrained machines are used in natural language processing applications, such as machine translation and text generation, as well as in time series prediction tasks, such as stock market forecasting. A popular example of a limited memory machine is the transform neural network (Transformer), which has been successfully used in a wide variety of natural language processing applications.
We often wonder what examples of AI we can find in our environment, and the fact is that artificial intelligence is a concept that in English has [...]
Read More »Natural Language Processing or NLP analyzes how machines understand, interpret and process human language.
Read More »As a consequence of this pandemic and economic situation in which we have found ourselves for the last two years, with the intention of better protecting the [...]
Read More »Data Mining is a process of exploration and analysis of large amounts of data, with the objective of discovering patterns, relationships and trends that can be [...]
Read More »