Hadoop is a free software framework that enables distributed processing of large data sets on clusters of servers. It was developed by Apache Software Foundation and is based on the Java programming language. Hadoop enables the storage and processing of large amounts of data in a distributed manner, which means that data is divided into small parts and processed in parallel on multiple servers. This allows large amounts of data to be processed quickly and efficiently. Hadoop includes different tools and components such as the Hadoop HDFS distributed file system and the MapReduce distributed data processing model. It is widely used in big data and artificial intelligence applications.
The content of this article synthesizes part of the chapter "Concept and brief history of Artificial Intelligence" of the thesis Generation of Artificial [...]
Read More »There is a broad consensus among executives of the world's leading companies about the impact that artificial intelligence is going to have on business and [...]
Read More »As a consequence of this pandemic and economic situation in which we have found ourselves for the last two years, with the intention of better protecting the [...]
Read More »Clustering methods, or grouping, are a fundamental part of the data analysis process, since they allow an automatic segmentation of the data [...]
Read More »