Hadoop

Concept and definition

Hadoop

What is Hadoop?

Hadoop is a free software framework that enables distributed processing of large data sets on clusters of servers. It was developed by Apache Software Foundation and is based on the Java programming language. Hadoop enables the storage and processing of large amounts of data in a distributed manner, which means that data is divided into small parts and processed in parallel on multiple servers. This allows large amounts of data to be processed quickly and efficiently. Hadoop includes different tools and components such as the Hadoop HDFS distributed file system and the MapReduce distributed data processing model. It is widely used in big data and artificial intelligence applications.

« Back to glossary

Do you want to get in touch?

CDRs contain data that a telecommunications company collects about phone calls, such as time and length of call. This data can be used in analytical applications.
Fill the form
Share:
Abbreviated History of Artificial Intelligence

The content of this article synthesizes part of the chapter "Concept and brief history of Artificial Intelligence" of the thesis Generation of Artificial [...]

Read More »
Artificial intelligence in the telecommunications sector

There is a broad consensus among executives of the world's leading companies about the impact that artificial intelligence is going to have on business and [...]

Read More »
How to detect delinquent customers and avoid defaults? 10 signs of delinquency

As a consequence of this pandemic and economic situation in which we have found ourselves for the last two years, with the intention of better protecting the [...]

Read More »
Clustering for data analysis

Clustering methods, or grouping, are a fundamental part of the data analysis process, since they allow an automatic segmentation of the data [...]

Read More »
See more entries
© Gamco 2021, All Rights Reserved - Legal notice - Privacy - Cookies