In focus

    About HDFS

    HDFS stand for (Hadoop Distributed File System) that runs on standard or low-end hardware and Developed by Apache Hadoop. It the standard distributed file system which provides better data throughput and access through the MapReduce algorithm, high fault tolerance and native support of large data sets. HDFS deployment on low-cost commodity hardware and server failures are common. That type of file systemhighly is used for fault-tolerant. It is mainly used for facilitating the rapid transfer of data between compute nodes and enabling Hadoop systems to continue running if a node fails and the HDFS architecture consists of clusters and cluster which is accessed through a single NameNode software tool installed on a separate machine to monitor and manage the that cluster"s file system and user access mechanism.