Hortonworks is the only 100 percent open source software provider to develop, distribute, and support an Apache Hadoop platform explicitly architected, built, and tested for enterprise-grade deployments.
Students will master core concepts on the Hadoop distributed file system, learn to understand apache hive and advanced apache hive programming concepts, and learn how to use Hcatalog, joining datasets in apache hive and HDFS Commands.
They will gain practical experience to import and export RDBMS data into HDFS, analyze clickstream data and analyze stock market data using quantiles. With our cloud labs, students get hands-on experience to run a YARN application, apache hive, join datasets with apache pig, and start an HDP cluster.
Understand Hadoop and the Hadoop Distributed File System (HDFS)
List Common HDFS Commands
List the Six Key Hadoop Data Types
Distinguish between Relational Databases and Hadoop
Understand Purpose of Name Nodes, Data Node, MapReduce, and Reduce Phases