Get in Touch

Course Outline

Section 1: Introduction to Hadoop

  • History and core concepts of Hadoop
  • Ecosystem overview
  • Available distributions
  • High-level architecture
  • Common myths about Hadoop
  • Challenges in Hadoop implementation
  • Hardware and software requirements
  • Lab: First look at Hadoop

Section 2: HDFS

  • Design and architecture
  • Core concepts (horizontal scaling, replication, data locality, rack awareness)
  • Daemons: NameNode, Secondary NameNode, DataNode
  • Communication and heartbeats
  • Data integrity mechanisms
  • Read and write paths
  • NameNode High Availability (HA) and Federation
  • Lab: Interacting with HDFS

Section 3: Map Reduce

  • Concepts and architecture
  • Daemons (MRV1): JobTracker and TaskTracker
  • Execution phases: driver, mapper, shuffle/sort, reducer
  • MapReduce Version 1 vs. Version 2 (YARN)
  • MapReduce internals
  • Introduction to writing Java MapReduce programs
  • Lab: Running a sample MapReduce program

Section 4: Pig

  • Pig vs. Java MapReduce
  • Pig job flow
  • Pig Latin language
  • ETL processes with Pig
  • Transformations and Joins
  • User-defined functions (UDF)
  • Lab: Writing Pig scripts to analyze data

Section 5: Hive

  • Architecture and design
  • Data types
  • SQL support in Hive
  • Creating Hive tables and querying
  • Partitions
  • Joins
  • Text processing
  • Lab: Various labs on processing data with Hive

Section 6: HBase

  • Concepts and architecture
  • HBase vs. RDBMS vs. Cassandra
  • HBase Java API
  • Time series data in HBase
  • Schema design
  • Lab: Interacting with HBase using the shell; programming in the HBase Java API; Schema design exercise

Requirements

  • Proficiency in the Java programming language (as most programming exercises are conducted in Java)
  • Familiarity with the Linux environment (ability to navigate the Linux command line and edit files using vi or nano)

Lab environment

Zero Install: No need to install Hadoop software on your personal machine! A fully operational Hadoop cluster will be provided for use.

Participants will need the following:

  • An SSH client (Linux and Mac systems include built-in SSH clients; PuTTY is recommended for Windows)
  • A web browser to access the cluster (Firefox is recommended)
 28 Hours

Number of participants


Price per participant

Testimonials (1)

Upcoming Courses

Related Categories