Overview

AWS Hadoop Fundamentals introduces you to the basics of big data and how Hadoop as a framework handles it. This course discusses Hadoop architectures and how large sets of data are stored and processed.

The course explains several tools used in the process: MapReduce, Hive, and Pig. The course also examines Hadoop as part of the AWS big data ecosystem.

Invest in your team’s cloud future: Master AWS with 600+ courses, labs, and interactive experiences, elevate your team’s cloud expertise with AWS Skill Builder Team Subscription.

Skills Covered

In this course, you will learn how to:

  • Describe the Hadoop framework and tools used
  • Explain what MapReduce is and how it processes data
  • Explain how the Hive data warehouse system is leveraged with Hadoop
  • Identify the components of Hive and Pig
  • Describe the Pig Latin query language
  • Recognize how Hadoop fits into the AWS big data ecosystem

Who Should Attend

This course is intended for:

  • Any individuals interested in learning the fundamental concepts of Hadoop, MapReduce, Hive, and Pig
  • Individuals responsible for designing and implementing big data solutions

Course Curriculum

Prerequisites

We recommend that attendees of this course have the following prerequisites:

  • Basic familiarity with big data workloads

Download Syllabus

Course Modules

Request More Information

Training Options

Intake: On-demand Learning
Duration: 90 Mins
Guaranteed: GTR
Modality: SPVC
Price:

Exam:

Exam & Certification

No associated certification.

Training & Certification Guide

Frequently Asked Questions