Introduction to Hadoop Development
Duration: 5 Days
You will learn how to use Apache Hadoop and write MapReduce programs. You will begin with a quick overview of installing Hadoop, setting it up in a cluster, and then proceed to writing data analytic programs. The course will present the basic concepts of MapReduce applications developed using Hadoop, including a close look at framework components, use of Hadoop for a variety of data analysis tasks, and numerous examples of Hadoop in action. The course will further examine related technologies such as Hive, Pig, and Apache Accumulo. Apache Accumulo is a highly scalable structured store based on Google's BigTable, written in Java and operates over the Hadoop Distributed File System (HDFS). Hive is data warehouse software for querying and managing large datasets. Pig is a platform to take advantage of parallelization when running data analysis. Finally, you will observe how Hadoop works in and supports cloud computing and explore examples with Amazon Web Services and case studies.
This class is focused on the Hadoop 2.0 (pre-)release.
Topics Covered In This Course
What is Hadoop?
Components of Hadoop
Writing basic MapReduce programs
Running Hadoop in the cloud
Programming with Pig
Overview Hadoop Related Technologies
Attendees should have good Java development experience, including Eclipse or similar IDE, as well as experience using JPA and data access. Exposure to UNIX/Linux bash or tcsh is also helpful.
This course is approximately 40% lecture and 60% hands-on labs.
Every student attending a Verhoef Training class will receive a certificate good for $100 toward their next public class taken within a year.
You can also buy "Verhoef Vouchers" to get a discounted rate for a single student in any of our public or web-based classes. Contact your account manager or our sales office for details.
Can't find the course you want?
Call us at 800.533.3893, or
email us at firstname.lastname@example.org