sas-training-courses
7839  Reviews star_rate star_rate star_rate star_rate star_half

Introduction to SAS(R) and Hadoop

This course teaches you how to use SAS programming methods to read, write, and manipulate Hadoop data. Base SAS methods that are covered include reading and writing raw data with the DATA step and...

Read More
$1,600 USD GSA  $1,446.35
Course Code DIAHSM
Available Formats Classroom

This course teaches you how to use SAS programming methods to read, write, and manipulate Hadoop data. Base SAS methods that are covered include reading and writing raw data with the DATA step and managing the Hadoop file system and executing Pig code from SAS via the HADOOP procedure. In addition, the SAS/ACCESS Interface to Hadoop methods that allow LIBNAME access and SQL pass-through techniques to read and write Hadoop Hive table structures are discussed. Although not covered in detail, a brief overview of additional SAS and Hadoop technologies, including executing DS2 in Hadoop with the SAS Code Accelerator for Hadoop and using Hadoop data sources for the distributed in-memory analytics platform of SAS Viya, is included. This course is included in the Expert Exchange on Hadoop: Using SAS/ACCESS service offering to configure SAS/ACCESS Interface to Hadoop or SAS/ACCESS Interface to Impala to work with your Hadoop environment.

Skills Gained

  • Read and write Hadoop files with the FILENAME statement.
  • Execute and use Hadoop commands with the HADOOP procedure.
  • Invoke the execution of Pig programs in Hadoop within a SAS program.
  • Access Hadoop distributions using the LIBNAME statement and the SQL pass-through facility.
  • Create and use SQL procedure pass-through queries.
  • Use options and efficiency techniques for optimizing data access performance.
  • Join data using the SQL procedure and the DATA step.
  • Use Base SAS procedures with Hadoop.
  • Modify DS2 programs to execute in-database in Hadoop.
  • Use data in Hadoop as disk storage for SAS Viya in-memory tables.

Who Can Benefit

  • SAS programmers that need to access data in Hadoop from within SAS

Prerequisites

  • Before attending this course, you should be comfortable programming in SAS and Structured Query Language (SQL). You can gain the required SAS programming knowledge from the SAS® Programming 1: Essentials course. You can gain the required knowledge of SQL from the SAS® SQL 1: Essentials course. A working knowledge of Hadoop is helpful.

Course Details

Introduction

  • What is Hadoop?
  • How SAS interfaces with Hadoop.

Accessing HDFS and Invoking Hadoop Applications from SAS

  • Overview of methods available in Base SAS for interacting with Hadoop.
  • Reading and writing Hadoop files using Base SAS.
  • Methods.
  • Executing MapReduce code.
  • Executing Pig code using PROC HADOOP.

Using the SQL Pass-Through Facility

  • Understand the SQL procedure pass-through facility.
  • Connecting to a Hadoop Hive database.
  • Learning methods to query Hive tables.
  • Investigating Hadoop Hive metadata.
  • Creating SQL procedure pass-through queries.
  • Creating and loading Hive tables with SQL pass-through EXECUTE statements.
  • Handling Hive STRING data types.

Using the SAS/ACCESS LIBNAME Engine

  • Using the LIBNAME statement for Hadoop.
  • Using data set options.
  • Creating views.
  • Combining tables.
  • Benefits of the LIBNAME method.
  • Using the HDMD procedure to access delimited data, XML data, and other non-Hive formats.
  • Performance considerations for the SAS/ACCESS LIBNAME statement.
  • Copying data from a SAS library to a Hive library.

Partitioning and Clustering Hive Tables

  • Identifying partitioning, clustering, and indexing methods in Hive.
  • Understanding how partitioning and clustering can increase query performance.
  • Creating and loading partitioned and clustered Hive tables.

SAS Viya and Hadoop and SAS Code Accelerator for Hadoop

  • Loading and saving CAS tables to HDFS.
  • Executing DS2 threads in Hadoop to summarize data.
  • Using the HDMD procedure to access HDFS files.

Overview of SAS In-Memory Analytics and the Code Accelerator for Hadoop (Self-Study)

  • Using high-performance procedures and the SASHDAT library engine.
  • Creating a SAS LASR Analytic Server session.
  • Using the SASIOLA engine.
  • Executing DS2 threads in the Hadoop cluster to summarize data.
  • Using PROC HDMD to access HDFS files.