->

Master Big Data - Apache Spark/Hadoop/Sqoop/Hive/Flume


 

Master Big Data - Apache Spark/Hadoop/Sqoop/Hive/Flume
Video: .mp4 (1280x720, 30 fps(r)) | Audio: aac, 44100 Hz, 2ch | Size: 3.9 GB
Genre: eLearning Video | Duration: 86 lectures (8 hour, 22 mins) | Language: English


In-depth course on Big Data - Apache Spark , Hadoop , Sqoop , Flume & Apache Hive, Big Data Cluster setup


What you'll learn

Hadoop distributed File system and commands. Lifecycle of sqoop command. Sqoop import command to migrate data from Mysql to HDFS. Sqoop import command to migrate data from Mysql to Hive. Working with various file formats, compressions, file delimeter,where clause and queries while importing the data. Understand split-by and boundary queries. Use incremental mode to migrate the data from Mysql to HDFS. Using sqoop export, migrate data from HDFS to Mysql. Using sqoop export, migrate data from Hive to Mysql. Understand Flume Architecture. Using flume, Ingest data from Twitter and save to HDFS. Using flume, Ingest data from netcat and save to HDFS. Using flume, Ingest data from exec and show on console. Flume Interceptors.

Requirements

No

Description

In this course, you will start by learning what is hadoop distributed file system and most common hadoop commands required to work with Hadoop File system.


Then you will be introduced to Sqoop Import

Understand lifecycle of sqoop command.

Use sqoop import command to migrate data from Mysql to HDFS.

Use sqoop import command to migrate data from Mysql to Hive.

Use various file formats, compressions, file delimeter,where clause and queries while importing the data.

Understand split-by and boundary queries.

Use incremental mode to migrate the data from Mysql to HDFS.


Further, you will learn Sqoop Export to migrate data.

What is sqoop export

Using sqoop export, migrate data from HDFS to Mysql.

Using sqoop export, migrate data from Hive to Mysql.



Further, you will learn about Apache Flume

Understand Flume Architecture.

Using flume, Ingest data from Twitter and save to HDFS.

Using flume, Ingest data from netcat and save to HDFS.

Using flume, Ingest data from exec and show on console.

Describe flume interceptors and see examples of using interceptors.

Flume multiple agents

Flume Consolidation.


In the next section, we will learn about Apache Hive

Hive Intro

External & Managed Tables

Working with Different Files - Parquet,Avro

Compressions

Hive Analysis

Hive String Functions

Hive Date Functions

Partitioning

Bucketing


Finally You will learn about Apache Spark

Spark Intro

Cluster Overview

RDD

DAG/Stages/Tasks

Actions & Transformations

Transformation & Action Examples

Spark Data frames

Spark Data frames - working with diff File Formats & Compression

Dataframes API's

Spark SQL

Dataframe Examples

Spark with Cassandra Integration


Who this course is for:

Who want to learn big data in detail


 


Homepage: https://www.udemy.com/course/big-data-ingestion-using-sqoop-and-flume-cca-and-hdpcd/


 TO MAC USERS: If RAR password doesn't work, use this archive program: 

RAR Expander 0.8.5 Beta 4  and extract password protected files without error.


 TO WIN USERS: If RAR password doesn't work, use this archive program: 

Latest Winrar  and extract password protected files without error.


 Broknote   |  

Information
Members of Guests cannot leave comments.




rss