MP4 | Video: AVC 1280x720 | Audio: AAC 44KHz 2ch | Duration: 2.5 Hours | Lec: 16 | 393 MB
Genre: eLearning | Language: English
Data Ingestion in Hadoop
this is detailed course upon Data Ingestion in Hadoop. very first Step towards analyzing Big Data. Learn How you can use Hadoop's file system commands to move Data in or out of HDFS. then we will Move ahead and see How can we transfer structured Data in and out of HDFS and RDBMS, we will see Many use cases, and I will be giving you live demo of each and every step. then we will take one more step and see How we can ingest log Datas using apache flume, we will then configure and make a data ingestion pipeline which will fetch data from social network website like twitter and ingest it to HDFS.
during the course I will be doing coding in front of you and will be explaining everything simultaneously so things will become more clear.
you are not required to know about Hadoop for joining the course, as this course is made with the assumption that students doesn't have a prior knowledge of Hadoop.
a very little introduction to sql commands is good, but again its not a requirement, as I will be giving you the commands as and when needed when we will be seeing module dedicated to sqoop. Nothing more than the CREATE TABLE IF NOT EXISTS and select * from table is required for this course if sql is your concern.
Java is not required as there will not be any Java or any other programming language's coding in this course
Sqoop_and_Flume.part2.rar - 100.0 MB
Sqoop_and_Flume.part3.rar - 100.0 MB
Sqoop_and_Flume.part4.rar - 73.0 MB
TO MAC USERS: If RAR password doesn't work, use this archive program:
RAR Expander 0.8.5 Beta 4 and extract password protected files without error.
TO WIN USERS: If RAR password doesn't work, use this archive program:
Latest Winrar and extract password protected files without error.