->
Oreilly - Building Spark Applications - 9780134393490
Oreilly - Building Spark Applications
by | Released November 2015 | ISBN: 013439349X


13+ Hours of Video InstructionOverviewBuilding Spark Applications LiveLessons provides data scientists and developers with a practical introduction to the Apache Spark framework using Python, R, and SQL.  Additionally, it covers best practices for developing scalable Spark applications for predictive analytics in the context of a data scientist's standard workflow.DescriptionIn this video training, Jonathan starts off with a brief history of Spark itself and shows you how to get started programming in a Spark environment on a laptop.  Taking an application and code first approach, he then covers the various APIs in Python, R, and SQL to show how Spark makes large scale data analysis much more accessible through languages familiar to data scientists and analysts alike.  With the basics covered, the videos move into a real-world case study showing you how to explore data, process text, and build models with Spark. Throughout the process, Jonathan exposes the internals of the Spark framework itself to show you how to write better application code, optimize performance, and set up a cluster to fully leverage the distributed nature of Spark.  After watching these videos, data scientists and developers will feel confident building an end-to-end application with Spark to perform machine learning and do data analysis at scale!About the InstructorJonathan Dinu is the founder of Zipfian Academy an advanced immersive training program for data scientists and data engineers in San Francisco and served as its CAO/CTO before it was acquired by Galvanize,  where he now is the VP of Academic Excellence.  He first discovered his love of all things data while studying Computer Science and Physics at UC Berkeley, and in a former life he worked for Alpine Data Labs developing distributed machine learning algorithms for predictive analytics on Hadoop.Jonathan is a dedicated educator, author, and speaker with a passion for sharing the things he has learned in the most creative ways he can.  He has run data science workshops at Strata and PyData (among others), built a Data Visualization course with Udacity, and served on the UC Berkeley Extension Data Science Advisory Board.  Currently he is writing a book on practical Data Science applications using Python.  When he is not working with students you can find him blogging about data, visualization, and education at http://hopelessoptimism.com/.Skill LevelBeginning/Intermediate What You Will Learn How to install and set up a Spark environment locally and on a cluster The differences between and the strengths of the Python, R, and SQL programming interfaces How to build a machine learning model for text Common data science use cases that Spark is especially well-suited to solve How to tune a Spark application for performance The internals of the Spark framework and its execution model How to use Spark in a data science application workflow The basics of the larger Spark ecosystem Who Should Take This CoursePracticing Data scientists who already use Python or R and want to learn how to scale up their analyses with Spark. Data Engineers who already use Java/Scala for Spark but want to learn about the Python, R, and SQL APIs and understand how Spark can be used to solve Data Science problems. Course RequirementsBasic understanding of programming. Familiarity with the data science process and machine learning are a plus. Lesson DescriptionsLesson 1: Introduction to the Spark EnvironmentLesson 1, “Introduction to the Spark Environment,” introduces Spark and provides context for the history and motivation for the framework.  This lesson covers how to install and set up Spark locally, work with the Spark REPL and Jupyter notebook, and the basics of programming with Spark.Lesson 2: Spark Programming APIsLesson 2, “Spark Programming APIs,” covers each of the various Spark programming interfaces. This lesson highlights the differences between and the tradeoffs of the Python (PySpark), R (SparkR), and SQL (Spark SQL and DataFrames) APIs as well as typical workflows for which each is best suited.Lesson 3: Your First Spark ApplicationLesson 3, “Your First Spark Application,” walks you through a case study with DonorsChoose.org data showing how Spark fits into the typical data science workflow.  This lesson covers how to perform exploratory data analysis at scale, apply natural language processing techniques, and write an implementation of the k-means algorithm for unsupervised learning on text data.Lesson 4: Spark InternalsLesson 4, “Spark Internals,” peels back the layers of the framework and walks you through how Spark executes code in a distributed fashion.  This lesson starts with a primer on distributed systems theory before diving into the Spark execution context, the details of RDDs, and how to run Spark in cluster mode on Amazon EC2.  The lesson finishes with best practices for monitoring and tuning the performance of a Spark application.Lesson 5: Advanced ApplicationsLesson 5, “Advanced Applications,” takes you through a KDD cup competition, showing you how to leverage Spark's higher level machine learning libraries (MLlib and spark.ml).  The lesson covers the basics of machine learning theory, shows you how to evaluate the performance of models through cross validation, and demonstrates how to build a machine learning pipeline with Spark.  The lesson finishes by showing you how to serialize and deploy models for use in a production setting.About LiveLessons Video Training The LiveLessons Video Training series publishes hundreds of hands-on, expert-led video tutorials covering a wide selection of technology topics designed to teach you the skills you need to succeed. This professional and personal technology video series features world-leading author instructors published by your trusted technology brands: Addison-Wesley, Cisco Press, IBM Press, Pearson IT Certification, Prentice Hall, Sams, and Que. Topics include: IT Certification, Programming, Web Development, Mobile Development, Home and Office Technologies, Business and Management, and more.  View all LiveLessons on InformIT at: http://www.informit.com/livelessons. Show and hide more
  1. Introduction
    • Building Spark Applications LiveLessons: Introduction 00:05:03
  2. Lesson 1: Introduction to the Spark Environment
    • Topics 00:00:49
    • 1.1 Getting the Materials 00:02:40
    • 1.2 A Brief Historical Diversion 00:07:17
    • 1.3 Origins of the Framework 00:07:23
    • 1.4 Why Spark? 00:19:12
    • 1.5 Getting Set Up: Spark and Java 00:09:48
    • 1.6 Getting Set Up: Scientific Python 00:05:08
    • 1.7 Getting Set Up: R Kernel for Jupyter 00:09:11
    • 1.8 Your First PySpark Job 00:18:04
    • 1.9 Introduction to RDDs: Functions, Transformations, and Actions 00:23:06
    • 1.10 MapReduce with Spark: Programming with Key-Value Pairs 00:17:16
  3. Lesson 2: Spark Programming APIs
    • Topics 00:01:02
    • 2.1 Introduction to the Spark Programming APIs 00:10:53
    • 2.2 PySpark: Loading and Importing Data 00:19:31
    • 2.3 PySpark: Parsing and Transforming Data 00:09:41
    • 2.4 PySpark: Analyzing Flight Delays 00:20:52
    • 2.5 SparkR: Introduction to DataFrames 00:20:33
    • 2.6 SparkR: Aggregations and Analysis 00:08:33
    • 2.7 SparkR: Visualizing Data with ggplot2 00:09:41
    • 2.8 Why (Spark) SQL? 00:03:42
    • 2.9 Spark SQL: Adding Structure to Your Data 00:31:47
    • 2.10 Spark SQL: Integration into Existing Workflows 00:04:42
  4. Lesson 3: Your First Spark Application
    • Topics 00:01:10
    • 3.1 How Spark Fits into the Data Science Process 00:14:29
    • 3.2 Introduction to Exploratory Data Analysis 00:10:09
    • 3.3 Case Study: DonorsChoose.org 00:17:40
    • 3.4 Data Quality Checks with Accumulators 00:18:49
    • 3.5 Making Sense of Data: Summary Statistics and Distributions 00:14:51
    • 3.6 Working with Text: Introduction to NLP 00:07:43
    • 3.7 Tokenization and Vectorization with Spark 00:17:53
    • 3.8 Summarization with tf-idf 00:20:17
    • 3.9 Introduction to Machine Learning 00:20:47
    • 3.10 Unsupervised Learning with Spark: Implementing k-means 00:24:04
    • 3.11 Testing k-means with DonorsChoose.org Essays 00:09:15
    • 3.12 Challenges of k-means: Latent Features, Interpretation, and Validation 00:21:38
  5. Lesson 4: Spark Internals
    • Topics 00:00:55
    • 4.1 Introduction to Distributed Systems 00:15:56
    • 4.2 Building Systems That Scale 00:11:37
    • 4.3 The Spark Execution Context 00:10:08
    • 4.4 RDD Deep Dive: Dependencies and Lineage 00:11:49
    • 4.5 A Day in the Life of a Spark Application 00:14:01
    • 4.6 How Code Runs: Stages, Tasks, and the Shuffle 00:13:21
    • 4.7 Spark Deployment: Local and Cluster Modes 00:20:50
    • 4.8 Setting Up Your Own Cluster 00:22:36
    • 4.9 Spark Performance: Monitoring and Optimization 00:09:25
    • 4.10 Tuning Your Spark Application 00:20:08
    • 4.11 Making Spark Fly: Parallelism 00:07:34
    • 4.12 Making Spark Fly: Caching 00:13:05
  6. Lesson 5: Advanced Applications
    • Topics 00:00:53
    • 5.1 Machine Learning on Spark: MLlib and spark.ml 00:13:40
    • 5.2 The KDD Cup Competition: Preparing Data and Imputing Values 00:22:43
    • 5.3 Introduction to Supervised Learning: Logistic Regression 00:17:36
    • 5.4 Building a Model with MLlib 00:13:09
    • 5.5 Model Evaluation and Metrics 00:14:41
    • 5.6 Leveraging scikit-learn to Evaluate MLlib Models 00:21:37
    • 5.7 Training Models with spark.ml 00:16:07
    • 5.8 Machine Learning Pipelines with spark.ml 00:11:03
    • 5.9 Tuning Models: Features, Cross Validation, and Grid Search 00:13:43
    • 5.10 Serializing and Deploying Models 00:08:22
  7. Summary
    • Building Spark Applications LiveLessons:Summary 00:08:06
  8. Show and hide more

    Oreilly - Building Spark Applications

    9780134393490.building.spark.applications.part1.OR.rar

    9780134393490.building.spark.applications.part2.OR.rar

    9780134393490.building.spark.applications.part3.OR.rar


 TO MAC USERS: If RAR password doesn't work, use this archive program: 

RAR Expander 0.8.5 Beta 4  and extract password protected files without error.


 TO WIN USERS: If RAR password doesn't work, use this archive program: 

Latest Winrar  and extract password protected files without error.


 Coktum   |  

Information
Members of Guests cannot leave comments.




rss