Oreilly - Deep Learning with Python Video Edition
by François Chollet | Released November 2017 | ISBN: 9781617294433VE
"The clearest explanation of deep learning I have come across...it was a joy to read." Richard Tobias, Cephasonics Deep Learning with Python introduces the field of deep learning using the Python language and the powerful Keras library. Written by Keras creator and Google AI researcher François Chollet, this book builds your understanding through intuitive explanations and practical examples. Machine learning has made remarkable progress in recent years. We went from near-unusable speech and image recognition, to near-human accuracy. We went from machines that couldn't beat a serious Go player, to defeating a world champion. Behind this progress is deep learning—a combination of engineering advances, best practices, and theory that enables a wealth of previously impossible smart applications. Inside: Deep learning from first principles Setting up your own deep-learning environment Image-classification models Deep learning for text and sequences Neural style transfer, text generation, and image generation This Video Editions book requires intermediate Python skills. No previous experience with Keras, TensorFlow, or machine learning is required. François Chollet works on deep learning at Google in Mountain View, CA. He is the creator of the Keras deep-learning library, as well as a contributor to the TensorFlow machine-learning framework. He also does deep-learning research, with a focus on computer vision and the application of machine learning to formal reasoning. His papers have been published at major conferences in the field, including the Conference on Computer Vision and Pattern Recognition (CVPR), the Conference and Workshop on Neural Information Processing Systems (NIPS), the International Conference on Learning Representations (ICLR), and others. An excellent hands-on introductory title, with great depth and breadth. David Blumenthal-Barby, Babbel Bridges the gap between the hype and a functioning deep-learning system. Peter Rabinovitch, Akamai The best resource for becoming a master of Keras and deep learning. Claudio Rodriguez, Cox Media Group NARRATED BY MARK THOMAS Show and hide more
- PART 1: THE FUNDAMENTALS OF DEEP LEARNING
- Chapter 1. What is deep learning? 00:08:40
- Chapter 1. Learning representations from data 00:09:40
- Chapter 1. Understanding how deep learning works, in three figures 00:05:14
- Chapter 1. Don’t believe the short-term hype 00:07:04
- Chapter 1. Before deep learning: a brief history of machine learning 00:08:49
- Chapter 1. Decision trees, random forests, and gradient boosting machines 00:10:56
- Chapter 1. Why deep learning? Why now? 00:08:58
- Chapter 1. A new wave of investment 00:06:45
- Chapter 2. Before we begin: the mathematical building blocks of neural networks 00:08:52
- Chapter 2. Data representations for neural networks 00:08:59
- Chapter 2. Real-world examples of data tensors 00:07:25
- Chapter 2. The gears of neural networks: tensor operations 00:05:56
- Chapter 2. Tensor dot 00:07:20
- Chapter 2. The engine of neural networks: gradient-based optimization 00:09:33
- Chapter 2. Stochastic gradient descent 00:08:35
- Chapter 2. Looking back at our first example 00:04:01
- Chapter 3. Getting started with neural networks 00:10:04
- Chapter 3. Introduction to Keras 00:07:31
- Chapter 3. Setting up a deep-learning workstation 00:07:26
- Chapter 3. Classifying movie reviews: a binary classification example 00:10:12
- Chapter 3. Validating your approach 00:05:49
- Chapter 3. Classifying newswires: a multiclass classification example 00:10:34
- Chapter 3. Predicting house prices: a regression example 00:10:21
- Chapter 4. Fundamentals of machine learning 00:10:21
- Chapter 4. Evaluating machine-learning models 00:08:44
- Chapter 4. Data preprocessing, feature engineering, and feature learning 00:08:28
- Chapter 4. Overfitting and underfitting 00:06:58
- Chapter 4. Adding weight regularization 00:06:33
- Chapter 4. The universal workflow of machine learning 00:06:49
- Chapter 4. Developing a model that does better than a baseline 00:07:32
- PART 2: DEEP LEARNING IN PRACTICE
- Chapter 5. Deep learning for computer vision 00:04:06
- Chapter 5. The convolution operation 00:08:36
- Chapter 5. The max-pooling operation 00:04:31
- Chapter 5. Training a convnet from scratch on a small dataset 00:08:06
- Chapter 5. Data preprocessing 00:08:54
- Chapter 5. Using a pretrained convnet 00:12:57
- Chapter 5. Fine-tuning 00:06:34
- Chapter 5. Visualizing what convnets learn 00:07:47
- Chapter 5. Visualizing convnet filters 00:09:47
- Chapter 6. Deep learning for text and sequences 00:09:08
- Chapter 6. Using word embeddings 00:12:03
- Chapter 6. Putting it all together: from raw text to word embeddings 00:06:05
- Chapter 6. Understanding recurrent neural networks 00:07:49
- Chapter 6. Understanding the LSTM and GRU layers 00:09:23
- Chapter 6. Advanced use of recurrent neural networks 00:07:41
- Chapter 6. A common-sense, non-machine-learning baseline 00:06:50
- Chapter 6. Using recurrent dropout to fight overfitting 00:10:42
- Chapter 6. Going even further 00:03:59
- Chapter 6. Sequence processing with convnets 00:05:21
- Chapter 6. Combining CNNs and RNNs to process long sequences 00:06:39
- Chapter 7. Advanced deep-learning best practices 00:07:46
- Chapter 7. Multi-input models 00:04:13
- Chapter 7. Directed acyclic graphs of layers 00:09:48
- Chapter 7. Layer weight sharing 00:04:31
- Chapter 7. Inspecting and monitoring deep-learning models using Keras callba- acks and TensorBoard 00:05:58
- Chapter 7. Introduction to TensorBoard: the TensorFlow visualization framework 00:06:29
- Chapter 7. Getting the most out of your models 00:07:39
- Chapter 7. Hyperparameter optimization 00:06:02
- Chapter 7. Model ensembling 00:08:35
- Chapter 8. Generative deep learning 00:06:53
- Chapter 8. A brief history of generative recurrent networks 00:08:33
- Chapter 8. Implementing character-level LSTM text generation 00:05:55
- Chapter 8. DeepDream 00:07:37
- Chapter 8. Neural style transfer 00:06:41
- Chapter 8. Neural style transfer in Keras 00:07:04
- Chapter 8. Generating images with variational autoencoders 00:03:58
- Chapter 8. Variational autoencoders 00:09:45
- Chapter 8. Introduction to generative adversarial networks 00:05:59
- Chapter 8. A bag of tricks 00:08:18
- Chapter 9. Conclusions 00:06:08
- Chapter 9. How to think about deep learning 00:09:38
- Chapter 9. Key network architectures 00:08:42
- Chapter 9. The space of possibilities 00:04:21
- Chapter 9. The limitations of deep learning 00:05:44
- Chapter 9. Local generalization vs. extreme generalization 00:04:57
- Chapter 9. The future of deep learning 00:09:35
- Chapter 9. Automated machine learning 00:09:12
- Chapter 9. Staying up to date in a fast-moving field 00:05:34
Show and hide more