English | 2022 | ISBN: 9781633439610 | 323 pages | PDF,EPUB | 19.56 MB
Make your deep learning models more generalized and adaptable! These practical regularization techniques improve training efficiency and help avoid overfitting errors. Regularization in Deep Learning includes Insights into model generalizability A holistic overview of regularization techniques and strats Classical and modern views of generalization, including bias and variance tradeoff When and where to use different regularization techniques The background knowledge you need to understand cutting-edge research Regularization in Deep Learning delivers practical techniques to help you build more general and adaptable deep learning models. It goes beyond basic techniques like data augmentation and explores strats for architecture, objective function, and optimization. You’ll turn regularization theory into practice using PyTorch, following guided implementations that you can easily adapt and customize for your own model’s needs. Along the way, you’ll get just enough of the theory and mathematics behind regularization to understand the new research emeg in this important area. about the technology Deep learning models that generate highly accurate results on their training data can struggle with messy real-world test datasets. Regularization strats help overcome these errors with techniques that help your models handle noisy data and chag requirements. By learning to tweak training data and loss functions, and employ other regularization approaches, you can ensure a model delivers excellent generalized performance and avoid overfitting errors. about the book Regularization in Deep Learning teaches you how to improve your model performance with a toolbox of regularization techniques. It covers both well-established regularization methods and groundbreaking modern approaches. Each technique is introduced using graphics, illustrations, and step-by-step coding walkthroughs that make complex math easy to follow. You’ll learn how to augment your dataset with random noise, improve your model’s architecture, and apply regularization in your optimization procedures. You’ll soon be building focused deep learning models that avoid sprawling complexity and deliver more accurate results even with new or messy data sets. about the reader For data scientists, machine learning eeers, and researchers with basic model development experience. about the author Peng Liu is an experienced data scientist focusing on applied research and development of high-performance machine learning models in production. He holds a Ph.D. in statistics from the National University of Singapore, and teaches advanced analytics courses as an adjunct lecturer in universities. He specializes in the statistical aspects of deep learning.
TO MAC USERS: If RAR password doesn't work, use this archive program:
RAR Expander 0.8.5 Beta 4 and extract password protected files without error.
TO WIN USERS: If RAR password doesn't work, use this archive program:
Latest Winrar and extract password protected files without error.