Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống
1
/ 18 trang
THÔNG TIN TÀI LIỆU
Thông tin cơ bản
Định dạng
Số trang
18
Dung lượng
5,52 MB
Nội dung
EXPERT INSIGHT Python Machine Learning Machine Learning and Deep Learning with Python, scikit-learn, and TensorFlow Third Edition - Includes TensorFlow 2, CANs, and Reinforcement Learning Sebastian Raschka & Vahid Mirjalili Python Machine Learning Third Edition Machine Learning and Deep Learning with Python, scikit-learn, and TensorFlow Sebastian Raschka Vahid Mirjalili Pacl BIRMINGHAM - MUMBAI Python Machine Learning Third Edition Copyright 2019Packt Publishing All rights reserved No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews Every effort has been made in the preparation of this book to ensure the accuracy of the information presented However, the information contained in this book is sold without warranty, either express or implied Neither the authors, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals However, Packt Publishing cannot guarantee the accuracy of this information Acquisition Editor: Jonathan Malysiak Acquisition Editor —Peer Reviews: Suresh Jain Content Development Editors: Joanne Lovell,Chris Nelson Technical Editor: Saby Dsilva Project Editor: Radhika Atitkar Proofreader: Safis Editing Indexer: Tejal Daruwale Soni Presentation Designer: Sandip Tadge First published: September 2015 Second edition: September 2017 Third edition: December 2019 Production reference: 1091219 Published by Packt Publishing Ltd Livery Place 35 Livery Street Birmingham B32PB,UK ISBN 978-1-78995-575-0 www.packt com Packt> packt com Subscribe to our online digital library for full access to over 7,000books and videos, as well as industry leading tools to help you plan your personal development and advance your career.For more information, please visit our website Why subscribe? Spend less time learning and more time coding with practical eBooks and Videos from over 4,000industry professionals Learn better with Skill Plans built especially for you Get a free eBook or video every month Fully searchable for easy access to vital information Copy and paste, print, and bookmark content Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.Packt com and as a print book customer, you are entitled to a discount on the eBook copy Get in touch with us at customercare@packtpub.com for more details At www.Packt com,you can also read a collection of free technical articles, sign up for a range of free newsletters, and receive exclusive discounts and offers on Packt books and eBooks Contributors About the authors Sebastian Raschka received his doctorate from Michigan State University, where he focused on developing methods at the intersection of computational biology and machine learning In the summer of 2018,he joined the University of WisconsinMadison as Assistant Professor of Statistics.His research activitiesinclude the development of new deep learning architectures to solve problems in the field of biometrics Sebastian has many years of experience with coding in Python and has given several seminars on the practical applications of data science, machine learning, and deep learning over the years, including a machine learning tutorial at SciPy, the leading conference for scientific computing in Python Among Sebastian's achievements is his book Python MachineLearning,which is a bestselling title at Packt and on Amazon.com The book received the ACM Best of Computing award in 2016and was translated into many different languages, including German, Korean, Chinese, Japanese, Russian, Polish, and Italian In his free time, Sebastian loves to contribute to open source projects, and methods that he implemented are now successfullyused in machine learning competitions such as Kaggle I would like to take this opportunity to thank the great Python community and the developers of open source packages who helped me create the perfect environment for scientific research and data science Also, I want to thank my parents, who always encouraged and supported me in pursuing the path and career that I was so passionate about Special thanks to the core developers of scikit-learn and TensorFlow As a contributor and user, I had the pleasure of working with great people who are not only very knowledgeable when it comes to machine learning and deep learning but are also excellent programmers Vahid Mirjalili obtained his PhD in mechanical engineering working on novel methods for large-scale, computational simulations of molecular structures at Michigan State University Being passionate about the field of machine learning, he joined the iPRoBe lab at Michigan State University, where he worked on applying machine learning in the computer vision and biometrics domains After several productive years at the iPRoBelab and many years in academia, Vahid recently joined 3M Company as a research scientist, where he can use his expertise and apply state-of-the-artmachine learning and deep learning techniques to solve real-world problems in various applications to make life better I would like to thank my wife, Taban Eslami, who has been very supportive and encouraged me on my career path Also, special thanks to my advisors, Nikolai Priezjev, Michael Feig, and Arun Ross, for supporting me during my PhD studies, as well as my professors, Vishnu Boddeti, Leslie Kuhn, and Xiaoming Liu, who have taught me so much and encouraged me to pursue my passion About the reviewers Raghav Bali is a senior data scientist at one of the world's largest healthcare organizations His work involves the research and development of enterprise-level solutions based on machine learning, deep learning, and natural language processing for healthcare- and insurance-related use cases In his previous role at Intel, he was involved in enabling proactive data-driven IT initiatives using natural language processing, deep learning, and traditional statisticalmethods He has also worked in the finance domain with American Express,solving digital engagement and customer retention use cases Raghav has also authored multiple books with leading publishers, the most recent being on the latest advancements in transfer learning research Raghav has a master's degree (gold medalist) in information technology from the International Institute of Information Technology, Bangalore Raghav loves reading and is a shutterbug, capturing moments when he isn't busy solving problems Motaz Saad holds a PhD in computer science from the University of Lorraine He loves data and he likes to play with it He has over 10 years of professional experience in natural language processing, computational linguistics, data science, and machine learning He currently works as an assistant professor at the faculty of Information Technology, IUG Table of Contents Preface Chapter 1: Giving Computers the Ability to Learn from Data Building intelligent machines to transform data into knowledge The three differenttypes of machine learning Making predictions about the future with supervised learning Classification for predicting class labels Regression for predicting continuous outcomes Solving interactive problems with reinforcement learning Discovering hidden structures with unsupervised learning xiii 3 Finding subgroups with clustering Dimensionality reduction for data compression Introduction to the basic terminology and notations Notation and conventions used in this book Machine learning terminology 11 A roadmapfor building machine learning systems 11 Using Python for machine learning 12 13 14 14 14 15 16 16 Preprocessing —getting data into shape Training and selecting a predictivemodel Evaluating models and predicting unseen data instances Installing Python and packages from the Python Package Index Using the Anaconda Python distributionand package manager Packages for scientific computing, data science, and machine learning Summary Table of Contents Chapter 2: Training Simple Machine Learning Algorithms for Classification Artificial neurons —a brief glimpse into the early history of machine learning The formal definition of an artificial neuron The perceptron learning rule Implementing a perceptron learning algorithm in Python An object-orientedperceptronAPI Training a perceptron model on the Iris dataset Adaptive linear neurons and the convergence of learning Minimizing cost functions with gradient descent Implementing Adaline in Python Improving gradient descent through feature scaling Large-scale machine learning and stochastic gradient descent Summary Chapter 3: A Tour of MachineLearning Classifiers Using scikit-learn Choosing a classification algorithm First steps with scikit-learn —training a perceptron Modelingclass probabilitiesvia logistic regression 19 20 21 23 26 26 30 36 37 44 46 51 53 53 54 60 Logistic regression and conditionalprobabilities Learning the weights of the logistic cost function 60 65 Converting an Adaline implementation into an algorithm for logistic regression 67 Training a logistic regression model with scikit-learn Tackling overfltting via regularization Maximum margin intuition 72 75 79 79 Dealing with a nonlinearlyseparable case using slack variables 81 Maximum margin classification with support vector machines Alternative implementations in scikit-learn Solving nonlinear problems using a kernel SVM Kernel methods for linearly inseparable data Using the kernel trick to find separating hyperplanes in a high-dimensionalspace Decision tree learning Maximizing IG —getting the most bang for your buck Building a decision tree Combining multipledecision trees via random forests K-nearest neighbors —a lazy learning algorithm Summary 83 84 84 86 90 91 96 100 103 108 Table of Contents Chapter 4: Building Good Training Datasets — Data Preprocessing Dealing with missing data Identifying missing values in tabular data Eliminating training examples or features with missing values Imputing missing values Understanding the scikit-learn estimator API Handling categorical data Categorical data encoding with pandas Mapping ordinal features Encoding class labels Performing one-hot encoding on nominal features Partitioning a dataset into separate training and test datasets Bringing features onto the same scale Selecting meaningful features 109 109 110 111 112 113 115 116 116 117 118 121 124 127 Ll and L2 regularizationas penalties against model complexity A geometric interpretationof L2 regularization 128 Sparse solutions with Ll regularization 131 135 141 Sequential feature selection algorithms Assessing feature importance with random forests Summary Chapter 5: Compressing Data via Dimensionality Reduction Unsupervised dimensionality reduction via principal componentanalysis The main steps behind principal component analysis Extracting the principal components step by step Total and explained variance Feature transformation Principal component analysis in scikit-learn Supervised data compression via linear discriminant analysis Principal component analysis versus linear discriminant analysis The inner workings of linear discriminant analysis Computing the scatter matrices Selecting linear discriminants for the new feature subspace Projecting examples onto the new feature space LDA via scikit-learn Using kernel principal component analysis for nonlinear mappings Kernel functions and the kernel trick Implementing a kernel principal component analysis in Python Example —separating half-moonshapes Example —separating concentric circles 128 143 145 145 146 148 151 152 155 159 159 160 161 164 167 168 169 170 175 177 180 Tableo Contents Projecting new data points Kernel pnncipal component analysis in scikit-learn Summary Chapter 6: Learning Best Practices for ModelEvaluation and H er arameterTunin Streamlining workflows with pipelines Loading the Breast Cancer Wisconsindataset Combining transformers and estimatorsin a pipeline Using k-fold cross-validation to assess modelperformance The holdout method K-fold cross-validation Debugging algorithms with learning and validation curves Diagnosing bias and variance problemswithlearning curves Addressing over- and underfittingwithvalidationcurves Fine-tuning machine learning models via grid search Tuning hyperparameters via grid search Algorithm selection with nested cross-validation Looking at different performance evaluation metrics Reading a confusionmatrix Optimizing the precision and recall of a classification model Plotting a receiver operating characteristic Scoring metrics for multiclass classification Dealing with class imbalance Summary Chapter 7: Combining Different Modelsfor Ensemble Learning Learning with ensembles Combining classifiers via majorityvote Implementing a simple majorityvote classifier using the majority voting principle to make predictions Evaluating and tuning the ensemble classifier Bagging —building an ensemble of classifiers from bootstrap samples Bagging in a nutshell Applying bagging to classify examples in the Wine dataset Leveraging weak learners via adaptive boosting How boosting works Applying AdaBoost using scikit-learn Summary Cha ter 8: A I in MachineLearnin to SentimentAnal sis movie review Preparing the IMDb data for text processing 183 187 188 191 191 192 193 195 196 197 201 201 205 207 207 209 211 211 213 216 219 220 222 223 223 227 228 234 237 243 244 245 249 250 254 257 259 259 Table of Contents Obtaining the movie review dataset Preprocessing the movie dataset into a more convenient format Introducing the bag-of-words model Transforming words into feature vectors Assessing word relevancy via term frequency-inverse document frequency Cleaning text data Processing documents into tokens Training a logistic regression model for document classification Working with bigger data —online algorithms and out-of-core learning Topic modeling with Latent Dirichlet Allocation Decomposing text documents with LDA LDA with scikit-learn Summary Chapter 9: Embedding a MachineLearning Modelinto a Web Application Serializing fitted scikit-learn estimators Setting up an SQLite database for data storage Developing a web application with Flask Our first Flask web application Form validation and rendering Setting up the directory structure Implementinga macro using the Jinja2 templatingengine Adding style via CSS Creating the result page Turning the movie review classifier into a web application Files and folders —looking at the directory tree Implementing the main application as app.py Setting up the review form Creating a results page template Deploying the web application to a public server Creating a PythonAnywhere account Uploading the movie classifier application Updating the movie classifier Summary Chapter 10: Predicting Continuous Target Variables with Regression Analysis Introducing linear regression Simple linear regression Multiplelinear regression 260 260 262 263 265 267 269 272 274 278 279 279 283 285 285 289 291 292 294 295 296 296 298 300 301 302 305 306 309 309 310 311 314 315 315 316 317 Table of Contents Exploring the Housing dataset Loading the Housing dataset into a data frame Visualizing the importantcharacteristics of a dataset Looking at relationships using a correlation matrix Implementing an ordinary least squares linear regression model Solving regression for regression parameters with gradient descent Estimating the coefficientof a regression model via scikit-learn Fitting a robust regression model using RANSAC Evaluating the performance of linear regression models Using regularized methods for regression Turning a linear regression model into a curve — polynomial regression Adding polynomial terms using scikit-learn Modeling nonlinear relationships in the Housing dataset Dealing with nonlinear relationships using random forests Decision tree regression Random forest regression Summary Chapter 11: Working with Unlabeled Data —Clustering Analysis Grouping objects by similarity using k-means K-means clustering using scikit-learn A smarter way of placing the initial cluster centroids using k-means++ Hard versus soft clustering Using the elbow method to find the optimal number of clusters Quantifying the quality of clustering via silhouette plots Organizing clusters as a hierarchicaltree Grouping clusters in bottom-up fashion Performing hierarchical clustering on a distance matrix Attaching dendrograms to a heat map Applying agglomerative clustering via scikit-learn Locating regions of high density via DBSCAN Summary Chapter 12: Implementing a Multilayer Artificial Neural Network from Scratch Modeling complex functions with artificial neural networks Single-layer neural network recap Introducing the multilayer neural network architecture Activating a neural network via forward propagation Classifying handwritten digits Obtaining and preparingthe MNIST dataset 318 318 320 322 325 325 330 332 334 337 339 340 342 345 346 348 350 353 353 354 358 359 361 363 367 368 369 373 375 376 382 383 383 385 387 391 393 394 Table Contents Implementing a multilayer perceptron 400 Computing the logistic cost function Developing your understanding of backpropagation Training neural networks via backpropagation 412 415 417 Training an artificial neural network About the convergence in neural networks A few last words about the neural network implementation Summary Chapter 13: Parallelizing Neural Network Training with TensorFIow TensorFlow and training performance Performance challenges What is TensorFlow? How we will learn TensorFlow First steps with TensorFlow Installing TensorFlow Creating tensors in TensorFlow Manipulating the data type and shape of a tensor Applying mathematical operations to tensors Split, stack, and concatenate tensors Building input pipelines using tf.data—the TensorFlow Dataset API 422 423 425 426 426 427 429 429 429 430 431 432 434 435 Creating a TensorFlow Dataset from existing tensors Combining two tensors into a joint dataset 436 437 Shuffle, batch, and repeat Creating a dataset fromfiles on your local storage disk 439 441 Fetching available datasets from the tensorflow_datasets library Building an NN model in TensorFlow The TensorFlow Keras API (tf.keras) Building a linear regression model Model training via the compile() and fit() methods 445 450 451 451 456 Building a multilayer perceptron for classifying flowers in the Iris dataset 457 461 Evaluating the trained model on the test dataset 461 Saving and reloading the trained model 462 Choosing activation functions for multilayer neural networks 463 Logistic function recap Estimating class probabilities in multiclass classification via the softmax 465 function 466 Broadening the output spectrum using a hyperbolic tangent 468 Rectified linear unit activation Summary 470 Table or Contents Understanding computation graphs Creating a graph in TensorFlow VI x Migrating a graph to TensorFlow v2 Loading input data into a model: TensorFlow VI x style Loading input data into a model: TensorFlow v2 style Improving computational performance with function decorators 471 472 473 473 474 475 476 476 477 Computing the gradients of the loss with respect to trainable variables Computing gradients with respect to non-trainable tensors Keeping resources for multiple gradient computations 479 483 483 485 485 Chapter 14: Going Deeper —The Mechanics of TensorFlow The key features of TensorFlow TensorFlow's computation graphs: migrating to TensorFIow v2 TensorFlow Variable objects for storing and updating model parameters Computing gradients via automatic differentiationand GradientTape Simplifying implementations of common architectures via the Keras API Solving an XOR classification problem Making model building more flexible with Keras' functional API Implementing models based on Keras' Model class Writing custom Keras layers TensorFlow Estimators Working with feature columns Machine learning with pre-made Estimators Using Estimators for MNIST handwritten digit classification Creating a custom Estimator from an existing Keras model Summary 486 489 494 496 497 501 501 506 510 512 515 Chapter 15: Classifying Images with Deep Convolutional Neural Networks 517 IJnderstanding CNNs and feature hierarchies Performing discrete convolutions 518 518 520 The building blocks of CNNs Discrete convolutions in one dimension Padding inputs to control the size of the output feature maps Determining the size of the convolution output Performing a discrete convolution in 2D Subsampling layers Putting everything together —implementing a CNN Working with multiple input or color channels Regularizing an NN withdropout Loss functions for classification t viii 521 523 525 526 530 532 532 536 539 Table of Contents Implementinga deep CNN using TensorFlow The multilayerCNN architecture Loading and preprocessing the data 542 542 543 Implementinga CNN using the TensorFlow Keras API ConfiguringCNN layers in Keras Constructinga CNN in Keras 545 Gender classification from face images using a CNN Loading the CelebA dataset Image transformation and data augmentation Training a CNN gender classifier Summary Chapter 16: Modeling Sequential Data Using Recurrent Neural Networks Introducing sequential data Modeling sequential data —order matters Representing sequences The differentcategories of sequence modeling RNNs for modeling sequences Understanding the RNN looping mechanism Computing activations in an RNN Hidden-recurrence versus output-recurrence The challenges of learning long-range interactions Long short-term memory cells ImplementingRNNs for sequence modeling in TensorFlow Project one —predicting the sentiment of IMDb movie reviews Preparing the movie review data Embedding layers for sentence encoding Buildingan RNN model Building an RNN model for the sentiment analysis task Project two —character-level language modeling in TensorFlow Preprocessing the dataset Building a character-level RNN model Evaluation phase —generating new text passages Understanding language with the Transformer model 550 551 552 558 564 567 568 568 569 570 571 571 574 577 580 582 584 585 590 592 594 600 601 607 609 613 Understanding the self-attention mechanism 614 Multi-head attention and the Transformer block 617 618 A basic version of self-attention Parameterizing the self-attentionmechanism with query, key, and value weights Summary Chapter 17: Generative Adversarial Networks for Synthesizing New Data Introducing generative adversarial networks ( ix 614 616 619 620 Table of Contents Starting with autoencoders Generative models for synthesizing new data Generating new samples with GANs Understanding the loss functions of the generator and discriminator networks in a GAN model 620 623 624 Implementing the generator and the discriminator networks Defining the training dataset 631 Implementing a GAN from scratch Training GAN models on Google Colab 626 628 628 636 638 Training the GAN model Improving the quality of synthesized images using a convolutional and Wasserstein GAN Transposed convolution Batch normalization 646 647 648 Implementing the generator and discriminator Dissimilarity measures between two distributions 657 Using EM distance in practicefor GANs Gradient penalty Implementing WGAN-GP to train the DCGAN model Mode collapse Other GAN applications Summary Chapter 18: Reinforcement Learning for Decision Making in Complex Environments Introduction —learning from experience Understanding reinforcement learning Defining the agent-environment interface of a reinforcement learning system The theoretical foundations of RL Markov decision processes The mathematical formulation of Markov decision processes Visualization of a Markov process Episodic versus continuingtasks RL terminology: return, policy, and value function The return Policy Value function Dynamic programming using the Bellman equation Reinforcement learning algorithms Dynamic programming Policy evaluation —predicting the value function with dynamic programming Improving the policy using the estimated value function 651 661 662 663 667 669 670 671 672 672 674 676 676 677 679 680 680 680 682 683 685 686 686 687 688 Table of Contents Policy iteration 689 Value iteration Reinforcement learning with Monte Carlo State-value function estimation using MC Action-value function estimation using MC Finding an optimalpolicy using MC control Policy improvement—computing the greedy policy from the action-value function 689 691 691 691 Temporal difference learning TD prediction 692 On-policyTD control(SARSA) 693 694 Off-policy TD control (Q-learning) Implementing our first RL algorithm Introducing the OpenAl Gym toolkit Working with the existing environments in OpenAl Gym A grid world example Implementingthe grid world environment in OpenAl Gym Solving the grid world problem with Q-learning Implementing the a-learning algorithm 694 695 695 697 698 705 705 A glance at deep Q-learning Training a DQN model according to the Q-learning algorithm Implementinga deep Q-learning algorithm Chapter and book summary Other Books You Ma En•o 709 710 712 717 721 725 Index [xil ... linear regression Multiplelinear regression 26 0 26 0 26 2 26 3 26 5 26 7 26 9 27 2 27 4 27 8 27 9 27 9 28 3 28 5 28 5 28 9 29 1 29 2 29 4 29 5 29 6 29 6 29 8 300 301 3 02 305 306 309 309 310 311 314 315 315 316 317... MachineLearnin to SentimentAnal sis movie review Preparing the IMDb data for text processing 183 187 188 191 191 1 92 193 195 196 197 20 1 20 1 20 5 20 7 20 7 20 9 21 1 21 1 21 3 21 6 21 9 22 0 22 2 22 3 22 3.. .Python Machine Learning Third Edition Machine Learning and Deep Learning with Python, scikit-learn, and TensorFlow Sebastian Raschka Vahid Mirjalili Pacl BIRMINGHAM - MUMBAI Python Machine