1. Trang chủ
  2. » Luận Văn - Báo Cáo

Ai Cho Khoa Học Dữ Liệu.pdf

231 0 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 231
Dung lượng 1,92 MB

Nội dung

Contents 1 Introduction 1 About AI 2 AI facilitates data science 3 About the book 2 Chapter 1 Deep Learning Frameworks 1 About deep learning systems 2 How deep learning systems work 3 Main deep learni[.]

Contents Introduction About AI AI facilitates data science About the book Chapter 1: Deep Learning Frameworks About deep learning systems How deep learning systems work Main deep learning frameworks Main deep learning programming languages How to leverage deep learning frameworks Deep learning methodologies and applications Assessing a deep learning framework Summary Chapter 2: AI Methodologies Beyond Deep Learning Optimization Fuzzy inference systems Artificial creativity Additional AI methodologies Glimpse into the future About the methods Summary Chapter 3: Building a DL Network Using MXNet Core components MXNet in action MXNet tips Summary Chapter 4: Building a DL Network Using TensorFlow TensorFlow architecture Core components TensorFlow in action Visualization in TensorFlow: TensorBoard High level APIs in TensorFlow: Estimators Summary Chapter 5: Building a DL Network Using Keras Core components Keras in action Model Summary and Visualization Converting Keras models to TensorFlow Estimators Summary Chapter 6: Building an Optimizer Based on the Particle Swarm Optimization Algorithm PSO algorithm Main PSO variants PSO versus other optimization methods PSO implementation in Julia PSO in action PSO tips Summary Chapter 7: Building an Optimizer Based on Genetic Algorithms Standard Genetic Algorithm Implementation of GAs in Julia GAs in action Main variants of GAs GA framework tips Summary Chapter 8: Building an Optimizer Based on Simulated Annealing Pseudo-code of the Standard Simulated Annealing Algorithm Implementation of Simulated Annealing in Julia Simulated Annealing in action Main Variants of Simulated Annealing Simulated Annealing Optimizer tips Summary 10 Chapter 9: Building an Advanced Deep Learning System Convolutional Neural Networks (CNNs) Recurrent Neural Networks Summary 11 Chapter 10: Building an Optimization Ensemble The role of parallelization in optimization ensembles Framework of a basic optimization ensemble Case study with PSO Systems in an ensemble Case study with PSO and Firefly ensemble How optimization ensembles fit into the data science pipeline Ensemble tips Summary 12 Chapter 11: Alternative AI Frameworks in Data Science Extreme Learning Machines (ELMs) Capsule Networks (CapsNets) Fuzzy logic and fuzzy inference systems Summary 13 Chapter 12: Next Steps Big data Specializations in data science Publicly available datasets Summary 14 Closing Thoughts 15 Glossary 16 Transfer Learning When is transfer learning useful? When to use transfer learning How to apply transfer learning Applications of transfer learning 17 Reinforcement Learning Key terms Reward hypothesis Types of tasks Reinforcement learning frameworks 18 Autoencoder Systems Components Extensions of conventional autoencoder models Use cases and applications 19 Generative Adversarial Networks Components Training process Pain points of a GAN model 20 The Business Aspect of AI in Data Science Projects Description of relevant technologies AI resources Industries and applications benefiting the most from AI Data science education for AI-related projects 21 Using Docker Image of the Book’s Code and Data Downloading the Docker software Using Docker with an image file Docker tips 22 Index Landmarks Cover Data flow and CNNs, 161 and Recurrent Neural Networks (RNNs), 172 Data generation, 263 Data models building, 19 deployment of, 20 Data science AI in, 271–75 and AI, 225 specializations, 216–19 Data science education, 275 Data science methodologies predictive analytics as, 11 Dataset representation, 49 Datasets and NDArray, 49–51 synthetic, 48–49 Deep Gimble I, 34 Deep learning (DL) programming languages and, 17–18 Deep learning (DL) systems alternatives to, 27–28 and AI, 9, 11–12 methodologies and applications and, 20–22 understanding, 12–16 vs conventional data science systems, 24 Deep learning frameworks, 9 assessing, 22–24 how to leverage, 18–20 Deep Learning Machines (DLM), 194–98 Deep learning networks, 13 Defuzzification, 31 Deng, Jack, 43 Denoising autoencoder, 261 Deterministic optimizers, 29 Dimensionality reduction, 12, 20, 262 Docker system, 277–78 Docker tips, 278 Elitism, 123 Emscripten, 44 Ensembles building of, 181–92 Epochs training, 14 Estimators API, and TensorFlow, 84–87 ETL processes, 19, 22 Exponential expression and particle Swarm Optimization (PSO), 115–16 Extract, Transform, and Load (ETL) processes, 11 Extreme Learning Machine (ELM), 35, 193, 209 architecture of, 196 Feature fusion, 39 Firefly, 119 Firefly optimizer, 107–8, 187–89 FIS, 32, 39, 194 Fitness, 123 Frameworks, 9–10 Main deep learning, 16–17 meta-feature of, 15 Functional programming languages, 225 Fuzzification, 31 Fuzzy Inference System (FIS), 32, 194, 203–7 downside of, 32 Fuzzy logic (FL), 30–33, 41, 194 Fuzzy logic systems, 27–28 relevance of, 32 Fuzzy logic tips, 208 Generative Adversarial Networks (GANs), 36, 262, 265–70 Genetic Algorithm (GA), 121–46, 144 activities in, 132–41, 141–43 in Julia, 127–31 tips, 143 Genetic Programming (GP), 142–43 Gluon interface, 44–45, 51, 55, 57, 61, 62 GRUs, 173–74, 179 Hadoop, 213–14, 214 Healthcare, 219 Hinton, Geoffrey, 194, 198, 199 Holland, John, Adaption in Natural and Artificial Systems, 121 Image caption generation, 178 IMF, 223 Internet of Things, 218 Interpretability, 23–24 Julia, 208 Particle Swarm Optimization (PSO) implementation, 109–12 programming language, 18 Simulated Annealing in, 151–53 Julia programming language, 30 Kaggle, 220, 224 Keras, 16, 17, 174 activities in, 91–101 and TensorFlow, 89–101 converting to TensorFlow estimators, 100 core components of, 91 model graph, 99 Kirkpatrick, S., 147 Knet framework, 16 Kohonen (Professor), 36 Kohonen Maps, 36 LeCun, Yann, 160 Library, 9 LibriSpeech, 223 LSTMs, 173–74, 179 Machine learning methodologies, 11 Machine learning systems vs deep learning systems, 12 Main deep learning programming languages, 17–18 Main deep learning frameworks, 16–17 Matlab, 18 Meta-features, 20, 35 of the framework, 15 Methodologies, 10 data science, 11 machine learning, 11 MNIST dataset, 164, 165 Model, 10 Model maintenance, 24 Multi-Layer Perceptrons, 55 Mutation, 122, 145 MXNet, 16, 17, 62 and checkpoints, 60–61 and deep learning frameworks, 43–44 and NDArray, 45–47, 49–51 and Python, 48–49 checkpoints, 63 Gluon interface and, 44–45 package in Python, 47–48 tips, 61 Natural language, 221 Natural Language Processing (NLP), 21, 217 NDArrays, 45–47, 63 Neuro-fuzzy systems, 31 Neurons, 11, 12 NLP, 178 Non-linearity and CNNs, 163 Novel AI methodology, 41 Novel AI systems, 40 OpenAI Gym, 257 Optimization, 27, 35, 39, 41 definition, 28 importance of, 28 programming languages and, 30 Optimization ensemble, 181–82, 183, 191 and data science, 189–90 framework of, 183 tips, 190–91 Optimization methods and Particle Swarm Optimization, 109 Optimization systems, 28, 29, 41 Parallelization, 182 Particle Swarm Optimization (PSO), 103–19, 118 activities in, 112–16 algorithms, 104–6 implementation in Julia, 109–12 tips, 116–18 variants of, 106–8 vs alternatives, 109 Perceptron system, 15, 194 Personally Identifiable Information (PII), 34 Pipeline, 10 Polynomial expression and Particle Swarm Optimization, 114–15 Pooling, and CNNs, 163 Predictive analytics system, 11, 20, 39 Programming languages, 9–10, 16, 43 main deep leraning, 17–18 Programming resources, 207 PSO optimizer, 181 PSO systems and Firefly ensemble, 187–89 in an ensemble, 184–86 Publicly available datasets, 220–23 Python, 18, 43, 63, 174, 207 and MXNet, 48–49 MXNet package in, 47–48 Pytorch, 16 Quantum Annealing, 156 Quora Question Pairs, 222 Rechenberg, Ingo, 121 Recurrent Neural Networks (RNNs), 159, 170–78 activities of, 174–78 components of, 171–72 variants of, 173–74 Regression, 63 and Keras, 95–98 and TensorFlow, 77–81 Regression MLP system, 57–59 Reinforcement learning (RL), 20, 253–57 ReLU function, 52 Resilient Distributed Dataset (RDD), 215 RL-Glue, 257 Rosenblatt, Frank, 194 Self-Organizing Genetic Algorithm (SOGA), 141 Self-organizing Maps (SOMs), 36 Sentiment analysis, 21 Sigmoid function, 14 Simulated Annealing (SA), 147–58, 149 activities and, 154–56 variants of, 156–57 Social sciences, 219 and time series, 223 Softmax function, 52, 172 Speech in English, 222 Speech recognition, 178 SQuAD, 222 Standard Genetic Algorithm, 125–27 Stanford Sentiment Treebank, 222 Stochastic optimizers, 29 Supervised learning, 20 Synthetic datasets, 48–49 System, 10 Target variable, 14 TensorBoard dashboard, 83 TensorFlow, 16, 17 and Keras, 89–101 architecture of, 67 TensorFlow estimators and Keras, 100 Testing, 10 Text synthesis, 178 Theano, 16 TIMIT, 223 Training, 10 Training algorithms, 12, 15 Training epochs, 14 Training process and CNNs, 163 and RNNs, 173 Transfer function, 14 Transfer learning, 247–51 Unsupervised learning methodologies, 21 Variable Selective Pressure Model (VSPM), 141 Variational autoencoder, 262 Visualization and CNNs model, 164–70 and Keras, 98–100 and TensorFlow, 81–87 WikiText, 221 World Bank, 223 ... number of training data points train = data[:n, ] # training set partition test = data[(n + 1):,] # testing set partition data_train = gluon.data.DataLoader(gluon.data.ArrayDataset(train[:,:3], train[:,3]), batch_size=BatchSize, shuffle=True) data_test =... he is speaking with a human or a computer, then the computer is said to have passed the test This simple test has remained a standard for AI, still adding value to related research in the field in various ways.2 AI facilitates data science So, how does AI. .. The next step is to split our data into train and test sets For this purpose, we use the scikit-learn’s train_test_split function that we imported before: X_train, X_test, y_train, y_test = train_test_split(features,

Ngày đăng: 27/02/2023, 10:37

w