top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
Advanced deep learning with Keras : apply deep learning techniques, autoencoders, GANs, variational autoencoders, deep reinforcement learning, policy gradients, and more / / Rowel Atienza
Advanced deep learning with Keras : apply deep learning techniques, autoencoders, GANs, variational autoencoders, deep reinforcement learning, policy gradients, and more / / Rowel Atienza
Autore Atienza Rowel
Edizione [1st edition]
Pubbl/distr/stampa London, England : , : Packt Publishing, Limited, , [2018]
Descrizione fisica 1 online resource (368 pages)
Disciplina 006.32
Soggetto topico Machine learning
Neural networks (Computer science)
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Cover -- Copyright -- Packt upsell -- Contributors -- Table of Contents -- Preface -- Chapter 1: Introducing Advanced Deep Learning with Keras -- Why is Keras the perfect deep learning library? -- Installing Keras and TensorFlow -- Implementing the core deep learning models - MLPs, CNNs and RNNs -- The difference between MLPs, CNNs, and RNNs -- Multilayer perceptrons (MLPs) -- MNIST dataset -- MNIST digits classifier model -- Building a model using MLPs and Keras -- Regularization -- Output activation and loss function -- Optimization -- Performance evaluation -- Model summary -- Convolutional neural networks (CNNs) -- Convolution -- Pooling operations -- Performance evaluation and model summary -- Recurrent neural networks (RNNs) -- Conclusion -- Chapter 2: Deep Neural Networks -- Functional API -- Creating a two-input and one-output model -- Deep residual networks (ResNet) -- ResNet v2 -- Densely connected convolutional networks (DenseNet) -- Building a 100-layer DenseNet-BC for CIFAR10 -- Conclusion -- References -- Chapter 3: Autoencoders -- Principles of autoencoders -- Building autoencoders using Keras -- Denoising autoencoder (DAE) -- Automatic colorization autoencoder -- Conclusion -- References -- Chapter 4: Generative Adversarial Networks (GANs) -- An overview of GANs -- Principles of GANs -- GAN implementation in Keras -- Conditional GAN -- Conclusion -- References -- Chapter 5: Improved GANs -- Wasserstein GAN -- Distance functions -- Distance function in GANs -- Use of Wasserstein loss -- WGAN implementation using Keras -- Least-squares GAN (LSGAN) -- Auxiliary classifier GAN (ACGAN) -- Conclusion -- References -- Chapter 6: Disentangled Representation GANs -- Disentangled representations -- InfoGAN -- Implementation of InfoGAN in Keras -- Generator outputs of InfoGAN -- StackedGAN -- Implementation of StackedGAN in Keras.
Generator outputs of StackedGAN -- Conclusion -- Reference -- Chapter 7: Cross-Domain GANs -- Principles of CycleGAN -- The CycleGAN Model -- Implementing CycleGAN using Keras -- Generator outputs of CycleGAN -- CycleGAN on MNIST and SVHN datasets -- Conclusion -- References -- Chapter 8: Variational Autoencoders (VAEs) -- Principles of VAEs -- Variational inference -- Core equation -- Optimization -- Reparameterization trick -- Decoder testing -- VAEs in Keras -- Using CNNs for VAEs -- Conditional VAE (CVAE) -- -VAE: VAE with disentangled latent representations -- Conclusion -- References -- Chapter 9: Deep Reinforcement Learning -- Principles of reinforcement learning (RL) -- The Q value -- Q-Learning example -- Q-Learning in Python -- Nondeterministic environment -- Temporal-difference learning -- Q-Learning on OpenAI gym -- Deep Q-Network (DQN) -- DQN on Keras -- Double Q-Learning (DDQN) -- Conclusion -- References -- Chapter 10: Policy Gradient Methods -- Policy gradient theorem -- Monte Carlo policy gradient (REINFORCE) method -- REINFORCE with baseline method -- Actor-Critic method -- Advantage Actor-Critic (A2C) method -- Policy Gradient methods with Keras -- Performance evaluation of policy gradient methods -- Conclusion -- References -- Other Books You May Enjoy -- Index.
Record Nr. UNINA-9910795323903321
Atienza Rowel  
London, England : , : Packt Publishing, Limited, , [2018]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Advanced deep learning with Keras : apply deep learning techniques, autoencoders, GANs, variational autoencoders, deep reinforcement learning, policy gradients, and more / / Rowel Atienza
Advanced deep learning with Keras : apply deep learning techniques, autoencoders, GANs, variational autoencoders, deep reinforcement learning, policy gradients, and more / / Rowel Atienza
Autore Atienza Rowel
Edizione [1st edition]
Pubbl/distr/stampa London, England : , : Packt Publishing, Limited, , [2018]
Descrizione fisica 1 online resource (368 pages)
Disciplina 006.32
Soggetto topico Machine learning
Neural networks (Computer science)
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Cover -- Copyright -- Packt upsell -- Contributors -- Table of Contents -- Preface -- Chapter 1: Introducing Advanced Deep Learning with Keras -- Why is Keras the perfect deep learning library? -- Installing Keras and TensorFlow -- Implementing the core deep learning models - MLPs, CNNs and RNNs -- The difference between MLPs, CNNs, and RNNs -- Multilayer perceptrons (MLPs) -- MNIST dataset -- MNIST digits classifier model -- Building a model using MLPs and Keras -- Regularization -- Output activation and loss function -- Optimization -- Performance evaluation -- Model summary -- Convolutional neural networks (CNNs) -- Convolution -- Pooling operations -- Performance evaluation and model summary -- Recurrent neural networks (RNNs) -- Conclusion -- Chapter 2: Deep Neural Networks -- Functional API -- Creating a two-input and one-output model -- Deep residual networks (ResNet) -- ResNet v2 -- Densely connected convolutional networks (DenseNet) -- Building a 100-layer DenseNet-BC for CIFAR10 -- Conclusion -- References -- Chapter 3: Autoencoders -- Principles of autoencoders -- Building autoencoders using Keras -- Denoising autoencoder (DAE) -- Automatic colorization autoencoder -- Conclusion -- References -- Chapter 4: Generative Adversarial Networks (GANs) -- An overview of GANs -- Principles of GANs -- GAN implementation in Keras -- Conditional GAN -- Conclusion -- References -- Chapter 5: Improved GANs -- Wasserstein GAN -- Distance functions -- Distance function in GANs -- Use of Wasserstein loss -- WGAN implementation using Keras -- Least-squares GAN (LSGAN) -- Auxiliary classifier GAN (ACGAN) -- Conclusion -- References -- Chapter 6: Disentangled Representation GANs -- Disentangled representations -- InfoGAN -- Implementation of InfoGAN in Keras -- Generator outputs of InfoGAN -- StackedGAN -- Implementation of StackedGAN in Keras.
Generator outputs of StackedGAN -- Conclusion -- Reference -- Chapter 7: Cross-Domain GANs -- Principles of CycleGAN -- The CycleGAN Model -- Implementing CycleGAN using Keras -- Generator outputs of CycleGAN -- CycleGAN on MNIST and SVHN datasets -- Conclusion -- References -- Chapter 8: Variational Autoencoders (VAEs) -- Principles of VAEs -- Variational inference -- Core equation -- Optimization -- Reparameterization trick -- Decoder testing -- VAEs in Keras -- Using CNNs for VAEs -- Conditional VAE (CVAE) -- -VAE: VAE with disentangled latent representations -- Conclusion -- References -- Chapter 9: Deep Reinforcement Learning -- Principles of reinforcement learning (RL) -- The Q value -- Q-Learning example -- Q-Learning in Python -- Nondeterministic environment -- Temporal-difference learning -- Q-Learning on OpenAI gym -- Deep Q-Network (DQN) -- DQN on Keras -- Double Q-Learning (DDQN) -- Conclusion -- References -- Chapter 10: Policy Gradient Methods -- Policy gradient theorem -- Monte Carlo policy gradient (REINFORCE) method -- REINFORCE with baseline method -- Actor-Critic method -- Advantage Actor-Critic (A2C) method -- Policy Gradient methods with Keras -- Performance evaluation of policy gradient methods -- Conclusion -- References -- Other Books You May Enjoy -- Index.
Record Nr. UNINA-9910819310903321
Atienza Rowel  
London, England : , : Packt Publishing, Limited, , [2018]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Apache Hadoop 3 quick start guide : learn about big data processing and analytics / / Hrishikesh Vijay Karambelkar
Apache Hadoop 3 quick start guide : learn about big data processing and analytics / / Hrishikesh Vijay Karambelkar
Autore Karambelkar Hrishikesh Vijay
Edizione [First edition]
Pubbl/distr/stampa London, England : , : Packt Publishing, Limited, , [2018]
Descrizione fisica 1 online resource (220 pages)
Disciplina 004.36
Soggetto topico Cloud computing
Electronic data processing - Distributed processing - Management
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910795325303321
Karambelkar Hrishikesh Vijay  
London, England : , : Packt Publishing, Limited, , [2018]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Apache Hadoop 3 quick start guide : learn about big data processing and analytics / / Hrishikesh Vijay Karambelkar
Apache Hadoop 3 quick start guide : learn about big data processing and analytics / / Hrishikesh Vijay Karambelkar
Autore Karambelkar Hrishikesh Vijay
Edizione [First edition]
Pubbl/distr/stampa London, England : , : Packt Publishing, Limited, , [2018]
Descrizione fisica 1 online resource (220 pages)
Disciplina 004.36
Soggetto topico Cloud computing
Electronic data processing - Distributed processing - Management
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910814241203321
Karambelkar Hrishikesh Vijay  
London, England : , : Packt Publishing, Limited, , [2018]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
ArcGIS Pro 2. x Cookbook : Create, Manage, and Share Geographic Maps, Data, and Analytical Models Using ArcGIS Pro
ArcGIS Pro 2. x Cookbook : Create, Manage, and Share Geographic Maps, Data, and Analytical Models Using ArcGIS Pro
Autore Corbin Tripp
Edizione [1st ed.]
Pubbl/distr/stampa Birmingham : , : Packt Publishing, Limited, , 2018
Descrizione fisica 1 online resource (704 p.)
Soggetto non controllato Geographic information systems
Technology & engineering
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Cover -- Copyright and Credits -- Dedication -- Packt Upsell -- Contributors -- Table of Contents -- Preface -- Chapter 1: ArcGIS Pro Capabilities and Terminology -- Introduction -- Determining whether your computer can run ArcGIS Pro -- Getting ready -- How to do it... -- How it works... -- Determining your ArcGIS Pro license level -- Getting ready -- How to do it... -- How it works... -- There's more... -- Opening an existing ArcGIS Pro project -- Getting ready -- How to do it... -- How it works... -- There's more... -- Opening and navigating a map -- Getting ready -- How to do it... -- How it works... -- Adding and configuring layers -- Getting ready -- How to do it... -- How it works... -- Creating a project -- Getting ready -- How to do it... -- How it works... -- There's more... -- ArcGIS Pro stock project templates -- Chapter 2: Creating and Storing Data -- Introduction -- Adding Raster and Vector data to a map -- Getting ready -- How to do it... -- How it works... -- Creating a new Geodatabase -- Getting ready -- How to do it... -- How it works... -- Creating a new Shapefile -- Getting ready -- How to do it... -- How it works... -- Adding CAD data to a map -- Getting ready -- How to do it... -- How it works... -- Plotting X,Y points from a table -- Getting ready -- How to do it... -- How it works... -- Geocoding addresses -- Getting ready -- How to do it... -- How it works... -- Chapter 3: Linking Data together -- Introduction -- Joining two tables -- Getting ready -- How to do it... -- How it works... -- Labeling features using a joined table -- Getting ready -- How to do it... -- Querying data in a joined table -- Getting ready -- How to do it... -- Creating and using a Relate -- Getting ready -- How to do it... -- There is more…. -- Joining features spatially -- Getting ready -- How to do it... -- Creating feature linked annotation.
Getting ready -- How to do it... -- How it works... -- Creating and using a relationship class using existing data -- Getting ready -- How to do it... -- Chapter 4: Editing Spatial and Tabular Data -- Introduction -- Configuring editing options -- Getting ready -- How to do it... -- Reshaping an existing feature -- Getting ready -- How to do it... -- How it works... -- Splitting a line feature -- Getting ready -- How to do it... -- Merging features -- Getting ready -- How to do it... -- How it works... -- Aligning features -- Getting ready -- How to do it... -- How it works... -- Creating new point features -- Getting ready -- How to do it... -- Creating new line features -- Getting ready -- How to do it... -- Creating new polygon features -- Getting ready -- How to do it... -- Creating a new polygon feature using autocomplete -- Getting ready -- How to do it... -- How it works… -- Editing attributes using the Attribute pane -- Getting ready -- How to do it... -- Editing attributes in the Table view -- Getting ready -- How to do it... -- Chapter 5: Validating and Editing Data with Topologies -- Introduction -- Creating a new geodatabase topology -- Getting ready -- How to do it... -- Validating spatial data using a geodatabase topology -- Getting ready -- How to do it... -- Correcting spatial features with topology tools -- Getting ready -- How to do it... -- Editing data with a map topology -- Getting ready -- How to do it... -- Chapter 6: Projections and Coordinate System Basics -- Introduction -- Determining the coordinate system for an existing map -- Getting ready -- How to do it... -- Setting the coordinate system for a new map -- Getting ready -- How to do it... -- Changing the coordinate system of a map -- Getting ready -- How to do it... -- Defining a coordinate system for data -- Getting ready -- How to do it.
Projecting data to different coordinate systems -- Getting ready -- How to do it... -- Chapter 7: Converting Data -- Introduction -- Converting shapefiles to a geodatabase feature class -- Getting ready -- How to do it... -- There's more… -- Merging multiple shapefiles into a single geodatabase feature class -- Getting ready -- How to do it... -- There's more… -- Exporting tabular data to an Excel spreadsheet -- Getting ready -- How to do it... -- Importing an Excel spreadsheet into ArcGIS Pro -- Getting ready -- How to do it... -- There's more… -- Importing selected features into an existing layer -- Getting ready -- How to do it... -- Chapter 8: Proximity Analysis -- Introduction -- Selecting features within a specific distance -- Getting ready -- How to do it... -- Creating buffers -- Getting ready -- How to do it... -- There's more… -- Determining the nearest feature using the Near tool -- Getting ready -- How to do it... -- There's more… -- Calculating how far features are using the Generate Near Table tool -- Getting ready -- How to do it... -- There's more… -- Chapter 9: Spatial Statistics and Hot Spots -- Introduction -- Identifying hot spots -- Getting ready -- How to do it... -- Finding the mean center of geographic distribution -- Getting ready -- How to do it... -- There's more… -- Identifying the central feature of geographic distribution -- Getting ready -- How to do it... -- Calculating the geographic dispersion of data -- Getting ready -- How to do it... -- Chapter 10: 3D Maps and 3D Analyst -- Introduction -- Creating a 3D scene -- Getting ready -- How to do it... -- Enabling your data to store Z coordinates (elevation) -- Getting ready -- How to do it... -- Creating multipatch features from 2D -- Getting ready -- How to do it... -- Creating 3D features -- Getting ready -- How to do it... -- Calculating lines of sight.
Getting ready -- How to do it... -- Calculating the volume of a polygon -- Getting ready -- How to do it... -- Chapter 11: Introducing Arcade -- Introduction -- Applying prebuilt Arcade expressions -- Getting ready -- How to do it... -- Creating an Arcade labeling expression -- Getting ready -- How to do it... -- Creating an Arcade symbology expressions -- Getting ready -- How to do it... -- Chapter 12: Introducing ArcGIS Online -- Introduction -- Logging into your ArcGIS Online account -- Getting ready -- How to do it... -- Creating a simple web map in ArcGIS Online -- Getting ready -- How to do it... -- Accessing ArcGIS Online content in ArcGIS Pro -- Getting ready -- How to do it... -- Accessing simple demographic data in ArcGIS Pro -- Getting ready -- How to do it... -- Using the ArcGIS Online geoprocessing services -- Getting ready -- How to do it... -- Chapter 13: Publishing Your Own Content to ArcGIS Online -- Introduction -- Publishing shapefiles using your browser -- Getting ready -- How to do it... -- Creating a layer using a CSV file -- Getting ready -- How to do it... -- Publishing layer packages using ArcGIS Pro -- Getting ready -- How to do it... -- Publishing web layers using ArcGIS Pro -- Getting ready -- How to do it... -- Publishing 2D maps -- Getting ready -- How to do it... -- Sharing published content -- Getting ready -- How to do it... -- Chapter 14: Creating Web Apps Using ArcGIS Online -- Introduction -- Creating a simple web app using an Esri template -- Getting ready -- How to do it... -- Creating a custom application with Web AppBuilder -- Getting ready -- How to do it... -- Sharing your applications -- Getting ready -- How to do it... -- Embedding an ArcGIS Online web map in a web page -- Getting ready -- How to do it... -- Other Books You May Enjoy -- Index.
Altri titoli varianti ArcGIS Pro 2. x Cookbook
Record Nr. UNINA-9910817064203321
Corbin Tripp  
Birmingham : , : Packt Publishing, Limited, , 2018
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Artificial Intelligence by Example : Acquire Advanced AI, Machine Learning, and Deep Learning Design Skills
Artificial Intelligence by Example : Acquire Advanced AI, Machine Learning, and Deep Learning Design Skills
Autore Rothman Denis
Edizione [2nd ed.]
Pubbl/distr/stampa Birmingham : , : Packt Publishing, Limited, , 2020
Descrizione fisica 1 online resource (579 pages)
Disciplina 6.3
Soggetto topico Artificial intelligence
ISBN 1-83921-281-0
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Cover -- Copyright -- Packt Page -- Contributors -- Table of Contents -- Preface -- Chapter 1: Getting Started with Next-Generation Artificial Intelligence through Reinforcement Learning -- Reinforcement learning concepts -- How to adapt to machine thinking and become an adaptive thinker -- Overcoming real-life issues using the three-step approach -- Step 1 - describing a problem to solve: MDP in natural language -- Watching the MDP agent at work -- Step 2 - building a mathematical model: the mathematical representation of the Bellman equation and MDP -- From MDP to the Bellman equation -- Step 3 - writing source code: implementing the solution in Python -- The lessons of reinforcement learning -- How to use the outputs -- Possible use cases -- Machine learning versus traditional applications -- Summary -- Questions -- Further reading -- Chapter 2: Building a Reward Matrix - Designing Your Datasets -- Designing datasets - where the dream stops and the hard work begins -- Designing datasets -- Using the McCulloch-Pitts neuron -- The McCulloch-Pitts neuron -- The Python-TensorFlow architecture -- Logistic activation functions and classifiers -- Overall architecture -- Logistic classifier -- Logistic function -- Softmax -- Summary -- Questions -- Further reading -- Chapter 3: Machine Intelligence - Evaluation Functions and Numerical Convergence -- Tracking down what to measure and deciding how to measure it -- Convergence -- Implicit convergence -- Numerically controlled gradient descent convergence -- Evaluating beyond human analytic capacity -- Using supervised learning to evaluate a result that surpasses human analytic capacity -- Summary -- Questions -- Further reading -- Chapter 4: Optimizing Your Solutions with K-Means Clustering -- Dataset optimization and control -- Designing a dataset and choosing an ML/DL model.
Approval of the design matrix -- Implementing a k-means clustering solution -- The vision -- The data -- The strategy -- The k-means clustering program -- The mathematical definition of k-means clustering -- The Python program -- Saving and loading the model -- Analyzing the results -- Bot virtual clusters as a solution -- The limits of the implementation of the k-means clustering algorithm -- Summary -- Questions -- Further reading -- Chapter 5: How to Use Decision Trees to Enhance K-Means Clustering -- Unsupervised learning with KMC with large datasets -- Identifying the difficulty of the problem -- NP-hard - the meaning of P -- NP-hard - the meaning of non-deterministic -- Implementing random sampling with mini-batches -- Using the LLN -- The CLT -- Using a Monte Carlo estimator -- Trying to train the full training dataset -- Training a random sample of the training dataset -- Shuffling as another way to perform random sampling -- Chaining supervised learning to verify unsupervised learning -- Preprocessing raw data -- A pipeline of scripts and ML algorithms -- Step 1 - training and exporting data from an unsupervised ML algorithm -- Step 2 - training a decision tree -- Step 3 - a continuous cycle of KMC chained to a decision tree -- Random forests as an alternative to decision trees -- Summary -- Questions -- Further reading -- Chapter 6: Innovating AI with Google Translate -- Understanding innovation and disruption in AI -- Is AI disruptive? -- AI is based on mathematical theories that are not new -- Neural networks are not new -- Looking at disruption - the factors that are making AI disruptive -- Cloud server power, data volumes, and web sharing of the early 21st century -- Public awareness -- Inventions versus innovations -- Revolutionary versus disruptive solutions -- Where to start? -- Discover a world of opportunities with Google Translate.
Getting started -- The program -- The header -- Implementing Google's translation service -- Google Translate from a linguist's perspective -- Playing with the tool -- Linguistic assessment of Google Translate -- AI as a new frontier -- Lexical field and polysemy -- Exploring the frontier - customizing Google Translate with a Python program -- k-nearest neighbor algorithm -- Implementing the KNN algorithm -- The knn_polysemy.py program -- Implementing the KNN function in Google_Translate_Customized.py -- Conclusions on the Google Translate customized experiment -- The disruptive revolutionary loop -- Summary -- Questions -- Further reading -- Chapter 7: Optimizing Blockchains with Naive Bayes -- Part I - the background to blockchain technology -- Mining bitcoins -- Using cryptocurrency -- PART II - using blockchains to share information in a supply chain -- Using blockchains in the supply chain network -- Creating a block -- Exploring the blocks -- Part III - optimizing a supply chain with naive Bayes in a blockchain process -- A naive Bayes example -- The blockchain anticipation novelty -- The goal - optimizing storage levels using blockchain data -- Implementation of naive Bayes in Python -- Gaussian naive Bayes -- Summary -- Questions -- Further reading -- Chapter 8: Solving the XOR Problem with a Feedforward Neural Network -- The original perceptron could not solve the XOR function -- XOR and linearly separable models -- Linearly separable models -- The XOR limit of a linear model, such as the original perceptron -- Building an FNN from scratch -- Step 1 - defining an FNN -- Step 2 - an example of how two children can solve the XOR problem every day -- Implementing a vintage XOR solution in Python with an FNN and backpropagation -- A simplified version of a cost function and gradient descent -- Linear separability was achieved.
Applying the FNN XOR function to optimizing subsets of data -- Summary -- Questions -- Further reading -- Chapter 9: Abstract Image Classification with Convolutional Neural Networks (CNNs) -- Introducing CNNs -- Defining a CNN -- Initializing the CNN -- Adding a 2D convolution layer -- Kernel -- Shape -- ReLU -- Pooling -- Next convolution and pooling layer -- Flattening -- Dense layers -- Dense activation functions -- Training a CNN model -- The goal -- Compiling the model -- The loss function -- The Adam optimizer -- Metrics -- The training dataset -- Data augmentation -- Loading the data -- The testing dataset -- Data augmentation on the testing dataset -- Loading the data -- Training with the classifier -- Saving the model -- Next steps -- Summary -- Questions -- Further reading and references -- Chapter 10: Conceptual Representation Learning -- Generating profit with transfer learning -- The motivation behind transfer learning -- Inductive thinking -- Inductive abstraction -- The problem AI needs to solve -- The gap concept -- Loading the trained TensorFlow 2.x model -- Loading and displaying the model -- Loading the model to use it -- Defining a strategy -- Making the model profitable by using it for another problem -- Domain learning -- How to use the programs -- The trained models used in this section -- The trained model program -- Gap - loaded or underloaded -- Gap - jammed or open lanes -- Gap datasets and subsets -- Generalizing the (the gap conceptual dataset) -- The motivation of conceptual representation learning metamodels applied to dimensionality -- The curse of dimensionality -- The blessing of dimensionality -- Summary -- Questions -- Further reading -- Chapter 11: Combining Reinforcement Learning and Deep Learning -- Planning and scheduling today and tomorrow -- A real-time manufacturing process.
Amazon must expand its services to face competition -- A real-time manufacturing revolution -- CRLMM applied to an automated apparel manufacturing process -- An apparel manufacturing process -- Training the CRLMM -- Generalizing the unit training dataset -- Food conveyor belt processing - positive p and negative n gaps -- Running a prediction program -- Building the RL-DL-CRLMM -- A circular process -- Implementing a CNN-CRLMM to detect gaps and optimize -- Q-learning - MDP -- MDP inputs and outputs -- The optimizer -- The optimizer as a regulator -- Finding the main target for the MDP function -- A circular model - a stream-like system that never starts nor ends -- Summary -- Questions -- Further reading -- Chapter 12: AI and the Internet of Things (IoT) -- The public service project -- Setting up the RL-DL-CRLMM model -- Applying the model of the CRLMM -- The dataset -- Using the trained model -- Adding an SVM function -- Motivation - using an SVM to increase safety levels -- Definition of a support vector machine -- Python function -- Running the CRLMM -- Finding a parking space -- Deciding how to get to the parking lot -- Support vector machine -- The itinerary graph -- The weight vector -- Summary -- Questions -- Further reading -- Chapter 13: Visualizing Networks with TensorFlow 2.x and TensorBoard -- Exploring the output of the layers of a CNN in two steps with TensorFlow -- Building the layers of a CNN -- Processing the visual output of the layers of a CNN -- Analyzing the visual output of the layers of a CNN -- Analyzing the accuracy of a CNN using TensorBoard -- Getting started with Google Colaboratory -- Defining and training the model -- Introducing some of the measurements -- Summary -- Questions -- Further reading.
Chapter 14: Preparing the Input of Chatbots with Restricted Boltzmann Machines (RBMs) and Principal Component Analysis (PCA).
Record Nr. UNINA-9910780786103321
Rothman Denis  
Birmingham : , : Packt Publishing, Limited, , 2020
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Artificial Intelligence by Example : Acquire Advanced AI, Machine Learning, and Deep Learning Design Skills
Artificial Intelligence by Example : Acquire Advanced AI, Machine Learning, and Deep Learning Design Skills
Autore Rothman Denis
Edizione [2nd ed.]
Pubbl/distr/stampa Birmingham : , : Packt Publishing, Limited, , 2020
Descrizione fisica 1 online resource (579 pages)
Disciplina 6.3
Soggetto topico Artificial intelligence
ISBN 1-83921-281-0
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Cover -- Copyright -- Packt Page -- Contributors -- Table of Contents -- Preface -- Chapter 1: Getting Started with Next-Generation Artificial Intelligence through Reinforcement Learning -- Reinforcement learning concepts -- How to adapt to machine thinking and become an adaptive thinker -- Overcoming real-life issues using the three-step approach -- Step 1 - describing a problem to solve: MDP in natural language -- Watching the MDP agent at work -- Step 2 - building a mathematical model: the mathematical representation of the Bellman equation and MDP -- From MDP to the Bellman equation -- Step 3 - writing source code: implementing the solution in Python -- The lessons of reinforcement learning -- How to use the outputs -- Possible use cases -- Machine learning versus traditional applications -- Summary -- Questions -- Further reading -- Chapter 2: Building a Reward Matrix - Designing Your Datasets -- Designing datasets - where the dream stops and the hard work begins -- Designing datasets -- Using the McCulloch-Pitts neuron -- The McCulloch-Pitts neuron -- The Python-TensorFlow architecture -- Logistic activation functions and classifiers -- Overall architecture -- Logistic classifier -- Logistic function -- Softmax -- Summary -- Questions -- Further reading -- Chapter 3: Machine Intelligence - Evaluation Functions and Numerical Convergence -- Tracking down what to measure and deciding how to measure it -- Convergence -- Implicit convergence -- Numerically controlled gradient descent convergence -- Evaluating beyond human analytic capacity -- Using supervised learning to evaluate a result that surpasses human analytic capacity -- Summary -- Questions -- Further reading -- Chapter 4: Optimizing Your Solutions with K-Means Clustering -- Dataset optimization and control -- Designing a dataset and choosing an ML/DL model.
Approval of the design matrix -- Implementing a k-means clustering solution -- The vision -- The data -- The strategy -- The k-means clustering program -- The mathematical definition of k-means clustering -- The Python program -- Saving and loading the model -- Analyzing the results -- Bot virtual clusters as a solution -- The limits of the implementation of the k-means clustering algorithm -- Summary -- Questions -- Further reading -- Chapter 5: How to Use Decision Trees to Enhance K-Means Clustering -- Unsupervised learning with KMC with large datasets -- Identifying the difficulty of the problem -- NP-hard - the meaning of P -- NP-hard - the meaning of non-deterministic -- Implementing random sampling with mini-batches -- Using the LLN -- The CLT -- Using a Monte Carlo estimator -- Trying to train the full training dataset -- Training a random sample of the training dataset -- Shuffling as another way to perform random sampling -- Chaining supervised learning to verify unsupervised learning -- Preprocessing raw data -- A pipeline of scripts and ML algorithms -- Step 1 - training and exporting data from an unsupervised ML algorithm -- Step 2 - training a decision tree -- Step 3 - a continuous cycle of KMC chained to a decision tree -- Random forests as an alternative to decision trees -- Summary -- Questions -- Further reading -- Chapter 6: Innovating AI with Google Translate -- Understanding innovation and disruption in AI -- Is AI disruptive? -- AI is based on mathematical theories that are not new -- Neural networks are not new -- Looking at disruption - the factors that are making AI disruptive -- Cloud server power, data volumes, and web sharing of the early 21st century -- Public awareness -- Inventions versus innovations -- Revolutionary versus disruptive solutions -- Where to start? -- Discover a world of opportunities with Google Translate.
Getting started -- The program -- The header -- Implementing Google's translation service -- Google Translate from a linguist's perspective -- Playing with the tool -- Linguistic assessment of Google Translate -- AI as a new frontier -- Lexical field and polysemy -- Exploring the frontier - customizing Google Translate with a Python program -- k-nearest neighbor algorithm -- Implementing the KNN algorithm -- The knn_polysemy.py program -- Implementing the KNN function in Google_Translate_Customized.py -- Conclusions on the Google Translate customized experiment -- The disruptive revolutionary loop -- Summary -- Questions -- Further reading -- Chapter 7: Optimizing Blockchains with Naive Bayes -- Part I - the background to blockchain technology -- Mining bitcoins -- Using cryptocurrency -- PART II - using blockchains to share information in a supply chain -- Using blockchains in the supply chain network -- Creating a block -- Exploring the blocks -- Part III - optimizing a supply chain with naive Bayes in a blockchain process -- A naive Bayes example -- The blockchain anticipation novelty -- The goal - optimizing storage levels using blockchain data -- Implementation of naive Bayes in Python -- Gaussian naive Bayes -- Summary -- Questions -- Further reading -- Chapter 8: Solving the XOR Problem with a Feedforward Neural Network -- The original perceptron could not solve the XOR function -- XOR and linearly separable models -- Linearly separable models -- The XOR limit of a linear model, such as the original perceptron -- Building an FNN from scratch -- Step 1 - defining an FNN -- Step 2 - an example of how two children can solve the XOR problem every day -- Implementing a vintage XOR solution in Python with an FNN and backpropagation -- A simplified version of a cost function and gradient descent -- Linear separability was achieved.
Applying the FNN XOR function to optimizing subsets of data -- Summary -- Questions -- Further reading -- Chapter 9: Abstract Image Classification with Convolutional Neural Networks (CNNs) -- Introducing CNNs -- Defining a CNN -- Initializing the CNN -- Adding a 2D convolution layer -- Kernel -- Shape -- ReLU -- Pooling -- Next convolution and pooling layer -- Flattening -- Dense layers -- Dense activation functions -- Training a CNN model -- The goal -- Compiling the model -- The loss function -- The Adam optimizer -- Metrics -- The training dataset -- Data augmentation -- Loading the data -- The testing dataset -- Data augmentation on the testing dataset -- Loading the data -- Training with the classifier -- Saving the model -- Next steps -- Summary -- Questions -- Further reading and references -- Chapter 10: Conceptual Representation Learning -- Generating profit with transfer learning -- The motivation behind transfer learning -- Inductive thinking -- Inductive abstraction -- The problem AI needs to solve -- The gap concept -- Loading the trained TensorFlow 2.x model -- Loading and displaying the model -- Loading the model to use it -- Defining a strategy -- Making the model profitable by using it for another problem -- Domain learning -- How to use the programs -- The trained models used in this section -- The trained model program -- Gap - loaded or underloaded -- Gap - jammed or open lanes -- Gap datasets and subsets -- Generalizing the (the gap conceptual dataset) -- The motivation of conceptual representation learning metamodels applied to dimensionality -- The curse of dimensionality -- The blessing of dimensionality -- Summary -- Questions -- Further reading -- Chapter 11: Combining Reinforcement Learning and Deep Learning -- Planning and scheduling today and tomorrow -- A real-time manufacturing process.
Amazon must expand its services to face competition -- A real-time manufacturing revolution -- CRLMM applied to an automated apparel manufacturing process -- An apparel manufacturing process -- Training the CRLMM -- Generalizing the unit training dataset -- Food conveyor belt processing - positive p and negative n gaps -- Running a prediction program -- Building the RL-DL-CRLMM -- A circular process -- Implementing a CNN-CRLMM to detect gaps and optimize -- Q-learning - MDP -- MDP inputs and outputs -- The optimizer -- The optimizer as a regulator -- Finding the main target for the MDP function -- A circular model - a stream-like system that never starts nor ends -- Summary -- Questions -- Further reading -- Chapter 12: AI and the Internet of Things (IoT) -- The public service project -- Setting up the RL-DL-CRLMM model -- Applying the model of the CRLMM -- The dataset -- Using the trained model -- Adding an SVM function -- Motivation - using an SVM to increase safety levels -- Definition of a support vector machine -- Python function -- Running the CRLMM -- Finding a parking space -- Deciding how to get to the parking lot -- Support vector machine -- The itinerary graph -- The weight vector -- Summary -- Questions -- Further reading -- Chapter 13: Visualizing Networks with TensorFlow 2.x and TensorBoard -- Exploring the output of the layers of a CNN in two steps with TensorFlow -- Building the layers of a CNN -- Processing the visual output of the layers of a CNN -- Analyzing the visual output of the layers of a CNN -- Analyzing the accuracy of a CNN using TensorBoard -- Getting started with Google Colaboratory -- Defining and training the model -- Introducing some of the measurements -- Summary -- Questions -- Further reading.
Chapter 14: Preparing the Input of Chatbots with Restricted Boltzmann Machines (RBMs) and Principal Component Analysis (PCA).
Record Nr. UNINA-9910818902203321
Rothman Denis  
Birmingham : , : Packt Publishing, Limited, , 2020
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
AWS certified solutions architect associate guide : the ultimate exam guide to AWS Solutions Architect certification / / Gabriel Ramirez and Stuart Scott
AWS certified solutions architect associate guide : the ultimate exam guide to AWS Solutions Architect certification / / Gabriel Ramirez and Stuart Scott
Autore Ramirez Gabriel
Edizione [1st edition]
Pubbl/distr/stampa London, England : , : Packt Publishing, Limited, , [2018]
Descrizione fisica 1 online resource (626 pages)
Disciplina 006.76
Soggetto topico Cloud computing - Examinations
Web services - Examinations
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910795323503321
Ramirez Gabriel  
London, England : , : Packt Publishing, Limited, , [2018]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
AWS certified solutions architect associate guide : the ultimate exam guide to AWS Solutions Architect certification / / Gabriel Ramirez and Stuart Scott
AWS certified solutions architect associate guide : the ultimate exam guide to AWS Solutions Architect certification / / Gabriel Ramirez and Stuart Scott
Autore Ramirez Gabriel
Edizione [1st edition]
Pubbl/distr/stampa London, England : , : Packt Publishing, Limited, , [2018]
Descrizione fisica 1 online resource (626 pages)
Disciplina 006.76
Soggetto topico Cloud computing - Examinations
Web services - Examinations
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910812812003321
Ramirez Gabriel  
London, England : , : Packt Publishing, Limited, , [2018]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
CCNA Routing and Switching 200-125 certification guide : the ultimate solution for passing the CCNA certification and boosting your networking career / / Lazaro Diaz
CCNA Routing and Switching 200-125 certification guide : the ultimate solution for passing the CCNA certification and boosting your networking career / / Lazaro Diaz
Autore Diaz Lazaro (Laz)
Edizione [First edition]
Pubbl/distr/stampa London, England : , : Packt Publishing, Limited, , [2018]
Descrizione fisica 1 online resource (504 pages)
Disciplina 005.71
Soggetto topico Internetworking (Telecommunication) - Examinations
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910795333803321
Diaz Lazaro (Laz)  
London, England : , : Packt Publishing, Limited, , [2018]
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui