LEADER 08985nam 2200529 450 001 9910523765803321 005 20230124202130.0 010 $a1-5231-5092-0 010 $a1-4842-7368-0 035 $a(CKB)4950000000281835 035 $a(MiAaPQ)EBC6785174 035 $a(Au-PeEL)EBL6785174 035 $a(OCoLC)1281957925 035 $a(OCoLC-P)1281957925 035 $a(CaSebORM)9781484273685 035 $a(EXLCZ)994950000000281835 100 $a20220712d2022 uy 0 101 0 $aeng 135 $aurcnu|||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aArtificial neural networks with Java $etools for building neural network applications /$fIgor Livshin 205 $a2nd ed. 210 1$a[Place of publication not identified] :$cApress,$d[2022] 210 4$d©2022 215 $a1 online resource (635 pages) 300 $aIncludes index. 311 $a1-4842-7367-2 327 $aIntro -- Table of Contents -- About the Author -- About the Technical Reviewers -- Acknowledgments -- Introduction -- Part I: Getting Started with Neural Networks -- Chapter 1: Learning About Neural Networks -- Biological and Artificial Neurons -- Activation Functions -- Summary -- Chapter 2: Internal Mechanics of Neural Network Processing -- Function to Be Approximated -- Network Architecture -- Forward Pass Calculation -- Input Record 1 -- Input Record 2 -- Input Record 3 -- Input Record 4 -- Back-Propagation Pass -- Function Derivative and Function Divergent -- Most Commonly Used Function Derivatives -- Summary -- Chapter 3: Manual Neural Network Processing -- Example: Manual Approximation of a Function at a Single Point -- Building the Neural Network -- Forward Pass Calculation -- Hidden Layers -- Output Layer -- Backward Pass Calculation -- Calculating Weight Adjustments for the Output-Layer Neurons -- Calculating Adjustment for W211 -- Calculating Adjustment for W212 -- Calculating Adjustment for W213 -- Calculating Weight Adjustments for Hidden-Layer Neurons -- Calculating Adjustment for W111 -- Calculating Adjustment for W112 -- Calculating Adjustment for W121 -- Calculating Adjustment for W122 -- Calculating Adjustment for W131 -- Calculating Adjustment for W132 -- Updating Network Biases -- Back to the Forward Pass -- Hidden Layers -- Output Layer -- Matrix Form of Network Calculation -- Digging Deeper -- Mini-Batches and Stochastic Gradient -- Summary -- Part II: Neural Network Java Development Environment -- Chapter 4: Configuring Your Development Environment -- Installing the Java Environment and NetBeans on Your Windows Machine -- Installing the Encog Java Framework -- Installing the XChart Package -- Summary -- Chapter 5: Neural Networks Development Using the Java Encog Framework. 327 $aExample: Function Approximation Using Java Environment -- Network Architecture -- Normalizing the Input Datasets -- Building the Java Program That Normalizes Both Datasets -- Building the Neural Network Processing Program -- Program Code -- Debugging and Executing the Program -- Processing Results for the Training Method -- Testing the Network -- Testing Results -- Digging Deeper -- Summary -- Chapter 6: Neural Network Prediction Outside of the Training Range -- Example: Approximating Periodic Functions Outside of the Training Range -- Network Architecture for the Example -- Program Code for the Example -- Testing the Network -- Example: Correct Way of Approximating Periodic Functions Outside of the Training Range -- Preparing the Training Data -- Network Architecture for the Example -- Program Code for Example -- Training Results for Example -- Log of Testing Results for Example 3 -- Summary -- Chapter 7: Processing Complex Periodic Functions -- Example: Approximation of a Complex Periodic Function -- Data Preparation -- Reflecting Function Topology in the Data -- Network Architecture -- Program Code -- Training the Network -- Testing the Network -- Digging Deeper -- Summary -- Chapter 8: Approximating Noncontinuous Functions -- Example: Approximating Noncontinuous Functions -- Network Architecture -- Program Code -- Code Fragments for the Training Process -- Unsatisfactory Training Results -- Approximating the Noncontinuous Function Using the Micro-Batch Method -- Program Code for Micro-Batch Processing -- Program Code for the getChart() Method -- Code Fragment 1 of the Training Method -- Code Fragment 2 of the Training Method -- Training Results for the Micro-Batch Method -- Testing the Processing Logic -- Testing the Results for the Micro-Batch Method -- Digging Deeper -- Summary. 327 $aChapter 9: Approximation of Continuous Functions with Complex Topology -- Example: Approximation of Continuous Functions with Complex Topology Using a Conventional Neural Network Process -- Network Architecture for the Example -- Program Code for the Example -- Training Processing Results for the Example -- Approximation of Continuous Functions with Complex Topology Using the Micro-Batch Method -- Program Code for the Example Using the Micro-Batch Method -- Example: Approximation of Spiral-like Functions -- Network Architecture for the Example -- Program Code for Example -- Approximation of the Same Functions Using Micro-Batch Method -- Summary -- Chapter 10: Using Neural Networks for the Classification of Objects -- Example: Classification of Records -- Training Dataset -- Network Architecture -- Testing Dataset -- Program Code for Data Normalization -- Program Code for Classification -- Training Results -- Testing Results -- Summary -- Chapter 11: The Importance of  Selecting the Correct Model -- Example: Predicting Next Month's Stock Market Price -- Including the Function Topology in the Dataset -- Building Micro-Batch Files -- Network Architecture -- Program Code -- Training Process -- Training Results -- Testing Dataset -- Testing Logic -- Testing Results -- Analyzing Testing Results -- Summary -- Chapter 12: Approximation Functions in 3D Space -- Example: Approximation Functions in 3D Space -- Data Preparation -- Network Architecture -- Program Code -- Processing Results -- Summary -- Part III: Introduction to Computer Vision -- Chapter 13: Image Recognition -- Classification of Handwritten Digits -- Preparing the Input Data -- Input Data Conversion -- Building the Conversion Program -- Summary -- Chapter 14: Classification of Handwritten Digits -- Network Architecture -- Program Code -- Programming Logic -- Execution. 327 $aConvolution Neural Network -- Summary -- Index. 330 $aDevelop neural network applications using the Java environment. After learning the rules involved in neural network processing, this second edition shows you how to manually process your first neural network example. The book covers the internals of front and back propagation and helps you understand the main principles of neural network processing. You also will learn how to prepare the data to be used in neural network development and you will be able to suggest various techniques of data preparation for many unconventional tasks. This book discusses the practical aspects of using Java for neural network processing. You will know how to use the Encog Java framework for processing large-scale neural network applications. Also covered is the use of neural networks for approximation of non-continuous functions. In addition to using neural networks for regression, this second edition shows you how to use neural networks for computer vision. It focuses on image recognition such as the classification of handwritten digits, input data preparation and conversion, and building the conversion program. And you will learn about topics related to the classification of handwritten digits such as network architecture, program code, programming logic, and execution. The step-by-step approach taken in the book includes plenty of examples, diagrams, and screenshots to help you grasp the concepts quickly and easily. What You Will Learn Use Java for the development of neural network applications Prepare data for many different tasks Carry out some unusual neural network processing Use a neural network to process non-continuous functions Develop a program that recognizes handwritten digits Who This Book Is For Intermediate machine learning and deep learning developers who are interested in switching to Java. 606 $aNeural networks (Computer science) 606 $aJava (Computer program language) 615 0$aNeural networks (Computer science) 615 0$aJava (Computer program language) 676 $a006.32 700 $aLivshin$b Igor$0905428 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910523765803321 996 $aArtificial Neural Networks with Java$92536855 997 $aUNINA