LEADER 02278oam 2200517 450 001 9910703408803321 005 20151216144148.0 035 $a(CKB)5470000002431607 035 $a(OCoLC)894251264 035 $a(EXLCZ)995470000002431607 100 $a20141103d2014 ua 0 101 0 $aeng 135 $aurmn||||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aGroundwater-quality data in the north San Francisco Bay shallow aquifer study unit, 2012 $eresults from the California GAMA Program /$fby George L. Bennett V and Miranda S. Fram ; prepared in cooperation with the California State Water Resources Control Board 210 1$aReston, Virginia :$cU.S. Department of the Interior, U.S. Geological Survey,$d2014. 215 $a1 online resource (x, 94 pages) $ccolor illustrations, color maps 225 1 $aData series ;$v865 300 $aTitle from title screen (viewed on Nov. 3, 2014). 300 $aChiefly tables. 300 $a"A product of the California Groundwater Ambient Monitoring and Assessment (GAMA) Program." 320 $aIncludes bibliographical references (pages 15-20). 517 $aGroundwater-quality data in the north San Francisco Bay shallow aquifer study unit, 2012 606 $aGroundwater$xQuality$zCalifornia$zSan Francisco Bay 606 $aWater quality management$zCalifornia$zSan Francisco Bay Area$xData processing 606 $aWater quality management$zCalifornia$zSan Francisco Bay 606 $aWater-supply$zCalifornia$zSan Francisco Bay 615 0$aGroundwater$xQuality 615 0$aWater quality management$xData processing. 615 0$aWater quality management 615 0$aWater-supply 700 $aBennett$b George L.$01390134 702 $aFram$b Miranda S$g(Miranda Susan), 712 02$aGeological Survey (U.S.), 712 02$aCalifornia.$bState Water Resources Control Board, 712 02$aGround Water Ambient Monitoring and Assessment Program (Calif.) 801 0$bAZS 801 1$bAZS 801 2$bOCLCQ 801 2$bGPO 906 $aBOOK 912 $a9910703408803321 996 $aGroundwater-quality data in the north San Francisco Bay shallow aquifer study unit, 2012$93442542 997 $aUNINA LEADER 04516nam 22006735 450 001 9910300755003321 005 20200702161832.0 010 $a9781484237908 010 $a1484237900 024 7 $a10.1007/978-1-4842-3790-8 035 $a(CKB)4100000006519785 035 $a(MiAaPQ)EBC5510213 035 $a(DE-He213)978-1-4842-3790-8 035 $a(CaSebORM)9781484237908 035 $a(PPN)23054262X 035 $a(OCoLC)1056157446 035 $a(OCoLC)on1056157446 035 $a(EXLCZ)994100000006519785 100 $a20180907d2018 u| 0 101 0 $aeng 135 $aurcnu|||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aApplied Deep Learning $eA Case-Based Approach to Understanding Deep Neural Networks /$fby Umberto Michelucci 205 $a1st ed. 2018. 210 1$aBerkeley, CA :$cApress :$cImprint: Apress,$d2018. 215 $a1 online resource (425 pages) 311 08$a9781484237892 311 08$a1484237897 327 $aChapter 1: Introduction -- Chapter 2: Single Neurons -- Chapter 3: Fully connected Neural Network with more neurons -- Chapter 4: Neural networks error analysis -- Chapter 5: Dropout technique -- Chapter 6: Hyper parameters tuning -- Chapter 7: Tensorflow and optimizers (Gradient descent, Adam, momentum, etc.) -- Chapter 8: Convolutional Networks and image recognition -- Chapter 9: Recurrent Neural Networks -- Chapter 10: A practical COMPLETE example from scratch (put everything together) -- Chapter 11: Logistic regression implement from scratch in Python without libraries. . 330 $aWork with advanced topics in deep learning, such as optimization algorithms, hyper-parameter tuning, dropout, and error analysis as well as strategies to address typical problems encountered when training deep neural networks. You?ll begin by studying the activation functions mostly with a single neuron (ReLu, sigmoid, and Swish), seeing how to perform linear and logistic regression using TensorFlow, and choosing the right cost function. The next section talks about more complicated neural network architectures with several layers and neurons and explores the problem of random initialization of weights. An entire chapter is dedicated to a complete overview of neural network error analysis, giving examples of solving problems originating from variance, bias, overfitting, and datasets coming from different distributions. Applied Deep Learning also discusses how to implement logistic regression completely from scratch without using any Python library except NumPy, to let you appreciate how libraries such as TensorFlow allow quick and efficient experiments. Case studies for each method are included to put into practice all theoretical information. You?ll discover tips and tricks for writing optimized Python code (for example vectorizing loops with NumPy). You will: Implement advanced techniques in the right way in Python and TensorFlow Debug and optimize advanced methods (such as dropout and regularization) Carry out error analysis (to realize if one has a bias problem, a variance problem, a data offset problem, and so on) Set up a machine learning project focused on deep learning on a complex dataset. 517 3 $aCase-based approach to understanding neural networks 606 $aArtificial intelligence 606 $aPython (Computer program language) 606 $aOpen source software 606 $aComputer programming 606 $aBig data 606 $aArtificial Intelligence$3https://scigraph.springernature.com/ontologies/product-market-codes/I21000 606 $aPython$3https://scigraph.springernature.com/ontologies/product-market-codes/I29080 606 $aOpen Source$3https://scigraph.springernature.com/ontologies/product-market-codes/I29090 606 $aBig Data$3https://scigraph.springernature.com/ontologies/product-market-codes/I29120 615 0$aArtificial intelligence. 615 0$aPython (Computer program language) 615 0$aOpen source software. 615 0$aComputer programming. 615 0$aBig data. 615 14$aArtificial Intelligence. 615 24$aPython. 615 24$aOpen Source. 615 24$aBig Data. 676 $a006.31 700 $aMichelucci$b Umberto$4aut$4http://id.loc.gov/vocabulary/relators/aut$01059671 801 0$bUMI 801 1$bUMI 906 $aBOOK 912 $a9910300755003321 996 $aApplied Deep Learning$92528706 997 $aUNINA