LEADER 00786nam0-22002771i-450- 001 990005954280403321 005 19980601 035 $a000595428 035 $aFED01000595428 035 $a(Aleph)000595428FED01 035 $a000595428 100 $a19980601g18401843km-y0itay50------ba 105 $a--------00-yy 200 1 $aAbhandlung auf dem Strafrechte und dem Strafprocesse$fANTON BAUER 210 $aGottingen$cDiererichschen Buchandlung$d1840-43 215 $a3 v.$d22 cm 676 $a345 700 1$aBauer,$bAnton$0226365 801 0$aIT$bUNINA$gRICA$2UNIMARC 901 $aBK 912 $a990005954280403321 952 $aXII B 21$bS.I$fFGBC 959 $aFGBC 996 $aAbhandlung auf dem Strafrechte und dem Strafprocesse$9585031 997 $aUNINA DB $aGIU01 LEADER 01570nam 2200445 450 001 9910693974003321 005 20221024221919.0 035 $a(CKB)3450000000002508 035 $a(NjHacI)993450000000002508 035 $a(OCoLC)57397392 035 9 $aocm57397392 035 $a(OCoLC)993450000000002508 035 $a(EXLCZ)993450000000002508 100 $a20221024d2004 uy 0 101 0 $aeng 135 $aur||||||||||| 181 $ctxt$2rdacontent 182 $cc$2rdamedia 183 $acr$2rdacarrier 200 10$aSexuality and reproductive health following spinal cord injury /$fDan DeForge [and nine others] 210 1$aRockville (MD) : , c2002 :$cAgency for Healthcare Research and Quality (US),$d[2004] 210 4$dİ2004 215 $a1 online resource (254 pages) $cillustrations 225 1 $aEvidence report/technology assessment. Summary ;$vno. 109 225 1 $aAHRQ pub. ;$vno. 05-E003-1 300 $aTitle from title screen (viewed on Jan. 10, 2005). 311 $a1-58763-171-7 320 $aIncludes bibliographical references. 606 $aSex instruction for people with disabilities 615 0$aSex instruction for people with disabilities. 676 $a613.96087 700 $aDeForge$b D. 701 $aDeForge$b D$01352503 712 02$aUnited States.$bAgency for Healthcare Research and Quality. 801 0$bNjHacI 801 1$bNjHacl 906 $aBOOK 912 $a9910693974003321 996 $aSexuality and reproductive health following spinal cord injury$93181751 997 $aUNINA LEADER 05459nam 2200685Ia 450 001 9911019799403321 005 20200520144314.0 010 $a9786610935178 010 $a9781280935176 010 $a1280935170 010 $a9780470148150 010 $a0470148152 010 $a9780470148143 010 $a0470148144 035 $a(CKB)1000000000355009 035 $a(EBL)309748 035 $a(OCoLC)476091973 035 $a(SSID)ssj0000251017 035 $a(PQKBManifestationID)11188681 035 $a(PQKBTitleCode)TC0000251017 035 $a(PQKBWorkID)10247187 035 $a(PQKB)10527494 035 $a(MiAaPQ)EBC309748 035 $a(Perlego)2749660 035 $a(EXLCZ)991000000000355009 100 $a20070830d2007 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $ctxt 182 $cc 183 $acr 200 12$aA statistical approach to neural networks for pattern recognition /$fRobert A. Dunne 210 $aHoboken, N.J. ;$aChichester $cWiley$dc2007 215 $a1 online resource (289 p.) 225 1 $aWiley series in computational statistics 300 $aDescription based upon print version of record. 311 08$a9780471741084 311 08$a0471741086 320 $aIncludes bibliographical references and index. 327 $aA Statistical Approach to Neural Networks for Pattern Recognition; Contents; Notation and Code Examples; Preface; Acknowledgments; 1 Introduction; 1.1 The perceptron; 2 The Multi-Layer Perceptron Model; 2.1 The multi-layer perceptron (MLP); 2.2 The first and second derivatives; 2.3 Additional hidden layers; 2.4 Classifiers; 2.5 Complements and exercises; 3 Linear Discriminant Analysis; 3.1 An alternative method; 3.2 Example; 3.3 Flexible and penalized LDA; 3.4 Relationship of MLP models to LDA; 3.5 Linear classifiers; 3.6 Complements and exercises; 4 Activation and Penalty Functions 327 $a4.1 Introduction4.2 Interpreting outputs as probabilities; 4.3 The fiuniversal approximatorfl and consistency; 4.4 Variance and bias; 4.5 Binary variables and logistic regression; 4.6 MLP models and cross-entropy; 4.7 A derivation of the softmax activation function; 4.8 The finaturalfl pairing and A,; 4.9 A comparison of least squares and cross-entropy; 4.10 Conclusion; 4.11 Complements and exercises; 5 Model Fitting and Evaluation; 5.1 Introduction; 5.2 Error rate estimation; 5.3 Model selection for MLP models; 5.4 Penalized training; 5.5 Complements and exercises; 6 The Task-based MLP 327 $a6.1 Introduction6.2 The task-based MLP; 6.3 Pruning algorithms; 6.4 Interpreting and evaluating task-based MLP models; 6.5 Evaluating the models; 6.6 Conclusion; 6.7 Complements and exercises; 7 Incorporating Spatial Information into an MLP Classifier; 7.1 Allocation and neighbor information; 7.2 Markov random fields; 7.3 Hopfield networks; 7.4 MLP neighbor models; 7.5 Sequential updating; 7.6 Example - MartinTMs farm; 7.7 Conclusion; 7.8 Complements and exercises; 8 Influence Curves for the Multi-layer Perceptron Classifier; 8.1 Introduction; 8.2 Estimators; 8.3 Influence curves 327 $a8.4 M-estimators8.5 The MLP; 8.6 Influence curves for pc; 8.7 Summary and Conclusion; 9 The Sensitivity Curves of the MLP Classifier; 9.1 Introduction; 9.2 The sensitivity curve; 9.3 Some experiments; 9.4 Discussion; 9.5 Conclusion; 10 A Robust Fitting Procedure for MLP Models; 10.1 Introduction; 10.2 The effect of a hidden layer; 10.3 Comparison of MLP with robust logistic regression; 10.4 A robust MLP model; 10.5 Diagnostics; 10.6 Conclusion; 10.7 Complements and exercises; 11 Smoothed Weights; 11.1 Introduction; 11.2 MLP models; 11.3 Examples; 11.4 Conclusion 327 $a11.5 Cornplernents and exercises12 Translation Invariance; 12.1 Introduction; 12.2 Example 1; 12.3 Example 2; 12.4 Example 3; 12.5 Conclusion; 13 Fixed-slope Training; 13.1 Introduction; 13.2 Strategies; 13.3 Fixing ? or O; 13.4 Example 1; 13.5 Example 2; 13.6 Discussion; Bibliography; Appendix A: Function Minimization; A.l Introduction; A.2 Back-propagation; A.3 Newton-Raphson; A.4 The method of scoring; A.5 Quasi-Newton; A.6 Conjugate gradients; A.7 Scaled conjugate gradients; A.8 Variants on vanilla fiback-propagationfl; A.9 Line search; A.10 The simplex algorithm; A.11 Implementation 327 $aA.12 Examples 330 $aAn accessible and up-to-date treatment featuring the connection between neural networks and statistics A Statistical Approach to Neural Networks for Pattern Recognition presents a statistical treatment of the Multilayer Perceptron (MLP), which is the most widely used of the neural network models. This book aims to answer questions that arise when statisticians are first confronted with this type of model, such as: How robust is the model to outliers? Could the model be made more robust? Which points will have a high leverage? What are good starting values for the fitting algorithm?