top

  Info

  • Utilizzare la checkbox di selezione a fianco di ciascun documento per attivare le funzionalità di stampa, invio email, download nei formati disponibili del (i) record.

  Info

  • Utilizzare questo link per rimuovere la selezione effettuata.
50 years after the perceptron, 25 years after PDP : neural computation in language sciences / / topic editors: Julien Mayor, Pablo Gomez, Franklin Chang and Gary Lupyan
50 years after the perceptron, 25 years after PDP : neural computation in language sciences / / topic editors: Julien Mayor, Pablo Gomez, Franklin Chang and Gary Lupyan
Autore Pablo Gomez
Pubbl/distr/stampa Frontiers Media SA, 2014
Descrizione fisica 1 online resource (180 pages) : illustrations; digital, PDF file(s)
Collana Frontiers Research Topics
Frontiers in Psychology
Soggetto topico Perceptrons
Computational linguistics
Computational linguistics - Research
Language acquisition
Cognition
Soggetto non controllato computational linguistics
language acquisition
probabilistic cognition
Recurrent networks
connectionism
computational modeling
word learning
interactive processing
Speech Perception
language processing
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNINA-9910131531303321
Pablo Gomez  
Frontiers Media SA, 2014
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
ICIA 2004 : proceedings of 2004 International Conference on Information Acquisition : June 21-25, 2004, Hefei, China
ICIA 2004 : proceedings of 2004 International Conference on Information Acquisition : June 21-25, 2004, Hefei, China
Pubbl/distr/stampa [Place of publication not identified], : IEEE, 2004
Disciplina 681/.2
Soggetto topico Detectors
Perceptrons
Information theory
Mechanical Engineering
Engineering & Applied Sciences
Industrial & Management Engineering
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Record Nr. UNISA-996202158903316
[Place of publication not identified], : IEEE, 2004
Materiale a stampa
Lo trovi qui: Univ. di Salerno
Opac: Controlla la disponibilità qui
Sequential methods in pattern recognition and machine learning [[electronic resource] /] / K.S. Fu
Sequential methods in pattern recognition and machine learning [[electronic resource] /] / K.S. Fu
Autore Fu K. S (King Sun), <1930-1985.>
Pubbl/distr/stampa New York, : Academic Press, 1968
Descrizione fisica 1 online resource (245 p.)
Disciplina 001.5/3
Collana Mathematics in science and engineering
Soggetto topico Perceptrons
Statistical decision
Machine learning
ISBN 1-282-29019-3
9786612290190
0-08-095559-2
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Front Cover; Sequential Methods in Pattern Recognition and Machine Learning; Copyright Page; Contents; Preface; Chapter 1. Introduction; 1.1 Pattern Recognition; 1.2 Deterministic Classification Techniques; 1.3 Training in Linear Classifiers; 1.4 Statistical Classification Techniques; 1.5 Sequential Decision Model for Pattern Classification; 1.6 Learning in Sequential Pattern Recognition Systems; 1.7 Summary and Further Remarks; References; Chapter 2. Feature Selection and Feature Ordering; 2.1 Feature Selection and Ordering-Information Theoretic Approach
2.2 Feature Selection and Ordering-Karhunen-Loève Expansion2.3 Illustrative Examples; 2.4 Summary and Further Remarks; References; Chapter 3. Forward Procedure for Finite Sequential Classification Using Modified Sequential Probability Ratio Test; 3.1 Introduction; 3.2 Modified Sequential Probability Ratio Test-Discrete Case; 3.3 Modified Sequential Probability Ratio Test-Continuous Case; 3.4 Procedure of Modified Generalized Sequential Probability Ratio Test; 3.5 Experiments in Pattern Classification; 3.6 Summary and Further Remarks; References
Chapter 4. Backward Procedure for Finite Sequential Recognition Using Dynamic Programming4.1 Introduction; 4.2 Mathematical Formulation and Basic Functional Equation; 4.3 Reduction of Dimensionality; 4.4 Experiments in Pattern Classification; 4.5 Backward Procedure for Both Feature Ordering and Pattern Classification; 4.6 Experiments in Feature Ordering and Pattern Classification; 4.7 Use of Dynamic Programming for Feature-Subset Selection; 4.8 Suboptimal Sequential Pattern Recognition; 4.9 Summary and Further Remarks; References
Chapter 5. Nonparametric Procedure in Sequential Pattern Classification5.1 Introduction; 5.2 Sequential Ranks and Sequential Ranking Procedure; 5.3 A Sequential Two-Sample Test Problem; 5.4 Nonparametric Design of Sequential Pattern Classifiers; 5.5 Analysis of Optimal Performance and a Multiclass Generalization; 5.6 Experimental Results and Discussions; 5.7 Summary and Further Remarks; References; Chapter 6. Bayesian Learning in Sequential Pattern Recognition Systems; 6.1 Supervised Learning Using Bayesian Estimation Techniques; 6.2 Nonsupervised Learning Using Bayesian Estimation Techniques
6.3 Bayesian Learning of Slowly Varying Patterns6.4 Learning of Parameters Using an Empirical Bayes Approach; 6.5 A General Model for Bayesian Learning Systems; 6.6 Summary and Further Remarks; References; Chapter 7. Learning in Sequential Recognition Systems Using Stochastic Approximation; 7.1 Supervised Learning Using Stochastic Approximation; 7.2 Nonsupervised Learning Using Stochastic Approximation; 7.3 A General Formulation of Nonsupervised Learning Systems Using Stochastic Approximation; 7.4 Learning of Slowly Time-Varying Parameters Using Dynamic Stochastic Approximation
7.5 Summary and Further Remarks
Record Nr. UNINA-9910778600003321
Fu K. S (King Sun), <1930-1985.>  
New York, : Academic Press, 1968
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
Sequential methods in pattern recognition and machine learning [[electronic resource] /] / K.S. Fu
Sequential methods in pattern recognition and machine learning [[electronic resource] /] / K.S. Fu
Autore Fu K. S (King Sun), <1930-1985.>
Pubbl/distr/stampa New York, : Academic Press, 1968
Descrizione fisica 1 online resource (245 p.)
Disciplina 001.5/3
Collana Mathematics in science and engineering
Soggetto topico Perceptrons
Statistical decision
Machine learning
ISBN 1-282-29019-3
9786612290190
0-08-095559-2
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto Front Cover; Sequential Methods in Pattern Recognition and Machine Learning; Copyright Page; Contents; Preface; Chapter 1. Introduction; 1.1 Pattern Recognition; 1.2 Deterministic Classification Techniques; 1.3 Training in Linear Classifiers; 1.4 Statistical Classification Techniques; 1.5 Sequential Decision Model for Pattern Classification; 1.6 Learning in Sequential Pattern Recognition Systems; 1.7 Summary and Further Remarks; References; Chapter 2. Feature Selection and Feature Ordering; 2.1 Feature Selection and Ordering-Information Theoretic Approach
2.2 Feature Selection and Ordering-Karhunen-Loève Expansion2.3 Illustrative Examples; 2.4 Summary and Further Remarks; References; Chapter 3. Forward Procedure for Finite Sequential Classification Using Modified Sequential Probability Ratio Test; 3.1 Introduction; 3.2 Modified Sequential Probability Ratio Test-Discrete Case; 3.3 Modified Sequential Probability Ratio Test-Continuous Case; 3.4 Procedure of Modified Generalized Sequential Probability Ratio Test; 3.5 Experiments in Pattern Classification; 3.6 Summary and Further Remarks; References
Chapter 4. Backward Procedure for Finite Sequential Recognition Using Dynamic Programming4.1 Introduction; 4.2 Mathematical Formulation and Basic Functional Equation; 4.3 Reduction of Dimensionality; 4.4 Experiments in Pattern Classification; 4.5 Backward Procedure for Both Feature Ordering and Pattern Classification; 4.6 Experiments in Feature Ordering and Pattern Classification; 4.7 Use of Dynamic Programming for Feature-Subset Selection; 4.8 Suboptimal Sequential Pattern Recognition; 4.9 Summary and Further Remarks; References
Chapter 5. Nonparametric Procedure in Sequential Pattern Classification5.1 Introduction; 5.2 Sequential Ranks and Sequential Ranking Procedure; 5.3 A Sequential Two-Sample Test Problem; 5.4 Nonparametric Design of Sequential Pattern Classifiers; 5.5 Analysis of Optimal Performance and a Multiclass Generalization; 5.6 Experimental Results and Discussions; 5.7 Summary and Further Remarks; References; Chapter 6. Bayesian Learning in Sequential Pattern Recognition Systems; 6.1 Supervised Learning Using Bayesian Estimation Techniques; 6.2 Nonsupervised Learning Using Bayesian Estimation Techniques
6.3 Bayesian Learning of Slowly Varying Patterns6.4 Learning of Parameters Using an Empirical Bayes Approach; 6.5 A General Model for Bayesian Learning Systems; 6.6 Summary and Further Remarks; References; Chapter 7. Learning in Sequential Recognition Systems Using Stochastic Approximation; 7.1 Supervised Learning Using Stochastic Approximation; 7.2 Nonsupervised Learning Using Stochastic Approximation; 7.3 A General Formulation of Nonsupervised Learning Systems Using Stochastic Approximation; 7.4 Learning of Slowly Time-Varying Parameters Using Dynamic Stochastic Approximation
7.5 Summary and Further Remarks
Record Nr. UNINA-9910811018203321
Fu K. S (King Sun), <1930-1985.>  
New York, : Academic Press, 1968
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
A statistical approach to neural networks for pattern recognition [[electronic resource] /] / Robert A. Dunne
A statistical approach to neural networks for pattern recognition [[electronic resource] /] / Robert A. Dunne
Autore Dunne Robert A
Pubbl/distr/stampa Hoboken, N.J. ; ; Chichester, : Wiley, c2007
Descrizione fisica 1 online resource (289 p.)
Disciplina 006.32
Collana Wiley series in computational statistics
Soggetto topico Perceptrons
Neural networks (Computer science)
Soggetto genere / forma Electronic books.
ISBN 1-280-93517-0
9786610935178
0-470-14815-2
0-470-14814-4
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto A Statistical Approach to Neural Networks for Pattern Recognition; Contents; Notation and Code Examples; Preface; Acknowledgments; 1 Introduction; 1.1 The perceptron; 2 The Multi-Layer Perceptron Model; 2.1 The multi-layer perceptron (MLP); 2.2 The first and second derivatives; 2.3 Additional hidden layers; 2.4 Classifiers; 2.5 Complements and exercises; 3 Linear Discriminant Analysis; 3.1 An alternative method; 3.2 Example; 3.3 Flexible and penalized LDA; 3.4 Relationship of MLP models to LDA; 3.5 Linear classifiers; 3.6 Complements and exercises; 4 Activation and Penalty Functions
4.1 Introduction4.2 Interpreting outputs as probabilities; 4.3 The fiuniversal approximatorfl and consistency; 4.4 Variance and bias; 4.5 Binary variables and logistic regression; 4.6 MLP models and cross-entropy; 4.7 A derivation of the softmax activation function; 4.8 The finaturalfl pairing and A,; 4.9 A comparison of least squares and cross-entropy; 4.10 Conclusion; 4.11 Complements and exercises; 5 Model Fitting and Evaluation; 5.1 Introduction; 5.2 Error rate estimation; 5.3 Model selection for MLP models; 5.4 Penalized training; 5.5 Complements and exercises; 6 The Task-based MLP
6.1 Introduction6.2 The task-based MLP; 6.3 Pruning algorithms; 6.4 Interpreting and evaluating task-based MLP models; 6.5 Evaluating the models; 6.6 Conclusion; 6.7 Complements and exercises; 7 Incorporating Spatial Information into an MLP Classifier; 7.1 Allocation and neighbor information; 7.2 Markov random fields; 7.3 Hopfield networks; 7.4 MLP neighbor models; 7.5 Sequential updating; 7.6 Example - MartinTMs farm; 7.7 Conclusion; 7.8 Complements and exercises; 8 Influence Curves for the Multi-layer Perceptron Classifier; 8.1 Introduction; 8.2 Estimators; 8.3 Influence curves
8.4 M-estimators8.5 The MLP; 8.6 Influence curves for pc; 8.7 Summary and Conclusion; 9 The Sensitivity Curves of the MLP Classifier; 9.1 Introduction; 9.2 The sensitivity curve; 9.3 Some experiments; 9.4 Discussion; 9.5 Conclusion; 10 A Robust Fitting Procedure for MLP Models; 10.1 Introduction; 10.2 The effect of a hidden layer; 10.3 Comparison of MLP with robust logistic regression; 10.4 A robust MLP model; 10.5 Diagnostics; 10.6 Conclusion; 10.7 Complements and exercises; 11 Smoothed Weights; 11.1 Introduction; 11.2 MLP models; 11.3 Examples; 11.4 Conclusion
11.5 Cornplernents and exercises12 Translation Invariance; 12.1 Introduction; 12.2 Example 1; 12.3 Example 2; 12.4 Example 3; 12.5 Conclusion; 13 Fixed-slope Training; 13.1 Introduction; 13.2 Strategies; 13.3 Fixing γ or O; 13.4 Example 1; 13.5 Example 2; 13.6 Discussion; Bibliography; Appendix A: Function Minimization; A.l Introduction; A.2 Back-propagation; A.3 Newton-Raphson; A.4 The method of scoring; A.5 Quasi-Newton; A.6 Conjugate gradients; A.7 Scaled conjugate gradients; A.8 Variants on vanilla fiback-propagationfl; A.9 Line search; A.10 The simplex algorithm; A.11 Implementation
A.12 Examples
Record Nr. UNINA-9910143416103321
Dunne Robert A  
Hoboken, N.J. ; ; Chichester, : Wiley, c2007
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
A statistical approach to neural networks for pattern recognition [[electronic resource] /] / Robert A. Dunne
A statistical approach to neural networks for pattern recognition [[electronic resource] /] / Robert A. Dunne
Autore Dunne Robert A
Pubbl/distr/stampa Hoboken, N.J. ; ; Chichester, : Wiley, c2007
Descrizione fisica 1 online resource (289 p.)
Disciplina 006.32
Collana Wiley series in computational statistics
Soggetto topico Perceptrons
Neural networks (Computer science)
ISBN 1-280-93517-0
9786610935178
0-470-14815-2
0-470-14814-4
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto A Statistical Approach to Neural Networks for Pattern Recognition; Contents; Notation and Code Examples; Preface; Acknowledgments; 1 Introduction; 1.1 The perceptron; 2 The Multi-Layer Perceptron Model; 2.1 The multi-layer perceptron (MLP); 2.2 The first and second derivatives; 2.3 Additional hidden layers; 2.4 Classifiers; 2.5 Complements and exercises; 3 Linear Discriminant Analysis; 3.1 An alternative method; 3.2 Example; 3.3 Flexible and penalized LDA; 3.4 Relationship of MLP models to LDA; 3.5 Linear classifiers; 3.6 Complements and exercises; 4 Activation and Penalty Functions
4.1 Introduction4.2 Interpreting outputs as probabilities; 4.3 The fiuniversal approximatorfl and consistency; 4.4 Variance and bias; 4.5 Binary variables and logistic regression; 4.6 MLP models and cross-entropy; 4.7 A derivation of the softmax activation function; 4.8 The finaturalfl pairing and A,; 4.9 A comparison of least squares and cross-entropy; 4.10 Conclusion; 4.11 Complements and exercises; 5 Model Fitting and Evaluation; 5.1 Introduction; 5.2 Error rate estimation; 5.3 Model selection for MLP models; 5.4 Penalized training; 5.5 Complements and exercises; 6 The Task-based MLP
6.1 Introduction6.2 The task-based MLP; 6.3 Pruning algorithms; 6.4 Interpreting and evaluating task-based MLP models; 6.5 Evaluating the models; 6.6 Conclusion; 6.7 Complements and exercises; 7 Incorporating Spatial Information into an MLP Classifier; 7.1 Allocation and neighbor information; 7.2 Markov random fields; 7.3 Hopfield networks; 7.4 MLP neighbor models; 7.5 Sequential updating; 7.6 Example - MartinTMs farm; 7.7 Conclusion; 7.8 Complements and exercises; 8 Influence Curves for the Multi-layer Perceptron Classifier; 8.1 Introduction; 8.2 Estimators; 8.3 Influence curves
8.4 M-estimators8.5 The MLP; 8.6 Influence curves for pc; 8.7 Summary and Conclusion; 9 The Sensitivity Curves of the MLP Classifier; 9.1 Introduction; 9.2 The sensitivity curve; 9.3 Some experiments; 9.4 Discussion; 9.5 Conclusion; 10 A Robust Fitting Procedure for MLP Models; 10.1 Introduction; 10.2 The effect of a hidden layer; 10.3 Comparison of MLP with robust logistic regression; 10.4 A robust MLP model; 10.5 Diagnostics; 10.6 Conclusion; 10.7 Complements and exercises; 11 Smoothed Weights; 11.1 Introduction; 11.2 MLP models; 11.3 Examples; 11.4 Conclusion
11.5 Cornplernents and exercises12 Translation Invariance; 12.1 Introduction; 12.2 Example 1; 12.3 Example 2; 12.4 Example 3; 12.5 Conclusion; 13 Fixed-slope Training; 13.1 Introduction; 13.2 Strategies; 13.3 Fixing γ or O; 13.4 Example 1; 13.5 Example 2; 13.6 Discussion; Bibliography; Appendix A: Function Minimization; A.l Introduction; A.2 Back-propagation; A.3 Newton-Raphson; A.4 The method of scoring; A.5 Quasi-Newton; A.6 Conjugate gradients; A.7 Scaled conjugate gradients; A.8 Variants on vanilla fiback-propagationfl; A.9 Line search; A.10 The simplex algorithm; A.11 Implementation
A.12 Examples
Record Nr. UNINA-9910830653803321
Dunne Robert A  
Hoboken, N.J. ; ; Chichester, : Wiley, c2007
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui
A statistical approach to neural networks for pattern recognition [[electronic resource] /] / Robert A. Dunne
A statistical approach to neural networks for pattern recognition [[electronic resource] /] / Robert A. Dunne
Autore Dunne Robert A
Pubbl/distr/stampa Hoboken, N.J. ; ; Chichester, : Wiley, c2007
Descrizione fisica 1 online resource (289 p.)
Disciplina 006.32
Collana Wiley series in computational statistics
Soggetto topico Perceptrons
Neural networks (Computer science)
ISBN 1-280-93517-0
9786610935178
0-470-14815-2
0-470-14814-4
Formato Materiale a stampa
Livello bibliografico Monografia
Lingua di pubblicazione eng
Nota di contenuto A Statistical Approach to Neural Networks for Pattern Recognition; Contents; Notation and Code Examples; Preface; Acknowledgments; 1 Introduction; 1.1 The perceptron; 2 The Multi-Layer Perceptron Model; 2.1 The multi-layer perceptron (MLP); 2.2 The first and second derivatives; 2.3 Additional hidden layers; 2.4 Classifiers; 2.5 Complements and exercises; 3 Linear Discriminant Analysis; 3.1 An alternative method; 3.2 Example; 3.3 Flexible and penalized LDA; 3.4 Relationship of MLP models to LDA; 3.5 Linear classifiers; 3.6 Complements and exercises; 4 Activation and Penalty Functions
4.1 Introduction4.2 Interpreting outputs as probabilities; 4.3 The fiuniversal approximatorfl and consistency; 4.4 Variance and bias; 4.5 Binary variables and logistic regression; 4.6 MLP models and cross-entropy; 4.7 A derivation of the softmax activation function; 4.8 The finaturalfl pairing and A,; 4.9 A comparison of least squares and cross-entropy; 4.10 Conclusion; 4.11 Complements and exercises; 5 Model Fitting and Evaluation; 5.1 Introduction; 5.2 Error rate estimation; 5.3 Model selection for MLP models; 5.4 Penalized training; 5.5 Complements and exercises; 6 The Task-based MLP
6.1 Introduction6.2 The task-based MLP; 6.3 Pruning algorithms; 6.4 Interpreting and evaluating task-based MLP models; 6.5 Evaluating the models; 6.6 Conclusion; 6.7 Complements and exercises; 7 Incorporating Spatial Information into an MLP Classifier; 7.1 Allocation and neighbor information; 7.2 Markov random fields; 7.3 Hopfield networks; 7.4 MLP neighbor models; 7.5 Sequential updating; 7.6 Example - MartinTMs farm; 7.7 Conclusion; 7.8 Complements and exercises; 8 Influence Curves for the Multi-layer Perceptron Classifier; 8.1 Introduction; 8.2 Estimators; 8.3 Influence curves
8.4 M-estimators8.5 The MLP; 8.6 Influence curves for pc; 8.7 Summary and Conclusion; 9 The Sensitivity Curves of the MLP Classifier; 9.1 Introduction; 9.2 The sensitivity curve; 9.3 Some experiments; 9.4 Discussion; 9.5 Conclusion; 10 A Robust Fitting Procedure for MLP Models; 10.1 Introduction; 10.2 The effect of a hidden layer; 10.3 Comparison of MLP with robust logistic regression; 10.4 A robust MLP model; 10.5 Diagnostics; 10.6 Conclusion; 10.7 Complements and exercises; 11 Smoothed Weights; 11.1 Introduction; 11.2 MLP models; 11.3 Examples; 11.4 Conclusion
11.5 Cornplernents and exercises12 Translation Invariance; 12.1 Introduction; 12.2 Example 1; 12.3 Example 2; 12.4 Example 3; 12.5 Conclusion; 13 Fixed-slope Training; 13.1 Introduction; 13.2 Strategies; 13.3 Fixing γ or O; 13.4 Example 1; 13.5 Example 2; 13.6 Discussion; Bibliography; Appendix A: Function Minimization; A.l Introduction; A.2 Back-propagation; A.3 Newton-Raphson; A.4 The method of scoring; A.5 Quasi-Newton; A.6 Conjugate gradients; A.7 Scaled conjugate gradients; A.8 Variants on vanilla fiback-propagationfl; A.9 Line search; A.10 The simplex algorithm; A.11 Implementation
A.12 Examples
Record Nr. UNINA-9910841024903321
Dunne Robert A  
Hoboken, N.J. ; ; Chichester, : Wiley, c2007
Materiale a stampa
Lo trovi qui: Univ. Federico II
Opac: Controlla la disponibilità qui