LEADER 05267nam 22006374a 450 001 9910457973203321 005 20210805121035.0 010 $a1-281-31146-4 010 $a9786611311469 010 $a0-08-051361-1 035 $a(CKB)1000000000384708 035 $a(EBL)344670 035 $a(OCoLC)239612960 035 $a(SSID)ssj0000218839 035 $a(PQKBManifestationID)11189796 035 $a(PQKBTitleCode)TC0000218839 035 $a(PQKBWorkID)10220212 035 $a(PQKB)11084255 035 $a(MiAaPQ)EBC344670 035 $a(Au-PeEL)EBL344670 035 $a(CaPaEBR)ebr10229407 035 $a(CaONFJC)MIL131146 035 $a(EXLCZ)991000000000384708 100 $a20070828d2006 uy 0 101 0 $aeng 135 $aur|n|---||||| 181 $ctxt 182 $cc 183 $acr 200 10$aPattern recognition$b[electronic resource] /$fSergios Theodoridis, Konstantinos Koutroumbas 205 $a3rd ed. 210 $aSan Diego, CA $cAcademic Press$dc2006 215 $a1 online resource (854 p.) 300 $aDescription based upon print version of record. 311 $a0-12-369531-7 327 $aFront cover; Title page; Copyright page; Table of contents; PREFACE; 1 INTRODUCTION; 1.1 IS PATTERN RECOGNITION IMPORTANT?; 1.2 FEATURES, FEATURE VECTORS, AND CLASSIFIERS; 1.3 SUPERVISED VERSUS UNSUPERVISED PATTERN RECOGNITION; 1.4 OUTLINE OF THE BOOK; 2 CLASSIFIERS BASED ON BAYES DECISION THEORY; 2.1 INTRODUCTION; 2.2 BAYES DECISION THEORY; 2.3 DISCRIMINANT FUNCTIONS AND DECISION SURFACES; 2.4 BAYESIAN CLASSIFICATION FOR NORMAL DISTRIBUTIONS; 2.5 ESTIMATION OF UNKNOWN PROBABILITY DENSITY FUNCTIONS; 2.6 THE NEAREST NEIGHBOR RULE; 2.7 BAYESIAN NETWORKS; 3 LINEAR CLASSIFIERS; 3.1 INTRODUCTION 327 $a3.2 LINEAR DISCRIMINANT FUNCTIONS AND DECISION HYPERPLANES3.3 THE PERCEPTRON ALGORITHM; 3.4 LEAST SQUARES METHODS; 3.5 MEAN SQUARE ESTIMATION REVISITED; 3.6 LOGISTIC DISCRIMINATION; 3.7 SUPPORT VECTOR MACHINES; 4 NONLINEAR CLASSIFIERS; 4.1 INTRODUCTION; 4.2 THE XOR PROBLEM; 4.3 THE TWO-LAYER PERCEPTRON; 4.4 THREE-LAYER PERCEPTRONS; 4.5 ALGORITHMS BASED ON EXACT CLASSIFICATION OF THE TRAINING SET; 4.6 THE BACKPROPAGATION ALGORITHM; 4.7 VARIATIONS ON THE BACKPROPAGATION THEME; 4.8 THE COST FUNCTION CHOICE; 4.9 CHOICE OF THE NETWORK SIZE; 4.10 A SIMULATION EXAMPLE 327 $a4.11 NETWORKS WITH WEIGHT SHARING4.12 GENERALIZED LINEAR CLASSIFIERS; 4.13 CAPACITY OF THE l-DIMENSIONAL SPACE IN LINEAR DICHOTOMIES; 4.14 POLYNOMIAL CLASSIFIERS; 4.15 RADIAL BASIS FUNCTION NETWORKS; 4.16 UNIVERSAL APPROXIMATORS; 4.17 SUPPORT VECTOR MACHINES: THE NONLINEAR CASE; 4.18 DECISION TREES; 4.19 COMBINING CLASSIFIERS; 4.20 THE BOOSTING APPROACH TO COMBINE CLASSIFIERS; 4.21 DISCUSSION; 5 FEATURE SELECTION; 5.1 INTRODUCTION; 5.2 PREPROCESSING; 5.3 FEATURE SELECTION BASED ON STATISTICAL HYPOTHESIS TESTING; 5.4 THE RECEIVER OPERATING CHARACTERISTICS (ROC) CURVE 327 $a5.5 CLASS SEPARABILITY MEASURES5.6 FEATURE SUBSET SELECTION; 5.7 OPTIMAL FEATURE GENERATION; 5.8 NEURAL NETWORKS AND FEATURE GENERATION/ SELECTION; 5.9 A HINT ON GENERALIZATION THEORY; 5.10 THE BAYESIAN INFORMATION CRITERION; 6 FEATURE GENERATION I: LINEAR TRANSFORMS; 6.1 INTRODUCTION; 6.2 BASIS VECTORS AND IMAGES; 6.3 THE KARHUNEN-LOE?VE TRANSFORM; 6.4 THE SINGULAR VALUE DECOMPOSITION; 6.5 INDEPENDENT COMPONENT ANALYSIS; 6.6 THE DISCRETE FOURIER TRANSFORM (DFT); 6.7 THE DISCRETE COSINE AND SINE TRANSFORMS; 6.8 THE HADAMARD TRANSFORM; 6.9 THE HAAR TRANSFORM; 6.10 THE HAAR EXPANSION REVISITED 327 $a6.11 DISCRETE TIMEWAVELET TRANSFORM (DTWT)6.12 THE MULTIRESOLUTION INTERPRETATION; 6.13 WAVELET PACKETS; 6.14 A LOOK AT TWO-DIMENSIONAL GENERALIZATIONS; 6.15 APPLICATIONS; 7 FEATURE GENERATION II; 7.1 INTRODUCTION; 7.2 REGIONAL FEATURES; 7.3 FEATURES FOR SHAPE AND SIZE CHARACTERIZATION; 7.4 A GLIMPSE AT FRACTALS; 7.5 TYPICAL FEATURES FOR SPEECH AND AUDIO CLASSIFICATION; 8 TEMPLATE MATCHING; 8.1 INTRODUCTION; 8.2 MEASURES BASED ON OPTIMAL PATH SEARCHING TECHNIQUES; 8.3 MEASURES BASED ON CORRELATIONS; 8.4 DEFORMABLE TEMPLATE MODELS; 9 CONTEXT-DEPENDENT CLASSIFICATION; 9.1 INTRODUCTION 327 $a9.2 THE BAYES CLASSIFIER 330 $aPattern recognition is a fast growing area with applications in a widely diverse number of fields such as communications engineering, bioinformatics, data mining, content-based database retrieval, to name but a few. This new edition addresses and keeps pace with the most recent advancements in these and related areas. This new edition: a) covers Data Mining, which was not treated in the previous edition, and is integrated with existing material in the book, b) includes new results on Learning Theory and Support Vector Machines, that are at the forefront of today's research, with a lot of inter 606 $aPattern recognition systems 608 $aElectronic books. 615 0$aPattern recognition systems. 676 $a006.3 676 $a006.4 700 $aTheodoridis$b Sergios$f1951-$0299259 701 $aKoutroumbas$b Konstantinos$f1967-$0299260 801 0$bMiAaPQ 801 1$bMiAaPQ 801 2$bMiAaPQ 906 $aBOOK 912 $a9910457973203321 996 $aPattern recognition$9730085 997 $aUNINA