000 05691nam a22004935i 4500
001 978-1-4614-5323-9
003 DE-He213
005 20140220082819.0
007 cr nn 008mamaa
008 121026s2013 xxu| s |||| 0|eng d
020 _a9781461453239
_9978-1-4614-5323-9
024 7 _a10.1007/978-1-4614-5323-9
_2doi
050 4 _aQ337.5
050 4 _aTK7882.P3
072 7 _aUYQP
_2bicssc
072 7 _aCOM016000
_2bisacsh
082 0 4 _a006.4
_223
100 1 _aDougherty, Geoff.
_eauthor.
245 1 0 _aPattern Recognition and Classification
_h[electronic resource] :
_bAn Introduction /
_cby Geoff Dougherty.
264 1 _aNew York, NY :
_bSpringer New York :
_bImprint: Springer,
_c2013.
300 _aXI, 196 p. 158 illus., 104 illus. in color.
_bonline resource.
336 _atext
_btxt
_2rdacontent
337 _acomputer
_bc
_2rdamedia
338 _aonline resource
_bcr
_2rdacarrier
347 _atext file
_bPDF
_2rda
505 0 _aPreface -- Acknowledgments -- Chapter 1 Introduction -- 1.1 Overview -- 1.2 Classification -- 1.3 Organization of the Book -- Bibliography -- Exercises -- Chapter 2 Classification -- 2.1 The Classification Process -- 2.2 Features -- 2.3 Training and Learning -- 2.4 Supervised Learning and Algorithm Selection -- 2.5 Approaches to Classification -- 2.6 Examples -- 2.6.1 Classification by Shape -- 2.6.2 Classification by Size -- 2.6.3 More Examples -- 2.6.4 Classification of Letters -- Bibliography -- Exercises -- Chapter 3 Non-Metric Methods -- 3.1 Introduction -- 3.2 Decision Tree Classifier -- 3.2.1 Information, Entropy and Impurity -- 3.2.2 Information Gain -- 3.2.3 Decision Tree Issues -- 3.2.4 Strengths and Weaknesses -- 3.3 Rule-Based Classifier -- 3.4 Other Methods -- Bibliography -- Exercises -- Chapter 4 Statistical Pattern Recognition -- 4.1 Measured Data and Measurement Errors -- 4.2 Probability Theory -- 4.2.1 Simple Probability Theory -- 4.2.2 Conditional Probability and Bayes’ Rule -- 4.2.3 Naïve Bayes classifier -- 4.3 Continuous Random Variables -- 4.3.1 The Multivariate Gaussian -- 4.3.2 The Covariance Matrix -- 4.3.3 The Mahalanobis Distance -- Bibliography -- Exercises -- Chapter 5 Supervised Learning -- 5.1 Parametric and Non-Parametric Learning -- 5.2 Parametric Learning -- 5.2.1 Bayesian Decision Theory -- 5.2.2 Discriminant Functions and Decision Boundaries -- 5.2.3 MAP (Maximum A Posteriori) Estimator -- Bibliography -- Exercises -- Chapter 6 Non-Parametric Learning -- 6.1 Histogram Estimator and Parzen Windows -- 6.2 k-Nearest Neighbor (k-NN) Classification -- 6.3 Artificial Neural Networks (ANNs) -- 6.4 Kernel Machines -- Bibliography -- Exercises -- Chapter 7 Feature Extraction and Selection -- 7.1 Reducing Dimensionality -- 7.1.1 Pre-Processing -- 7.2 Feature Selection -- 7.2.1 Inter/Intra-Class Distance -- 7.2.2 Subset Selection -- 7.3 Feature Extraction -- 7.3.1 Principal Component Analysis (PCA) -- 7.3.2 Linear Discriminant Analysis (LDA) -- Bibliography -- Exercises -- Chapter 8 Unsupervised Learning -- 8.1 Clustering -- 8.2 k-Means Clustering -- 8.2.1 Fuzzy c-Means Clustering -- 8.3 (Agglomerative) Hierarchical Clustering -- Bibliography -- Exercises -- Chapter 9 Estimating and Comparing Classifiers -- 9.1 Comparing Classifiers and the No Free Lunch Theorem -- 9.1.2 Bias and Variance -- 9.2 Cross-Validation and Resampling Methods -- 9.2.1 The Holdout Method -- 9.2.2 k-Fold Cross-Validation -- 9.2.3 Bootstrap -- 9.3 Measuring Classifier Performance   -- 9.4 Comparing Classifiers -- 9.4.1 ROC curves -- 9.4.2 McNemar’s Test -- 9.4.3 Other Statistical Tests -- 9.4.4 The Classification Toolbox -- 9.5 Combining classifiers -- Bibliography -- Chapter 10 Projects -- 10.1 Retinal Tortuosity as an Indicator of Disease -- 10.2 Segmentation by Texture -- 10.3 Biometric Systems -- 10.3.1 Fingerprint Recognition -- 10.3.2 Face Recognition -- Bibliography -- Index.
520 _aThe use of pattern recognition and classification is fundamental to many of the automated electronic systems in use today. However, despite the existence of a number of notable books in the field, the subject remains very challenging, especially for the beginner. Pattern Recognition and Classification presents a comprehensive introduction to the core concepts involved in automated pattern recognition. It is designed to be accessible to newcomers from varied backgrounds, but it will also be useful to researchers and professionals in image and signal processing and analysis, and in computer vision. Fundamental concepts of supervised and unsupervised classification are presented in an informal, rather than axiomatic, treatment so that the reader can quickly acquire the necessary background for applying the concepts to real problems. More advanced topics, such as estimating classifier performance and combining classifiers, and details of particular project applications are addressed in the later chapters. This book is suitable for undergraduates and graduates studying pattern recognition and machine learning.
650 0 _aComputer science.
650 0 _aOptical pattern recognition.
650 0 _aBiology
_xData processing.
650 0 _aAlgorithms.
650 1 4 _aComputer Science.
650 2 4 _aPattern Recognition.
650 2 4 _aNonlinear Dynamics.
650 2 4 _aSignal, Image and Speech Processing.
650 2 4 _aComputer Appl. in Life Sciences.
650 2 4 _aAlgorithms.
710 2 _aSpringerLink (Online service)
773 0 _tSpringer eBooks
776 0 8 _iPrinted edition:
_z9781461453222
856 4 0 _uhttp://dx.doi.org/10.1007/978-1-4614-5323-9
912 _aZDB-2-SCS
999 _c95379
_d95379