Aa/d  H $ d HHHHff@  dy Footnote TableFootnote**. . / - :;,.!?.\-4  `R   EquationVariables]M]QMNOP<$lastpagenum> <$daynum> <$monthname> <$year>"<$daynum>/<$monthnum>/<$shortyear>5<$daynum> <$monthname> <$year> <$hour24>:<$minute00> "<$daynum>/<$monthnum>/<$shortyear><$daynum> <$monthname> <$year>"<$daynum>/<$monthnum>/<$shortyear> <$fullfilename> <$filename> <$paratext[Title]> <$paratext[Heading]> <$curpagenum> <$marker1> <$marker2>Pagepage <$pagenum>Heading & Page<$paratext> on page <$pagenum>Section & Page%section <$paranum> on page <$pagenum>loo (Continued)+ (Sheet <$tblsheetnum> of <$tblsheetcount>)!1(R!77A"88A#IKA$LL%MM&NN'OOhuuonea<$$1. a1.1 <1.2 1.3 $1.4 m1.5 1.6 $1.7 i1.8 a 1.9 T 1.10 pai 1.11 cu2. 2.1 2.2 e2.3 n2.4 e2.5 a3. l3.1 3.2 u3.3 3.4 4. 4.1 4.2 I4.3 4.4 4.5 5. 5.1 5.2 5.3 5.4 5.5 5.6 5.7 $5.8  5.9 6. 6.1 $7. 7.1 7.2 7.3 7.4 7.5 8. a8.1  8.2 8.3 8.4  9.2 8.5 8.6 8.7 8.8  9.  9.1  9.3  9.4  9.5  9.6  9.7  9.8  9.9  10. 5. 10.1  10.2  10.3  10.4    10.5 ! 11.2 " 12.1 8 # 12.2 $ 12.3 % 13. &. 13.1 ' 13.2 ( 13.3 3 ) 13.4 *14. +14.1 ,14.2  - 14.3 .15. /15.1 015.2 5 115.3 234 5  12. 6 11. 7 11.1 8  11.3 9 11.4 -]P#d:- O79 d;. M8 HH</- HH .410.57d`" HH=0. .2HH  8&d` 'd>1II HH?21 +HHy##  I# UT UT`cDETAILED COURSE CONTENTS UR UT`cPATTERN RECOGNITION .2'`b 5`b2 OUP UT`a>Introduction to Pattern Recognition via Character Recognition e``  Transducers u``Preprocessing ``92Feature extraction (feature-space representation) ``"Classification (decision regions) ``+Template matching (affine transformations) M8`` &Grids (square, triangular, hexagonal) `` Connectivity ``6Contour tracing (square & Moore neighborhood tracing) ``%M.I.T. reading machine for the blind ``)Hysteresis smoothing (digital filtering) ``/Types of input to pattern recognition programs UN UT`aSpatial Smoothing 5``?Regularization +E``*Logical smoothing (salt-and-pepper noise) U``#Local averaging e``COMedian filtering u``REPolygonal approximation UL UT`aSpatial Differentiation `` tSobel operator tio``coRoberts cross operator ` `` Laplacian ``g Unsharp masking 9UJ UT`aonSpatial Moments pr``Moments of distributions n``) Moments of area & perimeter m%``anMoments for feature extraction ` 5``riMoments for pre-processing E``vi4Moments as predictors of discrimination performance or_UH UT`aciMedial Axis Transformations T.u``orDistance between sets d@3tinJJTys HHA43 UNHHkia##``?Jon#``Lo Medial Axis g ``oiSkeletonization (``inHilditchs algorithm O8`` Rosenfelds algorithm H``ioMinkowski metrics X``erDistance transforms h``er*Skeleton clean-up via distance transforms x``$Medial axes via distance transforms UT UT`ainTopological Feature Extraction Mo``)Convex hulls, concavities and enclosures UR UT`antProcessing Line Drawings ``Mo2Square, circular, and grid-intersect quantization ``pr+Probability of obtaining diagonal elements pre``in+Geometric probability (Bertrands paradox) ial``on2Difference encoding & chain correlation functions ``@Minkowski metric quantization 1UP UT`aTy:Detection of Structure in Noisy Pictures and Dot Patterns G``#1Point-to-curve transformations (Hough transform) #W``LoLine and circle detection g``izHypothesis testing approach Hiw`` OMaximum-entropy quantization `` Proximity graphs and perception ``Di$Triangulations and Voronoi diagrams er``p The shape of a set of points ``alAlpha hulls & Beta skeletons UN UT`ain-Neural Networks and Bayesian Decision Theory ``ul.Formal neurons, linear machines & perceptrons ``in%Continuous and discrete measurements e``idMinimum risk classification ``abMinimum error classification m'``Discriminant functions bil7``ra7The multivariate Gaussian probability density function coG``s !Mahalanobis distance classifiers rW``UPParametric decision rules g``#Independence and the discrete case dB5oh KKLoHHC65 izHHpp` OMaKqu UT UT`a8Independence of Measurements, Redundancy, and Synergism Di``nd+Conditional and unconditional independence sh/``inDependence and correlation ha ?``to;The best [k` measurements are not the [k` best on O``Feature evaluation criteria ar_``trFeature selection methods yUR UT`aet,Neural Networks and Non-parametric Learning ri``+Non-parametric training of linear machines ati``Error-correction procedures il``ra!The fundamental learning theorem l``onMulti-layer networks UP UT`aan4Estimation of Parameters and Classifier Performance io``Properties of estimators t``Dimensionality and sample size ``3Estimation of the probability of misclassification )UN UT`a Nearest Neighbor Decision Rules ?``The k-nearest neighbor rule O``en/Efficient search methods for nearest neighbors _``CoDecreasing space requirements o`` Error bounds eUL UT`ala4Using Contextual Information in Pattern Recognition re``Markov methods O``FeThe Viterbi algorithm ``+Combined bottom-up and top-down algorithms `UJ UT`ark+Cluster Analysis and Unsupervised Learning ```icDecision-directed learning ati``Graph-theoretic methods re``#Agglomerative and divisive methods m lHHT7N- UPHHl//!forHHU8L.tiHHl00"H D9. bilH on NearesLish`- O1P - t dE:it ;H nghH;3K ^F;<: a rH;3K ^H H eFootnote eHE;? ^G<;=: rtiHE;? ^HMHMe Single LineMaH'%H=<?:irb>>Footnotein I>=JUT steAn HlADf ^J?=@: esiHlADf ^HuHue Double Lineph-Hz %K@?C:gomABds Double Line LAB@   1MBA@1H11HT%  NC@H:DD Single Line  ODCees     H$ PE. EH$ t  nghM^`` a HH QF- eH E;?Nl`- M#N - HH$ RG- H$ =Od` HHZSHC:i TableFootnotenHHVI1gHHlJ22#HHWJ3HHlIK44#HHXK51HHlJ66#H YLM8.H l99$ H$ ZML. H$ lEE%H [NO7-H lFF& H$ \ON-H$ lGG'$d- Leftd.Rightd1d3d5d: Referencet$ @ yH Paragraph <#>.<+> . $$$@ yTitle.  @ y Title2 <#>.<+>. . $@ yH title2 <#>.<+> . @` yBody. fYT y  TableTitleT:Table : . @cy Title.. @b y Title. fZ y CellHeading . @` y 2Header. @` y Footer. ffZ yCellBody. @y Title1<+>. . $@` yH title2 <#>.<+> . @ay Title1<+>. . Yi\ )Yy  ڝZy tu [y]y Emphasis^y  Subscripty_y  Superscript ڝ`y)ay  )by )cy dEquationVariables [ey yyyyyyyyyThinyMediumyDoubletly Thicki @y Very Thin=M? =yyyHHHhHHFormat A >yyyHHHHHFormat B{CommentMNOPyuad yBlackT!zWhiteddA{yReddd|Greendd }Blued~CyandMagentayd Yellow Times-Roman Times-Bold Times-ItalicHelvetica-BoldTimes Helvetica RegularRegular BoldRegularItalic؟x$x}pe2sas'I~)vpN,(B%ue6puwu:Am_