05041nam 22007574a 450 991082430700332120200520144314.01-282-09675-397866120967540-262-25695-91-4237-7253-9(CKB)1000000000461544(EBL)3338671(SSID)ssj0000209016(PQKBManifestationID)11183794(PQKBTitleCode)TC0000209016(PQKBWorkID)10244426(PQKB)10229362(CaBNVSL)mat06267334(IDAMS)0b000064818b4314(IEEE)6267334(OCoLC)68907209(OCoLC)78987607(OCoLC)182530751(OCoLC)473096469(OCoLC)488454745(OCoLC)568007491(OCoLC)606032834(OCoLC)648227163(OCoLC)654817487(OCoLC)681167521(OCoLC)722566384(OCoLC)728037419(OCoLC)806185765(OCoLC)888437564(OCoLC)961533509(OCoLC)962660136(OCoLC)988489574(OCoLC)991986007(OCoLC)994982647(OCoLC)1011994736(OCoLC)1037421521(OCoLC)1037908088(OCoLC)1038701247(OCoLC)1055345938(OCoLC)1081193106(OCoLC)1083553647(OCoLC-P)68907209(MaCbMITP)4908(Au-PeEL)EBL3338671(CaPaEBR)ebr10173735(OCoLC)68907209(MiAaPQ)EBC3338671(EXLCZ)99100000000046154420050802d2005 uy 0engur|n|---|||||txtccrNearest-neighbor methods in learning and vision theory and practice /edited by Gregory Shakhnarovich, Trevor Darrell, Piotr Indyk1st ed.Cambridge, Mass. MIT Pressc20051 online resource (280 p.)Neural information processing series"... held in Whistler, British Columbia ... annual conference on Neural Information Processing Systems (NIPS) in December 2003"--Pref.0-262-19547-X Includes bibliographical references and index.Contents; Series Foreword; Preface; 1 Introduction; I THEORY; 2 Nearest-Neighbor Searching and Metric Space Dimensions; 3 Locality-Sensitive Hashing Using Stable Distributions; II APPLICATIONS: LEARNING; 4 New Algorithms for Efficient High-Dimensional Nonparametric Classification; 5 Approximate Nearest Neighbor Regression in Very High Dimensions; 6 Learning Embeddings for Fast Approximate Nearest Neighbor Retrieval; III APPLICATIONS: VISION; 7 Parameter-Sensitive Hashing for Fast Pose Estimation; 8 Contour Matching Using Approximate Earth Mover's Distance9 Adaptive Mean Shift Based Clustering in High Dimensions10 Object Recognition using Locality Sensitive Hashing of Shape Contexts; Contributors; IndexRegression and classification methods based on similarity of the input to stored examples have not been widely used in applications involving very large sets of high-dimensional data. Recent advances in computational geometry and machine learning, however, may alleviate the problems in using these methods on large data sets. This volume presents theoretical and practical discussions of nearest-neighbor (NN) methods in machine learning and examines computer vision as an application domain in which the benefit of these advanced methods is often dramatic. It brings together contributions from researchers in theory of computation, machine learning, and computer vision with the goals of bridging the gaps between disciplines and presenting state-of-the-art methods for emerging applications. The contributors focus on the importance of designing algorithms for NN search, and for the related classification, regression, and retrieval tasks, that remain efficient even as the number of points or the dimensionality of the data grows very large. The book begins with two theoretical chapters on computational geometry and then explores ways to make the NN approach practicable in machine learning applications where the dimensionality of the data and the size of the data sets make the naive methods for NN search prohibitively expensive. The final chapters describe successful applications of an NN algorithm, locality-sensitive hashing (LSH), to vision tasks.Neural information processing series.AlgorithmsCongressesGeometryData processingCongressesMachine learningCongressesNearest neighbor analysis (Statistics)CongressesAlgorithmsGeometryData processingMachine learningNearest neighbor analysis (Statistics)006.3/1Darrell Trevor1614333Indyk Piotr947726Shakhnarovich Gregory1614332MiAaPQMiAaPQMiAaPQBOOK9910824307003321Nearest-neighbor methods in learning and vision3944119UNINA