05072oam 22007454a 450 991077751660332120190503073334.01-282-09675-397866120967540-262-25695-91-4237-7253-9(CKB)1000000000461544(EBL)3338671(SSID)ssj0000209016(PQKBManifestationID)11183794(PQKBTitleCode)TC0000209016(PQKBWorkID)10244426(PQKB)10229362(CaBNVSL)mat06267334(IDAMS)0b000064818b4314(IEEE)6267334(OCoLC)68907209(OCoLC)78987607(OCoLC)182530751(OCoLC)473096469(OCoLC)488454745(OCoLC)568007491(OCoLC)606032834(OCoLC)648227163(OCoLC)654817487(OCoLC)681167521(OCoLC)722566384(OCoLC)728037419(OCoLC)806185765(OCoLC)888437564(OCoLC)961533509(OCoLC)962660136(OCoLC)988489574(OCoLC)991986007(OCoLC)994982647(OCoLC)1011994736(OCoLC)1037421521(OCoLC)1037908088(OCoLC)1038701247(OCoLC)1055345938(OCoLC)1081193106(OCoLC)1083553647(OCoLC-P)68907209(MaCbMITP)4908(Au-PeEL)EBL3338671(CaPaEBR)ebr10173735(OCoLC)68907209(MiAaPQ)EBC3338671(EXLCZ)99100000000046154420060516d2005 uy 0engur|n|---|||||txtccrNearest-neighbor methods in learning and vision theory and practice /edited by Gregory Shakhnarovich, Trevor Darrell, Piotr IndykCambridge, Mass. MIT Press©20051 online resource (280 p.)Neural information processing series" ... held in Whistler, British Columbia ... annual conference on Neural Information Processing Systems (NIPS) in December 2003"--Preface.0-262-19547-X Includes bibliographical references and index.Contents; Series Foreword; Preface; 1 Introduction; I THEORY; 2 Nearest-Neighbor Searching and Metric Space Dimensions; 3 Locality-Sensitive Hashing Using Stable Distributions; II APPLICATIONS: LEARNING; 4 New Algorithms for Efficient High-Dimensional Nonparametric Classification; 5 Approximate Nearest Neighbor Regression in Very High Dimensions; 6 Learning Embeddings for Fast Approximate Nearest Neighbor Retrieval; III APPLICATIONS: VISION; 7 Parameter-Sensitive Hashing for Fast Pose Estimation; 8 Contour Matching Using Approximate Earth Mover's Distance9 Adaptive Mean Shift Based Clustering in High Dimensions10 Object Recognition using Locality Sensitive Hashing of Shape Contexts; Contributors; IndexRegression and classification methods based on similarity of the input to stored examples have not been widely used in applications involving very large sets of high-dimensional data. Recent advances in computational geometry and machine learning, however, may alleviate the problems in using these methods on large data sets. This volume presents theoretical and practical discussions of nearest-neighbor (NN) methods in machine learning and examines computer vision as an application domain in which the benefit of these advanced methods is often dramatic. It brings together contributions from researchers in theory of computation, machine learning, and computer vision with the goals of bridging the gaps between disciplines and presenting state-of-the-art methods for emerging applications. The contributors focus on the importance of designing algorithms for NN search, and for the related classification, regression, and retrieval tasks, that remain efficient even as the number of points or the dimensionality of the data grows very large. The book begins with two theoretical chapters on computational geometry and then explores ways to make the NN approach practicable in machine learning applications where the dimensionality of the data and the size of the data sets make the naive methods for NN search prohibitively expensive. The final chapters describe successful applications of an NN algorithm, locality-sensitive hashing (LSH), to vision tasks.Neural information processing seriesNearest neighbor analysis (Statistics)CongressesMachine learningCongressesAlgorithmsCongressesGeometryData processingCongressesCOMPUTER SCIENCE/Machine Learning & Neural NetworksNearest neighbor analysis (Statistics)Machine learningAlgorithmsGeometryData processing006.3/1Shakhnarovich Gregory1467524Darrell Trevor1467525Indyk Piotr947726OCoLC-POCoLC-PBOOK9910777516603321Nearest-neighbor methods in learning and vision3678203UNINA