Campuses:

Peculiar Properties of Locally Linear Embedding -- Toward Theoretical Understanding of Unsupervised Learning

Monday, March 4, 2019 - 1:25pm - 2:25pm
Lind 305
Hau-tieng Wu (Duke University)
Since its introduction in 2000, the locally linear embedding (LLE) has been widely applied as an unsupervised learning tool. However, only few hand-waiving arguments are available to explain what is going on before 2018. For the sake of scientific soundness, we provide a systematic analysis of LLE, particularly under the manifold setup. In this talk, several theoretical results will be discussed. (1) We derive the corresponding kernel function, which in general is asymmetric and and not form a Markov process. (2) The regularization is critical. Different regularizations lead to dramatically different results. If chosen correctly, asymptotically we obtain the Laplace-Beltrami operator even under nonuniform sampling. (3) It has an intimate relationship with the local covariance analysis and tangent bundle structure. (4) When the boundary is not empty, we run into an interesting mixed-type differential equation. (5) An ingredient of the kernel provides a novel way to detect boundary, and hence a new approach to derive the Laplace-Beltrami operator with the Dirichlet boundary condition. If time permits, its relationship with several statistical topics like the locally linear regression and error in variable will be discussed. This is a joint work with Nan Wu.