Learning and its Foundation

Logic will get you from A to B. Imagination will take you everywhere.

Albert Einstein

Welcome to foundations of learning, a vast but challenging field that includes the following two areas that I am working on:

  • Foundations for modeling and analysis of non-Euclidean data;
  • Foundations of machine learning.

Research in these areas frequently interacts with different branches of mathematics, statistics, computer sciences, and domain sciences where they are applied to, and thus it employs a vast array of tools in mathematics, statistics and probability, including real/complex/functional/harmonic analysis, measure/probability theory, (stochastic) differential/integral equations, differential geometry, and algebraic/differential topology.

Most of my research is interdisciplinary. While maintaining interdisciplinary research in general may be more challenging, rewards can be huge in the form of expert knowledge on topics of different fields. I welcome you to join interdisciplinary research endeavors if you are interested in working with me.

Here is a roadmap of my website:

  • For course I teach, please look at my Teaching;
  • For research projects I have completed but may still have extensions, please look at my Remaining Research;
  • For current research projects, please look at my Current Research;
  • To know who we are and who I collaborate with, please look at Collaborators;
  • For news about my research or collaborators, please look at News.

I have some suggestions on your background if you would like to work with me; please see items below, where “Default” is the minimum requirement. Research in one of the two areas mentioned at the beginning or in a topic listed on my Current Research often consults carefully some of the “Core” books and chapters of some “Supplementary” books given below. Techniques in “Core” books are research specific, and if there are courses in the Department that teach them, you are recommended to take those courses; otherwise, these techniques are learned by a combination of self-study by you and guided-study by a me, i.e., I recommend to you what to study and work on, you try them at your best, and I then help you if you still have difficulties. Techniques in “Supplementary” books are sometimes needed to better understand “Core” books and are learnt the same way as techniques in “Core” books if needed.

  • Default: Skills required by Statistics GQE or Math GQE. In particular, good skills in calculus, linear algebra, and probability are required at least at the levels of the courses Math 420 “Linear Algebra”, Math 402 “Introduction to Analysis II” and Stat 443 “Applied Probability”. Be good at R or Python programming, so as to be able to numerically implement or test your ideas, methods or even conjectures. Familiarity with the basics of ordinary differential equations and partial differential equations, such as how to solve typical such equations.
  • Core for research on analysis of non-Euclidean data, deep learning, and theoretical statistics:
    • For research on analysis of non-Euclidean data > “Introduction to Riemannian Manifolds, by John M Lee” and “Nonparametric Statistics on Manifolds and Their Applications to Object Data Analysis, by Victor Patrangenaru, Leif Ellingson”.
    • For research in theoretical statistics > “Asymptotic Statistics, by Aad van der Vaart”, “Real Analysis and Probability, by Robert B. Ash” (or “Real Analysis and Probability, by R. M. Dudley”).
    • For research on deep learning > “Mathematical Aspects of Deep Learning, by Philipp Groths and Gitta Kutyniok”, “Algebraic topology, by Allen Hatcher”, and almost all the above, since this research will investigate geometric, topological, and statistical properties of deep learning.
  • Supplementary for theoretical statistics (up to semi-parametric statistics): “Mathematical Statistics: Basic Ideas and Selected Topics, Volume I” by Kjell A. Doksum and Peter J. Bickel, and “An introduction to multivariate statistical models” by T. W. Anderson.
  • Supplementary for probability and non-parametric statistics: “Convergence of probability measures” by Patrick Billingsley, “Concentration Inequalities: A Nonasymptotic Theory of Independence” by by Stephane Boucheron, Gabor Lugosi and Pascal Massart, “Probability on compact Lie groups” by David Applebaum, and “Mathematical Foundations of Infinite-Dimensional Statistical Models” by Evarist Gine Masdeu and Richard Nickl.
  • Supplementary for analysis, topology and geometry: “A Course in Functional Analysis” by John B. Conway, “Complex analysis” by Lars Ahlfors, “Abstract Harmonic Analysis: Volume I, II” by Edwin Hewitt and Kenneth A. Ross, “Introduction to smooth manifolds” by John M. Lee, “General Topology” by John L. Kelley.

I am building an interdisciplinary research team. If you are interested in working with me on a project of my expertise, please contact me at xiongzhi.chen@wsu.edu to schedule a meeting.

Acknowledgements
  • I gratefully acknowledge funding provided by the following organizations: Simons Foundation; NIH; WSU; State of Washington
  • I am very grateful to students and colleagues in my department for their help. As for research, I am grateful in particular to Gerard Letac (Universite de Toulouse), Sanat K. Sarkar (Temple U), Armin Schwartzman (UCSD), Donald St. P. Richards (Penn State U), Hong-ming Yin (WSU), Yimin Xiao (Michigan State U), Persi Diaconis (Stanford U) and Wayne Smith (U of Hawaii) for advice and help.