My name is Rohan Hitchcock and I am currently a PhD Candidate at the University of Melbourne and CSRIO. I am interested in understanding how deep learning works, with a view towards improving the interpretability and safety of large, complex and highly capable models. I see singular learning theory and developmental interpretability as promising directions. In my PhD I am thinking about the mathematical theory of singular learning theory and developmental interpretability (e.g. improving and better understanding local learning coefficient estimation), and, in collaboration with CSIRO, applying these interpretability methods to neural networks trained to simulate physical systems.
Prior to my PhD I completed a Masters thesis in mathematics, in algebraic geometry and homological algebra at the University of Melbourne. My thesis investigated aspects of the bicategory of Landau-Ginzburg models, a mathematical structure which describes relationships between different algebraic singularities. I focused on methods for explicitly computing these relationships (morphisms), as opposed to simply proving that such a relationship exists, and contributed algorithms for doing so. These algorithms led to the characterisation of a situation under which an important construction can be done more simply.
I currently work as a tutor (teaching assistant) in the School of Mathematics and Statistics and the School of Computing and Information Systems at the University of Melbourne, where I have taught undergraduate classes on artificial intelligence, theoretical computer science, linear algebra and vector calculus. You can see more about my teaching experience here.
Publications
- (Coming soon) R. Hitchcock et. al., “Emergence of computational structure in a neural network physics simulator”
- (2022) R. Hitchcock, “Differentiation, Division and the Bicategory of Landau-Ginzburg Models”, MSc Thesis (thesis, code, talk). Supervised by Daniel Murfet.
- (2020) Kato et. al., “Display of Native Antigen on cDC1 That Have Spatial Access to Both T and B Cells Underlies Efficient Humoral Vaccination” (paper)
Talks and notes
- On the convergence of SGLD (notes) Singular Learning Theory seminar 28/11/2024
- The Łojasiewicz exponent (handwritten notes) Singular Learning Theory seminar 1/8/2024
- On Shavit (2023) “What does it take to catch a Chinchilla?” (notes) AI safety reading group
- Induction heads and phase transitions (slides, video) SLT & Alignment Summit 2023
- Induction heads (notes, video) Singular Learning Theory seminar, 6/4/2023
Statistical mechanics seminar (page)
- What is statistical mechanics? Hamiltonian systems and Liouville’s theorem. (notes, video).
- Entropy and the Boltzmann distribution. (notes, video).
- Stochastic processes (notes, video).
- Stochastic differential equations (notes)
- Discussing Seung, Sompolinsky, Tishby (1992) “Statistical Mechanics of Learning from Examples” (notes).
Landau-Ginzburg seminar (page)
- An introduction to bicategories (notes, video).
- Matrix factorisations and geometry (notes, video).
- The Perturbation Lemma (notes, video1, video2).
- Composition in the bicategory of Landau-Ginzburg models (notes, video).
- The cut operation (notes, video1, video2).
- Differentiation and division (notes, video1, video2).
- The cut operation revisited (notes, video).
- Introduction to idempotent completion in preadditive categories (notes).
- Introduction to matrix factorisations (notes).
Miscellaneous
- Blow-ups and singular learning theory (notes).
- A ‘Stone-Weierstrass’ theorem for neural networks (notes).
- R. Hitchcock (2020) “Alternative construction of the groups of type \(E_6\)” (report) AMSI Vacation Research Scholarship, supervised by John Bamberg and Michael Giudici.
- Code for tracking T-cells and mapping blood vessels in the liver (code). This was produced between 2018–2020 during my time as a research assistant at the Peter Doherty Institute. I was supervised by Lynette Beattie, Jonathan Manton and William Heath.