Tutoring Experience

I am currently tutor within the School of Mathematics and Statistics and the School of Computing and Information Systems at the University of Melbourne.

  • COMP30026 Models of Computation (2023 Sem. 2, Tutor) A third-year computer science subject. Topics include logic, automata theory, computability, discrete mathematics. Uses the Haskell programming language.
  • MAST20029 Engineering Mathematics (2023 Sem. 2, Tutor) A second-year subject covering a variety of topics in vector calculus and differential equations.
  • MAST10005 Calculus 1 (2023 Sem. 2, Tutor) A introductory course in calculus.
  • COMP30024 Artificial Intelligence (2023 Sem. 1, Tutor): A third-year computer science subject. Topics include search, game playing, auction design and constraint satisfaction problems. Uses the Python programming language.
  • MAST10007 Linear Algebra (2023 Sem. 2, Tutor) An introductory course in linear algebra.
  • MAST10005 Calculus 1 (2023 Sem. 1, Tutor)
  • COMP30026 Models of Computation (2022 Sem. 2, Tutor)
  • MAST10007 Linear Algebra (2022 Sem. 2, Tutor)
  • COMP30024 Artificial Intelligence (2022 Sem. 1, Tutor)
  • MAST10007 Linear Algebra (2022 Sem. 1, Tutor)
  • COMP30026 Models of Computation (2021 Sem. 2, Tutor)
  • MAST10007 Linear Algebra (2021 Sem. 2, Tutor)
  • COMP30024 Artificial Intelligence (2021 Sem. 1, Tutor)
  • MAST10005 Calculus 1 (2021 Sem. 1, Tutor)
  • COMP30026 Models of Computation (2020 Sem. 1, Tutor)
  • COMP10001 Foundations of Computing (2018 Sem. 1, Demonstrator) An introductory course in programming, in Python.

Seminars

This is a list of seminars I have organised. These seminars were designed for first and second year mathematics students. My responsibilities included designing the seminars’ curriculum, mentoring students preparing for their talks and general seminar administration.

Neural Networks (2021 Sem. 1)

Co-organised with Nora Ganter.

Topics:

  • Introduction: What is machine learning? Logistic regression.
  • Defining neural networks: What is a neural network? Explain the need for activation functions and state approximation results.
  • Training neural networks: Discuss gradient descent and backpropagation.
  • GANs part 1: Discuss the standard GAN formulation.
  • GANs part 2: Discuss the Wasserstein GAN, as introduced in Arjovsky, Chintala, Bottou, “Wasserstein GAN” arXiv:1701.07875.

Texts:

Game Theory (2020 Sem. 2)

Co-organised with Nora Ganter, Jonah Nelson, Kshitija Vaidya, and Chengjing Zhang.

Topics:

  • Introduction to games
  • Nash equilibrium
  • Compactness in \(\mathbb{R}^n\)
  • Kakutani’s Theorem
  • Zero-sum games
  • Minmax Theorem
  • Solution concepts

Texts:

  • Fudenberg, Tirole (1991) Game theory. Cambridge, Massachusetts: MIT Press.
  • Leyton-Brown, Shoham (2008) Essentials of game theory: a concise, multidisciplinary introduction. San Rafael, California: Morgan & Claypool Publishers.

Foundations of Mathematical Cryptography (2020 Sem. 1)

Co-organised with Majid Alamudi, Nora Ganter, Adam Walsh, Chengjing Zhang and Gufang Zhao.

Topics:

  • Introduction to proofs
  • Binomial coefficients
  • Modular arithmetic, Euclidean algorithm, Bezout’s Lemma
  • Prime factorisation
  • RSA
  • Groups and cosets
  • Discrete logarithm part 1
  • Discrete logarithm part 2