Get more information regarding graduate Math courses offered each semester

Graduate Course Descriptions

Fall 2025

M 392C - Riemann Surfaces (Perutz)

Riemann surfaces, complex manifolds of dimension one, occupy a central place in differential and algebraic geometry and in complex function theory. This is a setting where one can prove deep theorems, using methods of analysis of linear PDE, of sheaf theory, and of algebraic geometry, that extend to higher dimensions, but only at the cost of far more foundational effort and less transparency; hence this is an excellent setting to build intuition.  It is also an area distinguished by beautiful examples. One such is Klein’s quartic curve, a compact Riemann surface described by a simple algebraic equation, whose symmetry group, the simple group of order 168, is the largest of any genus 3 Riemann surface; this Riemann surface can alternatively be understood as a modular curve, the home for modular forms of level 7, and can be pictured as a certain tiling of the hyperbolic disc.  

 

This course will start with the basic theory, and with many concrete examples and constructions (algebraic curves, branched covers, elliptic curves as tori and as cubic curves, modular forms, etc.). It will also cover some of the main theorems: existence of non-trivial meromorphic functions on compact Riemann surfaces, the Riemann-Roch theorem, projective embeddings of compact Riemann surfaces; the Hodge decomposition and Serre duality; uniformization. These results can all be proven by the same, potential-theoretic approach, involving the Laplace and Poisson equations, giving a taste of the methods of geometric analysis. The sheaf-theoretic approach, and more advanced topics, will be covered as time allows.  The main text will be Riemann Surfaces by S.K. Donaldson, though I will also draw on more algebraic sources.

 

Regular student presentations and problem classes will be built in.

 

Prerequisites:  complex analysis (as in the prelim) and basic topology up to the fundamental group are essential. Familiarity with the Algebraic/Differential Topology prelims (e.g. homology, differential forms) will be very useful. Some exposure to functional analysis, as in the Methods of Applied Math prelim sequence, is also helpful.

M 393C - Methods of Mathematical Physics (Chen)
The purpose of this graduate course is to provide an introduction to spectral and renormalization group methods (RG) in Quantum Mechanics and Quantum Field Theory, with connections and applications to neighboring research areas. Specific topics tentatively include stability of matter, RG in fluid equations and deep learning. No background in physics is required, but some preparation in Analysis/PDE is useful.
Familiarity with the material of Part I of this course (taught in F24) is useful but not necessary.
M 393C - Partial Differential Equations I (Patrizi)

Info coming soon.

M 393C - Predictive Machine Learning (Bajaj)

The Fall course  this year is on the design and performance analysis of optimally controlled, aka reinforcement leaned statistical machine learning algorithms, trained, verified  and validated on filtered, noisy observation data distributions collected from various multi-scale dynamical systems. The principal performance metrics will be on online and energy efficient training, verification and validation protocols that achieve principled and stable learning for maximal generalizability .  The emphasis will be on  possibly corrupted data and/or  the lack of full information for the learned stochastic decision making dynamic algorithmic  process. Special emphasis will also  be given to the underlying  mathematical and statistical physics principles  of  free-energy and stochastic Hamiltonian flow dyamics . Students shall  thus be exposed to the latest stochastic  machine learning  modeling approaches for  optimized decision-making, multi-player games involving stochastic dynamical systems and optimal stochastic control. These latter topics are foundational to the training of multiple neural networks (agents) both cooperatively and in adversarial scenarios to optimize the learning process of all the agents.

An initial listing of lecture topics and reference material are given in the syllabus below. This is subject to some modification, given the background and speed at which we cover ground.  Homework exercises shall be given almost bi-weekly.  Assignment solutions that are turned in late shall suffer a 10% per day reduction in credit and a 100% reduction once solutions are posted. There will be a mid-term exam in class. The exam content will be similar to the homework exercises. A list of topics will also be assigned as take-home final projects to train the best of scientific machine-learned decision-making (agents). The projects will involve modern ML programming, an oral presentation, and a written report submitted at the end of the semester.

This project shall be graded and be in lieu of a final exam.

The course is open to graduate students in all disciplines. Those in the 5-year master's program students, and in the CS, CSEM, ECE, MATH, STAT, PHYS, CHEM, and BIO, are welcome. You’ll need an undergraduate level  background  in  the intertwined topics of algorithms, data structures, numerical methods, numerical optimization,  functional analysis, algebra, geometry, topology, statistics, stochastic processes . You will need programming experience (e.g., Python ) at  a CS undergraduate senior level.

M 393C - Tensor Methods (Kileel)

Info coming soon.

M 394C - Stochastic Processes (Sirbu)

Info coming soon.