Interactive Theorem Proving in Lean

Lean logo QR code link to these slides

18th CICM, Brasilia. 6-10 October 2025.
Florent Schaffhauser, Heidelberg University.

Session 2: Dependent type theory and applications to mathematics

Intermediate Tactics

If you would rather use this time to learn about tactics than about dependently-typed functions, here is an additional practice file for you, on intermediate tactics 😊 . There will be one more on the last slide!

Intermediate tactics

Session 2: Dependent type theory and applications to mathematics

Many kinds of functions

  • So far, we have worked mostly with basic functions, such as neg : Bool → Bool or fact : Nat → Nat. These are simply-typed functions, from a type to a type.

    def fact : NatNat
    | 0     => 1
    | n + 1 => (n + 1) * fact n
    
  • But in fact, we have also seen type formers, such as Prod : Type → Type → Type, which takes two types X and Y and returns the type X × Y. Another common example would be List : Type → Type.

Session 2: Dependent type theory and applications to mathematics

Terms that depend on types, types that depend on terms

Thus, so far we have seen:

  • Terms that depend on terms (simply-typed functions).
  • Types that depend on types (type formers).

There should also be:

  • Terms that depend on types (polymorphic functions).
  • Types that depends on terms (type families).
Session 2: Dependent type theory and applications to mathematics

Polymorphic functions

  • A polymorphic function is a function that takes a type as an argument. The simplest example of a polymorphic function is the identity function:

    def id {X : Type} : XX := fun x ↦ x
    
    #check id  -- id {X : Type} :  X → X
    
  • The type parameter X is passed as implicit here. So to consider the identity function from Nat to Nat, we should write id (X := Nat) or @id Nat.

  • Since X is implicit, id is also an example of an overloaded term: whatever the type X, we may denote its identity function simply by id. The expressions id (3 : Nat) and id (3 : Int) are both well-typed.

Session 2: Dependent type theory and applications to mathematics

Type families

  • A type family parameterised by X is a function F: X → Type, so indeed types that depend on terms: For all x : X, F x is a type.

  • As an example, you can think of the type family Vec X : Nat → Type defined inductively as follows:

    inductive Vec (X : Type) : NatType where
    | null                                : Vec X 0
    | cons {n :Nat} (x : X) (v : Vec X n) : Vec X (n + 1)
    
  • Here Vec X n represents the type of lists of length n with entries in X:

    • Vec.null is a list of length 0.
    • For all n : Nat, cons x (v : Vec X n) is a list of length n + 1.
Session 2: Dependent type theory and applications to mathematics

Dependent pairs

  • Given a type family F: X → Type, we can form the associated type of dependent pairs, which in Lean can be denoted by (x : X) × F x (also called a Σ-type).
  • Dependent pairs are terms of the form ⟨x, t⟩ where x : X and t : F x.
  • For instance ⟨2, [1, -1]⟩ is a term of type (n : Nat) × Vec Int n (note that [1, -1] is a list of integers of length 2, so a term of type Vec Int 2).
  • Dependent pairs generalize usual pairs: If for all , , then

Session 2: Dependent type theory and applications to mathematics

Subtypes

  • The formalism of dependent pairs can be used to define subtypes: we just replace the type family F : X → Type by a predicate P : X → Prop.

  • In Lean, the subtype associated to a predicate can be denoted by (x : X) ×' P x or by {x : X // P x}. Its terms are dependent pairs ⟨x, px⟩ where x is an element of X and px is a proof of the proposition P x.

  • For instance, we could also define Vec X n as {L : List X // L.length = n}, using the following predicate:

    def List.has_length {X : Type} (n : Nat) : List XProp :=
      fun LL.length = n
    
Session 2: Dependent type theory and applications to mathematics

Dependently-typed functions

  • Given a type family F : X → Type, the associated type of dependent functions (also called a Π-type) is denoted by (x : X) → F x.

  • If f : (x : X) → X, then, given x : X, we get f x : F x (the return type depends on the input parameter x).

    def zero_vector : (n : Nat) → Vec Int n
    | 0     => Vec.null
    | n + 1 => Vec.cons 0 (zero_vector n)
    
    #check zero_vector 42  -- zero_vector 42 : Vec Int 42
    
  • Dependent functions generalize functions: if for all , , then

Session 2: Dependent type theory and applications to mathematics

Applications to mathematics

  1. Existential statements
  2. Universal statements
  3. Algebraic structures
Session 2: Dependent type theory and applications to mathematics

Existential statements

  • Given a predicate P : X → Prop on a type X, the proposition ∃ x : X, P x is the proposition defined inductively as follows.

    inductive Exists {X : Type} (P : XProp) : Prop
    | intro (x : X) (p : P x) : Exists P
    
  • This means that, in order to prove that ∃ x : X, P x, you need to construct a term x : X (a witness) and a proof of the proposition P x (the evidence).

  • Note the analogy with subtypes:

    inductive Subtype {X : Type} (P : XProp) : Type
    | intro (x : X) (p : P x) : Subtype P
    

    Also note that ∃ x : X, P x is stronger than saying ¬(∀ x : X, ¬(P x)).

Session 2: Dependent type theory and applications to mathematics

Universal statements

  • Given a predicate P : X → Prop on a type X, the proposition ∀ x : X, P x is the proposition defined by the type of dependent functions (x : X) → P x.

  • This means that, to prove such a statement, you need to construct a function. So you start your proof with fun x ↦ _ (if you are in term mode) or intro x (if you are in tactic mode).

    example : ∀ w : ℂ, ∃ z : ℂ, z ^ 2 = w :=
      by {                        --       ⊢ ∀ w : ℂ, ∃ z : ℂ, z ^ 2 = w
        intro w                   -- w : ℂ ⊢ ∃ z : ℂ, z ^ 2 = w
        sorry
      }
    
  • Note that there might be more than one witness z : ℂ for the property z ^ 2 = w, but that piece of data cannot be recovered from the existential statement itself.

Session 2: Dependent type theory and applications to mathematics

Algebraic structures

  • So far, we have focused on the calculus of predicates. Dependent type theory provides a syntax in which we can express mathematical statements.

    theorem FLT {n : Nat} (x y z : Int) : 
        (n > 2) → x ^ n + y ^ n = z ^ n → x * y * z = 0 :=
      sorry
    
  • But we can also use it to represent mathematical structures such as groups, rings, or topological spaces.

  • To do this in a programming language such as Lean, it is useful to first have a sense of what a record type is.

Session 2: Dependent type theory and applications to mathematics

Record types

  • As a first approximation, you can think of a record type as an inductive type with only one constructor. In Lean, record types are introduced via the keyword structure.

  • The product of two types, for instance, can be defined as a structure.

    structure Prod (X : Type) (Y : Type) : Type where
      mk :: (x : X) (y : Y)
    
  • The definition as an inductive type used quite similar syntax:

    inductive Prod (X : Type) (Y : Type) : Type where
    | mk (x : X) (y : Y) : Prod X Y
    
Session 2: Dependent type theory and applications to mathematics

Fields of a structure

  • While valid, the previous syntax for declaring Prod X Y as a record is not very enlightening.

    structure Prod (X : Type) (Y : Type) : Type where
      mk :: (x : X) (y : Y)
    
  • Try instead:

    structure Prod (X : Type) (Y : Type) : Type where
      mk ::      -- indicating the constructor's name is optional (try it!)
        fst : X 
        snd : Y
    
  • Prod is a structure with two fields, named fst and snd.

Session 2: Dependent type theory and applications to mathematics

Projections

  • Record types come equipped with projections to their fields:

    #check Prod.fst  -- Prod.fst : {X Y : Type} → Prod X Y → X
    
    #check Prod.snd  -- Prod.snd : {X Y : Type} → Prod X Y → Y
    
  • The name of the field should reflect that: Prod.fst is much more expressive than Prod.x as a name for the first projection.

  • A convenient feature of these projections is that you can use dot notation.

    #check (2, -1)      -- (2, -1) : Nat × Int
    #check (2, -1).fst  -- (2, -1).fst : Nat
    #eval  (2, -1).fst   -- 2
    
Session 2: Dependent type theory and applications to mathematics

Monoids

  • In mathematics, a monoid is a triple where:

    • is a set (called the carrier or the underlying set of the monoid ).
    • is an associative operation on , meaning that

    • is a neutral element for the operation , meaning that

  • Since a monoid is some kind of tuple, it is natural to translate this directly into a recod type in Lean. We just have to unpack the information about and .

Session 2: Dependent type theory and applications to mathematics

The type of monoids

  • The type of monoids can be introduced as follows in Lean.

    structure Monoid : Type 1 where
      carrier : Type
      op      : carrier → carrier → carrier 
      assoc   : ∀ x y z : carrier, op (op x y) z = op  x (op y z)
      elt     : carrier
      neutral : ∀ x : carrier, (op elt x = x) ∧ (op x elt = x)
    
  • Note how the field op depends on the field carrier and how the fields assoc and neutral depend on the fields carrier, op and elt.

  • Also, expressions such as ∀ x y z : carrier, op (op x y) z = op x (op y z) (which expresses the associativity property of the operation op) are types.

  • You can forget about Type 1 😅. If you remove : Type 1, Lean will infer it!

Session 2: Dependent type theory and applications to mathematics

Construction of monoids

  • Concretely, how do we construct a monoid? We must supply an element for each field of the Monoid structure.

    def NatAddZero : Monoid where
      carrier := Nat
      op      := Nat.add
      assoc   := Nat.add_assoc
      elt     := Nat.zero
      neutral := fun (n : Nat) ↦ ⟨Nat.zero_add n, Nat.add_zero n⟩
    
  • Note the use of the where keyword.

  • This works because Nat.add, Nat.add_assoc, etc are already contained in Lean's standard library. As shown in the neutral field, the term you need can be defined there directly.

Session 2: Dependent type theory and applications to mathematics

Using tactic mode to construct monoids

  • You can also use tactic mode to write terms that go into the various fields.

    def NatAddZero : Monoid where
      carrier := by {exact Nat}
        ... (omitted)
      neutral := by {
        intro n
        constructor
        exact Nat.zero_add n
        exact Nat.add_zero n
      }
    
  • Or, without the where keyword (try it!):

    def NatAddZero : Monoid := by {constructor, exact Nat, ... (omitted)}
    
Session 2: Dependent type theory and applications to mathematics

Beware of existential quantifiers in the fields of a structure

  • One could also think of declaring the type of monoids as follows.

    structure Monoid : Type 1 where
      carrier : Type
      op      : carrier → carrier → carrier 
      assoc   : ∀ x y z : carrier, op (op x y) z = op  x (op y z)
      neutral : ∃ elt : carrier, ∀ x : carrier, (op elt x = x) ∧ (op x elt = x)
    
  • The issue with this is that it is then unclear how to refer to the neutral element of a monoid, or if it is even possible to do that. In the previous construction, we had for instance NatAddZero.elt = Nat.zero (you can prove that if you want).

  • This is problematic if we want to write the definition of a group: to add something like ∀ x : carrier, ∃ y : carrier, (op y x = elt) ∧ (op x y = elt), we need a term elt to refer to.

Session 2: Dependent type theory and applications to mathematics

Groups

  • There is a theoretical way to get out of the issues above (use a definite description operator). But, for practical purposes, we may as well define the type of monoids as we did and the type of groups as follows (the Type 1 ascription is again optional here).

    structure Group : Type 1 extends Monoid where
      inv_map  : carrier → carrier
      inv_ppty : ∀ x : carrier, (op (inv_map x) x = elt) ∧ (op x (inv_map x) = elt)
    
  • Thanks to the extends keyword, there is no need to repeat the fields carrier, op, etc. They are part of the new structure and can be used when adding further fields. This also creates a projection map from Group to Monoid, which "forgets" the new fields.

    #check @Group.toMonoid  -- Group.toMonoid : Group → Monoid
    
Session 2: Dependent type theory and applications to mathematics

Exercises

  1. A basic exercise is to define the projection Group.toMonoid by hand.
  2. In the same vein, you can retake the definition of Prod X Y as an inductive type (as opposed to a record type) and define the projection to the first factor by hand.
  3. A somewhat harder exercise is to construct a group whose carrier is the set of integers . For this, you will to use the standard library and search for basic results such as the associativity property of addition on Int, the fact that 0 : ℤ is left and right neutral, or the existence of an inverse (the exact? or simp? tactics may be of assistance).
Session 2: Dependent type theory and applications to mathematics

Stationary sequences are convergent

  • It is possible to use usual mathematical notation in Lean. The quantifier symbols unfold to the definitions that we have given before and it is a good exercise to convert the types represented below into Σ-types and Π-types (by definition, Sequence ℝ := ℕ → ℝ).

    def Sequence.isConvergent (s : Sequence ℝ) : Prop :=
      ∃ l : ℝ, ∀ ε > 0, ∃ n : ℕ, ∀ m : Nat, m ≥ n → |s m - l| < ε
    
    def Sequence.isStationary (s : Sequence ℝ) : Prop :=
      ∃ a : ℝ, ∃ n : ℕ, ∀ m : Nat, m ≥ n → s m = a
    
  • Another good exercise is to prove the following implication.

    theorem stationary_implies_convergent : 
        ∀ (s : ℕ → ℝ), s.isStationary → s.isConvergent := sorry
    
Session 2: Dependent type theory and applications to mathematics

Practice time

Here are three practice files for you to work on during the rest of the session. I am happy to answer any questions you may have 😊 . Thank you for your attention!

  1. On intermediate tactics.
  2. On advanced tactics.
  3. On prime versus irreducible elements in commutative rings.

Intermediate tactics Advanced tactics Prime versus irreducible elements

Session 2: Dependent type theory and applications to mathematics

**+ link to these slides?**