optlib

We aim to formalize the broad area of mathematical optimization including convex analysis, convex optimization, nonlinear programming, integer programming and etc in Lean4. Related topics include but are not limited to the definition and properties of convex and nonconvex functions, optimality conditions, convergence of various algorithms.

More topics related to computational mathematics such as numerical linear algebra and numerical analysis will be included in the future.

Our GitHub web page corresponding to this work can be found at here .

Lean4 Toolchain Installation

How to use the code in this repository

If anything goes wrong, please feel free to contact Chenyi Li through email (lichenyi@stu.pku.edu.cn).

The version of Lean4 that used by this repository can be checked here.

Use the Optlib library as a Lean4 project dependency

In a Lean4 project, add these lines to your lakefile.lean:

require optlib from git
  "https://github.com/optsuite/optlib"

or in new lakefile.lean Lake DSL:

require "optsuite" / "optlib" @ "git#master"

The optlib library uses mathlib4 as a dependency, command lake exe cache get can be used to fetch mathlib4 cache.

Contribute to the Optlib library

The command

git clone https://github.com/optsuite/optlib.git && cd optlib && code .

will download the source of the optlib library. After editing those files, you can fork this project on GitHub and file a pull request.

Main Contributions (What we have done)

Differential

Convex

Function

Optimality

  • OptimalityConditionOfUnconstrainedProblem.lean: first order optimality conditions for unconstrained optimization problems.
  • Constrained_Problem.lean: the basic definitions of constrained optimization problems and the proof the KKT conditions under LICQ and linear constraint qualification
  • Weak_Duality.lean: the formalization of the dual problem and the proof of the weak duality property
  • KKT conditions for constrained convex problems under Slater Condition (Ongoing)

Algorithm

  • GradientDescent.lean: convergence rate of gradient descent algorithm for smooth convex functions.
  • GradientDescentStronglyConvex.lean: convergence rate of gradient descent algorithm for smooth strongly convex functions.
  • NesterovSmooth.lean: convergence rate of Nesterov accelerated gradient descent algorithm for smooth convex functions.
  • SubgradientMethod.lean: convergence rate of subgradient method with different choices of stepsize for nonsmooth convex functions.
  • ProximalGradient.lean: convergence rate of the proximal gradient method for composite optimization problems.
  • NesterovAccelerationFirst.lean: convergence rate of the first version of Nesterov acceleration method for composite optimization problems.
  • NesterovAccelerationSecond.lean: convergence rate of the second version of Nesterov acceleration method for composite optimization problems.
  • LASSO.lean: convergence rate of the LASSO algorithm for L1-regularized least squares problem.
  • the BCD method: convergence analysis of the block coordinate descent method.
  • the ADMM method: convergence analysis of the alternating direction method of multipliers.
  • Primal-Dual Algorithms (todo)

What we plan to do

Convex Analysis

  • First Order Conditions for Convex Functions (Done)
  • Second Order Conditions for Convex Functions
  • Definition and Properties of Proper Functions and Conjugate Functions (Ongoing)
  • Definition and Properties of Strongly Convex Functions (Done)
  • Definition and Properties of L-smooth Functions (Done)
  • Definition and Properties of Subgradient and Proximal Operator(Done)
  • Definition and Properties of Frechet Subdifferential (Done)
  • Definition of KL properties (Done)
  • Definition of semi-algebraic functions
  • ......

Optimality Conditions

  • First Order Conditions for Constrained and Unconstrained Problems
  • Second Order Conditions for Constrained and Unconstrained Problems
  • KKT Conditions for Constrained Problems under constraint qualifications
  • Slater Condition and KKT Conditions for convex optimization problems (Ongoing)
  • ......

Convergence of Optimization Algorithms

  • Gradient Descent for Convex and Strongly Convex Functions
  • Line Search Methods
  • Newton Method, Quasi-Newton Method, LBFGS Update
  • Primal-Dual Algorithms
  • Stochastic Gradient Descent and Stochastic Algorithms
  • ......

Integer Programming

Many other things to be added ...

References

Training Sessions

There will be a weekly online session on formalization every Wednesday from 6:15 p.m. to 7:15 p.m throughout the Spring semester of 2025.

The Team

We are a group of scholars and students with a keen interest in mathematical formalization.

Members

Other Contributors

  • Undergraduate students from Peking University:

    Hongjia Chen, Wanyi He, Yuxuan Wu, Shengyang Xu, Junda Ying, Penghao Yu, ...

  • Undergraduate students from Summer Seminar on Mathematical Formalization and Theorem Proving, BICMR, Peking University, 2023:

    Zhipeng Cao, Yiyuan Chen, Heying Wang, Zuokai Wen, Mingquan Zhang, Ruichong Zhang, ...

  • Undergraduate and graduate students from Summer Seminar on Mathematical Formalization and Theorem Proving, BICMR, Peking University, 2024:

    Yifan Bai, Yunxi Duan, Anqing Shen, Yuqing Gao, Pengfei Hao

  • Other collaborators:

    Anjie Dong, ...

Copyright

Copyright (c) 2025 Chenyi Li, Zichen Wang, Ziyu Wang, Zaiwen Wen. All rights reserved.

Released under Apache 2.0 license as described in the file LICENSE.