optlib
We aim to formalize the broad area of mathematical optimization including convex analysis, convex optimization, nonlinear programming, integer programming and etc in Lean4. Related topics include but are not limited to the definition and properties of convex and nonconvex functions, optimality conditions, convergence of various algorithms.
More topics related to computational mathematics such as numerical linear algebra and numerical analysis will be included in the future.
Our GitHub web page corresponding to this work can be found at here .
Lean4 Toolchain Installation
-
A comprehensive installation guide in Chinese: http://faculty.bicmr.pku.edu.cn/~wenzw/formal/index.html
-
Download guide provided by the official Lean team: https://leanprover-community.github.io/get_started.html
-
For Lean 4 users in China, the glean tool is recommended for using the Shanghai Jiao Tong University mirror service.
How to use the code in this repository
If anything goes wrong, please feel free to contact Chenyi Li through email (lichenyi@stu.pku.edu.cn).
The version of Lean4 that used by this repository can be checked here.
Use the Optlib
library as a Lean4 project dependency
In a Lean4 project, add these lines to your lakefile.lean
:
require optlib from git
"https://github.com/optsuite/optlib"
or in new lakefile.lean
Lake DSL:
require "optsuite" / "optlib" @ "git#master"
The optlib library uses mathlib4 as a dependency, command lake exe cache get
can be used to fetch mathlib4 cache.
Contribute to the Optlib
library
The command
git clone https://github.com/optsuite/optlib.git && cd optlib && code .
will download the source of the optlib library. After editing those files, you can fork this project on GitHub and file a pull request.
Code Explanations
Differential
Basic.lean
: (nowMathlib/Analysis/Calculus/Gradient/Basic.lean
) the definition and basic properties of the gradient of a function. (This file has been merged into mathlib4, see https://github.com/leanprover-community/mathlib4/blob/master/Mathlib/Analysis/Calculus/Gradient/Basic.lean)Calculation.lean
: the properties of the gradient of a function, including the chain rule, the product rule.GradientDiv.lean
: the quotient rule of the gradient.Lemmas.lean
: useful lemmas such as the mean-value theorem and the taylor's expansion.Subdifferential.lean
: the basic definitions and properties of subdifferentials for general nonsmooth functions.
Convex
ConvexFunction.lean
: the properties of convex functions.QuasiConvexFirstOrder.lean
: first order conditions for quasi-convex functions.StronglyConvex.lean
: the properties of strongly convex functions. (Part of this has been merged into mathlib) (see https://github.com/leanprover-community/mathlib4/blob/master/Mathlib/Analysis/Convex/Strong.lean)Subgradient.lean
: the basic definitions and the properties of subgradient.BanachSubgradient.lean
: the basic definitions of subgradient on a banach space.FiniteDimensionalConvexFunctionsLocallyLipschitz.lean
: the proof of the the contuity of convex functions on finite dimensional spaceConicCaratheodory.lean
: the proof of the conic Caratheodory lemmaFarkas.lean
: the proof of the Farkas Lemma
Function
ClosedFunction.lean
: (nowMathlib/Topology/Semicontinuous.lean
) the basic definitions and the properties of closed functions. (This file has been merged into mathlib4, see https://github.com/leanprover-community/mathlib4/blob/master/Mathlib/Topology/Semicontinuous.lean)Lsmooth.lean
: the properties of L-smooth functions.MinimaClosedFunction.lean
: Weierstrass theorem for closed functions.Proximal.lean
: the basic definitions and the properties of proximal operatorKL.lean
: KL properties and uniform KL properties
Optimality
OptimalityConditionOfUnconstrainedProblem.lean
: first order optimality conditions for unconstrained optimization problems.Constrained_Problem.lean
: the basic definitions of constrained optimization problems and the proof the KKT conditions under LICQ and linear constraint qualificationWeak_Duality.lean
: the formalization of the dual problem and the proof of the weak duality property
Algorithm
GradientDescent.lean
: convergence rate of gradient descent algorithm for smooth convex functions.GradientDescentStronglyConvex.lean
: convergence rate of gradient descent algorithm for smooth strongly convex functions.NesterovSmooth.lean
: convergence rate of Nesterov accelerated gradient descent algorithm for smooth convex functions.SubgradientMethod.lean
: convergence rate of subgradient method with different choices of stepsize for nonsmooth convex functions.ProximalGradient.lean
: convergence rate of the proximal gradient method for composite optimization problems.NesterovAccelerationFirst.lean
: convergence rate of the first version of Nesterov acceleration method for composite optimization problems.NesterovAccelerationSecond.lean
: convergence rate of the second version of Nesterov acceleration method for composite optimization problems.LASSO.lean
: convergence rate of the LASSO algorithm for L1-regularized least squares problem.the BCD method
: convergence analysis of the block coordinate descent method.the ADMM method
: convergence analysis of the alternating direction method of multipliers.
What we have done
Convex Analysis
- First and Second Order Conditions for Convex Functions
- Definition and Properties of Strongly Convex Functions
- Definition and Properties of L-smooth Functions
- Definition and Properties of Proper Functions and Conjugate Functions
- ......
Optimality Conditions
- First and Second Order Conditions for Constrained and Unconstrained Problems
- KKT Conditions for Constrained Problems under constraint qualifications
- Slater Condition and KKT Conditions for convex optimization problems (Ongoing)
- ......
Convergence of Optimization Algorithms
- Gradient Descent for Convex and Strongly Convex Functions
- Proximal Gradient Method and Nesterov's Acceleration
- Block Coordinate Descent Method
- Alternating Direction Method of Multipliers
- ......
Mathematics Textbook Formalization Project
Objectives
The project aims to systematically formalize core areas of mathematics, including convex analysis, numerical linear algebra, and high-dimensional probability using Lean. By formalizing classical textbooks and building a machine-readable, verifiable knowledge network, the initiative will bridge traditional human-written mathematics with AI-assisted reasoning and scientific computing. The ultimate goal is to create a replicable paradigm for textbook formalization, foster integration between mathematics and artificial intelligence, and establish a foundation for next-generation mathematical infrastructure.
Recruitment
The project will build a collaborative research team centered on Lean-based formalization. We welcome outstanding undergraduate seniors, master’s students, and PhD students with strong mathematical backgrounds and programming skills. Team members will be organized into groups under the guidance of group leaders, focusing respectively on convex optimization, numerical algebra, and high-dimensional probability.
Compensation and Benefits
-
Group Members: Base stipend of RMB 1,000 per month, rising up to RMB 3,000 depending on the number and quality of formalized theorems.
-
Group Leaders: Base stipend of RMB 1,500 per month, rising up to RMB 4,000 depending on individual and team contributions.
Topics
Convex Analysis
We mainly follow the book Convex Analysis (R. T. Rockafellar).
- Topological Properties
- Duality Correspondences
- Representation and Inequalities
- Differential Theory
- ...
Nonlinear Programming
We mainly follow the books Numerical Optimization (Jorge Nocedal, Stephen J. Wright) and Optimization: Modeling, Algorithm and Theory
- Line Search Methods
- Quasi-Newton Methods
- Theory of Constrained Optimization
- ...
Integer Programming
We mainly follow the book Integer Programming (Michele Conforti, Gérard Cornuéjols, and Giacomo Zambelli).
- Linear Inequlities and Polyhedra
- Perfect Formulations
- Intersection Cuts and Corner Polyhedra
- ...
Numerical Linear Algebra
We mainly follow the book Matrix Computations (Golub & Van Loan).
- Matrix Factorizations (LU/Cholesky, QR, Schur, SVD; Jordan canonical form)
- Matrix Functions
- Matrix Differential Calculus
- ...
High Dimensional Probability
We mainly follow the book High-Dimensional Probability (Roman Vershynin).
- Concentration Inequalities
- Random Vectors
- Random Matrices
- Random Processes
- ...
Sponsor
- Beijing International Center for Mathematical Research, Peking University
- Sino-Russian Mathematics Center
- Great Bay University
- National Natural Science Foundation of China
Publications
Formalization of Optimization
- Chenyi Li, Ziyu Wang, Wanyi He, Yuxuan Wu, Shengyang Xu, Zaiwen Wen. Formalization of Complexity Analysis of the First-order Optimization Algorithms
- Chenyi Li, Zichen Wang, Yifan Bai, Yunxi Duan, Yuqing Gao, Pengfei Hao, Zaiwen Wen, Formalization of Algorithms for Optimization with Block Structures
- Chenyi Li, Shengyang Xu, Chumin Sun, Li Zhou, Zaiwen Wen, Formalization of Optimality Conditions for Smooth Constrained Optimization Problems
- Chenyi Li, Zaiwen Wen, An Introduction to Mathematics Formalization Based on Lean (in Chinese)
Autoformalization and Automatic Theorem Proving
- Ziyu Wang, Bowen Yang, Shihao Zhou, Chenyi Li, Yuan Zhang, Bin Dong, Zaiwen Wen, Translating Informal Proofs into Formal Proofs Using a Chain of States
- Chenyi Li, Wanli Ma, Zichen Wang, Zaiwen Wen, SITA: A Framework for Structure-to-Instance Theorem Autoformalization
Premise Selection
- Zichen Wang, Anjie Dong, Zaiwen Wen, Tree-Based Premise Selection for Lean4
References
- H. Liu, J. Hu, Y. Li, Z. Wen, Optimization: Modeling, Algorithm and Theory (in Chinese)
- Rockafellar, R. Tyrrell, and Roger J-B. Wets. Variational analysis. Vol. 317. Springer Science & Business Media, 2009.
- Nocedal, Jorge, and Stephen J. Wright, eds. Numerical optimization. New York, NY: Springer New York, 1999.
- Nesterov, Yurii. Lectures on convex optimization. Vol. 137. Berlin: Springer, 2018.
- Convex Analysis, Vol. 28 of Princeton Math. Series, Princeton Univ. Press, 1970
- Bolte, J., Sabach, S. & Teboulle, M. Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math. Program. 146, 459–494 (2014).
- Maryam Fazel, Ting Kei Pong, Defeng Sun, and Paul Tseng. Hankel matrix rank minimization with applications to system identification and realization. SIAM Journal on Matrix Analysis and Applications, 34(3):946–977, 2013
- ...
Training Sessions
There will be a weekly online session on formalization every Wednesday from 6:15 p.m. to 7:15 p.m throughout the Spring semester of 2025. Join via Tencent Meeting (ID: 659-2856-6723).
The Team
We are a group of scholars and students with a keen interest in mathematical formalization.
Members
- Zaiwen Wen, Beijing International Center for Mathematical Research, Peking University, CHINA (wenzw@pku.edu.cn)
- Chenyi Li, School of Mathematical Sciences, Peking University, CHINA (lichenyi@stu.pku.edu.cn)
- Zichen Wang, School of Mathematics and Statistics, Xi’an Jiaotong University, CHINA (princhernwang@gmail.com)
- Ziyu Wang, School of Mathematical Sciences, Peking University, CHINA (wangziyu-edu@stu.pku.edu.cn)
Other Contributors
-
Undergraduate students from Peking University:
Hongjia Chen, Wanyi He, Yuxuan Wu, Shengyang Xu, Junda Ying, Penghao Yu, ...
-
Undergraduate students from Summer Seminar on Mathematical Formalization and Theorem Proving, BICMR, Peking University, 2023:
Zhipeng Cao, Yiyuan Chen, Heying Wang, Zuokai Wen, Mingquan Zhang, Ruichong Zhang, ...
-
Undergraduate and graduate students from Summer Seminar on Mathematical Formalization and Theorem Proving, BICMR, Peking University, 2024:
Yifan Bai, Yunxi Duan, Anqing Shen, Yuqing Gao, Pengfei Hao
-
Other collaborators:
Anjie Dong, ...
Copyright
Copyright (c) 2025 Chenyi Li, Zichen Wang, Ziyu Wang, Zaiwen Wen. All rights reserved.
Released under Apache 2.0 license as described in the file LICENSE.