TSSOS is a polynomial optimization tool based on the sparsity adapted moment-SOS hierarchies. To use TSSOS in Julia, run
pkg> add https://github.com/wangjie212/TSSOS
Documentation |
---|
TSSOS has been tested on Ubuntu and Windows.
The unconstrained polynomial optimization problem formulizes as
Taking
using TSSOS
using DynamicPolynomials
@polyvar x[1:3]
f = 1 + x[1]^4 + x[2]^4 + x[3]^4 + x[1]*x[2]*x[3] + x[2]
opt,sol,data = tssos_first(f, x, TS="MD")
By default, the monomial basis computed by the Newton polytope method is used. If one sets newton=false in the input,
opt,sol,data = tssos_first(f, x, newton=false, TS="MD")
then the standard monomial basis will be used.
Two vectors will be output. The first vector includes the sizes of PSD blocks and the second vector includes the number of PSD blocks with sizes corresponding to the first vector.
To compute higher TS steps of the TSSOS hierarchy, repeatedly run
opt,sol,data = tssos_higher!(data, TS="MD")
Options
nb: specify the first nb variables to be binary variables (satisfying
newton: true (using the monomial basis computed by the Newton polytope method), false
TS (term sparsity): "block" (using the maximal chordal extension), "MD" (using approximately smallest chordal extensions), false (without term sparsity)
solution: true (extracting an (approximate optimal) solution), false
Output
basis: monomial basis
cl: numbers of blocks
blocksize: sizes of blocks
blocks: block structrue
GramMat: Gram matrices (you need to set Gram=true)
flag: 0 if global optimality is certified; 1 otherwise
The constrained polynomial optimization problem formulizes as
Taking
@polyvar x[1:3]
f = 1+x[1]^4+x[2]^4+x[3]^4+x[1]*x[2]*x[3]+x[2]
g_1 = 1-x[1]^2-2*x[2]^2
g_2 = x[2]^2+x[3]^2-1
pop = [f, g_1, g_2]
d = 2 # set the relaxation order
opt,sol,data = tssos_first(pop, x, d, numeq=1, TS="MD")
To compute higher TS steps of the TSSOS hierarchy, repeatedly run
opt,sol,data = tssos_higher!(data, TS="MD")
Options
nb: specify the first nb variables to be binary variables (satisfying
TS: "block" by default (using the maximal chordal extension), "MD" (using approximately smallest chordal extensions), false (without term sparsity)
quotient: true (working in the quotient ring by computing Gröbner basis), false
solution: true (extracting an (approximate optimal) solution), false
One can also exploit correlative sparsity and term sparsity simultaneously, which is called the CS-TSSOS hierarchy.
using DynamicPolynomials
n = 6
@polyvar x[1:n]
f = 1+sum(x.^4)+x[1]*x[2]*x[3]+x[3]*x[4]*x[5]+x[3]*x[4]*x[6]+x[3]*x[5]*x[6]+x[4]*x[5]*x[6]
pop = [f, 1-sum(x[1:3].^2), 1-sum(x[1:4].^2)]
order = 2 # set the relaxation order
opt,sol,data = cs_tssos_first(pop, x, order, numeq=0, TS="MD")
opt,sol,data = cs_tssos_higher!(data, TS="MD")
Options
nb: specify the first nb variables to be binary variables (satisfying
CS (correlative sparsity): "MF" by default (generating an approximately smallest chordal extension), "NC" (without chordal extension), false (without correlative sparsity)
TS: "block" (using the maximal chordal extension), "MD" (using approximately smallest chordal extensions), false (without term sparsity)
order: d (relaxation order), "min" (using the lowest relaxation order for each variable clique)
MomentOne: true (adding a first-order moment matrix for each variable clique), false
solution: true (extracting an (approximate optimal) solution), false
You may set solver="Mosek" or solver="COSMO" to specify the SDP solver invoked by TSSOS. By default, the solver is Mosek.
You can tune the parameters of COSMO via
settings = cosmo_para()
settings.eps_abs = 1e-5 # absolute residual tolerance
settings.eps_rel = 1e-5 # relative residual tolerance
settings.max_iter = 1e4 # maximum number of iterations
and run for instance tssos_first(..., cosmo_setting=settings)
Output
basis: monomial basis
cl: numbers of blocks
blocksize: sizes of blocks
blocks: block structrue
GramMat: Gram matrices (you need to set Gram=true)
moment: moment matrices (you need to set Mommat=true)
flag: 0 if global optimality is certified; 1 otherwise
See the file runopf.jl as well as modelopf.jl in example.
TSSOS supports more general sum-of-squares optimization (including polynomial optimization as a special case):
model,info = add_psatz!(model, nonneg, vars, ineq_cons, eq_cons, order, TS="block", SO=1, Groebnerbasis=false)
where nonneg is a nonnegative polynomial constrained to be a Putinar's style SOS on the semialgebraic set defined by ineq_cons and eq_cons, and SO is the sparse order.
The following is a simple exmaple.
using JuMP
using MosekTools
using DynamicPolynomials
using MultivariatePolynomials
using TSSOS
@polyvar x[1:3]
f = x[1]^2 + x[1]*x[2] + x[2]^2 + x[2]*x[3] + x[3]^2
d = 2 # set the relaxation order
@polyvar y[1:2]
h = [x[1]^2+x[2]^2+y[1]^2-1, x[2]^2+x[3]^2+y[2]^2-1]
model = Model(optimizer_with_attributes(Mosek.Optimizer))
@variable(model, lower)
nonne = f - lower*sum(x.^2)
model,info = add_psatz!(model, nonne, [x; y], [], h, d, TS="block", Groebnerbasis=true)
@objective(model, Max, lower)
optimize!(model)
See the file example/sosprogram.jl for a more complicated example.
It is possible to compute a local solution of the polynomial optimization problem in TSSOS by Ipopt:
obj,sol,status = local_solution(data.n, data.m, data.supp, data.coe, numeq=data.numeq, startpoint=rand(data.n))
TSSOS also supports solving complex polynomial optimization via the sparsity adapted complex moment-HSOS hierarchies. See Exploiting Sparsity in Complex Polynomial Optimization for more details.
The complex polynomial optimization problem formulizes as
In TSSOS, we use
using DynamicPolynomials
n = 2 # set the number of complex variables
@polyvar x[1:2n]
f = 3 - x[1]*x[3] - 0.5im*x[1]*x[4]^2 + 0.5im*x[2]^2*x[3]
g1 = x[2] + x[4]
g2 = x[1]*x[3] - 0.25*x[1]^2 - 0.25 x[3]^2 - 1
g3 = x[1]*x[3] + x[2]*x[4] - 3
g4 = im*x[2] - im*x[4]
pop = [f, g1, g2, g3, g4]
order = 2 # set the relaxation order
opt,sol,data = cs_tssos_first(pop, x, n, order, numeq=3, TS="block")
Options
nb: specify the first nb complex variables to be of unit norm (satisfying
CS (correlative sparsity): "MF" by default (generating an approximately smallest chordal extension), "NC" (without chordal extension), false (without correlative sparsity)
TS: "block" (using the maximal chordal extension), "MD" (using approximately smallest chordal extensions), false (without term sparsity)
order: d (relaxation order), "min" (using the lowest relaxation order for each variable clique)
MomentOne: true (adding a first-order moment matrix for each variable clique), false
ipart: true (with complex moment matrices), false (with real moment matrices)
- When possible, explictly include a sphere/ball constraint (or multi-sphere/multi-ball constraints).
- When the feasible set is unbounded, try the homogenization technique introduced in Homogenization for polynomial optimization with unbounded sets.
- Scale the coefficients of the polynomial optimization problem to
$[-1, 1]$ . - Scale the variables so that they take values in
$[-1, 1]$ or$[0, 1]$ . - Try to include more (redundant) inequality constraints.
Visit NCTSSOS
Visit SparseDynamicSystem
Visit SparseJSR
[1] TSSOS: A Moment-SOS hierarchy that exploits term sparsity
[2] Chordal-TSSOS: a moment-SOS hierarchy that exploits term sparsity with chordal extension
[3] CS-TSSOS: Correlative and term sparsity for large-scale polynomial optimization
[4] TSSOS: a Julia library to exploit sparsity for large-scale polynomial optimization
[5] Sparse polynomial optimization: theory and practice
Jie Wang: wangjie212@amss.ac.cn
Victor Magron: vmagron@laas.fr