威廉希尔学术活动通告(一)
应威廉希尔邀请,美国路易斯安那州立大学张洪超副教授于5月18日至5月31日在公司作短期访问,将在学院学术报告厅做题为:Stochastic Gradient Methods 和 A Nonmonotone Smoothing Newton Algorithm for Weighted Complementarity Problems 的系列学术报告,并将与学院优化控制团队的青年教师和研究生座谈优化方向的最新进展和前沿问题。此次张洪超副教授的访问将进一步开拓青年教师和研究生的研究视野,也将为公司优化控制团队科研水平的提高起到积极的促进作用。
专家简介:张洪超,美国路易斯安那州立大学数学系和计算中心副教授,博士生导师。研究兴趣包括非线性优化理论和应用,稀疏矩阵,医学成像中的反问题,无导数优化等,近年来主持多项美国国家自然科学基金项目。在Mathematical Programming、SIAM Journal on Optimization、SIAM Journal on Numerical Analysis、SIAM Journal on Scientific Computing、SIAM Journal on Imaging、SIAM Journal on Control and Optimization等优化计算领域顶尖期刊发表论文30余篇。现为Computational Optimization and Applications (SCI二区)、Optimization Letters(SCI三区)、Numerical Algebra, Control and Optimization等国际期刊编委。
学术报告安排如下:
报告题目1:Stochastic Gradient Methods (I)
报告人:张洪超,博士
报告时间:2019年5月20日(周一)15:00—17:00
报告地点:威廉希尔学术报告厅
摘要:We consider the convex composite optimization problem whose objective function is a summation of a Lipschitz continuously differentiable but possibly nonconvex function and a simple convex but possibly nonsmooth function. Since the objective function can not be computed with high accuracy, it is in general difficult to apply standard deterministic optimization algorithms to solve the problem. In this talk, we introduce gradient and optimal gradient methods for solving this problem, and discuss the convergence of these methods as well as their convergence complexities.
报告题目2:Stochastic Gradient Methods (II)
报告人:张洪超,博士
报告时间:2019年5月22日(周三)15:00—17:00
报告地点:威廉希尔学术报告厅
摘要: In this talk, we introduce stochastic approximation (SA) method for solving the convex composite optimization problem whose objective function is a summation of a Lipschitz continuously differentiable but possibly nonconvex function and a simple convex but possibly nonsmooth function. The SA method only assumes the stochastic gradient instead of exact gradient of the function. We study the SA method for solving the convex composite optimization problem including convex case and nonconvex case, and discuss the convergence of these methods as well as their convergence complexities.
报告题目3:Stochastic Gradient Methods (III)
报告人:张洪超,博士
报告时间:2019年5月23日(周四)15:00—17:00
报告地点:威廉希尔学术报告厅
摘要: In this talk, we introduce the sample average approximation (SAA) method for solving the convex composite optimization problem whose objective function is a summation of a Lipschitz continuously differentiable but possibly nonconvex function and a simple convex but possibly nonsmooth function. The SAA method generates a random sample of size and approximate the original function by its sample average. We study the SAA method for solving the convex composite optimization problem including convex case and nonconvex case, and discuss the convergence of these methods as well as their convergence complexities.
报告题目4:A Nonmonotone Smoothing Newton Algorithm for Weighted Complementarity Problems
报告人:张洪超,博士
报告时间:2019年5月27日(周一)15:00—17:00
报告地点:威廉希尔学术报告厅
摘要: The weighted complementarity problem (denoted by WCP) significantly
extends the general complementarity problem and can be used for modeling a larger class of problems from science and engineering. By introducing a one-parametric class of smoothing functions which includes the weight vector, we propose a smoothing Newton algorithm with nonmonotone line search to solve the WCP. We prove that any accumulation point of the iteration sequence is a solution of the WCP. Moreover, when the solution set of the WCP is nonempty, we prove that our algorithm has local superlinear/quadratical convergence under assumptions which are weaker than the Jacobian nonsingularity assumption. Under such weak assumptions, we further show that the iteration sequence is bounded and it converges to one solution of the WCP superlinearly or quadratically. Promising numerical results are also reported.
欢迎广大师生参加。
2019年5月20日
- 上一篇:威廉希尔学术活动通告(二)
- 下一篇:2019年度威廉希尔外请专家学术报告之十五