金沙js9·线路中心(中国)股份有限公司

刘泽显
发布时间: 2021-02-01 浏览次数: 64

教师简介


刘泽显,1984年出生,广西昭平人,博士,副教授,硕士生导师。

研究方向:最优化方法与应用

邮箱:liuzexian2008@163.com

主要经历


20107   入职贺州学院数学系 

20189   西安电子科技大学应用数学专业博士学位,师从刘红卫教授

202012 中国科学院数学与系统科学研究院博士后出站,合作导师:戴彧虹研究员

20211   入职金沙js9·线路中心(中国)股份有限公司

所获荣誉


1. 陕西省第四届研究生创新成果一等奖(2018 

2. 西安电子科技大学金沙js9线路中心主页2016-2017年学术创新一等奖  

3. 2021年西安电子科技大学优秀博士论文

4. 2021年被评为金沙js9·线路中心(中国)股份有限公司“优秀共产党员”

5. 2022年获金沙js9·线路中心(中国)股份有限公司考核优秀

主要贡献


      主讲数学分析、数值分析、实变函数、数学实验等多门本科课程,以及《矩阵分析》研究生课程。主持完成区级教改项目1项

主要从事梯度法、共轭梯度法、拟牛顿法和机器学习与人工智能中的优化方法等方面的研究, 研究成果发表在 Comput. Optim. Appl. IMA J. Numer. Anal.J. Global Optim.   J. Optim. Theory Appl. 等高水平SCI期刊上。主持国家自然科学基金2项,参与国家自然科学基金重大项目1项,主持省级自然科学基金2项和中国博士后面上基金1项。

 

学术和社会兼职

1. 中国运筹学会算法软件与应用分会理事

2. 美国《数学评论》评论员

3. 国际SCI期刊 J. Global Optim.,  Optim. Methods softw.,  Numer. 

 Algorithms, Appl. Numer. Math.,   J. Comput. Appl. Math.,  Optim. Letters等期刊审稿人

4. 国内核心期刊 《中国科学.数学》、《计算数学》、《运筹学报》等期刊审稿人

 

主持和参与的项目

1. 新型一阶算法及其收敛率和应用研究(12261019)国家自然科学基金2023.1-2026.12主持,在研

2. 混合整数规划的人工智能方法(11991021), 国家自然科学基金重大项目,  2020.1-2024.12, 主要参与在研

3. 大规模优化的近似最优梯度法和有限内存共轭梯度法研究(11901561) 国家自然科学基金(青年项目)2020.1-2021.12主持,结题

4. 非凸优化问题的梯度型算法研究(黔科合基础-ZK[2022]一般 084),贵州自然科学基金,  2022.3-2024.3,主持,在研

5. 无约束优化问题的若干算法研究(2019M660833),中国博士后科学基金面上项目,2019.11-2021.1,主持,结题

6. 基于BB 算法思想的梯度法与共轭梯度法及其应用研究(2018GXNSFBA281180),广西自然科学基金,2018.11-2021.12,主持,结题

 

主持的教改项目:

1. 地方本科院校数学与应用数学专业实验教学的研究与实践(2014JGB234)2014广西高等教育教学改革工程项目,2014.6-2016.4,主持

 

软件:

1. 子空间共轭梯度法软件 SMCG_BB (Hongwei Liu, Zexian Liu.  An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization): CodeNumerical results

http://web.xidian.edu.cn/xdliuhongwei/files/20181023_141925.rar .  

2.梯度法软件 GM_AOS(cone) (Zexian Liu, Hongwei Liu.  Gradient method with approximately optimal stepsize based on conic model ):CodeNumerical results: http://web.xidian.edu.cn/xdliuhongwei/files/20181001_213519.zip

3. 有限内存共轭梯度法CGOPT(2.0) (Zexian Liu, Hongwei Liu, Yu-Hong Dai. An improved Dai–Kou conjugate gradient algorithm for unconstrained optimization): Code and numerical resultshttp://coa.amss.ac.cn/wordpress/?page_id=21

 

代表论文

24. Song Taiyong,  Liu Zexian*. An efficient inertial subspace minimization CG algorithm with convergence rate analysis for constrained nonlinear monotone equations. Journal of Computational and Applied Mathematics, 446, 115873, 2024  (SCI).

23. Liu Hongwei, Wang, Ting,  Liu Zexian.  A nonmonotone accelerated proximal gradient method with variable stepsize strategy for nonsmooth and nonconvex minimization problems.  Journal of Global Optimization  (2024), https://doi.org/10.1007/s10898-024-01366-4 (SCI)

22. Ni Yan,  Liu Zexian*.  A new Dai-Liao conjugate gradient method based on approximately optimal stepsize for unconstrained optimization. Numerical Functional Analysis and Optimization, DOI:10.1080/01630563.2024.23332

55,  2024  (SCI).

21Liu Zexian, Ni Yan, Liu Hongwei, Sun Wumei. A new subspace minimization conjugate gradient method for unconstrained minimization.  Journal of Optimization Theory and Applications,  200,  820–851, 2024 (SCI)

20. Liu Hongwei, Sun Wumei, Liu Zexian. A Regularized Limited Memory Subspace Minimization Conjugate Gradient Method for Unconstrained Optimization.  Numerical Algorithms, 94, 1919–1948, 2023 (SCI), 

19.  Liu Zexian,  Liu Hongwei, Wang  Ting.  New gradient methods with adaptive stepsizes by approximate models, Optimization, https://doi.org/10.1080/02331934.2023.2234925 , (SCI), 2023.

18.  Liu Hongwei, Wang Ting, Liu Zexian.  Convergence rate of inertial forward–backward algorithms based on the local error bound condition.IMA Journal of Numerical Analysis.  https://doi.org/10.1093/imanum/drad031,  2023(SCI)

17.  Liu Hongwei, Wang Ting, Liu Zexian. Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems. Computational Optimization and Applications, 83, 651–691, 2022 (SCI)

16Liu Zexian, Chu Wangli, Liu Hongwei. An efficient gradient method  with approximately optimal stepsizes based on regularization models for unconstrained optimization. RAIRO Operations Research, 56, 2403–2424(2022). (SCI)

15. Sun Wumei, Liu Hongwei, Liu Zexian. Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems. Numerical Algorithms,  91, 1677–1719, 2022  (SCI)

14. Sun Wumei, Liu Hongwei, Liu Zexian. A class of accelerated subspace minimization conjugate gradient methods. Journal of Optimization Theory and Applications, 190, 811–840, 2021. (SCI )

16. Zhao Ting, Liu Hongwei, Liu Zexian. New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numerical Algorithms. 87(4), 1501–1534, 2021.(SCI)

13Liu Zexian, Liu Hongwei, Dai Yu-Hong*. An improved Dai-Kou conjugate gradient algorithm forunconstrained optimization. Computational  Optimization and Applications. 202075(1):145–167 (SCI  )

12. Liu Zexian, Liu Hongwei. An efficient gradient method with approximately optimal stepsize based ontensor model for unconstrained optimization. Journal of Optimization Theory and Applications, 2019, 181(2): 608-633. (SCI)

11. Liu Hongwei, Liu Zexian*. An efficient Barzila-Borwein conjugate gradient method for unconstrained Optimization. Journal of Optimization Theory and Applications, 2019, 180(3):879-906 (SCI  )

10. Liu Zexian, Liu Hongwei. Several efficient gradient methods with approximate optimal stepsizes forlarge scale unconstrained optimization.  Journal of Computational and Applied Mathematics, 2018, 328:400-413. (SCI)

9Liu Zexian, Liu Hongwei. An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization.  Numerical Algorithms, 2018, 78(1):21-39. (SCI)

8. Li Ming, Liu Hongwei, Liu Zexian*. A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained  optimization. Numerical Algorithms, 2018, 79(1):195- 219(SCI)

7Liu Zexian, Liu Hongwei, Dong Xiaoliang. An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem. Optimization, 2018, 67(3): 427-440.(SCI)

6Liu Zexian*, Liu Hongwei, Wang Xiping. Accelerated augmented Lagrangian method for total variation minimization. Computational and Applied Mathematics, 2019, 38(2). https://doi.org/10.1007/ s40314-019-0787-7. (SCI )

5. Liu Hongwei, Liu Zexian*, Dong Xiaoliang. A new adaptive Barzilai and Borwein method for unconstrained optimization. Optimization Letters, 2018, 12(4):845-873. (SCI )

4. Li Yufei, Liu Zexian*, Liu Hongwei. A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Computational and Applied Mathematics, 2019, 38(1),https://link.springer.com/article/10.1007/s40314-019-0779-7 . ( SCI )

3. Wang Ting, Liu Zexian*, Liu Hongwei. A new subspace minimization conjugate gradient method based on tensor model for unconstrained  optimization. International Journal of Computer Mathematics, 2019, 96(10): 1924-1942. (SCI )

2. Dong Xiaoliang, Liu Zexian, Liu Hongwei, Li Xiangli. An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method.  Optimization Methods and Software, 2018, 34(2):1-14 

1. Zhang Keke, Liu Hongwei, Liu Zexian*. A new adaptive subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Journal of Computational Mathematics. 2021, 39(2), 159-177. (SCI )

 

指导研究生情况

目前在读硕士研究生7(20213人,20222人,20232)。每年有3-4招生指标,欢迎对最优化方法与应用感兴趣的本科生报考,拒绝“混文凭”的学生。


 
Copyright◎ 2016 math.gzu.edu.cn All Rights Reserved 金沙js9·线路中心(中国)股份有限公司 版权所有
邮编:550025 Tel:0851-83627662(办公室); 0851-83627557(教学科研科(含研究生管理)); 0851-83620186(学生科(本科))
XML 地图