电子学报 ›› 2021, Vol. 49 ›› Issue (4): 625-630.DOI: 10.12263/DZXB.20200310

• 学术论文 •    下一篇

正则化超限学习机的最大分划广义交替方向乘子法

侯秀聪, 赖晓平, 曹九稳   

  1. 杭州电子科技大学人工智能研究院, 浙江杭州 310018
  • 收稿日期:2020-03-30 修回日期:2020-07-10 出版日期:2021-04-25 发布日期:2021-04-25
  • 通讯作者: 赖晓平 Email:laixp@hdu.edu.cn
  • 作者简介:侯秀聪 女,1995年生于山东济南,现为杭州电子科技大学自动化学院研究生.研究方向为机器学习.E-mail:cindy_hxc@163.com
  • 基金资助:
    国家自然科学基金(No.U1909209)

A Maximally Split Generalized ADMM for Regularized Extreme Learning Machines

HOU Xiu-cong, LAI Xiao-ping, CAO Jiu-wen   

  1. Artificial Intelligence Institute, Hangzhou Dianzi University, Hangzhou, Zhejiang 310018, China
  • Received:2020-03-30 Revised:2020-07-10 Online:2021-04-25 Published:2021-04-25

摘要: 借助交替方向乘子法(Alternating Direction Method of Multipliers,ADMM),将多变量正则化最小二乘拟合问题,分解为多个可并行执行的标量优化问题,并引入可调步长因子加速算法,得到一个高度并行的最大分划广义ADMM算法,并应用于正则化超限学习机.建立了算法的收敛条件,分析了算法的计算复杂度,通过基准现实数据集实验与新近文献方法——最大分划松弛ADMM进行了收敛率比较.在GPU并行加速实验中,基于最大分划广义ADMM的正则化超限学习机获得的大GPU加速比,表明了该算法的高度并行性.

关键词: 机器学习, 超限学习机, 大数据, 并行学习, 交替方向乘子法

Abstract: By virtue of the alternating direction method of multipliers(ADMM),the multivariate regularized least-squares model fitting problem is decomposed into multiple univariate subproblems that are solvable in parallel.By introducing a tunable step size to accelerate the algorithm,a highly parallel maximally split generalized ADMM(MS-GADMM) is developed for the regularized extreme learning machine(RELM).The convergence condition of the MS-GADMM is established and the computational complexity of the MS-GADMM-based RELM is analyzed.Through experiments on real-world benchmark datasets,the MS-GADMM is compared with a maximally split relaxed ADMM recently presented in the literature.In the GPU implementation experiments,the MS-GADMM has obtained very large GPU acceleration ratios,which demonstrates the high parallelism of the proposed MS-GADMM-based RELM.

Key words: machine learning, extreme learning machine, big data, parallel learning, alternating direction method of multipliers

中图分类号: