会议议程丨中国法学会网络与信息法学研究会2025年年会暨第三届数字法治大会会议日程
授课安排丨四川大学法学院王竹教授授课安排(2025-2026学年秋季学期)
报考指南丨四川大学法学院王竹教授2026-2029年博士生报考指南
申请指南|数据安全防护与智能治理教育部重点实验室2025年度开放课题申请指南
会议议程丨高校哲学社会科学实验室联盟第二届会议
详细议程|第四届“数字法治与智慧司法”国际研讨会暨湖北省法学会法理学研究会2024年年会
会议议程丨中国法学会网络与信息法学研究会2024年年会暨第二届数字法治大会会议议程
会议通知 | 四川省法学会人工智能与大数据法治研究会会员大会暨2024年年会通知
征文启事丨CCF中国计算法学研讨会暨第三届学术年会征文启事
会议议程丨网络与信息法学学科建设论坛
时间:2025-11-14![]()
本文(A Fractional Gradient Descent Algorithm Robust to the Initial Weights of Multilayer Perceptron)原载Neural Networks,由四川大学蒲亦非教授等科研人员创作,系川大学智慧法治超前部署学科系列学术成果。后续会持续分享四川大学智慧法治超前部署学科系列学术成果,欢迎大家阅读。
For multilayer perceptron (MLP), the initial weights will significantly influence its performance. Based on the enhanced fractional derivative extend from convex optimization, this paper proposes a fractional gradient descent (RFGD) algorithm robust to the initial weights of MLP. We analyze the effectiveness of the RFGD algorithm. The convergence of the RFGD algorithm is also analyzed. The computational complexity of the RFGD algorithm is generally larger than that of the gradient descent (GD) algorithm but smaller than that of the Adam, Padam, AdaBelief, and AdaDiff algorithms. Numerical experiments show that the RFGD algorithm has strong robustness to the order of fractional calculus which is the only added parameter compared to the GD algorithm. More importantly, compared to the GD, Adam, Padam, AdaBelief, and AdaDiff algorithms, the experimental results show that the RFGD algorithm has the best robust performance for the initial weights of MLP. Meanwhile, the correctness of the theoretical analysis is verified.
Xue-Tao XIE, Yi-Fei PU*, Jian WANG. “A Fractional Gradient Descent Algorithm Robust to the Initial Weights of Multilayer Perceptron,” Neural Networks, vol. 158, pp. 154-170, 2023.(论文下载)