標(biāo)題:Accelerating Deep Neural Network Training with Decentralized Optimization
報(bào)告時(shí)間:2024年04月22日(星期一)10:00-11:00
報(bào)告地點(diǎn):人民大街校區(qū)數(shù)學(xué)與統(tǒng)計(jì)學(xué)院105教室
主講人:袁坤
主辦單位:數(shù)學(xué)與統(tǒng)計(jì)學(xué)院
報(bào)告內(nèi)容簡(jiǎn)介:
Decentralized optimization algorithms save remarkable communication overheads in distributed deep learning since each node averages locally with neighbors. The network topology connecting all nodes determines communication efficiency and local averaging effectiveness. The key to making decentralized algorithms efficient is to seek sparse topologies that realize effective local averaging with little communication. However, exiting common topologies either suffer expensive per-iteration communication or slow consensus rates. In this talk, we will propose several sparse and effective topologies that endow decentralized algorithms with state-of-the-art balance between communication efficiency and convergence. We will also discuss BlueFog, an open-source python library for straightforward, high-performance implementations of diverse topologies and decentralized algorithms.
主講人簡(jiǎn)介:
Dr. Kun Yuan is a Researcher at Center for Machine Learning Research (CMLR) in Peking University. He completed his Ph.D. degree at UCLA in 2019, and was a staff algorithm engineer in Alibaba (US) Group between 2019 and 2022. His research focuses on the development of fast, scalable, reliable, and distributed algorithms with applications in large-scale optimization, deep neural network training, federated learning, and Internet of Things. He was the recipient of the 2017 IEEE Signal Processing Society Young Author Best Paper Award, and the 2017 ICCM Distinguished Paper Award.