活動(dòng)主題:計(jì)算數(shù)學(xué)及其交叉學(xué)科前沿系列講座報(bào)告
報(bào)告題目:Neural Operators: Abstract Framework and Multigrid Structure
報(bào)告人:何俊材 助理教授 清華大學(xué)
邀請(qǐng)人:董灝博士
報(bào)告時(shí)間:2025年12月3日(星期三)上午9:00-12:00
報(bào)告地點(diǎn):騰訊會(huì)議:633-188-816
報(bào)告人簡介:何俊材2014年本科畢業(yè)于四川大學(xué),2019年博士畢業(yè)于北京大學(xué)。2019年至2020年,在賓夕法尼亞州立大學(xué)從事博士后研究;2020年至2022年,在得克薩斯大學(xué)奧斯汀分校擔(dān)任R. H. Bing Instructor;2022年至2025年,在沙特阿卜杜拉國王科技大學(xué)任研究科學(xué)家;2025年2月,加入清華大學(xué)丘成桐數(shù)學(xué)科學(xué)中心,任助理教授。他的主要研究方向是科學(xué)計(jì)算與機(jī)器學(xué)習(xí),涵蓋深度神經(jīng)網(wǎng)絡(luò)的理論分析、算法設(shè)計(jì)與實(shí)際應(yīng)用等方面。
報(bào)告摘要:In this talk, we will present recent results on applying multigrid structures to neural operators for problems in numerical PDEs. First, we will discuss some basic background on operator learning, including the problem setup, a uniform abstract framework, and a general universal approximation result. Motivated by the general definition of neural operators, we propose MgNO, which utilizes multigrid structures to parameterize these linear operators between neurons, offering a new and concise architecture in operator learning. For the implementation issue of MgNO, we will illustrate MgNet as a unified framework for convolutional neural networks and multigrid methods. This approach provides both mathematical rigor and practical expressivity, with many interesting numerical properties and observations.
主辦單位:數(shù)學(xué)與統(tǒng)計(jì)學(xué)院