1. IE browser is NOT supported anymore. Please use Chrome, Firefox or Edge instead.
2. If you are a new user, please register to get an IHEP SSO account through https://login.ihep.ac.cn/registlight.jsp Any questions, please email us at helpdesk@ihep.ac.cn or call 88236855.
3. If you need to create a conference in the "Conferences, Workshops and Events" zone, please email us at helpdesk@ihep.ac.cn.
4. The max file size allowed for upload is 100 Mb.
29 October 2025 to 2 November 2025
河南省新乡市 (Xinxiang, Henan)
Asia/Shanghai timezone

EveNet: Towards a Generalist Event Transformer for Unified Understanding and Generation of Collider Data

1 Nov 2025, 15:40
10m
平原AB厅

平原AB厅

LHC Computing Parallel 4

Speaker

Dr Yulei Zhang (University of Washington)

Description

With the increasing size of the machine learning (ML) model and vast datasets, the foundation model has transformed how we apply ML to solve real-world problems. Multimodal language models like chatGPT and Llama have expanded their capability to specialized tasks with common pre-train. Similarly, in high-energy physics (HEP), common tasks in the analysis face recurring challenges that demand scalable, data-driven solutions. In this talk, we present a foundation model for high-energy physics. Our model leverages extensive simulated datasets in pre-training to address common tasks across analyses, offering a unified starting point for specialized applications. We demonstrate the benefit of using such a pre-train model in improving search sensitivity, anomaly detection, event reconstruction, feature generation, and beyond. By harnessing the power of pre-trained models, we could push the boundaries of discovery with greater efficiency and insight.

Primary authors

Mr Bai-hong Zhou (SJTU/TDLI) Prof. Li Shu (SJTU/TDLI) Dr Yulei Zhang (University of Washington) Dr Qibin Liu (SLAC) Prof. Shih-Chieh Hsu (University of Washington) Mr Ting-Hsiang Hsu Dr Yuan-Tang Chou (University of Washington) Dr Yue Xu (University of Washington)

Presentation materials