张 培源
<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/44aa5dbc-d55c-477f-9d96-25e2fe4ed8c1/cd361d2a-9856-49d3-bfa3-abc0d900c2fa/25231.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/44aa5dbc-d55c-477f-9d96-25e2fe4ed8c1/cd361d2a-9856-49d3-bfa3-abc0d900c2fa/25231.png" width="40px" /> Github
</aside>
<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/44aa5dbc-d55c-477f-9d96-25e2fe4ed8c1/f30b1dd3-7384-4ae2-b912-11f36f7e174f/Picture2.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/44aa5dbc-d55c-477f-9d96-25e2fe4ed8c1/f30b1dd3-7384-4ae2-b912-11f36f7e174f/Picture2.png" width="40px" /> Google Scholar
</aside>
<aside> <img src="https://prod-files-secure.s3.us-west-2.amazonaws.com/44aa5dbc-d55c-477f-9d96-25e2fe4ed8c1/4a681ff9-9560-425e-a09b-83cefa5ab4e8/twitter-3.png" alt="https://prod-files-secure.s3.us-west-2.amazonaws.com/44aa5dbc-d55c-477f-9d96-25e2fe4ed8c1/4a681ff9-9560-425e-a09b-83cefa5ab4e8/twitter-3.png" width="40px" /> Twitter
</aside>
<aside> 📧 Email
</aside>
I am a first-year PhD student with Prof. Hao Zhang at UCSD CSE. I was a Research Assistant at NTU supervised by Prof. Ziwei Liu. I earned a first-class honors degree in Computer Science from SUTD. My undergraduate study at SUTD was supported by the Singapore Ministry of Education's full scholarship. During that time, I had the privilege of collaborating with Prof. Wei Lu.
My current research interests lie in the intersection of System and ML.
I believe great ML scientists are, fundamentally, exceptional software engineers.
(*: equal contribution)
Long Context Transfer from Language to Vision
Peiyuan Zhang*, Kaichen Zhang*, Bo Li*, Guangtao Zeng, Jingkang Yang, Yuanhan Zhang, Ziyue Wang, Haoran Tan, Chunyuan Li, Ziwei Liu
Arxiv Preprint .[paper]
One Network, Many Masks: Towards More Parameter-Efficient Transfer Learning
Guangtao Zeng*, Peiyuan Zhang*, Wei Lu
ACL 2023 Long Paper. [paper]
Better Few-Shot Relation Extraction with Label Prompt Dropout
Peiyuan Zhang, Wei Lu
EMNLP 2022 Long Paper. [paper]
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
One-for-all LMMs evaluation package.
LM long context training made simple.
UC San Diego, 09/2024–Present
PhD Student, with Prof. Hao Zhang
Nanyang Technological University, 10/2023–08/2024
Reseach Assistant, with Prof. Ziwei Liu
Singapore University of Technology and Design, 10/2022– 10/2023
Reseach Assistant, with Prof. Wei Lu
Singapore Agency of Science, Technology and Research, 05/2020–09/2020
Research Intern
Acknowledgments The template of this personal website is shamelessly brought from here.