Chi Zhang

Incoming Ph.D. student in Computer Science, UT Austin

I work on generative modeling, with growing interests in protein engineering and biomolecular modeling.

Portrait of Chi Zhang

I am finishing my M.S. in Computer Science at UT Austin in May 2026. At UT, I have worked with Prof. Qiang Liu on generative modeling, flow-based methods, and multimodal learning. I will continue at UT Austin as a Ph.D. student in Fall 2026, and I am also starting to explore protein engineering and biomolecular modeling with Prof. Adam Klivans and Dr. Danny Diaz.

I am primarily interested in generative modeling and in using machine learning to study and engineer proteins. My recent work has also touched multimodal robustness, benchmark design, and efficient generation. Below is a selected list of research directions and papers; red denotes first-author or co-first-author contributions.

Before UT Austin, I received my B.S. in Computer Science from Nanjing University, where I worked with Prof. Yu-Feng Li. I have also worked on VLM reliability with Prof. Raymond Mooney.

news

Mar 2026 Folding scFv--Antigen Complexes at Scale was accepted to the ICLR 2026 GEM workshop.
Mar 2026 My first-author paper Gumbel Distillation for Parallel Text Generation was accepted to ICLR 2026.
Jan 2026 My first-author paper Do Images Speak Louder than Words? was accepted to the EACL 2026 main conference.
Aug 2024 Started my M.S. study in Computer Science at UT Austin.

selected publications

  1. Gumbel publication preview
    Gumbel Distillation for Parallel Text Generation
    Chi Zhang, X. Hu, B. Liu, and Q. Liu
    ICLR, 2026
  2. Context-VQA publication preview
    Do Images Speak Louder than Words? Investigating the Effect of Textual Misinformation in Vision-Language Models
    Chi Zhang, W. Ding, J. Liu, M. Wu, Q. Wu, and R. Mooney
    EACL Main, 2026
  3. scFv publication preview
    Folding scFv--Antigen Complexes at Scale
    Ravi Shah, Jeffrey Ouyang-Zhang, Zachary Cohen, Maria Rosaria Briglia, Chi Zhang, Adam Klivans, and Daniel Jesus Diaz
    ICLR 2026 Workshop on Generative and Experimental Perspectives for Biomolecular Design, 2026
  4. Divide Optimize Merge publication preview
    Divide, Optimize, Merge: Fine-Grained LLM Agent Optimization at Scale
    J. Liu, Y. Zeng, S. Zhang, Chi Zhang, M. Højmark-Bertelsen, M. N. Gadeberg, H. Wang, and Q. Wu
    EMNLP Findings, 2025
  5. CANDLE publication preview
    Efficient and Long-tailed Generalization for Pre-trained Vision-Language Model
    J. X. Shi*, Chi Zhang*, T. Wei, and Y. F. Li
    ACM KDD, 2024