Chengchang Liu (刘程畅)

alt text 

Chengchang Liu


Ph.D candidate
Department of Computer Science and Engineering
The Chinese University of Hong Kong


Email:
7liuchengchang@gmail.com (recommended)
ccliu22@cse.cuhk.edu.hk

About me

I am a Ph.D candidate at CSE Department of CUHK under supervision of Prof. John C.S. Lui.

Previously, I visited CUHK (SZ) during the summer of 2024, hosted by Prof. Zhi-Quan (Tom) Luo. I visited HKUST during the summer of 2021, hosted by Prof. Tong Zhang. I work closely with Prof. Luo Luo.

I am always open for possible collaborations or visiting opportunities from both academia and industry, please feel free to contact me.

Research

My research focuses on designing efficient methods for large-scale optimization problems. I list my research interests and highlights below.

  • The theory of second-order optimization.

    • Optimal second-order methods are not ‘‘optimal’’. [arXiv24]

    • The gap of local convergence rates of quasi-Newton and Newton methods can be bridged by block updates. [NeurIPS23, arXiv23]

    • Quasi-Newton methods for minimax optimization enjoy similar rates as for minimization. [NeurIPS22, KDD22]

  • Distributed/Federated optimization.

    • Communicating Hessian can be done in parallel with updating the models. [KDD23]

  • The intersection of optimization and quantum computing.

Publications & Preprints (Google Scholar)

  1. Quantum Algorithms for Non-smooth Non-convex Optimization.
    Chengchang Liu, Chaowen Guan, Jianhao He, John C.S. Lui.
    NeurIPS, 2024

  2. Quantum Algorithm for Online Exp-concave Optimization.
    Jianhao He, Chengchang Liu, Xutong Liu, Lvzhou Li, John C.S. Lui.
    ICML, 2024

  3. Communication Efficient Distributed Newton Method over Unreliable Networks.
    Ming Wen, Chengchang Liu, Yuedong Xu.
    AAAI, 2024

  4. Communication Efficient Distributed Newton Method with Fast Convergence Rates.
    Chengchang Liu, Lesi Chen, Luo Luo, John C.S. Lui.
    KDD, 2023

  5. Block Broyden's Methods for Solving Nonlinear Equations.
    Chengchang Liu, Cheng Chen, Luo Luo, John C.S. Lui.
    NeurIPS, 2023

  6. Partial-Quasi-Newton Methods: Efficient Algorithms for Minimax Optimization Problems with Unbalanced Dimensionality.
    Chengchang Liu, Shuxian Bi, Luo Luo, John C.S. Lui.
    KDD, 2022 Best Paper Runner-Up

  7. Quasi-Newton Methods for Saddle Point Problems.
    Chengchang Liu, Luo Luo.
    NeurIPS, 2022 Spotlight

  8. Second-order Min-Max Optimization with Lazy Hessians.
    Lesi Chen, Chengchang Liu, Jingzhao Zhang.
    arXiv preprint, 2024

  9. Symmetric Rank-k Method
    Chengchang Liu, Cheng Chen, Luo Luo.
    arXiv preprint, 2023

  10. Regularized Newton Methods for Variational Inequalities with Hölder Continuous Jacobians.
    Chengchang Liu, Luo Luo.
    arXiv preprint, 2022

Services

  • Conference Reviewer: NeurIPS 2023-24, ICLR 2024-25, AISTATS 2024, ICML 2024, AAAI 2025.