My last name Cong (simplified 丛 / traditional 叢) is pronounced as ts-oh-ng
Research Interests
Throughout my doctoral studies, my primary focus has been on investigating fundamental graph ML problems from both ML theory and application perspective. Within this domain, I have taken the lead in the following research endeavors:
Optimization: scalable GNN training via sampling (KDD20, NeurIPS20) and distributed training (ICLR22);
Generalization and expressive power: understanding why GNN suffer from performance degradation as the number of layers increases (NeurIPS21);
Privacy: efficient unlearning via orthogonal projection (AISTATS23);
Model design: design neural architecture for temporal graph learning (SDM23, ICLR23).
Overview: Faud detection on large-scale Amazon product-seller-buyer data using GraphStorm ML framework. Enhancing GraphStorm by introducing new features, such as multiple target node types training and temporal graph learning.
Overview: Propose a conceptually and technically simple architecture that consists of three components: (1) a link-encoder that is only based on multi-layer perceptrons (MLP) to summarize the information from temporal links, (2) a node-encoder that is only based on neighbor mean-pooling to summarize node information, and (3) an MLP-based link classifier that performs link prediction based on the outputs of the encoders. Despite its simplicity, our proposal attains an outstanding performance on temporal link prediction benchmarks with faster convergence and better generalization performance. This paper is accepted by ICLR23 for oral presentation.
Overview: Propose a Transformer-based dynamic graph learning method named Dynamic Graph Transformer (DyFormer). To improve the generalization ability, we introduce two complementary self-supervised pre-training tasks and show that jointly optimizing the two pre-training tasks results in a smaller Bayesian error rate via an information-theoretic analysis. We also propose a temporal-union graph structure and a target-context node sampling strategy for an efficient and scalable training. This paper is accepted by SDM23.
I am living with three cats: the cat on the left side is Juju (橘橘🍊, he), the cat on the right side Heihei (黑娃, he), and Juno (🐈⬛, she) in the middle.
I am a great fun of motorsports and I am incredibly fortunate to have the opportunity to drive a formula car with Allen Berg Racing School at WeatherTech Raceway Laguna Seca.
I participate in online races using iRacing and I find pleasure in attending endurance racing events. This is a photo of my DIY sim-rig at home 👇.