Weilin Cong (丛炜霖)

alt text 

PhD, The Pennsylvania State University
E-mail: congweilin95@gmail.com

About me

I am a research scientist at Meta AI. I pursue my Ph.D. studies in Computer Science and Engineering at The Pennsylvania State University, under the supervision of Prof. Mehrdad Mahdavi. I received my B.S. degree from Beijing institute of Technology. For more detailed information, please refer to my CV.

My last name Cong (simplified 丛 / traditional 叢) is pronounced as ts-oh-ng

Research Interests

Throughout my doctoral studies, my primary focus has been on investigating fundamental graph ML problems from both ML theory and application perspective. Within this domain, I have taken the lead in the following research endeavors:

  • Optimization: scalable GNN training via sampling (KDD20, NeurIPS20) and distributed training (ICLR22);

  • Generalization and expressive power: understanding why GNN suffer from performance degradation as the number of layers increases (NeurIPS21);

  • Privacy: efficient unlearning via orthogonal projection (AISTATS23);

  • Model design: design neural architecture for temporal graph learning (SDM23, ICLR23).

Please refer to the publications and most recent updates for more details.

Most recent update (10/16/2023)

Working Experience

  • Amazon AWS AI, Santa Clara, CA

    Applied scientist intern, May.2023 - Sep.2023

    Supervisor: Zheng Da, Xiang Song

    Overview: Faud detection on large-scale Amazon product-seller-buyer data using GraphStorm ML framework. Enhancing GraphStorm by introducing new features, such as multiple target node types training and temporal graph learning.

  • Meta AI Research, Menlo Park, CA

    Research intern, May.2022 - Aug.2022

    Supervisor: Si Zhang

    Overview: Propose a conceptually and technically simple architecture that consists of three components: (1) a link-encoder that is only based on multi-layer perceptrons (MLP) to summarize the information from temporal links, (2) a node-encoder that is only based on neighbor mean-pooling to summarize node information, and (3) an MLP-based link classifier that performs link prediction based on the outputs of the encoders. Despite its simplicity, our proposal attains an outstanding performance on temporal link prediction benchmarks with faster convergence and better generalization performance. This paper is accepted by ICLR23 for oral presentation.

  • Meta AI Research, Remote, US

    Research intern, Jun.2021 - Aug.2021

    Supervisor: Yanhong Wu

    Overview: Propose a Transformer-based dynamic graph learning method named Dynamic Graph Transformer (DyFormer). To improve the generalization ability, we introduce two complementary self-supervised pre-training tasks and show that jointly optimizing the two pre-training tasks results in a smaller Bayesian error rate via an information-theoretic analysis. We also propose a temporal-union graph structure and a target-context node sampling strategy for an efficient and scalable training. This paper is accepted by SDM23.

Honors, Awards and Grants

  • Top Reviewer of NeurIPS 2022 [Link]


I am living with three cats: the cat on the left side is Juju (橘橘🍊, he), the cat on the right side Heihei (黑娃, he), and Juno (🐈‍⬛, she) in the middle.

alt text

I am a great fun of motorsports and I am incredibly fortunate to have the opportunity to drive a formula car with Allen Berg Racing School at WeatherTech Raceway Laguna Seca.

I participate in online races using iRacing and I find pleasure in attending endurance racing events. This is a photo of my DIY sim-rig at home 👇.

alt text