Skip to content

wutaiqiang/awesome-GNN2MLP-distillation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 

Repository files navigation

Awesome GNN2MLP

Learning MLPs to replace GNN

[2023-04-17] This is our first release, feel free to contact us if you find any missing important Papers. You are also welcome to create a pull request to contribute anything you feel useful!

[2023-06-03] Add one paper


Papers

Knowledge Distillation

  • (ICLR'22) Graph-Less Neural Networks: Teaching Old MLPs New Tricks Via Distillation. | Paper | Code

  • (ICLR'23) NOSMOG: Learning Noise-robust and Structure-aware MLPs on Graphs. | Paper | Code

  • SA-MLP: Distilling Graph Knowledge from GNNs into Structure-Aware MLP. | Paper

  • Linkless Link Prediction via Relational Distillation | Paper

  • Extracting Low-/High- Frequency Knowledge from Graph Neural Networks and Injecting it into MLPs: An Effective GNN-to-MLP Distillation Framework | Paper | Code

Regularization

  • Graph-MLP: Node Classification without Message Passing in Graph. | Paper

  • OrthoReg: Improving Graph-regularized MLPs via Orthogonality Regularization. | Paper

Knowledge Distillation & Regularization

  • TEACHING YOURSELF: GRAPH SELF-DISTILLATION ON NEIGHBORHOOD FOR NODE CLASSIFICATION. | Paper

Knowledge Distillation wiz Graph Edges

  • Edge-free but Structure-aware: Prototype-Guided Knowledge Distillation from GNNs to MLPs. | Paper

Other Resources

About

Learning MLPs to replace GNN

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published