Abstract: Graph Transformers, emerging as a new architecture for graph representation learning, suffer from the quadratic complexity and can only handle graphs with at most thousands of nodes. To this ...
State Key Laboratory of Medical Proteomics, Dalian Institute of Chemical Physics, Chinese Academy of Sciences, Dalian 116023, P. R. China University of Chinese Academy of Sciences, Beijing 100049, P.
Abstract: In the field of graph self-supervised learning (GSSL), graph autoencoders and graph contrastive learning are two mainstream methods. Graph autoencoders aim to learn representations by ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results