Discovering latent node Information by graph attention network

Gu, W., Gao, F., Lou, X. et al. Sci Rep 11, 6967 (2021).'

In this paper, we propose graph attention based network representation (GANR) which utilizes the graph attention architecture and takes graph structure as the supervised learning information. Compared with node classification based representations, GANR can be used to learn representation for any given graph. GANR is not only capable of learning high quality node representations that achieve a competitive performance on link prediction, network visualization and node classification but it can also extract meaningful attention weights that can be applied in node centrality measuring task. GANR can identify the leading venture capital investors, discover highly cited papers and find the most influential nodes in Susceptible Infected Recovered Model. We conclude that link structures in graphs are not limited on predicting linkage itself, it is capable of revealing latent node information in an unsupervised way once a appropriate learning algorithm, like GANR, is provided.

Learning Universal Network Representation via Link Prediction by Graph Convolutional Neural Network

W. Gu, F. Gao, R. Li and J. Zhang,in Journal of Social Computing, vol. 2, no. 1, pp. 43-51, March 2021.

we propose a novel network representation method, named Link Prediction based Network Representation (LPNR), which generalizes the latest graph neural network and optimizes a carefully designed objective function that preserves linkage structures. LPNR can not only learn meaningful node representations that achieve competitive accuracy in node centrality measurement and community detection but also achieve high accuracy in the link prediction task. Experiments prove the effectiveness of LPNR on three real-world networks. With the mini-batch and fixed sampling strategy, LPNR can learn the embedding of large graphs in a few hours.

Gumbel-softmax-based optimization: a simple general framework for optimization problems on graphs

Yaoxin Li,Jing Liu,Guozheng Lin,Yueyuan Hou,Muyun Mou,Jiang Zhang. Comput Soc Netw 8, 5 (2021).

. In this work, we proposed a simple, fast, and general algorithm framework based on advanced automatic diferentiation technique empowered by deep learning frameworks. By introducing Gumbel-softmax technique, we can optimize the objective function directly by gradient descent algorithm regardless of the discrete nature of variables. We also introduce evolution strategy to parallel version of our algorithm. We test our algorithm on four representative optimization problems on graph including modularity optimization from network science, Sherrington–Kirkpatrick (SK) model from statistical physics, maximum independent set (MIS) and minimum vertex cover (MVC) problem from combinatorial optimization on graph, and Infuence Maximization problem from computational social science. High-quality solutions can be obtained with much less time-consuming compared to the traditional approaches.

scCapsNet: a deep learning classifier with the capability of interpretable feature extraction, applicable for single cell RNA data analysis

Lifei Wang,Rui Nie,Zeyang Yu, Ruyue Xin, Caihong Zheng, Zhang Zhang, Jiang Zhang, Jun Cai; Nature Machine Intelligence, 2: 693-703(2020)

The scCapsNet model retains the capsule parts of CapsNet but replaces the part of convolutional neural networks with several parallel fully connected neural networks. We apply scCapsNet to scRNA-seq data. The results show that scCapsNet performs well as a classifier and also that the parallel fully connected neural networks function like feature extractors as we supposed.

Complex Network Classification with Convolutional Neural Network

Ruyue Xin,Jiang Zhang,Yitong Shao; Tsinghua Science and Technology, Volume 25, Number 4, 2020

we propose a novel framework of Complex Network Classifier (CNC) by integrating network embedding and convolutional neural network to tackle the problem of network classification. By training the classifier on synthetic complex network data, we show CNC can not only classify networks with high accuracy and robustness but can also extract the features of the networks automatically. We also compare our CNC with baseline methods on benchmark datasets, which shows that our method performs well on large-scale networks.

The Hidden Flow Structure and Metric Space of Network Embedding Algorithms Based on Random Walks

Gu, Weiwei, Li Gong,Xiaodan Lou, Jiang Zhang; Scientific Reports, 7: 13114, 2017

we propose an approach based on the open-flow network model to reveal the underlying flow structure and its hidden metric space of different random walk strategies on networks. We show that the essence of embedding based on random walks is the latent metric structure defined on the open-flow network. This not only deepens our understanding of random- walk-based embedding algorithms but also helps in finding new potential applications in network embedding.

Email

jakezj#163.com

Research Group

bnusss.github.io

Address

School of Systems Science
Beijing Normal University, 100875
Beijing, China