《2-5 复杂认知图神经网络及其应用.pdf》由会员分享,可在线阅读,更多相关《2-5 复杂认知图神经网络及其应用.pdf(30页珍藏版)》请在三个皮匠报告上搜索。
1、复杂认知图神经网络金弟2022.6.25图机器学习峰会图机器学习峰会20222022:复杂图论坛:复杂图论坛Outline1.面向复杂图的图神经网络2.认知图神经网络2GNN on Universal NetworksGNN on Text-rich NetworksGNN on Attribute Missing HINsGNN on Higher-order Dependency NetworksGNN on Complex Networks31.GNN on Universal Networksbeyond topological limitationsZacharys Karate C
2、lubHomophily Social Networks Citation NetworksMajority of linked nodes are similarAuction Network Protein Structures Transaction NetworksMajority of linked nodes are differentHeterophily Railway NetworksRandomnessER Random NetworkMajority of linked nodes are randomWhether networks with different str
3、uctural properties should adopt different propagation mechanisms?Di Jin,et al,Universal Graph Convolutional Networks,NeurIPS 20214MotivationD.Brockmann and D.Helbing.The hidden geometry of complex,network-driven contagion phenomena.Science,342(6164):1337-1342,2013.5Motivating Observations6MethodA un
4、iversal GCN model may not only consider the 1-hop network neighbors,but also the 2-hop neighbors andkNN for direct information propagation.More importantly,considering different network properties can be morecorrelated with one of them or even their combinations,the model itself should adaptively le
5、arn theircorresponding importance,so as to achieve feature fusion more effectively.Multi-type Convolution MechanismDiscriminativeAggregationFor 2-hop adjacency matrix,considering that the number of neighbors at exactly 2-hops away may raiseexponentially with the increase of network scale,we introduc
6、e a constraint,i.e.,select node pairs connected by atleast two different paths for each node to set edges.For kNN adjacency matrix,we adopt the cosine value of the angle between two vectors to measure the similarity.We perform a discriminative aggregation utilizing the attention mechanism,so as to l
7、earn the contributions of 1-hop network neighbors,2-hop neighbors and kNN automatically based on the given learning objectives.7Dongxiao He,et al,Block Modeling-Guided Graph Convolutional Neural Networks,AAAI 2022Di Jin,et al,RAW-GNN:RAndom Walk Aggregation based Graph Neural Network,IJCAI 202282.GN
8、N on Text-rich Networks Graphs in the real world are usually text-rich,implying that valuable semantic information needsto be carefully considered.Global topic semantic structureTopology structure Text-rich networkRaw textNetwork structureLocal word-sequence semantic structure WordWordDocDocTopicDi
9、Jin,et al,BiTe-GCN:A New GCN Architecture via Bidirectional Convolution of Topology and Features on Text-Rich Networks,WSDM 2021Zhizhi Yu,Di Jin,et al,AS-GCN:Adaptive Semantic Architecture of Graph Convolutional Networks for Text-Rich Networks,ICDM 20219The BiTe-GCN Method Overview Bi-Typed NetworkC
10、onstruction Joint Convolution Data Refinement Model Refinement10AS-GCN11E-commerce network constructionWe build up a natural bi-typednetwork where a total of 6.5Mqueries and 50M itemscovering all categories existingas the real nodes,and 60Kproductphrasesand12Kattribute phrases as the entitynodes.(fr
11、om JD.com)123.GNN on Attribute Missing HINsCertain types of nodes are completely missingHomogeneous networkSome dimensions are randomly missingHeterogeneous networkdifferentDBLPIMDBDi Jin,et al,Heterogeneous Graph Neural Network via Attribute Completion,WWW 202113venuepaperauthortermHeterogeneous In
12、formation NetworksDBLP networkmeta-pathAPA:co-author relationship.meta-pathAPVPA:co-venue relation.Composed of multiple types of nodes and edges.Contain comprehensive information and rich semantics.14What is necessary?Exploring metapath-oriented high-order topological information as prior knowledge.
13、Making attribute completion learnable guided by prior.Making attribute completion task-guided.Real topic distributionMachine LearningMolecular Biology15Framework-OverallHGNN-AC Framework164.GNN on Higher4.GNN on Higher-order Dependency Networksorder Dependency NetworksDi Jin,et al,Graph Neural Netwo
14、rk for Higher-Order Dependency Networks,WWW-22,Oral 1718Outline1.面向复杂图的图神经网络2.认知图神经网络19Di Jin,et al,Graph Convolutional Networks Meet Markov Random Fields,AAAI-19Meng Qu,Yoshua Bengio,Jian Tang,GMNN:Graph Markov Neural Networks,NIPS 2019Hongchang Gao,Jian Pei,et al,Conditional Random Field Enhanced
15、Graph Convolutional Neural Networks,KDD 201920Graph Convolutional Networks 21Markov Random Fields 22System I&System II Since GCN and MRF have complementary features,it isideal to combine the two to take advantage of their strengthsfor community detection.We propose an end-to-end deep learning method
16、 to combinethe GCN and MRF methods for semi-supervised communitydetection on attribute networks.End-to-end trainingGCNMRF23Design the eMRF ModelBasic definition:Energy function=Unary +PairwiseFor unary potential:For pairwise potential:The semantic similarities between communitiesNodes similarities o
17、n topology and attribute information()()uuiiyp y(,)(,)(,)uvijyyu v k i j(,)(2),(,)(1)u vu vu vW E.g.politicsismoresimilartoeconomythansports.measures the cost of node i having label umeasures the cost of node i,j having label u,vT(,)*(/(|)/2iijijijijk i jR xxxxd de atopologyattribute24eMRFs Inferenc
18、e into GCNByusingmeanfieldapproximateinference(substituting P with Q)and minimizing the KL-divergence D(Q|P),the derived messages updatingformula isIt can be factorized into four steps:Initialization.Message passing.Addition of unary potentials.Normalization.1()exp()()()()ijiiiijjjZj iCLQ CCC,C k i,
19、j Q C 1()exp()i1uiiZQ uy()()uuiiyp y 4321()()()11jjj iQ uk i,j Q u()()()21iiu LQ vu,v Q uSimilar to graph convolution1()exp()i43iiZQ uQ u()()()3u2iiiQ uyQu 25Forward Pass and Loss Function(1)(0)1()ReLU()Xf X,AAXW(2)(1)(1)(1)2()softmax()XfX,AAX W(2)(2)(2)(2)3()softmax()ZfX,A,XX-KXWThe forward pass of
20、 MRFasGCN:The frist convolution:The second convolution:The third convolution:MRFasGCN can be trained by minimizing the crossentropy between the predicted and the(partial)ground-truth:11argmin(,)argmin(ln)LnkijijijL Z YYZ 26The MRFasGCN Architecture From MRF,we make GCN as a initial solution to defin
21、e a good unarypotentials for MRF.From GCN,we make a community knowledge-based MRF as a layer ofconvolution of GCN in order to relieve its topological limitations.GCN and MRF are trained together in a same system.27Publication 2022:1 Di Jin,et al,RAW-GNN:RAndom Walk Aggregation based Graph Neural Net
22、work,IJCAI-22,Long(3.75%)2 Di Jin,et al,CGMN:A Contrastive Graph Matching Network for Self-Supervised Graph Similarity Learning,IJCAI-22,Long(3.75%)3 Di Jin,Yingli Gong,Zhiqiang Wang,Zhizhi Yu,Dongxiao He,Yuxiao Huang and Wenjun Wang,Graph Neural Network for Higher-Order Dependency Networks,The Web
23、Conference(WWW-22),Oral 4 Di Jin,Rui Wang,Tao Wang,Dongxiao He,Weiping Ding,Yuxiao Huang,Longbiao Wang,Witold Pedrycz,Amer:A New Attribute-Missing Network Embedding Approach,IEEE Transactions on Cybernetics,2022 5 Tao Wang,Di Jin,Rui Wang,Dongxiao He,Yuxiao Huang,Powerful Graph Convolutioal Networks
24、 with Adaptive Propagation Mechanism for Homophily and Heterophily,AAAI-22,Oral 6 Xin Sun,Xin Huang,and Di Jin*,Fast Algorithms for Core Maximization on Large Graphs,PVLDB 2022,Oral 7 Dongxiao He,Rui Guo,Xiaobao Wang,Di Jin,Yuxiao Huang and Wenjun Wang,Inflation Improves Graph Learning,The Web Confe
25、rence(WWW-22),Oral 8 Dongxiao He,Chundong Liang,Cuiying Huo,Zhiyong Feng,Di Jin,Liang Yang,Weixiong Zhang,Analyzing Heterogeneous Networks with Missing Attributes by Unsupervised Contrastive Learning,IEEE Transactions on Neural Networks and Learning Systems(TNNLS),2022 9 Nan Jiang,Wen Jie,Jin Li,Xim
26、eng Liu,Di Jin,et al.GATrust:A Multi-Aspect Graph Attention Network Model for Trust Assessment in OSNs,TKDE,202228Publication 2021:1 Di Jin,Cuiying Huo,Chundong Liang,and Liang Yang,Heterogeneous Graph Neural Network via Attribute Completion.WWW-21,Oral,Best Paper Award Runner-1 Zhizhi Yu,Di Jin,Ziy
27、ang Liu,Dongxiao He,Xiao Wang,Hanghang Tong,and Jiawei Han,AS-GCN:Adaptive Semantic Architecture of Graph Convolutional Networks for Text-Rich Networks,ICDM-21,Best Student Paper Award Runner-Up2 Di Jin,Zhizhi Yu,Pengfei Jiao,Shirui Pan,Dongxiao He,Jia Wu,Philip S.Yu,and Weixiong Zhang,A Survey of C
28、ommunity Detection Approaches:From Statistical Modeling to Deep Learning,TKDE 20213 Di Jin,Zhizhi Yu,Dongxiao He,Carl Yang,Philip S.Yu,and Jiawei Han,GCN for HIN via Implicit Utilization of Attention and Meta-paths,TKDE2021 4 Di Jin,Xiaobao Wang,Dongxiao He,Jianwu Dang,and Weixiong Zhang,Robust Dete
29、ction of Link Communities with Summary Description in Social Networks,TKDE 20215 Di Jin,Zhizhi Yu,Cuiying Huo,Dongxiao He,Xiao Wang,and Jiawei Han,Universal Graph Convolutional Networks,NeurIPS-216 Di Jin,Xiangchen Song,Zhizhi Yu,Ziyang Liu,Heling Zhang,Zhaomeng Cheng,and Jiawei Han,BiTe-GCN:A New G
30、CN Architecture via Bidirectional Convolution of Topology and Features on Text-rich Networks,WSDM-21,Oral7 Xin Sun,Xin Huang,Zitan Sun,and Di Jin*,Budget-constrained Truss Maximization over Large Graphs:A Component-based Approach,CIKM-21,Oral8 Dongxiao He,Shuai Li,Di Jin,Pengfei Jiao,Yuxiao Huang.Se
31、lf-Guided Community Detection on Networks with Missing Edges.IJCAI-21,Oral9 Dongxiao He,Tao Wang,Lu Zhai,Di Jin,Liang Yang,Yuxiao Huang,Zhiyong Feng,Philip S.Yu,Adversarial Representation Mechanism Learning for Network Embedding,TKDE,202110 Pengfei Jiao,Qiang Tian,Wang Zhang,Xuan Guo,Di Jin and Huam
32、ing Wu.Role Discovery Guided Network Embedding based on Autoencoder and Attention Mechanism.IEEE Transactions on Cybernetics,2021.11 Liang Yang,Yuanfang Guo,Junhua Gu,Di Jin,Bo Yang,Xiaochun Cao,Probabilistic Graph Convolutional Network via Topology-Constrained Latent Space Model,IEEE Transactions o
33、n Cybernetics,202112 Dongxiao He,Huixin Liu,Zhiyong Feng,Xiaobao Wang,Di Jin,Wenze Song,Yuxiao Huang,A Joint Community Detection Model:Integrating Directed and Undirected Probabilistic Graphical Models via Factor Graph with Attention Mechanism,IEEE Transactions on Big Data(TBD),202113 Ziyang Liu,Junqing Chen,Yunjiang Jiang,Yue Shang,Wei Xiong,Sulong Xu,Zhaomeng Cheng,Bo Long,Lingfei Wu,Yun Xiao and Di Jin.Semi-Explicit MMoE via Heterogeneous Multi-Task Learning for Ranking Relevance,the IRS2021 KDD workshop,202129谢谢大家!谢谢大家!Papers+Data+Code:https:/