《杨亮-From Propagation to Ego-Network Modeling.pdf》由会员分享,可在线阅读,更多相关《杨亮-From Propagation to Ego-Network Modeling.pdf(45页珍藏版)》请在三个皮匠报告上搜索。
1、From Propagation to Ego-Network Modeling in GNNsLiang YangOutline Existing Graph Neural Networks Orthogonal Propagation with Ego-Network Modeling Graph Neural Networks without Propagations Self-supervised GNNs via Low-Rank Decomposition ConclusionsOPEN:Orthogonal Propagation with Ego-Network Modelin
2、g123456Original GraphxXCLearnableMulti-channelsDimension Reduction Perspectivex=Mapping FunctionData PointMapping via Whole Ego-NetworkOrthogonalu1u2Whole Ego-networku1u2356Ego-network ExtractionCx=XHMotivation:Irrelevant propagationsThe propagations to each node are Irrelevant.10Pairwise LearningIn
3、tra-channel Irrelevant=x+xxPropagation WeightNode Embedding356Motivation:Irrelevant propagationsChannel-1,j=1Green lineChannel-2,j=2Orange lineInter-channel IrrelevantThe propagations in multi-channels are Irrelevant.35611356356Motivation:Irrelevant propagationsMulti-channelsThe propagations in mult
4、i-channels are Irrelevant.The propagations to each node are Irrelevant.123456C356xEgo-network Extraction=XH=xxx+Propagation PerspectivePropagationWeightNode EmbeddingxXCLearnableOriginal GraphPairwise LearningInter-channel IrrelevantIntra-channel IrrelevantExisting GNNs with Irrelevant Propagations1
5、2Irrelevant Relevant :the F-dimensional representations of one node Node Embedding Number of Nodes:Propagation Weights :the 1-dimensional representations of F Data Points Data Point Number of Data Points:F:Mapping Functions=xxx+PropagationWeightNode EmbeddingCx=XHPropagation PerspectiveDimension Red
6、uction PerspectiveNode3Node5Node6XXData Point1356x=Mapping FunctionData Point13Data Point2Data Point3Data Point4OPEN:Orthogonal Propagation with Ego-Network modelingMapping via Whole Ego-NetworkOrthogonalInter-channel Relevantu1u2Whole Ego-networkIntra-channel Relevantu1u2Dimension Reduction Perspec
7、tivex=Mapping FunctionData PointsFirst,PCA on whole ego-network:Orthogonal:Then,let14Mapping via Whole Ego-NetworkOrthogonalu1u2Whole Ego-networku1u2Dimension Reduction Perspectivex=Mapping FunctionData PointsAlgorithms15Prevent Over-smoothingTheorem 1.The representation of each node from OPEN is re
8、levant to the principal components of its corresponding ego-networks attribute.After many layers,the performances of OPEN do not remarkably drop.This matches the theoretical results of Theorem 1.16EvaluationsThe proposed OPEN achieves the new SOTA on 6 networks in all 7 networks.17EvaluationsRelevan
9、t propagation modeling can promote the robustness of the GNNs18EvaluationsFigure 2 shows that OPEN consistently outperforms GAT on all the number of channels.19EvaluationsThe visualizations of the embeddings obtained from GCN and OPEN.20EvaluationsConclusions This paper identifies the irrelevant pro
10、pagation issue in GNNs,which makes models exposed to overfitting to the labelled data.This paper presents a novel Orthogonal Propagation with Ego-Network modeling(OPEN)to model two kinds of relevance between propagations.Theoretical analysis and experimental evaluations reveal four attractive charac
11、teristics of the proposed OPEN as modeling high-order relationship beyond pairwise one,preventing overfitting issue,robustness,and high efficiency.21Graph Neural Networks without Propagations(a)Ego-network in Original Graph(b)Existing Propagation-based GNNs(c)The Proposed Low-Rank GNNsNodeAttributeL
12、ow-Rank NodeRepresentationNoiseWeightedAverageNodeRepresentation Pipeline of message passing Smooth effect over-smoothing,performance drop on heterophily networks Propagations irrelevant fragile to attributes noise Heavily relay on the topology fragile to topology noiseMotivation 1Replace the propag
13、ation with a Local operationNodeAttributeHomophilic nodeHeterophilic nodedestroydestroy(a)Low-rank Structure DestroyMotivation 2Ego-networks exist Low-rank characteristic(b)Percentage of nodes with nuclear norm increasedHomophily Rate 0.4CiteseerCoraPubmedPhotoComputerSquirrelChameleonWisconsinHomop
14、hily Rate 0.4100%90%80%70%60%50%40%30%20%10%0%This indicates that most collection of attributes come from Ego-network existing Low-rank CharacteristicRank of matrix Nuclear normMotivation 3Reducing rank can boost the performanceCoraCiteseerPubmed907050301090705030Accuracy(%)908070605040Accuracy(%)(7
15、0,75 (85,90(80,83 (89,92Accuracy(%)(70,75 (85,90The ratio of rank reduction(%)The ratio of rank reduction(%)The ratio of rank reduction(%)Improved performance and reduced rank are positively correlatedLow-Rank GNNsLose critical information due to mixing nodes from different categoriesKeep critical i
16、nformation by taking high-order relationship in the ego-networkLow-Rank GNNsBuild Objective Function(RPCA)Solve Objective Function(ADMM)increaseNodeAttributeLow-Rank NodeRepresentationNoiseALMInsights and DiscussionsPropagation-based GNNsLow-Rank GNNsEgo-network in Original Graph Smooth effect of Pa
17、irwise propagationLoss of information Robustness to Topology NoisesRobustness to Attribute NoisesParameter-free and Parallelization Correcting and RefiningIncrease complexity,Overfitting Only building Ego-network Consider all nodes in the Ego-networkHigh-order RelationshipMost models NOTYESEvaluatio
18、nsThe proposed Low-Rank GNNs achieves the new SOTA on 6 networks Homophilic DatasetsHeterophilic DatasetsEvaluationsRelevant Low-Rank GNNs has the robustness of the topology and attribute noisesEvaluations(a)Homophilic Datasets959085757065605550Accuracy(%)(b)Heterophilic DatasetsChameleon Squirrel A
19、ctor Cornell Texas Wisconsin958575655545352515Low-Rank GNNs(K=1)AVG(K=1)Low-Rank GNNs(K=2)AVG(K=2)Accuracy(%)Pubmed Computer Photo CS PhysicsLow-Rank exists in local regionsLow-Rank GNNs is more effective and universal than propagation-based GNNsAVG loses critical informationGATGCNLow-Rank GNNsPhoto
20、ComputerCiteseerChameleonEvaluationsThe visualizations of the embeddings obtained from GCN,GAT and LRGNNConclusions This paper has investigated some essential issues in most existing propagation-based graph neural networks,including causing over-smoothing and performance drop in networks with hetero
21、phily,irrelevance to model high-order relationship and fragility to topology and attribute noises.Quantitative experimental analysis reveals:1)the existence of low-rank characteristic in the node attributes from ego-networks and 2)the performance improvement by reducing its rank.The proposed Low-Ran
22、k GNNs,which perform low-rank attribute matrix approximation in ego-network,posses some attractive characteristics,including robust to topology and attribute noises,handling networks with both homophily and heterophily,parameter-free and parallelizable with theoretically analysis and experimental ev
23、aluations.Self-supervised Graph Neural Networks via Low-Rank Decomposition Ego-Network(a)(b)MotivationMotivation:Make the propagation between the nodes from the same classProblemProblem:1.1.GlobalGlobal parameters cause the model lack the ability to capture the local property;2.2.AggregationAggregat
24、ion operation in Propagation-based GNNs cause node representation to lose discriminative information;loss discriminative information MotivationMotivation:Make the propagation between the nodes from the same classProblemProblem:1.1.GlobalGlobal parameters cause the model lack the ability to capture t
25、he local property;2.2.AggregationAggregation operation in Propagation-based GNNs cause node representation to lose discriminative information;loss discriminative information How to get?AnalysisAnalysis:How to get the propagation matrix?Target:Get the Low-Rank Representation matrix By RPCA:Augmented
26、Lagrangian MethodsMethodologyMethodology:Low-Rank Tensor Decomposition-based GNNConstructs the node attribute tensor node attribute tensor by selecting similar ego-networks and splicing the attribute matrices of similar ego-networks into 3-way tensor.Tensor RPCAMethodologyMethodology:Low-Rank Tensor
27、 Decomposition-based GNNTensor RPCAAlgorithms Algorithms:Low-Rank Matrix/Tensor Decomposition-based GNNEvaluationsEvaluations Experimental results demonstrate the strong robustness strong robustness of LRD-GNN against random attacks on graph topology and node attributes.EvaluationsEvaluationsPrevent Over-smoothingEvaluationsEvaluationsConclusionsPropagation is just a way to model ego-networks.