上海品茶

您的当前位置:上海品茶 > 报告分类 > PDF报告下载

2-1 复杂图上的机器学习研究.pdf

编号:102418 PDF 43页 2.34MB 下载积分:VIP专享
下载报告请您先登录!

2-1 复杂图上的机器学习研究.pdf

1、DGL与复杂图上的机器学习甘全 亚马逊云科技|01DGL与异质图与异质图02DGL与动态图与动态图03DGL与超图与超图目录目录 CONTENT|00图与图神经网络图与图神经网络图与图神经网络图与图神经网络00|图的应用|图神经网络主流是消息传递消息传递,或者其变种端到端训练(绝大部分)或作为预处理的一部分(SGC、SIGN、GAMLP 2)也有使用Transformer架构的(Graphormer 4)1 Geometric Deep Learning Grids,Groups,Graphs,Geodesics,and Gauges,Bronstein et al.,20212 Graph

2、Attention MLP with Reliable Label Utilization,Zhang et al.,20213 Graph Neural Networks Inspired by Classical Iterative Algorithms,Yang et al.,20214 Do Transformers Really Perform Bad for Graph Representation?Ying et al.,2021消息传递结构的数学依据从谱分析的表达式归纳得到(GCN 1)对图上某一类特定的损失函数的优化过程可以视为消息传递(TWIRLS 2)在图上进行热力学扩散

3、的过程可以认为是消息传递(PageRank,APPNP 3,GRAND 4)1 Semi-Supervised Classification with Graph Convolutional Networks,Kipf&Welling,20162 Graph Neural Networks Inspired by Classical Iterative Algorithms,Yang et al.,20213 Predict then Propagate:Graph Neural Networks meet Personalized PageRank,Gasteiger et al.,2018

4、4 GRAND:Graph Neural Diffusion,Chamberlain et al.,2021DGL与异质图与异质图01|为什么需要异质图(Heterogeneous Graph)?|1 A Survey of Heterogeneous Information Network Analysis,Shi et al.,20152 https:/ 1 介绍|包括25个模型以及20+个数据集(包括所有HGB2的节点分类与链路预测数据集)1 https:/ Are we really making much progress?Revisiting,benchmarking,and re

5、fining heterogeneous graph neural networks,Lv et al.,2021从同质图模型到异质图模型的演化|核心是如何利用点边类型绝大部分:每一种边类型准备一种函数或一套参数GraphSAGE RGCN 1,GAT RGAT,HGT 2抽取数个仅包括部分边类型的子图,然后组合SIGN NARS 3以元路径(metapath)构造新图,将异质图同质化GAT HAN 41 Modeling Relational Data with Graph Convolutional Networks,Schlichtkrullet al.,20172 Heterogene

6、ous Graph Transformer,Hu et al.,20203 Scalable Graph Neural Networks for Heterogeneous Graphs,Yu et al.,20204 Heterogeneous Graph Attention Network,Wang et al.,2019从同质图模型到异质图模型的演化|从同质图模型设计空间 1 到异质图模型设计空间 2 的演化|1 Design Space for Graph Neural Networks,You et al.,20212 Space4HGNN:A Novel,Modularized a

7、nd Reproducible Platform to Evaluate Heterogeneous Graph Neural Network.Zhao et al.,2022Special thanks to Tianyu Zhao,Cheng Yang,Yibo Li,Zhenyi Wang,Fengqi Liang,Yingxia Shao,Xiao Wang,Chuan Shi(BUPT),and Huan Zhao(4Paradigm)!从同质图模型设计空间 1 到异质图模型设计空间 2 的演化|1 Design Space for Graph Neural Networks,You

8、 et al.,20212 Space4HGNN:A Novel,Modularized and Reproducible Platform to Evaluate Heterogeneous Graph Neural Network.Zhao et al.,2022Special thanks to Tianyu Zhao,Cheng Yang,Yibo Li,Zhenyi Wang,Fengqi Liang,Yingxia Shao,Xiao Wang,Chuan Shi(BUPT),and Huan Zhao(4Paradigm)!从同质图模型到异质图模型的演化的理论难点|图神经网络一般

9、假定同质偏好(Homophily)相近的节点拥有相似的特征、标签可以运用谱分析方法可以运用热学扩散方法消息传递可以视作一个优化过程的一部分异质图一般不存在同质偏好(Homophily)相近节点类型往往不同,相似性无从谈起异质图有多个邻接矩阵,不易直接使用谱分析方法异质图边类型各不相同,热学扩散需要不同规则异质图边类型各不相同,优化过程的损失函数比较复杂从优化过程的角度看同质图模型到异质图模型的演化|?1 Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks,Ahn et al.,

10、2022Special thanks to HongjoonAhn,Taesup Moon(Seoul National University),Yongyi Yang(Fudan University),and David Wipf(AWS)!从优化过程的角度看同质图模型到异质图模型的演化|?1 Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks,Ahn et al.,2022Special thanks to HongjoonAhn,Taesup Moon(Seoul Na

11、tional University),Yongyi Yang(Fudan University),and David Wipf(AWS)!从优化过程的角度看同质图模型到异质图模型的演化|1 Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks,Ahn et al.,2022Special thanks to HongjoonAhn,Taesup Moon(Seoul National University),Yongyi Yang(Fudan University),and Da

12、vid Wipf(AWS)!从优化过程的角度看同质图模型到异质图模型的演化|1 Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks,Ahn et al.,2022Special thanks to HongjoonAhn,Taesup Moon(Seoul National University),Yongyi Yang(Fudan University),and David Wipf(AWS)!?从优化过程的角度看同质图模型到异质图模型的演化|1 Descent Steps

13、of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks,Ahn et al.,2022Special thanks to HongjoonAhn,Taesup Moon(Seoul National University),Yongyi Yang(Fudan University),and David Wipf(AWS)!M需要限定为正定矩阵,且导出的消息传递过程对于传递方向对称,且仍然假定同质偏好从优化过程的角度看同质图模型到异质图模型的演化|1 Descent Steps of a Relation-Aw

14、are Energy Produce Heterogeneous Graph Neural Networks,Ahn et al.,2022Special thanks to HongjoonAhn,Taesup Moon(Seoul National University),Yongyi Yang(Fudan University),and David Wipf(AWS)!?从优化过程的角度看同质图模型到异质图模型的演化|1 Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks

15、,Ahn et al.,2022Special thanks to HongjoonAhn,Taesup Moon(Seoul National University),Yongyi Yang(Fudan University),and David Wipf(AWS)!H可以认为是表述同质偏好特征的兼容矩阵(compatibility matrix),故没有上述的任何问题从优化过程的角度看同质图模型到异质图模型的演化|1 Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks,Ah

16、n et al.,2022Special thanks to HongjoonAhn,Taesup Moon(Seoul National University),Yongyi Yang(Fudan University),and David Wipf(AWS)!从优化过程的角度看同质图模型到异质图模型的演化|1 Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks,Ahn et al.,2022Special thanks to HongjoonAhn,Taesup Moon(

17、Seoul National University),Yongyi Yang(Fudan University),and David Wipf(AWS)!(a)自上一层的残差项(b)自输入层的残差项(c)每一种边类型的消息传递项(d)对自身特征的变换从优化过程的角度看同质图模型到异质图模型的演化|1 Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks,Ahn et al.,2022Special thanks to HongjoonAhn,Taesup Moon(Seoul N

18、ational University),Yongyi Yang(Fudan University),and David Wipf(AWS)!为任意凸函数时,利用近端梯度下降(proximal gradient descent),可以在更新算式中引入非线性层。的取值在y为非负向量时为0,否则为正无穷 非线性变换为ReLU从优化过程的角度看同质图模型到异质图模型的演化|1 Descent Steps of a Relation-Aware Energy Produce Heterogeneous Graph Neural Networks,Ahn et al.,2022Special thanks

19、 to HongjoonAhn,Taesup Moon(Seoul National University),Yongyi Yang(Fudan University),and David Wipf(AWS)!DGL与动态图与动态图02|为什么需要动态图?|1 Deep Learning on Dynamic Graphs,Rossi&Bronstein,https:/ DySAT:Deep Neural Representation Learning on Dynamic Graphs via Self-Attention Networks,Sankar et al.,20202 Induc

20、tive Representation Learning on Temporal Graphs,Xu et al.,20203 JODIE:Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks,Kumar et al.,20194 Deep Learning on Dynamic Graphs,Rossi&Bronstein,https:/ APAN:Asynchronous propagation attention network for real-time temporal graph embed

21、ding,Wang et al.,2020给定时间戳,将图按时间切分成快照,按空间与时间分别聚合信息(DySAT 1)遵守消息传递网络框架,但传递时严格保证传递过程遵守时间先后(TGAT 2)每个节点维护一个记忆向量(memory vector),在新边加入时使用RNN结构更新端点的记忆向量(JODIE 3,TGN 4,APAN 5)TGL:动态图框架|每个部件都可以开启或关闭聚合函数的形式可以自由选择单机上可以训练十亿边级别的动态图1 TGL:A General Framework for Temporal GNN Training on Billion-Scale Graphs,Zhou

22、et al.,2022动态图模型评测的一些问题|动态图公开数据集普遍过小模型效果已经接近完美1 Inductive Representation Learning in Temporal Networks via Causal Anonymous Walks,Wang et al.,2021WSDM 2022 动态图链路预测竞赛|两个数据集GDELT 1:收集了2015年2月至2019年4月,近70000个实体之间1.4亿次事件信息,实测时降采样至19442个实体之间约2700万次事件信息。GitHub 2:约六十万用户与GitHub仓库之间的约八百万次交互信息,共14类。评测指标给定源节点s

23、、目标节点d、交互类型r、开始时间与结束时间,预测在给定时间段内是否会出现从s到d的,类型为r的边。使用AUC作为衡量指标。1 https:/www.gdeltproject.org/2 https:/www.gharchive.orgSpecial thanks to Xuhong Wang(SJTU)!WSDM 2022 动态图链路预测竞赛|1 https:/www.dgl.ai/WSDM2022-Challenge/Special thanks to Xuhong Wang(SJTU)!TeamMethodDataset ADataset BT-score AT-score BAntGr

24、aphLINE w/DT0.6660.9020.6770.585nothing hereTGN0.6620.9070.6700.588NodeInGraphLINE0.6280.8660.6080.564We can mask!KG0.6040.8980.5620.582IDEAS Lab UTCHIP+LR0.6050.8740.5650.568Sli-RecSeq.RecSys0.5840.8920.5260.579DGL与超图与超图03|为什么需要超图?|1 Image borrowed from https:/www.knowledge.unibocconi.eu/notizia.ph

25、p?idArt=219782 Learning Enhanced Representations for Tabular Data via Neighborhood Propagation,Du et al.,2022Special thanks to Kounianhua Du,Weinan Zhang,Ruiwen Zhou,Yangkun Wang,XilongZhao,Jiarui Jin(SJTU),Zheng Zhang and David Wipf(AWS)!超图:一条边可以连接两个以上的点表格数据的一行可以有多列,表示多个实体之间的关系超图上的消息传递可以把超图转换成普通图将每

26、一条边展开成一个团(完全子图)(clique expansion)将每一条边当作另一类节点作二分图(star expansion)将每一条边当作节点,共享点的边之间相连(line graph)可以同时在三个图上做消息传递并聚合1 Learning over Families of Sets-Hypergraph Representation Learning for Higher Order Tasks,Srinivasan et al.,2021PET 1:用超图处理表格数据|1 Learning Enhanced Representations for Tabular Data via N

27、eighborhood Propagation,Du et al.,2022Special thanks to Kounianhua Du,Weinan Zhang,Ruiwen Zhou,Yangkun Wang,XilongZhao,Jiarui Jin(SJTU),Zheng Zhang and David Wipf(AWS)!PET 1:用超图处理表格数据|1 Learning Enhanced Representations for Tabular Data via Neighborhood Propagation,Du et al.,2022Special thanks to Ko

28、unianhua Du,Weinan Zhang,Ruiwen Zhou,Yangkun Wang,XilongZhao,Jiarui Jin(SJTU),Zheng Zhang and David Wipf(AWS)!PET 1:用超图处理表格数据|1 Learning Enhanced Representations for Tabular Data via Neighborhood Propagation,Du et al.,2022Special thanks to Kounianhua Du,Weinan Zhang,Ruiwen Zhou,Yangkun Wang,XilongZh

29、ao,Jiarui Jin(SJTU),Zheng Zhang and David Wipf(AWS)!PET 1:用超图处理表格数据|1 Learning Enhanced Representations for Tabular Data via Neighborhood Propagation,Du et al.,2022Special thanks to Kounianhua Du,Weinan Zhang,Ruiwen Zhou,Yangkun Wang,XilongZhao,Jiarui Jin(SJTU),Zheng Zhang and David Wipf(AWS)!PET 1:

30、用超图处理表格数据|1 Learning Enhanced Representations for Tabular Data via Neighborhood Propagation,Du et al.,2022Special thanks to Kounianhua Du,Weinan Zhang,Ruiwen Zhou,Yangkun Wang,XilongZhao,Jiarui Jin(SJTU),Zheng Zhang and David Wipf(AWS)!PET 1:局限性与实验结果|1 Learning Enhanced Representations for Tabular D

31、ata via Neighborhood Propagation,Du et al.,2022Special thanks to Kounianhua Du,Weinan Zhang,Ruiwen Zhou,Yangkun Wang,XilongZhao,Jiarui Jin(SJTU),Zheng Zhang and David Wipf(AWS)!目前只能处理不含连续变量的表格依赖训练集中已有的标签,没有处理标签数量少的情形依赖检索算法返回结果的质量PET 1:局限性与实验结果|1 Learning Enhanced Representations for Tabular Data via Neighborhood Propagation,Du et al.,2022Special thanks to Kounianhua Du,Weinan Zhang,Ruiwen Zhou,Yangkun Wang,XilongZhao,Jiarui Jin(SJTU),Zheng Zhang and David Wipf(AWS)!目前只能处理不含连续变量的表格依赖训练集中已有的标签,没有处理标签数量少的情形依赖检索算法返回结果的质量非常感谢您的观看|

友情提示

1、下载报告失败解决办法
2、PDF文件下载后,可能会被浏览器默认打开,此种情况可以点击浏览器菜单,保存网页到桌面,就可以正常下载了。
3、本站不支持迅雷下载,请使用电脑自带的IE浏览器,或者360浏览器、谷歌浏览器下载即可。
4、本站报告下载后的文档和图纸-无水印,预览文档经过压缩,下载后原文更清晰。

本文(2-1 复杂图上的机器学习研究.pdf)为本站 (云闲) 主动上传,三个皮匠报告文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知三个皮匠报告文库(点击联系客服),我们立即给予删除!

温馨提示:如果因为网速或其他原因下载失败请重新下载,重复下载不扣分。
会员购买
客服

专属顾问

商务合作

机构入驻、侵权投诉、商务合作

服务号

三个皮匠报告官方公众号

回到顶部