《因果充分性和必要性和其在不变学习中的应用.pdf》由会员分享,可在线阅读,更多相关《因果充分性和必要性和其在不变学习中的应用.pdf(21页珍藏版)》请在三个皮匠报告上搜索。
1、探索充分必要因果性probability of sufficient and necessary causes杨梦月伦敦大学学院mengyue.yang.20ucl.ac.uk1 The OOD generalization taskInvariant Learning2TrainTest Invariant causal assumption across source and test distribution P(Y|C=c)=P(Y|C=c)Extract causal feature for OOD generalization.Infer causal feature from o
2、bservation data Predict label y from causal featureInvariant Learning3 Is causal representation enough in invariant learning?What kind of causal information is essential?Is that a cat?No!Causal representation4 Defining the sufficient and necessary causes.Chapter 9 in book:Causality Considering the c
3、ounterfactual probability on binary variables X and YCausal representation5 Understanding PNSCausal representation6 Understanding PNSCausal representation7 How to identify PNS from observational data Exogeneity:X is the cause of Y Monotonicity:Changes on X lead to monotonically changes on YCausal re
4、presentation8 How to identify PNS from observational data Exogeneity:X is the cause of Y Monotonicity:Changes on X lead to monotonically changes on YCausal representation9 Defining the PNS risk on test domain Defining Monotonicity measurement.The PNS risk modelingYang,Mengyue,et al.Invariant Learnin
5、g via Probability of Sufficient and Necessary Causes.arXiv preprint(NeurIPS2023 Spotlight).PNS RiskSatisfaction ofMonotonicitySatisfaction ofExogeneity10 Connecting the Monotonicity measurement with PNS riskSatisfaction of Monotonicity11 Exogeneity under different causal assumption 1.C contain all i
6、nformation of Y in X 2.There are no spurious correlation between causal information and domain knowledge 3.C contain not all information of Y in XSatisfaction of ExogeneityYang,Mengyue,et al.Invariant Learning via Probability of Sufficient and Necessary Causes.arXiv preprint(NeurIPS2023 Spotlight).1
7、2 Exogeneity under different causal assumption 1.PNS Risk can directly satisfies exogeneity 2.Additional constraint of independency between V and C like MMD 3.Additional constraint of conditional independence is required like IRM constraint.Satisfaction of ExogeneityYang,Mengyue,et al.Invariant Lear
8、ning via Probability of Sufficient and Necessary Causes.arXiv preprint(NeurIPS2023 Spotlight).13 PNS risk defined on unknown test domain Connecting source domain and test domainGeneralization analysis Yang,Mengyue,et al.Invariant Learning via Probability of Sufficient and Necessary Causes.arXiv prep
9、rint(NeurIPS2023 Spotlight).14 Using dataset from source domain to evaluate the riskGeneralization analysis Yang,Mengyue,et al.Invariant Learning via Probability of Sufficient and Necessary Causes.arXiv preprint(NeurIPS2023 Spotlight).15 Consider the failure case of learned PNS The small perturbatio
10、n induce changes on prediction Optimization process=0 0Failure case=0 0Sematic separatable caseYang,Mengyue,et al.Invariant Learning via Probability of Sufficient and Necessary Causes.arXiv preprint(NeurIPS2023 Spotlight).16 Consider the failure case of learned PNS Under the case of Sematic separata
11、ble,It is worth to evaluate PNS risk When Sematic separatable satisfies in data,then we should add additional constraint on representation avoid to learn trivial PNS.Optimization processYang,Mengyue,et al.Invariant Learning via Probability of Sufficient and Necessary Causes.arXiv preprint(NeurIPS202
12、3 Spotlight).17 Final objective For different causal assumption we need to add additional constraintOptimization processYang,Mengyue,et al.Invariant Learning via Probability of Sufficient and Necessary Causes.arXiv preprint(NeurIPS2023 Spotlight).18 Can we learn the sufficient and necessary causes?E
13、xperimentYang,Mengyue,et al.Invariant Learning via Probability of Sufficient and Necessary Causes.arXiv preprint(NeurIPS2023 Spotlight).19 The OOD generalization abilityExperimentYang,Mengyue,et al.Invariant Learning via Probability of Sufficient and Necessary Causes.arXiv preprint(NeurIPS2023 Spotl
14、ight).20 The scenario which need more stable than accuracy Auto drive OOD generalization Domain adaptation Dynamic system Future work More causal assumption More general caseProbable application/Future workYang,Mengyue,et al.Invariant Learning via Probability of Sufficient and Necessary Causes.arXiv preprint(NeurIPS2023 Spotlight).21