索引于
  • 学术期刊数据库
  • 打开 J 门
  • Genamics 期刊搜索
  • 期刊目录
  • 研究圣经
  • 乌尔里希的期刊目录
  • 电子期刊图书馆
  • 参考搜索
  • 哈姆达大学
  • 亚利桑那州EBSCO
  • OCLC-WorldCat
  • 学者指导
  • SWB 在线目录
  • 虚拟生物学图书馆 (vifabio)
  • 普布隆斯
  • 米亚尔
  • 日内瓦医学教育与研究基金会
  • 欧洲酒吧
  • 谷歌学术
分享此页面
期刊传单
Flyer image

抽象的

Gradient Boosting as a SNP Filter: an Evaluation Using Simulated and Hair Morphology Data

Lubke GH, Laurin C, Walters R, Eriksson N, Hysi P, Spector TD, Montgomery GW, Martin NG, Medland SE and Boomsma DI

Typically, genome-wide association studies consist of regressing the phenotype on each SNP separately using an additive genetic model. Although statistical models for recessive, dominant, SNP-SNP, or SNP-environment interactions exist, the testing burden makes an evaluation of all possible effects impractical for genome-wide data. We advocate a two-step approach where the first step consists of a filter that is sensitive to different types of SNP main and interactions effects. The aim is to substantially reduce the number of SNPs such that more specific modeling becomes feasible in a second step. We provide an evaluation of a statistical learning method called “gradient boosting machine” (GBM) that can be used as a filter. GBM does not require an a priori specification of a genetic model, and permits inclusion of large numbers of covariates. GBM can therefore be used to explore multiple GxE interactions, which would not be feasible within the parametric framework used in GWAS. We show in a simulation that GBM performs well even under conditions favorable to the standard additive regression model commonly used in GWAS, and is sensitive to the detection of interaction effects even if one of the interacting variables has a zero main effect. The latter would not be detected in GWAS. Our evaluation is accompanied by an analysis of empirical data concerning hair morphology. We estimate the phenotypic variance explained by increasing numbers of highest ranked SNPs, and show that it is sufficient to select 10K-20K SNPs in the first step of a two-step approach.