significant difference of p-values with dream() after updating VariancePartition
0
0
Entering edit mode
Le • 0
@bf1730d5
Last seen 3 months ago
United States

Hi gabriel.hoffman ,

Thanks for developing the variancePartition package. I have been relying on it for a while with any analysis requiring mixed model.

Recently I realized that my previous results done a year ago are dramatically different with updated version. I had the exact same input and code, which was for a bulk ATAC experiment with 6 paired samples. Out of around 150k peaks, 8000 peaks had FDR<0.05 using version 1.28.5 while version >=1.33.11 had 0. I'm wondering if the changes in scaled weights that lead to this, and if so, what kind of situation is mostly affected by this update.

What would be your recommended version, considering the stable release on Bioconductor is still at 1.32.5.

Thanks! Le

  form <- ~Condition+(1|clone)+cov1+cov2
  param = SnowParam(6, "SOCK", progressbar=TRUE)

    edgeR::DGEList(count_matrix, group = covariates$Condition) -> edgeR_obj
    edgeR_obj$samples$lib.size<-sizeFactors
    scaleOffset(edgeR_obj,offset = cqn_edgeR) -> edgeR_obj
    vobjDream = voomWithDreamWeights(edgeR_obj, form, covariates, BPPARAM=param )
    gc()
    fitmmKR = dream( vobjDream, form, covariates, ddf="Kenward-Roger", BPPARAM=param)
    # fitmmKR = dream( vobjDream, form, covariates, BPPARAM=param)

    fitmmKR<-eBayes(fitmmKR)
    results_DREAM_KR<-topTable(fitmmKR,coef = 2 ,number = Inf) 


sessionInfo( )
variancePartition • 551 views
ADD COMMENT
1
Entering edit mode

See variancePartition updates for v1.33.11 and v1.31.16

I fixed some issues that only affected the "Kenward-Roger" method and only in small samples sizes, like you have here. The new version is correct, producing improved power and control of false positive rate for small sample sizes.

It looks like your v1.28.5 is from Feb 24, 2023.

ADD REPLY
0
Entering edit mode

Thanks for your reply gabriel.hoffman !

I've been trying to isolate the changes, and v1.32.5 had the same results as v1.33.1 while v1.30.0 is the same as v1.28.5. It's probably the changes made in v1.31.16 that affected when Kenward-Roger is used. dream() without Kenward-Roger in older versions gave similar results as the new package.

Is there way to tell if dream() disgarded KR? Is there a way for me to play with my model or some other parameters to keep KR?

Thanks! Le

ADD REPLY

Login before adding your answer.

Traffic: 421 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6