limma-trend with random effects and/or array weights
2
2
Entering edit mode
maltethodberg ▴ 180
@maltethodberg-9690
Last seen 28 days ago
Denmark

I am doing differential expression on a fairly large RNA-Seq dataset, where multiple organs are obtained from the same mice. I would like to model the effects of individual mice using limma's random effects via duplicateCorrelation. As library sizes are very consistent across all samples (and I might want to play around with genas), I was thinking of using limma-trend.

I've seen some posts on using voom and voomWithArrayWeights together with duplicateCorrelation, but nothing on limma-trend. 

Given that a standard limma-trend pipeline goes as follows (starting from a count matrix EM and design mod):

dge <- DGEList(EM)
dge <- calcNormFactors(dge)
v <- cpm(dge, log=TRUE, prior.count=3)
fit <- lmFit(v, design=mod)
eb <- eBayes(fit, trend=TRUE, robust=TRUE)
decideTests(eb)
topTable(eb) 
...

How would this be modified to accommodate duplicateCorrelation and/or arrayWeights

limma-trend duplicatecorrelation arrayweights limma • 2.4k views
ADD COMMENT
4
Entering edit mode
@gordon-smyth
Last seen 12 hours ago
WEHI, Melbourne, Australia

The reason why there aren't any posts is because there's no special issue, everything just works as usual:

aw <- arrayWeights(v, design)
d <- duplicateCorrelation(v, design, block=mouse, weights=aw)
fit <- lmFit(v, design=mod, weights=aw, block=mouse, correlation=d$consensus)
eb <- eBayes(fit, trend=TRUE, robust=TRUE)

Note also that you should filter low expression genes after running cpm() but before running any of the above.

ADD COMMENT
3
Entering edit mode
@ryan-c-thompson-5618
Last seen 11 weeks ago
Icahn School of Medicine at Mount Sinai…

The issues that occur when mixing voom with arrayWeights and/or duplicateCorrelation happen because the voom step comes before the arrayWeights or duplicateCorrelation steps, but in order to be fully accurate, the voom step needs information from those other steps. The solution implemented in voomWithQualityWeights is essentially to alternate running voom and arrayWeights until convergence. (In practice, convergence is pretty much guaranteed after 2 runs of each, so that's what voomWithQualityWeights actually does instead of checking any convergence criterion.) I've written a helper function to do something similar with voom and duplicateCorrelation here: https://github.com/DarwinAwardWinner/CD4-csaw/blob/e6de96bcefcb3fcd928ea972dc4bc2d511e6ddbd/scripts/utilities.R#L392-L445. Note that I have not thoroughly investigated the convergence behavior of this function, but I'm pretty sure it should converge. If you use it, you should audit the code to make sure it's doing what you expect.

However, if you are using limma-trend instead of voom, I don't think there is any issue. The eBayes step comes after both arrayWeights and duplicateCorrelation, so there is no chicken-and-egg problem as there is with voom. After taking the logCPM, you just proceed as you would for microarray data, as detailed in the manual.

I suppose you might still need to use the trick of iterating back and forth until convergence if you want to use both arrayWeights and duplicateCorrelation in the same pipeline. (Edit: never mind, this would only be needed for voomWithQualityWeights instead of arrayWeights.)

ADD COMMENT
1
Entering edit mode

+1 but, regarding the last sentence, there's no need to iterate arrayWeights() and duplicateCorrelation(). arrayWeights doesn't even take correlation as an argument.

Also, I don't recommend that you iteration duplicateCorrelation() and voom(). I don't do that myself. Sure you can, but I don't see any real advantage in doing so.

ADD REPLY
0
Entering edit mode

In line with the original question, which sequence of function calls would you use in that case for combining duplicateCorrelation with voom/voomWithArrayWeights?

ADD REPLY

Login before adding your answer.

Traffic: 522 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6