Entering edit mode
Aaron Mackey
▴
170
@aaron-mackey-4358
Last seen 10.2 years ago
Is VST-normalization (a la DESeq) considered the right way to deal
with
large scale differences in mean "baseline" expression across
experimental
blocks? Is there a normalization method that can take into account
the
design matrix (or at least the batch/block columns)? I don't want to
remove the batch/block effects, but TMM and friends all assume
near-constant expression across the design, which is violated by our
(nuisance) block-level differences in composition. We see this when
we
compare edgeR TMM-normalized log(cpm) to qRT-PCR data; the
TMM-normalization has smoothed out the block differences that the Ct
values
still exhibit (though cpm and Ct are still strongly correlated, there
is a
Ct "shift" for each different block that is not seen in the cpm).
Thanks in advance for any insights/thoughts on the issue,
-Aaron
[[alternative HTML version deleted]]