Entering edit mode
jol.espinoz
▴
40
@jolespinoz-11290
Last seen 3.7 years ago
I recently acquired a metatranscriptomics dataset. There are ~1 million ORFs and after strict filtering could be reduced to ~100 thousand ORFs. The dataset has treatment (N=49) and control (N=34) groups for a total of 83 samples.
I have access to compute resources which I can use for WGCNA.
I only need WGCNA for the following:
- picking a soft threshold;
- TOM matrix calculation
- dendrogram cutting
I've seen the blockwise modules method but this seems strange in having multiple incomparable dendrograms. Is the blockwise modules method the recommended approach for a dataset of this size? What are some other approaches for running WGCNA on a dataset this large?