Hi,
I am having trouble with memory usage and the SWATH2stats R package. Specifically, when I attempt to split the transition groups into single transitions (in order to go on to use MSstats or MapDIA), using:
data.transition <- disaggregate(data.filtered)
I receive:
error: cannot allocate vector of size 7.0Gb
The data has been processed through the OpenSwathWorkflow, and reports ~4,000 protein ID hits, with ~16,000 unique peptides, and ~70,000 transitions . I have filtered and reduced the data using SWATH2stats, however it is for global profiling, so I do expect relatively high numbers.
Aside from increasing RAM (I am currently running 64-bit R, on Windows 7, with 32GB RAM), I was wondering if there are any work-arounds with SWATH2stats when dealing with data as large as this? Or whether any alternative tools are be better suited?
Any advice or tips appreciated!
Jess
Thanks so much for the quick reply, I will give the python script a go.
Best,
Jess