I'm using 64bit R in Windows 10, and my current memory.limit()
is 16287.
I'm working with mass spectra files (mzXML), I've been calling individual files one at a time using the line below, which
increases my memory.size()
to 7738.28. Then I filter out noise using basic R functions, and plot the EICs.
msdata <- xcmsRaw(datafile1,profstep=0.01,profmethod="bin",profparam=list(),includeMSn=FALSE,mslevel=NULL, scanrange=NULL)
EIC<-rawEIC(mzxcms,mass)
RT<-mzxcms@scantime
plot(RT,EIC[[2]], type='l')
Then I remove all the variables I created that are saved in my global environment using rm(list=ls())
.
But when I try to call in a new mzXML file, R tells me it cannot allocate vector of size **Gb
. When this error showed up, I checked my memory.size()
, which was 419.32, and I also used gc()
to confirm that the used
memory (on the Vcells
row) is on the same order with the number I see when I first open a new R session and type in gc()
.
I couldn't find any information on why R still thinks that something is taking up a bunch of memory space when the environment is completely empty. But if I terminate the session and reopen the program, I can import the data file. So it seems like there is still some memory being used even when the environment is empty, and it would still take up the space until I terminate the session. Does anyone have similar experience or suggestion on why this is happening?
Two things: 1) please change to the new functions/objects of
xcms
(see xcms vignette).2)
profstep = 0.01
will create a huge matrix (which in your case will not fit into memory) - question is if you really need that fine grained binning? Also, have a look at the vignette from the point above, it is now much easier (and less memory demanding) to extract EICs.Hope this helps.