Entering edit mode
Manca Marco PATH
▴
200
@manca-marco-path-3578
Last seen 10.3 years ago
Dear fellow bioconductors,
good afternoon.
I am encountering an error concerning the allocations of vectors
bigger than a certain size. My problem is that said size is far below
the amount of RAM I have installed on the system.
Following some previous threads on the topic I have at first thought
that my RAM could have been temporarily unavailable thus I have
monitored the memory usage and the error message appeared when just
2,5Gb were in use out of the about 7 Gb available.
The system is running linux Ubuntu Hardy Heron 64bit, and it sports a
Xeon CPU and 8Gb of RAM.
Following I am pasting my code and my sessionInfo()
# Charging my microArray data (44 human whole genome by Illumina)
>library(lumi)
>fileName <- "MyExperiment_no_32_no_tech_duplos.csv";
>library(lumiHumanAll.db)
>Data.Immature <- lumiR(fileName, lib="lumiHumanAll.db");
>Data <- lumiExpresso(Data.Immature, QC.evaluation=TRUE);
# Extract expression data from lumi object:
>Expression <- exprs(Data)
>names(Expression) <- Data at phenoData@data$sampleID
# Calculate distance matrix for cluster analysis
>distance=dist(Expression, method="euclidean")
Error: cannot allocate vector of size 1.8 Gb
> sessionInfo()
R version 2.10.1 (2009-12-14)
i486-pc-linux-gnu
locale:
[1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C
[3] LC_TIME=en_US.UTF-8 LC_COLLATE=en_US.UTF-8
[5] LC_MONETARY=C LC_MESSAGES=en_US.UTF-8
[7] LC_PAPER=en_US.UTF-8 LC_NAME=C
[9] LC_ADDRESS=C LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] lumiHumanAll.db_1.8.1 org.Hs.eg.db_2.3.6 lumi_1.12.4
[4] MASS_7.3-5 RSQLite_0.8-3 DBI_0.2-5
[7] preprocessCore_1.8.0 mgcv_1.6-1 affy_1.24.2
[10] annotate_1.24.1 AnnotationDbi_1.8.1 Biobase_2.6.1
loaded via a namespace (and not attached):
[1] affyio_1.14.0 grid_2.10.1 lattice_0.18-3
Matrix_0.999375-37
[5] nlme_3.1-96 tools_2.10.1 xtable_1.5-6
Reading this page (
http://stat.ethz.ch/R-manual/R-patched/library/base/html/Memory.html )
I have come to hypothesize that there could be some vector size limit
indipendent from the RAM availability, but I wonder whether it is true
on a 64bit machine and how could I circumvent the problem:
1) any suggestion on how to modify the vector size threshold (if it
exists)?
2) any suggestion on how to force the dist function to strip down the
matrix into smaller vectors?
Thank you in advance for your attention and for any feedback.
All the best, Marco
--
Marco Manca, MD
University of Maastricht
Faculty of Health, Medicine and Life Sciences (FHML)
Cardiovascular Research Institute (CARIM)
Mailing address: PO Box 616, 6200 MD Maastricht (The Netherlands)
Visiting address: Experimental Vascular Pathology group, Dept of
Pathology - Room5.08, Maastricht University Medical Center, P.
Debeijelaan 25, 6229 HX Maastricht
E-mail: m.manca at path.unimaas.nl
Office telephone: +31(0)433874633
Personal mobile: +31(0)626441205
Twitter: @markomanka
**********************************************************************
***********************************************
This email and any files transmitted with it are
confide...{{dropped:15}}