Entering edit mode
Roel Verhaak
▴
70
@roel-verhaak-710
Last seen 10.2 years ago
Hi,
I had the same opportunity and ran into the same problem. The error
occurred when trying to read >25 CEL-files with the ReadAffy-package
and
happened in R1.9 as well as R2.0. The problem seemed to be due to an
error
in the Tcl/Tk package and might relate to the installation of Tcl/Tk
on
our Irix machine. We did not completely tracked it down, as we than
discovered that we were using Bioconductor 1.4 (which actually worked
on a
second Irix machine with R1.9). Upgrading this to version 1.5 solved
all
our problems.
Bottomline: make sure you use R2.0, Bioconductor 1.5 and install the
most
recent packages available.
Regards,
Roel Verhaak
> Tae-Hoon Chung <thchung@tgen.org> writes:
>
>> Hi, all;
>>
>> I know there has been a lot of discussions on memory usage in R.
>> However, I have some odd situation here. Basically, I have a rare
>> opportunity to run R in a system with 64GB memory without any limit
on
>> memory usage for any person or process. However, I encountered the
>> memory
>> problem error message like this:
>>
>> Error: cannot allocate vector of size 594075 Kb
> ....
>> Although I have no idea of memory allocation in R, apparently
>> something's
>> wrong with this. The memory problem must have nothing to do with
>> physical
>> memory. My question is this. Is this memory problem due to some
>> non-optimal
>> configuration of memory usage? If so, then what will be the optimal
>> configuration for this? If not, then there must be problems on
actual
>> implementations of functions I used here, right? The reason I am
asking
>> this
>> is that, according to the reference manual, the error message I got
can
>> be
>> brought up by roughly three reasons. First, when the system is
unable to
>> provide the R requested memory. Second, when the requested memory
size
>> exceeds the address-space limit for a process. Finally, when the
length
>> of a
>> vector is larger than 2^31-1.
>
> Hmm, the length issue should not kick in before the length exceeds 2
> billion or so and you are not beyond 75 or 150 million (counting 8
or
> 4 bytes per elements).
>
>> I wonder the problem has anything to do with
>> the third case. (If so, then I think I am hopeless unless the
internal
>> implementations change...)
>
> Well, revolutionaries often find themselves just below the cutting
> edge...
>
> Just a sanity check: this is using a 64-bit compiled R on a 64-bit
> operating system, right?
>
> --
> O__ ---- Peter Dalgaard Blegdamsvej 3
> c/ /'_ --- Dept. of Biostatistics 2200 Cph. N
> (*) \(*) -- University of Copenhagen Denmark Ph: (+45)
35327918
> ~~~~~~~~~~ - (p.dalgaard@biostat.ku.dk) FAX: (+45)
35327907
>
>
>