how to deal with Fastq file more than 30G
1
0
Entering edit mode
wang peter ★ 2.0k
@wang-peter-4647
Last seen 10.2 years ago
---------- Forwarded message ---------- From: wang peter <wng.peter@gmail.com> Date: Wed, Oct 5, 2011 at 3:44 PM Subject: how to deal with Fastq file more than 30G To: bioc-sig-sequencing@r-project.org IT is too slow to read them in the memory. who can tell me if i need split them by other program or call some R function to split them thx [[alternative HTML version deleted]]
• 646 views
ADD COMMENT
0
Entering edit mode
@steve-lianoglou-2771
Last seen 21 months ago
United States
Hi, On Wed, Oct 5, 2011 at 3:45 PM, wang peter <wng.peter at="" gmail.com=""> wrote: > ---------- Forwarded message ---------- > From: wang peter <wng.peter at="" gmail.com=""> > Date: Wed, Oct 5, 2011 at 3:44 PM > Subject: how to deal with Fastq file more than 30G > To: bioc-sig-sequencing at r-project.org > > > IT is too slow to read them in the memory. > who can tell me if i need split them by other program or > call some R function to split them In the *nix universe, you'll find a command line program coincidentally named "split" that can do this for you. Another option is to reconsider whether or not you really need to load all of your reads into memory to do whatever it is that you are doing -- if I were a gambling man, I'd put money on "no" ... -steve -- Steve Lianoglou Graduate Student: Computational Systems Biology ?| Memorial Sloan-Kettering Cancer Center ?| Weill Medical College of Cornell University Contact Info: http://cbio.mskcc.org/~lianos/contact
ADD COMMENT

Login before adding your answer.

Traffic: 514 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6