View previous topic :: View next topic |
Author |
Message |
JohnMansell
Joined: 10 May 2006 Posts: 18 Location: Darlington
|
Posted: Thu Oct 04, 2007 8:51 am Post subject: Large sequential binary files |
|
|
Is there a size limit for these? Ary there any issues (apart from program design)? I'm currently running a program which has generated a 2.45 GByte scratch file. Its now in a part of the program in which the file the scratch file is subject to REWIND fairly often, followed by reading through the file. program progress seems very slow. I set this running yesterday, so I reckon its been running 16 hours. |
|
Back to top |
|
|
JohnCampbell
Joined: 16 Feb 2006 Posts: 2554 Location: Sydney
|
Posted: Fri Oct 05, 2007 1:12 am Post subject: |
|
|
John,
I'm not aware of a limit on file size, as I've generated files of up to 6gb in the past.
I have addressed this in the past. I wrote a set of routines to control the I/O to this file. I counted the number of reads, writes, bytes read and written ( using integer*8 or real*10 ). When an open, close, rewind or end of file is encountered, I report the statistics to the screen, with a date/time stamp. Given the size of the file, 2.6gb at about 30mb/second means it will take about 100 seconds to read or write each pass, so the frequency of reporting should not be too great.
You did not say if the file is sequential or direct accessed. Backspace on a sequential file may be a big problem. I've never tried this.
The alternative is to help me with the /3GB option and then read all the data into memory !!
I hope this helps
John |
|
Back to top |
|
|
JohnMansell
Joined: 10 May 2006 Posts: 18 Location: Darlington
|
Posted: Wed Oct 10, 2007 9:25 am Post subject: |
|
|
Apologies for delay in replying. Long weekend in the dales.
The job eventually ran to completion. Days rather than hours. so I reckon that, since memory is cheap, reducing file I/O by holding more results in memory is now the way to go. The current strategy was drawn up to suit DEC10 and VAX, so it's time to look at this again. |
|
Back to top |
|
|
|