|
forums.silverfrost.com Welcome to the Silverfrost forums
|
View previous topic :: View next topic |
Author |
Message |
Ralf
Joined: 30 Aug 2007 Posts: 50 Location: munich
|
Posted: Mon Apr 27, 2009 9:29 am Post subject: integer*2 limitation for files@ |
|
|
I am using files@ to create a list of files of the current directory with a certain extension. Because of the integer*2 definition of the parameter NMAX this feature is limited to 32768 files. Is there a similar command without this limit?
Regards,
Ralf |
|
Back to top |
|
|
PaulLaidler Site Admin
Joined: 21 Feb 2005 Posts: 7925 Location: Salford, UK
|
Posted: Mon Apr 27, 2009 12:33 pm Post subject: |
|
|
I cannot think of an alternative though it would be quite easy to add a new routine to the library.
If you know C/C++ then you could write your own function using the Windows API FindFirstFile and FindNextFile. |
|
Back to top |
|
|
JohnCampbell
Joined: 16 Feb 2006 Posts: 2554 Location: Sydney
|
Posted: Tue Apr 28, 2009 3:57 am Post subject: |
|
|
NMAX is the limit on the number of entries in the selected local directory only.
I successfully use FILES@ on network disks with 100,000's of files, but limited to 12,000 files per directory. I should get the statistic on the maximum number of entries recovered in a single call. This is a very useful routine for tracking files on a disk.
My routine is:-
Code: |
! nsize.inc
! Last change: JDC 13 Mar 2008 3:29 pm
integer*4, parameter :: max_rec = 12000 ! max active files in tree
integer*4, parameter :: max_chr = 320 ! max characters in a tree name
integer*4, parameter :: max_cht = 192 ! max characters in a tree list name
!
! directory search information
common /filinf/ max_use, & ! max number of records used
max_name_len, & ! max string length
max_tree_lev, & ! max tree level searched
n_h, s_h, & ! hidden file summary info
n_d, s_d, & ! directory summary info
n_f, s_f, & ! file summary info
n_bf, & ! count of big files
slack_space, & ! slack space estimate
echo_mask(8), & ! print flaged files
dump_dir, dump_file,& ! listing options
min_file_size, & ! listing limit
big_file_size, & ! listing limit
max_file_size, & ! listing limit
local, & ! percolate date
min_date, max_date, & ! date option
block_size, & ! disk allocation block size
lu, lu_map(0:3), & ! screen unit
last_option, &
last_option_used, &
!
attr(max_rec), & ! file attributes
date(max_rec), & ! DOS date stamp
time(max_rec), & ! DOS time stamp
file_size(max_rec) ! file size (bytes)
common /filinc/ file_name(max_rec) ! file name buffer
!
integer*2 max_use, max_name_len, max_tree_lev, min_date, max_date
character last_option*80
integer*4 n_h,n_d,n_f,n_bf, lu, lu_map
integer*8 block_size
real*8 s_h,s_d,s_f, slack_space
logical echo_mask, dump_dir, dump_file, local, last_option_used
!
integer*2 attr, date, time
!file8@ real*8 file_size
integer*4 file_size
integer*8 min_file_size, big_file_size, max_file_size
character file_name*320
!
! end of nsize.inc
|
Last edited by JohnCampbell on Tue Apr 28, 2009 8:51 am; edited 1 time in total |
|
Back to top |
|
|
JohnCampbell
Joined: 16 Feb 2006 Posts: 2554 Location: Sydney
|
Posted: Tue Apr 28, 2009 4:03 am Post subject: |
|
|
shame about the email size limit !!
Preview actually worked.
attached is the key bit of code
Code: |
!
nmax = max_rec-next ! entries available in scan arrays
call files8@ (local_dir, & ! C320 directory to scan
n, & ! I2 number of entries found
nmax, & ! I2 capacity of buffers
file_name(next), & ! C320 tree name of files found
attr(next), & ! I2 DOS attributes
date(next), & ! I2 DOS date format
time(next), & ! I2 DOS time format
file_size(next) ) ! I4 File size in bytes
max_use = max (max_use, next+n)
max_tree_lev = max (max_tree_lev, scan_level)
!
do 100 i = next, next+n-1
nc = len_trim (file_name(i))
max_name_len = max (max_name_len,nc)
!
this_file_size = file_size(i)
used_block = (this_file_size+block_size-one) / block_size
if (used_block < one) used_block = one
used_size = used_block * block_size
waste = used_size - this_file_size
!
! report unusual files - other than archive attribute
!
call get_file_type (file_name(i), attr(i), type_flag, type_label, ext, this_file_size)
!
call echo_file_type (file_name(i), type_flag, type_label, ext, &
i, this_file_size, min_file_size, &
max_file_size, big_file_size, &
date(i), time(i), min_date, max_date, &
echo_mask, dump_dir, dump_file)
!
if (type_flag(4).gt.0) then
!
! 8 volume name or directory pointers [..] or [.] - ignore
cycle
!
else if (type_flag(5).gt.0) then
!
! 16 sub directory
!
call dir_size (file_name(i), sub_local_size, sub_tree_size, sub_tree_num, &
next+n, scan_level+1, sub_local_date)
!
sl = sub_local_size/1.d06
st = sub_tree_size/1.d06
date_string = dos_date (sub_local_date)
!
|
I checked a network drive with 4,487 gb of info in 3,403,091 local files
13566 maximum active tree entries (max_rec set to 15000)
8515 maximum local directory entries
so 32000 is more than enough for the type of directories I see.
John |
|
Back to top |
|
|
Ralf
Joined: 30 Aug 2007 Posts: 50 Location: munich
|
Posted: Tue Apr 28, 2009 12:47 pm Post subject: |
|
|
Thanks for your replies!
This is not an urgent topic for me, because we can probably live with this limit for a long time.
I just try to remove some restrictions from my program and it is possible, that a user may reach this limit when he tries to evaluate his calculations in a wide frequency range (but then these files will all be located in the same directory).
Ralf |
|
Back to top |
|
|
|
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
Powered by phpBB © 2001, 2005 phpBB Group
|