View previous topic :: View next topic |
Author |
Message |
DanRRight
Joined: 10 Mar 2008 Posts: 2816 Location: South Pole, Antarctica
|
Posted: Thu Sep 03, 2009 9:55 pm Post subject: Need a function resembling READ_URL |
|
|
You probably know this very cool ClearWin+ enhancement which allows you to get the data from the internet, communicate with your server sending it requests, grab data from trading software or even make your own automated trading system or start DoS (denied of service) attacks on servers (last one in *one single line of fortran text*! ) ))
!----------------
SUBROUTINE READ_URL@(URL,FILE,MODE,ERROR)
CHARACTER*(*) URL
CHARACTER*(*) FILE
INTEGER MODE
INTEGER ERROR
If this routine succeeds (an internet connection is available and the URL can be accessed) ERROR is set to zero and the data from the URL is transfered to the specified file (which can be a full path name). If mode=0 the data is assumed to be text, so newlines are converted to DOS style, if mode=1 binary data is assumed.
The URL can be quite general. In particular, you can use an FTP address assuming anonymous access is possible, or an HTTP address with extra information attached - for example you could query a search engine directly using this routine.
If necessary, the system will dial up the service provider to process this call. Since the call will remain open after the transfer has completed, the following call will close a modem connection if one exists.
------------------------
I need exactly the same function but instead of writing into the file on harddrive (which is inconvenient and slow) it reads everything it got from the internet in the internal named buffer from which Fortran has to read it somehow.
Any idea how to implement that by any way including Salford C++?
If you know any other provider has similar function let me know
Last edited by DanRRight on Fri Sep 04, 2009 7:15 pm; edited 1 time in total |
|
Back to top |
|
|
PaulLaidler Site Admin
Joined: 21 Feb 2005 Posts: 7925 Location: Salford, UK
|
Posted: Fri Sep 04, 2009 8:39 am Post subject: |
|
|
READ_URL@ calls upon the Microsoft API InternetReadFile and uses a buffer to transfer the data into the given file.
One way forward would be for me to provide an alternative routine that also requires the file size to be provided. I could then call MAP_FILE_FOR_READING@ internally and return the result of this call after reading the internet data. The user would then call UNMAP_FILE@ to release the memory.
If this could be useful, let me know soon because we are planning a new release within the next few weeks. |
|
Back to top |
|
|
DanRRight
Joined: 10 Mar 2008 Posts: 2816 Location: South Pole, Antarctica
|
Posted: Fri Sep 04, 2009 6:25 pm Post subject: |
|
|
Would very appreciate any alternative to dumping data to harddrive. One useful property the older method has though is the possibility to do download of multiple streams simultaneously and independently in multithreads by just giving FILE different names .
If that could be done (i.e.mapping to separate buffers which are marked by the handle, read and then released) that would be just super!
Please also check if current version of READ_URL adds some more CR/LF at the end of the line, since in some viewers i see all lines in FILE are separated by empty line. Not big deal though |
|
Back to top |
|
|
PaulLaidler Site Admin
Joined: 21 Feb 2005 Posts: 7925 Location: Salford, UK
|
Posted: Sat Sep 05, 2009 6:54 am Post subject: |
|
|
I will have a go at this.
read_url@ does not process the raw data. |
|
Back to top |
|
|
PaulLaidler Site Admin
Joined: 21 Feb 2005 Posts: 7925 Location: Salford, UK
|
Posted: Mon Sep 14, 2009 1:40 pm Post subject: |
|
|
I have added a new routine for the next release.
It is similar to read_url@ but uses get_storage@ internally to write the file to memory. If the supplied file size is zero then it returns the required size of the file.
Here is some sample code to illustrate how it will work...
Code: | program main
include <windows.ins>
integer err,fileSize,handle
character(80) url
url = "http://msdn.microsoft.com/en-us/library/aa384159(VS.85).aspx"
err = 0
fileSize = 0
handle = download@(url, fileSize, err)
if(err > 0) stop 'Download failed'
fileSize = fileSize + 4
handle = download@(url, fileSize, err);
do i=0,20
print*, ccore1(handle+i)
enddo
call return_storage@(handle)
end |
|
|
Back to top |
|
|
PaulLaidler Site Admin
Joined: 21 Feb 2005 Posts: 7925 Location: Salford, UK
|
Posted: Tue Sep 15, 2009 3:17 pm Post subject: |
|
|
I have corrected an error in my sample program above. |
|
Back to top |
|
|
DanRRight
Joined: 10 Mar 2008 Posts: 2816 Location: South Pole, Antarctica
|
Posted: Tue Sep 22, 2009 7:54 pm Post subject: Re: |
|
|
I suspect I drilled many holes and destroyed several harddrives with read_url@ which ran for many days Thinking of using ramdrive or something...
When this new function expected to be available? |
|
Back to top |
|
|
PaulLaidler Site Admin
Joined: 21 Feb 2005 Posts: 7925 Location: Salford, UK
|
Posted: Tue Sep 22, 2009 10:18 pm Post subject: |
|
|
The new routine will be in the next release which is now being prepared and is planned to be available within the next few weeks. |
|
Back to top |
|
|
|