View Single Post
  #1  
Old July 9th 03, 10:05 AM
Wayne
external usenet poster
 
Posts: n/a
Default Needed: online database of airports with designators & Lat/Long for GPS upload

FTP is File Transfer Protocol after all. However, 1 megabyte is hardly a
large file by todays standards. Not all FTP's are resumable, then again,
there are programs now that will allow a resume on an HTTP connection as
well. I always use the DOS version of ftp at the command prompt to get the
files from my own FTP server, it doesn't allow resume, but working on as
many PC's as I do, I know I can count on always using the same program and
commands rather than having to download and install an FTP client package
before I go get the file I want.

"There are two reasons why FTP is more efficient for retrieving large

files
than is HTTP. Firstly, FTP involves a lower overhead of headers and
handshaking packets. Secondly, FTP can continue from the end of a

partially
downloaded file - so an interrupted download session can be restarted,
perhaps days later. For this reason, it is common to place large files -

for
instance larger than a megabyte - on an FTP server and link to them from a
Web page. This is typically done by running the FTP server program on the
same computer as the Web server, but larger sites may use separate
computers."



I'd be interested in your thoughts as to why this may not be the case and
that HTTP is just as efficient as FTP.


In the case of speed and efficiency, I'd still rather get a large file
from a fast HTTP server than a slow FTP server. Many of the download
accelerators available will search out the filename you want and get it
anyway it can even combining HTTP and FTP to get the file quickly.

What's really annoying is when people lock there mail accounts by
sending large files via POP3. Email's got to be the worst way to send files
and yet is probably the most common among the common internet junky.