A aviation & planes forum. AviationBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » AviationBanter forum » rec.aviation newsgroups » Piloting
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Needed: online database of airports with designators & Lat/Long for GPS upload



 
 
Thread Tools Display Modes
  #1  
Old July 9th 03, 10:05 AM
Wayne
external usenet poster
 
Posts: n/a
Default Needed: online database of airports with designators & Lat/Long for GPS upload

FTP is File Transfer Protocol after all. However, 1 megabyte is hardly a
large file by todays standards. Not all FTP's are resumable, then again,
there are programs now that will allow a resume on an HTTP connection as
well. I always use the DOS version of ftp at the command prompt to get the
files from my own FTP server, it doesn't allow resume, but working on as
many PC's as I do, I know I can count on always using the same program and
commands rather than having to download and install an FTP client package
before I go get the file I want.

"There are two reasons why FTP is more efficient for retrieving large

files
than is HTTP. Firstly, FTP involves a lower overhead of headers and
handshaking packets. Secondly, FTP can continue from the end of a

partially
downloaded file - so an interrupted download session can be restarted,
perhaps days later. For this reason, it is common to place large files -

for
instance larger than a megabyte - on an FTP server and link to them from a
Web page. This is typically done by running the FTP server program on the
same computer as the Web server, but larger sites may use separate
computers."



I'd be interested in your thoughts as to why this may not be the case and
that HTTP is just as efficient as FTP.


In the case of speed and efficiency, I'd still rather get a large file
from a fast HTTP server than a slow FTP server. Many of the download
accelerators available will search out the filename you want and get it
anyway it can even combining HTTP and FTP to get the file quickly.

What's really annoying is when people lock there mail accounts by
sending large files via POP3. Email's got to be the worst way to send files
and yet is probably the most common among the common internet junky.


  #2  
Old July 9th 03, 03:39 PM
John T
external usenet poster
 
Posts: n/a
Default

"Wayne" wrote in message


What's really annoying is when people lock there mail accounts by
sending large files via POP3. Email's got to be the worst way to send
files and yet is probably the most common among the common internet
junky.


"Ignorance is bliss." All they know is it works and works easily. They
have no idea how much their file gets bloated when it's converted to text
for transmission thereby adding *huge* demands on their bandwidth.

--
John T
http://tknowlogy.com/tknoFlyer
__________



  #3  
Old July 9th 03, 09:28 PM
Kyler Laird
external usenet poster
 
Posts: n/a
Default

"Derek Fage" writes:

I'd also always thought FTP was more efficient then HTTP for larger files
and a search on the internet came up with this:


"There are two reasons why FTP is more efficient for retrieving large files
than is HTTP. Firstly, FTP involves a lower overhead of headers and
handshaking packets.


That's not necessarily true. It's not mandated by the protocol.
(Someone *please* correct me if you've got a reference.)

'course for unauthenticated access, there is *less* overhead for
HTTP, but for large files this difference is in the noise. (Not
only do you not have to "log in" for HTTP, but you don't have to
mess with setting up a data channel. These are just setup costs
though.)

Secondly, FTP can continue from the end of a partially
downloaded file - so an interrupted download session can be restarted,
perhaps days later.


Same for HTTP. You can even do crazy stuff like download chunks
of the file out of order (with a single request). (PDF files
take advantage of this.)

--kyler
  #4  
Old July 9th 03, 09:43 PM
Ken Hornstein
external usenet poster
 
Posts: n/a
Default

In article ,
Kyler Laird wrote:
'course for unauthenticated access, there is *less* overhead for
HTTP, but for large files this difference is in the noise. (Not
only do you not have to "log in" for HTTP, but you don't have to
mess with setting up a data channel. These are just setup costs
though.)


There's no argument that in terms of protocol efficiency, FTP and HTTP
are about the same (since they're both essentially using a single TCP
stream for the bulk data transfer). You can make arguments about
protocol overhead in the HTTP request versus the FTP command channel,
but I think that's down in the noise (especially for larger files).

One thing to consider is that some government agencies have different
policies for what data can be made available via ftp versus HTTP (I
don't know if that is true for the FAA, but that's certainly true for
other agencies). Now, I admit that those policies are dumb, but _you_
try telling the people in charge that.

And I guess I have to ask ... what's the big deal? I mean, doesn't
your web browser handle an ftp:// URL? At least they're thinking about
making it available via the Internet, which you have to admit is a step
up.

--Ken
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
GPS 90 database needed John General Aviation 2 August 15th 03 03:18 AM


All times are GMT +1. The time now is 10:53 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 AviationBanter.
The comments are property of their posters.