View Single Post
  #9  
Old March 24th 07, 03:47 PM posted to rec.aviation.homebuilt
RST Engineering
external usenet poster
 
Posts: 1,147
Default theoretical radio range....

Assuming a 1 microvolt (pretty numb these days) receiver at the other end
and quarter wave vertical whips at both ends, a 5 watt transmitter has a
THEORETICAL range of about 3000 miles. Doubling the power increases the
range by (sqrt(2)) or a THEORETICAL range of about 4300 miles for the 10
watter.

Now since most of us will operate somewhere below the oxygen limited 12000
MSL altitude, and presuming you are over the ocean, your range will be
horizon ("line of sight") limited by the old familiar equation that horizon
(in miles) is equal to 1.4 times (sqrt (altitude in feet)) or something on
the order of 150 miles. You may get a BIT of refraction, but not enough to
make a difference in the basic equation.

The real answer is that 5 or 10 watts really doesn't make a difference in
quiet spectrum range. It only helps "punch through" when there is a lot of
interfering garbage on the frequency.

Jim



"Andy" wrote in message
oups.com...
i have a radio rated at 10W and a radio rated for 5W output. mine 5
watter isn't a handheld but this is typical output for that genre.

assuming they are using the same antenna what is the theoretical range
difference between the two and what is the practical range
difference? it seems the price difference is 2X to 3X. is the price
difference justified?

i guess i'm asking "should i ebay the 10W unit and find a better use
for the remainder?"