[comp.dcom.lans] Microwave blockage

macklin@garnet.berkeley.edu (Macklin Burnham) (06/11/91)

Anyone have an idea about how much foilage , ie; trees, it takes to
block a microwave ethernet link? Will it tolerate any at all?
Mack Burnham

benjamin@ee.tut.fi (Gr|nlund Pentti) (06/12/91)

Someone asked here about microwave ethernet blockage by trees etc.

Radio link performance in free space is determined by frequency, distance,
transmitter and receiver antenna gains, feed line loss in both ends and
receiver sensitivity. You can find the equations in every radio link, radar,
space communications and amateur microwave handbook.

In real life there are a lot of additional loss factors caused by terrain
formation and all kinds of obstructions. The attenuation caused by wet trees
(especially those with BIIIG leaves) can be tens of decibels and this can
contribute to loss of signal in marginal installations.

kwe@bbn.com (Kent W. England) (06/19/91)

In article <1991Jun11.151908.18648@agate.berkeley.edu> 
macklin@garnet.berkeley.edu (Macklin Burnham) writes:
> Anyone have an idea about how much foilage , ie; trees, it takes to
> block a microwave ethernet link? Will it tolerate any at all?

You don't say what frequency you are using.  The higher the frequency, the 
less blockage you can take both because of absorption/reflection and beam 
size.   I will assume you are talking private 23GHz microwave.   23GHz doesn't tolerate foliage at all.

Speaking from experience, never install a link in the fall when you are 
shooting over the tops of bare trees.  You may find in the spring that you 
will have to do a little forestry due to leafing out and growth.   True 
story.

--Kent