Home
Class 12
PHYSICS
A 50 MHz sky wave takes 4.04 ms to reach...

A 50 MHz sky wave takes 4.04 ms to reach a receiver via retransmission from a satellite 600 km above the earth.s surface. Assuming retransmission time by satellite negligible, find the distance between source and receiver. If communication between the two was to be done by Line of Sight (LOS) method, what should size and placement fo receiving and transmitting antennas be?

Text Solution

Verified by Experts

Let the receiver be at point A and source be at point B.
Velocity of waves `=3xx10^(8)m//s`
Time to reach a receiver = 4.04 ms
`=4.04xx10^(-3)`
Let the height of satellite be `h_(S)=600km`
Radius of the earth = 6400 km
Size of transmitting antenna = `h_(T)s`

We know that,
`("distance travelled by wave")/("time")` = velocity of waves
`(2x)/(4.04xx10^(-3))=3xx10^(8)`
or `x=(3xx10^(8)xx4.04xx10^(-3))/(2)`
`=6.06xx10^(5)=606km`
Using Pythagoras theorem,
`d^(2)=x^(2)-h_(S)^(2)=(606)^(2)-(600)^(2)=7236`
or `d=85.06km`
So, the distance between source and receiver = 2d
`=2xx85.06=170km`
The maximum distance covered on ground from the transmitter by emitted electromagnetic waves,
`d=sqrt(2Rh_(T))or(d^(2))/(2R)=h_(T)`
or size of antenna,
`h_(T)=(7236)/(2xx6400)=0.565km=565m`
Promotional Banner