asked 214k views
5 votes
Suppose the maximum safe average intensity of microwaves for human exposure is taken to be 1.50 W/m2. If a radar unit leaks 10.0 W of microwaves (other than those sent by its antenna) uniformly in all directions, how far away must you be to be exposed to an average intensity considered to be safe? Assume that the power spreads uniformly over the area of a sphere with no complications from absorption or reflection.

asked
User Kriegaex
by
7.6k points

1 Answer

5 votes

Answer:

r = 0.728 m

Step-by-step explanation:

The intensity is defined as the ratio between the power per unit area

I = P / A

Where A is the area of ​​the sphere where the wave is

A = 4π r²

In this exercise we are given the power of the P = 10.0 W micro waves, the safe intensity I₁ = 1.5 W / m²

A₁ = P / I₁

Calculate

A₁ = 10.0 / 1.5

A₁ = 6.6667 m²

Let's clear the radius of the sphere

r = √ A / 4π

r = √ (6.6667 / 4π)

r = 0.728 m

answered
User Richard Green
by
8.7k points
Welcome to Qamnty — a place to ask, share, and grow together. Join our community and get real answers from real people.