When you read research about the dangers of radio signals from cellphones and cell towers, most of the research focus on investigating whether radio signals from cellphones are dangerous, not much is said about whether the signals from cell towers are dangerous.
But it is not uncommon for new cell tower construction to be delayed because of perceived fears of harmful radiation to neighbors.
Why is it that so much controversy still accompany positioning of cell towers when almost everybody today own a cellphone or smartphone? You rarely see anybody complaining about somebody using his or her cellphone in the vicinity of others, at least not because of radiation fears.
In order to understand why radiation from cell towers rarely raise any alarms in scientific studies you need a basic understanding of how cellphone networks work and how radio signals travel through the air.
Cellphone networks are made up of a web of cells, meaning, intertwined coverage areas from each tower. These coverage areas used to be circular (with the tower in the middle) back in the day, hence the name "cell". Today, networks rely on more advanced antennas that focus the signal into different beams to achieve better coverage and capacity but the name cellphone is still with us.
Cell towers usually consist of a large pole or tower or are situated on top of buildings. At the top you find large and very sensitive antennas. These antennas both broadcast the signal to your cellphone and listen to the signal coming from your cellphone.
Many people have figured out that cell towers does indeed broadcast a signal which is many times stronger than the one your cellphone broadcast back to the tower (or at least have the capability to do so), either because they have read about it or they just deduced it from the large antennas. This is where the counterintuitive perception of the potential dangers of cell towers come into play.
To best explain how radio signals travel from the cell tower to your phone you can compare it to waves on water. For example, if you throw a rock into a body of water, you will see a splash and then rings of waves spreading out from where the rock hit the surface. If you try different rocks you will notice that a larger rock usually will produce a larger wave than a smaller rock. Even if you throw the largest rock you can find, you will notice that while the splash is huge and the wave is quite big in the immediate vicinity of the splashdown, the size of the wave shrinks very quickly as the wave spreads outwards and the force it hits anything in its way weakens considerably the further it travels and the circumference of the circle grow.
If you compare the energy of the wave on water, which is essentially a two-dimensional environment, to the radiation emanating from a cell tower, you have to think in 3D. If you think of the 3D equivalent of a circle you can think of the radiation spreading outward from the tower in the shape of a sphere or cylinder. The amount of energy that would hit a person depends of course on the strength of the signal coming from the antennas (equivalent to the size of the rock you throw in the water), but also to a very large extent on the distance between the person and the tower antenna.
Let's assume that the tower antenna broadcast the signal in equal strength in all directions, which is more in line with how a cellphone work and not an actual cell tower, but bear with me. If you were able to position your body immediately next to the tower antenna, so close that your body touches it, your body probably would absorb about 50% of whatever signal strength the antenna was emitting, because the rest of the signal would travel in the other direction, away from your body.
If you take one step back from the antenna (I hope you are not afraid of heights) and now stand about one yard away from it, your absorption of the total energy coming from the antenna would probably go down to about 30% or even less depending on your size and if you are facing it or have the antenna to your side. The reason for this is that when you moved away from the antenna, your body now cover a smaller percentage of the signal area or the surface of the sphere or cylinder i mentioned above. As you move further and further away from the antenna, the surface area of the sphere or cylinder grows a lot, while the size of your body stays the same. You now absorb a much smaller portion of the total energy emitted. You can say that the signal weakens as it travels away from the antenna because it disperses or has to cover a larger area.
This analogy is not 100% accurate with real-world conditions but I think you get the basic idea.
Because of this relationship between signal strength and distance from the antenna, most research focus on possible dangers of the radiation coming from your cellphone instead of the cell tower. Even though the phone broadcast signals at a fraction of signal strength the tower use, it is often located immediately next to your head. So even though the cell towers are big, the antennas are large and they broadcast a signal many times stronger than your phone, you end up getting more radiation from the tiny phone you use every day than from any towers you may or may not worry about.
The real kicker is that if there are many towers in an area, like a city, you actually absorb less radiation from your phone than if you were in the countryside where the towers are few and far between. This is because when your own cellphone has great coverage (is close to a tower) it will broadcast a weaker signal (to save battery and reduce interference in the network) compared to the stronger signal it has to emit if it is far from the tower.