Why is the range of the Wi-Fi so poor? Why can my device connect to my old cheap consumer WAP and not to your fancy and expensive enterprise access point?
This is a hard to answer this question in a manner that is both simple to understand / not very technical while also being accurate and comprehensive. But here goes...
It is very
common for consumer-grade access points to advertise and tout their range (i.e. area of
coverage). They do this by cranking up
their transmit power to FCC maximums (around 30 dBm = 1 Watt).
To first order, the speeds achievable over Wi-Fi are a function of the distance between the transmitter and receiver, as shown below. Intervening objects, like walls, furniture, etc. will also attenuate the signal in a non-symmetric way, so in reality the signal is never symmetric, so consider this a vastly simplified illustration only.
The problem is that Wi-Fi communication is asymmetric, and the transmit power of your smartphone or tablet is about 100 times less than that of the AP (around 10 dBm = 0.01 Watt). The device manufacturers do this purposely to maximize battery life. Hence, comparatively, the consumer-grade access point is screaming whereas the client device is whispering. Accordingly, a client device can hear the access point fairly well at reasonable connection speeds, but it is very hard for the access point to resolve upstream messages from the client device, even at very low connection speeds.
This asymmetric mode of operation was still “ok” in the days when Internet traffic was primarily downstream to a device. For example, you’d type in a URL in your browser, or click on a link, send a small amount of information upstream to the Internet, then download a whole lot of data and content to your device.
Alas, modern Wi-Fi usage no longer works this way. People expect to take pictures and videos and either email them to friends and/or post them to social media. Wi-Fi appliances, sensors, and cameras are constantly streaming data to the network. Accordingly, there is quite a lot more upstream traffic from the wireless devices to the access point and the Internet. Very slow upstream speeds resulting from a weak transmitter only serve to make the Wi-Fi seem “slow”.
Proper modern Wi-Fi design, therefore, attempts to balance the power levels between the client device and the access point by lowering the output transmit power of the access point. Hence, the access point isn’t screaming quite as loud, and the client will therefore need to be closer to the access point to maintain connectivity, which improves signal quality and ultimately signal speed. This approach does ultimately require more access points to be deployed in an environment. However, there are other industry trends that simultaneously drive the need for more access points:
To first order, the speeds achievable over Wi-Fi are a function of the distance between the transmitter and receiver, as shown below. Intervening objects, like walls, furniture, etc. will also attenuate the signal in a non-symmetric way, so in reality the signal is never symmetric, so consider this a vastly simplified illustration only.
The problem is that Wi-Fi communication is asymmetric, and the transmit power of your smartphone or tablet is about 100 times less than that of the AP (around 10 dBm = 0.01 Watt). The device manufacturers do this purposely to maximize battery life. Hence, comparatively, the consumer-grade access point is screaming whereas the client device is whispering. Accordingly, a client device can hear the access point fairly well at reasonable connection speeds, but it is very hard for the access point to resolve upstream messages from the client device, even at very low connection speeds.
This asymmetric mode of operation was still “ok” in the days when Internet traffic was primarily downstream to a device. For example, you’d type in a URL in your browser, or click on a link, send a small amount of information upstream to the Internet, then download a whole lot of data and content to your device.
Alas, modern Wi-Fi usage no longer works this way. People expect to take pictures and videos and either email them to friends and/or post them to social media. Wi-Fi appliances, sensors, and cameras are constantly streaming data to the network. Accordingly, there is quite a lot more upstream traffic from the wireless devices to the access point and the Internet. Very slow upstream speeds resulting from a weak transmitter only serve to make the Wi-Fi seem “slow”.
Proper modern Wi-Fi design, therefore, attempts to balance the power levels between the client device and the access point by lowering the output transmit power of the access point. Hence, the access point isn’t screaming quite as loud, and the client will therefore need to be closer to the access point to maintain connectivity, which improves signal quality and ultimately signal speed. This approach does ultimately require more access points to be deployed in an environment. However, there are other industry trends that simultaneously drive the need for more access points:
- 802.11n and 802.11ac take
advantage of the 5 GHz band, which has less sources of external interference
and uses wider channels. Thus, 5 GHz
tends to have more data capacity, and thus higher throughput, as compared to the
2.4 GHz band used by older generations of Wi-Fi like 802.11b and 802.11g. 802.11n works on both the 2.4 GHz and 5 GHz
bands, whereas 802.11ac only works on the 5 GHz band. (Any dual band 802.11ac
access point from any vendor is actually using 802.11n on the 2.4 GHz
band). While 5 GHz has more capacity, it has less
range, and attenuates more quickly when passing through walls, furniture,
appliances, mirrors, people, etc. To achieve a roughly equivalent area of
coverage, one usually needs the 2.4 GHz radio to be set 6 dB lower (i.e. ¼ of
the power) than the 5 GHz radio. I usually
recommend 14 dBm on the 2.4 GHz band and 20 dBm on the 5 GHz band, but this is
very much dependent upon the specifics of your deployment environment and the
types of devices being used on the network.
- The faster speeds advertised by
802.11n/ac (as compared to their 802.11a/b/g forebears) come at a price: effective
range. The higher speeds of modern Wi-Fi
are achieved by increasing complexity: modern
Wi-Fi uses numerous mathematical techniques to stuff more data into the same
unit of time. The more sophisticated the
math, the more data can be stuffed, and the faster the effective
throughput. However, with complexity
comes fragility. Remember the old line
from Star Trek V when Scotty says “The more complicated they make the plumbing,
the easier it is to plug up the drain.” This adage holds very true in Wi-Fi. By making the modulation and coding more complex at the transmitter, the
stronger the signal needs to be at the receiver in order to properly resolve it
and reconstruct the message.
- People are continuously using more wireless devices simultaneously on their networks and consuming more data, both upstream and downstream. This trend shall continue to grow rapidly over the next several years with the introduction of Wi-Fi enabled appliances and sensors with the Internet of Things (IoT). Hence, Wi-Fi design is not only about “coverage” but about “capacity”. More devices streaming more data back and forth to the wired network and the Internet require more APs in the environment to balance the load.
We’ve been
conditioned by the manufacturers of consumer Wi-Fi equipment to think of Wi-Fi
like a can of paint: “e.g. one access
point covers 2500 sq. feet”. The reality
is much more complicated and nuanced – it depends on what is in your
environment (e.g. drywall vs. stucco vs. metal walls, furniture, appliances,
etc.), the number and types of devices you have on your network, what kind of
traffic is being passed, both upstream and downstream, and how fast you want
your wireless devices to access the Internet.
No comments:
Post a Comment