There’s a lot of hype these days around the term “5G.” But what is it exactly? How is it different from 4G, and 3G? What the heck are 4G and 3G anyhow? I get these questions every day and me being the engineer nerd I am, often find it difficult to explain these in terms that my friends care to understand. (whoever made the pretty 5G graphic I stole from a quick google search… well done! Give me your info so you can get credit or tell me to take it down)
So, to start off, the term 4G and 3G originate from the generation of wireless standards they use. “3G” is “Third Generation,” “4G” is “Fourth Generation,” and of course, “5G” is “Fifth Generation.” Generally, each generation of wireless standards improves on the previous, adding features and capabilities beyond what the previous generation did. So, to start, here is a chart that shows the basics and the improvements from generation to generation in terms of data speeds and capabilities. I’m intentionally leaving out 2G and starting with 3G because 3G is really where data on mobile devices ‘effectively’ gained its’ foothold:
Now, “LTE” is technically considered a different standard than 4G, but the ‘elders of the internet and cellular technologies’ said that it was close-enough to count as 4G rather than argue the minutia and give LTE its’ own standard (3.95G is what it would have been if I remember correctly). What does “LTE” stand for? It literally means “Long-Term Evolution.” By itself, LTE is a wireless broadband communication standard for mobile devices that utilizes a different radio than 3G or straight-up 4G. LTE can best be described as an upgrade path for folks (AT&T and Verizon) utilizing GSM and CDMA networks.
“That’s a lot of techno-jargon, Taco. What exactly does all this really mean?”
Yes, what does this mean? This means that those ‘elders of the internet and cellular technologies’ got together and brainstormed ways to get you your data faster. Ideally, with each generation you want to see the ability so send more data at faster speeds with a lower latency.
“Dammit Taco! Quit using words I don’t know! “LATENCY?!?!””
Ok fine! If you’ve read my bit about bITs versus ByteS, you know that sending data is measured in how many “bits” you can send each second. I left LATENCY out of that post in order to keep it shorter. The dictionary defines latency as “the delay before a transfer of data begins following an instruction for its transfer.” What this really means is that when you want to send data, you have to send a little ‘request’ instruction to the place you want to send the data in order to let it know that you want to send it data. Then you have to wait for the ‘affirmative’ response that (more or less) says “Sure! Go ahead and send your data!” The amount of time that you wait for this request-response is the “handshake’ latency. Once the data transfer has started, latency then refers to the amount of time it takes for your data to reach its’ destination. I can send 1000 Mbps per second, but if it takes 2000 milliseconds to get there, that poses a problem.
This chart shows the relative latency of the 3 standards mentioned here
Why is this important? To put it in perspective, it takes about 10 ms for the human brain to process visual information received from the eye. Now, with lower latency wireless, you’ll be able to game anywhere in near real-time from anywhere (your friends’ parents’ basement instead of your own parents’ basement for example). Or for those that ventured from their parents’ basement and got their PhD, they can perform near real-time instruction for surgeries and procedures that they would otherwise need to get on a plane to go do in person. This also opens-up the framework for things like self-driving cars that need an ultra-low latency network to respond to other vehicles and traffic situations around it. Even Virtual Reality and Augmented Reality would become more commonplace since data can be received faster than our brains can process.
The last real difference I’ll touch on is the spectrum. The spectrum is the term used to describe the frequency (or frequencies) a particular network operates on. When we talk spectrum, remember that the lower the frequency, the less data can be carried. The chart below is a crude breakdown of frequencies by each standard:
The takeaway from this is that 5G is designed to operate across a large spectrum and optimize types of data for each range in the spectrum. Here I could easily ramble-on about wave length, distances or other things that would make your eyes glaze-over and drool ooze from the corner of your mouth, but I’ll spare you that. Instead just think that it’s a more versatile standard that handles data more efficiently and at an exponentially faster pace.
Now that we’ve finished discussing the differences, I’m just going to quickly list a few things that 5G will make possible:
- Elimination of costly fiber drops to new offices and/or satellite offices – 5G could essentially eliminates the need for wired connections for private multi-site networks.
- High-speed, low-latency gaming from anywhere.
- Virtual reality from anywhere
- Better networking and availability of data for AI (Can you say “SkyNet” in machine code?)
- The ability to network just about anything, anywhere.
- Near real-time monitoring of devices (i.e. – healthcare implants or environment sensors)
This list goes on… But I think you get the gist. The point of this article was not to go into detail of all the technologies involved in making 5G, but to create a fundamental understanding of what 5G is, and has to offer.