The first thing you should know is that the G in 5G, 4G, 3G, and 2G stands for Generation. What about the 5? Before we explore that, let’s take a trip back to 1979, in Tokyo, Japan, when it was 1.
Nippon Telegraph and Telephone, NTT of Japan, launched the world’s first commercial mobile phone cellular telecoms network on December 1, 1979. It was the first time a caller just dialed the number, and no human switchboard operator was needed to connect the call.
It wasn’t called any G at this point. It was just an analog mobile cellular phone mostly built into cars and too heavy to be carried around by users. It could transmit voice – low-quality – from mobile location and that was all that mattered.
In 1991, 22 years after the NTT technology spread across most of the world, Finland launched a mobile network that established the 2nd generation of mobile networks.
The Finnish network was based on an emerging standard, the Global System for Mobile Communications, originally called Groupe Spécial Mobile or GSM.
To differentiate this from the NTT technology, this standard was dubbed 2G thereby implicitly christening NTT’s 1G.
2G introduced digital signaling within the radio network. It came with circuit-switched mobile data services like text messaging, and packet delivery at 9.6Kbits/sec.
With this network, it will take you about 14 hours to send a 1Mb picture to a friend in the same city assuming it was just the two of you on the network.
In the mid-1990s, the General Packet Radio Service, GPRS, was introduced to the GSM standard and the first wireless internet access became possible. The packet delivery rate grew to 172Kbit/sec.
The addition of GPRS to 2G was named 2.5G.
A few years later, the standard was enhanced by EDGE – Enhanced Data rates for GSM Evolution. EDGE networks became 2.75G and were first launched in the U.S. in 2003.
In its lifetime, the 2G Standards described technologies that could achieve between 100-400 Kbits/sec data rate and 300-1000 ms latency.
The Birth of 3G – The PP and PP2 Partnerships
As the demand for mobile internet service exploded following the success of 2G, the European Telecommunications Standards Institute, ETSI, expanded the GSM standard to enable interoperability between the various technologies that were deployed to make those advancements possible. Without these standards, users would have been restricted to their home network and the convenience of mobile data access would have been greatly limited.
While GSM evolved rapidly and gained a majority of the market, the IS-95 standard, developed by Qualcomm ran parallel in the US and Canada. IS-95 pioneered the development of Code-Division Multiple Access, CDMA. Gadgets developed with IS-95 standards could not work with GSM networks and vice-versa. International travellers found it particularly troubling.
In 1998, GSM and IS-95 standards organizations formed two partnerships to develop global standards that would make interoperability possible and define standards for future generations of mobile networks.
The next generation was called 3G.
The 3rd Generation Partnership Project, 3GPP, was one of the resulting projects of the GSM-IS-95 partnership. It was responsible for developing the Universal Mobile Telecommunication System, UMTS. This became the first global standard upgrade to GSM networks.
Towards the end of 3G’s life, this project took on the task of developing the Long Term Evolution, LTE, standards – 4G, and to this day, 5G.
The other project initiated by the GSM-IS-95 partnership was the 3rd Generation Partnership Project 2, 3GPP2. It was responsible for the development of 3G specifications for CDMA technologies. CDMA2000 was the first standard issued by 3GPP2 and replaced the IS-95 standards.
The Rise of 3G
What later became marketed as 3G is the large number of infrastructure and technologies that followed the various specification releases by both 3GGP and 3GPP2.
Between 1999 and 2010, 3GPP introduced the UMTS (3G, 1999), High-Speed Downlink Access, HSDPA (3.5G, 2002), and High-Speed Packet Access Evolution, HSPA+ (3.75G, 2007). It also released the first version of LTE (3.95G)
The 3GPP2 project developed the CDMA specifications in parallel. Its first release was EV-DO in 1999. The last revision to the EV-DO was released in 2007.
3G not one Technology
There is no one 3G technology. Rather, 3G refers to a set of requirements that were developed and published by the International Telecommunication Union, ITU. Since the PP partnerships, ITU prescribes international standards and performance expectations – such as data rates and latency – for each generation. The partnerships – 3GPP and 3GPP2 – then define the technical standards to meet, and in most cases, exceed, these requirements.
ITU defined the 3G requirements in its International Mobile Telecommunications (IMT)-2000 standards.
Generally, 3G also known as Mobile Broadband targeted technologies that pushed data rates across the Mbits threshold. 3G Standards described technologies that could achieve between 0.5-5 Mbits/sec data rate and 100-500 ms latency.
4G – pushing the data rate limits
Like 3G, there’s no single 4G technology. Rather, 4G is a set of requirements, specifically, IMT-advanced, published by the ITU in 2008 after network operators and vendors agreed to retire 3GPP2 and converge around LTE as the common standard for all future networks.
The list of requirements for a 4G network is long. But here are a few examples:
- Download speed of 50 Mbits/sec for mobile clients. Yes, you read that right. If your network provider claims it is offering your 4G service, you must be able to achieve a download speed of 50 Mbits per second in a peak condition.
- Gbits/s speed for stationery clients – e.g. fiber optics clients
- Latency below 100 ms.
- Interoperable with previous standards (3G and 2G) – your phone should not stop working just because there’s no 4G or 3G coverage.
- Any composition of technologies that can meet these requirements are regarded as 4G.
In 2010, the 3GPP project released the 4G LTE-Advanced technical standards to meet these requirements.
Even though LTE was introduced as a successor to 3G’s HSPA+, the evolution of HSPA+ continued and many providers have been able to use this same tools to meet the 4G requirements.
Enter 5G – crushing data rate limits
Like its predecessors, 5G is not a technology. Rather, it is a set of requirements mobile network providers must meet in the advancement of their capabilities. The ITU released its first draft of the requirements for 5G in 2015. It is called IMT-2020. These requirements are expected to be completed this year and will shape the global standard for mobile communications this decade.
Like its predecessors, the list of requirements for a 5G network is long. Here are a few relevant examples:
- 1 million devices per square km minimum connection density – the service provider must be able to connect
- Download peak data rate of 20 Gbit/sec – the ideal situation for stationary users
- Upload peak data rate of 10 Gbits/sec – the ideal situation for stationary users
- What you should get on your phone (download) 100 Mbits/sec
- What you should get on your phone (upload) 50 Mbits/sec
- Latency 1 ms
Despite the increased power, your phone’s battery should last as long as it does now.
These requirements are ambitious. Besides the increased download speeds the networks are required to deliver to users, they are also required to deliver that level of service to more users – 1 million per square kilometer.
The user density requirement is in response to increased demand for connectivity by gadgets such as wristwatches, TVs, refrigerators, cars, street lights, etc.
The 3GPP project is actively developing standards and technologies expected to meet these requirements.
Can 5G make you sick?
If you have ever asked this question, your concerns are valid. But the short answer is NO. 5G will not make you sick.
Just before you run away with that answer, let’s consider why.
Just like the other Gs before it, 5G is not a technology. Rather, it is a set of requirements describing standards mobile service providers must meet to be considered contemporary.
Most concerns about the health effect of 5G center around the technologies that carriers build to meet the 5G requirements – specifically, the use of the millimeter waves.
On the electromagnetic spectrum, the millimeter waves roughly lie between the radio waves and visible light. Unlike the lower-frequency radio waves, millimeter waves have smaller wavelengths and generally poor with penetrating vegetation and thick walls. But they are good at handling large data at better rates.
With most of the radio waves already allocated, many carriers are turning to the millimeter waves to transmit the amount of data required by 5G. This means the carries would have to deploy a lot more access points within smaller distances than they currently do.
The two questions many organized societies have asked are, first, how safe is the millimeter-wave signal? And, will a large number of access points – deployed close to people’s homes – expose humans to dangerous radiation?
The scientific consensus on both questions is still NO.
Radio waves, millimeter-wave, and visible light all belong to a part of the electromagnetic spectrum called non-ionizing radiation – because of the inability to interact with cells in a way that can break cell bonds.
X-rays and gamma rays, on the higher end of the spectrum and belong to the ionizing-radiation class.
In fact, the millimeter-wave spectrum is not required to achieve 5G. T-Mobile, a U.S. carrier and the largest vendor of 5G in that country, uses low frequencies originally used for broadcast television.