Making 4G networks IoT and M2M-ready
02 October 2017
Most IoT devices will communicate using wireless machine-to-machine (M2M) communications technologies. Currently, 2G and 3G technologies dominate the cellular IoT market, but the future belongs to 4G, with NB-IoT (narrowband IoT) and LTE-M, also known as eMTC (enhanced machine type communications). They will enable mobile operators to address a wider share of the wireless IoT market.
For the digital issue of this piece, please visit this link – or click here to register for EPDT's magazine.
Features such as power saving mode (PSM), extended discontinuous reception (eDRX) cycles and coverage enhancement (CE) can tune the wireless interface to the needs of IoT applications. An optimised core architecture that also allows non-IP communications will further help to adapt 4G networks for the Internet of Things. To meet the specific performance and availability requirements, the right features and parameters need to be selected and all communications layers must work together perfectly. This creates a need for end-to-end application testing, in order to optimise performance parameters (for example, power consumption and reaction times).
Approximately 60% of today’s cellular IoT devices use 2G or 3G technologies (see Figure 1). Typical applications include fleet management, ATM banking services and personal health monitoring, which tend to generate little data traffic.
Because 4G LTE is optimised for mobile broadband, IoT applications have generated little demand for LTE over the last few years. Some aspects of LTE, however, make it increasingly attractive. LTE offers advantages in spectral efficiency, latency and data throughput. Another is global accessibility: according to GSMA, 4G LTE networks covered more than 60% of the global population in 2016. By the end of the decade, developing countries are expected to reach 60% coverage. The long-term availability of LTE is another consideration. 2G has been around for over 25 years, and operators are considering whether to discontinue the service. Therefore, the industry is looking for LTE solutions that – in terms of cost, power consumption and performance – compete with 2G networks and low power wireless technologies, such as Sigfox or LoRa.
3GPP standardisation for IoT
Optimisations for machine-type communications (MTC) have been developed within the 3GPP framework. One example of optimisation is ensuring protection of the network when several thousand devices try to connect simultaneously, which would occur when the power grid comes back after a power failure. Overload mechanisms for reducing signalling traffic have been introduced to handle extreme signalling loads. Many IoT applications (sensor networks being an example) send data infrequently and do not need to operate to the second. These devices can be configured to accept longer delays during the connection setup (delay tolerant access). Release 10 includes a process permitting the network to reject connection requests initially and delay resubmission (extended wait time). With 3GPP Release 11, access can be controlled by setting up classes. The network transmits an extended access barring (EAB) bitmap that identifies which classes are permitted access. These processes ensure reliable and stable operation of IoT applications without endangering the mobile broadband service.
Low cost, low power devices
In Release 12, the 3GPP committee started working on optimised solutions that addressed requirements such as low data traffic, low power consumption and low costs. It quickly became clear that there is no single, simple solution. The requirements for applications such as container tracking, waste bin management, smart meters, agricultural sensors, and sports and personal health trackers are too varied. Release 12 concentrated on reduced power consumption and cost-effective modems.
Figure 1. Number of cellular M2M connections in billions (source: Cisco VNI Mobile 2017).
The results were a power saving mode (PSM), which is important for battery-operated devices; and a new LTE device Category 0, which targets 50% less complexity than an LTE category 1 modem.
PSM is a kind of deep sleep mode. The receiver is literally switched off for a period so that the device is not accessible via paging, but the modem is still registered on the network. Consequently, PSM is not suitable for applications that expect a time-critical reaction. Applications that use PSM must tolerate this behaviour, and the design process must include the specification of optimal timer values for idle mode and power-saving mode.
LTE Category 0 was the first attempt at permitting significantly less expensive LTE modems, but since it simply did not meet market requirements, the industry quickly looked ahead to the next LTE-M generation.
Release 13 introduced LTE Category M1, as part of the work on eMTC, which added cost-reduction measures – particularly lower bandwidths in the uplink and downlink, lower data rates and reduced transmit power.
NB-IoT was developed in parallel with LTE Category M1. Its profile includes extremely low power and costs, as well as improved reception in buildings and support for devices with very little data traffic. The new LTE Category NB1 has a 180 kHz bandwidth and can be deployed in unused LTE resource blocks, free spectrum between neighbouring LTE carriers (guard band), or standalone (such as in unused GSM carriers).
eDRX is another power-reduction feature introduced in Release 13. For example, in idle mode, the modem periodically goes into receive mode to obtain paging messages and system status information. The DRX timer determines how often this occurs.
Currently, the shortest interval for the idle DRX timer is 2.56 seconds – which is a fairly frequent period for a device that, for instance, expects data only every 15 minutes and has relaxed delay requirements. eDRX now allows a much longer time interval of up to 2.9 hours for NB-IoT, or 44 minutes for LTE-M (depending on the application requirements and the network support).
Figure 2. Parameters influencing the battery lifetime of a device.
PSM and eDRX differ in the time allowed to remain in the sleep mode and in the procedure to switch into receive mode. To be reachable again, a device using PSM must first go into active mode – whereas a device using eDRX can stay in idle mode and go directly to receiver mode without additional signalling.
LTE-M and NB-IoT also offer coverage enhancement features for use cases such as smart meters. One main principle is redundant transmission: sending data repeatedly over time. This, of course, as some impact on power consumption and latency.
For completeness, 3GPP introduced in Release 13 an optimised network architecture that allows, for example, the use of the control plane to transmit user data, which can greatly reduce the signalling overload for transmitting data in very small amounts. In this context, a new core functionality named SCEF (service capability exposure function) was introduced. This function allows optimised non-IP communications with the device via the control plane, while exposing all required functionality to the application layer via a RESTful interface. In order to improve the signalling for traffic using the user plane, it is now possible to suspend/resume connections.
Based on more concrete market requirements, 3GPP drove further improvements in LTE-M and NB-IoT, now called further enhancements for MTC (feMTC) and enhancements for NB-IoT (eNB-IoT). As a result, two new LTE Categories were defined in Release 14: Cat M2 and Cat NB2. Important topics addressed are, for example, higher data rates for LTE-M; lower power class for NB-IoT; lower latency; better positioning; and multicast capabilities, to allow software upgrades and improvements for VoLTE over LTE-M devices. Table 1 gives an overview of the different LTE Categories for diverse IoT application requirements. 3GPP is already working on additional improvements in Release 15, which includes LTE-TDD support for NB-IoT and further improvements for latency, power consumption, load control and mobility.
End-to-end application testing
Theoretical calculations about battery lifetime are useful, but systems can behave differently in reality, and behaviours can also change over time. The overall communications performance of the end-to-end application – including communications triggers (client-initiated, server-initiated and periodic), delay requirements, network configuration, data throughput and mobility – needs to be considered (see Figure 2).
The challenge for developers is to use PSM and eDRX in the most efficient way. This requires analysis of anything that influences power consumption, beginning with both device and server-side applications, but also including the behaviours of the mobile network and IP network.
RF performance, battery consumption, protocol behaviour and application performance should also be considered. Initial work can be done on paper, but it is very useful to verify the results under controlled and simulated – yet realistic – network conditions. This verifies model assumptions and reveals the impact of real-world conditions. Scenarios in which the network does not support a feature or uses different timers can be verified.
Table 1. Different LTE categories for meeting diverse IoT application requirements.
A unique test solution
Manufacturers of test and measurement equipment are addressing the growing demand for test, verification and optimisation of end-to-end applications, which goes beyond RF and protocol testing. Rohde & Schwarz, for example, offers a solution based on the R&S CMW500/290 multi-radio communication test platform, the R&S CMWrun sequencer tool, and the R&S RT-ZVC02/04, which is a new multichannel power probe. It gives users a detailed view of different parameters, such as mobile signalling traffic, IP data traffic and power consumption – on one platform. The platform simultaneously emulates, parameterises and analyses wireless communications systems and their IP data throughput – something that cannot be done in real networks.
R&S CMWrun allows straightforward configuration of test sequences, for controlling the instrument remotely without requiring programming knowledge. It also provides full flexibility when configuring parameters and limits for the test items. One of the key differentiators of this solution is the intuitive way the user can combine and run applications in parallel with common event markers from signalling or IP activities. The solution is able to show the power consumption based on very accurate power measurements, from up to four independent measurement channels.
For example, in end-to-end application tests, synchronised traces show current drains and IP data throughput. During analysis, synchronised event markers, which indicate signalling events or IP status updates, are displayed in both graphs. This ensures a deeper testing level, where the user can see the impact of a signalling or IP event on the current drain and IP throughput, making it easier to understand the dependencies and to optimise the application parameters.
The starting point may be to consider overall communications behaviour – for example, the number of IP connections, transmitted messages, or communications and signalling events. Moreover, users may benefit from seeing the power consumption in different activity states, or in eDRX or PSM status – as well as the power consumption of the different power domains of the device. Later, it would be useful to tune the related parameters for eDRX or PSM, and probably application behaviour. Finally, it may help users to analyse different scenarios that reflect possible real-world situations. End-to-end application testing is therefore becoming more and more important for meeting challenging application requirements, such as 10-year battery lifetimes.
 GSMA report published at MWC 2017: The mobile economy 2017
 Cisco Visual Network Index 2017: Global Mobile Data Traffic Forecast Update, 2016–2021 White Paper
Contact Details and Archive...