Making the IoT a reality, now that the hype is past

Where are all the Things that we were promised would be linked to make an Internet of Things (IoT)? A decade ago, very senior telecoms executives were insisting that, by now, tens of billions of Things would be connected to the Internet to create widely distributed networks of sensors. These networks would create data that would be processed to create deep insights that could optimize everything from traffic flow to agricultural yields.

It didn’t happen – but why? A moment’s thought back then would have told us. IoT deployments usually involve widely distributing many sensing nodes into arbitrary locations from which they are supposed to operate autonomously and communicate wirelessly for years. It’s a big technical challenge, it’s a big logistical challenge, and so it’s a big cost-control challenge. And if the costs aren’t right, then any IoT project won’t get the go-ahead.

Meet our CEO

Costs

The first challenge, therefore, is the per-device cost. IoT nodes are often quite sophisticated, carrying sensors and signal-conditioning circuitry, analog to digital converters, microcontrollers, RF subsystems, security hardware, and more. This leads to complex and bulky boards, which pushes up the per-device bill of materials (BoM) and hence the capital expenditure (CapEx) of a deployment. The node's bulk also limits where it can be deployed.

The node’s size is also driven by the size of the battery, which in turn is driven by the type of sensing it is doing, the energy needed to do that sensing, how often it reports readings, and for how long the node is supposed to operate autonomously. And so, a CapEx and deployment issue rapidly becomes an operating-expense (OpEx) issue as well: you can add a bigger battery so that the node runs for longer, but this pushes up the CapEx and further limits deployment choices. And there are environmental issues, too: large IoT deployments usually demand the manufacture, recovery, and recycling of large numbers of batteries.

It will there.fore take substantial development on multiple fronts to make IoT network deployments more practical.

Nodes autonomy

One of the most important steps will be to make the nodes more autonomous by enabling them to run on energy scavenged from their environments, from solar panels, vibration-driven generators, or Peltier devices that rely on thermal differences. Achieving this will reduce the CapEx and OpEx associated with buying and replacing batteries and cutting e-waste.

As usual with any apparently ‘free lunch’, there’s a cost, in this case, of developing circuitry that can capture, convert, store, and deploy such energy efficiently. Current power-management ICs (PMICs) tend to need multiple external components, which increases each node’s cost and size.

Wireless communication presents another challenge. Wireless modules are often too big, use too much energy and cost too much to be practical for IoT nodes. IoT nodes that are using cellular wireless communications will also need subscriber identity modules (SIMs), which demand relatively large board-mounted connectors and use precious energy.

An IoT platform ready to be widely deployed

To address these issues, Murata, Deutsche Telekom, and Nexperia have jointly developed an IoT platform that can run on harvested energy, offers reasonable performance, doesn’t cost too much, and is small enough to be widely deployed.

The Autonomous Cellular LPWA Development Solution (ACDS) platform has three key elements. There is a compact Murata dual-mode cellular IoT module that supports 26.15kbps data rates over the NB-IoT protocol and 1Mbps over Cat.M1. It supports extended discontinuous reception and power-saving modes to minimize energy use.

Deutsche Telekom supplies its nuSIM technology, which integrates subscriber information directly into the wireless module, reducing connector and component count and doing away with the plastic waste of a conventional SIM. nuSIMs are around 35% faster at connecting to a network than conventional SIMs, which saves energy, as does the fact that the module no longer needs to run continuous SIM-presence detection checks.

Nexperia supplies its NEH2000BY PMIC, which manages energy transfer from the ACDS’ photovoltaic panel to a supercapacitor-based energy store. The PMIC can manage energy flows down to 10µW, meaning that energy can be harvested from the panel even on a cloudy day. The PMIC uses capacitive-based power conversion, which needs fewer passive components than competing approaches, as well as achieving an 80% average conversion efficiency.

The early hype about the rollout of IoT networks was based on the idea that the details would take care of themselves. It turns out that the devil is in the details. It takes calculations that start from the required functionality and work up through the energy costs of sensing and communicating data, the BoM to build a node to do it, the CapEx to create enough nodes, and the OpEx of sustaining their operation, to make a robust business case for an IoT deployment.

For example, if the ACDS solar panel is under 3000lux illumination – as on a cloudy day – for six hours a day, its PMIC will be able to store enough energy to support 60 NB-IoT transmissions per day – and do so indefinitely. It’s this kind of holistic thinking about lifetime costs, and the capabilities of platforms such as ACDS from Murata, Deutsche Telekom, and Nexperia, which are needed to turn the hype of the early days of the IoT into a reality for today.