How IoT Has Changed, and Why It’s Still Considered ‘Emerging Tech’

How IoT Has Changed, and Why It’s Still Considered ‘Emerging Tech’
Author: ISACA Now
Date Published: 20 July 2021

Even though the IoT originated 15 years ago, organizations in 2021 continue to define IoT as an emerging piece of technology. But why would it still be considered emerging if it’s nearly two decades old and it has already sown itself so deeply into our lexicon?

Before we examine how IoT has changed and what it means for the future of technology, let’s discuss its history. After all, sometimes it takes going back to the past to understand the future.

First, IoT stands for the “Internet of Things,” a term first coined in 1999 by Kevin Ashton, who worked at Procter & Gamble until becoming a British technology pioneer who helped establish the Auto-ID Center at the Massachusetts Institute of Technology (talk about a glow-up, right?). Ashton worked specifically in supply chain optimization, and he was trying to get senior management to embrace a new piece of technology at the time: RFID, which stands for radio-frequency identification. RFID uses electromagnetic fields to automatically identify and track tags attached to certain objects. Many personal items such as purses and wallets can be packaged with RFID chips in case they go missing.

However, IoT has greatly evolved since its inception. It’s now generally accepted as any device that can connect to the internet and offer the ability to collect and share data. Cellphones, smartwatches, and even your couch (if it’s Wi-Fi-enabled, of course) could be considered IoT devices.

Based on its origins, we can already see just how much the IoT has changed and how consequential it has become on the technology landscape. ISACA now offers a certificate program to help educate and instruct IT staff on IoT.

That’s great, but what is emerging tech?
The concept of emerging tech does not immediately signify that the technology is brand new and hasn’t been utilized before. Lots of buzz and excitement continues to swirl around cloud technology, which has been around for many years. Bitcoin is reaching new prominence even though it’s been in circulation since 2009. Even AI is often considered emerging and that’s been tinkered with since the 1970s. That’s because emerging tech can also refer to the continuing development of existing technology, which applies to IoT.

Consumer electronics are becoming more affordable, energy-efficient and smaller. For example, IBM just released a two-nanometer chipset, which is the diameter of a strand of DNA. Moreover, the smartwatch on your wrist has more computational power than home computers did in the 1980s and 1990s. With the usage of smaller parts, the device becomes cheaper to manufacture. This, in turn, means it will cost less to the consumer. And lastly, the cheaper the cost, the higher the adoption rate and the more the technology emerges.

So, what are some of the ways that IoT will continue to emerge or evolve?
Our digital lives are intertwining, especially when it comes to emerging tech. IoT devices send data to their respective cloud infrastructures so they can create datasets and machine learning and AI algorithms based on our digital choices. In this case, end-users may not be able to observe any risks or potential breaches, but if we have the requisite AI data at our disposal, then it may be able to alert the user before the issue ever occurs. The formation of IoT, cloud, AI, and even blockchain technology is vastly becoming its own technological ecosystem.

Smart homes, smart buildings and smart cities are emerging, as well. One day soon, it won’t be unusual to buy a home that already has some form of IoT device enablement. Smart cities that are interconnected could offer better traffic management, enhanced air quality and more efficient waste management. Connected cities could quite literally change the lives of all their citizens.

We already know that 5G technology is being utilized in the world of cellphones and smartwatches. However, less discussed is that some companies are already beginning work on 6G to make communication even faster. As it becomes speedier, communication will become more accessible and able to send/receive more data.

Earlier, we mentioned how small technology is becoming in the case of IBM’s nanometer chipset. That’s not likely where it’ll end. Once you reach the atomic level, you will see technology that will utilize quantum computing – although that may be several years in the future. Plus, we may even see the possibility of nanotechnology or nanobots becoming a reality, too.

During the pandemic, we have seen a lot of biofeedback technology emerge. IoT devices that measured oxygen levels in a person’s body helped identify concerns or disorders related to oxygen depletion, which could’ve signified a possible COVID infection. Biofeedback technology is readily becoming available for cellphones and smartwatches, some of which can measure heart rates and other vital signs and transmit the data directly to your doctor. If your doctor can receive your vitals electronically beforehand, you could know ahead of time whether a trip to the office is needed.

All in all, IoT offers many advantages for a technology-driven society like the one we live in now. Of course, there are cybersecurity risks among the devices that require shared data. Cybercriminals can crack into IoT devices, and the more devices created, the more opportunity for breaches. That’s why as IoT technology continues to develop and emerge, IT professionals are going to need to develop along with it by training and staying up to date on the latest in IoT technology.

Figure 1

For more information on the evolution of IoT, watch our recent LinkedIn Live session featuring the expertise of Dustin Brewer, ISACA’s senior director of emerging technology and innovation.