IC Mask SemiWiki Webinar Banner
WP_Term Object
(
    [term_id] => 3611
    [name] => IoT
    [slug] => iot-internet-of-things
    [term_group] => 0
    [term_taxonomy_id] => 3611
    [taxonomy] => category
    [description] => Internet of Things
    [parent] => 0
    [count] => 549
    [filter] => raw
    [cat_ID] => 3611
    [category_count] => 549
    [category_description] => Internet of Things
    [cat_name] => IoT
    [category_nicename] => iot-internet-of-things
    [category_parent] => 0
)

A brief history of the Internet of Things

A brief history of the Internet of Things
by Majeed Ahmad on 12-08-2014 at 2:00 pm

The Internet of Things (IoT) is apparently the next big thing, but it tends to appear in different ways to different people. To some it’s all about connectivity of the web of devices and to other it’s synonymous with sensors and wearable devices. And the scope of IoT is expanding by the day—to smart lighting, smart thermostats, smart homes, and smart buildings. Even cameras and cars are increasingly being seen under the IoT fold.

To understand the larger concept, and get clarity on the IoT bandwagon, it’d be worthwhile to take a brief history detour. Where are the technology origins? What was it initially aimed for? How the concept evolved over the years?

The vision thing

The Internet of Things was not a new idea. In 1988, Mark Weiser, a technologist at the Computer Science Lab of Xerox Palo Alto Research Center (PARC), put forward the notion of ubiquitous computing as information technology’s next wave after mainframe and personal computers. In this new world, what he called “calm technology” would reside around us, interacting with users in natural ways to anticipate their needs.

Weiser coined the term “ubiquitous computing” to describe a future in which personal computers would be replaced with invisible computers embedded in everyday objects. He believed that this would lead to an era of computing in which technology, rather than panicking people, would help them focus on what was really important.

Weiser’s work—based on research on human-computer interaction and PARC’s earlier work on computing—initially sparked efforts in areas such as mobile tablets and software agents. Subsequently, these efforts morphed into pursuing intelligent buildings packed with wireless sensor networks and displays, where information follows wherever people go. Weiser’s vision was shared by many in the PC industry.


Mark Weiser

The first practical manifestation of ubiquitous computing emerged in the early 1990s when John Doerr, the legendary venture capitalist at Kleiner Perkins Caufield & Byer, started the pen-computing frenzy by funding Go Corp. By 1991, the pen-based computing wave had become the “next big thing” in technology world. Yet, despite this pen-based computing rush, only a single product became commercially available from GRiD Systems, a small computer outfit on the east of the San Francisco Bay.

But then Apple Computer’s chief executive officer, John Scully, fanned the flames of pen-based computing in a speech about a handheld computer he called the personal digital assistant or PDA. “Palmtop computing devices will be as ubiquitous as calculators by the end of this decade,” he told his audience. Scully echoed Weiser’s vision touting that computing would eventually go a step farther in the journey that started from mainframe to minicomputer to personal computer. In May 1992, Apple CEO announced the Newton, an amazingly ambitious handheld computer. Scully set the computer world on fire with his prediction that PDAs such as Apple’s Newton would soon contribute a trillion-dollar market. He professed that this gadget would launch the “mother of all markets.”

Newton failed to connect with the rest of the computing world, and five years after its launch, the newly arrived chief executive Steve Jobs abandoned the product to focus on Apple’s core Macintosh lineup. But the Newton debacle proved a kind of start-over that led to a new generation of PDAs that would focus on more practical features. A plethora of such products sprang up—offering some sort of interactive capability—and among them was Palm Pilot, a simple, no-frills compact device which hit the market in February 1996. Palm Pilot became one of the fastest-selling high-tech toys of the decade. The elegant little computer became an American icon; one million Palm Pilots were sold in the first eighteen months.

The rise of the machines

Next up, the cellular-centric machine-to-machine (M2M) communications industry emerged in 1995 when Siemens set up a dedicated department inside its mobile phones business unit to develop and launch a GSM data module called M1. It was based on the Siemens mobile phone S6 for M2M industrial applications and enabled machines to communicate over wireless networks.


Siemens’ M1 module was used by Adshel to transmit data wirelessly via a GSM network

Among these network-centric innovations, the most prominent early contributions came from mobile phone companies who had been using their 2G and 3G networks to connect everything from juke boxes to ice machines since the late 1990s. The use of the mobile technology as a payment gateway had started in Helsinki in 1997 when a company owned by Coca-Cola installed two mobile-optimized vending machines. These machines accepted payment via text messages.

The companies like General Motors and Hughes Electronics were also among the early implementers of the M2M technology.

What’s in the name?

PDAs and M2M continued their slow and modest journey toward gaining interactivity and getting assimilated into the network. Meanwhile, a British technologist Kevin Ashton became interested in incorporating radio frequency identification (RFID) chips into products with smaller form factors while he was working as an assistant brand manager at Procter & Gamble in 1997.

After a couple of years’ of work, he proposed using RFID chips to help manage P&G’s supply chain problems. He argued in an article that having humans input data was incredibly clumsy and inefficient. On the other hand, semiconductor chips and sensors were becoming smaller, cheaper and less power hungry, so they could be incorporated into just about anything.

Ashton suggested that getting information from objects themselves could revolutionize the supply chain. The “Internet of Things” was born. The use of an RFID chip within a miniature device connected wirelessly was akin to a simple SIM card, and it expanded the reach of the Internet of Things to healthcare, automobile, energy, and more.


Kevin Ashton

Vision becomes reality

In 2000, LG Electronics announced plans to launch the first Internet refrigerator. Later, in 2005, the notion of theInternet of Things got official recognition from the communications world when the International Telecommunications Union (ITU) published its first report on this emerging industry discipline.

The report acknowledged, “A new dimension has been added to the world of information and communication technologies (ICTs): from anytime, anyplace connectivity for anyone, we will now have connectivity for anything. Connections will multiply and create an entirely new dynamic network of networks—an Internet of Things.”

In the 1990s, the concept of remotely monitoring and controlling distributed assets and devices a.k.a. the Internet of Things was mostly reserved for large and expensive investments like power plants and dams. Fast forward to 2013, connected products are expanding to e-books, cars, home appliances, smart grids, manufacturing, fast food, security, healthcare, and more. By 2020, billions of things—from clothes to cars and from body sensors to tracking tags—are forecast to be part of the Internet of Things bandwagon.

Image credit: Xerox PARC and Slideshare.net

The content of this article is based on the excerpts from The Next Web of 50 Billion Devices: Mobile Internet’s Past, Present and Future. The article was originally written for The Smartrphone World.

Share this post via:

Comments

0 Replies to “A brief history of the Internet of Things”

You must register or log in to view/post comments.