The next big thing is (IoT) – AND NO – It’s not the Internet of Things:

It’s actually the Independence of Technology


While everyone is busy talking about Big Data and Internet of Things, people seem not to be aware of the big technological revolution they will consequently produce.


Before we get into the scary part, let’s first see what does Internet of Things (IoT) mean? And how it’s related to Big Data?

Internet of Things (recently known as IoT) is a system of interrelated physical objects such as machines, electronic devices, vehicles, buildings, infrastructures, even animals and people all equipped with sensors, connected through a network such as WIFI, Internet, Bluetooth, electrical grid, each able to collect and exchange data. The result will be sets of large and complex information – namely Big Data – that will be then processed using immensely powerful computing systems to generate intelligence that would help us take better decisions.

Using cloud computing, this protocol bridges between the operational technology (OT) and information technology (IT) and allows objects to be remotely controlled, monitored and analyzed creating unlimited opportunities to improve efficiency, accuracy and economic benefits.

By 2020, there will be tens of billions of data-sharing objects connected to the Internet. Devices such as smart phones, smart homes, fitness bracelets and many others are already changing how we live and work.


Is it real or just my imagination?

Science-fiction novelists were always able to predict the future and talk about things that will eventually happen.

In 1865, more than a hundred years before the day Neil Armstrong actually took his ‘giant leap for mankind’, Jules Verne anticipated his step and wrote ’From the Earth to the Moon’.

In our days, a similar case can be built around SIRI – The Speech Interpretation and Recognition Interface from Apple – It’s an Artificial Intelligence program that replicates real life situations and uses information available on your device and on the world wide web to provide you with answers and suggestions.

SIRI never got the ability – so far – to act on its own.

But that’s not what ‘Her’ revealed, a 2013 movie starring Joaquin Phoenix. It illustrated the future of a such pre-programed software and portrayed the relationship between humans and machines in the coming not-so-far-from-now years. A brilliant movie, I must say, that exposed our increasing dependence on technology and consequently the commencement of machines autonomy.

In 2002, Simone (or S1MONE) with Al Pacino rendered a very similar concept. Many motion pictures like ‘I, Robot’, ‘Bicentennial Man’, ‘Terminator 3’, ‘Eagle Eye’, ‘Transcendence’ also fictionalized the independence of technology.

Scientists and thinkers have always labelled this particular form of technology as Artificial Intelligence (Ai). But that’s what I would rather call a computer or a sensor.

Bear with me. I promise everything will be clear in the next few lines.


Setting aside myths, what could possibly be the reason leading to Independence of Technology?

In ten years from today, all ‘things’ – us included – will be attached to a global network sharing autonomously all sorts of information.

Yes – you read it right – humans will very soon become Cyborgs. Brothers to androids and cousins to robots!

NBC news predicted that in 2017 all Americans will eventually be embedded with biometric devices. According to IWB Investment Watch, over 10,000 people have now received a permanent human RFID microchip implant.

These people certainly did not know that, in April 2010, Mark Gasson’s team demonstrated how a computer virus could wirelessly infect his RFID implant and then be transmitted on to other systems.

Imagine what could happen if a self-generated algorithm was developed by a supercomputer and got transmitted virally to all connected ‘things’.


If you think this is far-fetched and computers will never be able to be conscious, please continue reading..

In 2014, Google acquired DeepMind (DQN). An Artificial Intelligence that was first developed to learn how to play video games in a similar fashion to humans.

In October 2015, a program called AlphaGo, powered by Google’s DeepMind, beat the European ‘Go’ champion Fan Hui – five to zero.

As opposed to other AIs, such as IBM’s Deep Blue or Watson, which were developed for a pre-defined purpose and only function within its scope, DeepMind claims that their system is not pre-programmed: it learns from experience, based on trial and error logic, using only raw pixels as data input.


Bottom line, computers can think for themselves.

Combining supercomputers, self-learning softwares, Internet of Things with Big Data, you should seriously get used to the idea of machines taking over.

The question is not whether Independence of Technology is going to happen! It’s neither when!


The real question is: What will happen to us?



Or drop me an email at and I will be happy to discuss.


Copyright ⓒ 2016 – Adib Nachabé | All Rights Reserved