The 3rd Platform is Skynet

Warning - this is a technology blog, some may find this uninteresting :) So, to break it up, I have allowed Louise (my wife) to add some random comments in after she proofread it :)

The use of computers started off with a few use cases in society - mostly consigned to Military and University Research departments with the odd progressive bank embracing compute to help catalogue transactions. Computing was very much a siloed and manual activity - a single person, using a single machine for a single application (one to one for one). They were only employed for incredibly complex calculations that dramatically saved the time -and therefore expense- of highly qualified scientists and mathematicians who were typically few in numbers and hard to come by. 


Louise Edit - Basically computing was for complex calculations, not for the average joe



Over time the use of computers has proliferated into most homes and pretty much every private enterprise or public service we care to think about. This is simply because computers (specifically the applications that run on them) save society and businesses time, effort and money which allows for that resource to be spent further up the value chain (usually making new products or services). 



Along with computers becoming more affordable and usable, it was the advent of networks -specifically the internet- that opened up the use of computers and their applications dramatically. First of all, computers and their applications could be accessed remotely (you didn't have to be stood next to it, or be in the same building) and by more than one person at a time and for a multitude of tasks (many to one for many). Applications started to talk to other applications and share data. These quantum leaps forward not only made society and business more productive, it enabled a true global market for consumers whilst also -and importantly- enriching and inspiring a new type of value add -knowledge work (the creation of intellectual property).

We are are now entering the third age of computing. We still have computers from the 1st age kicking around in dusty old comms rooms and long forgotten corners of data centres. Mainframes running applications coded by people who have also been long forgotten where the old addage 'if its not broken - don't fix it' prevails. The networked application age (2nd age) is still going strong and society and businesses alike will continue to invest in this area but the next big thing in computing is here and it is growing fast. 

In the 3rd age, there is a fundamental shift in how computing is used. Computers are now smaller than ever before and so cheap to buy that most people -in the UK for example- have several on their person at any one time. These tiny computers have more power than some of the computers from the 1st age and are capable of sensing, gathering and storing huge amounts of data about us, our surroundings and circumstances without instruction or manual input. Furthermore, networks are genuinely ubiquitous, everything is becoming connected to the internet -from your computer, mobile phone, tablets, laptops... even your fridge, your TV, your car and your wrist watch! 

So, In the 3rd age, we have more convenient and mobile computers that make manual data entry easier or they simply gather information in real time for themselves using sensory devices at endpoints (like cameras, heart rate monitors, microphones, temperature gauges, etc). These computers are sending the information they gather over networks and handing off to more powerful computers that in turn are interconnected with many other powerful computers also receiving and storing vast amounts of important and relevant data. All this information is being aggregated and clever mathematicians (known as data scientists) are deploying (and constantly fine-tuning) very sophisticated algorithms to make sense of it all. All this, so that human beings can make better-informed decisions about all aspects of their lives -or in some instances take the decisions out of our hands and feed instructions back to our interconnected endpoints (perhaps your fridge needs to be cooler because the vindaloo you left last night is melting the cheese?!?!?). As such, computers are becoming a collective consciousness for the human race. Humans- more so than ever are being advised by computers, not instructing them. Data collection, interconnection and analysis are all automated to the point where we are becoming genuine consumers of computing as opposed to feeders.

Louise Edit - The programmers become programmed?

Some people are afraid of where this is leading, but I struggle to understand why anyone would be afraid of their own reflection. Computers, despite their growing intelligence and automation, still only act upon the information that is gathered from us- they are not thinking for themselves. Yet.....

Thanks for Reading,

Chancey

Comments

Popular posts from this blog

Engineered & Automated Biases

Start up life

Are you able to enjoy the big moments in your life?