
Once upon a time the world was simpler.
You knew where you were with computers; they were big static whirring machines with keyboards, screens (monitors), storage media where programs and data could be ‘fed in’ and then results were displayed or printed out.
This is not ancient history, if you were born in the 50’s, 60’s or 70’s, a late baby-boomer or a member of Generation X, you were most likely bought up without the web or mobile phones, computers were hard-wired (if available at all), and computer time in offices and universities was rationed via shared terminals. It was only in the last years of the C20th that all the exciting technology that we now take for granted came together, and the cost, size and performance made personal computing on a huge scale possible. In this period dedicated standalone business machines gave way to general-purpose PCs (Personal Computers), networks of computers, printers and servers, and the now ubiquitous wire-less devices. In parallel, integrated circuits or silicon ‘chips’ appeared in everything from your TV to your washing machine, from cars to cashpoints (ATMs), watches to games consoles. Every commercial and industrial process is now digital, automated, computer-aided, or web-enabled.
And what does all this technology have in common? The humble ever-present computer – not lost, superseded or redundant – is still there just under the surface. All the current technology relies, more or less, on similar basic components to the early mainframes and the work of the computer science pioneers such as Alan Turing and John von Neumann in the 1940s and 50s.

So, if you look inside you laptop or smart phone what will you find?
(Ed. this is not recommended, take our word for it!)
A Central Processor
The computers ‘brain’, the Central Processing Unit (CPU), that provides the electronics that carry out the programmed instructions, apply algorithms and coded rules, and allow communication between the various internal components and external devices and services.
The word computer derives from the verb ‘to compute’ or calculate, although the majority of activity is now information processing – accessing, storing, manipulating, transforming, and presenting data in different forms.
Some Memory
As the name suggests, this is information (more correctly digitized binary data) that the computer needs to remember and use to carry out its prescribed tasks. There is still a distinction between stored programs, internal memory (either read-only ‘ROM’ or rewritable ‘RAM’) and external memory (physical external storage). However, it is less obvious or important to the computer/device User of today where the data are, especially with web applications and cloud-based services. Did you ever stop to think about where your online profile, blog posts or pictures actually reside? These so-called thin-client applications benefit from a physical and logical separation of a computer into a local ‘client’ and a remote ‘server’. That’s what a server does, it serves one or more users or clients with the processing power, storage or other services that it requests, without you having to worry about the details of how or where it does it.
Input and Output Devices
This is probably the area of greatest obvious change to the von Neumann model. We are no longer limited to a keyboard/mouse and a punched cards/magnetic disks as input devices, although printers and screens still provide the vast majority of our output needs. A computer has to recognise and control – via software ‘drivers’ – these I/O devices. However, in the broader sense any means by which the computer communicates with the outside world would qualify as input-output, which could mean temperature and positional/spatial measurements as inputs, audio and physical movement as outputs, and any computer-to-computer dialogue or exchange of data. Once there is a network of computer-like devices, either in a fixed (client-server) or virtual (the internet) configuration then the concept of input and output becomes a bit blurred. Massively parallel processors (basically lots of CPUs working together), and open networks like the internet begin to exhibit unexpected ‘emergent’ behavior and look like a different intelligent life form…I digress!
Software
As well as the physical hard-ware components, something else is needed to turn the assemblies of metal, glass and plastic into useful working machines, and that something is software. Like it or loathe it, Microsoft was one of the earliest and most successful companies to exploit the need for off-the-shelf commercial software and a programming language and tools for the emerging PC market and early adopters. Bill Gates and Paul Allen knew they could not compete with the hardware producers and technologists, so they saw a niche for the enthusiastic, tech-savvy, low-cost software start-up…this is a familiar model even today!
A bit more about software; it provides the specific instructions, logic and rules that turn a utilitarian general-purpose computer (in most cases), into a specific application or functional tool. Types of software include:
- All web pages and smart phone apps (applications)
- Operating systems that sit between applications and the hardware
- Spreadsheets and word processors
- Web browsers
- Internet and worldwide web protocols
- The embedded ‘firmware’ or system software (somewhere between ‘soft’ and ‘hard’!) that controls technology at a lower level, i.e. not a business or end-user application
The instructions are coded in a pseudo-language with strict rules and syntax (programming language), that in turn is converted in to machine-readable binary code (ones and zeros).
Last but not least I would add a power source to the mix. This was less important before everything became wireless, but is now a significant part of the size and weight of mobile and smart devices.
The future is nearly here
This is a broad simplistic view of the world of computers but I think it helps to ground the technology in recognisable functional parts. Even though it may look and feel like magic sometimes, if you get involved in designing and developing software and computer systems, you are within touching distance of the technology and the founding fathers of computing, in the words of Isaac Newton you are truly, ‘standing on the shoulders of giants’.
So, what does the future look like? All I can guarantee is that in the next 15 minutes there will definitely be more change, more wearable technology, augmented reality devices and the increasing penetration of the Internet of Things into our daily lives. There will also be different undreamt-of technology, maybe even hover boards! Beyond digital is where things could get interesting. Is an organic computer, a self-organising neural network or a thinking machine still a computer – back to Alan Turing again and his Turing Test of non-human intelligence – will they still have the same basic components, architecture and functions as outlined here? Only time and the computer scientists and designers of the future can tell – and maybe that will include you!
If you want to know more about computers or related IT elements please comment here or Ask the IT Chemist.
@ITelementary
(c) 2015 Antony Lawrence CBA Ltd.
That’s cool that computers have evolved to use software IO drivers. The evolution of software is only one half of the equation. Overtime we’ll be able to create smaller and smaller silicon chips since transistors retain their same processing power even when scaled down. http://www.npoint.com/product-category/nanopositioning-stages/