The rapid movement of change in IT
- Carl Brettle
- Feb 3, 2020
- 3 min read
A friend posted this on Facebook and it brought back memories of when I used to use a similar machine.
When I was 16 (33 years ago - did I just type that!!) I really used to like computers, we started programming on a BBC Master series computer (when the BBC made computers for schools), using a language called 'Cecil' which forced users to solve problems with it's limited eight command language system.
It wasn't too long then before we moved onto 'BASIC' - Beginners All-Purpose Symbolic Instruction Code. I used thousands of hours of my life programming there, before upgrading to Pascal, then C and then eventually C++ before I determined to be a user of programming rather than a programmer in my own right. It wasn't that the programming was too difficult, just wanted to focus on other things. I must say, the way programming shaped my thinking, has helped me tremendously over the years.
The machines we used in the computer lab, were BBC Masters, the Electron, Archimedes, Commodore Pets and then IBM XT & AT's shortly before Amstrad brought out their very plastic 464 computer in 1980. I actually owned a 6128 and a variety of other smaller home computers in the flurry after that. Although I never owned a ZX-81 when they first came out.
So to the point. These first machines were pioneers in the field. A school could easily spend £4-£5,000 on such a machine. With 128k memory, floppy disks, a green screen monitor and 10 MB Hard Drive (which we were told would be impossible to fill!). How life has changed - I can take a photograph on my phone now which has a bigger file size than the hard disc in the computer I used back in 1976.
Even before that in 1965, Gordon Moore, projected that the number of transistors which could be held on a single processor would double every year - at the time he was the CEO of Intel and his projection became a reality.

This prediction has catapulted the world and for many technology firms set them on a course to beat Moore's Law with each new processor launch. Things were really hotted up in the early days when ARM produced their first RISK processor, a multi-threaded architecture which promised it could deliver parallel processing - it did and in style. Technology companies are still using the ideas they developed today.
The graph above shows heaps of progress over the years, to the point where the original 2,000 transistors back in the 1970's are now 50 billion - that's hard to believe but true. Laser imprinting the transistors on ever-decreasing thicknesses of silicon and also ave multiple CPU's working in a core has done the trick.
Yet AI (Artificial Intelligence) and low latency 5G mobile networks are now emerging, which will change everything we know about how society works. Instantly connected superfast, learning processors will shoot us forward, fulfilling our world's insatiable desire to be smaller, faster, smarter.
It's crazy that we can be so good in one area of humanity and so bad in so many other areas. Wouldn't the world be better if we could mass produce bio-degradable alternatives to single-use plastics ? or what about applying Machine Learning to ending war ? I'm sure if the computer industry was financially motivated to find solutions to end all cancer or disease in general, it would make huge progress.
Let's not forget that there is a million times more computational power in the smartphone we will hold in our hands today than the entire computer systems used to put Man on the Moon during the Apollo 11 moon landing in 1969. Even my washing machine is now more sophisticated.
I end with. We humans don't know how to fully apply the technology we are inventing for the wider benefit of man-kind. Yes we can build 'Cern', yes we can create orbital telescopes to look back toward the origins of space and time. Surely the technology we create should be first used to free people from poverty and disease. If everyone who had a smartphone could download an App which in the background used some of the computational power of our device for a wider good, we could unleash the biggest problem solving, multi-processing, AI-driven device on the planet.... Yet no one seems to want to do that.
I'm at an age where I've seen most of the commercially viable races form in computing, internet, mobile, entertainment - quite something to watch. I just hope and pray that something will emerge which doesn't just keep us more connected but is focussed intently on solving the biggest challenges humanity has today.

Comments