Forecasting the Future: The Technologies That Will Shape Our World
- Glenn

- Mar 14
- 12 min read
Updated: Sep 28

I’m not clairvoyant - nobody is. While fortune tellers claim to see the future, tomorrow's script is still unwritten. That said, plausible predictions aren’t just possible - they’re necessary. Digital technologies are now evolving at such speed that they are reshaping our lives in ways we can and must anticipate. Because understanding where we’re headed empowers us to prepare, adapt, and even write the story of what comes next . . .
Imagined Realities
When it comes to gauging how things might pan out, imagining the future has always been a bit of a double-edged sword. While the role of science-fiction and creatives who explore the future is important, their output can lead to confusion. On the one hand, thinkers, writers and filmmakers have enabled innovators to see many possibilities, some of which have resulted in real-world advancements - motion-activated doors, smart watches and space exploration, among others. However, visionaries have also been responsible for suggesting intergalactic space travel, gleaming utopian cities, and flying cars appearing in near-future worlds. We should already have experienced all these things apparently, according to stories and projections from the 1950s, 60s & 70s . . .
Of course, the whole point of imagining the future is to take us to places we can’t go in normal, everyday life - building pictures in our minds of what could be. But because fictitious sci-fi and plausible predictions are often conflated in the media, confusion invariably arises. And people find it hard to visualise the pathway we are on as a result.
Forecasting the Future
Way back in 2010, I began teaching on an undergraduate module that focuses on emerging technologies. The purpose of this module is to enable Graphic Design students to explore the use of new and evolving tech as tools in visual communication projects - mainly advertising campaigns. Central to this is engaging in innovative idea-generation and ‘future thinking’ - instead of focusing on what's possible in the here and now, students are encouraged to look ahead in an experimental, no holds-barred kind of way. They are tasked with looking at the trajectory of technology and using it to develop campaigns for the future. I still teach on this module today - it’s always been fun as it really gets students pushing boundaries, and the results are innovative and imaginative.
When I examine areas I presented in these lectures from the past, it’s interesting to reflect on how technology and life have progressed. It’s also insightful to see how many of the technological developments I projected have come to pass. Some of the things I foresaw include . . .
Better battery performance: Increased capacity and efficiency with faster charging times (bear in mind I was the proud owner of a 3-hour charge-time iPhone 3Gs at the time)
Significant smartphone capabilities: Processing power exceeding powerful desktop computers of the day that students were using for their design work
Smartphones becoming a ‘device for everything’: Phones replacing laptops and other devices for music, video, gaming and work
Advanced mobile Internet: Cheap, super-fast mobile internet for streaming and downloading (3G was the standard of the day, providing much slower, patchy connectivity)
An increased focus on VR/AR: The development of powerful headsets with far better capabilities (at the time, proper VR/AR was expensive and in an experimental stage)
Usable voice recognition as an interface to interact with computers: E.g. Alexa style systems (back then saying ‘yes’ 10 times into an automated tele-system was the height of voice-tech)
TV & film streaming replacing broadcast TV and physical formats: (Netflix, iPlayer etc. had been around for a bit, but scheduled TV and DVDs were still king at this stage)
Computer graphics quality hitting ‘near-realism’: (In 2010, the PS3 had just been released)
Virtual communities and worlds becoming mainstream: Second Life had been available for a while, but this was seen as a niche pastime (the concept of the metaverse had been around for ages, but this was long before the notion gained mainstream traction)
OK, so I know - none of this was far-out speculation. Nothing I said made people’s jaws drop to the floor, even at the time, because my predictions were future evolutions of what we were already seeing. But the fact remains that nothing on that list had happened in 2010. And yet the path towards them was clear to me. Why?
Predictive frameworks
In the 1960’s, co-founder of Intel Gordon Moore made an interesting and important observation. He spotted that as computer technology progressed, the number of transistors on microchips (CPUs) was doubling, roughly every two years. At the same time, he noted that while computing power was increasing as a direct result of this, the cost of computing was declining. And so was born Moore’s Law - a guiding principle used in the semiconductor industry that charted and fuelled technological advancement.
Such laws are useful, because they can act as a framework for predictions. Using Moore’s Law made it possible for me to assess computer technology in 2010 and work out - with a degree of accuracy - where it might get to ten or fifteen years later. But importantly, it also allowed for predictions in other areas as well.
You see, as computer chips improve, their performance has knock-on effects in related tech. For example, better CPUs led to better battery management, and as transistors became smaller, less power was needed to generate the same computational output. Also, more advanced computers lead to better research which contributes indirectly to advances in science. And so, knowing that CPUs would evolve according to Moore’s Law enabled me to predict the impact this might have on other technologies such as batteries.
Similarly, the rise of streaming was directly supported by the evolution of CPUs. The exponential increase in computing power and data storage facilitated high-quality video-compression and faster decoding. It essentially made cloud computing possible.
In fact, all of the predictions I made were driven by the development of computer chips. That, and my tendency to look at everything as a connected system as opposed to isolated technologies – this, I have found, not only helps when examining tech, but all aspects of life. Finally it’s an absolute truth that human innovation always progresses technology forward - nothing remains static when there’s a chance it can be enhanced.
These combined factors made it straightforward to make reasoned assumptions on how technologies might develop. That’s how I felt confident enough to present these areas as future milestones to students, even though they weren't being directly developed at the time.
Wild Card technologies
Back to my lectures of old, and it’s clear that while I laid out technologies that would come to exist, I completely failed to predict other technological advancements we have seen since. Remember – I, like everybody else, cannot see into the future.
On review of my teaching materials back then, the PowerPoint presentations I created contain absolutely no mention of . . .
The rise of Web3: Blockchain networks forming a decentralised Internet behind the scenes (I think I’d heard of Bitcoin, but that was about it)
AI chatbots (LLMs like ChatGPT): Advanced AI that we can have a deep and meaningful conversation with (I discussed early AI chatbots such as the amusing but useless finance assistant Clarna, but certainly didn’t foresee the AI Revolution)
Generative AI: Tools that enable us to create images, video, sound and written content with a few text prompts (we explored automatic logo generation and Apps that could auto-generate effects over real-world video in the early 2010’s, but these were primitive by today’s standards. I certainly didn’t think I would ever see the likes of Midjourney or Sora in my lifetime)
Advances in robotics and automation: Self-driving cars may have been discussed, but if they were, I didn't present them as something we would see in the near future
The things listed above were omitted from my teaching because they were not really on the horizon back in the early 2010s. While some individuals familiar with the nascent days of Bitcoin may have foreseen its growth and the myriad blockchains it would lead to, very few did. Similarly, others working in the field of artificial intelligence will have projected that we would reach the point we are at now a decade or more ago, but due to the unpredictable rise of AI, they too would have been in the minority.
Wild card technologies will always present themselves – lesser-known or not-yet-realised tech that cause tangents to be created and new routes to be followed. While clever people see the potential in these areas, even they cannot say with certainty how they will progress or the impact they will have. And so, it’s important to note that while using predictive frameworks can be useful, they are only effective to a point.
The current Landscape
Clearly, the world's tech landscape is very different to the one I experienced during the early 2010s. Since then . . .
AI has crashed into the public consciousness and is now growing at blistering speeds, spreading its roots into almost every area of human thought-based work and production
Web3 has become an established network with thousands of cryptocurrencies and blockchains forming a ‘second Internet’ beneath the public gaze
Robotics have become incredibly capable extending out of manufacturing into many areas of life and work
VR/AR has progressed greatly with lightweight, powerful headsets and augmented experiences now possible for entertainment and assistance in industry
5G is now the mobile standard facilitating super-speed streaming and downloads and pushing broadband slowly but surely into the history books
To be clear – these are all significant technological advancements. They are having - and will continue to have - an increasing impact on our way of life. The tech listed above will become ‘lead actors’ in the story that unfolds – fresh out of drama school in their current form, but soon to be playing central roles in the movie of tomorrow.
Making Predictions Today
Moore’s Law is no longer the benchmark framework to use in making tech predictions. Right now, CPU power is still increasing, though at a slower rate than it once did - we are getting close to the limit of what we can achieve with transistor-based technology. ‘Classical computer architecture’ has served us well for over 80 years, but transistor sizes are now measured in atomic scales – they are mind-meltingly tiny. Fortunately (or not, depending on your perspective) there’s a new kid on the block that is likely to one day completely change the computer paradigm. And until AI rocked up, that ‘one day’ was seen as being in the distant future. But AI has enabled huge breakthroughs that have pushed the timeline of Quantum Computing much closer. And this – coupled with the rapid evolution of AI that runs parallel to advances in computer hardware – make predicting the future much more of a challenge.
As I write this in 2025, I am keen to make predictions for the years to come. I want to lay out, with as much certainty as possible, what the landscape might be like 10 or 15 years from now. But how will I go about that? The answer is this – in a similar way as I did before, but with some caveats and differences. Because in 2025 we need to take other factors into account.
AI’s Progression
Whereas Moore’s Law predicts improvements in CPUs, models of AI progression have sprung up to help us gauge how this technology might develop in the future. Unlike Moore’s Law, predictive frameworks such as Huang’s Law, Neven’s Law, Koomey’s Law and Metcalf’s Law suggest more potential for exponential accelerations in the evolution of AI. They consider not only improvements in hardware but also the impact of growing AI networks and optimisations in software and training data.
Laying out a definitive formula that predicts AI’s trajectory is challenging. But most experts will tell you that one thing is inevitable – the rate of AI progression will be rapid and faster than the evolution of CPUs.
The AI Singularity
Because of the rapid advancement of AI, many believe we are approaching an AI Singularity. The term is borrowed from the world of physics where a gravitational singularity (e.g. the centre of a black hole) is the point at which the laws of physics break down and predictability ceases. In the case of an AI singularity, the term refers to a moment when artificial intelligence surpasses human intelligence, leading to uncontrollable and unpredictable results.
In essence, the AI Singularity suggests that we will reach a stage where predictions as to how both AI will develop, and the impact it will have on technology and life, become completely unpredictable. At this point, AI will formulate its own drivers and goals, and its intentions as to how and why it might improve itself may elude even the most brilliant human minds. This is because once something surpasses human intelligence, it will be considered super-intelligent. Artificial Super Intelligence (ASI) may struggle to communicate to humans what it knows and intends on doing. In the same way that we can communicate simple instructions to dogs, we know they won’t understand the plotline of a novel. This describes how our relationship with a super-intelligent AI may be and, while alarming to some, it underlines the significant shifts we could be looking at in the coming years.
Clearly, such a possibility also throws a sizable spanner into the works of any ‘predictive machine’. If (or when) AI reaches this point, all bets are off – nobody knows what this technology could lead to if it hits super intelligent levels. But that doesn’t mean we shouldn't try to make predictions now – even in this uncertain scenario. If anything, it makes reflecting on the possibilities of where we may be in the future even more essential.
Predictions for 2040
Based on what we know now, and despite the various challenges posed by unpredictable forces driving progression, I still think it’s possible to make some plausible predictions - things I firmly believe are more likely than not to occur in the next 10-15 years. The driving factor behind these predictions is simple - AI and the impact it will have on other technologies and life. Artificial intelligence is rapidly accelerating innovation through its abilities to analyse, process and draw conclusions using massive quantities of data. In addition, AI is developing new AI, and this fact alone is leading to a technological supercycle that will reshape the landscape.
While it’s impossible to account for tech wildcards (e.g. the progression of Quantum Computing, new forks of tech progression that AI will undoubtedly give birth to etc.) I’m confident enough to put my name against the following predictions, as I believe the current trajectory that AI is enabling will result in the following by 2040 . . .
AGI (Artificial General Intelligence): Machines achieving human-level intelligence, including emotional reasoning
50%+ of computer-based work handled by AI agents: From writing reports to coding, AI will replace humans in many desk jobs
50%+ of manual labor jobs replaced by robots: Industrial automation will expand beyond factories into all sectors of work
UBI (Universal Basic Income) in some regions: Governments will introduce financial safety nets for displaced workers
Web3's rise as a dominant force: Decentralised platforms will offer mainstream alternatives to big tech monopolies
A developing Metaverse: Immersive virtual worlds where we engage in many of the things we currently do through via a screen - socialise, game and work
AR glasses and Agentic AI as an interface for life: A blurring of the digital and the physical through content overlaid on reality and Internet interaction though AI assistance
Extended human lifespan: AI-driven healthcare and gene editing will significantly increase longevity and improve quality of life
The collapse of traditional education: AI-driven personalised learning will replace school education designed for an industrial-era economy
What if I’m Right?
If these predictions come true, where would that leave us? Quite simply in a very different world to the one we know now. If these projections come to pass, the ramifications will be far more impactful than any technological advancements we have seen – not only during the digital revolution, but throughout the history of humanity. As AI and robotics cut through industry, the world of work will become almost unrecognisable. Improvements in healthcare will change what it means to be human - longer lives and a reduction in suffering could alter how we live, how we perceive ourselves and what we are capable of doing. As education transforms both its methods and its purpose, the way people learn, grow and develop will lead to significant societal change. A decentralised Internet might alter the digital space so much that it becomes something completely new, creating virtual economies and societies that drive changes in the physical world. And as AI provides more control to centralised systems (as surely it will), Web3 may be the only thing that keeps digital economies open and controllable by us.
Make no mistake - these are big, hefty changes I am talking about – not minor tweaks to the order of life. And I haven’t even touched upon the potential impact of Quantum Computing, the growth of an immersive Metaverse or Artificial Super Intelligence – all of which have the potential to morph our reality in various, shattering ways.
So, what should you do with this information? Well, firstly, bear in mind that these are predictions made by someone who has stated twice already that he cannot see into the future. They are educated guesses, not assurances. But I personally believe that many, if not all, of the projections I have made are likely to happen in the next 10-15 years. So much so that I am happy to commit them to the Internet. And I am not alone – a host of people far cleverer than me are discussing how the tech world will soon reshape society – Geoffrey Hinton, Yuval Noah Harari and Mo Gawdat to name but a few . . .
Obviously, what you do with this information is up to you. But I strongly advise that you plan for significant change, prepare for the impact of AI on your life and work, and keep up to date with what’s going on. Don’t wait until the real impact of these changes are discussed on popular news channels and by governments as they are occurring – start preparing now by assessing what you do, considering how these developments might affect your life and make plans accordingly. While my discussion is based on predictions (not prophecies), one thing is certain: we have entered a technological supercycle.
Don’t let the changes this brings blindside you. Act now.
Watch the accompanying YouTube video, and join the conversation now...






Comments