Technology is great. Until it isn’t. Even, and maybe especially, with AI.
Over the years, technology has become more ubiquitous in our lives. For many people, it has become their lives.
But what happens when that technology changes, for better or worse?
For example, many people build a revenue stream through social media, often using one platform. When that platform cuts off your access, dials back its revenue model, or makes other changes you didn’t see coming, you’re suddenly left scrambling.
As a result of these stories, many people who use social for income now diversify, with the ultimate goal of going off-platform, such as reaching people through their own business.
We may be starting to see the same process play out with AI.
This may unfold in a few different ways.
Let’s start with the business case, similar to how social media played out.
As many of you know, OpenAI released its latest and greatest model, ChatGPT-5, recently.
Unveiling the hype train that precedes just about every new release of just about anything these days, we were told that our lives would change in such a way that AGI would be right around the corner.
We would soon not need any other tools. Everyone will lose their jobs. The AI takeover is here.
Except…it’s not.
Not only did GPT-5 not live up to the hype, it was noticeably worse in key areas that matter most for daily use, like accuracy and reliability.
If you rely on using AI for any kind of business use, as many people, myself included, do daily, this is a concerning development.
Mid-conversation, GPT would suddenly start thinking for extraordinarily long periods of time on even the most basic questions, only to respond with an answer that had nothing to do with the question. Confusion and hallucination well beyond the predecessor model, to the point of making it virtually unusable in many situations.
Not only that, but it forgot how to read. Days earlier, it could read pdf’s, extract information and provide summaries that were for the most part accurate. Not perfect, but certainly usable. Once the latest model rolled out, it suddenly couldn’t get anything right.
Overnight, your business’s reliance on a tool went kaput.
The same could be said for just about any tool. A business isn’t going to use multiple CRMs just in case one of them starts working strangely one day. However, AI is starting to become ubiquitous, and as a result, people are becoming overreliant on it, similar to how social media earners became too reliant on their platforms, except potentially on a much larger scale.
As a side note, people are now also becoming emotionally reliant on AI chatbots.
Imagine you were dating someone for a while, and then one day, you realized they were a completely different person. Well, as strange as that sounds, that’s what happened to a lot of people who, yes, had some kind of romantic feelings for ChatGPT. The chatbot, which was criticized for pandering too much to users and telling them what they wanted to hear instead of the truth, now has a completely different personality, or as some people now see it, none at all.
So, ChatGPT allowed access to its old v4 model and made some changes so that its practical uses aren’t as detrimental as they were at 5’s release, but it’s still not quite as good in many areas as 4 was.
I’m sure that will change, but it should be a stark warning side for anyone who uses AI — do not become overly reliant on the tool.
Your business could suffer, your job could suffer, your personal life could suffer (if you’re into that sort of thing), but what if your life suffers most of all?
We’ve been conditioned in today’s fast-paced world that we need to keep up with the latest technology or get left behind.
Who knows if and how that will play out, but one thing’s for sure: people are taking that seriously.
No one wants to miss out on the next Internet, social media, cryptocurrency, blockchain, metaverse, etc. etc. etc. boom (but when you put it like that, it seems like there’s always a new boom, but I digress).
As a result, they’re becoming consumed with AI.
Seems good, right? Being the person who uses AI better than anyone else.
What if it’s not?
It’s already shown to quickly deteriorate our writing skills and even our critical thinking skills when we stop using our own and outsource everything to AI.
Ironically, that’s when the whole concept of staying ahead of the technology curve backfires on you.
When technology can do everything and you now do nothing, what value are you bringing to the world? You’ve made yourself obsolete in the same way that people who didn’t learn the technology did—or so the “gurus” will tell you.
You were led to believe that in a world full of nails, AI is the only hammer you should use.
What if the most important skill becomes knowing when not to use it?
When that time comes, will you be so over reliant on AI that you forget how to write, how to think, how to be human?
I’m certainly not going to tell you not to use AI, but I think it’s a grave mistake to let it do everything for you.
AI can make things better and more efficient, but every one of us is unique and brings our own skills, ideas, work ethic, and way of looking at the world and thinking about it to our society.
Not only do we lose a lot of what we built when our overreliance on AI backfires and our workflows and tools don’t work the same as they did before, but we also become homogenized.
We become the same. We become the bland, boring machines. We may even bring the AI takeover upon ourselves, unwittingly.
So, keep using AI. Keep learning about it. Try different tools. Test different prompts, GPTs, workflows. Make areas of your life more efficient.
But don’t stop reading for yourself. Don’t stop writing on your own. Don’t stop thinking about how to solve problems before asking an AI chatbot what to do for you. Most importantly, don’t stop second-guessing what the AI tells you, what other people tell you (especially me because I have more to learn than I already know and that will never change), and even what you tell yourself.
AI may be the sharpest tool we’ve ever built, but it’s still just a tool. Knowing when not to use it might be the most human skill we have left.