Opinion

Is AI making me obsolete?

There’s been a lot of talk about AI making developers obsolete. Is it all hype, or should I be concerned?

Tom HaskellTom Haskell
AI

Many years ago, I did a Masters Degree in Artificial Intelligence. If you’d told me then we’d have LLMs that could write a strategy document or code an app, I’d have been amazed. The problem back then was availability of datasets. To train an AI you need vast amounts of human-validated content.

Take face-recognition for example - that was one of the classic problems back then. There were several algorithms that were very good at finding a face in an image, but answering the question of “who does this face belong to?” was exceptionally difficult - where do you get the data for what someone looks like to associate face = name? One of the first leaders in this were Facebook. Here was a site where you uploaded a photo, clicked a person’s face and “tagged” the face to a person, essentially making us the trainers for an AI that could then do it for you -suggesting the name immediately. It seemed like magic!

That’s essentially the same premise as today’s LLMs - they’ve been trained on masses of human-generated content, words, code, art, music, and your personal data to be able to suggest things to you. And that’s the problem: what you get is essentially a suggestion. Those facebook tagging suggestions were not always right - sometimes it thought you were your brother, if you were wearing glasses or a hat or looking off to the side it couldn’t identify you. But that was ok, we knew it was only a suggestion and were happy to correct it.

The Hype

AI models are like a good actor: they can say the words convincingly, but don’t actually know what they mean.

Today, AI is being touted as “the answer” rather than “the suggestion”. We’re told we don’t need copy-writers, junior developers, artists or designers because AI can do it all for us.

But it’s important to remember that these AI models don’t actually “know” anything - they’re just really good predictive text engines. I heard them described somewhere as being like a good actor- take a TV doctor for example: they might know how to say the medical jargon convincingly, but don’t actually know what the words mean. There are countless examples of people asking an AI model basic reasoning questions and it failing (how many R’s are in strawberry?) because there’s no “brain” to speak of, just a probability that a certain word will follow on from a previous set of words.

This gap between probability and real understanding is the chasm we now have to cross to reach the “True AI” that I dreamed of back in my uni days - or what we now call AGI (Artificial General Intelligence) - and has been promised by AI researchers repeatedly over the last 60 years as being just around the corner. I think we’re still someway off, if indeed it is possible at all.

Have I been replaced?

So bringing it back to my situation: do I still have a job in an environment where anyone can code their own app in a weekend?

Having used AI to vibe-code my website I’m not currently too concerned. Can it produce code? Absolutely. It’ll even produce code that will run and perform it’s own tests. But, as I mentioned in that previous post, it’s only as good as your average junior developer. You need to know exactly what you want, how you want that thing to work and look, and be incredibly precise about communicating it. To be honest, if I wasn’t already a developer I’m not sure how I would know how to explain to it what I wanted. Maybe I care too much about idiomatic code and structure, but even putting that aside there’s a level of intuition and experience that it just can’t replace.

The support problem

I’ve read several articles recently that have highlighted the “hidden tax” of vibe-coding: long-term support. What happens when a critical library is updated, or a big browser change breaks the layout? If you’re relying on AI to fix it for you, you might need to wait until there are enough human-generated tutorials for it to pick up the fix.

Positive use of AI

So far I’ve been very negative about AI, but it’s definitely not all bad. Going back to my original point about it providing suggestions - it’s exceptionally good at that. I use it all the time for writing documents based on bullet points, or fine tuning blog posts, or as a more natural interface for my “why doesn’t this work?” type questions. But I always read it through and edit it afterwards.

Using it to speed up my coding is also very helpful - it can perform some of the grunt work in lieu of having my own junior dev. Scaffolding out a Vue app, or writing the boilerplate CRUD methods for data access can certainly be very useful.

The Verdict

AI is undoubtedly useful, but lets keep using is as a suggestion tool rather than a source of expertise. With that view it will make me more productive, rather than redundant.

So, if you have a project idea and you have the technical depth to communicate it clearly and support it yourself—crack on! I’d genuinely love to hear how it goes. But if you’re looking for someone to bring that human intuition and experience to the table (while using AI to make it faster and cheaper), get in touch.

Latest Posts

View All
Opinion

Is AI making me obsolete?

There’s been a lot of talk about AI making developers obsolete. Is it all hype, or should I be concerned?

Read
Read Post
Tutorial

Setting Up the Safety Net: Antigravity + Dev Containers

Setting up a safe sandbox for AI agents to work in.

Read
Read Post
Opinion

Vibe-coding My Website

I finally stopped procrastinating and built my website, giving agentic AI a go in the process.

Read
Read Post