I was talking to a a friend of mine the other evening, they have a teenager who’s trying to figure out what to do when they grow up.
The first thing they said was that the kid is confused, they really want to be a graphic designer, but they are starting to recognize that AI is going to pretty much eliminate that role…
It got me thinking, how hard is it right now to figure out what you want to do when you grow up if you understand the change that’s coming.
As a 14 year old, with, say a 10 year timeline to a first potential career job, if you’re aware of what’s happening in AI startups at the moment, it’s plain to see there are shrinking choices.
Let’s be blunt , in 10 to 15 years:
Graphic design is gone — Getty is already accepting AI generated art, and Midjourney is staggering in this field already
Coding is gone — ChatGPT can already create basic code, and low and no-code is on the rise
UX design is gone — Combine multi variant testing with AI based no-code X and you have a perfect feedback loop
QA is gone — A combination of basic tests written by AI with guidance from a Product Manager
SRE is gone — Infrastructure as a service with self healing no-config AI based tools
Each of these areas may need an admin, a single person or so, but as career path, competition will be immense.
Over the next ten years, AI will create efficiency in thousands of jobs, and therefore reduce the required number of employees for almost all companies.
I’m not trying to be scaremonger, but we need to start talking about what we should be advising our children to do… if you genuinely understood the impact that the motor car was going to have back in the early 1900s you wouldn’t be saying to your children “get into horse shoeing, people will always need horseshoes!”.
There used to be 20 million horses on the roads of North America… less than 50 years later and there were virtually none.
So what’s the advice? Assuming that governments won’t regulate against AI; I mean it’s possible there will be controls that state at least 75% of a company’s “output” has to be human (at least during this transition) but I’m not convinced, and anyway, that’s a topic for another post.
The more I think about it, the more I think the safest role right now is Product Management.
Imagine the future of development in 10 years time.
- Someone needs to ideate the problem to be solved
- Someone needs to define the business requirements
- Someone needs to define the design style, the high level strategy
- Someone needs to understand prompting and relay this to the suite of AI tools
At this point, the AI instantiates the environment, builds and deploys the application, tests, tests, tests. And submits it back to someone for validation
- Someone then still needs to validate that the application is meeting those requirements
- Someone needs to validate that the application is thematically coherent, and makes sense within the context of the companies vision, portfolio, user base etc.
That someone… feels a lot like a Product Manager!