This content originally appeared on DEV Community and was authored by Melvyn Sopacua
Whenever I see a post about what a Chatbot did wrong, when asked to do something, there will be at least one “prompt engineer” blaming the user. Cause that’s what it is: user blaming for the tool’s inadequacy.
While companies heavily invested in AI will want you to believe that AI is coming for job, the very existence of the Prompt Engineer job title proves it won’t be. It pains me to say the obvious, but if AI is a success, you won’t need someone to “talk to it the right way”. Then you can communicate with it like a normal human being and it would do as you ask or you could replace it with a better fit. Framed like that, you can instantly see how far away from the human equivalence requirement we really are.
It’s also very unsettling to me, that PM’s and UX designers everywhere have simply accepted the fact that they are needed. Let me say it again: it is user blaming. The thing you punish us developers for, when we say something is “obvious so why would the user do that”. The prompt engineer’s sole reason of existence is that the product (what they call “AI”) is not performing according to expectations and so their job is to turn that around and rewrite the poor ignorant user’s question to the almighty Chatbot.
If that’s not harmful enough: the “prompt engineer” encroaches on the job of the Data Scientist who should be the one fixing the problem at the right end: the “AI”. Of course, this makes it harder to sell AI as a service: it’s incredibly expensive and slow to have per-user trained models. It is the technically better fix, but the harder sell and surely the harder to make profitable (profit comes from scaling the same thing to millions of users). This is the part less talked about: is the SaaS model stifling AI innovation? I think GPT-5’s launch made that abundantly clear: it was void of any real innovation and just rewiring (orchestration if you will) of already tried, tested and failing methods.
Yes, there are things that can be automated and on the whole scale of things, this means less jobs. CEO’s of not so well-known companies, start-up founders and the like are unfortunately listening to the industry gurus, so real jobs are disappearing (at least temporarily).
It’s already clear that the promised gains aren’t being met and there’s also a new market emerging for seasoned developers to fix “AI enhanced” codebases.
So as long as there are job postings for prompt engineers, know that AI still isn’t enough human to take your jobs.
This content originally appeared on DEV Community and was authored by Melvyn Sopacua