This content originally appeared on DEV Community and was authored by Peter Harrison
When I became a paid programmer I knew one language: DBase. Well, two if you count BASIC, but the less said about that the better. You could write a fully functional application in DBase, and distribute it as a exe file. You had to know something about data structures as well. The barrier to entry wasn’t that steep.
Today what do you need to write an web application? You will need HTML, CSS, and Javascript. You will need to adopt a front end framework such as React. You will need to learn a back end language, which these days can also be Javascript, aka Node. Or you can use Java or Python. You will probably use a SQL database, so you will need to know SQL. You will need to expose services via REST, so will need a REST API framework. You will need to store your code in version control, which these days is usually Git. To deploy your application you will usually need to know about Cloud services, Docker, Kubernetes.
Your application will need to implement security and authentication, for which there is OAuth2. You will also need a CI/CD system, which was previously Jenkins, but now varies depending on the platform. You will need to learn Cloud Formation on AWS, Bicep on Azure to do infrastructure as code.
This isn’t an advanced stack. This is the kind of skill set expected of a “full stack developer”. As a software developer I have always known that you can’t stay still. The old COBOL programmers had a good wicket for a while, but you get stranded on a declining island. So constant learning has always been a absolute necessity to stay relevant.
I had ended up specializing in integration and business automation, learning jBPM, Bonita, Activiti, but even there times are moving with new approaches which leave BPM type solutions in the dust.
Even though I have raced to adopt new skill sets it has become increasingly difficult to stay ahead of the skills demanded. And if I am having trouble the situation for junior developers must look like Mt Everest. Much less the fact that AI is now encroaching, making it even tougher for entry level positions.
There is a danger in thinking the present is just like the past. I would have thought the future would be easier for developers, better tooling, making life easier, but the evidence seems not to point there. AI in some respects makes things easier, but also more opaque, giving people power without tempering them with experience in software.
Should we expect people to be jack of all trades, able to handle everything from the front end to designing cloud infrastructure? When I began everything was on one PC, applications compiled to a single file. Today we have highly available distributed clusters, complex deployment pipelines and quality gates. It is good for developers to at least understand the whole stack, have some exposure to it, but having deep understanding of the whole stack has become unrealistic.
Are we cutting off the pipeline of young developers? Are we placing too many expectations on them? Is there a way to ease these expectations in the hiring process? Or do we think AI will be the silver bullet, removing the need for us to program at all?
This content originally appeared on DEV Community and was authored by Peter Harrison
