Since those people who started to work with computers in 70's have worked 50 years in this area, that is a life time of learning and practise, it means any future programmer will not make any usable program after at least 20 years of learning, since all the simple program has already been made everywhere for free
If there's one thing I learned back when I was working as a programmer, it's that you have to constantly learn new things every six months or so, and any skills you've obtained as little as two years ago are practically useless.
The other thing I learned was that with programming, the main thing to learn is computer logic and how it all works. After you manage to wrap your head around that, the rest is just ever-changing syntax (language), and new tools to make your life easier.
So if you learned programming just two or three years ago, you're likely not too far off skill-wise from someone who's been doing it for 50 years (sorry old-timer egos), and if you learned programming 50 years ago, and dropped it for a few decades, chances are you'll be able to pick it up again easily, since you already know the hard part (PC logic).
I work daily with many different programmers, some of them has been working with computers for 30 years, they still do not have a clue how the system works as a whole. If they do not understand the system from the binary level, e.g. how 0-1 level change will affect the registers, they will never get a clear picture, unfortunately, most of the programmers belong to this category, they just program at user level, which is just a game player who plays other's game
Back to the topic, programing is still a good job, but I see for each programmer hired, 3-4 traditional workers will be fired, and since his salary (and consumption) can not be 4x of those fired workers, the total consumption of the society is on the way down. I think in the latest 100 years, every such a wave of efficiency lift generated a huge recession