I have no issue with that. I know that IT salaries are overblown. Obvisouly, they are still way below what they create as extra value to the owners of the code so we still have a case of productivity gains pocketed by share holders and not workers.
I've been in IT for 35 years, solving problems that involved code I did not fully understand, writing code in languages I did not fully master, and all sorts of other things. I know "real programmers" who are not able to do what I can do, etc. What we do is what defines what we are. If the label people stick to us doesn't match, either we try a bit more to match or we change of label. There are tons of jobs in IT that are "programming jobs" and do not involve programming from morning to evening.
With life expectancy at 75+ for a lot of us, starting anything now (50+ for me) means I could be proficient with tools I don't know yet in like a year or so, and I have 25 years or experience in various IT fields that can help me find things that 20+ years old people don't know or understand or care about. That's fine with me.
The comparison to natural languages proficiency is tricky because it's more complex than that (I'm French, I've lived in Japan for the last ~25 years, and I use written English on a daily basis, my working language is Japanese and I barely use French on complex issues anymore). Plus, I did not really have a choice in the languages that I speak/learned/use.
Java is a suggestion that fits my current activity (free software collaboration on a Java based package), but I'm not sure I'll even be able to sell that. Plus, I'm not sure what I want to do with that language learning thing right now. So, Java ? I'm not sure, but I get the arguments (pay, teaching quality, availability of jobs, etc.)
Also, I see where getting a job can be a priority and then learning a language that hires comes as a #1 tip. But what if I want to create applications/services on my own ? Like macOS apps ? How does that influence my way of thinking about my priorities ?
The native language comparision was interesting, but here the breaking down of concepts makes it look like learning a computer language is a totally un-natural way to learn a symbolic way to think. Would I focus on learning "pronouns" and then "adjectives" and then other aspects of natural languages to learn them ? Is that the best way to actually learn a computer language ? This tip feels weird.
Interesting list of items. I would have thought participating to free software development would have made it there.
Interesting. Communities are nice. I guess my thought about free software development above is covered here.
Are you sure you want to hide this comment? It will become hidden in your post, but will still be visible via the comment's permalink.
Hide child comments as well
For further actions, you may consider blocking this person and/or reporting abuse