- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Published earlier this year, but still relevant.
0% of the fault lays on the students who got the degrees they were told were in demand by every single adult in ther life.
This was a coordinated push by our government and tech sector to drive down the cost of skilled labor by oversaturating the field.
I say this as a CS major that was forced to work fast food for 6 years until I could find a shitty tech support job and work my way up from there, there was never a single opportunity for me to be a programmer like I intended.
The one and only time I took compsci at a junior college just taught the basics of Office
An unfortunate but completely predictable result of the debt manufacturing industry. Widespread and getting worse.
In case anyone is not aware:
Are you currently employed?
Have you actively sought a job in the last 4 weeks?
If the answer to both of those questions is ‘no’, then congrats, according to the BLS, you are not unemployed!
You just aren’t in the labor force, therefore you do not count as an unemployed worker.
So yeah, if you finally get fed up with applying to 100+ jobs a week or month, getting strung along and then ghosted by all of them…
( because they are fake job openings that are largely posted by companies so that they look like they look like they are expanding and doing well as a business )
… and you just give up?
You are not ‘unemployed’.
https://www.bls.gov/cps/definitions.htm#unemployed
You are likely a ‘discouraged worker’, who is also ‘not in the labor force’.
https://www.bls.gov/cps/definitions.htm#discouraged
…
Also, if you are 5 or 6 or 7 figures in student loan debt, and… you can only find a job as a cashier? waiter/waitress? door dash driver?
Congrats, you too are not unemployed, you are merely ‘underemployed’.
But also, if you have too many simultaneous low paying jobs… you may also be ‘overemployed’.
…
But anyway, none of that really matters if you do not make enough money to actually live.
In 2024, 44% of employed, full time US workers… did not make a living wage.
https://www.dayforce.com/Ceridian/media/documents/2024-Living-Wage-Index-FINAL-1.pdf
(These guys work with MIT to calculate/report this because the BLS doesn’t.)
You’ve also got measures like LISEP…
Which concludes that 24.3% of Americans are ‘functionally unemployed’, by this metric which attempts to account for all the shortcomings of the BLS measures of the employment situation.
Using data compiled by the federal government’s Bureau of Labor Statistics, the True Rate of Unemployment tracks the percentage of the U.S. labor force that does not have a full-time job (35+ hours a week) but wants one, has no job, or does not earn a living wage, conservatively pegged at $25,000 annually before taxes.
So basically this is a way to try to measure ‘doesnt have a job + has a poverty wage job’.
…
A more useful measure of the actual situation for college grads, in terms of ‘did it make any economic/financial sense to get my degree?’ would be ‘are you currently employed in a job that substantially utilizes your specific college education, such that you likely could not perform that job without your specific college education?’
Something like that.
It sure would be neat if higher education in the US did not come with the shackles of student loan debt, then maybe people could get educated simply for the sake of getting educated, but, because it does, this has to be a cost benefit style question.
- sincerely, a not unemployed but technically ‘out of the the labor force’ econometrician.
If businesses continue believing they can vibe code some intern into success while drop kicking talent to the curb to save a buck, those CS unemployment numbers will fall off like a lemming!
Yep. Been saying it for years because I was laid off over and over. Do not enter computer science.
Become an welder, electrician, etc. ANYTHING but a computer scientist
Me reading this as a computer scientist
Chat, are we cooked?
To the quote in the summary - might be because debugging dozens of layers of bullshit is hard. Anyway, debugging is about sitting for hours and reading logs and looking for weirdness, and looking at dumps, and what not. It’s a very different skill from “being the next Zuckerberg”. Also Zuckerberg is a psychologist most of all, his computing knowledge is not that unique. Network effect is more important than skill and knowledge here.
The fairness meter at the bottom of the article is absurd. “Unfair left leaning” like yes, how dare the libtards use statistics to show how broken our economy is
This article is rated center/fair
If you are speaking of the needle position on the dial thingy, I believe it’s just the default until you vote, not meant to indicate anything (though it’s misleading). You have to vote to see actual results.
Ah, ok. What a strange default. Almost makes me think they chose that as the default to be rage bait
Brave goggles has a similar concept. Search “vaccines” with “from the right”, get a bunch of disinformation antivaxxer crap.
Just call it what it is: “Unfair truth leaning”, “Unfair fake leaning”.
Give it a minute. Pretty soon, they’re going to need a lot of people to fix all the vibe-code that’s currently being spewed out by AI. That’ll be a monumental task.
Just found out someone in my team has been vibe-coding VBA in Excel that our team is now using. I asked who was going to maintain it and she didn’t know what I meant by maintenance.
Reminds me of web development in the Dotcom days, cleaning up Dreamweaver HTML garbage.
Hearing advice on how to get into software dev made me I realise really don’t have enough passion for it. And given that its hyper competitive historically speaking, decided to move the adjacent job (that being a data analyst). Enjoying it so far. Now I just use my programming skills to just make cute little projects on my laptop, and of course a little bit for the data analyst stuff but.
Shades of dotcom days. Everyone hopped on the bandwagon. Most lured by the high salaries and gold-rush mentality. Nowadays, just having a CS degree isn’t enough. You want portfolio pieces to set you apart. Start by having a damn portfolio. You can set one up for free on GH Pages or CloudFlare. Or pay a few bucks and set one up on Wordpress. If you can’t figure out how, that CS degree was wasted.
You want stories that show you bring value. Show that you can build things beyond school projects. Even if you do school projects, document them and push them out. Show why they’re cool and what you can do. Throw up screenshots, diagrams, or animations. No walls of text.
Also, learn to sell yourself. Not in the oily LinkedIn way. Just be out there. Contribute back. Educate others and have a voice. Blog, newsletter, social media, book, or video channel. They’re dead-easy to set up and free so there’s no gatekeepers to go through, other than your ideas.
If in a big city, go to Meetups or demo days. Meet people and ASK WHAT THEY DO. Help connect them to others. Anyone just sitting there cranking out resumes is going to get filtered by the LLM screener. Might as well pin up your resume above the urinal at the pub.
Finally: everyone can low-code or vibecode. Those are table stakes now. You want to do better.
I take it there are not going to be many autistic new devs in the coming decades over there, with such requirements.
Yeah, no. Once I saw this kind of bullshit was needed for programming jobs I just pivoted to IT and cybersec.
These days the pay is just as good, and chances to find a job are even better, the environment is much lower pressure and this gross techbro exploited/exploiting attitude that somehow programming is special and not just a modern day 9-5 factory job is non-existent. With dev jobs, the goal posts are ever shifting. No I’m not doing a portfolio, no I’m not doing your “take home assessment”, no I’m not doing a live coding exercise for your £20k ass minimum wage job where “we measure work by effort, not time” and I’ll somehow end up on call. I love programming, but not enough to let myself get fucked by corpos every which way.
You do have to deal with corpo boomers though, but if you’re lucky they mostly realize they’re just cogs that got lost and they better not make too much noise or they’ll be let go.
Great advice. Also pick an open issue in an open source project, make a PR, have some public discussion of trade offs you considered, and get it merged. That’s an awesome differentiator. I’ve seen thousands of developer resumes without this. It shows you can work effectively and productively on good code and with a team.
I’d love to hear your experience around this and what sector or jobs this assisted, because more data is great.
But in my experience across 25+ jobs ranging from startups to fortune 500/250/100…I have never encountered a hiring process that would care about this.
I would love to be proven wrong though.
In the 90s everyone was getting “web certified”
In the 90s “web” was about knowing FTP, HTTP and HTML. Should have stayed this way. Scripts in browsers were a mistake.
I blame social media and this perverse need to display notifications instantly. Technically very interesting problem to work on, but basically useless to a customer.
We had a button for that, on demand - it was called F5
I remember that those were used for games like Travian (displaying time and resources), dynamic content (like blasting music on a webpage) and web chat (that’s what I blame the most, because it was in demand).
Well, they didn’t do that, but I can imagine another “standard and convenient” way could have been taken to add realtime notifications to a webpage - a set of tags for displaying messages of an IRC channel, sending a message to an IRC channel, and so on, with maybe associating actions (going to an URL? or maybe updating part of DOM, but without full agility of JS, just add/remove/replace tag by id) with events. Like refreshing a page on a message in the channel, but no more frequent than N seconds.
Combined with iframes (I’ll admit I consider iframes a good thing, burn me at the stake), this could give you a pretty dynamic experience.
IRC is, of course, not secure, but maybe if such functionality were present and if it became popular, IRC over SSL would become normal earlier too.
Or maybe something like WS could have been standardized far earlier. For pushing events to client.
I agree about F5, but the effect of realtime changes was psychologically very strong.
The industry went to shit after non-nerdy people found out there could be a lot of money in tech. Used to be full of other people like me and I really liked it. Now it’s full of people who are equally as enthused about it as they would be to become lawyers or doctors.
The industry went to shit after non-nerdy people found out there could be a lot of money in tech.
I started my undergrad in the early 90’s, and ran into multiple students who had never even used a computer, but had heard from someone that there was a lot of money to be made in computers so they decided to make that their major.
Mind you, those students tended not to do terribly well and often changed major after the first two years — but this phenomenon certainly isn’t anything particularly new. Having been both a student and a University instructor (teaching primarily 3rd and 4th year Comp.Sci subjects) I’ve seen this over and over and over again.
By way of advice to any new or upcoming graduates who may be reading this from an old guy who has been around for a long time, used to be a University instructor, and is currently a development manager for a big software company — if you’re looking to get a leg-up on your competition while you look for work, start or contribute to an Open Source project that you are passionate about. Create software you love purely for the love of creating software.
It’s got my foot in the door for several jobs I’ve had — both directly (i.e.: “we want to use your software and are hiring you to help us integrate it as our expert”; IBM even once offered a re-badged version to their customers) and indirectly (one Director I worked under once told me the reason they hired me was because of my knowledge and passion talking about my OSS project). And now as a manager who has to do hiring myself it’s also something that I look for in candidates (mind you, I also look for people who use Linux at home — we use a LOT of Linux in our cloud environments, and one of my easiest filters is to take out candidates who show no curiosity or interest in software outside whatever came installed on their PC or what they had to work with at school).
My own experience (being probably around your age) is that “Software development being fashionable” and hence there being a subsequent oversupply of devs, comes in cycles, with the peaks being roughly coincident with Tech bubbles.
I remember that period in the mid and late 90s when being a software developer was actually seen as “a good career choice” as the industry was growing fast (with personal computers, then computing spreading into all sizes of companies and vusiness activities, then the Net bubble).
Then the bubble crashed and suddenly it wasn’t fashionable anymore. The outsourcing wave made it fashionable again but in places like India, because they were serving not just their own IT needs but also a big slice of the rest of the Anglo-Saxon world’s, so the demand-supply over there was so inballanced that being a software developer was enough for a good house with servants in places like Mumbai. (I actually managed a small team based in India back then and I remember how most were clearly people who had no natural skill at all for programming). At the same time in those countries which were outsourcing to places like India, programming wasn’t a good career choice (mainly because it was the entry level stuff that got outsourced) but if you were senior back then demand had never been as high.
Then came a period of retrenchment of outsourcing because it wasn’t that good at delivering robust software that does what the business needs it to do (the mix of mediocre business requirements and development teams which are in fact not even it the same company means that deliverables invariably don’t do what the business needs them to do and the back-and-forth cycles needed to get it there take more time than it would if everything was in-house) and a new Tech bubble, so software development became fashionable again and once again people who would otherwise not consider it, were choosing it as a career.
I think that what we’re seeing now is the initial effects of the crash of the latest Tech bubble: the Stock Market might still be ridding its own momentum, but the actual people “at the coalface” are already reducing costs, plus the AI fad is hitting entry level positions like the outsourcing fad did, and probably it too will fade because AI “coding” has its own set of problems which will emerge as companies get more of that code and try and take it through a full production life-cycle.
As for how you chose devs, I would say it’s really just anchored on the eternal rule that “toolmakers make much better devs than tool users” - in my experience gifted devs tend be the ones who “solve their own problems” and for a dev that often means coding coming up with their own tool for it, either as a whole or as part of an existing open source project.
I guess that anyone who managed to make the effort to join Lemmy is already on the right track.
I set my mind on comp sci like 6 years ago because it was said to be one of the most in demand fields (maybe still is) and pays well (I was looking at SWE). Nowadays I have set my mind on a job that involves me working away in a server room. Hopefully that pans out.
3-5 years ago my answer would’ve been different. I could trip and find a job offer. I was getting job offers by email essentially without interviewing.
About a year ago that completely dried up. I can’t even remember the last email I got that was more than recruiter spam. My friend who used to also trip into jobs (7 at peak) has been hunting for 3 months now with no luck.
But…servers and data centers and stuff, you’re probably onto something. Wishing you the best.
I’ve been looking t fulltime for a long time now, and from what I’ve seen there are a tonne of jobs out there, it’s just that are that many more qualified devs than their were just a few years ago.
The way I see it, the hiring bubble that exploded during the pandemic let a lot of people gain proficiency, then followed by the waves of layoffs and you’ve got a lot of talented folks looking.
I wanted to become a dev 12 years ago, when it was still cool.
Needless to say that I haven’t, even if doctors I talked to refused to diagnose me with ADHD, my ASD and BAD and anxiety from many things kinda make it not a very good direction.
So - now I could probably become a dev, with the experience gained. But it’s really not the time when this is a good choice LOL.
Have you seen a psychiatrist or were you talking to a GP? I got diagnosed with ADHD 3 different times in a 15 year period without a problem. Only stuck with the meds the last time about 4 years ago, and it was a game changer.
I graduated with a degree in Computer Science and Software Engineering from the University of Washington in 2020, during the height of Covid.
After over 3000 handcrafted applications (and many more AI-written ones), I have never been offered a job in the field.
I know of multiple CS graduates who have killed themselves, and so many who are living with their parents and working service/retail.
I think the software engineering rush of the early 2000s will be looked back upon like the San Francisco gold rush in 1949.
A degree in CS is valueless for actual working jobs. You need to write software and show that you know what you’re doing. And if you can do that, you may not even need a job from anyone else. The time when companies would just overstaff and have paid interns is long over.
2020s was probably the worst time to graduate or even attend a 4-year university. they were starting to lock down, and they were laying people off and hiring freezing everywhere, that dint stop till maybe mid 2022, the effect was pretty devasting, i was still working a chain store and many people from IT to electrical engineer just got freshly laid off. and then the '23 massive tech layoffs began too i dont see this going to reverse for CS majors anythime soon, since CS has been having issues like since early 2010s of getting hired.
on students who were attending universities for the first time, or halfway through thier degree in the 2020s, i looked at reviews of my universities, most of them said they dint learn anything at all, so it puts them at disadvantage already, especially if its all only ONLINE courses. if you been in a regular course where the professors only uses powerpoint , you arnt learning anything a professor did this with BIOchem(for life science students, which is allegedly easier than the other biochem for scientists) and then when exam times came, they were almost as tough as my CC chem classes.
There was even a class action suit against UW for their negligence during covid. I guess the case is already settled, so I’m looking forward to my meager restitution check.
And I actually feel lucky that most of my serious classes were complete before Covid lockdown, bc the quality of education during covid was absolutely pathetic.
…the San Francisco gold rush in 1949.
Classic CS major, making an off-by-one(hundred years) error ;)
I’d be happy to review your resume and code samples and provide feedback if you want.
What CS subfield? I think it really depends if you were able to specialize somewhat. At least systems programming and lower level coding seems to be somewhat in demand once you get into the field. Even given the current economy we aren’t really getting much interest from students.
Over the years I have tried a handful of subfields.
I always felt particularly adept at assembly language programming, so I had a couple projects doing that, and applied to every relevent job I could find.
As a math nerd I enjoyed data science and machine learning, I had quite a few projects like a neutral network from scratch in Matlab, and many data analysis and computer vision projects in R. I was always aware this field is very competitive and my chances were low here.
I had a friend get a job in the biomedical field, so I tried to follow that, I have Python projects doing basic gene sequencing and analysis, even a really cool project that replicated evolution.
Another friend landed a government job, so I followed his advice and got some security certs.
I also had smaller projects and attempts at databases, finance programming, and video games.
3000? That’s hyperbole right?
No I have a spreadsheet with 3200 lines of submitted applications, which includes both entry level positions and internships. Many with customized cover letters.
When you do the math its not even a strong pace, only about 3/day over 3 years. On a good day I was submitting 12-15.
I even applied to some famous ones, like the time Microsoft opened up 30 entry level positions and received 100,000 applications in 24 hours. It is rumored thet they realized they cannot process 100k apps, so they threw them all away and hired internally.
Whether they actually threw them out or not, that one always sticks with me. Submitting 100k apps is literally a lifetime of human work. All of that wasted effort is a form of social murder in my opinion.
LMAO I THOUGHT YOU MEANT CODE APPLICATIONS. Like you developed 3000 apps. I was like no way…
I have twenty years experience and it took me 300+ applications to get my current job.
Times are changing.
It sounds like the same amount of effort that it would take to make a really good open source project, or contribute to an existing one.
I find it hard to believe you wouldn’t get a job with something like that under your belt. Also 3000 applications is probably a bit shotgun rather than targeted and HR would be able to pick up on it
You’re right that my time was wasted, and knowing the outcome, I wish I could go back and do more project work before trying to enter the job market.
But I don’t think that is a financial possibility for most Americans. Going to school drained my savings, when I graduated I had almost nothing except for school debt, medical debt, and high rent. Saying “I’m gonna take off and work for free for a year” never really seemed like a possibility.
And as for my apps, the 3000 were not shotgun, they were all personalized, custom cover letters, keywords, etc. It only averaged out to 3/day. I did not track the apps where I used AI to submit them- the AI ones were definitely shotgun.
It’s not your fault, but it sounds like you and probably a lot of other people were misled about what having a degree actually does.
The most important thing someone looks at when you apply for a job is that you are interested in the thing and capable of doing it. The degree doesn’t really do that but the personal projects do. The degree might be a nice to have on top and helps to convince some people, but you always end up working with people without one anyway.
I’m not sure I was misled, what you said was explicitly taught to us at University. I think my degree is the #1 thing on my resume, but of course I also had projects, a few certificates, and multiple attempts at more specific fields.
Back when I was applying, my GitHub activity was pretty solid green.
It’s weird because everywhere I’ve ever worked routinely hires people who don’t even know how to make a commit, or anything at all really.
For some reason even those people are somehow jumping ahead of competent people like you in the queue. It’s also annoying for us because we have to deal with the bad ones that HR delivers.
Well believe it gramps, most of the open source projects contributors now either just do content creation as a side hustle or are permanently looking for work, at least in my experience
“most” open source project contributors are looking for work? Lol ok bud
Yeah. Broken economy, broken world, etc etc. it’s like a bad dream that won’t end. IRL is the doomscroll now.
I don’t blame you, just be thankful you’re so out of touch you find it hard to believe.
Well to see it from the perspective from the inside: we always have hundreds of openings, and I’ve seen openings for months and years without suitable candidates. Sometimes lots of bad applicants and sometimes no applicants at all.
That’s for the niche openings. For regular graduate stuff new people start every single day.
It’s hard to match up that with the fact that some people apparently aren’t getting a single application progressed.
I agree, but until that’s clear I remain quite skeptical.
IDK about most. But, I’ve seen many OS contributors say they’re looking for work. Seen one recently saying he won’t be contributing much to the project anymore because he’s housing-insecure. Seen maintainers for popular projects get laid off and are now looking for work. Seen people with 10+ and 20+ years of experience not being able to find a job after many months.
Yeah there are obviously unfortunate cases. But to put another unsourced number out there I would say 90% of open source maintainers are employed in some way or even directly to work on that thing.
The point of bringing it up is that those people would gladly give a pass on an interview to someone they already know contributes than some random graduate they don’t know.
the 2020-23 isnt exactly a time they were hiring at all, they froze for like 2 years. and students were barely learning at all since the classes were all online, and there was no way to find volunteering work. if you go back to look at your university reviews on yelp(yea they have it for universities) its pretty dismal out there.
he said he handcrafted alot of them, so it was pretty targeted.
I was in a similar boat. Graduated right around the housing crash. If my wife didnt work at the time, we would have been in a terrible spot. It look a good 6 months to get my first job. After that, I haven’t had any issues popping into jobs.
Sounds like you got a raw deal. Our industry has many highs and lows when it comes to jobs and work available.
My buddy graduated and took a gap year. That year happened to be the dot com crash. So he kept backpacking for another year then started looking for work. 😁
For me, even graduating in 2022 with an MSc, 6 months is a short time to find a job
2008 was a very difficult job market for sure. Even around 2017 when I graduated it was quite difficult from now. Entry level positions have evaporated in the last 6-7 years
be willing to move
you’re offering salt in the middle of the Pacific
I fled from the Midwest because there were no good jobs outside of the oil and gas industry, and ended up in the Seattle area. Saving up and moving cost 2 years of my life, Im not sure I could do it again.
…and I did apply to some jobs on the west coast, although most of my apps were around Seattle.
But please tell me, where should I have went instead of Seattle?
yea that’s understandable
Honestly Seattle is a pretty good place for tech jobs, it’s just that the cost of living isn’t much better than California or other big tech hubs.
The major saw an unemployment rate of 6.1 percent, just under those top majors like physics and anthropology, which had rates of 7.8 and 9.4 percent respectively.
The numbers aren’t too high although it shows the market is no longer starved for grads.
It’s important to understand that this is a standard feature of the capitalist economy where the market is used to determine how many people are needed in a certain field at a point in time. It is not unusual that there’s no overarching plan for how many software engineers would be needed over the long term. The market has to go through a shortage phase, creating the effects in wages, unemployment, educational institutions and so on, in order to increase the production of software engineers. Then the market has to go through the oversupply phase creating the opposite effects on wages, unemployment and educational institutions in order to decrease the production of software engineers. The people who are affected by these swings are a necessary part of the ability for the market to compute the next state of this part of the economy. This is how it works. It uses real people and resources to do it. The less planning we do, the more people and resources have to go through the meat grinder in order to decide where the economy goes next. We don’t have to do it this way but that’s how it’s been decided for a while now.
I was doing my CS degree immediately after the 2008 meltdown. At the time there was a massive oversupply of finance people who graduated and couldn’t find work. This continued for years. I was always shocked at the time why the university or the government does not project these things and adjust the available program sizes so that kids and their parents don’t end up spending boatloads of money and lives in degrees under false promises of prosperity. I didn’t have an answer then and people around me couldn’t explain it either but many were asking the same question. I wish someone understood it the way I do now.
It’s also just a general pattern that when a skill is in high demand, the jobs pay great. Everyone wants great pay, so the flood the schools to acquire that skill. Eventually things reach a saturation point.
And also there are always charlatan programs that take your money to hand out worthless certifications. As time goes by, these “educations” mean less and less, a lot of people just nab them online because they want to make better money fast, and there are fewer and fewer real jobs unfilled. Until we arrive at a point like this.
It’s a supply and demand issue.
Yes my point is that it’s a feature of using the market to decide these variables in the economy, that includes the supply-demand dynamics. If we used some form of planning at the macro level that takes data from the industry and educational institutions, project long term direcrion, and propagate targets or at least expectations down the industry and educational institutions, we could save a ton of real resources and parts of people’s lives, and reduce the negative social effects of this process. Effects that destabilize the whole system if they grow to any significant proportions.
I find it hard to believe the true numbers are this low. Every job posting gets many hundreds or even thousands of applicants. It’s a shame so much talent is wasted by so many people being unemployed and doing “unproductive” things like spending months applying to jobs.
This should be common knowledge. I recall in the 1990s there was a huge push for truck drivers. Everywhere you went “Be a truck driver! Own your own business! Make six figures!” And only a decade later, employed drivers struggle to make ends meet.
If you see a huge push for a particular job - you better plan your exit.
Nursing in the early 2000s, CS in 2010s. I’m guessing whatever University of Phoenix is pushing, stay the fuck away from.
One eight hundred, five five one, eight nine hundred. Diesel Driving Academy!
I was always shocked at the time why the university or the government does not project these things and adjust the available program sizes so that kids and their parents don’t end up spending boatloads of money and lives in degrees under false promises of prosperity. I didn’t have an answer then and people around me couldn’t explain it either but many were asking the same question.
You are looking at Universities^0 all wrong. Predicting the markets are not their job or role in society.
The primary purpose of a University is research. That research output comes from three primary sources: the faculty, graduate students, and undergraduate students. Naturally undergrads don’t tend to come into the University knowing how to do proper research, so there is a teaching component involved to bring them up to the necessary standards so they can contribute to research — but ultimately, that’s what they exist for.
What a University is not is a job training centre. That’s not its purpose, nor should it be. A University education is the gold standard in our society so many corporations and individuals will either prefer or require University training in exchange for employment — but that’s not the Universities that are enforcing that requirement. That’s all on private enterprise to decide what they want. All the University ultimately cares about is research output.
Hence, if there is valuable research output to be made (and inputs in the form of grants) in the field of “Philosophy of Digital Thanatology” (yes, I’m making that up!), and they have access to faculty to lead suitable research AND they have students that want to study it, they’ll run it as a programme. It makes no difference whether or not there is any industry demand for “ Philosophy of Digital Thanatology” — if it results in grants and attracts researchers and students, a University could decide to offer it as a degree programme.
We have a LOT of degree programmes with more graduates than jobs available. Personally, I’m glad for that. If I have some great interest in a subject, why shouldn’t I be allowed to study it? Why should I be forced to take it if and only if there is industry demand for that field? If that were the case, we’d have nearly no English language or Philosophy students — and likely a lot fewer Math and Theoretical Physics students as well. But that’s not the point of a University. It never has been, and it never should be.
I’ve been an undergraduate, a graduate, and a University instructor in Computer Science. I’ve seen some argue in the past that the faculty should teach XYZ because it’s what industry needs at a given moment — but that’s not its purpose or its role. If industry needs a specific skill, it either needs to teach it itself, or rely on more practical community colleges and apprenticeship programmes which are designed around training for work.
[0] — I’m going to use the Canadian terminology here, which differentiates between “Universities” and “Colleges”, with the former being centres of research education that grant degrees and the latter referring to schools that are often primarily trade and skill focussed that offer more diploma programmes. American common parlance tends to throw all of the above into the bucket of “College” in one way or another which makes differentiating between them more complicated.
So coding trade schools need to be created.
It’s not honestly a job more complex than many trades. Treating it as different is a relict from the time when most programmers came from backgrounds in some cutting edge defense research or fundamental science. And honestly not all of them did, some learned it as a trade when it was a new thing, and advanced is like a trade, and themselves treated it like a trade, and wrote books about it like about a trade. Unfortunately later there was that hype over tech and Silicon Valley and crap.
Today’s programmers sometimes have problems with deep enough understanding of algorithms and data structures they use, while this is about similar in complexity to the knowledge an electrician possesses.
In USSR there was a program of “programming being the second literacy”, with Pascal and C being studied in schools and schools getting computers (probably the most expensive things in there), PDP-11 clones looking like PCs, and a few other kinds of machines. Unfortunately, the USSR itself was on the path to collapse. Honestly if only it existed for a bit longer, and reformed and liberalized more gently, maybe that program would have brought fruit (I mean, it did, just for other countries where people would emigrate).
BTW, Soviet trade schools (“primary technical school” that was called) prepared programmers among other things. University degrees related to cybernetics were more about architecture of mass service systems, of program systems, of production lines, industrial optimization, - all things that people deciding on those learning programs could imagine as being useful. Writing code wasn’t considered that important. And honestly that was right, except the Internet blew up, and with it - the completely unregulated and scams and bubbles driven tech industry.
Honestly the longer I live, the more nostalgic I become for that country which failed 5 years before I was born. Yeah, people remembering it also remember that feeling of “we can live like this no longer”, and that nothing was real or functional, but perhaps they misjudged and didn’t see the parts which were real and functional, treating them as given. It was indeed a catastrophe, not a liberation.
What you describe might be true for Canada, but it doesn’t apply to all universities. Many universities have two primary tasks: research and education. These are two separate tasks with overlap.
I do find it understandable if publicly funded universities place restrictions on how many students they accept per program as it’s their duty to give back go society.
Speaking for the US, major universities may be there for research, but they are a small portion of the mass of schools across the country.
People have mostly been getting degrees to get a good job since at least shortly after WW2. It’s silly to pretend people are going massively in debt without the expectation of a return on that investment.
Nothing against people learning for the joy of learning, but I absolutely hold schools accountable for not making job prospects clear when most of the students are both young and ignorant of the world.
they don’t want to scare people away form an impacted majors, they probably lose money if they arnt butts in the seat, if people arnt willing to pay for a major with no jobs the uni lose money and they probably have to shut that program down. it seems state uni around here on care about putting as much butts in seats of undergrads as possible so they can have thier cash cow, they dont care what happens to those 3-4years in, just push them through like they are in high school.
biotech is another one i bring up on other forums, its one of those it looks likes in demand, but they really arnt keen on hiring people. its gatekeeped at the scientist level, unless a student is aware that labs exists in thier universities they are out of luck. and state unis here do a good job of not telling or hiding the labs under an obscure category. Professors are very reluctant to even talk about thier labs at all; some have an ego issue(they dont want students to ruin thier reputation, eventhough we arnt even a threat thier field, as we arnt in grad school, i had a professor like this) and labs are usually filled up, so theres very little chance to get into lab if your lucky. CCs dont have labs. that is the part that universities dont warn students about, if you had labs in your unis all this time, isnt ir prudent to look for these labs, although i suspect they dont want the PIs to get inundated with students requesting to get into thier labs, thats why they are very hush hush about it.
i also think bio unemployment is skewed towards health too, because a significant amount of them are held by women, who are likely to be employed in the field over men, first its likely they are going into NURSING, dieticians, PHYSICAL therapy where all the jobs are, plus CLS which is a niche grad job. on the research side its the same for women ive only seen a majority are in the labs volunteering(apparently at my uni some of them only wanted women because lab manager/PI was being a creep), otherwise the biotech side have a pretty large unemployment, but its lumped in with all bio majors.
That’s not what I meant in that paragraph. I am not saying that universities are merely job training facilities. That was simply an example from my life where these types of professionals have come out of. I’m not making a judgement on universities as a whole. They just so happen to produce the vast majority of software engineers and finance professionals in Canada. That’s why I mentioned the university. If I was talking about electricians, I’d have said trades school, or college, etc. I am absolutely aware of the larger role of universities and you won’t catch me claiming they’re professional training factories.
the university or the government does not project these things and adjust the available program sizes
They kinda do, but only the part where they increase program sizes after demand exists and only wind down when the market is saturated. They can’t really work too far ahead if they don’t know ow something will be in demand and they don’t want to tell students to not do something they offer just because there are too many graduates. Add the four or five years to graduation and you get a system that lags behind reality even if the planning was better.
But a well designed post secondary education means graduates can go into similar or related fields, they aren’t limited to what is on their diploma except in their own minds.
This explains why people gave me a hard time for getting an anthropology degree…
its like psych degree, i heard people complaining in person about thier psych, yea you arnt going anywhere without a GRADuate degree for these majors, PSY-D/ PHD are the only options for that field, i assume thats what thier saying to you? anthropology might be more difficult, i assume your only going to be teaching at a university witha grad degree, but faculty positions are super-competitive asf, especially if its not a really in-demand degree.
There’s a lot of jobs in the private and public sector for people with anthropology degrees. In the US, anthropology is taught as a four field approach encompassing Biological Anthropology, Cultural Anthropology, Linguistic Anthropology, and Archaeology.
Each of the subfields have different levels of hireability based on a bachelor’s degree.
I personally only have a bachelor’s and live well. I have a home and live comfortably. But, to your point, I have essentially capped out my earnings. I can’t make more without obtaining a graduate degree.
Undergrad psych degree is pretty popular with social workers.
I was always shocked at the time why the university or the government does not project these things and adjust the available program sizes so that kids and their parents don’t end up spending boatloads of money and lives in degrees under false promises of prosperity.
https://www.bls.gov/ooh/ does track this a bit, but I don’t know if universities use the info or if the site is intended for individuals instead.