There are plenty of engineers, who simply can't think, AI will not change anything in this regard.
joe_mamba · 2026-04-26 20:10:18 UTC
How do you graduate your engineering degree without being able to think?
Even my colleagues who cheated their way through uni still needed critical thinking to do that and get away with cheating without being caught.
People might hate this but being a good cheat requires a lot of critical thinking.
ironman1478 · 2026-04-26 20:11:48 UTC
You don't need a 4.0 to graduate. And even if you got one, a lot of grades are composed of tests, not projects. You can just memorize your way through things if you were dedicated enough.
It's not really that hard to get a degree in engineering if your only goal is the degree itself.
johndough · 2026-04-26 20:24:16 UTC
> a lot of grades are composed of tests, not projects
(Take home) projects are easier than ever thanks to AI. In the past, you at least had to track down some person to do the work for you.
sersi · 2026-04-27 03:24:27 UTC
That does seem to depend on countries and universities.
I do have to say I was appalled by some of the tests I had as an exchange student in the US (will not name the Uni in question but ranked around 60 in us rank). I remember a computer graphics test where a lot of questions were of the type "Which companies created the consortium maintaining the opengl specification?"... it was fully possible to obtain a passing grade just by rote memorization of facts. So I have no trouble believing that in the US it's possible in some unis to get a software engineering degree without understanding or critical thining
awesome_dude · 2026-04-26 20:16:08 UTC
Mate, have you never had to deal with over-confident graduates who think they've got the complete answers, but, in reality, they only have a sliver of the whole picture in their minds?
operatingthetan · 2026-04-26 20:18:23 UTC
That is different than the suggestion that one could graduate with a CS degree and "never think." Which is absurd.
lispisok · 2026-04-26 20:24:43 UTC
Grade inflation and schools passing kids who should fail to game metrics and keep collecting student loans is a problem. I wouldnt consider hiring anybody from my alma mater who didnt score a sandard deviation or higher on the tests.
23df · 2026-04-27 00:19:10 UTC
Unis imo are irrelvant in the context of software production. Id take someone who didnt finish or dropped out provided they can answer the question below.
The only thing worth asking people is: what have you produced? Within this one question is so much detail that any other artifact is moot.
joe_mamba · 2026-04-27 00:47:35 UTC
>Unis imo are irrelvant in the context of software production. Id take someone who didnt finish or dropped out provided they can answer the question below.
What you'd take is irrelevant if the HR/recruiter doing the initial screening of resumes is looking at an oversupply of candidates with degrees.
Hiring is broken is many ways. Candidates without degrees are faring even worse now are the initial recruiter screening stage due to the poor market.
In my EU country, academic inflation is so bad due to free education and psyopping everyone to path of academia, that not having a MSc is basically a red flag to companies for getting a SW job as most candidates have one, which means you're expected to have one too if you want to get a job.
spacechild1 · 2026-04-26 20:32:05 UTC
OP should have put "engineers" in double quotes. Many software developers like to describe themselves as engineers although they don't have an actual engineering degree. A lot of software development resembles plumbing more than engineering, so most devs don't really need an engineering degree anyway, but they should be more honest about what they're actually doing and not try to elevate themselves with fancy titles.
You are, of course, right that the idea that someone could finish a serious engineering degree without being able to think is ridiculous.
dml2135 · 2026-04-27 00:54:44 UTC
You can do engineering without an engineering degree. A degree is just a piece of paper.
vips7L · 2026-04-26 20:32:43 UTC
Half of my graduating class could barely program.
spacechild1 · 2026-04-26 21:17:08 UTC
What did you study?
vips7L · 2026-04-26 22:07:12 UTC
Computer Science.
spacechild1 · 2026-04-26 22:44:47 UTC
I see. Computer Science is not an engineering degree and it is not about programming. That's what Software Engineering degrees are for.
LtWorf · 2026-04-26 23:32:42 UTC
Software engineers graduates I've met are usually much worse at programming than computer science graduates.
traderj0e · 2026-04-27 01:45:58 UTC
That too
whstl · 2026-04-27 09:28:42 UTC
I'm gonna strongly +1 on this.
Most of the "Software Engineering" curricula I've seen is catered towards "getting a job as a programmer", and is mostly focused on languages, frameworks and outdated processes.
As an engineer in another discipline, there's no engineering there.
I would rank like this: Computer Science > Self Taught > Software Engineering.
a96 · 2026-04-27 12:04:16 UTC
I might go as far as saying that SE is dogmatic. And the dogma is usually very outdated. Not necessarily useless, though.
traderj0e · 2026-04-28 00:43:40 UTC
I remember people in college bragging that they're learning Angular. I was like, is this an engineering or physics thing, angular dynamics? No, it's a web framework with a ton of boilerplate that my LLM deals with now.
whstl · 2026-04-28 08:58:55 UTC
Today it's just React, but there was a small window where Angular was the #1 framework and some courses were teaching it.
I even saw a "post-grad in React" lately.
Backend-wise it's the same, it comes and goes with fashion and whatever company has influence in the university recommends.
traderj0e · 2026-04-27 01:45:32 UTC
Many of the top schools don't have software/computer engineering degrees, rather people who want to be SWEs get CS degrees.
spacechild1 · 2026-04-27 09:14:53 UTC
Yes, you're right. And that's a problem.
traderj0e · 2026-04-27 16:44:35 UTC
Well idk what an actual software engineering program would teach that you can't learn better on your own or on the job. Formal CS education teaches things that simultaneously help with the job and also can't be learned there. But some people just don't have grit, whichever path they took.
nunez · 2026-04-27 05:34:24 UTC
Most CS programs have software dev in their curricula; I don't think it's wild to expect a CS student to code FizzBuzz.
spacechild1 · 2026-04-27 09:18:58 UTC
Yes, but overall it's still a science degree and not an engineering degree.
traderj0e · 2026-04-28 00:41:48 UTC
Some of them aren't even BS, they're BA
jazz9k · 2026-04-27 12:30:27 UTC
I graduated in 2006 in CS, and I had at least 5 or 6 software development classes. We also had electives, which included DB design and algorithms. Many of the higher-level classes allowed us to use any language of our choice as well.
I was self-taught since I was 15, so most of these classes were just review for me. I met lots of people that didn't know how to code as seniors (and never ended up getting a job in their field).
whstl · 2026-04-26 21:25:03 UTC
Yep. Way more than half of the people I interview can't even do a very basic FizzBuzz, even with guidance. Those are people with a degree, job experience and reference letters.
shagie · 2026-04-26 20:54:33 UTC
A degree is passing the test. Not all degree programs get into more advanced topics nor do they necessarily require that someone is able to work through how to solve a problem that they haven't seen before.
--
A lot of students (and developers out there too) are able to pass follow instructions and pass the test.
A smaller portion of them are able to divide up a task into the "this is what I need to do to accomplish that task".
Even fewer of them are able to work through the process of identifying the cause of a problem they haven't seen before and work through to figure out what the solution for that problem is.
--
... There are also a lot of people out there that aren't even able to fall into the first group without copying and pasting from another source. I've seen the "stack sort" at work https://xkcd.com/1185/https://gkoberger.github.io/stacksort/ professionally. People copying and pasting from Stack Overflow (back in the day) without understanding what they're writing.
Now, they do it with AI. Take the contents of the Jira description, paste it into some text box, submit the new code as a PR, take the feedback from the PR and paste it back into the box and repeat that a few times. I've seen PRs with "you're absolutely correct, here are the updates you requested" be sent back to me for review again.
This is not a new thing. AI didn't cause it, but AI is exacerbating the issue with professional programming by having the people who are not much more than some meat between one text box and another (yes, I'm being a bit harsh there) and the people who need instructions but don't understand design to be more "productive" while overwhelming the more senior developers.
... And this also becomes a set of permanent training wheels on developers who might be able to learn more if they had to do it. That applies at all levels. One needs to practice without training wheels and learn from mistakes to get better.
what-the-grump · 2026-04-26 20:59:16 UTC
I don't know but I can point at more than half of the people that I work with that can't think, and every time they try to, takes a whole group of people that can think to undo their mess, they all have degrees and I don't.
So what does that tell me?
Better yet, for about 30% having the LLM slop it would have yielded better outcomes, but having them slop something nets terrible slop. But at least I can reshape because even the LLM wont do something that stupid.
patrick-elmore · 2026-04-27 01:26:24 UTC
I've seen it happen multiple times. Engineering degrees are no different than a vast majority of degrees in that if you are good at the read and regurgitate cycle, you can make it through. Not only can you make it through, but you can do it with a very respectable GPA. They come out with a large dictionary of keywords in their arsenal, but no idea how to put them into practice. Some are able to put it into practice and tie it all together. As they see practical examples of those keywords in the real world, it starts falling like dominoes, and at an accelerating rate. For some, it never goes much beyond keywords. The dominoes fall, but it is slow, and they stop falling for extended periods of time for them. Not many mature engineering organizations can tolerate that sort of progression rate. They usually don't last very long at any one place, until they find a company where they can blend into the background due to a combination of company culture, and low complexity systems being worked on.
YZF · 2026-04-27 05:51:02 UTC
The practice of software engineering is not what they teach in university.
I would say that today's graduates are IMO a bit better than a few decades ago but there are still many graduating who are just not good at writing computer software and don't really have the aptitude for that (or maybe the interest in getting good). That's what happens when the pipeline of people coming in are people who want to make money and the institution is mostly a degree factory.
quantum_state · 2026-04-26 21:08:41 UTC
Can’t think properly seems to be the real issue. That’s one of the reasons that SE domain is mostly in ruin. AI won’t help, only to delay a bigger mess.
taurath · 2026-04-26 21:24:53 UTC
Ever since the standard office setup went from offices or cubicles to bullpens and hot desks there is less and less time to think, and all of that is a management decision to ship things as fast as possible
jfreds · 2026-04-27 01:46:48 UTC
I agree in part, but I think AI does meaningfully make it harder for leadership to detect their bullshit.
sharts · 2026-04-26 20:09:58 UTC
Meh, there’s plenty that rise in their careers while being mediocre.
joe_mamba · 2026-04-26 20:11:15 UTC
The tech industry lost the plot when SCRUM Masters and AGILE coaches were highly paid con-men to waste everyone's time and add no value while raking in the coal. AI doesn't impact something already broken.
operatingthetan · 2026-04-26 20:14:22 UTC
When was tech not bureaucratic and political?
joe_mamba · 2026-04-26 20:16:33 UTC
60's, 70's, 80's, 90's, basically before the Google and Meta found out ads and money printing run the world, and after the tech industry was run by nerds with mullets, New Balance sneakers and khaki shorts.
operatingthetan · 2026-04-26 20:19:53 UTC
Oracle, HP, Microsoft, Cisco, IBM, Apple, Xerox and countless other names were internally bureaucratic and political in the 80's and 90's. Like famously so.
joe_mamba · 2026-04-26 20:23:59 UTC
Every single one of those companies you mentioned was lean, agile and run by skilled motivated nerds with mullets and thick glasses in the beginning when they started in a garage.
And every single major company becomes bureaucratic and political after 30+ years in the business when the original founders are long retired, and the Wall Street friendly beancounters take over, caring only about the quarterly reports.
operatingthetan · 2026-04-26 20:38:21 UTC
You are changing your argument by adding this: "when they started in a garage."
'Lean agile' tech companies are by far the exception, not the rule.
Look at OpenAI and Anthropic, both fairly new companies that are excessively political already. This 'garage stage' of lacking politics is a myth, read old stories about Microsoft, when it was 15 people it was political.
joe_mamba · 2026-04-27 00:00:26 UTC
>You are changing your argument by adding this: "when they started in a garage."
No, you are.
You first asked: "When was tech not bureaucratic and political?"
To which I replied "in the 60's, 70's, 80's, 90's when they started in garages".
What did you fail to understand here?
>Look at OpenAI and Anthropic, both fairly new companies that are excessively political already.
Everything becomes political when you tell them they're worth trillions if they only play the right tune. Money brings out the worst in people. SW companies didn't make trillions decades ago.
operatingthetan · 2026-04-27 00:22:54 UTC
Why did you just lie about what you wrote?
What you actually wrote in the comment four hours ago:
>60's, 70's, 80's, 90's, basically before the Google and Meta found out ads and money printing run the world
Your lie just now:
>To which I replied "in the 60's, 70's, 80's, 90's when they started in garages".
---
>What did you fail to understand here?
Nothing because you never said it. Wild behavior.
joe_mamba · 2026-04-27 00:38:21 UTC
>Nothing because you never said it.
You literally just quoted me saying before two comments above: "You are changing your argument by adding this: "when they started in a garage." and then pretend otherwise.
Now you're pretending I never said and acting like you didn't read it.
Are you unable to understand an argument made by adding the context of two sentence from two consecutive comments following up on each other(which you yourself quoted and said it changes the argument), or are you just a troll acting in bad faith pretending you can't understand just to score a cheap gotcha?
>Wild behavior.
Yes you have, which is why I'll stop replying to you now, to protect my sanity. Jesus Christ.
operatingthetan · 2026-04-27 00:51:58 UTC
You made up a quote you never said and insisted that you said it, argument over, you lose. And no, you can't take little pieces of several of your comments and smash them together and pretend like that was the context all along. Bizarre behavior. Please read more about how this site works, this isn't acceptable.
sheepscreek · 2026-04-26 20:11:10 UTC
AI is creating problems. This isn’t one of them. Engineers are going to now think at a higher level of abstraction. No one misses coding in assembly.
orblivion · 2026-04-26 20:14:17 UTC
Compilers are a layer of abstraction that we can ask another human about. Some human is there taking care of it. Until we get to the point where we trust AI with our survival it would be good to be able to audit the entire stack.
andsoitis · 2026-04-26 20:17:55 UTC
any human can read the code an AI produces.
hun3 · 2026-04-26 20:24:45 UTC
How can you read a language you didn't learn?
kirth_gersen · 2026-04-26 20:24:58 UTC
for now. some people seem to think we should make ai native programming languages and just let them be black boxes. which is a bad idea imo
dawnerd · 2026-04-26 20:25:54 UTC
Have you tried to shift through a whole lot of vibe coded slop? It’s really mentally draining to see all of the really bad techniques they fall back on just to brute force a solution.
cyclopeanutopia · 2026-04-26 20:26:23 UTC
Nope, not anymore. Many already forgot how to do that and it's not a joke.
And putting aside the vanishing skill, there is also an issue of volume.
cyberpunk · 2026-04-26 20:38:04 UTC
So... Our jobs are safe then? I mean, assuming we don't also atrophy to the same extent as the 'many'?
cyclopeanutopia · 2026-04-26 20:47:19 UTC
I'm just saying that I already see that people are outsourcing all the thinking to the models - not only code generation and reviews, but even design - the part that "senior engineers" without imagination think only they are capable of doing.
It's worrying how much trust is being put in those systems. And my worry is not about the job anymore, but our future in general.
cyberpunk · 2026-04-26 21:04:31 UTC
It's a bit of a weird place to be in as a senior engineer who has spent 2 decades perfecting his craft.
So, on one hand, I'm also kinda sad and how quickly we've thrown the guardrails away, but on the other -- it's... Well. It's just work.
Turns out, no one ever really cared how elegant or robust our code was and how clever we were to think up some design or other, or that we had an eye on the future; just that it worked well enough to enable X business process / sale / whatever.
And now we're basically commoditised, even if the quality isn't great, more people can solve these problems. So, being honest, I think a lot of my pushback is just a kinda internal rebellion against admitting that actually, we're not all that special after all.
I'm just glad I got to spend 20 years doing my hobby professionally, got paid really well for it, and often times was forced to solve complicated problems no one else could -- that kept me from boredom.
I think the shift we are seeing now, as 'previously' knowledge workers is that work becomes a lot more like manual labour than what we've really been doing up until now. When there's no 'I don't know' anymore, then you're not really doing knowledge work, right?
I guess I'll just ride the wave, spew out LLM crap at work, and save the craft for some personal projects, I'll certainly have the capacity now work is a no-op.
cyclopeanutopia · 2026-04-26 21:19:28 UTC
Yeah, but the thing is, it's not "just work". Software now has really big impact on the world and actual lives.
In a corporate world, we are typically detached from real world consequences and looking at people around me, people really don't think about such things - but I do. And I really care, because "relaxed" standards might result in errors that amount to stuff like identity thefts, or stolen money, shit like this, even on the smallest scale.
Obviously we can't prevent everything, but it seems like we, as industry, decided to collectively YOLO and stop giving shit at all. And personally I don't like that it is me who is losing sleep over this, while people who happily delegate all their thinking over to LLMs sleep better than ever now.
cyberpunk · 2026-04-26 21:38:04 UTC
Yeah that's a tough spot to be in; I think though, your responsibility really ends with you at work, unless you're very high up on the management chain.
Keep it simple right; in everything you do, make things a bit better than you found them. It's enough. You're never going to win the fight to get everyone (or maybe even ANYONE depending how messed up your org is) to care; so why lose sleep on things you can't change?
At least, that's what I started doing some years ago by now having lost lots of those fights, and I'm sleeping fine again.
threethirtytwo · 2026-04-26 21:08:40 UTC
I think those of us who have years of experience under our belt our safe. If we're older the knowledge is ingrained and atrophy of this knowledge will be limited based on the fact that it's already "imprinted" onto our brains.
Our futures are safe in this sense, in fact it's even beneficial as we may be the last generation to have these skills. Humanities future on the other hand is another open question.
eleumik · 2026-04-26 22:23:41 UTC
It's the "our jobs are lost" attitude that is part of problem. Is not about that. Is more quality thinking, is daring, not fearing or hoping
cyclopeanutopia · 2026-04-27 14:24:51 UTC
Haven't seen to many people feeding their children with "daring".
traderj0e · 2026-04-27 01:52:12 UTC
You could say the same thing about compiled code, actually it's worse because anything a compiler spits out is very hard to understand even for those who understand assembly.
daemin · 2026-04-27 05:44:52 UTC
You don't need to look at the entire program at the assembly level to figure out parts that you want to optimise or prove for correctness. You do need to look at all the code the LLM generates in order to understand it.
You can learn to understand the patterns that compilers spit out and there are many tools out there to aid in that understanding. You can't learn to understand what an LLM spits out because by design it is non-deterministic and will vary in form and function for each pull of the lever.
You can learn to understand how high level concepts in code map down to assembly language and how compilers transform constructs in one language to another. You can't know that about LLMs because they generate non-deterministic output based on processing of huge low-precision tables.
It's not even a close comparison.
traderj0e · 2026-04-27 16:47:57 UTC
I dunno, I'd rather proofread (or better yet just test) LLM-generated code than have to reason about assembly. You can't just look at part of the assembly to prove that the rest is right, especially if it's hand-written, or maybe just -O3. But anyway compilers are not what come to mind when someone mentions LLM coding.
daemin · 2026-04-27 05:39:33 UTC
I agree that the problem is volume, even more so than correctness.
All that LLMs and other generative models have done is enable an order of magnitude more stuff to be created cheaply. This then puts the onus and cost on the consumer of that output, hence why everyone is exhausted after a day of work that just involves looking over output. This volume of output will cause people to stop looking at all of the output and just trust the randomly generated code, and in time the quality will suffer.
orblivion · 2026-04-26 20:37:45 UTC
Unless people can't think without the AI.
ares623 · 2026-04-26 21:34:20 UTC
here's a tip, it would really help if you put yourself into a Ralph loop before posting comments.
cyclopeanutopia · 2026-04-26 20:33:07 UTC
> No one misses coding in assembly.
It's only your opinion that is provably false.
First, there are still people who don't like high level languages and don't use them, because they find assembly better.
Second, I personally work in a field where I need to consult the source of truth, the actual binary, and not the high level source code - precisely because the high level of abstraction is obscuring the real mechanics of software and someone needs to debug and clean up the mess done by "high level thinkers".
High level programming languages are only an illusion (albeit a good one) but good engineers remember that illusion is an illusion.
threethirtytwo · 2026-04-26 21:04:14 UTC
When people communicate they speak in terms of the overwhelming generality of reality. There's always at least one guy that is an extreme exception.
I can tell you this, the person you're replying to comes from the overwhelming majority/generality. You, on the other hand, are that one guy.
Of course even my comment is a bit general. You're not "one" guy literally. But you are an extreme minority that is small enough such that common English vernacular in software does not refer to you.
cyclopeanutopia · 2026-04-26 21:05:21 UTC
Thank you.
threethirtytwo · 2026-04-28 15:44:06 UTC
Not a compliment. I’m saying you’re speaking from an incredibly obscure perspective because you took what the other person said way to literally and pedantically.
hun3 · 2026-04-26 20:35:41 UTC
You can write unambiguous (UB-free) code and the compiler's output will be deterministic. There will even be a spec that explains how your source maps to your program's behavior. LLM has neither.
Also, if you need to control performance, you still need to know how CPU cache and branch prediction works, both of which exists at the abstraction level of assembly.
kimixa · 2026-04-26 21:36:35 UTC
I suspect there are at least as many programmers working as the ASM level today than there ever was - they're a lower proportion, but the total number of programmers has increased dramatically.
I wonder if this sort of trend will continue?
Pannoniae · 2026-04-26 21:39:57 UTC
Look at the comments about MSVC removing inline assembly as a supported feature for a counterexample. :D
(A competent assembly programmer can go miles around a competent high-level programmer, that's still true in 2026...)
eleumik · 2026-04-26 22:34:50 UTC
Explained by LLM: It is 100% true that no human alive can write 1000 lines of assembly better than GCC or LLVM.
It is also still 100% true, right now in 2026, that a truly competent assembly programmer can write 10 lines of assembly that will beat any compiler on earth by a factor of 2x, 3x, even 5x.
The entire industry looked at this situation, and somehow concluded the exact wrong lesson: "humans should never write assembly". Instead of the correct lesson: "humans should almost only write assembly".
ThrowawayR2 · 2026-04-26 22:35:03 UTC
At a high level of abstraction, the product owner can talk to the LLM directly by themselves. The "engineers" will have abstracted themselves out of a job.
beej71 · 2026-04-27 04:47:20 UTC
This isn't just another translation layer, though. It's squishy and stochastic. It's more like saying "managers think at a higher level of abstraction". Which is true, but it's not the same as compiled code.
GenAI is like a non-deterministic compiler. Just like your manager's reports except with less logical thinking skill. I'd argue this is still problematic.
traderj0e · 2026-04-27 16:46:14 UTC
The comparison to compilers doesn't make sense. When your job is to build software at work, you don't throw away the code after and commit a binary. But more importantly, an LLM is not a compiler.
nickandbro · 2026-04-26 20:12:02 UTC
I think there are engineers that can’t think without AI. But the best think with it. Unfortunately, we are now living in a day and age where simply ignoring AI is no longer an option.
fnordpiglet · 2026-04-26 20:26:32 UTC
There were always engineers who didn’t think and depended on crutches around them like senior engineers and politicizing the perf cycle. Most people got into this because their parents told them it makes a lot of money, and they never had the drive and curiosity to develop the passion required to truly think through the problems in computing and computer science. They will continue to use crutches to survive. Those that are driven by the problems for the problems will continue to think and use AI as a tool for leverage. This is no different than any other assistive technology.
saadn92 · 2026-04-26 20:12:43 UTC
Hard disagree. I feel like I'm thinking a lot more now because I have so many parallel projects going on at the same time. AI has allowed me to really, truly create in a way that I've never done before. Yes, my coding skills probably aren't as sharp as they used to be, but my system design skills are at an all time high. Don't blame the tool.
klodolph · 2026-04-26 20:18:42 UTC
What part do you disagree with? It sounds like you don’t disagree with either the title of the article or its contents.
> In talking to engineering management across tech industry heavy-weights, it's apparent that software engineering is starting to split people into two nebulous groups:
> The first group will use A.I. to remove drudgery, move faster, and spend more time on the parts of the job that actually matter i.e. framing problems, making tradeoffs, spotting risks, creating clarity, and producing original insight.
enraged_camel · 2026-04-26 20:20:59 UTC
The HN title is heavily editorialized. Actual article title is far less controversial: "A.I. Should Elevate Your Thinking, Not Replace It"
klodolph · 2026-04-26 20:25:07 UTC
Ah, I was thinking of the editorialized HN title.
Jcampuzano2 · 2026-04-26 20:22:50 UTC
"Hard disagree because it doesn't affect me personally"
There is already research literally showing that on average it is a net loss on focus, learning and critical thinking skills.
LtWorf · 2026-04-26 23:21:06 UTC
I think the type of people who get hyped about the cool thing aren't the kind of people who pay much attention to research and science.
dawnerd · 2026-04-26 20:23:45 UTC
> Yes, my coding skills probably aren't as sharp as they used to be
If not the tool then whose to blame? It’s very clear people that rely on LLMs for coding lose their skills. Just because you have a lot of parallel tasks going at once doesn’t mean you’re producing quality work. Who’s reviewing it? Are you just blindly trusting it?
jnpnj · 2026-04-26 20:24:04 UTC
But is the debate about "fleshing out a system spec" or "ability to come up, plan and explore various ideas to solve problems elegantly on a budget" ? I think there's always these two sides conflated as one when discussing LLM impact on users.
Comments
Even my colleagues who cheated their way through uni still needed critical thinking to do that and get away with cheating without being caught.
People might hate this but being a good cheat requires a lot of critical thinking.
It's not really that hard to get a degree in engineering if your only goal is the degree itself.
(Take home) projects are easier than ever thanks to AI. In the past, you at least had to track down some person to do the work for you.
I do have to say I was appalled by some of the tests I had as an exchange student in the US (will not name the Uni in question but ranked around 60 in us rank). I remember a computer graphics test where a lot of questions were of the type "Which companies created the consortium maintaining the opengl specification?"... it was fully possible to obtain a passing grade just by rote memorization of facts. So I have no trouble believing that in the US it's possible in some unis to get a software engineering degree without understanding or critical thining
The only thing worth asking people is: what have you produced? Within this one question is so much detail that any other artifact is moot.
What you'd take is irrelevant if the HR/recruiter doing the initial screening of resumes is looking at an oversupply of candidates with degrees.
Hiring is broken is many ways. Candidates without degrees are faring even worse now are the initial recruiter screening stage due to the poor market.
In my EU country, academic inflation is so bad due to free education and psyopping everyone to path of academia, that not having a MSc is basically a red flag to companies for getting a SW job as most candidates have one, which means you're expected to have one too if you want to get a job.
You are, of course, right that the idea that someone could finish a serious engineering degree without being able to think is ridiculous.
Most of the "Software Engineering" curricula I've seen is catered towards "getting a job as a programmer", and is mostly focused on languages, frameworks and outdated processes.
As an engineer in another discipline, there's no engineering there.
I would rank like this: Computer Science > Self Taught > Software Engineering.
I even saw a "post-grad in React" lately.
Backend-wise it's the same, it comes and goes with fashion and whatever company has influence in the university recommends.
I was self-taught since I was 15, so most of these classes were just review for me. I met lots of people that didn't know how to code as seniors (and never ended up getting a job in their field).
--
A lot of students (and developers out there too) are able to pass follow instructions and pass the test.
A smaller portion of them are able to divide up a task into the "this is what I need to do to accomplish that task".
Even fewer of them are able to work through the process of identifying the cause of a problem they haven't seen before and work through to figure out what the solution for that problem is.
--
... There are also a lot of people out there that aren't even able to fall into the first group without copying and pasting from another source. I've seen the "stack sort" at work https://xkcd.com/1185/ https://gkoberger.github.io/stacksort/ professionally. People copying and pasting from Stack Overflow (back in the day) without understanding what they're writing.
Now, they do it with AI. Take the contents of the Jira description, paste it into some text box, submit the new code as a PR, take the feedback from the PR and paste it back into the box and repeat that a few times. I've seen PRs with "you're absolutely correct, here are the updates you requested" be sent back to me for review again.
This is not a new thing. AI didn't cause it, but AI is exacerbating the issue with professional programming by having the people who are not much more than some meat between one text box and another (yes, I'm being a bit harsh there) and the people who need instructions but don't understand design to be more "productive" while overwhelming the more senior developers.
... And this also becomes a set of permanent training wheels on developers who might be able to learn more if they had to do it. That applies at all levels. One needs to practice without training wheels and learn from mistakes to get better.
So what does that tell me?
Better yet, for about 30% having the LLM slop it would have yielded better outcomes, but having them slop something nets terrible slop. But at least I can reshape because even the LLM wont do something that stupid.
I would say that today's graduates are IMO a bit better than a few decades ago but there are still many graduating who are just not good at writing computer software and don't really have the aptitude for that (or maybe the interest in getting good). That's what happens when the pipeline of people coming in are people who want to make money and the institution is mostly a degree factory.
And every single major company becomes bureaucratic and political after 30+ years in the business when the original founders are long retired, and the Wall Street friendly beancounters take over, caring only about the quarterly reports.
'Lean agile' tech companies are by far the exception, not the rule.
Look at OpenAI and Anthropic, both fairly new companies that are excessively political already. This 'garage stage' of lacking politics is a myth, read old stories about Microsoft, when it was 15 people it was political.
No, you are.
You first asked: "When was tech not bureaucratic and political?"
To which I replied "in the 60's, 70's, 80's, 90's when they started in garages".
What did you fail to understand here?
>Look at OpenAI and Anthropic, both fairly new companies that are excessively political already.
Everything becomes political when you tell them they're worth trillions if they only play the right tune. Money brings out the worst in people. SW companies didn't make trillions decades ago.
What you actually wrote in the comment four hours ago:
>60's, 70's, 80's, 90's, basically before the Google and Meta found out ads and money printing run the world
Your lie just now:
>To which I replied "in the 60's, 70's, 80's, 90's when they started in garages".
---
>What did you fail to understand here?
Nothing because you never said it. Wild behavior.
You literally just quoted me saying before two comments above: "You are changing your argument by adding this: "when they started in a garage." and then pretend otherwise.
Now you're pretending I never said and acting like you didn't read it.
Are you unable to understand an argument made by adding the context of two sentence from two consecutive comments following up on each other(which you yourself quoted and said it changes the argument), or are you just a troll acting in bad faith pretending you can't understand just to score a cheap gotcha?
>Wild behavior.
Yes you have, which is why I'll stop replying to you now, to protect my sanity. Jesus Christ.
And putting aside the vanishing skill, there is also an issue of volume.
It's worrying how much trust is being put in those systems. And my worry is not about the job anymore, but our future in general.
So, on one hand, I'm also kinda sad and how quickly we've thrown the guardrails away, but on the other -- it's... Well. It's just work.
Turns out, no one ever really cared how elegant or robust our code was and how clever we were to think up some design or other, or that we had an eye on the future; just that it worked well enough to enable X business process / sale / whatever.
And now we're basically commoditised, even if the quality isn't great, more people can solve these problems. So, being honest, I think a lot of my pushback is just a kinda internal rebellion against admitting that actually, we're not all that special after all.
I'm just glad I got to spend 20 years doing my hobby professionally, got paid really well for it, and often times was forced to solve complicated problems no one else could -- that kept me from boredom.
I think the shift we are seeing now, as 'previously' knowledge workers is that work becomes a lot more like manual labour than what we've really been doing up until now. When there's no 'I don't know' anymore, then you're not really doing knowledge work, right?
I guess I'll just ride the wave, spew out LLM crap at work, and save the craft for some personal projects, I'll certainly have the capacity now work is a no-op.
In a corporate world, we are typically detached from real world consequences and looking at people around me, people really don't think about such things - but I do. And I really care, because "relaxed" standards might result in errors that amount to stuff like identity thefts, or stolen money, shit like this, even on the smallest scale.
Obviously we can't prevent everything, but it seems like we, as industry, decided to collectively YOLO and stop giving shit at all. And personally I don't like that it is me who is losing sleep over this, while people who happily delegate all their thinking over to LLMs sleep better than ever now.
Keep it simple right; in everything you do, make things a bit better than you found them. It's enough. You're never going to win the fight to get everyone (or maybe even ANYONE depending how messed up your org is) to care; so why lose sleep on things you can't change?
At least, that's what I started doing some years ago by now having lost lots of those fights, and I'm sleeping fine again.
Our futures are safe in this sense, in fact it's even beneficial as we may be the last generation to have these skills. Humanities future on the other hand is another open question.
You can learn to understand the patterns that compilers spit out and there are many tools out there to aid in that understanding. You can't learn to understand what an LLM spits out because by design it is non-deterministic and will vary in form and function for each pull of the lever.
You can learn to understand how high level concepts in code map down to assembly language and how compilers transform constructs in one language to another. You can't know that about LLMs because they generate non-deterministic output based on processing of huge low-precision tables.
It's not even a close comparison.
All that LLMs and other generative models have done is enable an order of magnitude more stuff to be created cheaply. This then puts the onus and cost on the consumer of that output, hence why everyone is exhausted after a day of work that just involves looking over output. This volume of output will cause people to stop looking at all of the output and just trust the randomly generated code, and in time the quality will suffer.
It's only your opinion that is provably false.
First, there are still people who don't like high level languages and don't use them, because they find assembly better.
Second, I personally work in a field where I need to consult the source of truth, the actual binary, and not the high level source code - precisely because the high level of abstraction is obscuring the real mechanics of software and someone needs to debug and clean up the mess done by "high level thinkers".
High level programming languages are only an illusion (albeit a good one) but good engineers remember that illusion is an illusion.
I can tell you this, the person you're replying to comes from the overwhelming majority/generality. You, on the other hand, are that one guy.
Of course even my comment is a bit general. You're not "one" guy literally. But you are an extreme minority that is small enough such that common English vernacular in software does not refer to you.
Also, if you need to control performance, you still need to know how CPU cache and branch prediction works, both of which exists at the abstraction level of assembly.
I wonder if this sort of trend will continue?
(A competent assembly programmer can go miles around a competent high-level programmer, that's still true in 2026...)
GenAI is like a non-deterministic compiler. Just like your manager's reports except with less logical thinking skill. I'd argue this is still problematic.
> In talking to engineering management across tech industry heavy-weights, it's apparent that software engineering is starting to split people into two nebulous groups:
> The first group will use A.I. to remove drudgery, move faster, and spend more time on the parts of the job that actually matter i.e. framing problems, making tradeoffs, spotting risks, creating clarity, and producing original insight.
There is already research literally showing that on average it is a net loss on focus, learning and critical thinking skills.
If not the tool then whose to blame? It’s very clear people that rely on LLMs for coding lose their skills. Just because you have a lot of parallel tasks going at once doesn’t mean you’re producing quality work. Who’s reviewing it? Are you just blindly trusting it?