Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Are “theoretical” concepts being lost by new engineers?
60 points by santiagobasulto on Aug 28, 2023 | hide | past | favorite | 110 comments
I've dealt with a lot of developers in the past ~4 years and I've noticed a lot of them didn't have the "basic" theoretical concepts that I considered widespread and universal about Software Engineering and Computer Science. For example, engineers putting a lot of logic in their unit tests, or choosing the wrong data structures, etc.

Maybe it's because of the advent of self taught engineers? Or maybe these concepts are just "too boring"? I'd like to know if I'm biased or not.

EDIT: I'm not implying this is a "bad thing". Just trying to assess if it's a reality or it's just me. And I'm very supportive of self taught engineers! I think it's great that people can build their own careers by themselves. We might just need to adjust our field a bit based on how it evolves.



Oh, no! Don't blame "self-taught" engineers; I'm self-taught. In my spare time, I learn assembly, and work with a bread board with an arduino. Fun stuff!

The problem is "bootcamps" and other such businesses promising so-called six-figure salaries. I think most people who find these bootcamps appealing aren't really interested in technology. They just want money. I'm sure they wouldn't learn any of this stuff in their spare time, as a hobby, having genuine fun.


What I have experienced is that engineers without a degree learn in their free time the “fun” stuff: Golang, K8s, assembly, hacking, frameworks, sql, etc. They usually don’t learn things like context-free grammars, logic, mutexes, the kind of stuff you learn at university.

People with degrees and who also learn in their free time (like me) get to learn both: the fun stuff (now) and the foundations of computer science and software engineering (back then).


I’m self taught and I’ve seen some of this baffling entitlement up close.

In my previous role, I had a direct report that was a talented designer who was also good and fast with React.

I put him on a project where we needed to use both Express and jQuery. He asked if there was some way to trigger when the browser has loaded everything. I got to mentor him a bit, and he’s a better engineer now, but boy howdy, we covered a lot of basics. He spent his free time living life away from screens.

He was pushing for six-figures in every other one-on-one during our last six months together.


> He was pushing for six-figures in every other one-on-one during our last six months together.

Good on him for bag chasing. Cost of living is absurd these days, shrugs


"Just want money" may be a little too harsh. I'd say many are interested in technology but they don't know what they are getting into.

Even the best bootcamps are too short to go much into concepts - they turn out at best software technicians.


> best software technicians.

This here is the fundamental issue right here. There is a difference between technicians and engineers. There's a heavy emphasis on technicians, not as much on engineers. The big way to spot the difference is a technician defines themselves by their tools.


I knew what I was getting into; practical programming skills and concepts to make a career change, both for money and for more interesting work.

Fortunately, most of us work on plumbing data through web applications, so you really don't miss much by not having taken classes on compilers and assembly language.

The best engineers will always be self-directed learners. Bootcamp is just a springboard.


In my experience, at least, self taught are typically better than title owners. Not 100% of the time, but at least a solid 80%. My explanation is that self taught really LOVE the topics.


This has been my experience as well, and I think the reason for it is what you state.

But I'll break it down even more: what makes someone highly skilled in an an area is a combination of learning everything they can about the field, and spending countless thousands of hours practicing.

If you're doing a thing because you love it, you'll naturally do both without even realizing that's what you're doing -- because to you, what you're doing is play. You're having fun.

If you're learning a thing because you want to earn a living doing it, but have no special love for it, you're less likely to spend nearly as much time learning about it expansively or actually practicing, because what you're doing isn't play, it's work.


Yeah, the self-taught concern is really missing the target in my experience too

Given how many tertiary courses are becoming increasingly vocationally focused - and feed into jobs which are extremely narrow in focus - the self-taught aspect is probably the most consistent distinguishing factor I see in exceptional engineers.

Doesn't matter if they're self-taught from zero, or self-taught on top of some technical degree - it seems to result in more breadth and depth of experience than a degree and a lack of passion does.

That said, perhaps "self-taught" is an overloaded term?

There's self-taught in traditional "autodidact" sense of: experimentation, reverse-engineering, and hundreds of small personal projects - potentially then exposed to informal expertise via usenet & BBS discussions

Then there's self-taught in the sense of: did a vocationally oriented bootcamp in 6 months and all experimentation was related to coursework

Perhaps we need to distinguish between "self-taught" and "self-guided"?


This is what I tell people that ask if it's possible to break into the field without a degree as well, which basically amounts to "what do you mean by self taught".

There are all different kinds of self taught. The highest performers in any discipline are naturally going to tend to be self taught (for a variety of reasons, and not exclusively, many have formal education or training as well, and without exception they will have had some form of mentorship). But so are the lowest.There is a huge range of skill level among the self taught, including a lot of people that are so far ahead academically that they would get nothing out of doing the typical university education thing.


I have 2 degrees, and I do consider myself - for the most part - self-taught. The degrees offered mostly guidance and support, but ultimately it was what you made of it.

Everything presents an opportunity, you just have to decide to unfold the fractal [1].

[1] http://paulgraham.com/greatwork.html


Are you the exception or the rule?

I'm not "blaming" self-taught engineers, I actually support self-taught (instead of blowing $100K on a traditional school).

But I'm wondering a few things:

a) is this universally known for all self taught engineers? Like, are bootcamps explicitly telling people: "Look, we're just giving the operational basics, there's a lot you have to cover on your own".

b) Is it easy for self-taught engineers to find the correct resources to bridge these gaps? In traditional software university, you have a career path and courses and books "pushed" onto you. So there's no way around that. I HAD to learn Prolog and the Logic paradigm cause otherwise I'd not pass. How do self-taught engineers manage to learn these things today?

---

Also, I'm not implying this is only an issue with self-taught people. Maybe our field is becoming more complex and even universities are "ditching" the theoretical concepts in favor of more "practical skills".


Perhaps also an exception, but most of the best devs I met in my teens and early 20's were self-taught.

Most were working for large companies before college age

Some eventually went on to complete CS degrees after many years in industry, but seemingly only as a formality, out of curiosity, or for a break from the grind


I'm self-taught as well. Dunno if I'm the one to say, but I'd consider myself in the upper-quartile of programming breadth as a result. There are lots of talented programmers though from a host of different backgrounds.

I've mostly learned from books, man pages and a string of interesting problems[1]. I think in general you tend to learn in a different way if you learn through self-directed problem solving. Like you get much deeper understanding of the theory, because that's the only way to get anything done. It's a much harsher standard than having to pass a test.

Though I think there's a lot of survivorship bias involved in making self-taught programmers competent. The ones who didn't have a knack for it are doing something else now.

[1] e.g. 15 years ago I built a 6502 emulator with a weird fantasy base 3 architecture, which eventually included a C compiler for this weird architecture. (I guess I was Terry Davis before it was cool.) More recently I'm working on an internet search engine I've built from scratch in vanilla Java bespoke index, bespoke crawling, bespoke everything.


> Oh, no! Don't blame "self-taught" engineers; I'm self-taught. In my spare time, I learn assembly, and work with a bread board with an arduino. Fun stuff!

But did you learn proper software engineering? Don't get me wrong, they also don't teach that in school.


I'm self-taught, and I absolutely did learn proper software engineering. And relearned it a few times as what was considered "proper software engineering" has changed (and continues to change) over time.


Boot camps are really great for self taught people that want to level up their knowledge.

About half the people in my boot camp were really into computers and got a lot from the program.


It's a tricky problem: a lot of people who find bootcamps appealing are self-taught like you and me, but don't have the elusive "experience" to put on a resume.

There is no reasonable way to differentiate between the two, despite that being the entire point of accreditation.

Maybe instead of acting like there are 100 million open six-figure-salaried positions and nothing else, we can actually give inexperienced devs a mediocre place to start.


I don’t think what you’re describing is even a bad thing, do you?

If people who aren’t really interested in technology can be good enough at their jobs as programmers/developers/engineers, who cares?


>who cares? If you are hiring or working alongside programmers, you probably want those that "will" be "good" rather than those that "can" be "good enough"


I don't think that's such a sure thing.

Good enough is good enough.

Hire people who you can coach up a bit at a good value and get the job done. Hardly any job needs the best people available. They really just need to get enough out out people who are good enough.

If that's not the kind of place somebody who considers themselves a good programmer wants to work, that's fine, too.


New engineers are always green in some aspect. It sounds like you’re combining a number of different engineers from different programs and drawing conclusions.

Courses teach a lot, but they also miss out on tons of topics as well. There’s limited reinforcement between classes, so even if your professor covers your topic of choice, the next course may not.

Unit tests for instance are generally used by students for checking their homework assignment as they go, but it’s the (incomplete) grading rubric. I didn’t actually write many unit tests in college until my software design class which was in Smalltalk.

I didn’t use version control until I worked on a few projects with a fellow student who self hosted SVN.

I used 6-8 different programming languages in school, each for a semester at a time. There’s simply no place for repetition and mastery until you get a job, either as an intern or permanent position—and then you’re the engineer you talk about.

Every new engineer is different, depending on their program and interests, even within the same school’s degree program. They’ll all need mentoring to grow. No one is ever going to graduate and be a mid to senior level purely from school.


I’m teaching my boss git these days. He is an electric engineer with lots of years of programming experience, but somehow avoided modern version control.

At the same time, our fresh-out-of-school hire just started and with 6+ years of FOSS Rust, he is teaching everyone how to do async properly.


Since when are unit tests theoritical? I don't think many universities teach them. Maybe like an introduction to software testing and QA but nothing too rigid.

So if you're talking about fresh graduates, it's always been like this AFAIK.

Bootcamps are different but they're focused on developing attractive applications.

Choosing the right data structure is also hard to reason about when you're building small applications, on the scale of a student project you won't hit bottle necks unless you do something outrageous.

You too probably learned about it from conf talks and experience working in the industry but in hindsight, looking back, it's "basics".


[flagged]


20 years ago my take was that CS students did not learn about using version control, unit testing and other industrial practices but around the time GitHub became popular most of them have been in the habit of using version control at least. (Based on the experience of working with interns around an academic library)

I skipped "CS 101" when I was an undergrad at the beginning of the 1990s (took two 1-credit-but-4-credits-worth-of-work John Shipman courses on C and TeX and a comparative programming languages course instead) but I saw the process of how it was being taught and something that sticks out to me now was that the TAs used automated scripts to grade assignments, something that a lot of students might not have really thought about.

Personally I think the real gold in the CS curriculum is the compilers class, that is the place where you learn to build highly complex programs and break down "magical" functionality into a bunch of parts that work together.

[1] https://www.nmt.edu/news/2022/shipman-honorary-doctorate.php


I’m trying to come up with the right way to convey this thought and I’m not sure I’ve nailed it, but here’s my attempt:

How you feel about these engineers is how lead engineers feel about how you feel about these engineers.

It’s hierarchical; you are probably a senior dev, and as such you notice the mistakes junior devs are making, which is fine and good. However, your attitude towards those junior engineers, that they must have something wrong with them, is a terrible mistake that only a senior dev would make.

The inability to perspective take or recall your earlier career is the same kind of mistake, to a lead dev, as putting too much logic into a unit test or selecting the wrong algorithm is to you.

Unlike you however, a lead dev will understand this is just how people are at this part of their career, and know that you’re probably coachable and can learn to be better than this.


> … the advent of self taught engineers?

This makes it sound like a recent phenomenon. Self-taught software engineers have been around quite some time. When I attempted to attend post-secondary school (early to mid 1990s), the idea of “computer science” was still too new to academia in general - the course offerings taught little to nothing outside of what one could pick up on their own.


Self taught programmers come in different flavors but people tend to think of one type when talking about them so it is hard to have good discussion. There are classic self taughts, like George Hotz who had an interest that naturally developed into a career. There is also the newer self taught, like the 40 year old fry cook who realized one day they needed a career and saw a bootcamp ad or a story of someone in their position brute forcing leetcode for 3 months and landing a 200k job at google. In some way even people with CS degrees are self taught programmers. You learn programming by spending hours in front of a computer just like you learn to drive a car by driving not just by sitting in a classroom.


I am forever reminded of a colleague providing his assessment of an interview candidate "Nice guy, but shouldn't be doing software development." Scarily more applicable today than when he said it.

It's tempting to suggest things are getting worse though (and I am bad for this) but I have encountered so many very senior people, in some cases famous ones, who don't actually understand what they're doing. It is scary how far confidence and an effort to rewrite history to hide all trace of your previous mistakes will get you.

What has changed is the salary and expected workload are known in the wider community and so a lot of people that treat programming as a production line grind as opposed to a creative process have showed up, and totally skewed management expectations for how things could and should operate in the process, completely oblivious to the fact much of their work, if appropriately structured, should be done by the machines they spend their lives swearing at.


Does anyone "understand what they are doing"? The field is so vast that I am sure you can take anyone and find an area of computer science where they don't understand what they are doing.


Confidently making bold assertions which are provably incorrect counts as not knowing what you are doing.


> [...] lot of people that treat programming as a production line grind as opposed to a creative process have showed up, and totally skewed management expectations for how things could and should operate in the process, completely oblivious to the fact much of their work, if appropriately structured, should be done by the machines they spend their lives swearing at.

This is so beautifully put.


I don’t see the point of understanding what I am doing. If management knows even less than I do or doesn’t care enough to check and my code runs, my solution is sufficient.


The management is there to manage people and their work, not to understand what developers are doing (otherwise they would be developers too).

Though granted, they should at least be capable of recognizing subpar work like yours, either by themselves or by proxy of other developers.


How can they manage my work without understanding it? And if they don't understand it, subpar work is paid the same as reliable work.

If there is no premium paid for software that is more reliable as they cannot identify it, why should it be more reliable?


> How can they manage my work without understanding it?

By assuming you're a responsible adult, perhaps. Sounds like you may be proving someone wrong.


Half of data scientists or ML engineers I work with can implement and train models but do not necessarily understand the underlying principles or math.

Most people just care that it works and not why.

Really dangerous depending on the problem and field tbh


Yes, I've noticed that in the Data Science space as well.

I guess our field is becoming more an "operational" field; we need more people to operate tools and you don't need the underlying concepts to do that.

I don't think it's a bad thing, we just need to be conscious about it.


That’s fine if they’re applied data scientists or MLEs but should not be the case for research scientists


I think part of the problem is:

1) the IT field keeps diversifying at a rapid rate. Thus the schools try to teach too broad of a topic set. Thus making generalist in what appears to be specific field. Oh so you are Robotics? Is that robotics, controls, building designing, human interface, power management, AI, blah blah. To be a well rounded robotics engineer would probably take a solid 12 classes. To be good at a specific topic, would be another 12 courses per topic.

2) tight job market makes it hard to acquire talent. Your boss does not want to match YOUR salary on a new engineer as that would anger you. Thus they aim lower ...

3) the really talented new engineers are possibly not interested in your field. As such the ones who have a passion are going somewhere else.

4) the explosion of money has a LOT of people who are more passionate about paycheck then tech. Which is not wrong, but does not provide the level of knowledge you seek.

I can only assume you are passionate about tech, and thus spend spare time studying. You also delve into topics. I have found MOST techies are not so passionate. Most learn a specific topic and that is it. Tech is not their hobby as well as their job.


Basically, schools don't teach a lot of topics that are required on the job; if you ask why, they just say that they teach how to learn.

What other fields do, like civil engineering and medicine, is have a licensing board. The board certifies that you know how to do the job. Often studying for the exam requires covering topics that weren't part of the curriculum.

It's easy to say that there's too many variations / topics in Software Engineering to have licensing; but medicine has just as many sub-specialties and they figure out how to have good exams.

One more thing to point out: In medicine, job interviews are mainly based on mutual interest; because the licensing board determines competency. We (software engineers) could learn a lot from that process.


For the love of God, please no mandatory licensing. Voluntary certificates approved by trusted institutions are sufficient. Businesses should choose which certificates they trust; politicians should have no power to define technical competence.


And then we would move the issue to the licenses. You would _need_ to learn, say, UML when no one actually uses it because the standards body said so.


As long as the board is compromised of experienced engineers who are writing tests based on what you need to know... Why oh why would they test on UML?


I completed my Comp Sci undergrad in 2021 from a top 10 UK university and I agree. We had modules on networking, databases, operating systems, software design, AI, distributed systems and many more, but almost everything was surface level.

I felt as if they went for massive breadth instead of focusing on key concepts in depth. We brushed over data structures and algorithms (this is pretty poor since Leetcode-esque questions are now standard for entry level interviews), never really learning the theory behind them, when to use each one, their efficiency etc. No maths besides probability and basic calculus... Writing efficient code and unit testing was not given more than half a lecture.

Luckily, all of my uni friendship group did comp sci and we love it so often went deeper into relevant topics together in our own time, otherwise I feel like I would've found getting into the industry pretty tough. I'm a machine learning engineer now and I'm basically self-taught, my degree didn't help much at all.


It's definitely shocking if your data structures and algorithms course did not provide you with a solid foundation for leetcode or discuss efficiency.

The noise in UK unis has been, for years, that they have real trouble normalising the intake of students with respect to maths ability, hence they need to revisit way too much basic calculus etc. when they would prefer to be doing other things.

That said, students and new grads tend to prioritise immediate software development practicalities, which change with surprising frequency, over the fundamentals, which we have been forgetting and rediscovering since the 70s at a far greater rate than we have advanced the field as a whole.

I hope you have managed to find a good team with the right spirited mentorship.


> I felt as if they went for massive breadth instead of focusing on key concepts in depth.

To be fair, a bachelor's degree in CS is almost unavoidably going to do that to some degree. There's enough of a breadth of subject matter that it's going to be an overview of lots of things.

Of course it should still go into some depth, or it wouldn't be worth a university degree. If, for example, no basic understanding of time complexity (or complexity analysis in general) was taught for common data structures and some basic algorithms, I agree that sounds shallow for a university course on the topic.


I'm in my 40s and have been, to varying levels of professionalism, a "professional" web developer since the early 2000s (having started self-teaching soon after I started using the web in the mid 90s). I've always been self-taught. It's only in the past year or so that I've started doing LeetCode, and learning about things like sorting algorithms, uses of hashes, binary trees, linked lists, and other things that are basic CS concepts. I've also been learning about how microprocessors work, a little assembly code, electronics/hardware design, embedded programming.

I'm not about to say it's been transformative or profoundly enlightening, but I do consider it to be useful and valuable to be able to know these things when working on projects where large data sets and performance issues are involved. I'm no longer able to nod along when people go on those rants about how LeetCode and whiteboard tests are just testing for conformity and obedience; I can now see how these concepts are important for people to know when working in the kinds of roles that the major tech companies are hiring for.

I think the bigger realisation I've had is that academic Computer Science and contemporary software development are just quite different fields, and it's non-obvious until you make the effort to bridge the gap, just how little in common they have, at least until you become more advanced as a software developer.

I agree that it's mostly due to the fact that many/most software developers these days are self-taught or taught in bootcamps or "IT" courses that just don't teach much or any fundamental CS. And I agree it's probably not a bad thing on the whole, but I also think it would be better if self-taught software developers were encouraged to believe that learning these concepts at some point is achievable, worthwhile and possibly quite satisfying and confidence-building.


Meh. When I interviewed at Amazon, I learned that "tree" was the only data structure. If you have a memory of something called a linked list, you're obviously wrong. Other things I learned from Amazon interviews are that queuing and control theory don't exist and there are only two programming languages: C++ and Python.

I do not work at Amazon.

I think what I'm saying is a large company like Amazon has figured out how to get by with a small subset of "historical" CS knowledge. Maybe .edu is responding to industry needs by only teaching kids the three or four concepts they need to pass an interview at Amazon.


> a large company like Amazon has figured out how to get by with a small subset of "historical" CS knowledge.

All software companies do this. They focus on just the aspects of engineering that directly serve to produce the products and services that particular company produces.

This is why if someone has only worked at a single company -- regardless of how prestigious that company is -- their education and experience is incomplete. A well-rounded engineer will have worked at a few different companies that are doing very different things from each other.


I was going to say "or at least do a little continuing education." I think that would work for some, but you're probably right... many (most? all? some?) engineers will learn better in an environment where the whole team has bought into various assumptions and you're hip deep in it every day.


Yes. I'm a huge proponent of continuing education (which can come in a variety of forms, not just in a classroom). I think it's a huge shame that this isn't part of software dev culture like it is in most other professional fields.

But a large part of education is actually practicing what you learn in a realistic and persistent way. You can do that outside of a work environment, but it's easiest inside of one. And since most devs advance their salaries and careers through changing jobs anyway, it makes sense to me to include learning new skills as one of the things you look for in a new position.


Many of sweng concepts taught at universities are desperately outdated; literally the only UML thing I ever wanted to use was sequence diagram. Waterfall/TDD/agile etc. all turned out to be fads that lost their substance over time, all that matters in the interviews these days is LeetCode-Hard and scalable system design, and most important, being likable/relatable. Theoretical computer science is now floating in its own cathedrals completely separated from the practice even in highly practical fields like distributed systems/algos. ML is just math.


IMO agile didn't turn out to be a fad. Rather it became so normalized (in some form, not necessarily a pure one) that it's just not a hot topic any more.


I'd say it got subverted and emptied, now it's just a set of silly rituals with some drummer setting the pace.


> Maybe it's because of the advent of self taught engineers?

I doubt that it's this. For much of the modern history of the software industry, almost all engineers were self-taught, but this is a more recent phenomenon. And it's one I've seen as much from college graduates as others.

Personally, I think it's because a lot of software engineering is climbing up higher of the ladder of abstraction. Having a career as a dev without understanding the low-level basics of how computers work is entirely viable now. Just a couple of decades back, it wasn't.


1) fewer people know how the machine and network works. This includes infrastructure experts.

2) there’s too much to learn and everyone wants to be a generalist. I’ve spent my entire career levelling up and I still don’t know enough most days.


> everyone wants to be a generalist

Weird. I'm the opposite: I REALLY love what I specialize in (idk why) and loathe almost everything else (idk why). I consider it a personal pathology that I just have to live with.


From my broad sample over the years, most of the good software engineers and architects are - and always have been - 'self-taught', even if many also happen to have classical CS degrees.

My experience is that good engineers getting things done is something that happens in engineer-led companies that pay well. If you're increasingly seeing the not-so-effective engineers then it's perhaps possibly a reflection that the company is one that doesn't try to or can't attract talent?

This doesn't make it a bad place to be. All that matters is that people are happy and worklife is enjoyable.


There is a difference between knowing specifics and knowing how, when, and why to find them.

It's easy to teach specifics. It's hard to teach people how to find them. It's tricky to teach people when to start looking; and most of the biggest mysteries in software development can be boiled down to the elusive "why".

In my experience, the biggest confounding factor is "secrecy", a word usually written out as: "proprietary".

There are two approaches to learning: definition and inference.

A public system, like Linux, can be learned from its roots, each definition built on the last.

A secretive system, like Windows, must be inferred. Every piece of knowledge is limited to the "best guess", and whatever conditions that guess was tested with.

Unfortunately, most people are working with secretive systems, so inference is the only strategy they get any practice with. That's the wrong strategy to apply when writing software, because - at the very least - one can read their own codebase.

Inference can teach you something close to specifics. It can tell you "when" to start looking (now and always, because your testing conditions are fragile). Unfortunately, inference can never truly answer you, "why".

Remove the secrecy, and everyone will know to value the basics, because those basics lay at the foundations of every answer.


While I tend to agree (my personal bugbear is understanding what the abstractions are hiding, e.g. containers are cgroups), as others have pointed out to me, there’s always another abstraction to peel away. Until you go back to Shockley, someone can always laugh at your moral high ground. EDIT: The irony of citing Shockley and using the phrase “moral high ground” isn’t lost on me, but hopefully for this particular point his physics work can be disconnected from his other beliefs.

That said, while I do believe that there’s a practical limit to how much depth of knowledge is required for most jobs, it’s perilously close to what is helpful to know. For example, while you don’t need to know how a B+tree works to write data into a database, knowing that the DB uses that (and in what capacity) can be enormously helpful when determining how to structure your table schema. Or at a higher level, you don’t need to know SQL to use an ORM, but it makes spotting performance problems much easier if you do.


I wonder sometimes if agile methodology exacerbates the problem. I've never been in a shop with a mature process so maybe we're just 'doing it wrong', but the incremental improvement philosophy doesn't really allow for that big picture comprehension that was so important for me when I first started 20+ years ago.


> I considered widespread and universal about Software Engineering and Computer Science

Could be relative, depending on level of experience, opportunity of hands-on implementation and retention of knowledge.

When I was at an interview from an eCommerce startup, they were all over the case of DB concepts and transactions. Then I was at an enterprise, where they swore by the design patterns and heavily pushed wrong patterns in wrong places because a senior engineer said so. Then I went to a hardware manufacturer, and they had zero concept of any best practices, it was bit manipulations and inline optimization all the way through.

Depending on where one is exposed for a long enough time, some concepts are lost, some are reinforced.

At the end of the day, theories don't bring any values, whatever gets the job done and brings in the money making the machine churn away is crucial.


I'd say it is the opposite.

Back when I first got online, most people calling themselves programmers were self-taught and they usually had a very strong interest in learning adjacent skills so we had plenty of discussions about best practices and data types on IRC.

And then the mass-manufactured 6-week bootcamp "consultants" arrived and basically just used whatever algorithm shows up first on StackOverflow. Those folks are mostly in it for the money, so they won't learn about algorithms in their free time because there is no direct link from knowing them to a salary increase. They also rarely participate in group discussions, but instead jump in and interrupt everyone else to ask for help on their specific problem.

And that's why some of the old self-taught people now look like 10x engineers (in comparison to those bootcamp consultants).


You're right most engineers are missing core concepts. You're wrong that it was ever different than it is today.

I suspect your observation results from your improvement over time. Plus, your relative skill level monotonically increases as others retire and juniors replace them - even if you stay stagnant.

There's other causes too;

- Major tech fads (remind me - is OOP in fashion or not right now?)

- New topical emphases for industry (is is more important to take that second data structures class, or that first LLM class?)

- A changing definition of who is an engineer (should business analysts by deploying pipelines in the data warehouse?)

All this to say: there's probably more people with a fantastic understanding of software engineering today than there ever has been. And the likelihood of working with one of them might still be lower than it ever has been.


Replying to - Major tech fads (remind me - is OOP in fashion or not right now?)

It depends on the definition of OOP which we can never agree on. This site says Alan Kay's definition is: OOP to me means only messaging, local retention and protection and hiding of state-process, and extreme LateBinding of all things. https://wiki.c2.com/?AlanKaysDefinitionOfObjectOriented

My personal feeling is that people are tired of talking about OOP. It is as useful as arguing what Agile is. I find it more interesting to talk about the motivations behind new computer languages, people are inventing. e.g., Rust claims the main purpose of using Rust is enhanced safety, speed, and concurrency, or the ability to run multiple computations parallelly. e.g., Kotlin claims it makes coding concise with less boilerplate.


> Maybe it's because of the advent of self taught engineers?

I doubt it. I'm self-taught I don't think I do any of this. Some of the worst engineers I've ever worked with were those fresh out of a comp sci course. I find ability in this field comes with time and one's effort, not educational background – it simply takes a lot of time and effort to be a good engineer.

I think what you're noticing is likely the result of the average Covid-era software engineering hire. I've spoken about this in other comments, but there was such massive demand for software engineers during the pandemic that companies were forced to lower standards to the point where almost anyone who had written code at some point in their lives could get a job fairly easily.

I'm a contractor so tend to work at companies with quite large contractor numbers and over the last year I noticed a lot of these crappy Covid hires were no longer getting contract renewals and were not being rehired. It's mostly normalised for me now and I'm back to working with people who are competent again. However, if you were at a company that hired a lot of new permanent employees during Covid then I'm assuming that's going to be a drag that lasts years unless the company is letting those Covid hires go.

Over the last decade I'd say the quality of the average software engineer has improved quite a bit. But my perspective of the average software engineer could be quite biased.


> Maybe it's because of the advent of self taught engineers?

Engineers, especially software engineers, are far less frequently self taught now than 20 or 30 years ago. There's far more to learn, far more opportunity for formal education, and (don't laugh, I'm serious) the level of professionalism required is significantly higher now. When I was a kid you could get an entry level programming or web dev job on the strength of building a web site over the holidays. Now, most places you need some sort of formal qualification to be an intern.


> Maybe it's because of the advent of self taught engineers?

All good engineers are self-taught whether they went to university for CS or not. Almost everything required to become skilled or knowledgeable happens outside the classroom. If you don't have an interest or passion to be deeply competent, it won't happen through osmosis because someone else is teaching you, you still have to do the hard work and experimentation on your own. Mediocrity in software engineering has been the norm as long as I've been in it.

Also, "self-taught" engineers aren't new, and some software domains like hardcore systems software seem to be mostly self-taught. In the past I think it was even more prevalent due to the dearth of easily accessible information and learning material. Blogs and tutorials weren't a thing, you had to learn a lot of things experimentally, which often had the side effect of much deeper understanding than if you'd just read about it. The massive increase in available software engineering knowledge is a huge improvement but it also reduces the need for understanding the material. If you can google a solution to a software problem you don't necessarily need to understand why or how it solves a problem.


Most new devs want to build things. Their motivation is driven by app ideas.

It takes a while to develop a fascination for the internals of programming which comes after you start building things.

The problem is that a normal person would look at the industry, look at what is the most popular, and think: this must be the way it should be done.

But pick any point in programming history and you would then just be learning the latest fad.

And you can only know how bad something is until you understand it completely in both theory and real-world experience. Only then are you adequately able to criticize it. So when the new hyped thing breaks through to adoption, its then at least 10 years until people can adequately point at the holes in it...and see which stick around and which get fixed.

Yet every single new thing, people still convince themselves that this is the way of the future forever.

I think an unbiased History of Programming Languages is probably the best thing people can learn. After learning a few modern languages of course. Then you can see why things are the way they are.

It's funny to hear people criticize frontend for churn - which is true. But at least the language has stayed the same. Backend is actually incurring a lot more churn in not only the libraries and frameworks like in the frontend world, but the whole programming language too.


Same questions here.

This other day I was interviewing with a recruiter for a Go dev position, then she asked "what is a map?", couldn't believe such a basic question was being asked, proceeded to explain, then asked her why such a basic question; she said most applicants couldn't tell her what a map was (shock face).

Next question was "what is a hash table?", even more shock...

I'm a Computer Engineer who started to program when I was 12 in a 286.


I’ve used Project Eulers Problem 001 as an interview filter and more than one senior developer failed it… not surprised they don’t know what a map is too


Yes, my opinion is these problems are arising naturally because of the permission of laissez faire attitudes in the interest of saving money at all costs.


Do you have any data to suggest that new engineers today are choosing the wrong data structures at higher rates than new engineers in the past?


I don’t think software engineers, as a category, have ever been defined by strong CS fundamentals.

On the other hand, I think increasingly few SWEs enter the professional workforce with prior “casual” (meaning hobby or similar) experience. Whether or not that’s a bad thing probably depends on whether you think programmers should love to program vs. have professional boundaries.


Have you ever thought that some software engineers enroll not by their natural calling but for pure economic factors?

For a lot of people in Europe, the paycheck immediately turns you into a middle class person with stable income, benefits and you get to work with kind of smart people around more or less.

Plus, it’s all over the media, how the hi-tech sector always lacks talents and that there’s no formal admission process to these kind of jobs except for FAANG.

Where I’m going with this, is that you’re actually right. I’ve worked with people who are born into this job without a degree but also, I’ve worked with those who enrolled into this career for purely economical reasons.

The economical no degree guys, especially in their late 20s, early 30s may have a tendency of producing a result, but it would be of lower quality than that of a CS grad.

With a CS grad, we can speak the same language about data structures, patterns and we do generally get each other. But with the off-the street guys — you never know if they get you. And this is a big problem once you build software at scale.

Adding to that, I’ve encountered situations, where the SSEs just discovered design patterns and for no good reason decided to practice what they learned right in the production codebase. This practice later came down to code rot, decreased portability and generally waste of engineering time (which is money too). When you at the company at a later stage, you can see that a self-trained SSE was just juggling with new shiny things (that could’ve been learned at school as when to apply them etc) and it’s a train wreck to work with such people.

Also, once you question what was the engineering justification for it, they may immediately take offense and become hostile rather than engage in what is expected to be a basic engineering discussion.

So I would say, that the overall quality of working software engineers is decreasing across the board, yes. My own experience proves this.


> Software Engineering and Computer Science

These are not necessarily the same thing (depends on the school), and many people who program computers are neither engineers nor scientists in any field, let alone in computing and software.

In all fields, not only programming, if you hire people without formal training, you get people without formal training. Sometimes they will acquire the same learning organically over many years (and maybe even surpass their capital-E Engineer brethren), sometimes they won't. You may even prefer them to not have the rigidity of formal training.

Moreover, this is not a new thing, nor, necessarily, a bad thing. The Wright Brothers weren't trained engineers. Qualified engineers of the time would have considered them woefully under-educated even for simplistic design work, and yet they built a working plane when no one else had.


I think far more of us were self-taught during the dotcom boom era and have built a wide base of knowledge to draw from.

The people we hire now don't seem to have learned anything they weren't forced to learn in school. The profession is becoming more blue collar but much of what we do isn't routine.


It's like anything else. Once an industry reaches a stage of development it abstracts away the previous stage. Think of it like working on cars. At one time to work on a car you needed to he able to fabricate cams. Now you just order parts. Same for electronics and just about everything else. A car technician might be able to tell you how a transmission works, and maybe they could even rebuild one, but usually they'll leave that to specialists and just order the part.

Modern developers slap parts together, because the work of building them has been done. I'd say, if the modern developer doesn't understand the parts they're slapping together that's the fault of boty the developer and the person who made the part.


I don't think they made us write any unit tests in college. What wrong data structure are people using?


A dev I worked with though it would be a good idea to use an object in javascript to solve a problem. He made the first key 0, the second key 1, and so on. You can't use a number as a key, so there was extra logic to make sure it was cast properly into a number when needed.

I asked him why he didn't just use an array and he just kind of had a mind blown look on his face.


fun fact, all arrays in php are also associative. I once worked on code that managed to delete one of the array index keys. That was fun to debug


How much do you pay?


I am self-taught so I have limited understanding of the types of degrees engineers usually pursue these days. I do know that more and more universities are offering focused software engineering degrees vs traditional CS degrees.

Can anyone comment if these newer degrees are better/worse at imparting the kind of theoretical knowledge of which OP speaks?

> engineers putting a lot of logic in their unit tests

Like is there a class which is supposed to teach these things? Seems like the kind of thing you have to learn through experience.

Regarding being self-taught... I find consistently that I generally outperform other traditionally educated engineers on my teams on a theoretical level, but I think this comes from many, many years of deep reading and small, focused experiments. I also haven't worked with a team comprised of only 99th percentile engineers but I assume that's not very common.

I don't even know how all of the nuance which drives sound theoretical frameworks can be imparted to a student in a 4 year curriculum, so I wager that a lot of this is taught on the job. So my next question is, what is the average developer team environment like, is there learning on the job, is emphasis placed on "getting shit done" vs "getting it done right", etc. My career has been entirely startups and independent consulting so I also have little insight into the bulk of the industry.

Anyway, I wouldn't discount self-taught engineers. If you know a self-taught engineer who has broken into the professional scene and isn't a junior, they likely have a comprehensive skillset and deep practical knowledge which converts to theoretical knowledge. Many self-taught engineers have a lifetime of experience.

I generally agree with you that software engineering today seems less rigorous. I also wonder if the internet is just making it easier to cross-examine engineering practices across the industry, and if there's always been such a distribution of people who deeply care about getting things right vs people who simply see programming as a job that in which they should invest the minimum.


I've noticed this myself, no theoretical computer science concepts. To be honest it's very rare nowadays that new hires actually have a computer science background and are not self taught with a totally different background.


>I've noticed a lot of them didn't have the "basic" theoretical concepts that I considered widespread and universal about Software Engineering and Computer Science

It's not required to be employed though.

Go try to build a list of basic universal and theoretical concepts and I guarantee someone will disagree with you about something on it. Someone else can just as easily declare some knowledge that you omitted as basic or that some knowledge you included is not basic. Now that you've gotten something wrong, should we be questioning you because you missed something as basic?


Your experience reads simply like there is a lack of practice. The discipline most definitely takes practice. It's not unsurprising that new engineers are not well practiced – that is what it means to be new.


“The children now love luxury; they have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise. Children are now tyrants, not the servants of their households. They no longer rise when elders enter the room. They contradict their parents, chatter before company, gobble up dainties at the table, cross their legs, and tyrannize their teachers.” ― Socrates

What are you (you reader!) doing to support your Junior developers?



Be careful with your judgments. Reading your complaint makes me wonder about your skills!

One should distinguish between computer science and SW engineering. The former is full of theoretical concepts, and the latter consists primarily of opinion. I know SW attracts people who like hard boundaries/divisions, but amongst the engineering I've been exposed to[1], SW is the least cut and dry.

Data structures is CS. Unit tests is SW engineering. The former is as clear cut as math, and the latter is a relatively new introduction that has widely differing opinions amongst its expert practitioners. Of the senior, good SW engineers I've worked with, you have some who are pro-logic in unit tests. Others who are anti-logic. I fall in the latter, and I counsel young developers that way, but I won't ding a senior developer in a code review for it. I am no more right than he is.

Data structures: The wise engineer knows when it matters. A very popular C++ book advocates vector usage even when theory says a map/set is better. Why? Because vectors have been optimized like crazy, and for data sets of a certain size, the vector almost always performs better. So he recommends getting data of representative (or max expected) size, and benchmark.

Have you done that for your code?

Even if it would be faster, how much of a bottleneck is the existing code? If only 2% of the time is spent on this portion of the code, does it matter if you speed up that 2% ten fold?

[1] My background is in one of the "hard" engineering - not CS.


In general, I think the quality is your average developer has gone down significantly in the last 5-10 years.

The market wanted a lot more developers and there weren't enough immigrants to fill those positions, so tons of companies lowered the bar for entry.

Your average dev these days is just so much worse which leads to a lot of the observations you have.


I'm self taught and also went to a boot camp around 8 years ago. Been working since then. I think I'm a pretty decent developer but often wonder if I ought to get a computer science degree (and also constantly kick myself for not staying in school when I was younger).


The amount of free material available on the web or on YouTube is actually mind blowing these days.

I got a CS degree almost 30 years ago (Computer Engineering actually so a mix of hardware and software) and have spent the last 3 years diving back in and relearning things I have a much deeper respect for. Surprisingly little has actually fundamentally changed, btw, in 30 years.

But there are wonderful resources like MIT’s original SICP book at well as video lectures (in glorious 80s retro splendor no less!), the Carnegie Mellon Databaseology courses, there’s a good Advanced Programming in the Unix Environment course, Crafting Interpreters, Smalltalk-80 Bluebook, talks by Alan Kay or Joe Armstrong, etc, etc, etc…

Once you start to get a sense of your own filter, as well as YouTube’s algos getting tuned in for what you actually want… I’ve actually been truly amazed. Lately I’ve been hacking around my own simple VMs and compiler theory in a way that I couldn’t have previously imagined.


I have only ever dealt with theory for interviews. One of my jobs is for a leetcoding unicorn and I can’t say I have ever heard the word “data structure” in my two years here.

I view it as something to regurgitate on command when interviewed. It is otherwise worthless knowledge.


Many developers today don't have basic theoretical concepts. Many developers 4 years ago didn't. Many developers 10 years ago didn't. Etc. This is not new. What have you done to improve this situation in your current company though?


How to lay things out is the biggest one I see. Everything crammed into a single function, files that are two thousand lines long for no good reason. Lots of side effects and names that aren't sufficiently descriptive.


from my last hiring experiences (frontend dev): people are not aware of underlying technologies. Frameworks is all now.

I partially understand that people might not be interested in the details of how things works. This makes them stuck on non-trivial tasks, but whatever.

One thing I don't understand - they no longer search for "interview questions" anymore before interview.

I was always avoiding such questions, because I felt the answers were memorized word by word, but now asking about basic concepts means no/wrong aswer.

I hope that's only the case in the frontend world, where things "don't matter" that much.


No. And what you posted aren't "theoretical" concepts, either.


Group theory. Stream processing. Testing and quality assurance in general. Technical debt. Everything in mythical man month and code complete, which are a bit more practical than theoretical, but still.


If this is U.S, then you are correct on the lack of knowledge part, not on the self taught part. The U.S. university system is such now that if you are not self taught, you will know nothing. The credits are made up, the cirriculum barren of experienced engineers, and the fed is handing out accredidation like candy at a parade for political purposes. The era where experienced engineers would be teaching at any of these schools is over - they're working private. This is a natural consequence of the racketeering federal student loans and grants have created.

It matters far more now what nationality, skin color, and how many studies you've published, quality or not. You can guess which political party decided to make it that way.


I think that we just need to realize that we exist to make money for companies. Customers don't care about how a CPU cache works, or what the runtime of an algorithm is. They care about software that solves their problem. Sometimes you need to know the theory side to solve customer problems, but more often than not you really don't. Most of us are not actual computer scientists, even if we have a grad degree in the subject.

On top of this, if you're not flexing the "theory" muscle often, it's easy to forget a lot of the details even if you have formal education. I think this applies to a lot of us who have been out of school for a bit.

I love theory, would love to work on it for a living, but here I am.


Same for me, I'm watching the MIT OCW videos now that I'm 43, because I feel like it's the right moment to learn theory, to tie my experience together.


Do you seriously think that whenever it was you started in this industry there weren't all sorts of complaints about new engineers not knowing anything?


How would you recommend to learn more or brush up on these concepts? Any particular resources you'd point to?


This phenomenon is not exclusive to engineering. It's happening in way more professions than these.


Welcome to the era of the burger flipper developers. 2 months ago they were at the grill, then entered some code academy (or sourcery as it's fashionable) and now welcomes you. If he survives a year in the industry - it's even worse, as "experience" kicks in.


Juniors are gonna junior.


The youth is lost!


Computer science and software engineering are completely separate disciplines, alike in dignity — but trouble arises when you confuse one for the other.

Computer science is a theoretical discipline, ideally suited to being taught in an academic setting, and concerns itself with the study of computation, including rigorous mathematical proofs, scientific experimentation, and complete thought experiments like super-Turing computation. It can sometimes offer insight into thorny programming problems, but for 90% of programs isn't directly applicable. Where research results from CS do become applicable to programming, they're typically implemented into libraries that can be used by software engineers with no deep understanding of the underlying research. Many subfields of computer science don't require the ability to write executable programs at all, though applied computer scientists are valuable to bridge research with the world of software engineering.

Software engineering is a practical discipline, ideally learnt ‘on the job’ in a self-taught or apprenticeship structure. It concerns itself with how to make good programs in a corporate environment, and in addition to practical programming concerns itself with other hard problems in building reliable software: testing, requirements analysis, and team collaboration dynamics. It's very rare for software engineering tasks to require knowledge of computer science concepts, though occasionally software engineers might choose to implement some CS research.

When people mix these things up, chaos ensues. For example, interviewing software engineers based on theoretical knowledge of algorithms, or demanding a CS degree as a prerequisite for your software engineering job, is largely useless and a terrible predictor of the quality of software they'll output — in the unlikely event that they need a sophisticated algorithm, a good software engineer can pull in a library or, in the worst case, search for it and translate some pseudocode. Likewise both disciplines end up getting maligned for not being the other — I've heard people both bemoaning that self-taught programmers can't balance a red–black tree (why would they?) as well as grumbling that computer science graduates learn weird languages like Haskell and Prolog but can't write a Python unit test.

Furthermore, because companies expect software engineering skill from computer science graduates, computer science degrees start incorporating more software engineering material at the cost of theoretical computer science material — and they inevitably don't teach it very well, because practical skills are better learnt on the job. Theoretical skills for research positions are increasingly shunted into masters or PhD levels. If we're not careful we'll end up with the worst of both worlds, where CS degrees are useless both for theoretical CS research and for software engineering, but for historical reasons are considered entry-level requirements for both professions.

tl;dr Companies should stop expecting CS degrees and knowledge for software engineering jobs, where it's all but irrelevant, and provide better on-the-job training and apprenticeship schemes. Universities should stop turning their CS degrees into software engineering degrees, and stick to teaching theoretical CS — because universities exist precisely to teach those things that are hard to learn on the job, or not immediately relevant to professional practice. If they do offer software engineering education it should be in a dedicated SE degree that can focus on SE theory, around e.g. organizational structure and quality assurance. We don't expect physicists to be able to design and build a house, or architects to answer interview questions about quantum field theory; why are we so determined to erase the distinction between theory and practice in the case of software?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: