Pop culture is all about identity and feeling like you're participating. It has nothing to do with cooperation, the past or the future — it's living in the present. I think the same is true of most people who write code for money. They have no idea where [their culture came from] — and the Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free?
I appreciate this categorization on many fronts:
- It explains how non-computer science majors have frequently succeeded. In music, you will never find a classical musician without formal training, but frequently find pop musicians who have little or none.
- It shows how some of the same ideas keep being "discovered" by one group while another group sighs and exclaims "we knew about that years ago..."
- Pop culture is inherently youthful, prone to extremes, sensitive to new trends. Sound familiar?
- It explains a vast amount of decision making that goes on in industry. Rather than choosing the absolute best solution from a somewhat abstract technical perspective, a solution that is new and hip (or fits what I was doing in the hey-day of the bosses youth) is selected.
It is over simplistic to simplify real world code as "sucking" (though of course much does). It is like comparing Bach and <fill in your favorite rock/pop star here>. Both produce valid creations that accomplish something, but were made with vastly different intentions and for different purposes.
Real world code? All code sucks. And Alan Kay says something more damning than that: he says that the whole industry is a pop culture. We are living in the murky anterenaissance and precious few seem bothered by it.
I didn't say that. Imagine the great classical Greeks watching the dark ages take place. You could imagine their horror. Not bitter, just horrified. And with good reason.
Alan Kay has every right to be bitter. He's a great software architect, and came up with OOP as a perfectly reasonable policy for managing complexity. Now most people associate OOP with commercial garbage and that VisitorSingletonFactory nonsense.
If you came up with a set of really good ideas that were later bastardized and mutated into the sort of business horseshit that's destroying software, wouldn't you be pissed off?
In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.
So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.
The hilarious thing about this is, of course, that Shakespeare is actually popular culture, we just glorify it because it's older than the word we made for it. And it's good, of course, but praising it as some kind of pop-culture antithesis betrays lack of understanding of culture itself, and is an example of how many people mistake something that extremely mainstream (but 400 years ago) with arthouse movies.
And to return to the issue of code quality, academia is not a magical land of perfect code. Code I read in academia, in scientific and bioinformatics code, was easily one of the worst I ever read, and I currently work on a huge codebase that started 10 years ago as someone's attempt to learn a new programming language. The "best" code comes either from obsession, or from requirements actually going down to the code quality itself — either because it's used for didactics, or because e.g. reliability requirements demand actual proof of correctness (which is rare, but there's some industrial code that actually has this requirement, see Maeslantkering).
Yeah - these are somewhat fuzzy categories and not diametrically opposed. I just think that there are certain pieces of software that are sort of "high culture" - meaning so influential and well thought out (Unix) or elegant (Lisp) that they merit close examination by later practitioners.
As for academia, I think again there is a difference in intention (publication vs. production code) not necessarily quality. Academics tend to emphasize mathematical/theoretical under pinning and work from the general towards the specific - and some of the best have their eye on some specific applications. Coders in industry tend to work to solve today's problems - and the best do so in a way that is reasonably general and abstracted. Again, I tend to have this as a separate grouping in my mind.
http://www.drdobbs.com/architecture-and-design/interview-wit...
Pop culture is all about identity and feeling like you're participating. It has nothing to do with cooperation, the past or the future — it's living in the present. I think the same is true of most people who write code for money. They have no idea where [their culture came from] — and the Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free?
I appreciate this categorization on many fronts:
- It explains how non-computer science majors have frequently succeeded. In music, you will never find a classical musician without formal training, but frequently find pop musicians who have little or none.
- It shows how some of the same ideas keep being "discovered" by one group while another group sighs and exclaims "we knew about that years ago..."
- Pop culture is inherently youthful, prone to extremes, sensitive to new trends. Sound familiar?
- It explains a vast amount of decision making that goes on in industry. Rather than choosing the absolute best solution from a somewhat abstract technical perspective, a solution that is new and hip (or fits what I was doing in the hey-day of the bosses youth) is selected.
It is over simplistic to simplify real world code as "sucking" (though of course much does). It is like comparing Bach and <fill in your favorite rock/pop star here>. Both produce valid creations that accomplish something, but were made with vastly different intentions and for different purposes.