So universities are conspicuous consumption, internships are conspicuous consumption, entrepreneurship is conspicuous consumption.
Perhaps the simple fact is that most people are motivated by the need for social validation, and only a tiny tiny fraction of the population is driven instead by things like intellectual curiosity, desire to help others / solve societal problems, etc. So any class of activity that becomes open to a sufficient number of entrants will eventually turn into a form of conspicuous consumption.
Now scientific research is conspicuous consumption (MIT Media Lab?), book authorship is conspicuous consumption, and political candidacy is conspicuous consumption. Are there any activities left that doesn’t function as a form of conspicuous consumption?
Surely anything is conspicuous consumption if you do it conspicuously and consumptively enough?
(That's a joke, but it also isn't - in a media panopticon society everything is conspicuous, and in an industrial/consumerist society everything is consumption .. that leaves only activities which are productive or collective but done privately. That leaves .. a few of the religions? Someone should break out the Baudrillard at this point, we are not the first to address this question)
Edit: I forgot the major media event of the time, duh; protesting, while very conspicuous, is intrinsically anticonsumerist, and getting injured and teargassed as part of it is sacrificial rather than consumerist.
>Are there any activities left that doesn’t function as a form of conspicuous consumption?
I would guess anything that doesn't involve exchanging money for an object or experience, in which the quality and/or duration of the object/experience increases with increased cost.
Some ideas off the top of my head:
- Volunteering your time locally in your community (soup kitchen, tutoring underprivileged kids, coaching youth sports, etc.)
- Building interpersonal relationships with new people
- Putting work in to maintain existing interpersonal relationships
- Meditation, mindfulness
- Building a tangible skill that takes intense study/practice over a time scale of multiple years to be considered an expert (craftsmanship, visual arts, martial arts, athletics, etc.)
> universities are conspicuous consumption, internships are conspicuous consumption, entrepreneurship is conspicuous consumption
> scientific research is conspicuous consumption (MIT Media Lab?), book authorship is conspicuous consumption, and political candidacy is conspicuous consumption
Is the paper making all these claims? I don't see them.
Then I don't understand who this question is supposed to be directed at:
> Are there any activities left that doesn’t function as a form of conspicuous consumption?
My answer would be that none of the activities you list (with the possible exception of the kind of wannabe entrepreneurship the paper is talking about) are conspicuous consumption. But it seemed like you were directing the question at the authors of the paper, not me.
> Then I don't understand who this question is supposed to be directed at:
To the HN community of course.
> My answer would be that none of the activities you list (with the possible exception of the kind of wannabe entrepreneurship the paper is talking about) are conspicuous consumption.
That’s a valid response, although I don’t agree. Note that I’m not regarding these activities to be entirely about conspicuous consumption, just like the paper doesn’t claim all of entrepreneurship is veblenian. Fwiw the paper does cite another work that claims some internships can be viewed as a form of conspicuous consumption.
> Note that I’m not regarding these activities to be entirely about conspicuous consumption
If your definition of an activity being conspicuous consumption is that somebody, somewhere, can use it for conspicuous consumption, I think you're right that it's going to be hard to find an activity that isn't "conspicuous consumption" by this definition. But I would say that's a problem with your definition.
Well surely it's a matter of degree (and also chronological trends) we're talking about here. My writing may not be perfectly clear but other commenters seem to understand my point, nuanced discussions may be difficult but not impossible if we're engaging in good faith.
I think it was a rhetorical question directed at me. I got a chuckle out of it, it made me go 'hum...', and brightened my day about one micromort's worth.
You make it sound as if the author is some middling engineer building smartphones for Apple.
An EE professor at a well-regarded research university deals with the scientific method all the time. Whatever opinion you may have about the article, as far as I can tell the author is certainly not on "the sidelines" and more than qualified to write about the state of science.
Japan has a long history of appreciating quirky (often useless) inventions, I think there’s a certain aesthetic quality to these gadgets that resists easy definition.
Lots of great resources here, but as mentioned in the essay signaling based on external validation is still crucial. (OP seems to have won a Thiel Fellowship, I’d assume that has a stronger signaling value than your typical college degree.) As much as I hate it, the world runs on limited resources and people and organizations have no choice but to rely on some sort of signaling scheme as a filtering mechanism.
I agree, but the signal can be pretty small. I contributed to Firefox and it was good enough to be hired by web browser company without a degree. After that, work experience was good enough for pretty much anything.
This is tangential to the author's argument but it's amusing to see how the word "kawaii" has taken a life of its own in the West.
In Japan we often make gratuitous use of English words (yokomoji, lit. "sideways-written characters") to make things sound more interesting than they really are, seems like it works the other way around too.
> In France too, [...] hilarious when management uses English words
Recently (well... back when going to the office was still a thing) I had a meeting with a senior manager at our company. At the end of the meeting, he grabbed his iPhone and said "I am going to take a selfie of the whiteboard" (in French except for the word 'selfie') thinking that meant any kind of picture.
We got the same thing happening in germany as well and i don't know a single person who thinks it sounds good. The (typically) senior PMs seem to be oblivious how everyone's making fun of them behind their backs because of how ridiculous they sound.
The legal drinking age in Japan is 20 (will be lowered to 18 in 2022), but attitudes toward teenagers drinking alcohol has historically been extremely lax here.
This is interesting! Seems we can make everything interactive these days with a simple mix of electronics and ML. Discoverability might be an issue though, I already have trouble remembering all the gesture combinations on the Macbook trackpad.
It’s also not clear if the ML-based classification scheme can handle the vast individual differences among the general public, in real-life scenarios. (Wasn't this why Motion Sense on Pixel 4 wasn’t as precise as initially expected?) Performance may take a further hit with wear and tear, if these are embedded in clothing like hoodie drawstrings.
Granted 16 isn't a lot, but that's actually a higher citation count than the vast majority of academic papers. Anyways, thinking of citations as a reliable proxy for paper quality is like ranking artists by how many followers they have on Twitter.
> Some people never do research again after completing a Ph.D. For such people, the Ph.D. was largely a waste of time.
Is it? Some of my friends went on to start successful companies after getting their PhDs. I'm a bit envious of them actually, I didn't have that option as my research was on an obscure topic with zero commercial potential.
> The Ph.D. is a tremendous opportunity. You get to pick an advisor in any research area you like and then you get to do research in that area, receive mentoring, think deeply on problems, publish papers, become famous, while paying zero tuition for 6 years and receiving a salary.
Yes it is a great opportunity, but considering how universities benefit massively from the work generated by their PhD students they're grossly underpaid (at least in STEM).
Overall this seems like good practical advice, but it's firmly written from the perspective of a true believer in the PhD system (and academia in general). My take is a bit different, I really think academia is broken in some fundamental ways.
> Some people never do research again after completing a Ph.D. For such people, the Ph.D. was largely a waste of time
I'm in this camp (haven't published since I graduated and joined industry), and I also strongly disagree with this statement. The value of the PhD is how it changes the way you think, how you break down big fuzzy problems, and how it helps you navigate the literature when you need help, and how it helps you know where to look and who to ask.
I've worked on much harder problems than my PhD, but none have felt as hard as the PhD, because of the PhD.
The problem is the definition of "research". If you call it "publication", then yes, it seems like a waste.
If you call it "development of new widgets after principled approaches consistent with or beyond the state of the art", it starts to feel like a Ph.D can be training to do really great work in industry.
Very much this. I have spent a considerable post-PhD period developing "new widgets" as you put it, and I constantly feel the benefit of (a) the early training of my PhD and (b) the benefit of commercial work done in the framework of "(a)". That seems a bit obscure, but think of it as this analogy:
Person A never does any serious training in a sport, and then goes on to spend 10 years playing "hit and giggle" tennis (say). They probably get a bit better.
Person B spends several years in their youth doing tennis lessons at a high level. They then never take another tennis lesson, ever, but they spend the same 10 years as person A playing recreational tennis, using the skills they have built in their youth as a framework.
I feel that a lot of the stuff I've done recently builds more on the work I did as a commercial researcher (2006-2017, particularly) than it does on my PhD work, but I also think that the 2006-2017 work greatly benefited from my PhD.
I have one anecdata point, kind of. It's not PhD level but master versus the non-academic route.
TL;DR: my friend is more practical (e.g. better at bandaid solutions). I am more integrated with theory and practice (e.g. diagnosing issues from sillicon to high level). When things are simple, he is faster. When things are harder, I am the only one who can solve it.
===== THE WHOLE STORY ======
I simply did a bachelor + master in CS (security + web/mobile). My friend is a semi self-taught web developer and (soon to be) pentester.
Friend in Web:
When he became serious about web development, he went to a coding bootcamp. When I started teaching web, he had 1 year of company experience.
Me in Web:
I had some hobby experience with web, but because I had CS fundamentals and a good teaching style, I was hired to start teaching his course.
Result:
My friend was more practical than me. He came with more bandaid-style solutions which were sometimes warranted (time-constraints) and sometimes they weren't. For me, it helped me to bridge theory and practice.
Alrighty, round 2: pentesting.
Friend in Pentesting:
It took him a year (!) to get his bearings and find a curriculum he wanted to learn. In this year he learned a lot about pentesting which is how he could verify that he found "the magic bullet" of curriculums. By the way the "magic bullet" for entry level pentesters is: go to hackthebox.eu and if you want to get certified (i.e. recruiters will notice you), do OSCP. He did a lot of different stuff before he got to this conclusion (honorable mention: VHL - Virtual Hacking Labs).
Me in Pentesting:
My friend invited me to join him on hackthebox.eu because he knew I did courses in web + network security, binary & malware analysis and hardware security. I go in and slay the boxes together with him. The key difference: he is fast, I am slow but I am capable of hacking the most difficult levels (which they call insane boxes).
Result:
We teach each other a bit of what we know. He helped me get faster with easy boxes. I helped him to (almost) hack insane boxes. In doing so, I taught him x64 assembly and some C.
Surely this is a question where many opinions will be biased.
If you did a phd and stayed in academia then you are likely to feel that having people do phds and leave academia is a waste of resources (maybe this is wrong and phd students are so cheap that they are net contributors).
If you did a phd and left academia then you are perhaps likely to feel that your phd bestowed benefits to you outside of academia and that those benefits ought to be afforded to people in later generations.
I suppose if one doesn’t have a phd then one might have either opinion, or some other opinion eg that phds are entirely a waste of public money.
>> Some people never do research again after completing a Ph.D. For such people, the Ph.D. was largely a waste of time.
> Is it? Some of my friends went on to start successful companies after getting their PhDs. I'm a bit envious of them actually, I didn't have that option as my research was on an obscure topic with zero commercial potential.
There's also plenty of industry positions that benefit from you having a PhD. Certain fields can basically require them, like Chemistry. In CS, you might be able to get a lot of them without one, but the PhD lets you get paid to develop your skill set in niche areas.
> Is it? Some of my friends went on to start successful companies after getting their PhDs. I'm a bit envious of them actually, I didn't have that option as my research was on an obscure topic with zero commercial potential.
Agreed, a PhD is absolutely not useless in this industry, even if you don't end up in the research community. Understanding where the research frontiers of various fields are and being able to quickly find / digest relevant technical papers feels like a superpower. The gap between an undergrad education and a research frontier is enormous, and only working on a PhD really gives you the time and incentive to cross it. Having done it once, it gets easier to do it again.
For me, it has turned a huge volume of "unknown unknowns" into "known unknowns" and equipped me with the tools to then convert those into "knowns". Without it I'd be a fine coder, sure. With it I can work on a different tier of projects, and direct my career much better.
The costs are very real, though. Giving up ~6 years of early career earnings in a high-paying industry is utterly insane; you will never, ever make it up short of your startup lottery ticket number coming up. It's a meat grinder for mental health. Dozens of things outside of your control can go wrong and torpedo your aspirations. It is the right choice only for a vanishingly small minority.
> Some people never do research again after completing a Ph.D. For such people, the Ph.D. was largely a waste of time.
This statement reflects the writers' large ego, not reality. It's a shame how this point of view is prominent in the scientific community. I've seen similar rude statements from academic scientists who say that working in R&D at a corporation isn't "real science".
I think the intention was referring to people who do no further research, whether academic or private (e.g. repetitive work with significant exploration involved). The last paragraph of section 2.6 is about doing research work in private companies, no?
I agree with the author that there is little value in doing a PhD if you don't intend to keep doing some form of research afterwards. There are alternative programs or work that would be much more applicable.
6 years of a grad student salary is a big financial sacrifice compared to 6 years of a software industry salary, even at non-Silicon-Valley salary levels.
The opportunity cost of a PhD can be pretty high. In CS you could theoretically make back a lot of that money with a higher salary after graduation if you then enter industry, but the gap in lifetime earnings up until that point is huge.
Let's say you spend 6 years in grad school on a stipend of $35k per year. You think you are being paid to get a PhD, but that's just because you are ignoring opportunity cost.
If you took your talents to industry instead, let's estimate an entry-level salary. I will intentionally make it lower than average. Let's say $100k per year, which is on the low end in Silicon Valley for an entry-level engineer, and let's also make the ridiculous assumption that such an engineer would not get a raise for the entire 6-year period.
After 6 years, the hypothetical grad student was paid $210k total, and the hypothetical entry-level engineer was paid $600k total. The PhD cost $600k - $210k = $390k. Would you pay $390k for a PhD?
Yes, of course, even if you increased it to $800k.
Several thousand brilliant people make this decision every year.
Think of it as paying $390k to buy 5-6 years of time, and it starts to sound very different.
5-6 years of time at the prime of your life where your basic needs are taken care of without any enforceable expectations from you.
You can go sit in any of the hundreds of interesting things being taught around you.
You can pursue hobbies, go camping in the middle of the week.
If you are lucky, you get to work on interesting bleeding edge projects which may or may not make business sense.
For this you get nearly any resource you ask for.
Labs in good universities are rather well funded.
Supercomputer time to run a random program, easy!
Also, most companies are happy to hand out huge amounts of cloud credit, reference hardware.
Compared to making 390k, are you surprised that at least some people would make this decision?
Also, 35K per year + well paid internships (~40$k for a summer) is a rather comfortable amount of money for a 25 year old person in most situations.
And you get a high salary, post PhD only if you are in some hot field. I got one the earliest PhD's in neural networks - but this was in 1992, long before the world discovered machine learning. It took a while for my salary to catch up, although I can't complain, the last decade has been good.
I don't do any academic research, but I do a LOT of science for my job. My PhD definitely provided me with skills and knowledge I could not have easily aquired in the workplace. I'm working on an almost-FAANG doing distributed systems maintenance and design.
Perhaps the simple fact is that most people are motivated by the need for social validation, and only a tiny tiny fraction of the population is driven instead by things like intellectual curiosity, desire to help others / solve societal problems, etc. So any class of activity that becomes open to a sufficient number of entrants will eventually turn into a form of conspicuous consumption.
Now scientific research is conspicuous consumption (MIT Media Lab?), book authorship is conspicuous consumption, and political candidacy is conspicuous consumption. Are there any activities left that doesn’t function as a form of conspicuous consumption?