This is a good example of an important general point: human social instincts are a really huge security vulnerability. The price we pay for adaptability is that we can't have a hardwired instinct of what's normal, so the solution evolution came up with was to program us to take our cue on that from the people around us. The downside is that if we put ourselves in an environment where people are doing bad things - working overtime, doing a PhD, taking drugs, bullying, working for a totalitarian regime, basically anything whatsoever - it won't be long before our brains start telling us it's perfectly normal and finding excuses for the harmful effects. Look at this guy: he admits it ruined his life, but he still thinks it was okay because he liked his initial group of co-workers.
Conclusion: do not put yourself in an environment where the people around you are doing bad things. If you are already in such an environment, talk to someone outside it, get their view on the matter, and take it seriously. And if you think you aren't in such an environment, consider doing that anyway as a reality check.
> the solution evolution came up with was to program us to take our cue on that from the people around us
Exactly, I cannot agree more. One of the most striking examples of this is a classic experiment where people are no longer able to trust their own judgment in a very simple matter and are compelled to give an obviously wrong answer to avoid deviating from the norm:
That was an interesting read. I'm not sure it actually proves your point though, seeing as two thirds of the subjects did not yield.
Also, from the end of the article:
> Asch's 1951 report emphasised the predominance of independence over yielding saying "the facts that were being judged were, under the circumstances, the most decisive." However, a 1990 survey of US introductory psychology textbooks found that most ignored independence, instead reported a misleading summary of the results as reflecting complete power of the situation to produce conformity of behavior and belief.
It's worth pointing out that 75% of the subjects did yield at some point.
"25 percent of the sample consistently defied majority opinion, with the rest conforming on some trials."
Ie, when taking the test individually, everyone got the correct answer almost all the time. But when taking the test in a group, 75% gave the incorrect answer at some point.
To be precise, he admits that it ruined his life, but he doesn't regret it. Regret is a fairly specific emotion, and not regretting something doesn't mean you think it was okay.
No they cannot do it for free otherwise. Research takes a lot of time, and if you're not getting money to do it you can't keep doing it (need to pay rent, food, etc). Perhaps you meant to say that the people in research would like to do that research regardless of the money it earns them; that seems true enough.
The funding cycles will also make the people not in it for the money tired, except for the ones that keep winning; it's a rat race regardless of your interest in cash. Most research is not about being "on the brink of discovering something [...]"; that is a romantic idea of research which is exceedingly rare.
I meant what I said. I believe a lot of researchers (in academia) would do it in their spare time, after hours, weekends, etc, even if it wasn't also their means for income. Their hobby and curiosities happen to align with a job that pays living wage.
Money in the bank is not the end goal for everyone. You can work a job to get paid so that one day you don't have to work anymore. Or, you can work a job that pays sufficiently and you'll happily do until your last days of life.
Lastly, most research is on the brink of discovering something. Whether or not there is a life changing impact (or application whatsoever) as a result of that discovery is another argument all together.
Maximizing discounted future cash flow often gives you the option of scaling back those hours you're trading for cash. Want more leisure time? Earn more during the hours that you are working!
No matter your goal(s) you can always treat it like that.
Just that with most humanly shaped goals, the formula would be too complicated to be tractable, especially since you have to account for all your guesses of the unknown. People don't even know what makes them happy before they live through it, and you can only live through so many experiences.
So we `give up' and rely on rules-of-thumb and call them wisdom. And mostly we should!
It's not all discounted future cashflow, it's discounted future utility. And a lot of people get the best part of their utility from family, friends, and relationships. Finding a reasonably prosperous position from which to secure a future with your family may be the best investment one can make.
Academia's contribution to that position is a murky proposition (added debt without added earnings potential undermines financial security). And while there are certainly immediate rewards from the pursuit of intellectual goals and from your friends at university, you can also get not-too-dissimilar rewards and more cash doing something like programming computers. And then you can use the cash to do things like: honeymoon with your wife in South America, enroll your child in a really good arts or music program, or spoil your grandchildren with a study-abroad semester in Europe.
That's an argument against going into postdoc academia, not getting the Ph.D. Getting the degree typically costs you nothing except for the time invested.
Economics tells us that an activity that takes time costs the opportunity cost of that time. Whatever else you could have done with that time, you now can't. This puts people behind where they could have been on career, savings, family relationships, and so on.
If I could live my life over, I would never have gone to grad school. And I'd have probably had kids 10 years earlier than I did.
Don't know about btilly. Here are some general reasons:
- more overlap of your lifetime with the kids lifetime (so, see more of your kids)
- get the scary parts over earlier, and enjoy more of the rest
- if female, easier time conceiving
So I'd have more energy to enjoy them, have more of a window between them growing up and being old, and so I'd have better odds of living long enough to get to know my grandchildren.
Let's put some numbers on that. My first child was born when I was 35. I turn 48 this year, and will not have all my children in college until I'm nearly 60. I'm not looking forward to figuring out college when I'd like to retire.
Moving on, my family health history sucks - I do not have a male relative who has reached 60 without a heart attack and/or stroke. That includes 2 brothers that I know of. I'm working to take care of myself with diet+exercise, but already have hypertension. If my children have children as late as I did, there won't be a grandchild until I'm in my 70s. I have very few male relatives who survived to 70, and none arrived there with their minds intact. There is reason to believe that I could do better, but realistically there is a good chance that I won't.
I was aware of most of these facts in my 20s. They just didn't seem relevant to the life choices that I was making then. In retrospect, I wish I'd chosen differently. I won't push my children towards any particular life decisions, but I hope to give them tools to make choices that better reflect what they want in the long term than I did.
The tech/startup world has realized relatively quickly that 100 hour workweeks are detrimental to productivity, retention, and overall success. Seems like articles such as this come out on a weekly basis, along with other articles on 'progressive' practices such as unlimited vacation, flex working hours, mandatory time off, half-day Fridays, etc... and it's only taken a dozen or so years.
The medical profession, on the other hand, has struggled with this for almost a hundred years. The norm was for residents to work 100+ hour weeks, for 3-5 years. Non-stop. Only recently have they given themselves a pat on the back for achieving a '[mandatory 80 hour workweek](https://en.wikipedia.org/wiki/Medical_resident_work_hours#To....
Seems odd that people building apps and games have figured out that working insane hours is bad, but the people responsible for treating the sick and wounded are still chugging along, bleary eyed and overworked.
The work is very different. If I'm coding and I'm tired, I can go home and get some sleep. When I get in the next day, the code is exactly how I left it.
A doctor can't put their patients in suspended animation while they get some shut-eye. The patient's condition may worse or improve. They may make vague complaints that are clues that can only be pieced together if you were around to pick up all of them.
Likewise, it's very very difficult for a doctor to serialize every bit of data they've gleaned from a patient to pass on to the next doctor to come on rotation.
Because of that "continuity of care"—having one doctor around to continuously interact with the same patient—is really important. It would definitely be great for doctors to work shorter hours, but it does have some real costs. Patients aren't text files.
4*12 = 48 hours a week and 2 people a day, 4 per week. Long hours is mostly a staffing issue. 24/7/365 is very hard when you don't have a lot of people.
Note, staggered hand-offs also work best. You really don't want everyone to leave at the same time.
My understanding from friends in the medical field is that doctors also nap during their shift. Obviously not as beneficial as a good night's sleep and probably depends on what's going on during their shift but it does alter the comparison a bit.
Yea, but naps really boost productivity even for just a short nap. I wish the start-up culture was more cool with naps. Spoken from someone who has worked at large and small startups that sometimes after lunch just really needs a nap.
Occasionally after lunch, I'll get hit with this intense wave of sleepiness, which no amount of caffeine seems to be able to fix. The best I can do is stare at my screen blankly through half-closed eyes while my brain is more or less shut off. It can last up to an hour. It's horrible, I've literally looked up motels in the area to see if I can nap during my lunch break but no luck :(
Do you have any evidence that's a relevant favor in the practice? Eg: how many times does a doctor see the same patient again in a shift? Is that really relevant across all medical positions/specialties?
But patients can and should be more similar to text files. It's a real tragedy that patients die simply because a doctor failed to write accurate notes for the next shift.
> 'progressive' practices such as unlimited vacation
Unlimited vacation is not progressive and it is also unhealthy.
It causes feelings of guilt while on vacation. Often there's an unspoken obligated to check-in (email & chat) when on vacation. And leads to taking fewer vacation days not more. When leaving a company, they have no obligation to pay out accrued vacation days since there is none defined (this may be a Canadian thing). Additionally, it can also come with the unspoken culture of overtime, since it can easily be made up with "more time off".
Having worked with unlimited vacation, if I am ever offered it again, I will decline and negotiate defined vacation into my employment agreement.
I had unlimited vacation on my last job, and I think I personally did great with it (4-5 weeks per year, very little email checking and such).
But on average, I'd say you're right. Many of my coworkers took almost zero vacation. At one point, the CTO just started scheduling a week off at a time for people who hadn't taken one in a year+.
Unlimited vacation should come with a mandatory minimum vacation taken per year.
Funny enough, of all places, banks have such rules. You have to take at least two weeks of leave as a single block. That was introduced after rogue traders hid losses from the beancounters by shuffling their positions around every day.
I don't think all banks introduced these measures to the same degree and at the same time. So there might be plenty of data around for some enterprising statistician to tease out the effect on productivity from these natural experiments. (https://en.wikipedia.org/wiki/Natural_experiment)
The point is to ensure that all tasks are periodically handed off to someone else - i.e., when you're off, the task doesn't wait for you to do it when you get back, a designated replacement has to do it.
It has the following benefits:
* if hiding something requires certain "adjustments" to periodical reports, then having someone else do it would often expose it - the particular rogue trader case, we was falsifying control documents, if someone else would do it a single time, then it would have been detected.
* if a particular customer or portfolio has problems that should have been escalated but haven't; having someone else handle them independently during your vacation would often expose it.
* if you suddenly need to replace someone, because they leave, die or get fired, you have a good understanding on what needs to be handed off, how to do it, and have some people who have done this before. It ensures, so to speak, that you don't have an absurdly low "bus-factor".
In addition, since it's not a one-off event but a policy, it acts as a preventive measure to do things in a manner that can be handed off, because they will have to be handed off, and a preventive measure for fraud because you have an expectation that you will get found out instead of knowing that you are the sole keeper of some information and can "adjust" it as needed.
> The tech/startup world has realized relatively quickly that 100 hour workweeks are detrimental to productivity, retention, and overall success.
Not that quickly ;-) My first job at a start up was 1992, I think. I didn't take a single day off (even national holidays) for 18 months. 12 hour days during the week, 8 on the weekend. After that, I took a year off to pretend to do a master's degree (but really just hung out in the gym and lifted weights). And then I went straight into a job where I became much more sensible and only worked 90 hour weeks.
I remember reading Jamie Zawinksi's writing at the time. There are some remnants of it around the internet, but because of this: https://www.jwz.org/blog/2011/11/watch-a-vc-use-my-name-to-s... he's not keen for people to link to it. I still remember with great fondness the post about trying to answer a survey of how many hours a week he worked. He said something like 120, but the computer field only had 2 digits. JWZ was incredulous that this could happen given that the guy who wrote the code surely worked more than 100 hours a week.
Back in those days, this was the way we thought you were supposed to do it. Even though Peopleware was written in 1987, we all read it and promptly ignored its advice (BTW, when was the last time you saw a programmer's desk of the recommended size? ahem....)
Even today (getting pretty darn close to the 30th anniversary of Peopleware) how many teams believe that crunch is not just necessary, but desirable? Hey, the fact that we don't crunch is one of the biggest selling points of the group I work with. Lot's of great programmers are willing to give up salary in exchange for a no-crunch life.
Lately there is general acceptance amongst programmers that a 40 hour week is a good idea (this is not always shared by entrepreneurs). However, my experience is that before about the year 2010, the concept was ludicrous -- even ridiculed. I'm really happy to see the tide turn, but this is a fairly recent phenomenon.
From what my doctor friends have told me, it is all about information hand-off, or lack thereof. When you have been on duty for 8 hours you have all the details of all the situations that are happening currently, but it is very difficult to hand off all that information to the next person, so it is just 'easier' to keep working and to be there when that info is needed.
It seems like that could be remedied with overlapping shifts, but that costs more money. I wonder how medical malpractice factors in.
Why can't everything be in the patent's chart? I do healthcare IT and doctors seem to do their digital 'paperwork' sometimes weeks after the visit. I know because we get calls to unlock the records so they can modify them. If all this data was in the chart, problem solved.
Sounds like we need to make data capture going into the charts much more slick and smooth. I'm thinking of a voice-driven personal assistant using the latest in AI. So you can just tell it what you're doing, and what you're thinking about the patient's problems, and it organizes it automatically, and makes it easy for others to retrieve (via voice).
Adding in some kind of unobtrusive HUD would be awesome too.
As a physician I can tell you it's not about data entry and data retrieval. It takes time to abstract into words the cognitive model in a physician's brain that represents the patient's condition and progress. Words are static and the format we are forced into documenting is also arthritic and linear. It does not convey the fluid and fuzzy nature of a patient's condition and progress.
Treating a sick patient is a little like reverse engineering a binary. Disassembly into blocks takes some guesswork. Pattern recognition plays a role- therefore you already need to know design patterns before you even approach a binary. Properly codifying your guesses can be helped by using something like IDA but sometimes that slows you down and you can go just as fast using ollydbg/gdb/whatever. This is because the construct is in your brain and you are using these software tools to gather data points, not construct some model on paper that stands like a sculpture or perhaps a Rube-Goldberg device that is a rough approximation of the original source code.
Treating patients begins with such a puzzle. You need to know what diseases look like, what the key distinguishing features of look-alike diseases are. Then you also need to have a feel for the time course of these processes. On this mental scaffold you pin data points gleaned from the tests you order. Trying to communicate this whole mental structure along with the relevant data points and also pointing out the confusing chaff can be a tremendous waste of time. Verbal communication is a lot faster. The abstraction/deabstraction bottleneck is avoided. Instead, approximations are presented so the model is quickly understood by the next physician. The next level of detail is then overlaid. Then another level of detail is overlaid and so on, like building a fractal image. The advantage of this approach is that the big picture is conveyed first without a lot of detail that could be misleading if it receives the wrong emphasis. Also, the receiving physician can interrogate the process early in the big picture phase to verify major diagnostic and treatment decisions.
Writing text is a linear process- it is a serial data stream. However, puzzle solving requires a constant zooming and zooming out as well as non linear jumping around, Puzzle solving is not well suited to a linear data stream. But writing and reading text is all that we are taught since grade school. Even the attempt to structure knowledge into an 'outline' is too restrictive. Interestingly, when we teach children how to use the this linear tool of text writing, we do not use linear methods. Rather, a sentence is approached as a whole entity and then broken down until the meaning is understood. We use this game of 'diagramming sentences' to break down, analyze and communicate the parts of our sentences. The most meaningful parts are the subject and verb. Then we sometimes add an object. Adjectives and adverbs are modifiers. Prepositions are key fulcrums as well. The similarity to medicine is that a patient's condition is analogous to a sentence in English.
When we discuss a patient's condition, there is a similar pattern-the principal parts are presented first. Additional levels of detail are then added to embellish and refine the picture. The model morphs and morphs again as certain facts are emphasized and others de-emphasized. Finally it locks into one of the 'disease patterns' we recognize.
Electronic medical records are not any different than the notes we used to write. They are a skeumorphic sham that do not permit better or faster comprehension. Computers have permitted an amazing use of graphics and improved design patterns that have changed the content on the net. If you saw a web site designed in 1990 you would recognize it immediately. Today's sites are much more dynamic. Font choices, formatting, parallax page movement, drop downs, animation and sound focus our attention to communicate information quickly. Hyperlinks permit non linear navigation. Electronic medical records make no use of that. Graphic primitives, hyperlinks or audio cues are not built in to these programs. Instead we are forced to use relics from a previous era. Why? That's a WHOLE 'NOTHER can 'o worms. There are over 450 electronic medical record systems most of which do not talk to each other. It feels like I am using wordstar. Remember that program??
Improving patient care needs to begin with the patient. We need to be able to give a person an iPad with a picture of a body on it. As they point and click and expand relevant body parts, they need to be given choices of typical symptoms. The whole patient centered part can be in any language- it would be mostly pictures with a few words. As they continue to point and click, the iPad is generating the text that represents their choices. The text could be in the physician's language. A person could do this while waiting for the physician. So when the physician gets to the patient, you just saved half an hour. and maybe avoided some unnecessary tests. and maybe gotten them the care they they needed. faster.
I'm imagining different kinds of 3-D visualizations that would usefully show the current state of the diagnosis, with the reported symptoms, test results, current theories, timeline and more.
Why isn't it possible for a programmer to step away from their desk at any time, and another programmer sit down and pick up right where you left off? Synthesis and planning happen between your ears, and there's a lot of overhead to handing that off. It's not a simple problem.
That is a factor for the length of the shift, but not the total amount - where I live, when there are professions that for various reasons need 24 or 12 hour shifts running continuously, the standard way to do that is "1 on / 3 off" continuous schedule which comes out to an average 42 hour work week. I.e., either you work for a full 24 hour shift and then have three days off; or you have a 12 hour shift every other day, no matter what day of week it is.
It has some advantages and disadvantages compared to normal work schedule (essentially, less commute but many of weekends/holidays away from family), but you can definitely live with that.
However, having 12 hour shifts every day? Just hire two employees then, because you have enough work that requires two full employees.
> The medical profession, on the other hand, has struggled with this for almost a hundred years.
IMO, this is all about money. For residents to work shorter hours, either the cost of care would have to go way up or each residents total pay would have to go way down. Payers won't accept the former, current and prospective residents won't accept the latter (largely, because the cost of medical education is driven by expectations set by the status quo -- they literally can't afford a cut.)
This may be partially true, but my anecdotal evidence points elsewhere. I live in a very poor city in the US, and most of the private hospitals here run a surplus. The problem (like in tech) is finding adequate talent. Doctors and nurse practitioners (the people that are licensed to actually provide medical care) are in extremely short supply. The money problem is at the University level. Medical programs are so rigorous and expensive, that schools cannot expand and graduate enough students to meet the demand. Hospitals and clinics are expanding, but medical schools are not. So the students that do graduate and move on to residency are overworked due to lack of adequate staff for the facility (rather than lack of money). Residents are extremely cheap compared to boarded physicians and nurse practitioners. I know personally if the hospitals I work for could hire more residents, they would in a snap, but the local University won't fund the programs.
The only solutions are to either get more money into schools to graduate more practitioners, or hope that new technology and innovation can somehow alleviate the burden of work on the existing care providers.
Or we could open medical credentialing to foreign talent. But that would mean that our trade policy would have disadvantage elites rather simply leaving the working class open to global competition while protecting doctors, lawyers, and content conglomerates.
That could be part of a solution. Another one is to be very careful about how much training to require for which function. Eg lots of routine work in hospitals could be done by people with less training than a full doctor.
Some places around the world are moving in that direction.
> The only solutions are to either get more money into schools to graduate more practitioners, or hope that new technology and innovation can somehow alleviate the burden of work on the existing care providers.
Why can't hospitals simply say to highschool graduates: We will "sponsor" your medical study, i.e. you won't have to take any debt for medical school education, but afterwards you will have to work, say, 10 years for us (with a priori defined salaries)?
This does occur in some states. Some students get loans repaid if they contract to work in rural areas for 5 years. I think this is still a symptom of too few graduates. The acceptance rate of US medical schools fluctuates around 20%, so there is no shortage of applicants. There is a shortage of available slots (both in school, and residency positions). According to an old NIH study, a student's tuition only covers about 60% of the cost of a four year med-school education. Universities are also required to pay a portion of the salaries for residents in affiliate hospitals. So lots of money going out of schools, but not much coming in.
This varies greatly by specialty. Surgeons (and anesthesiologists) are working 80 hours/week or more. Most non surgical specialties are well under 80. Pediatricians, dermatologists, psychiatrists and a handful of others probably average around 50 hours/week over the entire residency, with another 10 hours/week of home study (equivalent to a developer reading technical articles from hacker news & stackoverflow).
My girlfriend's brother is a doctor, and falls asleep very easily. He told us that during his residency, he once fell asleep during heart surgery while holding open the patient's chest (with some kind of crowbar-like medical instrument whose name I forget). Fortunately he maintained muscle tension so nothing bad happened. The surgeon noticed he was sleeping and told him to take a walk.
He also falls asleep driving, standing on the bus, at the dinner table, etc.
"He also falls asleep driving" - I don't know where you are, but in EU this would prohibit you from ever having a licence, just like being a diagnosed alcoholic would(unless you get a note from a doctor saying that you are fine to drive).
This doesn't exactly seem like the healthiest (or most efficient) routine.
> A pattern emerged in my days during this period. Get up, get washed and dressed, wash down a couple of caffeine tablets with some strong coffee, pick up a red bull, a coke and some cereal bars on my way into the office. Once at work I’d be at my desk non-stop; all meals would be eaten at my desk, though sometimes I’d not eat at all. When I did eat, it would usually mean a sandwich from the supermarket for lunch. Work would often provide dinner, invariably takeout. Sugary caffeinated drinks and sweets would sustain me for the rest of the time.
Working 80 hour weeks is not okay, but working 40 hour weeks is also probably not okay when you're relying on this sort of sustenance.
Thanks incredibly for sharing your experience and helping build such a renowned game. Hopefully people can take away from your experiences strategies to avoid a similar fate!
Yeah I dont understand this either as a programmer. Maybe this works in a big team better, but alone at least there's no way I can work even 8 hours productively every day, realistically it's more like 3-4 hours of intense thinking, maybe on a good day I can do 6-7 hours, and sometimes even more, but continuing this pattern will really affect the level of code and problem solving in a major way.
Better to take breaks, do something else, just relax for a day or two even if nothing is coming out, and then continue when it comes naturally.
They might have shipped sooner had they let everyone work normal hours. It's not just that the studies agree sustained crunch is a bad idea, I've seen several times in person that sustained crunch beyond the first few weeks lowers overall productivity. People start goofing off, or just staring at code without making much progress, or writing huge mountains of code when a little stream of code would have sufficed with more forethought. And the code that they do write they have to revisit later to fix all the bugs caused by sloppy coding.
I have personally seen someone in crunch replace a few terse lines written in 15 minutes that worked. With pages of boilerplate code and a few dozen files that must have taken days. His response, I did not understand what you did.
Ask? I mean no wonder that team was working 80 hour weeks. If you stop thinking then even very simple problems become monumental challenges.
Whenever I work more than 10-11 hours, I usually come back to the code the next day and wonder WTF I was thinking. Problems with straightforward solutions solved with large amounts of garbage code, wonky designs, lots of stuff that just gets thrown out wholesale. It is literally counterproductive.
It depends on what you do, when you're in more of a lead developer/team leader type position you don't only code on your own but have to routinely help/direct other devs all the time & it's easier to slip into working a lot longer (partially it's that you keep getting fresh problems, another part is that you aren't working on these problems alone).
I can easily imagine leading even a small-ish team of 5 to 10 other programmers who don't all start and finish the day at the exact same moment (so each may be working 9 hour days but it stretches over a 12 hours period) can make you "naturally" work longer hours if you don't keep an eye out for it.
I've seen this pattern as engineer and as an engineering director. You can always put the blame for these things on leadership's inability to make decisions. Too much time is spent in the theoretical, "what's possible" headspace that no final decisions get made that gives the team anchor points. It says a lot that the switch from open world to level transitions (a major decision) happened so late. Staying that there was no real plan to get to the end...that's the job of an executive.
The best executives I've ever seen, for better or worse, will regularly make decisions on the project so there's a continual sense of marching towards goals. They may not always be the right decisions, but making the wrong decisions can be less damaging than making no decisions.
Great points, but, remember that executive behavior is a by-product of the company culture and environment. I've seen high-ups afraid to make these important strategic decisions in a timely fashion that were rightfully worried. Sadly, some people do not share the sentiment that failure can be just another data point and reward it harshly.
Yep and the company the author worked at was founded by Peter Molyneux who is infamous for dreaming up stuff that's not possible in the real world with real constraints.
That was my thought. Peter has a history of big ideas that have a hard time translating into final products. Those kinds of minds are integral to a great product, but past a certain point it can also be a hinderance
I never understood how some people who work in tech have no ability to say no. I understand that passion runs deep but that's no reason to become a slave.
There is no way whatsoever that I would work 80 hours a week for an extended period unless I were working on my own startup or company and the time that I invest would benefit me in the future. And even then, I wouldn't do it.
Because someone else will say yes. Even if you win the battle you'll lose the war. I imagine that's why raising awareness, engaging in public discourse about it and pushing for reform or change industry-wide is a common theme.
And you're probably working on games because you're passionate about them so considering the hop over to a job doing $boring_thing vs. crunch time abuse is choosing between a rock and a hard place.
I think the games industry is to males what magazine journalism can be to females. Seen many (including friends) putting in loads of hours and working cheaply, because the competition for spots is so intense.
I don't mean any disrespect to the author or his team but this sounds much like Stockholm syndrome (1). I see symptoms even in many "Ask HN: Who is hiring?" posts. Yes, we work long hours but we play games at lunch. We have snacks and drinks!
Programming is a craft, game programming even more so. Passion runs high; people will work longer and harder for less money than at other jobs. The competition in the workplace and in the market is intense.
Willful moderation can never be applied systemically in this industry. There are two ways out: 1) six sigma industrial engineering type management of your processes, and 2) lucking into a clever idea that's not hard to implement. 1) leads to a safe, if well-loved, game. 2)is great of course, like in the way that hoping you'll win the lottery is great.
Might be tempted to add a 3) de-risking by middleware, but all that does is raise the bar for everyone. Good for consumers, not helpful for devs.
My opinions are based on working in the game industry for ~6 years, with months of 100 hour weeks under my belt. Stressed to the point of bleeding out of random parts of my body.
Just can't see it ever changing. I just lol at unionization.
Do people working on game engines licenced so other companies work hours more in line with the rest of the software industry, or more in line with the rest of game development?
What about people working on cloud-y games (like WoW) that don't have one big bang release, but have more continuous development?
Don't know. I worked in the industry when Quake, Unreal, and if I'm being generous, maybe RenderWare and LithTech, were the only real middleware options, and most people still just rolled their own engines.
I ultimately burned out while working on an MMO, so I think the answer to your second question is that it's no different... because the launch is for sure a very, very big bang release.
Thanks. I was thinking more after the initial release of the MMO.
There's probably also a difference between people closer to the creation of new content (for all the updates and expansion with firm deadlines) and the people keeping the infrastructure running.
I know the feeling. At my first startup a long long time ago I worked 100 hrs/week for 4.5 months and almost died. You can somehow get through but it takes too much out of you. Game companies do this routinely of course.
Does the middle and senior management put the same time in? I guess they have a much higher financial stake in things so even experiencing the horrible results first hand might not dissuade them.
They also often got where they are, because they put in the extra time.
Not so much because it actually made them productive, but because it made them look productive. Also, there are people who naturally deal less-badly with long hours---and they tend to rise to the top here.
I got into computer programming because I wanted to make video games. By the time i graduated, I no longer had that interest and I don't have any feelings of missing out (though I'm deeply envious of how indie devs like Lucas Pope and Toby Fox -- Papers, Please and Undertale, respectively -- have been able to create such artistic and clever and commercially popular work)...but I would love to experience a standard game dev cycle/crunch. Other types of user-facing software development I can imagine...but how software engineering interacts with the artistic components of development (nevermind playtesting and marketing) is something that seems so different to what I've ever worked on.
Side note: Eurogamer had an absolutely epic insider write up of Lionhead...I'm surprised it didn't make front page but it was one of the best things I've ever read about the industry, and I've never played the Fable games https://hn.algolia.com/?query=Lionhead:%20The%20inside%20sto...
For what it's worth, I LOVED Fable. I'm amazed to hear that the original plan was to have it as an open world instead of the load screen sectioned maps. I can't imagine how much harder that would have been. Also strange to hear they made that compromise very late in the development.
"It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to Heaven, we were all going direct the other way ...."
-A Tale of Two Cities, Charles Dickens
This is very relevant to the Bay Area tech scene. This area literally has the income disparity of a 3rd world country.
I don't really understand why people put themselves through that stress without getting a personal equally high payout. I mean, the stress he faced was pretty much as big as the ones startup founders face. But startup founders have a chance to end up rich or with a much bigger business. What could he gain from it? A bonus?
And as a side note: If the text about his work sounds like your work, you can gain a lot of more performance, fun and results if you spend some of your time learning more indepth how everything about your job works. The debugging drudges become less once you know enough. And not all of it can be learned just by doing.
I guess this just comes down to a difference of mindset - I work my job because of the concrete benefit I get from it. (Mostly money, but also work schedule, environment, etc.) I'm not interested in being driven to grind my life to dust for a shot at little bit of hero-glory. To be totally honest I hadn't heard of this game or this person before.
Crunch is a serious issue for the entire video game industry. It's only made worse by the idiotic treatment of unhealthy stress and unrealistic deadlines as a badge of honour. In fact, rumours about the terrible way game studios treat their employees, along with a brief firsthand glimpse of it during a summer job, is the reason I didn't immediately go into the game industry after graduating. Despite the fact that I had dedicated myself wholeheartedly towards pursuing a career as a game programmer ever since I was a kid. After 3.5 years working on the development of surgical simulators, I finally took the plunge into the game industry. It's one of the best decisions I've ever made.
When I first started working at Ghost Games (an EA studio) I heard a lot of horror stories about the crunch of the previous project. What made me put this aside and take the job was how everyone at the interview spoke sincerely about how crunch is a bad thing, the damage it had done to the studio and all the ongoing efforts to ensure it wouldn't happen again. In the end, the project I joined (the reboot of Need For Speed) finished without anything which could reasonably be called a crunch and I only had to put in a few hours of overtime, including one Saturday. This made me feel so relieved and I'm now looking forward to a long career in this field.
Of course, EA has a history of crunch as bad as the next game company, but there is a strong feeling that they've realised that it's a problem and they are doing everything they can to mitigate it. It's simply not good for business to burn through talent at a rapid pace and build a reputation for ruining people's lives.
> The consequences for me were devastating. I was briefly prescribed anti-psychotics at my lowest point. I experienced migraines, complete with terrifying tunnel vision, blackouts, severe depression, anxiety, panic attacks, paranoia, hallucinations and thought insertion.
That sounds pretty terrifying - wonder if there's a way to be resilient to this or if everyone is affected this way given enough time.
I was working on a system that was doing CSG of AI nav meshes. If enough floating point error crept in, eventually the a shared of edge of a navmesh would split apart and you'd be left with the AI walking around non-existent obstructions. Spent an unimaginable amount of time trying to deal with this. One night when I was walking back to my apartment, I walked over a section of the sidewalk that had plywood laying on it. The edges of two pieces of plywood had about an inch gap between them, and I remember my heart jumping in panic as though I was "seeing the bug" again - basically hallucinated my work problem into real life. That was towards the end of working in the game industry...
Under good conditions that can be a fun thing. I once spent a whole evening changing parameters, drawing black and white fractal trees. When I walked out into the night, I perceived RL trees as the software would render them. I wonder how much of this overlap goes on without us even realizing it :)
I was hallucinating driving home before after a barrage of 70-80 hour weeks at a gamedev startup. Of course it was on a completely empty road and it was minor things in the periphery (the sky looked like giant evil monsters duking it out), but I probably shouldn't have been on the road.
But I was at the office for 16 hours already, and there was nothing but my desk to sleep at at the office.
Before too long I ended up in the hospital for a heart attack scare (it wasn't a heart attack, but it seemed like one), and work stopped asking me to put in so many hours for a while.
The games I spent so much time and energy on didn't do so well for various reasons (unrelated to the quality of the games, I had polished them up as well as I could considering the shifting demands and firm deadlines), and the company ended up biting the bullet.
He joined a team that was goofing off, which made it an extremely fun place to work. Wouldn't life be easy without deadlines? So when they actually had to produce something, they were far behind where they should have been. The fault for this situation probably lies with the management but ... as a developer (who's ostensibly getting paid), you should expect to do work at work.
He also states that he was sad to see the "Big Blue Box" culture and company disappear. That company would have instead gone out of business if someone (the parent company?) hadn't stepped in and changed things.
That is also my feeling after reading the article. He almost seems to stress that his best moments at work were the parties and the lunch time football games. Sure they contribute to the well being of a project team, but shouldn't the best moment be shipping the product out the door?
Conclusion: do not put yourself in an environment where the people around you are doing bad things. If you are already in such an environment, talk to someone outside it, get their view on the matter, and take it seriously. And if you think you aren't in such an environment, consider doing that anyway as a reality check.