I'm going to reply to several thoughts I've seen in this thread.
1) As u/DevX101 said, "The article is based on a collection of anecdotes from responses to a Facebook question: '75% of med students and residents are taking either stimulants or antidepressants or both. True or false?'"
It looks like the 75% number is invented and not actually reflective of survey results.
2) Med students would be exactly the type of people that would know that caffeine is technically a stimulant. There are also a decent number of students with valid prescriptions for ADHD, which they would be on regardless of medical school.
3) Medical school has always been hard. But the amount information one has to intake and regurgitate seems to have steadily increased. Meanwhile, medical school remains four years long. Technology has evolved to help students absorb more faster. Spaced repetition. Watching lectures at 2x speed. Massive Q-banks. But you have wonder whether this is sustainable.
Five years of medical school is a bit untenable because of the debt. Additionally, our education system is not set up to move the preclinical years to undergrad (which I believe other countries do).
EDIT
4) I suspect depression is seriously underdiagnosed in med students and MORE students should be taking antidepressants than those currently on them.
> 4) I suspect depression is seriously underdiagnosed in med students and MORE students should be taking antidepressants than those currently on them.
I find that seriously messed up. Maybe it's not 75%, but let's assume an extra margin should be taking antidepressants to exist in the highly competitive med school environment. Further, let's assume that antidepressants progress and competitive programs keep raising the bar based on entrants and dropout rate..
Our public health goal seems to be to develop a society so stressful that psychiatric drugs are not for unbalance in the patient compared to a healthy control but to make unhealthy people that can survive hyper-competition and are maximally profitable for institutions.
I'm confused as to why blanket "antidepressants" are being treated as performance enhancing drugs in the same line as stimulants (as u/ravenstine has posited in a sibling comment). Med school is psychologically taxing which can cause or exacerbate depression. People that have clinical depression should get some sort of treatment for their depression.
Yes, I agree with you that the environment needs to be changed so that it is more hospitable from a mental health standpoint. But this doesn't change the fact that you have medical students right now that are not being treated for the clinical depression from which they are suffering.
One of the absolute issues with the statement "75% of med students are on antidepressants or stimulants" is that it conflates appropriate medication for diagnosed disorders with illegal misuse of Schedule II stimulants for competition. Your typical first-line antidepressants (eg, sertraline, escitalopram) aren't scheduled and are pretty easy to obtain via a PCP visit. They don't offer a "competitive advantage" unless people consider stuff like "being happy" or "not wanting to kill oneself" as an unfair competitive advantage in medical school.
> keep raising the bar based on entrants and dropout rate..
Dropout rates in med school are actually extremely low, at least in the US. Burnout rates as physicians, however...
You have to think about what's the system that will lead to the most economic growth.
If you have constant problems and no social support network, you need to spend your way out of all of those problems. That leads to people maxing out their credit, working as hard as possible for their employer, and being afraid to leave their jobs. Which equals mondo economic growth.
We used to have a society where people generally took care of each other and made a little money for specialty items as part of their career. Money was used to change your position, not to maintain it. The economy was much smaller then.
For doctors, brain drugs → knowledge → saving lives.
For athletes, body drugs → strength → showing off.
"Showing off" isn't really something we require much of in society, or see as a moral good (however entertaining we happen to find it when it is done well.) "Saving lives" is something we want to happen as much as possible, to the point that we might be willing to ask the people doing so to harm themselves a bit to get it done.
A better comparison to doctors might be, say, firefighters: body drugs → strength → saving lives.
Would it make sense for a firefighter to take anabolic steroids if it increased the maximum weight they could lift-and-carry out of a burning building? I don't know. I don't think there's an automatic answer to that.
Showing that you can maintain a stress state for school years doesn't really tell us whether you will retain any information after you are done or be a safe doctor by retirement age. I would guess today's doctors will have a much higher chance of dementia than equally educated but never sleep deprived peers.
Similarly, a firefighter who does steroids might be better until he dies of a heart attack at an inopportune time causing additional deaths. But the real risk as far as I am concerned is that if the steroids are allowed and the tests are standard, then his eventual choice is to take steroids or not be a firefighter. Play that out enough and you have a doped out society that is quantitatively better on what you choose to measure, but highly toxic, needing bail outs for long term side effects, and making short term labor discounts until you figure out where the no free lunch thereom has stashed the harm.
Playing devil's advocate, if you believe that we are a capitalist society (not advocating for or against capitalism) and operating under an efficient market then society does in fact value professional athletes on an individual basis more than doctors as they are paid far more.
Also I would argue that professional athletes playing their chosen sport for the entertainment of those watching are doing far more than showing off, they are in fact paid employees of high revenue corporations.
With that said I don't think athletes should be using anabolics nor do I think doctors should be using anti depressants except under the recommendation of a neutral psychiatrist.
> society does in fact value professional athletes on an individual basis more than doctors as they are paid far more
Yes, we value professional athletes more as people. However, we value the work of doctors more. In fact, the doctors themselves value the work more. That's why—like firefighters, or rescue workers, or police, or soldiers—they're willing to "burn their lives" (or risk their lives) to get the job done.
Micro-econ 101 isn't enough to explain this effect. You need an understanding of the desirability of jobs (and preference-functions that go into making jobs desirable), and how people are willing to trade off capturing less value as pay, for satisfying more of their other preferences.
You're right that professional athletes are paid employees of high-revenue corporations. Which is to say: they're paid a lot because 1. the corporation is getting a lot of benefit from their work, but 2. the athlete themselves is not getting much terminal-preference-satisfaction from the job itself, and so the athlete demands high monetary compensation for the work. (Compare: coal miners, oil-rig workers, etc. These people have risky jobs that they don't have any intrinsic desire to do; they're highly paid jobs because nobody would do the job if they weren't.) In the case of professional athletes, it's not quite that nobody would do the job, but rather that nobody with as much skill would do the job—the talent pool as a whole is large, but the top of the talent pool (the people everyone wants to see, and so the only valuable people from advertisers' perspectives) is small enough to create a seller's market for that talent.
Doctors, meanwhile, want the job (saving lives) to get done more than anyone else. That's often a large part of why they became doctors—because their preference-function ranks "saving lives" quite highly, so they will enjoy a little "saving lives" more than a large amount of something else. Thus, even though we as a society also value saving lives, we don't have to compensate doctors as much as professional athletes for doing it.
I hear what you are saying but I think your argument is flawed. Very few professions burn lives, physical and mental well being to the level that professional sports do. Athletes in the martial sports very much put their lives at risk, even at the amateur level for essentially nothing knowing that most of them will not make it. These athletes train and risk their well being from an early age for the chance to 'go pro'. Also judging from the number of athletes busted for using PED's they are more than willing to risk their future health for current rewards as they love what they do.
I would also argue that while doctors do indeed want to help people, most of them would not go into the profession unless they were well compensated.
But the number of professional athlete jobs are limited enough that the only the best need apply while there are far more roles for doctors so you get a variety of skill levels and the market prices them as such.
Don't get me wrong I would much rather my money go to educating doctors to the maximum of their ability rather than paying to watch sports. I just don't think the based on the way pro athletes are praised and viewed that society feels the same way. Doctors have unfortunately become a commodity in much the same way as the police and fire departments or sanitation. But doctors have the distinction of working in a very much for profit industry so they are paid more than police, firemen etc. Society does indeed value the role they as a profession play overall more than they do sports but not as individuals but as a service.
> These athletes train and risk their well being from an early age for the chance to 'go pro'.
My understanding is that there isn't much overlap between the sports with the most highly-paid athletes, and the sports with the highest risk to long-term health. Martial athletes, or even participants in full-contact sports like American Football or Rugby, aren't nearly as well-paid as participants in sports like basketball, baseball, or soccer. Heck, e-sports "athletes" are highly paid as well, and there is next to no long-term health risk in what they do.
I'm not sure why this is true, but I'll hypothesize anyway: corporations don't want to invest in athletes who can't retire to a life of being a charismatic PR mouthpiece for said corporation. If you get permanent brain damage, your value as a spokesperson goes way down. So corporations don't tend to be as interested in those sports—at least, from the POV of sponsoring the athletes.
There's probably an interesting curve you can compute by summing up the "athlete sponsorship expenses" for a given sport, and then dividing it by sum of the expenses of other market-interest-correlated activities that said corporations engage in, like franchise merchandizing or sports video-game production. I would bet that, the more risky a sport is, the less they spend on branding the athletes, in proportion to how much they spend on branding the teams, the country's league as a whole, or the sport itself.
And certainly, kids want to be athletes, regardless of the risk. I would argue that 1. many kids choose this path long before they can accurately weigh the risks and rewards of a career path, and then 2. they get stuck in it, because they (and their parents) have invested so much effort into cultivating their talent in the sport. (It's not simply loss aversion, but more like being 1% of the way into cornering the market on a lottery drawing by buying all the tickets. If you stopped there, you almost certainly wouldn't win, and would just lose all the money you put in; but if you continue, there's a clear point at which winning becomes increasingly probable, so as long as you can continue down that path, you feel incentivized to do so.)
This still means, though, that when you interview the average olympic athlete (the people who have "won" this competition) and ask them what they do for fun... they don't have much to say. They've put everything into this one bet; they have no other talents or hobbies or passions, because they never had time to cultivate them.
> Society does indeed value the role they as a profession play overall more than they do sports but not as individuals but as a service.
Yes, correct, that's closer to what I meant than what I said myself. :) Look at it like buying a smartphone: just because there's huge demand for the product, doesn't mean the average worker at a Foxconn factory is getting rich.
The middle-man—the hospital, in this case—is satisfying the societal demand, and therefore is "getting rich"; the doctors, meanwhile, only get rich to the degree that they manage to negotiate better pay from the hospital. That negotiation is sometimes explicit, but is frequently implicit, with a kind of collective bargaining going on just by social-status moves of doctors as a group causing the salary-level which no doctor with their "pride" would accept, to go up and down. (You can tell that this is happening because of the existence of "free clinics." Voluntary work is different-in-kind, so it frequently crops up in industries where the workers are too prideful to ever work "for cheap." Without this pride, you wouldn't see "free clinics", but rather budget clinics.)
Educated and physically capable men with some wealth typically became warriors in the past. They certainly didn’t become blacksmiths/builders/engineers/whatever. Maybe there’s something to be said for an entire industry and culture around letting this type of person act out their aggression in a less violent way. Probably serves a huge good to a stable society in general, so that the builders can build in relative peace. If you believe that prices reflect value, then definitely more good to pay these dudes to tackle each other than it is to pay for doctors and teachers - Can’t care for the raped and pillaged.
Funny though that now we get the less-fortunate rungs of society to fight the wars now.
I agree it does seem very odd to say that, but I would say if one looks at our modern society and the way athletes are paid and idolized that for the most part the statement is true. Most people could not name 5 doctors, but they could name ten athletes. I'm not saying I like it but I think its correct.
That's one way to look at it, but are "they" (as a group, not individually) really paid more? I think the total amount of money spent on medical services far exceeds the total amount spent on professional sports or even entertainment as a whole.
Plenty of soldiers' lives depend on other soldiers' 100m dash times in combat... yet the military is quite good at preventing illegal drug use... and the drugs that soldiers most want to take are drugs that would harm physical performance
No they don't. The marginal running ability (or strength) of a soldier does not save lives. That is, a solider who runs a 12.5s 100m dash will not save more lives than a soldier who runs a 12.6s 100m dash.
Obviously there are minimum physical requirements to serve in the military, but once you're in you are not going to start saving more lives because you can run a little bit faster.
Anecdotal reports indicate that illegal steroids and similar performance enhancing drugs are quite common in ground combat units. The military doesn't routinely test for those.
Top teir schools are full of PEDs at this point. I think software developers might be less aware of how much more effective a drug like Adderall is when you pair it with and antidepressant. Even just caffeine. Removes the main problems of stimulants in the short term.
Look up the percentages of professional athletes that have ADD diagnosis which grants them a waiver to use amphetamines during a game.
To me the key point is #1. From reading the article, it seems that the number thrown around is 50-75% of students being on antidepressants. And yet the author of the post adds in stimulants when asking for responses. Even then, they don't provide anything but anecdotes. They run a ‘survey’ of 1800 people and don’t provide a single statistic as a result.
From this, it's hard to draw any conclusion at all about the prevalence of antidepressant or stimulant use. I'm surprised to see such shoddy work on HN's frontpage; the title is provocative and leads to discussion, but there's nothing to back it up.
> "Medical school has always been hard. But the amount information one has to intake and regurgitate seems to have steadily increased. Meanwhile, medical school remains four years long..."
It would seem to me the med profession is an ideal place for technology / AI. Docs aren't magicians and House (the TV doc) is fictional.
Furthermore, it would seem to me that the aggregation of patient data / symptoms / outcomes would be more beneficial than the single opinion of the one or two docs that see.
I'm not trying to dismiss the human element or the knowledge / experience of any single doc, but certainly that shouldn't remain so silo'ed.
<sidebar>
My father had a (severe) stroke almost 18 months ago. He's doing well and ultimately got good / great care. That said, my sense was most docs favored their (subjective) optinions over what I presumed would be known best practices, etc. Mind you, my assessment is subjective. None the less it was a pattern that existed across multiple shift and med facilities.
> It would seem to me the med profession is an ideal place for technology / AI. Docs aren't magicians and House (the TV doc) is fictional.
As someone that works on healthcare AI, you'll have to believe me when I say AI physicians are the jetpack of the 21st century. The populace might think they're right around the corner technologically, but there's a lot of obstacles, both technical and human that make it pretty unrealistic. The need for physicians isn't going away anytime soon. In the same way, the profession of piloting hasn't been threatened by autopilot.
> The populace might think they're right around the corner technologically, but there's a lot of obstacles, both technical and human that make it pretty unrealistic.
And let's not forget regulatory/legal obstacles. Even if we convince doctors and patients of the benefits of AI in healthcare we still have to teach the old farts in Washington what AI is, what AI isn't, and convince them it isn't dangerous. Anyone who watched the Facebook hearings can see how steep that learning curve will be.
If AI could ever come close to replacing doctors, hospitals and insurance companies will be throwing so much money at congress that they'll get whatever regulations they want passed. The problem is that AI actually has to sort-of-work-ish first.
AI is extremely far from being ready for field hospital work. The back office is hidden from you as a patient, but you have to realize that hospitals are a huge mess. Really, really messy places that remind you that medicine is still in its infancy and that medieval times are not far behind. AI absolutely needs a source of information to be efficient. In fact, this is directly related to why doctors work long hours. The information is so scattered, impredictable and irregular that delegating that to AI today is both physically and financially unthinkable. It will be one day, but not today, and not in the coming years.
> Docs aren't magicians and House (the TV doc) is fictional.
House (the TV doc) wasn't, IIRC, a particularly-strong example of an intelligent/clever/omniscient/"magical" doctor. House was:
1. willing to do non-therapeutic experimental procedures for the sake of differential diagnosis. (I.e., he was willing to do small amounts of harm to patients in order to get their bodies to tell him something, where other docs would only ever try procedures in DDx that had some likelihood of being ameliorative in-and-of themselves.)
2. good at synthesizing a true patient history from relatively little—and much of that potentially-false—information.
Other than that, he was just a regular doctor (who was a specialist in infectious disease and nephrology) who was good at team-building by hiring experts of complementary specialties, and then focus-firing all that group expertise at the problem at once, rather than doing the standard medical specialist-to-speciaist chart handoff.
Really, any doctor could be House, if they set out to do so and practiced the skills involved. The real fiction was a hospital willing to set up the kind of "diagnostic medicine" department that House headed, where you've got five experts working on one patient at a time.
> That said, my sense was most docs favored their (subjective) optinions over what I presumed would be known best practices, etc.
My impression, from having a few doctor friends, is that "best practice" is generally either old news or p-hacked hooey. There are many therapeutic approaches that very obviously work in clinical practice, even though—according to medical academia—their effect-size is "neligible." (For example: https://slatestarcodex.com/2018/11/07/ssris-an-update/)
Ideally, you'd collect statistics on what doctors believed from informed clinical experience, and then teach other doctors that as clinical best-practice. Right now, though, we just teach them the stuff we've "proven" in academia, and then let the real world slap them in the face later on.
Medicine is still as much an art as a science. Evidence based medicine is expanding and some doctors don't follow known best practices as frequently as they ought to, yet for many areas we simply don't have the large-scale clinical studies necessary to establish best practices. And even when there are established treatment protocols, those often have to be tweaked to fit individual patients. A lot of this is based on tacit knowledge and intuition which can only be gained through experience.
AI diagnostic tools could be useful in some limited circumstances. The problem is in gathering the necessary data (especially coded data) to feed into the AI. So it won't necessary save any effort. The need for human radiologists to read images may decline as pattern recognition technology improves.
Doctors 100% take advantage of technology, contrary to many comments here. There are search engines that are basically google but for everything medicine related. Your doctor doesn’t know the dosage for most medications, they just look it up on their website.
1) I find it HIGHLY concerning (albeit not surprised) at the lack of knowledge by a M.D. when it comes to survey bias [0]. It's not a problem to say "hey guys, I have a highly unscientific poll here to just get feeler numbers" but to assert it with such confidence to put it in the title is journalistic malpractice.
> 4) I suspect depression is seriously underdiagnosed in med students and MORE students should be taking antidepressants than those currently on them.
Antidepressants are a terrible solution to depression and should only be used in rare cases. They blunt your feelings so that you can keep on with your depressing life choices with out making the changes you need to be really happy. Depression is an indication that you need to make major life changes. It's not a chemical imbalance.
Your "one size fits all" approach to depression is greatly at odds with the great diversity of presentations that exist. There are absolutely roles for biological solutions like antidepressants to play for many people.
Humans are remarkably similar. It's popular to think that we are each so very special but our differences, despite standing out to our brains which are tuned to recognise them, are small. Especially, physiologically. We love to hear that we are special snowflakes even when the news is bad. This works out great for big pharma.
But, I can go a step further and prescribe a cure for depression that will work for 99% of the population, that means you.
1) Get outside into nature at least 3 times a week.
2) Break a sweat doing some physical activity at least 3 times a week.
3) Spend less than 10hrs combined
per week on your phone or computer doing non productive things like watching shows, flipping through social media or reading HN.
4) Eat healthy food every day.
5) Sleep 8hrs per day. If you are following the above this should not be a problem.
6) Maintain your relationships with family and friends on a daily basis. This means spending time making meaningful connections, perhaps doing some of the things listed above.
If you do the above consistently for at least six months and don't see significant results then perhaps you need the drugs but the drugs will have subtle but profound costs in terms of quality of life.
If you are unable to do any of the above then you are not living in an environment which promotes good mental and physical health.
But hey, it's so much easier to pop a pill and everyone else is doing it.
"Depression is an indication that you need to make major life changes. It's not a chemical imbalance" is about as un-researched and un-qualified of a statement one could make. Google Scholar exists. Wikipedia exists. Everyone is has access to these resources.
I'm with you, especially on 4). Truth be told, doctors work long hours because they are used as the last line of defense for the patient in a ressource-constrained system. This in itself is certainly not optimal, but it seems no one can really come up with a good solution for the time being.
Hours and shifts are a very complex problem for health care, and at the risk of appearing presumptuous, I would say that non-MDs and MDs alike do not have a sufficiently clear understanding of all the factors at play to make propositions in full knowledge of the consequences. Only healthcare policy specialists can really find their way in those waters.
That said, I would love my shifts to shorten. If that's a good thing for society at large is anyone's guess.
More antidepressants are not the answer. I’ve been on them, and while they do “help”, they’re actually quite horrible in the long run. Messing with your brain’s seotonin have all kinds of known and unknown side effects. I think they’re ok as a short-term solution, but resolving the root of the problem is the best way to alleviate depression. Students now, as in the past, have always had a lot of social, financial, and parental pressure to excel in school, however the financial burden for medical school has become ridiculously high, therefore the risk of financial ruin in case of failure or dropout is accordingly higher. Add to that the increasing course loads and complications arising from continual changes in local and national regulations, and it’s no wonder students are depressed at alarming rates.
Ok, but that means that in your country medicine is an undergrad program. In the US you need a 4 year degree before entering medical school, which is another 4 years.
Interesting. Does the requisite 4yr degree need to be in a relevant field, like biology? Or can a literature or accounting decree count for just as much?
> There are also a decent number of students with valid prescriptions for ADHD, which they would be on regardless of medical school.
This seems far from a foregone conclusion. What makes you think that people use prescription drugs without regard to their life circumstances?
Many people acquire prescriptions for their stimulants of choice based specifically on their life plans. I literally personally know two people who, in preparation for (one medical, one law) school, secured supplies of amphetamine from their doctor.
Of course they all use stimulants. I don't know how else they could handle the workload. The scary part is that it continues into residency. There are surgery residents working 24h+ shifts. We have laws against overworking truck drivers, but somehow the medical profession gets a pass.
And Johns Hopkins estimates 250K deaths a year from medical errors. I wonder how many of these are from lack of sleep.
A friend runs a residency program and I've talked with them about it a few times. This is definitely getting better, but the number of hours still seems high to me. As a developer, I'm strongly against extended hours because I know how quickly the error rate creeps up. (And I probably wouldn't know that if I weren't doing TDD and pair programming, because the first thing that goes for me is ability to notice my poor performance.) My basic question was: shouldn't doctors work 8 hours and then go home?
The big difference between writing code and doing medicine is that patients won't stay the same when a doctor leaves for the day. With 8-hour shifts and 40-hour weeks, covering a patient around the clock requires 4-5 people. Those people will have 21 handoffs during that week. Each one of those handoffs is an opportunity for information to get lost, for understanding to fade, for followups not to happen. If people work 12 hours, that's only 4 handoffs. 16 and it's 10. 24 and it's 7.
Obviously, at some point the harm from overwork outweighs the harm from handoffs. But it's not an easy decision to make. When I'm debugging some weird, urgent problem, I know how valuable it is to stay with it, to keep all the state loaded in my head until I figure it out. And hospitals are full of weird, urgent problems.
This is really important and an often missed point. The handover can cause more errors than a fatigued doctor. So you would need to solve this problem at the same time (eg what if there was a way to change doctors without a handover)
Realistically, nursing is a job that is far more structured than is being an MD. This allows nurses to function much better as a group.
That's precisely the crux of the matter and the (questionable) reason for the hours. Nursing teams are the stable basis on which the MDs rely on to fight the huge pile of s* * * work that escapes the standard pathways in which we try to shoehorn every patient.
I find it extremely unlikely that so many patients have difficult handover and accute condition that 24h+ shifts are necessary as the norm. Even assuming some patients and situations are like that, it should not be normal (which it is) and there should be enough recovery after (which it is not really).
Never underestimate the a ability of people or systems to fuck up.
When I had back surgery, due to a miscommunication in the nursing staff turnover I wasn’t given any pain relief 3 hours after a major surgery until late the next day.
My grandmother was given PT on the wrong limb. Again, poor handoff.
A friend got a big congratulations from the OB doing rounds when she was in the hospital. Small problem: she was there for complications of miscarriage.
That does not sound like defense of current system at all. It sounds like lack of process and system, relying on individual memory and recall instead.
In particular, both OB and PT on wrong limb are not just handover errors. First I am bit surprised that babies and miscarriages mix - they don't in here (so you know which it is based on room, but I think cause of split is something else). PT on wrong limb is error of not checking what you are supposed to do before administering - but also sounds like error people are more likely to do under time pressure or when tired and falling into routine.
All in all, after major procedure you are in hospital for days and they have to move you between doctors and nurses many times.
I hope it doesn’t sound like a defense. IMO this is a broken system that needlessly kills people as a matter of bureaucratic inertia.
Surgeries performed by the Ob/Gyns are housed on a specific floor. The situation was that there wasn’t a transition as the shifts changed due to understaffing, and the person doing rounds didn’t bother to read the chart.
In my surgical recovery state, the reasoning was poor situational awareness and burden on the nurses. Data was located in 4 different EMRs and they missed it.
Half my family is engaged in various medical professions. Every one of them is dissatisfied with how these systems work.
I had foot surgery a few months ago. When I came to from GA, my nurse was worried that my heart rate was a little too high. He did an ultrasound of my bladder, turns out no one had cathed me and I had 1200 ml of urine. Apparently the discomfort of an overfull bladder was causing higher heart rate. Nurse was shocked when he saw the ultrasound, ran to get the equipment to drain the bladder.
I know. I also know software development teams that believe that long night is simply necessary for every single deadline and everything is always at the last moment. And I know teams that make deadlines without long night - not always but often enough and sometimes sooner.
The "it is totally impossible without 28h+ regular shifts" sounds like team in first category.
Not blaming individuals here, these problems are cultural and systematic and largely steam from leadership.
I've heard this rationalization from medical professionals before.
They need to solve the problem with handoffs. It's a big problem, and problem requires rewriting a lot of the process, but there's nothing more to it. It's solvable.
The institution however, has beaten any imagination out of them and replaced it with resistance to change.
How would you propose to solve the problem? Most attempts thus far have relied up getting attending doctors to do detailed data entry about patient state into an EHR, but that by itself is time consuming and detracts from immediate patient care.
Maybe someday AI and NLP technology will serve to automate much of the data capture, but we're many years away from those technologies being a practical reality.
What about doubling or tripling staff and then having them work in teams of two with half their shifts overlapping with someone that will be staying longer? Would that work?
All I'm interested in is whether it would work, not the cost. There are radical proposals that would manage cost better, but there's no point in considering them if it wouldn't help.
I could propose as many pat, armchair solutions as the next HN reader :) However, it's clearly not a problem that's conducive to quick fixes, or it would have been sorted.
That said, the arguments for the status quo are laughable. There's no other field of human endeavour where people claim that 24 hour shifts are >safer< than 8 hour shifts. It's not like handoffs have even been eliminated: handouts are just done 3 times less, but by profoundly sleep-deprived people, who have to reiterate the last 24 hours to the incoming staff.
When people defend behaviours of this nature, it's a 'culture smell': to a greater or lesser extent, the proponents have been indoctrinated to cling to a local maxima.
Your experience with coding for a long time is exactly like mine, but no one is going to die if I stay up late and write terrible code. Otherwise the clear answer would be to just be time inefficient and pick it up the next day. If there is an edge case where an emergency happens in hour 7 of an 8 hour scheduled shift I guess that's fine, you can go hard like that from time to time and be fine, but not constantly. And this really is an edge case since most medical care is not in an emergency setting.
I agree that most medical care isn't urgent. But I think all doctors should be able to handle emergencies. It's similar to how I think we should work to never have urgent production issues, but that all developers should be able to debug urgent production issues.
The book "Why we sleep" talks a lot about this - apparently the doctor who created the first residency program was a cocaine addict, which let him stay awake for days at a time: https://en.m.wikipedia.org/wiki/William_Stewart_Halsted
The reduced fatigue is offset by the increased number of patient handoffs between shifts. More handoffs = more errors. My wife is a resident and she supports 30 hour shifts for residents for this reason. The difference between your judgement at hour 23 and hour 29 is less of a risk factor than a 20% increase in handoffs.
Retired neurosurgical anesthesiologist here (38 years experience in top-tier academic centers: UCLA/USC/UVA). It was always my practice — from my first year of residency in 1977 until I retired as an attending faculty anesthesiologist in 2015 — to refuse relief at 3-4 pm when the night crew came on duty IF I had a patient who'd been unstable or whose management was especially complex. I didn't believe I could fully read the relieving anesthesiologist into the case because so much of what I'd been through with the patient had been idiosyncratic and it had taken me hours to get a "feel" for what worked and what their safe parameters were. Relief teams loved me, you can be sure. Oh, one more thing: when I did hand over a relatively straightforward case, rather than quickly dress and get the heck out of Dodge, I'd go down to the cafeteria, get a snack, come back to the anesthesia ready room, shmooze a bit, and then go back to the OR I'd left and quietly enter, saying to the relief doc, "Did I leave anything out, or is there anything on the anesthesia record that isn't clear?" Quite often there WAS a question or three whose answer(s) helped orient the relief anesthesiologist.
Thank you for going the extra mile for your patients. I'm sure you saved lives that way.
Too many hackers make the mistake of thinking that human bodies are deterministic machines that respond consistently to inputs just like computers. The reality of medical care is far more complex.
I'm assuming "quite often" was not 90% of the time. To that end, was there a common theme to the times this happened? Not necessarily automatic automation obvious, but some learning.
And, I do want to echo the sibling post. Just because I am asking questions like this does not mean that I am not thankful for the job you do. Thanks!
No common themes: a million and one different loose ends, many that never came up before and never would again, having to do with inability to read lab values written in tiny spaces, shorthand to myself not known by anyone else, etc.
This feels unlikely. I get that there would be a lot. But the number is almost certainly smaller than you are giving credit.
Just with the two you gave, automating the recording of lab results should help. Such that you never have to copy from a readout to a record. That should just happen. (Consider how many shipments and other things are transferred with minimal errors every day.)
Similarly, why have a shorthand that is known only to you? We have plenty of things like that in computers. Not everyone has my macros. That said, they expand to known things. Or contract from them.
So, I would hope we have folks working to help in this area. I'm not convinced we don't. I do think it gets presented adversarially all too often. There is no need for that.
No, we're not. Clinical software is the 9th circle of hell for every healthcare worker, as it was made not for patients, but for billing efficiency.
As a result, handoffs get no benefits from tech. Computers make it even worse by increasing administrative loads.
> The difference between your judgement at hour 23 and hour 29 is less of a risk factor than a 20% increase in handoffs.
What about the difference in judgement at hour 8 vs hour 29? Honestly, the fact that there's not much benefit for working "way too many hours" vs "way too many hours + 6" doesn't surprise me. It's the difference between "a reasonable number of hours" and "way too many hours" that should have more of an impact.
And how much would handoff issues be reduced if everyone was working an 8 hour shift?
Do you have an idea of how much 8-hour shifts would cost? It is not pretty. It is doable, but has numerous drawbacks as well. The first being that you would not want to have surgery done by someone trained on 8-hour shifts without a complete rehaul of medical education that would also cost you an arm and a leg.
> We have laws against overworking truck drivers, but somehow the medical profession gets a pass.
It's a feedback loop. Those residents are working themselves to the bone aiming for a big payoff in a high prestige, high income profession.
The fix you want is isomorphic to just hiring a bunch more doctors to handle the workload, which will dilute the existing pool making it both lower income and lower prestige.
Basically, Residents are willing to work hard because they want to be part of a profession where residents have to work hard. That's not really an issue of regulatory structure (though I totally agree that work hour limits are a common sense thing that should absolutely be enforced by some authority).
On paper yes. In fact, this is a social dilemma and your point of view will hold until you yourself will end up in the hospital and complain because your doctor does not know his/her job. I don't mean that aggressively. I unfortunately get that from first hand experience.
Yes, of course. In fact, the problem is that when we make major changes to the healthcare system, the truth is that no one has any idea of where we'll end up a few years later.
That's why there is a lot of posturing on both sides. I can judge how extreme opinions will impact my immediate professional surroundings, but besides that your judgement is as valid as mine.
> I wonder how many of these are from lack of sleep.
A lot of the deaths from lack of sleep wouldn't be counted as medical errors. E.g. medical errors includes accidentally doing surgery on the wrong person, but doesn't include fucking up a surgery that you actually need to do. C.f.:
But, what if you got the wrong patient because you were sleepy and it was affecting your judgement and clarity of thinking? I'd say there's a decent amount that could still fall under that.
That's why there are multiple pre-op Checkpoints with nurses involved. One doctor cannot "get the wrong patient" all by him/herself. It takes a 5 people team fuck up for that to happen.
Yeah my point was more that the JHU estimates are purposely a lower bound. E.g. they are using only data from hospitals, even though most procedures are happen in outpatient or nursing home settings. So whatever the real number of accidental injuries and deaths from sleep deprivation, you can't get it as just a percentage of the JHU estimate.
> There are surgery residents working 24h+ shifts. We have laws against overworking truck drivers, but somehow the medical profession gets a pass.
That is so true. During my Zivildienst[0], for nurses were required either by law or by collective agreement to have at least ten hours of free time / rest between two shifts. The morning shift was 06:00 to 14:00, the late shift was 13:00 to 21:00, so if somebody worked the shift one day and the early shift the next day, they could either leave an hour earlier on the first day or come in an hour later on the second day. But Doctors working 24 hour shifts were not seen as a problem.
In Germany, this is weirdly intertwined with the pay structure of medical doctors, so any attempt to improve working conditions would also result in lower wages, which the medical doctors do not like, either, so the whole situation is somewhat stuck.
EDIT: [0] When we had a military draft in Germany, one could become a conscientious objector and serve in a civilian institution instead, for example Hospitals or nursing homes.
Near the end of the book Why We Sleep, the author specifically talked about this. The lack of sleep contribute to 170% likely more cause an error during surgery. Scary to ponder...
He mentioned the guy that started the residency program, Halstead, in John Hopkin turned out to be a cocaine addict. He stressed that a doctor needs to work all the time. No one knew about his drug addiction until he had passed away.
He even tried to get rid of the addiction by checking into a rehab up north but came away in addition to the cocaine addiction but also a morphine addiction as well.
That's not correct. These rules are for normal shifts where residents are caring for patients. There might be "down time" to grab lunch or catch up on notes, but it's not enough to go take a nap.
Perhaps other fields better manage their manpower, like airline pilots and crew, I think they operate under strict duty limits which is impossible to bypass without breaking rules. I think even train locomotive crews are under very strict duty limits.
Airline pilots are not really comparable. They don't routinely hand off control of the airplane to another set of pilots in the middle of an in-flight emergency. Every intensive care patient is essentially a non-stop crisis.
Many of the respondents to this informal poll classified coffee or other energy drinks as a stimulant, so not surprised by this. 64% of Americans drink coffee every day.
This would have been more interesting if antidepressant usage were isolated.
You dug into the sources? The author of the linked article wrote, "I discovered that 75% of med students (and new doctors) are now on psychiatric medications."
The article is based on a collection of anecdotes from responses to a Facebook question: “75% of med students and residents are taking either stimulants or antidepressants or both. True or false?”
I drink large amounts of coffee. Previously, I bought a 4oz jar of anhydrous caffeine powder. I've also been on 10mg BID amphetamine.
To compare the 2, the amphetamine feels more artificial bodily wise. However ingesting about 500mg caffeine has the similar bodily feel as 10mg amphetamine. Given the amphetamine did some tricks like time delayed whereas the caffeine did not, does make it harder to compare. The caffeine did stay just as long, and provided a much longer bodily high.
These days, I just drink coffee and stay away from pills and powder. Safer.
I don't really know how anyone could honestly compare caffeine and amphetamines. One is like a knife and the other is like a 50 cal machine gun, as far as stimulation goes. There is a reason amphetamines are a schedule II drug and caffeine is not.
Drug scheduling is a joke. Schedule I is defined: "Substances in this schedule have no currently accepted medical use in the United States, a lack of accepted safety for use under medical supervision, and a high potential for abuse."
How do alcohol and cigarettes not fit this definition!? Meanwhile, edible cannabis is relatively safe and Ecstacy has shown promise as a component of PTSD treatment. Just a few examples...
And unfortunately, the GP using the fact that $thing is drug scheduled is the same type of idiocy that we see through government.
"Well, marijuana is a schedule 1, so we can't test it for legitimate applications. And since it's schedule 1, it has no medical applications".
There is absolutely no scientific or chemical basis of "schedule". It was originally a way for Republicans (Nixon, Reagan, etc) to use the law to arrest hippies and black people. And we have no further to look at crack/powder cocaine and sentencing disparities. Crack was inner city and primarily used by black people. Whereas powder cocaine was used more by whites outside of the inner city core.
It's about dosage. 5 mg of amphetamines is really just a bit a boost. Of course if you start cutting lines and snort them to keep awake and dancing until noon the next day, it's a different story.
> I don't really know how anyone could honestly compare caffeine and amphetamines.
They are both stimulants. I've done both, like I said. So yes, for my body (n=1) I can indeed compare them. And it was sincere and honest.
And the only reason why caffeine isn't schedule 2 is the same reason why alcohol and nicotine isn't either. That's purely a societal reason, and no basis in chemistry. If caffeine/coffee was discovered last year, it would be schedule drug. Look no further than kratom.
Yes 500mg caffeine is kinda similar to 10mg amphetamine, but you can easily take more amphetamine, up to a gram in some cases. It’s very much easier to abuse, has a far stronger, profounder and long lasting effect.
It also has some nasty side effects that come with abuse. Mental problems being the first ones, depression, anxiety, mood swings ...
Indeed, this is a very misleading title. Medical researchers might understand that caffeine is a stimulant, but as a layperson I assumed they meant aderall or similar.
Prescription stimulants like amphetamine and methylphenidate are more potent than caffeine.
What if the headline had said "75 % of med students are on drugs" but actually it just means that 75 % of med students occasionally take acetaminophen or aspirin for headaches? Those are indeed drugs, but wouldn't you feel misled? People drinking coffee is not newsworthy, so one wouldn't expect an article about it. I haven't looked at the data the article is based on, so I don't know if something like this is what's going on, but I think this is the point village-idiot was trying to make.
No shit speed and meth are stronger than coffee. I don't think that was the point being made. Or at least its probably safe to assume the commenter you are replying to didn't indend to imply that.
Sorry to be do blunt but casually ignoring things like modafinil which seems to be the go to drug for staving off sleep if you want to be able to pass drug tests is a bit of a mistake.
Subjectivly most users of modafinil find it "weaker" than coffee because there is no rush associated with it as there is with coffee, you just can't sleep when you take it.
So if the point you're making is actually just virtue signalling with the "well some drugs are worse than others because I guess they are classified as controlled?" bullshit then please provide an actual reason why the comment was disagreeable.
??? I'm not sure you know what virtue signaling is. Also your tone is very unpleasant.
The point the comment was making was very simple: providing a single percentage that includes as a category things that are very different is misleading. In the extreme, saying "95% of the US takes painkillers at a rate considered addiction" where painkiller is defined as aspirin or heroin, is not a very useful metric
1. Methylphenidate (Ritalin) is not "meth", that's methamphetamine.
2. I'm not judging people who use any of these drugs, whatever their purpose. If it were up to me, all drug laws would be repealed and the state would have no say about what anyone is allowed to put in their own bodies. I'm only pointing out that there are stark differences between caffine and drugs like amphetamine, methylphenidate, and, yes, even modafinil, and lumping them all together as "stimulants" isn't useful to this discussion. (If modafinil is no better than caffeine, why does anyone bother going through the hassle and expense of obtaining it when caffeine is dirt-cheap and easily accessible?)
This discussion is not about villifying drugs or people who use them, it's about the unreasonable, inhumane workload that medical students are expected to handle. If 75 % of medical students feel the need for and take steps to obtain prescription drugs, whether illegally or by getting a prescription, then there is clearly something wrong with medical school. If 75 % of medical students drink coffee or other caffinated beverages, that is not a cause for concern (for me, at least).
I'm curious if you could reason this out for me. If I drink a few hundred litres of tea a day to keep me alert then why is that different to me taking ritalin? I mean its obviously impractical and probably quite bad for me and I should absolutely take ritalin instead. But I'd love to know why the desire the stay alert to the extent that I need to seek drugs to help with that because I can't get enough sleep is any less alarming if my drug of choice is tea (ie. Caffeine) rather than ritalin?
Because caffeine consumption is usually a given in the general population, and significantly weaker than more specifically engineered and harder to acquire prescription drugs. Three pots of coffee in an hour is a poor equivalent to taking ritalin.
If I considered walking across the room as a form of workout, I could pretty confidently say that 99.99% of Americans exercise once a day. But that wouldn't yield me much in the way of insight.
Coffee is quite stimulating by itself, an addiction can probably ruin your life via sleep and anxiety. I’m fine with it being classified as a stimulant, although I wish they pulled caffeine out into its own category.
> So you are saying a disability is disqualifying to be a doctor? No that is not correct.
No, you are wrong. Mental health among doctors is a huge issue in the field right now. In some states having a recorded case of depression is enough for you to have a license review.
Behind closed doors everyone will tell you to lie about your mental history. If you need help, you better make damn sure you're going to someone who will take cash and not keep any records of your appointments.
Doctors are literally having their entire careers put at risk because they admit they're human.
Certain types. It talks about this is in the article about your medical board not being your friend. Medical doctors, lawyer, police officers, military have strict requirements about even antidepressants and you can be disqualified from having just taken them.
My wife is a psychiatrist and sees a psychiatrist off the record because she’s afraid of the stigma affecting her career. It’s just one data point but I find it telling that a psychiatrist is afraid that other psychiatrists will know she takes med to treat a psychiatric condition. Hers is anxiety related.
In my view it's pretty widespread that people are hesitant to seek certain kinds of care because of the potential effect of a "pre existing condition" on their employability.
Exactly. I don't think it's because the medical board itself would stop them, but more because the stigma attached to it would lead to people questioning them and their ability to do their job, not to mention the issues with insurance. I'd say that likely has a lot more to do with doctors wanting to see others off-the-record... I know of some teachers who do the same thing with psychiatrists, because they don't want it getting out to the students' parents.
So wait, I could have been an air force pilot after all, and their annoying eyesight and physical requirements are illegal?
That is absolutely ridiculous, the ADA doesn't imply that anyone with any disability needs to be accommodated for any job, just that reasonable accommodation be made. For instance, someone in a wheelchair will be unable to do many jobs, regardless of accommodation,such as framing carpentry and roofing.
As someone who has gotten ADA accommodations related to mental health, there are absolutely limits. Due to my hand tremors, no hospital in the nation would allow me to be a surgeon, regardless of qualifications. Ditto for police departments, due to said mental-health issues.
I think the vast majority of people would agree with you. Clearly, some jobs require certain physical attributes.
However, which jobs would a person with a psychiatric disorder be disqualified for? Couldn’t one argue that all jobs that require a brain would be off-limits? Probably not, but where is the line drawn?
And that is as much because of the potential for the condition to be used for blackmail as it is any real limits it imposes on the work you would be doing.
And for some jobs, e.g. you want to work undercover for the CIA as a field agent, your mental state is not what would be strictly considered normal.
This whole discussion is completely absurd. Good soldiers, doctors, intelligence agents, fighter pilots and astronauts are often far outside of the bell curve in some important traits. That's why they ended up in the job. It would be impossible to find good candidates if everyone was completely honest with all the things that should disqualify them from the job. If you were a cynic, you'd call it a test of good judgement.
You can bet that almost everyone in these jobs has some skeleton in the closet that would officially be a problem. Some years of depression or other psychiatric illness, a proclivity for scandalous sex, having used an illegal drug a couple of times or even regularly, an unofficial child somewhere, some hidden medical issue that popped up once and was never talked about again, etc. With the common case that the issue has little, no or even positive effect on job performance.
This is anecdotally true from off-the-record conversations with exceptional individuals I know. The list of disqualifying parameters is just there so the bureaucrats can cover their ass when an aneurysm causes the loss of a billion-dollar aircraft, and to prevent blackmail with information that can be easily found.
But regarding blackmail, I think the bigger problem is that honest and harmless things can be used for blackmail in the first place. Everyone has smoked a joint and had some compromising photos taken, it shouldn't be such a big deal.
ADA has limits. You can't be in the military if you are missing a limb. Can't be a police officer if you have mental issues. There are common sense limits to ADA.
Varies by jurisdiction, but there are a lot of places, including many areas in the USA, where this is very much an issue, affecting licensing and credentialing. Would it survive a court challenge? Probably not, but I don't know a lot of doctors who would take the risk of reputation, career prospects, lost income, etc. to find out.
Med school -> Memorization school. The medical profession needs to figure out how to use technology more effectively in its processes. I'm convinced that having doctors spend years doing route memorization hinders their critical thinking. It also puts a ton of stress on medical students who have to spend an exorbitant amount of time drilling. Of course, that would harm the medical industrial complex, so we must make being a doctor as difficult as possible.
You have to build the background knowledge to even be able to speak the language of medicine. This is not difficult "just because", it's difficult because it's a huge amount of baseline knowledge that must be assimilated, before you can even begin to think critically.
This is like telling someone "we need to figure out how to use technology to prevent people from spending years of rote memorization before they can speak <X_foreign_language>!"
You can undoubtedly put priority of grammar over vocabulary, no? You are not thinking in terms of systems that allow us to find the "words" we need after being guided by our mastery of the grammar and ability to reason about what must be said given a situation. I think what you said is valid only if we think about speaking a language unassisted by these types of systems, which is exactly what I feel must be left behind.
Edit:
Also, this is what happens in practice anyway! Doctors don't remember a lot and constantly have to look things up ( as is understandable and expected ). You must be a master of the "grammar of medicine" though. Otherwise you won't be able to construct well formed sentences... aka medical thoughts.
You need some memorization to give some concrete instances of the grammar, of course; however, going overboard on the "vocabulary" is not constructive and shifts the focus onto knowing a lot of words, not knowing how to construct great sentences. It's like a writer that knows a lot of uncommon words, but ends up having nothing meaningful to say... because the power of writing is not in the vocabulary alone, but in the craft of composition.
The problem with that analogy is Med school treats medicine like speaking English. But you can simplify the material and treat it like teaching a constructed language.
For example if you named the bones in the fingers as left, thumb, two then it’s little effort to recall. Instead people need to recal the stuff like distal phalanges. Math is arguably worse with redundant notations for the same thing. So, I see the benifit of avoiding creating multiple systems.
I don't think that's a good example. There are only 206 bones in the adult human body. It doesn't take very long to memorize those. The real memorization time is spent on other stuff for which we don't even have common English names. You could invent new, simpler names for every anatomical part, every disease, every symptom, but that would be a huge undertaking and would only make memorization marginally easier.
My point is if some of med school is obvious cruft that can be removed and or altered to be more useful in practice. Why not?
PS: I would include tendons, joints, and muscles etc not just 206 bones in that example. Also, that was rather off the cuff as left/right could confuse things so maybe you want port vs starboard or something.
You’re really displaying your total ignorance of medicine now - our anatomical naming conventions are highly specific.
Anterior, superior, distal, medial, ... hell, even phalanx means finger. Your lack of knowledge is causing you to make very general assumptions about the teaching and learning of medicine that are incorrect
Phalanx also means finger to most students of history. It is, after all, Greek for Finger (secondary to Aristotle dubbing the bones so).
What's your point? It seems to be that medical terminology (or the expert body of knowledge that distinguishes someone with anatomical or medical knowledge from the layperson) is an artificial construct which imposes undue restrictions on the easy transfer of this knowledge - guild behaviour.
Every realm of knowledge has its own language which must be learned in order to be proficient. In briefly reading your other replies in this thread, it seems like you feel that computing and programming don't have these restrictions - that is absolutely not true, just ask anyone who has been trying to learn programming themselves but is stuck on the difference between a method and a function (only to find out they are the same thing).
It is not possible to be an expert without gathering the knowledge, the language, and the real understanding necessary to partake in the field. Anything else is simple arrogance, to assume that any specialised field that people spend decades learning, is actually just arcane crud and cruft that disguises the simplicity hidden within
I think you’re getting the wrong impression. I find medical papers tend to be easier to read as a non specialist than recent matimatics papers.
Math however is trying to be as clear as possible and more elegant notation generally though not always wins over time. Mathematics don’t still use Roman Numerals even though they worked.
PS: Languages change over time, programmers dropped the term subroutine. This can look messy, but it’s benifical over time.
I may be getting the wrong impression - but what is the point you’re trying to make - and did you just choose a shitty example?
The left, thumb, Two, carries so many problems - do you mean palmar or volar surface? Medial or lateral? I can accurately discribe a lesion at the base of the left thenar eminence and everyone who has studied anatomy anywhere in the world will understand the exact part of the body I am talking about. In another more specific example, I might describe petechial haemorrhages proximal to the iliocaecal valve.
These words aren’t in common usage but again, every medically trained human in the world would know exactly what I am describing.
I understand your last comment about languages changing over time (technical languages) - medicine is no different. See for instance the Dukes grading of bowel cancer, now largely superceeded by TNM Grading which carries specific histological and pathological meanings (as dukes does) but is applicable to all cancers.
Yeah the idea is skip names as the foundation altogether. Instead characterize the spaces by their properties and symmetries, and when there's an exact number of things (like bones) use a GUI (body diagram, decision tree) to air recall so people can just learn the names on the job.
Ah, yes, the best way to learn all the intricacies of a human: osmosis.
Good luck attaining knowledge that takes 8+ years of focused study in medical training today through osmosis over 30 years of "on the job" training, being paid $0 because you are still "learning" and can't contribute to anyone's actual medical care during that time unlike a resident physician.
“Distal phalanges” are the far away phalanges, which are the bones that are in an array like a phalanx in the extremities.
The problem is that language changed after that, and the expectation for those words to make intuitive sense changed in the broader culture doctors were pulled from.
They’re then faced with having to rename everything, breaking with the established record and creating confusion or having to explicitly teach the names.
I’m also curious what you think is redundant in math notation.
Although I would welcome the use of common names instead of Latin names, the names in anatomy and pathology are annoying but not a hindrance. The difficult thing is to remember causes and symptoms of diseases, innervation of muscles, drug pathways, etc. That would be hard to simplify.
Instead of Latin/Greek root words, you'd have to learn a new system of common words. And not just new trainees, but also practicing physicians, because everyone has to speak the same language.
You fail to appreciate that there is quite a bit of skills development and problem solving training. The first two years are more memorization heavy, as you have to learn the language and basics. Years 3 and 4 are more focused on navigating the health care system, interacting with patients, performing technical skills, physical examination, use of imaging, diagnosis, etc. Memorization is only a fraction of what you do to become a competent physician.
First two years of medical school are in classroom, second two years are clinical rotations.
Residency is after medical school, and that's the first time you start earning a meager paycheck. In medical school, even when you're "working" with patients, you still pay tuition.
Also, let's be real, the type of cramming these students have to do is antithetical to long term retention. Most of the knowledge is still greatly reinforced in the job. Cramming for the test is a "first draft" for the rest of one's career, just as taking notes in lecture (also a bad idea btw) is first draft for preparing for the test.
If my prescription required citations, my tested data, the doctor's conjectures (because most conditions cannot be diagnosised with much certainty yet), and formal reasoning to connect it all, I couldn't care less.
(Before somebody complains of layman mob rule, remember checking a proof vs inventing a proof does and should require far less expertise.)
All learning requires some memorization. That doesn't mean all memorization is equally good. Memorize that which is hard to look up. And compress what you need to memorize as much as possible; the compression process itself is intimical to understanding and generalizing.
Medical school today would be like learning programming by memorizing the x86 ISA spec.
The x86 ISA spec is a doddle compared to organic chemistry.
Which is a pre-med subject.
There is a minimum amount of material in any subject that needs to be available for rapid recall in order to be an expert in that subject.
A wide variety of different areas have been studied to relate performance to level of learned information. Linguistic fluency requires easy recall of vocabulary. High rankings in chess depend on the recognition of thousands of distinct patterns. Musical improvisation requires thousands of hours of tedious repetition and practice. Effective programming requires a body of knowledge encompassing a variety of topics, a lot of which will be memorised as a side-effect of repeated exposure.
But if you think you can be a better doctor than a doctor by scanning the indices of medical textbooks -- realising a little late that you are unfamiliar with a lot of the terms and also unsure in what order the concepts need to be assimilated before you can apply them -- then at least make sure your will is up to date.
Orgo is exactly what I'm talking about. Whether or not it's harder or easier is besides the point: it's simply too low a level to be useful.
I'm all for a liberal arts approach of understanding other layers of abstraction for more context, but empirically this is not how the vast majority of premeds experience the orgo requirement. And if liberal arts was really the goal, then corresponding amounts of public health classes would also be required (layers of abstraction above and below at equal distance). But that's not the case. Clearly then it's not about a broader perspective but instead about weeding people out arbitrarily for sake of some combination of wages and egos.
I'm not a doctor or on that track. But I am a musician and programmer, at say 1000+ hours for the former, and 10,000s of hours for the latter. Not once have I engaged in flash-card-style rote memorization for either, especially not for any written test. I am certainly full of random facts by now, but the knowledge I hold most valuable is not that which I could also get from API docs or stack overflow.
No doctor, med student, or premed I have ever met studdied their field remotely like the ways I've studdied mine. Is being a doctor inherently that different? I can't see why not, and if I look at economics and public health research the arguments I give leap out at me.
(I also studdied Chinese for a few years when I was younger and definitely did not do enough flash cards. Now that is a field (language learning) where rote memorization is inherent to the problem at hand.)
> The medical profession needs to figure out how to use technology more effectively in its processes.
My wife is in med school. It has nothing to do with technology.
It comes down to an insane risk aversion present throughout all of medicine. If you can't prove an alternative method is empirically better, it won't even get tried. If it's not tried, you can't get the numbers.
Even when things are obvious improvements or carry limited down-sides, the whole medical field is hesitant to pursue them.
I think the main failure to use technology is actually in communication, specifically in handoffs, communication with patients, etc. Paper records exist even though the infrastructure to make paper records work (including a population of young people willing and able to make this system run) is largely gone. Practices need to write referrals, and will try to fax documents over which then need to be re-entered. HIPPA, rather than encouraging EMR systems, makes professionals skittish to put anything into a computer. Rather than medical information breeches, we suffer from the medical information not getting to the right people.
You can't really get around the drilling. Medicine is fundamentally different than programming in this regard. You have to know the names, not just the concepts, and there are a TON of names.
There IS technology that does help with memorization. Many, many students use Anki for spaced repetition. Most subscription study sources offer some kind of spaced rep (USMLERx Flash Facts, Firecracker, etc). There is also Sketchy and Picmonics for pictorial memorization of microbes and drugs.
You can't get around the drilling in anything, but the point is that students do an OBSCENE amount of it, and forget most of it. Also, as other's have said, most material is totally irrelevant to the actual practice of medicine and will be forgotten quickly. EX: You don't need to know organic chemistry to be anything but a medical researcher (which should be a distinct profession), yet the system falsely increases the number of requirements to keep people in school longer and maintain control. It's a corrupt system In my eyes, and people fall for this bullshit that it's so difficult to be a practicing doctor... it only is when you add extraneous stuff that is not necessary in the practice of medicine.
Most doctors aren't researchers. You don't need that much education to practice medicine and fit the role of what people need when they seek a doctor. It has to change, because the current state of affairs is out of control. People are losing all that they have to the raping of the medical industry.
>If you don't have a knowledge base, you simply can not practice medicine because you don't understand how things actually work...
Nonsense. Why does a heart surgeon need to know anything about psychiatry? What does a radiologist need to know about cellular biology?
The equivalent for programming would be forcing someone to understand how transistors work at the atomic level, CPU architecture, and OS structure before letting them write a webapp. It's definitely one of the problems in medicine. There's way too much useless knowledge required leading to a limited supply of people being able and willing to master it. Plus, it makes it much more expensive and time consuming to train all of those people.
Knowing about CPU architecture and OS structure helps me to reason about the application code I write. Do I understand those things as much as someone who specializes in them? No of course not. But a certain level of baseline knowledge does have its benefits.
They seem to follow a pretty similar career arc. If you take a typical CS grad they're going to go through a couple years as a junior dev, couple more as a more senior one, and then five years or more in they are ready to be in charge of their own project. That's pretty similar to the intern -> resident -> attending pattern that doctors follow.
Honestly, the biggest difference between having a career in programming and being a doctor is the gate keeping. In programming, you can drop out of high school and 20 years later be an expert in the field. There's a path for talented people to go from junior -> dev -> senior -> lead -> principle -> whatever. There's no path like that in medicine. Sure, as a nurse you can go from LPN -> RN -> NP but that's where it ends. Your only option at that point is to spend 4 years and hundreds of thousands of dollars. And there's really no reason for that.
They’re not directly comparable topics like that. A doctor definitely needs to know more than a computer scientist. But all of medicine compared to all of computer science is not an easy comparison to accurately make, since they’re both such broad topics - in medicine, everything from psychology to pharmocology to genetic engineering - in computer science, everything from transistors to complex mathematics to artificial intelligence safety. They’re both topics that are impossible for one human to understand in entirety, but society requires doctors to know a lot more of medicine.
Compliance with medication regimes and followup care can affect outcomes in cardiology, as in other areas. Understanding a patient's psychiatric illness, psychosocial functioning, and other factors can help predict treatment compliance. That in turn may influence the treatment regime the cardiologist recommends.
Medications that are prone to abuse, ones that may be more complex to take, ones that can be dangerous if someone takes too many... all are things the cardiologist should be aware of which takes some basic psychiatric knowledge.
There are few in cardiology, but many physical health meds can definitely create psychiatric side effects. How do you differentiate those from an existing psychiatric illness?
There's a large body of literature that recognizes the links between cardiac care and mental health, resulting in poorer outcomes, increased system utilization and costs, etc. Saying "this is my area, that's yours" exacerbates this problem.
> >Heart surgeons have patients with psychiatric problems.
> Which they don't treat. That's why we have psychiatrists.
People with mental illness also have heart problems that need heart surgery. It'd be nice if heart surgeons had a bit of understanding of psychiatry so they can avoid diagnostic overshadowing. This is a significant problem that contributes to the shorted lifespan of people with severe mental illness.
The common theme in these subthreads is that you consider your ignorance as a superior source of knowledge. That since you personally don't know about a thing, the thing doesn't exist.
It's difficult to unhorse a debating partner who refuses to allow any evidence but their own imagination.
Just let it go. You are making an apex fool of yourself in the eyes of each and every health professional reading this. It would be strictly impossible for hospitals to function based on your opinions, which is why the supporting data you are calling for does not exist.
We don't pretend to program better than you do or demand solid data to support your pet language/paradigm. Have a little understanding of others for a change.
You are part of the problem, not the solution. Working on that would be the real improvement.
>It would be strictly impossible for hospitals to function based on your opinions, which is why the supporting data you are calling for does not exist.
Oh nonsense. There's plenty of evidence that lower trained providers are just as effective as doctors.
Of course we don't have studies comparing what the equivalent radiologist not because it's strictly impossible for it to happen. But because it's strictly illegal.
There's also evidence that the way doctors are trained isn't the only possible way.
Humanities and social science majors who omitted organic chemistry, physics, and calculus, and did not take the MCAT do just as well as traditional students:
>Just let it go. You are making an apex fool of yourself in the eyes of each and every health professional reading this
Put up or shut up. Where is your evidence based argument that cardiologists who have a psych rotation have measurably better patient out comes than those that don't. Or that a radiologist with more biochem knowledge is a better one?
Well, you clearly don't understand what you are reading. Of course there are improvements to be made. Of course we can take shortcuts. No one denies that. Same as you could bypass your shining CS degree and certifications that you are so proud of and still be a lead developer. But as they say "there is no free lunch". That is what you should be thinking about.
Or does your self-righteousness exclude that 3 professionals telling you you misunderstand could maybe have a point? I certainly earn less than you do and have no MCAT, does that make me more truthful? CRNAs also have to learn a lot about cell biology, is that also useless cruft? Something that is taught in med school is to critically review one's own reasoning!
Forget your petty certitudes and slow down. If there was an easy way to do much better, even the AMA could do nothing to stop it. I certainly don't expect to persuade you and therefore will not take the time to "put up" and brandish more useless Cochrane reviews, but you might consider the remote possibility that you are wrong. And extremely arrogant.
Instead of searching for additional academic papers, I went on to read your comment history.
Now I understand better. You are the iconic portrait of the American self-made man/woman! Going on a limb, I'd say you have a high chance of being against universal care, medical abortion and other niceties.
So yeah, now I can very confidently say that you are downright nefarious to your country's healthcare system. Not just to MDs. The whole shebang.
If you feel insulted although I took great care in staying within what could still be defined as polite if strong disagreement, let me just make clear that your behaviour is par for the course. Let's also just highlight that it's the only time over this whole thread that you do not directly oppose what I'm saying :)
> Nonsense. Why does a heart surgeon need to know anything about psychiatry? What does a radiologist need to know about cellular biology?
Medicine doesn't resemble neatly modulular systems with crisp APIs. A nodding familiarity with distant specialities is required because very frequently a patient cannot be well-treated by a single specialist.
I have a psychiatrist, an endocrinologist and an orthopaedic surgeon. They all take an intense interest in what the others have investigated, diagnosed or treated. More to the point, they have enough overlapping knowledge to profit from their interest.
> The equivalent for programming would be forcing someone to understand how transistors work at the atomic level, CPU architecture, and OS structure before letting them write a webapp.
Webapps don't often come down to distinguishing between conditions which are benign through uncomfortable, disruptive, disabling up to fatal, and besides, our hardware cousins have striven mightily to hide the messiness of reality from us. Medicine doesn't have improvable subject matter to protect it in the same way.
>As a radiologist, I have to know a lot about cellular biology to understand and predict the imaging manifestations of cellular disease.
A given radiologist shouldn't be predicting the imaging manifestation of cellular disease. They should be using their knowledge of how known diseases or conditions appear on scans to diagnose the issue. The people that need to understand things on a deeper level are the ones trying to devise new tests or scans to expand front line radiologists diagnostic abilities.
One of the problems with letting people in a profession set the requirements for that profession is that they exaggerate them to limit competition and increase their earnings. You end up with things like radiologists saying they need to know cellular biology. It's simply ludicrous on its face.
Radiology is a particularly egregious case. If it weren't for legal protection the profession would be dramatically different. It's obviously stupid to spend a bunch of time training them to interpret and diagnosis a bunch of different scans. Basic efficiency would be to train Person A to interpret chest X-Rays, person B to do CT scans, Person C for leg X-rays and have them spend all day doing that one thing. Instead, radiologists spend a decade learning how to interpret a hundred different things and spend 1/100 of their time on each of them.
How do you interpret an image if you don't have an understanding of what is in it?
In other words, let's say that a patient has a fever. Chest X-ray was done.
High-school educated "radiologist" sees the image. Does he understand the anatomy? The variations of anatomy? Pathological manifestations of potential causes of fever? How to exclude image artifact versus include potential sign of pathology? What about findings that are not related to fever but need to be identified, further characterised and further imaging required for follow-up? What about signs of infectious fluid versus non-infectious fluid like blood or extravasated fluid?
What about when the ordering physician wants to discuss the findings with the "radiologist"? Will that "radiologist" actually understand anything that the physician is talking about?
There is a role for AI assisting in rapid analysis of radiological studies, however radiologists can never be replaced because AI will never perform to the level of or have the same functions as a physician radiologist.
>How do you interpret an image if you don't have an understanding of what is in it?
Because you're looking to identify patterns and match them to known ones. To use an analogy, I can teach you to identify statues of Hindu Gods without teaching you anything about Hinduism. For example, to identify Ganesh you need to know that he has an elephant head. Knowing that he has an elephant head because Shiva cut his human head off doesn't really help you.
>In other words, let's say that a patient has a fever. Chest X-ray was done.
>High-school educated "radiologist" sees the image. Does he understand the anatomy? The variations of anatomy? Pathological manifestations of potential causes of fever? How to exclude image artifact versus include potential sign of pathology? What about findings that are not related to fever but need to be identified, further characterised and further imaging required for follow-up? What about signs of infectious fluid versus non-infectious fluid like blood or extravasated fluid?
All of the things you mentioned are things that a radiologist interpreting a chest X-ray needs to know. What you need to explain is why someone interpreting a chest X-ray needs to understand Organic Chemistry. And the vein structure of the leg. And the typical development pattern of a child. And the various mental illnesses a person might have. And interpret a knee MRI.
>There is a role for AI assisting in rapid analysis of radiological studies, however radiologists can never be replaced because AI will never perform to the level of or have the same functions as a physician radiologist.
No, but I could easily take 10 people, train them each 1/10th of what a radiologist studies, and have them perform just as well as 10 radiologists by routing the right stuff to the right person.
The average Radiologist does not only interpret a single modality or the same body part day in and day out (Radiograph, CT, MR, ultrasound, nuclear scintigraphy, PET, or mammography). We are prepared by our extensive residency to competently read any and all of these upon graduation. We do not interpret these in isolation; we are constantly comparing to prior studies across modalities, often across overlapping body regions. I completely disagree that barely trained hyperspecialists would be sufficient.
While specialization is coming to all of Medicine, we tend to cluster by body region/disease state, rather than by modality. This is because all of the modalities provide complementary information and you must be able to cross reference across the various manifestations of disease.
All of medical knowledge is iterative. Unlike programming, you can’t just abstract the low-level programming. The premedical curriculum provides the baseline knowledge to understand pharmacology, which is essential when trying to understand our interventions in physiology and pathophysiology. We need to know both physiology and pathophys when interpreting imaging to know how disease manifests and what is abnormal versus post therapy related change.
We have to know how referring clinicians will treat disease, and know the major complications to look out for on imaging. We discuss their treatment plans during tumor boards and need to speak the language of the treating teams so we can tailor our interpretations to be useful.
There are no easy shortcuts here. NPs and PAs are a living experiment at a shortcut, but what I see day in and day out is that the people they consult (radiology and pathology) need to know even more clinical medicine to help the inexperienced NP or PA in knowing what to do when something happens that deviates from the protocol. Many many many times I will call with a semi-urgent unexpected finding, and just get silence on the other end of the phone. They don’t know what to do, whereas on weekends or nights when I get residents or attendings, I don’t hear this complete absence of understanding.
> I completely disagree that barely trained hyperspecialists would be sufficient.
>While specialization is coming to all of Medicine, we tend to cluster by body region/disease state, rather than by modality.
These are contradictory statements. If Doctors typically go:
No Training -> Trained in everything -> Specialized in body region/disease state
You can easily go:
No Training -> Trained in body region/disease state -> Specialized in body region/disease state
So even if we assume it's not possible to specialize radiologists any further than they already are we can still cut the training time and difficulty.
>We have to know how referring clinicians will treat disease, and know the major complications to look out for on imaging. We discuss their treatment plans during tumor boards and need to speak the language of the treating teams so we can tailor our interpretations to be useful. We need to know both physiology and pathophys when interpreting imaging to know how disease manifests and what is abnormal versus post therapy related change.
No one is denying that there are things in medicine that are complicated and require very skilled people. What you describe is the end state of a fully educated and experienced doctor. Medicine is more or less the only field that makes you become that before you start working. Can you take a reasonably smart person off the street and have him designing plans to treat tumors in a year? No. Can you take that person and train them to identify normal appearing lungs vs cancerous ones in that time? Probably. And can that person learn on the job and develop the ability to design a treatment plan over the course of 10 years working? For sure. And they wouldn't learn everything taught in Med school. They would pick up only the things relevant to the job they are trying to perform.
>All of medical knowledge is iterative. Unlike programming, you can’t just abstract the low-level programming. The premedical curriculum provides the baseline knowledge to understand pharmacology, which is essential when trying to understand our interventions in physiology and pathophysiology. We need to know both physiology and pathophys when interpreting imaging to know how disease manifests and what is abnormal versus post therapy related change.
Going back to the comment that kicked this all off. If I handed you a cellular biology final from Med school do you think you'd be able to pass it? What about the USMLE?
>There are no easy shortcuts here. NPs and PAs are a living experiment at a shortcut, but what I see day in and day out is that the people they consult (radiology and pathology) need to know even more clinical medicine to help the inexperienced NP or PA in knowing what to do when something happens that deviates from the protocol. Many many many times I will call with a semi-urgent unexpected finding, and just get silence on the other end of the phone. They don’t know what to do, whereas on weekends or nights when I get residents or attendings, I don’t hear this complete absence of understanding.
Every other field has the ability to take inexperienced people and make them experienced. There's no reason medicine can't do the same. Your clueless NP and PA should have more experienced people to go to for help when they experience something new. The next time they get that call they wouldn't be as clueless.
What you are describing is literally how medical training works. Residency IS on-the-job training. Medical school provides the foundation to then learn on-the-job; residency transitions them with graded responsibilities to independent practice.
As far as your targeted questions, yes, I do think I would pass a cellular biology test from medical school. I wouldn't get 95%+ but I would expect myself to get at least 80% correct.
I'd estimate the overwhelming majority of practicing physicians would pass the USMLE if they took it cold. They probably wouldn't do as well as they did after many weeks of dedicated study, but they would be above the minimum threshold.
>What you are describing is literally how medical training works. Residency IS on-the-job training. Medical school provides the foundation to then learn on-the-job; residency transitions them with graded responsibilities to independent practice.
It's not because you can't start on the path without going to medical school. And you don't need medical school to start learning radiology. I messed up my elbow in high school. They showed me the x-ray and said this is a fracture, this is a chip, etc. It wasn't difficult to understand and with enough time and practice I'd be able to see them in other x-rays. I don't see where something like Organic Chemistry comes into play.
>As far as your targeted questions, yes, I do think I would pass a cellular biology test from medical school. I wouldn't get 95%+ but I would expect myself to get at least 80% correct.
So ~20% of what you learned in cellular biology isn't needed as a radiologist assuming you only remember what you've actually needed.
There are a thousand reasons why your theories make absolutely no sense at all in the real world. You also make baseless and downright false assumptions due to your complete ignorance.
People like you are part of the issues that make the job harder than it should be.
Maybe you could take a break and re evaluate whether you have the elements to make judgement calls? Or enter med school and see if your theories hold water?
You “could” teach a high school student all aspects of medicine, but it’s going to take so long for them to build the groundwork that by the time they can meaningfully contribute, you’d never make the return on investment. This is actually how medical training was done in the years prior to the Flexner report.
From what I can tell in this exchange is that you have an innate distrust of credentialing, because you think because there is some excess therefore the entire system is wasteful. I’d argue that this is the DunningKruger effect in action.
And claiming that 20% of cellular bio is unnecessary is a silly metric when in the US 70% is passing... I don’t think that logical conclusion applies.
Yes we have more tools to look up information that we have forgotten, but as another poster here says “we aren’t bootstrapping the search each time”. If there’s something we’ve forgotten, it’s much more effective knowing where to start the search.
I agree there’s a lot wrong with our current training paradigm; the cost of medical school education is #1 with several terrible downstream effects.
However, there are no shortcuts. You have to put in the time to master the material, else you are doing your patients a disservice.
>From what I can tell in this exchange is that you have an innate distrust of credentialing, because you think because there is some excess therefore the entire system is wasteful. I’d argue that this is the DunningKruger effect in action.
>And claiming that 20% of cellular bio is unnecessary is a silly metric when in the US 70% is passing... I don’t think that logical conclusion applies.
You're saying that you don't know 20% of cellular biology that they teach in med school. You are a practicing radiologist. Therefore, 20% of the cellular biology they teach is not necessary to be a radiologist.
I have certifications in my field. Developer, Senior Developer, and Lead Developer. If I were to retake the tests, I'd get a 100% on the Developer and Senior one without studying for a minute. Those are good tests. They reflect what you need to know to be effective. The Lead one isn't. I'd probably not pass if I took it today even though I've worked several years as a lead developer since I passed it. So despite actually working in the field and becoming a better developer I'd do worse on the test. That means it is a bad test.
So when you say that the overwhelming majority "wouldn't do as well as they did after many weeks of dedicated study" that makes it a bad test. It means that a good chunk of it is meaningless hoop jumping irrelevant to practicing physicians. A competent practicing physician should breeze through a well designed certification exam.
>I agree there’s a lot wrong with our current training paradigm; the cost of medical school education is #1 with several terrible downstream effects.
>However, there are no shortcuts. You have to put in the time to master the material, else you are doing your patients a disservice.
If someone who wants to be a geriatric doctor skips pediatrics I don't see the disservice. Or take someone like a registered Nurse Midwife. Someone with 20 years experience delivering babies. The only way for them to become competent to prescribe pitocin, antibiotics, or use forceps is 4 years and $200k worth of school? It doesn't pass the smell test.
Many medical practices are organised as you describe (for example, surgeries specialising in nothing but hernias), but medical cases often transcend simple boundaries and a broad amount of understanding is required to detect that boundary violation (such as realising it's not a hernia, or that an organ near the hernia seems to be diseased, or that the patient is going into anaphylactic shock because of a surprise reaction to anaesthetics, or ... or ... or ... or ... ).
> or that the patient is going into anaphylactic shock because of a surprise reaction to anaesthetics, or ... or ... or ... or ... ).
And if I go into anaphylactic shock during my hernia surgery should I be grateful that my surgeon did a Pediatric, OBGYN, Neurology, Psychology, and Oncology rotation? Or that they can draw the Krebs cycle?
The level of aggressive ignorance displayed in comments like this is quite shocking. I'm puzzled as to why engineers and developers think they're qualified to give advice to other professionals in fields where they have no education or experience. What causes that type of hubris?
Okay, that was sarcastic. But in my view no doctor knows all of those subjects in depth. Pre-med students take the basic STEM courses in order to pass their MCAT's, and then forget most of what they learned, and those courses were all just superficial intro courses anyway.
To be fair, the same can be said of most engineers.
Many med students (wisely) major in a STEM subject, so they may get into more depth than someone who majored in some kind of "studies." But what they know is a function of what they actually studied, learned, and kept up with after college, which does not make me super optimistic.
Since you haven't been through medical education and training, you wouldn't understand that while certain details are forgotten, the core knowledge that serves as the basis for further learning is retained.
Do you remember every detail of your first grade education? Probably not, but it has served as a foundation for everything else you learned in life. How do you learn calculus if you don't learn numbers and letters?
You are way overstating the knowledge of most practicing physicians. You are right that most of them learned about these topics at some point, but have quickly forgotten them. I worked with a pediatrician who works in under privileged areas and conducts public health research and she had forgotten the difference between a diploid and a haploid cell (for anyone not bio savvy this is bio 101) yet she still does a fine job treating patients and understands the problems of the communities she works in.
Different people have different strengths and collaboration is all about enabling others to use their strengths. You do not need to learn most of the things on your list before you can make a positive contribution to medicine. Your list would exclude most physicians from contributing.
Yeah, med school/boards require you to memorize some trivia about some or all of those fields. That doesn't mean such knowledge is clinically useful. By necessity, most of that knowledge will be superficial because there is far too much to cover. The stuff that's useful will be retained, while the rest disappears.
As you conclude with, this is really a social/economic problem mascarading manifesting as a technical one.
The problem is doctors (and lawyers) are the best unionized workers in the country. I'm all for unions, but when a certain group of laborers is much better protected than everyone else, and their work legally distinguished creating extra monopoly power, the asymmetries erase the benefits to society.
1. All the memorization acts as gatekeeping to tighten supply. This is in addition to other explicit quotas.
2. Young post trainees (residents, young associates) do the boring drudgery (and in medicine are under-compensated relative to later). This internally is a hazing ritual, pure and simple. Externally, this is again terrible for productivity. Reducing the drudgery at the top of the food chain removes the psychological yearning for improvement from those with more power. And in the resident case exacerbates the lack of competition which would force change.
The long hours of residents + stigma talked about here also exposes the rank hypocrisy.
Your hypothesis seems to be that things would improve if doctors were less protected. If you achieve putting that into practice you will see your already damaged healthcare system literally crumble to ruins.
You have to realize that whether you like it or not docs are the last line of defense. They are the ones putting in the hours to keep the ship afloat, which is why you have to pay them well.
You are right in saying there are dishonest people in medicine, but if you want change then target the industry and their lobbyists. It will be much better for you and everyone else, and will bring some of the changes you seek.
I'm not even arguing a massive conspiracy, to be clear. Not all doctors are Tom Price. I don't know how it works with the ADA, but my guess is the stuff is mainly tradition, and economics informs tradition unconsciously.
And yes the institutions that most want to pay doctors left are totally untrustworthy with that money. So I am not advocating letting them do that.
I guess I'd like:
1. Government health care. Nobody can rationalize the value of their life on their death bed; markets make no sense for health.
2. After that (which fixes the deepest problems), switch to an apprentice style program where everyone starts as a nurse, and then some go back to school to learn more.
The way society valorizes doctors above all (including, implicitly, nurses) (this "last line of defense stuff", etc, etc) strikes me as ungrounded and, frankly, classist or sexist in many instances. There are other well-meaning individuals, and focusing on the individual level, whether patient or care-giver, is not the right approach to systemic problems.
What I meant to say is that in the current structure it's like that and unfortunately very real. It can be changed, but probably at both a high cost and a profound cultural change. Your current social contract as a doc is
1. Do the job that is traditionally yours.
2. Handle all the impromptu alone because no one else will do it.
The second part is the problem, and is largely due to ressource constraints.
What's clear is that the system is already in pretty bad shape, so I'm just saying that the frontline workers are not a suitable target for pressure.
BTW your view regarding the nurse to doctor transition would have been ok 50 years ago. Nowadays, nurses are hyperspecialized in their own niches. If you meant to say "you should know how to make a bed before doing heart surgery", I heartily agree that it would do great good to everyone if docs were to experience what life is lower down the care ladder.
Thanks for replying this far after the life of the thread.
I was aware that nurses are far more trained today. But is their training that orthogonal to the doctors? I was thinking doctors starting with nurses in tandem with nurses allowed to operate more machines, do more radiology, etc.
Well, yes you can do that for some theoretical knowledge, but it wouldn't be very practical on the job because indeed, they are not the same jobs at all. The role of nurses is to handle the expected as efficiently as possible. The job of docs is to handle anything unexpected. That is why a wide education is necessary.
You may also be surprised by the fact that specialist nurses are well paid and would often never want to "upgrade" to being a doc, precisely be cause of the additional constraints. Finally, the ego of specialist nurses rivals that of docs, and they are even stronger unionized.
I don't think it's really possible to keep the same quality of care for cheaper by changing education because armies of residents and fellows are effectively doing slave labor and you'd have trouble finding cheaper and more efficient than a highly-trained slave. In my opinion, well applied technology is the only long term solution.
My three-year (1977-1980) anesthesiology residency at UCLA Hospital was wonderful. Loved almost every minute, including the all-nighters. And no, I wasn't on antidepressants or stimulants other than occasional coffee.
Final year of residency — schedule as senior resident in the house overnight:
Day 1 (weekday): Report for duty at 3 pm, take over from the day team with night call team. Try to send everyone on day team home by 5 pm. Cover hospital and all services for Code Blues; emergency intubations; difficult catheterizations/IVs; patients who arrived late and required pre-op history & physical before next day's surgery; anesthesia for deliveries in OB; anesthesia for cases ongoing from day plus emergencies from ER and in-house. Done at 7 am the next day (Day 2). So 16 hours. No sleep as a rule.
Day 2: Off (home to sleep during day to make up for missed night; to bed that night as usual.
Day 3: Report for work at 7 am; light duties such as pre-op workups of ambulatory patients; start IVs/arterial lines in pre-op area; Code Blue coverage of hospital and all services.
Off at 3 p.m.
Day 4: Repeat Day 1. This schedule applied for six months.
Note: If Day 1 fell on Saturday or Sunday or a holiday, report to work at 7 am (rather than 3 pm), otherwise identical to weekday. Off at 7 am the next day, so 24 hour shift. Sometimes a nap or two.
And are you aware of all the studies regarding irregular sleep schedules, and humans' inability to judge their own exhaustion and how it affects their behavior?
Great job enjoying such circumstances, but the evidence would point to it being a sub-par way to have residents perform at their best. Surely we could have regular-length shifts (with fixed start times so as to work with a reversed circadian rhythm) and more overlap to solve the patient hand-off problem.
I just sent this article to my psychiatrist, whose practice includes medical students. His response:
"Yep. I believe it, mostly. Would estimate 50-66% but medical boards suppress self-reporting."
I was introduced to daily caffeine usage by the most religious christian engineer I worked with at the time. We would drink 2 cups at once to get a jolt.
I don't think I need to ask about engineering, as many of my coworkers have 4+ cups of coffee a day. I can't imagine how many of them take other stimulants/anti-depressants, etc...
From professionals to laborers, why would we assume drug usage is any different?
Does any form of Christianity besides Mormonism have a stance against caffeine? I know many Mormons from professionals to laborers who abstain from caffeine, nicotine (another common stimulant), and alcohol. (Of course many others who don't.)
In modern Catholicism, coffee could be sinful if it is disruptive to your life or connection with God. This is very much the same as alcohol which is normally fine, but can have a detrimental effect in excess or if addiction develops.
There is also some folk history that when coffee started to become popular in Europe, several clergy were concerned about this Islamic drink (coffee having been introduced to traders by Muslims, who cannot drink alcohol). Since wine is central to Christian liturgy, they were concerned that Muslims had developed an alternative to wine and this could be the work of Satan.
This was allegedly put to rest when Pope Clement VIII tasted coffee and deemed it to be an acceptable Catholic beverage.
For me, coffee is my stimulant, alcohol is my relaxer/antidepressant. I've never used any illegal drug, or used a prescription drug without an indicated reason and prescription for it.
I don't. I have a vitamin sometimes when I remember, but I don't drink booze, tea, or coffee, smoke, take antidepressants, focus drugs, anything. I also don't play with essential oils, homeopathy, crystals, or other placebos. Heck, I don't even take painkillers or headache medicine unless I've got stitches. So, while I would agree that most probably do, you can't say 'everyone'.
This 75% number is not from a formal study. The number supposedly comes from quotes from 3 different psychiatrists who quoted the same 75% number.
The author verified the 75% number by asking her Facebook followers.
Do most med students require psych drugs for day-to-day
survival? I turned my question over to Facebook: “75% of
med students and residents are taking either stimulants
or antidepressants or both. True or false?”
She cherry-picked a few Facebook responses to include in the article, but even some of those don't support the author's claim:
but I have no idea if it’s 75%…I don’t know enough of my
class well enough to have that info, nor do I think anyone
does…there are usually cliques of up to 25 people, but for
people to say they know for sure details of 75% of their
class would be hard for me to believe but maybe…there is a
lot of it, I agree with that.
And
Being completely honest 75% seems a bit high, but I wouldn’t
be that surprised if it were true
The author's real point is buried at the bottom: Medical licensing requires doctors to disclose their psychiatric history and medication use. Admitting previous psychiatric problems on your licensing form requires a detailed explanation of the condition and treatment. Doctors know this, so they have a perverse incentive to hide their treatment or even avoid treatment at all for fear of risking their careers.
They're both effectively performance-enhancing drugs, and I'm not sure I'm totally against this. Anti-depressants boost your stress tolerance and stimulants enhance your focus and productivity. Unless it's been shown that usage of these damages people permanently in the long term, the net outcome is a more effective medical industry.
I think the problem doesn't lie as much in the use of the drug, but in the environment that leads to consumption. If we assume strong social stigma against it, and yet a majority of students end up doing it, it means there is a very strong pressure to increase performance by whatever means necessary - that is to say, that students are not in a state where they consider their workload bearable. That's in a pool of people that have been somewhat preselected for being high achievers and willing to work a lot.
While medicine is a field where there should be high expectations for students (given what's at stake), I don't think keeping students and workers around the very edge of what they can mentally handle is at all healthy or worth it in the long term. We don't want burnout, we don't want to have them suffer from stress-related issues, we don't want to increase their chances of making mistakes at work, and from an ethical viewpoint we don't want them going through a miserable life even if it's a net win for society (at least I don't).
100% agree with this. We see the same mental health epidemic in PhD students. Clearly, there needs to be a paradigm shift where we can enable people to achieve high levels of education (and hold them to high standards) without making it a nonstop do-or-die mission. Mental wellness training should definitely be part of the curriculum for medical students (and everyone).
Stimulants all downregulate. They will end up having to use more unless they take breaks to resensitize ( couple weeks off take a tiny dose then another week off).
The antidepressants are worse. In people with violent thoughts (are there people without?) they can make people more likely to act out those thoughts. In everyone else it destroys libido and has has physical withdrawal. Anecdotally I have also noticed people on them don't draw boundaries where they should. They are OK with treatment from employers and friends that they shouldn't be.
Yet for others, stimulants and/or antidepressants are a potentially life-stabilizing treatment that cannot be achieved any other way. The only thing your dismissive comment could achieve is to make people who need them feel discouraged (which is potentially dangerous).
The context is 75% of a profession in training taking these drugs to handle the pressure they are under. Not people with ADHD or chronic depression/anxiety that doesn't respond to therapy.
I also meant to say that stimulants and antidepressants shouldn’t be expected to solve the entire problem. Stimulants, in particular, are a double-edged sword that can amplify sleep deprivation, anxiety, poor-diet, etc. and make problems worse in the long term. But I wouldn’t hesitate to encourage someone to seek medication that they need. I can see how the line gets a bit blurry with medical and PhD students—at what point does chronic academic stress crossover into anxiety/depression/ADD? We don’t want people popping Adderall as an academic requirement.
In any case, I think preventative mental health education and destigmatization would be a more effective solution overall.
Until there is a multi-decade study on the long term effects of this in combination with a stressful job with long hours I would advice to be cautionary. I would really like to know the heart attack rate in those people over 40 compared to non-users.
I have a number of friends who are presently in medical school who come from a wide variety of backgrounds. The ~75% number seems highly consistent with my experiences. Replace "coffee" with "adderall" and it would remain consistent. Some would say they would take adderall just to get the drive to continue their studies despite otherwise being stressed, burned out, depressed, disinterested, etc. as their end goal that they've been working towards for ~10+ years has been becoming a doctor.
I always find it interesting that we go to the medical system to get advice about staying healthy but the people who live in that system and give advice live in a profoundly unhealthy system. A lot of doctors don't have first hand experience how to live a healthy lifestyle.
Is there typically no screening for doctors during job interviews or at some interval after getting a job?
Not sure if everyone is aware of this, but in the tech imdustry typically the more prestigious the company or role is the less likely there is any drug testing. It’s a bit ironic, just one anecdote: I’m not aware that any of the top 5 software focused companies require developers or related tech roles to ever take a test.
On the other hand, it’s very common for a developer or tech role to have to be screened at companies like AllState Imsurance. I’ve never worked at AllState and don’t mean to imply they’e worse than average (anyone can google the requirement). It’s just meant to represent a typical example of large company IT corporate America for which tech is not the core competency of the business.
I can see drawing a line between truck drivers and developers (or any role where it’s easy for a mistake to cost lives). However, if truck drivers are almost always tested to get a job:
I've always worried that an ssri would decrease my mental performance. Maybe worth revisiting. Can the help you learn better / focus more by blunting depression / stress?
Biochemistry and pharmacology are very, very, very complex. Psychopharmacology is very, very, very complex raised to the power of Knuth on an acid trip channeling the ghost of HP Lovecraft.
The practical upshot of which is that the various drugs affect various people quite differently. I have trialled various pharmaceuticals over the years for my assortment of mental health issues. Some I will never take again. Some I take only if I really need to. Some I take every day. Some work great for me but suck for people I know. Some suck for me but have changed others' lives.
The only way to know how a given medicine will affect you is to try it yourself under medical supervision.
Or the opposite. The ramp-up period (4-8 weeks before the brain rebalances its chemistry in response to the drug) is hell, and cessation of the drug (or running out without access to a pharmacy) causes near-total loss of cognitive ability for a few days (with brain zaps to boot).
That's why I'm not using my bupropion. I got a prescription last month to treat seasonal affective disorder, and my psychiatrist wants me to use it every winter then cycle off in the spring, but the side effects both coming on and off seem awful.
Depression and anxiety can result in a rather wide range of cognitive symptoms. For many people, that's what eventually convinces them to seriously pursue treatment. Yes, there are risks that side effects of SSRI's can affect cognition, but there are many alternatives. Ultimately you're trying to optimize your wellness, whatever that may look like.
Depression is not healthy for the brain. It raises inflammation in the brain, and could possibly cause irreversible damage. It seems like ssri protects against this.[0]
"I was told by the psychologist at my med school’s campus assistance program, that 75%..."
Okay, this intuitively sounds right. It makes sense that overwork and sleep deprivation are both triggers for issues like depression and would also drive those same students to seek stimulants to overcome them.
However! From the title to the contents, this rate is presented as fact. In reality it's an annecdotal single piece of information combined with some facebook "me too" chime in. Hardly a quality survey.
Maybe this particular school has a significantly worse problem, or even the opposite: their rates might represent the lower bound of this issue. So while this all fits with our preconceptions of med school issues, I'm reluctant to take the particulars as a true representation of the issue.
Dr Wible who wrote the article on med students on medication (antidepressants and/or stimulants) did a TEDMED talk on "Why doctors kill themselves" http://www.idealmedicalcare.org/why-doctors-kill-themselves/
which refers to the problem with the medical system itself.
If the healer is mentally ill (and most of them are), how can we expect the best medical care? The fact that med students take medication is just the tip of the iceberg.
Can someone explain to me why we have very highly paid medical staff working crazy hours, as opposed to simply having more medical staff working normal hours and getting paid proportionally less? Is it because we can’t train enough of these people so we must overwork the existing ones to meet the demand for healthcare?
(Obviously from the patients’ point of view they’d rather not have any important medical decisions made by sleep deprived doctors.)
I'm surprised this hasn't been solved by lawsuits yet. I have to assume that a surgery going wrong when the surgeon hasn't slept in 24h is pretty much an automatic guilty verdict.
Because the actual surgeon is not a resident at the end of a 24 hour shift, it's a staff doctor on call or a local doctor with privileges at that hospital or surgery center.
It takes like a decade to train a doctor, and a bit over a decade ago was when the American Medical Association stopped trying so hard to limit the number of new doctors, we got the first new school of medicine since the 80s. The biggest barrier to entry right now is probably the residency requirements that make up a few of those training years, there are only so many hospitals and only so many slots in each hospital. The funding for residency is gated by federal funds.
Hand-offs are also dangerous in many parts of medicine. So, often times, hospitals want the decisions being made by a single shift to mitigate the risk of transcription errors between shifts.
I was in a hospital for a few days thanks to a relative being in critical care. I can tell you that no two practitioners are ever in the same room at the same time, so the standard of care is already one of continuous hand-offs and transcriptions.
I think the medical field will be one that benefits most from AI. Routine surgeries will be able to be done by medical robots like those being developed at Cal Berkeley with human supervision if something unexpected comes up
This will shift the supply and demand equation and lower medical costs and make life easier for doctors
>This will shift the supply and demand equation and lower medical costs and make life easier for doctors
Its time to start shifting this discussion over to the name of the profession-
>Physician
and
>Pharmacist
These jobs are not often limited by the amount of physical and mental labor, but often by bureaucratic paperwork to get the government mandated approval.
My wife is a Doctor of Physical Therapy, and Physicians have the stamp needed to get a patient treatment. $380M has been paid by physicians to politicians to entrench their position.
It seems we need to AI and remove the bureaucracy. This is true for Pharma, insurance, and hospitals. That AI will reduce costs.
Using AI for sugery would terrify me. Some algorithm that does it right 95% of the time and in 5% of the time would do something completely random and inexplicable...
Anything in the medical field would have to be extensively tested, this isn't a web app where "move fast and break things" applies
and it would primarily be used with routine surgeries and would default to human intervention under uncertainty. Most of these surgeries are basically identical and the AI could improve by sharing data with all surgeries performed around the world. Not to mention how bad humans are at surgery, especially when fatigued.
AI is the only way to fix healthcare, a doctor/surgeon can work effectively only a few hours a day and takes nearly 30 years to educate from childhood. An AI could do surgery and diagnose patients nearly 24/7, 7 days a week.
I agree that medecine is one of the only domains that didn't industrialize, remaining an artisanal profession, with a highly skilled, highly paid professional in front of every customer. But I am more concerned about the failure modes of AI.
That is not how ai works. Such algorithm would not make it to an actual surgery. Even humans make mistakes. The advantage The point is to get the model to perform better than humans.
Surgical robots -- at least the ones that actually exist and work today -- are designed more like traditional safety-critical robotics platforms and less like a one-off case study for a nice visualization/table for an end-to-end deep RL paper. There is no "the model".
Even in the presence of superhuman medial AI, we still should insist on human experts (i.e. highly-trained doctors) not just in the loop, but at the center of it. This adds another required skill to the repertoire of doctors—understanding the AI systems that they are utilizing (overseeing?), and communicating risk/results to patients.
In other words, this still leaves the same problem—students getting crushed by the medical education gauntlet.
I just read "How to Change your Mind" by Michael Pollan. It appears we are inching closer to freeing psychedelic drugs (psilocybin and LSD for example) for medical use via psychedelic therapist. Michael goes over a lot of new research and revisits older research (1950s - 1970s) showing the effectiveness of these drugs when administered by a professional to help cure depression and break addictions to substances like alcohol and nicotine.
Yes, they are not a guaranteed solution. But research from Hopkins University School of Medicine and others show they are more effective than any current pharmaceutical drugs.
and that doctor making critical decisions about you or even worse that surgeon cutting into you has been on back-to-back shifts for 24+ hours
there are many documentaries about the sheer mess our medical system is in, yet everyone remains ignorant and allows to burn forever, like guns apparently you have to be a victim first before you change your mind and give a damn
I would also be on Ritalin if I could get it. I tried it a couple of times (my son took it briefly but stopped because "it made him feel weird"). A single delayed release pill easily doubles my productivity, improves my memory, and makes the work far more enjoyable.
Something's bizarrely wrong to me that you seem excited about a medication you borrowed from your kid that he felt side effects so significant that he discontinued use.
When I was a kid I used it for years until I stopped because it made me feel weird. For me "Weird" meant antisocial, never hungry, and feeling crestfallen for hours when it wore off.
I took it again in university until I stopped because by then the "crestfallen" feeling was replaced consistently by suicidal thoughts.
To each their own, and everyone can experience different effects, but you seem so very ignorant to what taking Ritalin is actually like. It's not a magic productivity pill. You pay for every bit of it.
I don't see anything "bizarrely wrong" with observing a quantifiable doubling of productivity and improvement in enjoyment of life, all from taking a very small dose.
When (and where) I was growing up ADHD wasn't really a diagnosis you could get. I strongly suspect I have it, at least the attention deficit part, but apparently the only way to properly diagnose it is before you're a fully grown adult, so I'm kind of SOL since I'm in my 40s.
You, on the other hand, seem to be quite judgmental towards people you don't know. That's what's "bizarrely wrong".
Very high workload with long hours, high performance demands depending on the stage of training you are in, oscillating shift based schedule, minimal time off to decompress, cultural pressure and competition.
This has to be a big component of it. Working 16 to 28 hour consecutive shifts, often over night, where a mistake can potentially be fatal, with very irregular sleep hours. That has to be a recipe for disaster. After I learned that was what residents’ schedules were like I was terrified of going to the hospital.
Compare this with countries who have very little licensure/regulation of the profession and I'm sure those doctors and students are much less stressed.
And I'm sure a number of them can also be dismissed as "bad doctors", but not all.
This isn't sympathy, it's practicality. I've been on SSRI's. The side effects are nothing to joke about. The withdrawals when coming off of them are brutal. It was hard enough writing code while on them. I would rather my doctor not have to be using them while treating me.
Instead, let's get them the help they need to not have to be on antidepressants. I'm all for hard work but there is no reason whatsoever to make med students and medical providers work hours that we restrict truck drivers from working.
Med students accumulate horrendous debt and don’t actually make real money until they finish both med school and residency. That’s an 8 year lag in money making compared to a comp sci grad who goes into tech.
That is why mostly people with wealthy parents will attempt med school. You can trust me, I lived with med students in a living coop for some time. Where CS students rent an apartment in a 30 story block on the outskirts of town, med students live almost in the center and rent a whole house together. It's not even comparable.
And do you think anti depressants get paid by health care? Someone needs to buy them as well. Most CS students I know are happy when they can afford the frozen pizza from the super market.
And that's also the main point here. It is a sign of wealth that you can worry about depression. If you are not, then you are also depressed, probably even more so, but can't spend time worrying about it because you worry about rent and food.
So I'm supposed to trust your anecdote on the wealth of med students based on you living with med students for some time? I went through med school and have many friends who went to other med schools. There is a wide range of med schools and support mechanisms. While it may be true that premeds come from wealthier backgrounds than CS students in US, that's something that needs separate sources. If you think CS students are happy when they can afford frozen pizza, it differs from my memory of CS at Stanford, so there's also a huge range of affluence among CS students depending on where they go.
Also if someone is diagnosed with clinical depression, insurance will definitely pay for antidepressants. That diagnosis can be made by student health.
1) As u/DevX101 said, "The article is based on a collection of anecdotes from responses to a Facebook question: '75% of med students and residents are taking either stimulants or antidepressants or both. True or false?'"
It looks like the 75% number is invented and not actually reflective of survey results.
2) Med students would be exactly the type of people that would know that caffeine is technically a stimulant. There are also a decent number of students with valid prescriptions for ADHD, which they would be on regardless of medical school.
3) Medical school has always been hard. But the amount information one has to intake and regurgitate seems to have steadily increased. Meanwhile, medical school remains four years long. Technology has evolved to help students absorb more faster. Spaced repetition. Watching lectures at 2x speed. Massive Q-banks. But you have wonder whether this is sustainable.
Five years of medical school is a bit untenable because of the debt. Additionally, our education system is not set up to move the preclinical years to undergrad (which I believe other countries do).
EDIT
4) I suspect depression is seriously underdiagnosed in med students and MORE students should be taking antidepressants than those currently on them.