Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Okay this example makes sense. No prescription == no drugs (especially for painkillers and other things that people abuse).

If you think this is a bad thing, then it's an organizational problem and has nothing to do with computers.



If you are a human and you can see in the computer that the prescription is regularly updated and you can even call the doctor if you doubted it. You can even check the prescription afterwards and if my story doesn't check out, do something about it. The migraine was pretty bad, imo it was like not treating a broken leg because of a missing insurance card or something.


There are very good reasons why a doctor might decide to change a prescription for someone, and listening to the patient over the doctor in some of those cases would not only be a bad idea, it could be potentially fatal or cause long lasting problems.

For example:

- Evidence of abuse of a drug.

- A second prescription which interacts badly with the first.

- Changing health circumstances (pregnancy, some sort of deficiency, failing internal systems, etc).

The correct thing to do in a case like this is to stop and coordinate with the prescribing physician. Having a pharmacist look at some change compared to how things were done prior and have a patient tell them it's a mistake and they should ignore it without consulting the prescribing physician is almost never a good idea.


You're really blaming the wrong people. Blame the doctor for screwing it up, your wife for not checking the prescription, or the government for creating/enforcing the relevant laws that would have put not only the chemist but their entire franchise underground for "doing the right thing". Laws are laws, broskie, don't expect everybody to break them for you.

If the migraine was really bad (bad enough to cause tangible damage) maybe you should sue the doctor for damages, or if it wasn't that bad, report him/her and go somewhere else next time.


So in effect he's right to suggest the pharmacist should just be replaced by a robot because you would have them follow the rules no matter what with all humanity stripped out. If there was more than one pharmacist available it wouldn't have killed him/her to go and take a look at the wife in the car.


The pharmacist can, and should, use their human judgement to refuse to dispense drugs even when prescribed. But they don't and shouldn't use their human judgement to dispense drugs that weren't prescribed; this is by design and for good reason. It's a two-person rule: you only get the drugs if both the pharmacist and the doctor formally agreed you should get them.


The problem seems to be that the drug has indeed been prescribed, but the writing was erroneous. This is not about "should we freely sell drug", but about "should we use human judgment in addition to paper orders".

Clearly, some commenters are also trying hard to be robot-swappable. :)


The prescription was for the wrong drug, i.e. the wrong drug was prescribed and conversely the right drug was not prescribed. The pharmacist can, should, and quite possibly did use their human judgement to refuse to dispense the wrong drug, even though it was prescribed. But by design they don't have the authority to dispense the right drug without getting a doctor to formally prescribe it.


I really wouldn't be surprised if it turned out a large percentage (over 35%) of hacker news commenters originate from robots / ai / chatbots being trained / tested surreptitiously.


It doesn't matter if the pharmacist thinks -- or even knows -- that they are doing the "right thing", pharmacists cannot prescribe medications. Their life would be over in a heartbeat if anybody ever found out that he or she legally provided a drug to somebody without a prescription. Additionally, pharmacists should use their best judgement to deny valid prescriptions, similar to how a bartender would use their best judgement to not sell alcohol to a customer.


You are right and make an interesting point. Part of the advantage of dealing with a human is that one hopes they can deal with whatever strange problem it thrown at them.

If you can’t deal with corner cases, or have none than leveraging rules/software is the way to go.

I think of vending machines and parking meters as automation of simple tasks, but what to do when they break?

I think google tries this by trying to “automate all the things”. It works most times but when it goes wrong it makes it very frustrating to correct.


When they (the machines) break, a human should be able to override, i.e., I was once stuck in a malfunctioning parking garage (no ability to pay, barrier stayed shut), so I lifted the barrier and let everyone out without paying. This could be illegal, but keeping me in a parking garage without telling me how long it will take is also a crime. Please don't make human un-openable parking garages, ever (meaning: don't let a robot decide whether a human is allowed to leave).


This is a completely different scenario to the one described above. You yourself are potentially breaking the law (if the boom gate is damaged, assisting in potential theft of services rendered, etc) to prevent your car being stuck in the parking lot (which very likely has a clause in the conditions of entry, so may not be illegal).

The two key points are that you are committing the crime yourself, not asking somebody else to, and that it is in fact only potentially a crime, and even then it's relatively minor, and could be argued strongly in court.

You are not asking somebody else to illegally sell you opioids (for example) with no prescription. From what it sounds like, nobody was in real danger, just pain, and you expect people to risk everything they've spent their entire life working for just to prevent an hour of pain for some stranger?


We don't have a suing culture so much here so I don't think that is relevant. Also, I feel that medical personnel should minimize suffering to the best of their ability, they don't take an oath swearing to fully emulate an emotionless robot.


Get a grip on reality dude. Nobody is going to risk their livelyhood to stop somebody being in relatively[0] moderate pain for an hour. The fact that you expect strangers to do this for you is really very selfish.

Not sure how suing is related to the comment you replied to. If a chemist hands out drugs to people without prescriptions and they get caught, they will lose their licence, be barred for life, lose their store, get a massive fine, and in some cases serve jail time. Nothing to do with suing.

[0] Relative to medical emergencies.


If so, these people should be replaced by robots asap.


> If there was more than one pharmacist available it wouldn't have killed him/her to go and take a look at the wife in the car.

Depending on the jurisdiction, pharmacists aren't allowed to prescribe stuff or treat people on the spot. That's business of a (licensed) doctor.


My mother had a pharmacy and she often called the doctor to solve this issues. The doctor is a phone call away, and the pharmacist job is to do whatever he cans to improved the patient's illness.


In which case you blame the programmer who wrote the code, not the employee following the software. Your idea of deferring to the rules over all common sense is the core of the problem. It doesn't matter if we are talking about government rules over which they'll enslave you if you break them or corporate rules over which they'll fire and blacklist you if you break them.


Note the irony in saying the pharmacist should have trusted the machine while ignoring the established rules.


Hell, most humane countries do it like that way already. My pharmacist parents in Norway wouldn’t have a problem with this, based on anecdotes over the years.


That's more about CYA. The pharmacist would risk a lot by giving out the medicine without proper prescription. They may know it's OK, but they won't risk an overzealous prosecution, for example.


Except you are asking that pharmacist to break the law. If they got fired afterward, would you be the one to take them and their family in?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: