Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So in effect he's right to suggest the pharmacist should just be replaced by a robot because you would have them follow the rules no matter what with all humanity stripped out. If there was more than one pharmacist available it wouldn't have killed him/her to go and take a look at the wife in the car.


The pharmacist can, and should, use their human judgement to refuse to dispense drugs even when prescribed. But they don't and shouldn't use their human judgement to dispense drugs that weren't prescribed; this is by design and for good reason. It's a two-person rule: you only get the drugs if both the pharmacist and the doctor formally agreed you should get them.


The problem seems to be that the drug has indeed been prescribed, but the writing was erroneous. This is not about "should we freely sell drug", but about "should we use human judgment in addition to paper orders".

Clearly, some commenters are also trying hard to be robot-swappable. :)


The prescription was for the wrong drug, i.e. the wrong drug was prescribed and conversely the right drug was not prescribed. The pharmacist can, should, and quite possibly did use their human judgement to refuse to dispense the wrong drug, even though it was prescribed. But by design they don't have the authority to dispense the right drug without getting a doctor to formally prescribe it.


I really wouldn't be surprised if it turned out a large percentage (over 35%) of hacker news commenters originate from robots / ai / chatbots being trained / tested surreptitiously.


It doesn't matter if the pharmacist thinks -- or even knows -- that they are doing the "right thing", pharmacists cannot prescribe medications. Their life would be over in a heartbeat if anybody ever found out that he or she legally provided a drug to somebody without a prescription. Additionally, pharmacists should use their best judgement to deny valid prescriptions, similar to how a bartender would use their best judgement to not sell alcohol to a customer.


You are right and make an interesting point. Part of the advantage of dealing with a human is that one hopes they can deal with whatever strange problem it thrown at them.

If you can’t deal with corner cases, or have none than leveraging rules/software is the way to go.

I think of vending machines and parking meters as automation of simple tasks, but what to do when they break?

I think google tries this by trying to “automate all the things”. It works most times but when it goes wrong it makes it very frustrating to correct.


When they (the machines) break, a human should be able to override, i.e., I was once stuck in a malfunctioning parking garage (no ability to pay, barrier stayed shut), so I lifted the barrier and let everyone out without paying. This could be illegal, but keeping me in a parking garage without telling me how long it will take is also a crime. Please don't make human un-openable parking garages, ever (meaning: don't let a robot decide whether a human is allowed to leave).


This is a completely different scenario to the one described above. You yourself are potentially breaking the law (if the boom gate is damaged, assisting in potential theft of services rendered, etc) to prevent your car being stuck in the parking lot (which very likely has a clause in the conditions of entry, so may not be illegal).

The two key points are that you are committing the crime yourself, not asking somebody else to, and that it is in fact only potentially a crime, and even then it's relatively minor, and could be argued strongly in court.

You are not asking somebody else to illegally sell you opioids (for example) with no prescription. From what it sounds like, nobody was in real danger, just pain, and you expect people to risk everything they've spent their entire life working for just to prevent an hour of pain for some stranger?


We don't have a suing culture so much here so I don't think that is relevant. Also, I feel that medical personnel should minimize suffering to the best of their ability, they don't take an oath swearing to fully emulate an emotionless robot.


Get a grip on reality dude. Nobody is going to risk their livelyhood to stop somebody being in relatively[0] moderate pain for an hour. The fact that you expect strangers to do this for you is really very selfish.

Not sure how suing is related to the comment you replied to. If a chemist hands out drugs to people without prescriptions and they get caught, they will lose their licence, be barred for life, lose their store, get a massive fine, and in some cases serve jail time. Nothing to do with suing.

[0] Relative to medical emergencies.


If so, these people should be replaced by robots asap.


> If there was more than one pharmacist available it wouldn't have killed him/her to go and take a look at the wife in the car.

Depending on the jurisdiction, pharmacists aren't allowed to prescribe stuff or treat people on the spot. That's business of a (licensed) doctor.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: