[mod.comp-soc] more on Pharmacutical databases

taylor@hplabsc.UUCP (06/28/86)

ing the
service he paid for.

The person taking the drugs has also paid for a service and has every
right to expect to receive a safe and effective product.

The manufacturer will have to pay the damages. Neither the programmer
or the QA person are likely to have the resources to pay damages.
Perhaps the FDA should play a role here much as the FAA is responsible
for certifying the airworthiness of aircraft. Note that the FAA is
not, to my knowledge, liable for crashes caused by a bad design.  I
would not expect the FDA to be either. But they could play an
important role as an independent watchdog.  In any case, the
manufacturer should be encouraged to set up a design and release
process that reduces the chance of buggy software getting out.  Things
which seem expensive in man-hours and time to market such as code
reviews and complete testing by well paid engineers (most companies
treat QA as a second class group, if they have one at all) might seem
less expensive if the manufacturer were held liable when their product
did not perform as advertised.

>My feeling is that the burden of legal responsibility is on the
>parties that did something wrong, or, by proxy, failed to do something
>"right" with criminal intent, in this situation.

I'm not sure there was any criminal *intent* in this example. Neglect,
perhaps.

> For example, if the pharmacist was later shown to have known that the 
> combination was potentially lethal but trusted the computer to be right 
> when he might be 'confused' then he is indeed responsible for the death 
> of the patient.

I think he would have some responsibility but it would be hard to show
that he knew the combination was lethal. I think it would also be
unlikely that he would sell the combination if he knew it. You may
show that he *should* have known it, that it was covered in one of his
classes. But it is entirely reasonable for him to rely on the system
he bought to perform as advertised. If it doesn't and he has to check
every interaction by hand he'd be better off without it.

>Recently, due to a number of lawsuits brought by women with health problems, 
>the courts have ruled that A.H. Robbins must offer restitution to women 
>adversely affected by the IUD.  This could potentially cost the company
>hundreds of millions of dollars - perhaps driving it out of business
>entirely.
>
>There was NO criminal intent when they introduced the IUD, however, and 
>they also had the approval of the FDA (the branch of the American government 
>involved with testing and approving foods and drugs for consumer purchase).  
>
>The point?  That the US government feels that the act is all, and that
>the intention is irrelevant.

I think the intention is relevant but not an absolute defense. What's
important here is that a company offered a product for sale which
turned out to hurt people. When you offer a product on the
marketplace, you are promising that it is a product that does the job
and is safe to use. If it is not safe to use, even if it was
improperly used, you the manufacturer are liable. Do people know the
lawnmower as hedge trimmer story or should I repeat it here? I know
that much software available does not carry this kind of promise but
that is partially due to the willingness of most software consumers to
accept less than what they thought they were buying.  Didn't I read
recently about a politician who bought a computer which didn't perform
as claimed? She responded by introducing legislation which required
stores to make good on their claims. This is the general attitude with
regard to consumer goods.

[Note that this example is essentially moot.  See the article posted
 previously about the Dalkon Shield in this group...  --Dave]

>Enough legalities, however.  The more interesting questions in the
>hypothetical scenario presented are the moral ones.  Should the programmer, 
>hearing about this tragedy, feel responsible in any sense for the death?  

Yes. They should feel responsible. They should know that their software will
be making life and death decisions and take the appropriate level of
caution. It is unlikely they would be held legally liable, however. They
don't have enough money for anyone to care.

[I think that moral/ethical responsibility is irrelevent of legal liability
 and/or financial status...--Dave]

>From another direction, legally the designer would not be, at least
>in states like Colorado.  Here in Colorado engineers are licensed 
>by the state after taking rigorous exams, or are vouched for by 
>the employer, who must accept some legal responsibility for this.
>Hewlett Packard chooses the second route.  This is one of the main
>reasons that engineers almost always MUST have degrees to be able
>to perform certain tasks.

In one sense that is a reason. In another sense, the reason is that
engineers (civil engineers always, electrical engineers sometimes,
software engineers, sometimes) do things that people's lives depend
on. The regulations reflect that reality.

I have some other thoughts with regard to holding manufacturers of
vaccines liable and whether this is good for society overall but I
think such things belong in another forum.

 Phil Ngai +1 408 749 5720
 UUCP: {ucbvax,decwrl,ihnp4,allegra}!amdcad!phil
 ARPA: amdcad!phil@decwrl.dec.com

taylor@hplabsc.UUCP (07/01/86)

--------
This article is from amdcad!phil (Phil Ngai)
 and was received on Tue Jun 24 20:14:24 1986
--------
 
>The question is - who's responsible?
>
>The pharmacist for relying on the computer for information when they 
>should have known that alpha + beta are potentially fatal from the
>years of pharmacutical school?
>
>The Computer programmer who designed the faulty database/knowledge
>system that failed to issue a warning about the fatal drug interaction?
>
>The Person testing the system before installation to verify that it
>does indeed have all the knowledge and information it's meant to have?
>
>Or the person taking the drugs, for trusting a pharmacist who uses
>flawed computer systems?

My feeling is that the manufacturer of the system is responsible.  The
pharmacist bought the system to automate his operation. If he had to
manually check for drug interactions then he is not receiving the
service he paid for.

The person taking the drugs has also paid for a service and has every
right to expect to receive a safe and effective product.

The manufacturer will have to pay the damages. Neither the programmer
or the QA person are likely to have the resources to pay damages.
Perhaps the FDA should play a role here much as the FAA is responsible
for certifying the airworthiness of aircraft. Note that the FAA is
not, to my knowledge, liable for crashes caused by a bad design.  I
would not expect the FDA to be either. But they could play an
important role as an independent watchdog.  In any case, the
manufacturer should be encouraged to set up a design and release
process that reduces the chance of buggy software getting out.  Things
which seem expensive in man-hours and time to market such as code
reviews and complete testing by well paid engineers (most companies
treat QA as a second class group, if they have one at all) might seem
less expensive if the manufacturer were held liable when their product
did not perform as advertised.

>My feeling is that the burden of legal responsibility is on the
>parties that did something wrong, or, by proxy, failed to do something
>"right" with criminal intent, in this situation.

I'm not sure there was any criminal *intent* in this example. Neglect,
perhaps.

> For example, if the pharmacist was later shown to have known that the 
> combination was potentially lethal but trusted the computer to be right 
> when he might be 'confused' then he is indeed responsible for the death 
> of the patient.

I think he would have some responsibility but it would be hard to show
that he knew the combination was lethal. I think it would also be
unlikely that he would sell the combination if he knew it. You may
show that he *should* have known it, that it was covered in one of his
classes. But it is entirely reasonable for him to rely on the system
he bought to perform as advertised. If it doesn't and he has to check
every interaction by hand he'd be better off without it.

>Recently, due to a number of lawsuits brought by women with health problems, 
>the courts have ruled that A.H. Robbins must offer restitution to women 
>adversely affected by the IUD.  This could potentially cost the company
>hundreds of millions of dollars - perhaps driving it out of business
>entirely.
>
>There was NO criminal intent when they introduced the IUD, however, and 
>they also had the approval of the FDA (the branch of the American government 
>involved with testing and approving foods and drugs for consumer purchase).  
>
>The point?  That the US government feels that the act is all, and that
>the intention is irrelevant.

I think the intention is relevant but not an absolute defense. What's
important here is that a company offered a product for sale which
turned out to hurt people. When you offer a product on the
marketplace, you are promising that it is a product that does the job
and is safe to use. If it is not safe to use, even if it was
improperly used, you the manufacturer are liable. Do people know the
lawnmower as hedge trimmer story or should I repeat it here? I know
that much software available does not carry this kind of promise but
that is partially due to the willingness of most software consumers to
accept less than what they thought they were buying.  Didn't I read
recently about a politician who bought a computer which didn't perform
as claimed? She responded by introducing legislation which required
stores to make good on their claims. This is the general attitude with
regard to consumer goods.

[Note that this example is essentially moot.  See the article posted
 previously about the Dalkon Shield in this group...  --Dave]

>Enough legalities, however.  The more interesting questions in the
>hypothetical scenario presented are the moral ones.  Should the programmer, 
>hearing about this tragedy, feel responsible in any sense for the death?  

Yes. They should feel responsible. They should know that their software will
be making life and death decisions and take the appropriate level of
caution. It is unlikely they would be held legally liable, however. They
don't have enough money for anyone to care.

[I think that moral/ethical responsibility is irrelevent of legal liability
 and/or financial status...--Dave]

>From another direction, legally the designer would not be, at least
>in states like Colorado.  Here in Colorado engineers are licensed 
>by the state after taking rigorous exams, or are vouched for by 
>the employer, who must accept some legal responsibility for this.
>Hewlett Packard chooses the second route.  This is one of the main
>reasons that engineers almost always MUST have degrees to be able
>to perform certain tasks.

In one sense that is a reason. In another sense, the reason is that
engineers (civil engineers always, electrical engineers sometimes,
software engineers, sometimes) do things that people's lives depend
on. The regulations reflect that reality.

I have some other thoughts with regard to holding manufacturers of
vaccines liable and whether this is good for society overall but I
think such things belong in another forum.

 Phil Ngai +1 408 749 5720
 UUCP: {ucbvax,decwrl,ihnp4,allegra}!amdcad!phil
 ARPA: amdcad!phil@decwrl.dec.com