Boston Linux & Unix (BLU) Home | Calendar | Mail Lists | List Archives | Desktop SIG | Hardware Hacking SIG
Wiki | Flickr | PicasaWeb | Video | Maps & Directions | Installfests | Keysignings
Linux Cafe | Meeting Notes | Linux Links | Bling | About BLU

BLU Discuss list archive


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

The threat of UCITA



Randall Hofland <rhofland at gis.NET> writes:

	    I wonder if these proposed UCITA restrictions on user rights will
	instead have a dramtically positive effect for Freeware" and the
	OpenSource movement.

This is something that I've wondered about, too.   The  reasoning  is
possibly  best  explained  by  a parallel:  Suppose that in a highway
construction project (such as  the  Big  Dig),  the  prime  developer
(typically  the  state)  were  to  order the contractors to use steel
beams from a  certain  supplier.   Suppose,  furthermore,  that  that
supplier  refused  to  supply  any specs for the steel, claiming that
that was "proprietary" information.  Then, suppose that  an  overpass
ere  to  collapse  because  of  the  poor quality of the steel in the
supports.

In such a case, the first  lawsuit  would  probably  be  against  the
construction  firm.   But they would simply show the contracts, point
out that they were required to use steel whose properties  they  were
not allowed to know.  Therefore, they couldn't be held liable for the
results.  The liability would rest with whoever ordered them  to  use
the materials of unknown quality, and with the suppliers of the steel
(who presumably had specs in the order).

This isn't hypothetical.  There have been many such court cases.

Programmers writing software for proprietary computer systems are  in
exactly this situation.  If my code is running on a system whose code
I can't see, I can't  possibly  attest  to  the  quality  of  my  own
product,  because  it  can  and  will  fail  due  to  failures in the
underlying system that I am not permitted to learn about.

We can expect to see exactly this argument in the courts if this sort
of  liability  law goes into practice.  The liability will ultimately
rest with whoever made the decision to use  a  computer  system  with
secret   internal   components  whose  specs  and  quality  can't  be
determined by the programmers.

Systems built of open code have an obvious advantage here.

I've seen a similar argument used by security specialists. One of the
standard  pieces of advice is that if you are serious about security,
there is one rule that is easy to understand:  You  don't  allow  ANY
software  on  your  systems  unless you have ALL the source code (and
time to study it). If you accept anything without all the source, you
have  no  idea what may be hidden inside it or what it might be doing
to  you  without  your  knowledge.   "Anyone  who  uses   proprietary
binary-only software in a secure environment is simply incompetent."

So far, secret "black-box" software has succeeded,  because  when  it
fails,  there are no legal repercussions against either the vendor or
the person who made the purchase decision.  In  particular,  managers
can  decide to buy Microsoft or IBM because they know that they can't
be held accountable for its failures. They can order their workers to
use  such systems and write software for them, while denying them the
information required to build reliable software.  But if we have laws
that actually provide teeth for software failure, the people who sell
the end-user software have an obvious  defense.   "I  can't  be  held
accountable  for  the  failures  of a computer system if I was denied
access to its code."

-
Subcription/unsubscription/info requests: send e-mail with
"subscribe", "unsubscribe", or "info" on the first line of the
message body to discuss-request at blu.org (Subject line is ignored).




BLU is a member of BostonUserGroups
BLU is a member of BostonUserGroups
We also thank MIT for the use of their facilities.

Valid HTML 4.01! Valid CSS!



Boston Linux & Unix / webmaster@blu.org