janm@dramba.neis.oz (Jan Mikkelsen) (01/04/91)
[Somehow, I don't think this is the right place for this discussion. Followups to sci.crypt] In article <18875@rpp386.cactus.org> jfh@rpp386.cactus.org (John F Haugh II) writes: >The amount of information required to go from Einstein to >a working nuclear device is non-trivial. The amount of >effort required to go from FIPS PUB 46 to a working DES >machine is trivial. I have a copy of FIPS PUB 46 sitting >somewheres in this room. It is for DES as well ... See below ... > The whole world of cryptography >requires that cryptosystems be open to examination. DES >is published so that everyone may stare at it and uncover >any holes they might find. So, while I need to know how to >create the appropriate neutron density to get a bomb to go >"BOOM" instead of "bzzt", with DES I only need to read the >PUBs and start typing. Sure, many people have written pieces of software which perform DES encryption. It is considerably more difficult to design a piece of hardware with specific characteristics, for example, very high encryption speed, tamper resistance, small size, or the ability to operating in a hostile environment. These are the things which cannot be built using the FIPS specification alone. These should be sensitive, not the algorithm itself (which it isn't). I suspect that it was much easier for the Americans to restrict all implementations, rather than spending lots of time trying to figure out which implementations to restrict, and which not to restrict. Of course, this is pure supposition. Perhaps it would make more sense to allow software for export, but not real, physical, hardware. Ultimatly this does all the work. -- Jan Mikkelsen janm@dramba.neis.oz.AU or janm%dramba.neis.oz@metro.ucc.su.oz.au "She really is."
gnu@hoptoad.uucp (John Gilmore) (01/04/91)
People can endlessly debate the small points of the rules; I want to understand the big ones. WHY SHOULD PRIVACY TECHNOLOGY BE ILLEGAL? Why does the US government think that privacy is something neither its subjects, nor the citizens of other countries, should have? Back to details... From: jfh@rpp386.cactus.org (John F Haugh II) > Hopefully you will mention in your letter that DES should not be > restricted by the Commerce Department either. There is no reason > to restrict DES software (or even hardware). True. Commerce Dept. rules are that software which is freely available to the public is treated like documents, e.g. can be exported to any destination under no-paperwork General Licence GTDA. But this limits commercial usage of encryption, which is a serious problem; multinational companies are at a severe disadvantage in computer security if they do their r&d in the US, because they can't export the result. DES is not the be-all and end-all of encryption either. It's just the "sticking point" where the Munitions people refuse to allow export. There should be no controls on the import, export, or use of encryption. From: bhoughto@hopi.intel.com (Blair P. Houghton) > . . . there's something to be said for prohibiting the > export of sensitive technologies, regardless of the availability > of related scientific information. What exactly is "sensitive" about the availability of PRIVACY? From: janm@dramba.neis.oz (Jan Mikkelsen) > It is considerably more difficult to design a piece of hardware > with specific characteristics, for example, very high encryption speed, > tamper resistance, small size, or the ability to operating in a hostile > environment. . . These should be sensitive, not the algorithm itself What exactly is sensitive about the ability to produce a tamper resistant package? Do we not wish anyone who wants a tamper resistant package to have one? The only reason I can see for outlawing tamper resistance is if the government wants to undetectably tamper with our things. Small size? What is sensitive about SMALL devices that provide privacy? If privacy itself is OK, why not portable privacy? High speed encryption? I presume the problem is high volume, not high speed. If privacy itself is OK, what business is it of the government's how much data you choose to keep private? I would think that the government would encourage people with a lot of private data (credit card companies, gun registration lists, payroll information for large companies, etc) to have good means for keeping their information private. Hostile environments? Hostile to what? Certainly a privacy-assuring device should operate in environments hostile to privacy :-). High temperatures, humidity, radiation, etc? I don't think techniques for heat-sinking, sealing, shielding, etc are export-controlled, though there are some that are classified (and thus aren't even available to the U.S. public). -- John Gilmore {sun,pacbell,uunet,pyramid}!hoptoad!gnu gnu@toad.com Just say no to thugs. The ones who lock up innocent drug users come to mind.
barmar@think.com (Barry Margolin) (01/04/91)
In article <14511@hoptoad.uucp> gnu@hoptoad.uucp (John Gilmore) writes: >People can endlessly debate the small points of the rules; I want to >understand the big ones. WHY SHOULD PRIVACY TECHNOLOGY BE ILLEGAL? >Why does the US government think that privacy is something neither its >subjects, nor the citizens of other countries, should have? There are a couple of reasons. First of all, it's high-tech, and there are export regulations on most of our higher technologies. I think the purpose of this is to try to make sure we maintain the lead in *applications* of high technology; for instance, we can maintain the lead in weather simulation, which generally requires supercomputers, by making it hard for foreigners to get supercomputers. Also, smuggling high-tech devices to enemy nations is frequently done by pretending to be a purchaser from a friendly nation. As far as DES in particular is concerned, the NSA is extremely (read "overly") paranoid about foreigners getting our encryption technology. A few years ago the NSA tried to get all research on cryptology declared "unclassified but sensitive." This would have required all papers on cryptology to be sent to the NSA for approval to publish, and foreigners would generally not be allowed to attend conferences on cryptology. It's not clear whether they're worried about foreigners learning how to break our codes or use codes that we can't break; it's probably some of both. The academic community went up in arms about those restrictions, and I think the NSA eventually gave up. However, they did manage to get the Commerce Dept to restrict export of encryption mechanisms, and this has stuck. Since no large companies depend heavily on such devices for their income, there wasn't enough complaint to prevent it. -- Barry Margolin, Thinking Machines Corp. barmar@think.com {uunet,harvard}!think!barmar
janm@dramba.neis.oz (Jan Mikkelsen) (01/05/91)
In article <14511@hoptoad.uucp> gnu@hoptoad.uucp (John Gilmore) writes: >People can endlessly debate the small points of the rules; I want to >understand the big ones. WHY SHOULD PRIVACY TECHNOLOGY BE ILLEGAL? >Why does the US government think that privacy is something neither its >subjects, nor the citizens of other countries, should have? I agree, privacy technology should not be illegal. I cannot see the justification for restricting software DES implementations, nor most hardware implementations. I have a couple of Schlumberger M64 smart cards lying around which do DES in a monolithic chip, with secure key storage. I don't know what the situation with devices like this is in the United States, but I that it would be very hard to enforce a restriction on devices such as this. There are however, other aspects of an implementation for which I can see the justification for treating as sensitive, which have nothing to do with DES, or any other crypto system. For example: > I don't think techniques for >heat-sinking, sealing, shielding, etc are export-controlled, though >there are some that are classified (and thus aren't even available to the >U.S. public). Now, what can be done about making crypto systems more available to the masses? -- Jan Mikkelsen janm@dramba.neis.oz.AU or janm%dramba.neis.oz@metro.ucc.su.oz.au "She really is."
allbery@NCoast.ORG (Brandon S. Allbery KB8JRR) (01/05/91)
As quoted from <14511@hoptoad.uucp> by gnu@hoptoad.uucp (John Gilmore): +--------------- | People can endlessly debate the small points of the rules; I want to | understand the big ones. WHY SHOULD PRIVACY TECHNOLOGY BE ILLEGAL? | Why does the US government think that privacy is something neither its | subjects, nor the citizens of other countries, should have? +--------------- The rest of your message continues the implication that it's all a plot to make privacy illegal. That isn't the intent. Despite the fact that it's all for nought, the U.S. government is worried about hostile foreign powers violating *its own* privacy by decrypting its DES-encypted data. Considering that anyone who wants to type in code from Andrew S. Tanenbaum's COMPUTER NETWORKS can bring up DES, this is a bit silly, but nonetheless your assumption that it's Big Brother out to get us is equally silly. ++Brandon -- Me: Brandon S. Allbery VHF/UHF: KB8JRR on 220, 2m, 440 Internet: allbery@NCoast.ORG Packet: KB8JRR @ WA8BXN America OnLine: KB8JRR AMPR: KB8JRR.AmPR.ORG [44.70.4.88] uunet!usenet.ins.cwru.edu!ncoast!allbery Delphi: ALLBERY
amanda@visix.com (Amanda Walker) (01/08/91)
In article <1991Jan5.022309.19716@NCoast.ORG> allbery@ncoast.ORG (Brandon S. Allbery KB8JRR) writes: >That isn't the intent. Despite the fact that it's all for nought, the U.S. >government is worried about hostile foreign powers violating *its own* privacy >by decrypting its DES-encypted data. Not so, as I understand. DES is only approved for unclassified data. For any kind of classified communication, other (non-public) encryption methods are used. -- Amanda Walker amanda@visix.com Visix Software Inc. ...!uunet!visix!amanda -- "I was born in Iowa--I just *work* in outer space" --Star Trek IV
lupienj@hpwadac.hp.com (John Lupien) (01/09/91)
In article <1991Jan3.232017.15364@Think.COM> barmar@think.com (Barry Margolin) writes: >In article <14511@hoptoad.uucp> gnu@hoptoad.uucp (John Gilmore) writes: >>WHY SHOULD PRIVACY TECHNOLOGY BE ILLEGAL? > >There are a couple of reasons. "ostensibly", of course. >First of all, it's high-tech, and there are >export regulations on most of our higher technologies. I think the purpose >of this is to try to make sure we maintain the lead in *applications* of >high technology; for instance, we can maintain the lead in weather >simulation, which generally requires supercomputers, by making it hard for >foreigners to get supercomputers. Kind of a bogus argument. All that really does is ensure that foreign supercomputer markets will be supplied by foreign supercomputer manufacturers. Similarly for other high tech, of course. >Also, smuggling high-tech devices to >enemy nations is frequently done by pretending to be a purchaser from a >friendly nation. Yes it is, but this doesn't relate to the question, which in context could be re-cast as "should unfriendly nations have privacy?" The bit about "unfriendly nations" is kind of transient, too: Iraq was a better friend than Iran for some time after the Iranian revolution. >As far as DES in particular is concerned, the NSA is extremely (read >"overly") paranoid about foreigners getting our encryption technology. Well, perhaps that's not what "the NSA" is concerned about. The NSA is in charge of national security. They desire that the information related to national security should be secure. This may involve the use of encryption. If so, decryption becomes problematic: they do not want "others" to be able to decrypt security related information. Rumors that DES is breakable kind of make the DES issue moot, if true, but DES is not the only cryptographic technology which NSA seeks to control. >A few years ago the NSA tried to get all research on cryptology declared >"unclassified but sensitive." This would have required all papers on >cryptology to be sent to the NSA for approval to publish, and foreigners >would generally not be allowed to attend conferences on cryptology. >It's not clear whether they're worried about foreigners learning how to >break our codes or use codes that we can't break; it's probably some of >both. I would guess that it's more of the latter. Specifically, US citizens are subject to eavesdropping along with everybody else, and the possibility that the content of the communications taking place are not available to the eavesdroppers has an unsettling effect to the policy makers that benefit therefrom. >The academic community went up in arms about those restrictions, and I >think the NSA eventually gave up. However, they did manage to get the >Commerce Dept to restrict export of encryption mechanisms, and this has >stuck. Since no large companies depend heavily on such devices for their >income, there wasn't enough complaint to prevent it. Well, that seems a bit out of line with reality. Banks, insurance companies, major financial institutions of many kinds use encryption as the backbone of the financial networks. The management of these companies are naturally unwilling to stick their necks out. >Barry Margolin, Thinking Machines Corp. >barmar@think.com >{uunet,harvard}!think!barmar --- John R. Lupien lupienj@hpwarq.hp.com