David_Conrad%Wayne-MTS@um.cc.umich.edu (04/12/90)
Dave Ihnat <ignatz@chinet.chi.il.us> writes: >So the point I was making is that in an environment which doesn't even >provide underlying hardware support for protection, it's impossible to >make a secure, safe system no matter how good you are in software >development. Having the hardware, however, does not guarantee such >security; but id [sic] does make it possible. Having the hardware neither guarantees such security nor makes it possible; what it does make possible is a greater degree of security, and that is, in itself, a good thing. But a completely safe, completely secure system is impossible unless no changes could be made. (If no changes could be made, then, of course, we must ask ourselves how such a system was brought into being, and then realize that no such system can exist.) As long as some changes can be made, whether they are loopholes due to an imperfect strategy (because even if the security system could be perfectly implemented, it would also have to be perfect in its conception), or they are changes that are considered to be proper under most conditions, then some program could exploit that ability to make changes and create harmful or virulent code. Hardware support for security makes the virus writer's job more difficult and the virus interceptor's job easier, which, as I said, is good. But do not confuse increased security for complete security. Regards, David R. Conrad +-------------------------------------------------------------------------+ | David R. Conrad (preferred) dconrad%wayne-mts@um.cc.umich.edu | | /\/\oore Soft\/\/are dave@thundercat.com | | Disclaimer: No one necessarily shares my views, but anyone is free to. | +-------------------------------------------------------------------------+