ems1@att.att.com (06/20/89)
It's good to see that the debate on gragu versus active shield is alive and well again. I don't think it was ever conclusively resolved in our last go round. (And I should know! :-) One problem with the "pro" active shield arguments seems to be what I call the "Maginot Syndrome". The Maginot Line, for those who may not know, was a *massive* construction of defensive installation built by the French, who then "retired", secure in the knowledge that no attacking force could possibly get through their shield. Alas, a way was found around their static defense, and disaster followed. Would-be builders of active shields need to consider strategies for the "unthinkable" breaching of their defenses. It need not be the end of everything. In fact, by careful planning now, we can make the job of the attacker much more difficult. In this scenario, let us say that some nation or group finds itself under attack by "victorious" gray goo, it's active shields crumbling rapidly. What happens next? Well, one of the most effective strategies for surviving an attack is simply not to be there when it lands. Prudence would demand that the members of the group being attacked disperse themselves as far and as fast as possible. Now the infection will not be able to spread. (As a pre-emptive measure, setting up numerous space colonies representing your group seems a good idea. The kid with lots of brothers is usually left alone by the street toughs.) What about the individuals already infected, their body tissues being rapidly destroyed? For these individuals, we have the Lifeboat. The Lifeboat, prepared in advance of the attack, is a tough enclosure designed to support an analog of one individual's thought processes, probably through nanotech modeling of synapses and other brain structures. ( Note that I say "thought process". I don't believe a static map of one's brain would provide any continuity of identity.) With one's essential self (ie mind) transferred into the Lifeboat, the attacker's job has just become much harder. Instead of just disabling fragile biological processes until death occurs, they must first locate, and then penetrate, the Lifeboat enclosure to be sure of killing you. If the attack fails, then like MacArthur, you can say, "I shall return". (You DID remember to keep your body map up to date, didn't you?) The Lifeboat, with it's nanotech brain analog, will probably be smaller than a pea (possibly *much* smaller) and is covered by some material at least as tough as diamond. It also probably contains a small bit of fissionable material to power the "nanobrain". It also must contain a few general purpose assemblers, and must have some means of being opened from the inside. Of course some "escape route", or means of migrating your thought process from it's usual home into the nanobrain as quickly as possible, must be devised. (I can think of a number of possible techniques, though.) Somewhere outside of the Lifeboat (Though probably not attached) is a radio beacon transmitting the details of the attack to any friendly forces that may be listening. (Another reason for the space colonies mentioned earlier) The Beacon also transmits the last and latest update of your mind-state to your offsite archives, thus insuring that at least a copy of "you" will survive the attack. Probably this Beacon should work as a type of deadman switch, ie. If no further updates are received, your friends and allies will hold a brief service (for the original you) and then quickly construct an exact duplicate, thus denying the attacker any satisfaction for your real death (except momentarily). (Note again that I don't believe any actual transfer of *identity* via radio will be possible for a long time. If you had that, then you'd have long-range mind transfer, and you'd never have to worry about any of this mess again.) This Lifeboat and Beacon defense makes any lasting victory for the attacker virtually impossible. To "win" they would have to: 1. Destroy or fatally poison your body. (easy unfortunately) 2. Locate and destroy your Lifeboat (much tougher, the real death) 3. Locate and destroy every radio'd copy (horribly tough, requires FTL to be possible at all) Possible scenarios for the "castaway" include: 1. The attackers have left or been forced away. Rebuild a reasonably accurate copy of your original body, and resume life where you left off. 2. The attackers continue to hold the field. Wait for outside rescue, or if you get angry enough, build a tough robot body and fight back. ("You dirty rats! I was *really* attached to that body! :-) Any attack that has the result of creating a tough army of cyborgs bent on revenge would probably not be tried more than once. This article was written to show that the attack/defense problem is not as simple as "either active shields work, or its complete disaster". And that if the defenders are creative enough, and shed a few preconceptions, a reasonably safe and secure life in a nano- technological world is certainly possible. So, keep your Lifeboat supplies fresh, and don't forget those Archives! Ed Strong {princeton,mccc}!nanotech!ems
landman@sun.com (Howard A. Landman) (06/22/89)
In article <Jun.20.01.04.27.1989.19358@athos.rutgers.edu> mtuxo!ems1@att.att.com writes: >The Lifeboat, prepared in advance of the attack, is a tough enclosure >designed to support an analog of one individual's thought processes, >probably through nanotech modeling of synapses and other brain structures. Except for the enclosure, this isn't much different from a brain-enhancing implant. Assuming each individual has more nanotech compute power in their head (easiest place to interface to gray matter) than gray matter compute power, and more nanotech memory as well, it might well be that the implant will contain most of the individual's consciousness anyway. All you need to do is implant it in a new (cloned?) body. >The Beacon also transmits the last >and latest update of your mind-state to your offsite archives, thus >insuring that at least a copy of "you" will survive the attack. If you do the calculations for the necessary bandwidth, assuming that implants of say 100x the brain's capacity must also be "backed up", and assuming a world population of at least 10 billion (or optimistically a city population of at least 100,000) sharing the same ether, my guess is that this won't really be practical. >(Note again that I don't believe any actual transfer of *identity* >via radio will be possible for a long time. If you had that, then >you'd have long-range mind transfer, and you'd never have to worry >about any of this mess again.) Why not? How does that differ from long-range mind backup? > or if you get angry enough, build a tough robot body and fight back. This assumes that you're not *already* living in such a body, but that it's readily available. Might not many choose to avoid the messiness of rebuilding themselves (clones take time), and just stay constantly a robot? There might have to be laws governing this ... Howard A. Landman landman@sun.com
ems1@att.att.com (06/24/89)
In article <Jun.21.14.10.01.1989.3011@athos.rutgers.edu> you write: >In article <Jun.20.01.04.27.1989.19358@athos.rutgers.edu> mtuxo!ems1@att.att.com writes: >>The Lifeboat, prepared in advance of the attack, is a tough enclosure >>designed to support an analog of one individual's thought processes, >>probably through nanotech modeling of synapses and other brain structures. >Except for the enclosure, this isn't much different from a brain-enhancing >implant. Assuming each individual has more nanotech compute power in their >head (easiest place to interface to gray matter) than gray matter compute >power, and more nanotech memory as well, it might well be that the implant >will contain most of the individual's consciousness anyway. All you need >to do is implant it in a new (cloned?) body. Agreed. Probably most of us information gluttons will spend our time that way, for the high "baud rates" possible. On the other hand, some of us prefer natural childbirth, for instance. >>The Beacon also transmits the last >>and latest update of your mind-state to your offsite archives, thus >>insuring that at least a copy of "you" will survive the attack. >If you do the calculations for the necessary bandwidth, assuming that >implants of say 100x the brain's capacity must also be "backed up", and >assuming a world population of at least 10 billion (or optimistically a >city population of at least 100,000) sharing the same ether, my guess is >that this won't really be practical. You're assuming the Beacon needs to send a complete backup. It perhaps wasn't clear from my earlier posting, but the "latest update" I foresee will be an incremental type, just a record (mind-diff ?) of the changes since your last update was sent, probably a week ago. Transmitting 1 week's worth of experience by Beacon is a lot more achievable than trying to send the experience of an entire lifetime. How much does someone change in 1 week? >>(Note again that I don't believe any actual transfer of *identity* >>via radio will be possible for a long time. If you had that, then >>you'd have long-range mind transfer, and you'd never have to worry >>about any of this mess again.) >Why not? How does that differ from long-range mind backup? Well mind *transfer* probably would require that complete backup you were assuming earlier, for one thing. Mind transfer would also imply that you could escape from any attacker at the speed of light. This would be a lot more foolproof than either active shields or lifeboats. (Although the attacker *could* locate all your archives, and have henchmen waiting at each possible destination.) > >> or if you get angry enough, build a tough robot body and fight back. > >This assumes that you're not *already* living in such a body, but that it's >readily available. Might not many choose to avoid the messiness of rebuilding >themselves (clones take time), and just stay constantly a robot? There might >have to be laws governing this ... > The small number of "general purpose assemblers" (yeah I know, what do *they* look like) are included in the lifeboat to enable you to build *whatever* you may need later, assuming there are freely available raw materials on hand. People who preferred to spend most of their time in robot bodies might, in time, come to view natural bodies as slimy and disgusting. Sort of like super-Victorianism? Some things are no fun unless they're messy, however :-) As for laws, about the only changes I foresee would be amendments to the existing anti-discrimination laws, adding the clause "or preferred bodily form" to the statutes already on the books. Am I missing something? > Howard A. Landman > landman@sun.com Ed Strong {princeton,mccc}!nanotech!ems
dmocsny@uceng.uc.edu (daniel mocsny) (06/27/89)
In article <Jun.24.00.14.09.1989.23716@athos.rutgers.edu>, mtuxo!ems1@att.att.com writes: [ about emergency mind backups ] > You're assuming the Beacon needs to send a complete backup. It perhaps > wasn't clear from my earlier posting, but the "latest update" I foresee > will be an incremental type, just a record (mind-diff ?) of the changes > since your last update was sent, probably a week ago. Transmitting 1 > week's worth of experience by Beacon is a lot more achievable than > trying to send the experience of an entire lifetime. How much does > someone change in 1 week? Exactly. And why bother to save that week? Or even one day? Just have your brain backed up every night while you sleep. That way broadcast bandwidth isn't any problem, as you can use your regular optical fibers that carry your daily gigabytes anyway. One way to cut down the transmission would be to compress the data. Perhaps experiences have a compact fractal representation. If we had LOTS of CPU time to throw away, we could have a central site running its own simulations of everyone and their lives. If the model learned, it could continuously refine its performance based on the daily backups. Eventually it should be predicting future experiences with some accuracy, especially if everyone else's experiences in the same localities were coming in simultaneously. That way even if you got blown away without backing up in a week or so, the simulation could probably piece you back together by extrapolating forward from your past and by interpolating with its other knowledge of events in your area. I can see a great story line here (but it must surely have been done already). Imagine someone who keeps getting blown away and restored, and each time the destruction is serious enough to delete the memory of how death occurred. The protagonist knows that (s)he keeps getting killed and losing the memory of the week or so leading up to it, but (s)he can't figure out how the enemy is getting away with it. You have a situation where the protagonist is investigating her/his own death in a desperate attempt to prevent it from recurring. If the adversary can take out a bigger chunk of time with each killing, the protagonist dies in stages back to infancy. (That wouldn't work, but it's a cute idea anyway. Maybe the killer attempts to corrupt your backups too.) The attacker cannot permanently kill the protagonist, but (s)he may be able to chisel away all memories, and so eliminate the present form of the protagonist. Dan Mocsny dmocsny@uceng.uc.edu [I recall a novel by Jack Vance along these lines entitled something like "To Live Forever"... There is also of course the famous Null-A series by van Vogt. --JoSH]
mike@uunet.uu.net (Mike Higgins) (06/30/89)
In article <Jun.27.01.01.49.1989.12569@athos.rutgers.edu> dmocsny@uceng.uc.edu (daniel mocsny) writes: > >In article <Jun.24.00.14.09.1989.23716@athos.rutgers.edu>, mtuxo!ems1@att.att.com writes: >[ about emergency mind backups ] > >> You're assuming the Beacon needs to send a complete backup. It perhaps >I can see a great story line here (but it must surely have been done >already). Imagine someone who keeps getting blown away and restored, >and each time the destruction is serious enough to delete the >memory of how death occurred. The protagonist knows that (s)he keeps >getting killed and losing the memory of the week or so leading up to >it, but (s)he can't figure out how the enemy is getting away with it. The short story "The Phantom of Kansas" by John Varley, is situation almost exacly like you describe: The protagonist is killed several times and loses several weeks, so her insurance agency starts paying for backups more often. The officials are unable to figure out who is doing it, so she becomes her own detective and tracks down the "phantom" in an environmental park made to recreate the Kansas great planes ecology. Mike Higgins "Never trust a machine ...ucbvax!cogsci!well!fico2!everexn!mike you can't program"