[net.space] supernovae & gravity waves

kcarroll (08/04/82)

   A supernova's gravitational effect on nearby stars might not be
negligible, for at least two reasons. The first is that the supernova
need not produce a spherically-symmetric nebula; hence the 
center of mass of the supernova remnant would be slightly different
from that of the pre-nova star. It might even be
significantly different, as I beleive that the amount of the star
blown off by the S-N would be on the order of 50%, rather than
0.1% or so.(although this is based on a rather hazy recollection
of some Asimov article(how's <that> for an example of pleading 
higher authority?)). Working out the ncenter of mass of the nebula
would be tricky, as not all the gas in it would be radiating
(ie. we wouldn't be able to see much of it); hence it would be hard
to correlate the mass-center shift with effects on other stars, in
any useful manner.
   The other possible effect is based on the presumed wave nature of
gravity (appropriate, as this experiment is designed to measure the
gravity-wave propagation speed, which can only be done if gravity
<is> wavelike). From what I've read of supernovae, they are quite 
rapid events, and involve large masses being shuffled about very
quickly.  This ought to generate a rapid change in the local
gravity-wave medium, perhaps analogous to  a shock wave in air.
The effects of the passage of this wave are what we would look
for in nearby stars and nebula. I can imagine it causing a ripple
through a nebula, or a sudden shift in the period of a variable star.
(of course, as far as astrophysics goes, I have more imagination
than real knowledge)
   The place where this experimaent might really fall down is
the use of "known" distances between stars as part of the
calculation of the speed of gravity. To my knowledge, it's
only for the very near stars that we have any precise idea
of distance. The farther stars are too far to use parallax as
an accurate distance-measurement, and so relative magnitude
of the stars must be used. Except, the ABSOLUTE magnitudes of the stars
are not really known, so this isn't all that accurate, either.
As a result, I'd be surprised if the distance of a star more than
(say) 10 parsecs away was known to better than (say) 5% accuracy.
(can anybody out there  in net.space-land confirm or refute this?)

Kieran A. Carroll
...decvax!utzoo!kcarroll

gdw (08/06/82)

#R:utzoo:-233800:harpo:11700001:000:290
harpo!gdw    Aug  6 16:21:00 1982

I thought that distances were measured using "red shift", viz.
the Doppler shift (toward longer wavelengths) of it's spectrum
due to it's velocity away from Earth caused by the expanding
universe. This velocity times the Hubble constant equals the
distance of the star with good accuracy.