[comp.graphics] Comp.graphics.images

aoki@faerie.Berkeley.EDU (Paul M. Aoki) (06/17/89)

Sorry for taking so long to get back to the group on this.
(An even bigger "sorry" to the people whose mail I didn't 
respond to.  It's been MS-thesis-completion time 'round these
here parts .. )  Anyway, the mail and group response to the
suggestion of comp.graphics.images was entirely positive 
(if somewhat guardedly so in one case) so I think it would
be worthwhile to give the group-creation process a shot.

I think it would be better to try the formal call-for-discussion
and call-for-votes around Aug/Sep, when the undergrads come 
back -- the net always seems to slow down a bit during summertime.
Well, ok, I have another motive for delaying: I'm moving this 
summer and I probably won't be really settled in San Diego until 
then.  If anyone thinks the delay is pointless, fire away in 
news.groups (it's a free country, I'll even you send a sample CFD 
if you like).
--
Paul M. Aoki		aoki@postgres.Berkeley.EDU	     uunet!ucbvax!aoki

joe@vixen.uucp (Joe Hitchens) (11/10/89)

tmb@wheaties.ai.mit.edu (Thomas M. Breuel) writes:

> I think it is wrong to mix a newsgroup for program sources and tools
> with a newsgroup for images.

I agree with this.


> Furthermore, I fear that USENET doesn't have the bandwidth to transmit
> images.

This would be an argument for moderation.  The moderator could filter
out images that are not very good or interesting.  But then we have
to trust the tastes of the moderator.


> So, in summary, a "YES" vote for a source newsgroup devoted to tools
> for creating, converting, and manipulating images, and a strong "NO"
> against a newsgroup for distributing grey level or color images.

How about this for a really keen idea ...

  2 newsgroups.
  One for image conversion, utilities whatever.
  One for actual images.
  The "images" group is moderated.  Moderator collects all the images,
  algorithmically shrinks them to a standard 64 x 64 pixel iconized
  version of the original image and posts THAT.
  Then if someone gets a tiny pic that looks interesting, they mail
  the moderator and he sends the ACTUAL image.

Does this help the bandwidth problem or would every one get into a
mad rush of "mailing for pictures" and end up making the problem worse?
I am a computer artist and would really like to distribute some of the
things I have done.  I like the idea of a "images" newsgroup,
but don't really feel qualified to make an intelligent YES or NO vote on
the it.
I would vote "YES" for the utilities group however.

j.h.

-- 
___________________________________________________________________________
Joe Hitchens -- Artist, Sculptor, Animator of Sculpture, Iconographer Adept
joe@vixen  ...!uunet!iconsys!caeco!vixen!joe         Phone: (801) 292-2190

twheeler@jarthur.Claremont.EDU (Theodore Wheeler) (11/11/89)

Would a group called alt.images, or something to that extent, work?
I believe that this would make it so that sites receiving
comp.(everything) would not receive it by default. This way people
with no anonymous FTP access could have access to the images.

                              -T.J. Wheeler

davidbe@sco.COM (The Cat in the Hat) (11/14/89)

news.groups's own joe@vixen.UUCP (Joe Hitchens) said:
-tmb@wheaties.ai.mit.edu (Thomas M. Breuel) writes:
-
-> I think it is wrong to mix a newsgroup for program sources and tools
-> with a newsgroup for images.
-
-I agree with this.

Likewise.

-> Furthermore, I fear that USENET doesn't have the bandwidth to transmit
-> images.
-
-This would be an argument for moderation.  The moderator could filter
-out images that are not very good or interesting.  But then we have
-to trust the tastes of the moderator.

It's the best argument for moderation.  Are you volunteering?

-How about this for a really keen idea ...
-
-  2 newsgroups.
-  One for image conversion, utilities whatever.
-  One for actual images.

How about this.  Comp.graphics is used for utilities.  Comp.graphics.images
is used for the pretty pictures.

Comp.graphics.images is MODERATED (by someone who isn't me).  Everyone lives
happily ever after.

-- 
        David Bedno aka davidbe@sco.COM: Speaking from but not for SCO.

		   The keyboard's been drinking, not me.

mehl@atanasoff.cs.iastate.edu (Mark M Mehl) (11/16/89)

davidbe@sco.COM (The Cat in the Hat) writes:
|news.groups's own joe@vixen.UUCP (Joe Hitchens) said:
|-tmb@wheaties.ai.mit.edu (Thomas M. Breuel) writes:
|-> I think it is wrong to mix a newsgroup for program sources and tools
|-> with a newsgroup for images.

|-I agree with this.

|Likewise.

Likewise again.

|-> Furthermore, I fear that USENET doesn't have the bandwidth to transmit
|-> images.

Yes.  Please appreciate that someone else is paying the phone bill to
transmit all these images to thousands of machines.  Earlier, someone
suggested to simply setup an archive that others can FTP from.  This is
certainly the most sensible solution, particularly for large images.

I appreciate that all sites (although they should) can't FTP.  These
sites should get a friend (on Internet) to simply e-mail the files
directly to them.  Also, if these sites simply dialed directly into the
archive and used Kermit to download what they wanted, this would be
better than posting these sources all over the world.

|-How about 2 newsgroups.
|- One for image conversion, utilities whatever; one for actual images.

|How about this.  Comp.graphics is used for utilities.  Comp.graphics.images
|is used for the pretty pictures.

If you're proposing a source group, put "sources" or "binaries" in the
group name so everybody knows what it is.  Isn't this the obvious thing
to do?  Something like:

   comp.binaries.graphics    --or--    comp.graphics.binaries

A "source" group by any other name would simply be misleading to
potential users.
-- 
 /\ Mark M Mehl, alias Superticker (Supertickler to some)
<><> Internet: mehl@atanasoff.cs.IAstate.edu
 \/ UUCP: {{mailrus,umix}!sharkey,hplabs!hp-lsd,uunet}!atanasoff!mehl
Disclaimer: You got to be kidding; who would want to claim anything I said?

sg04@GTE.COM (Steven Gutfreund) (11/17/89)

I have only seen the tail end of this discussion, so I apologize if this
question has been previously answered:

What sort of compression will be used for these images? Also are there
better publicly available compressions that unix COMPRESS (Lempel-Ziv)?
-- 
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Yechezkal Shimon Gutfreund		 		  sgutfreund@gte.com
GTE Laboratories, Waltham MA			    harvard!bunny!sgutfreund
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

bio_zwbb@jhunix.HCF.JHU.EDU (William Busa) (11/17/89)

	It would seem that the most serious objection to formation of
comp.graphics.images is the rather pervasive fear that such a group would
simply overwhelm the Net's bandwidth. I share this fear, and therefore at
present I oppose formation of this group. Still, it would be nice if we
had some data, however crude, on which to base our judgement. Therefore, I
suggest that the originator(s) of this proposal post a survey in the
relevant newsgroups concerning the level of use such a group would see. I
envision questions such as "How often would you post to this group?" "How
many images per posting?" "How large?" "What distribution?"

	Now, one can fault such a survey on any number of grounds, and it
would be wrong to take the results as anything but the roughest sketch of
users' present estimation of their intentions, but even as such I think it
would provide useful information. My primary concern is to know whether
such a group would cause a 1% increase in Net traffic, a 10% increase, or
100% or 1000%??? I think a poll would (at best) give us an
order-of-magnitude estimate. Statisticians please save your flames; I
recognize and readily acknowledge that this poll would have no statistical
significance.
-- 
Dr. William Busa, Dept. of Biology, The Johns Hopkins University, Charles
& 34th Sts., Baltimore, MD 21218              (301) 338-8207

bio_zwbb@jhunix.hcf.jhu.edu                 uunet!mimsy!jhunix!bio_zwbb

jarnot@canisius.UUCP (Repo Man) (11/17/89)

In article <7821@bunny.GTE.COM>, sg04@GTE.COM (Steven Gutfreund) writes:
> What sort of compression will be used for these images? Also are there
> better publicly available compressions that unix COMPRESS (Lempel-Ziv)?

If the images are GIF pictures, they are already compressed due to the fact
that they ARE GIF pictures.  The image data is compressed with a Lempel-
Ziv type compression method.  I'm not sure how much space will be saved
from running compress on the uuencoded image.


-- 
  ...!{decvax|watmath|allegra|rocksvax}!sunybcs!canisius!jarnot    
                      jarnot@klaatu.cs.canisius.edu     
Canisius College:  the small Buffalo college that KNOWS it's small, but
		     still has a big ego anyways... 

brad@looking.on.ca (Brad Templeton) (11/19/89)

There is data to support the fear of being overwhelmed.  Just look at
the download libraries on many BBSs and commercial services like
Compuserve.  Many are chock full of GIF files (mostly, it seems, nude
women and soft-core) to the point of surpassing PD binaries.
-- 
Brad Templeton, ClariNet Communications Corp. -- Waterloo, Ontario 519/884-7473

mherman@alias.UUCP (Michael Herman) (11/20/89)

I think anyone can guess there is a lot of images waiting to be sent
when (and if) comp.graphics.images is finally formed.

I think the group may need a "modulator" more than a moderator for its
first 6 or so months of operation - to act as a throttle and to prevent
duplication.  After that, just cut it loose.

For statistics purposes, I have 28 GIF of images of all sorts of
things.  All are from various BBSs.  They total 2.5 million bytes - I
would be confident quoting an average of 100K bytes per image (a little
high but not much).

Perhaps someone can look at the USENET stats and pick a monthly budget
for the group (i.e. what is a reasonable number of 100K byte images per
month to ship over USENET?).

Maybe we should just have an archive somewhere and the newsgroup is
only used for posting the most popular/interesting/requested image of
the day/week/month?

... just some of my thoughts.

greg@sj.ate.slb.com (Greg Wageman) (11/21/89)

Opinions expressed are the responsibility of the author.

In article <2586@canisius.UUCP> jarnot@canisius.UUCP (Repo Man) writes:
>In article <7821@bunny.GTE.COM>, sg04@GTE.COM (Steven Gutfreund) writes:
>> What sort of compression will be used for these images? Also are there
>> better publicly available compressions that unix COMPRESS (Lempel-Ziv)?
>
>If the images are GIF pictures, they are already compressed due to the fact
>that they ARE GIF pictures.  The image data is compressed with a Lempel-
>Ziv type compression method.  I'm not sure how much space will be saved
>from running compress on the uuencoded image.

Here are some examples:

Size-----v
        28744 Nov  1 10:56 fall.gif    (Pure GIF)
        39631 Nov 20 17:07 fall.uue    (Uuencoded GIF)
        35921 Nov 20 17:07 fall.uue.Z  (Uuencoded, compressed)

        20776 Oct 10 10:23 vase.gif
        28653 Nov 20 16:59 vase.uue
        26153 Nov 20 17:00 vase.uue.Z

The uuencoded and compressed versions of the GIF images, while smaller
than the uncompressed, uuencoded versions, are still about 25% larger
than the pure GIF data.

These numbers will, of course, vary from image to image.

Copyright 1989 Greg Wageman	DOMAIN: greg@sj.ate.slb.com
Schlumberger Technologies	UUCP:   {uunet,decwrl,amdahl}!sjsca4!greg
San Jose, CA 95110-1397		BIX: gwage  CIS: 74016,352  GEnie: G.WAGEMAN
        Permission granted for not-for-profit reproduction only.

dave@imax.com (Dave Martindale) (11/22/89)

In article <635@alias.UUCP> mherman@alias.UUCP (Michael Herman) writes:
>
>For statistics purposes, I have 28 GIF of images of all sorts of
>things.  All are from various BBSs.  They total 2.5 million bytes - I
>would be confident quoting an average of 100K bytes per image (a little
>high but not much).

What are the characteristics of the "average" image that might be
posted to comp.graphics.images (or equivalent).  What resolution, and
how many bits/pixel?  Should there be some agreed-upon upper limit to
these values?

For example, how about sending 1024x768 24-bit images, which are 2.3 Mb
each?  Even 8-bit images of that size are still 790 Kb.

If we go to TV resolution (640x480), images are 300 Kb for 8-bit and
900 Kb for 24-bit.  Are these still too large?  Only when images are
limited to about 256 squared and 8 bits per pixel do you get under the
100 Kb barrier.

Of course, you can't transmit 8-bit raw data over Usenet; it has to be
encoded, which increases its size by 34% or so.  You can save some
space with some images by various encoding schemes, but then everyone
has to have software to decode that format, and no compression schemes
do very well on scanned images, which have a lot of noise.

Also, the compression used in sending news batches typically gets 50%
compression on the text files that make up news articles, but does
considerably worse on image files.  Thus, a 100-Kb image file could
cost up to twice as much money to transmit than a 100-Kb text article.

Ultimately, there needs to be a concensus of how big an image can be
before the cost of sending it outweighs the benefits of distributing
it.  What is this size?

My own opinion is that sending images larger than 50-100 Kb or so is
using too much bandwidth for the potential usefulness of the images.

dnwiebe@CIS.OHIO-STATE.EDU (Dan N Wiebe) (11/22/89)

------
The uuencoded and compressed versions of the GIF images, while smaller
than the uncompressed, uuencoded versions, are still about 25% larger
than the pure GIF data.
------

	This is probably just because I'm dense, or something, but what
good is a compressed uuencoded file?  Uuencode converts an 8-bit file
to a 7-bit file so that it can be sent through mail or some other ASCII
service, right?  Since this makes a file both larger *and* unusable
(you have to uudecode it first), it would seem to me that the only reason
for the existence of a uuencoded file would be transmission, and once that
was done, it would be uudecoded to shrink it back down and make it useful.
Unix compress produces an 8-bit file, doesn't it?  So using compress on
a uuencoded file, it would seem, defeats the one and only purpose of
uuencoding it in the first place.
	It looks to me like a more profitable route to compress and *then*
uuencode, producing file.gif.Z.uue instead of file.gif.uue.Z, but even
this isn't too sensible because compress can rarely make a .GIF file
any smaller.
	Thanks for all the forthcoming explanations...

Dan Wiebe

jordan@Morgan.COM (Jordan Hayes) (11/22/89)

Dan N Wiebe <dnwiebe@CIS.OHIO-STATE.EDU> writes:

	This is probably just because I'm dense, or something, but what
	good is a compressed uuencoded file?

Right you are; however, notice the following:

% compress -v fall.gif
fall.gif: Compression: -43.30% -- file unchanged

Since GIF is a compression technique itself, Lempel-Zif (or most other
compression techniques -- !jaw, are you out there?) won't be of much
help.

btoa, rather than uuencode, will do a slightly better job (127% vs. 138%
in the fall.gif example) ...

/jordan

greg@sj.ate.slb.com (Greg Wageman) (11/28/89)

Opinions expressed are the responsibility of the author.

In article <8911212014.AA06822@ironwood.cis.ohio-state.edu> dnwiebe@CIS.OHIO-STATE.EDU (Dan N Wiebe) writes:
>------
(I wrote:)
>The uuencoded and compressed versions of the GIF images, while smaller
>than the uncompressed, uuencoded versions, are still about 25% larger
>than the pure GIF data.
>------
>
>	This is probably just because I'm dense, or something, but what
>good is a compressed uuencoded file?  Uuencode converts an 8-bit file
>to a 7-bit file so that it can be sent through mail or some other ASCII
>service, right?  Since this makes a file both larger *and* unusable
>(you have to uudecode it first), it would seem to me that the only reason
>for the existence of a uuencoded file would be transmission, and once that
>was done, it would be uudecoded to shrink it back down and make it useful.

Well, yes and no.  A GIF file would, as you say, be uuencoded and
posted uncompressed.  However, the news software routinely compresses
news articles before transmission.  The purpose of the experiment was
to see if the compression would make up for the loss due to
uuencoding, and the answer seems to be "no".

This means that the space required for posting binary images to a
newsgroup is greater than that required for binary FTP access, even
allowing for transmission-time compression.


Copyright 1989 Greg Wageman	DOMAIN: greg@sj.ate.slb.com
Schlumberger Technologies	UUCP:   {uunet,decwrl,amdahl}!sjsca4!greg
San Jose, CA 95110-1397		BIX: gwage  CIS: 74016,352  GEnie: G.WAGEMAN
        Permission granted for not-for-profit reproduction only.

jallen@netxdev.DHL.COM (John Allen) (12/07/89)

In article <8911212014.AA06822@ironwood.cis.ohio-state.edu> dnwiebe@CIS.OHIO-STATE.EDU (Dan N Wiebe) writes:
>------
>The uuencoded and compressed versions of the GIF images, while smaller
>than the uncompressed, uuencoded versions, are still about 25% larger
>than the pure GIF data.
>------
>	It looks to me like a more profitable route to compress and *then*
>uuencode, producing file.gif.Z.uue instead of file.gif.uue.Z, but even
>this isn't too sensible because compress can rarely make a .GIF file
>any smaller.

The GIF standard uses a slightly modified 12 bit Lempel-Ziv algorithm
to compress the image data.  Since compress is also Lempel-Ziv, the size
of a GIF is about equivalent to a 12 bit compress.

For those familiar with Lempel-Ziv, I have found that a typical 320x
256x8 bit image will have something between three and five RESETs.
Since LZW compression gets the greatest "mileage" at the end of the
compressed stream, each RESET is very expensive to the overall file
size.  So any gain obtained from compressing a GIF file will come
from full 16 bit compression.

Perhaps it would be worthwhile to lobby for an enhancement to the GIF
standard which permits 16 bit LZW compression.  If GIF did use 16 bit
compression, the result should be much smaller.
=============================================================================
John Allen, NetExpress Communications, Inc.    usenet: jallen@netxcom.DHL.COM
1953 Gallows Road, Suite 300                   phone:  (703) 749-2238
Vienna, Virginia, 22182                        telex:  901 976

mjs@cbnews.ATT.COM (martin.j.shannon) (12/08/89)

In article <2045@netxcom.DHL.COM> jallen@netxdev.UUCP (John Allen) writes:
>The GIF standard uses a slightly modified 12 bit Lempel-Ziv algorithm
>to compress the image data.  Since compress is also Lempel-Ziv, the size
>of a GIF is about equivalent to a 12 bit compress.

Plus the header information, plus the pallette, neither of which is
compressed at all.

>So any gain obtained from compressing a GIF file will come
>from full 16 bit compression.

Certainly many images would gain from using 16-bit compression, but
doesn't that pose some serious memory usage problems for 8086-based
machines (and other small-address-space machines, too)?  I do all my
GIF viewing & manipulating on a '386 running SVR3.2, so moving up to
16-bit compress doesn't bother *me*, but is it a good idea to make it
so difficult for the small machines?

>Perhaps it would be worthwhile to lobby for an enhancement to the GIF
>standard which permits 16 bit LZW compression.  If GIF did use 16 bit
>compression, the result should be much smaller.

This is probably a very good idea, but there are a few other things
that really want to be added to the specification, as well.  Among them
are: a comment/copyright block (arbitrary text that *must* be displayed
when the picture is viewed); aspect ratio indicator; an RLE mode (many
of the pictures I've seen would compress much smaller if they were 1st
run through an RLE encoder); there are others.  I made a list about a
year ago, when I first figgered out how to read a GIF file, but I don't
have it handy -- I'll try to dig it up & post it.

>John Allen, NetExpress Communications, Inc.    usenet: jallen@netxcom.DHL.COM
>1953 Gallows Road, Suite 300                   phone:  (703) 749-2238
>Vienna, Virginia, 22182                        telex:  901 976

-- 
Marty Shannon; AT&T Bell Labs; Liberty Corner, NJ, USA
(Affiliation is given for identification only:
I don't speak for them; they don't speak for me.)