modperl_apache_speedup_via_compression

This is part of The Pile, a partial archive of some open source mailing lists and newsgroups.



Subject: RE: More Speed -> mod_perl Module for HTML Compression 
From: Geoffrey Young <gyoung@laserlink.net>
Date: Thu, 30 Nov 2000 13:36:06 -0500


Nigel Hamilton [mailto:nigel@e1mail.com] wrote;

> 	I'm trying to reduce the amount of data sent from server to
> browser by using compression ---> hopefully accelerating the time to
> serve a page.
> 
> 	Does anyone know of a mod_perl module that compresses HTML and a
> companion Javascript procedure that decompresses the data on the
> client-side?
> 
> 	I know there are Gzip modules that zip files on the way back to
> the browser ... but I'm after something that zips on the server and  
> decompresses transparently in Javascript across all browsers. 
> Ideally I
> want to do: document.write(uncompressed-contents) in Javascript on the
> client-side.
> 
> 	Has anyone come up with something for this?
> 
> 	Also for average-sized files, does the time taken to perform the
> decompression/compression negate any speed increase gained by 
> reduced file
> size?

there's mod_gzip, available from
http://www.remotecommunications.com/apache/mod_gzip/
which I've played with and looks pretty good

or Apache::Compress, available from CPAN, which also works rather nicely
(and is Apache::Filter ready, so you can chain PerlHandlers into it)

just beware that not all browsers that claim to accept gzip compression
actually do...
Subject: Re: More Speed -> mod_perl Module for HTML Compression 
From: Robin Berjon <robin@knowscape.com>
Date: Thu, 30 Nov 2000 19:37:39 +0100

At 18:14 30/11/2000 +0000, Nigel Hamilton wrote:
>	Also for average-sized files, does the time taken to perform the
>decompression/compression negate any speed increase gained by reduced file
>size?

I don't have numbers to back this, but I'm ready to bet that writing an
uncompression algo meant to run within a browser is going to be *slow*.

Why do you want to do this ? I can't see any reason for wanting this
instead of using gz encoding, which will be faster by much, and is a well
proven method of sending large files faster.

===

Subject: Re: More Speed -> mod_perl Module for HTML Compression
From: Matt Sergeant <matt@sergeant.org>
Date: Thu, 30 Nov 2000 18:50:51 +0000 (GMT)

On Thu, 30 Nov 2000, Nigel Hamilton wrote:

> Hi,
> 	I'm trying to reduce the amount of data sent from server to
> browser by using compression ---> hopefully accelerating the time to
> serve a page.
>
> 	Does anyone know of a mod_perl module that compresses HTML and a
> companion Javascript procedure that decompresses the data on the
> client-side?
>
> 	I know there are Gzip modules that zip files on the way back to
> the browser ... but I'm after something that zips on the server and
> decompresses transparently in Javascript across all browsers. Ideally I
> want to do: document.write(uncompressed-contents) in Javascript on the
> client-side.
>
> 	Has anyone come up with something for this?

Nobody here would be mad enough to do this... Is it on an intranet? If
not, you'll never get me visiting your site - I don't enable javascript
generally.

> 	Also for average-sized files, does the time taken to perform the
> decompression/compression negate any speed increase gained by reduced file
> size?

I don't think so, but it probably depends a huge amount on the size of
your pipe and how many pages you're hoping to server. For example, I'm on
a 64K pipe, so CPU isn't the limiting factor of what I can serve - the
pipe is. So I gzip and can serve more pages.

===

Subject: Re: More Speed -> mod_perl Module for HTML Compression
From: Matt Sergeant <matt@sergeant.org>
Date: Thu, 30 Nov 2000 18:50:51 +0000 (GMT)

On Thu, 30 Nov 2000, Nigel Hamilton wrote:

> Hi,
> 	I'm trying to reduce the amount of data sent from server to
> browser by using compression ---> hopefully accelerating the time to
> serve a page.
>
> 	Does anyone know of a mod_perl module that compresses HTML and a
> companion Javascript procedure that decompresses the data on the
> client-side?
>
> 	I know there are Gzip modules that zip files on the way back to
> the browser ... but I'm after something that zips on the server and
> decompresses transparently in Javascript across all browsers. Ideally I
> want to do: document.write(uncompressed-contents) in Javascript on the
> client-side.
>
> 	Has anyone come up with something for this?

Nobody here would be mad enough to do this... Is it on an intranet? If
not, you'll never get me visiting your site - I don't enable javascript
generally.

> 	Also for average-sized files, does the time taken to perform the
> decompression/compression negate any speed increase gained by reduced file
> size?

I don't think so, but it probably depends a huge amount on the size of
your pipe and how many pages you're hoping to server. For example, I'm on
a 64K pipe, so CPU isn't the limiting factor of what I can serve - the
pipe is. So I gzip and can serve more pages.

===

Subject: RE: More Speed -> mod_perl Module for HTML Compression
From: Matt Sergeant <matt@sergeant.org>
Date: Thu, 30 Nov 2000 19:33:07 +0000 (GMT)

On Thu, 30 Nov 2000, Geoffrey Young wrote:

> there's mod_gzip, available from
> http://www.remotecommunications.com/apache/mod_gzip/
> which I've played with and looks pretty good
>
> or Apache::Compress, available from CPAN, which also works rather nicely
> (and is Apache::Filter ready, so you can chain PerlHandlers into it)
>
> just beware that not all browsers that claim to accept gzip compression
> actually do...

No its the other way around. Not all browsers that can accept gzip send
out Accept-Encoding: gzip. Notably early versions of IE4.

===

Subject: RE: More Speed -> mod_perl Module for HTML Compression
From: Geoffrey Young <gyoung@laserlink.net>
Date: Thu, 30 Nov 2000 14:44:53 -0500

Original Message-----
> From: Matt Sergeant [mailto:matt@sergeant.org]
> Sent: Thursday, November 30, 2000 2:33 PM
> To: Geoffrey Young
> Cc: 'Nigel Hamilton'; mod_perl list
> Subject: RE: More Speed -> mod_perl Module for HTML Compression
> 
> 
> > just beware that not all browsers that claim to accept gzip 
> compression
> > actually do...
> 
> No its the other way around. Not all browsers that can accept 
> gzip send
> out Accept-Encoding: gzip. Notably early versions of IE4.

I was basing that on discussions on the mod_gzip list and the following
(from the mod_gzip code)

     * 5. Many browsers ( such as Netscape 4.75 for UNIX ) are unable
     *    to handle Content-encoding only for specific kinds of HTML
     *    transactions such as Style Sheets even though the browser
     *    says it is HTTP 1.1 compliant and is suppying the standard
     *    'Accept-encoding: gzip' field. According to the IETF
     *    specifications any user-agent that says it can accept
     *    encodings should be able to do so for all types of HTML
     *    transactions but this is simply not the current reality.
     *    Some will, some won't... even if they say they can.

I don't have any first hand experience with it, though...

===

Subject: Apache::Compress patch
From: Geoffrey Young <gyoung@laserlink.net>
Date: Thu, 14 Dec 2000 08:33:53 -0500

hi ken...


something has been bugging me in Apache::Compress for a while now - it
_always_ tries to compress output.

my thought here is that all filters in the pipeline should be able to return
DECLINED and have apache's defaults take over.  unfortunately, Compress
basically only uses the browser compatibility (and some minimal filehandle
checks) to determine its action, which is either to compress or not compress
(but it always involves a print in either case).

The reason this is important is that if there are two (or more) filters in
the pipeline and the all return DECLINED because a file is not found (which
is better than NOT_FOUND because later handlers may deal with the situation
of null input differently) then Compress tries to compress null and you get
a null document back instead of 404.  

also, this comment from mod_gzip has me thinking...

     * Default to 300 bytes as a minimum size requirement for it
     * to even be worth a compression attempt. This works well as a
     * minimum for both GZIP and ZLIB which are both LZ77 based and,
     * as such, always have the potential to actually increase the
     * size of the file.

here's a quick patch that solves both issues (well we count characters
instead of bytes, but it's close enough and 'use bytes' seemed to really
complain about the uninitialized stuff worse than without it)

 --- Compress.pm.old     Thu Dec 14 08:22:15 2000
+++ Compress.pm Thu Dec 14 08:21:52 2000
@@ -35,10 +35,13 @@
   return SERVER_ERROR unless $fh;
   
   if ($can_gzip) {
+    local $/;
+    local $^W;  # length() gives an uninitialized warning. hmmm...
+    my $file = <$fh>;
+    return DECLINED unless length($file) > 300;
     $r->content_encoding('gzip');
     $r->send_http_header;
-    local $/;
-    print Compress::Zlib::memGzip(<$fh>);
+    print Compress::Zlib::memGzip($file);
   } else {
     $r->send_http_header;
     $r->send_fd($fh);


BTW, I've been thinking of trying to integrate a perl interface into
mod_gzip (since all the compression code is containted within it and it has
lots of built in intelligence), but since my C is poor I'm at a loss of
where to start (can't use h2xs since it doesn't have a *.h file).  If anyone
is interested in guiding the C challenged...

===
Subject: Apache::Compress + mod_proxy problem
From: Edward Moon <em@mooned.org>
Date: Fri, 15 Dec 2000 18:44:19 -0800 (PST)

I've run into a problem with Apache::Compress in dealing with mod_proxyed
content. The author of Apace::Compress suggested that I post the problem
here.

I'm running apache 1.3.14, mod_perl 1.24_01, & Apache::Compress 1.003 on a
RedHat 6.2 linux box.

I get an internal server error when ever I try to request a file ending
with .htm/.html on a proxied server.

Meaning:
www.company.com/local/index.html - OK, contents are compressed
www.company.com/proxy/           - OK, contents are NOT compressed
www.company.com/proxy/index.html - 500 error w/ Apache::Compress
www.company.com/proxy/script.pl  - OK, contents are NOT compressed

Apache::Compress tries to map the URI to the filesystem and ends up with
an undefined filehandle for content residing on the remote server and
returns SERVER_ERROR.

I've modified Apache::Compress to special case proxy requests, but I
can't figure out how to do a redirect to mod_proxy from within
Apache::Compress.

Is this the best way to resolve this issue? I've been reading the eagle
book and I'm guessing I could use a Translation Handler to handle this as
well.

Here are the relevant sections of httpd.conf:
PerlModule Apache::Compress
<FilesMatch "\.(htm|html)$">
	SetHandler	perl-script
	PerlHandler	Apache::Compress
</FilesMatch>

<IfModule mod_proxy.c>
	ProxyRequests	On
	ProxyVia Off
	ProxyPass		/proxy	http://internalserver/
	ProxyPassReverse	/proxy	http://internalserver/
</IfModule>



===

Subject: RE: Apache::Compress patch
From: Geoffrey Young <gyoung@laserlink.net>
Date: Mon, 18 Dec 2000 15:31:27 -0500

Original Message-----
> From: Geoffrey Young [mailto:gyoung@laserlink.net]
> Sent: Thursday, December 14, 2000 8:34 AM
> To: 'Ken Williams'
> Cc: 'modperl@apache.org'
> Subject: Apache::Compress patch
> 
> 
> hi ken...
> 
> 
> something has been bugging me in Apache::Compress for a while now - it
> _always_ tries to compress output.
> 
[snip]

whoops on that patch...  it didn't print filtered output that was less than
300 characters *doh*.  This should do the trick (against Compress.pm
1.003)...

 --Geoff

 --- Compress.pm.old     Thu Dec 14 08:22:15 2000
+++ Compress.pm Mon Dec 18 15:29:26 2000
@@ -35,10 +35,23 @@
   return SERVER_ERROR unless $fh;
   
   if ($can_gzip) {
-    $r->content_encoding('gzip');
-    $r->send_http_header;
     local $/;
-    print Compress::Zlib::memGzip(<$fh>);
+    local $^W;  # length() gives an uninitialized warning. hmmm...
+    my $file = <$fh>;
+
+    my $length = length($file);
+
+    return DECLINED unless $length;
+
+    if ($length < 300) {
+      $r->send_http_header;
+      $r->print($file);
+    }
+    else {
+      $r->content_encoding('gzip');
+      $r->send_http_header;
+      print Compress::Zlib::memGzip($file);
+    }
   } else {
     $r->send_http_header;
     $r->send_fd($fh);



===

Subject: RE: Apache::Compress patch
From: "Ken Williams" <ken@forum.swarthmore.edu>
Date: Tue, 19 Dec 2000 22:40:14 -0600

Hi Geoff,

Does the following patch work just as well for you?  I'm trying to keep
using $r->send_fd if at all possible when we're not compressing.

===================================================================
 --- Compress.pm 2000/11/05 05:36:46     1.3
+++ Compress.pm 2000/12/20 04:34:35
@@ -34,14 +34,24 @@
   }
   return SERVER_ERROR unless $fh;
   
+  my $buff;
+  unless (defined read($fh, $buff, 300)) {
+    $r->log_error("Can't read from filehandle '$fh': $!");
+    return SERVER_ERROR;
+  }
+
+  return DECLINED unless length $buff;
+  $can_gzip = 0 if eof($fh);
+
   if ($can_gzip) {
     $r->content_encoding('gzip');
     $r->send_http_header;
     local $/;
-    print Compress::Zlib::memGzip(<$fh>);
+    print Compress::Zlib::memGzip($buff . <$fh>);
   } else {
     $r->send_http_header;
-    $r->send_fd($fh);
+    print $buff;
+    $r->send_fd($fh) unless eof($fh);
   }
   
   return OK;
===================================================================

I'll trust the mod_gzip docs that 300 is a reasonable limit.

Thanks for the patch.


gyoung@laserlink.net (Geoffrey Young) wrote:
>> From: Geoffrey Young [mailto:gyoung@laserlink.net]
>> Sent: Thursday, December 14, 2000 8:34 AM
>> To: 'Ken Williams'
>> Cc: 'modperl@apache.org'
>> Subject: Apache::Compress patch
>> 
>> 
>> hi ken...
>> 
>> 
>> something has been bugging me in Apache::Compress for a while now - it
>> _always_ tries to compress output.
>> 
>[snip]
>
>whoops on that patch...  it didn't print filtered output that was less than
>300 characters *doh*.  This should do the trick (against Compress.pm
>1.003)...
>
>--Geoff
>
>--- Compress.pm.old     Thu Dec 14 08:22:15 2000
>+++ Compress.pm Mon Dec 18 15:29:26 2000
>@@ -35,10 +35,23 @@
>   return SERVER_ERROR unless $fh;
>   
>   if ($can_gzip) {
>-    $r->content_encoding('gzip');
>-    $r->send_http_header;
>     local $/;
>-    print Compress::Zlib::memGzip(<$fh>);
>+    local $^W;  # length() gives an uninitialized warning. hmmm...
>+    my $file = <$fh>;
>+
>+    my $length = length($file);
>+
>+    return DECLINED unless $length;
>+
>+    if ($length < 300) {
>+      $r->send_http_header;
>+      $r->print($file);
>+    }
>+    else {
>+      $r->content_encoding('gzip');
>+      $r->send_http_header;
>+      print Compress::Zlib::memGzip($file);
>+    }
>   } else {
>     $r->send_http_header;
>     $r->send_fd($fh);
>
>
>

===

Subject: [Solved]: Apache::Compress + mod_proxy problem
From: Edward Moon <em@mooned.org>
Date: Fri, 22 Dec 2000 18:33:44 -0800 (PST)

Here's a patch for Apache::Compress that passes off proxied requests to
mod_proxy.

Without this patch Apache::Compress will return an internal server error
since it can't find the proxied URI on the local filesystem.

Much of the patch was lifted from chapter 7 of the Eagle book.

Right now the code requires you to write proxy settings twice (i.e. in the
FileMatch block for Apache::Compress and in the mod_proxy section):

<FilesMatch "\.(htm|html)$">
	SetHandler	perl-script
	PerlSetVar	PerlPassThru '/proxy/ =>
			http://private.company.com/,
			/other/ => http://www.someother.co.uk'
	PerlHandler	Apache::Compress
</FilesMatch>

ProxyRequests On
ProxyPass		/proxy		http://private.company.com/
ProxyPassReverse	/proxy		http://private.company.com/
ProxyPass		/other		http://www.someother.co.uk/
ProxyPassReverse	/other		http://www.someother.co.uk/

34a35,49
> 
>   if ($r->proxyreq) {
>       use Apache::Proxy;
>       my $uri = $r->uri();
> 
>       my %mappings = split /\s*(?:,|=>)\s*/, $r->dir_config('PerlPassThru');
> 
>       for my $src (keys %mappings) {
> 	  next unless $uri =~ s/^$src/$mappings{$src}/;
>       }
> 
>       my $status = Apache::Proxy->pass($r, $uri);
>       return $status;
>   }
> 


===

To: Geoffrey Young <gyoung@laserlink.net>
From: Matt Sergeant <matt@sergeant.org>
Subject: RE: More Speed -> mod_perl Module for HTML
Compression
Date: Thu, 30 Nov 2000 19:53:05 +0000 (GMT)

On Thu, 30 Nov 2000, Geoffrey Young wrote:

> > > just beware that not all browsers that claim to accept gzip
> > compression
> > > actually do...
> >
> > No its the other way around. Not all browsers that can accept
> > gzip send
> > out Accept-Encoding: gzip. Notably early versions of IE4.
>
> I was basing that on discussions on the mod_gzip list and the following
> (from the mod_gzip code)
>
>      * 5. Many browsers ( such as Netscape 4.75 for UNIX ) are unable
>      *    to handle Content-encoding only for specific kinds of HTML
>      *    transactions such as Style Sheets even though the browser
>      *    says it is HTTP 1.1 compliant and is suppying the standard
>      *    'Accept-encoding: gzip' field. According to the IETF
>      *    specifications any user-agent that says it can accept
>      *    encodings should be able to do so for all types of HTML
>      *    transactions but this is simply not the current reality.
>      *    Some will, some won't... even if they say they can.
>
> I don't have any first hand experience with it, though...

Yikes, thats really dumb. I guess its both ways around then...

<shameless_plug>
So really your best bet is to just use AxKit, which will compress just
your HTML content and won't handle your CSS files or anything else :-)
</shameless_plug>

===

To: "'Wiswell, Virginia'" <Virginia.Wiswell@dowjones.com>
From: Geoffrey Young <gyoung@laserlink.net>
Subject: RE: More Speed -> mod_perl Module for HTML
Compression 
Date: Thu, 30 Nov 2000 15:07:03 -0500

> -----Original Message-----
> From: Wiswell, Virginia [mailto:Virginia.Wiswell@dowjones.com]
> Sent: Thursday, November 30, 2000 2:52 PM
> To: 'Geoffrey Young'; 'Nigel Hamilton'; mod_perl list
> Subject: RE: More Speed -> mod_perl Module for HTML Compression 
> 
> 
>  
> > there's mod_gzip, available from
> > http://www.remotecommunications.com/apache/mod_gzip/
> > which I've played with and looks pretty good
> > 
> > or Apache::Compress, available from CPAN, which also works 
> > rather nicely
> > (and is Apache::Filter ready, so you can chain PerlHandlers into it)
> > 
> > just beware that not all browsers that claim to accept gzip 
> > compression
> > actually do...
> > 
> 
> geoff - is there any documentation as to which browsers will 
> or will not
> handle gzip compression?

I don't think so - from lurking around mod_gzip, I think the folks at Remote
Communications have a pretty good grip on what _really_ accepts compression
and what doesn't (or so I've gathered), but I don't think it's documented
anywhere authoritative (that I know about, at least)

a look at the mod_gzip code
(http://12.17.228.52/mod_gzip/src/1.3.14.3/mod_gzip.txt search for 'Basic
sanity checks completed and we are still here') lists lots of
Accept-Encoding problems, but only mention Nescape 4.75 on unix by name...

===

To: "'Matt Sergeant'" <matt@sergeant.org>,
From: "Paul G. Weiss" <PGWeiss@arity.com>
Subject: RE: More Speed -> mod_perl Module for HTML
Compression
Date: Thu, 30 Nov 2000 15:22:21 -0500

Actually its both then.  I've had to hack up mod_gzip to 
not send compressed data if the following is true:

	1.  The browser is Netscape
	2.  The URL is a javascript file (ends in .js).

Netscape sends Accept-Encoding: gzip for javascript files
and then doesn't know what to do with them.  

===

To: Nigel Hamilton <nigel@e1mail.com>
From: Joshua Chamas <joshua@chamas.com>
Subject: Re: More Speed -> mod_perl Module for HTML
Compression
Date: Thu, 30 Nov 2000 12:23:35 -0800

Nigel Hamilton wrote:
> 
>         Does anyone know of a mod_perl module that compresses HTML and a
> companion Javascript procedure that decompresses the data on the
> client-side?
> 
>         I know there are Gzip modules that zip files on the way back to
> the browser ... but I'm after something that zips on the server and
> decompresses transparently in Javascript across all browsers. Ideally I
> want to do: document.write(uncompressed-contents) in Javascript on the
> client-side.
> 

To add to Matt's comments and likely Ged would agree, you'll probably
find that Gzip compression is better supported cross browser than
any JavaScript you could come up with.  JavaScript breaks in lots of
ways all the time if you just look at IE4-IE5, NS4.0x-NS4.7x.  And then
look at them on NT/2000 vs. 95/98/ME, that'll really kill ya.

===

To: "mod_perl list" <modperl@apache.org>
From: "Dave Kaufman" <david@gigawatt.com>
Subject: Re: More Speed -> mod_perl Module for HTML
Compression
Date: Thu, 30 Nov 2000 15:33:13 -0500

"Matt Sergeant" <matt@sergeant.org> wrote:

> On Thu, 30 Nov 2000, Geoffrey Young wrote:
>
> > just beware that not all browsers that claim to accept gzip compression
> > actually do...
>
> No its the other way around. Not all browsers that can accept gzip send
> out Accept-Encoding: gzip. Notably early versions of IE4.

Right, and in response to Nigel's assumtion:

> ...I'm after something that zips on the server and
> decompresses transparently in Javascript across all browsers.

I believe you'd find that a lot more browsers will already transparently
decompress your server-gzipped content for you, than you will
JavaScript-enabled browsers that will successfully decompress the content
for you.

Another reason being that you can *detect* Accept-Encoding: gzip in a
browser's request headers (and even workaround the IE versions that dont
send this by looking at their USER_AGENT headers) and know beforehand
whether gzipped content can be decoded by that browser or not, while there
are no such early-warning systems to assure you that Javascript will be
enabled.  So, since most modern browsers *already* decompress gzip on the
fly, why would you want to add all the necessary JS code (to the
content-size) and then ask the JavaScript interpreter to do what the browser
already knows how to do?

===
To: Geoffrey Young <gyoung@laserlink.net>
From: "G.W. Haywood" <ged@www.jubileegroup.co.uk>
Subject: RE: More Speed -> mod_perl Module for HTML
Compression
Date: Thu, 30 Nov 2000 20:40:50 +0000 (GMT)

Hi all,

On Thu, 30 Nov 2000, an assortment of correspondents wrote:

>>> beware that not all browsers that claim to accept gzip compression
>>> actually do...
>> 
>> No its the other way around. Not all browsers that can accept gzip send
>> out Accept-Encoding: gzip. Notably early versions of IE4.
> 
> I was basing that on discussions on the mod_gzip list and the following

I think it's safe to say that most browsers (mentioning no two in
particular:) do exactly what they're supposed to do very rarely.

Especially if JavaScript is involved.

===

To: mod_perl list <modperl@apache.org>
From: Jim Winstead <jimw@trainedmonkey.com>
Subject: Re: More Speed -> mod_perl Module for HTML
Compression
Date: Thu, 30 Nov 2000 15:44:09 -0500

On Nov 30, Geoffrey Young wrote:
> I was basing that on discussions on the mod_gzip list and the following
> (from the mod_gzip code)
> 
>      * 5. Many browsers ( such as Netscape 4.75 for UNIX ) are unable
>      *    to handle Content-encoding only for specific kinds of HTML
>      *    transactions such as Style Sheets even though the browser
>      *    says it is HTTP 1.1 compliant and is suppying the standard
>      *    'Accept-encoding: gzip' field. According to the IETF
>      *    specifications any user-agent that says it can accept
>      *    encodings should be able to do so for all types of HTML
>      *    transactions but this is simply not the current reality.
>      *    Some will, some won't... even if they say they can.
> 
> I don't have any first hand experience with it, though...

i don't have any first-hand experience with it either (and don't
doubt at all that there are browser bugs in the implementations),
but the language of that comment is atrocious. there's no such
thing as an "html transaction". all the http/1.1 rfc (2616) has to
say on the matter is that if the browser sends an accept-encoding
header that lists an encoding type (such as gzip) with a non-zero
q-value, the server may send the response using that content-encoding.
it doesn't matter what type of data is being served (the server
could gzip gif images if it really wanted).

nothing beats just having a reasonable test environment to be able
to test the major browsers against your site. with something like
vmware, you can even use a single box to act as most of your
platforms. (you could probably even do better by having a macintosh
and using one of the virtual-intel-pc applications available for
it.)

===

To: "Nigel Hamilton" <nigel@e1mail.com>
From: "Ken Williams" <ken@forum.swarthmore.edu>
Subject: Re: More Speed -> mod_perl Module for HTML
Compression 
Date: Thu, 30 Nov 2000 15:37:11 -0600

nigel@e1mail.com (Nigel Hamilton) wrote:
>	I'm trying to reduce the amount of data sent from server to
>browser by using compression ---> hopefully accelerating the time to
>serve a page.
>
>	Does anyone know of a mod_perl module that compresses HTML and a
>companion Javascript procedure that decompresses the data on the
>client-side?
>
>	I know there are Gzip modules that zip files on the way back to
>the browser ... but I'm after something that zips on the server and  
>decompresses transparently in Javascript across all browsers. Ideally I
>want to do: document.write(uncompressed-contents) in Javascript on the
>client-side.

I think you've got a slight misconception about how gzip HTTP
compression works.  It's perfectly transparent, in that browsers that
support compression will decompress the file automatically, and the user
will never know that the page was compressed in the first place.  That's
much smoother than the javascript decompression you propose, which I
can't help thinking will turn into a real headache, perhaps even a
nightmare.

In particular, it seems like you think that users have to manually
decompress gzipped content, but that's not the case.  Just thought I'd
state it if that was the confusion.

mod_gzip, Apache::Compress, or Apache::Gzip are solutions here.

===

To: "'Matt Sergeant'" <matt@sergeant.org>,
From: "Paul G. Weiss" <PGWeiss@arity.com>
Subject: RE: More Speed -> mod_perl Module for HTML
Compression
Date: Thu, 30 Nov 2000 16:48:11 -0500

While we're on the subject.

In IE4, it sends out Accept-Encoding: gzip both for
html and javascript requests.

However,

it will not understand any compressed javascript files
until it has first loaded a compressed html file.

Once it has loaded a compressed html file, it can then
successfully load a non-compressed html file that 
references a compressed javascript file.

Clear as mud?

===

To: "Ken Williams" <ken@forum.swarthmore.edu>
From: merlyn@stonehenge.com (Randal L. Schwartz)
Subject: Re: More Speed -> mod_perl Module for HTML
Compression
Date: 30 Nov 2000 14:37:32 -0800

>>>>> "Ken" == Ken Williams <ken@forum.swarthmore.edu> writes:

Ken> In particular, it seems like you think that users have to manually
Ken> decompress gzipped content, but that's not the case.  Just thought I'd
Ken> state it if that was the confusion.

Ken> mod_gzip, Apache::Compress, or Apache::Gzip are solutions here.

Or even my cool compressing pre-forking tiny proxy at
  <http://www.stonehenge.com/merlyn/WebTechniques/col34.html>

Neatly proxies, but sends compressed text across slow links if
the browser understands that.

===


the rest of The Pile (a partial mailing list archive)

doom@kzsu.stanford.edu