otto at ottodestruct.com
Tue Sep 18 19:23:38 GMT 2007
On 9/18/07, Omry Yadan <omry at yadan.net> wrote:
> the uncompressed file?
> this depends on the client machine speed and on it's network speed.
While true, I think it's pretty obvious that executing interpreted
bytes for most of the JS code we have in WordPress.
Take the Prototype. It's about 69 kb. Packing it gives you something
like 34 kb. While a 50% reduction is nice, it's still only like a 34k
difference. With any high speed connection, you're talking less than a
second. It's only a few extra packets over an already setup TCP
connection. And it's cached that way as well, eliminating any
unpacking evaluation time.
Now, as your JS gets much, much larger, sure, it's going to shift the other way.
On the other hand, simple gzip compression reduces prototype from 69k
down to like 6k. And that does make a noticable difference in both
bandwidth consumption as well as rendering speed. The impact to the
browsers is minimized simply because the decompression phase is built
decompression. The overhead is minimized.
On the other hand, I do see what's being said regarding combining JS
files into one, however I'm uncertain if that helps enough to make any
real difference. A few extra requests combined with more browser
caching when they're separated might make a bigger difference. Too
often the "web 2.0" development type people only look at total page
load size and ignore the effects of smart browser caching.
> 3. I think the greatest benefit will not come from reducing the size of
Total agreement, and this actually argues against JS-packing and
combining JS files as is being discussed. The enqueue script is the
way to go with this stuff.
More information about the wp-hackers