Javascript Compression

So what’s the best way to compress Javascript code? I found a couple of online services that didn’t work and this Perl script which seemed nice but was a little overzealous perhaps because the resulting code errored out.

32 thoughts on “Javascript Compression

  1. Not file compression, code compression. Shortening variables and functions, removing unecessary whitespace, et cetera.

  2. Also serving CSS and JS through gzip can cause weird errors, which is why they’re off by default when using mod_gzip.

  3.  
    Shortening variables and functions, removing unnecessary whitespace, et cetera.
     

    This is always a bad idea, especially if you are talking about JavaScript which is visible in the source. I absolutely loathe MATLAB code that’s optimised in this way; it becomes illegible.

  4. How big are these codes Matt? I mean, I’d strip out whitespace by hand, test it to see if that has any effect on it, and see where it goes from there. Functions and variables shouldn’t be shortened unless you’re absolutely certain they won’t bite ya….(but you already knew that of course, I’m just commenting).

    Maybe there’s a php script somewhere for this?

    #Now quite intrigued at the idea#…..Hmmm, off to Google! #Laughs#

  5. Dean’s packer dramatically reduces the performance of the javascript file. It is only cool for small simple code. It has problems with complex object oriented scripts. Through the renaming it does not work to extend a object in another file as the file where this object is defined. For the most complex javascripts these things are very important. For other small scripts its ok to use simple gzip I think.

    In qooxdoo we use a self written python script for compression and extraction of documentation in the style of javadoc or doxygen. It works well on multiple depending files. But it does not shorten variable or function names.

  6. Are you just trying to get intialization and initial download amounts/times down?

    Could you have stubs that auto-load additional chunks of javascript on the fly?

  7. The best approach would be to store the files uncompressed and serve them through a compression script, IMO. As for the code that won’t run after compression, the trick is to write clean and well structured code: although js works without semicolons at the end of line, newlines are some of the excess whitespace that is usually stripped at compression, leaving you with a big mess if you didn’t use them. (“tag soup” vs “code soup”? ;))

  8. I work on a web browser (Danger hiptop / T-Mobile Sidekick) and I loathe dealing with shortened JavaScript. It makes it a nightmare to work out why things are going wrong.

    If you just want to strip comments and stuff you could try using the c preprocessor. If you’re serving over gzip compression having expressive variable and function names shouldn’t hurt you too much.

  9. Sebastien, everything you said about my compressor is wrong. It affects performance a lot less than others (maybe the least). The name shortening is optional but if used properly gives big savings. Version 2.0 has been out for a few months now and I’m very pleased with it. It reduces IE7 from 90K source to 23K.

  10. While we’re all discussing the different methods and merits of javascript compression, let me ask a simple question: why? The average javascript is just a couple of dozen kilobytes in size. How much is making the thing unreadable and error-ridden actually going to save?

  11. Matt,
    I have a system that I have set up. It is not automated… yet. I have tweaked the JavaScript crunchinator from http://www.brainjar.com and made it much faster. Before and after I crunch I run the code through a local copy of jslint that has a few tweaks to not check for a few overly strict rules jslint has. This ensures that 1) I have all of the proper syntax to start with. Crunching the code never breaks anything. Secondly, I check it after the crunching just to make sure nothings lost anyway. This has been 100% effective to date.

  12. I think there is a runtime improvement as well. Even though Java is compiled, I think that JavaScript is more or less interpreted. Imagine how long pages would take to load if all scripts had been compiled.

    Java and JavaScript are as much alike as Car a Carpet.

  13. Just another option (to confuse and confound!):

    http://dojotoolkit.org/~alex/js_with_compress.jar

    This is a modified version or Rhino (the Mozilla JS interpreter written in Java). We (the Dojo project) have added a “compressor” method to the built-in Decompile class. Since we’re operating on the token stream from the parser, this system is somewhat less prone to errors and “overzealousness”. A new “-c” flag sends the “compressed” version of the input to stderr. Normal invocation is:

    java -jar js_with_compress.jar -c in.js > out.js 2>&1

    This tool will NOT mangle your public APIs. We’re using it as a part of the build process in Dojo and it’s been working well for some time now. It’s much less error prone than the regexp-based stripper we used in netWindows.

    Source for the tool (i.e., a zipped up version of my anonymous CVS checkout) is available in the same directory as the Jar file.

    Regards

  14. There are actually three very good reasons for compressing javascript code: first, to improve download times, second, to reduce bandwidth utilisation, and third, to stop people nicking your code. Most of the objections listed here are along the lines of “but I hate javascript compression because it makes the code more difficult to read”. Well, that’s part of the intention.

    If the world were a nice place where everyone played nicely together and credited their sources, we wouldn’t have to do this, but I get fed up with people taking the code from OnOneMap (http://ononemap.com) and using it to attempt to build a rival service without crediting us for it.

    I’ve not found a good compressor yet, but I’m still looking.

  15. I’ve look at all these compressors, but non seem to handle the more complex structures and objects. I have been trying to code this part for a while – automatically determining which is a user accessible variable or function, to that of something burried and never accessed directly by the user. This bit is a sticking point. JSLint does the cleaning of the code before the packing/compressing, but its hard trying to map objects.

    Even though it makes things unreadable, its true, 50-75% savings on code sizes for deployed JS as opposed to the source makes things faster (there are people still out there who don’t have whopping ADSL links or OC-48 lines)

    (I credit the people who’s code I have used to work out parts – thats just common courtesy)

  16. Why do compression? Because many of nifty things you can do with JavaScript such as dragging objects and file upload progress indicators take up a lot of space. I just finished a script that will let the user drag objects around and resizing them and the code takes up 15 KB of space before any type of compression. One of my favorite compressors is located here: http://hometown.aol.de/_ht_a/memtronic/ , though it does error out occasionally. There is another one that often breaks, but sometimes does not when the Memtronics cruncher does – I can’t find it at the moment (and I really need it).

  17. I still cannot find it, but the compressor has been updated. Also for those pesky problems that can trip up the compressor, go to http://www.jslint.com to find out what the problem is. If there are errors before compression, it can really trip up the process later on.

  18. For anyone still looking for an answer to this, especially if you have genuinely complex code to run, the best solution I’ve found is this:

    Create a file, call it js.php, that can be invoked like this:

    What it should do is:
    1. Hash together the entire Query string and treat the MD5 hash as a filename.
    2. If that file exists in the cache, compare the modification time of each file in the Javascript directory: Eg, “DOM.js”, “animations.js”, “pages/home.js”
    3. If ANY of the files have a modification time greater than the archived file, or the archive doesn’t exist, (re)build it:
    a. Read each javascript file
    b. Use a regular expression to kill comments and blank lines.
    c. Concatenate all the files together, in order, into one file (You’ll get better gzip compression this way too).
    d. Store two versions of this file in your cache directory — one as .js, and another gzipped copy as .gz
    4. Serve the appropriate file depending on whether the user’s browser explicitly supports compression in the accept header.

    This will seriously reduce both transmission time and processing time, and since it is a dynamic script, browsers are MUCH better at expiring the content and checking for a new version when users visit the page — important if you are updating your site a lot. Once you feel that a set of scripts is stable, you can send the appropriate headers to make the client’s browser stop polling for new versions every time. And you can always compress / obfuscate the scripts beforehand

Leave a Reply to Malcolm WhiteCancel reply