grew tremendously over the years.
To reduce bandwidth requirements, these libraries are usually minified and GZIP compressed.
Surprisingly, most Content Delivery Networks (CDNs) give you a quite bad level
of GZIP compression.
(open source) is currently the best GZIP compressor and produces the smallest files.
It produces files fully compatible to the DEFLATE algorithm (used by GZIP) but in order to find even better compression
it is based on a random walk (trial-'n'-error
I run a brute-force search to find optimal Zopfli parameters.
That means: when decompressed by the browser, my files are 100% identical to the original minified versions.
There is no difference to the developer, to the administrator or to the end-user:
except my files are smaller and therefore load a tiny bit faster.
size(mine) < size(original) but content(mine) = content(original)
Everything you find on the website can be used in in non-commercial as well as commercial projects without any obligations
However, if you are a super-kind person, then please send me a short mail. Or a postcard. Or both.
My server analyzed these already minified and zipped projects and compressed them even further:
On average, my files
than the smallest files you find on these major public content delivery networks:
Not a single file
from these CDNs is smaller or has the same size as my compressed files.
Take a look at some statistics
How To Use It
Select your desired library from the table above, for example
hotlink to my server's copy
- download the smallest file and store it on your server
As mentioned before, after decompression my files are 100% identical
to their original version.
The only difference is I spent much more time finding an (almost) optimal compression.
No matter which CDN you use, it's always a good idea to have a fallback in case the CDN is unreachable or - even worse -
delisted your specific library.
Please be aware that window.jQuery
(last line) is specific to jQuery,
||Load Local Fallback
||<script>window.angular || document.write('<script src="local_server_path/angular.min.js">\x3C/script>')</script>
||<script>window.Backbone || document.write('<script src="local_server_path/backbone.min.js">\x3C/script>')</script>
||<script>$.fn.modal || document.write('<script src="local_server_path/bootstrap.min.js">\x3C/script>')</script>
||<script>window.d3 || document.write('<script src="local_server_path/d3.min.js">\x3C/script>')</script>
||<script>typeof(dojo) !== "undefined" || document.write('<script src="local_server_path/dojo.min.js">\x3C/script>')</script>
||<script>window.Ember || document.write('<script src="local_server_path/ember.min.js">\x3C/script>')</script>
||<script>window.jQuery || document.write('<script src="local_server_path/jquery.min.js">\x3C/script>')</script>
||<script>window.ko || document.write('<script src="local_server_path/knockout.min.js">\x3C/script>')</script>
||<script>window._ || document.write('<script src="local_server_path/lodash.min.js">\x3C/script>')</script>
||<script>window.React || document.write('<script src="local_server_path/react.min.js">\x3C/script>')</script>
||<script>window.THREE || document.write('<script src="local_server_path/three.min.js">\x3C/script>')</script>
||<script>window._ || document.write('<script src="local_server_path/underscore.min.js">\x3C/script>')</script>
You can try another CDN instead of falling back to your local server, too.
I created this website as a hobby project because every time a byte is wasted, a kitten cries. Seriously ;-)
By the way: if you run a small website it's totally fine to hotlink to the compressed files on my server.
However, high-traffic websites should copy my files to their own server.
And in the best of all worlds, major CDNs would replace their (too) large versions ... well, I'm just dreaming.
is a fast open source multi-threaded implementation of
written by one of the original authors of gzip
However, when using compression level 11, pigz
actually switches to the slower Zopfli algorithm and isn't multi-threaded anymore.
to Zopfli offer the highest level of configuration and is therefore used for my brute-force search.
wrote the closed-source
compression program and Jonathon Fowler
ported it to Linux.
was created by Joachim Henke
is a tool by Ben Jos Walbeehm.
website made by Stephan Brumme in 2015
and still improving in 2020.
all timestamps are displayed in central european time. see my changelog
no flash, not even images or external css files - and everything squeezed into a single html file.
which was handsomely compressed before releasing it into the wild internet - obviously.
please visit my homepage
and my blog
email: minime (at) stephan-brumme.com
All trademarks are property of their respective owners. You know, the boring legal stuff.