Choose a version:
26% The original file has 629246 bytes (614.5k) and is available from the project website.
There you can find the official minified version, too, which brings down the size to 163844 bytes (160.0k, 26%).

After GZIP compression these minified files vary in size:
Boot
  64897 bytes (63.4k)
CDN
cdnjs
  56231 bytes (54.9k)
CDN
gzip -6 (default)
  55967 bytes (54.7k)
local copy
gzip -9
  55909 bytes (54.6k)
local copy
7zip -mx=9 -tgzip
  53245 bytes (52.0k)
local copy
libdeflate -12
  53240 bytes (52.0k)
local copy
kzip -s0 -rn -b1
  53196 bytes (51.9k)
local copy
pigz -11 -n
  53187 bytes (51.9k)
local copy
zultra
  53186 bytes (51.9k)
local copy
Zopfli
  53014 bytes (51.8k)
local copy
Zopfli (defluff)
  53013 bytes (51.8k)
local copy

perma-link to the smallest file on my server:
http://minime.stephan-brumme.com/files/dojo/dojo-1.10.1.min.js (or via HTTPS)

You will automatically get the smallest Dojo 1.10.1 file, ETag caching is available and
if your browser doesn't support GZIP decompression then the uncompressed version will be sent.

Currently best Zopfli settings

Save 3217 bytes by using my Dojo 1.10.1 Zopfli version instead of the best available CDN (6.07% smaller than cdnjs, 53014 vs. 56231 bytes):
You can use my super-compressed files for whatever purpose you like as long as you respect the library's original license agreement.
There are no restrictions from my side - but please avoid hot-linking if you run a high-traffic website.

These command-line settings yielded the best compression ratio so far (Linux version of zopfli-krzymod):
zopfli --i1000000 --mb8 --mls2 --bsr13 --lazy --ohh

(found January 30, 2020)
Description Value Parameter
iterations 1000000  --i1000000
maximum blocks 8  --mb8
maximum length score 2  --mls2
block splitting recursion 13  --bsr13
lazy matching in LZ77 yes  --lazy
optimized Huffman headers yes  --ohh
initial random W for iterations 1  --rw1
initial random Z for iterations 2  --rz2

Even Smaller Files Thanks To Defluff

Zopfli's output can be further optimized by the defluff tool.
In this particular case, defluff saves 1 more byte (53013 bytes).

Verify file integrity

After decompression, my uncompressed files are identical to the original ones:

MD5:
curl --silent --compressed https://download.dojotoolkit.org/release-1.10.1/dojo.js --location | md5sum
41630a93e668ff46178295a702c9a90e  -
curl --silent --compressed https://minime.stephan-brumme.com/files/dojo/dojo-1.10.1.min.zopfli.js.gz | md5sum
41630a93e668ff46178295a702c9a90e  -

SHA1:
curl --silent --compressed https://download.dojotoolkit.org/release-1.10.1/dojo.js --location | sha1sum
95850d11ed2b15628fba4253d8fc36d47fd6db77  -
curl --silent --compressed https://minime.stephan-brumme.com/files/dojo/dojo-1.10.1.min.zopfli.js.gz | sha1sum
95850d11ed2b15628fba4253d8fc36d47fd6db77  -

All listed CDNs deliver identical contents:
CDN Size (compressed) MD5 (uncompressed) Timestamp
Boot 64897 bytes 41630a93e668ff46178295a702c9a90e March 19, 2015 @ 15:28
cdnjs 56231 bytes 41630a93e668ff46178295a702c9a90e February 8, 2015 @ 14:45

Note: only the MD5 hashes are shown to keep things simple.

Other Versions

Available Dojo versions at minime.stephan-brumme.com:

1.17.3, 1.17.2, 1.17.1, 1.17.0,
1.16.5, 1.16.4, 1.16.3, 1.16.2, 1.16.1, 1.16.0,
1.15.6, 1.15.5, 1.15.4, 1.15.3, 1.15.2, 1.15.1, 1.15.0,
1.14.9, 1.14.8, 1.14.7, 1.14.6, 1.14.5, 1.14.4, 1.14.3, 1.14.2, 1.14.1, 1.14.0,
1.13.10, 1.13.9, 1.13.8, 1.13.7, 1.13.6, 1.13.5, 1.13.4, 1.13.3, 1.13.2, 1.13.1, 1.13.0,
1.12.11, 1.12.10, 1.12.9, 1.12.8, 1.12.7, 1.12.6, 1.12.5, 1.12.4, 1.12.3, 1.12.2, 1.12.1,
1.11.13, 1.11.12, 1.11.11, 1.11.10, 1.11.9, 1.11.8, 1.11.7, 1.11.6, 1.11.5, 1.11.4, 1.11.3, 1.11.2, 1.11.1, 1.11.0,
1.10.10, 1.10.9, 1.10.8, 1.10.7, 1.10.6, 1.10.5, 1.10.4, 1.10.3, 1.10.2, 1.10.1, 1.10.0,
1.9.11, 1.9.10, 1.9.9, 1.9.8, 1.9.7, 1.9.6, 1.9.5, 1.9.4, 1.9.3, 1.9.2, 1.9.1, 1.9.0,
1.8.14, 1.8.13, 1.8.12, 1.8.11, 1.8.10, 1.8.9, 1.8.8, 1.8.7, 1.8.6, 1.8.5, 1.8.4, 1.8.3, 1.8.2, 1.8.1, 1.8.0,
1.7.12, 1.7.11, 1.7.10, 1.7.9, 1.7.8, 1.7.7, 1.7.6, 1.7.5, 1.7.4, 1.7.3, 1.7.2, 1.7.1, 1.7.0,
1.6.5, 1.6.4, 1.6.3, 1.6.2, 1.6.1, 1.6.0,
1.5.6, 1.5.5, 1.5.4, 1.5.3, 1.5.2, 1.5.1, 1.5.0,
1.4.8, 1.4.7, 1.4.6, 1.4.5, 1.4.4, 1.4.3, 1.4.2, 1.4.1, 1.4.0,
1.3.3, 1.3.2, 1.3.1, 1.3.0,
1.2.3, 1.2.2

The project site contains an overview how well these versions were compressed.
Other interesting projects are AngularJS, BackboneJS, Bootstrap, D3, Ember, jQuery, Knockout, lodash, React, Socket.IO, ThreeJS, UnderscoreJS and Vue.

Changelog

Best Zopfli parameters so far:
Size Improvement Parameters Found
53014 bytes -1 byte zopfli --i1000000 --mls2 --bsr13 --lazy --ohh January 30, 2020 @ 08:37
53015 bytes -5 bytes zopfli --i100000 --mls2 --bsr13 --lazy --ohh September 25, 2015 @ 23:58
53020 bytes -4 bytes zopfli --i10000 --mls2 --bsr13 --lazy --ohh September 25, 2015 @ 21:05
53024 bytes -2 bytes zopfli --i1000 --mls2 --bsr13 --lazy --ohh September 22, 2015 @ 14:55
53026 bytes -1 byte zopfli --i1000 --mls2 --bsr14 --lazy --ohh September 22, 2015 @ 14:49
53027 bytes zopfli --i100 --mls2 --bsr17 --lazy --ohh September 22, 2015 @ 11:21

If there are multiple parameter sets yielding the same compressed size, only the first one found is shown.

Most recent activity on March 16, 2022 @ 19:14.

Heatmaps

This Zopfli heatmap visualizes how compression changes when modifying the --bsr and --mls parameter.
Cell's contents is the best filesize achieved (in bytes, hover with mouse over cells to see number of iterations).

Good parameters are green, bad are red. The best and worst are bold as well.
The brightness of the blue background color indicates how many iterations were processed:
10,000, 100,000 or 1,000,000.
bsr \ mls
2 4 8 16 32 64 128 256 512 1024 2048 4096 8192 16384 32768
bsr \ mls
2 4 8 16 32 64 128 256 512 1024 2048 4096 8192 16384 32768
53065 53071 53089 53090 53090 53093 53096 53102 53092 53094 53118 53094 53097 53130 53102
53030 53036 53043 53039 53090 53091 53098 53107 53092 53100 53099 53100 53108 53097 53101
53054 53027 53045 53039 53091 53090 53091 53098 53094 53105 53097 53103 53102 53099 53100
53088 53124 53131 53090 53101 53097 53092 53091 53094 53115 53096 53093 53094 53109 53096
53023 53027 53038 53038 53086 53093 53100 53097 53093 53094 53106 53107 53104 53106 53094
53017 53025 53039 53040 53089 53091 53097 53097 53093 53091 53101 53092 53104 53100 53100
53047 53052 53052 53090 53088 53093 53095 53091 53093 53103 53100 53095 53092 53096 53103
53016 53025 53048 53037 53091 53089 53104 53103 53093 53099 53100 53112 53101 53097 53093
53014 53050 53089 53038 53096 53089 53089 53091 53094 53096 53101 53103 53102 53096 53094
53014 53025 53035 53039 53089 53096 53091 53098 53093 53102 53099 53093 53104 53101 53095
53017 53038 53042 53039 53089 53086 53087 53092 53102 53115 53099 53093 53101 53111 53094
53025 53018 53034 53089 53091 53088 53092 53088 53093 53107 53098 53093 53094 53097 53095
53014 53049 53037 53038 53088 53091 53090 53103 53093 53102 53099 53093 53099 53096 53101
53016 53050 53034 53036 53088 53090 53099 53097 53092 53102 53102 53092 53101 53098 53092
53017 53025 53036 53037 53089 53090 53097 53097 53092 53104 53101 53094 53101 53098 53095
53016 53024 53033 53040 53089 53089 53091 53091 53092 53106 53099 53107 53101 53097 53095
53089 53072 53090 53091 53091 53089 53097 53097 53093 53102 53099 53093 53095 53107 53100
53014 53023 53038 53037 53090 53089 53098 53092 53093 53096 53098 53092 53095 53096 53103
53024 53068 53089 53038 53088 53089 53098 53096 53092 53097 53099 53094 53100 53098 53091
53087 53077 53109 53089 53091 53089 53106 53090 53093 53101 53101 53093 53092 53096 53092
53058 53075 53089 53090 53089 53089 53091 53091 53093 53096 53103 53092 53094 53099 53094
53056 53078 53093 53089 53087 53089 53091 53092 53093 53101 53098 53092 53093 53096 53093
53017 53028 53034 53038 53089 53087 53091 53090 53092 53096 53103 53092 53093 53099 53094

Due to the Monte Carlo design of my search algorithm, not all parameters have reached the same number of iterations yet:
Iterations Min. Bytes Reduction Coverage
100 53027 bytes 100%
1,000 53024 bytes -3 bytes 100%
10,000 53020 bytes -4 bytes 100%
100,000 53015 bytes -5 bytes 3.48%
1,000,000 53014 bytes -1 byte 1.16%
10,000,000

KZIP has far less options available for tuning/optimization. I only played around with the number of blocks (parameter -n):
Blocks Min. Bytes Compared To Best Zopfli Compared To Best KZIP
53262 bytes +248 bytes (+0.47%) +66 bytes
53196 bytes +182 bytes (+0.34%)
53213 bytes +199 bytes (+0.38%) +17 bytes
53261 bytes +247 bytes (+0.47%) +65 bytes
53304 bytes +290 bytes (+0.55%) +108 bytes
53319 bytes +305 bytes (+0.58%) +123 bytes
53330 bytes +316 bytes (+0.60%) +134 bytes
53358 bytes +344 bytes (+0.65%) +162 bytes
53402 bytes +388 bytes (+0.73%) +206 bytes

Non-DEFLATE Algorithms

Archivers based on completely different compression algorithms often produce superior results.
Unfortunately, browsers only support gzip compression at the moment.
However, support for Brotli is constantly growing - but your browser doesn't support it.
Algorithm Program Parameters Size Compared To Best Zopfli
ZPAQ (Wikipedia) zpaq zpaq -method 69 38852 bytes -14162 bytes (-26.71%)
RAR (proprietary) RAR rar a -m5 -md64m -mc63:128t -mt1 46088 bytes -6926 bytes (-13.06%)
PPMd (Wikipedia) 7zip 7za a -mx=9 -m0=ppmd 47563 bytes -5451 bytes (-10.28%)
Brotli (Wikipedia) brotli brotli -q 11 48599 bytes -4415 bytes (-8.33%)
LZMA2 (Wikipedia) xz xz -9 49784 bytes -3230 bytes (-6.09%)
Zstandard (Wikipedia) zstd zstd -19 51256 bytes -1758 bytes (-3.32%)
Burrows-Wheeler transform (Wikipedia) bzip2 bzip2 -9 51262 bytes -1752 bytes (-3.30%)

Detailled Analysis

I wrote a DEFLATE decoder in Javascript. Click the button below to start a client-side analysis of the smallest gzipped files (may take a second):


Notes: pigz is a fast open source multi-threaded implementation of gzip written by one of the original authors of gzip.
However, when using compression level 11, pigz actually switches to the slower Zopfli algorithm and isn't multi-threaded anymore.
KrzyMOD's extensions to Zopfli offer the highest level of configuration and is therefore used for my brute-force search.
Ken Silverman wrote the closed-source KZIP compression program and Jonathon Fowler ported it to Linux.
Defluff was created by Joachim Henke; DeflOpt is a tool by Ben Jos Walbeehm.

website made by Stephan Brumme in 2015 and still improving in 2024.
all timestamps are displayed in central european time. see my changelog.
no flash, not even images or external css files - and everything squeezed into a single html file.
which was handsomely compressed before releasing it into the wild internet - obviously.

please visit my homepage and my blog, too.
email: minime (at) stephan-brumme.com

All trademarks are property of their respective owners. You know, the boring legal stuff.