Choose a version:
26% The original file has 629493 bytes (614.7k) and is available from the project website.
There you can find the official minified version, too, which brings down the size to 163811 bytes (160.0k, 26%).

After GZIP compression these minified files vary in size:
Boot
  64881 bytes (63.4k)
CDN
cdnhttps
  56311 bytes (55.0k)
CDN
cdnjs
  56230 bytes (54.9k)
CDN
gzip -6 (default)
  55961 bytes (54.6k)
local copy
gzip -9
  55904 bytes (54.6k)
local copy
7zip -mx=9 -tgzip
  53240 bytes (52.0k)
local copy
libdeflate -12
  53236 bytes (52.0k)
local copy
kzip -s0 -rn -b1
  53191 bytes (51.9k)
local copy
pigz -11 -n
  53185 bytes (51.9k)
local copy
Zopfli
  53010 bytes (51.8k)
local copy
Zopfli (defluff)
  53009 bytes (51.8k)
local copy

perma-link to the smallest file on my server:
http://minime.stephan-brumme.com/files/dojo/dojo-1.10.3.min.js

You will automatically get the smallest Dojo 1.10.3 file, ETag caching is available and
if your browser doesn't support GZIP decompression then the uncompressed version will be sent.

Currently best Zopfli settings

Save 3220 bytes by using my Dojo 1.10.3 Zopfli version instead of the best available CDN (6.07% smaller than cdnjs, 53010 vs. 56230 bytes):
You can use my super-compressed files for whatever purpose you like as long as you respect the library's original license agreement.
There are no restrictions from my side - but please avoid hot-linking if you run a high-traffic website.

These command-line settings yielded the best compression ratio so far (Linux version of zopfli-krzymod):
zopfli --i100000 --mb8 --mls2 --bsr16 --lazy --ohh

(found December 7, 2015)
Description Value Parameter
iterations 100000  --i100000
maximum blocks 8  --mb8
maximum length score 2  --mls2
block splitting recursion 16  --bsr16
lazy matching in LZ77 yes  --lazy
optimized Huffman headers yes  --ohh
initial random W for iterations 1  --rw1
initial random Z for iterations 2  --rz2

Even Smaller Files Thanks To Defluff

Zopfli's output can be further optimized by the defluff tool.
In this particular case, defluff saves 1 more byte (53009 bytes).

Verify file integrity

After decompression, my uncompressed files are identical to the original ones:

MD5:
curl --silent --compressed http://download.dojotoolkit.org/release-1.10.3/dojo.js --location | md5sum
0c720b7cc29cc1028951a5bac3ddf95c  -
curl --silent --compressed http://minime.stephan-brumme.com/files/dojo/dojo-1.10.3.min.zopfli.js.gz | md5sum
0c720b7cc29cc1028951a5bac3ddf95c  -

SHA1:
curl --silent --compressed http://download.dojotoolkit.org/release-1.10.3/dojo.js --location | sha1sum
1f01ee88d6fab7f82b60a3d01a234ebefee331e4  -
curl --silent --compressed http://minime.stephan-brumme.com/files/dojo/dojo-1.10.3.min.zopfli.js.gz | sha1sum
1f01ee88d6fab7f82b60a3d01a234ebefee331e4  -

These CDNs send you the original file:
CDN Size (compressed) MD5 (uncompressed) Timestamp
Boot 64881 bytes 0c720b7cc29cc1028951a5bac3ddf95c March 19, 2015 @ 15:29
cdnjs 56230 bytes 0c720b7cc29cc1028951a5bac3ddf95c February 8, 2015 @ 14:45

And some CDNs send you a different file:
CDN Size (compressed) MD5 (uncompressed) Comment / Diff Timestamp
cdnhttps 56311 bytes 4d0d1da5756596ecc65c808bb106491a < /*
< Copyright (c) 2004-2011, The Dojo Foundation All Rights R [...]
< Available via Academic Free License >= 2.1 OR the modifie [...]
< see: http://dojotoolkit.org/license for details
< */
<
< /*
< This is an optimized version of Dojo, built for deploymen [...]
< development. To get sources and documentation, please vis [...]
<
[...]
December 24, 2015 @ 07:33

Note: only the MD5 hashes are shown to keep things simple.

Other Versions

Available Dojo versions at minime.stephan-brumme.com:

1.13.0,
1.12.3, 1.12.2, 1.12.1,
1.11.5, 1.11.4, 1.11.3, 1.11.2, 1.11.1, 1.11.0,
1.10.9, 1.10.8, 1.10.7, 1.10.6, 1.10.5, 1.10.4, 1.10.3, 1.10.2, 1.10.1, 1.10.0,
1.9.11, 1.9.10, 1.9.9, 1.9.8, 1.9.7, 1.9.6, 1.9.5, 1.9.4, 1.9.3, 1.9.2, 1.9.1, 1.9.0,
1.8.12, 1.8.11, 1.8.10, 1.8.9, 1.8.8, 1.8.7, 1.8.6, 1.8.5, 1.8.4, 1.8.3, 1.8.2, 1.8.1, 1.8.0,
1.7.10, 1.7.9, 1.7.8, 1.7.7, 1.7.6, 1.7.5, 1.7.4, 1.7.3, 1.7.2, 1.7.1, 1.7.0,
1.6.3, 1.6.2, 1.6.1, 1.6.0,
1.5.4, 1.5.3, 1.5.2, 1.5.1, 1.5.0,
1.4.6, 1.4.5, 1.4.4, 1.4.3, 1.4.2, 1.4.1, 1.4.0,
1.3.3, 1.3.2, 1.3.1, 1.3.0,
1.2.3

The project site contains an overview how well these versions were compressed.
Other interesting projects are AngularJS, BackboneJS, Bootstrap, D3, Ember, jQuery, Knockout, lodash, React, Socket.IO, ThreeJS, UnderscoreJS and Vue.

Changelog

Best Zopfli parameters so far:
Size Improvement Parameters Found
53010 bytes -1 byte zopfli --i100000 --mls2 --bsr16 --lazy --ohh December 7, 2015 @ 11:19
53011 bytes -3 bytes zopfli --i100000 --mls2 --bsr22 --lazy --ohh December 7, 2015 @ 10:43
53014 bytes -1 byte zopfli --i10000 --mls2 --bsr13 --lazy --ohh September 24, 2015 @ 19:40
53015 bytes -7 bytes zopfli --i10000 --mls2 --bsr18 --lazy --ohh September 24, 2015 @ 19:13
53022 bytes -1 byte zopfli --i1000 --mls2 --bsr15 --lazy --ohh September 22, 2015 @ 13:32
53023 bytes -6 bytes zopfli --i1000 --mls4 --bsr6 --lazy --ohh September 21, 2015 @ 13:54
53029 bytes zopfli --i100 --mls4 --bsr12 --lazy --ohh September 21, 2015 @ 13:05

If there are multiple parameter sets yielding the same compressed size, only the first one found is shown.

Most recent activity on June 16, 2016 @ 14:40.

Heatmaps

This Zopfli heatmap visualizes how compression changes when modifying the --bsr and --mls parameter.
Cell's contents is the best filesize achieved (in bytes, hover with mouse over cells to see number of iterations).

Good parameters are green, bad are red. The best and worst are bold as well.
The brightness of the blue background color indicates how many iterations were processed:
10,000 or 100,000.
bsr \ mls
2 4 8 16 32 64 128 256 512 1024 2048 4096 8192 16384 32768
bsr \ mls
2 4 8 16 32 64 128 256 512 1024 2048 4096 8192 16384 32768
53058 53067 53085 53086 53086 53088 53091 53097 53098 53092 53116 53091 53093 53126 53099
53023 53026 53038 53043 53086 53088 53095 53104 53088 53096 53095 53097 53104 53094 53093
53034 53018 53076 53033 53087 53087 53089 53093 53090 53105 53091 53099 53098 53098 53096
53116 53111 53124 53088 53097 53087 53090 53087 53090 53112 53091 53089 53089 53107 53091
53017 53021 53104 53041 53084 53088 53096 53093 53088 53090 53102 53103 53102 53101 53090
53012 53046 53049 53100 53084 53087 53094 53093 53089 53087 53097 53089 53101 53103 53100
53041 53048 53048 53086 53085 53088 53091 53087 53089 53101 53096 53090 53088 53103 53099
53011 53019 53032 53087 53087 53082 53099 53098 53088 53094 53100 53102 53097 53096 53088
53011 53017 53051 53032 53091 53086 53085 53089 53090 53092 53100 53099 53099 53097 53094
53010 53019 53050 53031 53084 53094 53088 53093 53088 53098 53105 53088 53099 53098 53089
53018 53021 53038 53043 53085 53084 53084 53089 53098 53110 53098 53088 53096 53099 53089
53011 53020 53035 53032 53087 53084 53090 53085 53089 53103 53104 53089 53089 53106 53092
53010 53019 53028 53034 53085 53087 53086 53100 53088 53103 53099 53089 53097 53096 53101
53012 53022 53026 53032 53086 53086 53095 53094 53088 53098 53097 53089 53098 53096 53089
53012 53019 53045 53049 53085 53086 53094 53094 53088 53138 53100 53089 53097 53093 53098
53018 53020 53028 53040 53086 53087 53086 53087 53088 53106 53094 53102 53098 53096 53093
53064 53067 53104 53087 53086 53086 53094 53093 53089 53132 53095 53089 53092 53096 53100
53010 53020 53033 53030 53086 53085 53094 53093 53088 53092 53098 53089 53092 53097 53103
53011 53021 53026 53030 53085 53085 53094 53091 53088 53094 53095 53089 53101 53114 53088
53069 53069 53106 53086 53086 53084 53103 53087 53089 53096 53100 53089 53088 53111 53089
53084 53063 53072 53086 53087 53086 53087 53086 53089 53092 53098 53089 53090 53094 53088
53011 53059 53020 53086 53083 53085 53086 53088 53089 53097 53094 53089 53089 53098 53090
53012 53025 53028 53031 53084 53083 53087 53087 53088 53092 53103 53089 53088 53097 53089

Due to the Monte Carlo design of my search algorithm, not all parameters have reached the same number of iterations yet:
Iterations Min. Bytes Reduction Coverage
100 53024 bytes 100%
1,000 53022 bytes -2 bytes 100%
10,000 53014 bytes -8 bytes 100%
100,000 53010 bytes -4 bytes 3.48%
1,000,000
10,000,000

KZIP has far less options available for tuning/optimization. I only played around with the number of blocks (parameter -n):
Blocks Min. Bytes Compared To Best Zopfli Compared To Best KZIP
53253 bytes +243 bytes (+0.46%) +62 bytes
53191 bytes +181 bytes (+0.34%)
53211 bytes +201 bytes (+0.38%) +20 bytes
53254 bytes +244 bytes (+0.46%) +63 bytes
53295 bytes +285 bytes (+0.54%) +104 bytes
53317 bytes +307 bytes (+0.58%) +126 bytes
53314 bytes +304 bytes (+0.57%) +123 bytes
53351 bytes +341 bytes (+0.64%) +160 bytes
53395 bytes +385 bytes (+0.73%) +204 bytes

Non-DEFLATE Algorithms

Archivers based on completely different compression algorithms often produce superior results.
Unfortunately, browsers only support gzip compression at the moment.
Algorithm Program Parameters Size Compared To Best Zopfli
ZPAQ (Wikipedia) zpaq zpaq -method 69 38854 bytes -14156 bytes (-26.70%)
RAR (proprietary) RAR rar a -m5 -md64m -mc63:128t -mt1 46084 bytes -6926 bytes (-13.07%)
PPMd (Wikipedia) 7zip 7za a -mx=9 -m0=ppmd 47559 bytes -5451 bytes (-10.28%)
Brotli (Wikipedia) brotli brotli -q 11 48604 bytes -4406 bytes (-8.31%)
LZMA2 (Wikipedia) xz xz -9 49764 bytes -3246 bytes (-6.12%)
Burrows-Wheeler transform (Wikipedia) bzip2 bzip2 -9 51212 bytes -1798 bytes (-3.39%)
ZSTD (Wikipedia) zstd zstd -19 51253 bytes -1757 bytes (-3.31%)

Detailled Analysis

I wrote a DEFLATE decoder in Javascript. Click the button below to start a client-side analysis of the smallest gzipped files (may take a second):


Notes: pigz is a fast open source multi-threaded implementation of gzip written by one of the original authors of gzip.
However, when using compression level 11, pigz actually switches to the slower Zopfli algorithm and isn't multi-threaded anymore.
KrzyMOD's extensions to Zopfli offer the highest level of configuration and is therefore used for my brute-force search.
Ken Silverman wrote the closed-source KZIP compression program and Jonathon Fowler ported it to Linux.
Defluff was created by Joachim Henke; DeflOpt is a tool by Ben Jos Walbeehm.

website made by Stephan Brumme in 2015 and still improving in 2018.
all timestamps are displayed in central european time. see my changelog.
no flash, not even images or external css files - and everything squeezed into a single html file.
which was handsomely compressed before releasing it into the wild internet - obviously.

please visit my homepage and my blog, too.
email: minime (at) stephan-brumme.com