Choose a version:
24% The original file has 377390 bytes (368.5k) and is available from the project website.
There you can find the official minified version, too, which brings down the size to 90277 bytes (88.2k, 24%).

After GZIP compression these minified files vary in size:
Boot
  35397 bytes (34.6k)
CDN
Baidu
  30914 bytes (30.2k)
CDN
cdnjs
  30906 bytes (30.2k)
CDN
gzip -6 (default)
  30819 bytes (30.1k)
local copy
gzip -9
  30792 bytes (30.1k)
local copy
Yandex
  30772 bytes (30.1k)
CDN
Sina
  30769 bytes (30.0k)
CDN
libdeflate -12
  29421 bytes (28.7k)
local copy
7zip -mx=9 -tgzip
  29404 bytes (28.7k)
local copy
kzip -s0 -rn -b3
  29386 bytes (28.7k)
local copy
zultra
  29358 bytes (28.7k)
local copy
pigz -11 -n
  29350 bytes (28.7k)
local copy
Zopfli
  29256 bytes (28.6k)
local copy

perma-link to the smallest file on my server:
http://minime.stephan-brumme.com/files/dojo/dojo-1.5.1.min.js (or via HTTPS)

You will automatically get the smallest Dojo 1.5.1 file, ETag caching is available and
if your browser doesn't support GZIP decompression then the uncompressed version will be sent.

Currently best Zopfli settings

Save 1513 bytes by using my Dojo 1.5.1 Zopfli version instead of the best available CDN (5.17% smaller than Sina, 29256 vs. 30769 bytes):
You can use my super-compressed files for whatever purpose you like as long as you respect the library's original license agreement.
There are no restrictions from my side - but please avoid hot-linking if you run a high-traffic website.

These command-line settings yielded the best compression ratio so far (Linux version of zopfli-krzymod):
zopfli --i1000000 --mb8 --mls16 --bsr22 --lazy --ohh

(found February 28, 2020)
Description Value Parameter
iterations 1000000  --i1000000
maximum blocks 8  --mb8
maximum length score 16  --mls16
block splitting recursion 22  --bsr22
lazy matching in LZ77 yes  --lazy
optimized Huffman headers yes  --ohh
initial random W for iterations 1  --rw1
initial random Z for iterations 2  --rz2

Verify file integrity

After decompression, my uncompressed files are identical to the original ones:

MD5:
curl --silent --compressed https://download.dojotoolkit.org/release-1.5.1/dojo.js --location | md5sum
c9e1b24fb1ce730d6066686e0921a8c2  -
curl --silent --compressed https://minime.stephan-brumme.com/files/dojo/dojo-1.5.1.min.zopfli.js.gz | md5sum
c9e1b24fb1ce730d6066686e0921a8c2  -

SHA1:
curl --silent --compressed https://download.dojotoolkit.org/release-1.5.1/dojo.js --location | sha1sum
64d08651b6f0f0d2663373c39f8024c98e86e57f  -
curl --silent --compressed https://minime.stephan-brumme.com/files/dojo/dojo-1.5.1.min.zopfli.js.gz | sha1sum
64d08651b6f0f0d2663373c39f8024c98e86e57f  -

These CDNs send you the original file:
CDN Size (compressed) MD5 (uncompressed) Timestamp
Boot 35397 bytes c9e1b24fb1ce730d6066686e0921a8c2 March 19, 2015 @ 15:32
cdnjs 30906 bytes c9e1b24fb1ce730d6066686e0921a8c2 February 8, 2015 @ 14:45
Yandex 30772 bytes c9e1b24fb1ce730d6066686e0921a8c2 June 20, 2013 @ 11:59
Sina 30769 bytes c9e1b24fb1ce730d6066686e0921a8c2 April 25, 2019 @ 14:07

And some CDNs send you a different file:
CDN Size (compressed) MD5 (uncompressed) Comment / Diff Timestamp
Baidu 30914 bytes 5afb63c304d724239b304c1b16a77034 only whitespaces differ January 7, 2015 @ 10:16

Note: only the MD5 hashes are shown to keep things simple.

Other Versions

Available Dojo versions at minime.stephan-brumme.com:

1.17.3, 1.17.2, 1.17.1, 1.17.0,
1.16.5, 1.16.4, 1.16.3, 1.16.2, 1.16.1, 1.16.0,
1.15.6, 1.15.5, 1.15.4, 1.15.3, 1.15.2, 1.15.1, 1.15.0,
1.14.9, 1.14.8, 1.14.7, 1.14.6, 1.14.5, 1.14.4, 1.14.3, 1.14.2, 1.14.1, 1.14.0,
1.13.10, 1.13.9, 1.13.8, 1.13.7, 1.13.6, 1.13.5, 1.13.4, 1.13.3, 1.13.2, 1.13.1, 1.13.0,
1.12.11, 1.12.10, 1.12.9, 1.12.8, 1.12.7, 1.12.6, 1.12.5, 1.12.4, 1.12.3, 1.12.2, 1.12.1,
1.11.13, 1.11.12, 1.11.11, 1.11.10, 1.11.9, 1.11.8, 1.11.7, 1.11.6, 1.11.5, 1.11.4, 1.11.3, 1.11.2, 1.11.1, 1.11.0,
1.10.10, 1.10.9, 1.10.8, 1.10.7, 1.10.6, 1.10.5, 1.10.4, 1.10.3, 1.10.2, 1.10.1, 1.10.0,
1.9.11, 1.9.10, 1.9.9, 1.9.8, 1.9.7, 1.9.6, 1.9.5, 1.9.4, 1.9.3, 1.9.2, 1.9.1, 1.9.0,
1.8.14, 1.8.13, 1.8.12, 1.8.11, 1.8.10, 1.8.9, 1.8.8, 1.8.7, 1.8.6, 1.8.5, 1.8.4, 1.8.3, 1.8.2, 1.8.1, 1.8.0,
1.7.12, 1.7.11, 1.7.10, 1.7.9, 1.7.8, 1.7.7, 1.7.6, 1.7.5, 1.7.4, 1.7.3, 1.7.2, 1.7.1, 1.7.0,
1.6.5, 1.6.4, 1.6.3, 1.6.2, 1.6.1, 1.6.0,
1.5.6, 1.5.5, 1.5.4, 1.5.3, 1.5.2, 1.5.1, 1.5.0,
1.4.8, 1.4.7, 1.4.6, 1.4.5, 1.4.4, 1.4.3, 1.4.2, 1.4.1, 1.4.0,
1.3.3, 1.3.2, 1.3.1, 1.3.0,
1.2.3, 1.2.2

The project site contains an overview how well these versions were compressed.
Other interesting projects are AngularJS, BackboneJS, Bootstrap, D3, Ember, jQuery, Knockout, lodash, React, Socket.IO, ThreeJS, UnderscoreJS and Vue.

Changelog

Best Zopfli parameters so far:
Size Improvement Parameters Found
29256 bytes -1 byte zopfli --i1000000 --mls16 --bsr22 --lazy --ohh February 28, 2020 @ 00:46
29257 bytes -3 bytes zopfli --i100000 --mls16 --bsr22 --lazy --ohh December 2, 2015 @ 09:01
29260 bytes -4 bytes zopfli --i10000 --mls16 --bsr22 --lazy --ohh October 14, 2015 @ 09:21
29264 bytes -4 bytes zopfli --i1000 --mls16 --bsr22 --lazy --ohh September 22, 2015 @ 03:07
29268 bytes zopfli --i100 --mls16 --bsr22 --lazy --ohh September 21, 2015 @ 10:58

If there are multiple parameter sets yielding the same compressed size, only the first one found is shown.

Most recent activity on July 20, 2020 @ 12:49.

Heatmaps

This Zopfli heatmap visualizes how compression changes when modifying the --bsr and --mls parameter.
Cell's contents is the best filesize achieved (in bytes, hover with mouse over cells to see number of iterations).

Good parameters are green, bad are red. The best and worst are bold as well.
The brightness of the blue background color indicates how many iterations were processed:
10,000, 100,000 or 1,000,000.
bsr \ mls
2 4 8 16 32 64 128 256 512 1024 2048 4096 8192 16384 32768
bsr \ mls
2 4 8 16 32 64 128 256 512 1024 2048 4096 8192 16384 32768
29300 29304 29309 29314 29329 29299 29328 29348 29334 29346 29349 29356 29340 29339 29338
29321 29316 29317 29282 29313 29317 29339 29336 29321 29334 29339 29330 29285 29343 29338
29305 29302 29304 29316 29304 29315 29333 29339 29321 29345 29336 29334 29289 29340 29340
29326 29313 29307 29311 29302 29282 29336 29308 29325 29342 29339 29330 29340 29339 29338
29315 29292 29319 29312 29317 29315 29294 29338 29307 29314 29309 29332 29339 29340 29338
29320 29300 29316 29321 29310 29310 29304 29322 29306 29336 29339 29331 29296 29341 29341
29297 29293 29310 29313 29314 29324 29315 29332 29329 29329 29328 29331 29316 29341 29344
29315 29314 29315 29305 29315 29315 29309 29337 29316 29331 29330 29330 29320 29338 29335
29321 29280 29304 29307 29308 29309 29332 29313 29321 29327 29331 29339 29333 29332 29332
29312 29305 29325 29305 29301 29285 29304 29322 29310 29332 29327 29331 29292 29337 29335
29308 29306 29329 29296 29305 29314 29334 29335 29329 29327 29330 29328 29332 29340 29340
29305 29309 29309 29296 29311 29310 29303 29309 29329 29331 29338 29328 29333 29340 29339
29291 29283 29315 29297 29313 29320 29331 29336 29330 29330 29336 29332 29332 29337 29335
29301 29316 29325 29318 29316 29310 29308 29321 29304 29332 29334 29331 29303 29340 29339
29299 29317 29309 29298 29301 29310 29302 29321 29308 29330 29342 29330 29331 29341 29338
29306 29316 29324 29308 29300 29311 29333 29309 29309 29336 29337 29330 29333 29340 29337
29308 29314 29309 29298 29283 29324 29309 29319 29330 29341 29292 29331 29287 29332 29340
29317 29314 29311 29301 29311 29312 29306 29332 29308 29337 29333 29331 29332 29333 29339
29293 29315 29271 29256 29284 29318 29333 29336 29310 29330 29336 29337 29333 29335 29329
29312 29286 29324 29316 29304 29316 29297 29313 29308 29333 29334 29331 29305 29329 29331
29316 29309 29318 29305 29304 29316 29332 29337 29304 29328 29304 29328 29293 29337 29333
29303 29309 29319 29319 29307 29303 29331 29337 29302 29332 29329 29331 29331 29331 29329
29307 29307 29306 29307 29317 29309 29333 29320 29306 29334 29284 29331 29330 29340 29338

Due to the Monte Carlo design of my search algorithm, not all parameters have reached the same number of iterations yet:
Iterations Min. Bytes Reduction Coverage
100 29268 bytes 100%
1,000 29264 bytes -4 bytes 100%
10,000 29260 bytes -4 bytes 100%
100,000 29257 bytes -3 bytes 1.16%
1,000,000 29256 bytes -1 byte 0.29%
10,000,000

KZIP has far less options available for tuning/optimization. I only played around with the number of blocks (parameter -n):
Blocks Min. Bytes Compared To Best Zopfli Compared To Best KZIP
29394 bytes +138 bytes (+0.47%) +8 bytes
29388 bytes +132 bytes (+0.45%) +2 bytes
29426 bytes +170 bytes (+0.58%) +40 bytes
29386 bytes +130 bytes (+0.44%)
29425 bytes +169 bytes (+0.58%) +39 bytes
29416 bytes +160 bytes (+0.55%) +30 bytes
29427 bytes +171 bytes (+0.58%) +41 bytes
29394 bytes +138 bytes (+0.47%) +8 bytes
29438 bytes +182 bytes (+0.62%) +52 bytes

Non-DEFLATE Algorithms

Archivers based on completely different compression algorithms often produce superior results.
Unfortunately, browsers only support gzip compression at the moment.
However, support for Brotli is constantly growing - but your browser doesn't support it.
Algorithm Program Parameters Size Compared To Best Zopfli
ZPAQ (Wikipedia) zpaq zpaq -method 69 23187 bytes -6069 bytes (-20.74%)
RAR (proprietary) RAR rar a -m5 -md64m -mc63:128t -mt1 26124 bytes -3132 bytes (-10.71%)
PPMd (Wikipedia) 7zip 7za a -mx=9 -m0=ppmd 26996 bytes -2260 bytes (-7.72%)
Brotli (Wikipedia) brotli brotli -q 11 27148 bytes -2108 bytes (-7.21%)
LZMA2 (Wikipedia) xz xz -9 28076 bytes -1180 bytes (-4.03%)
Zstandard (Wikipedia) zstd zstd -19 28848 bytes -408 bytes (-1.39%)
Burrows-Wheeler transform (Wikipedia) bzip2 bzip2 -9 28929 bytes -327 bytes (-1.12%)

Detailled Analysis

I wrote a DEFLATE decoder in Javascript. Click the button below to start a client-side analysis of the smallest gzipped files (may take a second):


Notes: pigz is a fast open source multi-threaded implementation of gzip written by one of the original authors of gzip.
However, when using compression level 11, pigz actually switches to the slower Zopfli algorithm and isn't multi-threaded anymore.
KrzyMOD's extensions to Zopfli offer the highest level of configuration and is therefore used for my brute-force search.
Ken Silverman wrote the closed-source KZIP compression program and Jonathon Fowler ported it to Linux.
Defluff was created by Joachim Henke; DeflOpt is a tool by Ben Jos Walbeehm.

website made by Stephan Brumme in 2015 and still improving in 2024.
all timestamps are displayed in central european time. see my changelog.
no flash, not even images or external css files - and everything squeezed into a single html file.
which was handsomely compressed before releasing it into the wild internet - obviously.

please visit my homepage and my blog, too.
email: minime (at) stephan-brumme.com

All trademarks are property of their respective owners. You know, the boring legal stuff.