Choose a version:
35% The original file has 266057 bytes (259.8k) and is available from the project website.
There you can find the official minified version, too, which brings down the size to 93636 bytes (91.4k, 35%).

After GZIP compression these minified files vary in size:
Microsoft
  42638 bytes (41.6k)
CDN
Boot
  38749 bytes (37.8k)
CDN
jsdelivr
  38749 bytes (37.8k)
CDN
Baidu
  33718 bytes (32.9k)
CDN
cdnjs
  33714 bytes (32.9k)
CDN
gzip -6 (default)
  33487 bytes (32.7k)
local copy
Google
  33471 bytes (32.7k)
CDN
cdnhttps
  33467 bytes (32.7k)
CDN
gzip -9
  33439 bytes (32.7k)
local copy
libdeflate -12
  32404 bytes (31.6k)
local copy
7zip -mx=9 -tgzip
  32370 bytes (31.6k)
local copy
kzip -s0 -rn -b1
  32327 bytes (31.6k)
local copy
Yandex
  32275 bytes (31.5k)
CDN
pigz -11 -n
  32261 bytes (31.5k)
local copy
Zopfli
  32247 bytes (31.5k)
local copy

perma-link to the smallest file on my server:
http://minime.stephan-brumme.com/files/jquery/jquery-1.8.3.min.js (or via HTTPS)

You will automatically get the smallest jQuery 1.8.3 file, ETag caching is available and
if your browser doesn't support GZIP decompression then the uncompressed version will be sent.

Currently best Zopfli settings

Save 28 bytes by using my jQuery 1.8.3 Zopfli version instead of the best available CDN (0.09% smaller than Yandex, 32247 vs. 32275 bytes):
You can use my super-compressed files for whatever purpose you like as long as you respect the library's original license agreement.
There are no restrictions from my side - but please avoid hot-linking if you run a high-traffic website.

These command-line settings yielded the best compression ratio so far (Linux version of zopfli-krzymod):
zopfli --i100000 --mb8 --mls16384 --bsr22 --lazy --ohh

(found January 8, 2016)
Description Value Parameter
iterations 100000  --i100000
maximum blocks 8  --mb8
maximum length score 16384  --mls16384
block splitting recursion 22  --bsr22
lazy matching in LZ77 yes  --lazy
optimized Huffman headers yes  --ohh
initial random W for iterations 1  --rw1
initial random Z for iterations 2  --rz2

Verify file integrity

After decompression, my uncompressed files are identical to the original ones:

MD5:
curl --silent --compressed https://code.jquery.com/jquery-1.8.3.min.js --location | md5sum
3576a6e73c9dccdbbc4a2cf8ff544ad7  -
curl --silent --compressed https://minime.stephan-brumme.com/files/jquery/jquery-1.8.3.min.zopfli.js.gz | md5sum
3576a6e73c9dccdbbc4a2cf8ff544ad7  -

SHA1:
curl --silent --compressed https://code.jquery.com/jquery-1.8.3.min.js --location | sha1sum
06e872300088b9ba8a08427d28ed0efcdf9c6ff5  -
curl --silent --compressed https://minime.stephan-brumme.com/files/jquery/jquery-1.8.3.min.zopfli.js.gz | sha1sum
06e872300088b9ba8a08427d28ed0efcdf9c6ff5  -

These CDNs send you the original file:
CDN Size (compressed) MD5 (uncompressed) Timestamp
Boot 38749 bytes 3576a6e73c9dccdbbc4a2cf8ff544ad7 March 18, 2015 @ 09:42
jsdelivr 38749 bytes 3576a6e73c9dccdbbc4a2cf8ff544ad7 (invalid)
cdnjs 33714 bytes 3576a6e73c9dccdbbc4a2cf8ff544ad7 (invalid)
cdnhttps 33467 bytes 3576a6e73c9dccdbbc4a2cf8ff544ad7 December 24, 2015 @ 07:33

And some CDNs send you a different file:
CDN Size (compressed) MD5 (uncompressed) Comment / Diff Timestamp
Microsoft 42638 bytes e1288116312e4728f98923c79b034b67 only whitespaces differ (invalid)
Baidu 33718 bytes e1288116312e4728f98923c79b034b67 only whitespaces differ (invalid)
Google 33471 bytes e1288116312e4728f98923c79b034b67 only whitespaces differ (invalid)
Yandex 32275 bytes e1288116312e4728f98923c79b034b67 only whitespaces differ June 15, 2015 @ 21:17

Note: only the MD5 hashes are shown to keep things simple.

Other Versions

Available jQuery versions at minime.stephan-brumme.com:

3.3.1, 3.3.0,
3.2.1, 3.2.0,
3.1.1, 3.1.0,
3.0.0,
2.2.4, 2.2.3, 2.2.2, 2.2.1, 2.2.0,
2.1.4, 2.1.3, 2.1.2, 2.1.1, 2.1.0,
2.0.3, 2.0.2, 2.0.1, 2.0.0,
1.12.4, 1.12.3, 1.12.2, 1.12.1, 1.12.0,
1.11.3, 1.11.2, 1.11.1, 1.11.0,
1.10.2, 1.10.1, 1.10.0,
1.9.1, 1.9.0,
1.8.3, 1.8.2, 1.8.1, 1.8.0,
1.7.2, 1.7.1, 1.7.0,
1.6.4, 1.6.3, 1.6.2, 1.6.1, 1.6,
1.5.2, 1.5.1, 1.5,
1.4.4, 1.4.3, 1.4.2, 1.4.1, 1.4,
1.3.2, 1.3.1, 1.3,
1.2.6, 1.2.5, 1.2.4, 1.2.3, 1.2.2, 1.2.1, 1.2,
1.1.4, 1.1.3, 1.1.2, 1.1.1, 1.1,
1.0.4, 1.0.3, 1.0.2, 1.0.1, 1.0

The project site contains an overview how well these versions were compressed.
Other interesting projects are AngularJS, BackboneJS, Bootstrap, D3, Dojo, Ember, Knockout, lodash, React, Socket.IO, ThreeJS, UnderscoreJS and Vue.

Changelog

Best Zopfli parameters so far:
Size Improvement Parameters Found
32247 bytes -3 bytes zopfli --i100000 --mls16384 --bsr22 --lazy --ohh January 8, 2016 @ 05:40
32250 bytes -1 byte zopfli --i100000 --mls2048 --bsr9 --lazy --ohh September 2, 2015 @ 00:04
32251 bytes -1 byte zopfli --i100000 --mls2048 --bsr12 --lazy --ohh September 1, 2015 @ 23:35
32252 bytes -1 byte zopfli --i10000 --mls2048 --bsr9 --lazy --ohh September 1, 2015 @ 21:02
32253 bytes -3 bytes zopfli --i1000 --mls2048 --bsr12 --lazy --ohh September 1, 2015 @ 20:36
32256 bytes zopfli --i100 --mls1024 --bsr10 --lazy --ohh September 1, 2015 @ 20:17

If there are multiple parameter sets yielding the same compressed size, only the first one found is shown.

Most recent activity on June 16, 2016 @ 14:36.

Heatmaps

This Zopfli heatmap visualizes how compression changes when modifying the --bsr and --mls parameter.
Cell's contents is the best filesize achieved (in bytes, hover with mouse over cells to see number of iterations).

Good parameters are green, bad are red. The best and worst are bold as well.
The brightness of the blue background color indicates how many iterations were processed:
10,000 or 100,000.
bsr \ mls
2 4 8 16 32 64 128 256 512 1024 2048 4096 8192 16384 32768
bsr \ mls
2 4 8 16 32 64 128 256 512 1024 2048 4096 8192 16384 32768
32258 32259 32260 32260 32273 32274 32259 32259 32254 32255 32251 32256 32252 32258 32258
32258 32259 32258 32258 32258 32259 32257 32257 32253 32254 32253 32251 32256 32257 32259
32253 32253 32257 32256 32257 32257 32256 32257 32254 32253 32253 32253 32255 32252 32258
32253 32253 32253 32254 32256 32256 32257 32257 32257 32257 32254 32257 32253 32253 32258
32253 32253 32256 32256 32256 32254 32255 32257 32254 32253 32251 32258 32255 32252 32258
32253 32253 32256 32258 32256 32259 32255 32257 32255 32253 32250 32250 32250 32257 32259
32253 32253 32256 32256 32257 32256 32257 32257 32252 32254 32251 32258 32250 32251 32258
32259 32259 32256 32256 32257 32257 32255 32257 32253 32254 32252 32254 32250 32252 32258
32253 32253 32256 32254 32257 32256 32253 32257 32253 32253 32251 32254 32254 32253 32258
32253 32253 32253 32255 32257 32256 32257 32256 32254 32253 32251 32254 32255 32253 32258
32253 32253 32255 32255 32258 32259 32256 32257 32254 32253 32251 32254 32250 32253 32259
32253 32253 32256 32255 32256 32257 32257 32257 32257 32255 32252 32258 32252 32252 32258
32253 32253 32256 32256 32257 32256 32255 32257 32254 32253 32251 32254 32250 32252 32258
32253 32253 32256 32256 32252 32255 32257 32257 32256 32257 32254 32258 32252 32258 32258
32253 32253 32253 32255 32257 32257 32256 32257 32256 32255 32257 32258 32252 32254 32258
32253 32253 32255 32255 32252 32257 32254 32255 32256 32255 32252 32257 32252 32253 32259
32253 32253 32256 32256 32258 32256 32255 32256 32257 32256 32253 32258 32253 32252 32258
32253 32253 32256 32256 32256 32256 32258 32257 32256 32257 32252 32258 32252 32252 32258
32253 32253 32253 32255 32256 32257 32255 32257 32258 32255 32253 32258 32252 32247 32258
32253 32253 32256 32255 32257 32256 32252 32257 32257 32256 32254 32257 32252 32252 32258
32253 32253 32253 32255 32257 32256 32256 32257 32256 32256 32252 32258 32253 32252 32258
32253 32253 32253 32256 32256 32256 32255 32257 32257 32255 32252 32257 32252 32252 32258
32254 32254 32256 32255 32257 32257 32255 32257 32256 32255 32252 32258 32258 32252 32258

Due to the Monte Carlo design of my search algorithm, not all parameters have reached the same number of iterations yet:
Iterations Min. Bytes Reduction Coverage
100 32255 bytes 100%
1,000 32250 bytes -5 bytes 100%
10,000 32250 bytes 100%
100,000 32247 bytes -3 bytes 16.23%
1,000,000
10,000,000

KZIP has far less options available for tuning/optimization. I only played around with the number of blocks (parameter -n):
Blocks Min. Bytes Compared To Best Zopfli Compared To Best KZIP
32328 bytes +81 bytes (+0.25%) +1 byte
32327 bytes +80 bytes (+0.25%)
32359 bytes +112 bytes (+0.35%) +32 bytes
32392 bytes +145 bytes (+0.45%) +65 bytes
32398 bytes +151 bytes (+0.47%) +71 bytes
32393 bytes +146 bytes (+0.45%) +66 bytes
32428 bytes +181 bytes (+0.56%) +101 bytes
32466 bytes +219 bytes (+0.68%) +139 bytes
32495 bytes +248 bytes (+0.77%) +168 bytes

Non-DEFLATE Algorithms

Archivers based on completely different compression algorithms often produce superior results.
Unfortunately, browsers only support gzip compression at the moment.
Algorithm Program Parameters Size Compared To Best Zopfli
ZPAQ (Wikipedia) zpaq zpaq -method 69 26166 bytes -6081 bytes (-18.86%)
RAR (proprietary) RAR rar a -m5 -md64m -mc63:128t -mt1 26902 bytes -5345 bytes (-16.58%)
PPMd (Wikipedia) 7zip 7za a -mx=9 -m0=ppmd 28623 bytes -3624 bytes (-11.24%)
Brotli (Wikipedia) brotli brotli -q 11 29830 bytes -2417 bytes (-7.50%)
Burrows-Wheeler transform (Wikipedia) bzip2 bzip2 -9 30122 bytes -2125 bytes (-6.59%)
LZMA2 (Wikipedia) xz xz -9 31052 bytes -1195 bytes (-3.71%)
Zstandard (Wikipedia) zstd zstd -19 31603 bytes -644 bytes (-2.00%)

Detailled Analysis

I wrote a DEFLATE decoder in Javascript. Click the button below to start a client-side analysis of the smallest gzipped files (may take a second):


Notes: pigz is a fast open source multi-threaded implementation of gzip written by one of the original authors of gzip.
However, when using compression level 11, pigz actually switches to the slower Zopfli algorithm and isn't multi-threaded anymore.
KrzyMOD's extensions to Zopfli offer the highest level of configuration and is therefore used for my brute-force search.
Ken Silverman wrote the closed-source KZIP compression program and Jonathon Fowler ported it to Linux.
Defluff was created by Joachim Henke; DeflOpt is a tool by Ben Jos Walbeehm.

website made by Stephan Brumme in 2015 and still improving in 2018.
all timestamps are displayed in central european time. see my changelog.
no flash, not even images or external css files - and everything squeezed into a single html file.
which was handsomely compressed before releasing it into the wild internet - obviously.

please visit my homepage and my blog, too.
email: minime (at) stephan-brumme.com