Hi all,

According to the TeX in Practice book, "the largest dimension value that can be represented in a TeX program is 16383.99999pt. This value is assigned to a dimension register \maxdimen [...]." This is a tiny bit smaller than 2^14 pt. As a layman TeX user, my naive guess is that in the compiler, floating point dimension registers probably use int32_t or some other fixed-length type, and the compiler implements its own floating point arithmetics. Either way, whenever an overflow is going to happen, the compiler prints a "! Dimension too large" error.

This error also occurs when using LuaTeX. My gut feeling is that the current limit of (almost) 16384 pt is perhaps mostly historic; could it nowadays be loosened somehow? Perhaps the dimension calculations are already in Lua and the error is there for legacy reasons, or perhaps the calculations are still in C and if int32_t is indeed used, int64_t could conceivably raise the max dimensions to hundreds of kilometers, more than anyone would accidentally reach.

Looking forward to hearing your thoughts.

Best,
Sjors Gielen

P.S.: The discussion I am hoping to invoke is theoretical, but for background, my motivation comes from the use of the 'mdframed' package to split a large frame over many pages. Our automated testing system creates mdframed environments containing test steps and their outcomes; we have apparently crossed some new amount of steps where the virtual vbox size goes over \maxdimen before the vbox is split over multiple pages. If the error is ignored, the generated PDF looks fine. We can also end the mdframed environment every N steps and immediately start a new one, which is an acceptable workaround but I'm a bit unsatisfied that it's necessary. Perhaps this is better considered a bug in mdframed than in the typesetting engine, but I still find the theoretical question behind \maxdimen interesting, hence this e-mail.