"OMG it's a 25GB game patch, must be tons of amazing new content that they're keeping quiet about."

*Could* be, sure, and stealth launches are a thing sometimes, but generally speaking, if it's a big patch for no discernible reason not attached to anything upcoming that's already been announced, the actual reason is probably something very boring and usually non-game-related

(Very) incomplete list of reasons for big patches that are completely unrelated to new content, from personal experience:

- Somebody added an extra field to some header somewhere, so everything that comes after the header moves and has slightly different context, plus content chunking puts chunk boundaries in different places now, so all the compressed data after changes
- We updated compilers and our assets are built using fast math, YOLO

- Sound assets use Opus in float (not fixed-point) mode which uses RCPSS/RSQRTSS during encode, and we switched our build machines between Intel and AMD, so presto, new audio for everyone
- we changed audio codecs
- we changed lossless data codecs for some or all data
- we changed _something_ and don't know for sure what or why it's a 15GB patch but we only noticed 4 days before cert and it was too scary to figure out and fix before then, so here's hoping it's fine
- this one simple bugfix in our vector math library had bigger ripple effects than we thought
- this one simple bugfix in the platform libm had bigger ripple effects than we thought
- this one simple bugfix in our compiled asset caching had bigger ripple effects than we thought
- who are we kidding, past a certain scale there's really no such thing as a simple bugfix
- given all of the above, reproducible asset builds are Actually Hard, therefore we use content-addressed storage and just cache build artifacts... unfortunately something went wrong and corrupted that store so we had to rebuild everything and this is what happened
- these normal maps used to be BC5 and now are ASTC 5x5 on this target, which saves us a bunch of desperately-needed memory at runtime but also means we get to re-distribute new normal maps for ALL THE THINGS
@rygorous you forgot “there were apparently some not well understood limits in window or file size analysis in the platform packaging system and of the two folks who understood it one left the company and the other is scrambling to understand it and he’s in a different time zone and actually isn’t supposed to be working on this anyway so don’t get your hopes up.” All purely hypothetical of course.
@rygorous or “the platform actually doesn’t provide a way to know the final patch size until it’s done processing on their backend and/or QA installs it from the retail test environment and that is a manual process that takes a few hours for each build.” Again purely hypothetical.
@rygorous Build determinism issue - fused-multiply and add. Different build configs or compiler versions strikes again!
@rygorous
- The game is made with Unity, doesn't use asset bundles, nothing changed and you just made a new build.
(TFW what you thought was just a transient local cache is actually a load-bearing persistent data store that it was never designed to be)
@rygorous that is oddly specific. Would you be referring to something with a 3-letter acronym that starts with a D, ends with a C, and has a D in the middle, that typically runs on Windows shares?
@TheIneQuation DDC is one example (although FWIW we don't generally run this on Windows shares at Epic), but I've also built one of these myself (15 years ago at the company I was at at the time) and interacted with multiple RAD customers who also had versions of this, all fairly similar in big picture terms
@rygorous surely you mean "not anymore"? 🙂 I distinctly recall DefaultEngine.ini with site-specific UNC addresses. 🙃 But that was 5+ years ago, admittedly.
@rygorous In a totally different context I recently spent hours to keep a cache that was no longer just a cache from getting cleared. Ah, the beautiful, elegant, exact science of computing! 😭
@rygorous A couple jobs back, the studio I was working at started having our nightly clean build fail, though everyone's workstations and the normal CI build were fine. Turns out we had either moved some file, or changed something in the build so a particular object file was no longer getting rebuilt, but was still cached for any iterative builds. So until we fixed it, everyone's workspace was on a ticking timebomb that would blow up when they ran a clean build
@rygorous This brings up repressed World of Warcraft memories... I built the asset server. 😰
@stilescrisis I built one 15 years ago for the (long defunct) game company I was at at the time, and felt simultaneously validated and disheartened after coming to Epic and learning UE had a conceptually very similar system (DDC) with mostly the same strengths and headaches...
@rygorous WoW's system is probably a lot more database-centric than most, but a lot of these asset problems are very relatable. (Although we didn't ever YOLO things... if there was a large patch, we'd know why and it'd have a darn good reason.)

@stilescrisis I've perpetrated one of these that should've been a database but wasn't, interacted with a few more, and at Epic UE has its own variant (DDC) with multiple backends that offer different sets of guarantees.

Our monitoring for this for Fortnite has some gaps and systemic problems (that are being worked on). I wish I could say we never rolled out a big patch primarily because we noticed too late to realistically fix it, but I'd be lying.