It's understandable that Unreal needs to touch a lot of files when starting the editor. But what if I told you that more than 5500 of those files are not needed for the editor to start at all and are just adding multiple seconds to the editor launch time?
Let's fix this!
#UE5 #gamedev
https://larstofus.com/2025/09/27/speeding-up-the-unreal-editor-launch-by-not-opening-5500-files/
Speeding up the Unreal Editor launch by … not opening 5500 files?

In my last article I wrote about some tooltip optimization to reduce the start time of the Unreal Editor by 2-5 seconds. Turns out people do really care about their editor start time. So much that …

Larst Of Us

@LarsThiessen Excellent work! (also, I see @superluminal screenshots, I like :))

I suspect finding and fixing these types of small slowdowns can be a fulltime job at this scale. My last several years at Unity were similar, but it is an uphill battle. Yes you find a ton of small inefficiencies all the time, and they are easily fixable. The problem is, there are several hundred people sometimes adding *new* inefficiencies.

@aras @LarsThiessen @superluminal then again the "source available" nature of Unreal means there are potentially also more people looking at and fixing them. (That, and the average developer at Epic and even the average user of Unreal is much more technical than at Unity.)
@sschoener @LarsThiessen @superluminal Yeah, source availability is… well, I think you know my opinion on what Unity should do (or should have done)

@sschoener @aras @superluminal

True, but more people potentially fixing issues doesn't help if the owner of the code base doesn't have an efficient system to review and merge the incoming pull requests.
Not that I blame them, that's a complex task even many open source communities struggle with :/

@sschoener @aras @superluminal
I'll take everything back, the first of the two changes already got merged. Didn't expect to see a pull request that I submitted on a Saturday already being merged on Monday :o
@LarsThiessen @aras @superluminal maybe it also helped that you wrote a blog post about it and got many eyes on it?
@sschoener @aras @superluminal
Yeah, I *strongly* suspect that being the case :D
I wouldn't complain if that motivated more developers to blog about their work, though. I could use some more reading material ;)
@aras @LarsThiessen @superluminal but it would be surprising if unreal didn't have some subset of regression tests, each of which fails if it doesn't complete within some time limit on some reference hardware config...?
@aras (Which isn't to say that new inefficiencies can't be added under new conditions. But surely the most common conditions can be accounted for...?)
@JamesWidman @aras @LarsThiessen @superluminal I think rather than surprising, that would be the norm. And any performance regression testing would be an unexpected bonus. But yes, that would be a nice thing to have at least in theory. Tends to be very noisy, though, and issues where you're slowly degrading performance over time (often depending on content) are very hard to find with regression tests on their own.
@JamesWidman @aras @LarsThiessen @superluminal in this example, the amount of work scales with how many content packs you have on your local machine. If you have a stripped down bare regression test for performance, then it'd potentially miss the issue entirely if it has no content packs. If it does have content packs your next problem is that when the code was originally added it may have been fine, but the accumulation of content packs over time makes it increasingly noticeable.
@JamesWidman @aras @LarsThiessen @superluminal Which is not to say that was the issue here, it's just how it often goes with regression tests for performance in games. That said, tracking the editor boot time over time is just a good thing to do, and having some benchmark which you always want to stay under is also good. If it blows that budget you can get somebody to investigate. Usually this just happens when people complain or get annoyed enough :')

@dotstdy it seems like assets/content packs would need to be part of some regression test suite (and therefore checked into that suite's repository)?

(here it definitely helps to have a VCS that supports checking out only a strict subset of a large mono-repo.)

@dotstdy to track performance tests using asset sets that reflect real-world use cases would of course require A Big Goddamn Mono-Repo, but then, it's not as if Epic doesn't have the resources for this. They ought to be able to have constant inputs and reproducible outputs (including a reproducible duration as one of the outputs for each test in the suite).

And it would help their users to make things faster (which seems like it would help their business?)

@JamesWidman the better question is whether a big automated system would be more efficient or effective than just having somebody check it occasionally. Also note the content packs in a default install are controlled by Epic, but that's not generally true. Those packs are coming from third parties in general.
@JamesWidman @dotstdy
My guess is that they probably have *some* form of regression testing, but, as mentioned, there is no way to ensure it catches everything. Even if they use the fortnite-repo (which would literally be a "big goddamn mono-repo"), this probably wouldn't catch those specific issues with the starter content, since this branch probably doesn't use them.
@JamesWidman @dotstdy Tokenization of real world assets is key
@aras @superluminal
Thanks, glad you liked it.
I wonder how many fulltime jobs you could actually fill with that amount of stuff if you also include all the plugins and tools that ship with the engine. It's probably a decent number and with all the Fortnite-money Epic could easily pay for them ;)

@LarsThiessen Good stuff!

Here are two other crazy performance catastrophes related to assets :

- if you pass a list of map names when cooking, it'll take a ridiculous amount of time to resolve them to map assets. Use fully qualified names instead.

- The blueprint editor, when it opens for the first time, will scan the asset database for blueprint libraries to populate the functions list UI, and *load them*. This can be very slow if your blueprints libraries have hardref to assets...

@LarsThiessen Nice! FWIW I believe FindFirstFile needs to open the *directory* that is being scanned - not each file. There’s no syscall per file. It’s not super clear from the article if UE misuses the APIs or if it scans 5500 folders (which could be way more files naturally).

And also while folder scanning is actually quite efficient in Windows, you can have file system filters installed that slow down the process substantially. Notably, pretty much any antivirus software will be a problem.

@zeux Thanks for the info, I was also a bit confused by that part. I also would have expected for only the folders to be opened, but in my profiling this wasn't the case, the profiler reported 4957 calls to "OpenFile" and in the end 5500 files were listed, so I assume this call to happen for every file (with the delta being explained by the profiler resolution).
But it's very possible that this is still not the ideal way 👀
@LarsThiessen I hope this can be merged into the main branch. I don't want to compile UE and run a custom version of the engine.

@LarsThiessen Good work! I applied the changes on my 5.6 and it's a bump of speed.

2 additional methods:
- Turn UE always enabled plugins (Apple, Linux, VR) to disable. I have a tool for this at https://github.com/DarknessFX/UEPlugins_DisableDefault .

- Change Engine/Config/BaseEngine.ini to default to DX12 SM6 (default is SM5), save "shaders compiling" time when creating new uproject.

[/Script/WindowsTargetPlatform.WindowsTargetSettings]
DefaultGraphicsRHI=DefaultGraphicsRHI_DX12
+D3D12TargetedShaderFormats=PCD3D_SM6

GitHub - DarknessFX/UEPlugins_DisableDefault: Tool for manage Unreal Engine Plugins (.uplugins) set to Enabled By Default .

Tool for manage Unreal Engine Plugins (.uplugins) set to Enabled By Default . - DarknessFX/UEPlugins_DisableDefault

GitHub