CMake: Further increase RAM threshold for regular compile jobs.

Not great, but the alternative solution (putting several 'main'
targets, like `bf_blenkernel`, into the heavy compile pool, would be
even worse).
This commit is contained in:
Bastien Montagne
2025-07-08 17:33:03 +02:00
parent fc32c26722
commit 9e8ffef07b

View File

@@ -1827,16 +1827,23 @@ Define the maximum number of concurrent heavy compilation jobs, for ninja build
set(_compile_heavy_jobs)
set(_compile_heavy_jobs_max)
# Heuristics: Assume 3Gb of RAM is needed per regular compile job.
# Heuristics: Assume 4Gb of RAM is needed per regular compile job.
# Typical RAM peak usage of these is actually way less than 1GB usually,
# but this also accounts for the part of the physical RAM being used by other unrelated
# processes on the system, and the part being used by the 'heavy' compile and linking jobs.
#
# FIXME:
# There are a few files in 'normal' compile job pool now that require a significant amount of RAM
# (e.g. `blenkernel/intern/volume.cc` can require almost 5GB of RAM in debug + ASAN builds). Until
# we can add individual files to the heavy compile pool job (not possible currently with CMake),
# this is the best that can be done. Alternative solution would be to put several whole targets
# (like `bf_blenkernel`) into the heavy pool, but that is likely even worse of a workaround.
#
# If there are 'enough' cores available, cap the maximum number of regular jobs to
# `number of cores - 1`, otherwise allow using all cores if there is enough RAM available.
# This allows to ensure that the heavy jobs won't get starved by too many normal jobs,
# since the former usually take a long time to process.
math(EXPR _compile_jobs "${_TOT_MEM} / 3000")
math(EXPR _compile_jobs "${_TOT_MEM} / 4000")
if(${_NUM_CORES} GREATER 3)
math(EXPR _compile_jobs_max "${_NUM_CORES} - 1")
else()