Files
test2/source
Jacques Lucke a8667aa03f Core: introduce MemoryCounter API
We often have the situation where it would be good if we could easily estimate
the memory usage of some value (e.g. a mesh, or volume). Examples of where we
ran into this in the past:
* Undo step size.
* Caching of volume grids.
* Caching of loaded geometries for import geometry nodes.

Generally, most caching systems would benefit from the ability to know how much
memory they currently use to make better decisions about which data to free and
when. The goal of this patch is to introduce a simple general API to count the
memory usage that is independent of any specific caching system. I'm doing this
to "fix" the chicken and egg problem that caches need to know the memory usage,
but we don't really need to count the memory usage without using it for caches.
Implementing caching and memory counting at the same time make both harder than
implementing them one after another.

The main difficulty with counting memory usage is that some memory may be shared
using implicit sharing. We want to avoid double counting such memory. How
exactly shared memory is treated depends a bit on the use case, so no specific
assumptions are made about that in the API. The gathered memory usage is not
expected to be exact. It's expected to be a decent approximation. It's neither a
lower nor an upper bound unless specified by some specific type. Cache systems
generally build on top of heuristics to decide when to free what anyway.

There are two sides to this API:
1. Get the amount of memory used by one or more values. This side is used by
   caching systems and/or systems that want to present the used memory to the
   user.
2. Tell the caller how much memory is used. This side is used by all kinds of
   types that can report their memory usage such as meshes.

```cpp
/* Get how much memory is used by two meshes together. */
MemoryCounter memory;
mesh_a->count_memory(memory);
mesh_b->count_memory(memory);
int64_t bytes_used = memory.counted_bytes();

/* Tell the caller how much memory is used. */
void Mesh::count_memory(blender::MemoryCounter &memory) const
{
  memory.add_shared(this->runtime->face_offsets_sharing_info,
                    this->face_offsets().size_in_bytes());

  /* Forward memory counting to lower level types. This should be fairly common. */
  CustomData_count_memory(this->vert_data, this->verts_num, memory);
}

void CustomData_count_memory(const CustomData &data,
                             const int totelem,
                             blender::MemoryCounter &memory)
{
  for (const CustomDataLayer &layer : Span{data.layers, data.totlayer}) {
    memory.add_shared(layer.sharing_info, [&](blender::MemoryCounter &shared_memory) {
      /* Not quite correct for all types, but this is only a rough approximation anyway. */
      const int64_t elem_size = CustomData_get_elem_size(&layer);
      shared_memory.add(totelem * elem_size);
    });
  }
}
```

Pull Request: https://projects.blender.org/blender/blender/pulls/126295
2024-08-15 10:54:21 +02:00
..
2024-08-15 10:54:21 +02:00