This page describes how to execute long-running console commands, to make sure they don't run out of memory. An example is a custom import command or the indexing command provided by the.
To avoid quickly running out of memory while executing such commands you should make sure to:
Avoid Stash (Persistence cache) using to much memory in prod:
If your system is running, or you need to use cache, then disable Stash InMemory cache as it does not limit the amount of items in cache and grows exponentially:
stash: caches: default: inMemory: false
Also if you use FileSystem driver, make sure
memKeyLimit is set to a low number, default should be 200 and can be lowered like this:
stash: caches: default: FileSystem: memKeyLimit: 100
If your setup is offline and cache is cold, there is no risk of stale cache and you can actually completely disable Stash cache. This will improve performance of import scripts:
stash: caches: default: # Note: In eZ Publish 5.3 and earlier "drivers" is called "handlers" drivers: [ BlackHole ] inMemory: false
For logging using monolog, if you use either the default
buffer handler, make sure to specify
buffer_size to limit how large the buffer grows before it gets flushed:
monolog: handlers: main: type: fingers_crossed buffer_size: 200
-d memory_limit=-1 app/console <command>
xdebug(PHP extension to debug/profile php use) when running the command, this will cause php to use much more memory.
Even when everything is configured like described above, memory will grow for each iteration of indexing/inserting a content item with at least 1kb per iteration after the initial first 100 rounds. This is expected behavior; to be able to handle more iterations you will have to do one or several of the following:
The recommended way to completely avoid "memory leaks" in PHP in the first place is to use processes, and for console scripts this is typically done using process forking which is quite easy to do with Symfony.
The things you will need to do: