Introduce new cronjob to regularly cleanup outdated lock files if fil… #39372
+112
−6
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
…e based lock provider is being used.
Description (*)
When Magento is setup to use file based locking, then we need to keep the directory that stores these files under control.
I'm introducing a cronjob here, that runs once per day and searches for lock files that haven't been modified in the last 24 hours and can thus be safely removed. This will keep the contents of the lock files directory under control.
This cronjob will only execute something when the lock provider is configured to use files, not when one of the others is used (database - the default, zookeeper or cache)
Related Pull Requests
Fixed Issues (if relevant)
Manual testing scenarios (*)
var/locks
, expected is that no file should be older then 48 hoursvar/log/system.log
file for the phraseDeleted xxx old lock files
. After 48 hours, it should appear at least once, maybe twice or more, thexxx
should in theory be bigger then 0 at least once.Questions or comments
app/code/Magento
that is related to Locks, I picked theMagento/Backend
one, because it already has a similar cronjob to keep caches cleanIf this isn't a good idea, how should we identify lock files? Maybe an extra check can be added to only delete files that are 0 bytes? And skip files that are bigger?
The existing integration test used
/tmp
, so maybe people use this directory as well in real life and deleting all kinds of random files from that directory might not be a good idea?Contribution checklist (*)