-
-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cleanup
isn't concurrent-safe
#3242
Comments
Guess we can write some marker file or something to prevent this from happening. Other ideas are welcome as well. |
Another way would be to ignore sorted = files.sort_by { |path| File.mtime(path) }
remove_files(sorted, dirs, remove_count, verbose) We know the files were there an instant ago, so they were most likely removed by another |
11 tasks
mikegee
added a commit
to mikegee/rubocop
that referenced
this issue
Jan 27, 2017
We already ignore `Errno::ENOENT` during the removal, but `File.mtime` might raise too.
bbatsov
pushed a commit
that referenced
this issue
Jan 28, 2017
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Expected behavior
I can run many
rubocop
processes at the same time and they all finish.Actual behavior
If two
rubocop
processes finish running at the same time, one of them can clean up the cache directory, and the other throws with:Steps to reproduce the problem
One of them should error out int result_cache.rb:35.
RuboCop version
The text was updated successfully, but these errors were encountered: