You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Include a detailed description of the bug or suggestion
pip list of the virtual environment you are using
pytest and operating system versions
Minimal example if possible
I have a pytest suite that uses fixtures extensively to run some functions with all combinations of options. The result is that pytest collects and runs something like 13,000 tests. Unfortunately, on Appveyor/Windows at least, this exhausts the system memory, and the tests fail. No large objects are created by the test themselves, it's simply that the pytest in-memory overhead of 13000+ tests is significant. It would be great to be able to tell pytest to step through these tests 1000 at a time or somesuch, such that the test suite will still run on memory constrained systems. Something like --batch-size=1000.
The text was updated successfully, but these errors were encountered:
Possibly.. Without knowing the details of the implementation though, I think it may require changes to the test collection process in coore pytest. That's just a dumb guess though.
pip list
of the virtual environment you are usingI have a pytest suite that uses fixtures extensively to run some functions with all combinations of options. The result is that pytest collects and runs something like 13,000 tests. Unfortunately, on Appveyor/Windows at least, this exhausts the system memory, and the tests fail. No large objects are created by the test themselves, it's simply that the pytest in-memory overhead of 13000+ tests is significant. It would be great to be able to tell pytest to step through these tests 1000 at a time or somesuch, such that the test suite will still run on memory constrained systems. Something like --batch-size=1000.
The text was updated successfully, but these errors were encountered: