Feature #4244
open
performance testing configs repo
Added by Peter Manev almost 4 years ago.
Updated over 3 years ago.
Description
Master ticket.
Open to suggestions.
We should be able to come up with a way to share configs and setups for performance testing. Fro example configs/pcaps for testing with pktgen,trex etc.
Whatever can be public of course.
I am thinking of a repo similar to suricata-verify and of course at the same time a repo that anyone can pull(or contribute to) and run a test and share results.
This also could be related to specific performance corner cases and could be included in regular QA / per release/process etc.
- Assignee set to Peter Manev
something like:
- folder structure per tool
- for each tool we could have
- per test case (lets say 40G ISP traffic mix, 10G Corporate traffic mix, SMB/NFS/KRB5 mix , etc ...)
- readme explaining the test setup if needed, "how" to run, exact command lines, version of the tool used, link to redmine ticket
- folder with a config
- folder with example or needed pcaps
- script (optional?) if needed to facilitate testing
Also add in custom.yaml if needed.
One thing that can be made more detailed/explained is that those test cases are not "copy paste" production configs.
We might also need specific compile options per test: a debug/asan etc.
One thing to be very careful about those settings is that it needs to be well controlled as they are quite heavy on performance and we might end up chasing our tails. Those will be specific cases where the config is not maxing out the testing setup but rather put enough pressure to expose certain waned or not wanted condition. (example rs_functions appearing in the top 20 on perf top)
Could be interesting to also add expected/recommended time duration of each test/config.
It would also make sense i guess if all that info can be done in a structured format so that it can be picked up by a QA tool and used for a test in an automated fashion. (yaml/python config etc.)
All the updates above I actually have done with "trex in mind" - so this could actually differ for another tool (ex pktgen).
sha256 sums of any pcaps provided should be included in the doc/config of each test as well.
Pre-run checks need to be done too - is everything that is supposed to be up is up, checksums match, versions match etc
Also available in: Atom
PDF