earlephilhower / ezfio

Simple NVME/SAS/SATA SSD test framework for Linux and Windows

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

FIO 3.12 from Bluestop and comparing results

Hr46ph opened this issue · comments

Hello,

Thanks for the script, I've been figuring out fio cmdline parameters to troubleshoot some performance issues so this comes in handy and takes some work out of my hands.

Can you give me any indication how long a run should take? Any reference? I need to test 4 256GB NVME drives and test them on 2 systems, Asus ROG laptop with gen7 Core i7 on Windows 10 and a HPe ML10 Gen9 Xeon e3-v5 on Arch Linux (using PCE-E 3.0 x4 expansion card).

Should Windows results be comparable to Linux results?

Final Q I have is you mention issues with Bluestop FIO version 3.1, their latest is now 3.12. Are there any known issues running ezfio with this version?

Thank you.

Anywhere from 4 hours to 4 days or longer is possible.

Runtime really depends on the speed of the devices under test when fully loaded. It preconditions the drive, normally the longest part, by filling it twice sequentially (relatively fast operation even on consumer SSDs except for some of the QLC where I've heard of reports of it going to <100MB/s writes after sustained bursts), doing some tests, then filling it two times using random writes (which can become insanely slow on consumer SSDs compared to data center class ones) and doing some tests plus a 20 minute stability run.

Windows and Linux should be roughly comparable, differences mostly depend on the NVME driver and the CPU scheduler.

I haven't run with FIO 3.12 exactly, but I would be surprised if it had any issues. The result format is relatively consistent on the tool.

Thanks for the quick response!

One more if you don't mind please. I just noticed I can specify a percentage to use. I'm running the Linux on 10% and a Windows test (same model drive) and all the results so far are very comparable.

If running on 10% is the running time also 10% (is that linear)?

Would you say there's any benefit to running it on the full disk? I don't suspect a problem with the drives perse, but more with the slot or expansion card they are in (in the ML10). But I need to establish a baseline, hence I am running the tests on Windows too?

Thanks again!

It's not going to be linear time, the scale factor is only related to preconditioning size. The 20 minute sustained run won't shrink, nor the other tests, so the 4 longest bits should shrink, while the other tests remain constant.

For just sanity checking PCIE slots, I suppose 10% would be fine, but you would want to make sure you TRIM the drives between each run. The problem is that by only filling 10% of the drive per run, you could be testing peak performance one run and then sustained performance on the next run, and those #s are going to be very different.