I've only got a 1080p screen to test with, so that should really be the lowest common denominator.
I'd suggest doing the run on Fullscreen Ultra preset to get a worst case scenario.
Those with higher panels a run at 1080 and native (4k panels could run at 1440 too, if you're up for it) would give good indicators too.
I've actually started collating into a Excel sheet (I might have had too much time on my hands...) so happy to update with more accurate figures, and I think it would also be interesting to see the difference higher end cards are achieving at higher resolutions.
CMDR | CPU | GPU | RAM | OS | Resolution | Unigine Score | FPS |
BL1p | Ryzen 3700x | Nvidia 3060 RTX 12g | 32Gb | W10 | 1920 x 1080 | 2627 | <30 |
-VR- Max Factor | i5 8600 | Nvidia 2070 | 16Gb | | | | >40 |
CMDR The Riz | i7-8750H | Nvidia 2070 | 32Gb | | 1920 x 1080 | | <40 (not ultra) |
FalconFly | Ryzen 3600 | AMD RX580 | 16Gb | | 1920 x 1080 | | <30 |
VR Lucian Arkright | Ryzen 3600 | Nvidia GTX 980 | 16Gb | W10 | 1920 x 1080 | 1578 | <30 |
MCCART | Ryzen 2600x | AMD RX5600XT | 32Gb | W10 | 1920 x 1080 | | <30 |
sede | i7-4771 | Nvidia GTX 780M | | | | 709 | |
Athlon-uk | i7-10700k | Nvidia RTX 3070 | | | 1920 x 1080 | 4307 | |