Group test of 49 graphics cards in No Mans Sky
Hardware

Group test of 49 graphics cards in No Man’s Sky

No Man’s Sky is a project with a long and complicated history. It was released on PCs back in 2016, and everyone knows what happened back then: the makers’ ambitions collided with a limited budget and timeframe. It seemed that No Man’s Sky would forever remain a rough draft of the game, which it could have been if Hello Games hadn’t rushed to release it. On the contrary, the developers have made every effort to eliminate the worst errors and then published a large number of technical and content additions in less than six years. As a result, No Man’s Sky ratings have gradually changed for the better, and now it is already one of the top 100 most popular titles on Steam.

Recently, the seventh Expedition, Leviathan, appeared in the game and we have another benchmark marathon planned to coincide with this event. The graphical shell of No Man’s Sky has undergone many modifications, and in particular the developers have changed the original OpenGL API to the low-level Vulkan. As a result, the previous estimates of the speed of certain GPUs are no longer relevant and the capabilities of computers do not stand still. Let’s see which graphics cards from previous generations can still cope with the current version of No Man’s Sky and with which settings the new hardware can cope.

Graphics quality settings

As with most modern games, No Man’s Sky lets you adjust the graphic details with a global switch that controls individual settings. When testing graphics cards of one performance level or another, we used the “Standard”, “Enhanced” and “Ultra” profiles. The first sets the settings to the minimum, except for playerbase detail and ambient occlusion, and in the last, the graphics work at maximum settings, except for eight anisotropic filtering instead of the allowed 16x. The Enhanced profile is an intermediate option in terms of both image quality and performance.

Graphics settings in tests
default advanced Ultra
texture quality default advanced Ultra
animation quality default advanced Ultra
shadow quality default advanced Ultra
post processing default advanced Ultra
reflections default advanced Ultra
volumetric effects default advanced Ultra
terrain tessellation default advanced Ultra
planet quality default advanced Ultra
base complexity default default Ultra
Anisotropic filtering one 2 eight
GTAO default advanced Ultra
anti-aliasing Out of TAA TAA

In addition, No Man’s Sky supports image upscaling through DLSS on NVIDIA RTX series graphics cards and more recently the first generation FSR, which has virtually no hardware limitations. However, as benchmarks will show, frame scaling in No Man’s Sky is only relevant on low-performance hardware or when playing on a 4K monitor.

The game uses the ubiquitous TAA algorithm as its primary anti-aliasing tool, which is enabled by default with the Enhanced and Ultra presets. And we have to admit that Hello Games still has to work on this graphic component in future updates: it is rare for TAA to give such a “soapy” picture. DLAA is also rarely used – full-frame anti-aliasing based on a neural network, which, of course, is only available to owners of NVIDIA accelerators, and No Man’s Sky has such an option.

Finally, the game has a frame rate limit of 240 FPS, but this is not a problem for powerful graphics accelerators in conjunction with high-speed gaming screens, because among the graphics cards participating in the test, only the GeForce RTX 3090 approached the 240 FPS mark and then at a resolution of 1080p.

default

advanced

Ultra

default

advanced

Ultra

default

advanced

Ultra

default

advanced

Ultra

default

advanced

Ultra

default

advanced

Ultra

default

advanced

Ultra

default

advanced

Ultra

Test bench, test methodology

test bench
CPU AMD Ryzen 9 5950X (4.4 GHz, all cores fixed)
motherboard ASUS ROG Strix X570-E Gaming (included in resizable bar)
Rome G.Skill Trident Z RGB F4-3200C14D-16GTZR 4 x 8GB (3600MT/s, CL17)
Rome Intel SSD 760p, 2048 GB
power adapter Corsair AX1200i 1200W
CPU cooling system Corsair iCUE H115i RGB PRO XT
frame stand outdoors
operating system Windows 10 Pro x64
AMD GPU software
All graphics cards AMD Radeon Software Adrenaline 2020 Edition 22.5.2
NVIDIA GPU software
All graphics cards NVIDIA GeForce Game Ready Driver 512.95

performance test performed with the OCAT utility. The average and minimum frame rates are derived from the array of individual frame render times that OCAT writes to the result file.

The average frame rate in the charts is the reciprocal of the average frame rendering time. To estimate the minimum frame rate, the number of frames generated in each second of the test is calculated. The value that corresponds to the 1st percentile of the distribution is taken from this number field.

test taker:

  • AMD Radeon RX 6900 XT (1825/2250MHz, 16Gb/s, 16GB);
  • AMD Radeon RX 6800 XT (1825/2250MHz, 16Gb/s, 16GB);
  • AMD Radeon RX 6800 (1700/2105MHz, 16Gb/s, 16GB);
  • AMD Radeon RX 6700 XT (2321/2581MHz, 16Gb/s, 12GB);
  • AMD Radeon RX 6600 XT (2064/2607MHz, 16Gb/s, 8GB);
  • AMD Radeon RX 6600 (1626/2491MHz, 14Gb/s, 8GB);
  • AMD Radeon RX 6500 XT (2420/2825MHz, 18Gb/s, 4GB);
  • AMD Radeon RX 5700 XT (1605/1905MHz, 14Gb/s, 8GB);
  • AMD Radeon RX 5700 (1465/1725MHz, 14Gb/s, 8GB);
  • AMD Radeon RX 5600 XT OC (1420/1750MHz, 14Gb/s, 6GB);
  • AMD Radeon RX 5600 XT (1235/1620MHz, 12Gb/s, 6GB);
  • AMD Radeon RX 5500 XT (1607/1845MHz, 14Gb/s, 8GB);
  • AMD Radeon RX 5500 XT (1607/1845MHz, 14Gb/s, 4GB);
  • AMD Radeon VII (1400/1750MHz, 2Gb/s, 16GB);
  • AMD Radeon RX Vega 64 LC (1406/1677MHz, 1.89Gb/s, 8GB);
  • AMD Radeon RX Vega 64 (1247/1546MHz, 1.89Gb/s, 8GB);
  • AMD Radeon RX Vega 56 (1156/1471MHz, 16Gb/s, 8GB);
  • AMD Radeon RX 590 (1469/1545MHz, 8Gb/s, 8GB);
  • AMD Radeon RX 580 (1257/1340MHz, 8Gb/s, 8GB);
  • AMD Radeon RX 570 (1168/1244MHz, 7Gbps, 4GB);
  • AMD Radeon RX 560 16CU (1175/1275MHz, 7Gb/s, 4GB);
  • AMD Radeon RX 560 14CU (1090/1175MHz, 7Gb/s, 4GB);
  • NVIDIA GeForce RTX 3090 (1395/1695MHz, 19.5Gb/s, 24GB);
  • NVIDIA GeForce RTX 3080 Ti (1365/1665MHz, 19Gb/s, 12GB);
  • NVIDIA GeForce RTX 3080 (1440/1710MHz, 19Gb/s, 10GB);
  • NVIDIA GeForce RTX 3070 Ti (1575/1770MHz, 19Gb/s, 8GB);
  • NVIDIA GeForce RTX 3070 (1500/1730MHz, 14Gb/s, 8GB);
  • NVIDIA GeForce RTX 3060 Ti (1410/1665MHz, 14Gb/s, 8GB);
  • NVIDIA GeForce RTX 3060 (1320/1837MHz, 15Gb/s, 12GB);
  • NVIDIA GeForce RTX 3050 (1550/1780MHz, 14Gb/s, 8GB);
  • NVIDIA GeForce RTX 2080 Ti Founders Edition (1350/1635MHz, 14Gb/s, 11GB);
  • NVIDIA GeForce RTX 2080 SUPER (1650/1815MHz, 15.5Gb/s, 8GB);
  • NVIDIA GeForce RTX 2080 Founders Edition (1515/1800MHz, 14Gb/s, 8GB);
  • NVIDIA GeForce RTX 2070 SUPER (1605/1770MHz, 14Gb/s, 8GB);
  • NVIDIA GeForce RTX 2070 Founders Edition (1410/1710MHz, 14Gb/s, 8GB);
  • NVIDIA GeForce RTX 2060 SUPER (1470/1650MHz, 14Gb/s, 8GB);
  • NVIDIA GeForce RTX 2060 (1365/1680MHz, 14Gb/s, 6GB);
  • NVIDIA GeForce GTX 1660 Ti (1500/1800MHz, 12Gb/s, 6GB);
  • NVIDIA GeForce GTX 1660 SUPER (1530/1830MHz, 14Gb/s, 6GB);
  • NVIDIA GeForce GTX 1660 (1530/1785MHz, 8Gb/s, 6GB);
  • NVIDIA GeForce GTX 1650 SUPER (1530/1770MHz, 8Gb/s, 4GB);
  • NVIDIA GeForce GTX 1650 (1485/1725MHz, 8Gb/s, 4GB);
  • NVIDIA GeForce GTX 1080 Ti (1480/1582MHz, 11Gb/s, 11GB);
  • NVIDIA GeForce GTX 1080 (1607/1733MHz, 10Gb/s, 8GB);
  • NVIDIA GeForce GTX 1070 Ti (1607/1683MHz, 8Gbps, 8GB);
  • NVIDIA GeForce GTX 1070 (1506/1683MHz, 8Gbps, 8GB);
  • NVIDIA GeForce GTX 1060 (1506/1708MHz, 9Gb/s, 6GB);
  • NVIDIA GeForce GTX 1060 (1506/1708MHz, 8Gb/s, 3GB);
  • NVIDIA GeForce GTX 1050 Ti (1290/1392MHz, 7Gb/s, 4GB).

Note. In parentheses after the names of the graphics cards are the base and boost frequencies according to the specifications of the respective device. Graphics cards with factory overclocking are adjusted to the reference parameters (or close to them) if this is possible without manual adjustment of the clock frequency curve. Otherwise the manufacturer settings are used.

RELATED TOPICS

About the author

Dylan Harris

Dylan Harris is fascinated by tests and reviews of computer hardware.

Add Comment

Click here to post a comment