Article Index

From time to time I see posts on the Ubuntu Forums about whether the newer AMD graphics cards (R7, R9) will work with Ubuntu. To find out for sure, I requested and Canonical graciously provided an XFX "Dual Dissipation" R9 290X.

So do these things work on Ubuntu?

The answer is "Yes" and "No." "Yes" -- and very well -- if the proprietary driver is used with the video acceleration packages. "No" -- not really -- if the current open source driver is used.

The testbed:
CPU: AMD Phenom II 1100T
Motherboard: Gigabyte 990FXA UD3
RAM: 16GB DDR3 1600
PSU: Silverstone Strider 1500W (Bronze, and about 5 years old)
SSD: 250GB Samsung 840 EVO (firmware update applied)
HDD: 500GB mechanical SATA
Case: Cooler Master HAF 912. 2x 200mm fans, 2x 140mm fans, middle drive
         cage removed to allow for the length of the card.
Monitors: 3 monitor setup. 2x DVI, 1x Display Port
 
 
The subject:
 
XFX R9 290X Dual Dissipation
 
Releases Tested:
 
12.04.4 LTS, 14.04.1 LTS, 14.10, 15.04 (development)
 
Hardware acceleration packages installed when the proprietary driver was installed:

xvba-va-driver, libva-glx1, libva-egl1, vainfo

Tests conducted:
  1. Install Ubuntu fresh from DVD and update.
  2. Test performance of the adapter using the open source Radeon driver
     
  3. Test installation of proprietary driver via Additional Drivers, packages    required for hardware acceleration installed via synaptic (which I added when I installed).
  4. Reinstall Ubuntu fresh from DVD and update.
  5. Test installation of proprietary driver from the terminal. This included the packages required for hardware acceleration.
  6. Test the performance of the proprietary driver.

Performance of driver tested with:

  1. Video content from the web.
  2. An .avi file from my network, "Big Buck Bunny". This provides movement from slow to fast with textures that are easy to pick out.
  3. A DVD movie with complex "chase" scenes and complex background movement.
  4. Team Fortress 2 on Steam.
  5. GLMark2 from the Phoronix Test Suite.

 
 
Summary:
 
  1. Installation went as easily as expected, which one exception: When installing 12.04, the adapter was only driving two of the three monitors, a DVI monitor and the DisplayPort monitor, during installation. However, installation was otherwise unremarkable. Installation of 14.04, 14.10 and 15.04 did not exhibit this behavior and installation was unremarkable.
  2. Graphical performance of the DE and applications with the open source driver was marginally satisfactory. 3D effects such as the cube performed terribly, since the work was being done by the CPU.
  3. Video, gaming and GLMark2 performance with the open source driver was very poor. Much of the graphical processing ended up being dumped on the CPU, with predicable results. Despite that, the fact that the dedicated open source driver developers are able to get it to work at all continues to amaze me!
  4. Installation of the proprietary driver was unremarkable using both the Additional Drivers method and the terminal method (as described in section 2.1 of https://help.ubuntu.com/community/BinaryDriverHowto/AMD), which I added to the wiki and have been helping to maintain for several years. I did make some edits to that section after this testing.
  5. Graphical performance of the DE and applications using the proprietary driver was crisp and 3D was outstanding. However, some tearing was noted when Tear Free was disabled.
  6. Video, gaming and GLMark2 performance with the proprietary AMD driver were excellent -- as long as the settings were optimized for each.
  7. Those who are interested in purchasing one of these need to be very careful about the PSU they choose. Although the Silverstone Strider ST1500 provides double the 750 watts recommended by AMD, the amperage rating on the four available 12v PCI-E rails was insufficient to run this card without using two rails. On the PSU used, each rail is rated at 25A. However, the nominal published maximum wattage requires approximately 30A at 12v for 360W. Since this unit can approach or surpass 400W when stressed or over-clocked, at least 35A should be provided. In the end, I had to plug an 8 pin PCI-E power cable in to one 25A 12v PCI-E rail and a 6 pin power cable into another rail. Without doing that, the initial current spike to start the unit tripped the over-current/voltage circuitry on the PSU and the computer would not finish the POST or send any signal to the monitors.  This PSU was built when top-of-the-line single GPU cards were rated at about 220W -- and 18A was sufficient.  Fortunately, Silverstone included enough cables with the PSU to fill a computer case by themselves.

 

Findings - OS installation, installation of driver using Additional Drivers, installation of driver using the command line:

1. 12.04.4

As indicated, during initial installation of the OS, one monitor was not provided a signal by the adapter. Although disconcerting, this did not keep me from completing the installation. However, when rebooted the adapter was still  not providing a signal to one monitor. This being the case, I did not attempt to do any performance testing, and moved on to installation of the proprietary driver.

Installation of the proprietary driver using the Additional Drivers method was entirely unremarkable.
 
Installation of the driver via the command line was unremarkable.
 
With the proprietary driver, all three monitors were supplied with signal.
 
2. 14.04.1, 14.10, 15.04 (development)
 
Installation was a straight-forward process as expected, with all three monitors provided with signal.

Installation using the Additional Drivers method was unremarkable.

Installation using the terminal was unremarkable.


 

Findings - 3D performance of the Desktop Environment (typical, all releases):

 
1. Performance with the open source driver:
 
Most 3D desktop effects were terribly slow and jerky, if they worked at all. Rotating the cube, in particular, was a nightmare.
 
 
2. Performance with the proprietary driver:
 
3D desktop effects, as expected, were very sharp and very fast.

 
 
Findings - video, game and GLMark2 performance (typical, all releases):
 
1. Performance with the open source driver:
 
Not tested on 12.04.4 as indicated above.
 
On 14.04.1, 14.10 and 15.04 (development) the performance was marginal for normal use, and extremely poor for graphics.
 
Web content stuttered and tore.
 
The .avi file stuttered, halted, tore and pixelated.
 
The DVD fared no better than the .avi.
 
I could not get Team Fortress 2 to run at all due to Steam's driver requirements.
 
The GLMark2 tests appeared to be handled almost exclusively by the CPU, which ran between 50% and 75% for the entire duration of the testing. Tearing, stuttering and complete halting for periods of about half a second were observed. The final FPS was 36.

 
 
2. Performance with the proprietary driver; all settings at highest quality; Tear Free enabled; packages xvba-va-driver, libva-glx1, libva-egl1 and vainfo installed; (typical, all releases):
 
Web content performed flawlessly.
 
The .avi file, played across my LAN in SMPlayer, was flawless. Movement was smooth and consistent in both slow and fast moving portions of the video. No tearing or pixilation were noted.  Since the hardware acceleration packages installed, the GPU/CPU loads in my conky indicated that the GPU was loaded in varying degrees from about 10% to 30% (rarely) while the CPU was barely affected. This would indicate that the entire process was being handled properly by the GPU.
 
The DVD movie played in SMPlayer played the "chase" scenes with complex background movement flawlessly. Again, load on the GPU/CPU indicated that the load was being handled properly by the GPU. Load occasionally spiked to 100% momentarily, but not for a sustained period. The GPU load rarely exceeded 30%. Again, however, this would indicate that the entire process was being handled properly by the GPU.
 
Team Fortress 2 was also flawless, if the frame rate was a bit "slow". The GPU took the lion's share of the load, but I did note some increased load on the CPU. I don't know if this is just a function of how TF2 is designed.
 
GLMark2 performed flawlessly (but the frame rate was a "bit pokey") at 1920x1080, and even with all settings at highest quality and Tear Free enabled, it produced a consistent frame rate of 60 through all the tests. (The argument about FPS notwithstanding, 45 - 60FPS is about what it takes for most humans to perceive smooth motion on a monitor.) GLMark2 exercised the GPU pretty well, but it still did not even get close to 100% sustained load, but did spike there occasionally. Otherwise it was rarely above 50%. The highest observed temperature of the GPU was 59C and the highest observed fan speed was 36%. With regard to temperature and fan speed, it should be noted that this "Double Dissipation" model has two large, lower rpm fans rather than the single high rpm reference fan. This produces much better cooling and much lower noise.

 
 
3. Performance with the proprietary driver, all settings at highest quality (including Wait for Vertical Refresh), Tear Free disabled (typical, all releases):
 
Some occasional minor tearing noted when playing web content. Frame rate was consistent.
 
Occasional transient minor tearing and pixilation were noted when playing the .avi file. Frame rate was consistent
 
Minor tearing was noted when playing the DVD movie played in SMPlayer. Frame rate was consistent.
 
Minor tearing was noted with Team Fortress 2. There was some stuttering, indicating that the frame rate suffered.
 
GLMark2 demonstrated some minimal tearing and a few periods of odd pixilation. Oddly, frame rate was degraded to 57FPS.

 
 
4. Performance with the proprietary driver, set to use application settings, Tear Free enabled (typical, all releases):
 
I could not see any difference in the video tests between this and when the settings all at high quality. Obviously those applications were being allowed to use optimum values and Tear Free was key.
 
TF2 and GLMark2 frame rate increased back to 60FPS. But that's really not what the gamers will want to hear.
 
5. Performance with the proprietary driver, set to use application settings, Tear Free disabled (typical, all releases):
 
I could see some hints of tearing in videos.  Or was it my imagination?
 
But this where the game and GLMark2 got to really shine -- at least in demonstrating the computing power of the GPU. Tear Free and overriding application settings seem to knock the wind out of them as far as their computing performance.
 
TF2 was much faster (in terms of frame rate) and the GLMark2 score was pretty good at 2077 -- or about 2000FPS faster than my 75Hz monitors can produce. I don't know if gamers would find that acceptable for any sort of bragging rights or not.
 
Max observed temperature was 65%, max fan speed was 55% and load hovered near 100%.  Obviously it was working harder than it was before.
 
Unfortunately, this did cause some significant disruption in the smoothness of some of the DE eye candy, like window movement, etc.

 
 
6. Just for giggles and grins: Same settings as item 4, but turning off all 3d effects and compositing in the Desktop Environment (this was tested on 14.10 only, on a lark):
 
GLMark2 score rose to 5193. Zounds! I think I'll go ahead and pick up that 5200Hz monitor at the local computer store!

 

Conclusion:

As a non-gamer, what this sort of thing represents to a science geek like me is the computing capabilities of the GPU for GPGPU computing -- that's lot of power for programs designed for GPUs.  Miners would certainly love to have four of these things (or two of the 2x GPU units) hooked up to a PSU that supplies the requisite amperage rating.

 


Legal Disclaimer:  I am not an Ubuntu apologist, but I do use Ubuntu and Kubuntu.  I am not a Fedora apologist, but I do use Fedora.  I am not a Windows apologist, but I do use Windows.  I'm not a FOSS apologist, but I use FOSS tools when I can and when they fit the job at hand.  I don't find any sense in a religious affiliation with tools and operating systems.  That's just asinine.