Tag Archives: NVIDIA

Running NVIDIA On GNOME’s X.Org Session May Get A Lot Smoother


GNOME --

Canonical’s Daniel van Vugt continues doing a lot of interesting performance investigations and optimizations around improving the experience of GNOME not only for Ubuntu but the upstream components. His latest focus has been on NVIDIA enhancements and now for the X.Org session there is a merge request pending to provide for a smoother experience.

This week Van Vugt opened up a pull request that provides a “significant improvement” to the frame-rate smoothness for NVIDIA’s proprietary Linux graphics driver running on GNOME under the X.Org session (this MR doesn’t affect the Wayland session).

The change is dropping the threaded swap waiting used with the NVIDIA driver. “So “threaded swap wait” provided better sub-frame phase accuracy, but at the expense of frame rates. And as soon as it starts causing frame drops, that one and only benefit is lost. There is no reason to keep it.

Daniel also added, “Noticed way better responsiveness when videos are running in Chrome or CPU is running at 100% (e.g. Handbrake encoding videos). The choppiness is gone. Using a NVIDIA 1080 Ti here using 430.14 driver.

This code does depend upon Daniel’s earlier patches from months ago about consolidating all the frame throttling code into clutter-stage-cogl. That prerequisite itself is a big win with addressing NVIDIA CPU usage problems as well as mouse cursor issues at 60Hz when the display’s refresh rate is higher, among other benefits for that reworked code.

So let’s hope that this merge request will land for the current GNOME 3.33 development cycle so that September’s GNOME 3.34 will be looking good on the performance front. There’s already been several optimizations merged this cycle while a lot more changes are still pending / stuck in the review queue.


Ubuntu to Package Proprietary Nvidia Driver » Linux Magazine


According to reports, Ubuntu developers are planning to add the proprietary NVIDIA drivers to the ISO of the next release of Ubuntu (19.10).

However, these drivers will not be activated/enabled by default.

The reason for backing these drivers is simple. As mentioned in the Launchpad bug report, “On Ubuntu desktop, without a network connection, the user can elect to install 3rd party drivers (which states that it’ll install graphics driver) but even if the user selects this option, Nvidia proprietary drivers won’t be installed because they are not on the pool of the ISO.”

With drivers backed into the ISO, users can install these drivers without Internet. To ensure that there won’t be any licensing issues, Will Cooke of Canonical said that they have worked with Nvidia to ensure that they are allowed to distribute the drivers on the ISO. Depending on user feedback, Canonical might also back-port this to earlier releases of Ubuntu.

Canonical will continue to offer open-source Nouveau drivers as the default driver for NVIDIA cards.



Source link

Nouveau Gets Initial Support For NVIDIA TU117 (GeForce GTX 1650)


NOUVEAU --

While it missed the main DRM pull request for Linux 5.2, the Nouveau DRM driver now has initial support for NVIDIA’s Turing TU117, the GPU powering the new GeForce GTX 1650 series.

Nouveau DRM maintainer Ben Skeggs of Red Hat committed the support to their staging tree on Thursday for this TU117 enablement. The TU117 support is largely based on their existing Turing TU106 GPU support and amounts to just 36 lines of code.

Like the existing Turing support by this open-source NVIDIA Linux driver, currently it’s limited to just kernel mode-setting (display) support. Nouveau doesn’t yet offer any hardware acceleration for Turing GPUs as they are blocked by NVIDIA, waiting on them to release the necessary signed firmware images needed for initialization.

But even when those Turing firmware blobs end up being released, there will still be the issue like with Maxwell / Pascal / Volta of only running at the boot clock frequencies without any re-clocking support for being able to drive the hardware at its optimal clock frequencies. For overcoming that challenge, additional firmware support or workarounds need to be devised around the PMU handling. Until that happens, the Nouveau performance past the GeForce GTX 700 series remains very slow.

At least the GeForce GTX 1650 does run well on the proprietary NVIDIA driver as outlined in our GeForce GTX 1650 Linux review. If you care about open-source driver support, however, the Radeon RX 570 is a much better bet.


NVIDIA “AltMode” Open-Source Driver Heading To Mainline Kernel With Linux 5.2


NVIDIA --

There’s a new open-source NVIDIA driver heading to the mainline kernel with Linux 5.2, but don’t get too excited.

The NVIDIA AltMode driver queued up for entrance into the Linux 5.2 kernel is for handling VirtualLink devices with the newest RTX Turing graphics cards that have a USB Type-C connector.

Previously we’ve seen NVIDIA post a new i2c driver for the USB-C connections on their newest Turing graphics cards while this latest addition is a simple driver for enabling the Type-C Alternate Mode for VirtualLink devices.

With this new code by NVIDIA is now support for VirtualLink devices on Linux 5.2+.

This was queued in the USB-next code-base following patches by Intel for UCSI DisplayPort Alternate Mode Support across a few commits also destined for Linux 5.2.

NVIDIA supplied a larger code contribution as well of firmware flashing support for their Type-C controller in order to perform firmware upgrades under Linux.

While their USB Type-C/VirtualLink code bits are open-source, there are signed firmware blobs involved in bringing up their USB controller support.

Valve’s upcoming “Index” VR headset might be among the first VR HMDs supporting a VirtualLink interface.


NVIDIA GeForce GTX 1650 Linux Gaming Performance & Benchmarks Review


This week NVIDIA introduced the $149 USD Turing-powered GTX 1650 graphics card. On launch day I picked up the ASUS GeForce GTX 1650 4GB Dual-Fan Edition (Dual-GTX1650-O4G) graphics card for Linux testing and have out now the initial GTX 1650 Linux performance benchmarks under Ubuntu compared to an assortment of lower-end and older AMD Radeon and NVIDIA GeForce graphics cards.

For $149+ USD, the GeForce GTX 1650 features 896 CUDA cores, 1485MHz base clock with 1665MHz boost clock, 4GB of GDDR5 video memory, Volta-based NVENC video capabilities (not the newer Turing NVENC, but still good enough especially compared to older generations of NVIDIA GPUs), and has just a 75 Watt TDP meaning no external PCI Express power connector is required.

In the case of the ASUS Dual-GTX1650-O4G, I was able to acquire it on launch day for $160 USD though there were other models indeed hitting the $149 price point. This particular ASUS SKU does use the same 1485MHz base clock but its GPU boost clock can reach 1725MHz compared to the 1665MHz reference clock. There is also an ASUS GPU Boost Clock mode under Windows to reach 1755MHz. No manual overclocking was attempted with this graphics card since you can read about GPU overclocking on plenty of other websites while we focus on the Linux support and performance aspects.

The ASUS GeForce GTX 1650 Dual-Fan Edition features outputs for DVI-D, HDMI 2.0b, and DisplayPort 1.4. The GTX 1650 does support driving three displays simultaneously. This ASUS graphics card with two fans is a standard dual-slot form factor and the card measures in at 20.4 x 11.5 x 3.7 cm.

This GTX 1650 graphics card was working fine under Linux in conjunction with the new NVIDIA 430.09 beta Linux driver. The initial round of tests were from Ubuntu 19.04 x86_64 with the Linux 5.0 kernel. No problems were encountered in the time spent thus far benchmarking a variety of OpenGL and Vulkan Linux games (including Steam Play / DXVK titles) and some OpenCL/CUDA compute workloads.