PDA

View Full Version : Graphics performance


Klutzine
06-29-2010, 01:17 AM
Win7 (DirectX):
Up to 200 fps, never less than 100.
Linux (OpenGL):
Sauerbraten: Never less than 60 fps.
RO: Down to 10-12 fps, usually around 25 fps, never better than 30-36 fps.

Ubuntu Lucid, 4Gb mem, nVidia GTX 480.

What gives?

Freduardo
06-29-2010, 07:37 AM
What driver version are you using? Afaik the older one gives way better performance.

Klutzine
06-29-2010, 12:17 PM
195.36.24

But I happen to use the system for other things than RO, so I'm not going to change to older video drivers for the sake of a game!

rabidweezle
07-08-2010, 03:58 AM
Downgrading drivers in any OS is like shooting yourself in the foot. Let's seriously fix this. Even if you have to get Icculus (www.icculus.org (the guy that ported the most commercial games to linux than anyone else and made the SDL libraries)) over here to put this together right for you, give it a shot. It's a big turnoff to get a game that runs worse natively than world of warcraft in (wine) emulation. Seriously. This is not ranting, Ryan Gordon can get you up and running real stable like if you look at aquaria, prey, lugaru, the list goes on, this man gets it done and he does it right.

As a **temp** fix for those affected with this open game.cfg in your regnum folder and set this line under [debug]

dbg_disable_shaders = 1

You will see a drastic improvement in fps

Forgeon
07-11-2010, 08:46 AM
RO: Down to 10-12 fps, usually around 25 fps, never better than 30-36 fps.

Ubuntu Lucid, 4Gb mem, nVidia GTX 480.

What gives?

You have a problem... of configuration or cpu.
Me on RO, 60 fps usually (limited to sync to vblank, frequency of my screen) 30 fps in some scenes (everything to max quality excepted the AA and anisotropic at 4 each, shader 4, i just remove shadow of terrain because they are bogus on display), at 1920x1200x32 fullscreen.

Ubuntu Lucid (for amd64), more mem (DDR3-10666) (but 4Gb should be ok if not used by something else), nVidia GTX 470, i7 at 3.+GHz (but cpu does not reach 100% in RO and scheduler by default is "OnDemand", lowest frequency 1.6 GHz), no overclocking, everything "from factory" (lucid installed fresh at the end of june 2010 (well, it was a karmic DVD, updated to lucid right after install)
(sound was a problem with RO until I installed the libopenal1 package)

Version of nvidia driver is 195.36.24 (package nvidia-current / nvidia-settings / nvidia-185-modaliases / nvidia-glx-185 .... ), traditional package repositories (no experimental/parallel repository)

How is your cooling ? (470 GTX at 72°C just with 2 screens and web-browsing, disks at 31-35°C, motherboard/GTX environment reported by 470 at 50°C, no access to CPU temperature) May be you're throttled down at the graphic card (max temp is 105°C) or cpu ?

ov3rcl0ck
07-13-2010, 05:55 AM
The engine is very poorly done as of now. It relies too much on the CPU and it only uses one core at that, which will slow performance at least 6 times what it could be if properly coded to use hardware acceleration.

The engine definitely does not run the quality of graphics that the resource usage suggests. With this resource usage and FPS I should be running with much better shaders, much better graphics, and better models. This is a major issue to most older games.

Today graphic processing is insane, but barely anything uses it. We should start programming games like we were putting them on the console. Consoles have shitty CPUs and low system RAM, where as they usually have really nice GPUs and dedicated GPU RAM, so you're forced to program them well basically.

Klutzine
07-13-2010, 09:12 PM
Forgeon: We seem to have pretty much the same system, with you having more RAM, while I have a somewhat better graphic card, so it is strange that performances vary so much. There is no temperature problem; moreover RO is the only program I know with such poor Linux performance. In short, it is not a hardware problem. (After all, our graphics card are both state of the art.)

Which brings me to Ov3rclock's points: The fact that RO depends so heavily on the CPU to do graphics processing, simply is stone age programming. If that isn't fixed, the game will be gone from the radar in a year or two. And there's no excuse for it either; Nvidia gives programmers and gamers first rate Linux support. As for Linux users who use other chipsets for gaming (liike some laptop users), they're probably aware that they can't expect state of the art graphics.

So NGD really has to get its stuff together in terms of graphivs/OpenGL programming. Soon it'll be too late.

Arafails
07-13-2010, 11:53 PM
Consoles have shitty CPUs and low system RAM, where as they usually have really nice GPUs and dedicated GPU RAM, so you're forced to program them well basically.

Actually, game consoles these days have pretty good CPUs, except for the PS3 which has an exceptionally good cpu set (If you have a CellBE and you're programming for the GPU you're doing something wrong). The PPC processors (Yeah, that counts for Wii, XBox360, and PS3) in them may not be clocked real fast, but they can process a heck of a lot of data per cycle. Much more than your desktop processor can. They are, after all, based on modern Mainframe technology. The GPUs are pretty much there for predictable microcode shader paths. That said, the strong points of the processors require you to think more about what you do with a cycle rather than try to cram as many instructions as you can through in serial.

I'm sure Regnum could improve in the area of GPU utilisation, though. Threaded input and network code would be nice too (we have all these SMP desktop systems, yet so many games still have their input and network code bound by framerate! WTF?!)

The declaration a while back that Hardware Skinning relied on having a good CPU worried me. I hope if it originated from a developer that it was subject to whispers changing GPU from the original statment to CPU.

For a more direct reply to the topic, I tried out the 256.34 nvidia drivers and they seem to work faster.

ov3rcl0ck
07-14-2010, 12:05 AM
Actually, game consoles these days have pretty good CPUs, except for the PS3 which has an exceptionally good cpu set (If you have a CellBE and you're programming for the GPU you're doing something wrong).

Yeah the PS3 has a cell processor, and Xbox 360 has a tricore(yes oddly enough its true). But the reason for this is basically because now companies can make the same mistakes they do with computer games, they can rely on the CPU too much. Not long till you have to worry about laggy console games.

Its kinda like what all the old computer gurus say, no need to optimize code because of all this new age processing horse power, you can make any heap of code work now. Its pretty true, and becoming more true.

Another thing, I don't see why NGD bothers having the engine support both DX and OpenGL, when opengl preform lighter and IMO better. Not only that openGL works on just about all modern platforms, including Windows. So they're just making more work for themselves...

My theory on most game companies making games in DX is because its easy to whip up a badly coded game with it. DX is slower, heavier, but easier. OpenGL is faster, lighter, but takes more work.

But this doesn't apply for NGD cause they use both(either/or), so this isn't the case...

I don't get it.

However you should try changing it to OpenGL, maybe a better FPS. You can do it by going to your regnum install folder, going into the "live" directory/folder and in the file "game.cfg" change "vg_renderizer=directx" to "vg_renderizer=opengl"

Klutzine
07-14-2010, 10:20 AM
I totally agree that NGD should concentrate on one graphics platform, and if they want to keep their Linux players, it had better be OpenGL. (Can vg_renderizer be set to anything else than vg_renderizer=opengl in Linux, btw?) But they would have to put som major effort into it, because OpenGL in RO just doesn't cut it at the moment as compared to DirectX performance under Win. Part of the problem is probably that NGD uses its own proprietary engine?

I'm not sure what dbg_disable_shaders=1 does.

Ashnurazg
07-14-2010, 10:30 AM
But they would have to put some major effort into it, because OpenGL in RO just doesn't cut it at the moment as compared to DirectX performance under Win.
I have good frame rate with opengl under windows.
DirectX crashes when I use Shader 3.0. :fury:

I'm not sure what dbg_disable_shaders=1 does.
It disables the Shaders (http://en.wikipedia.org/wiki/Shader).

Part of the problem is probably that NGD uses its own proprietary engine?
Yes, that's why I support the Open source Petition (http://www.regnumonlinegame.com/forum/showthread.php?t=63322).

Can vg_renderizer be set to anything else than vg_renderizer=opengl in Linux, btw?
No, Linux have no other renderizer than opengl.

ieti
07-14-2010, 11:00 AM
I read in ioquake3 lists about intention to move load from CPU to GPU by adding VBO and GLSL to engine. I never programmed with OpenGL and do not know what this will help, but anyway if this can help to get more fps and engine to be less CPU dependent it will be great.

Other point is alot linux users use NVidia cards because their drivers are better supported and less problematic. Would it help to make some module that uses Cuda or OpenCL to move some load to GPU again. If this is detected as possible ofc.

Multithreading can help to to split load across multiple cores.

Zas_
07-14-2010, 03:16 PM
I'm in holidays, here is an 4 years old pc with winXP and Xubuntu in dual boot, and an ati rv370 128gb video card:
- under Xubuntu, max performance is 12 fps in 1024x768, fixed pipeline, unplayable.
- under WinXP, directX9, 65 fps in 1024x768, shader 2, 55 fps with openGL.

What a pain to boot winXP to play the only linux native good mmorpg around...

Klutzine
07-14-2010, 05:45 PM
I understand dbg_disable_shaders=1 disables the shaders! I wasn't too clear; what I meant was 1. Why is it in the debug section, and 2. What's the difference from fixed pipeline? Anyway, I have a nVidia GTX480 -- there should be absolutely no reason to turn down the graphics quality...

ov3rcl0ck
07-15-2010, 03:28 AM
I totally agree that NGD should concentrate on one graphics platform, and if they want to keep their Linux players, it had better be OpenGL. (Can vg_renderizer be set to anything else than vg_renderizer=opengl in Linux, btw?) But they would have to put som major effort into it, because OpenGL in RO just doesn't cut it at the moment as compared to DirectX performance under Win. Part of the problem is probably that NGD uses its own proprietary engine?

I'm not sure what dbg_disable_shaders=1 does.

vg_renderizer will give an error if set to anything else I believe. dbg_disable_shaders I believe is what the "Fixed Pipeline" option for Shaders does, it just disables shaders for older cards that can't handle it and will cause graphical glitches or mega lag if you don't enable it, so I assume.

Klutzine
07-15-2010, 02:43 PM
vg_renderizer will give an error if set to anything else I believe. dbg_disable_shaders I believe is what the "Fixed Pipeline" option for Shaders does, it just disables shaders for older cards that can't handle it and will cause graphical glitches or mega lag if you don't enable it, so I assume.

I have assumed the same, actually, and while graphics certainly are not the main reason why I play RO, the pedestrian implementation of OpenGL really annoys me, sseing what can be gone.

I've been fooling around a bit with OpenCL, and it's simply amazing. While lots of the stuff may not be directly relevant (after all, the GTX 480 is sort of a supercomputer on a board), it's disheartening to see what possibilites NGD have chosen to pass by. If they had a decent OpenGL implementation, they could really make a splash. (Posting that last remark here to avoid having to deal with the progamming illiterates in that other thread.)

Jack-Ruby
07-21-2010, 11:57 AM
I had some small problem early

Now it works better
jackruby:~$ nvidia-settings -g | egrep "OpenGL|rendering"
direct rendering: Yes
OpenGL vendor string: NVIDIA Corporation
OpenGL renderer string: GeForce 8400 GS/PCI/SSE2
OpenGL version string: 2.1.2 NVIDIA 173.14.22
OpenGL extensions:


I modified the file game.cfg
dbg_disable_shaders to 1


http://h.imagehost.org/view/0002/dbg_disable_shaders_a_0 dbg_disable_shaders=0
http://h.imagehost.org/view/0655/dbg_disable_shaders_a_1 dbg_disable_shaders=1


My file "game.cfg"
http://pastebin.com/ddNG6VMQ

I still have to find how to avoid the crucifixion