PDA

View Full Version : Support for i965/GMA X3100 linux driver.


Prometheus
09-22-2007, 10:39 AM
Hi, I bought a new notebook a few days ago which has one of the newest Intel chipsets (GM965), so I got a i965 (or GMA X3100) onboard GFX chip.

Has anyone got Regnum running on such a configuration?

I have tested the game under Windows where it seems to be fine (only tried DirectX though).

Or is the linux driver not supported at all?

Edit: Other 3D games work just fine (Tremolous, Warsow, Glest)

DuoMaxwell
09-22-2007, 11:36 AM
the x3100 supports OpenGL1.5, but like all IGPs gaming preformance sucks, this also goes for the ones from nvidia and ati, the slowest dedicated graphics card will be much faster then they are in the same series.

If all else fails see if you can exchange the top for something with an nvidia chipset.

But right now you're the first I've seen ask about the gma cards under linux. Try windows again under opengl.

Prometheus
09-22-2007, 07:53 PM
the x3100 supports OpenGL1.5, but like all IGPs gaming preformance sucks, this also goes for the ones from nvidia and ati, the slowest dedicated graphics card will be much faster then they are in the same series.

No doubt about that, but it works quite good under Windows (have tested both Direct3D and OpenGL now), so I can't see any reason why it shouldn't work under Linux too.

Regnum just tells me that the GFX card is not supported and I was wondering why, glxinfo gives all extensions you'll ever need *g*. As I said before other 3D games under Linux work fine with X3100.

Froste
09-22-2007, 07:57 PM
probably because of s3tc, you can read other threads about it and how to enable it in dri

Prometheus
09-23-2007, 07:09 PM
probably because of s3tc, you can read other threads about it and how to enable it in dri
Thanks for pointing this out, but I saw this too and already tried, still the same message. I think it isn't as simple as that :(

stickboy144
10-16-2007, 07:02 PM
so basically this game works with this chipset under windows but not under linux? How lame

I dont see why the developers even put in this box telling us our video card isnt right, why not let us brute force it with cards that obv will run it, would solve a lot of the queries on this fourm. You get it working in linux with that intel in the end?

toxigenicpoem
10-17-2007, 03:01 AM
the x3100 supports OpenGL1.5, but like all IGPs gaming preformance sucks, this also goes for the ones from nvidia and ati, the slowest dedicated graphics card will be much faster then they are in the same series.

If all else fails see if you can exchange the top for something with an nvidia chipset.

But right now you're the first I've seen ask about the gma cards under linux. Try windows again under opengl.


I would agree they are slower, but sucks might be too strong of a word. :) Depending on what your doing IGP may be just fine. I really don't need anything faster than a FX5500(nvidia), (i don't play many pc games), I hardly use my X1300 pro, but my Xpress 200M IGP out performs the FX5500, so all in all it works pretty slick for me.

Now of course I can't play intensive games, F.E.A.R. etc on it, but for anything 9.0b or earlier it works great.


@ Prometheus

So your saying you enabled DRI?

DuoMaxwell
10-18-2007, 09:38 AM
Really? Well it's kind of to be expected of the 5500FX, it was nothing more then an overclocked 5200FX and the entire FX line from nvidia sucked, but since the GeForce 6x00 series on they have improved by leaps and bounds, sure all the mid and low range DX10 capable cards are concidered slow compared to the previous generation, but that is said for ATI's cards, the high end cards are still good preformers but ATI is in a bit of trouble with their current midrange, they cost the same or more then nvidia's current cards but are slower then them, the the HD 2600XT, the best of them ATI midrange of DX10 cards is slower then the GeForce 8600GT, the 8600GTS is the next step up from nvidia before the high end and blows it clean out of the water for around the same price.

Now for a different analogy, I build comps myself, it only costs around $50 more for a machine with a mobo without an IGP and with a card such as an nvidia 8500GT, their lowest end current card that doesn't use turbocache. I have yet to build a machine that wasn't ordered for an office enviornment that doesn't have a real gpu card for the simple fact that for a small amount more the machine is much more capable. IGPs leech off of the slow system ram and sometimes even the CPU, making them slow the entire machine down just to render just 2d graphics used all day every day, why have a machine half as powerful as it could be for a nominal price increase? Often the low end cards come with rebates and free windows games as well. The only thing I look to be careful of with the low end cards is if they use turbocache, which also leeches the bulk of the cards vram from the system ram, thus being counter productive, and if the card has a crippled memory bandwidth, there are some 64 bit versions of the 8500GT floating around, and theres GDDR 2 and 3 ram versions of the 8500GT and 8600GT, the GDDR3 version of the 8600GT is around 2x as powerful as the crippled GDDR2 versions for the same price.

Don't belive me? Look up the numbers on the tech review sites.

toxigenicpoem
10-18-2007, 07:43 PM
Nope I don't disagree with you there, that the performance gain from going stand alone, isn't there not at all. I just don't happen to have a problem with IGP performance, esp when I feel that the performance is just as good as a stand alone for a previous generation, even considering it has half the vertex, and pixel pipes/shaders.

I'm a developer for a living, so this is just my general experiance as well. I was just saying it all comes down to what the use is to determine if sucks works. :biggrin: For instance, I agree that TurboMemory/HyperMemory is slow. But there is a difference between painting effects and painting loaded memory objects used to pilot the OS. I've seen neglible performance gain in later model IGPs, in Paint methods concearning Standalone vs IGP, if I'm painting native objects that already exist within the memory space. Of course this would be mute if we were talking about AIGLX, or WDDM. However like you said, once I need to create a dynamic allocation of video buffers to paint something complex that doesn't exist, the Draw time is considerably slower while offloading these objects to system ram, instead of dedicated GPU Ram.

I also agree with you that in a Desktop setup, that the cost of going standalone vs IGP - power is the better choice. But so far from what I've seen as far as pricing goes, IGPs still don't lose my vote in the mobile market, unless of course your going to be going after some heavy gaming.

I see your a cheese head btw ;-) I'm in Saint Paul, MN! And thanks for the Regnum comic on your stumble apon. Can you elaborate your DSL setup a little more for me? I couldn't really make out where your connectors are going, I thought that was pretty interesting.

DuoMaxwell
10-18-2007, 10:15 PM
Ok then, from a develpoer's stand point you may rarely be actually using your GPU, go download the Doom 3, Quake 4 and Quake Wars: Enemy Territory demos for linux, try them on your IGP monitoring the framerate, especially drring a larger fire fight, you'll notice that if the game drops below 30 frames a seccond that you'll start to see the game stutter, the higher you have the graphics setings the worse this becomes, and while you may be able to make do with it on the single player most of the time at least at the lower levels as the game gets harder with more things going on at once on the screen the problem compounds to points where you will graphically lag and the game will kill you, the same thing happens when you play these games online, you will get graphical lag in a fire fight there against other real ppl which will be much smarter then the computer AI, well not all of them...

Now you may be thinking what does this have to do with regnm? Well the same applies when you are in a the avrage sized fort war here, durring certain times of the day you cannot blame the lag you are getting on NGD's shitty server, the lag is on your end, for a game like regnum since the game isn't usually fast paced so many players like to max out the graphical settings, which is fine, for grinding mobs, but if you are going on a hunt or to a fort war you'll want to cut back your graphical settings to help you ensure you get that first cast in.

Now yes, in a laptop the gpus are stated as cards that they are not, a 7600Go doesn't hold a candle to the 7600GS let alone the 7600GT, the only really powerful laptop gpus are found in the 17"+ sized desktop replacements fromm companies like Sager, Boxxtech, Falcon Northwest and VoodooPC, often these monsters are little more then $5000 desktop comps with a built in monitor and a UPS system. Who the hell buys a laptop with an AMD FX60, 4Gb of ram, 3 disk Raid0 and 2 7900GTXGo in SLI anyways? You can't be getting more then 5 mins of battery life out of that monster lol.

The grey wire connects to the phone company side of the box that connects to the pole, it goes to a spliter, the beige wire headding off to the right is going up to the 2nd floor to my DSL gateway modem, the other line is a dsl line filter to keep the damn thing working right going into the black wire which is the internal house wiring, I really should be less lazy and make it into something that fits in the box lol.

Prometheus
12-07-2007, 10:58 PM
@ Prometheus

So your saying you enabled DRI?

Yes DRI is enabled and S3TC is enabled too. I just played OpenArena (The Open Source version of Quake III Arena) using Linux and it works very well. I just don't get why RO doesn't detect my card. :(

Dupa_z_Zasady
01-22-2008, 10:45 PM
Hi, I bought a new notebook a few days ago which has one of the newest Intel chipsets (GM965), so I got a i965 (or GMA X3100) onboard GFX chip.

Has anyone got Regnum running on such a configuration?

I have tested the game under Windows where it seems to be fine (only tried DirectX though).

Or is the linux driver not supported at all?

Edit: Other 3D games work just fine (Tremolous, Warsow, Glest)

Well i compiled mesa and drm from git repository and there are significant steps forward. I was able even to launch the game for a few seconds in safe mode. But it looks awful at the time. But on sunday it didn't looked at all. ;)
I've even managed to take a screenshot. Look for yourself and be scared.:P

http://img517.imageshack.us/img517/6048/screenshot2008012221231bn0.jpg (http://imageshack.us)

Miraculix
01-23-2008, 02:35 AM
funky! :guitar:

Prometheus
01-27-2008, 08:13 PM
My most recent attempts now show the "gamigo games" log then crashes with this message:

game: intel_regions.c:231: intel_region_data: Assertion `dst_offset + dstx + width + (dsty + height - 1) * dst->pitch * dst->cpp <= dst->pitch * dst->cpp * dst->height' failed.
Saving backtrace to crash_backtrace_3909.log
Got SIGABRT (aborted)

going_nutz
01-28-2008, 05:22 PM
I am yet another happy user of those X3100-Intel-Cards.
I am running Ubuntu Linux 7.10, Gutsy Gibbon and Regnum tells me my video card is to old.

That's odd.
Isn't there just any kind of parameter I can pass to rolauncher to bypass the test and just use the card?

Would be very happy!

regards

going_nutz
01-28-2008, 06:00 PM
Ok, did it like this:

http://www.regnumonline.com.ar/forum/showthread.php?t=15740&highlight=s3tc

Unfortunately I am getting the same error as above:
$ game: intel_regions.c:231: intel_region_data: Assertion `dst_offset + dstx + width + (dsty + height - 1) * dst->pitch * dst->cpp <= dst->pitch * dst->cpp * dst->height' failed.
Saving backtrace to crash_backtrace_7954.log
Got SIGABRT (aborted)

Backtrace:
http://paste.pocoo.org/show/24298/

I hope you can get it fixed.

regards

going_nutz
01-28-2008, 06:30 PM
Just to be precise: The above error occured on Arch Linux. Trying it on Ubuntu with the exact same steps still results in the "Unsupported Video Card"-Thing...

Prometheus
01-29-2008, 03:51 PM
Just to be precise: The above error occured on Arch Linux. Trying it on Ubuntu with the exact same steps still results in the "Unsupported Video Card"-Thing...

The "Unsupported Video Card" message disappeared in the most recent Debian Unstable, so I think upgrading Mesa on your Ubuntu may solve that.

The other error seems to be caused by a problem in the Intel video driver as you can cleary see that the assertion is raised in the intel video module of X.Org. Maybe we should try with a GIT version like Dupa did. Or even contact the corresponding mailing list.

CyberTribe
01-30-2008, 10:04 PM
up to date gentoo

game runs on
Display controller: Intel Corporation Mobile 945GM/GMS, 943/940GML Express Integrated Graphics Controller (rev 03)

though some textures issues present

i'll play around with settings and see if i can solve them

all i done is - http://www.regnumonline.com.ar/forum/showthread.php?t=15740&highlight=s3tc

attached screenshot - the current textures issues

going_nutz
02-01-2008, 07:48 PM
Gnaaarf.

Come on guys, please fix it! :)

Dupa_z_Zasady
02-26-2008, 03:57 PM
game runs on
Display controller: Intel Corporation Mobile 945GM/GMS, 943/940GML Express Integrated Graphics Controller (rev 03)

http://www.regnumonline.com.ar/forum/showthread.php?t=15740&highlight=s3tc



Too bad that we talking about different chipset/card. :>
Sorry, just could not hold myself.

I tried new git dri and mesa and there is no significant change. Do you know where to drop error logs? Maybe developers will be able to do something about that because it seems to be shaders problem now. And unfortunatelly the computer was hanging on other 3D applications. But remember these are bleeding edge drivers I use.

WesMatte
05-17-2008, 09:14 PM
http://homepage.hispeed.ch/rscheidegger/dri_experimental/s3tc_index.html

i've use a old ati radeon 9200 and tried the driconf and "s3tc fix" to make the game install/run, but with visual errors(like most people).

I downloaded the libtxc_dxtn070518.tar.gz file from link above, and disable the "s3tc fix" in driconf. Then installed mesa-common-dev and build-essential with apt-get. Extracted files from libtxc_dxtn070518.tar.gz and did a "make" and "sudo make install" in the libtxc_dxtn* folder. Started the game and it looks ok now.:thumb:

edpl
06-25-2008, 11:17 AM
almost works... after intro game crashes and says:

game: intel_regions.c:231: intel_region_data: Assertion `dst_offset + dstx + width + (dsty + height - 1) * dst->pitch * dst->cpp <= dst->pitch * dst->cpp * dst->height' failed.
Saving backtrace to crash_backtrace_7175.log
Got SIGABRT (aborted)

if trying to run with export LIBGL_ALWAYS_INDIRECT=1 crashes all x server oO

Dupa_z_Zasady
06-25-2008, 01:37 PM
almost works... after intro game crashes and says:

game: intel_regions.c:231: intel_region_data: Assertion `dst_offset + dstx + width + (dsty + height - 1) * dst->pitch * dst->cpp <= dst->pitch * dst->cpp * dst->height' failed.
Saving backtrace to crash_backtrace_7175.log
Got SIGABRT (aborted)

if trying to run with export LIBGL_ALWAYS_INDIRECT=1 crashes all x server oO

See this thread:
http://regnumonline.com.ar/forum/showthread.php?t=23060