home home

downloads files

forum forum

docs docs

wiki wiki

faq faq

Cube & Cube 2 FORUM


General Thread

by Aardappel on 01/05/2002 01:55, 15527 messages, last message: 03/01/2024 13:02, 12364823 views, last view: 10/08/2024 06:43

for questions, announcements etc.

Go to first 20 messagesGo to previous 20 messages    Board Index    Go to next 20 messagesGo to last 20 messages

#9044: Re: Problem with shaders

by SanHolo on 09/13/2007 16:01, refers to #9042

The OS X drivers for the latest ATI-cards seem to be bad, maybe that's the cause.
What framerates do you get without shaders?

reply to this message

#9045: Re: win 32 exception 0xc0000000005

by acinger on 09/13/2007 16:57, refers to #9043

f721c4058dc45abdd267e7d1ef55e6f6

reply to this message

#9046: Re: win 32 exception 0xc0000000005

by acinger on 09/13/2007 16:58, refers to #9043

Sauerbraten Summer Edition
Amd Athlon 64 3400+
1 GB Ram
ATI Radeon 9800, 256 MB

reply to this message

#9047: Re: win 32 exception 0xc0000000005

by Drakas on 09/13/2007 17:12, refers to #9046

and which OS?

reply to this message

#9048: Re: win 32 exception 0xc0000000005

by acinger on 09/13/2007 17:15, refers to #9047

Windows XP Home 32 bit

reply to this message

#9049: Re: Problem with shaders

by Xannin on 09/13/2007 18:14, refers to #9044

Without shaders it runs at about 20fps

reply to this message

#9050: Re: Problem with shaders

by a baby rabbit on 09/13/2007 19:46, refers to #9049

that sounds just wrong given your mac hardware. SanHolo - I wasn't aware of such driver issues, what is your source?

reply to this message

#9051: Re: Problem with shaders

by Xannin on 09/13/2007 21:45, refers to #9050

That's what it runs at when it's 1920 x 1200 resolution and 4x FSAA.
Also I did some checking around and read several places that there are problems with the drivers for all new IMacs.

reply to this message

#9052: ..

by Acord on 09/13/2007 23:16

Well, the resolution is VERY high. That and the 4xFSAA are going to take a major hit out of any card. The higher the res, the more of a hit it takes.

reply to this message

#9053: Problems connecting to master server

by dussander on 09/14/2007 08:47

i'm trying to request the in-game server list and i continue to get the "Master Server Not Replying" message. i'm not behind a router. i'm on wildblue satelite. any ideas?

btw, i'm running xp, firewall turned off

message me on AIM at HerrDussander if you have any ideas

reply to this message

#9054: Re: Problem with shaders

by Quin on 09/14/2007 18:41, refers to #9042

Well, firstly. Try the 'ati_*_bug' variables (ati_skybox_bug fixes the 'see through random parts of the level' for me on an R9550).

Next, try messing with your driver settings. Disable Catalyst AI if it is available, and try changing the OpenGL depth buffer (or perhaps even /zpass 0 from the Sauerbraten console).

Last but not least, you could try some of the suggestions on the wiki.

http://cube.wikispaces.com/
http://cube.wikispaces.com/FAQ

Hope this helps point you in the right direction, let us know how you go and if you have any success at all.

reply to this message

#9055: Re: Problem with shaders

by SanHolo on 09/14/2007 19:02, refers to #9050

I read about the "driver issues" for the new R600 Cards on OS X in the various benchmark reports. Some games (UT 2004, Doom III, Quake 4) perform worse on the new MBP with the GeForce 8600M than they do on the MBP with the Radeon X1600.

http://www.macworld.com/2007/06/firstlooks/mbpbench/index.php

reply to this message

#9056: Re: Problem with shaders

by SanHolo on 09/14/2007 19:02, refers to #9055

Not R600 of course, I mean the GeForce-8 series. Sorry. =)

reply to this message

#9057: Re: Problem with shaders

by demosthenes on 09/14/2007 20:42, refers to #9056

That's not a driver issue, except with ATI.

The X1600 I looked at just now had a 600MHz core clock and a 1GHz memory clock, compared to the 8600GT I looked at which had a 540MHz core clock and an 800MHz memory clock. Plus, the 8600s use GDDR2 while the X1600s use GDDR3.

The ATI should be performing better comparatively than it is, by a quick glance at those benchmarks.

reply to this message

#9058: Re: Problem with shaders

by Xannin on 09/14/2007 22:23, refers to #9054

Thanks Quin, /zpass 0 fixed the problem with seeing through the level.
I messed around with turning different shaders on and off,
thinking that one shader in particular might have been causing the problem with the models.
When I turned off dynamic shadows it seemed to help, and the models would sometimes be visible.
However, when they were visible the models were missing random faces and were all somewhat green.

reply to this message

#9059: Message censored by administrator

by oijrsqxatz on 09/15/2007 18:47

#9060: Re: Problem with shaders

by SanHolo on 09/15/2007 22:09, refers to #9057

You're referring to the Desktop-versions, I am talking about the mobile versions. The mobile X1600 has a 450/470 MHz clockrate, I don't know exactly about the 8600 Go, but clockrates are almost twice as high.

Plus it would make no sense to build a slower GPU into the next generation of a laptop.

reply to this message

#9061: Re: Problem with shaders

by yetanotherdemosthenescomputer on 09/15/2007 22:52, refers to #9060

To you, the end-user, no, it does not.

However, name-dropping can sell. And sell big! Saying that you've got a GeForce 8 card in a laptop makes more of an impression of the average user than saying you've got a x1600. So, regardless of the fact that the (desktop) 1600 is more powerful than the (desktop) 8600GT by clock speeds, at least, you can still sell more laptops with the "new" chipset than you might sell of identical laptops with the "old" chipset, even though they're practically identical.

That is, of course, before GPU architecture is brought into it, at which point, the lower clock rates of the one (desktop) card as opposed to the other may not actually reflect performance. Either way, the main point of the 8 series is DX10 cores (Note that an 8600 is fairly identical to a 7600, but for the DX version). Since Mac OS can't really use DX (it being Windows only), there's no real reason to include an 8 series card in there which is less powerful than a 7 series card might be for a similar price.

Eh, whatever.

reply to this message

#9062: ..

by Julius on 09/15/2007 23:00

AFAIK MacOSX and Linux can use the "dx10" features of the Geforce8 series by OpenGL extensions and OpenGL2.0.

reply to this message

#9063: Re: ..

by yetanotherdemosthenescomputer on 09/15/2007 23:05, refers to #9062

What?! Links, please. I have never heard of OGL and DX being able to be used in conjunction with each other. Would be interesting to read about if it's true.

reply to this message

Go to first 20 messagesGo to previous 20 messages    Board Index    Go to next 20 messagesGo to last 20 messages


Post a Message

Username

Email

Subject

Body

6 multiplied by 4 is?


content by Aardappel & eihrul © 2001-2024
website by SleepwalkR © 2001-2024
57232194 visitors requested 75164364 pages
page created in 0.134 seconds using 10 queries
hosted by Boost Digital