3DMaze low FPS

Previous topic - Next topic

Moru

When I set the Screen size to 1680x1050 and multisampling to 4, fullscreen I get low FPS in certain areas. If I go to the south edge of the left big corridor (on the screenshot below) and look a bit to the left I get 60 FPS. If I look straight ahead I get 30 fps. If I move forward to the point indicated in the screenshot it switches back to 60 FPS. Why could this be? It works fine in 1024x768.

Graficscard: GF 6600 GT, fresh drivers
Windows XP sp3


[attachment deleted by admin]

Kuron

QuoteWhy could this be?
My guess would be:

"I set the Screen size to 1680x1050 and multisampling to 4"

The higher the res and sample rate generally the slower the FPS since there is so much more work for the graphics card to do.

Moru

I can play other games with same resolution, 4x AA and full detail and shadows with 35 Fps so I'm not sure that is the case unless GL-Basic is that much slower at drawing? But it does seem to variate depending on the CPU load. If I run programs in the background consuming 5% of CPU I get 30 fps on more places than if I'm not running anything consuming CPU.

Kuron

You can really only compare if the other games are using OpenGL and the same version of OpenGL. 

I am guessing that GLBasic is not as optimized as other 3D engines.  I would try it on my end, but I do not have any system capable of that high of a resolution.  For that resolution and the sampling rate, the FPS you get seems very reasonable to me.

Moru

I can't find any open-gl games. I thought it was one but it had a setting for using dx7 or 9... any suggestion for a free game with open gl where you can change resolution?

Hemlos

#5
Moru, kuron is right, the more res and the more multisampling, the less fps, just because you didnt notice the difference of fps on another game in a similiar setup, doesnt mean that isnt a fact.

Perhaps you are viewing more vertices at a given time, or too high of resolution images at a long distance, this always makes fps run down no matter what vid card you have, and no matter what opengl version it was made with.

Try optimizing the walls to have less vertices? I assume one or more walls are using more than 4 vertices, reduce them to 4. And reduce the resolution of the textures that are further away...either make new textures for ranged textures, or add mipmapping, or do both.


Optimizing 3d:
Try MIPMAPPING. Especially if you are trying to render objects / images at a distance.
MIPMAP that backwall for sure! It is VERY high resolution and can be reduced with little noticability.
Its far away enough if you reduced the resolution of it, you wouldnt notice, and the video card would certainly be working less.


Also, im not 100% sure about this, but you can also try optimizing with X_fog i think this does something similiar to mipmapping...reducing hard-pixels being rendered.


About mipmapping, LUCASARTS games has a special way to reduce image pixels and vertices being rendered...
They make layered objects (frames) ...with each frame succesively and significantly reduced vertices and reduced pixel images, making them look like blocks almost, when they are far away.....those are rendered when at a very far distance...you dont need to see all the details, if it is very far.
Same thing with the image texture....they reduce the size of the images significantly when viewed at a distance....this is essentially mipmapping, they optimize both images and objects in layers depending on how far from the camera it is. For each object, they make multiple objects, layered into framed objects, and match to the same number of images, with each one reduced resolution. It is the oldschool way to mipmap...i think they invented it lol.

Basically anything to reduce distant images resolutions, pixels, and reduce number of vertices rendered will optimize your fps.

PS heres a list i found on google of opengl games...the listing is OLD, but you can see what was made in opengl.....i didnt know quake was opengl :)
http://home.pacbell.net/freundj/openglgames/
Bing ChatGpt is pretty smart :O

Kuron

Moru, you really can't compare the performance of OpenGL with DirectX even if it is on the same computer.  Some graphics cards will handle DX better and some will handle OpenGL better.  It is not even a brand thing anymore, it varies in brands between model numbers for the card.  I wish I could be of more help, but I haven't had a chance to mess with the 3D side of GLBasic too much.  I am not a 3D artist and all the models I bought for my game are in B3D format, and they are not supported by GLBasic's conversion program.

Kitty Hello

which brings up the old question: Who can write a b3d import filter?
I *think* you can access the triangles from b3d models with Blitz. But I'm not an expert.

Hemlos

Quote from: Kitty Hello on 2008-Aug-26
which brings up the old question: Who can write a b3d import filter?
I *think* you can access the triangles from b3d models with Blitz. But I'm not an expert.

Hatonastick posted a reply, its gone now..dont know what happened to it....

here is the link he provided to b3d format:
http://www.blitzbasic.com/sdkspecs/sdkspecs/b3dfile_specs.txt
Bing ChatGpt is pretty smart :O

Kuron

Take that info with a grain of salt.  When somebody tried to implement B3D support into a competing product for GLBasic, it was quickly found that the version in the docs is somewhat different from what B3D currenly uses.

Hemlos

QuoteWho can write a b3d import filter?


{points at Kuron}  :rtfm:
Bing ChatGpt is pretty smart :O

Kuron


Hatonastick

Well...  Wouldn't surprise me but if that is true, it's not the smartest decision Mark has ever made.  In the long run you could potentially do more damage to yourself, your customers, potential allies (people who would add B3D import/export to 3D editors, converters etc.) than your competitor.  Mind you it would probably be minor damage no matter which path was taken.  ;/

If push comes to shove, I'd be willing to have a go at using that document to read a B3D file or two just to see if it does work or not.  I don't know the GLBasic format so not sure I could make a converter.  Plus these days I couldn't code my way out of a wet paper bag that has previously had holes gnawed in it by savage rodents.  :|

Kuron

Quote from: Hatonastick on 2008-Aug-27t's not the smartest decision Mark has ever made.
Mark has made more than his fair share of blunders over the past few years.  Between the docs and examining some existing B3D models, somebody should be able to write a converter, but the real question is why bother?  B3D is a dead format attached to a dead product and nobody is selling models exclusively in the B3D format now. 

Kitty Hello

Ah. I thought there were a lot of libraries selling b3d models. Strange. That question came quite frequently.