News | Forum | People | FAQ | Links | Search | Register | Log in
Quake Custom Engines
Discuss modified Quake engines here, I guess. What engines do you use? What are the pros/cons of existing engines? What features would you like to see implemented/removed?
First | Previous | Next | Last
@Izhido 
MH's now dead engine, DirectQ, could easily clock 5000+ fps on an id1 map. 
@Baker 
I'm pretty sure that's true, no doubt about it.
Do we have, however, some kind of display who can actually show those 5K FPS?
That was really kind of my point. :) 
 
Here is a partial list of "blockbuster" map releases in the last few years:

a) A Roman Wilderness of Pain [aka ARWOP]
b) The Alter of Storms [aka ne_ruins]
c) Rubicon Rumble
d) Remake Quake

that will grind down to single digit frames per second in a traditional GLQuake style engines. But a DirectQ or FTEQW can instead of getting 12 fps will get 150 fps or 300 fps.

And these kind of mega-maps have been "the norm" in the last 5-6 years. There are plenty more where those came from. 
 
Whoa. Can they run on vanilla? I'd love to test them on a iPhone ;) 
 
a) requires enhanced network protocol (FitzQuake 666 or similar)
b) requires enhanced network protocol (FitzQuake 666 or similar)
c) requires BSP2 support, enhanced network protocol (FitzQuake 666 or similar) and should have alpha masked texture support.
d) requires the Remake Quake engine bundled in the download (BSP2, alpha masked texture support, Remake Quake's protocol 999, hurt blurs, other 2D effects) 
 
What kind of hardware is actually limited to low FPS in those maps? Laptops with Intel GPUs? No system I've played them on with Nvidia GPUs has had issues. 
 
With what engine? Quakespasm since late 2014 or thereabout has used vertex arrays so the performance is about triple the norm on maps with tons and tons of monsters. 
 
Usually Quakespasm. 
@Izhido 
try them. fix stuff that doesn't work. job done.
its not like engines that support this stuff are closed source, so it shouldn't be that hard considering the stuff you've previously managed. 
@Izhido 
Sample source code implementing protocol 666 download source

Every single protocol 666 change is marked with:

#ifdef FITZQUAKE_PROTOCOL

Yeah, every last change is marked. 
 
Do we have, however, some kind of display who can actually show those 5K FPS?

That's kinda not the part that's important; it's a bit of a strawman, in fact. As some of us have hinted, running DM3 at 5k FPS is actually not an interesting problem. Take 200 FPS: say someone has a 144Hz display and bump the target framerate to 200 to allow headroom for transient peaks. Running DM3 at 200 FPS isn't an interesting problem either. I'm pretty sure that even a low-end phone can do that nowadays.

Running a big, complex map with big, complex scenes at 200 FPS - now that's an interesting problem. You can solve anything if you throw enough brute force at it; but solving it on lower-end hardware too: even more interesting.

Historically the Quake engine was viewed as "can't do big scenes" or "unsuitable for open areas" but it turned out that none of that was actually true. It's obviously not the best engine for this kind of thing, but a primary cause of it's historical problems was in the implementation of it's renderer, and it wasn't too difficult to make the necessary changes to solve those problems.

So do that and what will happen is that running DM3 at thousands of FPS will also happen, but as a side-effect, not as the primary goal. 
A Few Questions Regarding Indexed Colors And Lightmap Blending 
1. per-texel lightmap blending?

How does quake's software renderer achieve per-texel blending? example

2. how and where in the rendering pipeline does "palettization" occur?

I assume that illegal colors are produced after a lightmap is blended with the diffuse texture? If so, my best guess is that palettizing is the last step before being drawn, though I have a feeling this is probably certainly wrong.

3. a practical way to achieve this look for a modern game?

Software render? GL render? GL render + GLSL shaders or some other form of post processing?

here is a screenshot palettized in photoshop. I'm currently using these janky mockups to help inform art direction. Obviously, this method is sub-optimal and I suspect this is not close to what an actual game renderer would output. It's a pain in the ass because this rendering style depends on an appropriate art style else it all turns to shit... hence the previous questions.

After spending months testing color palettes, I have a whole new level of respect for the id art team. Cramming quake in to essentially 240-ish colors is an art form in itself. 
Kill 
I apologize that can't comment on any of your questions but I do have to say that screenshot is extremely fucking killer. 
 
I concur. Love the light design on the door and the left & middle monsters. Would be thrilled to see them in Quake. 
I Appreciate The Compliments 
 
 
You're welcome. Are these for a Quake mod or a standalone project? 
Standalone Project 
unless the project dies, in which case the assets would probably be handed over to the quake community. 
 
You have a ModDB page or something for this project? I'd like to know more and keep myself informed. Perhaps send a PM or post in General Abuse? I don't want to clutter the thread with OT. 
#600 
for the first and second questions:

1. the lighting is blended with the textures in a "surface cache" which is in texture space so it's always 1 pixel per texel (rather than in screen space where one texel covers many pixels.) That's why you don't see banding that crosses over the middle of texels.

2. the entire renderer is 8-bit so it never has to be "palettized" at run time. 8-bit texture data is combined with light map data in a lookup table called colormap.lmp, where the resulting value is also 8-bit. So i guess it's the creation of this lookup table (shipped with the game) where the palettization takes place.

#3 is complicated an other people probably have better answers than me. 
 
huh, pretty interesting. that clears a few things up. feels like something that specialized could have some drawbacks. it would be super cool though. devil daggers looks really nice, i wonder how they did it?

@mugwump - no, nothing public currently. I'll make sure you get the word when it's out there :) 
OK, Thanks! 
Oops, Didn't Mean To Check The Q2 Icon... 
...stupid fat fingers on stupid phone! 
The Way FTE Does It... 
#2 something like:
diffuseindex = texture[s][t];
lightlevel = lightmap[lm_s][lm_t];
pixelvalue = colourmap[diffuseindex][lightlevel];

so yeah, a simple 2d lookup.
the surface cache is just a cache that provides a lower res source lightmap, at an appropriate mipmap level.

#3 r_softwarebanding in FTE replicates it with glsl (essentually the same, except the colourmap already has the rgb values expanded to save an extra lookup). taniwha's glsl renderer also does this.
this lookup actually happens 3 times over in fte so that lits work as expected. if the lighting is always grey then its identical(ish) to software rendering.
there's some fiddly stuff required to get the lightmap sample changes snapped to match the diffuse.
and the choice of which mipmap to use follows standard opengl based on pixel distance with an annoying bias based upon the slope of the surface, unlike software rendering picking some midpoint or something and then using a single mip level for the entire surface, so you might want to tweak the d_mipcap cvar to use only the first 3 mip levels or something (yes, FTE uses the proper mipmaps too). this is actually resolution dependant.
talking about resolution, r_renderscale 0.25 will scale the rendered res down. negative values in rev 5026 will give you nearest filtering (with a vid_restart annoyingly).


I have no idea what you'd need to set up to get DP to replicate this. In theory just paletted textures instead of rgb ones. you can probably just provide greyscale or whatever for the base mip level, but autogeneration of the other mips will just result in noise.
if you were to hack some paletization logic into glsl, then my suggestion to you would be to use a 3d texture, but again you'd probably need to hack dp's sourcecode for that.
failing that you'd have to express the colour ramps mathematically. 
Killpixel 
2) See r_surf.c

3) Are you sure you want to do this?

The restrictions of 8-bit color software rendering are great for scientific purposes (because they help to expose the needs and the limits of color transformation algorithms), but not for artistic purposes.

After studying software rendering for so long, I'm sure that being truly faithful to its limitations isn't worth the effort. It's too painfully restricting.

You can get good, painless results by reducing the smoothness of each color channel individually. Let's say, by zeroing the lowest 4 bits of each channel, you'll limit the colors to 16 shades per channel, for a total of 4096 colors. This way, you'll be able to create low-color assets without fighting against palette limitations. 
1 post not shown on this page because it was spam
First | Previous | Next | Last
You must be logged in to post in this thread.
Website copyright © 2002-2024 John Fitzgibbons. All posts are copyright their respective authors.