News | Forum | People | FAQ | Links | Search | Register | Log in
Quake Custom Engines
Discuss modified Quake engines here, I guess. What engines do you use? What are the pros/cons of existing engines? What features would you like to see implemented/removed?
First | Previous | Next | Last
 
Do we have, however, some kind of display who can actually show those 5K FPS?

That's kinda not the part that's important; it's a bit of a strawman, in fact. As some of us have hinted, running DM3 at 5k FPS is actually not an interesting problem. Take 200 FPS: say someone has a 144Hz display and bump the target framerate to 200 to allow headroom for transient peaks. Running DM3 at 200 FPS isn't an interesting problem either. I'm pretty sure that even a low-end phone can do that nowadays.

Running a big, complex map with big, complex scenes at 200 FPS - now that's an interesting problem. You can solve anything if you throw enough brute force at it; but solving it on lower-end hardware too: even more interesting.

Historically the Quake engine was viewed as "can't do big scenes" or "unsuitable for open areas" but it turned out that none of that was actually true. It's obviously not the best engine for this kind of thing, but a primary cause of it's historical problems was in the implementation of it's renderer, and it wasn't too difficult to make the necessary changes to solve those problems.

So do that and what will happen is that running DM3 at thousands of FPS will also happen, but as a side-effect, not as the primary goal. 
A Few Questions Regarding Indexed Colors And Lightmap Blending 
1. per-texel lightmap blending?

How does quake's software renderer achieve per-texel blending? example

2. how and where in the rendering pipeline does "palettization" occur?

I assume that illegal colors are produced after a lightmap is blended with the diffuse texture? If so, my best guess is that palettizing is the last step before being drawn, though I have a feeling this is probably certainly wrong.

3. a practical way to achieve this look for a modern game?

Software render? GL render? GL render + GLSL shaders or some other form of post processing?

here is a screenshot palettized in photoshop. I'm currently using these janky mockups to help inform art direction. Obviously, this method is sub-optimal and I suspect this is not close to what an actual game renderer would output. It's a pain in the ass because this rendering style depends on an appropriate art style else it all turns to shit... hence the previous questions.

After spending months testing color palettes, I have a whole new level of respect for the id art team. Cramming quake in to essentially 240-ish colors is an art form in itself. 
Kill 
I apologize that can't comment on any of your questions but I do have to say that screenshot is extremely fucking killer. 
 
I concur. Love the light design on the door and the left & middle monsters. Would be thrilled to see them in Quake. 
I Appreciate The Compliments 
 
 
You're welcome. Are these for a Quake mod or a standalone project? 
Standalone Project 
unless the project dies, in which case the assets would probably be handed over to the quake community. 
 
You have a ModDB page or something for this project? I'd like to know more and keep myself informed. Perhaps send a PM or post in General Abuse? I don't want to clutter the thread with OT. 
#600 
for the first and second questions:

1. the lighting is blended with the textures in a "surface cache" which is in texture space so it's always 1 pixel per texel (rather than in screen space where one texel covers many pixels.) That's why you don't see banding that crosses over the middle of texels.

2. the entire renderer is 8-bit so it never has to be "palettized" at run time. 8-bit texture data is combined with light map data in a lookup table called colormap.lmp, where the resulting value is also 8-bit. So i guess it's the creation of this lookup table (shipped with the game) where the palettization takes place.

#3 is complicated an other people probably have better answers than me. 
 
huh, pretty interesting. that clears a few things up. feels like something that specialized could have some drawbacks. it would be super cool though. devil daggers looks really nice, i wonder how they did it?

@mugwump - no, nothing public currently. I'll make sure you get the word when it's out there :) 
OK, Thanks! 
Oops, Didn't Mean To Check The Q2 Icon... 
...stupid fat fingers on stupid phone! 
The Way FTE Does It... 
#2 something like:
diffuseindex = texture[s][t];
lightlevel = lightmap[lm_s][lm_t];
pixelvalue = colourmap[diffuseindex][lightlevel];

so yeah, a simple 2d lookup.
the surface cache is just a cache that provides a lower res source lightmap, at an appropriate mipmap level.

#3 r_softwarebanding in FTE replicates it with glsl (essentually the same, except the colourmap already has the rgb values expanded to save an extra lookup). taniwha's glsl renderer also does this.
this lookup actually happens 3 times over in fte so that lits work as expected. if the lighting is always grey then its identical(ish) to software rendering.
there's some fiddly stuff required to get the lightmap sample changes snapped to match the diffuse.
and the choice of which mipmap to use follows standard opengl based on pixel distance with an annoying bias based upon the slope of the surface, unlike software rendering picking some midpoint or something and then using a single mip level for the entire surface, so you might want to tweak the d_mipcap cvar to use only the first 3 mip levels or something (yes, FTE uses the proper mipmaps too). this is actually resolution dependant.
talking about resolution, r_renderscale 0.25 will scale the rendered res down. negative values in rev 5026 will give you nearest filtering (with a vid_restart annoyingly).


I have no idea what you'd need to set up to get DP to replicate this. In theory just paletted textures instead of rgb ones. you can probably just provide greyscale or whatever for the base mip level, but autogeneration of the other mips will just result in noise.
if you were to hack some paletization logic into glsl, then my suggestion to you would be to use a 3d texture, but again you'd probably need to hack dp's sourcecode for that.
failing that you'd have to express the colour ramps mathematically. 
Killpixel 
2) See r_surf.c

3) Are you sure you want to do this?

The restrictions of 8-bit color software rendering are great for scientific purposes (because they help to expose the needs and the limits of color transformation algorithms), but not for artistic purposes.

After studying software rendering for so long, I'm sure that being truly faithful to its limitations isn't worth the effort. It's too painfully restricting.

You can get good, painless results by reducing the smoothness of each color channel individually. Let's say, by zeroing the lowest 4 bits of each channel, you'll limit the colors to 16 shades per channel, for a total of 4096 colors. This way, you'll be able to create low-color assets without fighting against palette limitations. 
8-bit Is A Worthwhile Artistic Goal. 
A lot of players like/prefer the paletted look. Yes, I'm sure a lot of it is down to nostalgia, but that doesn't make it any less a legitimate reason to pursue the 8-bit look for artistic purposes. 
 
@spike - interesting. I didn't realize fte had these features out of the box. I'll revisit fte this weekend. Honestly, I can't recall why I switched to dp, I think it was an issue with q3 shaders or something. r_softwarebanding and r_renderscale are some tasty morsels. Is there presently a cvar to control the distance at which a mip is used? Are mips generated by the engine? If so, can I create mips manually to be used instead? If i recall correctly, fte supports fbsp?

@mankrip - It's very restrictive, yes. Sometimes that's a good thing. I am very fond of the idea of an 8bit software renderer. It's a romantic idea, I can see why it would be ultimately be dismissed as impractical by many.

4096 is a lot of colors. I'm currently using 16 shade textures... after lightmaps (some of which are colored) you end up with a fairly large range that just doesn't look right. It's neither 8bit or hi-fi graphics... it's sort of a cheap impression, but not genuine.

If this were magical christmas land, I would have a software renderer with a 512 color palette, or something along those lines. Would that even be? it's not 8bit, but not high color either (I think).

@kinn - I agree entirely. In my case, it's not really nostalgia. Some of my first Doom/Quake experiences where with DP and Doomsday. I later discovered software rendering and immediately preferred the look. There is something magical about it, almost like an animated painting... I can't quite put my finger on it. 
 
If so, can I create mips manually to be used instead?
A while back in the mapping help thread, Rick told me about a program called MipDip that supposedly allows you to replace the submips. You can find it on Quake Terminus. I tested it briefly but didn't understand how it works, I'll need to look into it more. 
 
@killpixel, sorry, forgot to mention that r_softwarebanding only works with paletted source textures, which means the feature has only really been tested with q1bsp+q2bsp+q1map(with wads). the engine doesn't generate any mips itself in this mode (so watch out for that too).
I could give you some workarounds for q3bsp, but its probably better if I just add palletization logic so that you won't have to tread so carefully... and get wads+wals working properly for paletteized mips.

anyway, the basic idea is that you use 'fte_program defaultwall#EIGHTBIT' in your shader (giving you glsl-based nonq3style shaders) with a 'map $colourmap' (which preprocesses the colormap.lmp), then fte pulls in the palette+lightmap+etc textures too.
for q1+q2, this already happens by default, but q3 content is much more likely to have shaders overriding things, and fte's explictness means you may want to modify those. 
#614 
If this were magical christmas land, I would have a software renderer with a 512 color palette, or something along those lines. Would that even be?

This is one of the possibilities I've studied for a long time, and it isn't worth it. The problem isn't just the size of the palette, it's the speed performance.

A 512 color palette would use 9-bit values for addressing. However, those 9 bits would have to be stored in 16-bit pointers. So, each color index variable would require twice the memory anyway, increasing the amount of cache misses. They would also require bigger color transformation tables and extra bounds checking, which would result in significantly slower performance.

I don't remember all the specifics, but when comparing all possibilities, I've realized that the best options would be to either use direct RGB color values (as Unreal does), or keep the renderer in 8-bit indexed color. And then, since a number of operations are faster to perform in 8-bit indexed color, due to dealing with a single index value instead of individual values for 3 color channels, I've decided to stay in 8-bit. 
And Thus 
mankrip became the dither master. 
 
@spike

forgot to mention that r_softwarebanding only works with paletted source textures

I do use indexed/paletted textures (not quake's palette). though this might not be what you mean?

@mankrip

This is one of the possibilities I've studied for a long time, and it isn't worth it. The problem isn't just the size of the palette, it's the speed performance.

I had a feeling...

So, I really have 2(ish) options: 8bit (with all it's limits) or some sort of post-processing shader.

I don't understand how it works, but a glsl shader seems like a brute force method that's not as accurate as a true 8bit renderer. I imagine it functioning like the way I do it in Photoshop: The scene is rendered, then "palettized", then sent to the display.

@mugwump

thanks for the link to the glsl shader... I've messed with that and a couple other shaders in DP and never got them to work. They either don't work at all or end up looking all messed up... :/ 
 
a glsl shader seems like a brute force method that's not as accurate as a true 8bit renderer. I imagine it functioning like the way I do it in Photoshop: The scene is rendered, then "palettized", then sent to the display.

AFAIK, that's exactly this.

The "correct" way would probably be to quantize textures to a global 8-bit color palette, generate 8-bit color transformation tables (lighting, blending, etc.) from the palette, and apply them using a fragment shader.

Faithfully replicating Quake's lighting is more complicated, because lightmaps should be upscaled 16x with bilinear filtering, and then quantized to 64 levels. Those 64 levels are then used to index the 64 levels of the color shading map (colormap.lmp). 
 
AFAIK, that's exactly this.

That isn't desirable IMO, though it may seem trivial to others.

Not only does it seem inefficient, the pre-palettized scene is just an approximation and sometimes won't look the way it's intended once palettized. 
 
Designing for software rendering is partially about defining the intended looks, and partially about accepting the resulting looks. 
1 post not shown on this page because it was spam
First | Previous | Next | Last
You must be logged in to post in this thread.
Website copyright © 2002-2024 John Fitzgibbons. All posts are copyright their respective authors.