News | Forum | People | FAQ | Links | Search | Register | Log in
General Abuse
Talk about anything in here. If you've got something newsworthy, please submit it as news. If it seems borderline, submit it anyway and a mod will either approve it or move the post back to this thread.

News submissions: https://celephais.net/board/submit_news.php
First | Previous | Next | Last
My Mistake 
Who calls me a mess 
Respect Teh History 
Mankrip I would have to think that given it was 1996 and that Quake went through development hell, changing from a flight-sim to a virtual reality game to an online focused mess, the shortcomings of the Quake mdls are deliberate.

The models are not UV wrapped, there is no attempt to pack islands and most skins are direct othographic projections which is massively inappropriate for anything other than planes. The result is stretching, and a lack of detail. Models of other games of the era exhibit better UV wrapping, and in the case of Playstation/N64 games benefited from higher than 8 bit color. Quake 2's mdl2 format did literally nothing to rectify this, although some of its models exhibit better island packing/wrapping.

The fact that Quake was software accelerated and meant to be played at a paltry 320x240,combined with Carmacks strict polygon limitations, the artists probably said "no one will notice". The fact that there is not much mesh to work with, it is shocking the animation is as lively as it is, considering the wooden animations of the era. That and there is no skeletal animation system.

However, when Unreal came around shortly after with properly UV wrapped models weighted to skeletons, it blew Quake out of the water. And literally every game based on each subsequent iteration of Unreal tech always stole the thunder from same generation id tech games.

So ya, study the low poly Lovecraftian meshes, but not the lack of polish that went into them. 
MAX_***** Of Modern Engines 
For some reason, I’m currently getting a deja vu of having already asked this question before. If so, a thousand apologies, I just need a reminder.

The original source code of Quake has these MAX_***** macros all over the place declaring limits for the various elements in the game (entities, brushes, textured, models, whatnot). Understandably so, due to the memory limitations of early PCs and their operating systems.

I wonder, however, if modern iterations of the game engine had these limitations all but removed, entirely, considering how nowadays having GBs of RAM, GHz of processor speed, tens of CPU / thousands of GPU cores and TBs of hard drive storage are commonplace for the average computer user. Are these limitations no more in current Quake gaming engines? 
Ask Negke About That One. 
 
#31032 
given the "save every bit" philosophy of the Q1 mdl format (e.g. the "onseam" silliness) I would imagine the lack of UV islands was to avoid having to dupe vertices, increasing the filesize.

I disagree with the Q1 models not being "polished". I think that considering the constraints that teh carmack gave to the artists, plus the state of the modelling apps at the time, they turned out pretty well. 
Thank God People Can Disagree On Here. 
Healthy debate and discussion FTW. 
#31033 
Izhido - such limits have been removed where it makes sense to remove them (when I say "remove", that typically means the limit is massively increased to the point where it's not really a limit for the mapper any more) - see engines such as Quakespasm and FTE. Off the top of my head, Sepulchre is probably the best showcase of what modern Q1 map formats, compilers and engines can handle.

Some limits are built into file formats though, and this can make things controversial because to change the limit there, means to change the file format, and all engines have to then recognise and parse the new file format, no-one can agree on what the new format should be, and it's just a logistical mess. Very rarely, agreement happens and so we have the bsp2 file format which lets you make maps that are larger than anyone would really ever need. 
#31037 
Got it. So, if I understand correctly, there have been no efforts to introduce dynamic memory management to the engines that would, in theory, remove the need to have hardcoded limits to its capabilities (however high they may be)? I’m asking this just to have my thoughts in order before diving into FTE / spasm / etc source code for answers... 
 
efforts to introduce dynamic memory management to the engines that would, in theory, remove the need to have hardcoded limits

I recall some talk of this in engine threads here, but I don't know to what extent this has been implemented (if at all). I'm at the limits of my knowledge on the subject here, so best check with engine chappos like spike, ericw, metlslime etc. 
 
the vanilla behaviour is for map lumps to be consecutively allocated onto the hunk.
this means you don't pay any memory costs for unused faces or whatever.
it also means that the engine has no real need to enforce any of those max_map_* limits, it only has the hunk size to worry about.

there are some exceptions though, like max_map_leafs that uses various statically sized buffers around the engine, but since Sepulcher neither FTE nor QuakeSpasm have any of those MAX_MAP_* limits remaining (ignoring MAX_MAP_HULLS_hulls which is part of the file format so doesn't really count).

Note that file format limits are not always as strict as they first seem. A few limits have been increased by just redefining variables as unsigned, thereby doubling the limit. The most creative expression of this is the max_leafs and max_nodes limits, that still need to share a 16bit index, resulting in some interesting logic to decide whether a node's child is a leaf or a node instead of just checking the sign.
The progs.dat format has a similar tweak to double the numpr_globals limit, while being careful to not break statements that do not refer to globals.

Either way, just because you CAN have large maps doesn't mean that you should. Or to put that another way, I don't remember to hit quicksave often enough. 
 
Some engines have dynamic allocation, but not all. 
MarkV Tries To Have Dynamic Maximum Edict Allocation 
(And fails) 
 
The artistic skills employed in the Q2 models are much superior

I'm talking about the art, not the file format.

In Q1 there's the squished dead fish head, the badly offset knight skin, the badly bent knight sword in one frame of its animations, the lack of a vore idle pose, stuff like that.

It's not about technical limitations. It's more likely about lack of time to correct mistakes before they released the game.

For example, Hexen II models were created with vastly superior skills, despite using the same format. 
 
The Hexen 2 models are better yes, but still have crap stuff - there's some cat-headed man thing whose legs collapse like squished cardboard boxes during his run animation for example. 
 
in general, if you want to study great low-poly models, looking at stuff from the 90s might not be the best strategy.

In modern times, ultra-low polycounts are still used for some mobile games, and can be found in some indie games and whatnot for stylistic reasons, if you dig around you can find some incredible artistry on display in this area. 
 
Rareware N64 games also had great animations with incredibly low polycounts. Banjo & Kazooie, Conker's Bad Fur Day, and so on.

Turok 2 also has great animations, despite the wonky physics.

Unreal may have had skeletal animations, but they were too floaty and IMO inferior to Q1 model animations. Unreal II and UT fixed that. 
Q1 Models 
While talking of Q1 models. I've got this strange habbit noone could explain yet.

As a fan of Q1 models I was searching for long for the SailorMoon models of Usaki. I finally found them and tried to convert them from Q2 to Q1. This worked but the texture file got lost. Normally I split up a base model, import it again as dxf and start texturing again.
A lot of work, but I don't mind.

Then I thought of Noesis, that would help me out. No, not the slightest idea how to add the skin file.
Then I found Qwalk, and yes, there were my models, without weapon but allright, that can be fixed with merging.

But my question, these models grew up to something of 3Mb, quite large. I experienced when loading them up in Quark4.07 and saving them would deminish them back to 750kb.
Only warning is "Some vertices have the same front and back side, you will get strange effects" but in game there is no sign of them.

What's the reason of this warning and would it be destructive. 
MadFox 
QuArK sucks for modeling. It splits the vertices shared between the front and the back sides of the texture, making the model filesize larger and killing the gouraud shading across front/back seams.

The 3MB thing, however, seems to be an issue with triangles having both a backside and a frontside.

(triangle backside&frontside isn't the same thing as texture backside&frontside) 
Right 
thanks for your answer mankrip.

I'm allready glad I catched the models with Qwalk. Sad I don't understand the manual of Noesis. It seems there are not many Q1 model converters.

Yes, QaArK ain't the best model studio.
Got to row with the peddles there are at hand. 
Hehehehe... 
With Oculus Quest out in the field, I’m wondering, given the team behind the hardware and SDK, what would happen if somebody like me or you guys attempted to publish one of our engines to the Oculus Store 😂 
5 posts not shown on this page because they were spam
First | Previous | Next | Last
You must be logged in to post in this thread.
Website copyright © 2002-2024 John Fitzgibbons. All posts are copyright their respective authors.