News | Forum | People | FAQ | Links | Search | Register | Log in
Mapping Help
This is the place to ask about mapping problems, techniques, and bug fixing, and pretty much anything else you want to do in the level editor.

For questions about coding, check out the Coding Help thread:
First | Previous | Next | Last
that's bloody top shelf. Cheers mate :} 
has anyone tried using triangular pyramids instead of triangular prisms for trisouping?
allows a lot more freedom (almost like working with individual faces!) at the expense of a bit more work.

...or does everyone use pyramids now? 
I Think 
Most people use pyramids now. Just banging some terrain together only needs triangles, but they tend to produce bugs.

Of course, there's always: 
Most people use pyramids now

this sounds messy - do you have any links to examples? 
Not really - and yeah, it can potentially be a mess.

I can post a couple of test maps later on if you like. 
i shouldn't bother unless you've already made something - i doubt i can be swayed from the path... 
yes, it's quite messy in the viewports. 
We've got a test folder full of odds and sods, there's definitely at least one or two examples in there.

Can't check from the office though. 
So. what are all the cool kids using these days to turn a Maya animation into a Quake .mdl? 
Never Mind 
just found the modelling help thread.

Lunaran to the rescue it would seem :} 
Not Sure 
If Noesis is mentioned on there:

Which is a multi-multi format that also supports Quake stuffs. Could be a decent alternative. 
More Options 
More options can't be a bad thing, right? My current favoured method is to export to MD3 files, and then compile them into Q1 format using md3tomdl. I like calling it compiling because it suggests the same kind of workflow as iteratively tweaking/compiling/testing a map. It also makes the process sound irreversible, so that you always keep the model editable in a more suitable format. Like keeping your master recordings in wav files rather than converting them from and to mp3 each time you want to remix... 
I prefer to hit less steps on exporting to be honest - Quake can be pretty finicky about exporting and the more steps there are the more likely things are to screw up.

And its more work.

I'm still using Qme for many things, which can throw a wobbler every now and again and refuse to work.

Holy shit I'm disagreeing with Preach. 
Fair Point 
The way I look at it is that it lets me cut out the step which involves QMe entirely. Before I'd get the model geometry designed and animated in gmax, then do the conversion and finish off the model in QMe. Usually the finishing off is just stuff like rotating and scaling properly, handling the skin.

The problem I had was that then if I realised I wanted to make some deeper change to the model, like un-mirroring a section of skin or adding extra detail to the mesh, it would mean ditching all that work in QMe and going back to the gmax version. So for me running an extra batch file to compile and copy the model to the right place is a timesaver, but it depends on your workflow.

(PS: cutting out QMe means you can use smoothing groups on your mesh!) 
(PS: cutting out QMe means you can use smoothing groups on your mesh!)

Quake's model lighting uses smoothing groups?? 
i don't think that's what he meant???(??) because it doesn't. it smooths wherever faces are connected. 
Smoothing Groups 
There's gouraud shading on the models from the normal vector that each animation vertex encodes. The quake format is a bit crude in how the normals are stored, they get a single byte which is used in a fixed lookup table of normal vectors.

How these vectors are calculated is entirely up to the exporting program. QMe calculates this every time you save a model by averaging the normals of all the faces attached to a given vertex(per frame of animation), then calculating which of the fixed normal vectors best approximates it.

If you split the vertices at a point then this changes the averaging calculations because each triangle now joins to a different vertex. It might be necessary to split some vertices for skinmapping purposes or other nefarious tricks. This can create visual seams on a model if not handled correctly, tomorrow I will post a before and after example model and some more tools for messing with vertex normals... 
wow, that's more complex than i imagined it would be.

are you saying there's a way to remove or at least reduce the seam on split edges with some normal trickery? 
The Fix-up 
For an example see:
The model azoth_pre is the version in the released travail mod, and seams are visible on the insides of his legs. azoth_post has been fixed in this regard through use of a perl script. The script was designed to unify the normals of all sets of vertices which
a) occupy the same position on the model in every frame
b) have the same colour pixel lying under their position on the skin

The script applies b in this case for the sake of the metal prongs that protrude from the belt. Their trianges form a continuous mesh with the torso, but because the colour at the vertex is different from the torso they get a distinct normal. This gives us a smoothing group!

I will admit that I discovered this worked well on the model by luck. My plan was to create an extra skin for the model with just solid blocks of colour to define the smoothing groups, then have the script just delete the extra skin when it was done. As it happens, if your seams are skinned carefully then this isn't necessary.

Another example that might be quicker to test is the waterfall:
If you open the model in QMe and save it again so that the vertex normals get reset, then you should be able to see the difference it makes by looking at the water coming over the lip of the waterfall, the "squareness" is higher. In this case a script was necessary to fix the normals because even in gmax the segments of the model were physically disconnected. 
I think we can all agree time hasn't been kind to Qme.

I'm having the same problem just now actually - sank too much time into Qme and ended up having to sacrifice it all to make some changes.

I've wondered about the smoothing before actually, since Qme visually breaks the mesh, or forms it into one depending on one of two options.

Next up is the IQM format, which basically allows boned models in Quake. Apparently Gb has it exporting meshes directly from Blender into Quake - no other steps involved at all. 
It's worth mentioning that IQM needs engine support. RMQengine has it, and I think DP and others are considering it. 
Applying Transparent Shader 
Ok, so I have a brush and have applied a texture of a fence, which I need to be transparent... Where do I apply the shader? On the sides of the brush? 
You Need 
Engine support - RMQ has that as well. I think DP supports it also, but I'm not the guy to answer the question well.

Basically yes, apply the texture to the brush and it'll work. We're using { in front of the texture name to designate partial transparency, same way that * makes a texture 'liquid'. 
It's For Q3 Engine 
The fence texture goes on the face that's visible to the players ("inside"), all other faces get the nodraw texture (or nodraw_nonsolid depending on the shader settings). Also check if the shader is one-sided or double-sided - if it's the former, and the fence is supposed to be seen from both sides, then the texture needs to be applied to both sides (or the shader changed, which would be the better solution). 
I was given a shader called transparent.shader, which apparently does the same thing. But when i say i want a transparent fence, i mean i was the black bits on the texture to disapear. On the texture, there is a fence, and in between each piece of metal on the fence, is black, which should really be nothing. I need to remove this black. Negke, was that what you were talking about, if so. Thank you, I will try you method 
First | Previous | Next | Last
Post A Reply:
Website copyright © 2002-2017 John Fitzgibbons. All posts are copyright their respective authors.