Jump to content

Planning for new map importer


xdaniel
 Share

Recommended Posts

Vertex blending is something that is really needed, especially for unlit interior maps, also I'm thinking that it would be very useful to have a dialog that lists flags and which ones have been used, purely for housekeeping so that users don't wind up accidentally using the same flag twice. Perhaps also a parsing style dialog before the map is imported; so whether to split collision and gfx by group or by texture, since people tend to use different 'modelling styles'. Perhaps also something for skipping faces and verts from the gfx or collision when converting it to dlists for testing optimization.

Link to comment
Share on other sites

  • Allowing groups for collision editing, that way exits, certain collision to make Link jump/dive properly, to also be able change sounds properly depending on surfaces walked on would be a major plus.
  • Adding exits/entrances properly and adding the new maps to the dma table with a crc fix(That way one could say, go from hyrule field to the new map/area without crashing, or dying and crashing)
  • Allow for skybox features(as there is different skyboxes)
  • Allow for flow of  time to be changed, that way the user can slow, speed up, or have time stand still.
  • Allow for the use of multiple object sets (ex: Child link/Adult link or depending how flow of time is; Child link day/night and Adult link day/night)
  • Allow for vertex shading and blending, as others have said, this will save time and polygons for modelers trying to make areas.
  • Fix the generating S10.5 format values properly that was causing the UV errors in SO 
  • Allow rendering of Doors? or perhaps a representation of them because it makes placing door actors that have small cube a huge hassle when you don't know what side is what on the cube or if its at the correct height
  • Use a different file format, .obj is too limiting, I think something like .3ds would be a better representation for a file format that could keep vertex shading, normals, etc which would save time of having to generate normals(but should allow it be an option that you can check if you need it) and hope it doesn't come out weird.
  • Fix the press enter for everything for a value to stick otherwise it doesn't save it properly
  • Add a room list to help users understand transitions easier and which maps are what:
  • EX:Room 0 - map1.3ds

         Room 1 - map2.3ds

         Room 2 - map3.3ds

  • Allow for changing save location of Scene/Map
  • Adding wavepoints with the ability to copy wavepoints from a scene within the game(A lot of npcs in OOT/MM use this a lot and not having them makes adding npcs almost useless)
  • Adding animated textures(Something that isn't so hackish, the game does this naturally and we should look into documenting this and making this a feature, this would add A LOT of life to maps)
  • Take into mind that we don't need to spoon feed every user, I understand making it more user friendly but at the same time, the users really need to be able to do some research and understand what are skyboxes, collision, waterboxes, etc for even doing this. I honestly didn't have much trouble with SO understanding its functions because I know the basics already and that information is readily available via the wiki. Not to mention someone from the community could make a tutorial on how to use the options and what they do if people have problems. I don't see a reason to over complicate your work because people are too lazy to read.
  • Like 1
Link to comment
Share on other sites

Getting the obvious bugs out, and getting the basics working would make the most sense.

 

For the love of Trifoce, make a *USEFUL* TWO OBJ SYSTEM!!!1 For collision, 1 for Polys, textures? vertexes??... whatever you call it.For begginers/prototyping maps, you could always choose the same file for both.SO does this already, BUT then goes and matches the collision types to the textures in the poly map.Which is a pain in the ass!  I worked around it by building the map twice, once for polys, once for collisions, but what a pain.There in no reason to change the idea of matching collision to the OBJ texures.  It's a simple, intuative, beautiful system.  If it had been implimented differntly.Other than that, all values entered should be kept in HEX format this time around, even in the XML, or whatever build files you choose to use this time.  Converting my custom collsion types to decimal in previous verions of SO was terrible.Collisons in genral in later versions of SO kept going in the wrong direction.  You *ALWAYS* need to be able to enter manual values in definding collision strings.  I had the proper string for ice, and couldn't use it.Also a *USEFUL* over ride for texture conversion, so you can make fences, gates, windows, ect using the ALPHA channel of the 16-bit RGBA textures.  Again, the feature is there in SO, but it dissables conversion for all textures.  Which is completely worthtless.  It needs to be per texture.

 

I mean why can't I have a map with 20 textures, only 1 of them specifcly 16-bit RGBA?  It's impossible to port many maps back from OVMAV dumps whithout this feature.

 

Oh and the sky box string used by SO isn't used anywhere in the game.  It's incorrect.  It's what messes up the lighting on SO maps.  It's a two byte fix that never gets taken care of.

Link to comment
Share on other sites

The skybox settings are what messes up lighting in SO? Why, this is the first time I've ever heard of that! All I knew about messed up lighting in SO was that the normal calculation was shitty! What about, you know, telling me what exactly is the problem if you know what it is? I'm 99% sure that I was never told about this. And you've still not told me. Which two bytes is it? /rant

 

...right. Those complaints are valid, there were a lot of dumb decisions, there's a lot of broken functionality - like how you're supposed to be able to enter collision types manually, but that's broken IIRC and I never fixed it. However, the XML files weren't supposed to be edited manually in the first place - although granted, you need to do so with SO at times -, so if I can't get writing and reading of hexadecimal values in XML going without a stupid amount of overhead (keyword: serialization), I'll keep them decimal.

Link to comment
Share on other sites

Too big of a focus on sophisticated code - getting hung up on Reflection, XML, modularity of the code, etc., etc.

The second reason I have to work on alone. I have to find a balance between code that simply works and code that is clean and "professional" for the lack of a better description.

I know that feel :(   I don't like writing anything I can't use in multiple projects, which is funny, because I never rarely reuse anything

 

 

 

Question for the advanced users: How much abstraction should the program do? Which aspects of a scene or room file should be in the program in the rawest form possible, and what could be abstracted to the point that it's simple for end users to get a result, but that one would need to dive into the generated files with a hex editor to change specific, less important properties?
I think most of the stuff should be kept as low-level as possible to maintain 1:1 import and export of scenes, but it would be neat if you could use shaders to control the combiner.
 

 

Question for the beginners: What is difficult to understand or perhaps even intimidating about the current map importers, SharpOcarina in particular? What do you think is missing from SO, what could be made easier?
some of my ideas

[*]Fix resource locating when openning files (I'm assuming you already plan on doing this)

  • Add support for importing existing scenes and maps (again, assuming you already plan on this)
  • Use the same obj as model for the collision mesh unless the user changes it
  • Add an option for reversing faces
  • Allow applying scale to mesh data so you can scale up higher than 20x
  • Add some sort of indicator like a box or grid or something for object size to help the user keep things to scale
  • Warn the user if a particular texture exceeds 4096 bytes when importing and provide them solutions (eg. scaling it down, changing the format, etc)
  • Allow moving objects with the mouse and snapping objects to surfaces
  • Allow the user to load lists of song and actor names
  • 1-click actor adding. Just give a list of all the actors and let them select one, have SO take care of managing which groups need to be added
Link to comment
Share on other sites

I like the idea of adding a dummy box/cylinder with roughly the same dimensions/appropriate dimensions of the actor, possibly with a 2d image of what the actor looks like for easy reference. This would add the possibility of snapping an actor to a specific surfaces. The problem with that is the way an actors can have different appearances depending on what object is loaded.

Link to comment
Share on other sites

Its a good thing xDan didn't say to be harsh...lol

 

Yeah... I pretty much made a dick out of myself, didn't I?  :( :( :(Point taken.  Sorry XDaniel.  Forgive me?SO uses:11 00 00 00 00 00 01 00A common sky box is:11 00 00 00 01 00 00 00

 

Personally as far as a new scene development program goes, I'm not ready to see Sharp Ocarina go just yet.  I've been sick(er), and haven't been around.  Last release I knew was v6?

 

I can work around the other issues I've listed, other than the need to disable texture conversion.  So if you could find it in your heart to forgive me, and turn out a new release of Sharp Ocarina with the abilty to disable texture conversion on a texture by texture basis, I'd be happy already.

Link to comment
Share on other sites

I like the idea of adding a dummy box/cylinder with roughly the same dimensions/appropriate dimensions of the actor, possibly with a 2d image of what the actor looks like for easy reference. This would add the possibility of snapping an actor to a specific surfaces. The problem with that is the way an actors can have different appearances depending on what object is loaded.

I was meaning more like general units. eg. the size of a crawlspace, door, or push-block.

Link to comment
Share on other sites

Haven't been... well, "working" is the wrong word I guess, but I haven't been doing stuff with/for this over the last few days. Also, I guess I'll look into that skybox command and maybe texture conversion; not sure if I can work the latter in on a per texture/material basis tho. And I wasn't really mad or anything, just a bit annoyed that a bug in SO had a solution and I didn't know about it :P

Link to comment
Share on other sites

What is difficult to understand or perhaps even intimidating about the current map importers, SharpOcarina in particular? What do you think is missing from SO, what could be made easier?

 

for me, actors in sharpocarina are the most difficult part, or at least the part thats most intimidating.

 

it wouldn't be antimidating for me at all if there was a dropdown menu, where i could pick and choose what actors I want, instead of having to look through all all these larger actor list, which are on various forums, then inserting the hex codes.

 

 

that is the most/only intimidating part for me. I started importing maps a couple days ago. so its safe to say i have the least experience with this. the only actor ive successfully loaded into my map is a door.

Link to comment
Share on other sites

I think the only issue I have with the current one is the lighting, music not working and the bugs where you must change something somewhere else to keep the changes. A prime example is the actors, you won't be able to change the actor's number or variable unless you move it's position.

 

Music, Lighting, Easier Actor Movement, and to show the name of the actor. That is the only thing I would need.

Link to comment
Share on other sites

  • 2 weeks later...

Okay, first of all, I haven't forgotten about this thread. A lot of thinking is being done on this subject, but I cannot and will not promise anything for now.

 

Next, I'd like to start a bit of a survey about everyone's computers. This is regarding the 3D preview an importer like I imagine it (meaning SharpOcarina-ish) would have, and how accurate I could make it for, for example, per-surface and per-vertex transparency and colorization. OpenGL can only do so much there before I'd have to use more advanced features like, in short, shaders to simulate what the N64's color combiner can do.

 

So, what kind of graphics card or chipset do you have? An Intel GMA, an older Radeon or GeForce? A more modern Radeon HD, GeForce GTX or somesuch? Please post what model it is! And while less important, so it's no big deal if you don't know, how much video RAM do you have? In my case, it would be an ATI Radeon HD 5450 with 512 MB RAM.

 

Link to comment
Share on other sites

So, what kind of graphics card or chipset do you have? An Intel GMA, an older Radeon or GeForce? A more modern Radeon HD, GeForce GTX or somesuch? Please post what model it is! And while less important, so it's no big deal if you don't know, how much video RAM do you have? In my case, it would be an ATI Radeon HD 5450 with 512 MB RAM.

I use an Intel HD sandy bridge, but I don't see why this would be important. With the exception of hardware texture mirroring (which you would most likely implement a fallback software implementation for), there no reason any architecture-specific code would be needed to emulate the n64 combiner in GLSL
Link to comment
Share on other sites

 

I use an Intel HD sandy bridge, but I don't see why this would be important. With the exception of hardware texture mirroring (which you would most likely implement a fallback software implementation for), there no reason any architecture-specific code would be needed to emulate the n64 combiner in GLSL

 

ARB assembly language (the extension ARB_fragment_program in particular), not GLSL, and to explain the situation in more detail: From all I know, with rendering using OpenGL's fixed function pipeline - which is what I'd use because I'm used to it, despite it being deprecated in OpenGL 3.0 and removed in 3.1, and the other way would be shaders anyway - transparency/translucency can only come from one source, the diffuse color, but there'd be two possible sources in the model converter, material color (Ka or Kd, plus Tr/d for translucency in an .obj) and vertex color (not officially part of the .obj format, but would/should be editable in-program at least).

 

Both of them could be used at the same time, ex. having translucent water via a material that's not fully opaque, and having a decal of some kind (different grass, a pathway or whatever) that fades out around the edges, or even both on the same surface, but they cannot be rendered correctly at the same time. It might be possible to work around this limitation with surfaces that use just one of the two "types" of transparency, but I cannot say for sure, I'd need to try this out. Also, while texture mirroring is actually easily done - IIRC via ARB_texture_mirrored_repeat, which should be supported by pretty much everything out there -, there's something else that just cannot be done otherwise, and which I'd like to support somehow: multi-texturing. Well, I guess it can be done (ARB_multitexture), but that method appears to be too limited to accurately preview how it would look in-game.

 

(Hope that made some sense, I'm kinda tired by now, pretty late over here)

Link to comment
Share on other sites

ARB assembly language (the extension ARB_fragment_program in particular), not GLSL, and to explain the situation in more detail: From all I know, with rendering using OpenGL's fixed function pipeline - which is what I'd use because I'm used to it, despite it being deprecated in OpenGL 3.0 and removed in 3.1

ARB extensions are part of GLSL, and acknowledged by the GLSL standard. Shaders are compiled at runtime for your gpu anyway, so writing machine specific code for each architecture only places a large unneeded burden on your shoulders :/Also, GLSL shaders were introduced in OpenGL 2.1, which is safe to say that all architectures support it to varying degrees.

and the other way would be shaders anyway - transparency/translucency can only come from one source, the diffuse color, but there'd be two possible sources in the model converter, material color (Ka or Kd, plus Tr/d for translucency in an .obj) and vertex color (not officially part of the .obj format, but would/should be editable in-program at least).

I'm not sure exactly how this ties together, but there are 2 combiner modes made using 4 separate equations;
[code:nocode] color 1 = (a0 - b0) * c0 + d0 alpha 1 = (Aa0 - Ab0) * Ac0 + Ad0 color 2 = (a1 - b1) * c1 + d1 alpha 2 = (Aa1 - Ab1) * Ac1 + Ad1 [/code]

You'd need to support all 4 equations to emulate the combiner properly, so using only ambient and diffuse colors won't work (unless I'm mistaken and you're referring to how you're going to import the .obj models, not render them?) Also, just some simple advice, checking if the fields are COMBINED or TEXEL1 will let you know if multi-texturing is being used, and it might be best to pass the arguments for the combiner to your shaders via a matrix uniform.

IIRC via ARB_texture_mirrored_repeat, which should be supported by pretty much everything out there -, there's something else that just cannot be done otherwise, and which I'd like to support somehow: multi-texturing. Well, I guess it can be done (ARB_multitexture), but that method appears to be too limited to accurately preview how it would look in-game.

afaik, texture_mirrored_repeat is not an ARB extension, but a proposed EXT extension, and Intel HD architectures like mine don't support it.
Link to comment
Share on other sites

I'm just talking about rendering the .obj models, plus vertex colors that would be added to the parsed .obj model in the program. When not using shaders and just relying on OGL's fixed function stuff - so glColor, glVertex, etc. instead of vertex shaders - I can specific more than one color, but only the alpha channel of one of them (diffuse) can be used to specific translucency when rendering.

 

With ARB_fragment_program, I'd just have to write a few lines of ARB assembly (see the sample code on the Wikipedia page I linked to) to combine the necessary colors, which is thus also far from a large burden, as long as the OpenGL implementation/graphics card/drivers/whatever of the system that the program runs on supports the extension. That's why I've started the whole survey thing in the first place as, when I used that extension before, some people ended up having rendering problems as their hardware did not, or not correctly support the extension. And as for combiner emulation, that's even less of a problem as it's already being almost fully emulated by OZMAV2 (or libbadRDP rather, in C), SayakaGL (C#) and another handful of "my" programs, so taking that code and reusing it is just copy and paste.

 

Also, regarding ARB_texture_mirrored_repeat, it is ARB and over ten years old: http://www.opengl.org/registry/specs/ARB/texture_mirrored_repeat.txt

 

Link to comment
Share on other sites

With ARB_fragment_program, I'd just have to write a few lines of ARB assembly (see the sample code on the Wikipedia page I linked to) to combine the necessary colors, which is thus also far from a large burden, as long as the OpenGL implementation/graphics card/drivers/whatever of the system that the program runs on supports the extension.

Looks like more work to me.

 

ARB_fragment

!!ARBfp1.0TEMP color;MUL color, fragment.texcoord[0].y, 2.0;ADD color, 1.0, -color;ABS color, color;ADD result.color, 1.0, -color;MOV result.color.a, 1.0;END
GLSL
#version 120uniform sampler2D sampler;void main() {      gl_FragColor = texture2D(sampler, gl_TexCoord[0].xy);}

And as for combiner emulation, that's even less of a problem as it's already being almost fully emulated by OZMAV2 (or libbadRDP rather, in C), SayakaGL (C#) and another handful of "my" programs, so taking that code and reusing it is just copy and paste.

And I was just trying to help because I figured out the bit encoding for the combiner instructions and have some experience using GLSL. No need to be rude...

 

Also, regarding ARB_texture_mirrored_repeat, it is ARB and over ten years old: http://www.opengl.org/registry/specs/ARB/texture_mirrored_repeat.txt

I'd like to know how ARB_texture_mirror_repeat is a 10-year old ARB extension when the first GLSL standard was published in 2004. Wiki's got something wrong.

 

Also, the sandy bridge gpu that my pc uses was created in 2011, and does not support the texture mirroring extension.

 

 

EDIT:

 

sorry if that last part sounded sarcastic it's not intended to be, I'm actually curious...GLSL was introduced as only an extension to 1.4 and wasn't standardized until 2004, so I'm honestly confused about how there were ARB standards before then....

Link to comment
Share on other sites

Looks like more work to me.

 

GLSL is more work for me because I've never worked with GLSL, while I know (enough at least) ARB assembly.

 

And I was just trying to help because I figured out the bit encoding for the combiner instructions and have some experience using GLSL. No need to be rude...

 

That wasn't supposed to be rude at all, I'm sorry if it came across as such. It just seemed to me like you think that I barely have an idea how the combiner works or how to emulate it, despite having co-created things like OZMAV2. And that "my" was in quotation marks to signify the co-creation thing, as I'm not the sole author of at least the C-based projects I've worked on; it wasn't meant to imply ex. anything in comparison to your programs or somesuch.

 

I'd like to know how ARB_texture_mirror_repeat is a 10-year old ARB extension when the first GLSL standard was published in 2004. Wiki's got something wrong.Also, the sandy bridge gpu that my pc uses was created in 2011, and does not support the texture mirroring extension.EDIT:sorry if that last part sounded sarcastic it's not intended to be, I'm actually curious...GLSL was introduced as only an extension to 1.4 and wasn't standardized until 2004, so I'm honestly confused about how there were ARB standards before then....

 

ARB is short for the Architecture Review Board of OpenGL which created GLSL. They've approved/created extensions to OpenGL - ones that don't even have anything to do with shaders to boot - since long before GLSL even existed. And as you can see from that specs document for ARB_texture_mirrored_repeat, which is right on the OpenGL website and not Wikipedia or anywhere, it's status is "Complete. Approved by ARB on October 16, 2001." So yes, it is over ten years old, and I have no idea why your 2011 GPU doesn't support it when my rather low-end 2010 GPU does, just like my previous 2007 one (GeForce 8500GT) did. Intel didn't care? Too old and useless an extension in their eyes? Really, no idea.

Edited by xdaniel
Link to comment
Share on other sites

That wasn't supposed to be rude at all, I'm sorry if it came across as such. It just seemed to me like you think that I barely have an idea how the combiner works or how to emulate it

I wasn't exactly sure how much you knew about it considering that you've only used OpenGL 1/2 api for your modding tools up until this point. I figured that rather than decoding the combiner equations, you were probably detecting specific packets.
Link to comment
Share on other sites

 Share

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.