-
Posts
437 -
Joined
-
Last visited
-
Days Won
43
Content Type
Profiles
Forums
Downloads
Calendar
Bug Tracker
Everything posted by SoulofDeity
-
I needed to extract some files from ocarina of time so I could look at some crap. So, I ported z64dump3 to Linux. I also made some small improvements to the filename listing. So here's the source to z64dump4: http://www.mediafire.com/file/r5e4qr7weni7o1u/z64dump4.c/file As you can see below, it categorizes all the actors, and the filenames tell you what the actor number is and which object number it uses. Maps are also matched up with their scene files. The tool works on all zelda roms, including majora's mask. If I can figure out how N64SoundTool works, I'll release a 5th version that dumps the ctl / tbl files.
-
This is easier said than done. N64 games don't have a native file system, and I doubt that most developers even bothered to make one. The Zelda games and Animal Forest (aka. Animal Crossing N64) share the same file system format from what I've heard, but outside of that, you're looking at something that differs on a game by game basis.
-
Aside from nOVL, you could use a linker script to create a flat binary and then write a script that uses the 'readelf' tool with the '-r' option to list the relocations and their types; and then use a tool like 'awk' to convert the output into a list of entries that you can pipe onto the end of the file. If you're talking about having some way of making a vanilla gcc output an overlay, then the answer is no. At the very least, you'd need to make a new target for libBFD and recompile gcc against it. I'm just stating what the N64 developer's manual itself tells game developers to do. It emphasizes that developers shouldn't use very large numbers in their matrices (which is uncommon anyhow), and has mildly informational charts showing where the values lose precision during conversions or calculations. The whole point of doing the calculations in floating point is that the number of artifacts are significantly reduced.
-
The Matrix Format This is confusing as hell to explain, so I think it's best to start by explaining how 32-bit fixed-point calculations work. A 32-bit fixed-point value has the format 's15.16', where the 's' represents the sign-bit, the 15 represents the 15 integer bits, and the 16 represents the 16 fractional bits. For the fractional part, a value of 0 is equal to 0, and a value of 65,536 is equal to 1. Note that you can never actually have a fractional value of 1, because the carried bit exceeds the 16-bit range and the value wraps back around to 0. You can convert to this format simply by multiplying by 65,536. eg. 0.5 * 65,536 = 32,768 = 0x8000. You can also convert back by dividing by 65,536. It's important to note that the carried bit that was mentioned before will actually overflow into the integer part of the number; just like how 0.9 + 0.2 = 1.1. In addition to the encoding of the fractional number information, the s15.16 format also uses two's complement encoding. This means that when the sign-bit is set, the actual integer value is 0 - (32,768 - the_value). eg. a value of 32,765 with the sign-bit set (0xFFFD) would equal 0 - (32,768 - 32,765) = -3. The same rule also applies to the fractional value, where the actual fractional value is 0 - (1.0 - the_value) when the sign-bit is set. That said, this is just a side effect of the two's complement encoding, and has nothing to do particularly with the fractional number format. The last thing we need to cover before we can get on to the actual matrix format is how overflow from addition and multiplication is handled for fixed-point values. Well, first things first, because of the way that the fractional component is encoded, fixed-point addition, subtraction, multiplication, and division are all equivalent to integer addition, subtraction, multiplication, and division. This makes things much easier. The second thing you need to know is that the overflow from 32-bit addition and multiplication can easily be captured just by using 64-bit arithmetic; the confusing part is converting back to a 32-bit value. So how does the N64 do it? It extracts the "middle" 32-bits of the 64-bit value. This seems like some sort of sorcery, but just try it: 1.5 = (1 << 16) + (0.5 * 65536) = 0x00018000 0x00018000 * 0x00018000 = 0x0000000240000000 0x0000000240000000 -> 0x0000 00024000 0000 -> 0x00024000 0x00024000 >> 16 = 0x0002 = 2 0x00024000 & 0xFFFF = 0x4000 = 16384 16384 / 65536 = 0.25 2 + 0.25 = 2.25 1.5 * 1.5 = 2.25 Crazy, huh? That said, some of the values in rotation and projection matrices have really small fractional values prone to generate errors when multiplied against factors with much higher values. As a result, the N64 manual states that developers should perform their actual calculations using floating-point and only convert the matrices to fixed-point when they're uploading to the RSP. Speaking of which, now that we understand how the math for each individual value works, it's time to look at the matrix format itself. struct Matrix { int16_t integer_parts[4][4]; uint16_t fractional_parts[4][4]; }; Well, that makes no fucking sense, but that's what it is. If you're unfamiliar with C syntax, a [4][4] array is equivalent to a [16] element array. Technically speaking, the gbi.h defines the matrix as a 'long [4][4]', which requires bit shifting integers together and fractionals together (which I personally find hideous and confusing).
-
I wasn't going to post this because I figured it'd be a waste of time, but I figured, why not. The Zelda64 Overlay Format The overlays in Zelda64 games are fairly simple object files. It's just a dump of the ".text" section, followed by a dump of the ".data" section, and then a dump of the ".rodata" section. Immediately after this, there's a small structure with the format: struct { uint32_t size_of_text_section_in_bytes; uint32_t size_of_data_section_in_bytes; uint32_t size_of_rodata_section_in_bytes; uint32_t size_of_bss_section_in_bytes; uint32_t number_of_relocations; }; This is just common knowledge to people who are familiar with object files, but the ".text" section contains the code, the ".data" section contains data that's both readable and writable, the ".rodata" section contains data that's read-only, and the ".bss" section contains data that's initialized to 0 (hence the reason that there is no dump of the ".bss" section in the overlay file. Immediately after this structure is the relocations themselves. Each relocation is a 32-bit value. The uppermost 4-bits is the section type: enum { TEXT_SECTION = 0, DATA_SECTION = 1, RODATA_SECTION = 2, BSS_SECTION = 3 }; The next 4-bits is the relocation type. Each relocation type corresponds exactly to the MIPS relocation types of the ELF file format. As you can see, relocations are actually pretty simple to do. You just extract the operands, apply the calculation, and write the result back. That said, there are a lot of different relocation types, so if you want to fully support the linking and/or loading of object files, it's going to be a bit of a pain in the ass. Immediately after all of the relocations, padding is added to the file such that it's 16-byte aligned minus 4-bytes; and there must be more than 0 bytes of padding. So if the file ends with: 00001000: xx xx xx xx xx xx xx xx xx xx xx xx xx xx xx xx 00001010: xx xx xx xx xx xx xx xx xx xx xx xx xx xx xx xx 00001020: xx xx xx xx xx xx xx xx xx xx xx xx Then you'll need to add 16 bytes of padding. Immediately after this padding is a 32-bit pointer to the structure that contains the section sizes and the number of relocations. The Combiner Modes The combiner is executed after the texturizing unit samples and filters texels from the tiles (which is also after the rasterizer generates fragments from the vertices). There are 2 types of combining: color combining, and alpha combining. Both use the equation "(A - B ) * C + D", but the operands that they accept differ. For the color combiner: enum { CC_A_COMBINED_COLOR = 0, // the combined color from cycle 1 CC_A_TEXEL0_COLOR = 1, // the color of the texel for tile 0 CC_A_TEXEL1_COLOR = 2, // the color of the texel for tile 1 CC_A_PRIMITIVE_COLOR = 3, // the primitive color, set via the 0xFA SetPrimColor command CC_A_SHADE_COLOR = 4, // the shade (vertex) color CC_A_ENVIRONMENT_COLOR = 5, // the environment color, set via the 0xFB SetEnvColor command CC_A_1 = 6, // vec3 { 1.0, 1.0, 1.0 } rgb CC_A_NOISE = 7, // a random color CC_A_0 = 8 // vec3 { 0.0, 0.0, 0.0 } rgb ... // values 9..15 are all the same as value 8 }; enum { CC_B_COMBINED_COLOR = 0, // the combined color from cycle 1 CC_B_TEXEL0_COLOR = 1, // the color of the texel for tile 0 CC_B_TEXEL1_COLOR = 2, // the color of the texel for tile 1 CC_B_PRIMITIVE_COLOR = 3, // the primitive color, set via the 0xFA SetPrimColor command CC_B_SHADE_COLOR = 4, // the shade (vertex) color CC_B_ENVIRONMENT_COLOR = 5, // the environment color, set via the 0xFB SetEnvColor command CC_B_CENTER = 6, // the chroma key center, set via the SetKeyR and SetKeyGB commands CC_B_K4 = 7, // the chroma key K4 constant, set via the SetConvert command CC_B_0 = 8 // vec3 { 0.0, 0.0, 0.0 } rgb ... // values 9..15 are all the same as value 8 }; enum { CC_C_COMBINED_COLOR = 0, // the combined color from cycle 1 CC_C_TEXEL0_COLOR = 1, // the color of the texel for tile 0 CC_C_TEXEL1_COLOR = 2, // the color of the texel for tile 1 CC_C_PRIMITIVE_COLOR = 3, // the primitive color, set via the 0xFA SetPrimColor command CC_C_SHADE_COLOR = 4, // the shade (vertex) color CC_C_ENVIRONMENT_COLOR = 5, // the environment color, set via the 0xFB SetEnvColor command CC_C_SCALE = 6, // the chroma key scale, set via the SetKeyR and SetKeyGB commands CC_C_COMBINED_ALPHA = 7, // the combined alpha from cycle 1 CC_C_TEXEL0_ALPHA = 8, // the alpha of the texel for tile 0 CC_C_TEXEL1_ALPHA = 9, // the alpha of the texel for tile 1 CC_C_PRIMITIVE_ALPHA = 10, // the primitive alpha, set via the 0xFA SetPrimColor command CC_C_SHADE_ALPHA = 11, // the shade (vertex) alpha CC_C_ENVIRONMENT_ALPHA = 12, // the environment alpha, set via the 0xFB SetEnvColor command CC_C_LOD = 13, // the actual lod fraction CC_C_PRIMITIVE_LOD = 14, // the primitive lod fraction, set via the 0xFA SetPrimColor command CC_C_K5 = 15, // the chroma key K5 constant, set via the SetConvert command CC_C_0 = 16, // vec3 { 0.0, 0.0, 0.0 } rgb ... // values 17..31 are all the same as value 16 }; enum { CC_D_COMBINED_COLOR = 0, // the combined color from cycle 1 CC_D_TEXEL0_COLOR = 1, // the color of the texel for tile 0 CC_D_TEXEL1_COLOR = 2, // the color of the texel for tile 1 CC_D_PRIMITIVE_COLOR = 3, // the primitive color, set via the 0xFA SetPrimColor command CC_D_SHADE_COLOR = 4, // the shade (vertex) color CC_D_ENVIRONMENT_COLOR = 5, // the environment color, set via the 0xFB SetEnvColor command CC_D_1 = 6, // vec3 { 1.0, 1.0, 1.0 } rgb CC_D_0 = 7 // vec3 { 0.0, 0.0, 0.0 } rgb }; Well, that seems messy as fuck, but there's no helping that; just a side effect of them trying to cram as much crap into a single command as possible. Anyhow, here's the alpha combiner modes: enum { AC_ABD_COMBINED_ALPHA = 0, // the combined alpha from cycle 1 AC_ABD_TEXEL0_ALPHA = 1, // the alpha of the texel for tile 0 AC_ABD_TEXEL1_ALPHA = 2, // the alpha of the texel for tile 1 AC_ABD_PRIMITIVE_ALPHA = 3, // the primitive alpha, set via the 0xFA SetPrimColor command AC_ABD_SHADE_ALPHA = 4, // the shade (vertex) alpha AC_ABD_ENVIRONMENT_ALPHA = 5, // the environment alpha, set via the 0xFB SetEnvColor command AC_ABD_1 = 6, // 1.0 AC_ABD_0 = 7, // 0.0 } enum { AC_C_LOD = 0, // the lod fraction AC_C_TEXEL0_ALPHA = 1, // the alpha of the texel for tile 0 AC_C_TEXEL1_ALPHA = 2, // the alpha of the texel for tile 1 AC_C_PRIMITIVE_ALPHA = 3, // the primitive alpha, set via the 0xFA SetPrimColor command AC_C_SHADE_ALPHA = 4, // the shade (vertex) alpha AC_C_ENVIRONMENT_ALPHA = 5, // the environment alpha, set via the 0xFB SetEnvColor command AC_C_PRIMITIVE_LOD = 6, // the primitive lod fraction, set via the 0xFA SetPrimColor command AC_C_0 = 7, // 0.0 }; One thing that should also be mentioned (if you haven't inferred it already from the comments above, is that the 0xFC SetCombine command actually sets 2 color combine modes and 2 alpha combine modes at the same time. In single-cycle mode, the 2 modes are exactly the same, and the 'COMBINED_COLOR / COMBINED_ALPHA' values are unused. In double-cycle mode, the two modes are different. Double-cycle mode is typically used for multitexturing. Algebraically speaking, because the combining equation is (A - B ) * C + D; the following statements hold true: Rule 1: if B = 0, then (A * C + D) = ((A - * C + D) Rule 2: if C = 0, then (D) = ((A - * C + D) Rule 3: if D = 0, then ((A - * C) = ((A - * C + D) Rule 4: if B = 0 and D = 0, then (A * C) = ((A - * C + D) Given these statements, here are some common color combiner modes: Modulate = TEXEL0_COLOR * SHADE_COLOR // See Rule 4 Decal = TEXEL0_COLOR // See Rule 2 Blend = (PRIMITIVE_COLOR - ENVIRONMENT_COLOR) * TEXEL0_COLOR + ENVIRONMENT_COLOR and here are some common alpha combiner modes: Select = TEXEL0_ALPHA Multiply = TEXEL0_ALPHA * TEXEL1_ALPHA // See Rule 4
-
That's pretty badass!
-
This is pretty awesome, but I'm not really sure you can call it 'decompilation'...Disassembly with more human-readable mnemonics, maybe. Now, if you're really going for decompilation, some suggestions I'd offer: Scan each of the code branches, adding any argument registers (a0, a1, a2, a3) to a whitelist if you encounter an instruction that reads or writes to them. This will let you detect when you've encountered a function that takes less than 4 arguments. Check when an instruction tries to read or write a value on the stack that exists in a location before the stack pointer has been incremented. Iirc, the MIPS calling convention has the caller do cleanup, so any space that the callee allocates on the stack should only be used for local variables. Combined with the previous option, you may be able to use this determine the exact number of arguments for each function. Check for cases when lui/addiu and lui/ori instructions are used together. This usually means a 32-bit value is being loaded into a register. Check if conditional branches or jumps are jumping to a location within the same function. This implies that there is a loop. In any case, true decompilation is really difficult to perform effectively. Some of the more advanced ones start using heuristics and whatnot.
-
I've been doing a little research on how the game's physics work. I've been using the debug rom, but this information should translate over nonetheless... In the debug rom RAM map, there are 2 functions called CheckCollision_setAT (@0x8005d79c) and CheckCollision_setOT (@0x8005dc4c). The second one (setOT) is called the most often. It seems to be used for all the collision in the game that should only require simple bounds checks; eg. whether or not you're walking into a bush, tree, sign, npc, or any other object. The first one (setAT) seems to be used for all collision that would require complex collision checks that traverse an actor's joint hierarchy. It's used for sword swings, rolling, epona's feet, projectiles, explosions, and enemy attacks from what I can see so far. They aren't called that often; maybe one per frame for setOT, so I think they might be called merely to set the location of the actor and object instance tables for the physics engine or something. As far as motion is concerned, it's been documented that each actor instance has 3 position/rotation pairs. In a header for some custom actor examples in the old vg64tools svn trunk, I noticed that someone named one of the fields as 'acceleration'. From this, I have a reasonable hunch that the 3 position/rotation pairs are basically past/present/future transforms for an actor. The past transform should remain unchanged from the last iteration of the game loop. the future transform is where the actor should be in the next iteration. the current transform would be some kind of linear interpolation between the two that is calculated with respect to the game's physics. That is to say that actors basically can ignore the physics engine as long as the physics engine is aware of the simple and complex (hierarchical) collision data for the object. As for the collision format itself, I've found quotes online that the Zelda64 engine is based on the Mario64 engine, which uses a fast sweeping ellipsoids algorithm; and there's a function in the game called 'ClObjJntSph_set3' (Collision Object - Joint Sphere?). While the game appears to never call this function, I think it gives a subtle clue that the complex hierarchical collision checking works on a joint-by-joint basis.
-
Too much, effort-wise. Creating a custom game engine is really complex, especially when you start getting into audio and physics. But even beyond that, you still have to write tools for creating scenes and converting resources before you can even think about getting started on making the game itself. I had completed the module loading system, completed the actor loader (for both native code and lua scripts), written the OpenGL mesh renderer, and was a good ways through writing a new tool (based on a fixed version of one of my prior tools, z64dump3) to rip models from the game in fbx format. Maybe I'll continue working on it in the future as a personal project, but for now, I just want to be productive.
-
Unity5's new canvas system is making it really easy to make GUI's that scale to different resolutions. Here, you can see 4:3 and 16:9 aspect ratios. ~Pixel Perfect
-
Well, in light of some recent realizations, I decided to drop the idea of using my own game engine and go back to Unity3D. I usually prototype my ideas in it anyhow, and I'm more productive that way. Besides, Unity5 seems to make a couple of things easier... Ignoring the 'Ethan' standard asset model in the middle of the screen, Unity5 allows you to attach a Skybox component to a camera to override the default skybox. The skybox is just a material, which in this case uses a ShaderLab shader that interpolates between a 'clear day' and 'clear dusk' skybox. Since the skybox uses an orientation that's independent of the camera, I can parent the sun, moon, and stars to the 'Sky Camera' and rotate it to simulate a moving sky. This is a much cleaner way of implementing custom skyboxes than what I was doing before. It's going to take a little time to get things back to the more impressive state that the project was in before, but it'll be worth it in the long run. I can still make it moddable like I planned with Cx, by generating assets at runtime, but that's not high-up on my todo list.
-
Adding Text and Collision to custom actor?
SoulofDeity replied to mooliecool's topic in Modifications
I don't know how the game actually does collision, but I do know how you could do it. The game has a segment table that contains the offsets of each segment in memory. segments 2 and 3 are the map and scene (can't remember which is which though). The scene file has collision meshes in it. If you know the math, you can implement raycasting or box/capsule/sphere on triangle soup collision detection. As for the text, how navi perceives an actor depends on the actor's type tag, which is a good place to start as far as making your actor target-able goes. Flotonic wrote a tool a while back that could modify the game's text, which should be on the maco64 website still. as for opening the textbox, I would try to find a specific message in the message table in ram using Nemu64's debugger, setting a watchpoint and breakpoint on it, and then talking to the character or whatever that opens the dialog to see if you can find the function that loads it. Sorry I don't have exact solutions. Not much is known about the game's internal functions. -
Project CX - Zelda Modding About To Get Real
SoulofDeity replied to SoulofDeity's topic in Community Projects
These are just some personal notes I've been taking on the special effects in the games: Shadows - Spot shadow, used for anything other than the player, or when the player is in the air to make it easier to predict the landing spot; it's just a horizontal plane drawn at the character's feet - Foot shadow, used only by the player, has a length and direction that depends on angle between the player and the light source; each foot will have a separate foot shadow for each light source - Real shadow, unused. it requires a stencil buffer and thus is expensive Stars - Stars are only seen in majora's mask and are rendered independently of each other; the north star is red, stars that make up constellations are various shades of blue, all other stars are various shades of gray or white that 'twinkle' in and out of visibility when the camera moves Rain - Splashes are emitted independently of rain on a 'splash plane', drawn at the same level as the player's feet; however, the size of the plane is dependent on the resolution of the screen Trails - Used by the player's footsteps, deku-link spins, and sword slashes. The footsteps and deku-link spin trails never clip through the floor like shadows; so it's possible that they're implemented by a stencil buffer Coronas - Coronas, drawn around most point light sources like torches and fairies, are occluded via raycasting from the camera, but drawn over everything else in a scene (except gui stuff like menus and text boxes) Heat Distortion - Heat distortion, like you see while in the death mountain crater, is implemented by 'jiggling' the frustum of the camera Skeletal Animation - In OoT / MM, skinning (binding vertices to a skeleton) is performed via a technique known as 'linear blend skinning'. First, forward-kinematics are applied. This entails converting the local-space transformation matrix of each joint into world-space by traversing the hierarchy of the skeleton from the root node down to each of the terminal leaf nodes. Second, any necessary inverse-kinematics are applied. This is used to do things like correct the placement of the player's feet or make a character turn their head to look at something. It entails traversing the hierarchy of the skeleton backwards from the terminal leaf nodes up an arbitrary number of limbs. It should be noted that, while OoT/MM does use linear blend skinning, which has been standard in the game-design industry for many years, that double-quaternion skinning is ~2.5x faster and has less artifacts. -
Project CX - Zelda Modding About To Get Real
SoulofDeity replied to SoulofDeity's topic in Community Projects
OoT3D/MM3D-style graphics are also a possibility. Really, you can do anything. What I'm interested in is the physics and special effects that polish that game and make it look amazing. Basically, I want break down the special effects to understand each individual piece, find the most optimal alternative possible, and then put the effect back together. Eg. For the gerudo valley video, one of the things that I mentioned was the heat distortion effect. OoT and MM for the n64 implemented this effect in Death Mountain Crater by jiggling the frustum of the camera; it's not as great of an effect though because it lacks the refraction you see in the UE video. -
Project CX - Zelda Modding About To Get Real
SoulofDeity replied to SoulofDeity's topic in Community Projects
Just something to note about my previous post regarding game aesthetics; the one reason I refuse to use the Unreal Engine is that...I can't...Literally. On all of my computers, the engine is completely unusable; with terrible graphics glitches and an average of about 1 FPS. Rather than trying to use the latest and greatest features out there, expecting people to have $1200 SLI rigs just to play games, I'm interested in finding alternative means of implementing the same graphics effects without incurring the huge amounts of overhead. -
Project CX - Zelda Modding About To Get Real
SoulofDeity replied to SoulofDeity's topic in Community Projects
So, I've been thinking a lot today about the graphics capabilities of the engine, and I think that we shouldn't limit ourselves to light n64-style graphics, but merely make them fallbacks to meet hardware constraints. Why? Because N64-style graphics are perfect if you're trying to target mobile devices, but if I'm going to play the game on my PC, I want it to look like this: I doubt that any of you will disagree. Now, the question is, how do we reach THAT? Well, there are several special effects he's pulling off here if you look closely. It's a combination of bloom lighting, global illumination (possibly radiosity or a lightmass), heat wave distortion, motion blur, a wind zone, particle systems, and volumetric fog. Bloom lighting, heat wave distortion, and motion blur are all simple (well...intermediate I should say) post-processing effects that can be performed with shaders. Global illumination is a bit painful to implement, but basically allows for extremely realistic lighting by having light bounce off of objects. Particle systems and volumetric fog are both controlled by the game's physics. As for the map he's using, the way the walls protrude tell me that he's not using a terrain generator. Most likely, he sculpted it in a tool like zbrush, baked some bump and/or displacement maps, and decimated the mesh to reduce the polycount. Now, I'm not saying it'll be easy, but the only things that I think will truly be difficult to implement here are the global illumination and volumetric fog (mainly because I'm not very familiar with the techniques involved). It mostly boils down to creativity and imagination when it comes to these things. I can assure you that the same set of tools in a different pair of hands might not have looked even half as good. --- The quality of graphics aside, another thing I've been thinking about is the implementation of extreme weather. Because I honestly think my heart would explode out of my chest if I saw something as epic as a tornado ripping it's way through Hyrule Field. Especially if it was in that UE-style graphics. >_> Well, aesthetically, you guys now have idea of the kind of scalability I'm hoping to achieve. The engine should be compact and fast enough to run very well on an embedded platform, but it shouldn't stand in your way if you want to make something that covers computer monitors in grey matter. -
Well, you can do whatever you want as far as the game is concerned. The engine supports a text-based interchange format for all the assets by default (note: the graphics api is still flexible enough to write your own model loaders if you so wish). As far as reverse engineering goes, I'm working on a new tool called z64rip that basically does a shitload of black magic to find as many assets as possible in the games and rip/convert them into manageable assets. The scripting part of the game engine has been a bit of a pain; enough so that I've considered just going back to unity, but I think that ultimately I'll just stick with making my own engine.
-
Right now, I'm working on one of the biggest projects I've ever written; which will probably spark a whole new life into the Zelda modding community. This new project, dubbed "Project CX" is an easily portable, lightweight, cross-platform engine for creating Zelda or Zelda-like RPG games. Unique to this engine is that it is designed specifically with the intention of being modded. All quests, scenes, models, effects, fonts, user interfaces, actors, sound banks, music, and so forth; that is to say anything that is not specifically related to the core of the engine is configured via a very simple interchange format that I've designed. Furthermore, the game will connect to a configurable list of repositories that lets you download new content in the form of mods. The goal of this project is to breath new life into the modding community by lowering the barrier of complexity. Architecturally, the engine revolves around a quest system. Quests are a series of temporary or persistent events. Events can be used to execute scripts, load scenes, play cutscenes, or start and stop other quests. Scenes are a set of entrances, exits, paths, and cutscenes, associated with a map and a collision/navigation mesh. A map is a set of rooms associated with a set of models, events, triggers, and actors to be spawned. An actor is a script (lua) associated with a model, status, inventory, and controller. A controller is an abstract interface that can accept input from the user, a pathfinder, a script, ai, etc. The controller's job is to generate a sequence of actions that is processed by the navigation / animation / physics framework. This vastly simplifies the task of creating new enemies and npc's for the game. In addition to the engine itself, I'm also writing a new tool called z64rip that extracts models (objects/maps), scenes, music, and soundbanks from the Zelda64 games into my engine's interchange format and a blender import/export script for my model format. You can expect within a few weeks from today (12/7/2016) to be able to see practically all scenes and maps from OoT and MM imported into the engine. Progress wise, z64rip can already dynamically decompress any zelda64 game and locate the scene and sequence pointer tables in their code files; the next step there is iterating through the scene table to construct models and saving them to disk. On Project CX, I've recently finished designing the interchange format and writing the module loading system, which includes the initialization of the transparent graphics interface that can bind a camera to move through a scene. I'm currently working on the code for loading and rendering models. I can't put an exact date on how long it will take. Some days, I'm much more productive than others; but I have a very good feeling that I will have something to show sometime between now and mid-January. I'll be keeping everyone up to date as progress is being made.
-
Um. I didn't leave because of a conflict with other members, I just lost interest in modding in general for a while. Also, this project is as alive or dead as you want it to be, I started it fully with the intention of it being a community project (which is why the source code is on git hub). I personally stopped working on it because Unity doesn't integrate well with github, and I was spending more time rewriting other people's code to play nicely with each other than I was writing my own code; and it was becoming a bit too tedius for me. Also downloading commits was a nightmare, because Unity wanted to recompile all of the audio files each and every time... Currently, I'm working on writing my own Zelda64 game engine from scratch that doesn't share the same weaknesses as Unity and is designed from the ground-up specifically to be moddable. It uses a custom text-based interchange format I've designed to do things like: Quests - A quest file is a series of events that can execute scripts, load scenes, etc. Scenes - A scene file is a collection of entrances, exits, paths, and cutscenes associated with a collision/navigation mesh and one or more maps Models Actors - An actor file is a lua script associated with a model, inventory, and controller that can perform actions managed by the game's navigation / animation / physics framework Soundbanks - A soundbank file is a collection of instruments associated with keymapped multisampled stereo sounds with volume envelopes; used by music sequences With the exception of images and sound, you will be able to replace or extend any content you want in the game using nothing but a text editor. I'm also working on tools to work with the interchange format and better extract resources from the Zelda64 games (eg. a tool to extract actual entrances, exits, paths, etc. from scenes in the games)
-
Help with loading a separate sheath for Biggorons Sword?
SoulofDeity replied to PwnzLPs's topic in Modifications
Changing the model in memory does not change it permanently throughout the entire game. The point of using segments is that zobj's can be loaded anywhere in memory and still work as long as the entries in the segment table point at the right places. Knowing this, all you have to do add a single DE command into link's zobj that points to the sheath (branch, no return). Then, write a hook somewhere in the player actor file that looks up the physical address for the zobj in the segment table (it'll be the 7th entry because models use segment 6), add the offset of a custom DE command +4, and change the offset depending on which sword Link has equipped. -
I'm currently using version 5. It seems to have this neat feature where it detects when you're using old api's and rewrites your code for you.
-
Instead of jumping right into the dirty work, I've wanted to play around a bit. Managed to create a fairly decent rain effect. A stillshot doesn't really do it justice; in game the splashes on the ground almost make it look like there's flowing water on the ground. I originally figured out how the effect worked from watching a tutorial on youtube, but the dingus who made the tutorial created the effect rotated 270 degrees on the x-axis so the cone-shape emitters would point upwards; this hack only works in preview mode so I had to spend the next hour or so tinkering with the velocity over lifetime and figuring out how to render the particle physics in world space. The difficult thing now is trying to figure out how to render it so that it follows the camera. EDIT: I currently just finished migrating the skybox from the old project. There's a major improvement in the rendering pipeline though. The old steps were: clear color and depth render sky clear depth render everything else The new steps are: clear depth render sky (with writing to depth buffer disabled) render everything else This takes advantage of the fact that since the skybox is the only thing being rendered on the sky layer that there is no need to worry about which order polygons should be drawn in; and because the skybox is always rendered, there is no need to clear the color of the screen. EDIT #2: For the sake of resolving my own annoyances, I fixed the rotation and scaling of the FBX model for the skybox. This allowed me to remove some of the setup code in the start function and rotate on the up-axis instead of the right-axis (which makes more sense). I also changed the "SetSkybox" function to be static and call it only once during the start function rather than once every frame. Now, if you want to change the skybox in game, you just call "Environment.SetSkybox (blend_value, skybox1_material, skybox2_material)". EDIT #3: Some notes I wrote out regarding how to optimize performance in some areas and solve a few problems: When exporting FBX files from Blender, rotate them on the X-axis by -90 degrees, apply the transformation to the object, and then in the export options, set it to +Y forward, Z up. This ensures that the model is oriented correctly when it's imported into Unity. When importing animations, use the legacy mode. My animation importer will then strip the keys for translation and scale. This not only reduces the amount of memory the animation uses, but also allows you to use the same animations on multiple skeletons. Almost (if not) all models should be using fake shadows, so make sure you change the "Cast Shadows" property of the mesh renderer to "off". Only terrains need to receive shadows, so for everything else, uncheck the "Receive Shadows" property of the mesh renderer. Because the game uses simplistic lighting, uncheck the "Use Light Probes" property of the mesh renderer. Anything that doesn't move should be marked as static. Make sure the format of each texture is set to "Truecolor". By default, it's set to "Compressed", which causes distortion. Make sure the wrap mode of each texture is set to "Clamp". Unity doesn't support separate X, Y clamp/repeat options, so it should always be done with the shader instead using the 'frac' function. Never use the default Unity shaders or surface shaders. They perform a ton of slow lighting calculations you don't need. Here, you can see the use of the 'frac' function to reproduce the texture repeat on the grass; despite the fact that the texture wrapmode is set to clamp. This should solve that annoying problem with the edges on the top of mountains in hyrule field.
-
I may sound like a prick when I say this, but I could rewrite that with support for orbiting the player in just 4 lines of code. The difficult things with camera movement are dealing with collision so objects don't get between the player and the camera, how to distance the camera from the player when the player is falling or running, where to position the camera when you're locked onto a target, and how to handle cutscenes.
-
Nope, you're right. I just haven't been keeping up to date on what's been going on with Unity. I actually already have 5 installed; I just haven't been able to tinker with it because I've had some problems with my laptop overheating lately.
-
The biggest issues with using Unity are the inability to use post-processing framebuffer effects without buying the pro version and dealing with version control. Unreal Engine would be an option, but it runs stupidly slow on my pc. I've also looked into other engines like Ogre3d (which I gave up trying to compile after a couple of days) and Irrlicht (which doesn't do skeletal animations or shaders very well). Graphics and audio actually aren't that hard. There was a seminar I saw once where a game designer shows how it's possible to create a simple 2D game in just 8 hours. The most difficult things to deal with are physics (eg. collision and rigidbodies) and the creation of artwork for the game.