September 24, 2016 at 1:53 am (CST)
I finally got around to releasing version 4.2 of Noesis. It features a new architecture/platform-agnostic disassembler and debugger, an assortment of new rendering features (including a PBR shading model), and a lot of other random fixes and additions.
I've been sitting on this build for over half a year, so it's worth going over some of the stuff added in this and other semi-recent builds. The things I throw into Noesis don't tend to get any exposure or explanation when they're added, because most of it is done on a whim or to meet a need for something else I happen to be working on. So let's have a little fun
. I think you're really going to like what I have in store for you today, as we beat the physically-based devil out of Noesis together.
Referring to any of the BRDF's that game developers have collectively settled on over the last few years as "physically-based" is a lot like calling a McDonald's hamburger meat-based. In some sense, you're sort of correct, but you're really just trying to make yourself feel better about all the feces you're ingesting with that burger. Such is the brave new world of PBR in videogames. It's the same old shit, with a little less antibiotic filler.
That said, the fact that we now have a standardized way of working a Fresnel term into our BRDF, and we generally no longer have to deal with every developer deciding to handle specular distribution in a slightly different way with some manner of hacked in exponent, makes things easier for tools like Noesis.
I initially implemented a BRDF based on the same Disney paper
that everyone else in the industry has based their implementations on. The default Noesis BRDF sticks with GGX/GTR (still defaulting the exponent to 2) for the distribution function, GGX/Smith for the geometric shadowing function (by default, roughness remapping is not performed here, as most games don't seem to be retaining this concept), and Schlick approximation for the Fresnel function. The diffuse model is still Lambert, however, where there looks to be a healthy mix of Lambert and the Disney diffuse model in the current crop of PBR games. (Lambert is cheaper, although arguably on current consoles this can matter much less with careful dependency ordering - ALU cycles are often hidden by being horribly memory-bound with deferred renderers in lighting passes)
For specular anisotropy, I wanted to maintain some semblance of parity with IBL, and importance sampling at runtime isn't especially viable, so anisotropy is implemented universally by shifting the specular halfangle along the geometric bitangent. Although I felt kind of filthy taking this approach, it turns out that Far Cry 4 is doing something very similar, so hopefully it's a good enough fit for other games that do implement some form of anisotropic distribution.
Although everyone is still doing something ever so slightly different, because graphics programmers like to feel special, you should be able to fit most PBR content you come across to the default Noesis model, or at least beat it into close submission by tweaking material parameters and BRDF flags. Not many games are implemented using PBR shading models in Noesis yet, but current examples include Carmageddon: Reincarnation (above), and Fallout 4 (below), although FO4 has been hacked in for testing retroactively, and is only enabled when specifically invoked.
It should also be noted that HDR, bloom, etc. are disabled in Noesis by default, for the sake of maintaining compatibility and speed for all those people still running on Intel integrated graphics chips from 2001. While the PBR path is enabled by default, BRDF output will be clamped and output directly to gamma space, and it will be difficult to get a real indication of whether you're seeing correct lighting values this way. To enable HDR, go to "Tools->Data Viewer" in the Noesis menu, and "Persistent settings->View" in the data viewer. You'll find lots of rendering options at the top here, including options for rendering and globally overriding environment maps. Noesis also has a new pixel picker for the HDR framebuffer, which can help you verify that you're treating data correctly and have everything in the correct color space.
It's also worth noting that by default, Noesis attempts to look for pre-existing diffuse irradiance and pre-convolved GGX specular cubemaps for a given environment map when it's referenced by a PBR material. If Noesis can't find any existing textures, it'll calculate new ones, which can be time-consuming. Once the textures have been calculated, you can use the "Export from Preview" option to spit the cubemaps out so that you don't need to wait on anything else referencing the same environment map.
The idea here was to create a protocol for debugging, disassembly, memory viewing/editing, etc. which could place as much burden on the host or the client as desired. The Noesis Universal Debugger is the client that sits on top of this new protocol. The client itself is still pretty new and barebones, and currently looks like this:
The debugging protocol itself is designed to support any architecture with any addressing mode and any number of physical or logical cores with the possibility to debug multiple different types of architectures in the same instance. The idea, at the same time, is to allow the host or the client to implement any number of key debugging features, such as actually generating disassembly or providing a callstack. The host/target implementation is very black-boxed and should easily drop into just about any codebase with very few external hooks. I've been using it on a PSX emulator to good effect.
This thing ended up being a lot more time-consuming than I'd hoped, and I ran out of steam about half way through. It's the main reason I hadn't made a build of Noesis for over half a year, being perpetually trapped between "I don't feel like working on this shit anymore" and "I want to finish that before I make a new build". Now that it's up and running reasonably, though, I'll be continuing to chip away at it. As it evolves, I'll be adding support for it to more programs (particularly emulators), and if it ever reaches a point of suitable maturity, I'll share the protocol and associated code in case anyone else wants to get on board with it.
The motivation for creating this thing was driven by seeing that most emulators with built-in debuggers have really terrible debuggers, and external support for IDA Pro is quite rare, probably because IDA Pro is costly and IDA free is crippled in critical ways. Adding support for a new architecture/system in IDA can also be very cumbersome, where the Noesis Universal Debugger protocol has been designed with the idea that you can take support for a core architecture, and very easily extend it for the particular quirks of a system. (such as, for example, the ability to support custom opcodes or coprocessors) This makes it trivial to add support to emulators on the emulator side, without even having to touch the client, and it keeps the door open for the client to be implemented entirely inside something like IDA while retaining all of the niceties of the protocol's flexibility.
Given the amount of free time I have lately, furthering this seems like something of a pipe dream, but I guess we'll see. It's useful for reverse engineering PSX games in the mean time.
It's been a while now, but I wrote a DICOM loader quite a few releases ago. If you don't know what DICOM is, it's a truly awful medical transfer and imaging standard, with a somewhat architecturally flawed core that's been bloated in unspeakable ways for the last 20 years. The specification is thousands of pages long, with a good many hundreds of those pages touching on information necessary to correctly support the format. The Noesis DICOM loader is probably one of the more complete and robust DICOM loaders in existence, and it supports a huge variety of transfer syntaxes, at great cost to my personal sanity. Noesis can load data as image slices, or visualize it in 3D using a number of techniques.
It so happens that I've been dying over the last few years (I lost my gallbladder last year, and I've been subjected to dozens of tests as doctors attempt to figure out why I'm dying), and when I needed a brain scan, I was quick to ask for the results on a disc. They were, predictably, in DICOM format, and my head is now included with Noesis. This is kind of what I'd look like if you cut my face off, but without the blood and gurgling:
I don't think my head is really that fat. The dimensions are derived directly from image space without a proper scale and bias. I'm not fat, I just have a non-relative Z axis. Anyway, the above was generated using a .noevmarch file which is included in the Noesis distribution. I'll touch on that a bit more in a second. You can pull the iso-level back to get an idea of what it would looks like if you cut my face off and then threw acid all over the remains of my head, too:
If you find yourself dying as well, and the doctors want to take pictures of your insides, don't forget to ask for a copy of your scan data. You haven't lived until you've used Noesis to fly through your own innards. The data they'll hand you these days is almost always going to be DICOM, but Noesis also supports the Analyze 7.5 and NifTI-1 formats commonly used in medical imaging.
I'm a fan of voxels. Not as big of one as Ken Silverman, but who is? I added support for the KVX format used in the Build engine games (like Shadow Warrior) quite a few releases ago.
As with DICOM and any other volume data, KVX files can be loaded using a .noevmarch file. Initially, this was a way of setting data up with marching cubes, but it's come to support other methods for meshing scalar fields as well with a variety of options. The above pictured armor from Shadow Warrior, for example:
Noesis also features a voxelizer, for going the other way around. It's been refined a fair amount, and designed to work on any model you can throw at it, including concave geometry with holes. Keeping in theme, here's Duke from Duke Nukem Forever, pre and post voxelization:
The voxelizer samples texture and vertex colors from the model for each voxel. If you're a hipster indie dev, you can probably make a whole game out of this feature.
Noesis features a secret software renderer. It's very flexible, and allows you to specify pixel rendering callbacks to do all manner of custom rendering, with access to unclipped barycentric coordinates and some other useful per-triangle data.
Although I've written a test tools plugin that renders the active scene through the software renderer (pictured above), it's really only intended for development use. One plugin uses it for an amazing hack, where models are rendered into UV space, using a custom callback to determine the alpha coverage of each triangle, which is then used to determine what the default blend mode on that mesh should be. The renderer is multi-threaded, and will suck the life out of every core in a given machine pretty effectively, so it can be especially useful to spin up for all manner of one-off rendering tasks that you don't want to conflict with Noesis rendering. (although Noesis does expose a hardware rendering API which ensures compatibility across different Noesis renderers, it isn't something you want to be using on, say, the export thread) The API is included in a header in the native plugin SDK.
A lot of random game formats have been updated/added in recent time. A couple of significant undertakings were Final Fantasy XI and Ultima Online.
I know I saw a format specification for FFXI a long time ago. I thought nothing of it at the time, and I'm not sure how complete it was, but I certainly came up empty when I went looking this time around. I really only wanted to add this game to round out support for the mainline FF games, and ended up having to redo someone else's work in figuring out the model and animation formats. Thankfully, I was able to find a snippet for decrypting the maps and map objects, which saved me from that. I put the source code for the loader up on my site here, so no one else has to re-endure the same pain, although I expect interest is mostly dead at this point.
I also went through the trouble of figuring out the model and animation formats for the higher-detail character creation/selection models, which I think was a first. Somewhat bizarrely, they used entirely different model and animation formats for those things.
Ultima Online, on the other hand, has very well-documented formats. I was addicted to UO for several years when it first came out. Then the whole MMO thing wore off, and I've had some level of hatred for them ever since for what they threaten to do to my life if I let myself enjoy them in any way. But I kind of felt like I had to support UO in Noesis, after all these years, because it's one of those special games that stole a big chunk of my life. The interesting thing about the UO support is that I decided to do it all in Python. This amounted to making a UO renderer mostly via BLT calls for the MULTI objects:
These things are handled like standard image importers, but there's a lot more going on under the hood than the usual image importer, as you can see in the script. I added support for all of the main asset types for the classic client, which works from the day 1 CD image data all the way up to the current official classic client distribution available for download. As with all animated image formats, this means you can export them to animated GIF's with the click of a button. Internet time!
I'll probably go back to chipping away at more formats now that some of these big-ticket items are finished up, in my rare bits of free time.
Until next time, my friends, or until my internal organs finish failing.
11 comments in total.
Post a comment
January 26, 2017 at 5:39 pm (CST)
Wow, this is amazing. I definitely will be using the voxelizer and the debugger :)
January 8, 2017 at 5:35 pm (CST)
Incredible. Im in love with voxelization.
Thank you very much.
November 4, 2016 at 3:15 pm (CST)
Outstanding update, Above and beyond all expectations. Many thanks.
November 1, 2016 at 9:02 am (CST)
Stupid question, but I was wondering if Noesis will get support for Scarface's Pure3D format?
October 17, 2016 at 3:59 am (CST)
Knocked it out of the ballpark! every time I come here I amazed at what a dedicated programmer can do. The FFXI implementation is a little wonky but really appreciated. You also are the first person that I am aware of to work through the character selection model format. Thank again sir. Made my year for sure.
October 2, 2016 at 10:24 pm (CST)
Amazing work, thank you!!!
September 30, 2016 at 5:06 am (CST)
Thanks for your answer, i finally manage to make something that looks good by transforming points into triangles and then link them to make a mesh. It's not as proper as expected but it works.
September 29, 2016 at 9:40 pm (CST)
There are a few options there, from insane to relatively easy:
1) Write a visualizer plugin. You can stuff whatever data you want into your model under a "userdata" tag, like lines, points, or any kind of primitive you want to do custom rasterization on, then look for that tag in your visualizer in the rendering callback. Here you have the option of using the hardware rendering (NGL) API to draw lines/points in the scene, or you can use the software renderer and composite if you want to get extra insane with some form of custom rasterization. NGL hardware lines would probably suffice for your purpose.
2) Just add the line as triangles. There are various functions for meshing splines as well that you could incorporate here. There's an example script that shows how to generate geometry from 2D font splines. It was up on the Google code repo, which should still be around in archival mode. Unless they've started killing those already.
3) Add the line as 2D triangles, and apply a material expression (setExpr_vpos_x/y/z) in order to do your own transform on the line to keep it oriented to the view. (this won't necessarily give you good results when exporting to other formats, though - keeping in mind many model formats don't support the concept of "line")
4) Add the lines as detached joints in the scene hierarchy, if you want them to be visible in the skeleton/transform view.
I'd probably recommend 2 if you want something that's going to fit well enough to all of your potential model export targets.
PS- Thanks Beaves.
September 29, 2016 at 12:26 am (CST)
Many thanks for your awesome work !
I recently discovered Noesis and started to write some python plugins for it.
I would like to know if it's possible to draw a line between two vertices (like wire frames) in a python plugin in a proper way.
September 29, 2016 at 10:53 am (CST)
Really great post for such an incredible amount of work! I think you really are one of those 10x programmers.
 2 ... Next
Post a comment