No Man’s Sky is a space exploration game which uses procedural generation technology in order to create game environment and assets (textures, models, terrains etc). I was very excited since this game was announced back in 2013, not only for the game itself, but mostly for a chance to start researching the game files and find out how the game works. The game turned out as one of the most controversial games ever released, but still its inner workings are very interesting.

Anyone who installs the game can easily find out that the game is really small for its scale and this is true. But the main reason behind that is that the game works on a very small set of assets in order to create literally hundreds of variants, using procedural generation. I’m going to concentrate on the content that is related with the game’s 3D models because this is were I am (always) interested in. I’m also going to separate the article into 3 main categories. Geometry, Textures and Animation.


So, within the game files, pure geometry files (vertex, index buffers etc) are stored into files with extension “.GEOMETRY.MBIN”. With just those files one can create very simple parsers which can parse that geometry into modeling software. BUT this file is not enough by itself. This geometry file actually acts like a container for the pure geometry data.

The game loads assets in a scenegraph way. This means that all model assets are defined as separate scene files with custom object hierarchies, multiple mesh parts, multiple type objects (joints, lights, interacts, collisions, other scene files) etc. That kind of information is stored into “.SCENE.MBIN” files. These are the actual descriptors of a particular scene, and usually those files are referencing one geometry container, from where all the scene mesh parts get their respective geometry information.

Till now there is nothing new. This is pretty common stuff for a ton of games out there. What is different about No Man’s Sky (at least I have never encountered it before) is that within a scene file there is not just one complete -crafted- creature that can spawn into the game under certain circumstances, and this is where the procedural generation kicks in.

Just to give a short idea of how this looks i’ll attach some pictures of the NMS Model Viewer I created.

Triceratops model


As you can see, the scene on the first look is a total mess, you can’t even tell what you’re seeing and you’re obviously not seeing something that could spawn in the game somehow sometime.

The key part to that story is the actual names of the objects. One can clearly notice that there is some relation between them and there should be a way for the game to know how to combine those parts and how to exclude others when creating a model.

So having that in mind, I started searching for other files that may handle that kind of organization of the scene, and it seems that this is what the “.DESCRIPTOR.MBIN” files do. Not all models have such files. It turns out that only procedural generated models have them, while static models (like the astronaut model or some trailer crafted material) don’t.

When I first started reversing the NMS file formats, I started out by parsing those descriptor files, but at the time they looked nonsense to me and I could not figure out what they did. So after parsing the scenes and actually knowing what I’m searching for, I turned back into the descriptor files. These files look very very very similar with the scene files. They got the same kind of parts definition and stuff like that BUT they can only reference either mesh parts or whole scene files. The way those files work goes like that:


The procedure works like that: There is a main part that should be decided what it is going to be and this is the _HEAD_ part in this example. Usually parts that are named with underscores before them means that they are part of a descriptor group and only one of them will be selected for the final model. Now as you can see this part is defined under a TkResourceDescriptorList. Those elements contain a “Descriptors” property whose children are all the candidates for the selection. All you have to do next is to select one of the Descriptors property children. So this is how for example the head model will be selected. Now again for that specific head model that was selected, there is another “Descriptors” property which has its own list of available options and again you have to select one of them. And so on.

After doing that for all the items in the descriptor.mbin file, what you have at the end is a part selection which leads to a unique and complete model.


Fighter Ship – Exported from No Mans Model Viewer


The whole procedure is actually a tree traversal. The root of the tree is that full-of-parts scene and what you want at the end is a unique branch of the tree which represents a solid model with just the necessary parts.

The key to this procedure is how the part selection is made. If you look closely on the xml code above, every part has a “Chance” property, but it is set to “0” in pretty much all of them. I guess that the actual probabilities for the part selection are either decided on runtime by the engine, or they are set in other game parameter files. In my model viewer, I’ve randomized the selection. all parts have equal probabilities of being selecting and that leads to pretty diverge models.

A random generation of creatures based on the triceratops model


Fighter Ships – ProcGen (Those cubes are just not applied decals)


Dropships – ProcGen




Diplodocus model from the E3 trailer



Personally I’ve played the game for about 70h, all that time I NEVER encountered a creature like the diplodocus one. This means that either the engine is faulty and those parts are not selected (which i doubt it) or those parts chance of selection is so small that they end up super rare in the game. A lot of discussion (and mostly hatred) has been done about missing content from the game and content that appears only in gameplay trailers and stuff like that. I can’t speak about general game functionality or gameplay features etc, but from examining quite all the creature models in the game files I can say that there is TONS of content, which due to the engine decisions(?) doesn’t appear very often (or at all) in the game. If you ask me, the procedural generated diplodocus models are 10 times better than the static ones, and still if they wanted they could easily dictate their engine to load the static models (and of course all the trailer content) whenever they wanted, so, good or bad this is probably a design decision.



This is how the main part of the procedural model generation works. It is a very elegant and clever procedure because it is very easy for artists to add new content for the procedural generation. And in fact for every new part they add the new number of total combinations is increasing exponentially (if the part will be available in all tree paths).  From what I know they got 2 or 3 artists working on the models. The mindblowing thing about this generation procedure is that if they had double the number of people working EXCLUSIVELY on that part, the game content (just for the creatures) would be hundreds of times larger. And this fact alone shows me the capabilities and the potential that NMS game engine has.


Textures is another too complicated story regarding NMS models. As i mentioned in the Geometry section, all mesh parts are defined in SCENE.MBIN files. Entries in those files look something like that:


As you can see there are several attributes, which I’m not going to describe what they do right now, but I’ll talk a bit about the last one which defines the material that the mesh uses. As you can see this node references a material file to be used. If we now take a look on that material file it looks like this:

The important part on the material files is the “Samplers” section. its obvious that this section defines the textures to be used on the model part. Here is the interesting part. On static models all referenced textures are super nice proper textures which can be used directly on the models, no harm at all. BUT when the mesh is used in a procedural generated model, only the normal texture is a proper texture. The diffuse texture which contains all the color information of the part is an empty white texture.

At first I was thinking that the colors and textures are decided again at runtime, but that’s not the case. Those texture files are always accompanied by some “.TEXTURE.MBIN” files, which guess what, they act exactly like the model descriptor files. They define a way to combine textures in order to compose the final model diffuse texture. The game artists except from designing different model parts, they have also provided multiple different textures for each part. So traversing that file in the same fashion with the descriptor file, one can calculate the final diffuse texture of the procedural generated model. And here is the even better part. Even if two models are identical geometrywise, they can end up having completely different colors, marks, shapes, etc using those procedurally generated textures.

Sample markings texture

The texture details during their procedural generation can differ quite a lot. It looks like the textures are built up in a layer fashion. I’ll give an assembly example of a creature procedurally generated texture. Usually in the bottom layer there is a base texture which adds a basic color and shading on the model (Such texture names end with .BASE.DDS). On the next layer there is an underbelly texture (.UNDERBELLY.DDS) which gives more detail on the belly part of a creature. Then there is another layer which adds some more detail on random parts of the model (.UNDERLAYER.X.DDS). Next are the markings (.MARKINGS.X.DDS) which obviously define the most noticable skin details of the model. On the next layer there are again some skin details which have to be on top of the markings (.SKIN.DDS) and finally on the last layer there is another texture (TOP.X.DDS) which gives some more detail on specific parts of the model.

There seems to be a maximum number of allowed layers used in the procedural textures which is 8 (usually just 5 or 6 are used). Obviously there are a lot of textures that need to be blend together. This is why all textures are also accompanied by their appropriate mask texture which contains proper alpha information, so that the blending can be as accurate as possible. Also most of the time, textures are accompanied by the appropriate normal map, which handles the detail of each part.


If this texture mess was not enough, even with all the blending going on, the final texture is not properly colored. This is yet another genius technical trick from the developers. What they wanted to achieve was the creatures to have similar to the environment colors and they way that they implement that in the game is by using Palettes. I’m not 100% about how this works in the game, but I’m going to describe the way that I think this works and (because I’ve already applied it in my viewer) it should be pretty accurate.







So during the planet creation (or system creation),  specific colors are selected which are going to be used during the whole planet’s population. I’m talking about color selection because in the game files there are some very specific 8×8 color palettes (under PCBANKS/TEXTURES/PALETTES). Those palettes come probably into different shapes. This means that those 64 colors that are contained in the palettes are usually grouped into groups of 4 or 8 colors. So when the game sets a planet environment it makes a selection out of those groups which are going to be used later on. Those groups can be easily identified when looking on the palettes because they actually compose a gradient between 2 boundary colors.


The actual indexing within the selected group is done with information contained into the .texture.mbin file. A texture entry in those files looks like this:

Again I won’t talk about what the rest options do. What we care about on that entry is the “Palette” property. From this property’s information we can find out what color we should select for our part. In this case we see that we need to index the “Fur” palette (which comes into groups of 4 as seen above) and from the selected group we need the “Alternative1” color. This is how colors in Palettes are indexed. These “ColourAlt” property values can be something like “Primary”, “Alternative1”, “Alternative2″,”Alternative3” etc. This means that primary will be the first color of the group, alternative1 the second and so on. Again I’m not 100% about that but this is the way I’m using it and makes sense because of the nature of the color groups in the palettes.

So by now we have one base color for a specific texture, what do we do with it? Simple, multiply it with the texture color. Usually the default game textures come in a bluish kind of color which looks neutral. After the multiplication with the palette color the texture gets a proper vivid color.

Again this is done for every single texture that is to be applied to the model. So if there are 8 of them, we fetch them, we combine them with proper palette colors, we blend them bottom to top and we apply them to the model.

With all those texture combinations, diffuse textures can REALLY vary and make same geometry models have completely different skins. This procedural texture generation technique combined with the procedural model generation are enough to provide generated models with a unique look. Again like I wrote above for the model procedural generation, if dedicated artists where working exclusively on adding more layer variations and more detailed color palettes the outcome would be even richer and more detailed.



Triceratopsrig skeletal animation



Astronaut walking cycle


Spiderrig procedurally generated model slow walking animation


To be honest I’ve not researched animations as extensive as the geometry and textures. All I did (it was not as easy as it sounds though :P) was to parse them and successfully play them back in my model viewer. Still there are many parts that are very interesting and unique in this category.First of all, model skeletons are defined in the “SCENE.MBIN” files. They are actually hierarchical joint systems on which the actual model parts attach through vertex skinning. Nothing unusual on this. The interesting part is that, like I mentioned in the Geometry section, there are multiple mesh parts in a SCENE.MBIN file. So in order to handle the animation and movement of all those parts, the joint system defined covers ALL scene parts.Animations work exactly the same way, they are animating the full skeleton even if actually not even the have of the joints may be used from the final generated model. With a first look this looks like a waste of resources. In fact, the only overhead here is the actual parsing of the full animation file. At runtime the SCENE and GEOMETRY MBIN files provide sufficient information to upload to the GPU just the data needed for each particular model part.

Playing the game, I know that there is a final procedural procedure going on that has to do with the joint skeleton. Somehow the engine can modify the skeleton. Lower the center of gravity of the model, make legs higher/shorter make, modify head sizes and stuff. I haven’t researched enough in order to find out how this is controlled but I know that this is happening and this adds a completely new dimension on the creature procedural generation, because it is actually a way of modifying the final shape of the model. In fact the final shape can be modified to that extend where it doesn’t look very similar with the initial base model. Also the required IK calculations that have to be done in order to apply the old animations on the new joint skeleton, lead to modified animations, which depending on the heaviness of the modification makes creatures look completely different.



I tried to explain how the game works the way I have understood it by working on the game files for over 3 weeks. I concentrated on creature generation, but the same principles stand for the rest stuff (ships, npc’s, buildings, plants) When I started working on the files, everyone was excited with the game, everything looked new, different and stuff, but as the weeks passed everything started looking the same. So the question is: Is procedural generation worth it in NMS? 

There is no straight answer to this question.

As a software developer and a video game reverse engineer I’d say its TOTALLY worth it. Technically I have never seen game mechanics similar to NMS’s, and I doubt that I’ll ever see in any other upcoming game that will use procedural generation tech, simply because noone else will try to create fantasy worlds with enough randomness in there. From a technical perspective No Man’s Sky is a real GEM, and everyone who tries to deny that simply lies to himself, or he just got no idea (or doesn’t even care at all) about how games work. It’s game engine has so much potential and I just can’t stop thinking the game we would have if this engine was at the hands of a much larger game studio. Even with those initial asset limitations I’m still satisfied with the creature variations that we can get ingame (I do believe though that with some proper tuning we can get even more, from existing assets).

As a gamer though my feelings are mixed. Even if that is not my general style of playing, I knew from the moment I preordered No Man’s Sky, that this is the game where I can just chill out. Inspect the environment, the plants, the creatures. In a first look they may seem like you’ve seen then before but with a second look most of them will be different. Maybe just the textures will be different but they are different, maybe its a tiny small horn on a creature’s head, maybe its different creature markings, maybe a different small ship accessory. The content IS there (even the gameplay one, which they could spawn if they wanted), you can’t say there isn’t and what you get is a result of a very very good procedural generation procedure. In fact the content that NMS’s engine creates for a system with 2-3 planets exceeds BY FAR the assets you may see in ARK for example. And of course nothing will be 100% perfect like all the dinos in ARK, but that’s the beauty of it. It’s an engine that can create the most gorgeous and magestic creatures, and at the same time the lamest creatures ever existed in a video game. This is why I bought the game, and this is what I love about the game. I live for that moment, after exploring all that boring stuff, suddenly I land on the most beautiful planet I’ve ever seen. Again I’m not comparing rpg or gameplay features and stuff. I’m talking about procedural content only. This doesn’t mean that it can’t get any better. I believe that it can and I expect them to do that.

On the other hand, if someone is not determined to chill out, be patient and pay attention to detail, its totally not worth it, and all the procedural generated is really a total waste of resources. Its not an action fps game where you are supposed to be hunted all the time and rush every single second, and if the devs where clear about something at all, this was it. Everything will look the same if someone plays like that. Even if the trailer content was spawning in the game and every second planet had those lush planets and those huge dinos and that sandworm, still it would get boring, still it would be the same stuff after the 3rd system, simply because you still wouldn’t pay attention. You can’t blame the game or the developers for not exploring in an exploration game. Also, procedural generation is not and will never be a means to create content out of nowhere that will make sense. At least not in the near future. You don’t just plug in a math equation and create a new creature. You can do that on terrain, on plants, on rocks and ships maybe (with some really huge ??? on texturing) but on living moving things like creatures and npcs it gets so complicated that its impossible (And of course if that was possible they had no reason not do to it, they already do on all the rest stuff).  So if someone expected to see a brand new alien every 10 feet, sorry, but this is an expectation issue and not a dev problem. If there is a way to get that type of content, its the way NMS’s engine does it.

After all that research I know that the game has enough content to at least differentiate everything on each planet, so I can’t blame the content or the engine for not delivering. I have to blame the engine tuning and its configuration. I also have to blame those cursed multiplatform releases and the publisher. I’m 10000000000000000000% sure that the devs were rushed to release the game. The game that we got is not even close to a finished game, and obviously not even close reaching the 80% of the capabilities of the underlying game engine. From inspecting the files this is CRYSTAL CLEAR. Its closer to a tech demo than a game. Trying to deliver the same stuff over PC and PS4, simply butchers the game and probably trying to make it work on lower end specs and as higher framerates as possible, butchers the game even more. Personally I’m expecting updates and LOTS of them. I can forgive lots of HG’s mistakes on the game release, overpricing, lack of communication, even the lack of features (like multiplayer, which honestly I don’t give a sh*t about), BUT what I can’t forgive is that, considering that pre-release pretty messed up and pressured situation, they didn’t at least deliver an overall ingame engine configuration. What modders are doing right now is to dive into the files and try to find ways of making that VERY SAME ENGINE, create richer and more diverse content, and most of the time they succeed on that, simply because it IS capable of delivering way better stuff that it is doing right now. So all those options should have been accessible to every single player, and not found out only by modders. Obviously they chose no to do it is because they wanted all users to have the same universe, so that sharing waypoints, creatures, planets makes sense. But they should’ve included that. Force offline play and prove to all gamers what the engine is capable of.

For some reason I’m convinced that HG sooner or later is going to deliver. You simply don’t abandon 4+ years of working on an engine which is in fact great. And for those HG-conspirancy fans, really guys there are a thousand other ways that they could take our money and go, and that would happen a lot sooner.

Till the game engine blossoms…


Samuel Eto’o FIFA 17

So now that the compression obstacle is down, we can finally start working on the game models 😀
Bad news is that due to the new engine, models (like all the rest of the game files) have a new container format which will need a lot of testing and trial&error to parse it properly. Its more complicated than the rx3 format because it probably supports more functionality.

Good news is that the pure geometry information in FIFA 17 files is much more organized and therefore easier to parse than it was in the past. Right now its just vertices and triangles, so stay tuned for a new -update- post.


Literally the whole world plays Pokemon GO. Therefore I wouldn’t miss the chance playing around with the game’s files :P. So far I’m playing around with the Android version of the game.

The reason why I started playing around was obviously to find the pokemon models. It turns out though that the game (at least the Android game) does not ship with the pokemon models. Since the game in order to function is in a continuous connection with the Niantic servers, the pokemon models are actually downloaded  the moment you encounter them. This makes the model dumping really difficult and to a point where its actually impossible right now.

Eventually sometime in the future, someone is going to succeed into fetching the models and then I will be able to work on them. For now I was limited to work in models that are available in the files of the apk. The most common model files are the player models, so i started working on them.

The game is made using Unity, and therefore all its assets are in unity bundle files. The models should be in some unity specific geometry container as well, but all the tools i found on the net, failed to open the model assets and i had to work on them manually. It was not that difficult tbh. I didn’t automate the process, because like i said there are not so many models to import so far, so this is why the process is a bit manual right now. But as I said, locating the vertex & indices buffers is really easy.

The only “weird” stuff that i found on the model structure is that the UV buffers are completely separated by the vertex buffer. So in order to get the UV’s as well you need to locate this buffer as well. This is located right after the vertex buffer and it contains information for all the uv maps of the model (which are just 2 in most cases).

Textures especially in the android version of the game, usually come into 2 formats. Either pvr format (pretty much all model textures are in this format) or if the texture is pretty big and takes too much space it comes into a raw astc format (wiki).

Recently i got an iphone (*.ipa) version of the game.  Once I find some more time, I’m going to check what files are available there and if the textures are in a more common format.

For now I’m posting some screens of the male and female character models of Pokemon GO. I’ve packed those textured models and made them available for download in the download section.



The last couple of weeks, I spent some time playing ARK Survival Evolved. It is a great survival game, featuring some prehistorically based creatures which are simply STUNNING. The detail is amazing ingame, and in general the game graphics are simply EPIC.

The game was made in Unreal Engine 4, and afaik the company offers the full dev kit for modders to start creating and adding new stuff in the game. The game has an amazing potential on modding, and its up to the modders imagination to make the game even more beautiful than it already is.

Enough with the info 😛

As i mentioned the game was made in UE 4. UE 4 uses .uasset files which is obviously a binary and of course undocumented file format which is used by the engine to store literally everything. From textures, models and materials, to shaders, icons and skeletons. The good thing is that through the dev kit one can have access to all the stuff through the UE4 engine software. The bad thing about this is that its a 42 GB download + 4 GBs for the engine and of course this is the easy way, which of course nobody likes. There is also another tool called UE Viewer which unfortunately does not support any of the game files so far, although it is supposed to support UE 4 stuff…

So once again what I did was to quickly investigate the container format. It looks like it has a proper structure but it is not that easy to determine precisely because of the tons of different content that the asset contains. Fortunately i quickly figured out about the polygons formats and the texture formats and i am able to manually export textures and models out of those files (didn’t even bother examining other asset files).

Most textures are 2048×2048 size, but on the large creatures the game uses 4096×4096 textures. Its obvious that on quite crowded regions within the game, the vram is going to explode xD. You’ll need a really good GFX card in order to take advantage of the full game. Otherwise the 1024×1024 and 512×512 mipmaps will always work xD. The texture assets are quite consistent by providing all the possible mipmaps out of one image and in terms of the file format, for some reason the low res mipmaps are explicitly defined near the start of the file, and afterwards the highest 4 or 5 mipmaps are stored like they would normally do in a dds texture.

As for the models, they seem to provide 4 different LODs for each model in the game. The model assets should be containing mesh splits as well, but i wasn’t able to detect how this is done so far. Locating the vertex/uv/indices buffers was quite easy and this was how i managed to get the models into blender.

So here are some renders i did from some of the creatures i extracted:






You can notice some seams, on the pics above, its not a mistake or something. Those areas are probably overlapping during the animation, so they look normal ingame.

I can get pretty much all of them, but i just got what i thought it would be cool for now 😛


PS: If someone downloads those 42 GBs let me know because i want to cross check some stuff 😀




Blender Importer Script for NBA 2K15 is almost complete.

Model information such as subparts, uvmaps, binormals,tangents as well as blendindices and blendweights have all been successfully parsed.
I also wrote some code to parse the model skeleton into blender which allows eventually the animation of the model 😀

Some more pics about the model skeletons and the subparts:

The script will be released once the exporter is done as well.
Stay tuned for more news


For quite some time, i am working on the 2k15 archives. A whole new packaging and archiving scheme this year. IMMENSLY complicated, but still reversable. I am writing an archive explorer for the time being which after A LOT of testing is quite competitive, but will be released in time. The reason for this post is to talk about NBA 2K15 textures. First of all i think all you may have noticed that almost ALL developers have switched to DirectX 11. This means that they use dds textures which have the new dx10 headers (although using the same known dxt/bc compressions) and for the time being there are no previewing tools for them.

Now what 2K packs in the archives as textures are some files like this:


These seem like normal dds files, they are packed together with their appropriate dds headers, but they are a bit mixed. They need some kind of conversion which i managed to found out:


I am planning to add a texture viewer on the explorer, i’ll see how it will go.


Now some more textures i managed to extract from the game, just for showcase:


More news when its time 🙂



I took some time to examine the model bones again. Just to remind everyone, this is the reason why the FIFA exporter is not capable of exporting custom meshes as head, hair or boot models.

I managed to find the bone head positions out of the transformation matrices, but that does not seem to work will all the bones. That is still a heavily WIP.

Despite that, it is nice to know how many bones control a FIFA 15 face (and they are more than 30).



Fifa 15 has indtoduced some changes to all model textures.

The most important one is that all textures have doubled their sizes.

Face textures are now 1024×1024, ball textures 1024×1024 and boot textures 512×512

The second one that i noticed is that they introduced a new compression for the normal maps. From now and on the normal maps will be compressed with -BC5 compression. This is fairly new type which is probably optimized for normal maps and with the latest versions of DirectX.