[Sumukh] Generally, it went like this:
For static objects:
For dynamic objects:
(David) For model/texture loading, Anthony implemented a model loader using Assimp, a model importer library. We also used Assimp for importing animation data, based off of a LearnOpenGL article on implementing animation in OpenGL, and imported animations from Mixamo in the .fbx format. One thing about Mixamo animations is that it can be a bit tricky to get a model uploaded with the correct textures. My workaround for this was as follows:
We used Blender exclusively for our 3D modeling of individual scenes as props, as well as UV unwrapping. Actual materials were created with the awesome Substance Painter software. Rather that directly attaching materials in Blender, we had a custom configuration-based system that allowed to write material data in .ini files, allowing us to change properties such as textures, tints, brightness, and the like on the fly without having to reopen Blender. To get things into our game, we used ASSIMP for model loading, and SOIL2 for material loading.
We also used Blender for stitching together the individual scenes and props, as well as giving them metadata that our ActivatorRegistrator system took advantage above.
For modeling and animations, everything was done in Maya 2015 and the student version of Maya 2016, and exported as OBJ and FBX files. Everything was first tested in Open 3D Model Viewer to make sure models and animations looked good before going to the graphics team. Ultimately all model assets used were in FBX format. In order to get the textures for more complex and custom designs, such as the character model and the deerstalker, we created automatic UV maps and took UV snapshots in Maya. The UV snapshots are saved as PNG files and imported into Photoshop to paint before being imported back into Maya as texture color files. We learned that transparency in the texture file might make the model transparent and awkward, so future artists be careful! To model the character from 2D images, Tiffany drew orthographic views of the character (front and side) and imported them to Maya on intersecting planes. The intersecting planes act as guides for 3D modeling. Animations were done by keyframes, with the first and last frames the same for looping animations such as running and punching. Maya has a great rigging tool called HumanIK, which automatically generates a skeleton and an advanced rig for you. The rig helps the animating process smoother. However we had problems getting the HumanIK to load in Assimp so it’s safer to first manually create a skeleton with the joint tool. We were able to get animations with just the skeleton and no IK handles, though it does require a bit more attention to detail (ie. how would the rest of the body move with a lowered arm). Before animating, ensure the skin is bound to the right joints by checking the Paint Skin Weights Tool. Maya is a very advanced tool with a steep learning curve that caused many frustrations, but because it’s so widely used in the industry, the suffering was worth it! I expect a similar experience with 3ds Max and Blender so there’s no escape either way! [Tiffany]
[Lauren] All the models were created using Maya. Once the models were finished, they were exported as obj files and sent to Thomas to be imported into the game. Keeping the files small was important because if they were too big the initial start up of the game would be slowed significantly. Because of this, the models couldn't be very smooth, so we took a more angular design approach.
[Thomas] We did not use a library to load meshes, instead opting to code that manually. Textures were loaded using the SDL2_image library, version 2.0.1, so that we could support any image file format Lauren wanted to use without any difficulty. It's worth noting, however, that the textures loaded in this manner are flipped vertically when sent to OpenGL; in order to work around this, our shaders used 1-y for all texture lookups. While meshes and material data were loaded from the Wavefront OBJ format, we only supported use of a single material in a given model. In addition, we only supported meshes made entirely of triangles; we could not find any setting in Maya which would automatically triangulate meshes, so this required the additional step of manually triangulating each mesh before exporting.
For our game, we didn't have to support animation since our animation is all included through the physics library where pieces will fall apart. Therefore, we only need to worry about loading models and apply correct textures onto the model. We use an online loader called tinyobj loader to load our obj models. SOIL to load all the image assets such as texture map, normal map, metallic and gloss map. Then we use shader to apply all the effects. One problem we faced was using metallic and gloss map correctly. Due to our time constraint and problems with the shader and map itself, we ended up not using them at all.
I would suggest use mtl files first. The properties contained in mtl files are really useful. Things like object diffuse, specular and ambient make writing shader much more easier.
[Alex] Modeling was done with Blender 2.66, though Maya/3DMax would’ve worked just as well. When the model is done and textured, Blender is used to export the .obj file and the .mtl file. In the final version, we used Blender to export the .obj file and 4 .pngs corresponding to the model’s diffuse, specular, metallic and normal maps. The diffuse pictures were taken from the internet, and Crazybump was used to generate the specular, metallic and normals .pngs.
We used 3ds max for modeling and assimp to load 3D model and texture. To load animation, we found a md2 loader from Google, and here is the link http://www.flipcode.com/archives/MD2_Model_Loader.shtml. This md2 loader has a very simple interface, which only contains three functions, load, setframe, and draw. Load function is used to load animation, draw function is used to render a specific frame, and setframe is used to set that frame. However, the problem of md2 file format is that it doesn’t contain the normal of each vertex, which means we have to use the normal of the triangle instead. However, this md2 loader doesn’t support texture.
https://ogldev.org/www/tutorial38/tutorial38.html. This tutorial teaches how to load a md5 (skeleton animation) animation by using assimp.
David Wang: Unless you have someone as good as Wes in importing Fbx files, I would probably stay away from that file format if you’re working in Maya. I found a way to export Md5 files from fbx files in blender with a python script, but that takes a little searching. I remember Md5 as being a much easier alternative, though by the time we figured that out, Wes already solved the fbx issue. .dae files cause bone issues in Maya and obj files are for static meshes only. If you do plan to export FBX files, animate in Maya LT through their game exporter. From there you’re able to export multiple “takes” for one model in one fbx file. This saves memory and time loaded in the beginning of the game. Running “select -hi” in Maya LT’s python script allows you to select all of the bones at once for keyframing. You’re welcome future animator.
Wesley Lau: For graphics, animation was a pain to figure out. I first had an initial model loader that just got the static models to load fine, but extending it to animation was the frustrating part. Anyways, there is not a lot of documentation out there, but perhaps I will post up the code for our animation loader up online if any future 125 students want to see how our pipeline (code-side) worked. Here are the steps that I took after receiving the fbx model file from David: Using ASSIMP, I loaded all the mesh, bone, and animation data At least start drawing the initial mesh (bind pose) Take the bone matrices of each mesh and apply the necessary calculation to animate them while running an animation timer Draw onto screen using VAO or vertex3
The characters were modeled using maya (1000-8000 triangles was the range for the characters, but most averaged at 2,000 polys). A combination of reducing triangles and smoothing normals in maya allowed us to have high res looking models but with a low poly count. The models were then exported and textured and UV mapped using Zbrush. Zbrush UV maps the models with a press of one button, then from zbrush it was exported back into maya just to make sure everything mapped correctly and then exported as an .obj or .fbx format. Zbrush is amazing, but It is usually used for modeling for movies were polycount doesn’t matter so that was a huge learning process, but its awesome to paint your models with. The 2D graphics were all done in photoshop. Other textures that were used were found online with a corresponding bump map.
For loading in meshes and models, we used both an obj loader and an fbx loader. The obj loader simply amounts to code that counts up the number of vertices, normals, and texture coordinates, and then fills in arrays with their data reading down the file (based on if the line starts with v, vn, or vt). The last part of the obj file are faces, which have three points. Each point is in the format x/y/z, where the x represents what number vertice was listed, y stands for the texture coordinate, and z is the normal. Since these counts start at 1, you should subtract 1 from the value when referencing your arrays. The fbx loader came from code found for the fbx file. The logic is similar to an obj loader, except it has functions for getting the index for a polygon and then the vertex and vertexnormal for the polygon. We added code to calculate the center of the model and adjust the points so that they were centered at 0, but other than that, no real troubleshooting was done. While the obj loader was really straight forward, we wouldn’t recommend the fbx file type since it contained quite a lot of extra information and we had one file that for an unknown reason seemed to take longer to load even though it was smaller and simpler than the other models.
The other piece of code we needed was the DDSTextureLoader from Microsoft. This was needed because they deprecated the other functions that were used to load textures and so this loader was needed to load in the textures. The textures had to be in DDS format for this to work. We found converting the textures to DDS in Visual Studio worked the best and had no problems. Other than that, it simply involved using the DirectX::CreateDDSTextureFromFile() function call which Microsoft provides details for. Although it did require adding outside code, the DDSTextureLoader was easy to use and would be recommended if you are using the latest version of Direct X.
For the art, we relied on 3DS Max to texture, skin and animate the models. The model mesh itself was created in a free program Sculptris. While Sculptris is a very helpful program for making detailed models that look good, it does not allow a lot of control over the size, vertices of the model or the number of triangles, also the cross compatibility of Sculptris models with other 3D modeling programs is dubious at best. If given the option to work on it again, even though it gives a lot less ease and flexibility, I would try creating, skinning, and rigging the model all in one program. Even though it probably would have resulted in a simpler-looking model it would be easier to use a model built within the program(Maya or 3DS Max) to take advantage of all of the program's features.
We used Assimp to load blender models. The problem with Assimp is that it doesn't allow bones with the same name. The artist needs to be aware of this and make sure every bone is named differently. Animations also need to be starting from 0 frame. We had an incident where the animation was off by one or two frames and Assimp failed to work. One more thing to notice for graphics programmer is that artists usually don't know the maximum number of bone attachments in the model because modelling software is pretty high level. This can cause less bone attachments inside the program than what was designed by the artist which will warp the model. The solution is to make the maximum attachment a larger number than what is usually used/needed.
Assimp is a pretty decent library and so far I haven't found any other library that’s easier to use. So I'll definitely use it again.