Jump to content

Josh

Staff
  • Posts

    23,138
  • Joined

  • Last visited

Everything posted by Josh

  1. Okay, I believe this will be fixed when a new build goes up...
  2. I am seeing something weird on my 1080
  3. Josh

    POINT

    I correct the function definition: "Point", sol::overload ( [](Entity& e, shared_ptr<Entity> target) { e.Point(target); }, [](Entity& e, shared_ptr<Entity> target, int a) { e.Point(target, a); }, [](Entity& e, shared_ptr<Entity> target, int a, float r) { e.Point(target, a, r); }, [](Entity& e, shared_ptr<Entity> target, int a, float r, float z) { e.Point(target, a, r, z); }, [](Entity& e, Vec3& target) { e.Point(target); }, [](Entity& e, Vec3& target, int a) { e.Point(target, a); }, [](Entity& e, Vec3& target, int a, float r) { e.Point(target, a, r); }, [](Entity& e, Vec3& target, int a, float r, float z) { e.Point(target, a, r, z); }, [](Entity& e, float x, float y, float z) { e.Point(x,y,z); }, [](Entity& e, float x, float y, float z, int a) { e.Point(x, y, z, a); }, [](Entity& e, float x, float y, float z, int a, float r) { e.Point(x, y, z, a, r); }, [](Entity& e, float x, float y, float z, int a, float r, float roll) { e.Point(x, y, z, a, r, roll); } ),
  4. Josh

    POINT

    Test: local world = CreateWorld() local box = CreateBox(world) local box2 = CreateBox(world); box:SetPosition(2,0,0) box:Point(box2, 2, 1, 0) result: sol: no matching function call takes this number of arguments and the specified types
  5. I changed the function definition to support NULL for the parent parameter.
  6. Yes, please send your project.
  7. This is a protected member that was meant for possible future development. I am making this member private. I also added a read-only font member to the Interface class. This will only be non-NULL when a 3D GUI is created.
  8. This is looking pretty good. Everything except animations is loading now. I tried the Step file loading, but it failed on my first try, and there are a lot of posts saying it's not really supported, as I suspected. #include "UltraEngine.h" #include "ComponentSystem.h" #include "assimp/include/assimp/Importer.hpp" #include "assimp/include/assimp/scene.h" #include "assimp/include/assimp/postprocess.h" #include "assimp/include/assimp/pbrmaterial.h" using namespace UltraEngine; shared_ptr<Model> AILoadModel(const aiScene* scene, aiNode* node, shared_ptr<World> world, shared_ptr<Entity> parent, const std::vector<shared_ptr<Material> >& materials) { auto model = CreateModel(world); model->SetParent(parent); model->name = std::string(node->mName.C_Str()); // Process meshes attached to the current node (if any) for (unsigned int i = 0; i < node->mNumMeshes; i++) { MeshPrimitives mode = MeshPrimitives(0); aiMesh* aimesh = scene->mMeshes[node->mMeshes[i]]; switch (aimesh->mPrimitiveTypes) { case aiPrimitiveType_POINT: mode = MESH_POINTS; break; case aiPrimitiveType_LINE: mode = MESH_LINES; break; case aiPrimitiveType_POLYGON: if (aimesh->mNumFaces > 0 and aimesh->mFaces[0].mNumIndices == 4) mode = MESH_QUADS; break; case aiPrimitiveType_TRIANGLE: mode = MESH_TRIANGLES; break; } if (mode != 0 and aimesh->HasPositions()) { auto mesh = model->AddMesh(mode); Vec3 pos, norm; Vec2 texcoords; Vec4 color; std::array<int, 4> bones; std::array<float, 4> weights; for (int v = 0; v < aimesh->mNumVertices; ++v) { pos.x = aimesh->mVertices[v].x; pos.y = aimesh->mVertices[v].y; pos.z = aimesh->mVertices[v].z; if (aimesh->HasNormals()) { norm.x = aimesh->mNormals[v].x; norm.y = aimesh->mNormals[v].y; norm.z = aimesh->mNormals[v].z; } if (aimesh->GetNumUVChannels() > 0) { texcoords.x = aimesh->mTextureCoords[0][v].x; texcoords.y = 1.0f - aimesh->mTextureCoords[0][v].y; } mesh->AddVertex(pos, norm, texcoords); if (aimesh->HasVertexColors(0) and aimesh->GetNumColorChannels() == 4) { color.r = aimesh->mColors[0][v].r; color.g = aimesh->mColors[0][v].g; color.b = aimesh->mColors[0][v].b; color.a = aimesh->mColors[0][v].a; mesh->SetVertexColor(v, color); } if (aimesh->GetNumUVChannels() > 1) { texcoords.x = aimesh->mTextureCoords[1][v].x; texcoords.y = 1.0f - aimesh->mTextureCoords[1][v].y; mesh->SetVertexTexCoords(v, texcoords, 1); } if (aimesh->HasBones()) { for (int n = 0; n < Min(aimesh->mNumBones, 4); ++n) { //bones[n] = aimesh->mBones[n]->mNode; weights[n] = aimesh->mBones[n]->mWeights->mWeight; } //mesh->SetVertexBones(); } if (mode == MESH_POINTS) mesh->AddIndice(v); } if (mode != MESH_POINTS) { for (int v = 0; v < aimesh->mNumFaces; ++v) { switch (mode) { case MESH_LINES: mesh->AddPrimitive(aimesh->mFaces[v].mIndices[0], aimesh->mFaces[v].mIndices[1]); break; case MESH_TRIANGLES: mesh->AddPrimitive(aimesh->mFaces[v].mIndices[0], aimesh->mFaces[v].mIndices[1], aimesh->mFaces[v].mIndices[2]); break; case MESH_QUADS: mesh->AddPrimitive(aimesh->mFaces[v].mIndices[3], aimesh->mFaces[v].mIndices[2], aimesh->mFaces[v].mIndices[1], aimesh->mFaces[v].mIndices[0]); break; } } } mesh->SetMaterial(materials[aimesh->mMaterialIndex]); mesh->UpdateBounds(); if (aimesh->HasNormals() and aimesh->GetNumUVChannels() > 0) mesh->UpdateTangents(); } } for (int n = 0; n < node->mNumChildren; ++n) AILoadModel(scene, node->mChildren[n], world, model, materials); model->UpdateBounds(); return model; } shared_ptr<Model> ImportModel(shared_ptr<World> world, const WString& path) { Assimp::Importer importer; auto d = CurrentDir(); const aiScene* scene = importer.ReadFile(path.ToUtf8String().c_str(), aiProcess_JoinIdenticalVertices | aiProcess_SortByPType); if (not scene or scene->mFlags & AI_SCENE_FLAGS_INCOMPLETE or not scene->mRootNode) { return nullptr; } std::vector<shared_ptr<Texture> > textures; if (scene->HasTextures()) { for (unsigned int i = 0; i < scene->mNumTextures; ++i) { aiTexture* aitex = scene->mTextures[i]; auto buffer = CreateBuffer(aitex->mWidth * aitex->mHeight * 4); memcpy(buffer->Data(), aitex->pcData, buffer->GetSize()); auto pixmap = CreatePixmap(aitex->mWidth, aitex->mHeight, TEXTURE_RGBA, buffer); auto mipchain = pixmap->BuildMipchain(); auto tex = CreateTexture(TEXTURE_2D, aitex->mWidth, aitex->mHeight, TEXTURE_RGBA, mipchain); } } std::vector<shared_ptr<Material> > materials; if (scene->HasMaterials()) { for (unsigned int i = 0; i < scene->mNumMaterials; ++i) { auto mtl = CreateMaterial(); auto aimtl = scene->mMaterials[i]; aiString path; if (aimtl->GetTextureCount(aiTextureType_DIFFUSE) > 0) { if (aimtl->GetTexture(aiTextureType_DIFFUSE, 0, &path) == aiReturn_SUCCESS) { auto tex = LoadTexture(std::string(path.C_Str())); mtl->SetTexture(tex, TEXTURE_BASE); } } if (aimtl->GetTextureCount(aiTextureType_NORMALS) > 0) { if (aimtl->GetTexture(aiTextureType_NORMALS, 0, &path) == aiReturn_SUCCESS) { auto tex = LoadTexture(std::string(path.C_Str())); mtl->SetTexture(tex, TEXTURE_NORMAL); } } if (aimtl->GetTextureCount(aiTextureType_EMISSIVE) > 0) { if (aimtl->GetTexture(aiTextureType_EMISSIVE, 0, &path) == aiReturn_SUCCESS) { auto tex = LoadTexture(std::string(path.C_Str())); mtl->SetTexture(tex, TEXTURE_EMISSION); } } float metalness;; if (AI_SUCCESS == aimtl->Get(AI_MATKEY_GLTF_PBRMETALLICROUGHNESS_METALLIC_FACTOR, metalness)) { mtl->SetMetalness(metalness); } float roughness;; if (AI_SUCCESS == aimtl->Get(AI_MATKEY_GLTF_PBRMETALLICROUGHNESS_ROUGHNESS_FACTOR, roughness)) { mtl->SetRoughness(roughness); } aiColor4D pbrSpecularGlossiness; if (AI_SUCCESS == aimtl->Get(AI_MATKEY_GLTF_PBRSPECULARGLOSSINESS, pbrSpecularGlossiness)) { mtl->SetShaderFamily(LoadShaderFamily("Shaders/SpecularGloss.fam")); mtl->SetSpecular(Vec3(pbrSpecularGlossiness.r, pbrSpecularGlossiness.g, pbrSpecularGlossiness.b)); mtl->SetGlossiness(pbrSpecularGlossiness.a); } int unlit; if (AI_SUCCESS == aimtl->Get(AI_MATKEY_GLTF_UNLIT, unlit) && unlit == 1) { mtl->SetShaderFamily(LoadShaderFamily("Shaders/Unlit.fam")); } aiColor4D diffuseColor(0.f, 0.f, 0.f, 0.f); aimtl->Get(AI_MATKEY_COLOR_DIFFUSE, diffuseColor); aiColor3D emissiveColor(0.f, 0.f, 0.f); aimtl->Get(AI_MATKEY_COLOR_EMISSIVE, emissiveColor); mtl->SetColor(diffuseColor.r, diffuseColor.g, diffuseColor.b, diffuseColor.a); mtl->SetEmission(emissiveColor.r, emissiveColor.g, emissiveColor.b); materials.push_back(mtl); } } auto model = AILoadModel(scene, scene->mRootNode, world, NULL, materials); //for (int n = 0; n < scene->mNumAnimations; ++n) //{ // aiAnimation* animation = scene->mAnimations[n]; // animation->mChannels[0]. //} return model; } int main(int argc, const char* argv[]) { RegisterComponents(); auto cl = ParseCommandLine(argc, argv); //Load FreeImage plugin (optional) auto fiplugin = LoadPlugin("Plugins/FITextureLoader"); //Get the displays auto displays = GetDisplays(); //Create a window auto window = CreateWindow("Ultra Engine", 0, 0, 1280 * displays[0]->scale, 720 * displays[0]->scale, displays[0], WINDOW_CENTER | WINDOW_TITLEBAR); //Create a framebuffer auto framebuffer = CreateFramebuffer(window); //Create a world auto world = CreateWorld(); auto camera = CreateCamera(world); camera->AddComponent<CameraControls>(); camera->Move(0, 0, 3); camera->Turn(0, 180, 0); camera->SetClearColor(0.25); //Load a model auto model = ImportModel(world, "test.gltf");//load your own model here auto light = CreateDirectionalLight(world); light->SetRotation(34, 36+180, 0); //Set environment maps const WString remotepath = "https://raw.githubusercontent.com/UltraEngine/Documentation/master/Assets"; auto specmap = LoadTexture(remotepath + "/Materials/Environment/Storm/specular.dds"); auto diffmap = LoadTexture(remotepath + "/Materials/Environment/Storm/diffuse.dds"); world->SetEnvironmentMap(specmap, ENVIRONMENTMAP_SPECULAR); world->SetEnvironmentMap(diffmap, ENVIRONMENTMAP_DIFFUSE); //Main loop while (window->Closed() == false and window->KeyDown(KEY_ESCAPE) == false) { world->Update(); world->Render(framebuffer); } return 0; }
  9. It seems pretty easy to load models with this: #include "UltraEngine.h" #include "ComponentSystem.h" #include "assimp/include/assimp/Importer.hpp" #include "assimp/include/assimp/scene.h" #include "assimp/include/assimp/postprocess.h" using namespace UltraEngine; shared_ptr<Model> AILoadModel(const aiScene* scene, aiNode* node, shared_ptr<World> world, shared_ptr<Entity> parent) { auto model = CreateModel(world); model->SetParent(parent); model->name = std::string(node->mName.C_Str()); // Process meshes attached to the current node (if any) for (unsigned int i = 0; i < node->mNumMeshes; i++) { MeshPrimitives mode = MeshPrimitives(0); aiMesh* aimesh = scene->mMeshes[node->mMeshes[i]]; switch (aimesh->mPrimitiveTypes) { case aiPrimitiveType_POINT: mode = MESH_POINTS; break; case aiPrimitiveType_LINE: mode = MESH_LINES; break; case aiPrimitiveType_POLYGON: mode = MESH_QUADS; break; case aiPrimitiveType_TRIANGLE: mode = MESH_TRIANGLES; break; } if (mode != 0 and aimesh->HasPositions()) { auto mesh = model->AddMesh(mode); Vec3 pos, norm; Vec2 texcoords; for (int v = 0; v < aimesh->mNumVertices; ++v) { pos.x = aimesh->mVertices[v].x; pos.y = aimesh->mVertices[v].y; pos.z = aimesh->mVertices[v].z; if (aimesh->HasNormals()) { norm.x = aimesh->mNormals[v].x; norm.y = aimesh->mNormals[v].y; norm.z = aimesh->mNormals[v].z; } if (aimesh->GetNumUVChannels() > 0) { texcoords.x = aimesh->mTextureCoords[0]->x; texcoords.y = aimesh->mTextureCoords[0]->y; } mesh->AddVertex(pos, norm, texcoords); if (aimesh->GetNumUVChannels() > 1) { texcoords.x = aimesh->mTextureCoords[1]->x; texcoords.y = aimesh->mTextureCoords[1]->y; mesh->SetVertexTexCoords(v, texcoords); } if (mode == MESH_POINTS) mesh->AddIndice(v); } if (mode != MESH_POINTS) { for (int v = 0; v < aimesh->mNumFaces; ++v) { switch (mode) { case MESH_LINES: mesh->AddPrimitive(aimesh->mFaces[v].mIndices[0], aimesh->mFaces[v].mIndices[1]); break; case MESH_TRIANGLES: mesh->AddPrimitive(aimesh->mFaces[v].mIndices[2], aimesh->mFaces[v].mIndices[1], aimesh->mFaces[v].mIndices[0]); break; case MESH_QUADS: mesh->AddPrimitive(aimesh->mFaces[v].mIndices[3], aimesh->mFaces[v].mIndices[2], aimesh->mFaces[v].mIndices[1], aimesh->mFaces[v].mIndices[0]); break; } } } if (aimesh->mMaterialIndex) { auto aimtl = scene->mMaterials[aimesh->mMaterialIndex]; aiString diffuseTexturePath, normalTexturePath, metalTexturePath, roughnessTexturePath, displacementTexturePath, occlusionTexturePath, specularTexturePath, glossTexturePath; if (aimtl->GetTextureCount(aiTextureType_DIFFUSE)) aimtl->GetTexture(aiTextureType_DIFFUSE, 0, &diffuseTexturePath); if (aimtl->GetTextureCount(aiTextureType_NORMALS)) aimtl->GetTexture(aiTextureType_NORMALS, 0, &normalTexturePath); if (aimtl->GetTextureCount(aiTextureType_METALNESS)) aimtl->GetTexture(aiTextureType_METALNESS, 0, &metalTexturePath); if (aimtl->GetTextureCount(aiTextureType_DIFFUSE_ROUGHNESS)) aimtl->GetTexture(aiTextureType_NORMALS, 0, &roughnessTexturePath); if (aimtl->GetTextureCount(aiTextureType_SPECULAR)) aimtl->GetTexture(aiTextureType_SPECULAR, 0, &specularTexturePath); if (aimtl->GetTextureCount(aiTextureType_SHININESS)) aimtl->GetTexture(aiTextureType_SPECULAR, 0, &glossTexturePath); if (aimtl->GetTextureCount(aiTextureType_DISPLACEMENT)) aimtl->GetTexture(aiTextureType_DISPLACEMENT, 0, &displacementTexturePath); if (aimtl->GetTextureCount(aiTextureType_AMBIENT_OCCLUSION)) aimtl->GetTexture(aiTextureType_AMBIENT_OCCLUSION, 0, &displacementTexturePath); aiColor3D diffuseColor(0.f, 0.f, 0.f); aimtl->Get(AI_MATKEY_COLOR_DIFFUSE, diffuseColor); aiColor3D emissiveColor(0.f, 0.f, 0.f); aimtl->Get(AI_MATKEY_COLOR_EMISSIVE, emissiveColor); } mesh->UpdateBounds(); if (aimesh->HasNormals() and aimesh->GetNumUVChannels() > 0) mesh->UpdateTangents(); } } for (int n = 0; n < node->mNumChildren; ++n) AILoadModel(scene, node->mChildren[n], world, model); model->UpdateBounds(); return model; } shared_ptr<Model> ImportModel(shared_ptr<World> world, const WString& path) { Assimp::Importer importer; auto d = CurrentDir(); const aiScene* scene = importer.ReadFile(path.ToUtf8String().c_str(), aiProcess_JoinIdenticalVertices | aiProcess_SortByPType); if (not scene or scene->mFlags & AI_SCENE_FLAGS_INCOMPLETE or not scene->mRootNode) { return nullptr; } auto model = AILoadModel(scene, scene->mRootNode, world, NULL); return model; } int main(int argc, const char* argv[]) { RegisterComponents(); auto cl = ParseCommandLine(argc, argv); //Load FreeImage plugin (optional) auto fiplugin = LoadPlugin("Plugins/FITextureLoader"); //Get the displays auto displays = GetDisplays(); //Create a window auto window = CreateWindow("Ultra Engine", 0, 0, 1280 * displays[0]->scale, 720 * displays[0]->scale, displays[0], WINDOW_CENTER | WINDOW_TITLEBAR); //Create a framebuffer auto framebuffer = CreateFramebuffer(window); //Create a world auto world = CreateWorld(); auto camera = CreateCamera(world); camera->AddComponent<CameraControls>(); camera->Move(0, 0, -3); //Load a model auto model = ImportModel(world, "test.fbx");//load your own model here auto light = CreateDirectionalLight(world); light->SetRotation(34, 36, 0); //Main loop while (window->Closed() == false and window->KeyDown(KEY_ESCAPE) == false) { world->Update(); world->Render(framebuffer); } return 0; }
  10. Two meters is a very high near range. It should be way less than that.
  11. I suspect the 28th step is triggering a garbage collection step...
  12. It should be limited only by your system memory.
  13. Hmmm, that is a lot! I think your idea to disable reprojection based on a camera setting is probably better.
  14. Here is an extension(?) to the Blender exporter / importer that adds support for this feature: https://github.com/takahirox/glTF-Blender-IO-MSFT-lod Have not tried it myself yet.
  15. You would probably want to use to cameras to render this, first drawing everything under the transparent window, then applying the post-effect, then rendering the window in another pass. Honestly, it sounds pretty hard to set up in a general-purpose way.
  16. Every SketchFab model is available in glTF format. glTF is the only file format I know of that supports a PBR material system. FBX does not. The channel splitting and compression stuff PBR materials require is pretty complex: https://www.ultraengine.com/learn/pbrmaterials For trees it might not matter, but if you get into metal surfaces I think you might start to appreciate the glTF export process more.
  17. It sounds like you are moving a lot of work from the Ultra editor into Blender, and you are setting up a very specific scene in Blender so it works with your export process. This is fine if it works for you. Note that Blender's DDS file format support is 15 years outdated and they seem uninterested in fixing it, so it does not support artifact-free texture compression (BC7 and BC5 for normal maps). DXT-compressed texture data looks blocky and often develops a greenish tint that look especially pronounced in dark areas: Interestingly, I came across a neural network designed specifically to remove these artifacts: https://github.com/n00mkrad/cupscale/releases/ https://developer.valvesoftware.com/wiki/Restoring_Texture_After_DXT_Compression https://drive.google.com/file/d/1WXBNdlqDWV10a3L1_U7zuNkSpN9aBF0M/view?usp=sharing My point is, if you want artifact-free compressed textures, PNG textures in Blender aren't going away, and you need some post-export step to convert them to BC7 and BC5 DDS files. This tool is the easiest way to do that for glTF files: For your purposes, it sounds like it's not the file format that matters so much, but the export process you have set up. If so, you could probably export glTF files just as easily using the same process, but if your current approach is working then by all means use it. In my view, there is no need for any "source file" anymore because glTF is the source file, as well as the final game-ready file. Fortunately, the export capabilities of the engine are quite good, so even if data does get sent into a proprietary format, it can still be saved as a glTF or OBJ without much trouble...
  18. The reason it is tied to the terrain is because each instance's position is determined programmatically, and not stored in memory. The x and z position are calculated by the instance number, and the y position is based on the terrain heightmap elevation. There is no position for each instance stored in memory, which is why each instances consumes only one bit. Without the terrain, how would the algorithm figure out the position of each instance?
  19. You probably need to restart Visual Studio after editing the env var. If you launch VS from the Ultra Editor, you would need to restart the editor, since I think child processes inherit the env vars the parents have, which are loaded at process start and do not update when the user changes them.
  20. Wouldn't that be a post-processing effect? You can not write to and read from the same color texture at the same time.
×
×
  • Create New...