Jump to content

klepto2

Developers
  • Posts

    857
  • Joined

  • Last visited

Posts posted by klepto2

  1. A small addition to your TODO-List (if you consider it):

    • fix ssr in combination with refraction (too dark environment and the reflection of refracted surfaces is broken):
    • image.png.f85027c308f0a8530dd7dc2b158b970a.png
    • add the removed animated texture feature (would make some things a lot easier)
    • cleanup the shaders (remove experiment values etc.)
    • maybe add some more hooks: (just a few ideas, but these would alter the flexibility of the engine a lot)
      • BEFORE_TRANSER / RENDER and AFTER_TRANSFER / RENDER
      • BEFORE_CAMERA_RENDER / AFTER_CAMERA_RENDER
      • BEFORE_PFX_RENDER / AFTER_PFX_RENDER
    • Add a way to get the internal shader index of a texture.
    • Maybe consider adding more than just one matrix for user defined values in shaders. Some complicated shaders need a lot of input values and storing them in textures might be a bit complicated.

     

    • Upvote 1
  2. small update: 

    image.thumb.png.7589cad91df68a3d95281208d068b52c.png

    This is more or les an adaption of the Leadwerks-Water, but instead of my previous screen it doesn't need a modified refraction shader. you can see a  (realtime) generated alpha mask for the water on the left bottom (generated via a texturebuffer and top-down camera rendering the whole covered terrain space), this texture is used in a custom water-shaderfamily which supports animated textures (normals) and calculates the alpha level based on the generated alpha texture. With that it should be possible to add foam as well. and later generating a distance field based on this map to animate waves, foam and other interactions in realtime.

    • Like 3
  3. 16 minutes ago, reepblue said:

    I might want to check this out for myself as I was getting weird issues when making an in-game console a few months ago.

    Minus the AMD issue, looks like it's just little things left to do. I really think you can get the early access release out before Christmas. :)

    With the working texturebuffer and the camera—>setrendertarget it should work very good.

  4. Indeed I am using a posteffect on the rendercamera. After removing it, it creates and uses the correct texture. Background: I  want to capture the underlying terrain, to build some simple Distantfield. therefore i need a pfx to just output the height below the water and there i use a custom posteffect for.

  5. I am experimenting now with the render to texture feature, and noticed something weird or maybe it is wanted that way, i don't know.

    If i create a TextureBuffer like this:

    auto screenBuffer = CreateTextureBuffer(512, 512);
    auto rendercamera = CreateCamera(world);
    rendercamera->SetFov(90);
    rendercamera->SetRenderTarget(screenBuffer);

    the actual textures for this buffer are not of size 512 * 512, but insead they are the same size as the main framebuffer. This leads to some issues later as you need to remap the texture coords to map to the correct space. 

    image.thumb.png.c1f14d0bab9ba527c4c49523464a1bb8.png

    • Confused 1
  6. 9 hours ago, Josh said:

    Wait...wouldn't the Z axis be half as big each mipmap?

    
    #include "UltraEngine.h"
    
    using namespace UltraEngine;
    
    int main(int argc, const char* argv[])
    {
        auto plg = LoadPlugin("Plugins/FITextureLoader");
    
        int framecount = 128;
        std::vector<shared_ptr<Pixmap> > pixmaps(framecount);
        for (int n = 0; n < framecount; ++n)
        {
            pixmaps[n] = LoadPixmap("https://raw.githubusercontent.com/UltraEngine/Documentation/master/Assets/Materials/Animations/water1_" + String(n) + ".png");
        }
    
        //Build mipmaps
        iVec3 size = iVec3(pixmaps[0]->size.x, pixmaps[0]->size.y, pixmaps.size());
        auto mipchain = pixmaps;
        while (true)
        {
            auto osize = size;
            size.x = Max(1, size.x / 2);
            size.y = Max(1, size.y / 2);
            size.z = Max(1, size.z / 2);
            for (int n = 0; n < size.z; ++n)
            {
                auto a = pixmaps[n * 2 + 0];
                auto b = pixmaps[n * 2 + 1];
                auto mipmap = CreatePixmap(osize.x, osize.x, pixmaps[0]->format);
                for (int x = 0; x < pixmaps[0]->size.x; ++x)
                {
                    for (int y = 0; y < pixmaps[0]->size.y; ++y)
                    {
                        int rgba0 = a->ReadPixel(x, y);
                        int rgba1 = b->ReadPixel(x, y);
                        int rgba = Rgba((Red(rgba0) + Red(rgba1)) / 2, (Green(rgba0) + Green(rgba1)) / 2, (Blue(rgba0) + Blue(rgba1)) / 2, (Alpha(rgba0) + Alpha(rgba1)) / 2);
                        mipmap->WritePixel(x, y, rgba);
                    }
                }
                mipmap = mipmap->Resize(size.x, size.y);
                pixmaps[n] = mipmap;
                mipchain.push_back(mipmap);
            }
            if (size == iVec3(1, 1, 1)) break;
        }
    
        //Convert to compressed format
        for (int n = 0; n < mipchain.size(); ++n)
        {
            mipchain[n] = mipchain[n]->Convert(TEXTURE_BC5);
        }
    
        //Save
        SaveTexture(GetPath(PATH_DESKTOP) + "/water1.dds", TEXTURE_3D, mipchain, framecount);
    
        return 0;
    }

    Of course, you are right. Then there must be another error in it. Because everything is not animated with miplevel > 0. I take another look at it.

  7. Hi Josh. As shown above i use your generated water1.dds. And as you can see the dds works, but is generated wrong (only miplevel 0 is animating). As it is a volume texture the z component is of constant size and must not be divided for the mipmapchain as the chain should always have 128 slices for each miplevel.

    int framecount = 128;
    	std::vector<shared_ptr<Pixmap> > pixmaps(framecount);
    	for (int n = 0; n < framecount; ++n)
    	{
    		pixmaps[n] = LoadPixmap("Materials/Animations/water1_" + String(n) + ".png");
    	}
    
    	//Build mipmaps
    	iVec3 size = iVec3(pixmaps[0]->size.x, pixmaps[0]->size.y, pixmaps.size());
    	auto mipchain = pixmaps;
    	while (true)
    	{
    		auto osize = size;
    		size.x = Max(1, size.x / 2);
    		size.y = Max(1, size.y / 2);
    		size.z = Max(1, size.z);
    		for (int n = 0; n < size.z - 1; ++n)
    		{
    			Print(n);
    			auto a = pixmaps[n + 0];
    			auto b = pixmaps[n + 1];
    			auto mipmap = CreatePixmap(osize.x, osize.y, pixmaps[0]->format);
    			for (int x = 0; x < osize.x; x++)
    			{
    				for (int y = 0; y < osize.y; y++)
    				{
    					int rgba0 = a->ReadPixel(x, y);
    					int rgba1 = b->ReadPixel(x, y);
    					int rgba = RGBA((Red(rgba0) + Red(rgba1)) / 2, (Green(rgba0) + Green(rgba1)) / 2, (Blue(rgba0) + Blue(rgba1)) / 2, (Alpha(rgba0) + Alpha(rgba1)) / 2);
    					mipmap->WritePixel(x, y, rgba);
    				}
    			}
    			mipmap = mipmap->Resize(size.x, size.y);
    			pixmaps[n] = mipmap;
    			mipchain.push_back(mipmap);
    		}
    		if (size.x == 1 && size.y == 1) break;
    	}
    
    	SaveTexture("water2.dds", TEXTURE_3D, mipchain, framecount);

    This should work correctly, but the SaveTexture is now complaing about wrong miplevel count. 

    Saving texture "water2.dds"
    Error: Incorrect numbers of images in mipchain

    I assume this is because the z component is wrong calculated in the SaveTexture - Validation.

  8. 1 minute ago, SpiderPig said:

    That's looks good.  Is this an infinite water plane or will it work with any custom shape?

    @Josh A lots changed since I've last compiled a custom shader - is this still possible?  I get a few errors currently, I see Base.frag is now base_frag.glsl but I'm not sure if I should be suing that anymore...

    This is currently just a plane generated with CreatePlane and some customizations made to the PBR and rafraction shaders to restore some functionality Josh has currently disabled (animated textures, texture scaling). It is more or less a test how easy it is to get properly refracted materials in the engine. In Leadwerks you have to create your own renderer for custom water (if not deffered) you need to get a rrefraction and reflection texture. But in UltraEngine this works out of the box (nearly). 

    • Like 1
  9. a small experiment with the refraction shader and some modifications to get animated textures to work:

    image.thumb.png.bf5bcb4d2b3d13050596ccf0497d026c.png

    you can see, that the water is blending nicely with the terrain. The normal map is the animated 3d texture Josh created in his blog. Unfortunatly the normal isn't stored currently in the refraction inputs, but this should be no problem later. The blending is calculated by actual thickness of the volume and could be (in theory) modified by some kind of visibility parameter or, maybe the alpha value or something like this.

    • Like 4
  10. While Start and Update are verbs, they also have a different meaning than Collide. 

    Maybe some guideline would help: 

    If something is definitely called every time (like Start or Update) use the verb.

    If something is called on a specific condition it should (in my opinion) called something like: On... e.g: OnCollision

     

    The problem with the collide(...) in this context is that it could be misintrepted as it actually is the reaction of a Collision and does not actually does the Collision.

     

     

    • Like 1
    • Upvote 1
  11. Just a few things I encountered and currently evaluating how this could be solved. 

    Background: I want to add a very basic ocean material (or water material) which first mimics the shallow water of Leadwerks Engine. Therefore, I have checked your animated texture blog post : https://www.ultraengine.com/community/blogs/entry/2715-animated-textures/

    What i noticed: the current shaders don't support animations, and I guess this is due to the changes in the material pipeline. Artifacts of this are still in the "base.vert.glsl" and I was able to recreate them, but the "fragment.glsl" just uses 2d Texture lookups currently and the Texture_Animation Def is not passed to the fragment stage.

    If i understand it correctly the refraction is a deffered step, after everything is rendered, so technically it is a posteffect running before all other posteffects and takes the transparent objects as an input and calculates the refraction based on the material. So back to my thinking: Water or every other liquid doesn't have fixed alpha levels. so just setting the alpha of an object just works for very rare edgecases like a glass or the fluid in the glass (very thin layers of liquids). Maybe it would be possible to set some kind of function to the refraction-step: where the real transparency is based on the real thickness of the current medium (based on the view dependend ray from the surface to the underling depth).

    image.thumb.png.bbb51c355b7ec8cedd45871999cc0638.png

     

     

    • Upvote 1
  12. Small but simple addition to the CameraControls Component:

    class CameraControls : public Component
    {
    public:
        bool freelookstarted = false;
    	float mousesmoothing = 0.0f;
    	float mouselookspeed = 1.0f;
    	float movespeed = 4.0f;
    	Vec2 freelookmousepos;
    	Vec3 freelookrotation;
    	Vec2 lookchange;
    	Vec2 multiplier = Vec2(10.0,0.25);
    
    	KeyCode key_slowdown = KEY_CONTROL;
    	KeyCode key_speedup = KEY_SHIFT;
    	KeyCode key_up = KEY_E;
    	KeyCode key_down = KEY_Q;
    	KeyCode key_left = KEY_A;
    	KeyCode key_right = KEY_D;
    	KeyCode key_forward = KEY_W;
    	KeyCode key_backward = KEY_S;
    
    	bool freelookMode = true;
    
    	virtual void Update()
        {
    		auto window = ActiveWindow();
    		if (window == NULL) return;
    
    		if (freelookMode)
    		{
    			if (!freelookstarted)
    			{
    				freelookstarted = true;
    				freelookrotation = entity->GetRotation(true);
    				freelookmousepos = window->GetMouseAxis();
    			}
    			auto newmousepos = window->GetMouseAxis();
    			lookchange.x = lookchange.x * mousesmoothing + (newmousepos.y - freelookmousepos.y) * 100.0f * mouselookspeed * (1.0f - mousesmoothing);
    			lookchange.y = lookchange.y * mousesmoothing + (newmousepos.x - freelookmousepos.x) * 100.0f * mouselookspeed * (1.0f - mousesmoothing);
    			if (Abs(lookchange.x) < 0.001f) lookchange.x = 0.0f;
    			if (Abs(lookchange.y) < 0.001f) lookchange.y = 0.0f;
    			if (lookchange.x != 0.0f or lookchange.y != 0.0f)
    			{
    				freelookrotation.x += lookchange.x;
    				freelookrotation.y += lookchange.y;
    				entity->SetRotation(freelookrotation, true);
    			}
    			freelookmousepos = newmousepos;
    		}
    
    		float speed = movespeed / 60.0f;
    		if (window->KeyDown(key_speedup))
    		{
    			speed *= multiplier.x;
    		}
    		else if (window->KeyDown(key_slowdown))
    		{
    			speed *= multiplier.y;
    		}
    
    		if (window->KeyDown(key_up)) entity->Translate(0, speed, 0);
    		if (window->KeyDown(key_down)) entity->Translate(0, -speed, 0);
    		if (window->KeyDown(key_right)) entity->Move(speed, 0, 0);
    		if (window->KeyDown(key_left)) entity->Move(-speed, 0, 0);
    		if (window->KeyDown(key_forward)) entity->Move(0, 0, speed);
    		if (window->KeyDown(key_backward)) entity->Move(0, 0, -speed);
        }
    }; 

    It let you configure the keys for moving, the speed factors for slowdown or speedup and lets you suspend the freelookmode.

    • Like 1
  13. With the latest version i get the following when a Terrain is in the scene:

    Program: ...s\Shadow\Documents\Ultra Engine\Environment\Environment_d.exe
    File: C:\Program Files\Microsoft Visual Studio\2022\Community\VC\Tools\MSVC\14.33.31629\include\vector
    Line: 1748
    
    Expression: vector subscript out of range
    >	Environment_d.exe!std::vector<class UltraCull::CullingLOD,class std::allocator<class UltraCull::CullingLOD> >::operator[](unsigned __int64)	Unbekannt
     	Environment_d.exe!UltraCull::CullingModel::Draw(class std::shared_ptr<class UltraCull::CameraVisibilityList> &,class std::map<class std::shared_ptr<class UltraRender::RenderMesh>,class std::shared_ptr<class UltraCull::SurfaceBatch>,struct std::less<class std::shared_ptr<class UltraRender::RenderMesh> >,class std::allocator<struct std::pair<class std::shared_ptr<class UltraRender::RenderMesh> const ,class std::shared_ptr<class UltraCull::SurfaceBatch> > > > &,class std::shared_ptr<class UltraCull::CullingWorld>,class std::shared_ptr<class UltraCull::CullingBuffer>,class std::shared_ptr<class UltraCull::CullingCamera>,class std::shared_ptr<class UltraRender::RenderCamera>,bool,int)	Unbekannt
     	Environment_d.exe!UltraCull::CullingNode::Draw(class std::shared_ptr<class UltraCull::CameraVisibilityList> &,class std::map<class std::shared_ptr<class UltraRender::RenderMesh>,class std::shared_ptr<class UltraCull::SurfaceBatch>,struct std::less<class std::shared_ptr<class UltraRender::RenderMesh> >,class std::allocator<struct std::pair<class std::shared_ptr<class UltraRender::RenderMesh> const ,class std::shared_ptr<class UltraCull::SurfaceBatch> > > > &,class std::shared_ptr<class UltraCull::CullingWorld>,class std::shared_ptr<class UltraCull::CullingBuffer>,class std::shared_ptr<class UltraCull::CullingCamera>,class std::shared_ptr<class UltraRender::RenderCamera>,bool,int)	Unbekannt
     	Environment_d.exe!UltraCull::CullingTerrain::Draw(class std::shared_ptr<class UltraCull::CameraVisibilityList> &,class std::map<class std::shared_ptr<class UltraRender::RenderMesh>,class std::shared_ptr<class UltraCull::SurfaceBatch>,struct std::less<class std::shared_ptr<class UltraRender::RenderMesh> >,class std::allocator<struct std::pair<class std::shared_ptr<class UltraRender::RenderMesh> const ,class std::shared_ptr<class UltraCull::SurfaceBatch> > > > &,class std::shared_ptr<class UltraCull::CullingWorld>,class std::shared_ptr<class UltraCull::CullingBuffer>,class std::shared_ptr<class UltraCull::CullingCamera>,class std::shared_ptr<class UltraRender::RenderCamera>,bool,int)	Unbekannt
     	Environment_d.exe!UltraCull::CullingOctreeNode::DrawEntities(class std::shared_ptr<class UltraCull::CameraVisibilityList> &,class std::map<class std::shared_ptr<class UltraRender::RenderMesh>,class std::shared_ptr<class UltraCull::SurfaceBatch>,struct std::less<class std::shared_ptr<class UltraRender::RenderMesh> >,class std::allocator<struct std::pair<class std::shared_ptr<class UltraRender::RenderMesh> const ,class std::shared_ptr<class UltraCull::SurfaceBatch> > > > &,class std::shared_ptr<class UltraCull::CullingWorld>,class std::shared_ptr<class UltraCull::CullingBuffer>,class std::shared_ptr<class UltraCull::CullingCamera>,class std::shared_ptr<class UltraRender::RenderCamera>,class UltraEngine::Vec3 const &,int,int,bool,int)	Unbekannt
     	Environment_d.exe!UltraCull::CullingOctreeNode::Draw(class std::shared_ptr<class UltraCull::CameraVisibilityList> &,class std::map<class std::shared_ptr<class UltraRender::RenderMesh>,class std::shared_ptr<class UltraCull::SurfaceBatch>,struct std::less<class std::shared_ptr<class UltraRender::RenderMesh> >,class std::allocator<struct std::pair<class std::shared_ptr<class UltraRender::RenderMesh> const ,class std::shared_ptr<class UltraCull::SurfaceBatch> > > > &,class std::shared_ptr<class UltraCull::CullingWorld>,class std::shared_ptr<class UltraCull::CullingBuffer>,class std::shared_ptr<class UltraCull::CullingCamera>,class std::shared_ptr<class UltraRender::RenderCamera>,class UltraEngine::Vec3 const &,int,bool,bool,int)	Unbekannt
     	Environment_d.exe!UltraCull::CullingCamera::BuildVisibilityLists(class std::shared_ptr<class UltraCull::CullingWorld>,class std::shared_ptr<class UltraCull::CullingBuffer>,class std::shared_ptr<class UltraCull::VisibilityList>)	Unbekannt
     	Environment_d.exe!UltraCull::CullingWorld::BuildVisibilityLists(class std::shared_ptr<class UltraCull::CullingBuffer>,class std::shared_ptr<class UltraCull::CullingWorld>)	Unbekannt
     	Environment_d.exe!UltraCull::CullingThreadManager::Update(bool)	Unbekannt
     	Environment_d.exe!UltraCore::ThreadManager::EntryPoint(class std::shared_ptr<class UltraEngine::Object>)	Unbekannt
     	Environment_d.exe!UltraEngine::Thread::thread_function(void *)	Unbekannt
     	ucrtbased.dll!00007fffce663010()	Unbekannt
     	kernel32.dll!00007ff86b7374b4()	Unbekannt
     	ntdll.dll!00007ff86bde26a1()	Unbekannt

     

  14. Tested some of the terrainsamples and modified them a bit:

    Some things i noticed: 

    •  LoadHeightmap currently only supports r16 files in a specific format. other image files (like png) just result in flat terrains.
      • auto pixmap = LoadPixmap("Materials/hm_riverside.png");
        	pixmap = pixmap->Resize(terrainSize, terrainSize);
        
        	for(int x = 0; x < terrainSize;x++)
        		for (int y = 0; y < terrainSize; y++)
        		{
        			auto p = Red(pixmap->ReadPixel(x, y)) / 255.0;
        			terrain->SetHeight(x, y, p);
        		}
      • the above sample works, maybe a  heightmap-Loader plugin would help.
    • pixmap->Sample(x,y) just returns Vec4(0) for every pixel
    • Hightmaps bigger than 1024 will result in an Access Violation exception with (newton_d.dll)
    • Tesselation seems to be broken currently (no big priority, but would be nice to see it in action)
      • The terrain has gaps or is just black for everything not painted with the first material
      • Skybox is black as well when using Tesselation
    14 hours ago, DoomSlayer said:

    Wow! This is water simulation in Ultra Engine?

    No, but it is on its way ;)

    @Josh: What would be a nice addition to the camera system would be a way to specify an override for the shaderfamily or pass: this way you could setup cameras which just render depth data and don't use any other material switches ans i don't know if it is already there: Clipplanes. 

    Background: I am experimenting with a (done this in Leadwerks) way to create a topdown flowmap / sdf field for  river and ocean rendering (foam detection etc.).

     

     

    • Thanks 1
  15. Well, it wouldn‘t break it, but it would make access to it a bit harder. I don’t mean my implementations as i already create my own views for cubemaps but why create extra views if the correct ones are already there? Providing this info doesn‘t hurt and could help to encourage to extend the engine for other users. Maybe, and I think I have mentioned this idea already, it would be nicer to have some external scope to get all these informations. So you can keep the main classes clean, but for those who need it, you can provide these information ( undocumented of course, maybe later supported). 
    I could imagine something like an extra namespace like UltraEngine::Transfer and there you could have something like getvktexturedata( shared_ptr<Texture>).

    this is just my opinion, I like it if the code is protected, but also accessible for advanced cases. The decision is up to you.

    • Upvote 1
  16. I am currently refactoring and reorganizing my Compute Pipeline to gain more performance and to make it a bit more flexible to use:

    In my current implementation, you always have one Shader with one Set of data to use, so you internally the descriptorset is always switched and the buffers,textures are always bound on every dispatch, while this is no problem for one-shot computes it could and will slowdown the pipeline for recurring compute passes like particles or realtime pbr calculations. So i started refactoring it to allow shared Descriptors between multiple compute shaders. You can call them something like ShaderGroups, this means you can still creatte the same behaviour like my first apprach, but also you can group several compute shaders with the same descriptorset to one batch:

    This is how it will look like (kind of pseudo code) :

    	auto pipeline = CreateComputePipeline(world, ComputeHook::TRANSFER);
    	auto sharedPipelineDescriptor = CreateComputeDescriptor();
    	sharedPipelineDescriptor->AddSampler(0, texture1, 0); //slot, texture, set
    	sharedPipelineDescriptor->AddSampler(1, texture2, 0); //slot, texture, set
    	sharedPipelineDescriptor->AddSampler(0, texture3, 1); //slot, texture, set
    	sharedPipelineDescriptor->AddUniformBuffer(1, &buffer, 1); //slot, buffer, set
    	sharedPipelineDescriptor->SetPushConstantSize(sizeof(PushData));
    
    	auto environmentComputeShader = CreateComputeShader("compute/environment.comp.spv", sharedPipelineDescriptor);
    	auto particleComputeShader = CreateComputeShader("compute/particleDescriptor.comp.spv", sharedPipelineDescriptor);
    	
    	environmentComputeShader->SetPushConstant(&pushdata1);
    	particleComputeShader->SetPushConstant(&pushdata2);
    	environmentComputeShader->SetDispatchSize(16,16,1);
    	particleComputeShader->SetDispatchSize(8,8,8);
    
    	pipeline->AddShader(particleComputeShader);
    	pipeline->AddShader(environmentComputeShader);
    
    	pipeline->Dispatch();

    What do you think? Any ideas how to improve the code experience?

     

  17. the new Character Controller and Physics looks very smooth.

    The PBR-Pipeline still seems to be somehow broken: The result from this sample 

     

     

    now looks like this:

     image.thumb.png.92b3b218faf9789b7de828f4bc99c155.png

    Also it seems that the IBL textures have nearly no effect on the ambient colors.

     

    Side question: Is it possible to get the current VkrenderPass in the Render Hook? if not is it possible to add a custom RenderPipeline or RenderPass to the actual renderer?

  18. Another small feature Request: 

    Currently you can config the Validation_Layers the Vulkan layer uses through the json file. How about the same for enabled features?

    Background: For some things like collecting rendering stats additional features are required, which needs to be enabled on creation time of the Device. 

    eg: pipelineStatisticsQuery

    Also it would be nice to have these Settings available from code: So you could add and remove features in code before starting the actual rendering.

×
×
  • Create New...