Jump to content

SpiderPig

Members
  • Posts

    2,306
  • Joined

  • Last visited

Everything posted by SpiderPig

  1. Okay, I'll just keep reporting if it breaks again.
  2. I running Visual Studio 22 v17.8.2 and Release will no longer compile. I have the latest Ultra update. I tried rolling back VS to before the last update but it only goes back once it seems and then the option to roll back is no longer available. What do you suggest here @Josh? Do we hold off updating VS until you've updated on your end?
  3. Tried that. I can't seem to reproduce it in a smaller project yet either...
  4. Hey @Josh I seem to be getting a lot of random crashes since updating an hour ago. Sometimes it crashes to memory.h but sometimes there's no useful information at all. I'll see if I can replicate it in a smaller project...
  5. Yeah I think I'll just use 2D GUI and anything 3D just overlay in my main window.
  6. I'm using Ultra as a simulator for a A.I that will eventually be deployed to physical hardware. I just wanted a few diagnostic screens that represented data in a few real-time 3D viewports. But I think there are other ways I can achieve similar results. Multiple 3D windows was just the preferred option. Ultimately, if a panel could have a texture assigned without being a 3D interface, that would work best. Thanks for giving it a try.
  7. I only need one world. But if it needed more to work that's fine too.
  8. I have three screens and wanted to put one window on each, but another option might be to create one window and stretch it across all three screens. Then create what 3D viewports I need inside the interface.
  9. So far the only way I can get something to work is by rendering a camera to a texture buffer, then taking a snapshot of that buffer per frame and setting the resulting pixmap to the panel. Dreadfully slow, but no extra framebuffer required. I may have to be satisfied with just one window then.
  10. Wouldn't I still need a framebuffer for the second window? The problem seems to be with two framebuffers...
  11. That's what I was thinking too. Though it seems a panel can only have texture assigned to it if the interface has been created with a world and I can't get two interfaces working with one world. I can create another world for this purpose but I need two framebuffers, one for each window, and that dosn't work very well.
  12. Is it possible to create 2 or more windows each one with UI with 3D? I can't think of a way to do it that doesn't involve multiple framebuffers.
  13. I fixed it auto ui = CreateInterface(world, font, framebuffer->size); ui->background->SetColor(0, 0, 0, 0); ui->SetRenderLayers(RENDERLAYER_1);//<----- Needs this!!
  14. Thanks, I'll try a different approach then.
  15. I'm rendering two worlds to two separate windows with a UI overlay. The moment I render the 2nd world it cause FPS to drop and rendering is very choppy in both windows. I've also noticed that the window position in CreateWindow() seems to have no effect. #include "UltraEngine.h" using namespace UltraEngine; #define RENDERLAYER_1 2 #define RENDERLAYER_2 4 int main(int argc, const char* argv[]) { auto displays = GetDisplays(); //WINDOW A auto window = CreateWindow("WindowA", 0, 0, 512, 512, displays[0]); auto framebuffer = CreateFramebuffer(window); auto world = CreateWorld(); world->RecordStats(true); auto font = LoadFont("Fonts/arial.ttf"); auto camera = CreateCamera(world); camera->SetPosition(0, 0, -2); auto ui_camera = CreateCamera(world, UltraEngine::PROJECTION_ORTHOGRAPHIC); ui_camera->SetRenderLayers(RENDERLAYER_1); ui_camera->SetPosition(framebuffer->size.x / 2.0f, framebuffer->size.y / 2.0f); ui_camera->SetClearMode(UltraEngine::CLEAR_DEPTH); auto ui = CreateInterface(world, font, framebuffer->size); ui->background->SetColor(0, 0, 0, 0); ui->SetRenderLayers(RENDERLAYER_1); //WINDOW B auto worldB = CreateWorld(); worldB->RecordStats(true); auto windowB = CreateWindow("WindowB", 800, 0, 512, 512, displays[0]); auto framebufferB = CreateFramebuffer(windowB); auto cameraB = CreateCamera(worldB); cameraB->SetPosition(0, 0, -2); auto ui_cameraB = CreateCamera(worldB, UltraEngine::PROJECTION_ORTHOGRAPHIC); ui_cameraB->SetRenderLayers(RENDERLAYER_2); ui_cameraB->SetPosition(framebufferB->size.x / 2.0f, framebufferB->size.y / 2.0f); ui_cameraB->SetClearMode(UltraEngine::CLEAR_DEPTH); auto uiB = CreateInterface(worldB, font, framebufferB->size); uiB->background->SetColor(0, 0, 0, 0); uiB->SetRenderLayers(RENDERLAYER_2); auto fpsA = CreateLabel("FPS : 0", 10, 10, 100, 20, ui->root); auto fpsB = CreateLabel("FPS : 0", 10, 10, 100, 20, uiB->root); auto boxA = CreateBox(world); boxA->SetColor(1, 0, 0); auto boxB = CreateBox(worldB); boxB->SetColor(0, 1, 0); while (true) { while (UltraEngine::PeekEvent()) { auto event = UltraEngine::WaitEvent(); switch (event.id) { case UltraEngine::EVENT_WINDOWCLOSE: if (event.source == window) { return 0; } break; } ui->ProcessEvent(event); uiB->ProcessEvent(event); } fpsA->SetText("FPS : " + WString(world->renderstats.framerate)); fpsB->SetText("FPS : " + WString(worldB->renderstats.framerate)); boxA->Turn(0.1f, 0.1f, 0.2f); boxB->Turn(0.2f, 0.1f, 0.1f); worldB->Update(); worldB->Render(framebufferB, false); world->Update(); world->Render(framebuffer, false); } return 0; }
  16. While the example here works, it seems if you are overlaying an interface over a 3D environment the panel won't show the assigned texture. I first noticed this in my project where I wanted to show the real-time render of a 2nd camera on the panel. #include "UltraEngine.h" using namespace UltraEngine; int main(int argc, const char* argv[]) { auto displays = GetDisplays(); auto window = CreateWindow("Ultra Engine", 0, 0, 1280, 720, displays[0]); auto framebuffer = CreateFramebuffer(window); auto world = CreateWorld(); auto font = LoadFont("Fonts/arial.ttf"); auto ui = CreateInterface(world, font, framebuffer->size); ui->background->SetColor(0, 0, 0, 0); //Create and position camera auto camera = CreateCamera(world); camera->SetPosition(0, 1, -2); int RENDERLAYER_1 = 2; auto ui_camera = CreateCamera(world, UltraEngine::PROJECTION_ORTHOGRAPHIC); ui_camera->SetRenderLayers(RENDERLAYER_1); ui_camera->SetPosition(framebuffer->size.x / 2.0f, framebuffer->size.y / 2.0f); ui_camera->SetClearMode(UltraEngine::CLEAR_DEPTH); auto panel = CreatePanel(0, 0, 512, 512, ui->root); auto tex = LoadTexture("https://raw.githubusercontent.com/UltraEngine/Documentation/master/Assets/Materials/Ground/river_small_rocks_diff_4k.dds"); panel->SetTexture(tex); auto box = CreateBox(world); box->SetColor(1, 0, 0); while (true) { while (UltraEngine::PeekEvent()) { auto event = UltraEngine::WaitEvent(); switch (event.id) { case UltraEngine::EVENT_WINDOWCLOSE: if (event.source == window) { return 0; } break; } ui->ProcessEvent(event); } world->Update(); world->Render(framebuffer); } return 0; }
  17. Thanks that worked a treat. Should we be able to change the format of a texturebuffer, like to RGBA or is RGBA16 the only one that will work for camera targets?
  18. Thanks. It's actually 3 times slower to do that per pixel than to convert the whole pixmap and access the pixel data directly. I am I correct in saying that for an RGBA16 format, the data here 2 bytes per channel? char* data = pixmap->pixels->Data(); I need a value between 0 and 255 but I'm not positive if bit shifting like this is the right way. And then I'm not positive how to get back the 0 - 255 range. auto r = (data[index] << 8) | data[index + 1]; auto g = (data[index + 2] << 8) | data[index + 3]; auto b = (data[index + 4] << 8) | data[index + 5]; auto a = (data[index + 6] << 8) | data[index + 7]; index += 8;
  19. How can I access the pixel data from a pixmap generated with TextureBuffer::Capture()? I want to access the pixel data directly like this: auto captures = texture_buffer->GetCaptures(); if (captures.size() != 0) { auto pixmap = captures[0]; auto data = pixmap->pixels->Data(); int x = 0, y = 0, stride = 4; auto i = ((y * pixmap->size.x) + x) * stride; auto r = data[0]; auto g = data[1]; auto b = data[2]; auto a = data[3]; } I've also tried this but it shows an error saying it "Bytes per block must be four or less" auto pix = pixmap->ReadPixel(x, y); auto r = Red(pix); auto g = Green(pix); auto b = Blue(pix); auto a = Alpha(pix); I can convert the pixmap from TEXTURE_RGB16 to TEXTURE_RGBA and both code will work but it's far too slow for my situation (I'm doing it once per frame).
  20. Can we please get a GetMaterial() method for the Sprite class.
  21. Just a minor error in the C++ section for Framebuffer::Capture() auto path = GetPath(PATH_DESKTOP) .. "/screenshot.jpg"; pixmap:Save(path); Should be: auto path = GetPath(PATH_DESKTOP) + "/screenshot.jpg"; pixmap->Save(path);
  22. Perfect! I'll give it a go. Thankyou.
  23. I'm thinking about giving this a go as it's not really a real time application I want this for. How slow do you think this could be though for a 512x512 image rendered with a camera? 1FPS?
  24. SpiderPig

    Finishing Touches

    Good to see all your hard work has paid off.
  25. I tried this some time ago. I only ever got png to export in the gltf file. Even if I imported the textures as dds. What I ended up doing was as you say, using the asset editor.
×
×
  • Create New...