Jump to content

Rastar

Members
  • Posts

    421
  • Joined

  • Last visited

Posts posted by Rastar

  1. @Olby: That is a very important argument. However, we one-man shows are probably more dependant on easy access to assets than studios. With a tool like Quixel's Ddo, even I (and I'm by no means an artist) can produce great looking assets that just work in the engine. And I would bet that with big engines all on PBR now, sites like Arteria3D and Dexsoft will go that way as well.

  2. @YouGroove: I'm afraid it isn't as simple as that. While you use diffuse, normal maps etc. as well for PBR shading, they are prepared differently. E.g., the diffuse map must not contain any lighting information (like baked AO etc.) but just the physical diffuse color. Compared to a traditional diffuse map, the PBR variant usually looks very bright and flat.

     

    Metals are implemented differently, but in the specular PBR workflow the diffuse color of a metal is black (0,0,0), and all color information is calculated from the (RGB) specular color (while non-metals typically have a grey specularity with a value of about 5% across the board).

     

    The glossiness/roughness values are characteristic for a material, so you use physically measured values for those (which also means you get realistic results if you stick to that). Then there is energy conservation, which makes sure the material doesn't reflect more light than hits it.

     

    And as I wrote, you have to completely work in linear space to get good results, which e.g. requires that you convert the textures (which are usually encoded in gamma space) to linear space before shading (and then back to gamma for displaying it on a monitor). And preferably using HDR textures (which are tone-mapped back to 8bit) to work with higher precison/greater color range.

  3. Are we talking about the same thing here? I don't mean Image-based lighting, with precomputed Spherical Harmonics parameters from an (HDR) background texture calculated using light probes. That is used for indirect lighting (and, by the way, is the method of choice for global illumination in the current CryEngine).

     

    I mean changing the shading calculations to include energy conservation, microsurface structure (similar to Cook-Torrance) and real-world material properties (instead of ad-hoc specularity parameters). This simply gives more consistent shading under different lighting conditions. Have a look here

     

    http://www.marmoset.co/toolbag/learn/pbr-theory

    https://www.marmoset.co/toolbag/learn/pbr-practice

    • Upvote 2
  4. I was wondering,

     

    are there any plans for moving to physically-based rendering in the short/medium term? There are so many DCC tools moving to PBR, it's getting more difficult to get along without it... Substance Painter+Designer, Quixel DDo+its Megascan library, of course Marmoset Toolbag, 3DCoat will have PBR materials soon... and I think it's looking great as well and is making an artist's life easier. Of course, textures+shaders in linear color space and HDR would be prerequisites, but they were on the menu anyways, weren't they wink.png

     

    But joking aside, really wondering if/when this will be coming to Leadwerks.

    • Upvote 2
  5. Leadwerks is cool because it

     

    - has a simple, clean API that allows you to do pretty low-level stuff in an easy way (e.g. procedural mesh generation)

    - provides full access to all shader stages, making it the perfect tool to try out rendering algorithms

    - has a very friendly and helpful community

    - is affordable and royalty-free.

    • Upvote 1
  6. It's more for doing the dynamic tessellation of grids in terrain and ocean rendering. It would render the vertices + specified attributes into a buffer that is stored on the GPU side. That way the mesh generation performed by the tessellation and/or geometry shaders wouldn't have to be redone from scratch every frame, but rather could be modified in subsequent frames to account for camera movements etc.

     

    As an example, in this paper they are using transform feedbacks for exactly this purpose:

     

    http://www.hindawi.com/journals/tswj/2014/979418/

  7. Is it currently possible to use transform feedbacks with LE 3.2? If not, is there a chance this might be added? Maybe even in 3.3 rolleyes.gif ?

    TFs would make the geometry and tessellation shaders even more valuable, and there are a number of very interesting algorithms especially for (guess what) terrain and ocean rendering that are making heavy use of them.

     

    EDIT: Oh, and I would be happy with a pure experimental, fully undocumented and completely unsupported implementation wink.png

  8. The shader has a cutoff value of 0.5 hard-coded in the fragment shader. I don't have Leadwerks access right now, but it should read something like

     

    if (color.a < 0.5) discard;
    

     

    You could lower that value to see if it gives better results (maybe first saving a copy of the shader). Also, do the leaf textures use color bleed, meaning the leaf edges are smeared or surrounded by some green color? That helps with some artifacts you might get with alpha testing and mipmapping.

  9. OK, I did some more tests, this time not with confusing terrain stuff... It seems it is not really the call to SetPosition(), but rather the number of objects that causes the slowdown. However, I still don't really understand why.

     

    Attached is a little App that

     

    1) creates a pivot

     

    2) creates a configurable amount of boxes/meshes/pivots (boxCount in line 40) and parents them to the pivot

     

    3) set the pivot's position every game loop (line 97).

     

    Running in debug mode I get

     

    ~135 fps for 0 children

    ~105 fps for 100 children

    ~65-75 fps for 250 children

    ~45-60 fps for 500 children

     

    Results for the three entity types are similar. There is nothing else in the scene, not even a light. I wouldn't expect that sort of slowdown - am I doing something principally wrong? Does the scene tree not like flat and wide structures?

     

    Thanks for any hints!

     

    App.cpp

  10. While looking around in the Leadwerks header files, I noticed a Get/SetHDR() member in the camera class. That of course made me curious...:

     

    1) Is it already possible to render to HDR buffer/textures? If so, is there also an option to define the tone mapping (or how is this mapped back to 8bits/channel)?

     

    2) This raised a second question: Is Leadwerks able to do the rrendering in linear space? The default shaders obviously expect the colors to be in linear space (no gamma up/downscaling there). However, a typical texture will be provided in gamma space. Does Leadwerks do any conversions here? I know that there are hardware sampler states for that so that the GPU does the conversion automatically upon reading/writing. Are they used, or is it possible to activate that by configuration?

  11. Yes, sorry for the confusion. I am not using the Leadwerks terrain, but my own terrain implementation, which yes, consists of mesh patches. And I like to move that with the camera since the mesh density is higher close to the entity's origin, so,it should be centered around the viewer. Think of moving a magnifying glass over a map.

  12. It's just the one call per game loop. And actually, I have the same drop when I parent my entity (and its children) directly to the camera.

     

    EDIT: Yes, I am actually moving the terrain. I don't think the number of vertices plays a role here, since they are in object space coordinates and not moved individually. No, something else must be updated during the SetPosition() call, since the position transformation itself should be pretty fast.

  13. I am playing around with a custom terrain once again. The terrain has a number of patches parented to it, depending on the terrain size these can be between a few dozen and several hundred. They are all directly parented to the terrain, no sub-hierarchy.

     

    I have added a hook

     

    void CameraPositionHook(Entity* entity) {
    MT::Terrain * terrain = static_cast<MT::Terrain*>(entity);
    Camera* camera = terrain->GetCamera();
    Vec3 position = camera->position;
    position.y = 0;
    if (camera) terrain->SetPosition(position);
    }
    

     

    which I add as an UpdateWorldHook to the terrain, and the SetPosition() call causes the observed rate drop.

  14. I would like to move an entity together with the camera. Since I don't want it to rotate with the camera, I just called ent->SetPosition(camera->position + offset) every frame. However, that seems to be really expensive - doing so gives me a framerate drop from 100-110 down to 50-60 (in Debug). Is Leadwerks doing some expensive calculations during this call?

     

    EDIT: OK, the frame rate drop seems to depend on the number of children that my entity has, so I guess the engine goes down the tree and recalculates the transform? However, that still seems to be too costly.

  15. I don't think you need a lot of vector *calculus* in game programming, only if you do a lot of physics stuff (and even then maybe only if you write your own physics engine.

     

    Mostly you'll be dealing with plain vector algebra. Maxwell's equations, gradient/Laplace operator... we're talking about higher-order differential equations here, which are too costly to solve most of the time anyway.

    • Upvote 1
  16. I know I'm getting on everybody's nerves (at least Josh's) rolleyes.gif but...

     

    API for editor plugins, pretty please!

     

    As a teaser: Two days ago, Side Effects released a very affordable Indie version of Houdini. Also of the Houdini engine, which has an API to be integrated into other applications, and API-wise it shouldn't be too hard to integrate that with Leadwerks - if only there was an API for that... And then you could procedurally generate geometry, create road networks etcpp.

     

    Pretty please...

    • Upvote 2
×
×
  • Create New...