Jump to content

Rastar

Members
  • Posts

    421
  • Joined

  • Last visited

Blog Comments posted by Rastar

  1. Very excited that you're considering large-scale terrain for Leadwerks 5, that topic's been an interest of mine for ages. I particularly find the algorithm for "Real-Time Deformable Terrain Rendering with DirectX 11" by Egor Yusov in GPU Pro 3 interesting. They basically observe that terrain at any point on the grid usually only deviates slightly from a value linearly interpolated from its neighbors. They use that to only store the delta from that interpolation and get a data set that can be very efficiently compressed. The terrain is then reconstructed at run time from those delta sets according to the required LOD level. There might be some interesting ideas for you in there (and by now OpenGL also provides the relevant features like texture arrays etc.)

  2. That's cool, have to try them out. I had already started working on some environment probes myself for my PBR experiments, but without accessibility to the editor the workflow is a little cumbersome.

    Question: Is there/will there be an API to generate the cubemaps at runtime? I know this is time consuming, but for changing outdoor conditions (time of day) it would be nice if the maps could be generated at runtime, like every 10mins or so.

  3. Ooooooh, that's one feature I've been eagerly waiting for :) Really happy that you're improving the graphics of Leadwerks again. Question: I'd be interested to play around with several tone mapping operators (I guess similar to what the "iris adjustment" does) - will the iris adjustment be a simple post processing effect attachment to the camera (in which case it could be easily replaced)?

  4. Actually I did, but in all those back and forth changes between the two pipelines I must have made a mistake, I created a clean new project and did the "classic" shot again:

     

    http://images.akamai.steamusercontent.com/ugc/318997198277429861/AA729F838AEFA16B397DE94A706744F4ED18E51C/

     

    But the direct comparison is really difficult. E.g. in this new shot the light intensity is higher (1.2 instead of 0.8) than in the PBR shots to show some highlights.

  5. That's actually not too easy, because the assets have to be so different. I tried the following: Used a barrel model from Dexsoft's Industrial3 collection and applied a rusted steel texture from gametextures.com (that is available in both classic and PBR variants):

     

    Classic

     

    http://images.akamai.steamusercontent.com/ugc/318997198276597375/6C91BE3851233BE9365238615B71AC72E1012060/

     

     

    and PBR

     

    http://images.akamai.steamusercontent.com/ugc/318997198276599144/73A9480C1634DF36C7D0BE8E3DBBAF46143C8179/

    • Upvote 1
  6. Can you not just pass your data in place of the inherent ambientlight uniform?

     

    I am using spherical harmonics (think Marmoset Skyshop) which have 9 float parameters, so the 4 channels of the ambient color aren't sufficient. Come to think of it: I am currently using just one cubemap, but there could be many in the scene, so the coefficients (and reflection cubemap values) would depend on the object's position. It might be best to keep these calculations in the materials.

  7. I can't give you a direct comparison. It should be slower, but not by an awful lot. Apart from the ambient lighting there aren't more texture lookups, and that would be an unfair comparison since it's an additionl feature. But yes, the shader calculations are more involved (more dot product, a pow() for the Schlick approximation etc.). My code isn't yet optimized, and some optimizations would actually need changes to the Leadwerks engine:

    • the conversion of textures from linear space to gamma can be don eby the GPU using its sRGB sampling. Right now I have to transform the colors using pow(col, 2.2) and back using pow(col, 1/2.2)
    • I am currently doing the ambient lighting in the materials because I don't know how to pass the required data to the lighting shaders,

  8. I think with "high-quality" Josh is not so much referring to texture size (2k should be enough) but about how meaningful the displacement map is. This is a grey-scale with black being "no vertical height" and white "full displacement". Often this is generated by some tool from the diffuse map (photo of rock surface), but this will be noisy due to colored specks etc. Using that as a displacement map will create unrealistic spikes. It's better to generate it from rocks etc. sculpted in a 3D modelling tool.

    • Upvote 2
  9. Congratulations on a successful year 2014, and may 2015 be even better! I really like the improvements in the model editor, those are real time savers.

     

    Looking forward to the new landscape tools! If I may utter some wishes (I know I do that most of the time...)

     

    1) It would be great if at least the vegetation painter would not depend on the built-in terrain, but could work with anything derived from Model/Surface. This would allow to paint on meshes as well, and also to work with home-grown terrain solutions that some of us are playing around with ;-)

     

    2) Some folks (like, for example, me smile.png ) are using tools like WorldMachine that can also create distribution maps for vegetation, rocks etc (in the form of textures or ASCII files). Would be cool if that could be used with the vegetation tool as well. Format/encoding/meaning of the RGB values would be up to you, as long as we're able to import a texture and/or x/y/something file that defines vegetation distribution according to the Leadwerks specification.

    • Upvote 2
  10. To a degree, I do experience this, yes. Not getting sick, but rather a little dizzy and "headachy". I normally don't get seasick and can easily read in a car. Apparently, you need a few days or even weeks to train your brain for this new experience, but I haven't gotten that far.

     

    A normal HUD wouldn't help, I guess. And it flies a little in the face of the immersive experience, since it doesn't move when turning your head. The normal recommendation is to have a 3D user interface in front of the player. These things are probably one of the trickiest issues in VR: You have to come up with different interaction schemes tailored to the new experience.

  11. Well, I'm on unsafe ground here because I have next to no modeling experience. From what I understand, the other formats (fbx included) are somewhat unreliable, sometimes due to changes between versions, sometimes due to flaky exporters. It probably depends on what the reasoning behind the Blender exporter is if looking at this makes sense, I just noticed that your description sounded like your planning something similar and thought this could save some effort.

  12. About that Blender plugin: You might consider using OpenGEX. It is an open format for exporting models from modeling tools and importing them into game engines (yes, designed by the owner of the C4 engine, but that shouldn't rule it out). It was born out of the frustrations that model conversions for the proprietary formats cause (especiallly for animated models), and the failure of Collada to solve this. Plugins for 3DS Max and Maya are already available (open sourced), Blender will be soon. The task would then be to convert that into mdl format, which shouldn't be too hard.

    • Upvote 2
  13. Hi shadmar, that's an honor :) Yeah, texturing will be another topic... A selector-based texturing solution similar to Josh's would be great (read: memory-friendly). In addition, World Machine exports very nice splat maps (e.g. defining the path erosion activities), I would lie to make use of those as well.

  14. Yes, that might work as well. What I have in mind, though, is a LOD calculation in the control shader, giving as result the wanted length of the tessellated triangles. This result should be identical for the two adjacent patch edges. The outer tessellation factor would then be calculated by dividing the total edge length of the patch by this number, leading to coinident vertices. Hopefully smile.png

×
×
  • Create New...