Jump to content

Rastar

Members
  • Posts

    421
  • Joined

  • Last visited

Everything posted by Rastar

  1. Hi shadmar, that's an honor Yeah, texturing will be another topic... A selector-based texturing solution similar to Josh's would be great (read: memory-friendly). In addition, World Machine exports very nice splat maps (e.g. defining the path erosion activities), I would lie to make use of those as well.
  2. Thanks for the info. You mean the "bumpiness slider", right? True, that only becomes active when the Normal Map flag is set. You mean "height" as in "height map"? Is this then saved to the alpha channel (and that is the reason that appears on the Information tab)?
  3. At this point just a little WIP note to make sure Josh doesn't take over the blog space... I am still tuning the tessellation procedure. Temporarily (?) I have switched from my beatifully crafted concentric mesh rings to a uniform grid structure - it is more difficult to get a smooth LOD transition for those rings than for a uniform mesh, and I have enough problems with that already. So, I'll first try to get things right with the simpler mesh structure and then see if things work as well with those rings. I made wo little videos of my experiments. The first runs in wireframe mode and shows the tessellation change with the moving camera. Notice the intense "activity" when mving clode to the ground. http://steamcommunity.com/sharedfiles/filedetails/?id=229124907 This is also visible in the second video which shows a similar scene in normal shading mode: The terrain "bubbles" when flying close over it. In part this certainly is due to the heightmap resolution: I had to switch from a one-channel, 8bit heightmap to a two-channel, 16bit format to get this effect under control (otherwise the terrain was boiling...). But still, the horizontal resolution is low, with a 4096 texture covering 16km. So when a new vertex is created or moved around, it might suddenly lie on a different texel of the heightmap and is displaced differently. http://steamcommunity.com/sharedfiles/filedetails/?id=229127339 Well, a lot of stuff to do...
  4. What exactly does it do? I've noted an alpha channel is available if this is checked (which isn't really needed for a normal map)? Is Leadwerks using an uncompressed 32bit format when checked, or what is its meaning?
  5. I mentioned this before, but think forgot to make an "official" suggestion: I would really love to be able to use quads as a drawing primitive, mainly for using it in tessellation - quads IMHO lead to a more evenly distributed tessellation. So something like Surface:AddQuad(index0..index3) which also implicitly means a glPatchParameteri(GL_PATCH_VERTICES, 4); An unofficial, unsupported, "use at your own risk" version would be suficient for now
  6. Is there a way to get the "naked" projection matrix (rather than the projectioncameramatrix), either in a shader or in a script?
  7. Rastar

    Up! (Part 2)

    Yes, that might work as well. What I have in mind, though, is a LOD calculation in the control shader, giving as result the wanted length of the tessellated triangles. This result should be identical for the two adjacent patch edges. The outer tessellation factor would then be calculated by dividing the total edge length of the patch by this number, leading to coinident vertices. Hopefully
  8. Rastar

    Up! (Part 1)

    Great, looking forward to your feedback!
  9. Rastar

    Up! (Part 2)

    After torturing you with a walk-through of my terrain vertex shader I will do the same harm again, this time using the fragment shader... Without further ado, here we go: #version 400 //Uniforms uniform sampler2D texture0;//diffuse map uniform sampler2D texture1;//normal map After the obligatory version information, this time two texture uniforms are declared, the (already used) diffuse map and an additional normal map. //Inputs in vec2 ex_texcoords0; in vec4 ex_color; in mat3 nmat; Using that "naming-convention trick", three variables of the vertex shader are wired into the fragment shader, namingly the UV coordinates, the vertex color and the matrix for transforming normals into view space. out vec4 fragData0; out vec4 fragData1; out vec4 fragData2; out vec4 fragData3; And now I have to take a deeper breath: Leadwerks 3.1 is using a deferred renderer. This means that all lighting calculations are done behind the scenes in a separate processing step taking place after the fragment shader has run. So you won't see lighting calculations for diffuse reflection, specular reflection etc. here, but rather a filling-up of several buffers: fragData0 contains the diffuse color (rgb) and transparency (alpha) of the fragment fragData1 stores the normal (rgb) and specularity (alpha) fragData2 holds the emissive color (rgb) and a flag for lighting on/off (alpha) fragData3 - no idea if this is being used and what it might hold vec4 outcolor = ex_color; That's it for the ouverture, now into the main play. The vertex color is passed to the outcolor variable. Actually, the result up to here would look something like this: An amorphous white mass with holes in it. Those holes are the cracks described earlier, caused by T-junctions between mesh levels. Hopefully, this can later be healed by a healthy dose of tessellation potion. outcolor *= vec4((texture(texture0,ex_texcoords0)).xyz, 1.0); fragData0 = outcolor; To make things a bit more interesting, the outcolor is multiplied by a sample of the diffuse map (retrieved using our self-calculated, global UV coordinates). The result is then stored in the diffuse buffer. vec3 normal = texture(texture1,ex_texcoords0).xzy * 2.0 - 1.0; normal = normalize(nmat*normal); fragData1 = vec4(normal*0.5+0.5,0.0); A bit more code - the normal map is sampled and "extended". Since a texture can store values between 0.0 and 1.0, but every (xyz) component of a normal can range from -1 to +1, those sampled values have to be brought into that range. The normal is transformed into view space, normalized again and stored in the normal buffer (back to a range of 0.0 to 1.0). The specularity is set to 0.0. int materialflags=1; fragData2 = vec4(0.0,0.0,0.0,materialflags/255.0); No emission, and setting the lighting flag to on. And here we go, the final result of our labor: What's next? I'll try to heal those cracks using tessellation, and I'll probably need some time to figure this out. To win that time, the next post will be about the general structure of a (pass-through) tessellation shader. PS: Of course the entry images to these two posts were done using klepto2's and shadmar's wonderful Posteffect Pack. Thanks you two!
  10. Rastar

    Up! (Part 1)

    I feel a bit like I'm spamming the community blog... but of course that won't stop me from doing it again! With the basic mesh generated it's now time for diving into shader programming. I will displace the vertices in the vertical using a heightmap, and then apply a global color and normal map to make the terrain look more... terrainy. I'll explain the shader line by line. All of this will actually be very basic (so, shadmar and klepto: Move along, nothing to see here... whew, we're amongst ourselves now!). But it will still be so lengthy that I've split the post in two parts. I've put the files for this in my Dropbox folder (due to the large textures it's about 100MB) at https://www.dropbox.com/s/clvbremwdjhdp9c/Terrain.zip. Just extract the folder into your project root and attach the MTTerrain_v2.lua script to a pivot entity. All textures were generated using World Machine (it's from their Tutorial 4 file). Side note: You are not allowed to use them commercially, but I guess you didn't plan to anyway... If you enter a "Total terrain size" of 16384 and a "Vertical scale" of 2625 you have the dimensions that the terrain was generated for. Leadwerks 3.1 provides access to four different stages of the shader pipline: vertex, tessellation, geometry and fragment shader (the tessellation stage actually consists of two programmable parts, the control and evaluation shaders). At least the vertex and the fragment stage have to be present for your shader to compile and produce any output on the screen, and that's what I'll be using here. 1) The vertex shader is the first programmable stage. It is executed for every vertex of the rendered model. It cannot produce any geometry or drop it, but can be used to change vertex positions, transform between different spaces, calculate colors and lighting and much more. 2) The fragment shader is executed for every fragment (potential pixel on your screen). Its job is to produce the final color value for each fragment. For a short summary of the shader stages see the OpenGL wiki. Leadwerks shaders are programmed in GLSL, a special C-like shading language. I won't go into detail about syntax, data types and so on, but rather focus on two things that have always confused me (and still do): How data flows in, between and out of shaders; and the various transformations between coordinate spaces. So, here is the vertex shader for the terrain (I have stripped it down to almost the bare minimum that I need): #version 400 #define MAX_INSTANCES 256 It starts easy enough with the minimum GLSL version number required by this shader. For the more recent releases, the GLSL and OpenGL version numbers are identical. Then it defines the maximum number of (hardware) instances for a model. Frankly, I don't know what happens when you exceed that limit and how large that number can be - maybe somebody can chime in? //Uniforms uniform vec4 materialcolordiffuse; uniform sampler2D texture0;//diffuse+height map uniform mat4 projectioncameramatrix; uniform mat4 camerainversematrix; uniform instancematrices { mat4 matrix[MAX_INSTANCES];} entity; Then a couple of uniforms follows. Uniform values are identical across vertices. The values above are defined and filled by Leadwerks. The first two are defined in the material editor as the diffuse color and diffuse (first) texture. The next three are Leadwerks internals and lead us into the wonderful world of coordinate transformations: Starting with the last, this is an array of entity transforms, ie the values for position, rotation and scale of an entity. This is hardware instancing at play: Instead of rendering the same model with numerous draw calls, the different entity matrices of those models are stored in a uniform buffer and then passed to the shader who renders them in a single draw call. By multiplying a vertex with such a matrix it is transformed into world space (from local to global coordinates). The camerainversematrix (aka "view matrix") transforms vertices into camera space, ie into a space with the camera at the origin and typically looking down the z axis. Finally, the projectioncameramatrix both transforms into camera space and then into "clip space", the final perspective view. uniform float terrainSize; uniform float verticalScale; The next two lines are uniforms that I have defined myself and that are set by the MTTerrain.lua script like this shader:SetFloat("terrainSize", self.terrainSize) shader:SetFloat("verticalScale", self.verticalScale) They contain the overall size of the terrain (in meters) and its maximum height (also in meters). //Attributes in vec3 vertex_position; in vec4 vertex_color; //Outputs out vec4 ex_color; out vec2 ex_texcoords0; out mat3 nmat; And finally the per-vertex (non-uniform) inputs and outputs of the vertex shader follow. The "ins" are predefined and filled by Leadwerks (they contain the object-space vertex position and vertex color). The "outs" are arbitrary variable names that are then provided to later shader stages and can be used there by a simple naming convention. So an "out vec4 ex_color" can be accessed in another shader by defining "in vec4 ex_color". Then the magic happens: Every shader stage needs a void main(void) function that does the actual processing (though you can define additional functions to structure your code). mat4 entitymatrix = entity.matrix[gl_InstanceID]; mat4 entitymatrix_=entitymatrix; entitymatrix_[0][3]=0.0; entitymatrix_[1][3]=0.0; entitymatrix_[2][3]=0.0; entitymatrix_[3][3]=1.0; vec4 modelvertexposition = entitymatrix_ * vec4(vertex_position,1.0); First, the entity matrix of the current instance (identified by gl_InstanceID) is retrieved from the array and stored in the variable entitymatrix. Now, as a little trick Leadwerks stores a per-entity color in the (otherwise unused) fourth column of that matrix - the entitymatrix_ variable is relieved from that burden (if you select an entity in the scene explorer and switch to the Appearance tab - the "Diffuse" down there - that's it!). Then, (object-space) vertex position is transformed into world space. ex_texcoords0 = modelvertexposition.xz / terrainSize + 0.5; The (first) UV coordinates of a mesh are passed into the shader in a uniform called vertex_texcoord0, but that is of no use here: I want to apply a color and normal map that covers the entire terrain (not just a single mesh patch), so I have to calculate them myself. Since the terrain is centered on the origin, after dividing the vertex position by the terrain size an addition of 0.5 gives the correct range from 0.0 to 1.0. modelvertexposition.y = (texture(texture0, ex_texcoords0)).a * verticalScale; Tadaa - this innocent looking line does all the magic of the vertex displacement. Using our calculated UV coordinates, it samples the alpha channel of the diffuse texture (where I have encoded the height map), multiplies that value (which again will be between 0.0 and 1.0) with the maximum height and displaces each vertex by the result in the y direction. Simple. gl_Position = projectioncameramatrix * modelvertexposition; Every vertex shader must return the vertex position in the gl_Position variable, in this case in clip space coordinates. nmat = mat3(camerainversematrix[0].xyz,camerainversematrix[1].xyz,camerainversematrix[2].xyz);//39 nmat = nmat * mat3(entitymatrix[0].xyz,entitymatrix[1].xyz,entitymatrix[2].xyz);//40 ex_color = vec4(entitymatrix[0][3],entitymatrix[1][3],entitymatrix[2][3],entitymatrix[3][3]); ex_color *= vec4(1.0-vertex_color.r,1.0-vertex_color.g,1.0-vertex_color.b,vertex_color.a) * materialcolordiffuse; Almost done... Finally a matrix for transforming normals into camera space and a vertex color value is calculated and exported. What's next I know, I know, that's not really a joy to wrap your head around. Part 2 goes quickly ove the fragment shader (will be much shorter) and then present the final result. Stamina!
  11. Try importing the 16bit format (r16), that should work. Can't remember right now if LE supports the r16 extensions, if not just rename it to "raw". It depends on your vertical terrain scale if 8bit is enough, but in many cases the 256 values are a little too few - for a 1000m terrain this means a resolution of 4m. I use 16bit formats most of the time. EDIT: One thing that comes to my mind: If I remember correctly, the Basic Edition of World Machine by default generates a terrain of 513x513. So you'd have to remove that checkbox under Project Parameters to make it 512x512. Also, you have to create a 512x512 terrain in Leadwerks instead of the default 1024x1024.
  12. I'm afraid you would have to spend $249 for that experiment. I you like I can generate the meshes for you (you could define your terrain in the basic edition and send it over). You wouldn't be allowed to use them commercially, of course. Just PM me if that's an option.
  13. World Machine does, but only in the Professional edition (for the tiled build). It can export meshes in the obj format, and you can optionally run a mesh simplification before exporting them.
  14. I am not sure what you mean by "video tab"... in the editor, there is an option under Tools->Options, the behavior during run-time is controlled via the vwait flag of Context:Sync()
  15. It's available in Lua as well. As for examples, it's pretty straightforward: If your shader for example has a uniform float my_parameter; then you can set that from a Lua script using shader:SetFloat("my_parameter", 42.0)
  16. I would like some documentation on this as well. My guess (I have no access to my PC right now to try it out) is: Basically everything is shared between an entity and its instances, with the following exceptions 1) the transforms (position, rotation, scle) 2) the entity color (ie entitiy:SetColor()) If you look in one of the vertex shaders, you'll see that the enity matrix is indexed with gl_InstanceID, so that's why the transforms are per instance. In addition, Leadwerks encodes the entity color in the fourth column of the entity matrix, so that is per instance as well. Everything else (like material diffuse color) is passed in as a uniform, so it will be shared between instances. But again, this is just my guess from the code, an official documentation on this would be nice...
  17. As Aggror said - possible: yes, easy: no. You basically have two options: 1) Use the built-in terrain and map system. Split your terrain over several maps and change the maps during run-time. You could pre-load the next map in a separate thread to reduce swapping time, but the transition will be visually noticeable, so you might need a loading screen or something like that. 2) Implement a streaming terrain solution. Leadwerks currently doesn't provide a lot in that area (few engines do), so you would have to do everything yourself. For example, you could use tiled meshes arranged in a checkerboard-like pattern and load/unload them (and their contents) as the player moves. It probably depends a little on how much experience in game development you have. If you are just starting out, consider doing something smaller, otherwise you might quickly get frustrated. Otherwise - it's a fascinating journey ahead of you, but a long and twisted one..
  18. @martyj; There are a couple of questions that you should probably answer to yourself before putting work into this: What is actually the "real-world" size of your terrain? If the heightmap size you have given is the number of pixels - how many units (meters) per pixel do you have? More than 1 m/pixel? Very large maps will create problems due to floating point errors far away from the origin (say, more than 16k in either direction), leading to rendering artifacts and physics problems. There are ways to counter this, but it's an additional complexity. Do you really need such a large terrain? Does your game idea call for it? How fast will the player move through the terrain? How are you filling such an enormous terrain with interesting gameplay? The dimensions of your heightmap are slightly unusual, normally you will find powers of two, and most often square dimensions (like 1024x1024, 2048x2048 and so on), not the least because GPUs are optimized for handling textures of those dimensions. Objects (trees, enemies, houses, ...) for such a terrain can't be kept simultaneously, they will have to be streamed in and out according to player movement. EDIT: @MajorAF, just saw your post in the Tutorial Request thread. As far as I know, Combat Helo uses terrain tiles that are streamed in and out of memory. But that's not part of the built-in Leadwerks 2 terrain, but a custom implementation. You might ask community member "Flexman" about this, he's one of the authors. But again, all of these large terrain solutions are hand-crafted and don't come with the engine.
  19. Rastar

    Patchwork

    Thanks, Aggror - nothing compared to your tutorials, of course...
  20. Rastar

    Patchwork

    Oh, cool, didn't know that - thanks!
  21. "Shader model 4.0" roughly means OpenGL 3.3 (OpenGL4 has Shader Model 5)
  22. Rastar

    Patchwork

    As described in my last post, I'd like to create a homegrown system for rendering large terrains. The basic idea is to use a set of planar mesh patches that are displaced in the vertical using a heightmap in the shaders, and using OpenGL 4's tessellation features to create a more detailed and optimally structured mesh during run-time. Also mentioned before was the fact that this might be a bit of trial and error, so bear with me, and if you see me running in the wrong direction - please let me know! Mesh patches To displace a mesh in the shaders I first need a mesh... I use an approach similar to the one described by http://developer.nvidia.com/GPUGems2/gpugems2_chapter02.html, but with a simpler patch structure. The basic idea of using such patches is to 1) use a small set of static vertex buffers to keep memory consumption and required CPU-GPU bandwidth low, and 2) to allow patches to be culled. Since the final mesh struture will be determined later on in the tessellation shaders, I think I don't have to be overly concerned with the patch structure. It only has to provide a good enough base for later tessellation, and to potentially cover a large terrain. I will use an arrangement like this: So there will be just one base patch, scaled and moved around to create a concentric ring of meshes centered on the viewer. Since I am using the Model:Instance() call, and Leadwerks 3.1 by default uses hardware instancing, this should mean that the terrain will actually be rendered in a single draw call (am I right here?). The innermost level (Level 0) is comprised of 4x4 patches with the highest resolution, while every higher-numbered level consists of twelve patches that are double the size of the next lower level (as seen by the grid structure in the lower left). There are three parameters to control the terrain generation: Patch size - the number of quads per side in a patch, so a value of 8 results in an 8x8 patch Total terrain size - the size of terrain in Leadwerks units (usually meters) Quad size level 0 - the size of a quad in Leadwerks units Using those values, the number of required levels is calculated, afterwards the quad size for level 0 is recalculated to make sure the numbers match. It is probably best to use powers of two for those numbers to reduce the possibility of rounding errors, which might result in vertices on the boundaries not lying on top of each other (e.g. 8 - 4096 - 1). Code, issues to solve I have attached the code for generating such a mesh structure below. To use it, just create a pivot entity in your scene and attached the script to it. When running the app, you should see something like this: Yes, not very impressive, I know - but we have to start somewhere. Unfortunately, Leadwerks scripts currently can't run in the editor, and there is no way to render a scene in wireframe mode outside the editor when using Lua (there is an unoffical C++ call for this, though), so we have to hope the triangles are all nicely aligned... EDIT: With shadmars wireframe material it is actually possible to visualize the triangle structure (thanks!). Looking good so far: There are a couple of issues with this code, among them the automatically calculated axis-aligned bounding boxes (AABB) don't know about the shader displacement later on, so I will have to manually correct them at the boundary between two levels there are lots of T junctions, where a vertex of the lower level lies on an edge (and not a vertex) of the next higher level. This will generate cracks in the surface once we displace the heights. I plan to solve this in the tessellation control stage (hope it works, just an idea up to now...) What's next? In the next post I'll use a shader to do the vertex displacement. This is actually something like a one-liner, so I'll use the remaining space to describe some of the things about shaders that I think I have understood. Oh, by the way: I took the entry picture to this post last week while cross-country skiing in the Alps on the German-Austrian border (I'm rubbish at it, but what the heck!). Just to give an idea what I'm aiming for - isn't nature just beautiful! I'll probably also create a dataset for that region as well, now that I have some reference images... Oh, and at the end of the month I'm going on a biking trip to Mallorca, my "test terrain", hopefully I'll be able to take some nice images there, too.
  23. Ah, it seems GetSurface() is not available for an instanced model. So local model = Model:Create() model:AddSurface() local surface = model:GetSurface(0) run fine, but the following local model2 = model:Instance() local surf2 = model2:GetSurface(0) gives the described error.
  24. I am trying to get the first surface of a model patch:GetSurface(0) but only get attempt to call method 'GetSurface' (a nil value) Is that not exposed to Lua?
×
×
  • Create New...