Jump to content

Rastar

Members
  • Posts

    421
  • Joined

  • Last visited

Everything posted by Rastar

  1. Rastar

    Next Steps

    Well, I'm on unsafe ground here because I have next to no modeling experience. From what I understand, the other formats (fbx included) are somewhat unreliable, sometimes due to changes between versions, sometimes due to flaky exporters. It probably depends on what the reasoning behind the Blender exporter is if looking at this makes sense, I just noticed that your description sounded like your planning something similar and thought this could save some effort.
  2. Rastar

    Next Steps

    About that Blender plugin: You might consider using OpenGEX. It is an open format for exporting models from modeling tools and importing them into game engines (yes, designed by the owner of the C4 engine, but that shouldn't rule it out). It was born out of the frustrations that model conversions for the proprietary formats cause (especiallly for animated models), and the failure of Collada to solve this. Plugins for 3DS Max and Maya are already available (open sourced), Blender will be soon. The task would then be to convert that into mdl format, which shouldn't be too hard.
  3. The LE3 displacement shader already tessellates, but probably the algorithm and its configurability could be improved upon (as Josh stated above). And yeah, that tessellated terrain is nice, but it's a pretty complicated algorithm (described in Shader X7), since they first render a point cloud in a first pass to get info on the terrain structure, and then calculate the tessellation factors in a second pass.
  4. Actually - no, I don't (maybe shadmar and klepto know something?) It might be better to start with a forward renderer to learn shader programming, because everything is right in front of your eyes, and playing around with different light types, lighting models etc is possible. I started digging into shaders with Leadwerks 3.0 (which has a forward renderer), and that was helpful at least for me. And it might be easier to understand what is going on in a deferred renderer if you have done everything for yourself first.
  5. It's not working for two reasons: 1) All those special variables for vertex normal, light position, transformation matrices are not part of GLSL. They are passed in by name by the engine, and every engine has its own naming convention. If you have a look in a Leadwerks shader, the uniforms there have names like vertex_normal, entity_matrix ('M' of the model-view-perspective transform) and projectioncameramatrix (the 'VP'). So the variables you're using in your code don't carry any values. But even if you used the Leadwerks variables, this concrete shader wouldn't work in Leadwerks 3.1 because 2) it's written for a forward renderer, and Leadwerks 3.1 has a a deferred renderer. In a deferred renderer, the lighting calculations (here the diffuse shading with the NdotL term) are done for you behind the scenes. The shader's main job is to fill the fragData G buffers for diffuse color, normal, emission etc. that the engine needs for its calculations.
  6. I faintly remember Shadmar doing something like this in the early 3.0 days..
  7. I am not too familiar with the various raw formats (I think every manufacturer has one or even several variants containing some specific metadata). And I don't know if Leadwerks is able to skip this metadata. A "really raw" RAW file for heightmap import contains just the pixel values (only one channel) as floating point values (preferably is 16bit format, or two bytes per pixels). Maybe you can convert the file using Photoshop or a similar tool? Or you could write a little conversion routine.
  8. Yeah, I know, the suspense was too much... :-) Who am I to torture you any longer? Here is the eagerly awaited next step in the tessellation pipeline - the evaluation shader. After having defined the tessellation factors in the control shader, a fixed (non-programmable) tessellation stage takes over and creates all those little vertices and triangles for you. The result is then passed into the evaluation stage as patches whose size was defined in the control shader by the layout (vertices = 3) out; statement. Or rather, you're getting access to every single vertex of the patch (already existing and newly created) and the original vertex data at the same time. But we're getting ahead of ourselves, let's do this line by line. Ouverture Like the control stage, the evaluation shader starts with a layout definition statement: layout (triangles, equal_spacing, ccw) in; So we're expecting triangles to be passed in (the default). The next parameter can be a bit tricky: It defines how the triangles are distributed along the edges. There are three options: equal_spacing: every segment of the subdivided edge has the same length. This only makes sense for integral tessellation factors, so any fractional factors are rounded up or down. This has the disadvantage that new vertices pop into existence when e.g. you're increasing the factor from 3.4 to 3.6, which might be noticeable. fractional_odd_spacing: The tessellation factor is rounded down to the next lower odd integer (for example from 4.7 to 3). The edge is divided into (in this case) 3 equal-length segments and two additional smaller ones (taking up the space for the remainder of 1.7, each standing in for half of it). fractional_even_spacing: Same as above, but the tessellation factor is rounded down to the next lower even integer (here from 4.7 to 4). Sounds complicated, actually is a little complicated, but the simple rule is: In most cases where the tessellation factor is calculated (e.g. by a LOD formula) both fractional options give better results than the even one, because new vertices are created close to existing ones and smoothly glide along the edge with changing factors, giving less popping (you can see the effect in a video that I posted earlier on). The final parameter in the layout statement defines in what order you would like your vertices to passed in, counterclockwise (ccw) or clockwise (cw). This has nothing to do with backface culling or anything, it just gives the order in the array data: in ControlData { vec2 texcoord0; vec4 color; vec3 normal; vec3 binormal; vec3 tangent; } cData[]; out VertexData { vec2 texcoord0; vec4 color; vec3 normal; vec3 binormal; vec3 tangent; } vData; So you're getting passed in the data for the original patch vertices as an array and have to produce corresponding data for every vertex in the patch. The evaluation shader is called once for every vertex (newly created or not). But where is its data? Well, you have to create it, which brings us into the Main play All information that the tessellation stage gives you for any vertex is its postiion - in barycentric coordinates. These are basically numbers between 0 and 1 defining the distance of the new vertex from the original corner vertices of the patch. And you use that information to interpolate all vertex data from the corresponding values of the corner vertices. Don't get your head spinning - it basically comes down to applying the same formula for every vertex attribute that you would like to pass along (the formula for quads is different, of course): vData.texcoord0 = cData[0].texcoord0 * gl_TessCoord.x + cData[1].texcoord0 * gl_TessCoord.y + cData[2].texcoord0 * gl_TessCoord.z; The built-in vector variable gl_TessCoord contains the mentioned barycentric coordinates. As you can see, you basically multiply the vertex attribute for every corner vertex with the corresponding barycentric coordinate to get the value for the newly minted one - in this case for the first set of UV coordinates, but the same formula applies to color, normal etc. Only the position uses an additional built-in variable vec4 vPos = (gl_TessCoord.x * gl_in[0].gl_Position) + (gl_TessCoord.y * gl_in[1].gl_Position) + (gl_TessCoord.z * gl_in[2].gl_Position); As you probably have guessed, gl_in contains the original vertices and the field gl_Position their position. And last but least, here again you have to export the vertex position in the special variable gl_Position: gl_Position = projectioncameramatrix * vPos; If you remember, I removed the projection of vertices from world into camera space from the vertex shader, so that we can later calculate tessellation factors using world space data. But at some point before the fragment shader we have to apply this projection, and this is actually the last chance to do that... Finale well, nothing left to say, the pass-through tessellation shader is done! I have attached a small zip file, if you extract that into a project folder you will have the PassThrough.shader (in Shaders->Model) a PassThrough.mat (in Materials->Developer) a simple triangle.map (well, in Maps...) containing a box with the shader applied. The material displays the box in wireframe mode so you can see the effect (by the way, you have to run the game to see that, it won't be shown in-editor). Since a pass-through isn't too exciting, you might want to experiment a little with the shader: modify the tessellation factors (gl_TessLevelInner and gl_TessLevelOuter in the Control shader) and the edge subdivision mode (equal, odd, even) in the Evaluation shader. What's next? Those manual factors aren't of great help, so we'll dynamically calculate the tessellation based on camera distance and other factors. Hope to see you again!
  9. Ah, there's one glitch: The maximum tessellation factor is 64, so you won't be able to e.g. tessellate one large BSP face into thousand of triangles per edge.
  10. This somehow reminds me of a blog about the abuse of tessellation in Crysis 2 :-) http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2 An idea might to include the displacement map resolution in the calculation of the tessellation factors: Check how many pixels would lie on the edges of a patch and tessellate accordingly (of course lower those factors depending on viewing distance). I was planning to use something like that for my terrain experiments.
  11. This is one of the first things I'd do if Leadwerks had an editor plugin system - a general tool for placing splines and an extension that extrudes or generates a geometry along those splines. For the "megatexture baking" option that Josh mentioned we would of course also need to be able to programmatically alter the heightmap and modify the texture. Come on Josh, set the community on fire :-) I'm not sure if the texture baking works in all cases, though - only for very high resolution terrains, I guess (like 1m per px or so), because otherwise the heightmap couldn't be modified to include the road geoemtry, and the road shape would follow the bump and dents of the terrain. It did in LE 2.x, actually, and I never liked that. In those cases a mesh might still have to be placed on top of the terrain, but the baked texture could still be used for rendering the road for higher view distances.
  12. Any action needed on our behalf?
  13. Shadmar's code expects a dffuse detail texture in the "Texture 4" slot, which is empty in your screenshot. If you want to use the base diffuse for this, change the code to outcolor *= texture(texture0, ex_texcoords0*4.0); EDIT: I probably misunderstood your setup... You want to use the cliff texture as the detail texture, right? And in your screenshot, are you using shadmar's code for that very texture?
  14. Cooool! Which reminds me, I still have to do the evaluation blog post... :-( What exactly do you mean by "smearing" effect?
  15. I am referring to this post http://www.leadwerks.com/werkspace/blog/41/entry-1156-leadwerks-31-pre-orders-now-available-indie-edition-coming-to-steam-january-6th/ Unter "Exemptions" it states "Leadwerks 3.0 customers who backed the Leadwerks for Linux Kickstarter project for $99 or more will receive the 3.1 upgrade for free." So that's pretty definite. But then, it also says "There is no purchase necessary to upgrade the mobile add-ons from Leadwerks 3.0 to Leadwerks 3.1" As I said, my guess is that the updater currently isn't aware of all those different cases.
  16. I'm having the same issue. Now I was a Kickstarter backer for Indie Edition ($100). To my understanding that entitles you to an update fron 3.0 to 3.1 (which makes some sense since it's the same amount as the update fee). My guess is that at least my problems have something to do with that. Either the updater doesn't take those cases into account, or I was wrong about that update thing. In any case, this corrupted my 3.0 installation, since it updates all project and content files but not the engine itself.
  17. Model::GetSurface(index) Vec3 Surface::GetVertexPosition(index) Surface::SetVertexPosition(index, Vec3 newPos) should do the trick. Of course you'll need a way to identify the indices.
  18. I have to mingle in here as well. I have no problem with the Ouya being ditched in favor of SteamOS, that's a sound decision. However, simply dropping both mobile platforms is not acceptable. Up to now I was under the impression that they will be ported to 3.1, only at a later stage (remember Josh's experiments with a deferred renderer for the iPad?). Just a little over a year ago Leadwerks put its focus on mobile (and drove some veteran members of the community away with that decision), only to drop that again now and go for Linux? I understand this is a one-man show, but changing course that easily and often is not the solution. I mean, many of us (me included) will probably never get a game out of the door, but even if we tried we couldn't because during the two-year-something development time the engine would have changed at least twice with support being dropped for the engine we started with. Don't know if I'm sad, angry or disillusioned. All of them I guess.
  19. Which means: We really need some more information what the difference between those two version is before we can state our preference. To me the most important questions are - What are the supported publishing paths for the two editions? - will there be any differences feature-wise (apart from the obvious tighter integration with the Steam workshop)? - differences in release cycle (e.g. always Steam first, stand-alone next)?
  20. No, pretty sure the shaders couldn't be used, only the textures.
  21. Don't know if Josh's opinion on this has changed, but just FYI: There was a discussion on this back in the LE2.x days: http://www.leadwerks.com/werkspace/topic/5750-le-with-substance/page__hl__substance
  22. One slightly insane sounding idea would be to release source code for the editor - free of charge to licensees. That way Josh wouldn't reveal any of his IP regarding rendering etc. but really could jump-start community contributions. I guess many of us don't want to fool around with the core engine but would love to improve the workflow, deal with a few UI bugs, inconsistencies and omissions. And I think the only way to stay afloat in these crazy times is to 1) have a few differentiating factors and 2) leverage the community for development and content. Just a silly thought..
  23. Yoy should be OK here. The license conditions state that a) You need a Pro version (regardless if it's Indie Solo/Enterprise or Enterprise) to use the models and textures shipped with the product (and asset packs) commercially. B) Everything you do with FP is your own IP. At least that's the way I understood it
  24. Maybe it's possible to implement some of this using a geometry shader? I can't remember if LE2.5 gives you access to this, although the OpenGL version it uses provides it.
×
×
  • Create New...