Jump to content

Rastar

Members
  • Posts

    421
  • Joined

  • Last visited

Everything posted by Rastar

  1. The path parameter for Camera:AddPostEffect also accepts a Lua script which does the shader handling. I have no access to Leadwerks right now, but iirc the Bloom effect uses this approach (have a look at Shaders/PostProcess).
  2. You would have to make this into a uniform uniform int downgrade = 6; and can then set it via Shader:SetInt().
  3. Hi shadmar, thanks for the suggestion - will try over the weekend!
  4. Is the bullet scaled in-editor to values < 1? If so, maybe Instance() starts with a scale of 1. Just a wild guess, don't have access to LE right now...
  5. Bump. Anybody? I've put a screenshot of my WIP in the gallery http://www.leadwerks.com/werkspace/page/viewitem?fileid=462724265 but without indirect specular lighting from (special) cubemaps this won't be the real deal ;-)
  6. After some time off I am picking up my PBR stuff for Leadwerks again. For the indirect specular lighitng I need special cubemaps that I would like to generate using the Leadwerks API. I would prefer to stay in Lua. I am doing it currently like this local cameraBuffer = Buffer:Create(self.resolution, self.resolution, 1, 0) local cubemap = Texture:CubeMap(self.resolution, self.resolution, Texture.RGB) cameraBuffer:Enable() for i = 1,6 do camera:SetRotation(self.cameraRotations[i]) cameraBuffer:SetColorTexture(cubemap, 0, i-1) App.world:Render() end cameraBuffer:Disable() Context:SetCurrent(App.context) Now Running this code gives me a "Asset map vaue is different" error, though the game continues normally. I don't know what this means? What is the second parameter to Buffer:SetColorTexture()? The mipmap level? I'm assuming that a cubemap is created by specifying the width and height of a single face (rather than width*6), is this correct? As I see it, there currently is no way to save a texture to disk (or is there?). Would it be possible to add this to the (Lua) API?
  7. The texture editor has a "Generate Mipmaps" flag. I take it that when set, Leadwerks will calculate the mipmaps for a texture (and import only the first level when unchecked)? I actually need to import textures that already have mipmaps which should not be overwritten during import because that carry information other than plain mipmapping (e.g. specular cubemaps giving the irradiance for several levels of specular power). Is there a way to import those into Leadwerks?
  8. Yes, start small and grow big. Begin with a pass-through shader that does nothing but transform coordinates and output color. Experiment with that. You might find this thread http://www.leadwerks.com/werkspace/topic/8203-internal-shader-uniforms/page__hl__uniform useful as it lists the uniforms (constants passed into the shader) that you need. Also, be aware that Leadwerks has a deferred renderer - many examples you might find online are for a forward renderer. The big difference being that in a forward shader you do the lighting calculations right away, whereas in deferred rendering you first write certain values into a so-called G-buffer, and Leadwerks does the lighting calculations afterwards in a separate pass. The most important buffers you'll need in the beginning are fragData0 for the diffuse color and fragData1 for the normals. Also, don't hesitate to ask here what exactly you don't understand about the inner workings of the Leadwerks shaders!
  9. Ahh, OK. And those 7 ambients? By the way, it seems the camera (range, zoom) uniforms aren't passed to the ambient lighting shaders. While that is fine for "ordinary", view-independent ambient terms, I would like to calculate an ambient specular contribution. Would it be possible to at least hand those two to the ambient lighting shaders so I can calculate the view direction vector? Thanks again!
  10. Ummmm, sorry for a silly question, but I can't see the forest for the trees right now... what are those combinations (8*2*3) again? And those 7 for the ambient?
  11. Ahhh, texelFetch, that was the missing clue - thanks, shadmar!
  12. For some reason I seem to be unable to Bind the normals buffer in a PostProcess Lua file. In function Script:Render(camera,context,buffer,depth,diffuse,normals,emission) I try to bind the normals to texture2 using normals:Bind(2) but the result is complete blackness. Are they actually passed in to that function? Do I have to do something special/different?
  13. Is there any way to pass additional uniforms to the lighting shaders (directionallight.shader, ...) ? (Provided I have modified the sources and added those uniforms). Which is equivalent to. Is there a way to get a reference to the loaded shaders? I guess when I call Shader::Load() for those shaders I get a separate instance that I then set uniforms on, not the one actually loaded by the render code?
  14. I haven't tried Mixamo stuff in LE, but you might something in this thread http://www.leadwerks.com/werkspace/topic/11449-fbx-animations-not-working/page__hl__mixamo However, I'm pretty sure the crawler animations won't work on your Mixamo characters. Animations are created for a specific bone structure (skeleton), and the bone hierarchies, scales and orientations of the crawler and a Mixamo character are certainly different. You would have to retarget the crawler animations to your character.
  15. Mmh, OK, wasn't aware of that. Is that for performance reasons? Otherwise it would seem like an unnecesary duplication of shaders that are identical except for those texture lookup code lines.
  16. I was trying to create some test spheres for shading experiments with no textures assigned, just fixed material colors. It seems they still use textures from some other material. E.g., if I create a new empty material in MyGame and assign it the diffuse+normal shader, it will show a concrete preview in the material editor. When creating a brush sphere in the introduction scene, assigning it the material and running the game, it is rendered using the tire material
  17. I'm actually thinking about making this an item for the commercial workshop. But there's still a lot of work ahead and requires some additional technical features in Leadwerks, so no promises yet...
  18. Ahh, I see.... I guess we're referring to different things here: you to DCC applications and textures, me to shaders and the G buffer. The material flags I was mentioning are stored in the alpha channel of the G buffer's fragData2 - per Pixel, not per material. By "0 or 1 anyway" I meant that since I only use 6 bits for metalness it can only have 64 different values but that doesn't hurt so much since it usually is set to 1 (metal) or 0 (insulator).
  19. Yes, I know, what I meant is that I can reuse that channel to either store roughness or metalness. By the way, I finally understood how the best fit normals work, and I guess that means I can't compress those to two channels since the vectors aren't normalized. So I'll probably go ahead and just store the metalness in the upper 6 bits of the material flags channels, since that value is usually 0 or 1 anyways.
  20. Hej shadmar, thanks fpr the tip! Yes, that would work, although I actually just need one single extra channel since I can reuse the specularity in the normal's alpha. I got the normal compression working for the non-BFN case (looking OK, though I guess some additional bits in the normal buffer would help), still having artifacts for BFN and not don't understand why, but I won't budge... ;-)
  21. Ahhh, got it... It's better to kick out the y component and store xz, since that is usually the largest component of a normal (since it's mostly pointing upwards) and therefore introduces a large error if stored with one of the smaller ones.
  22. Actually, going for PBR ;-) - so in a sense yes, but storing metalness and roughness.
  23. I would like to stuff an additional parameter into the G buffer. The only place I can thin of (without adding an additional texture like fragData3) is to encode the normals compress the normals to two channels and reconstruct the in the lighting shaders. However, I can't seem to get this working without artifacts, and I am unsure if this is due to the compression algorithms. Question: Is the normals texture (fragData1) 8bit per channel or 16bit per channel? Could the multisampling somehow get in my way here?
  24. Hi, for a post effect I need to reconstruct the pixels' camera position from the depth buffer. I am a little confused since Leadwerks seems to be doing some things differently than textbook OpenGL, so a few questions I have: I find this snippet e.g. in the directionallight.shader (and also a little different in klepto's sky shader) screencoord = vec3(((gl_FragCoord.x/buffersize.x)-0.5) * 2.0 * (buffersize.x/buffersize.y),((-gl_FragCoord.y/buffersize.y)+0.5) * 2.0,depthToPosition(depth,camerarange)); screencoord.x *= screencoord.z / camerazoom; screencoord.y *= -screencoord.z / camerazoom; screennormal = normalize(screencoord); if (!isbackbuffer) screencoord.y *= -1.0; Now I get the depthToPosition function to calculate the z value. However, the x and y coordinates confuse me: First of all, it seems that Leadwerks uses the y position in the backbuffer inverted compared to the front buffer, therefore those isbackbuffer -> * -1 lines (here and also in other shader parts)? Is that also the reason that the y coordinate uses 0.5 - gl_FragCoord.y rather than gl_FragCoord.y - 0.5? The first line basically seems to compute normalized device coordinates, but why is the x coordinate multiplied by buffersize.x/buffersize.y? This code seems to assume a symmetric FOV with equal FOV values for x and y, correct?
×
×
  • Create New...