Jump to content

Josh

Staff
  • Posts

    23,394
  • Joined

  • Last visited

Blog Entries posted by Josh

  1. Josh
    Leadwerks Game Engine 4.1 beta is now available on the beta branch on Steam. This is the final version 4.1, barring any bug fixes that are made before the release later this month. I am going to focus on just resolving bug reports and adding new documentation for the next two weeks.
     

     
    4.1 adds environment probes, which have been covered previously, and volumetric lighting. Point and spot lights can display a volumetric effect by adjusting the "Volume strength" setting, found under Light properties in the editor. You can also use the new commands Light::SetVolumetricStrength() and Light::GetVolumetricStrength(). At this time, directional lights should use the godray shader instead, but I might try this effect on directional lights as well before the release.
     
    Volumetric lighting is an expensive effect to render, but my Intel 5200 is running it at a fast speed, and high-end cards should have no problem with it. Lights with a large range are more expensive than lights with a small range. The performance will be exactly the same for any intensity level above zero (the default). The effect is disabled when the low-quality light setting is in use.
     
    Volumetric lighting looks best in dark scenes with lots of contrast. Spotlights tend to look a little better, as point lights don't have the same well-defined shape. Use these sparingly for dramatic lighting effects in key positions. It looks especially nice when you have a fence or grate in front of the light to create lots of interesting light beams.
     
    The bloom effect has also been improved, and SSAO, godrays, and iris adjustment post-processing effects have been added. The bloom effect is using the mixing equation from Leadwerks 2, along with an improved Gaussian blur function from the RasterGrid blog.
     

     
    New entity icons courtesy or reepblue have been created.
     
    Have fun with the new features and let me know if you have any questions.
  2. Josh
    I came across a very interesting presentation that talks about how to avoid "banding" artifacts in game graphics. The author uses dithering to add noise to an image and break up the visible lines your eye can detect. This even works with audio. When noise is added to the mix, the original tune appears to become more high-fidelity than it actually is:
     


     
    I was able to use this concept to improve the appearance of banding in shadow acne in a low-resolution spotlight with a large volume. Here is the original situation, purposely created to maximize the banding effect:
     

     
    And here is the result when some slight dithering is added:
     

     
    The trick to dithering is to know how much noise to add. If we add too much it starts becoming very apparent:
     

     
    You want to calculate your noise amplitude as the difference between the two discrete levels you are trying to bridge. In the case of the spotlight shader, random noise is being added to the z coordinate of the shadow lookup, so I want the noise amplitude to be equal to the minimum depth difference the shadowmap can display. I calculated this as follows, but more experimentation is needed to make sure its right:
     

    float noiselevel = 0.000001 * lightrange.y; shadowcoord.z += rand(lightnormal.xy) * noiselevel - noiselevel * 0.5;
     
    You also want to make sure your random seed is really random. Using the screen coordinate alone produces bad results because the same random seeds will stay in place as you look around and cause visible patterns. If you use the "currenttime" shader uniform the noise will constantly change as the camera stays still, resulting in a film grain effect. I find it is best to multiply the xy components of the fragment coordinate by some other vec2 value.
     
    Here's another example with a directional light exhibiting a banding appearance due to a low angle on the ground:
     

     
    And here is the improved image with dithering applied:
     

     
    This code does not eliminate shadow acne, it just breaks it up with some random noise so that your eye can't as easily detect a continuous line.
     
    The same technique can be used to improve the appearance of the new godrays shader I am working on. With only 16 samples for the rays, banding is very easily visible in this image.
     

     
    When we add a random offset to the ray starting position, with an amplitude equal to the length of the distance between steps, banding disappears.
     

     
    Of course more samples are always better, but even at higher sample counts, dithering makes a huge improvement in quality. Leadwerks Engine 2 actually used 64 samples, with worse results than what I got using just 16 samples above. The ease with which I can now modify shaders in Leadwerks Editor and see the results instantly really helped develop these techniques.
  3. Josh
    Environment probes are now available on the beta branch. To access them, you must set the "UnlockBetaFeatures" setting in the config file to 1, then start the editor. Environment probes are available in the "Effects" object creation category.
     
    Environment probes should be placed with one in each room. Use the object scale to make the probe's volume fill the room. (Like decals, probes will display a bounding box when selected.) You do not have to worry about covering every single space as the GI effect will blend in pretty well with the regular old ambient light level. Outdoor scenes do not need probes covering everywhere. There is not presently an entity icon for probes so you will have to select them from the scene panel. The AI & Events map has been updated with probes added, and it just took a couple of minutes.
     

     
    The color value of the probe (under the "Appearance" tab in the object properties) is used to control the amount of ambient lighting the probe contributes. The specular color of the probe is used to control the strength of the reflections it casts.
     
    Materials can control their reflectivity with their specular color and specular texture, if it exists. You may find some materials are highly reflective and need to be toned down a bit, so just make their specular color darker. The roughness setting is used to control how sharp or blurry reflections are. Higher roughness = blurrier reflections.
     
    When you first create an environment probe, its cubemap will be rendered immediately. You should re-render it when you have made some changes to your map and want to see the updated global illumination. To do this, select the Tools > Build Global Illumination menu item, which will update all environment probes in the scene.
     
    Global illumination is an inexpensive effect because it is precomputed. However, environment probes will be skipped if the lighting quality is set to the lowest setting.
  4. Josh
    First, there's a lot of confusion about what HDR actually is, and I want to clear it up. HDR does not automatically mean "iris adjustment" although the two often go together. HDR, at its simplest, means that colors won't get washed out during the rendering pipeline.
     
    Game graphics inputs are 8-bit (per channel textures). The output is 8-bit (unless you have a 10-bit monitor). So if we only perform one rendering step, it is impossible to lose color resolution. Even if our lighting step brightens the light beyond the brightest color possible (255,255,255), it doesn't matter, because your monitor can't display the difference anyways.
     
    However, if we have post-processing effects that darken the screen again, we CAN lose color resolution. I made an extreme example to showcase this idea. Here's my "brighten" post-processing effect, which increases the brightness of each pixel by four times:

    #version 400 uniform sampler2D texture1; uniform bool isbackbuffer; uniform vec2 buffersize; out vec4 fragData0; void main() { vec2 tcoord = vec2(gl_FragCoord.xy/buffersize); if (isbackbuffer) tcoord.y = 1.0 - tcoord.y; vec4 texcolor = texture(texture1, tcoord); fragData0 = texcolor * 4.0; }
     
    And here's my "darken" post-processing effect, which reverses this effect:

    #version 400 uniform sampler2D texture1; uniform bool isbackbuffer; uniform vec2 buffersize; out vec4 fragData0; void main() { vec2 tcoord = vec2(gl_FragCoord.xy/buffersize); if (isbackbuffer) tcoord.y = 1.0 - tcoord.y; vec4 texcolor = texture(texture1, tcoord); fragData0 = texcolor / 4.0; }
     
    Here is a simple scene with no post-processing effect applied:

     
    If I place the brighten effect first, and follow it by the darken effect, the color will get clamped at the max value of (255,255,255) and then darkened, resulting in a washed out image:

     
    If we use the darken effect first, and then follow up with the brighten effect, the color gets compressed and inaccurate. I had to crank the multiplier up to 60 to show the effect, but you get the idea.

     
    HDR solves this problem by using higher-accuracy float buffers in the middle stage processing. In this example, it results in an image that is identical to what we could see if neither of these post-processing effects were in use:

     
    In real world usage, it can be used to prevent post-processing effects from washing out your colors. However, unless you are using a post-processing effect that darkens the whole screen, even bright pixels, you're not going to see any difference. This is where the iris adjustment shader comes in. Most effects actually brighten the screen, but iris adjustment is one that uniformly dims the screen, regardless of pixel brightness. That's why this effect is closely associated with HDR.
    Texture Formats
    When rendering to floating-point textures, you might think that would increase memory usage a lot. We do have the option to render to 32-bit floating point RGBA textures. However, we also have half-float 16-bit textures, which will probably produce the same results in most cases. But wait, we don't actually need the alpha channel at this stage in the rendering pipeline, and there's a compressed float format available called GL_R11F_G11F_B10F. This eliminates the alpha channel and uses the extra bits to add precision to the color, packing a floating point RGB image into the same space as an 8-bit RGBA image. I just tested, and it even works on Intel integrated graphics! It is likely that in the future this will be the default setting, and HDR will always be enabled by default, at no cost. To start with I am going to make it an explicit option until we figure out how well it works on all hardware, and whether we need a higher-resolution RGBA16F format. 
    You will have access to this feature in the next beta build.
  5. Josh
    A "roughness" property has been added in the material editor. This is a simple slider with a value from zero to one. (The default material roughness value is 0.5, so all your existing materials won't turn into mirrors when the update comes). Changing the roughness value will have no visible effect unless an environment probe is visible in the scene, although I could modify the lighting shaders to make this control gloss in the specular calculation:
     

     
    Two new commands have also been added, Material::SetRoughness and Material::GetRoughness.
     
    All of the channels in our 4-texture gbuffer are already in use. To avoid allocating an additional texture to store the per-pixel roughness value, I used two flags in the materials flags value to make a simple 2-bit float. This is encoded in the geometry pass as follows:

    if (materialroughness>=0.5) { materialflags += 32; if (materialroughness>=0.75) materialflags += 64; } else { if (materialroughness>=0.25) materialflags += 64; }
     
    The probe shader then decodes it into a "roughness" integer:

    int roughness=1; if ((32 & materialflags)!=0) roughness += 4; if ((64 & materialflags)!=0) roughness += 2;
     
    The reflection cubemap lookup is then performed as follows:

    miplevel = max(int(textureQueryLod(texture5,shadowcoord).y),roughness); vec4 specular = textureLod(texture5,shadowcoord,miplevel) * specularity * lightspecular;
     
    This allows us to pack the roughness value into the gbuffer while avoiding additional texture memory use. We also have one final unused flag in the material flags value we can allocate in the future for additional functionality. Although this only allows four possible roughness values, I think it is a good solution. Just remember that "rough surface" equals "blurry reflection". The image below corresponds to a low roughness value and the reflections are sharp and clear.
     

     
    Notice that the reflections do not appear on the directly lit spheres, due to the "maximum" blend mode described here. While not technically accurate, this is a good way of dealing with the limitations of 8-bit color until ultrabright 10-bit monitors (or more) become the norm.
     
    Here is a shot using a medium roughness value, around 0.5:
     

     
    Finally, a very rough surface gives indistinct reflections that still look great because they correspond to the surrounding environment:
     


    Shader Updates
    To solve the problem described here, all shaders have been updated so that gbuffer normals are stored in world space instead of camera space. If you are using the beta branch, you must update your project to get the new shaders or lighting will not appear correctly. 
    Any third party shaders must be updated for this change. Most model shaders will have a section of code in the vertex program that looks something like this:

    mat3 nmat = mat3(camerainversematrix[0].xyz,camerainversematrix[1].xyz,camerainversematrix[2].xyz); nmat *= mat3(entitymatrix[0].xyz,entitymatrix[1].xyz,entitymatrix[2].xyz);
     
    The camera matrix multiplication can be removed and the code simplified to that below:

    mat3 nmat = mat3(entitymatrix);
     
    Some post-processing shaders retrieve the pixel normal with a piece of code like this in the fragment program:

    vec3 normal = normalize(normaldata.xyz*2.0-1.0);
     
    To update these shaders, first declare a new uniform before the main function:

    uniform mat3 camerainversenormalmatrix;
     
    And multiply the world normal by this to get the screen normal:

    vec3 normal = camerainversenormalmatrix * normalize(normaldata.xyz*2.0-1.0);

    Leadwerks Game Engine 4.1
    The new global illumination feature will be released in Leadwerks Game Engine 4.1. For comparison you can see a screenshot below of the Leadwerks 4.0 renderer, which looks good: 

     
    But the Leadwerks 4.1 render of the same scene looks absolutely fantastic:
     

     
    Until now, directly illuminated surfaces in Leadwerks looked the best, and now shaded areas look equally beautiful, or even better. Remaining tasks include testing on AMD and Intel hardware, which I have not done yet. I also have to do more work to selectively render objects in the GI reflection render. Objects that cast dynamic shadows are presently skipped, but I need to actually re-render all shadows so their shadows aren't visible in the reflection. I also need to remove entities like particle emitters and adjust quality settings so the GI render isn't rendering reflective water and other unnecessary things. Finally, the probe shader needs more work so it can handle rotation of the probe entity, which it presently does not do.
  6. Josh
    I was able to work out the shader fixes and now I can show you a good schematic of how the new lighting system works. Instead of a single light source, we now have three types of lighting that are combined for the final render. Direct lighting is the standard deferred lighting model you know and love from Leadwerks. This looks great, but shadowed areas have always looked pretty flat.

     
    The second component is the new GI ambient lighting. This is a view-independent lookup on the local cubemap that gives us a rough approximation of single-bounce radiosity. Notice how the lightened areas are lit up as a result of the direct lighting in the previous image. The bluish rectangle on the floor is a result of the sky color reflecting from above:

     
    Finally, we have the new specular reflections, a high-res dramatic reflective effect that changes with the view position:

     
    When all three lighting contributions are combined, it looks like this:

     
    And when you add textures, it looks like this:

     
    Instead of like this:

  7. Josh
    I've been puzzling over a problem that exhibits itself on flat surfaces. Cubemap reflections appear "jittery" as if they are been rounded off to a lower-precision float. At first I thought this might mean that our best-fit normals routine using 8 bits per channel might lack the precision needed for cubemap reflections. However, when I changed the normal texture in the gbuffer to a 32-bit floating.point RGBA texture, the problem remained. This error is also visible in Igor's SSLR shader.
     
    I found that the camera matrix multiplication in the model vertex shader was causing this, but I couldn;t figure out why. In the model shader, the normal is multiplied by the camera normal matrix:

    mat3 nmat = camerainversenormalmatrix; nmat *= mat3(entitymatrix[0].xyz,entitymatrix[1].xyz,entitymatrix[2].xyz); ex_normal = normalize(nmat * vertex_normal); ex_tangent = normalize(nmat * vertex_tangent); ex_binormal = normalize(nmat * vertex_binormal);
     
    Then that operation is reversed in the probe lighting shader:

    shadowcoord = cameranormalmatrix * reflect(screennormal*flipcoord,normal);
     
    However, if I stored the fragment normal in world space rather than camera space, the error went away. This makes some sense because in world space the value is not constantly changing as you rotate the camera. In the video below I switch the code back and forth to adjust the two shaders and show the difference.
     


     
    Changing this will require the matrix multiplication being changed in pretty much every lighting, post-processing, and model shader, which isn't a huge deal. I still don't understand why using a 32-bit floating point RGBA texture for the normal texture didn't fix the problem though. I also verified that the camera matrix was not changing between the model drawing and the probe drawing.
     
    Part of the idea for the design for this system came from watching footage of the new Doom game, which makes good use of a parallax cubemapping effect all over the environment. You can see lots of lovely reflections in the pools of blood on the floor here:
     


  8. Josh
    After a resolving a few odd and ends I was able to create a proof of concept of the deferred environment probe idea I talked about earlier. The environment probe is basically the same as a point light. They have a range and affect a finite area. The color property can be used to adjust the intensity of the ambient lighting / reflections. Basically, you want to cover your indoor environments with these, but it's okay if you miss some spots. The environment probes fade out gradually so you don't have to be too strict about coverage. Also, if two proves overlap the brightest value will be used, but they will not add their lighting together.
     

     
    There are still some issues to figure out, like how to control the miplevel that is used in the cubemap lookup and how to automate cubemap rendering in the editor. I'm not going to try to make the cubemaps update in real-time in the game, because it would be too slow.
     
    The screenshots below show the same scene with flat ambient lighting:

     
    And with an environment probe placed in the scene:

  9. Josh
    The environment probe feature is coming along nicely, spawning a whole new system I am tentatively calling "Deferred Image-Based Lighting". This adds image-based lighting to your scenes that accentuates the existing deferred renderer and makes for higher quality screenshots with lots of next-gen detail. The probes handle both image-based ambient lighting and reflections.
     
    Shadmar helped by implementing parallax cubemap correction. This provides an approximate volume the cubemap is rendered to and gives us a closer approximation of reflections. You can see a good side-by-side comparison of it vs. conventional cubemaps in Half-Life 2
    . It works best when the volume of the room is defined by the shape of the environment probe, so I changed probes to use a box for rendering instead of a sphere. The box is controlled by the object's scale, just like with decals, although I wonder if there would be a way to build these out of brushes in the future, since our CSG building tools are so convenient. In the screenshot below you can see I have added two environment probes to a map, one filling the volume of each room. Placing probes does require some amount of artistry as it isn't a 100% perfect system, but with a little experimentation it's easy to get really beautiful results. 

     
    Once probes are placed you will select a menu item to build global illumination. GI will be built for all probes and the results are instantly visible. The shot below is only using an environment probe for lighting, together with an SSAO post-processing effect. Because the ambient lighting and reflections are image-based, any bright surface can act as a light even though it isn't really.
     

     
    When combined with our direct deferred lighting, shadowed areas become much more complex and interesting. Note the light bouncing off the floor and illuminating the ceiling.
     

     
    The new feature works seamlessly with all existing materials and shaders, since it is rendered in a deferred step. It can be combined with a conventional ambient light level, or you can set the ambient light to black and just rely on probes to provide indirect lighting.
  10. Josh
    Previously I talked about the idea of implementing a new "Ambient Point Light" into the engine. This would work by rendering the surrounding environment to a cubemap, just like a regular point light does. However, this type of light would render to a color buffer instead of just a depth buffer.
     
    The light could then be rendered to display ambient reflections and soft lighting on the surrounding environment. While it would not provide perfect real-time reflections, it would give an extra boost to the appearance of scenes that make heavy use of reflective surfaces.
     
    One of the problems in this type of system is how to handle overlapping lights. It would look weird to have an area where two lights are combining to make the ambient light or reflections brighter.
     
    I found a could create a new "lighten" blend mode with the following code:

    glBlendFunc(GL_ONE, GL_ONE); glBlendEquation(GL_MAX);
     
    This ensures that a light will only brighten a pixel up to its own value, and never beyond. If the pixel is already brighter than the light color it will have no effect. Below you can see two spotlights using this new blend mode. Notice that the area where the two lights both illuminate is never any brighter than either one.
     

     
    This also means soft ambient lighting will only appear in dark areas, and will have no effect on brightly lit surfaces, as it should.
  11. Josh
    As described previously, the open-market model store approach does not work for Leadwerks and I am shifting my focus to providing DLCs made by third-party authors. The first thing I want to do is create more consistent branding that all DLC model packs will use. Here's what I came up with:
     

     

     

     

     
    I'm also developing terminology to apply consistently to say what the DLC contains. The meaning of "Material Pack" and "Model Pack" is obvious. "Action Figures" will refer to character models that are scripted and completely ready to use. "Weapon Pack" will likewise refer to weapons that are scripted and completely ready to use.
     
    Unscripted models without sounds are something I would am hesitant to add. I feel like the expectation for DLCs will be that each item is completely ready to use. This is one of the ways this approach is more limiting than the Workshop Store.
  12. Josh
    The Leadwerks Mercenary Action Figure has been released as a DLC. Previously, I talked about how the Workshop Store was not selling much content, and that we need game-ready content to be available to use in Leadwerks. At the same time, DLC sales are quite good, which seems contradictory.
     
    To date, this item has total sales of $299 since release and cost $2000 to produce. (Additional characters will be a bit cheaper if we reuse animations.) After Steam's cut, I've lost $1800 on this project at this point. For that reason, I have halted production of additional characters.
     
    By releasing this item as a DLC, I can answer a few questions:
    Will a single model released as a DLC sell?
    Can we produce our own exclusive HD content in a sustainable manner?
    What is the difference between Workshop Store sales and DLC sales for the same exact item? I am very interested in getting this statistic and showing it to Valve.

     
    If this DLC item sells enough to cover my costs of production by August 1st (three months), then I will begin production of two additional characters and make it a pack. If that sells then we can start producing all kinds of character packs for every type of game, using the same studio to give us consistent artwork. Here are some of the other sketches that were created during production.
     

  13. Josh
    An update is available on the beta branch with the following changes:
    Fixed System::Print() command which was printing char* values as bools.
    Added optional parameter to the end of Entity::SetInput().
    Add optional parameter to the end of Emitter::SetVelocity()
    Fixed light occlusion culling bug.
    Fixes a sound source position bug.

  14. Josh
    As discussed before, sales in the Workshop Store have been measly compared to what the DLCs regularly do. People just don't want to use it. Without revenue coming in, I can't convince third parties that it is a good idea to sell through this system (it isn't), and no new products will get added. We need a large library of content to be available for use in Leadwerks, but this isn't working. So I am moving on to another approach.
     
    The "Workshop" menu item in the website header has been placed under the "Community" menu where it is less prominent. Any products you purchased through the Workshop Store will remain available, forever. The Workshop will continue to be a place for the community to share files.
     
    I considered associating DLC purchases with Workshop items so that the two systems were merged, at least from the user's perspective, but decided against it, for now. Discovery takes place in the Steam Store, not the Workshop interface, and there are more important issues to tackle first. I may tie these together in the future so that installing a DLC goes through the Workshop interface. What we need right now is to have lots of content available, and to make sure that content is selling so that more will be added. If that isn't happening, nothing else matters.
     
    A new update adds a tab in the Workshop window titled "DLC". This is where all your purchased DLCs will be listed, and this is how they can be installed to your project.
     

     
    Okay, so monetizing the Workshop failed, and I am calling it now. Now what?
     
    At the same time the Workshop was struggling, our DLC sales of model packs have been very, very good. In fact, the DLCs will regularly sell more in a single day than the Workshop Store has done during its entire three month existence. It's clear that one approach works and the other does not.
     
    What I don't like about DLCs is they require more oversight by me, they can really only be sold in packs, and not individual models, the quality control and standards have to be higher and thus more limiting, and there is a limited amount of store space available to show them. Any revenue splits I do have to be calculated manually, and me doing that myself each month just isn't an option. On Monday I am meeting with an accounting firm so that they can handle this task for me each month. I plan to start releasing third-party model packs as DLCs, where they will actually get sales, and I will outsource the handling of royalty payments so I don't have to deal with it.
     
    This didn't work out the way I planned, but the fact remains we have a method of selling content that works and provides sales for third parties comparable to the other model stores they sell content through. We just need to recognize our own unique strengths and play to them instead of fighting what the user wants.
  15. Josh
    An update is available on the beta branch which makes changes to the way DLCs are installed. The SciFi Interior Pack has been added as a paid Workshop item. You can obtain this item by either purchasing it through the Workshop Store interface, or buying the DLC on our Steam Store page. At the time of this writing, the other model pack DLCs have not been added yet.
     
    To install the SciFi Interior Pack DLC you simply browse to it in the Workshop interface. If you own the DLC the "Buy" button will say "Install" and allow you to install the item. The "Subscribed Items" tab has been removed completely.
     

     
    We have found that customers overwhelmingly prefer to purchase DLCs through the Steam store, and do not purchase Workshop store items at nearly the same rate. These changes allow us to simplify the install process and will allow us to "promote" Workshop items to DLCs, where they will get good sales.
     
    I don't understand the reason for this behavior, but there is more than a 10x difference in sales.
  16. Josh
    I sent my mockups off to the designer to create our very own hoodie as a super tournament prize. This garment will come with not one, not two, but three separate printed graphics on a high-quality American Apparel hooded fleece. Here's roughly what it will look like:
     

     
    And here is a t-shirt design for a future production run:
     

  17. Josh
    If you entered the Winter Games Tournament, I am shipping prizes to you now (late, I know).
     
    You will receive a friend request from the Leadwerks Software Steam account. Please accept this. I will need your mailing address in order to ship your prize.
  18. Josh
    The Leadwerks Merc character, who I think will go down in history over the next few years second only to the infamous "Crawler", is an experiment. First of all, I wanted a completely custom-made character to developer AI with. This ensured that I was able to get the model made exactly to my specs so that we would have a template for other characters to follow. Fortunately, the script I wrote can easily be used with other models like Arteria's Strike Troop.
     
    The quality of this model is really high and I am really happy to be able to provide high-end content like this for use with Leadwerks. The cost of production was $2965, plus $125 for the weapon model, for a total of $3090. Additional characters can be made at a slightly lower cost, if I choose to go forward with production. At $9.99, we need to sell 309 copies to break even.
     
    Leadwerks model packs sell quite well as DLCs, but so far Workshop Store sales have been slower. It seems that getting people to use the new store is a task itself, because people are unfamiliar with the process.
     
    In the first two weeks, the Merc character has sold 13 copies for a total of $129 in sales (times 70% after the Steam tax). Assuming a flat rate of sales, this means that the character will take one year and four months before sales have covered costs.
     
    This is not an acceptable rate of recovering costs. I am not going forward with additional custom characters based on these results. Not sure how this will go forward, but I am halting all production for now.
  19. Josh
    There are three low-level advancements I would like to make to Leadwerks Game Engine in the future:
    Move Leadwerks over to the new Vulkan graphics API.
    Replace Windows API and GTK with our own custom UI. This will closely resemble the Windows GUI but allow new UI features that are presently impossible, and give us more independence from each operating system.
    Compile the editor with BlitzMaxNG. This is a BMX-to-C++ translator that allows compilation with GCC (or VS, I suppose). This would allow the editor to be built in 64-bit mode.

     
    None of these enhancements will result in more or better games, and thus do not support our overarching goal. They also will each involve a significant amount of backtracking. For example, a new Vulkan renderer is pretty much guaranteed to be slower than our existing OpenGL renderer, for the first six months, and it won't even run on a Mac. A new GUI will involve lots of bugs that set us back.
     
    This is likely to be stuff that I just explore slowly in the background. I'm not going to take the next three months to replace our renderer with Vulkan. But this is the direction I want to move in.
  20. Josh
    An update is available on the beta branch which resolves the AMD rendering issues on Windows 10 with 1x MSAA setting enabled:
    http://www.leadwerks.com/werkspace/topic/13273-graphical-glitch/
     
    The following components have changed:
    Lighting shaders adjusted to handle regular 2D textures (MSAA level 0)
    Editor updated, for Windows.
    Lua interpreter and debug interpreter updated, for Windows.

     
    The C++ library has not been updated yet.
  21. Josh
    Happy Friday.
     

     
    One thing I will point out is that success is much more about creative experimentation than it is about sheer willpower. You have a little control over willpower, but not a ton. It's much better to keep the same amount of effort and just be smarter about what you do, than to try really really hard. That means taking a lot of risks and experiencing a lot of small low-cost failures to find what works.
  22. Josh
    Note: this is highly experiment and subject to cancellation, change, revision, etc. Now on with the blog...
     
    Shadmar and Reepblue have paved the way for this research, and I am experimenting with some of these ideas in Leadwerks.
     
    Environment probes in Leadwerks are an experimental entity that creates a vantage point from which a cubemap is generated. The cubemap provides a 360 degree view of the surrounding environment in a single texture. This texture can then be used in the deferred lighting routine to provide more realistic ambient lighting and reflections.
     
    Let's start off with the limitations. These are not an end-all be-all solution to global illumination or reflections. Environment probes must be placed by hand, and like a light, only cover a restricted area. The smaller the area, the more accurate they can be, but they will never provide 100% physically accurate reflections. They also do not reflect dynamic objects and can only enhance static lighting. Rendering the cubemaps is a fairly expensive step and should not be performed in real-time.
     
    It was fairly simple to implement a new entity type and make it render the surrounding scene into a cubemap. A new parameter was added to the Camera::SetRenderTarget() command to allow the user to specify a cubeface index, from zero to six. The resulting cubemap is displayed on a sphere in the editor.
     

     
    The next problem is what to do with the generated cubemaps. Source Engine uses the closest environment probe to each object when rendering reflective materials. This results in popping as objects move between different probes. We don't want that.
     
    It occurred to me that we could perform an additive step in the ambient light calculation, by uploading the closest 4-8 cubemaps into the ambient light full-screen shader. This is basically like performing a forward rendering though, and I realized it was better for each probe to be rendered on its own, in a deferred step, just like a point light. So now instead of thinking of environment probes as area hints of what reflections should look like, I am now considering them to be "ambient lights".
     
    Environment probes / ambient lights would add two types of lighting to the scene. They would first use a low-frequency sample of the cubemap (probably by just using a low-res mipmap) and use this to add ambient light to the scene. This would do things like make a neon sign cast lighting onto the surrounding environment. Another calculation could be performed at the same time for reflective surfaces, possibly using the material's specular term as the reflectivity at each pixel. You would probably want to use pure black ambient light with this, so that the lighting just came from the environment itself.
     
    The good of this is that with some extra care, you could create an environment with detailed reflections, more varied ambient lighting, and it would be completely compatible with how we do everything right now.
     
    The downside is you will have to place a lot of overlapping environment probes to fill up your map, and it may be tedious to cover every part of the map, and your reflections won't react to dynamic objects or changes in lighting, without expensive re-renders of the cubemaps.
     
    If it works out as described above, I see this as an additional tool you can use if wanted, to achieve some kinds of new looks in some maps. It isn't an end-all solution to completely dynamic reflections, but it could be used to enhance some types of maps.
     
    More as it develops.
  23. Josh
    My productivity has not been good lately because I am agonizing over where to live. I don't like where I am right now. Decisions like this were easy in college because I would just go where the best program was for what I was interested in. It's more difficult now to know if I am making the right choice.
     
    I can "go big" and move to San Francisco, which weirdly has become the capitol of tech. I lived there before and know all the craziness to expect. I used to work at 24th and Mission (before Leadwerks existed) which was and still is a dangerous neighborhood, but if you go one block north there's now an Alamo Drafthouse where drug dealers used to offer me crack cocaine. (I was gentrifying the Mission before it was cool.) San Francisco is a stressful place full of insecure nasty people but there's also a ton of stuff to do and it's full of world-class tech talent and a lot of interesting cool people from all walks of life.
     
    The other option is to move out to the midwest somewhere and buy a house for cheap, and just chill out and work. I won't have to spend time fighting with insane traffic, trying to get groceries, and dodging hobo poop on the sidewalk.
     
    At the same time, Leadwerks has reached a level of success where I can easily get the attention of people in the game industry I previously did not have access to. The 10,000 user announcement (set to hit 20,000 this summer) was a big deal.
     
    Basically, I need to get relocated in the next two weeks, open a new office, and get back to work. I'm having a hard time deciding this.
  24. Josh
    The stock scripts that come with Leadwerks form a continually expanding game framework that can be used in many different ways. The reuse we get out of the door, switch, trigger, and AI scripts is really great. Every time I add a new script I can count on it being used in many different ways.
     
    Sometimes this requires a custom model in order to develop and demonstrate usage of the script. It's best for me to have one model made exactly to my specs, and then make everything else conform to that template.
     
    There are additional scripts I want to release in the near future that add more gameplay elements:
    TurretAI, with custom model.
    Enterable vehicles with controls for guns, if present.
    Third-person player controls, with options for fixed rotation or over-the-shoulder views.

     
    I also want to make the projectile prefab loaded by the soldierAI script a modifiable script value, so that we can easily add new types of projectiles. This would also work with turrets or vehicle-mounted guns.
     
    What other ideas for widely reusable scripts do you have?
  25. Josh
    You might not have noticed this, but Workshop items can now include the following file formats:
    "bmp","jpg","jpeg","png","tga","dds","psd","blend","fbx","obj","3ds","x","dae"
     
    When Leadwerks installs Workshop items, the source art files are unzipped first, then the final files are unzipped, in order to avoid triggering a reconversion of the file. If you import an FBX and makes some changes to it, you can safely include the FBX in your Workshop package and it will install without reconverting the source file.
×
×
  • Create New...