Jump to content

Josh

Staff
  • Posts

    23,347
  • Joined

  • Last visited

Blog Entries posted by Josh

  1. Josh
    I've begun implementing unicode in Leadwerks Game Engine 5.  It's not quite as simple as "switch all string variables to another data type".
    First, I will give you a simple explanation of what unicode is.  I am not an expert so feel free to make any corrections in the comments below.
    When computers first started drawing text we used a single byte for each character.  One byte can describe 256 different values and the English language only has 26 letters, 10 numbers, and a few other characters for punctuation so all was well.  No one cared or thought about supporting other languages like German with funny umlauts or the thousands of characters in the Chinese language.
    Then some people who were too smart for their own good invented a really complicated system called unicode.  Unicode allows characters beyond the 256 character limit of a byte because it can use more than one byte per character.  But unicode doesn't really store a letter, because that would be too easy.  Instead it stores a "code point" which is an abstract concept.  Unfortunately the people who originally invented unicode were locked away in a mental asylum where they remain to this day, so no one in the real world actually understands what a code point is.
    There are several kinds of unicode but the one favored by nerds who don't write software is UTF-8.  UTF-8 uses just one byte per character, unless it uses two, or sometimes four.  Because each character can be a different length there is no way to quickly get a single letter of a string.  It would be like trying to get a single byte of a compressed zip file; you have to decompress the entire file to read a byte at a certain position.  This means that commands like Replace(), Mid(), Upper(), and basically any other string manipulation commands simply will not work with UTF-8 strings.
    Nonetheless, some people still promote UTF-8 religiously because they think it sounds cool and they don't actually write software.  There is even a UTF-8 Everywhere Manifesto.  You know who else had a manifesto?  This guy, that's who:

    Typical UTF-8 proponent.
    Here's another person with a "manifesto":

    The Unabomber (unibomber? Coincidence???)
    The fact is that anyone who writes a manifesto is evil, therefore UTF-8 proponents are evil and should probably be imprisoned for crimes against humanity.  Microsoft sensibly solved this problem by using something called a "wide string" for all the windows internals.  A C++ wide string (std::wstring) is a string made up of wchar_t values instead of char values.  (The std::string data type is sometimes called a "narrow string").  In C++ you can set the value of a wide string by placing a letter "L" (for long?) in front of the string:
    std::wstring s = L"Hello, how are you today?"; The C++11 specification defines a wchar_t value as being composed of two bytes, so these strings work the same across different operating systems.  A wide string cannot display a character with an index greater than 65535, but no one uses those characters so it doesn't matter.  Wide strings are basically a different kind of unicode called UTF-16 and these will actually work with string manipulation commands (yes there are exceptions if you are trying to display ancient Vietnamese characters from the 6th century but no one cares about that).
    For more detail you can read this article about the technical details and history of unicode (thanks @Einlander).
    First Pass
    At first I thought "no problem, I will just turn all string variables into wstrings and be done with it".  However, after a couple of days it became clear that this would be problematic.  Leadwerks interfaces with a lot of third-party libraries like Steamworks and Lua that make heavy use of strings.  Typically these libraries will accept a chr* value for the input, which we know might be UTF-8 or it might not (another reason UTF-8 is evil).  The engine ended up with a TON of string conversions that I might be doing for no reason.  I got the compiler down to 2991 errors before I started questioning whether this was really needed.
    Exactly what do we need unicode strings for?  There are three big uses:
    Read and save files. Display text in different languages. Print text to the console and log. Reading files is mostly an automatic process because the user typically uses relative file paths.  As long as the engine internally uses a wide string to load files the user can happily use regular old narrow strings without a care in the world (and most people probably will).
    Drawing text to the screen or on a GUI widget is very important for supporting different languages, but that is only one use.  Is it really necessary to convert every variable in the engine to a wide string just to support this one feature?
    Printing strings is even simpler.  Can't we just add an overload to print a wide string when one is needed?
    I originally wanted to avoid mixing wide and narrow strings, but even with unicode support most users are probably not even going to need to worry about using wide strings at all.  Even if they have language files for different translations of their game, they are still likely to just load some strings automatically without writing much code.  I may even add a feature that does this automatically for displayed text.  So with that in mind, I decided to roll everything back and convert only the parts of the engine that would actually benefit from unicode and wide strings.
    Second Try + Global Functions
    To make the API simpler Leadwerks 5 will make use of some global functions instead of trying to turn everything into a class.  Below are the string global functions I have written:
    std::string String(const std::wstring& s); std::string Right(const std::string& s, const int length); std::string Left(const std::string& s, const int length); std::string Replace(const std::string& s, const std::string& from, const std::string& to); int Find(const std::string& s, const std::string& token); std::vector<std::string> Split(const std::string& s, const std::string& sep); std::string Lower(const std::string& s); std::string Upper(const std::string& s); There are equivalent functions that work with wide strings.
    std::wstring StringW(const std::string& s); std::wstring Right(const std::wstring& s, const int length); std::wstring Left(const std::wstring& s, const int length); std::wstring Replace(const std::wstring& s, const std::wstring& from, const std::wstring& to); int Find(const std::string& s, const std::wstring& token); std::vector<std::wstring> Split(const std::wstring& s, const std::wstring& sep); std::wstring Lower(const std::wstring& s); std::wstring Upper(const std::wstring& s); The System::Print() command has become a global Print() command with a couple of overloads for both narrow and wide strings:
    void Print(const std::string& s); void Print(const std::wstring& s); The file system commands are now global functions as well.  File system commands can accept a wide or narrow string, but any functions that return a path will always return a wide string:
    std::wstring SpecialDir(const std::string); std::wstring CurrentDir(); bool ChangeDir(const std::string& path); bool ChangeDir(const std::wstring& path); std::wstring RealPath(const std::string& path); std::wstring RealPath(const std::wstring& path); This means if you call ReadFile("info.txt") with a narrow string the file will still be loaded even if it is located somewhere like "C:/Users/约书亚/Documents" and it will work just fine.  This is ideal since Lua 5.3 doesn't support wide strings, so your game will still run on computers around the world as long as you just use local paths like this:
    LoadModel("Models/car.mdl"); Or you can specify the full path with a wide string:
    LoadModel(CurrentDir() + L"Models/car.mdl"); The window creation and text drawing functions will also get an overload that accepts wide strings.  Here's a window created with a Chinese title:

    So in conclusion, unicode will be used in Leadwerks and will work for the most part without you needing to know or do anything different, allowing games you develop (and Leadwerks itself) to work correctly on computers all across the world.
  2. Josh
    I've built a modified version of ReepBlue's C++ animation manager class (based off my Lua script) into the engine. This adds a new command and eliminates the need for the animation manager script.
     

    void Entity::PlayAnimation(const std::string& sequence, const float speed=1.0f, const int blendtime=500, const int mode=0, const std::string endhook="") void Entity::PlayAnimation(const int index, const float speed = 1.0f, const int blendtime = 500, const int mode = 0, const std::string endhook = "")
     
    The animation manager only updates when the entity is drawn, so this does away with the most common use of the entity draw hook. This will make the engine a bit simpler to use, and existing code will still work as before.

    Usage in Lua
    Although the animation manager script has been removed from the default scripts, you do not need to modify your existing projects. 
    The AI, weapon, and animation scripts have been updated to use the built-in animation manager. Previously these scripts would create an AnimationManager lua table and call it like this:

    self.animationmanager:SetAnimationSequence("Idle",0.02)
     
    These calls now look like this:

    self.entity:PlayAnimation("Idle",0.02)
     
    One-shot animations that call a callback when the end of the sequence is reached used to look like this:

    self.animationmanager:SetAnimationSequence("Death",0.04,300,1,self,self.EndDeath)
     
    Now we just pass the function name in:

    self.entity:PlayAnimation("Death",0.04,300,1,"EndDeath"
     
    Again, as long as you have the old AnimationManager.lua script in your project, your existing scripts do not need to be updated.
     
    Animation callbacks are not yet supported in C++.
     
    This update will be available later today, for C++ and Lua, on all platforms.
  3. Josh
    Heat haze is a difficult problem. A particle emitter is created with a transparent material, and each particle warps the background a bit. The combined effect of lots of particles gives the whole background a nice shimmering wavy appearance. The problem is that when two particles overlap one another they don't blend together, because the last particle drawn is using the background of the solid world for the refracted image. This can result in a "popping" effect when particles disappear, as well as apparent seams on the edges of polygons.

    In order to do transparency with refraction the right way, we are going to render all our transparent objects into a separate color texture and then draw that texture on top of the solid scene. We do this in order to accommodate multiple layers of transparency and refraction. Now, the correct way to handle multiple layers would be to render the solid world, render the first transparency object, then switch to another framebuffer and use the previous framebuffer color attachment for the source of your refraction image. This could be done per-object, although it could get very expensive, flipping back and forth between two framebuffers, but that still wouldn't be enough.
    If we render all the transparent surfaces into a single image, we can blend their normals, refractive index, and other properties, and come up with a single refraction vector that combined the underlying surfaces in the best way possible.
    To do this, the transparent surface color is rendered into the first color attachment. Unlike deferred lighting, the pixels at this point are fully lit.

    The screen normals are stored in an additional color attachment. I am using world normals in this shot but later below I switched to screen normals:

    These images are drawn on top of the solid scene to render all transparent objects at once. Here we see the green box in the foreground is appearing in the refraction behind the glass dragon.

    To prevent this from happening, we need add another color texture to the framebuffer and render the pixel Z position into it. I am using the R32_SFLOAT format. I use the separate blend mode feature in Vulkan, and set the blend mode to minimum so that the smallest value always gets saved in the texture. The Z-position is divided by the camera far range in the fragment shader, so that the saved values are always between 0 and 1. The clear color for this attachment is set to 1,1,1,1, so any value written into the buffer will replace the background. Note this is the depth of the transparent pixels, not the whole scene, so the area in the center where the dragon is occluded by the box is pure white, since those pixels were not drawn.

    In the transparency pass, the Z position of the transparent pixel is compared to the Z position at the refracted texcoords. If the refracted position is closer to the camera than the transparent surface, the refraction is disabled for that pixel and the background directly behind the pixel is shown instead. There is some very slight red visible in the refraction, but no green.

    Now let's see how well this handles heat haze / distortion. We want to prevent the problem when two particles overlap. Here is what a particle emitter looks like when rendered to the transparency framebuffer, this time using screen-space normals. The particles aren't rotating so there are visible repetitions in the pattern, but that's okay for now.

    And finally here is the result of the full render. As you can see, the seams and popping is gone, and we have a heavy but smooth distortion effect. Particles can safely overlap without causing any artifacts, as their normals are just blended together and combined to create a single refraction angle.

  4. Josh
    An update is available for the new Turbo Game Engine beta.
    Fixed compiling errors with latest Visual Studio version Fixed compatibility problems with AMD hardware Process is now DPI-aware Added high-resolution depth buffer (additional parameter on CreateContext()). Subscribers can download the new version here.
  5. Josh
    In games we think of terrain as a flat plane subdivided into patches, but did you know the Earth is actually round? Scientists say that as you travel across the surface of the planet, a gradual slope can be detected, eventually wrapping all the way around to form a spherical shape! At small scales we can afford to ignore the curvature of the Earth but as we start simulating bigger and bigger terrains this must be accounted for. This is a big challenge. How do you turn a flat square shape into a sphere? One way is to make a "quad sphere", which is a subdivided cube with each vertex set to the same distance from the center:

    I wanted to be able to load in GIS datasets so we could visualize real Earth data. The problem is these datasets are stored using a variety of projection methods. Mercator projections are able to display the entire planet on a flat surface, but they suffer from severe distortion near the north and south poles. This problem is so bad that most datasets using Mercator projections cut off the data above and below 75 degrees or so:

    Cubic projections are my preferred method. This matches the quad sphere geometry and allows us to cover an entire planet with minimal distortion. However, few datasets are stored this way:

    It's not really feasible to re-map data into one preferred projection method. These datasets are enormous. They are so big that if I started processing images now on one computer, it might take 50 years to finish. We're talking thousands of terabytes of data that can be streamed in, most of which the user will never see even if they spend hours flying around the planet.
    There are many other projection methods:

    How can I make our terrain system handle a variety of projection methods ti display data from multiple sources? This was a difficult problem I struggled with for some time before the answer came to me.
    The solution is to use a user-defined callback function that transforms a flat terrain into a variety of shapes. The callback function is used for culling, physics, raycasting, pathfinding, and any other system in which the CPU uses the terrain geometry:
    #ifdef DOUBLE_FLOAT void Terrain::Transform(void TransformCallback(const dMat4& matrix, dVec3& position, dVec3& normal, dVec3& tangent, const std::array<double, 16>& userparams), std::array<double, 16> userparams) #else void Terrain::Transform(void TransformCallback(const Mat4& matrix, Vec3& position, Vec3& normal, Vec3& tangent, const std::array<float, 16>& userparams), std::array<float, 16> userparams) #endif An identical function is used in the terrain vertex shader to warp the visible terrain into a matching shape. This idea is similar to the vegetation system in Leadwerks 4, which simultaneously calculates vegetation geometry in the vertex shader and on the CPU, without actually passing any data back and forth.
    void TransformTerrain(in mat4 matrix, inout vec3 position, inout vec3 normal, inout vec3 tangent, in mat4 userparams) The following callback can be used to handle quad sphere projection. The position of the planet is stored in the first three user parameters, and the planet radius is stored in the fourth parameter. It's important to note that the position supplied to the callback is the terrain point's position in world space before the heightmap displacement is applied. The normal is just the default terrain normal in world space. If the terrain is not rotated, then the normal will always be (0,1,0), pointing straight up. After the callback is run the heightmap displacement will be applied to the point, in the direction of the new normal. We also need to calculate a tangent vector for normal mapping. This can be done most easily by taking the original position, adding the original tangent vector, transforming that point, and normalizing the vector between that and our other transformed position.
    #ifdef DOUBLE_FLOAT void TransformTerrainPoint(const dMat4& matrix, dVec3& position, dVec3& normal, dVec3& tangent, const std::array<double, 16>& userparams) #else void TransformTerrainPoint(const Mat4& matrix, Vec3& position, Vec3& normal, Vec3& tangent, const std::array<float, 16>& userparams) #endif { //Get the position and radius of the sphere #ifdef DOUBLE_FLOAT dVec3 center = dVec3(userparams[0], userparams[1], userparams[2]); #else Vec3 center = Vec3(userparams[0], userparams[1], userparams[2]); #endif auto radius = userparams[3]; //Get the tangent position before any modification auto tangentposition = position + tangent; //Calculate the ground normal normal = (position - center).Normalize(); //Calculate the transformed position position = center + normal * radius; //Calculate transformed tangent auto tangentposnormal = (tangentposition - center).Normalize(); tangentposition = center + tangentposnormal * radius; tangent = (tangentposition - position).Normalize(); } And we have a custom terrain shader with the same calculation defined below:
    #ifdef DOUBLE_FLOAT void TransformTerrain(in dmat4 matrix, inout dvec3 position, inout dvec3 normal, inout dvec3 tangent, in dmat4 userparams) #else void TransformTerrain(in mat4 matrix, inout vec3 position, inout vec3 normal, inout vec3 tangent, in mat4 userparams) #endif { #ifdef DOUBLE_FLOAT dvec3 tangentpos = position + tangent; dvec3 tangentnormal; dvec3 center = userparams[0].xyz; double radius = userparams[0].w; #else vec3 tangentpos = position + tangent; vec3 tangentnormal; vec3 center = userparams[0].xyz; float radius = userparams[0].w; #endif //Transform normal normal = normalize(position - center); //Transform position position = center + normal * radius; //Transform tangent tangentnormal = normalize(tangentpos - center); tangentpos = center + tangentnormal * radius; tangent = normalize(tangentpos - position); } Here is how we apply a transform callback to a terrain:
    #ifdef DOUBLE_FLOAT std::array<double, 16> params = {}; #else std::array<float, 16> params = {}; #endif params[0] = position.x; params[1] = position.y; params[2] = position.z; params[3] = radius; terrain->Transform(TransformTerrainPoint, params); We also need to apply a custom shader family to the terrain material, so our special vertex transform code will be used:
    auto family = LoadShaderFamily("Shaders/CustomTerrain.json"); terrain->material->SetShaderFamily(family); When we do this, something amazing happens to our terrain:

    If we create six terrains and position and rotate them around the center of the planet, we can merge them into a single spherical planet. The edges where the terrains meet don't line up on this planet because we are just using a single heightmap that doesn't wrap. You would want to use a data set split up into six faces:
    All our terrain features like texture splatting, LOD, tessellation, and streaming data are retained with this system. Terrain can be warped into any shape to support any projection method or other weird and wonderful ideas you might have.
  6. Josh
    It's December, which means Leadwerks for Linux is nearly here! Last week I had to get into more depth with our tutorials. We're creating a series of maps that demonstrate simple game mechanics. The goal is to show how to set up game interactions by attaching scripts to objects and connecting them in the flowgraph editor, without getting into any actual programming. These lessons center around a first-person shooter, but are applicable to many types of games.
     
    I had to get my hands dirty in the lessons another programmer was developing because there were a few bugs preventing them from working right. As we started getting into some more advanced interactions and physics we found some things that had to be fixed. Most of this revolved around the character controller physics and all the different situations the player can encounter. It's now working correctly, and we will be able to demonstrate some fun physics puzzles you can set up in the editor.
     
    There's just a few more steps to getting Leadwerks for Linux out. The new tutorials need to be written and recorded, which is actually the easy part. The final lesson needs a full mini-level designed. I'm still having trouble with the GTK UI, but it's the kind of thing that I just need to keep plugging away at. Since the Nvidia and ATI drivers for Linux work flawlessly, we aren't dependent on any third parties fixing their code, which is a big relief.
     
    Before Kickstarter, we ran a Greenlight campaign to put Leadwerks on Steam. Originally, this was going to be released last summer, but I decided to focus on Linux development and release them around the same time. Leadwerks for Steam is a more limited version of what the Kickstarter backers are getting, and will initially only be available for Windows.
     

    Building FPS Mechanics
    Getting into the lessons was fun for me because I got to work on some gameplay design. When building a game, it's important to decide what to focus on. There are an infinite number of ways you can waste your time, so it's critical that you decide what is important and focus on that. We could have implemented an endless number of little interactions in our gameplay lessons, and it's easy to do with our flow graph system, but I decided I wanted to focus primarily on the basic combat and feel of the player. There's a big difference between a demo where you walk around in a game level holding a gun, and a first-person shooter that feels like you are interacting with the game. I wanted to capture that feeling of being in the game, because it will benefit all the derivative games and demos for Linux that are built off of this. With that in mind, I set out to analyze exactly what it is that makes a first-person shooter feel interactive. 
    Player Movement
    The first basic thing you can do to enhance the player experience is add footstep sounds. Don't use just any footsteps. if they don't sound good, it won't feel right. IndieGameModels.com happened to release a pack of footstep noises that are excellent. I negotiated a special license for some of the sounds and will include them in Leadwerks 3.1, free for use. One subtle thing I learned from Valve games is that it's better to play the "landing" sound when the player jumps, not when they land. It gives a much better feeling of interaction with the level, for some reason.
     
    A very slight camera bob was added so that when you jump, it feels like your arms have inertia. I also tried smoothing the rotation of the arms so they lag a little as you look around, but it looked terrible and they were jumping around on the screen. So while a little inertia can be a subtle indicator of motion, too much is definitely not good.
     
    Ballistics
    How do you really tell when a gun is being fired? What makes it "feel" right? I had to look at some games from the all-time masters of first-person combat, id Software. I added a muzzle flash sprite that appears for a fraction of a second when you fire, as well as a point light at the tip of the barrel. The deferred renderer in Leadwerks 3.1 makes the light cast dynamic shadows, giving a convincing feeling of space and depth.
     
    When a bullet strikes a wall or other surface, you need some visual cue of what happened. A set up two emitters that get copied and played with a one-shot animation whenever a bullet collides with something. One emitter throws off solid chunks with a high gravity value, while the other is a software smoke that slowly emits from the point of impact. A secondary blood effect was created for when living objects are shot.
     
    Combat
    When i initially implemented combat with our crawler character (the new AI navigation system makes this great!) I found the action to be...strange. I had a pistol with unlimited ammo and no reloads, and taking down a pack of crawlers was extremely easy. I assumed I had just not given them enough health, or made the bullets too powerful, and just kept working on it. It wasn't until I added two other aspects of gameplay that I realized how big of an influence they have on the game mechanics.
     

     
    Reloading your weapon doesn't just give you a feeling of "realism". It actually introduces a whole new aspect of gameplay. Not only do you have to ration your ammo, but you have to ration your clip and not fire too many shots too quickly. Once you get surrounded by a pack of crawlers and you are trying to reload your weapon to get another shot off, you can quickly get taken down. So the speed of reloading and size of the clip actually become critical design decisions on what kind of game you want.
     
    The second thing I added was making the camera bounce around when hit. When a crawler slaps you, the camera will be pushed off to the side. You can see this in Doom 3 when the player gets hit by a monster. I found this enhancement has two effects:
    First, it's unpleasant. Having your view suddenly disoriented is a jarring feeling that comes through the game to effect you in real life. It's almost as good as the pain vest I see someone besides me also had the idea for. When you have the possibility of an unpleasant experience, it makes you afraid, and changes the way you play and makes enemies feel much more menacing.
    Second, it adds a new gameplay mechanic. Getting slapped around makes it harder to aim. If you get surrounded by a pack of enemies, it gets very difficult to hit anything. So not letting them get near you becomes very important.

     
    I think that by focusing on these core elements and providing a structure Linux game developers can build on, we can foster development of a great many derivative projects that make use of these mechanics. The fact that it's happening on Linux is exciting, because it feels like we are doing something truly new, almost like the early days of first-person shooters.
     
    It's hard to predict the exact release date of Leadwerks for Linux, but we're getting close. My original estimate for the development timeline turned out to be pretty accurate on almost everything. Stay tuned!
  7. Josh
    It doesn't happen to me much anymore but when I get stuck on a difficult problem, I get very restless and short-tempered. I guess that's the point where most programmers give up on a project. Since I can't physically wrestle code to the ground and give it a good beating Billy Batts style, the fight or flight response probably is not conducive to sitting in front of a piece of glass and figuring out code. On the other hand, I have a tendency to get difficult things done, no matter how hard they are. Maybe a more patient programmer in the same situation would be too patient to produce useful output under his own volition.
     
    My understanding of how recast works is it turns upwards-facing triangles into a grid, then uses that grid to construct a navigation mesh. The results seem very robust, but the fundamental approach is an approximation. It is possible to construct a perfectly accurate navigation mesh out of CSG solids, because they have volumetric properties an arbitrary polygon soup does not. On the other hand, my instinct says people will prefer an approximation that works with any arbitrary geometry over a mathematically correct solution that requires constructive solid geometry.
     
    Another approach is to simply let the user construct the navigation mesh in the editor. Since the pieces need to be convex, a CSG editor is ideal for this. This also has the possibility of adding more advanced functionality like climbing up walls and jumping. (This is how Left 4 Dead 2 works.) Still, I know people will value a pretty-good solution that works with arbitrary polygon geometry more, and that gets into a land of approximations I don't really have any experience with. I suspect it would involve a lot of adjustments and fine-tuning to get the desired results.
     
    You can see here, the results are definitely an approximation, but they're a pretty good one:

     
    So here are the options I am looking at:
    1. Hire someone with existing knowledge of recast to write the implementation.
    2. Find someone with existing knowledge of recast and work off their knowledge to write the implementation myself.
    3. Stare at incomprehensible code for days on end and hope that somehow imparts knowledge of how to use the library. Maybe other people can understand this, but I am awful at deciphering other people's code.
    4. Write my own polygon-based solution.
    5. Write my own CSG-based solution.
     
    I think the CSG-based solution is the best technical choice, but I think it would cause a lot of complaints. It's fine for Valve, but I think a lot of people would just get mad if I tried to explain why arbitrary triangle meshes have no volumetric properties.
     
    Another frightening thing about recast is that from what I am reading, a lot of people are using the demo application as a tool to generate their navmesh data, and just loading the saved files that produces in their own game. That's completely unacceptable. We need to be able to generate this data in our own editor. I know it's possible, but the lack of people able to do this is an indication of the difficulty of the task.
     
    The pathfinding stuff is actually the last bit of research I have to complete before I know how everything in Leadwerks3D works. The rest is just a matter of hard work, but all the unknown factors we started with will be resolved.
     

  8. Josh
    Sometimes I run into situations where I don't really know how to structure things. I don't mind this, because it usually results in some really elegant design once I figure out what to do. I just play with ideas and try not to force anything, and when the right idea arises, I will recognize it.
     
    Explaining a problem to someone else can help facilitate that process. How many times have you solved a difficult problem right after you posted a description of it on a forum somewhere? The procedure of explaining it logically to someone else can help you think more clearly about it. And so, we have today's blog topic.
     
    Below is a very rough start to the script editor. The syntax highlighting system was written about a year ago, and works beautifully, using the native text widget on both Windows and Mac.

     
    In the Leadwerks3D source code, there is a base class called an "AssetEditor". From this class the material, model, shader, texture, font, and script editor classes are derived. Like the other asset editor windows, only one instance of the script editor window will be allowed open at any time. Unlike the other asset editor windows, which display only one asset at a time, the script editor will use tabs to display multiple files. Scripts aren't a typical asset like a material or a model, so it's fine for them to behave a little differently.
     
    Any Leadwerks3D application can have its Lua state debugged. The engine uses networking commands to communicate with a debugger on a specified port. This means the engine can communicate with a debugger whether it's part of a C++ program, C# app, or standalone Lua interpreter. The debugger can display the Lua callstack and shows all variables and values in the Lua state, including full examination of C++ objects and members!
     
    I do not intend for Leadwerks3D to "play a game" in the editor. We've tried that approach and there are a lot of problems. I want Leadwerks3D to be a very solid and stable level editor, with a visual interface to do everything you need. I also want better consistency between Lua and C++ programs. Therefore, Leadwerks3D will use a run game system more similar to 3D World Studio than the Leadwerks Engine editor. A dialog will allow you to choose the application to run, command-line parameters, and other settings. These will be saved between sessions, so you can hit a key to do a "Quick Launch" and run your game. It would be possible to hook the Lua debugger into any application as it is launched, which could be very helpful.
     
    Let's go back to the script editor now. My inclination is to have F5 launch an interpreter and call the script for the currently selected tab. However, I don't think it's a good idea to use multiple game launch modes, I already described a uniform game launch mode for both Lua and C++ applications, but that seems sort of counter-intuitive if you are working in the script editor and just want to run something really quickly.
     
    There;s also the question of whether we want to provide a standalone script editor and debugger outside of Leadwerks3D. Or should the debugger be a standalone application as well, since someone might want to use it with a C++ application? You see there are a lot of options and a lot of possible ways to set this up.
     
    What about Lua compile errors? I can print that out in the engine log, but how will the editor display it? If a compile error occurs, should the program pause and display the line it occurred at? What if the user just doesn't care, and wants the program to keep going?
     
    Alternatively, the user may want to just hit F5 in the script editor and check for basic syntax errors, which the command LuaL_LoadString() will detect.
     
    That's pretty much all my questions at this point. I don't expect anyone to come along and solve my problems, but the process of describing and discussing the issues will help me come to a resolution.
  9. Josh

    Articles
    As I have explained before, I plan for Ultra Engine to use glTF for our main 3D model file format, so that your final game models can be easily loaded back into a modeling program for editing whenever you need. glTF supports a lot of useful features and is widely supported, but there are a few missing pieces of information I need to add into it. Fortunately, this JSON-based file format has a mechanism for extensions that add new features and data to the format. In this article I will describe the custom extensions I am adding for Ultra Engine.
    ULTRA_triangle_quads
    All glTF models are triangle meshes, but we want to support quad meshes primarily because its better for tessellation. This extension gets added to the primitives block. If the "quads" value is set to true, this indicates that the triangle indices are stored in a manner such that the first four indices of every six indices form a quad:
    "extensions": { "ULTRA_triangle_quads": { "quads": true } } There is no other glTF extension for quads, and so there is no way to export a glTF quad mesh from any modeling program. To get quad meshes into Ultra Engine you can load an OBJ file and then resave it as glTF. Here is a glTF file using quads that was created this way. You can see the tessellation creates an even distribution of polygons:

    For comparison, here is the same mesh saved as triangles and tessellated. The long thin triangles result in a very uneven distribution of polygons. Not good!

    The mesh still stores triangle data so the file can be loaded back into a 3D modeling program without any issues.
    Here is another comparison that shows how triangle (on the left) and quads (on the right) tessellate:

    ULTRA_material_displacement
    This extension adds displacement maps to glTF materials, in a manner that is consistent with how other textures are stored:
    "ULTRA_material_displacement": { "displacementTexture": { "index": 3, "offset": -0.035, "strength": 0.05 } } The extension indicates a texture index, a maximum displacement value in meters, and a uniform offset, also in meters. This can be used to store material displacement data for tessellation or parallax mapping. Here is a model loaded straight from a glTF file with displacement info and tessellation:

     
    If the file is loaded in other programs, the displacement info will just be skipped.
    ULTRA_vertex_displacement
    Our game engine uses a per-vertex displacement factor to control how displacement maps affect geometry. This extension adds an extra attribute into the primitives structure to store these values:
    "primitives": [ { "attributes": { "NORMAL": 1, "POSITION": 0, "TEXCOORD_0": 2 }, "indices": 3, "material": 0, "mode": 4, "extensions": { "ULTRA_vertex_displacement": { "DISPLACEMENT": 7 } } } } This can be used to prevent cracks from appearing  at texcoord seams.

    Here you can see the displacement value being loaded back from a glTF file it has been saved into. I'm using the vertex color to visually verify that it's working right:

    ULTRA_extended_material
    This extension adds other custom parameters that Ultra Engine uses. glTF handles almost everything we want to do, and there are just a few settings to add. Since the Ultra Engine material format is JSON-based, it's easy to just insert the extra parameters into the glTF like so:
    "ULTRA_extended_material": { "shaderFamily": "PBR", "shadow": true, "tessellation": true } In reality I do not feel that this extension is very well-defined and do not expect it to see any adoption outside of Ultra Engine. I made the displacement parameters a separate extension because they are well-defined, and there might be an opportunity to work with other application developers using that extension.
    Here we can see the per-material shadow property is disabled when it is loaded from the glTF:

    For comparison, here is what the default setting looks like:

    These extensions are simply meant to add special information that Ultra Engine uses into the glTF format. I do not currently have plans to try to get other developers to adopt my standards. I just want to add the extra information that our game engine needs, while also ensuring compatibility with the rest of the glTF ecosystem. If you are writing a glTF importer or exporter and would like to work together to improve the compatibility of our applications, let me know!
    I used the Rock Moss Set 02 model pack from Polyhaven in this article.
  10. Josh
    Leadwerks Game Engine 4.4 has been updated on the beta branch on Steam.
    Networking finished and documented. GUI finished. All new physics features finished. The character controller physics and picking up objects has been improved and made smoother.  There is a problem with the player sliding down slopes, as seen in the FPS Character Controller example map.  I will work this out.
    I also noticed during testing that picking up some objects in the FPS / AI map will freeze the game.  Will check it out.
    I have not actually tested compiling on Linux yet, because my Linux machine is in the office and I am at home right now.  I'm heading in this afternoon, at which point I will complete Linux testing.
    The only other problem is that vehicles are not working yet.  I'm not sure yet how I will proceed with this.
    Updating C++ Projects
    The following changes are needed to update your C++ projects:
    Visual Studio
    Add these include header search directories:
    $(LeadwerksHeaderPath)\Libraries\NewtonDynamics\packages\thirdParty\timeTracker Add these input libraries:
    newton_d.lib;dContainers_d.lib;dCustomJoints_d.lib; (debug) newton.lib;dContainers.lib;dCustomJoints.lib; (release) Code::Blocks
    Add these include header search directories:
    $(LeadwerksPath)/Include/Libraries/NewtonDynamics/packages/thirdParty/timeTracker You also need the dev files for libcurl:
    sudo apt-get install libcurl4-openssl-dev This is pretty much the finished 4.4, so please test it and post any bug reports you have.  Thank you.
  11. Josh
    I wanted to work on something a bit easier before going back into voxel ray tracing, which is another difficult problem. "Something easier" was terrain, and it ended up consuming the entire month of August, but I think you will agree it was worthwhile.
    In Leadwerks Game Engine, I used clipmaps to pre-render the terrain around the camera to a series of cascading textures. You can read about the implementation here:
    This worked very well with the hardware we had available at the time, but did result in some blurriness in the terrain surface at far distances. At the time this was invented, we had some really severe hardware restrictions, so this was the best solution then. I also did some experiments with tessellation, but a finished version was never released.
    New Terrain System
    Vulkan gives us a lot more freedom to follow our dreams. When designing a new system, I find it useful to come up with a list of attributes I care about, and then look for the engineering solution that best meets those needs.
    Here's what we want:
    Unlimited number of texture layers Pixel-perfect resolution at any distance Support for tessellation, including physics that match the tessellated surface. Fast performance independent from the number of texture layers (more layers should not slow down the renderer) Hardware tessellation is easy to make a basic demo for, but it is hard to turn it into a usable feature, so I decided to attack this first. You can read my articles about the implementation below. Once I got the system worked out for models, it was pretty easy to carry that over to terrain.
    So then I turned my attention to the basic terrain system. In the new engine, terrain is a regular old entity. This means you can move it, rotate it, and even flip it upside down to make a cave level. Ever wonder what a rotated terrain looks like?

    Now you know.
    You can create multiple terrains, instead of just having one terrain per world like in Leadwerks. If you just need a little patch of terrain in a mostly indoor scene, you can create one with exactly the dimensions you want and place it wherever you like. And because terrain is running through the exact same rendering path as models, shadows work exactly the same.
    Here is some of the terrain API, which will be documented in the new engine:
    shared_ptr<Terrain> CreateTerrain(shared_ptr<World> world, const int tilesx, const int tiles, const int patchsize = 32, const int LODLevels = 4) shared_ptr<Material> Terrain::GetMaterial(const int x, const int y, const int index = 0) float Terrain::GetHeight(const int x, const int y, const bool global = true) void Terrain::SetHeight(const int x, const int y, const float height) void Terrain::SetSlopeConstraints(const float minimum, const float maximum, const float range, const int layer) void Terrain::SetHeightConstraints(const float minimum, const float maximum, const float range, const int layer) int Terrain::AddLayer(shared_ptr<Material> material) void Terrain::SetMaterial(const int index, const int x, const int y, const float strength = 1.0, const int threadindex = 0) Vec3 Terrain::GetNormal(const int x, const int y) float Terrain::GetSlope(const int x, const int y) void Terrain::UpdateNormals() void Terrain::UpdateNormals(const int x, const int y, const int width, const int height) float Terrain::GetMaterialStrength(const int x, const int y, const int index) What I came up with is flexible it can be used in three ways.
    Create one big terrain split up into segments (like Leadwerks Engine does, except non-square terrains are now supported). Create small patches of terrain to fit in a specific area. Create many terrains and tile them to simulate very large areas. Updating Normals
    I spent almost a full day trying to calculate terrain normal in local space. When they were scaled up in a non-linear scale, the PN Quads started to produce waves. I finally realized that normal cannot really be scaled. The scaled vector, even if normalized, is not the correct normal. I searched for some information on this issue, but the only thing I could find is a few mentions of an article called "Abnormal Normals" by someone named Eric Haines, but it seems the original article has gone down the memory hole. In retrospect it makes sense if I picture the normal vectors rotating instead of shifting each axis. So bottom line is that normal for any surface have to be recalculated if a non-uniform scale is used.
    I'm doing more things on the CPU in this design because the terrain system is more complex, and because it's a lot harder to get Vulkan to do anything. I might move it over to the GPU in the future but for right now I will stick with the CPU. I used multithreading to improve performance by a lot:
    Physics
    Newton Dynamics provides a way to dynamically calculate triangles for collision. This will be used to calculate a high-res collision mesh on-the-fly for physics. (For future development.) Something similar could probably be done for the picking system, but that might not be a great idea to do.
    Materials
    At first I thought I would implement a system where one terrain vertex just has one material, but it quickly became apparent that this would result in very "square" patterns, and that per-vertex blending between multiple materials would be needed. You can see below the transitions between materials form a blocky pattern.

    So I came up with a more advanced system that gives nice smooth transitions between multiple materials, but is still very fast:

    The new terrain system supports up to 256 different materials per terrain. I've worked out a system that runs fast no matter how many material layers you use, so you don't have to be concerned at all about using too many layers. You will run out of video memory before you run out of options.
    Each layer uses a PBR material with full support for metalness, roughness, and reflections. This allows a wider range of materials, like slick shiny obsidian rocks and reflective ice. When combined with tessellation, it is possible to make snow that actually looks like snow.

     Instancing
    Like any other entity, terrain can be copied or instantiated. If you make an instance of a terrain, it will use the same height, material, normal, and alpha data as the original. When the new editor arrives, I expect that will allow you to modify one terrain and see the results appear on the other instance immediately. A lot of "capture the flag" maps have two identical sides facing each other, so this could be good for that.

    Final Shots
    Loading up "The Zone" with a single displacement map added to one material produced some very nice results.



    The new terrain system will be very flexible, it looks great, and it runs fast. (Tessellation requires a high-end GPU, but can be disabled.) I think this is one of the features that will make people very excited about using the new Turbo Game Engine when it comes out.
  12. Josh
    An often-requested feature for terrain building commands in Leadwerks 5 is being implemented. Here is my script to create a terrain. This creates a 256 x 256 terrain with one terrain point every meter, and a maximum height of +/- 50 meters:
    --Create terrain local terrain = CreateTerrain(world,256,256) terrain:SetScale(256,100,256) Here is what it looks like:

    A single material layer is then added to the terrain.
    --Add a material layer local mtl = LoadMaterial("Materials/Dirt/dirt01.mat") local layerID = terrain:AddLayer(mtl) We don't have to do anything else to make the material appear because by default the entire terrain is set to use the first layer, if a material is available there:

    Next we will raise a few terrain points.
    --Modify terrain height for x=-5,5 do for y=-5,5 do h = (1 - (math.sqrt(x*x + y*y)) / 5) * 20 terrain:SetElevation(127 + x, 127 + y, h) end end And then we will update the normals for that whole section, all at once. Notice that we specify a larger grid for the normals update, because the terrain points next to the ones we modified will have their normals affected by the change in height of the neighboring pixel.
    --Update normals of modified and neighboring points terrain:UpdateNormals(127 - 6, 127 - 6, 13, 13) Now we have a small hill.

    Next let's add another layer and apply it to terrain points that are on the side of the hill we just created:
    --Add another layer mtl = LoadMaterial("Materials/Rough-rockface1.json") rockLayerID = terrain:AddLayer(mtl) --Apply layer to sides of hill for x=-5,5 do for y=-5,5 do slope = terrain:GetSlope(127 + x, 127 + y) alpha = math.min(slope / 15, 1.0) terrain:SetMaterial(rockLayerID, 127 + x, 127 + y, alpha) end end We could improve the appearance by giving it a more gradual change in the rock layer alpha, but it's okay for now.

    This gives you an idea of the basic terrain building API in Leadwerks 5, and it will serve as the foundation for more advanced terrain features. This will be included in the next beta.
  13. Josh
    I spent a lot of time last weekend making sure resources are correctly shared between rendering contexts. It's surprising how many commercial games make you restart the game to switch graphics resolutions, and I find it annoying. Leadwerks Engine 3 uses a small hidden window with an OpenGL context to create the OpenGL 3.3 contexts, and it just stays open so there is always a persistent context with which resources are shared. Textures, shaders, and vertex buffers can all be shared between OpenGL contexts, but oddly, frame buffer objects cannot. This is probably because FBOs are small objects that don't consume large amounts of memory, but it still seems like a strange design choice. I got around this problem by using the persistent background context whenever any FBO commands are called, so buffers will continue to work after you delete a context. So I guess the way to describe that is I start with something that is sort of awkward to work with, and encapsulate it in something that makes more sense, to me at least.
     
    Because the background context is created in the OpenGL3GraphicsDriver constructor, you can start calling 3D commands as soon as you create the driver object, without creating a visible 3D window! Weird and cool. No idea yet if it will work like this on MacOS, but I'll find out soon enough, since I ordered my iMac last week. I got the 27-inch model with the 3.2 ghz dual core CPU, and upgraded the GPU to an ATI 5750. I chose the 3.2 ghz dual core over the 2.8 ghz quad core because I have found in general usage, my quad core rarely goes over 50% usage, and I would rather have a faster clock speed per core.
     
    I said earlier that the window/context design was a little tricky to figure out, especially when you take into consideration the external windows people will want to use. In Leadwerks Engine 2, this was accomplished via a custom buffer, where callbacks were used to retrieve the context dimensions, and the user was responsible for setting up an OpenGL window. Well, initializing pixel format and OpenGL version on a window is a somewhat tricky thing, and if it's possible I would like to avoid making you deal with that. I ended up with a window design that is quite a lot more advanced than the simple Graphics() command in LE2. The window is created from a GUIDriver, which implies other parts of a cross-platform GUI might one day be included. The design is modeled similarly to MaxGUI for BlitzMax. To create a window, we do this:

    Gadget* window = CreateWindow("My window",0,0,1024,768,NULL,WINDOW_FULLSCREEN)
    Then you can create a graphics context on the window (or any other gadget):

    Context* context = GraphicsDriver->CreateContext(window)
    We can check for events in our game loop like this:

    while (PeekEvent()) { Event ev = WaitEvent(); switch (ev.id) { case EVENT_WINDOWCLOSE: return 0; } }
    At this time, windows are the only supported gadget, but the framework is there for adding additional gadgets in the future. This system can also be used to implement the skinned game GUI as well as native interface elements. Before Rick says anything, yes, there will be a custom event handler function you can attach to a gadget instead of polling events.
     
    You can render to an external window just by supplying the HWND (on Windows) to the GUIDriver->CreateGadget(HWND hwnd) command. This will create a "Gadget" object from any valid hwnd, and it can then have a context created on it, like the above example.
     
    Simple deferred lighting is working, just using a directional light with no shadows. On both AMD and NVidia cards, the engine can render 16x MSAA deferred lighting. The gbuffer format in Leadwerks Engine 3 is only 12 bytes per pixel. Per-pixel motion blur will add a couple more bytes:
     
    color0 (RGBA8)
    diffuse.r
    diffuse.g
    diffuse.b
    specular intensity
     
    color1 (RG11B10)
    normal.x
    normal.y
    normal.z
     
    color2 (RGBA8)
    emission.r
    emission.g
    emission.b
    materialid
     
    Material properties are sent in an array to the GPU, and the material ID is used to look up properties like specular reflection color, gloss value, etc. So this is a very efficient usage of texture bandwidth. My GEForce 9800 GTX can handle 1920x1080 with 8x MSAA, but 16x seems to go over a threshold and the card crawls. I don't know if you'll be using 16x MSAA in a lot of games, but if nothing else it makes for fantastic screen shots, and the lower resolution antialias options are still there. I personally don't see a big improvement past 4x multisampling in most games.
     

     
    Here's my current test program. It's more low-level than you will have to work with. You won't have to create the gbuffer and draw lighting yourself like I am here, but you might like seeing how things work internally:

    #include "le3.h" using namespace le3; int main() { InitFileFactories(); //Create GUI driver GUIDriver* guidriver = new WindowsGUIDriver; //Create a window Gadget* window = guidriver->CreateWindow("Leadwerks",0,0,1024,768,NULL,WINDOW_TITLEBAR|WINDOW_RESIZABLE); if (!window) { Print("Failed to create window"); return 0; } //Create graphics driver GraphicsDriver* graphicsdriver = new OpenGL3GraphicsDriver; if (!graphicsdriver->IsSupported()) { Print("Graphics driver not supported."); return 0; } //Create a graphics context Context* context = CreateContext(window,0); if (!context) { Print("Failed to create context"); return 0; } //Create world World* world = new World; //Create a camera Camera* camera = CreateCamera(); camera->SetClearColor(0.5,0.5,0.5,1); //Load a model LoadModel("Models/train_sd40.mdl"); //Create gbuffer #define SAMPLES 16 Buffer* gbuffer = CreateBuffer(context->GetWidth(),context->GetHeight(),1,1,SAMPLES); Texture* normals = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGB_PACKED,0,1,SAMPLES); Texture* emission = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGBA,0,1,SAMPLES); gbuffer->SetColor(normals,1); gbuffer->SetColor(emission,2); delete normals; delete emission; //Set up light shader Shader* lightshader = LoadShader("shaders/light/directional.shd"); Mat4 lightmatrix = Mat4(1,0,0,0, 0,1,0,0, 1,0,0,0, 0,0,0,1); lightmatrix *= camera->mat; lightshader->SetUniformMat4("lightmatrix",lightmatrix); lightshader->SetUniformVec4("lightcolor",Vec4(1,0,0,1)); lightshader->SetUniformVec4("ambientlight",Vec4(0,0,0,1)); lightshader->SetUniformVec2("camerarange",camera->range); lightshader->SetUniformFloat("camerazoom",camera->zoom); //Delete and recreate the graphics context, just because we can //Resources are shared, so you can change screen resolution with no problems delete context; delete window; window = guidriver->CreateWindow("Leadwerks",200,200,1024,768,NULL,WINDOW_TITLEBAR|WINDOW_RESIZABLE); context = CreateContext(window,0); float yaw = 0; while (true) { //Print(graphicsdriver->VidMemUsage()); if (!window->Minimized()) { yaw +=0.25; //Adjust the camera camera->SetPosition(0,0,0,false); camera->SetRotation(0,yaw,0,false); camera->Move(0,2,-10,false); //Update the time step UpdateTime(); //Render to buffer gbuffer->Enable(); camera->Render(); //Switch back to the window background context->Enable(); //Enable shader and bind textures lightshader->Enable(); gbuffer->depthcomponent->Bind(0); gbuffer->colorcomponent[0]->Bind(1); gbuffer->colorcomponent[1]->Bind(2); gbuffer->colorcomponent[2]->Bind(3); //Draw image onto window graphicsdriver->DrawRect(0,0,context->GetWidth(),context->GetHeight()); //Turn the shader off lightshader->Disable(); //Swap the back buffer context->Swap(false); } //Handle events while (PeekEvent()) { Event ev = WaitEvent(); switch (ev.id) { case EVENT_WINDOWRESTORE: ResumeTime(); break; case EVENT_WINDOWMINIMIZE: PauseTime(); break; case EVENT_WINDOWSIZE: //Recreate the gbuffer delete gbuffer; gbuffer = CreateBuffer(context->GetWidth(),context->GetHeight(),1,1,SAMPLES); normals = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGB_PACKED,0,1,SAMPLES); emission = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGBA,0,1,SAMPLES); gbuffer->SetColor(normals,1); gbuffer->SetColor(emission,2); delete normals; delete emission; break; case EVENT_WINDOWCLOSE: //Print OpenGL error to make sure nothing went wrong Print(String(glGetError())); //Exit program return 0; } } } }
    There's been some debate about the use of constructors, and although it would be nice to be able to use a constructor for everything, but that does not seem possible. I use a lot of abstract classes, and there is no way to use an abstract class constructor to create an object. If there was a way to turn the object into a derived class in its own constructor, that would work, but it's not supported. You certainly wouldn't want to have to call new OpenGL3Buffer, new DirectX11Buffer, new OpenGL4Buffer depending on the graphics driver. The point of abstract classes is so you can just call their commands without knowing or caring what their derived class is. So if anyone has any other ideas, I'm all ears, but there doesn't seem to be any other way around this.
     
    What's next? I need to get some text up onscreen, and the FreeType library looks pretty good. I'll be getting the Mac version set up soon. And I am eager to get the engine working together with BlitzMax, so I can work on the editor. The graphics features of LE3 are great, but I think there are two even more important aspects. The first is the art pipeline. I've designed a system that is the absolute easiest way to get assets into the engine. More details will come later, but let's just say it's heavy on the drag and drop. The other important aspect of LE3 is the interactions system. I have concluded that programming, while necessary, is a lousy way of controlling interactions between complex objects. The Lua implementation in LE2 was good because it provided a way for people to easily share programmed objects, but its the interactions of objects that make a system interesting, and a lot of object-oriented spaghetti is not a way to handle this. Using the interactions system in LE3 with inputs and outputs for each object is something that I think people will really like working with.
  14. Josh
    Environment probes are now available on the beta branch. To access them, you must set the "UnlockBetaFeatures" setting in the config file to 1, then start the editor. Environment probes are available in the "Effects" object creation category.
     
    Environment probes should be placed with one in each room. Use the object scale to make the probe's volume fill the room. (Like decals, probes will display a bounding box when selected.) You do not have to worry about covering every single space as the GI effect will blend in pretty well with the regular old ambient light level. Outdoor scenes do not need probes covering everywhere. There is not presently an entity icon for probes so you will have to select them from the scene panel. The AI & Events map has been updated with probes added, and it just took a couple of minutes.
     

     
    The color value of the probe (under the "Appearance" tab in the object properties) is used to control the amount of ambient lighting the probe contributes. The specular color of the probe is used to control the strength of the reflections it casts.
     
    Materials can control their reflectivity with their specular color and specular texture, if it exists. You may find some materials are highly reflective and need to be toned down a bit, so just make their specular color darker. The roughness setting is used to control how sharp or blurry reflections are. Higher roughness = blurrier reflections.
     
    When you first create an environment probe, its cubemap will be rendered immediately. You should re-render it when you have made some changes to your map and want to see the updated global illumination. To do this, select the Tools > Build Global Illumination menu item, which will update all environment probes in the scene.
     
    Global illumination is an inexpensive effect because it is precomputed. However, environment probes will be skipped if the lighting quality is set to the lowest setting.
  15. Josh
    I've taken your suggestions and incorporated the fixes into the new documentation system here:
    https://www.leadwerks.com/learn
     
    All classes and commands are alphabetized, with classes listed first.
     
    All pages in the API Reference should be working now.
     
    If a page does not appear in the old docs, or if a command does not have an example, it will not appear in the new docs, as I am not changing the content right now.
     
    Please let me know if the table of contents of pages have any errors.
  16. Josh
    Explore our reimagining of the Chernobyl nuclear exclusion zone with The Zone asset pack.  This package contains over three gigabytes of high-quality game assets prepared to take advantage of the latest Leadwerks features.  Use our ready-made map (included) to start your game or create your own post-apocalyptic environment.
    Get it now on Steam with a discount during launch week.


    "The Zone" DLC includes the following assets:
    24 terrain textures 11 buildings (plus 2 background buildings) 4 types of bridges 2 types of fences 6 crates 3 cargo containers 12 signs 13 rocks 7 gravestones 8 plants 39 junk and debris models 20 furniture models Barriers Railway tracks Diesel locomotive and boxcar 2 skyboxes And much more... TECHNICAL SPECIFICATIONS
    Polygon Count: 30-46985
    Textures: Diffuse, Normal, Specular
    Texture Resolution: 64x64, 128x128, 256x256, 512x512, 1024x1024, 2048x2048
    Collision Shapes: Yes
    Source Files: FBX, MAX, PSD, JPG, PNG, BMP



     
  17. Josh
    Leadwerks GUI is now functioning, on the beta branch, Windows only, Lua interpreter only.
     

     
    GUI Class
    static GUI* Create(Context* context)
    Creates a new GUI
     
    Widget Class
    static Widget* Create(const int x, const int y, const int width, const int height, Widget* parent, const int style=0)
    Creates a new Widget. Widgets can be made into buttons, dropdown boxes, or anything else by attaching a script.
     
    virtual bool SetScript(const std::string& path, const bool start = true)
    Sets a widget's script for drawing and logic
     
    virtual void SetText(const std::string& text)
    Sets a text string for the widget that can be retrieved with GetText().
     
    The widget script will call GUI drawing commands:
    virtual void DrawImage(Image* image, const int x, const int y);//lua
    virtual void DrawImage(Image* image, const int x, const int y, const int width, const int height);//lua
    virtual void SetColor(const float r, const float g, const float b);//lua
    virtual void SetColor(const float r, const float g, const float b, const float a);//lua
    virtual void DrawRect(const int x, const int y, const int width, const int height, const int fillmode = 0, const int radius = 0);//lua
    virtual void DrawText(std::string& text, const int x, const int y, const int width, const int height, const int style = 0);//lua
    virtual void DrawLine(const int x0, const int y0, const int x1, const int y1);//lua
     
    Images are also supported (these are loaded from a tex file):
    static Image* Load(std::string& path, GUI* gui);
     
    Images are a little funny. If the GUI is context-based it will internally use a texture. If the GUI is window-based the image will store a bitmap that varies with the operating system. The point is, the GUI drawing commands will work the same on either.
     
    The button script looks like this and provides a fully functional button:

    Script.pushed=false Script.hovered=false function Script:Draw() --System:Print("Paint Button") local pos = self.widget:GetPosition(true) local gui = self.widget:GetGUI() gui:SetColor(1,1,1,1) if self.pushed then gui:SetColor(0.2,0.2,0.2) else if self.hovered then gui:SetColor(0.3,0.3,0.3) else gui:SetColor(0.25,0.25,0.25) end end gui:DrawRect(pos.x,pos.y,self.widget.size.width,self.widget.size.height,0,3) gui:SetColor(0.9,0.9,0.9) local text = self.widget:GetText() if text~="" then if self.pushed then gui:DrawText(text,pos.x+1,pos.y+1,self.widget.size.width,self.widget.size.height,Text.Center+Text.VCenter) else gui:DrawText(text,pos.x,pos.y,self.widget.size.width,self.widget.size.height,Text.Center+Text.VCenter) end end gui:DrawRect(pos.x,pos.y,self.widget.size.width,self.widget.size.height,1,3) end function Script:MouseEnter(x,y) self.hovered = true self.widget:Redraw() end function Script:MouseLeave(x,y) self.hovered = false self.widget:Redraw() end function Script:MouseMove(x,y) --System:Print("MouseMove") end function Script:MouseDown(button,x,y) --System:Print("MouseDown") self.pushed=true self.widget:Redraw() end function Script:MouseUp(button,x,y) --System:Print("MouseUp") local gui = self.widget:GetGUI() self.pushed=false if self.hovered then EventQueue:Emit(Event.WidgetAction,self.widget) end self.widget:Redraw() end function Script:KeyDown(button,x,y) --System:Print("KeyDown") end function Script:KeyUp(button,x,y) --System:Print("KeyUp") end
     
    The button uses the new EventQueue class to emit an event:

    EventQueue:Emit(Event.WidgetAction,self.widget)
     
    The main script can then poll events and find out when the button is pushed. This code is inserted into the main loop to do that:

    while EventQueue:Peek() do local event = EventQueue:Wait() if event.id == Event.WidgetAction then if event.source == button then System:Print("The button was pressed!") end end end
     
    Support for event callbacks will also be added.
     
    The full main script to create a GUI and handle events looks like this:

    --Initialize Steamworks (optional) Steamworks:Initialize() --Set the application title title="$PROJECT_TITLE" --Create a window local windowstyle = window.Titlebar + window.Resizable-- + window.Hidden if System:GetProperty("fullscreen")=="1" then windowstyle=windowstyle+window.FullScreen end window=Window:Create(title,0,0,System:GetProperty("screenwidth","1024"),System:GetProperty("screenheight","768"),windowstyle) --window:HideMouse() --Create the graphics context context=Context:Create(window) if context==nil then return end --Create a GUI local gui = GUI:Create(context) --Create a new widget local button = Widget:Create(20,20,300,50,gui:GetBase()) --Set the widget's script to make it a button button:SetScript("Scripts/GUI/Button.lua") --Set the button text button:SetText("Button") --Create a world world=World:Create() world:SetLightQuality((System:GetProperty("lightquality","1"))) --Load a map local mapfile = System:GetProperty("map","Maps/start.map") if Map:Load(mapfile)==false then return end --window:Show() while window:KeyDown(Key.Escape)==false do --Process events while EventQueue:Peek() do local event = EventQueue:Wait() if event.id == Event.WidgetAction then if event.source == button then System:Print("The button was pressed!") end end end --If window has been closed, end the program if window:Closed() then break end --Handle map change if changemapname~=nil then --Clear all entities world:Clear() --Load the next map Time:Pause() if Map:Load("Maps/"..changemapname..".map")==false then return end Time:Resume() changemapname = nil end --Update the app timing Time:Update() --Update the world world:Update() --Render the world world:Render() --Render statistics context:SetBlendMode(Blend.Alpha) if DEBUG then context:SetColor(1,0,0,1) context:DrawText("Debug Mode",2,2) context:SetColor(1,1,1,1) context:DrawStats(2,22) context:SetBlendMode(Blend.Solid) else --Toggle statistics on and off if (window:KeyHit(Key.F11)) then showstats = not showstats end if showstats then context:SetColor(1,1,1,1) context:DrawText("FPS: "..Math:Round(Time:UPS()),2,2) end end --Refresh the screen context:Sync(true) end
     
    At this point the system contains everything you need to begin writing your own widget scripts. Small changes may occur in the API before the feature is finalized, but this is pretty close to the final product.
  18. Josh
    One of my main goals in 2016 is to build a built-in store for Leadwerks users to buy and sell 3D models and other items through Steam. It took me a while to understand how this works:
    The client (Leadwerks Editor) sends a message to the Leadwerks.com server requesting that a purchase be made.
    The Leadwerks.com server sends a message to the Steam server requesting that a purchase be initialized.
    Steam displays a dialog to confirm the transaction in Leadwerks Editor.
    The Steam server returns the result of the transaction to the Leadwerks.com server.
    If the transaction was successful, the Leadwerks server stores a record of the transaction in an SQL database on the server.
    The editor receives a message confirming the transaction and downloads the purchased item.

     
    It's sort of complicated, and it's not really something I am used to dealing with. The first step is to restore our SSL certificate, which I let lapse because we aren't using it for anything anymore. I have a web developer I talked to a few months ago about this project, and put it on hold because of the holidays, but will contact him again soon.
  19. Josh
    A new update is now available on the beta branch. This completes our migration to the newer SteamUGC system for Workshop content. Downloads should be working reliably now, but please let me know if you experience any trouble.
     
    Steam protocol calls are now supported on Linux. Most notably, the Steam interface will open the Workshop now, instead of it opening in a web page.
     
    An extra color option for the perspective viewport has been added, as well as several new color schemes. Select the Tools > Options menu, then the Colors tab, and press the Load button to see them all.
     
    Leadwerks Game Player has also been updated and should work reliably.
     
    (Build 611789 was updated to 620411.)
  20. Josh
    The beta branch has been updated. The following changes have been made:
    Rolled beta branch back to release version, with changes below. Added new FBX converter. Fixed Visual Studio project template debug directory. Fixed Visual Studio project template Windows Platform SDK version problem. If everything is okay with this then it will go out on the default branch soon.
  21. Josh
    Leadwerks Engine 2.43
    ...will be released tomorrow, along with a new source distro. I've fixed a number of bugs, but I don't like compiling releases when I am tired because there's a lot of little steps to mess up, so I will do it in the morning.
     
    Leadwerks Engine 3
    Optics was always my favorite subject in physics, and I've been getting some amazing results lately by modeling computer graphics after real lighting phenomena.
     
    Once I decided to make the materials system like 3ds max, everything became easy. The engine chooses an "ubershader" variation based on what texture slots a material has a texture assigned to. Creating a normal mapped material is as easy as creating a material and adding two textures. The engine will assume texture slot 0 is the diffuse map and slot 1 is the normal map, and will load a shader based on that. Predefined slots include diffuse, normal, specular, displacement, reflection, emission, refraction, and opacity maps. Of course, you can still explicitly assign a shader if you need something special. The material below was created just by dragging some textures into different slots and adjusting their strength:

     
    Cubemaps are built into the model ubershader, and there's support for reflection, refraction, or both using a fresnel term to combine them. Chromatic aberrtion is also supported, which splits refracted light into its RGB components:

     
    While I was getting into all these advanced optics, I decided to take a stab at color grading, and the results are great. You create a 3D texture (it's easier than it sounds) which gets used as a color lookup table in the post-processing filter. To make a new color table you can just run the source 2D image through a photoshop filter and save it. Color grading gives a scene an overall feel and makes colors look consistent. It's a technique that's used extensively in film, starting with 2000's Oh Brother Where Art Thou:

     
    Here's anpther example:

     
    And here's a simple shot in the engine. The original:

     
    And graded with a "cool" color table:

     
    It's more advanced than just tinting the screen a certain color, because this effect will actually emphasize a range of colors.
     
    Preparing the Asset Store
    Now that Leadwerks.com is hosted on our own dedicated server, I was able to create a more secure folder to store asset store files in that can only be accessed by the forum system. Previously, we had dynamic download URLs working, but the same file could still be downloaded from any browser within a few minutes before the dynamic URL changed. This will give better security for asset store merchants. According to my big flowchart, the only things left to do are to fix the side-scrolling main page and set up a merchant account with our bank, and then we'll be ready to launch.
     
    Great movie and soundtrack, by the way:


  22. Josh
    An update is available on the beta branch with the following changes:
    Implemented "CollisionMesh"-named limbs in models.
    Added "Strip vertex colors" option in model editor.
    Workshop items can now be submitted as free or curated paid items, although no mechanism is in place yet to actually purchase items.

     

     
    Curated items appear in the Workshop and can be voted for, but cannot be downloaded.
     

     
    You can enter your bank information for receiving payments, and you can even create a revenue split between multiple people.
     

     
    Although purchasing items is not yet supported, this is a first step that allows you to prepare for when it is implemented. Any paid items you submit will be viewable, but cannot be used or downloaded by other users yet.
  23. Josh
    Leadwerks Engine SDK 2.31 is now available. A new SDK installer in the download area allows you to download different versions of the SDK. You must have an activated Leadwerks account to download the new SDK installer.
     
    New features include lighting optimizations for point and spot lights. You can read about this feature in detail here. Another new feature is character controller crouching behavior. Note that the origin of character controllers has been moved to the very bottom of the controller, instead of the vertical center. In addition, particle emitters will now store the particle velocity in the first texture coordinate array of the surface. This allows the implementation of directional particles, for elongated sparks or other non-circular particles.
     
    Shadow updating can be a bottleneck in rendering. To make matters worse, point lights require a total of six passes when its shadow is refreshed. We solved this problem by only redrawing shadows of objects that moved. The shadow is then combined with the rest of the cached shadow buffer, to make the final shadow map. The net result is fully dynamic point light shadows using a proper six-sided shadow map, with less rendering cost and better quality than dual parabolic shadow maps.
     




  24. Josh
    I've been working with "Klepto2" to integrate the Scintilla text control into the new Leadwerks editor. Scintilla is very powerful and feature-rich, but is somewhat insane to work with. You've got to have a Scintilla expert onboard to integrate it successfully, which we fortunately do.
     
    Previously, I was relying on calls to a Debug:Stop() function to control breakpoints. This was hard-coded into your script program, and breakpoints could not be dynamically added or removed. Since Scintilla gives us the ability to edit breakpoints in the editor, I rewrote the debugging interface to give control to provide more communication between the application and the debugger. Breakpoints can now be added and removed as the program is running. You can also pause the program at any time from the editor and see where the code is executing.
     
    The script editor includes a great debugger that lets you view the entire contents of the Lua virtual machine. As a finishing touch, I added in some icons from Microsoft's Visual Studio icon pack. As a programmer, the result is something I really like working with:

     
    We think having a professional-grade code IDE integrated in the editor will provide a smoother and more seamless user experience, so you don't have to switch back and forth with third-party IDEs.
  25. Josh
    Lately I've been talking a lot about making non-programmers happy, so here's something for the coders. B)
     
    For Leadwerks3D documentation, I considered three possibilities:
    -IPB-based docs
    -Wiki
    -Offline CHM files
     
    Each has their own strengths and weaknesses, as always seems to be the case.
     
    IPB
    A bit slower, harder to organize, harder to edit, looks really good and consistent with the site, search requires quite a few clicks to get where you want to go, missing table of contents is a bummer for long pages.
     
    Wiki
    A target for spam, doesn't match the main website, user-written docs cause fragmention and a mix of official and unnofficlal information, nice table of contents, good searchability.
     
    Offline CHM
    Fast, doesn't require internet connection, easy to edit, Windows-only.
     
    I dismissed offline docs simply because we wouldn't be able to post a link to refer people to a certain page. I decided on a wiki because of the better search functionality, table of contents, and ease of editing.
     
    Here's what I have in mind for the docs. Because Leadwerks3D is written in C++, we're able to document members, constructors, operators, and other stuff we haven't had access to in the past. Best of all, almost everything works exactly the same in Lua, including function and operator overloading.
    http://www.leadwerks.com/newwiki/index.php?title=Vec3
     
    I'm also a big fan of the ease with which we can add internal links. For example, the Vec3 members in the AABB class documentation link straight to the Vec3 class page:
    http://www.leadwerks.com/newwiki/index.php?title=AABB#Members
     
    To avoid spam, and to make sure I can answer any questions about the contents of the wiki, it's going to be read-only. Community tutorials have been very useful in the past, and we're going to continue that going forward, either through a subforum or database for Leadwerks3D tutorials.
     
    I think having one page per command and an example for every single command has been more redundant and time-consuming that we have needed. I want to list the syntax for each command only once, like this:
    http://www.leadwerks.com/newwiki/index.php?title=Entity#SetPosition
     
    Instead of making a lot of redundant examples (Do we really need a page of code for Entity::SetPosition() AND Entity::SetRotation()?) I hope to provide more comprehensive examples showing how to actually do something, and how the functions can be used together to do interesting things.
     
    In general, I want to go into much more depth with the Leadwerks3D documentation, on all aspects of using the program, because I plan for this iteration of our technology to have a pretty long life. If you have any suggestions, just let me know.
×
×
  • Create New...