Jump to content

Josh

Staff
  • Posts

    23,222
  • Joined

  • Last visited

Blog Entries posted by Josh

  1. Josh
    A new update is available on the beta branch on Steam. I've merged the OpenGL2 and OpenGL4 graphics driver classes into a new class called simply "OpenGLGraphicsDriver". Everything should work, but let me know if you find any behavior changes between 4.2 and this new beta.
     
    OGG support has also been added, thanks to MartyJ's help. The following header search paths must be added to C++ projects:
    ..\..\Source\Libraries\libogg\include
    ..\..\Source\Libraries\\libvorbis\include
    ..\..\Source\Libraries\\libvorbis\lib

  2. Josh
    The Leadwerks 2 terrain system was expansive and very fast, which allowed rendering of huge landscapes. However, it had some limitations. Texture splatting was done in real-time in the pixel shader. Because of the limitations of hardware texture units, only four texture units per terrain were supported. This limited the ability of the artist to make terrains with a lot of variation. The landscapes were beautiful, but somewhat monotonous.
    With the Leadwerks 3 terrain system, I wanted to retain the advantages of terrain in Leadwerks 2, but overcome some of the limitations. There were three different approaches we could use to increase the number of terrain textures.
    Increase the number of textures used in the shader. Allow up to four textures per terrain chunk. These would be determined either programmatically based on which texture layers were in use on that section, or defined by the artist. Implement a virtual texture system like id Software used in the game "Rage". Since Leadwerks 3 runs on mobile devices as well as PC and Mac, we couldn't use any more texture units than we had before, so the first option was out. The second option is how Crysis handles terrain layers. If you start painting layers in the Crysis editor, you will see when "old" layers disappear as you paint new ones on. This struck me as a bad approach because it would either involve the engine "guessing" which layers should have priority, or involve a tedious process of user-defined layers for each terrain chunk.
    A virtual texturing approach seemed liked the ideal choice. Basically, this would render near sections of the terrain at a high resolution, and far sections of the terrain at low resolutions, with a shader that chose between them. If done correctly, the result should be the same as using one impossibly huge texture (like 1,048,576 x 1,048,576 pixels) at a much lower memory cost. However, there were some serious challenges to be overcome, so much so that I added a disclaimer in our Kickstarter campaign basically saying "this might not work"..
    Previous Work
    id Software pioneered this technique with the game Rage (a previous implementation was in Quake Wars). However, id's "megatexture" technique had some serious downsides. First, the data size requirements of storing completely unique textures for the entire world were prohibitive. "Rage" takes about 20 gigs of hard drive space, with terrains much smaller than the size I wanted to be able to use. The second problem with id's approach is that both games using this technique have some pretty blurry textures in the foreground, although the unique texturing looks beautiful from a distance.

    I decided to overcome the data size problem by dynamically generating the megatexture data, rather than storing in on the hard drive. This involves a pre-rendering step where layers are rendered to the terrain virtual textures, and then the virtual textures are applied to the terrain. Since id's art pipeline was basically just conventional texture splatting combined with "stamps" (decals), I didn't see any reason to permanently store that data. I did not have a simple solution to the blurry texture problem, so I just went ahead and started implementing my idea, with the understanding that the texture resolution issue could kill it.
    I had two prior examples to work from. One was a blog from a developer at Frictional Games (Amnesia: The Dark Descent and Penumbra). The other was a presentation describing the technique's use in the game Halo Wars. In both of these games, a fixed camera distance could be relied on, which made the job of adjusting texture resolution much easier. Leadwerks, on the other hand, is a general-purpose game engine for making any kind of game. Would it be possible to write an implementation that would provide acceptable texture resolution for everything from flight sims to first-person shooters? I had no idea if it would work, but I went forward anyway.
    Implementation
    Because both Frictional Games and id had split the terrain into "cells" and used a texture for each section, I tried that approach first. Our terrain already gets split up and rendered in identical chunks, but I needed smaller pieces for the near sections. I adjusted the algorithm to render the nearest chunks in smaller pieces. I then allocated a 2048x2048 texture for each inner section, and used a 1024x1024 texture for each outer section:

    The memory requirements of this approach can be calculated as follows:
    1024 * 1024 * 4 * 12 = 50331648 bytes
    2048 * 2048 * 4 * 8 = 134217728
    Total = 184549376 bytes = 176 megabytes
    176 megs is a lot of texture data. In addition, the texture resolution wasn't even that good at near distances. You can see my attempt with this approach in the image below. The red area is beyond the virtual texture range, and only uses a single low-res baked texture. The draw distance was low, the memory consumption high, and the resolution was too low.

    This was a failure, and I thought maybe this technique was just impractical for anything but very controlled cases in certain games. I wasn't ready to give up yet without trying one last approach. Instead of allocating textures for a grid section, I tried creating a radiating series of textures extending away from the camera:

    The resulting resolution wasn't great, but the memory consumption was a lot lower, and terrain texturing was now completely decoupled from the terrain geometry. I found by adjusting the distances at which the texture switches, I could get a pretty good resolution in the foreground. I was using only three texture stages, so I increased the number to six and found I could get a good resolution at all distances, using just six 1024x1024 textures. The memory consumption for this was just 24 megabytes, a very reasonable number. Since the texturing is independent from terrain geometry, the user can fine-tune the texture distances to accommodate flight sims, RPGs, or whatever kind of game they are making.

    The last step was to add some padding around each virtual texture, so the virtual textures did not have to be complete redrawn each time the camera moves. I used a value of 0.25 the size of the texture range so the various virtual textures only get redrawn once in a while.
    Advantages of Virtual Textures
    First, because the terrain shader only has to perform a few lookups each pixel with almost no math, the new terrain shader runs much faster than the old one. When the bottleneck for most games is the pixel fillrate, this will make Leadwerks 3 games faster. Second, this allows us to use any number of texture layers on a terrain, with virtually no difference in rendering speed. This gives artists the flexibility to paint anything they want on the terrain, without worrying about budgets and constraints. A third advantage is that this allows the addition of "stamps", which are decals rendered directly into the virtual texture. This allows you to add craters, clumps of rocks, and other details directly onto the terrain. The cost of rendering them in is negligible, and the resulting virtual texture runs at the exact same speed, no matter how much detail you pack into it. Below you can see a simple example. The smiley face is baked into the virtual texture, not rendered on top of the terrain:

    Conclusion
    The texture resolution problem I feared might make this approach impossible was solved by using a graduated series of six textures radiating out around the camera. I plan to implement some reasonable default settings, and it's only a minor hassle for the user to adjust the virtual texture draw distances beyond that.
    Rendering the virtual textures dynamically eliminates the high space requirements of id's megatexture technique, and also gets rid of the problems of paging texture data dynamically from the hard drive. At the same time, most of the flexibility of the megatexture technique is retained.
    Having the ability to paint terrain with any number of texture layers, plus the added stamping feature gives the artist a lot more flexibility than our old technique offered, and it even runs faster than the old terrain. This removes a major source of uncertainty from the development of Leadwerks 3.1 and turned out to be one of my favorite features in the new engine.
  3. Josh
    Day/night cycles are something I have thought about for a long time. There's several possible approaches to simulating these, but it wasn't until today that I decided which is best. Here are some options I considered:
     
    1. Shader-based skies with particle clouds.
    This is the method Crysis employs. A subsurface scattering shader creates the sky background. The mathematics are pretty complex, but the results for an empty blue sky look great. Particle clouds are used to place sparse clouds in the sky.
     
    Pros: Clear skies look great.
    Cons: Not very good for overcast skies, clouds look worse than a skybox, expensive to render.
     
    2. Animated skyboxes.
    The idea is to animate a looping sequence of skybox animations for a 24-hour cycle, and interpolate between the nearest two frames at all times.
     
    Pros: Potentially very high quality results, with a low cost of rendering.
    Cons: Long rendering times to generate skybox textures, no ability to tweak and adjust skies.
     
    The solution I finally arrived at is similar to the day/night cycles in STALKER: Call of Pripyat. A skybox with uniform lighting is used, so that it is not apparent from which direction the sunlight should be coming. The ambient light level, directional light color, fog, and skybox color are adjusted according to a set of colors for a 24-hour cycle. A large sun corona can be used to simulate the sun moving across the sky, and the directional light angle can be synced with the sun.
     
    Pros: Adjustable and dynamic, easy to add new skyboxes, low cost to render. Skyboxes can be changed to alter weather.
    Cons: Skyboxes with a distinct direction of lighting won't look good.
     
    Here are some of my results using this method:



    Of course, you are still free to implement any method you choose, but I am pretty satisfied with this approach for my own work, and I will make a scripted entity available to control these settings.
  4. Josh
    Here is the amazing first shot of Leadwerks Engine 3. Behold not one, but TWO triangles! It's a dual display of isosceles inspiration.
     

     
    Believe it or not, this screenshot actually demonstrates a feature that's new for LE3...hardware multisampling. Look carefully at the edges of the triangle:
     

     
    This isn't the first time I've done multisampling, but until now it hasn't been possible to use with deferred rendering. It's now possible to combine these techniques with OpenGL4. You'll have our awesome lighting combined with high-quality 16x antialiasing for raytracer-like graphics. Oh yeah, multiple layers of transparency, each with proper light and shadows, will also be supported.
     
    I ran into problems getting Code::Blocks to import the glew library. Something struck me when I was Googling around for docs. I became aware that most Code::Blocks programmers are probably coding for Linux, and all the examples I read just assumed I was using MS Visual Studio. I didn't give in to the C++ monster just so I could go off on another "non-standard" (if there is such a thing) development route, so I am back on MSVC. I really don't like the poor (or complete lack of) error handling, but I am just going to have to adjust my coding style to this environment. If anyone knows if either the pro version of MSVC or the Intel compiler will stop on the first error and select the line it occurs on, I would gladly pay for that functionality.
     
    As annoying as C++ can be, there are some compelling reasons to write the next iteration of our technology in it. First, I need C++ source code to have something I can sell to professional studios. Second, it is extremely difficult to find programmers willing to work in either BlitzMax or C, and dealing with the interface between BlitzMax and C libraries can get ugly. It's not so technically difficult, but you are doing something in the world no one else is, so there's not any support for it. Having Leadwerks Engine 3 written in pure C++ code allows me to work with other programmers who can handle various tasks like networking, some special effects, and ports to other platforms. It also makes it easier to include external libraries for added features. Finally, only C++ can be used for all the platforms I want to support, including Android, iPhone, XBox, PS3, Wii, etc.
     
    C++ programmers will like this a lot because the LE3 command set allows them more power than they have had before. It's possible to write your own drivers for physics, graphics, device input, and have it all work with the core engine. You can also extends existing classes. This is particularly useful for entity programming.
     
    Should you switch to C++ for LE3? If you've already got another language you're comfortable with, I don't see any reason to. When you are programming a game, flexibility and production efficiency are more important than the raw speed of the final program. Let's say that C++ gave a 25% speed increase over another language. This is probably not realistic, but for the sake of argument let's use that number. That still doesn't mean much, if the user can get a 300% speed increase by switching the GPU or changing some quality settings. So unless you want to support a lot of extra platforms or you just like C++, I don't see any big advantage when it comes to actual game authoring. I need a C interface for use with BlitzMax, because I intend to continue making the editor with that language. Therefore, our policy of allowing you to "code the way you want", with any language you want, will continue.
     
    By the way, Khronos has done a really great job with the design of OpenGL 3.1 and onwards. A few points of interest are that immediate mode rendering is finally done away with:

    glBegin(GL_TRIANGLES); glVertex3f(1,2,3); glVertex3f(1,4,3); glVertex3f(1,2,4); glEnd();
    No more! It doesn't make sense to have three different ways of drawing primitives, and it just makes the drivers overly complicated. You should learn the one right way of doing things, from the start.
     
    Also, the shaders have all the remnants of the old fixed function pipeline cleaned out. The GPU doesn't have a "color array" or a "texcoord array". It just has vertex arrays and you decide what they are, and what format they should be in. So I can do things like compress surface normals into four bytes and not worry about needing the color array for something else.
     
    I didn't get much into it, but it looks like shaders have a nicer implementation of what used to be called "varyings". Variables are just "in" or "out" in the vertex and fragment shaders. Uniforms remain unchanged. I might have a play at geometry shaders, but two years later I am still not sure what they are for, other than something that would have been nice when stencil shadows were popular. Of course, what I am really looking forward to is the OpenGL4 hardware tessellation.
     
    Anyways, I am pretty comfortable with C++, and am happy to be back into graphics, my favorite area of technology. Thanks again to Roland Strålberg for his advice with some C++ details. Have a good week!
  5. Josh
    I spent a lot of time last weekend making sure resources are correctly shared between rendering contexts. It's surprising how many commercial games make you restart the game to switch graphics resolutions, and I find it annoying. Leadwerks Engine 3 uses a small hidden window with an OpenGL context to create the OpenGL 3.3 contexts, and it just stays open so there is always a persistent context with which resources are shared. Textures, shaders, and vertex buffers can all be shared between OpenGL contexts, but oddly, frame buffer objects cannot. This is probably because FBOs are small objects that don't consume large amounts of memory, but it still seems like a strange design choice. I got around this problem by using the persistent background context whenever any FBO commands are called, so buffers will continue to work after you delete a context. So I guess the way to describe that is I start with something that is sort of awkward to work with, and encapsulate it in something that makes more sense, to me at least.
     
    Because the background context is created in the OpenGL3GraphicsDriver constructor, you can start calling 3D commands as soon as you create the driver object, without creating a visible 3D window! Weird and cool. No idea yet if it will work like this on MacOS, but I'll find out soon enough, since I ordered my iMac last week. I got the 27-inch model with the 3.2 ghz dual core CPU, and upgraded the GPU to an ATI 5750. I chose the 3.2 ghz dual core over the 2.8 ghz quad core because I have found in general usage, my quad core rarely goes over 50% usage, and I would rather have a faster clock speed per core.
     
    I said earlier that the window/context design was a little tricky to figure out, especially when you take into consideration the external windows people will want to use. In Leadwerks Engine 2, this was accomplished via a custom buffer, where callbacks were used to retrieve the context dimensions, and the user was responsible for setting up an OpenGL window. Well, initializing pixel format and OpenGL version on a window is a somewhat tricky thing, and if it's possible I would like to avoid making you deal with that. I ended up with a window design that is quite a lot more advanced than the simple Graphics() command in LE2. The window is created from a GUIDriver, which implies other parts of a cross-platform GUI might one day be included. The design is modeled similarly to MaxGUI for BlitzMax. To create a window, we do this:

    Gadget* window = CreateWindow("My window",0,0,1024,768,NULL,WINDOW_FULLSCREEN)
    Then you can create a graphics context on the window (or any other gadget):

    Context* context = GraphicsDriver->CreateContext(window)
    We can check for events in our game loop like this:

    while (PeekEvent()) { Event ev = WaitEvent(); switch (ev.id) { case EVENT_WINDOWCLOSE: return 0; } }
    At this time, windows are the only supported gadget, but the framework is there for adding additional gadgets in the future. This system can also be used to implement the skinned game GUI as well as native interface elements. Before Rick says anything, yes, there will be a custom event handler function you can attach to a gadget instead of polling events.
     
    You can render to an external window just by supplying the HWND (on Windows) to the GUIDriver->CreateGadget(HWND hwnd) command. This will create a "Gadget" object from any valid hwnd, and it can then have a context created on it, like the above example.
     
    Simple deferred lighting is working, just using a directional light with no shadows. On both AMD and NVidia cards, the engine can render 16x MSAA deferred lighting. The gbuffer format in Leadwerks Engine 3 is only 12 bytes per pixel. Per-pixel motion blur will add a couple more bytes:
     
    color0 (RGBA8)
    diffuse.r
    diffuse.g
    diffuse.b
    specular intensity
     
    color1 (RG11B10)
    normal.x
    normal.y
    normal.z
     
    color2 (RGBA8)
    emission.r
    emission.g
    emission.b
    materialid
     
    Material properties are sent in an array to the GPU, and the material ID is used to look up properties like specular reflection color, gloss value, etc. So this is a very efficient usage of texture bandwidth. My GEForce 9800 GTX can handle 1920x1080 with 8x MSAA, but 16x seems to go over a threshold and the card crawls. I don't know if you'll be using 16x MSAA in a lot of games, but if nothing else it makes for fantastic screen shots, and the lower resolution antialias options are still there. I personally don't see a big improvement past 4x multisampling in most games.
     

     
    Here's my current test program. It's more low-level than you will have to work with. You won't have to create the gbuffer and draw lighting yourself like I am here, but you might like seeing how things work internally:

    #include "le3.h" using namespace le3; int main() { InitFileFactories(); //Create GUI driver GUIDriver* guidriver = new WindowsGUIDriver; //Create a window Gadget* window = guidriver->CreateWindow("Leadwerks",0,0,1024,768,NULL,WINDOW_TITLEBAR|WINDOW_RESIZABLE); if (!window) { Print("Failed to create window"); return 0; } //Create graphics driver GraphicsDriver* graphicsdriver = new OpenGL3GraphicsDriver; if (!graphicsdriver->IsSupported()) { Print("Graphics driver not supported."); return 0; } //Create a graphics context Context* context = CreateContext(window,0); if (!context) { Print("Failed to create context"); return 0; } //Create world World* world = new World; //Create a camera Camera* camera = CreateCamera(); camera->SetClearColor(0.5,0.5,0.5,1); //Load a model LoadModel("Models/train_sd40.mdl"); //Create gbuffer #define SAMPLES 16 Buffer* gbuffer = CreateBuffer(context->GetWidth(),context->GetHeight(),1,1,SAMPLES); Texture* normals = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGB_PACKED,0,1,SAMPLES); Texture* emission = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGBA,0,1,SAMPLES); gbuffer->SetColor(normals,1); gbuffer->SetColor(emission,2); delete normals; delete emission; //Set up light shader Shader* lightshader = LoadShader("shaders/light/directional.shd"); Mat4 lightmatrix = Mat4(1,0,0,0, 0,1,0,0, 1,0,0,0, 0,0,0,1); lightmatrix *= camera->mat; lightshader->SetUniformMat4("lightmatrix",lightmatrix); lightshader->SetUniformVec4("lightcolor",Vec4(1,0,0,1)); lightshader->SetUniformVec4("ambientlight",Vec4(0,0,0,1)); lightshader->SetUniformVec2("camerarange",camera->range); lightshader->SetUniformFloat("camerazoom",camera->zoom); //Delete and recreate the graphics context, just because we can //Resources are shared, so you can change screen resolution with no problems delete context; delete window; window = guidriver->CreateWindow("Leadwerks",200,200,1024,768,NULL,WINDOW_TITLEBAR|WINDOW_RESIZABLE); context = CreateContext(window,0); float yaw = 0; while (true) { //Print(graphicsdriver->VidMemUsage()); if (!window->Minimized()) { yaw +=0.25; //Adjust the camera camera->SetPosition(0,0,0,false); camera->SetRotation(0,yaw,0,false); camera->Move(0,2,-10,false); //Update the time step UpdateTime(); //Render to buffer gbuffer->Enable(); camera->Render(); //Switch back to the window background context->Enable(); //Enable shader and bind textures lightshader->Enable(); gbuffer->depthcomponent->Bind(0); gbuffer->colorcomponent[0]->Bind(1); gbuffer->colorcomponent[1]->Bind(2); gbuffer->colorcomponent[2]->Bind(3); //Draw image onto window graphicsdriver->DrawRect(0,0,context->GetWidth(),context->GetHeight()); //Turn the shader off lightshader->Disable(); //Swap the back buffer context->Swap(false); } //Handle events while (PeekEvent()) { Event ev = WaitEvent(); switch (ev.id) { case EVENT_WINDOWRESTORE: ResumeTime(); break; case EVENT_WINDOWMINIMIZE: PauseTime(); break; case EVENT_WINDOWSIZE: //Recreate the gbuffer delete gbuffer; gbuffer = CreateBuffer(context->GetWidth(),context->GetHeight(),1,1,SAMPLES); normals = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGB_PACKED,0,1,SAMPLES); emission = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGBA,0,1,SAMPLES); gbuffer->SetColor(normals,1); gbuffer->SetColor(emission,2); delete normals; delete emission; break; case EVENT_WINDOWCLOSE: //Print OpenGL error to make sure nothing went wrong Print(String(glGetError())); //Exit program return 0; } } } }
    There's been some debate about the use of constructors, and although it would be nice to be able to use a constructor for everything, but that does not seem possible. I use a lot of abstract classes, and there is no way to use an abstract class constructor to create an object. If there was a way to turn the object into a derived class in its own constructor, that would work, but it's not supported. You certainly wouldn't want to have to call new OpenGL3Buffer, new DirectX11Buffer, new OpenGL4Buffer depending on the graphics driver. The point of abstract classes is so you can just call their commands without knowing or caring what their derived class is. So if anyone has any other ideas, I'm all ears, but there doesn't seem to be any other way around this.
     
    What's next? I need to get some text up onscreen, and the FreeType library looks pretty good. I'll be getting the Mac version set up soon. And I am eager to get the engine working together with BlitzMax, so I can work on the editor. The graphics features of LE3 are great, but I think there are two even more important aspects. The first is the art pipeline. I've designed a system that is the absolute easiest way to get assets into the engine. More details will come later, but let's just say it's heavy on the drag and drop. The other important aspect of LE3 is the interactions system. I have concluded that programming, while necessary, is a lousy way of controlling interactions between complex objects. The Lua implementation in LE2 was good because it provided a way for people to easily share programmed objects, but its the interactions of objects that make a system interesting, and a lot of object-oriented spaghetti is not a way to handle this. Using the interactions system in LE3 with inputs and outputs for each object is something that I think people will really like working with.
  6. Josh
    Sometimes I run into situations where I don't really know how to structure things. I don't mind this, because it usually results in some really elegant design once I figure out what to do. I just play with ideas and try not to force anything, and when the right idea arises, I will recognize it.
     
    Explaining a problem to someone else can help facilitate that process. How many times have you solved a difficult problem right after you posted a description of it on a forum somewhere? The procedure of explaining it logically to someone else can help you think more clearly about it. And so, we have today's blog topic.
     
    Below is a very rough start to the script editor. The syntax highlighting system was written about a year ago, and works beautifully, using the native text widget on both Windows and Mac.

     
    In the Leadwerks3D source code, there is a base class called an "AssetEditor". From this class the material, model, shader, texture, font, and script editor classes are derived. Like the other asset editor windows, only one instance of the script editor window will be allowed open at any time. Unlike the other asset editor windows, which display only one asset at a time, the script editor will use tabs to display multiple files. Scripts aren't a typical asset like a material or a model, so it's fine for them to behave a little differently.
     
    Any Leadwerks3D application can have its Lua state debugged. The engine uses networking commands to communicate with a debugger on a specified port. This means the engine can communicate with a debugger whether it's part of a C++ program, C# app, or standalone Lua interpreter. The debugger can display the Lua callstack and shows all variables and values in the Lua state, including full examination of C++ objects and members!
     
    I do not intend for Leadwerks3D to "play a game" in the editor. We've tried that approach and there are a lot of problems. I want Leadwerks3D to be a very solid and stable level editor, with a visual interface to do everything you need. I also want better consistency between Lua and C++ programs. Therefore, Leadwerks3D will use a run game system more similar to 3D World Studio than the Leadwerks Engine editor. A dialog will allow you to choose the application to run, command-line parameters, and other settings. These will be saved between sessions, so you can hit a key to do a "Quick Launch" and run your game. It would be possible to hook the Lua debugger into any application as it is launched, which could be very helpful.
     
    Let's go back to the script editor now. My inclination is to have F5 launch an interpreter and call the script for the currently selected tab. However, I don't think it's a good idea to use multiple game launch modes, I already described a uniform game launch mode for both Lua and C++ applications, but that seems sort of counter-intuitive if you are working in the script editor and just want to run something really quickly.
     
    There;s also the question of whether we want to provide a standalone script editor and debugger outside of Leadwerks3D. Or should the debugger be a standalone application as well, since someone might want to use it with a C++ application? You see there are a lot of options and a lot of possible ways to set this up.
     
    What about Lua compile errors? I can print that out in the engine log, but how will the editor display it? If a compile error occurs, should the program pause and display the line it occurred at? What if the user just doesn't care, and wants the program to keep going?
     
    Alternatively, the user may want to just hit F5 in the script editor and check for basic syntax errors, which the command LuaL_LoadString() will detect.
     
    That's pretty much all my questions at this point. I don't expect anyone to come along and solve my problems, but the process of describing and discussing the issues will help me come to a resolution.
  7. Josh
    As I finish up the FPS Weapons Pack, I am struck by just how much work it takes to build a game, beyond the engine programming. I wrote Leadwerks, and it still took me several days to script the behavior of these weapons. I also had the advantage of being able to add new entity classes, as I needed two new entity types to complete the task: LensFlares and Sprites.
     
    I like the flexibility of Leadwerks to make any type of game, and I think having an abstract 3D command set to do anything is really powerful. However, I feel like we're still missing something, and it's not more graphical features.
     
    The output of the community can be seen on the games page and in the Leadwerks Workshop. There are a lot more Leadwerks 3 games than Leadwerks 2 games, which is a good sign. In most cases, I think the bottleneck is on content. I don't simply mean 3D models. Those are fairly easy to obtain, although it is very hard to find a large collection of artwork with consistent art direction. The problem is there is a big gap between having a 3D model and having a scripted game-ready item you can drop into the map and start playing. Considering how much work it has taken me just to set up the first two DLCs, it's ridiculous to expect each individual to reproduce all that work themselves.
     
    You can categorize most game technology into low, mid, and high-level features. Leadwerks 2 was only strong in the lowest level category. Leadwerks 3 adds very strong support for the mid-level category, thanks to the editor. Traditionally, everything beyond that has been left up to the user to implement.
     

     
    This is something that needs to change. Low-level game technology is not very valuable anymore. I think the success of Leadwerks will be based on connecting games with players on Steam. To make that happen, we need more games. I want to make the development process a little more like modding a game, with more game-ready content ready to use. Anyone should be able to make a game without any programming at all. When you're ready to dig into Lua or C++, it's there, but you should be able to get up and running without it.
     


     
    Fortunately, the community has responded positively to the first DLC and I think the FPS Weapons Pack will sell well. I've started production of our own game assets using just a few artists. Why are we producing our own stuff, rather than reselling existing work?
    There's a ton of content out there already, but most of it is very bad quality.
    Even the good quality stuff doesn't feel consistent because it comes from different sources.
    Interactive objects like characters and weapons still require a ton of work to make game-ready. Often times animations have all kinds of trouble and glitches.
    It takes a ton of work getting things ready for use. Why not just make it to spec from the start, instead of scrounging for content that often times is not suited for purpose?
    Since sales cover the cost of production, I'd rather just sell them myself and make a profit. (Two revenue streams are better than one because it gives you more options.)

     
    Right now I have my environment artist, my character artist, and will likely rely on the same person for future weapon packs. All Leadwerks content will come from the same artists, so it will all look like it fits together. The first character pack is in production now.
     
    So I think you can expect that I will start taking on more responsibility for gameplay stuff that has in the past been left up to the end user. I won't attempt to create every game under the sun, but I think Leadwerks should have more default gameplay stuff the user can modify to suit their own needs.
  8. Josh
    I'm really happy to see all the great and scary games coming out for the Halloween Game Tournament. I have something of my own to share with you.
     
    The Leadwerks 3 vegetation system is a groundbreaking new technology. All vegetation instances are procedural and dynamically placed, so that physics and rendering of any amount of instances can occur with zero memory usage. This demo proves that the idea actually works!
     
    This demo shows the new Leadwerks Game Engine 3 vegetation system in action. It uses three vegetation layers for trees, brushes, and dense grass. Without any adjustment of vegetation settings for different hardware, I am getting the following results:
    NVidia GEForce GTX 780: 301 FPS
    AMD Radeon R9 290: 142 FPS
    Intel HD Graphics 400: 15 FPS

     
    The view distances can be further adjusted to make the system run faster on low-end hardware.
     

  9. Josh
    A new easy-to-use networking system is coming soon to Leadwerks Game Engine.  Built on the Enet library, Leadwerks networking provides a fast and easy way to quickly set up multiplayer games.  Each computer in the game is either a server or a client.  The server hosts the game and clients can join and leave the game at will.  On the other hand, when the server leaves the game, the game is over!

    Creating a Client
    You can soon create a client with one command in Leadwerks:
    client = Client:Create() To connect to a server, you need to know the IP address of that computer:
    client:Connect("63.451.789.3") To get information from the other computer, we simply update the client and retrieve a message:
    local message = client:Update() if message.id == Message.Connected then print("Connected to server") elseif message.id == Message.Disconnected then print("Disconnected from server") elseif message.id == Message.Chat then print("New chat message: "..message.stream:ReadString()); end You can even send messages, consisting of a simple message ID, a string, or a stream.
    client:Send(Message.Chat,"Hello, how are you today?") There are two optional flags you can use to control the way your messages are sent.  If you specify Message.Ordered, your packets will arrive in the order they were sent (they won't necessarily otherwise).  You can use this for updating the position of an object, so that the most recent information is always used.  The Message.Reliable flag should be used for important messages that you don't want to miss.  UDP packets are not guaranteed to ever arrive at their destination, but messages sent with this flag are.  Just don't use it for everything, since it is slower!
    When we're ready to leave the game, we can do that just as easily:
    client:Disconnect() A dedicated server does not have anyone playing the game.  The whole computer is used only for processing physics and sending and receiving information.  You can create a dedicated server, but it's better to let your players host their own games.  That way there's always a game to join, and you don't have to buy an extra computer and keep it running all the time.
    Creating a Server
    Your game should be able to run both as a client and as a server, so any player can host or join a game.  Creating the game server is just as easy.
    local server = Server:Create(port) Once the server is created, you can look up your IP address and ask a friend to join your game.  They would then type the IP address into their game and join.
    The server can send and receive messages, too.  Because the server can be connected to multiple clients, it must specify which client to send the message to.  Fortunately, the Message structure contains the Peer we received a message from.  A peer just means "someone else's computer".  If your computer is the client, the server you connect to is a peer.  If your computer is the server, all the other clients are peers:
    local message = client:Update() if message.id == Message.Connected then player2 = message.peer end You can use the peer object to send a message back to that computer:
    server:Send(peer, Message.Chat, "I am doing just great! Thanks for asking.") If you want to boot a player out of your game, that's easy too:
    server:Disconnect(peer) The broadcast command can be used to send the same message out to all clients:
    server:Broadcast(Message.Chat, "I hope you are all having a great time in my cool chat program!") Public Games
    You can make your game public, allowing anyone else in the world who has the game to play with you.  You specify a name for your game, a description of your server, and call this command to send a message to the Leadwerks server:
    server:Publish("SuperChat","My SuperChat Server of Fun") All client machines, anywhere in the world, can retrieve a list of public games and choose one to join:
    for n=0,client:CountServers("SuperChat")-1 do local remotegame = client:GetServer(n) print(remotegame.address) print(remotegame.description) end This is a lot easier than trying to type in one person's IP address.  For added control, you can even host a games database on your own server, and redirect your game to get information from there.
  10. Josh
    Leadwerks 4.x will see a few more releases, each remaining backwards compatible with previous versions.  Leadwerks 5.0 is the point where we will introduce changes that break compatibility with previous versions.  The changes will support a faster design that relies more on multi-threading, updates to take advantage of C++11 features, and better support for non-English speaking users worldwide.  Changes are limited to code; all asset files including models, maps, shaders, textures, and materials will continue to be backwards-compatible.
    Let's take a look at some of the new stuff.
    Shared Pointers
    C++11 shared pointers eliminate the need for manual reference counting.  Using the auto keyword will make it easier to update Leadwerks 4 code when version 5 arrives.  You can read more about the use of shared pointers in Leadwerks 5 here.
    Unicode Support
    To support the entire world's languages, Leadwerks 5 will make use of Unicode everywhere.  All std::string variables will be replaced by std::wstring.  Lua will be updated to the latest version 5.3.  This is not compatible with LuaJIT, but the main developer has left the LuaJIT project and it is time to move on.  Script execution time is not a bottleneck, Leadwerks 5 gains a much longer window of time for your game code to run, and I don't recommend people build complex VR games in Lua.  So I think it is time to update.
    Elimination of Bound Globals
    To assist with multithreaded programming, I am leaning towards a stateless design with all commands like World::GetCurrent() removed.  An entity needs to be explicitly told which world it belongs to upon creation, or it must be created as a child of another entity:
    auto entity = Pivot::Create(world); I am considering encapsulating all global variables into a GameEngine object:
    class GameEngine { public: std::map<std::string, std::weak_ptr<Asset> > loadedassets; shared_ptr<GraphicsEngine> graphicsengine; shared_ptr<PhysicsEngine> physicsengine; shared_ptr<NetworkEngine> networkengine; shared_ptr<SoundEngine> soundengine; shared_ptr<ScriptEngine> scriptengine;//replaces Interpreter class }; A world would need the GameEngine object supplied upon creation:
    auto gameengine = GameEngine::Create(); auto world = World::Create(gameengine); When the GameEngine object goes out of scope, the entire game gets cleaned up and everything is deleted, leaving nothing in memory.
    New SurfaceBuilder Class
    To improve efficiency in Leadwerks 5, surfaces will no longer be stored in system memory, and surfaces cannot be modified once they are created.  If you need a modifiable surface, you can create a SurfaceBuilder object.
    auto builder = SurfaceBuilder::Create(gameengine); builder->AddVertex(0,0,0); builder->AddVertex(0,0,1); builder->AddVertex(0,1,1); builder->AddTriangle(0,1,2); auto surface = model->AddSurface(builder); When a model is first loaded, before it is sent to the rendering thread for drawing, you can access the builder object that is loaded for each surface:
    auto model = Model::Load("Models\box.mdl", Asset::Unique); for (int i=0; i<model->CountSurfaces(); ++i) { auto surface = model->GetSurface(i); shared_ptr<SurfaceBuilder> builder = surface->GetBuilder(); if (builder != nullptr) { for (int v=0; v < surface->builder->CountVertices(); ++v) { Vec3 v = builder->GetVertexPosition(v); } } } 98% of the time there is no reason to keep vertex and triangle data in system memory.  For special cases, the SurfaceBuilder class does the job, and includes functions that were previously found in the Surface class like UpdateNormals().  This will prevent surfaces from being modified by the user when they are in use in the rendering thread.
    A TextureBuilder class will be used internally when loading textures and will operate in a similar manner.  Pixel data will be retained in system memory until the first render.  These classes have the effect of keeping all OpenGL (or other graphics API) code contained inside the rendering thread, which leads to our next new feature...
    Asynchronous Loading
    Because surfaces and textures defer all GPU calls to the rendering thread, there is no reason we can't safely load these assets on another thread.  The LoadASync function will simply return true or false depending on whether the file was able to be opened:
    bool result = Model::LoadASync("Models\box.mdl"); The result of the load will be given in an event:
    while (gameengine->eventqueue->Peek()) { auto event = gameengine->eventqueue->Wait(); if (event.id == Event::AsyncLoadResult) { if (event.extra->GetClass() == Object::ModelClass) { auto model = static_cast<Model>(event.source.get()); } } } Thank goodness for shared pointers, or this would be very difficult to keep track of!
    Asynchronous loading of maps is a little more complicated, but with proper encapsulation I think we can do it.  The script interpreter will get a mutex that is locked whenever a Lua script executes so scripts can be run from separate threads:
    gameengine->scriptengine->execmutex->Lock(); //Do Lua stuff gameengine->scriptengine->execmutex->Unlock(); This allows you to easily do things like make an animated loading screen.  The code for this would look something like below:
    Map::LoadASync("Maps/start.map", world); while (true) { while (EventQueue::Peek()) { auto event = EventQueue::Wait(); if (event.id == Event::AsyncLoadResult) { break; } } loadsprite->Turn(0,0,1); world->Render();// Context::Sync() might be done automatically here, not sure yet... } All of this will look pretty familiar to you, but with the features of C++11 and the new design of Leadwerks 5 it opens up a lot of exciting possibilities.
  11. Josh
    Now that we have our voxel light data in a 3D texture we can generate mipmaps and perform cone step tracing. The basic idea is to cast a ray out from each side of each voxel and use a lower-resolution mipmap for each ray step. We start with mipmap 1 at a distance that is 1.5 texels away from the position we are testing, and then double the distance with each steo of the ray. Because we are using linear filtering we don't have to make the sample coordinates line up exactly to a texel center, and it will work fine:

    Here are the first results when cone step tracing is applied. You can see light bouncing off the floor and reflecting on the ceiling:

    This dark hallway is lit with indirect lighting:

    There's lots of work left to do, but we can see here the basic idea works.
  12. Josh
    TLDR: I made a long-term bet on VR and it's paying off. I haven't been able to talk much about the details until now.
    Here's what happened:
    Leadwerks 3.0 was released during GDC 2013. I gave a talk on graphics optimization and also had a booth at the expo. Something else significant happened that week.  After the expo closed I walked over to the Oculus booth and they let me try out the first Rift prototype.
    This was a pivotal time both for us and for the entire game industry. Mobile was on the downswing but there were new technologies emerging that I wanted to take advantage of. Our Kickstarter campaign for Linux support was very successful, reaching over 200% of its goal. This coincided with a successful Greenlight campaign to bring Leadwerks Game Engine to Steam in the newly-launched software section. The following month Valve announced the development of SteamOS, a Linux-based operating system for the Steam Machine game consoles. Because of our work in Linux and our placement in Steam, I was fortunate enough to be in close contact with much of the staff at Valve Software.
    The Early Days of VR
    It was during one of my visits to Valve HQ that I was able to try out a prototype of the technology that would go on to become the HTC Vive. In September of 2014 I bought an Oculus Rift DK2 and first started working with VR in Leadwerks. So VR has been something I have worked on in the background for a long time, but I was looking for the right opportunity to really put it to work. In 2016 I felt it was time for a technology refresh, so I wrote a blog about the general direction I wanted to take Leadwerks in. A lot of it centered around VR and performance. I didn't really know exactly how things would work out, but I knew I wanted to do a lot of work with VR.
    A month later I received a message on this forum that went something like this (as I recall):
    I thought "Okay, some stupid teenager, where is my ban button?", but when I started getting emails with nasa.gov return addresses I took notice.
    Now, Leadwerks Software has a long history of use in the defense and simulation industries, with orders for software from Northrop Grumman, Lockheed Martin, the British Royal Navy, and probably some others I don't know about. So NASA making an inquiry about software isn't too strange. What was strange was that they were very interested in meeting in person.
    Mr. Josh Goes to Washington
    I took my first trip to Goddard Space Center in January 2017 where I got a tour of the facility. I saw robots, giant satellites, rockets, and some crazy laser rooms that looked like a Half-Life level. It was my eleven year old self's dream come true. I was also shown some of the virtual reality work they are using Leadwerks Game Engine for. Basically, they were taking high-poly engineering models from CAD software and putting them into a real-time visualization in VR. There are some good reasons for this. VR gives you a stereoscopic view of objects that is far superior to a flat 2D screen. This makes a huge difference when you are viewing complex mechanical objects and planning robotic movements. You just can't see things on a flat screen the same way you can see them in VR. It's like the difference between looking at a photograph of an object versus holding it in your hands.

    What is even going on here???
    CAD models are procedural, meaning they have a precise mathematical formula that describes their shape. In order to render them in real-time, they have to be converted to polygonal models, but these objects can be tens of millions of polygons, with details down to threading on individual screws, and they were trying to view them in VR at 90 frames per second! Now with virtual reality, if there is a discrepancy between what your visual system and your vestibular system perceives, you will get sick to your stomach. That's why it's critical to maintain a steady 90 Hz frame rate. The engineers at NASA told me they first tried to use Unity3D but it was too slow, which is why they came to me. Leadwerks was giving them better performance, but it still was not fast enough for what they wanted to do next. I thought "these guys are crazy, it cannot be done".
    Then I remembered something else people said could never be done.

    So I started to think "if it were possible, how would I do it?" They had also expressed interest in an inverse kinematics simulation, so I put together this robotic arm control demo in a few days, just to show what could be easily be done with our physics system.
     
    A New Game Engine is Born
    With the extreme performance demands of VR and my experience writing optimized rendering systems, I saw an opportunity to focus our development on something people can't live without: speed. I started building a new renderer designed specifically around the way modern PC hardware works. At first I expected to see performance increases of 2-3x. Instead what we are seeing are 10-40x performance increases under heavy loads. During this time I stayed in contact with people at NASA and kept them up to date on the capabilities of the new technology.
    At this point there was still nothing concrete to show for my efforts. NASA purchased some licenses for the Enterprise edition of Leadwerks Game Engine, but the demos I made were free of charge and I was paying my own travel expenses. The cost of plane tickets and hotels adds up quickly, and there was no guarantee any of this would work out. I did not want to talk about what I was doing on this site because it would be embarrassing if I made a lot of big plans and nothing came of it. But I saw a need for the technology I created and I figured something would work out, so I kept working away at it.
    Call to Duty
    Today I am pleased to announce I have signed a contract to put our virtual reality expertise to work for NASA. As we speak, I am preparing to travel to Washington D.C. to begin the project. In the future I plan to provide support for aerospace, defense, manufacturing, and serious games, using our new technology to help users deliver VR simulations with performance and realism beyond anything that has been possible until now.
    My software company and relationship with my customers (you) is unaffected. Development of the new engine will continue, with a strong emphasis on hyper-realism and performance. I think this is a direction everyone here will be happy with. I am going to continue to invest in the development of groundbreaking new features that will help in the aerospace and defense industries (now you understand why I have been talking about 64-bit worlds) and I think a great many people will be happy to come along for the ride in this direction.
    Leadwerks is still a game company, but our core focus is on enabling and creating hyper-realistic VR simulations. Thank you for your support and all the suggestions and ideas you have provided over the years that have helped me create great software for you. Things are about to get very interesting. I can't wait to see what you all create with the new technology we are building.
     
  13. Josh
    I recorded some clips from Dave's latest version of his scene he is working on. Somehow he managed to get more detail and faster speed. The octree optimizations in version 2.32 help a lot here. The renderer is really good at dealing with lots of small objects strewn across a scene, even if it did take some trouble before we got it working completely right. So here's a short video we'll be using to showcase the capabilities of Leadwerks Engine:
     
    Please share this on Facebook, Twitter, YouTube, etc. We want to get lots of views on this one. Thanks!
     


     

     
    Suggestions are welcome, as I will be doing a second cut with some small improvements.
  14. Josh
    What an interesting first week. I compiled a C++ program for Android, made a programming language, learned about iPhone development, and figured out a lot of C++ stuff I did not know.
     
    It's nice to see that a port to Android will work pretty much like I was hoping. I just write an abstract driver for every system, and have specific drivers that extends that class:
    class GraphicsDriver {}
    class GL4GraphicsDriver : public GraphicsDriver {}
     
    Then when you have something like a surface that is dependent on the graphics driver, you do this:
    surface = GetGraphicsDriver().CreateSurface()
     
    And a GL4Surface object is returned. Of course, this is just the internal workings, and you will only have to call CreateSurface(). The GL4Surface is an extension of the Surface class, the same way the GL4 graphics driver extends the base graphics driver class.
     
    I would like to get something running on my HTC Evo, but the details of an Android, XBox, or PS3 version aren't too important right now. What is important is to get the C++ core done, in a way that makes it easy to add support for more platforms in the future. So as planned, you'll get a .lib file for C++, a .dll you can use with any language, and multiple Lua scripts can be attached to any entity.
     
    A roadmap of the development plan can be viewed here. I hope to accomplish most of stages 1 and 2 myself, and then recruit additional coders for the last leg. I don't intend to write the networking code myself, either. I figure it will be easier to develop streamed terrain from the beginning, instead of trying to tack it on two years from now, so I will see what I can do about that. I've never seen an infinite streamed world with the density and complexity I want, but Leadwerks seems to do best when we do really aggressive development:
    http://leadwerks.com/werkspace/index.php?/page/roadmap
     
    What I like best about this process is the code I write now is ForeverCode: It's good for any platform, and it will last for the life of the engine, which will be a very long time. I really can't imagine ever writing a version 4, because version 3 is being designed to be perfect.
     
    At this point I would like to thank Roland Stralberg, Mika Heinonen, and Ed Upton for their feedback and wisdom.
  15. Josh
    When I saw specs for the graphics card they are using in the new iMacs, I knew it was time for Leadwerks to come to Mac. Steam for Mac also recently came out, so the time seems right for Mac gaming. Of course, some guy blew up the Roseville Galleria, so the Apple store was closed. I ordered the 27" iMac with a 3.2 ghz dual core processor and upgraded the graphics card to an ATI 5750. Here's what they sent me:

    The computer case/monitor (it's all one piece) is a solid piece of aluminum that doesn't bend or crackle when you pick it up, like most PC equipment. It's very nice to have a solid geometric shape with such stability.
     
    The keyboard looks weird at first, but I got used to typing on the low flat keys very quickly. The mouse has a touch surface instead of a mouse wheel, which works well for scrolling pages, but doesn't work at all for switching weapons in a game. The shape of the mouse is not very ergonomic, and I don't think I will be able to use it for extended periods of time unless I figure out a better way to hold it. The mouse and keyboard are both wireless. In fact, I have a computer with speakers, mouse, keyboard, camera, and internet, and only one cord is coming out of it, for power. It definitely cuts down on space and mess.
     
    The monitor is the most beautiful I have ever seen. It's 27 inches with a native resolution of 2560x1440. At that resolution, you can't even see the individual pixels. The desktop background looks like a photograph because you can't see pixels. I have a black and white photo in the background and I can see individual grains of sand on a beach. It's really incredible. Unfortunately, the OS has not adjusted to this fine resolution. There is no way to adjust the font size for the whole interface, so if you have the monitor set at its native resolution you will be looking at text that would be about a size 6 or 7 on a 1920x1080 monitor of the same physical size. You can lower the resolution, but then text gets slightly blurred. A few programs like Xcode and Safari let you adjust some of the text sizes, so that is what I am getting by with for now. This giant image is squeezed onto a 27" monitor:

    The speakers embedded in the monitor (that I didn't even know existed) are surprisingly good. I sort of got used to my bass-heavy Altec Lansing speakers with bad shielding and a constant buzz. It's nice to have good clear speakers and no subwoofer to trip over.
     
    The ATI 5750 has 720 stream processors (equivalent to about 144 NVidia stream processors) but I was apprehensive about performance because it is a mobile card. Source engine games ran great with max settings at the monitor's native resolution of 2560x1440. Remember, this is running in native OSX, not Windows with boot camp:

    I wanted to see how a more demanding engine would do, so I installed Windows 7 on a separate partition and ran Crysis. I was surprised to see the game running with max settings at about 18 frames per second, at the monitor's massive resolution. Of course, lowering the screen resolution improved performance. So that was not bad at all.

    I spent some time with the Mac developer tool Xcode. It's about the same as Visual Studio. It uses GCC to compile, which cuts the ten minute build time in Visual Studio down to about a minute. I could get the same results on Windows using Code::Blocks so that's not really a Mac thing, but it's nice to work with. I am pleased to say that Leadwerks Engine 3 is now compiling for Windows and MacOS. The Mac version has no graphics or GUI yet, but it compiles without any errors, and that is a big achievement. Linux should be easy now, since it's a lot more like Mac than Mac is like Windows. I also tried out the iPhone SDK and was pleased to see the iPhone emulator starts up almost instantly. By contrast, the Android SDK took me about 45 minutes to install the first time, and it took ten minutes for the emulator to start.
     
    So my conclusion on Mac is that I am very pleased with just about every aspect of them. You could reasonably get away with only having an iMac for development, because it allows you to make games for Windows, Mac, Linux, Android, and iPhone. I may make MacOS my main operating system, although I have to keep some other machines around for compatibility testing. It's taking some extra time to get the cross-platform support for Leadwerks Engine 3 in from the beginning, but I think a couple extra weeks spent now will be well worth it in the long run. Soon enough the code will be platform-agnostic, meaning I can just work with my own code, instead of trying to figure out somebody's else's designs.
     
    I picked up Left 4 Dead 1 and 2 last week in a Steam Halloween sale. I'm starting with the first one, and I absolutely love it. I've always liked co-op play since Duke Nukem, and there hasn't been nearly enough of it available. I also love zombie movies, so Left 4 Dead is right up my alley.


  16. Josh
    So after a lot of learning and mistakes, I finally have Leadwerks Engine 3 running on OSX Snow Leapord, Lion, and iOS. Rather than write out a detailed blog, I'll just throw a lot of random thoughts your way:
     
    -OpenGLES2 is a nice blend of OpenGL 2 and OpenGL 3.3. I'm really surprised my OpenGL 3 shaders translate easily into OpenGLES2 shader code, with only a few changes. In fact, the iOS renderer looks exactly like the deferred OpenGL 3 renderer, except for shadows and some post-effects. The games I have seen running on the iPhone are severely underpowered compared to what they could look like. This little device has quite a lot of power, and you can be sure I will find the way to max out the visuals.
     

     
    -iOS uses it's own texture compression format. This sucks, because it means either you are using uncompressed textures for cross-platform compatibility, or you have to have separate textures for your iOS build. The OpenGL spec really should have a defined compressed format, but I guess there was some patent issue with DXTC compression, or something like that.
     
    -Lua and my other cross-platform libraries seem to work just fine with iOS, which is fantastic. I really had no idea when I started this project whether it would really work like I hoped or not, but everything looks good.
     
    -The iOS port was probably the hardest one, due to the use of Objective-C. Android should be pretty easy after this. The PS3 is another platform I am really interested in, and my guess is LE3 could be ported to PS3 in a few days or less.
     
    -OSX Lion has some very appealing characteristics related to Leadwerks Engine 3. I'm under NDA and can't say exactly what they are, but it's very good for Mac and graphics. BTW, the gestures they are adding to OSX Lion are really fantastic, and reason enough to get the upgrade.
     

     
    There's still a ton of work to do before I have an actual product ready to release, but the plan is working and we're on track for a fantastic 3D development system for a wide variety of platforms.
  17. Josh
    So after about three weeks of pain and frustration, I have successfully calculated my first path using Recast navigation. This has been a new experience for me. I've implemented half a dozen low-level C++ libraries, and never had any serious trouble, but Recast Navigation is something else.
     
    The technology underlying Recast is impressive. First they take triangle geometry, convert it to voxels, then calculate navigation, and convert it back into rough polygons. You can read about the process in more detail here:
    https://sites.google.com/site/recastnavigation/MikkoMononen_RecastSlides.pdf
     
    The results seem pretty reliable, and although the speed of regenerating a tile is slow, it can be done on a separate thread and doesn't have to be updated all that often. I have two criticisms of the library.
     
    First, the code is a total mess. I can't complain too much since it's a free library, and I definitely appreciate Mikko putting it out there, but integrating it into the engine was a hellish process. I wouldn't even call Recast a code library so much as it is an open-source program. The amount of work it took to figure out what was going on was far beyond any library I have worked with, and it could have been wrapped up into a set of much simpler commands (which is what my end result was).
     
    Second, I believe converting polygons to voxels and back is the technically wrong way to go about this. Constructive solid geometry is perfect for navmeshes, autogenerated or otherwise. It would be very fast to generate. It could be dynamically updated quickly. The results would also be 100% accurate virtually all the time. However, it would require all navigation geometry to be modeled with CSG, and although I would be fine with that, I know others will want to use arbitrary polygonal geometry. So that's one thing I would do different if I were making an engine for just myself. There's an example of why you need to be flexible with your goals sometimes.
     
    Still, it's great to be able to build navigation data for any scene and make a character walk around in it without any waypoints or ray casts. Below is my very first result with Recast. There's some obvious strange results with the path, but the point is that it's working, to some extent. It's a minor miracle I am able to plot any kind of path at all. The hard part is done, and I'm sure I'll get any remaining issues ironed out pretty quickly:

     
    Building AI features right into the engine will give us better integration than using a third-party add-on, or trying to build it on top of the engine. First example, we can use physics geometry instead of visual polygons, and make the engine automatically invalidate and update sections of a map when objects move. This will give you powerful AI features that work perfectly with the rest of the engine, so you can control characters with just a few lines of script code. When this is done, it's reasonable to expect to be able to program something like Left 4 Dead in just a couple hundred lines of script (or less, if you just feel dragging some premade scripts around and attaching them to entities).
     
    Once the pathfinding is all tidied up, it will be a big moment, because it means at that point I will know everything I need to finish Leadwerks3D! When setting out with Leadwerks3D, the things I hadn't done before were what I was most worried about. I attacked the unknown issues first. Now that we are coming to the end of this long research and development phase, all that remains is a lot of hard work. It's all stuff I have done before, so I expect the remaining tasks to go pretty quickly. It's also pretty awesome to have a clear picture of how this massive piece of technology all works...four platforms, with rendering technology stretching from fixed-function graphics all the way to Shader Model 5 hardware tessellation...plus the easiest art pipeline ever and a script and flowgraph system that no one has ever done. The scope of this engine is just so much bigger than anything I have ever done, and it actually works! B)
  18. Josh
    I have the basis of CSG editing written into the engine. It was a bit challenging to figure out just how to structure the Brush class. We have a lightweight ConvexHull entity, which is used for the camera frustum. This is an invisible volume that changes frequently, and is used to test whether objects intersect the camera's field of view. I didn't want to bloat this class with a lot of renderable surfaces, texture coordinates, vertex normals, etc. I initially thought I would derive the Brush class from the ConvexHull class, and make brushes a more complex renderable "thing" in the world.
     
    I didn't want to create multiple rendering pathways in the octree system, so it made sense to derive the Brush class from the Entity class, so all the entity rendering pathways could be used to tell whether a brush should be drawn. However, both the Entity and ConvexHull classes are derived from the base class in Leadwerks3D, the Object class. This creates the diamond inheritance problem, where there are two Objects a Brush is derived from:

     
    In the end, I derived the Brush class from the Entity class, and just gave it a ConvexHull for a member. This meets the needs of the system and is simple:

    class Brush : public Entity { ConvexHull* convexhull; ... };
     
    So with that out of the way, it's on to editing. There are three types of objects you can create in the Leadwerks3D editor. You can drag a model into the scene from the asset browser. You can create and modify brushes like in 3D World Studio. You can also create special entities like lights, particle emitters, sound emitters, and other items.
     
    I'm ready to start CSG editing, but the thing that strikes me right away is I have no idea how object creation should work. You can drag models into the scene right now, from the Asset Browser. However, CSG brushes work better when they are drawn like architecture drafting:

     
    For "point entities" 3D World Studio lets you position a crosshair, then hit enter to create the current object:

     
    I'm not sure how we should handle brush and model editing in here. Usually when I run into a problem and am not sure how to proceed, I struggle with it a few days, but when I'm done, I am certain the approach I chose was best. Here's another one of those situations. Feel free to discuss and explore ideas with me in the comments below. Here's what the editor looks like at the moment:

     
    --Edit--
     
    Here's something I am playing with. The "Objects" menu contains a lot of brushes and special entities you can create:

     
    This idea actually goes back to the old Quake 3 Arena editor, Radiant (which was otherwise a pretty difficult tool to use!):

     
    You would still be able to drag a model into the scene from the Asset Browser at any time, but the brush and entity creation would be more like 3D World Studio. Of course, we also want the ability to move, rotate, and scale things in the 3D viewport as well! I generally don't like having multiple ways to do a thing, but the editing choices here make good sense. I'll keep playing with ideas and see what I come up with. Feel free to chime in, in the comments below.
  19. Josh
    The Leadwerks community project "Midnight Salsa" was one of my favorite things that came out of Leadwerks 2. Watching the videos of everyone collaborating across the globe was really interesting, and with Aggror's leadership they did pull off what they set out to do: Make a playable zombie FPS. However, there were some weak points that showed me where I could put effort and have it be the most useful.
     
    When you go through the first set of doors, a carefully placed light above the door shines to illuminate a sitting zombie as the doors open. I noticed right away the animation transitions weren't completely smooth. Although our animation system makes it very easy to transition between frames and do all sorts of interesting things, the logic that manages multiple animation sequences blending in and out can get a little messy. I approached this as a general problem and wrote an animation manager script in Lua that handles a "stack" of animations. When you set a new animation sequence, it is added to the top of the stack and blended in over a specified amount of time. Once it is completely blended in, all animations beneath it in the stack are removed. This process can handle any number of transitions smoothly, so you can just always set the latest one and there will never be more than two or three sequences active in the stack.
     
    The second problem that affects a lot of people is AI. How do you get characters to go where you want them to go? My advice in the past was to implement a system of connected nodes, as I did in my game "Metal". I finally solved this problem once and for all with a relatively recent technique called navmesh pathfinding. This provides more accurate pathfinding that isn't limited to node points, and it's completely dynamic. If the level changes, the pathfinding adjusts to it. This level of integration is only possible because Leadwerks 3 was built with pathfinding baked into the core, and it makes it very easy to use. Implementing this system was a huge investment of time and effort, and there was no guarantee it would have succeeded, so it probably wasn't conceivable to add this on top of Leadwerks 2. In retrospect, I really should have built node-based pathfinding into Leadwerks 2 so we could all have a common foundation for AI.
     
    In this video, the project leader mentions having to send the level model back to the artist for resizing. This causes a delay in progress while everyone waits for the new model. Our new CSG level design tools eliminate this problem because the map can be immediately adjusted right in the editor, without any waiting around or reimporting. Even if the artist wanted to take their time to make it perfect, a quick modification could be made to give the other team members something to test with. When combined with the built-in navmesh pathfinding, you can visualize the navigable areas as you build the level. There's never any guessing or going back to redo things because your enemies don't have room to get through a hallway or something.
     
    Finally, there is the matter of game object interaction. Sometimes you have special interactions in a map, and abstract classes aren't always the best way to manage that. In the past, gameplay mechanics like doors, switches, were exclusively the domain of expert programmers. Although Leadwerks 2 had a built-in messaging system, too much was left undefined. Consequently, even the programmers who were able to implement advanced interactions each isolated themselves from the community of developers, because they were branching the engine. Our new flowgraph system provides a shared framework we can all utilize and share code within. This means reusable scripts can be written that can be used in many games, by many developers, and our scripts can "talk" to each other through the flow graph. In the map below I've set up a collision trigger (shown in grey) and attached it to two doors (shown in green). When the player walks into the collision trigger the doors open. This simple example shows how anybody can set up advanced game interactions in their map.

     
    Our primary focus for Leadwerks 3 is delivering gameplay. I think all of these features will facilitate collaboration and give the community some much needed high-level frameworks. Not only does it make building games faster, it gives us all a common framework to work together within.
  20. Josh
    Back when Leadwerks 2 was first launched, it was just a BlitzMax programming SDK, intended to be a modern replacement for Blitz3D. It turned out Blitz users generally did not want it, and instead the audience it attracted was C++ programmers who like graphics. A visual editor was introduced and it was made as good as possible for a product that started with the intention of just being a programming SDK.
     
    Leadwerks 3.0 was launched with a strong emphasis on the tools that Leadwerks 2 lacked, and mobile. It was pretty clear that the main target was mobile. Once again though, the intended audience and the audience that came to me were two different things. There's still a lot of C++ programmers on Windows, but the appeal has broadened to include a lot of modders, artists, and beginners who want an easy to use editor, and of course Linux users. My favorite aspect of Leadwerks 3 is that it serves both advanced programmers and total beginners, with no compromises in the design. (Big win!)

    Leadwerks for Mobile
    My mobile offering, on the other hand, has not done well. There are two problems: 
    I estimate that mobile has accounted for around 80% of the time I spent debugging things in Leadwerks 3.0, yet it accounts for less than 10% of revenue. (It wasn't 80% of the problems, but because the whole process of developing on mobile is painfully slow, it takes more time to do simple things.) Sending an application from the computer to a device can take several minutes, the debuggers are slow, and the whole process is much less transparent than PC development. It's not terribly challenging in an intellectual sense, but it eats a lot of time. For example, it took about two weeks to implement our terrain system, but it took an extra week after that just to get it working on iOS and Android. That week could have been spent adding a different feature, but instead I was messing around with various tablets just trying to make them work.
     
    The other problem is that there's a big disparity between mobile and PC graphics. I saw a huge upswing in interest when I started showing shots of the OpenGL 4.0 renderer. Although we have moved beyond just being a graphics engine, and I know the renderer is a big part of the appeal of Leadwerks. On mobile, the hardware is comparatively limited, so it's much harder for me to do anything that makes Leadwerks mobile stand out.
     
    If I could just hire one engineer dedicated to mobile support, the first problem would be solved, because it wouldn't cut into my time. However, mobile has accounted for less than 10% of revenue in the last year. I didn't even break even on my development costs. PC sales have been consistently strong, but mobile doesn't even make enough to pay for its own maintenance. So what's been happening is the PC side of the business is subsidizing the mobile side.
     
    I expect the second problem will be solved within a couple of years. Nvidia's Tegra 4 chips can supposedly run regular OpenGL 4 (not ES). When those become the norm, it could give us total convergence of graphics between the PC and mobile. At that point, I might actually have a mobile product that stands out and provides something different.

    Leadwerks Tomorrow
    Now, as I am shipping the Kickstarter rewards and about to launch 3.1 proper, I have to think about where I want Leadwerks to be in 6 months, and in 12 months. There are three main areas I want to move forward in: 
    Leadwerks Editor
    Adding new tools to make life easier.
    Refining the workflow to eliminate any remaining"sticky" points.
    Bug fixes, to make it a super slick and polished experience.

     
    The Engine
    New features in graphics, physics, networking, etc.
    Performance optimization.
    Improved compatibility across all OpenGL 4 hardware.

     
    Third-Party Technologies
    Blender, SteamOS, Steam Workshop, virtual reality, etc.

     
    Leadwerks on the PC is in a very good position today. Virtually every computer being sold today will run Leadwerks. We came into desktop Linux at just the right time, and I am very optimistic about that space. SteamOS is opening up the living room for the first time to us lowly indies. Think about where we are now, and where this community can be a year from now...there's a ton of opportunity for all of us, but I need to focus on where we're winning. If I continue to subsidize our mobile efforts, it's going to slow down development and result in a buggier product with fewer features. So I think it's pretty clear mobile should be discontinued as of version 3.1, in favor of a stronger PC-centric product.
     
    It sucks because I lost a lot of money on mobile, and the people most willing to take a chance on a new venture also lost out. But at the same time, if I had just stayed still on Windows and continued along the Leadwerks 2 path, none of the great things we have going for us now would have happened. It may make sense to pick mobile up again, when the hardware can provide enough power, and when there is more interest from the community. So far the response has been pretty much what I expected; a handful of (understandably) disappointed people, but most of our community was never really into mobile, and the new growth is coming from other parts of the PC ecosystem, not mobile.
     
    Support for the OUYA was added rather thoughtlessly as a stretch goal for the Kickstarter campaign. Given the OUYA's failure, it should obviously be scrapped and replaced with SteamOS support. Since Android was a part of that, I am giving Kickstarter backers the ability to opt out of the campaign instead of receiving the final rewards. Kickstarter was an overall good experience, but I probably won't ever do another one, unless the project is really well-defined and doesn't have a lot of unknown parts.
     
    Desktop Linux, VR, and SteamOS are on the upswing, and all of these play to the strengths of Leadwerks. If 2013 was the year of finding our place, I think 2014 is going to be a year of reaching out to the gaming world and making some really astonishing developments.
  21. Josh
    Based on a little observation, I've made the following changes to the publish routine on the beta branch:
    The FPS Player script has been updated so that the footstep sound files are explicitly named, so they don't get missed in the publish step. You will need to update your project to get the new script (or just copy FPSPlayer.lua to your project).
    There is now a checkbox to "Only include used files". If this is checked, then the new packing routine will be used, otherwise the old behavior of including all files is used.
    The file extension filter will now be used so you can include files like XML, TXT, or whatever else you want.
    The routine has been fixed to pick up terrain textures and include those.
    A few people suggested a "tagging" system so the user can mark files to be included or not. This is a very bad idea. The whole point of the selective file inclusion system is that the user doesn't want to maintain a clean directory. So adding a manual system to overcome the limitations of the automatic system...it would be easier just to delete the unwanted files. When your feature requests lead to more feature requests, to fix the problems of the previous feature, that is a strong sign of bad design. So we won't be going down that path.
    If you are using the selective file inclusion and you want to force a file to be included that would otherwise be impossible to detect, an easy way to do this is to just add a commented-out line in your main script that names the file:
    --myfile.tex Anyways, the new build is available now on the beta branch. Let me know how you like it.
  22. Josh
    Here's what's next for Leadwerks Game Engine.
     
    Game Launcher
    Leadwerks Game Launcher will be released in early preview mode when 50 games are reached. Right now we have 13. Filling the content up is a high priority, and the Summer Game Tournament will bring us closer to that goal.
     
    I am mentioning this first because I want to emphasize that the number of games posted is my measure of how well I'm doing. Every action should be considered in terms of how it helps you guys publish more games.
     
    Usability Improvements
    Since the learning process for Leadwerks is now very well defined, we're able to identify and improve minor hiccups in the workflow. I'm getting a lot of great feedback here. The tutorials have helped y giving us a common reference point we can use to communicate how things should be done, and what can be improved.
     
    Documentation will continue to be improved as we gain more experience. I am holding off on recording any videos until we get the materialize finalized.
     
    Here are some of the things I want to modify or improve:
    New icon set
    Improve 3D control widget
    Use name in script entity field instead of drag+drop
    World:FindEntity() Lua command
    Revert material editor texture list to old style
    Implement web browser in Linux GUI for welcome page and Workshop.
    Add built-in help window for better documentation layout (will still pull data from online pages)
    Improve project manager and new project window, using web interface.
    Research is being performed on improvements to the script editor.

     
    Game Templates
    The 2D and 3D platformer templates are the easiest to make and are good examples for learning. These will likely come next. A racing game template is fun and simple, and may come after that.
     
    DLC
    A soldier pack would add a lot of new options to gameplay. We saw with the zombie pack that many people were able to use these items in a lot of different ways, and it added a lot of depth to gameplay. I think a soldier pack with characters that can fight against or with the player, or against zombies, each other, etc. would be a great addition.
     
    New Features
    I make decisions on new features based on the following question: What will result in more games? The things I think are important are a vegetation system like Leadwerks 2 had (but better), decals, and some built-in GUI commands. It is likely that the vegetation system will be the next big feature.
     
    Built-in Model Store
    We have some setbacks on this, but we're researching another way to make this happen.
     
    That's quite a lot of different stuff.
     
    As always, any of these items are subject to change.
  23. Josh
    An update is available on the beta branch with the following changes:
    Implemented "CollisionMesh"-named limbs in models.
    Added "Strip vertex colors" option in model editor.
    Workshop items can now be submitted as free or curated paid items, although no mechanism is in place yet to actually purchase items.

     

     
    Curated items appear in the Workshop and can be voted for, but cannot be downloaded.
     

     
    You can enter your bank information for receiving payments, and you can even create a revenue split between multiple people.
     

     
    Although purchasing items is not yet supported, this is a first step that allows you to prepare for when it is implemented. Any paid items you submit will be viewable, but cannot be used or downloaded by other users yet.
  24. Josh
    I realized there are two main ways a plugin is going to be written, either as a Lua script or as a DLL. So I started experimenting with making a JSON file that holds the plugin info and tells the engine where to load it from:
    { "plugin": { "title": "Game Analytics", "description": "Add analytics to your game. Visit www.gameanalytics.com to create your free account.", "author": "© Leadwerks Software. All Rights Reserved.", "url": "https://www.turboengine.com", "library": "GameAnalytics.dll" } } { "plugin": { "title": "Flip Normals", "description": "Reverse faces of model in Model Editor.", "author": "© Leadwerks Software. All Rights Reserved.", "url": "https://www.turboengine.com", "scriptfile": "FlipNormals.lua" } } I like this because the plugin info can be loaded and displayed in the editor without actually loading the plugin.
    I also wanted to try using a JSON file to control script properties. For example, this file "SlidingDoor.json" goes in the same folder as the script and contains all the properties the engine will create when the script is attached to an entity:
    { "script": { "properties": { "enabled": { "label": "Enabled", "type": "boolean", "value": true, "description": "If disabled the door will not move until it is enabled." }, "distance": { "label": "Distance", "type": "float", "value": [1,0,0], "description": "Distance the door should move, in global space." }, "opensound": { "label": "Open Sound", "type": "sound", "value": null, "description": "Sound to play when door opens." }, "closedelay": { "label": "Close Delay", "type": "integer", "value": 2000, "minvalue": 0, "description": "The number of milliseconds a door will stay open before closing again. Set this to 0 to leave open." } } } } I like that it is absolutely explicit, and it allows for more information than the comments allow in Leadwerks 4. There is actually official tools for validating the data. The json data types map very closely to Lua. However, it is more typing than just quickly writing a line of Lua code.
    While we're at it, let's take a look at what a JSON-based scene file format might look like:
    { "scene": { "entities": [ { "type": "Model", "name": "main door", "id": "1c631222-0ec1-11e9-ab14-d663bd873d93", "path": "Models/Doors/door01.gltf", "position": [0,0,0], "euler": [90,0,0], "rotation": [0,0,0,1], "scale": [1,1,1], "matrix": [[1,0,0,0],[0,1,0,0],[0,0,1,0],[0,0,0,1]], "mass": 10, "color": [1,1,1,1], "static": false, "scripts": [ { "path": "Scripts/Objects/Doors/SlidingDoor.lua", "properties": { "distance": [1,0,0], "movespeed": 5 } }, { "path": "Scripts/Objects/Effects/Pulse.lua" } ] } ] } }  
  25. Josh

    Articles
    This is an update on my progress of our voxel raytracing system. VXRT is designed to provide all the reflection information that PBR materials use. If a picture is worth a thousand words, then this counts as a 5000 word article.
    Direct lighting:

    Global illumination:

    Specular reflection:

    Skybox component:

    Final combined image:

×
×
  • Create New...