Jump to content

Josh

Staff
  • Posts

    23,226
  • Joined

  • Last visited

Blog Entries posted by Josh

  1. Josh
    Leadwerks Game Engine 4.4 features an upgrade to the latest version of Newton Dynamics, along with a bunch of new features to enhance physics.
    Kinematic Controller
    The new kinematic controller is a joint that lets you specify a position, rotation (Euler or quaternion), or a 4x4 matrix to orient the body to.  You can set the maximum linear and angular force the joint may use to orient the entity.  This allows you to create a kinematic controller that only affects position, only affects rotation, or one that controls both at once.  In the video below I am using a kinematic controller to create a simple IK system with two hinge joints.  The end effector is controlled by the mouse position, while the base entity stays in place, since it has zero (infinite) mass:
    The kinematic controller provides much more stable collisions than the Entity PhysicsSetPosition() and PhysicsSetRotation() commands, and should be used in place of these.  In fact, these commands will be removed from the documentation and should not be used anymore, although they will be left in the engine to ensure your code continues to work.  The FPS player script will be updated to use a kinematic control for objects you are holding, which will eliminate the energetic collisions the script currently produces if you pick up a crate and push it into the wall.
    The new joint commands are as follows:
    static Joint* Kinematic(Entity* entity, const Vec3& position); virtual void SetTargetMatrix(const Mat4& mat); virtual void SetTargetPosition(const float x, const float y, const float z, const float blend = 0.5); virtual void SetTargetPosition(const Vec3& pos, const float blend = 0.5); virtual void SetTargetRotation(const float pitch, const float yaw, const float roll, const float blend = 0.5); virtual void SetTargetRotation(const Vec3& rotation, const float blend = 0.5); virtual void SetTargetRotation(const Quat& rotation, const float blend = 0.5); For improved constistency in the API, the joint SetAngle function will be renamed SetTargetAngle, but a copy of the old command will remain in the engine:
    virtual void SetTargetAngle(const float angle); Joint Friction
    Hinge joints can now accept a friction value to make them more resistant to swinging around.  I used this in the example below to make the joints less "loose", while a kinematic controller positions the green box:
    New Vehicle Model
    Newton 3.14 features a new vehicle model with a realistic simulation of a slip differential.  Power is adjusted to each wheel according to the resistance on each tire.

    Watch closely as the wheels below act just like a real car does when its tires slip:
    The realistic vehicle models gives vehicles a much more visceral and fun feeling.  The new vehicle also uses actual bodies for the tires, instead of convex raycasts, so the sudden bouncing the old vehicles could exhibit if the chassis didn't encompass the tires is eliminated.
    Springs
    Slider and hinge joints now have optional spring behavior you can enable with one command.  Use this to make our own custom suspension system, or anything else you need.
    void SetSpring(const float spring) These changes will be available next week on the beta branch on Steam.
  2. Josh
    An update is available on the beta branch which adds the new vegetation system. You can open the attached map in the editor to see the system in action. Be sure to create a new project, or update your existing one, in order to get the new shaders and models.
    vegetationsample.zip
     
    To get this update early you must opt in to the beta branch on Steam.
     
    The new vegetation system is special because it does not use persistent objects stored in memory. Rendering and physics are executed according to a distribution algorithm that dynamically generates instances each frame. This allows the system to manage an infinite number of instances, with much denser placement and faster performance than a conventional vegetation system would allow (such as what Leadwerks 2 used.)
     

     
    You can select the vegetation tool in the terrain tools dropdown box. Usage is pretty self-explanatory.
     
    Models that are to be used as vegetation layers much have a material with a vegetation shader applied. There are two new slots in shaders list in the material editor, for the vegetation and vegetation shadow shaders. These can be selected from the "Shaders\Vegetation" folder.
     
    Models with a default physics shape and collision enabled will be collideable.
     
    A new "Pick mode" setting has been added in the material editor. This allows you to disable picking on some materials (like leaves and grass). This allows bullets to pass through objects that should not stop them.
     
    At this time, vegetation will only appear when texture+lighting render mode is in use.
     
    The default models have been donated generously by Pure3D.de.
  3. Josh
    One thing I love about constructive solid geometry modeling is that texture mapping is sooooo much simpler than 3ds Max. Most of the time the automatic texture mapping works fine, and when you do need to adjust texture mapping by hand, CSG texture mapping tools are still much easier. The justify buttons line a texture up along a face, or a group of faces.
     
    Although using these tools is fun and easy, programming them is another matter. I dreaded the implementation of the texture justify buttons, but it wasn't that hard. It doesn't account for rotation and scale yet, but a basic implementation turned out to be surprisingly easy. I guess after writing 3D World Studio five times, I am starting to get the hang of this:

     
    Smooth groups are working as well. Leadwerks3D will be the first CSG editor ever (AFAIK) to support real-time smooth groups. In the example below, we have a beveled corner we want to make rounded:

     
    To round the corner, select all the faces you want smoothed together, and press one of the smoothing group buttons. This will assign the selected faces to use the smoothing group you press. You can add multiple smooth groups to a face. Faces that have one or more smooth groups in common will be smoothed together:

     
    Here is the resulting rounded corner:

     
    Finally, I am starting to narrow down the visual theme for the new editor. Since we use the standard Windows and Mac interfaces on each operating system, it makes sense to use the standard Windows Office/VS icons whenever possible. One side benefit is they also look great on Mac, strangely enough:

     

     

     

     

    That's all for now. I've had a lot of fun designing the "perfect" workflow for game development, and can't wait to show you all the finished product this summer!
  4. Josh
    I'm working to move our Workshop implementation over to the newer SteamUGC API. There were recently some things updated, and that is still being sorted out. I'm also finishing up the game player.
     
    The first incarnation of Leadwerks Game Launcher was pretty utilitarian (and ugly):

     
    I realized this was pretty drab for a product aimed at consumers, so I designed a more outlandish colorful interface that was purely web-based:

     
    This one looks nice, but you can tell it will start to feel dated quickly. I'm also concerned about the user's perception of this product. It is meant to be a software application, not a thinly wrapped web page. I'm also sensitive to the danger of this product being associated with garbage like this, enough so that I decided to get rid of the "Game Player" name and call it something else. The name "Game Launcher" is more closely associated with initialization interfaces like the one used in Amnesia. It's just more of an acceptable PC gaming thing:

     
    My design also took up a lot of space and looked a lot like the iTunes App Store. It is not my goal to build a redundant marketplace within Steam, and that is what this looks like.
     
    So with that in mind, I came up with a third design that makes better use of the GUI and minimizes the web elements:
     

     

     
    The buttons in the left-hand lower corner are unicode characters, interestingly.
  5. Josh
    So after a lot of learning and mistakes, I finally have Leadwerks Engine 3 running on OSX Snow Leapord, Lion, and iOS. Rather than write out a detailed blog, I'll just throw a lot of random thoughts your way:
     
    -OpenGLES2 is a nice blend of OpenGL 2 and OpenGL 3.3. I'm really surprised my OpenGL 3 shaders translate easily into OpenGLES2 shader code, with only a few changes. In fact, the iOS renderer looks exactly like the deferred OpenGL 3 renderer, except for shadows and some post-effects. The games I have seen running on the iPhone are severely underpowered compared to what they could look like. This little device has quite a lot of power, and you can be sure I will find the way to max out the visuals.
     

     
    -iOS uses it's own texture compression format. This sucks, because it means either you are using uncompressed textures for cross-platform compatibility, or you have to have separate textures for your iOS build. The OpenGL spec really should have a defined compressed format, but I guess there was some patent issue with DXTC compression, or something like that.
     
    -Lua and my other cross-platform libraries seem to work just fine with iOS, which is fantastic. I really had no idea when I started this project whether it would really work like I hoped or not, but everything looks good.
     
    -The iOS port was probably the hardest one, due to the use of Objective-C. Android should be pretty easy after this. The PS3 is another platform I am really interested in, and my guess is LE3 could be ported to PS3 in a few days or less.
     
    -OSX Lion has some very appealing characteristics related to Leadwerks Engine 3. I'm under NDA and can't say exactly what they are, but it's very good for Mac and graphics. BTW, the gestures they are adding to OSX Lion are really fantastic, and reason enough to get the upgrade.
     

     
    There's still a ton of work to do before I have an actual product ready to release, but the plan is working and we're on track for a fantastic 3D development system for a wide variety of platforms.
  6. Josh
    It is now possible to compile shaders into a single self-contained file that can loaded by any Vulkan program, but it's not obvious how this is done. After poking around for a while I found all the pieces I needed to put this together.
    Compiling
    First, you need to compile each shader stage from a source code file into a precompiled SPIR-V file. There are several tools available to do this, but I prefer GLSlangValidator because it supports the Google #include extension. Put your vertex shader code in a text file named "shader.vert" and your pixel shader code in a file called "shader.frag". Create a .bat file in the same directory with the following contents:
    glslangValidator.exe "shader.vert" -V -o "vert.spv" glslangValidator.exe "shader.frag" -V -o "frag.spv" Run the bat file and two .spv files will be saved.
    Linking
    Now we want to combine our two files representing different shader stages into a single file. This is done with the link tool from Khronos. Add the following lines to your .bat file to compile the two .spv files into one. It will also delete the existing files to clean things up a little:
    spirv-link "vert.spv" "frag.spv" -o "shader.spv" del "vert.spv" del "frag.spv" This will save a single file named "shader.spv" that you can load as one shader module and use for different stages in Vulkan.
    Here are the required executables and a .bat file:
    BuildShader.zip
    Parsing
    If you always use vertex and fragment stages then there is no problem, but what if the combined .spv file contains other stages, or is missing a fragment stage? We can easily account for this with a minimal SPIR-V file parser. We're not going to include any big bloated libraries to do this because we only need some basic information about what stages are contained in the shader. Fortunately, the SPIR-V specification is pretty simple and it doesn't take much code to extract the information we want:
    std::string entrypointname[6]; auto stream = ReadFile(L"Shaders/shader.spv"); // Parse SPIR-V data Assert(stream->ReadInt() == 0x07230203); int version = stream->ReadInt(); int genmagnum = stream->ReadInt(); int bound = stream->ReadInt(); int reserved = stream->ReadInt(); bool stages[6] = {false,false,false,false,false,false}; // Instruction stream while (stream->Ended() == false) { int pos = stream->GetPos(); unsigned int bytes = stream->ReadUInt(); int opcode = LOWORD(bytes); int wordcount = HIWORD(bytes); if (opcode == 15) { int executionmodel = stream->ReadInt(); Assert(executionmodel >= 0); if (executionmodel < 6) { stream->ReadInt(); // entry point stages[executionmodel] = true; entrypointname[executionmodel] = stream->ReadString(); } } stream->Seek(pos + wordcount * 4); } This code even retrieves the entry point name for each stage, so you can be sure you are loadng the shader correctly.
    Here are the different shader stages from the SPIR-V specification:
    0: Vertex 1: TessellationControl 2: TessellationEvaluation 3: Geometry 4: Fragment 5: GLCompute That's it! We now have a standard single-file shader format for Vulkan programs. Your code for creating these will look something like this:
    VkShaderModule shadermodule; // Create shader module VkShaderModuleCreateInfo shaderCreateInfo = {}; shaderCreateInfo.sType = VK_STRUCTURE_TYPE_SHADER_MODULE_CREATE_INFO; shaderCreateInfo.codeSize = bank->GetSize(); shaderCreateInfo.pCode = reinterpret_cast<const uint32_t*>(bank->buf); VkAssert(vkCreateShaderModule(device->device, &shaderCreateInfo, nullptr, &shadermodule)); // Create vertex stage info VkPipelineShaderStageCreateInfo vertShaderStageInfo = {}; vertShaderStageInfo.sType = VK_STRUCTURE_TYPE_PIPELINE_SHADER_STAGE_CREATE_INFO; vertShaderStageInfo.stage = VK_SHADER_STAGE_VERTEX_BIT; vertShaderStageInfo.module = shadermodule; vertShaderStageInfo.pName = entrypointname[0].c_str(); VkPipelineShaderStageCreateInfo fragShaderStageInfo = {}; if (stages[4]) { // Create fragment stage info fragShaderStageInfo.sType = VK_STRUCTURE_TYPE_PIPELINE_SHADER_STAGE_CREATE_INFO; fragShaderStageInfo.stage = VK_SHADER_STAGE_FRAGMENT_BIT; fragShaderStageInfo.module = shadermodule; fragShaderStageInfo.pName = entrypointname[4].c_str(); } // Create your graphics pipeline...  
  7. Josh
    I got my idea working with spot lights! You can simply use a shadow mode of 2 to indicate an object or light should be considered "static". A light that has the static mode set will use two shadow maps; one for static objects and one for dynamic objects. In the image below, the walls and room are static, and the oildrum has the regular dynamic shadow mode set. As you can see, redrawing the shadow only requires 650 triangles. Without this feature, the whole room and everything in it would have to be redrawn any time something moved!
     



    Best of all, if the light moves or if any static object within the light's volume moves, both the static and dynamic shadowmaps are redrawn. The process is seamless for the end user. Dynamic lighting is usually dynamic, and baked lighting is usually static. I'm not sure what this qualifies as, but one thing is for sure...it's optimal!
     
    This will yield a huge performance increase for scenes with lots of point and spot lights, as long as those lights remain in a fixed position. You should avoid having lots of moving point and spot lights.
  8. Josh
    The OpenGL 4 renderer for Leadwerks 3.1 is well underway. I began implementing deferred lighting in earnest last Friday. Previous work included setting up MSAA on deferred buffers, as well as implementation of the Leadwerks Buffer class, which functions as a thin wrapper around OpenGL FBOs (frame buffer objects). The engine already tracks relationships between objects and lights, and this system was able to be used to provide a list of shadow-casting objects for each point and spot light.
     
    Because I've been working with deferred rendering since 2008, I've had time to study the bottlenecks and invented my own optimizations to improve performance with this type of renderer. Leadwerks uses special proprietary techniques to make shadow map rendering extremely fast.
     
    I began with the simplest light type, a spot light. These use a perspective projection onto a single shadowmap. This took about two days to implement.

     
    Point lights work by projecting six images onto a cubemap shadowmap. This was a little trickier but I got it done after another two days:

     
    The hardest to implement was directional light shadows. We began using cascaded shadowmaps back in 2008, and the industry has since concluded they are in fact the best choice for performance and quality. It took a few extra days to get this one working, but now it's done:

     
    This video is a very nice illustration of the technique:


     
    And here is my implementation in action. With just a little more adjustment, it will be perfect:


     
    The Leadwerks 3.1 deferred renderer has a number of advantages over the Leadwerks 2 renderer:
    Deferred renderer can now use 32x hardware MSAA instead of "fake" antialias techniques.
    Shadow edge filtering is much softer, less "grainy".
    Cubemap shadow maps are used to render point lights in a single pass.
    Higher default resolutions and settings to take advantage of modern hardware.

     
    These factors combine to provide an extremely clean and beautiful rendered image. When I'm done, I think the Leadwerks 3.1 renderer will look more like high-quality cg than what we think of as real-time renders.
  9. Josh
    After a lot of research and development, Leadwerks GUI is almost ready to release.  The goal with this system was to create an in-game GUI that was customizable, extendable, and could also serve as a windowed GUI for application development in the future.
    Widgets
    The GUI system is based on the Widget class.  Once a GUI is created on a rendering context you can add widgets to it.  Each widget is a rectangular container with a padding area.  The widgets can be arranged in a hierarchy and their bounds will clip the contents of their child widgets, both for rendering and mouse interaction.
    The GUI is automatically rendered onto the screen inside the call to Context:Sync(),
    Widget Scripts
    Each widget has a Lua script to control rendering and behavior, similar to the way Lua scripts work in our entity system.  The script assigned to a widget controls what type of widget it is, how it looks, and how it interacts with mouse and keyboard input.  A set of widget scripts are provided to create a variety of controls including buttons, checkboxes, text entry boxes, list boxes, text display areas, choice boxes, sliders, and more.
    You can create your own widget scripts to add new types of controls, like for an RPG interface or something else.  The script below shows how the tabber widget is implemented.
    --Styles if Style==nil then Style={} end if Style.Panel==nil then Style.Panel={} end Style.Panel.Border=1 Style.Panel.Group=2 --Initial values Script.indent=1 Script.tabsize = iVec2(72,28) Script.textindent=6 Script.tabradius=5 function Script:Start() self.widget:SetPadding(self.indent,self.indent,self.tabsize.y+self.indent,self.indent) end function Script:MouseLeave() if self.hovereditem~=nil then self.hovereditem = nil local scale = self.widget:GetGUI():GetScale() local pos = self.widget:GetPosition(true) local sz = self.widget:GetSize(true) self.widget:GetGUI():Redraw(pos.x,pos.y,sz.width,self.tabsize.y*scale+1) --self.widget:Redraw() end end function Script:Draw(x,y,width,height) local gui = self.widget:GetGUI() local pos = self.widget:GetPosition(true) local sz = self.widget:GetSize(true) local scale = self.widget:GetGUI():GetScale() local n local sel = self.widget:GetSelectedItem() --Draw border gui:SetColor(0) gui:DrawRect(pos.x,pos.y+self.tabsize.y*scale,sz.width,sz.height-self.tabsize.y*scale,1) --Draw unselected tabs for n=0,self.widget:CountItems()-1 do if n~=sel then self:DrawTab(n) end end --Draw selected tab if sel>-1 then self:DrawTab(sel) end ---Panel background gui:SetColor(0.25) gui:DrawRect(pos.x+1,pos.y+self.tabsize.y*scale+1,sz.width-2,sz.height-self.tabsize.y*scale-2) end function Script:DrawTab(n) local gui = self.widget:GetGUI() local pos = self.widget:GetPosition(true) local sz = self.widget:GetSize(true) local scale = self.widget:GetGUI():GetScale() local s = self.widget:GetItemText(n) local textoffset=2*scale if self.widget:GetSelectedItem()==n then textoffset=0 end local leftpadding=0 local rightpadding=0 if self.widget:GetSelectedItem()==n then gui:SetColor(0.25) if n>0 then leftpadding = scale*1 end rightpadding = scale*1 else gui:SetColor(0.2) end gui:DrawRect(-leftpadding+pos.x+n*(self.tabsize.x)*scale,textoffset+pos.y,rightpadding+leftpadding+self.tabsize.x*scale+1,self.tabsize.y*scale+self.tabradius*scale+1,0,self.tabradius*scale) gui:SetColor(0) gui:DrawRect(-leftpadding+pos.x+n*(self.tabsize.x)*scale,textoffset+pos.y,rightpadding+leftpadding+self.tabsize.x*scale+1,self.tabsize.y*scale+self.tabradius*scale+1,1,self.tabradius*scale) if self.widget:GetSelectedItem()~=n then gui:SetColor(0) gui:DrawLine(pos.x+n*self.tabsize.x*scale,pos.y+self.tabsize.y*scale,pos.x+n*self.tabsize.x*scale+self.tabsize.x*scale,pos.y+self.tabsize.y*scale) end if self.hovereditem==n and self.widget:GetSelectedItem()~=n then gui:SetColor(1) else gui:SetColor(0.7) end gui:DrawText(s,pos.x+(n*self.tabsize.x+self.textindent)*scale,textoffset+pos.y+self.textindent*scale,(self.tabsize.x-self.textindent*2)*scale-2,(self.tabsize.y-self.textindent*2)*scale-1,Text.VCenter+Text.Center) end function Script:MouseDown(button,x,y) if button==Mouse.Left then if self.hovereditem~=self.widget:GetSelectedItem() and self.hovereditem~=nil then self.widget.selection=self.hovereditem local scale = self.widget:GetGUI():GetScale() local pos = self.widget:GetPosition(true) local sz = self.widget:GetSize(true) self.widget:GetGUI():Redraw(pos.x,pos.y,sz.width,self.tabsize.y*scale+1) EventQueue:Emit(Event.WidgetAction,self.widget,self.hovereditem) end elseif button==Mouse.Right then if self.hovereditem~=self.widget:GetSelectedItem() and self.hovereditem~=nil then EventQueue:Emit(Event.WidgetMenu,self.widget,self.hovereditem,x,y) end end end function Script:KeyDown(keycode) if keycode==Key.Right or keycode==Key.Down then local item = self.widget:GetSelectedItem() + 1 if item<self.widget:CountItems() then self.widget.selection=item local scale = self.widget:GetGUI():GetScale() local pos = self.widget:GetPosition(true) local sz = self.widget:GetSize(true) self.widget:GetGUI():Redraw(pos.x,pos.y,sz.width,self.tabsize.y*scale+1) EventQueue:Emit(Event.WidgetAction,self.widget,item) end elseif keycode==Key.Left or keycode==Key.Up then local item = self.widget:GetSelectedItem() - 1 if item>-1 and self.widget:CountItems()>0 then self.widget.selection=item local scale = self.widget:GetGUI():GetScale() local pos = self.widget:GetPosition(true) local sz = self.widget:GetSize(true) self.widget:GetGUI():Redraw(pos.x,pos.y,sz.width,self.tabsize.y*scale+1) EventQueue:Emit(Event.WidgetAction,self.widget,item) end elseif keycode==Key.Tab then local item = self.widget:GetSelectedItem() + 1 if item>self.widget:CountItems()-1 then item=0 end if self.widget:CountItems()>1 then self.widget.selection=item local scale = self.widget:GetGUI():GetScale() local pos = self.widget:GetPosition(true) local sz = self.widget:GetSize(true) self.widget:GetGUI():Redraw(pos.x,pos.y,sz.width,self.tabsize.y*scale+1) EventQueue:Emit(Event.WidgetAction,self.widget,item) end end end function Script:MouseMove(x,y) local prevhovereditem = self.hovereditem self.hovereditem = nil local scale = self.widget:GetGUI():GetScale() local sz = self.widget:GetSize(true) if x>=0 and y>=0 and x<sz.width and y<self.tabsize.y*scale then local item = math.floor(x / (self.tabsize.x*scale)) if item>=0 and item<self.widget:CountItems() then self.hovereditem=item end end if self.hovereditem==self.widget:GetSelectedItem() and prevhovereditem==nil then return end if self.hovereditem==nil and prevhovereditem==self.widget:GetSelectedItem() then return end if prevhovereditem~=self.hovereditem then local pos = self.widget:GetPosition(true) local sz = self.widget:GetSize(true) self.widget:GetGUI():Redraw(pos.x,pos.y,sz.width,self.tabsize.y*scale+1) end end Widget Rendering
    Widgets are buffered and rendered with an advanced system that draws only the portions of the screen that need to be updated.  The GUI is rendered into a texture, and then the composite image is drawn onscreen.  This means you can have very complex interfaces rendering in real-time game menus with virtually no performance cost.
    By default, no images are used to render the UI so you don't have to include any extra files in your project.
    Widget Items
    Each widget stores a list of items you can add, remove, and edit.  These are useful for list boxes, choice boxes, and other custom widgets.
    GUI Events
    Leadwerks 4.4 introduces a new concept into your code, the event queue.  This stores a list of events that have occurred.  When you retrieve an event it is removed from the stack:
    while EventQueue:Peek() do local event = EventQueue:Wait() if event.source == widget then print("OK!") end end Resolution Independence
    Leadwerks GUI is designed to operate at any resolution.  Creation and positioning of widgets uses a coordinate system based on a 1080p monitor, but the GUI can use a global scale to make the interface scale up or down to accommodate any DPI including 4K and 8K monitors.  The image below is rendering the interface at 200% scaling on a 4K monitor.

    A default script will be included that you can include from Main.lua to build up a menu system for starting and quitting games, and handling common graphical features and other settings.

    Leadwerks GUI will be released in Leadwerks Game Engine 4.4.
  10. Josh
    In this blog I am going to explain my overarching plan to bring Leadwerks into the future and build our community.
    First, regarding technology, I believe VR is the future. VR right now is basically where mobile was in 2006. The big publishers won't touch it and indies have a small but enthusiastic market they can target without fear of competition from EA. I have an HTC Vive and it is AWESOME! After using it, I don't really want to play games on a 2D monitor anymore. I want to be in the game, but there is still a serious lack of decent content out there.

    At the same time, I think Vulkan is eventually going to be the graphics API we want to migrate to. Although Apple still isn't supporting it, Vulkan provides the best chance at a real cross-platform graphics API in the future.

    We eventually want to move everything over to 64-bit and drop all support for 32-bit Windows (which almost no one has nowadays).
    None of these issues are pressing. In reality, a switch over to Vulkan graphics, if done right, would result in no apparent change in the user experience for you guys. This is more about future-proofing and performance gains.
    There is one last big thing I can do that could have a huge increase in performance. Right now Leadwerks runs your game code, which calls rendering and physics calls when you choose. A concurrent architecture would have four threads running your game logic, physics, rendering, and AI in separate threads, all the time, constantly. The data exchange between these threads is a complicated matter and would likely involve some restrictions on your code. It would also break backwards compatibility with our existing API examples. So this is probably something to discuss with Leadwerks 5, and may be quite a ways off in the future.
    However, it's important that we start developing now with a clear idea of where we want to go. If information changes between now and then I can always change our course in the right direction.
    Where we are today:
    OpenGL 4.0 Windows API / GTK / Cocoa 64/32 bit Sequential architecture Where we are going:
    Vulkan 1.0 Leadwerks GUI on Windows, Mac, Linux 64-bit Concurrent architecture What if VR Flops?
    It doesn't matter. Developing for VR simply means adding in the OpenVR SDK and focusing on performance gains. This isn't like mobile where there's a completely different API and texture formats. VR will not hold back Leadwerks on the PC. It will help it. VR needs a solid 90 FPS, (although it's perfectly fine to lower your quality settings to achieve that). So developing for VR means focusing on performance gains over flexibility. We have a good idea now of how people use Leadwerks and I think we can make those decisions wisely now.
    How to Begin?
    I just outlined a plan to replace some of our low-level APIs and a major architectural change. However, I'm not willing to go back and do that until I feel like we have really polished Leadwerks Game Engine and given the users what they need. There's always a temptation to go back to the low-level details, but that just makes you an eternal tinkerer. Leadwerks is good because it's the easiest way to build your own games. That is the whole reason for Leadwerks to exist and we mustn't forget that.
    If you read our reviews on Steam, it is pretty clear what the user needs. We are sitting right now with a solid 75% ("positive") approval rating, and a community that is actively churning out games. We are almost all the way there, but Leadwerks needs to take one last step to really connect with what the user needs. In my opinion, Leadwerks Game Engine is only as good as the default templates that come with it. If we can't provide polished game examples that look and feel like a real commercial game, it isn't fair to expect the end user to come up with one. I plan to add the following game templates to Leadwerks:
    Shoot-em-up space side-scroller Contra-like side scroller Simple racing game against AI opponents (like 4x4 Evo) RPG / dungeon crawler Next, I plan to add built-in offline help documentation and video tutorials. By default, the editor is going to open the new help system automatically when the program starts. Too many people come on the forum asking questions about YouTube videos from years ago and don't go to the official tutorials still.
    If we are going to make video tutorials, and I know that at some point we are going to revise the GUI, then it makes sense to revise the GUI first. Otherwise we will end up with a lot of video content that looks obsolete once we switch over to a new GUI. This is why I am working on Leadwerks GUI now. (Leadwerks GUI will also make it easier for us to switch the editor over the BlitzMaxNG, so that it can be compiled in 64-bit mode as pure C++.)

    The revised GUI, built-in help system, and additional game templates will raise our approval rating on Steam up to 80%, which is considered "very positive". Once I have a game development system with an 80% positive rating and (by then) 20,000 users on Steam, that gives us quite a bit of leverage. Success breeds success, and I can use those figures to our advantage when negotiating with third parties.
    Here are the steps we will take, in order:
    Leadwerks GUI Add Mac support Built-in offline help with videos New game templates Achieve 80% approval rating on Steam Once we hit the 80% mark, that's when I know Leadwerks is really giving the end user what they need. I can then go back into some of the internals and work on our next iteration of technology:
    Move completely over to 64-bit Vulkan graphics Concurrent multithreaded architecture This plan gives us a path forward into emerging technologies, in balance with the needs of the end user and the realities of the business. We won't have a major interruption of development and the game tournaments will continue as this is happening.
    As always, this plan is subject to change and I may decide to do things differently depending on how circumstances develop.
  11. Josh
    I wanted to get a working example of the plugin system together. One thing led to another, and...well, this is going to be a very different blog.
    GMF2
    The first thing I had to do is figure out a way for plugins to communicate with the main program. I considered using a structure they both share in memory, but those always inevitably get out of sync when either the structure changes or there is a small difference between the compilers used for the program and DLL. This scared me so I went with a standard file format to feed the data from the plugin to the main program.
    GLTF is a pretty good format but has some problems that makes me look for something else to use in our model loader plugins.
    It's text-based and loads slower. It's extremely complicated. There are 3-4 different versions of the file format and many many options to split it across multiple files, binary, text, or binary-to-text encoding. It's meant for web content, not PC games. Tons of missing functionality is added through a weird plugin system. For example, DDS is supported through a plugin, but a backup PNG has to be included in case the loaded doesn't support the extension. The GMF file format was used in Leadwerks. It's a custom fast-loading chunk-based binary file format. GMF2 is a simpler flat binary format updated with some extra features:
    Vertices are stored in a single array ready to load straight into GPU memory. Vertex displacement is supported. Compressed bitangents. Quad and line meshes are supported. LOD Materials and textures can be packed into the format or loaded from external files. PBR and Blinn-Phong materials Mesh bounding boxes are supported so they don't have to be calculated at load time. This means the vertex array never has to be iterated through, making large meshes load faster. I am not sure yet if GMF2 will be used for actual files or if it is just going to be an internal data format for plugins to use. GLTF will continue to be supported, but the format is too much of a mess to use for plugin data.
    Here's a cool five-minute logo:

    The format looks something like this:
    char[4] ID "GMF2" int version 200 int root //ID of the root entity int texture_count //number of textures int textures_pos //file position for texture array int materials_count //number of materials int materials_pos //file position for materials int nodes_count //number of nodes int nodes_pos //file position for nodes As you can see, it is really easy to read and really easy to write. There's enough complexity in this already. I'm not bored. I don't need to introduce unnecessary complexity into the design just so I can show off. There are real problems that need to be solved and making a "tricky" file format is not one of them.
    In Leadwerks 2, we had a bit of code called the "GMF SDK". This was written in BlitzMax, and allowed construction of a GMF file with easy commands. I've created new C++ code to create GMF2 files:
    //Create the file GMFFile* file = new GMFFile; //Create a model node = new GMFNode(file, GMF_TYPE_MODEL); //Add an LOD level GMFLOD* lod = node->AddLOD(); //Add a mesh GMFMesh* mesh = lod->AddMesh(3); //triangle mesh //Add a vertex mesh->AddVertex(0,0,0, 0,0,1, 0,0, 0,0, 255,255,255,255); mesh->AddVertex(0,0,1, 0,0,1, 0,0, 0,0, 255,255,255,255); mesh->AddVertex(0,1,1, 0,0,1, 0,0, 0,0, 255,255,255,255); //Add a triangle mesh->AddIndice(0); mesh->AddIndice(1); mesh->AddIndice(2); Once your GMFFile is constructed you can save it into memory with one command. The Turbo Plugin SDK is a little more low-level than the engine, so it includes a MemWriter class to help with this, since the engine Stream class is not present.
    As a test I am writing a Quake 3 MD3 import plugin and will provide the project and source as an example of how to use the Turbo Plugin SDK.

    Packages
    The ZIP virtual file system from Leadwerks is being expanded and formalized. You can load a Package object to add new virtual files to your project. These will also be loadable from the editor, so you can add new packages to a project, and the files will appear in the asset browser and file dialogs. (Package files are read-only.) Plugins will allow packages to be loaded from file formats besides ZIP, like Quake WADs or Half-Life GCF files. Notice we keep all our loaded items in variables or arrays because we don't want them to get auto-deleted.
    //Load Quake 3 plugins auto pk3reader = LoadPlugin("Plugins/PK3.dll"); auto md3loader = LoadPlugin("Plugins/MD3.dll"); auto bsploader = LoadPlugin("Plugins/Q3BSP.dll"); Next we load the game package files straight from the Quake game directory. This is just like the package system from Leadwerks.
    //Load Quake 3 game packages std::wstring q3apath = L"C:/Program Files (x86)/Steam/steamapps/common/Quake 3 Arena/baseq3"; auto dir = LoadDir(q3apath); std::vector<shared_ptr<Package> > q3apackages; for (auto file : dir) { if (Lower(ExtractExt(file)) == L"pk3") { auto pak = LoadPackage(q3apath + L"/" + file); if (pak) q3apackages.push_back(pak); } } Now we can start loading content directly from the game.
    //Load up some game content from Quake! auto head = LoadModel(world, "models/players/bitterman/head.md3"); auto upper = LoadModel(world, "models/players/bitterman/upper.md3"); auto lower = LoadModel(world, "models/players/bitterman/lower.md3"); auto scene = LoadScene(world, "maps/q3ctf2.bsp"); Modding
    I have a soft spot for modding because that is what originally got me into computer programming and game development. I was part of the team that made "Checkered Flag: Gold Cup" which was a spin-off on the wonderful Quake Rally mod:
    I expect in the new editor you will be able to browse through game files just as if they were uncompressed in your project file, so the new editor can act as a modding tool, for those who are so inclined. It's going to be interesting to see what people do with this. We can configure the new editor to run a launch script that handles map compiling and then launches the game. All the pieces are there to make the new editor a tool for modding games, like Discreet's old Gmax experiment.

    I am going to provide official support for Quake 3 because all the file formats are pretty easy to load, and because gmax can export to MD3 and it would be fun to load Gmax models. Other games can be supported by adding plugins.
    So here are some of the things the new plugin system will allow:
    Load content directly from other games and use it in your own game. I don't recommend using copyrighted game assets for commercial projects, but you could make a "mod" that replaces the engine and requires the player to own the original game. You could probably safely use content from any Valve games and release a commercial game on Steam. Use the editor as a tool to make Quake or other maps. Add plugin support for new file formats. This might all be a silly waste of time, but it's a way to get the plugin system working, and if we can make something flexible enough to build Quake maps, well that is a good test of the robustness of the system.
     
  12. Josh
    I've begun working on an implementation of voxel cone tracing for global illumination. This technique could potentially offer a way to perfrorm real-time indirect lighting on the entire scene, as well as real-time reflections that don't depend on having the reflected surface onscreen, as screen-space reflection does.
    I plan to perform the GI calculations all on a background CPU thread, compress the resulting textures using DXTC, and upload them to the GPU as they are completed. This means the cost of GI should be quite low, although there is going to be some latency in the time it takes for the indirect lighting to match changes to the scene. We might continue to use SSR for detailed reflections and only use GI for semi-static light bounces, or it might be fast enough for moving real-time reflections. The GPU-based implementations I have seen of this technique are techically impressive but suffer from terrible performance, and we want something fast enough to run in VR.
    The first step is to be able to voxelize models. The result of the voxelization operation is a bunch of points. These can be fed into a geometry shader that generates a box around each one:
    void main() { vec4 points[8]; points[0] = projectioncameramatrix[0] * (geometry_position[0] + vec4(-0.5f * voxelsize.x, -0.5f * voxelsize.y, -0.5f * voxelsize.z, 0.0f)); points[1] = projectioncameramatrix[0] * (geometry_position[0] + vec4(0.5f * voxelsize.x, -0.5f * voxelsize.y, -0.5f * voxelsize.z, 0.0f)); points[2] = projectioncameramatrix[0] * (geometry_position[0] + vec4(0.5f * voxelsize.x, 0.5f * voxelsize.y, -0.5f * voxelsize.z, 0.0f)); points[3] = projectioncameramatrix[0] * (geometry_position[0] + vec4(-0.5f * voxelsize.x, 0.5f * voxelsize.y, -0.5f * voxelsize.z, 0.0f)); points[4] = projectioncameramatrix[0] * (geometry_position[0] + vec4(-0.5f * voxelsize.x, -0.5f * voxelsize.y, 0.5f * voxelsize.z, 0.0f)); points[5] = projectioncameramatrix[0] * (geometry_position[0] + vec4(0.5f * voxelsize.x, -0.5f * voxelsize.y, 0.5f * voxelsize.z, 0.0f)); points[6] = projectioncameramatrix[0] * (geometry_position[0] + vec4(0.5f * voxelsize.x, 0.5f * voxelsize.y, 0.5f * voxelsize.z, 0.0f)); points[7] = projectioncameramatrix[0] * (geometry_position[0] + vec4(-0.5f * voxelsize.x, 0.5f * voxelsize.y, 0.5f * voxelsize.z, 0.0f)); vec3 normals[6]; normals[0] = (vec3(-1,0,0)); normals[1] = (vec3(1,0,0)); normals[2] = (vec3(0,-1,0)); normals[3] = (vec3(0,1,0)); normals[4] = (vec3(0,0,-1)); normals[5] = (vec3(0,0,1)); //Left geometry_normal = normals[0]; gl_Position = points[0]; EmitVertex(); gl_Position = points[4]; EmitVertex(); gl_Position = points[3]; EmitVertex(); gl_Position = points[7]; EmitVertex(); EndPrimitive(); //Right geometry_normal = normals[1]; gl_Position = points[1]; EmitVertex(); gl_Position = points[2]; EmitVertex(); ... } Here's a goblin who's polygons have been turned into Lego blocks.

    Now the thing most folks nowadays don't realize is that if you can voxelize a goblin, well then you can voxelize darn near anything.

    Global illumination will then be calculated on the voxels and fed to the GPU as a 3D texture. It's pretty complicated stuff but I am very excited to be working on this right now.

    If this works, then I think environment probes are going to completely go away forever. SSR might continue to be used as a low-latency high-resolution first choice when those pixels are available onscreen. We will see.
    It is also interesting that the whole second-pass reflective water technique will probably go away as well, since this technique should be able to handle water reflections just like any other material.
  13. Josh
    I have proven beyond a shadow of a doubt that I am very bad at shipping posters.  The Winter Game Tournament was completed in January and I have yet to deliver your prizes.  If you have a job opening for someone to ship prizes, or to ship anything at all, I am probably the worst candidate you could ever hope to find to fill said position.  Seriously.
    Part of the problem (although it's still not an excuse) has been that it is actually incredibly difficult to have custom USB keychains made.  These took a long time because my description of black monochrome printing with the text vertical aligned was too apparently too complicated.

    The first proof I got used a beautiful white-on-silver-so-light-it's-practically-white color scheme.

    The second attempt's failure was slightly more subtle.  Slightly.  Is it just me, or does the logo obviously beg to be centered by the vertical center of the text, instead of the image dimensions?  Is this really that hard?:

    At this point I was all like:

    So I drew them a picture of something that to me seemed extremely obvious.  Finally, I got the corrected proof of our glorious USB keychain and have authorized a limited production of this one-of-a-kind item.  I will give props to the graphics designer for choosing the dark gray (subtle blue?) print color, which is much better than #000000:

    All I wanted was a little taste in design.
    So here's what I am going to do.  Each entrant in the Winter Games Tournament will receive one limited-edition poster, with one limited-edition Leadwerks USB keychain inside the poster tube, plus one Leadwerks sticker (which I have a ton of).  Because of the delay and the fact that I suck at shipping prizes, I am bumping up the 4GB USB drive to a whopping 16GB.  That's four times the jigglebytes, and enough to back up the entire Leadwerks.com website on.
    In other news, the acquisition of American Apparel by Gilden Activewear has actually affected our supply chain for the fabulous Leadwerks line of clothing.  The official Leadwerks hoodie in our signature gray color is currently only available in sizes small and extra small.

    I contacted SpreadShirt.com about the matter and received the following explanation:
    The first and most pertinent question is why is the little mermaid working in customer service for SpreadShirt?

    The second question is when will the official Leadwerks hoodie become available again?  The garment can be gotten in other colors, but I do not feel that any of these less appealing colors adequately represent the brand of our game engine.  This is most distressing and I will continue to look for a solution.
    The 2017 Summer Game Tournament will proceed once I have shipped the prizes out from the previous tournament.  However, as I have proven that I am not a reliable shipper of prizes, the tournament is going back to our original prizeless model.  A roundup of entries from all game developers will be posted at the end and fun will be had by all.
  14. Josh
    AI is always a fun programming topic, and it's even more fun when you're mixing a physics-based character controller with dynamic navmesh pathfinding.
     
    We planned on using navmesh pathfinding from the very start of the design of the new engine. Because it was integrated from the very beginning our implementation works really nicely. Pathfinding is completely automatic and dynamic. There are no commands you need to call to make it work. When part of the scene changes, the navigation data for that sector is automatically recalculated. If you haven't seen it already, you can check out some videos on the Leadwerks YouTube channel:


     
    Character controllers are a special physics body used to control the movement of characters in the game. They are used for both players as a control method, and by enemies and NPCs. Sometimes they will be controlled by keyboard or touch input, and sometimes they are controlled by AI navigation. Consequently, the character controller class has to be able to handle both.
     
    There are so many cool things I can do that it's fun and a little scary. Right now I am toying with the following API. The first two commands would just make the character constantly move to the position or follow the entity you specify, so there's only need to call them once, and the engine will handle the rest:

    bool CharacterController::GoToPoint(const float& x, const float& y, const float& z) bool CharacterController::Follow(Entity* entity) void CharacterController::Stop()
     
    And then you still have manual controls, which are analogous to "UpdateController" from LE2:

    CharacterController::SetInput(const float& angle, const float& movement, const float& strafe...)
     
    In the screenshot below, I control the rightmost character with the keyboard, while the others are programmed to follow me:

  15. Josh
    It's been nearly a year since I made the decision to port our OpenGL renderer over to Vulkan, and it has been very difficult work, but we are getting back up to speed. I was able to get skinned animation working for the first time in Vulkan yesterday, using a slightly modified version of our original animation shader code.
    The system works exactly the same as in Leadwerks 4, with a few differences:
    Animation features are confined to the Model class only, and are no longer a general Entity feature. Bones are no longer an entity. This reduces a lot of overhead when animation is processed. There will be some type of "AttachToBone" method in the entity class you can use to place a weapon in a character's hand. The Model class has a new Skeleton member. A skeleton is only present in animated models. Skeletons can be shared between models. This allows the engine to process animation once and multiple models can share the skeleton. The models can be instances of the same model, or different models. This is very good for rendering large crowds of enemies. (However, you might not even need to bother with this type of optimization, because 1000 unique skeletons works with no issues, as seen below.) Animation is executed on one or more separate threads which makes it very very fast. Animated characters are now batched instead of drawn one at a time. With the OpenGL prototype renderer we saw very good performance with this technique, so it will be interesting to see how Vulkan compares:
    I have not added support for GLTF animation loading yet, which will give us a wide variety of ready-to-use animated models.
  16. Josh
    After a couple days of work I got point light shadows working in the new clustered forward renderer. This time around I wanted to see if I could get a more natural look for shadow edges, as well as reduve or eliminate shadow acne. Shadow acne is an effect that occurs when the resolution of the shadow map is too low, and incorrect depth comparisons start being made with the lit pixels: By default, any shadow mapping alogirthm will look like this, because not every pixel onscreen has an exact match in the shadow map when the depth comparison is made:

    We can add an offset to the shadow depth value to eliminate this artifact:
    \
    However, this can push the shadow back too far, and it's hard to come up with values that cover all cases. This is especially problematic with point lights that are placed very close to a wall. This is why the editor allows you to adjust the light range of each light, on an individual basis.
    I came across a techniqe called variance shadow mapping. I've seen this paper years ago, but never took the time to implement it because it just wasn't a big priority. This works by writing the depth and depth squared values into a GL_RG texture (I use 32-bit floating points). The resulting image is then blurred and the variance of the values can be calculated from the average squared depth stored in the green channel.


    Then we use Chebyshev's inequality to get an average shadow value:

    So it turns out, statistics is actually good for something useful after all. Here are the results:

    The shadow edges are actually soft, without any graininess or pixelation. There is a black border on the edge of the cubemap faces, but I think this is caused by my calculated cubemap face not matching the one the hardware uses to perform the texture lookup, so I think it can be fixed.
    As an added bonus, this eliminates the need for a shadow offset. Shadow acne is completely gone, even in the scene below with a light that is extremely close to the floor.

    The banding you are seeing is added in the JPEG compression and it not visible in the original render.
    Finally, because the texture filtering is so smooth, shadowmaps look much higher resolution than with PCF filtering. By increasing the light range, I can light the entire scene, and it looks great just using a 1024x1024 cube shadow map.

    VSMs are also quite fast because they only require a single texture lookup in the final pass. So we get better image quality, and probably slightly faster speed. Taking extra time to pay attention to small details like this is going to make your games look great soon!
  17. Josh
    An update is available on the beta branch on Steam that adds support for multiplayer games with the following features:
    NAT punch-through with relay server fallback. Connectionless peer-to-peer UDP messages with multiple channels and optional reliable flag. Public server list of available games to play. Voice-over-IP for in-game chat (and taunts). The new multiplayer system will open up a new range of game types that can be easily created with Leadwerks Game Engine.
    These features are still being tested and are only available in the Windows build right now.
  18. Josh
    Step 1. Build the object you want to make into a prefab. Here I have staircase I want to make into a reusable object.

     
    Step 2. Drag all the objects onto one, to make it a parent of the others. Prefabs need one top-level entity.

     
    Step 3. Click on the top-most object in the hierarchy and select the "Save as Prefab" menu item. You'll get a file save dialog so you can choose the file to save.

     
    Step 4. Begin using your prefab by dragging it into the scene from the asset browser.

     
    Prefabs will always be reloaded from the original prefab file. Even if you overwrite the prefab file, your changes will be instantly reflected throughout the entire scene. If you make any changes to an object loaded from a prefab that would make that object so it is no longer instanced, a dialog box will warn you and ask if you want to proceed before turning it into a unique object.
     
    Prefabs can be made from all entities in the editor, not just CSG brushes.
  19. Josh
    The stock scripts that come with Leadwerks form a continually expanding game framework that can be used in many different ways. The reuse we get out of the door, switch, trigger, and AI scripts is really great. Every time I add a new script I can count on it being used in many different ways.
     
    Sometimes this requires a custom model in order to develop and demonstrate usage of the script. It's best for me to have one model made exactly to my specs, and then make everything else conform to that template.
     
    There are additional scripts I want to release in the near future that add more gameplay elements:
    TurretAI, with custom model.
    Enterable vehicles with controls for guns, if present.
    Third-person player controls, with options for fixed rotation or over-the-shoulder views.

     
    I also want to make the projectile prefab loaded by the soldierAI script a modifiable script value, so that we can easily add new types of projectiles. This would also work with turrets or vehicle-mounted guns.
     
    What other ideas for widely reusable scripts do you have?
  20. Josh
    I was thinking this morning how Apple has influenced my life and the company Leadwerks. You might ask why I am talking about this, since we are only now coming out with our first product for any Apple operating system. Well, before Windows existed, back in the 1980's, this was my first computer:
     

     
    The first "game" I ever made was in Macintosh Paint. I drew a road from a top-down view, with barriers randomly placed on either side. I drew a race car, zoomed in the image, and used the selection lasso to cut it from the page. By dragging the selected pixels to the top of the screen, I could make the image scroll, causing the road to move beneath my car, and I had to move the mouse left and right to avoid the barriers. (I tried to forget the order I had placed them in.)
     
    I also used this machine to paint my first textures, before texture mapping was invented. I drew sides of buildings with variations with windows and doors. Although the monitor displayed only two colors, black and white, you could approximate gray by stippling pixels. There was also the option to use colored pencils once you printed out the images. I would print these on paper and then construct buildings out of paper with them, a precursor to the work we would do later with CSG modeling.
     
    My uncle also wrote some of the early Mac games, including "Save the Farm", and "Hot Air Balloon". I actually have the original apps and amazingly they run on my 2010 iMac with an emulator.
     

     
    Last year, as we began development of Leadwerks3D, I bought my first Apple products in my adult life, including a 27" iMac, iPhone 4, and iPad 2. I've been really impressed with the quality and design of these products, and although I am not quite sure their OpenGL 3.2 drivers are there yet, I think high-end graphics on Mac has a good future. (I had it confirmed from another developer that OpenGL 3.2 MRTs do work on Mac, so I still have some experimentation to do. I suspect the multisampled texture format I am using may be causing a problem...) Ever since I started putting out videos of our new engine running on mobile devices, there has been a tremendous response from developers, both inside the Leadwerks community and beyond.
     
    I think the rise of the iPhone as a gaming device was sort of accidental. They had a device that could run apps, then they discovered people were playing games more and more on them. However, all indie game developers should be grateful Apple developed a thriving ecosystem for third party developers, because without them I don't think we would see the resurgence of indie game development like we have. Even if you are developing primarily for Android, you have to admit that platform would not be as good as it is without the influence of iOS.
     

     
    In a sea of mediocrity, one man cared about his work enough to make sure that everything his company put out was a cohesive reflection of his tastes and standards. I don't agree with every decision Apple has made, but I have a great deal of respect for the way Steve Jobs had a coherent vision and brought it to life, instead of just blindly reacting to what he thought would make money. He rejected the "low quality, low confidence, low price, low effort" approach the rest of the tech industry has taken, and won. Without him I think Leadwerks would be without what I think will be our most successful platforms, and indie game developers would not have the same opportunities that are present today. We should all be grateful for that.
     


     

    http://www.youtube.com/watch?v=uwOjsYNcUW0
  21. Josh
    The beta branch of the professional version is upgraded to use Visual Studio 2017. You can download the release candidate for free here:
    https://www.visualstudio.com/vs/visual-studio-2017-rc/
     
    Your existing projects should work with VS 2017 with no changes, but you might want to hold off while we test if you are in the middle of a project. New projects you create should open with VS 2017 by default.
     

  22. Josh
    A new update is available to beta testers. This makes some pretty big changes so I wanted to release this before doing any additional work on the post-processing effects system.
    Terrain Fixed
    Terrain system is working again, with an example for Lua and C++.
    New Configuration Options
    New settings have been added in the "Config/settings.json" file:
    "MultipassCubemap": false, "MaxTextures": 512, "MaxCubemaps": 16, "MaxShadowmaps": 64, "MaxIntegerTextures": 32, "MaxUIntegerTextures": 32, "MaxCubeShadowmaps": 64, "MaxVolumeTextures": 16, "LuaErrorCommand": "code", "LuaErrorCommandArguments": "-g \"$(CurrentFile)\":$(LineNumber) \"$(AppDir)\"" The max texture values will allow you to reduce the array size the engine requires for textures. If you have gotten an error message about "not enough texture units" this setting can be used to bring your application down under the limit your hardware has.
    The Lua settings define the command that is run when a Lua error occurs. By default this will open Visual Studio code and display the file and line number an error occurs on. 
    String Classes
    I've implemented two string classes for better string handling. The String and WString class are derived from both the std::string / wstring AND the Object class, which means they can be used in a variable that accepts an object (like the Event.source member). 8-bit character strings will automatically convert to wide strings, but not the other way. All the Load commands used to have two overloads, one for narrow and one for wide strings. That has been replaced with a single command that accepts a WString, so you can call LoadTexture("brick,dds") without having to specify a wide string like this: L"brick.dds".
    The global string functions like Trim, Right, Mid, etc. have been added as methods on the two string classes, Eventually the global functions will be phased out.
    Lua Integration in Visual Studio Code
    Lua integration in Visual Studio Code is just about finished and it's amazing! Errors are displayed, debugging works great, and console output is displayed, just like any serious modern programming language. Developing with Lua in Leadwerks 5 is going to be a blast!

    Lua launch options are now available for Debug, Release, Debug 64f, and Release 64f.
    I feel the Lua support is good enough now that the .bat files are not needed. It's easier just to open VSCode and copy the example you want to run into Main.lua. These are currently located in "Scripts/Examples" but they will be moved into the documentation system in time.
    The black console window is going away and all executables are by default compiled as a windowed application, not a console app. The console output is still available in Visual Studio in the debug output, or it can be piped to a file with a .bat launcher.
    See the notes here on how to get started with VSCode and Lua.
  23. Josh
    I've been working on water. It's a challenge to get it to work properly with the post-effects stack and lighting, something I never did quite right in Leadwerks 2. However, the results are turning into the best implementation of water I've ever made.
    Water blends naturally with post-effects stack.
    Water reflection/refraction equation and behavior is improved over Leadwerks 2.
    Water ripples don't have an obvious direction of flow and look very realistic, without looking "wrong" when placed in certain maps (i.e. water flowing sideways against a coastline).

     
    I am not attempting to do 3D ocean waves like in Crysis and some other games. Although these techniques came close to looking real, I feel like they haven't aged well and sit in the uncanny valley and sort of look like saran wrap when in motion. They also aren't as versatile as the water I am making, and only look good in some specific types of settings.
     
    Here's a WIP image with a bunch of post-effects applied to make sure everything works right together.
     

     
    I also met with Valve and asked them to add a desktop version of the in-app purchase dialog for user items. This will allow other users to easily purchase the items you publish for use with Leadwerks.
     

  24. Josh
    Textures in LE3 give you more control. You can retrieve all or part of any mipmap of a texture to system memory, edit the texture, then send it back to the GPU. Fortunately, the engine checks the bounds of the texture so you don't have to worry about producing any blue screens of death, as can happen when you're having too much fun with GPU memory.
     
    Here's some simple code to retrieve the texture from video memory into a system memory bank, modify it, and send it back to the graphics card:

    Bank* pixels = new Bank( texture->GetMipmapSize(0) ); texture->GetPixels( pixels ); pixels->PokeByte( (x*height + y)*4, 255 ); texture->SetPixels( pixmap );
    There are overloaded functions for raw memory pointers, but the bank syntax has a little extra error checking, since this kind of thing can easily produce BSODs.
     
    If you want to get really fancy, there are additional parameters that will let your retrieve or send part of a texture to and from the GPU. This is something I use extensively in the terrain editing routines, which take place on the GPU (that's why terrain editing is so fast). I use a shader to modify part of the terrain heightmap, and then I have to retrieve the modified heightmap back to system memory. Because the textures can be pretty big, it's more efficient to retrieve a sub-rectangle of pixels instead of the whole texture.
     
    I ran into some trouble with DDS compression, as the image below demonstrates. This is what happens when you accidentally offset DDS mipmap data by four bytes. (This has been fixed):

    The texture loader can handle background texture loading, so if you want you can have those cool blurry textures when a level starts, and they get sharper after a few seconds. This cuts the texture load time down to practically zero. If you have a large scene with lots of large textures, this will be nice to cut level load times down.
     
    All media files in LE3 can be reloaded with a Reload() class function. This is necessary for some of the art pipeline features in the editor, so it's being built in from the very beginning. It is pretty cool to tap a key and watch your textures reload one mipmap at a time. This also means you can change the texture quality setting on the fly and not worry about reloading textures manually...when you change the texture quality setting the engine will automatically reload all textures that came from a file.
     
    As we saw last week, here is the proper way to load a texture and apply it to a material in LE3:

    Texture* tex = LoadTexture("brickwall01.dds"); mat->SetTexture(tex); delete tex;
    It occurred to me this can be simplified with an alternate overloaded function:

    mat->SetTexture("brickwall01.dds");
    The same can be done with shaders. If it succeeds, true is returned, otherwise false is returned:

    mat->SetShader("Shaders/simple.shader");
    So that's a nice convenient feature.
     
    It was pointed out that my Copy(int unique) syntax was confusing, and I agree, because I couldn't even remember which was which. Therefore you will have an Instance() and Copy() function for textures, shaders, entities, materials, etc. The Instance() function will act like LE2's CopyEntity(), and the Copy() function will make a unique copy that can be modified without altering the original.
     
    Texture animation is supported, out of the box. I'm using the old Quake naming convention where you add +framenumber to the end of the texture to specify an animation. For example, if you load "fire+01.dds", the engine will look for "fire+02.dds", "fire+03.dds", etc., until all frames are loaded. The texture will automatically switch frames with an animation speed you specify. It should be possible to convert AVI files into a sequence of separate frames, but I haven't looked into it yet.
     
    That's all I have for now. It feels like I am really in the thick of LE3 development now. I've already done most of the graphics stuff with LE2, so I know how to do that. What I am really looking forward to is the high-level features, particularly the script and flowgraph implementation. I like to attack unknowns first, because the more I know, the better I can design the engine. We already know we can do fantastic graphics, and I understand how that all works. It would make sense to focus on areas that I haven't developed as extensively in the past...networking, gameplay, and the art pipeline. Therefore, we may see a Leadwerks 3 beta with just basic graphics and a focus on gameplay features first.
     
    Please "like" this blog!
×
×
  • Create New...