Jump to content

Josh

Staff
  • Posts

    23,303
  • Joined

  • Last visited

Blog Entries posted by Josh

  1. Josh
    In Leadwerks Game Engine 4, bones are a type of entity. This is nice because all the regular entity commands work just the same on them, and there is not much to think about. However, for ultimate performance in Leadwerks 5 we treat bones differently. Each model can have a skeleton made up of bones. Skeletons can be unique for each model, or shared between models. Animation occurs on a skeleton, not a model. When a skeleton is animated, every model that uses that skeleton will display the same motion. Skeletons animations are performed on one or more separate threads, and the skeleton bones are able to skip a lot of overhead that would be required if they were entities, like updating the scene octree. This system allows for thousands of animated characters to be displayed with basically no impact on framerate:
    This is not some special case I created that isn't practical for real use. This is the default animation system, and you can always rely on it working this fast.
    There is one small issue that this design neglects, however, and that is the simple situation where we want to attach a weapon or other item to an animated character, that isn't built into that model. In Leadwerks 4 we would just parent the model to the bone, but we said bones are no longer entities so this won't work. So how do we do this? The answer is with bone attachments:
    auto gun = LoadModel(world, "Models/gun.gltf"); gun->Attach(player, "R_Hand"); Every time the model animation is updated, the gun orientation will be forced to match the bone we specified. If we want to adjust the orientation of the gun relative to the bone, then we can use a pivot:
    auto gun = LoadModel(world, "Models/gun.gltf"); auto gunholder = CreatePivot(world); gunholder->Attach(player, "R_Hand"); gun->SetParent(gunholder); gun->SetRotation(0,180,0); Attaching an entity to a bone does not take ownership of the entity, so you need to maintain a handle to it until you want it to be deleted. One way to do this is to just parent the entity to the model you are attaching to:
    auto gun = LoadModel(world, "Models/gun.gltf"); gun->Attach(player, "R_Hand"); gun->SetParent(player); gun = nullptr; //player model keeps the gun in scope In this example I created some spheres to indicate where the model bones are, and attached them to the model bones:
    Bone attachments involve three separate objects, the entity, the model, and the bone. This would be totally impossible to do without weak pointers in C++.
  2. Josh
    Wow, two updates in two days! This one adds new features and bug fixes.
     
    Fixes
    http://www.leadwerks.com/werkspace/topic/10019-directional-lights-dont-cast-shadows/
    http://www.leadwerks.com/werkspace/topic/10030-values-in-appearance-tab-ignored/
    http://www.leadwerks.com/werkspace/topic/10020-values-in-entities-general-tab-stuck/
    http://www.leadwerks.com/werkspace/topic/8138-pointlight-viewrange-bug
    http://www.leadwerks.com/werkspace/topic/9919-wobbly-fbx-imports/
     
    New Features
    Global skybox setting! Select the scene root, and choose the skybox you want from the properties editor. You can select either a texture or a material. The new command Camera::SetSkybox will do this in code.
    Export VMF files! You can now use Leadwerks Editor to make maps for Left 4 Dead or other games, if you are so inclined. Texture mapping planes are perfectly preserved in the export:

     

     
    I hope to get a few more bugs fixed and get the Linux build running with Steam before the release of the Leadwerks Workshop.
  3. Josh
    I feel like I've resolved the design challenges I was facing in my last blog. A menu item is always checked, specifying what the current object is that will be created when the user clicks on a viewport. There are two types of objects that can be created, brushes and entities. Brushes are constructive solid geometry objects that can be sketched out to quickly design a building. Entities are created at a point in space, and include lights, sounds, and particle emitters. As I have described it, this is pretty much identical to the 3D World Studio workflow.
     
    Models in the Asset Browser are another thing. This is where the Leadwerks3D editor is very different from 3D World Studio. Models can be dragged into the scene from the Asset Browser, in either the 3D or 2D viewports.
     
    In the 3D viewport, we will have position/rotation/scale control similar to that found in Leadwerks Editor, the editor for Leadwerks Engine 2. We will improve this with a button to toggle between global and local space. Since 3D World Studio has no 3D editing mode, this can be implemented without interfering with our powerful classic CSG workflow.
     

     
    I think the design I have laid out here will be simple enough to figure out without reading a manual, require the fewest number of mouse clicks possible, and most of all feel natural and fun to use, a quality that I think modern modeling applications have largely lost.
  4. Josh
    I've now got the Vulkan renderer drawing multiple different models in one single pass. This is done by merging all mesh geometry into one single vertex and indice buffer and using indirect drawing. I implemented this originally in OpenGL and was able to translate the technique over to Vulkan. This can allow an entire scene to be drawn in just one or a few draw calls. This will make a tremendous improvement in performance in complex scenes like The Zone. In that scene in Leadwerks the slow step is the rendering routine on the CPU churning through thousands of OpenGL commands, and this design effectively eliminates that entire bottleneck.

    There is no depth buffer in use in the above image, so some triangles appear on top of others they are behind.
    Vulkan provides a lot of control when transferring memory into VRAM, and as a result we saw an 80% performance improvement over OpenGL in our first performance comparison. I have set up a system that uses staging buffers to transfer bits of memory from the CPU into shared memory buffers on the GPU. Another interesting capability is the ability to transfer multiple chunks of data between buffers in just one command.
    However, that control comes at a cost of complexity. At the moment, the above code works fine on Intel graphics but crashes on my discrete Nvidia card. This makes sense because of the way Vulkan handles memory. You have to explicitly synchronize memory yourself using a pipeline barrier. Since Intel graphics just uses system memory I don't think it will have any problems with memory synchronization like a discrete card will.
    That will be the next step, and it is really a complex topic, but my usage of it will be limited, so I think in the end my own code will turn out to be pretty simple. I expect Vulkan 2.0 will probably introduce a lot of simplified paths that will become the default, because this stuff is really just too hard for both beginners and experts. There’s no reason for memory to not be synced automatically and you’re just playing with fire otherwise.
  5. Josh
    An update has been posted with the following fixes:
    http://www.leadwerks.com/werkspace/topic/7847-undo-does-not-refresh-gizmos/
    http://www.leadwerks.com/werkspace/topic/7842-bsp-mesh-export/
  6. Josh
    A new update is available on the beta branch which fixes two bugs.
     
    Undo not working correctly with copied objects:
    http://www.leadwerks.com/werkspace/topic/11888-undo-after-ctrl-duplicate-is-messed-up/
     
    Water plane movement and culling problem:
    http://www.leadwerks.com/werkspace/topic/11895-water-plane-doesnt-cover-all-terrain/
  7. Josh
    There were numerous small issues related to the 2.32 release, but I am chipping away at them:
    http://leadwerks.com/werkspace/index.php?app=tracker&showproject=1&catfilter=2
     
    I'm not going to add any new features until every bug is fixed.
     
    Remember that the more complete and easy to reproduce a report is, the faster it will be to fix. Thanks for the feedback!
  8. Josh
    Several people have asked me for my hardware recommendations for a new Linux gaming machine. I get rather frustrated by PC manufacturers who load computers up with expensive Intel CPUs and extra RAM you don't need, and then forget to include a graphics card. Or they proclaim PC gaming is not dead and proudly announce their "gaming machine" with a shoddy GPU, that costs more than my first car. I've been building my own PCs since high school, and I know you can build a PC with superior performance and lower prices than the garbage PC vendors are selling today. So I am writing this article to tell you what I would buy if I wanted to build a high-quality future-proof gaming PC with big performance on a small budget.
     
    These components were chosen to give great performance at an affordable price, with maximum upgradability. I expect this computer would run any PC game today with maximum settings at 1920x1080 resolution. Components were chosen specifically for gaming, thus more importance is placed on the GPU and the CPU is relatively more lightweight. If you do not want to assemble the components yourself, any computer shop will do it for less than $100. I only chose components from reputable brands I have purchased hardware from before, no cheap brands or refurbished parts.
     
    For Linux gaming, I recommend buying AMD CPUs. The integrated graphics chips in Intel CPUs may cause problems with detection of a discrete graphics card and make it very difficult to install graphics drivers. AMD CPUs also give a better price/performance ratio with somewhat slower single-threaded speeds at a much lower price. For gaming, the single-threaded CPU speed isn't actually that important since the intensive tasks like rendering and occlusion culling are typically offloaded onto the GPU, and modern game engines like Leadwerks make extensive use of multi-threading.
     
    Disclaimer: I have not built a machine with these exact components. I am not responsible if it doesn't work, do so at your own risk, blah, blah, blah. Now on to the parts...
     
    Motherboard
    Asus M5A78L-M LX3 Desktop Motherboard (Socket AM3+):
    http://www.newegg.com/Product/Product.aspx?Item=N82E16813131935
    Price: $44.99
     
    CPU
    AMD FX-4300
    http://www.newegg.com/Product/Product.aspx?Item=N82E16819113287
    Price: $109.99
     
    RAM
    Corsair 4GB DDR3
    http://www.newegg.com/Product/Product.aspx?Item=N82E16820233349
    Price: $39.99
     
    Graphics Card
    MSI Nvidia GEForce 650
    http://www.newegg.com/Product/Product.aspx?Item=N82E16814127703
    Price: $94.99
     
    Case
    Cooler Master Elite 350 with 500w power supply
    http://www.newegg.com/Product/Product.aspx?Item=N82E16811119269
    Price: $59.99
     
    Hard drive
    Seagate Barracuda 500GB
    http://www.newegg.com/Product/Product.aspx?Item=N82E16822148767
    Price: $59.99
     
    Optical drive
    ASUS DVD Writer
    http://www.newegg.com/Product/Product.aspx?Item=N82E16827135305
    Price: $19.99
     
    Total cost: $429.93
     
    Other cost-cutting tips
    If you have any old computer, chances are you can reuse the hard drive, memory, optical drive, and even the case. Eliminating these items would bring the cost down to a ridiculously affordable $249.97.
    I could have got the price down to about $350 if I used really cheap parts, but I don't recommend doing this.
    TigerDirect.com may have some prices even lower than NewEgg.com, but I find their pricing information to be confusing. I don't ever consider manufacturer rebates, since you have no guarantee you will ever actually receive a check.

     
    Improving performance:
    If you want better performance, invest in the graphics card. When shopping for graphics cards, just look at the number of "Cuda Cores". This will give you a pretty linear estimate of performance. (On ATI cards divide the number of "stream processors" by five and that is equivalent. But I don't recommend buying an ATI card for Linux gaming at this time.) My next choice up would be the Nvidia GEForce 670, which provides 1344 Cuda Cores versus the 650's 384, at a price of $299.99:
    http://www.newegg.com/Product/Product.aspx?Item=N82E16814121707
    If I were to upgrade the CPU, I would get the FX-8320, which only costs $159.99 and gives the best price/performance ratio, according to cpubenchmark.net:
    http://www.newegg.com/Product/Product.aspx?Item=N82E16819113285
    An SSD will significantly improve game load times, and will generally make your computer feel much snappier whenever files are accessed. However, the prices are still high compared to hard drives, like this 256 gb PNY SSD for $179.99:
    http://www.newegg.com/Product/Product.aspx?Item=N82E16820178456

     
    Conclusion
    The companies selling PCs don't seem to have a clue about gaming hardware requirements. Fortunately, you can build your own gaming rig and get better performance for much less money than anything they are selling. A good gaming PC doesn't need to be expensive. My recommended build here costs less than $430, and can be less than $250 if you have any spare parts lying around or an old computer you can cannibalize.
     
    The upgradability of the PC means future upgrades can cost very little and add significant performance to your hardware. My suggestions for improving performance would raise the price by about $365, giving you a mid-high machine for $795. I don't recommend buying the very highest-end parts because the price/performance balance just isn't there, and a year from now whatever you bought will be second-best anyways.
     
    When you're done building your killer Linux gaming machine, remember to take the money you didn't spend on Windows and use it to buy some great Linux games.
  9. Josh
    As the maker of a game engine primarily aimed at Indie game developers, I have the opportunity to observe behavior of many individuals working together. This allows me to observe interactions from a perspective the individual participants in the system sometimes can't see.
     
    There are many steps in game development, that can be performed by people with different skill sets. The workflow for Leadwerks is designed on this premise. At its simplest, Leadwerks can be used as a visual design tool with mapping tools and the flowgraph editor for game logic. Beneath this lies the entity script layer, and then beneath this is the C++ API at its core.
     
    Ultimately, my goal is to facilitate cooperation between different users who occupy different niches in the content production pipeline. Mappers, programmers, designers, and artists should be able to benefit from one another's work without entering into structured agreements. Ideally, they will all just churn away at what they are good at, like the cells in a human body keeping the organism healthy and productive.
     
    Taking a very broad view of the matter, you really need three things to achieve that vision:
    Free sharing of items within the community.
    Protection of intellectual property rights.
    Tracking of the chain-of-authorship and support for derivative works.

     
    This is challenging because some of those goals may seem to contradict others. For example, if I upload a model on Turbosquid I certainly don't want someone else turning around and reselling it as their own work, because that violates my intellectual property rights. On the other hand, if we don't allow frictionless exchange of content, individuals further up the value chain never get a change to add their valuable contributions to the ecosystem.
     
    That's why I'm taking a new design with this system, one that has never been done before. In my next update I will talk in more detail about my design and how this system will change the way content is produced.
  10. Josh
    Previously, I described the goals and philosophy that were guiding my design of our implementation of the Leadwerks Workshop on Steam. To review, the goals were:
    1. Frictionless sharing of items within the community.
    2. Protection of intellectual property rights.
    3. Tracking of the chain-of-authorship and support for derivative works.
     
    In this update I will talk more specifically about how our implementation meets these goals.
     
    Our implementation of the Steam Workshop allows Leadwerks developers to publish game assets directly to Steam. A Workshop item is typically a pack of similar files, like a model or texture pack, rather than single files:

     
    To add an item to Leadwerks, simply hit the "Subscribe" button in Steam and the item will become available in a new section of the asset browser:

     
    You can drag Workshop files into your scene and use them, just like a regular file. However, the user never needs to worry about managing these files; All subscribed items are available in the editor, no matter what project you are working on. When a file is used in a map or applied to a model, a unique global ID for that file is saved, rather than a file path. This allows the item author to continue updating and improving the file without ever having to re-download files, extract zip archives, or any other mess. Effectively, we are bringing the convenience of Steam's updating system to our community, so that you can work together more effectively. Here's one of the tutorial maps using materials from a sci-fi texture pack from the Workshop. When the map is saved, the unique file IDs are stored so I can share the map with others.

     
    Publishing your own Workshop packages is easy. A built-in dialog allows you to set a title, description, and a preview image. You can add additional images and even videos to your item in Steam:

     
    Leadwerks even has support for derivative works. You can create a model, prefab, or map that uses another Workshop file and publish it to Steam. Since Leadwerks tracks the original ID of any Workshop items you used, they will always be pulled from the original source. This allows an entirely new level of content authors to add value to items downstream from their origin, in a way similar to how Linux distributions have grown and evolved. For example, maybe you don't have the artistic skill to make every single texture you need for a house, but you can put together a pretty nice house model and pant it with another user's textures. You can then upload that model right back to the Workshop, without "ripping off" the texture artist; their original package will still be needed to load the textures. It's perfectly fine to change the name of your Workshop package at any time, and you never need to worry about your file names conflicting with files in other packages. (If you decide you want to change a lot of file names, it's best to just create a new package so that you don't interrupt the work of users "downstream" from you,)
     
    Uninstalling a Workshop package just requires you to hit the "unsubscribe" button on the item's page in the Steam Workshop. No more hunting around for stray zip files! You can easily check out other users' work, use whatever you like, and unsubscribe from the packages you don't like, with no mess at all.
     
    How Do I Get It?
    The Leadwerks Workshop beta begins today. You must be a member of the Leadwerks Developer group on Steam to access the Workshop. A limited number of beta invites are being sent out. Once the system is completely polished, we will make it available to the entire Leadwerks community.
  11. Josh
    Following my previous update about correcting file path cases, I am now able to load all our maps in Leadwerks. The power of this tool running natively on Linux is starting to show:
     

     
    The next step is to implement a file system watcher to detect file changes. Leadwerks automatically monitors the current project directory, and will reload assets whenever the file changes. This allows you to keep an image open in an image editor like GIMP, and any time you save your changes will be reflected inside Leadwerks. It's a great workflow for artists and just the kind of feature I wanted to bring to Linux with this project.
     
    Linux has a built-in file system watcher class called "inotify". Interestingly, this class was added to Linux in 2005, the same year the iPod was released, but there appears to be no connection. The "i" in "inotify" stands for "inode". Dennis Ritchie explains:
     
    In truth, I don't know either. It was just a term that we started to use. "Index" is my best guess, because of the slightly unusual file system structure that stored the access information of files as a flat array on the disk, with all the hierarchical directory information living aside from this. Thus the i-number is an index in this array, the i-node is the selected element of the array. (The "i-" notation was used in the 1st edition manual; its hyphen was gradually dropped.)
     
    The inotify system is pretty straightforward to implement, with a procedural C interface. One thing that tripped me up was an odd layout of the inotify_event structure. It actually has a char pointer built into the structure, so technically the structure does not have a determined length. I don't believe I have ever encountered this design before, but I also am usually dealing with std::string classes.
     
    One drawback of inotify is that it isn't recursive. While searching for information on the design, I came across this post by Robert Love, one of the two guys who wrote it (the other being John McCutchan). I disagree with his rational for omitting recursion; just because the performance is not as optimal as he would like does not mean the end user's tastes and preferences will change. I can't imagine a scenario when you wouldn't want the system to work recursively, or at least have the option to. In any case, implementing a recursive watch system was fairly easy. The entire file watch system from zero to finished only took about half a day. So we can mark this as one of the features where I overestimated the time and complexity of implementation.
     
    File creation, deletion, and modify events are now firing reliably so I am going to plug this code into the editor and start testing it out. Because the editor already works reliably on other Windows and OSX, I don't anticipate any problems. I do not have a firm release date yet, but as you can surmise, we are nearing completion of Leadwerks for Linux.
  12. Josh
    HDR skyboxes are important in PBR rendering because sky reflections get dampened by surface color. It's not really for the sky itself, but rather we don't want bright reflections to get clamped and washed out, so we need colors that go beyond the visible range of 0-1.
    Polyhaven has a large collection of free photo-based HDR environments, but they are all stored in EXR format as sphere maps. What we want are cubemaps stored in a single DDS file, preferably using texture compression.
    We're going to use this online tool to convert the sphere maps to cubemaps. There are other ways to do this, and some may be better, but this is what I was able to find right away. By default, the page shows a background we can use for testing:

    Let's process and save the image. Since we are just testing, it's fine to use a low resolution for now:

    The layout is very important. We will select the last option, which exports each face as a separate image:

    Processing and saving the image will result in a zip file downloaded to your computer, with six files:
    px.hdr nx.hdr py.hdr ny.hdr pz.hdr nz.hdr These images correspond to the positive and negative directions on the X, Y, and Z axes.
    We will start by just converting a single image, px.hdr. We will convert this pixmap to the format TEXTURE_BC6H, which corresponds to VK_FORMAT_BC6H_UFLOAT_BLOCK in Vulkan, or DXGI_FORMAT_BC6H_UF16 in the DDS file format. We need to load the FreeImage plugin for HDR file import, and the ISPC plugin for fast BC6H compression. The R16G16B16A16_SFLOAT is an intermediate format we need to convert the 32-bit floating point HDR file to, because the ISPC compressor expects this as the input format for BC6H compression:
    auto plg = LoadPlugin("Plugins/FITextureLoader"); auto plg2 = LoadPlugin("Plugins/ISPCTexComp"); auto pixmap = LoadPixmap(GetPath(PATH_DESKTOP) + "/px.hdr"); pixmap = pixmap->Convert(TextureFormat(VK_FORMAT_R16G16B16A16_SFLOAT)); pixmap = pixmap->Convert(TEXTURE_BC6H); pixmap->Save(GetPath(PATH_DESKTOP) + "/px.dds"); When we open the resulting DDS file in Visual Studio it appear quite dark, but I think this is due to how the HDR image is stored. I'm guessing colors are being scaled to a range between zero and one in the HDR file:

    Now that we know how to convert a save a single 2D texture, let's try saving a more complex cube map texture with mipmaps. to do this, we need to pass an an array of pixmaps to the SaveTexture function, containing all the mipmaps in the DDS file, in the correct order. Microsoft's documentation on the DDS file format is very helpful here.
    std::vector<std::shared_ptr<Pixmap> > mipchain; WString files[6] = { "px.hdr", "nx.hdr", "py.hdr", "ny.hdr", "pz.hdr", "nz.hdr" }; for (int n = 0; n < 6; ++n) { auto pixmap = LoadPixmap(GetPath(PATH_DESKTOP) + "/" + files[n]); pixmap = pixmap->Convert(TextureFormat(VK_FORMAT_R16G16B16A16_SFLOAT)); Assert(pixmap); mipchain.push_back(pixmap->Convert(TEXTURE_BC6H)); while (true) { auto size = pixmap->size; size /= 2; pixmap = pixmap->Resize(size.x, size.y); Assert(pixmap); mipchain.push_back(pixmap->Convert(TEXTURE_BC6H)); if (size.x == 4 and size.y == 4) break; } } SaveTexture(GetPath(PATH_DESKTOP) + "/skybox.dds", TEXTURE_CUBE, mipchain, 6); when we open the resulting DDS file in Visual Studio we can select different cube map faces and mipmap levels, and see that things generally look like we expect them to:

    Now is a good time to load our cube map up in the engine and make sure it does what we think it does:
    auto skybox = LoadTexture(GetPath(PATH_DESKTOP) + "/skybox.dds"); Assert(skybox); world->SetSkybox(skybox); When we run the program we can see the skybox appearing. It still appears very dark in the engine:

    We can change the sky color to lighten it up, and the image comes alive. We don't have any problems with the dark colors getting washed out, because this is an HDR image:
    world->SetSkyColor(4.0);
    Now that we have determined that our pipeline works, let's try converting a sky image at high resolution. I will use this image from Polyhaven because it has a nice interesting sky. However, there's one problem. Polyhaven stores the image in EXR format, and the cube map generator I am using only loads HDR files. I used this online converter to convert my EXR to HDR format, but there are probably lots of other tools that can do the job.
    Once we have the new skybox loaded in the cubemap generator, we will again process and save it, but this time at full resolution:

    Save the file like before and run the DDS creation code. If you are running in debug mode, it will take a lot longer to process the image this time, but when it's finished the results will be worthwhile!:

    This will be very interesting to view in VR. Our final DDS file is 120 MB, much smaller than the 768 MB an uncompressed R16G16B16A16 image would require. This gives us better performance and requires a lot less hard drive space. When we open the file in Visual Studio we can browser through the different faces and mipmap levels, and everything seems to be in order:

    Undoubtedly this process will get easier and more automated with visual tools, but it's important to get the technical basis laid down first. With complete and in-depth support for the DDS file format, we can rely on a widely supported format for textures, instead of black-box file formats that lock your game content away.
  13. Josh
    One of my main goals in 2016 is to build a built-in store for Leadwerks users to buy and sell 3D models and other items through Steam. It took me a while to understand how this works:
    The client (Leadwerks Editor) sends a message to the Leadwerks.com server requesting that a purchase be made.
    The Leadwerks.com server sends a message to the Steam server requesting that a purchase be initialized.
    Steam displays a dialog to confirm the transaction in Leadwerks Editor.
    The Steam server returns the result of the transaction to the Leadwerks.com server.
    If the transaction was successful, the Leadwerks server stores a record of the transaction in an SQL database on the server.
    The editor receives a message confirming the transaction and downloads the purchased item.

     
    It's sort of complicated, and it's not really something I am used to dealing with. The first step is to restore our SSL certificate, which I let lapse because we aren't using it for anything anymore. I have a web developer I talked to a few months ago about this project, and put it on hold because of the holidays, but will contact him again soon.
  14. Josh
    The Leadwerks 4 renderer was built for maximum flexibility. The Leadwerks 5 renderer is being built first and foremost for great graphics with maximum speed. This is the fundamental difference between the two designs. VR is the main driving force for this direction, but all games will benefit.
    Multithreaded Design
    Leadwerks 4 does make use of multithreading in some places but it is fairly simplistic. In Leadwerks 5 the entire architecture is based around separate threads, which is challenging but a lot of fun for me to develop. I worked out a way to create a command buffer on the main thread that stores a list of commands for the rendering thread to perform during the next rendering frame. (Thanks for the tip on Lambda functions @Crazycarpet) Each object in the main thread has a simplified object it is associated with that lives in the rendering thread. For example, each Camera has a RenderCamera object that corresponds to it. Here's how changes in the main thread get added to a command buffer to be executed when the rendering thread is ready:
    void Camera::SetClearColor(const float r,const float g,const float b,const float a) { clearcolor.x = r; clearcolor.y = g; clearcolor.z = b; clearcolor.w = a; #ifdef LEADWERKS_5 GameEngine::cullingthreadcommandbuffer.push_back( [this->rendercamera, this->clearcolor]() { rendercamera->clearcolor = clearcolor; } ); #endif } The World::Render() command is still there for conceptual consistency, but what it really does it add all the accumulated commands onto a stack of command buffers for the rendering thread to evaluate whenever it's ready:
    void World::Render(shared_ptr<Buffer> buffer) { //Add render call onto command buffer GameEngine::cullingthreadcommandbuffer.push_back(std::bind(&RenderWorld::AddToRenderQueue, this->renderworld)); //Copy command buffer onto culling command buffer stack GameEngine::CullingThreadCommandBufferMutex->Lock(); GameEngine::cullingthreadcommandbufferstack.push_back(GameEngine::cullingthreadcommandbuffer); GameEngine::CullingThreadCommandBufferMutex->Unlock(); //Clear the command buffer and start over GameEngine::cullingthreadcommandbuffer.clear(); } The rendering thread is running in a loop inside a function that looks something like this:
    shared_ptr<SharedObject> GameEngine::CullingThreadEntryPoint(shared_ptr<SharedObject> o) { while (true) { //Get the number of command stacks that are queued CullingThreadCommandBufferMutex->Lock(); int count = cullingthreadcommandbufferstack.size(); CullingThreadCommandBufferMutex->Unlock(); //For each command stack for (int i = 0; i < count; ++i) { //For each command for (int n = 0; n < cullingthreadcommandbufferstack[i].size(); ++n) { //Execute command cullingthreadcommandbufferstack[i][n](); } } //Remove executed command stacks CullingThreadCommandBufferMutex->Lock(); int newcount = cullingthreadcommandbufferstack.size(); if (newcount == count) { cullingthreadcommandbufferstack.clear(); } else { memcpy(&cullingthreadcommandbufferstack[0], &cullingthreadcommandbufferstack[count], sizeof(sizeof(cullingthreadcommandbufferstack[0])) * (newcount - count)); cullingthreadcommandbufferstack.resize(newcount); } CullingThreadCommandBufferMutex->Unlock(); //Render queued worlds for (auto it = RenderWorld::renderqueue.begin(); it != RenderWorld::renderqueue.end(); ++it) { (it->first)->Render(nullptr); } } return nullptr; } I am trying to design the system for maximum flexibility with the thread speeds so that we can experiment with different frequencies for each stage. This is why the rendering thread goes through and executes all commands an all accumulated command buffers before going on to actually render any queued world. This prevents the rendering thread from rendering an extra frame when another one has already been received (which shouldn't really happen, but we will see).
    As you can see, the previously expensive World::Render() command now does almost nothing before returning to your game loop. I am also going to experiment with running the game loop and the rendering loop at different speeds. So let's say previously your game was running at 60 FPS and 1/3 of that time was spent rendering the world. This left you without about 11 milliseconds to execute your game code, or things would start to slow down. With the new design your game code could have up to 33 milliseconds to execute without compromising the framerate. That means your code could be three times more complex, and you would not have to worry so much about efficiency, since the rendering thread will keep blazing away at a much faster rate.
    The game loop is a lot simpler now with just two command you need to update and render the world. This gives you a chance to adjust some objects after physics and before rendering. A basic Leadwerks 5 program is really simple:
    #include "Leadwerks.h" using namespace Leadwerks; int main(int argc, const char *argv[]) { auto window = CreateWindow("MyGame"); auto context = CreateContext(window); auto world = CreateWorld(); auto camera = CreateCamera(world); while (true) { if (window->KeyHit(KEY_ESCAPE) or window->Closed()) return 0; world->Update(); world->Render(context); } } This may cause problems if you try to do something fancy like render a world to a buffer and then use that buffer as a texture in another world. We might lose some flexibility there, and if we do I will prioritize speed over having lots of options.
    Clustered Forward Rendering
    Leadwerks has used a deferred renderer since version 2.1. Version 2.0 was a forward renderer with shadowmaps, and it didn't work very well. At the time, GPUs were not very good at branching logic. If you had an if / else statement, the GPU would perform BOTH branches (including expensive texture lookups) and take the result of the "true" one. To get around this problem, the engine would generate a new version of a shader each time a new combination of lights were onscreen, causing period microfreezes when a new shader was loaded. In 2.1 we switched to a deferred renderer which eliminated these problems. Due to increasingly smart graphics hardware and more flexible modern APIs a new technique called clustered forward rendering is now possible, offering flexibility similar to a deferred renderer, with the increased speed of a forward renderer. Here is a nice article that describes the technique:
    http://www.adriancourreges.com/blog/2016/09/09/doom-2016-graphics-study/

    This approach is also more scalable. Extra renders to the normal buffer and other details can be skipped for better scaling on integrated graphics and slower hardware. I'm not really targeting slow hardware as a priority, but I wouldn't be surprised if it ran extremely fast on integrated graphics when the settings are turned down. Of course, the system requirements will be the same because we need modern API features to do this.
    I'm still a little foggy on how custom post-processing effects will be implemented. There will definitely be more standard features built into the renderer. For example, SSR will be mixed with probe reflections and a quality setting (off, static, dynamic) will determine how much processing power is used for reflections. If improved performance and integration comes at the cost of reduced flexibility in the post-process shaders, I will choose that option, but so far I don't foresee any problems.
    Vulkan Graphics
    The new renderer is being developed with OpenGL 4.1 so that I can make a more gradual progression, but I am very interested in moving to Vulkan once I have the OpenGL build worked out. Valve made an agreement with the developers of MoltenVK to release the SDK for free. This code translates Vulkan API calls into Apple's Metal API, so you basically have Vulkan running on Mac (sort of). I previously contacted the MoltenVK team about a special license for Leadwerks that would allow you guys to release your games on Mac without buying a MoltenVK license, but we did not reach any agreement and at the time the whole prospect seemed pretty shaky. With Valve supporting this I feel more confident going in this direction. In fact, due to the design of our engine, it would be possible to outsource the task of a Vulkan renderer without granting any source code access or complicating the internals of the engine one bit.
  15. Josh
    My goal with Leadwerks 3 has always been to give the people what they need to make playable games. What they really need. Creative projects are interesting because it's easy to get derailed. You can lose interest in an idea, or you might avoid a critical problem you don't know how to solve, and instead spend time doing fun little things that make you feel like you're making progress when you're not. For example, let's say I am making a dungeon crawler. I have some basic gameplay but I can't figure out how to make my AI do what I want, so I decide I will spend my time painting flags in Photoshop for the 112 factions my game will have. Avoidance kills projects.
     
    It's easy to see this from the outside, but when it's your own work it's much harder to separate and see the progress from an objective perspective. It requires a sense of balance so you can manage the different aspects of a game in tandem and allow them to grow together, never letting one focus become too disproportionately large so that it consumes other critical aspects. Project management.
     
    From my perspective, I try to focus on the most critical problem that holds back games. One huge obstacle we overcame was AI. AI programming used to be an advanced topic only a few people could get to. Now it is easy and everyone can tinker around with it, because we have a standard system it's built on. Same thing goes with level interactions. Implementing simple button-door interactions used to be the stuff only expert programmers could do, but now we have a standard system for it. This opens up the doors for progress and makes my goal of turning players into makers more achievable.
     
    So at the point we're at now, what is needed most? More features (and bug fixes) are very important, but do these things move the ball forward? Do these things improve our ability to get playable games produced? Not really. I will continue to add new stuff, but we should recognize more features aren't really the bottleneck in game production.
     
    If you look through the recent screenshots on Steam, it's pretty obvious what's missing. You have all these cool screenshots of level design, which I love, but the scenes are lifeless and empty. The AI and flowgraph system are innovative system that make gameplay easy to implement. But something is missing, and I think that is animated characters.
     
    Animated characters are very difficult to produce from scratch. There are large collections of these you can buy online, but they still require hotboxes, scripting, sounds, scaling, and various setup procedures. Although Leadwerks makes the import process as easy as possible, there's some stuff you just can't automate.
     
    Another issue is there actually isn't a lot of usable stuff out there, even the paid items. You have problems with consistency from different artists, target polygon counts, animation quality, etc. It's easy to find stuff that looks good at a glance, but then you find out for some reason or another, it just won't work or is incomplete.
     
    So it seems clear that animated characters are a huge unsolved problem: The demands of producing these are far beyond most indies, and there isn't a real usable source of content they can purchase, when you get into the details of it. I even looked at Mixamo Fuse, and although they try to solve the problem, their characters look cartoonish and their animations are ridiculously expensive. I can get custom animations made for much less than they charge just to use an animation on one single model.
     
    I think providing art content is part of what Leadwerks should provide, though I traditionally have not done this in the past. The prospect of creating every kind of character you could need is daunting. Where to even begin? I think The Game Creators have done a decent job of this with their FPSC model packs, but those took years to build up. This needs to be pursued intelligently, with a plan.
     
    Several years ago I wrote a simple program called Character Shop. I had a set of animations produced, and the program allowed the user to weight meshes to a skeleton and export an animated character. I have no plans at this time to bring this back, but the same basic idea is still valid: build your animations once and use them for all characters:

     
    I've commissioned a set of animations to get started with. These will use a standard human skeleton and have animations that can be used and reused. This can be used by both my own artists working on official content and artists in the Leadwerks community who want to just weight their mesh on a standard skeleton and import the animations in the Leadwerks model editor. Once the animations library is built, the cost of creating additional characters is fairly low.
     
    So the first animated character pack is on the way. The other big need are first-person view weapons, and I will see if these can be handled in the same manner. Fortunately, I don't have to do anything to make these happen, so it's back to coding for me!
  16. Josh
    It's quite a challenge to get something semi-deployable to work on four platforms. I never intended to become an expert on the subject, but here are some of the interesting things I have picked up recently.
     
    Code generation mode in Visual Studio
    By default, Visual Studio builds programs in debug mode that won't run on consumer's machines. Fix it by changing the code generation mode:
    http://www.leadwerks...grams-cant-run/
     
    Static Libraries in Xcode
    Xcode has no "exclude from build" option. If you include a debug and release static library with the same name, the compiler will choose either one or the other, and cause errors in one build. The way around this is to set both to "Optional" in the "Build Phases" project settings.
    https://devforums.ap...170482?tstart=0
     
    Android NDK Path
    To relocate a project, you have to alter the NDK Builder setting "Project_Path". The NDK settings are hidden away pretty well in the Project>Android settings.
     
    And slightly unrelated, but equally maddening...
     
    Polled Input + Notifications on OSX in BlitzMax = death
    http://blitzmax.com/...php?topic=99043
  17. Josh
    The beta branch on Steam is updated with a new build.

    Fixes and Enhancements
    The camera movement options have been made more sane. The range of possible values is a little more practical now. A mouse smoothing option has been added that makes the camera motion in the perspective viewport feel a lot more smooth and natural. I also found a bug that caused the camera move speed to change when a map was loaded. It was an annoyance that was almost unnoticeable but it feels much better now. 
    The editor is also using a different garbage collection mode. I believe this fixes the problems some people were having with memory allocation during some operations.
     
    The Leadwerks Workshop on Steam is almost ready to open. A "Workshop" menu has been added for easier access to this functionality.

    Mapping and Modular Props
    While preparing content for the Workshop, I dug into my old backups disks looking for models and textures we could use. From Leadwerks 2 we got "The Zone", a huge pack of buildings, structures, and props inspired by the S.T.A.L.K.E.R. series: 

     
    Going further back to the days of 3D World Studio, I found a set of industrial props that look a lot like something you would find in Counter Strike Source. With the aid of Macklebee, we imported a bunch of items into Leadwerks 3.1. By adding normal maps, many items are quite usable still today. Here are a few samples:
     

     

     
    My favorite items are the modular model sets, like these chain link fence segments. They are sized and oriented so they will easily line up in your map perfectly along the power-of-two grid lines:
     

     
    These pipe pieces can be dragged into the scene and made into interesting configurations with just a few clicks. They are sized precisely so you don't have to worry about lining them up exactly, you just pop them into the scene and they work:
     

     
    I love this kind of stuff because you can easily drag some pieces into the scene and make something of your own, without a lot of effort. The best is when pieces just line up magically without you really thinking about it. For example, I decided to try building a chain link fence around a large electrical tower. Because the tower was sized precisely, I was able to line up fence pieces that exactly matched the dimensions of the tower base:
     

     
    Getting a chance to play with all these props again reminded me of why I got into game development in the first place. Building your own map is like having an infinite supply of digital Legos that never run out, plus you can run around inside your creation and fight monsters, race cars, or do anything you want. How is that not the most awesome thing ever? I'll be writing some guidelines on modeling modular props so that it's easy for the map designer to use them like this, and in the future I think more of the items we supply will be designed to be used like this.
  18. Josh
    A new easy-to-use networking system is coming soon to Leadwerks Game Engine.  Built on the Enet library, Leadwerks networking provides a fast and easy way to quickly set up multiplayer games.  Each computer in the game is either a server or a client.  The server hosts the game and clients can join and leave the game at will.  On the other hand, when the server leaves the game, the game is over!

    Creating a Client
    You can soon create a client with one command in Leadwerks:
    client = Client:Create() To connect to a server, you need to know the IP address of that computer:
    client:Connect("63.451.789.3") To get information from the other computer, we simply update the client and retrieve a message:
    local message = client:Update() if message.id == Message.Connected then print("Connected to server") elseif message.id == Message.Disconnected then print("Disconnected from server") elseif message.id == Message.Chat then print("New chat message: "..message.stream:ReadString()); end You can even send messages, consisting of a simple message ID, a string, or a stream.
    client:Send(Message.Chat,"Hello, how are you today?") There are two optional flags you can use to control the way your messages are sent.  If you specify Message.Ordered, your packets will arrive in the order they were sent (they won't necessarily otherwise).  You can use this for updating the position of an object, so that the most recent information is always used.  The Message.Reliable flag should be used for important messages that you don't want to miss.  UDP packets are not guaranteed to ever arrive at their destination, but messages sent with this flag are.  Just don't use it for everything, since it is slower!
    When we're ready to leave the game, we can do that just as easily:
    client:Disconnect() A dedicated server does not have anyone playing the game.  The whole computer is used only for processing physics and sending and receiving information.  You can create a dedicated server, but it's better to let your players host their own games.  That way there's always a game to join, and you don't have to buy an extra computer and keep it running all the time.
    Creating a Server
    Your game should be able to run both as a client and as a server, so any player can host or join a game.  Creating the game server is just as easy.
    local server = Server:Create(port) Once the server is created, you can look up your IP address and ask a friend to join your game.  They would then type the IP address into their game and join.
    The server can send and receive messages, too.  Because the server can be connected to multiple clients, it must specify which client to send the message to.  Fortunately, the Message structure contains the Peer we received a message from.  A peer just means "someone else's computer".  If your computer is the client, the server you connect to is a peer.  If your computer is the server, all the other clients are peers:
    local message = client:Update() if message.id == Message.Connected then player2 = message.peer end You can use the peer object to send a message back to that computer:
    server:Send(peer, Message.Chat, "I am doing just great! Thanks for asking.") If you want to boot a player out of your game, that's easy too:
    server:Disconnect(peer) The broadcast command can be used to send the same message out to all clients:
    server:Broadcast(Message.Chat, "I hope you are all having a great time in my cool chat program!") Public Games
    You can make your game public, allowing anyone else in the world who has the game to play with you.  You specify a name for your game, a description of your server, and call this command to send a message to the Leadwerks server:
    server:Publish("SuperChat","My SuperChat Server of Fun") All client machines, anywhere in the world, can retrieve a list of public games and choose one to join:
    for n=0,client:CountServers("SuperChat")-1 do local remotegame = client:GetServer(n) print(remotegame.address) print(remotegame.description) end This is a lot easier than trying to type in one person's IP address.  For added control, you can even host a games database on your own server, and redirect your game to get information from there.
  19. Josh
    The new engine features advanced image and texture manipulation commands that allow a much deeper level of control than the mostly automated pipeline in Leadwerks Game Engine 4. This article is a deep dive into the new image and texture system, showing how to load, modify, and save textures in a variety of file formats and compression modes.
    Texture creation has been finalized. Here is the command:
    shared_ptr<Texture> CreateTexture(const TextureType type, const int width, const int height, const TextureFormat format = TEXTURE_RGBA, const std::vector<shared_ptr<Pixmap> > mipchain = {}, const int layers = 1, const TextureFlags = TEXTURE_DEFAULT, const int samples = 0); It seems like once you get to about 6-7 function parameters, it starts to make more sense to fill in a structure and pass that to a function the way Vulkan and DirectX do. Still, I don't want to go down that route unless I have to, and there is little reason to introduce that inconsistency into the API just for a handful of long function syntaxes.
    The type parameter can be TEXTURE_2D, TEXTURE_3D, or TEXTURE_CUBE. The mipchain parameter contains an array of images for all miplevels of the texture.
    We also have a new SaveTexture command which takes texture data and saves it into a file. The engine has built-in support for saving DDS files, and plugins can also provide support for additional file formats. (In Lua we provide a table of pixmaps rather than an STL vector.)
    bool SaveTexture(const std::wstring& filename, const TextureType type, const std::vector<shared_ptr<Pixmap> > mipchain, const int layers = 1, const SaveFlags = SAVE_DEFAULT); This allows us to load a pixmap from any supported file format and resave it as a texture, or as another image file, very easily:
    --Load image local pixmap = LoadPixmap("Materials/77684-blocks18c_1.jpg") --Convert to RGBA if not already if pixmap.format ~= TEXTURE_RGBA then pixmap = pixmap:Convert(TEXTURE_RGBA) end --Save pixmap to texture file SaveTexture("OUTPUT.dds", TEXTURE_2D, {pixmap}, 1, SAVE_DEFAULT) If we open the saved DDS file in Visual Studio 2019 we see it looks exactly like the original, and is still in uncompressed RGBA format. Notice there are no mipmaps shown for this texture because we only saved the top-level image.

    Adding Mipmaps
    To add mipmaps to the texture file, we can specify the SAVE_BUILD_MIPMAPS in the flags parameter of the SaveTexture function.
    SaveTexture("OUTPUT.dds", TEXTURE_2D, {pixmap}, 1, SAVE_BUILD_MIPMAPS) When we do this we can see the different mipmap levels displayed in Visual Studio, and we can verify that they look correct.

    Compressed Textures
    If we want to save a compressed texture, there is a problem. The SAVE_BUILD_MIPMAPS flags won't work with compressed texture formats, because we cannot perform a bilinear sample on compressed image data. (We would have to decompressed, interpolate, and then recompress each block, which could lead to slow processing times and visible artifacts.) To save a compressed DDS texture with mipmaps we will need to build the mipmap chain ourselves and compress each pixmap before saving.
    This script will load a JPEG image as a pixmap, generate mipmaps by resizing the image, convert each mipmap to BC1 compression format, and save the entire mip chain as a single DDS texture file:
    --Load image local pixmap = LoadPixmap("Materials/77684-blocks18c_1.jpg") local mipchain = {} table.insert(mipchain,pixmap) --Generate mipmaps local w = pixmap.size.x local h = pixmap.size.y local mipmap = pixmap while (w > 1 and h > 1) do w = math.max(1, w / 2) h = math.max(1, h / 2) mipmap = mipmap:Resize(w,h) table.insert(mipchain,mipmap) end --Convert each image to BC1 (DXT1) compression for n=1, #mipchain do mipchain[n] = mipchain[n]:Convert(TEXTURE_BC1) end --Save mipchain to texture file SaveTexture("OUTPUT.dds", TEXTURE_2D, mipchain, 1, SAVE_DEFAULT) If we open this file in Visual Studio 2019 we can inspect the individual mipmap levels and verify they are being saved into the file. Also note that the correct texture format is displayed.

    This system gives us fine control over every aspect of texture files. For example, if you wanted to write a mipmap filter that blurred the image a bit with each resize, with this system you could easily do that.
    Building Cubemaps
    We can also save cubemaps into a single DDS or Basis file by providing additional images. In this example we will load a skybox strip that consists of six images side-by-side, copy sections of the sky to different images, and then save them all as a single DDS file.
    Here is the image we will be loading. It's already laid out in the order +X, -X, +Y, -Y, +Z, -Z, which is what the DDS format uses internally.

    First we will load the image as a pixmap and check to make sure it is six times as wide as it is high:
    --Load skybox strip local pixmap = LoadPixmap("Materials/zonesunset.png") if pixmap.size.x ~= pixmap.size.y * 6 then Print("Error: Wrong image aspect.") return end Next we will create a series of six pixmaps and copy a section of the image to each one, using the CopyRect method.
    --Copy each face to a different pixmap local faces = {} for n = 1, 6 do faces[n] = CreatePixmap(pixmap.size.y,pixmap.size.y,pixmap.format) pixmap:CopyRect((n-1) * pixmap.size.y, 0, pixmap.size.y, pixmap.size.y, faces[n], 0, 0) end To save a cubemap you must set the type to TEXTURE_CUBE and the layers value to 6:
    --Save as cube map SaveTexture("CUBEMAP.dds", TEXTURE_CUBE, faces, 6, SAVE_DEFAULT) And just like that, you've got your very own cubemap packed into a single DDS file. You can switch through all the faces of the cubemap by changing the frames value on the right:

    For uncompressed cubemaps, we can just specify the SAVE_BUILD_MIPMAPS and mipmaps will automatically be created and saved in the file:
    --Save as cube map, with mipmaps SaveTexture("CUBEMAP+mipmaps.dds", TEXTURE_CUBE, faces, 6, SAVE_BUILD_MIPMAPS) Opening this DDS file in Visual Studio 2019, we can view all cubemap faces and verify that the mipmap levels are being generated and saved correctly:

    Now in Leadwerks Game Engine 4 we store skyboxes in large uncompressed images because DXT compression does not handle gradients very well and causes bad artifacts. The new BC7 compression mode, however, is good enough to handle skyboxes and takes the same space as DXT5 in memory. We already learned how to save compressed textures. The only difference with a skybox is that you store mipmaps for each cube face, in the following order:
    for f = 1, faces do for m = 1, miplevels do The easy way though is to just save as the RGBA image as a Basis file, with mipmaps. The Basis compressor will handle everything for us and give us a smaller file:
    --Save as cube map, with mipmaps SaveTexture("CUBEMAP+mipmaps.basis", TEXTURE_CUBE, faces, 6, SAVE_BUILD_MIPMAPS) The outputted .basis file works correctly when loaded in the engine. Here are the sizes of this image in different formats, with mipmaps included:
    Uncompressed RGBA: 32 MB BC7 compressed DDS: 8 MB Zipped uncompressed RGBA: 4.58 MB Zipped BC7 compressed DDS: 3.16 MB Basis file: 357 KB Note that the Basis file still takes the same amount of memory as the BC7 file, once it is loaded onto the GPU. Also note that a skybox in Leadwerks Game Engine 5 consumes less than 1% the hard drive space of a skybox in Leadwerks 4.
    Another option is to save the cubemap faces as individual images and then assemble them in a tool like ATI's CubemapGen. This was actually the first approach I tried while writing this article. I loaded the original .tex file and saved out a bunch of PNG images, as shown below.

    Modifying Images
    The CopyRect method allows us to copy sections of images from one to another as long as the images use the same format. We can also copy from one section of an image to another area on itself. This code will load a pixmap, copy a rectangle from one section of the image to another, and resave it as a simple uncompressed DDS file with no mipmaps:
    --Load image local pixmap = LoadPixmap("Materials/77684-blocks18c_1.jpg") --Convert to RGBA if not already if pixmap.format ~= TEXTURE_RGBA then pixmap = pixmap:Convert(TEXTURE_RGBA) end --Let's make some changes :D pixmap:CopyRect(0,0,256,256,pixmap,256,256) --Save uncompressed DDS file SaveTexture("MODIFIED.dds", TEXTURE_2D, {pixmap}, 1, SAVE_DEFAULT) When you open the DDS file you can see the copy operation worked:

    We can even modify compressed images that use any of the FourCC-type compression modes. This includes BC1-BC5.. The only difference that is instead of copying individual pixels, we are now working with 4x4 blocks of pixels. The easiest way to handle this is to just divide all your numbers by four:
    --Convert to compressed format pixmap = pixmap:Convert(TEXTURE_BC1) --We specify blocks, not pixels. Blocks are 4x4 squares. pixmap:CopyRect(0,0,64,64,pixmap,64,64) --Save compressed DDS file SaveTexture("COMPRESSED+MODIFIED.dds", TEXTURE_2D, {pixmap}, 1, SAVE_DEFAULT) When we open the image in Visual Studio the results appear identical, but note that the format is compressed BC1. This means we can perform modifications on compressed images without ever decompressing them:

    The Pixmap class has a GetSize() method which returns the image size in pixels, and it also has a GetBlocks() function. With uncompressed images, these two methods return the same iVec2 value. With compressed formats, the GetBlocks() value will be the image width and height in pixels, divided by four. This will help you make sure you are not drawing outside the bounds of the image. Note that this technique will work just fine with BC1-BC5 format, but will not work with BC6h and BC7, because these use more complex data formats.
    Texture SetSubpixels
    We have a new texture method that functions like CopyRect but works on textures that are already loaded on the GPU. As we saw with pixmaps, it is perfectly safe to modify even compressed data as long as we remember that we are working with blocks, not pixels. This is the command syntax:
    void SetSubPixels(shared_ptr<Pixmap> pixmap, int x, int y, int width, int height, int dstx, int dsty, const int miplevel = 0, const int layer = 0); To do this we will load a texture, and then load a pixmap. Remember, pixmaps are image data stored in system memory and textures are stored on the GPU. If the loaded pixmap's format does not match the texture pixel format, then we will convert the pixmap to match the texture:
    --Load a texture local texture = LoadTexture("Materials/77684-blocks18c_1.jpg") --Load the pixmap local stamp = LoadPixmap("Materials/stamp.png") if stamp.format ~= texture.format then stamp = stamp:Convert(texture.format) end To choose a position to apply the pixmap to, I used the camera pick function and used to picked texture coordinate to calculate an integer offset for the texture:
    local mpos = window:GetMousePosition() if window:MouseDown(MOUSE_LEFT) == true and mpos:DistanceToPoint(mouseposition) > 50 then mouseposition = mpos local mousepos = window:GetMousePosition() local pick = camera:Pick(framebuffer, mousepos.x, mousepos.y, 0, true, 0) if pick ~= nil then local texcoords = pick:GetTexCoords() texture:SetSubPixels(stamp, 0, 0, stamp.size.x, stamp.size.y, texcoords.x * texture.size.x - stamp.size.x / 2, texcoords.y * texture.size.y - stamp.size.x / 2, 0, 0) end end And here it is in action, in a Lua script example to be included in the next beta release: Note that this command does not perform any type of blending, it only sets raw texture data.

    What you see above is not a bunch of decals. It's just one single texture that has had it's pixels modified in a destructive way. The stamps we applied cannot be removed except by drawing over them with something else, or reloading the texture.
    The above textures are uncompressed images, but if we want to make this work with FourCC-type compressed images (everything except BC6h and BC7) we need to take into account the block size. This is easy because with uncompressed images the block size is 1 and with compressed images it is 4:
    texture:SetSubPixels(stamp, 0, 0, stamp.size.x / stamp.blocksize, stamp.size.y / stamp.blocksize, texcoords.x * (texture.size.x - stamp.size.x / 2) / stamp.blocksize, (texcoords.y * texture.size.y - stamp.size.x / 2) / stamp.blocksize, 0, 0) Now our routine will work with uncompressed or compressed images. I have it working in the image below, but I don't know if it will look much different from the uncompressed version.
     
    I'm not 100% sure on the block / pixel part yet. Maybe I will just make a rule that says "units must be divisible by four for compressed images".
    Anyways, there you have it. This goes much deeper than the texture commands in Leadwerks Game Engine 4 and will allow a variety of user-created tools and extensions to be written for the new level editor. It all starts with the code API, and this is a fantastic set of features, if I do say so myself. Which I do.
  20. Josh
    We've now got 40 games in Leadwerks Game Launcher and I am giving serious thought to the final launch. The application is presently in early release mode with about 2500 users. We want that number to be 100,000 users. Leadwerks Game Launcher is a really unique application; it's distributed exclusively through Steam, it's a standalone app, and it's free. This is not really like anything else out there, so how do you most effectively promote it?
     
    I am running an experiment with Facebook ads. I figured we wanted a young male demographic, in the U.S. (for now), with an interest in Steam, Zynga, or Desura. This may be adjusted in the future with more tests. For example, we should be able to optimize campaign by comparing results between these keywords and focusing on the most effective ones, and by varying the wording of the text.
     
    There's not much data yet, but at this point we can see we presently have a calculated user acquisition cost of $0.83 per user (money spent versus link clicks). Based on that calculation, we can assume that reaching 100,000 users through Facebook advertising would cost $80,925. Which is actually something I can make happen if needed, but I don't think we will be relying exclusively on Facebook ads for this. I think our exposure through Steam will be our most significant channel, and it's free.
     
    Anyways, I had the idea a year ago that promoting a bundle of Leadwerks games would be more effective than having each developer individually promoting their own game (although you are certainly still encouraged to do so). Now we're starting to see the manifestation of a more consumer-oriented Leadwerks Software, and it's very interesting. I think a lot of things will be learned in 2016.
     
    I thought this post would be interesting information for you guys. I do not trust Facebook's ad statistics and this requires confirmation through Steam stats, but I am starting to gather data.
     

  21. Josh
    I recently added a new "SoldierAI" script to handle enemies that can shoot back and coordinate with one another. This was developed for the new Mercenary Action Figure. In this blog I will explain how I designed the behaviors to work.
     
    Just like the MonsterAI script, the soldier AI script includes a "teamid" value. This has three possible values, "good", "bad", and "neutral". Good guys and bad guys will attack each other, but not their own team or neutral characters.
     

     
    At its simplest, the AI I wanted was similar to the old enemies in the original Quake game. When you are far away, they shoot at you. When you are close enough, they will stab / claw / chainsaw you with a melee attack. This was easy to implement by adjusting behavior based on distance. Instead of making the soldiers stand there shooting in bursts, I decided to make them run forward in between shots, until they reach an optimum shooting distance. They will also move towards the player, using the navmesh, if their view of the player is blocked, as you can see in the end of this video when the player hides behind a truck:
     


     
    The Soldier AI has two ways of identifying an enemy. (In the script, this value is called "target".) They may visually sense the enemy in front of them, or hear them if they are close enough. They may also begin attacking if they receive damage from an enemy. Even if they are already chasing one enemy, if another enemy attacks them and is closer than their target, they will switch to fight the new attacker. I found this necessary when I saw a soldier shooting at a distant crawler, while another one right next to him was clawing at him.
     


     
    When a soldier is alerted to the presence of an enemy, all characters on his team within a certain distance will also become aware of the enemy. This means that if one soldier sees you, the whole pack will come looking for you, so you'd better be prepared!
     
    Finally, the bullets the soldier shoots are managed by a new "Projectile" script and a "tracer" prefab. The script is written in a general-purpose way so that it can also handle arrows, rockets, and other types projectiles. This will allow the soldier AI to work with a variety of characters with different types of weapons.
     
    To make the soldier AI script work with any character model, you just need the following animations:
    Death
    Fire
    Hit
    Idle
    Attack1
    Attack2 (optional)
    Run
    Walk

     
    You also need a child limb named "muzzle" in the location where you want bullets to emit from.
  22. Josh
    In this series of blogs I am going to describe my implementation of AI for a new Workshop Store model, the Leadwerks turret. The model is shown below, without the weapon inserted.
     

     
    Before writing any code I first decided the main behaviors I wanted this script to involve.
    The turret should scan the area to identify an enemy (using the team ID to differentiate between enemy types) and choose the closest target. Unlike the soldier AI, the turret AI will not change targets when another enemy attacks it.
    If the target is lost or occluded, another target will be chosen.
    The turret will fire a continuous rapid stream of bullets whenever an enemy is in view. We want this to be an obstacle you have to take cover from, or you will quickly be killed.
    The turret can be killed by either incurring enough damage to bring the health down to zero, or by knocking it over to deactivate it. If the turret is damaged to the point of "death" it will shoot sparks out once and then emit a cloud of smoke. The smoke will be continuous, to clearly indicate the turret is inactive.
    The turret movement will not use the physics system. The model is relatively small and I can't think of any really useful reasons I would want the motion to interact with physics, so I'm not going to implement this.

     
    The first draft of the script uses the World:ForEachEntityInAABB() function to perform a search and find a target enemy within a certain range. I set a breakpoint at line 18 to confirm that the script was successfully finding the player: Notice that a distance check is used so the turret will locate the closest enemy.

    Script.target = nil--enemy to shoot Script.mode = "idle" Script.lastsearchtime=0 Script.searchfrequency = 500 Script.range = 20 function TurretSearchHook(entity,extra) if entity.script~=nil then if type(entity.script.health)=="number" then if entity.script.health>0 then local d = extra:GetDistance(entity) if d<extra.script.range then if extra.script.target~=nil then if extra.script:GetDistance(extra.script.target)>d then extra.script.target = entity.script end else extra.script.target = entity.script end end end end end end function Script:UpdateWorld() --Search for target if self.target==nil then local currenttime = Time:GetCurrent() if currenttime - self.lastsearchtime > self.searchfrequency then self.lastsearchtime = currenttime local pos = self.entity:GetPosition(true) local aabb = AABB(pos - Vec3(self.range), pos + Vec3(self.range)) self.entity.world:ForEachEntityInAABBDo(aabb,"TurretSearchHook",self.entity) end end end
  23. Josh
    I've expanded the turret AI to include a visibility test. This is performed initially when selecting a target, and is also performed on a continuous basis to make sure the target stays in view. To avoid overloading the engine I am only performing the visibility test at most once every 500 milliseconds. This provides near-instantaneous updating while preventing the picking from becoming a bottleneck and slowing down performance.
     
    Adding bullets was mostly a copy and paste job from the SoldierAI script. Because we have a self-contained tracer prefab already assembled, it is easy to reassign this prefab for use with the turret.
     

     
    Still left to do:
    The FPSGun script expects hit objects to either be animated nemies or physics objects. Some revision will be required to handle the turret objects, so that the bullets will hurt the object and also apply a force at the hit point, so that it can potentially tip over the turrets.
    Smoke needs to emit from the turret once it is killed, to signify that it is no longer operational.
    The script needs to detect when the turret is tipped over and deactivate the AI when this occurs.
    The model needs to be adjusted so the pivots point in the direction I want for the gun to point at the target object.

     
    My updated script is below.

    --Public Script.target = nil--enemy to shoot Script.range = 20 Script.health=100 Script.projectilepath = "Prefabs/Projectiles/tracer.pfb" Script.shoot1sound=""--path "Fire 1 sound" "Wav file (*.wav):wav|Sound" Script.shoot2sound=""--path "Fire 2 sound" "Wav file (*.wav):wav|Sound" Script.shoot3sound=""--path "Fire 3 sound" "Wav file (*.wav):wav|Sound" --Private Script.lastfiretime=0 Script.mode = "idle" Script.lastsearchtime=0 Script.searchfrequency = 500 Script.firefrequency = 50 function Script:Start() --Load sounds self.sound = {} self.sound.shoot = {} if self.shoot1sound~="" then self.sound.shoot[1] = Sound:Load(self.shoot1sound) end if self.shoot2sound~="" then self.sound.shoot[2] = Sound:Load(self.shoot2sound) end if self.shoot3sound~="" then self.sound.shoot[3] = Sound:Load(self.shoot3sound) end self.projectile = Prefab:Load(self.projectilepath) if self.projectile~=nil then self.projectile:Hide() end self.turret = self.entity:FindChild("turret") --Add muzzleflash to gun self.muzzle = self.entity:FindChild("muzzle") if self.muzzle~=nil then local mtl=Material:Create() mtl:SetBlendMode(5) self.muzzle:SetMaterial(mtl) mtl:Release() self.muzzleflash = Sprite:Create() self.muzzleflash:SetSize(0.35,0.35) local pos=self.muzzle:GetPosition(true) self.muzzleflash:SetPosition(pos,true) self.muzzleflash:SetParent(self.muzzle,true) mtl = Material:Load("Materials/Effects/muzzleflash.mat") if mtl then self.muzzleflash:SetMaterial(mtl) mtl:Release() mtl=nil end local light = PointLight:Create() light:SetRange(5) light:SetColor(1,0.75,0) light:SetParent(self.muzzleflash,flash) if light.world:GetLightQuality()<2 then light:SetShadowMode(0) end self.muzzleflash:Hide() end end function Script:Release() if self.projectile~=nil then self.projectile:Release() self.projectile = nil end end function TurretSearchHook(entity,extra) if entity~=extra then if entity.script~=nil then if type(entity.script.health)=="number" then if entity.script.health>0 then local d = extra:GetDistance(entity) if d<extra.script.range then if extra.script.target~=nil then if extra.script:GetDistance(extra.script.target)>d then if extra.script:GetTargetVisible(entity.script) then extra.script.target = entity.script end end else if extra.script:GetTargetVisible(entity.script) then extra.script.target = entity.script end end end end end end end end function Script:GetTargetVisible(target) if target==nil then target = self.target end if target==nil then return false end local pickinfo = PickInfo() local p0 = self.entity:GetAABB().center local p1 = target.entity:GetAABB().center local pickmode0 = self.entity:GetPickMode() local pickmode1 = target.entity:GetPickMode() self.entity:SetPickMode(0) target.entity:SetPickMode(0) local result = not self.entity.world:Pick(p0,p1,pickinfo,0,false,Collision.LineOfSight) self.entity:SetPickMode(pickmode0) target.entity:SetPickMode(pickmode1) return result end function Script:UpdateWorld() if self.health>=0 then local currenttime = Time:GetCurrent() --Stop shooting if target is dead if self.target~=nil then if self.target.health<=0 then self.target = nil end end --Search for target if self.target==nil then if currenttime - self.lastsearchtime > self.searchfrequency then self.lastsearchtime = currenttime local pos = self.entity:GetPosition(true) local aabb = AABB(pos - Vec3(self.range), pos + Vec3(self.range)) self.entity.world:ForEachEntityInAABBDo(aabb,"TurretSearchHook",self.entity) end end --Continuous visibility test if self.target~=nil then if currenttime - self.lastsearchtime > self.searchfrequency then self.lastsearchtime = currenttime if self:GetTargetVisible(self.target)==false then self.target = nil return end end end if self.target~=nil then --Motion tracking if self.turret~=nil then local p0 = self.turret:GetPosition(true) local p1 = self.target.entity:GetAABB().center local yplane = p1 - p0 yplane.y=0 self.turret:AlignToVector(yplane,1,0.1 / Time:GetSpeed(),0) end --Shoot if self.muzzle~=nil and self.projectile~=nil then if currenttime - self.lastfiretime > self.firefrequency then self.lastfiretime = currenttime local bullet = self.projectile:Instance() self.muzzleflash:Show() self.muzzleflash:EmitSound(self.sound.shoot[#self.sound.shoot]) self.muzzleflash:SetAngle(math.random(0,360)) self.muzzleflashtime=currenttime if bullet~=nil then bullet:Show() bullet:SetPosition(self.muzzle:GetPosition(true),true) bullet:SetRotation(self.muzzle:GetRotation(true),true) if bullet.script~=nil then bullet.script.owner = self if type(bullet.script.Enable)=="function" then bullet.script:Enable() end bullet:Turn(Math:Random(-3,3),Math:Random(-3,3),0) end end end end end end self:UpdateMuzzleFlash() end function Script:UpdateMuzzleFlash() if self.muzzleflashtime then if Time:GetCurrent()-self.muzzleflashtime<30 then self.muzzleflash:Show() else self.muzzleflash:Hide() end end end
  24. Josh

    Articles
    I have two goals for the art pipeline in the new game engine.
    Eliminate one-way 3D model conversions. Use the same game asset files on all platforms. In Leadwerks, 3D models get converted into the proprietary MDL format. Although the format is pretty straightforward and simple, there is no widespread support for loading it back into 3D modeling programs. This means you need to keep a copy of your 3D model in FBX and MDL format. You may possibly want to keep an additional file for the modeling program it was made in, such as 3D Studio Max MAX files. I wanted to simplify this by relying on the widely-support glTF file format, which can be used for final game-ready models, but can still easily loaded back into Blender or another program.
    Texture Compression
    Texture compression is where this all gets a lot more tricky. Texture compression is an important technique that can reduce video memory usage to about 25% what it would be otherwise. When texture compression is used games run faster, load faster, and look better because they can use higher resolution images. There are two problems with texture compression.
    Supported texture compression methods vary wildly across PC and mobile platforms. Many 3D modeling programs do not support modern texture compression formats. On the PC, BC7 is the best compression format to use for most images. BC5 is a two-channel format appropriate for normal maps, with Z reconstruction in the shader. BC6H can be used to compress HDR RGB images (mostly skyboxes). BC4 is a single-channel compressed format that is rarely used but could be useful for some special cases. The older DXTC formats should no longer be used, as they have worse artifacts with the same data size.
    Here are a few programs that do not support the newer BC formats, even though they are the standard for modern games:
    Windows Explorer Blender Microsoft 3D Object Viewer Now, the glTF model format does not actually support DDS format textures at all. The MSFT_texture_dds extension adds support for this by adding an extra image source into the file. glTF loaders that support this extension can load the DDS files, while programs that do not recognize this extension will ignore it and load the original PNG or JPEG files:
    "textures": [ { "source": 0, "extensions": { "MSFT_texture_dds": { "source": 1 } } } ], "images": [ { "uri": "defaultTexture.png" }, { "uri": "DDSTexture.dds" } ] I don't know any software that supports this extension except Ultra Engine. Even Microsoft's own 3D Object Viewer app does not support DDS files.
    The Ultra Engine editor has a feature that allows you to convert a model's textures to DDS. This will resave all textures in DDS format, and change the model materials to point to the DDS files instead of the original format the model uses (probably PNG):

    When a glTF file is saved after this step, the resulting glTF will keep copies of both the original PNG and the saved DDS files. The resulting glTF file can be loaded in Ultra Engine ready to use in your game, with DDS files applied, but it can still be loaded in programs that do not support DDS files, and the original PNG files will be used instead:

    The same model can be loaded back into Blender in case any changes need to be made. If you resave it, the references to the DDS images will be lost, so just repeat the DDS conversion step in the Ultra Engine to finalize the updated version of your model.

    To cut down on the size of your game files, you can just skip PNG files in the final packaging step so only DDS files are included.
    Basis Universal
    We still have to deal with the issue of hardware support for different codecs. This table from Unity's documentation shows the problem clearly:

    We do want to support some mobile-based VR platforms, so this is something that needs to be considered now. Basis Universal is a library from the author of Crunch that solves this problem by introducing a platform-agnostic intermediate compressed format that can be quickly transcoded into various texture compression formats, without going through a slow decompression and recompression step. We can generate Basis files in the Ultra Engine editor just like we did for DDS: There is a glTF extension that supports basis textures, so glTF models can be saved that reference the Basis files.

    The basis textures are stored in the glTF file the same way the DDS extension works:
    "textures": [ { "source": 0, "extensions": { "KHR_texture_basisu": { "source": 1 } } } ], "images": [ { "mimeType": "image/png", "bufferView": 1 }, { "mimeType": "image/ktx2", "bufferView": 2 } ] The resulting glTF files can be loaded as game-ready models with compressed textures on PC or mobile platforms, and can still be loaded back into Blender or another 3D modeling program. We don't have to store different copies of our textures in different compression formats or go through a slow conversion step when publishing the game to other platforms. If your game is only intended to run on PC platforms, then DDS is the simplest path to take, but if you plan to support mobile-based VR devices in the future then Basis solves that problem nicely.
    Here is our revised table:

    I was also working with the KTX2 format for a while, but I ended up only using it for its support for Basis compression. DDS and Basis cover all your needs and are more widely supported.
    Basis also supports a very lossy but efficient ETC1S method which is similar to Crunch, as well as uncompressed texture data, but in my opinion the one purpose for the Basis format is its UASTC format.
    This article features the lantern sample glTF model from Microsoft.
  25. Josh

    Articles
    In Leadwerks, required files were always a slightly awkward issue. The engine requires a BFN texture and a folder of shaders, in order to display anything. One of my goals is to make the Ultra Engine editor flexible enough to work with any game. It should be able to load the folder of an existing game, even if it doesn't use Ultra Engine, and display all the models and scenes with some accuracy. Of course the Quake game directory isn't going to include a bunch of Ultra Engine shaders, so what to do?
    One solution could be to load shaders and other files from the editor directory, but this introduces other issues. My solution is to build shaders, shader families, and the default BRDF texture into the engine itself. This is done with a utility that reads a list of files to includes, then loads each one and turns it into an array in C++ code that gets compiled into the engine: The code looks like this:
    if (rpath == RealPath("Shaders/Sky.json")) { static const std::array<uint64_t, 62> data = {0x61687322090a0d7bULL,0x6c696d6146726564ULL,0xd7b090a0d3a2279ULL,0x746174732209090aULL,0x9090a0d3a226369ULL,0x66220909090a0d7bULL,0xa0d3a2274616f6cULL,0x9090a0d7b090909ULL,0x555141504f220909ULL,0x909090a0d3a2245ULL,0x90909090a0d7b09ULL,0x6c75616665642209ULL,0x909090a0d3a2274ULL,0x909090a0d7b0909ULL,0x6573616222090909ULL,0x90909090a0d3a22ULL,0x909090a0d7b0909ULL,0x7265762209090909ULL,0x5322203a22786574ULL,0x532f737265646168ULL,0x762e796b532f796bULL,0x227670732e747265ULL,0x9090909090a0d2cULL,0x6d67617266220909ULL,0x5322203a22746e65ULL,0x532f737265646168ULL,0x662e796b532f796bULL,0x227670732e676172ULL,0x909090909090a0dULL,0x9090909090a0d7dULL,0x7d090909090a0d7dULL,0xd2c7d0909090a0dULL,0x756f64220909090aULL,0x90a0d3a22656c62ULL,0x909090a0d7b0909ULL,0x45555141504f2209ULL,0x90909090a0d3a22ULL,0x9090909090a0d7bULL,0x746c756166656422ULL,0x90909090a0d3a22ULL,0x90909090a0d7b09ULL,0x2265736162220909ULL,0x9090909090a0d3aULL,0x90909090a0d7b09ULL,0x7472657622090909ULL,0x685322203a227865ULL,0x6b532f7372656461ULL,0x34365f796b532f79ULL,0x732e747265762e66ULL,0x9090a0d2c227670ULL,0x7266220909090909ULL,0x3a22746e656d6761ULL,0x7265646168532220ULL,0x6b532f796b532f73ULL,0x72662e6634365f79ULL,0xd227670732e6761ULL,0x7d0909090909090aULL,0x7d09090909090a0dULL,0xd7d090909090a0dULL,0x90a0d7d0909090aULL,0xa0d7d090a0d7d09ULL,0xcdcdcdcdcdcdcd7dULL }; auto buffer = CreateBuffer(489); buffer->Poke(0,(const char*)data.data(),489); return CreateBufferStream(buffer); } An unsigned 64-bit integer is used for the data type, as this results in the smallest generated code file size.
    Files are searched for in the following order:
    A file on the hard drive in the specified path. A file from a loaded package with the specified relative path. A file built into the engine. Therefore, if your game includes a modified version of a shader, the shader module will still be loaded from the file in your game directory. However, if you don't include any shaders at all, the engine will just fall back on its own set of shaders compiled into the core engine.
    This gives Ultra Engine quite a lot more flexibility in loading scenes and models, and allows creation of 3D applications that can work without any required files at all, while still allowing for user control over the game shaders.
    The screenshot here shows the Ultra Engine editor loading a Leadwerks project folder and displaying 3D graphics using the Ultra Engine renderer, even though the Leadwerks project does not contain any of the shaders and other files Ultra Engine needs to run:

×
×
  • Create New...