Jump to content

Josh

Staff
  • Posts

    23,313
  • Joined

  • Last visited

Blog Entries posted by Josh

  1. Josh
    My work with the MD3 format and its funny short vertex positions made me think about vertex array sizes. (Interestingly, the Quake 2 MD2 format uses a single char for vertex positions!)
    Smaller vertex formats run faster, so it made sense to look into this. Here was our vertex format before, weighing in at 72 bytes:
    struct Vertex { Vec3 position; float displacement; Vec3 normal; Vec2 texcoords[2]; Vec4 tangent; unsigned char color[4]; unsigned char boneweights[4]; unsigned char boneindices[4]; } According to the OpenGL wiki, it is okay to replace the normals with a signed char. And if this works for normals, it will also work for tangents, since they are just another vector.
    I also implemented half floats for the texture coordinates.
    Here is the vertex structure now at a mere 40 bytes, about half the size:
    struct Vertex { Vec3 position; signed char normal[3]; signed char displacement; unsigned short texcoords[4]; signed char tangent[4]; unsigned char color[4]; unsigned char boneweights[4]; unsigned char boneindices[4]; } Everything works with no visible loss of quality. Half-floats for the position would reduce the size by an additional 6 bytes, but would likely incur two bytes of padding since it would no longer by aligned to four bytes, like most GPUs prefer. So it would really only save four bytes, which is not worth it for half the precision.
    Another interesting thing I might work into the GMF format is Draco mesh compression by Google. This is only for shrinking file sizes, and does not lead to any increased performance. The bulk of your game data is going to be in texture files, so this isn't too important but might be a nice extra.
  2. Josh
    Now that we have our voxel light data in a 3D texture we can generate mipmaps and perform cone step tracing. The basic idea is to cast a ray out from each side of each voxel and use a lower-resolution mipmap for each ray step. We start with mipmap 1 at a distance that is 1.5 texels away from the position we are testing, and then double the distance with each steo of the ray. Because we are using linear filtering we don't have to make the sample coordinates line up exactly to a texel center, and it will work fine:

    Here are the first results when cone step tracing is applied. You can see light bouncing off the floor and reflecting on the ceiling:

    This dark hallway is lit with indirect lighting:

    There's lots of work left to do, but we can see here the basic idea works.
  3. Josh
    To make an entity in Leadwerrks3D behave, you can drag a script from the asset browser to the actors list in the entity properties editor. Each entity can have multiple scripts attached to it. Each script can have different variables exposed, with control for the GUI they are displayed in. For example, you can specify upper and lower limits of a spinner or a list of choices for a drop down box. Each script attached to an entity, along with its set of properties, is called an "actor" because...it acts.
     
    To handle the interaction of different actors, a flowgraph is used. You can drag actors from the entity properties editor into the flowgraph editor. Each actor has inputs and outputs defined in its script. You can connect the output of one actor to the input of another. For example, you can connect the "tick" output of a timer script to the "Resume" output of a mover script, to make the mover translate and rotate an entity at regular intervals when the timer ticks:


     
    I hope this makes it more clear how entity interactions can be handled in Leadwerrks3D, even without programming. With this system, we're all working on one big game together, and we can share functional objects without any difficulty.
  4. Josh
    At this point I would like to share some plans with you on continued development over the next several months. I generally avoid detailed roadmaps for two reasons:
    It's more fun to announce something with a short notice than to say "we're going to do X, Y, and Z, by these dates...". If people know too far in advanced they sort of take it for granted, and it doesn't have as big an impact.
    Plans change. When they do, people often consider cancelled plans to be "promised features". I like to dynamically adjust my plans as I gather more information and see how people react to the software.

     
    I do have a plan for Leadwerks 3.3 and 3.4. They will each likely include some useful features and graphical enhancements.
     
    Another idea I want to expand on is pre-made game-ready content. The simplest manifestation of this is a model pack, like the SciFi interior DLC that was released. A more advanced implementation would be something like a character or weapon pack. Rather than just offering 3D models, I want these to be finished game-ready items with scripts and sounds that you can drop into your game and get instant functionality. I find the idea of no-programming game development very exciting, when you can do it without the terrible limitations other products that attempt this have; When you're ready, you can start cracking open scripts and looking around, and if you want to go a step beyond that, you can always drill down the to C++ layer. Leadwerks is unique in that it's perfectly suited for noobs, but is capable of taking you all the way to the professional level, if you have that desire. We're not really taking advantage of that concept yet, but I hope to soon.
     
    I am also planning on something that will give your Workshop games a lot of exposure, so if you have a Lua game you want to share, now is a good time to submit it. Don't worry if it's incomplete or "not good enough". The Workshop is for prototyping and testing each other's ideas. I love playing the simple games that are up there now, and each update brings something new and exciting. The ease with which people can produce simple games is my measure of how good of a job I'm doing.
     
    Oh yeah, and official Linux support on Steam is coming soon.
     
    That's all for now. To all newcomers, welcome to the community, and to the old-timers, thank you for helping them.
  5. Josh
    Physics simulations typically require physical geometry to be provided in a series of convex hulls that approximate the shape of the object. This is because the intersection of two convex objects can be calculated fast enough for use in real-time simulations.
     
    One solution for creating this type of physical geometry is to create a low-detail model of the object in a modeling program. However, this process can be tedious and is often overlooked by artists. Auto-generating this type of geometry from the visual mesh itself would be ideal.
     
    Convex decomposition is a technique that takes an artbitrary polygonal mesh and transforms it into a series of convex hulls. Unfortunately until recently such algorithms were unreliable and produced physics geometry that was far too high-detail for real-time physics, as shown in the image below:
     

     
    One solution was to model a low-poly polygonal mesh and then run that through the convex decomposition routine. Although this relieved the artist from the requirement of modeling shapes as separate convex objects, it created a situation that was the worst of both worlds; physics geometry still had to be modeled by hand, but the unreliability of a convex decomposition algorithm also had to be accounted for. For this reason, I have chosen to avoid using this technique until now. However, physics geometry is our bottleneck right now in getting more content into Leadwerks, and this is a problem that has been on my mind.
     
    VHACD is a new(er) convex decomposition library. Apparently it's been around for a while, but I've only heard of HACD until recently, which did not produce very usable results. VHACD works by converting polygonal geometry into voxel data and creating a series of convex shapes from this data. Integrating this into Leadwerks took just two days, as it was very similar to the old HACD code, which was integrated at one point, but never saw the light of day because I did not feel it was good enough to release. The results of the new algorithm are, in most cases, about as good as hand-modeled geometry. This is generated purely from the visual geometry, so there is no need to hand-model a low detail collision mesh:
     

     

     

     

     

     
    You can try this feature out right now if you opt in to the beta branch on Steam, on Windows. Open a model in the editor and select the Tools > Convex Decomposition... menu item to open the convex decomposition dialog.
     
    Thanks to Khaled Mamou and Julio Jerez for their work in this area.
  6. Josh
    I am experimenting with a system for creating a sequence of actions using Lua coroutines. This allows you to define a bunch of behavior at startup and let the game just run without having to keep track of a lot of states.
    You can add coroutines to entities and they will be executed in order. The first one will complete, and then the next one will start.
    A channel parameter allows you to have separate stacks of commands so you can have multiple sequences running on the same object. For example, you might have one channel that controls entity colors while another channel is controlling position.
    function Script:Start() local MotionChannel = 0 local ColorChannel = 1 local turnspeed = 1 local colorspeed = 3 --Rotate back and forth at 1 degree / second self:AddCoroutine(MotionChannel, ChangeRotation, 0, 45, 0, turnspeed) self:AddCoroutine(MotionChannel, ChangeRotation, 0, -45, 0, turnspeed) self:LoopCourtines(MotionChannel)--keeps the loop going instead of just running once --Flash red and black every 3 seconds self:AddCoroutine(ColorChannel, ChangeColor, 1, 0 , 0, 1, colorspeed) self:AddCoroutine(ColorChannel, ChangeColor, 0, 0, 0, 1, colorspeed) self:LoopCourtines(ColorChannel)--keeps the loop going instead of just running once end There's no Update() function! Where do the coroutine functions come from? These can be in the script itself, or they can be general-use functions loaded from another script. For example, you can see an example of a MoveToPoint() coroutine function in this thread.
    The same script could be created using an Update function but it would involve a lot of stored states. I started to write it out actually for this blog, but then I said "ah screw it, I don't want to write all that" so you will have to use your imagination.
    Now if you can imagine a game like the original Warcraft, you might have a script function like this that is called when the player assigns a peasant to collect wood:
    function Script:CollectWood() self:ClearCoroutines(0) self:AddCoroutine(0, self.GoToForestAndFindATree) self:AddCoroutine(0, self.ChopDownTree) self:AddCoroutine(0, self.GoToCastle) self:AddCoroutine(0, self.Wait, 6000) self.AddCoroutine(0, self.DepositWood, 100) self:LoopCoroutines(0) end I wonder if there is some way to create a sub-loop so if the NPC gets distracted they carry out some actions then return to the sequence they were in before, at the same point in the sequence.
    Of course this would work really well for cutscenes or any other type of one-time sequence of events.
  7. Josh
    I previously reported that Leadwerks Game Launcher had obtained 4000 users. My records in Steam have changed and it now reflects a user base of 1765 users. I think the reporting for free applications was not working correctly, because this number is much closer to what I would expect based on game subscriptions.
     
    The application is still in early release mode and has not been officially launched, and it will have low visibility before the final release. This gives us a period of time to work out any glitches, add support for SteamOS, and add to the number of games offered.
  8. Josh
    I'm building the VR project template for Leadwerks 4.5.  Although you can enable VR in any project, this template is specifically designed to provide some of your most common room-scale VR features:
    Teleportation movement, which prevents motion sickness. Picking up and throwing objects. (It's actually really fun!) To start with I am creating the art assets for the teleport effect. This is basically what I want:

    Your controller shoots a beam which ends in an indicator when it hits an upwards-facing slope. Typically this beam will be somewhat arced.  Why the curve? This allows you to climb up to areas above you:

    As always, I am starting with the game assets. I don't believe in using programmer art because it hurts your understanding of what you are trying to create, it's uninspiring, and you will end up writing your code twice once you get the final artwork and realize all the mistakes you made.
    I started with textures. I know I want a circular indicator on the floor, a misty spinning effect rising off it, and a beam. I'm going to make all my textures grayscale so that I can control the color with the entity color value and dynamically change it in the game.  Here are my textures I created in about ten minutes in Paint Shop Pro:



    The first texture above is clamped along the X and Y axes and the second one is clamp along the Y axis.  I am using uncompressed textures for all of these because they have a lot of soft gradients.
    I created my materials with the following settings, again leaving everything white:

    In 3ds Max I created my indicator model. It's just a plane with a cylinder on top, with the end caps removed:

    When I import it into Leadwerks and apply my materials, the model looks like this:

    I'll show you why I am using uncompressed textures. You can see in this shot the edge of the ring has some ugly artifacts when texture compression is used:

    Here's a closeup. Not something I want to see in VR:

    Now I am going to create an instance of the model in the editor and adjust the color. I want a bright blue glowy color. I am setting the color to RGB 128,255,255 and cranking the intensity way up to 2.0. This effectively sets the entity color to 256,512,512. This color is multiplied by the texture color at each pixel and then clamped to 0-255 (the maximum color range of the monitor). That means that the brightest spots on the material will reach a full 255,255,255 white color and look really intense, while darker parts will be tinted blue:

    Notice the object isn't just a flat color, but has a range of color from blue to white. To get this effect I had to increase the intensity over 1.0 to create colors brighter than RGB 255,255,255, and I had to have some red in the color. If I had set the color to RGB 0,255,255 the red channel would never increase and I would have a flat color like this. Not so good:

    If I had set the color to RGB 128,255,255 but left the intensity at 1.0 I would also have a solid color:

    Finally I added a script to the model and saved it as a prefab. The script just rotates the model around slowly on its Y axis, which I think will look pretty good. I'm going to perform the rotation in the Draw() function so it doesn't get called if the object is hidden or offscreen, and I don't think anyone will notice if the rotation doesn't update when they look away:
    function Script:Draw() self.entity:Turn(0, 0.1 * Time:GetSpeed(), 0) end That's it for now. The next step will be to create my teleportation mechanic in VR.
  9. Josh
    The next step is to put our scene into VR and look at everything. I'm about six feet tall and the player model seems the same height, so that all looks right. The controllers, as you can see below, are all working with full articulation of all controls. The teleport indicator looks good too.
    Now we're going to start our VR player script. We start with a new script and initialize the VR environment in the Start() function. A camera is also created in code, and I deleted the camera from the scene.
    function Script:Start() if VR:Enable()~=true then Print("Error: Failed to initialize VR environment.") end self.camera = Camera:Create() end I wanted to look at the controller models and see if there was a child that indicated where the tip of the controller was, for aiming a beam. The code below will wait until a controller model is loaded and then print out the names of all its children.
    function Script:Start() if VR:Enable()~=true then Print("Error: Failed to initialize VR environment.") end self.camera = Camera:Create() self.partslisted=false end function Script:UpdateWorld() if self.partslisted==false then local model = VR:GetControllerModel(VR.Left) if model==nil then model = VR:GetControllerModel(VR.Right) end if model~=nil then self.partslisted = true local n for n=0,model:CountChildren()-1 do local subobject = model:GetChild(n) local name = subobject:GetKeyValue("name") System:Print(name) end end end end The result is below. Three blank lines mean that three subobjects in there have no names. You can tell what some of the subobjects are, but none of them look like what I am looking for.
    body button led lgrip rgrip scroll_wheel status sys_button trackpad trackpad_scroll_cut trackpad_touch trigger None of those look like what I am looking for, so I am just going to use the controller model itself and hope for the best. The code below creates a box sticking out of the controller.
    function Script:Start() if VR:Enable()==false then Print("Error: Failed to initialize VR environment.") end self.camera = Camera:Create() end function Script:UpdateWorld() local controller = VR:GetControllerModel(VR.Right) if controller~=nil then if self.beam==nil then self.beam = Model:Box(0.05,0.05,4) end if self.beam:GetParent()~=controller then self.beam:SetPosition(0,0,0) self.beam:SetRotation(0,0,0) self.beam:SetParent(controller,false) self.beam:Move(0,0,-2) end end end As a result, we've got this funny light saber-looking thing:

    Now let's replace the box with a sprite and apply our third teleport material to it. We'll make the sprite rotate around its own Z axis to face the camera using the SetViewMode() command.
    function Script:Start() if VR:Enable()==false then Print("Error: Failed to initialize VR environment.") end self.camera = Camera:Create() end function Script:UpdateWorld() local controller = VR:GetControllerModel(VR.Right) if controller~=nil then if self.beam==nil then self.beam = Sprite:Create() self.beam:SetViewMode(6) self.beam:SetSize(0.05,4) self.beam:SetColor(1,2,2) local mtl = Material:Load("Models/VR/teleport3.mat") if mtl~=nil then self.beam:SetMaterial(mtl) mtl:Release() mtl = nil end end if self.beam:GetParent()~=controller then self.beam:SetPosition(0,0,0) self.beam:SetRotation(0,0,0) self.beam:SetParent(controller,false) self.beam:Move(0,0,-2) end end end Now it really looks like a light saber!

    The next step is to add picking so we can use the beam to place the teleporter. I deleted the teleport indicator prefab from the map and loaded a copy of the prefab up in the Start() function:
    function Script:Start() if VR:Enable()==false then Print("Error: Failed to initialize VR environment.") end --Create the player camera self.camera = Camera:Create() --Load the teleport indicator prefab self.teleportindicator = Prefab:Load("Models/VR/teleport.pfb") self.teleportindicator:Hide() end function Script:UpdateWorld() local controller = VR:GetControllerModel(VR.Right) if controller~=nil then --Create the teleporter beam if self.beam==nil then self.beam = Sprite:Create() self.beam:SetViewMode(6) self.beam:SetSize(0.05,4) self.beam:SetColor(1,2,2) local mtl = Material:Load("Models/VR/teleport3.mat") if mtl~=nil then self.beam:SetMaterial(mtl) mtl:Release() mtl = nil end end --Reparent the beam, if needed if self.beam:GetParent()~=controller then self.beam:SetPosition(0,0,0) self.beam:SetRotation(0,0,0) self.beam:SetParent(controller,false) self.beam:Move(0,0,-2) end if VR:GetControllerButtonDown(VR.Right,VR.TouchpadButton)==true then local world = self.entity.world local pickinfo = PickInfo() local p0 = controller:GetPosition(true) local p1 = Transform:Point(0,0,-4,controller,nil) if world:Pick(p0, p1, pickinfo, 0, true)==true then self.teleportindicator:SetPosition(pickinfo.position) self.teleportindicator:Translate(0,0.05,0) self.teleportindicator:Show() end end end end We can move the teleporter indicator around. It climbs straight up walls and there are lots of problems, but you can see the beginnings of a teleporter:
    The next thing we want to do is add a normal check to prevent the teleporter from working if the picked slope is too high. You can get the slope in degrees of any normal with the following bit of code:
    local slope = 90 - Math:ASin(pickinfo.normal.y) We're also going to hide the indicator when a valid teleportation destination isn't picked. Here's the complete code:
    function Script:Start() if VR:Enable()==false then Print("Error: Failed to initialize VR environment.") end --Create the player camera self.camera = Camera:Create() --Create teleporter beam self.beam = Sprite:Create() self.beam:SetViewMode(6) self.beam:SetSize(0.05,20) self.beam:SetColor(1,2,2) local mtl = Material:Load("Models/VR/teleport3.mat") if mtl~=nil then self.beam:SetMaterial(mtl) mtl:Release() mtl = nil end self.beam:Hide() --Load the teleport indicator prefab self.teleportindicator = Prefab:Load("Models/VR/teleport.pfb") self.teleportindicator:Hide() end function Script:UpdateWorld() self.teleportindicator:Hide() self.beam:Hide() local controller = VR:GetControllerModel(VR.Right) if controller~=nil then --Reparent the beam, if needed if self.beam:GetParent()~=controller then self.beam:SetPosition(0,0,0) self.beam:SetRotation(0,0,0) self.beam:SetParent(controller,false) self.beam:Move(0,0,-10) end --Activate teleporter if VR:GetControllerButtonDown(VR.Right,VR.TouchpadButton)==true then local world = self.entity.world local pickinfo = PickInfo() local p0 = controller:GetPosition(true) local p1 = Transform:Point(0,0,-20,controller,nil) if world:Pick(p0, p1, pickinfo, 0, true)==true then local slope = 90 - Math:ASin(pickinfo.normal.y) if slope<35 then self.teleportindicator:SetPosition(pickinfo.position) self.teleportindicator:Translate(0,0.05,0) self.teleportindicator:Show() self.beam:Show() end end end end end And now we will add the actual teleport mechanic. We take the picked position, subtract the camera's current XZ position, and add to the existing VR offset. This took a little trial and error, but it works perfectly:
    --Check if teleporter is active and the button was released if self.beam:Hidden()==false then if VR:GetControllerButtonDown(VR.Right,VR.TouchpadButton)==false then local offset = VR:GetOffset() local pos = self.teleportindicator:GetPosition() local campos = self.camera:GetPosition(true) pos.x = pos.x - campos.x + offset.x pos.y = pos.y - 0.05 pos.z = pos.z - campos.z + offset.z VR:SetOffset(pos) end end Here it is in action. We can not yet teleport up to the highest blocks, but we can freely move around the map.
    Next we will make the beam arc so we can climb up to platforms above us.
  10. Josh
    The final step to our VR teleporter mechanic is to make the beam arc. This allows us to climb up to areas above our head.

    The trick to this is to take our single beam picking mechanic and split it up into a lot of little segments, for both the intersection test and the visual display. I decided to make a kind of simple physics simulation for this, rather than using bezier curves or another method. The basic idea is you have a starting position and a velocity. You move a point along that velocity, and then add gravity to the velocity and repeat. You don't have to worry about resistance or anything like that. The result is a nice arc that mimics a ball thrown into the air.
    --Loop through segments making an arc for n=1,self.maxbeamsteps do p1 = p0 + velocity * speed velocity.y = velocity.y + gravity end The other feature I added was smooth movement during teleportation. It's very fast, taking less than a second, but it does a really good job of communicating motion. I myself am sensitive to motion sickness and am able to use this method of locomotion with no problems.
    --Update offset position local pos = VR:GetOffset() local d = self.targetoffset:DistanceToPoint(pos) local speed = 1.0 if speed>d then speed=d end pos = pos + (self.targetoffset - pos):Normalize() * speed VR:SetOffset(pos) I also decided to make the teleportation beam and indicator green or red, depending on its state, because the meaning of that is universal and easily understood. Here's the complete script, in only 150 lines of code:
    function Script:Start() --Initialize VR if VR:Enable()==true then VR:SetTrackingSpace(VR.Roomspace) else System:Print("Error: Failed to initialize VR environment.") end --Create the player camera self.camera = Camera:Create() self.camera:SetPosition(self.entity:GetPosition(true)) self.camera:Translate(0,1.6,0) self.camera:SetRotation(self.entity:GetRotation(true)) VR:CenterTracking() --Create teleporter beam local beam = Sprite:Create() beam:SetViewMode(6) beam:SetSize(0.05,20) beam:SetColor(1,2,1) local mtl = Material:Load("Models/VR/teleport3.mat") if mtl~=nil then beam:SetMaterial(mtl) mtl:Release() mtl = nil end beam:Hide() --Create beam segments self.maxbeamsteps=32 self.beam = {} self.beam[1] = beam local n for n=2,self.maxbeamsteps do self.beam[n] = tolua.cast(self.beam[1]:Instance(),"Sprite") end --Load the teleport indicator prefab self.teleportindicator = Prefab:Load("Models/VR/teleport.pfb") self.teleportindicator:SetColor(1,2,1) self.teleportindicator:Hide() --Initialize offset self.targetoffset = VR:GetOffset() end function Script:UpdateWorld() local window = Window:GetCurrent() if window~=nil then if window:KeyHit(Key.F2) then VR:CenterTracking() end end --Check if teleporter is active and the button was released if self.teleportindicator:Hidden()==false then if VR:GetControllerButtonDown(VR.Right,VR.TouchpadButton)==false then local offset = VR:GetOffset() local pos = self.teleportindicator:GetPosition() local campos = self.camera:GetPosition(true) pos.x = pos.x - campos.x + offset.x pos.y = pos.y - 0.05 pos.z = pos.z - campos.z + offset.z self.targetoffset = pos end end --Hide beam and indicator self.teleportindicator:Hide() for n=1,self.maxbeamsteps do self.beam[n]:Hide() self.beam[n]:SetColor(2,1,1) end local controller = VR:GetControllerModel(VR.Right) if controller~=nil then --Activate teleporter if VR:GetControllerButtonDown(VR.Right,VR.TouchpadButton)==true then local world = self.entity.world local pickinfo = PickInfo() local p0 = controller:GetPosition(true) local velocity = Transform:Normal(0,0,-1,controller,nil) local speed = 1 local n local gravity = -0.1 --Loop through segments making an arc for n=1,self.maxbeamsteps do p1 = p0 + velocity * speed velocity.y = velocity.y + gravity self.beam[n]:Show() self.beam[n]:SetPosition((p0+p1)*0.5,true) self.beam[n]:AlignToVector(p1-p0,2) self.beam[n]:SetSize(0.05,p0:DistanceToPoint(p1)+0.02) if world:Pick(p0, p1, pickinfo, 0, true)==true then --Correct the length of the last beam segment self.beam[n]:SetSize(0.05,p0:DistanceToPoint(pickinfo.position)+0.02) self.beam[n]:SetPosition((p0+pickinfo.position)*0.5,true) --Cancel if slope is too steep local slope = 90 - Math:ASin(pickinfo.normal.y) if slope>35 then break end --Show the indicator self.teleportindicator:SetPosition(pickinfo.position) self.teleportindicator:Translate(0,0.05,0) self.teleportindicator:Show() --Recolor the beam for n=1,self.maxbeamsteps do self.beam[n]:SetColor(1,2,1) end break end p0 = p1 end end end --Update offset position local pos = VR:GetOffset() local d = self.targetoffset:DistanceToPoint(pos) local speed = 1.0 if speed>d then speed=d end pos = pos + (self.targetoffset - pos):Normalize() * speed VR:SetOffset(pos) end function Script:Detach() self.teleportindicator:Release() self.teleportindicator = nil for n=1,self.maxbeamsteps do self.beam[n]:Release() end self.beam = nil self.camera:Release() self.camera = nil end And here it is in action. This will be standard in Leadwerks Game Engine 4.5:
     
  11. Josh
    This tutorial demonstrates how to create a high-quality skybox for Leadwerks Game Engine using Vue.
    Download
    Cloudy Blue Skies.zip FixVueCubemap.zip Required Third-Party Programs
    Vue Esprit Exporter Module Loading the Example
    Run Vue and select the File > Open menu item.  Extract the zip file above and open the file "Cloudy Blue Skies.vue".

    Atmosphere and Clouds
    You can modify the appearance of the sky with the Atmosphere Editor. Select the Atmosphere > Atmosphere Editor menu item to open this dialog.

    The clouds tab lets you adjust various properties of the cloud layers and add new ones. Skyboxes look best with multiple layers of different kinds of clouds, so don't expect to get the perfect look with just one layer.

    The load button to the right side of the cloud layer list will let you select from a wide range of different cloud types.

    Experiment with different cloud layers to get the look you want. The "Detail amount" setting in particular will really enhance the image, but don't overdo it. You can right-click and drag the mouse to look around in the main panel, so be sure to take a look around to see how the clouds affect the entire sky.
    Lighting
    To edit the sunlight properties in Vue, select the sunlight object in the World Browser on the right side of the main window.

    You can match the exact rotation of the default sunlight angle in Leadwerks to make your skybox line up exactly to the scene lighting. The default sunlight angle in Leadwerks is (55,-35,0). In Vue this corresponds to the values (145,0,215). To get these values we add 90 degrees to the pitch and subtract the yaw from 180. Note in Vue the order of the yaw and roll are switched.

    The sun color is very important for the overall composition of our image. In real life we're used to seeing a very high range of light levels in the sky. Computer monitors cannot represent the same range of colors, so images can easily become washed out and lose details. We want to adjust the sun color so we can get the most detail within the spectrum of a 32-bit color display. Like the rotation, the sun color can be modified in the sun properties.

    If the sunlight color is too bright, the image will be overexposed and the cloud shape will become washed out.

    If the sunlight is too dark, it will look gray and desaturated.

    The right sun brightness will give a balanced look between bright spots and shadows. This is the part in the process that requires the most artistic sense. It's a good idea to look at some screenshots or photos for comparison as you adjust your settings.

    You will get quite a lot of variation in brightness across the sky, so be sure to take a look around the whole scene when you are adjusting lighting. You can also adjust the main camera's exposure value to vary the brightness of the rendered image.
    If you want to hide the sun from view you can do this by setting the "Size of the sun" and "Size of the corona" settings both to zero under the "Sun" tab in the Atmosphere Editor. Exporting
    To export our skybox the exporter module must be installed. Select the File > Export Sky menu item and the export dialog will appear.

    The "Supporting geometry" setting should be set to "Cube". Set the X value to 1536 and the Y value to 2048. This controls the width and height of the saved image. When we press the OK button, the sky will be rendered out into a vertical cube cross with those dimensions. Each face of the cubemap will be 512x512 pixels.

    By default, your skybox will be exported to the file "Documents\e-on software\Vue 2015\Objects\Atmosphere.bmp". The exported cube cross is a nonstandard orientation. To convert this into a cubemap strip ready to load in Leadwerks, use the FixVueCubemap.exe utility posted above.  Drag your exported image file onto the executable, and it will save out a cube strip in PNG format that is ready to load in Leadwerks.

    Importing
    To import your skybox into Leadwerks, just drag the cubemap strip PNG file onto the Leadwerks main window. Open the converted texture from the Leadwerks Asset Browser. Set the texture mode to "Cubemap", uncheck the "Generate Mipmaps" checkbox, and check the clamp mode for the X, Y, and Z axes. Press the Save button to reconvert the texture and it will appear in a 3D view.

    You can use the skybox in the current map by selecting it in the scene settings.

    Disabling mipmap generation will reduce the size of a 1024x1024 cubemap from 32 to 24 mb. Due to the way the image is displayed, mipmaps aren't needed anyways. Final Render
    For the final render, we want each cubemap face to be 1024x1024 pixels. However, we can get a better quality image if we render at a larger resolution and then downsample the image. In Vue, select the File > Export menu item again to open the export dialog. This time enter 6144 for the X value and 8192 for the Y value. Don't press the OK button until you are ready to take a long break, because the image will take a long time to render. When you're done you will have a huge image file of your skybox with a 2048x2048 area for each cubemap face.
    If we resize the image file in a regular paint program, it will create seams along the edges of the cubemap faces. Instead, we're going to pass a parameter to the conversion utility to tell it to downsample the image by a factor of 50%. The "downsample.bat" file is set up to do this, so just double-click on this to launch the executable with the correct parameters. The resulting cubemap strip will be 6144x1024 pixels, with a 1024x1024 area for each face. However, due to the original rendering resolution this will appear less grainy then if we had rendered directly to this resolution.
    Import this texture into Leadwerks as before and enjoy your finished high-quality skybox. Always do a low-resolution pass before rendering the final image, as it can take a long time to process.
  12. Josh
    We now accept popular cryptocurrencies in our store and Marketplace through Coinbase Commerce. That's right, you can now buy software like Ultra App Kit using Bitcoin, Ethereum, Litecoin, Dai, or Bitcoin Cash (if you can figure it out!). Right now it's a novelty, but it's worth trying. Maybe by 2022 it will be your only option?

    Sadly, Dogecoin is not one of the currently supported coins. Soon?
  13. Josh
    The built-in level design tools in Leadwerks3D are great for quickly sketching out a game level. Most of this remains unchanged from the design of 3D World Studio, with a few extras like smooth groups and new primitive types:

     
    When a point entity object type is selected in the side panel, the object creation widget changes to a simple cross hair:

     
    The selection box with tabs that CSG brushes use are great for quickly moving, scaling, rotating, and shearing brushes, but they aren't that great for point entities. That's why I'm implementing a different control style for selected point entities. A 3D widget like 3DS Max uses will appear in the 3D and 2D viewports:

     
    To give the user more control, they can choose between global and local coordinate systems to move and rotate entities:

     
    We think this will provide the best combination of fast level editing with CSG brushes, and quick object manipulation for other types of objects.
     
    --EDIT--
     
    Here's an updated view of the 3D control widget. Of all the implementations of this I have seen, I like the Crysis Editor's the best. The next step will be to make the widget change appearance when the mouse is hovered over it. As has been requested in the past, the widget stays on top of everything drawn, and always scales to appear the same size at any distance.

  14. Josh
    In order to get ready for the Workshop, and to get ready to implement texture locking, some preliminary work needed to be done. First, I needed to improve the CSG texture mapping routine. It was modified to take into account a texture's dimensions. This means that wide flat textures like this trim piece will be mapped to their texel area, not with the same aspect ratio as a square image:

     
    The default mapping scale is that 1024 texels maps to 256 centimeters. This ensures it is easy to line geometry up to match details in the textures, and most materials will just automatically map to the size you want. However, there are many wall textures out there that are only 512x512 texels, so I added a "Mapping scale" feature in the material editor. This can be used to give any material a default mapping multiplier, so it will always show up at the intended scale, without having to adjust texcoords each time you use it.
     
    A pick material mouse tool has been added. This is like a color/eye dropper that lets you click something in the scene, and the picked object's material will be shown and highlighted in the asset browser. This is just a nice tool for convenience, and I found I really wanted something like this as I was working.
     
    Finally, a "Treat as one" option has been added in the face properties. When this is checked, the justify buttons will treat all selected faces as one when aligning their texture coordinates.
     
    Justify left with the "Treat as one" disabled acts like this, alining each individual object separately:

     
    And it acts like this when "Treat as one" is enabled:

  15. Josh
    All this month I have been working on a sort of academic paper for a conference I will be speaking at towards the end of the year. This paper covers details of my work for the last three years, and includes benchmarks that demonstrate the performance gains I was able to get as a result of the new design, based on an analysis of modern graphics hardware.
    I feel like my time spent has not been very efficient. I have not written any code in a while, but it's not like I was working that whole time. I had to just let the ideas settle for a bit.
    Activity doesn't always mean progress.
    Anyways, I am wrapping up now, and am very pleased with the results. It all turned out much much better than I was expecting.
  16. Josh
    Leadwerks3D will ship with a finished game demo to demonstrate how to use the software. Darkness Awaits is a third-person dungeon explorer with a 45 degree view. It's like a cross between Diablo and Legend of Zelda: A Link to the Past. This is an idea I had back in my very early days of game development, before I even learned to program. This was originally done in the Quake 1 engine. It didn't really go anywhere, but what we had was awesome. You could run around and shoot skeletons with flaming arrows, in a third-person view. B) My job was actually map design. My favorite part was this cozy little house you started off in with a fireplace and a warm feeling, before going outside into a snow storm to kill monsters.
     
    And so, the project is being resurrected to demonstrate game development with Leadwerks3D. Below are a few possible logo styles. Which is your favorite?
     




  17. Josh
    Development of Leadwerks 3 was (theoretically) completed today. We still have a lot of testing and bug hunting to do, but all features for release are written.
     
    This is where it started:
    http://www.leadwerks.com/werkspace/blog/1/entry-500-day-1/
  18. Josh
    After a much-needed vacation I'm back, and ready to rock.

    IGDA Summit
    Everything at the IGDA Summit was mobile, mobile, mobile. Based on this kind of behavior in the industry, you'll have to forgive me for thinking at one time that mobile was the next big thing. There's two problems: Perhaps half the companies there were support services to help you monetize your app. I haven't heard the word "monetize" so much since the height of the Social boom, when people had huge numbers of free users and were looking for a way to convert them into paying customers (which didn't happen). I take it as a bad sign. If you are a corn farmer, and a bunch of door to door salesmen come to your farm offering to help you monetize your corn selling business, you probably have a problem.
    At the same time, all the indie developers unanimously said they were having trouble making money, and that the market was very crowded. The general consensus was that it wasn't enough just to make a good mobile game. You had to have a way to promote it. One developer told me his marketing plan was to create a Facebook page. I guess he had never gone that far before, and was confident it would be the answer. Another guy complained he couldn't even give away his totally free game, with no ads or in-app purchases.

     
    So to me right now mobile looks like one of those things where a bunch of companies operate at a loss for a few years in hopes they will get acquired by EA, and then 90% of them disappear. Does not feel like a promising place if you are actually trying to provide goods to customers in exchange for money. (Of course there will still be some major successes, but those seem like the exception.)
     
    I was the only Linux developer at the event, as far as I know. Based on the strength of our recent Kickstarter campaign, Linux is arguably our lead development platform now. Which I would not have guessed a year ago.

    Leadwerks 3.1
    Development of Leadwerks 3.1 for Linux and Steam begins in earnest now. I'm focusing on graphics and terrain first, since those tasks cannot be done by anyone but me. Linux and Steam integration can be performed by the other devs, and I will just contribute to those depending on how much time I have. I made a lot of progress on terrain recently, and my goal is to have the terrain editor implemented by the end of the week. This will be included in the 3.0 version and rolled out when it is available. 
    I will also be attending to any outstanding bug reports. Having the current version 3.0 in use before the graphics upgrade in 3.1 is a great way to evaluate out the editor and make sure everything is solid.
     
    So I am looking forward to the development of 3.1 over the next few months. After all the recent excitement, some quiet coding time sounds great to me.
  19. Josh
    Scripting support is very important in the new engine. The script implementation in the old engine was great for quickly adding animation and simple behavior to objects, but wasn't capable of supporting advanced gameplay. I've put a lot of work into our implementation of Lua script to bring it to a new level.
     
    In the old engine, if a script error occurred the program would simply crash. The new engine supports detailed error handling and will automatically open files and show you where an error occurs in a script:

     
    You can insert breakpoints in your program with the Debug:Stop() command. When a breakpoint is hit, the program pauses. You can browse all variables running in the virtual machine in the script debug pane. You can even see actual memory addresses of C++ objects in the Lua virtual machine!:

     
    Of course, you can also step through your script line by line with code stepping:

     
    You can modify object scripts, and when you hit F5 the current scene will be saved and loaded in your project. The editor launches the game as an external executable, you don't have to worry about messing up your editing session, and you can even debug the virtual machine in C++ programs. This makes the editor stable and solid, and game iteration is still very fast.

     
    By focusing on improved Lua support and really digging into the details of the language, I hope we can provide a scripting environment that is very capable and pleasant to use.
  20. Josh
    Leadwerks Game Engine 5 Beta now supports debugging Lua in Visual Studio Code. To get started, install the Lua Debugger extension by DevCat.
    Open the project folder in VSCode and press F5. Choose the Lua debugger if you are prompted to select an environment.
    You can set breakpoints and step through Lua code, viewing variables and the callstack. All printed output from your game will be visible in the Debug Console within the VS Code interface.

    Having first-class support for Lua code in a professional IDE is a dream come true. This will make development with Lua in Leadwerks Game Engine 5 a great experience.
  21. Josh
    I am usually very excited to read about new graphical techniques, and even new hardware approaches, even if they are presently impractical for real use. I was very interested in Intel's Larabee project, even though I didn't expect to see usable results for years.
     
    However, sometimes articles get published which are nothing but snake oil to raise stock prices. The uninformed reader doesn't know the difference, and these articles are usually written in such a way that they sound authoritative and knowledgeable. It's unfair to consumers, it's unfair to stockholders, and it hurts the industry, because customers become unable to differentiate between legitimate claims and marketing nonsense. This one is so over-the-top, I have to say something.
     
    In an attempt to stay relevant in real-time graphics, Intel, the company that single-handedly destroyed the PC gaming market with their integrated graphics chips, is touting anti-aliasing on the CPU.
     
    There's a nice explanation with diagrams that make this sound like an exciting new technique Intel engineers came up with. The algorithm looks for edges and attempts to smooth them out:

     
    It's so advanced, that I wrote this exact same algorithm back in 2007, just for fun. Below are my images from it.
     
    Original:

     
    Processed:

     
    The reason this is dishonest is because you would never do this in real-time on the CPU. It may be possible, but you can always perform antialiasing on the GPU an order of magnitude faster, whether the device is a PC, a netbook, or a cell phone. I don't think Sebastian Anthony has any clue what he is writing about, nor should he be expected to, since he isn't a graphics programmer.
     
    Furthermore, swapping images between the GPU and the CPU requires the CPU to wait for the GPU to "catch up" to the current instructions. You can see they completely gloss over this important aspect of the graphics pipeline:
    Normally, graphics are a one-way street from the CPU, to the GPU, to the monitor. The CPU throws instructions at the GPU and says "get this done ASAP". The GPU renders as fast as it can, but there is a few milliseconds delay between when the CPU says to do something, and when the GPU actually does it. Sending data back to the CPU forces the CPU to wait and sync with what the GPU is doing, causing a delay significant enough that you NEVER do this in a real-time renderer. This is why occlusion queries have a short delay when used to hide occluded objects; the CPU doesn't get the results of the query until a few frames later. If I made the CPU wait to get the results before proceeding, the savings you would gain by hiding occluded geometry would be completely negligible compared to the enormous slowdown you would experience!
     
    What Intel is suggesting would be like if you went to the post office to mail a letter, and you weren't allowed to leave the building until the person you were sending it to received your letter and wrote back. They're making these claims with full knowledge of how ridiculous they are, and counting on the public's ignorance to let it slide by unchallenged.
     
    So no, Sebastian, this is not going to "take a little wind out of AMD’s heterogeneous computing sails". Please check with me first next time before you reprint Intel's claims on anything related to graphics. If any Intel executives would like to discuss this with me over lunch (your treat) so that I can explain to you how to turn the graphics division of your company around, I live near your main headquarters.
  22. Josh
    It's 12:30 in the morning, but I have the model scripts working the way I want. Thanks to "Nilium" for his tips, and to everyone who gave their feedback. I am confident in this revision. I have a lot of scripts that need to be slightly altered, but it's just a matter of adjusting the function structure, not really changing any existing code.
     
    Here's the light_ambient script:

    require("scripts/class") require("scripts/linkedlist") local class=CreateClass(...) function class:Spawn(model) local object=self.super:Spawn(model) function object:Update() AmbientLight(self.model.color) end function object:SetKey(key,value) if key=="color" then local returnvalue=self.super:SetKey(key,value) self:Update() return returnvalue elseif key=="intensity" then local returnvalue=self.super:SetKey(key,value) self:Update() return returnvalue else return self.super:SetKey(key,value) end return 1 end function object:Kill(model) local model,object --Iterate through all instances of this class. --If any instances are found, use them to set --the ambient light value. for model,object in pairs(self.class.instances) do if object~=self then object:Update() break end end self.super:Kill() end object.model:SetKey("intensity","0.5") --Call the update function before finishing. --This can only be called after the function is declared. --We don't actually need to call it here since setting the intensity key --will do that for us, but it's still a good idea as a general rule. object:Update() end function class:Cleanup() --Restore default ambient light setting. AmbientLight(Vec3(0.5,0.5,0.5)) self.super:Cleanup() end
×
×
  • Create New...