Jump to content

nick.ace

Members
  • Posts

    647
  • Joined

  • Last visited

Posts posted by nick.ace

  1. Yes, you would need an animated (or at the very least rigged) mesh. I static mesh isn't going to be very useful for this.

     

    You don't need to recode anything. You just need to call FindChild() to get a reference of them in code and set the position and rotation of each of them to the corresponding ragdoll you made in the video. The model will still be allowed to animate, you just need to not set set the position or rotation while it's doing this.

    • Upvote 1
  2. Does anyone know how to write tesselation shaders? I looked at the example, and was able to integrate it with my custom shader, but the control and evaluation shaders from diffuse-normal-specular-tess.shader doesn't really seem to do anything. I want to be able to take a triangle and subdivide it into an arbitrary amount of triangles. There seems to be a lack of tutorials online for tesselation shaders :P.

  3. Yeah sure. You need to call FindChild() from your mesh, and that function takes in a string where you can specify a bone. This bone is an entity, so you can call SetPosition() and SetRotation() on it to the position and rotation of the joints of the ragdoll (GetPosition() and GetRotation()). Make sure you set the global flag to true though. You should be all set with this, but it'll end up being a lot of copy/pasting code since you have to do this with each joint.

    • Upvote 1
  4. Is this a physics object? If so, it might not be the best idea to combine absolute position commands with the natural physics of the ball. I would suggest either doing one or the other. Also, there are two issues with using SetVelocity like you are doing. One is that it's not in the UpdatePhysics() loop, which means that it may not synchronize completely with the physics. The other is that the ball still has acceleration, and collisions with absolute movement can cause physics glitches with super acceleration. Have you taken a look at the marble demo? It might be a good place to look for how to handle the ball.

  5. Looks good! I really like the environment and the HUD! For the ambient lighting, if you are thinking of changing it, why not just interpolate it across the mine? Have multiple pivots and then do some sort of interpolation from there. Even if you have hundreds of these pivots, you could just break up the distance checking per frame to keep performance reasonable if you needed to do that.

  6. How did you get the ragdoll physics to work so well? I remember spending hours only to find out that the joints are super-elastic when I tried to do it.

     

    Attaching isn't too difficult. All you do is get the model's children and set the position of each bone to the ragdoll's limbs (you don't have to parent all the bones though). One issue I remember having though is that sometimes the model will disappear because it's bounding box kind of gets messed. I don't remember how to fix that part, but one thing that can help a little is to reposition the mesh to the torso of the ragdoll and manipulate the bones there.

    • Upvote 2
  7. Sorry, I think I need to reword my question. I'm trying to write to a custom buffer using a model fragment shader. I want to access this buffer using a post-process shader. Is there a way to do this? Right now the only way I can think of accomplishing this is to cram values into the depth/normal/diffuse buffers, but then that messes up other post-processing effects.

  8. Lol I think it's funny how everyone on this thread was on one of those posted threads giving the same advice! biggrin.png

     

    In all seriousness, I actually thought Sketchup was very intuitive! I through everyone for a loop with this suggestion though. Anim8or is a pretty decent free program for fine-tune modelling. Despite it's age, it's still great for creating low-poly objects. I've used many other 3D modelling programs though.

  9. You may have seen this game before as "Snoop Dogg Simulator" but has since undergone a change into "Meme Simulator" so that it can be released publicly without myself getting in trouble.

     

    Just to preface, I'm not a lawyer, and this is not legal advice.

     

    That's not how intellectual property law works in the U.S. at least. Yes, changing the name may remove some trademark infringements. But you are still using art that could infringe upon more trademarks and copyrights.

  10. This isn't so much of a game dev challenge as it is an algorithm issue. It's easy to avoid making rooms not intersect by using a 2D or 3D array. The difficult part is making the result be what you want. You need to define how you want the procedural generation to go.

     

    In particular, I would look at these:

    https://en.wikipedia.org/wiki/Category:Graph_algorithms

     

    I would particularly look at the minimum spanning tree ones:

    https://en.wikipedia.org/wiki/Kruskal%27s_algorithm

    https://en.wikipedia.org/wiki/Prim%27s_algorithm

     

    Then, you should have a decent base to create this. But you still need to position the nodes and stuff. It can get somewhat complicated on the conceptual end or the coding end, so if you haven't programmed before, then this might not be the best project to start out with.

  11. When you published, did you choose the option to automatically pack files (do you see a .zip file containing your items)? This may not work with your script if this is the case because your file is being dynamically loaded.

  12. I think Brutile means set a breakpoint, but it's weird that it wouldn't crash in debug mode. I'm kind of confused. Is it crashing on startup or when you are in game reading a note? Those are two totally different things.

  13. If other programs can open it without errors, then it's an issue with Leadwerks. I don't know if Josh programmed the decoder (he may have used a library). But anyway, there are many different ways to represent the alpha channel in PNG images (see section 6.2):

    https://www.w3.org/TR/PNG/#4Concepts.PNGImageTransformation

     

    According to a StackOverflow post I saw, alpha channels are handled differently (they could not be placed in the tRNS chunk):

    https://www.w3.org/TR/PNG-Decoders.html

    https://www.w3.org/TR/PNG/#11tRNS

     

    I may have this backwards (I've never actually written a decoder for this), but the more common form of PNG images uses 24-bit (RGB) palettes with a tRNS chunk to map alpha values to those (which can save on memory since undefined alpha values will default to 255). However, you could also have 32-bit palettes (RGBA) to represent characters. This is apparently less common, but still valid. So, Substance Designer must be using this. The downside to something like this that you have a larger palette, but you could make texture accesses quicker since the palette is all in one spot in memory. I think that's why this method isn't as popular. With 4K textures though, maybe 32-bit palettes are better for decoding quicker.

  14. Sorry, I don't think I explained it well. I want to set a global uniform across all shaders (since I don't think there's a way to access the vegetation shader in code since it's tied to the terrain class).

  15. It's not the DXTn compression methods though that cause this. Those are specific texture compression schemes that are better for quick access texture lookups in VRAM. Once the PNG is decoded in the Leadwerks editor, it is encoded with the DXTn compression (except when you select uncompressed). The TEX file I think contains the new texture, which is why when you change the PNG source file, the TEX texture will also change. The problem is with the PNG being decoded.

  16. This can't be a problem with SD since it works with other programs. There are multiple ways to encode PNG images, and it's possible that Leadwerks only decodes it in certain way(s). I'm thinking SD is exporting 32-bit palette color PNGs or something, and since it's not the most common format, Leadwerks doesn't directly support it. I don't really have the time or desire to investigate the actual encoding techniques used, but this sounds like the issue to me.

    • Upvote 1
  17. Yes it is application specific. Until today had never seen an 8bit representation of the alpha channel in a png as i do with say a targa. I did a bit of reading and photoshop is specifically hiding the alpha channel in png’s

     

    From the adobe help desk., Transparency and alpha channels and png

     

    PNG does not support arbitrary alpha channels like other formats such as TIFF. PNG specifies that the fourth channel in a file is transparency, and only transparency. When you open a PNG file with transparency in Photoshop, it is considered a single layer image. It is not a flat background image. Alpha channels can contain anything, while transparency is a specific channel relationship. You can have multiple alpha channels per document, but only one transparency channel. Photoshop handles transparency and alpha channels separately.

     

    That doesn't make any sense. A PNG can contain whatever it wants. The compression scheme is purely an algorithm on multichannel data (probably designed to better compress RGBA, but that's not a limitation). For example, in the early days of GPGPU computing, textures were used to perform non-graphics calculations. It's Adobe's fault if they want to handle a PNG like that. They probably mention TIFF because they have business interests in that format lol. But even that format seems to have the same bit-depth, so they would have to be dividing a channel or doing some weird math. I haven't used Photoshop so I don't really know the difference between single-layer and background image. Anyway, I think this is the technical reason why Photoshop and Leadwerks both have issues importing some types of compressed PNGs:

    http://graphicdesign.stackexchange.com/questions/5888/why-cant-photoshop-properly-open-this-png

     

    I actually didn't even notice this problem until this texture came around.

     

    @ScrotieFlapWack

    Sorry wrong texture. Open it up in Paint.NET or something and resave it. The encoding is probably not handled correctly by the Leadwerks importer, and this is probably the same issue that Photoshop faces.

    • Upvote 1
  18. @ScrotieFlapWack

    When I open the diffuse texture up in Leadwerks, it works. I had to change the compression type though. Every type of compression except DXT1 (the default one) worked.

    • Upvote 1
  19. That one was straight out of substance designer, I always was under the impression png didn't write to an independent channel.

     

    It depends on the type of encoding for PNG (there are multiple types) as well as the program used to view it. A lot of image formats can vary a bit in their encoding (Paint.NET gives you some options). There are also API limitations, such as if you open a 32-bit .png in MS Paint, it'll be opaque. If you open a 32-bit .png in Paint.NET, then it'll be transparent.

    https://en.wikipedia.org/wiki/Portable_Network_Graphics

×
×
  • Create New...