Jump to content

Niosop

Members
  • Posts

    1,124
  • Joined

  • Last visited

Everything posted by Niosop

  1. Awesome, that's almost exactly what I'm going for. Will add a couple extra textures (base texture + 4 in RGBA channels), and normal/spec maps for each for a total of 10. Then I just have to write a little program that will let you walk around your scene and splat the textures onto your static objects to add variety. Have it export the data you painted to files and make a custom post load function that will look through and apply them and we're good to go Thanks so much, this will help save a lot of time.
  2. Niosop

    Revisiting my code

    Yes, that would be very helpful.
  3. Would you recommend making it a user defined sized RGBA texture to map the other textures instead (like the terrain layers do)? That has the advantage of not requiring a high vertex count to be able to add layers, but may require a second non-overlapping/non-tiling UV set to be defined on the mesh to work properly if they use tiling or shared UV space. Doing it per-vertex allows you to use it regardless of UV layout but doesn't work well on very low poly meshes. I guess having both methods available would be ideal, but I figure I'll just focus on one for now. Where does using a single shared instance of models/meshes/surfaces help performance? Just RAM or VRAM/GPU as well? I'd assume every instance is still an additional draw call, right?
  4. Well, I'm trying to implement vertex painting so you can use the same model/mesh many times but each can have different vertex colors. Eternal Crisis is working on a shader that will blend different textures based on the vertex color. Hopefully it will make it so that you can have a ton of uniquely textured instances of objects without the overhead of creating a new texture for each of them. I thought about using the decal system, but that creates a new surface and thus a new draw call for each, which could get really expensive. I'll try just creating a new mesh/surface on the fly and see how that works. Guess it will probably only work for things loaded with LoadMesh and not LoadModel, thus no LOD would be possible. Not even sure how well it will work with mipmapping. Only one way to find out I guess unless Josh has attempted this in the past and has any comments about the viability/hurdles/etc.
  5. Yeah, his physics mesh isn't used at all, I just use a cylinder around it. Did you see the second video w/ him actually colliding w/ stuff? The reason his mesh is so far off is just because of how I exported him from the modeling program. He's off the origin in the frames I'm using. The code accounts for that as far as the animation, model position and collision cylinder, but the editor shows the auto-generated one off by that amount. He's also kinematic, so collisions on him are off, but his Lua script creates a collision cylinder that is updated to be centered around him every Update. This keeps him from being affected by collisions (which caused really jerky movements and sent him flying away sometimes) but allows him to affect other things. You can download the Unity locomotion system at http://unity3d.com/support/resources/unity-extensions/locomotion-ik and view it in action at
  6. Is there any way to set the vertex color of an instance of a surface? SetVertexColor changes it for all instances of the surface. I suppose I could reconstruct the surface of the mesh in code and swap it out, but would be nice if there was a built in way.
  7. So you want a stripped down game type that does: dofile('Scripts/constants/keycodes.lua') fw=GetGlobalObject("framewerk") while KeyHit(KEY_ESCAPE)==0 do fw:Update() fw:Render() Flip(0) end Then you handle all keyboard input, camera control, etc in separate objects that you drop in? That's fine but it doesn't fix any of the problems you mentioned. In your example, what if you drop in both a first person camera object and a third person camera object? They're going to fight with each other and you're in "camera controller hell", or "keyboard input hell" or whatever. Your method is totally usable, and I'm all for breaking stuff down into self contained components, and I'll possibly use this paradigm in my own code. But it's hard to tell where the divide should occur. In order to avoid conflicts you have to break your script own into ridiculously small pieces. "This script rotates the camera around the X axis in the negative direction by the value you set in the 'mouse sensitivity' box when you move the mouse down", then another one to rotate the camera up, then another to handle the left mouse button, another one to parent the camera to some object you define, etc, etc. It might be kind of cool to have a totally visual system like that where you never have to write a single line of code to make it do something (ala kismet), but performance would tank due to function call overhead and you'd be in the same position of "This script handles mouse movement, but I want to use this horizontal mouse movement script with it....is there a vertical only version I can download?" Anyways, it's not my intent to disparage your idea, I think compartmentalization is great, I'm just not sure if it's Josh's job to decide how far things should be broken down when we can easily just include a README file saying to save the above code to a file and set it be the default game file in the editor.
  8. They are game scripts. You really can't strip them down or when you hit the play in game button it won't do anything. They have to create a camera and position it and handle keyboard input. If you're pre-packaging an object w/ a Lua script, then you should package that *object* that is self contained and does what it does regardless of the game type. If you're packaging a game type (fps, third person, rts, etc) then you make a game script. Those are just example ones, everyone should either make their own, or not use one and handle that logic in compiled code. If you have multiple scripts that control the camera you're going to need one game script that knows how to switch between them anyways. So essentially there is no "base lua file" that you are meant to use. There's just examples of a couple possible ones and one that is set as default so when you hit the "Play" button it actually does something.
  9. Are you talking about base.lua? Or the game play scripts like fpscontroller.lua, driver.lua, etc?
  10. Really? Have a link to it? Eternal Crisis is working on one now, and I'm working on a little editor that will let you do the painting and save out a vert color file, but if there's already a shader written it would make EC's job easier.
  11. Niosop

    LinePick

    It does if you use the correct collision mask. 5 worked in my case, but it might also cause it to collide with other things. Mess around w/ the value and see if you can find one that works for you. I don't know if the collision is a bitwise mask and 5 is actually taking into account 4 and 1 (whatever they are) or if it means something else. I should probably read up on it more.
  12. At some point in the future it would be nice if the editor would allow you to vertex paint meshes on a per instance basis. This would let us add almost infinite variety to meshes without having to generate a separate texture for each instance (moss on rocks, cracks on pillars, etc, etc). Ideally we'd be able to select up to 5 materials for a mesh, and use the RGBA channels to determine blending between them kind of like the terrain system currently does. I suppose the .sbx file would have to store vertex color information along with the position/scale/rotation info it already does or reference a separate file for each mesh that had vertex painting applied. A separate file by sbx assigned ID number might be best because then it could be binary and take up much less room. It could be just like the terrain editor paint feature, just allowing you to pick materials instead of just textures. But even just using something exactly like the terrain shader would be fine (diffuse/normal/specular).
  13. Niosop

    LinePick

    LinePick(Vec3 start, Vec3 end, radius, collision type) Example usage (modified from what's in the ActionSnake game): function GetGroundHeight(ent) local v = ent:GetPosition(1) local v1 = Vec3(v.x, v.y + 10, v.z) local v2 = Vec3(v.x, v.y - 10, v.z) local Pick = LinePick(v1, v2, 0, 5) local groundheight = ent.position.y local CollisionEntityName ="" if Pick ~= nil then CollisionEntityName = Pick.entity:GetKey( "class") if CollisionEntityName == "Terrain" then groundheight = Pick.position.y end end return groundheight end Note: I'm not sure about the 5 for collision type. It works for what I'm using it for though. For most things it seems the BlitzMax usage is what is used for the Lua implementation.
  14. Fraps works pretty well but you only get 30 seconds of recording at a time w/ the free version.
  15. Does anyone know how UE3/UDK handles materials? I love their system. Do they have one big "uber" shader that does different stuff depending on what inputs are passed, or does it dynamically pick a shader depending what's passed in? Are all the possible effects handled in the shader or does it go through some code that updates the shader with new values (for panning, sine wave generation, etc)? I think if a similar material editor could be added to LWE it would bring us one step closer to going toe to toe with the big boys. Here's some examples of cool stuff you can do with it. http://www.hourences.com/book/tutorialsue3mated2.htm (example of the sine and time functions, UV coordinate scaling) http://www.chrisalbeluhn.com/UDK_Asset_Position_Offsets_Texturet_Tutorial.html (texture offsetting based on world coordinates) http://www.3dbuzz.com/vbforum/simplevideo.php?v=894&t=-0.5 (detail normals) I don't doubt that that anything that can be done w/ the UDK material editor can be done in LWE using a custom shader and some code to pass values to it, but it requires deep knowledge of GLSL to do so. A visual system like this would open up a lot possibilities for us non-shader guru types. So what do you shader gurus think? Would a single uber-shader work? What's the limit on shader size? Would it have a big performance impact? It seems that the current system is pretty much a couple of big shaders that have parts selectively enabled/disabled depending on which defines are passed. Any insight on how this might be implemented would be appreciated. I don't mind diving in and getting my hands dirty with it, I just wanted to get some feedback from people who know what they're talking about before I waste a bunch of time going down the wrong path. For instance, can the shader handle changing values based on time, or would the program have to handle that by modifying a uniform?
  16. Niosop

    What is Dot3?

    dot3 is just a naming convention. They should be standard tangent space normals. Are you using a specular shader? If so, it uses the alpha channel of the normalmap as the specular map I think. So you might want to go in and tone down the alpha because it's probably set to a solid 255.
  17. Isn't it in pixels? Can the mouse move fractional pixels?
  18. Maybe it's the PositionEntity call that's not working. Try entity.cameraPivotCube:SetPosition(entity.cameraPivot:GetPosition(1), 1) or similar? Why not just parent the cubes to the pivots instead of manually moving them?
  19. someThing:SetParent(otherThing)
  20. This is kind of cool too, results are very pretty: http://code.google.com/p/pixelcity/ Josh, any chance of adding a function to TTexture that will give us a pointer to the actual raw texture data? So if we use mytexture = CreateTexture( 512, 512, TEXTURE_RGBA ) then we could call mytexture.GetTextureData and get a pointer to an array of RGBA values or something? Maybe we can already do something similar, I really don't know how it works. I'd like to procedurally generate some textures at runtime I just can't figure out a way to get access to the data to modify it.
  21. Here's something a little sleeker: http://hackaday.com/2009/10/26/head-mounted-computer/
  22. Why are you running spawn in your Kill? You should call base_Kill(model) but call it AFTER you do your other stuff. As an alternative, you could do: function Kill(model) entity.playerPivotCube:SetParent(model) base_Kill(model) end This should cause the playerPivotCube to get destroyed along with it's parent. I haven't tested this though, so it could result in bad things like your dog getting sick on your carpet or something.
  23. I think it might be a poly limit as opposed to an object limit. Maybe see if it doesn't like files w/ >64k polys?
  24. I see what you mean Naughty Alien, I just don't see the point. Why would I want to duplicate then strip out movement information when I can just export the files I have and let the computer do it for me? If you had an already normalized animation I guess it would be helpful. Just different approaches I think You prefer modeling/animating, I prefer coding. Here's a quick test w/ simple one-way physics interactions using a cylinder:
×
×
  • Create New...