Jump to content

wmaass

Members
  • Posts

    32
  • Joined

  • Last visited

Everything posted by wmaass

  1. Thanks Josh, I don't have 2.5 so I just turned off Bloom and the problem went away. Sorry to ask (it has been a while) but can you point me in the general direction of the sync tool so that I can update?
  2. It has been a long while since working with the Leadwerks Engine. I dusted off an old project and I am having some weird rendering going on (see below). I'm on 2.27 and this does not happen with a clean project generated from the wizard. The problem project uses Framework, so I am wondering if something is going on there. Also happens in the 2.40 editor. Anyone seen this? http://www.maasscreativelabs.com/huh.jpg
  3. The only functional difference I ran into is that at least for now there is no audio support for the Asus, but this is because OpeNni does not yet support it. The biggest thing for me was that the Asus permits commercial use while Kinect does not, but this is soon changing. The fine details of Kinect commercial use is still a question though. The Asus is also much smaller and does not require a special power supply. The USB cable length can be extended quite a bit which is important to me. I also went the Wiimote route in the beginning, then moved to FaceAPI which was good but not practical for my projects. I have also done some simple color based fast object tracking but it isn't as robust. I will have to check out jDome, sounds cool. I have access to 3 screen projection systems, this should be wild in that system as well.
  4. It feels like I don't ever want to play a game without it...ever. No disorientation at all, the immersion is incredible. It provides more of an impact than 3D glasses, and without wearing anything. Integration is really easy. That would be cool, the sky is the limit with this. I am using the Asus sensor but the Kinect sensor works with it as well.
  5. Thanks. I am now working on my own laser/machine vision based weapon system. Could make for a killer FPS or training system.
  6. Been experimenting with integrating OpenNi using the Asus Xtion Pro sensor. Results are pretty good I think, video below.
  7. Why is it that dealing with a simple rolling ball is so awkward for these complex physics engines? Anyway, does anyone know if you can remove forces acting on a body or just make it sleep? I ask this so that I can stop my ball from rolling when it's velocity is below a certain thresh. EDIT: I didn't see SetBodyVelocity right away, looks like I can use that.
  8. Thanks for the reply. I concede that I might have to play with it for it to "look" right. No big deal.
  9. Hi All, Putting together a little miniputt game for use in a full size golf simulator. The sim has a sensor that measures all of the launch conditions of any shot. I want to take the measured ball velocity and translate it into something that I can use with AddForce. Looking around on the net I found: "One Newton (N) of force is defined as the amount of force needed to accelerate 1 kilogram (kg) of mass at a rate of 1 meter per second squared (m/s2)." A golf ball is roughly 0.046 kg, therefore if I have a measured ball speed of say 5 mps then: force = 0.046 * 5.0^2 So then I can add force like so: AddBodyForce ballBody, Vec3(0, 0, 1.15), 0 Does this sound right?
  10. peh!...I was setting a res that my new monitor does not support. All is well now.
  11. Yep, did that before posting. Going to try a few things now.
  12. Hi all, Returning to using LE after a long break. I got a new/old PC recently and installed 2.27. I pulled up an old project and tried it out only to get this error: "'glsl 1.20 not supported etc, etc" I have a GTX 285, seems to me it should work. Anyone encounter this issue?
  13. So here I am revisiting the 3D thing. I have some folks interested in it enough to pursue. Maybe someone knows the answer to this but does LE support Quad Buffering? If it does then NVidia 3D vision can be used in theory. I've found a passive 3D solution (which I prefer) that would work as well but it also would require Quad Buffering.
  14. Yes, exactly. You can get a free version of the API that has enough in it to do this. But to actually sell anything using it you have to pay. I needed to get something going in order to provide a convincing demo for people I work with to devote some resources to something similar (but better). If you want to try it and run into any problems let me know, I might be able to help you get it running. You will need a decent web cam. I used an xbox cam at first but then switched to a security camera connected to an analog to digital video converter.
  15. Yeah, that's the stuff I've got going in LE with FaceAPI - this is pretty straight forward to get going if anyone else wants to try. Using a Wiimote, as creative as it is, isn't very practical.
  16. After playing around for several hours I became tired of the glasses. The effect was cool for a little while but this stuff messes with my head way too much. I concur on head tracking, it is far more compelling than active/passive 3D. I've managed to get some very good head tracking going with LE via FaceAPI and am currently working on a proprietary head tracking setup. I am lucky in that I work with some of the best machine vision guys on the planet. It's exciting stuff. I really wanted to see what my current head tracking setup would look like WITH the active glasses, hence my question about NVidia 3D vision relative to LE. I got a chance to see something like this recently and wanted to replicate it.
  17. It looks like unless you have a Quadro card you won't be able to flip a switch in the nVidia Control panel. I purchased the bundle which comes with a Samsung monitor that can do 120hz, the glasses and emitter. It is a nice package though it is a shame it does not support Windows XP.
  18. I found this but I don't have a Quadro. http://www.seereal.com/download/drivers/NVidia_OpenGLStereo_EN.pdf
  19. So my Nvidia 3D Vision bundle arrived today, pretty cool. Now the question is, how can I make Leadwerks engine take advantage of it. I'm looking around for info but if anyone has any ideas, awesome.
  20. wmaass

    DOF Settings

    Interesting. Must be something different between the versions of Framewerk.
  21. wmaass

    DOF Settings

    You got it, it was the HDR borking it. Check it out now. Thanks!
  22. wmaass

    DOF Settings

    Here is simplified version using c++. Thanks for your help by the way. #include "framewerk.h" #include "ProcessScene.h" using namespace leadwerks; int main( int argn, char* argv[] ) { Initialize() ; TMesh mesh; TLight light; TMaterial material; RegisterAbstractPath("C:/Program Files/Leadwerks Engine SDK"); Graphics(800,600); Framewerk fw; if( !fw.Create() ) { MessageBoxA(0,"Failed to initialize engine.",NULL,0); //return 1; } fw.GetRenderer().SetSkybox( LoadMaterial("abstract::FullskiesBlueClear0016_2_L.mat") ); fw.GetRenderer().SetSSAO( true ); fw.GetRenderer().SetGodRays(true); fw.GetRenderer().SetBloom(true); fw.GetRenderer().SetHDR(true); fw.GetRenderer().SetAntialias(false); fw.GetRenderer().SetWater(true); fw.GetRenderer().SetWaterHeight(1.0); fw.GetRenderer().SetWaterSoftness(0.8); fw.GetRenderer().SetWaterAmplitude(2.1); fw.GetRenderer().SetDistanceFog(true); fw.GetRenderer().SetDistanceFogColor(Vec4(1,1,1,0.7)); fw.GetRenderer().SetDistanceFogRange(Vec2(10,75)); fw.GetRenderer().SetFarDOF(true); fw.GetRenderer().SetFarDOFRange(Vec2(5,8)); fw.GetRenderer().SetFarDOFStrength(1.0); material = LoadMaterial("abstract::cobblestones.mat"); mesh = CreateCube(); PaintEntity( mesh, material ); PositionEntity(mesh,Vec3(0.0f,2.0f,0.0f)); light = CreateDirectionalLight(); RotateEntity( light, Vec3(45) ); PositionEntity( fw.GetMain().GetCamera(), Vec3(0,2,-1.5) ); // Game loop while( !KeyHit() && !AppTerminate() ) { if( !AppSuspended() ) // We are not in focus! { TurnEntity( mesh, Vec3(AppSpeed()*0.5f) ); fw.Update(); fw.Render(); // Send to screen Flip(0) ; } } // Done return Terminate() ; }
×
×
  • Create New...