Jump to content

Flexman

Members
  • Posts

    916
  • Joined

  • Last visited

Posts posted by Flexman

  1. Cheers Jen, that makes sense.

    I'll try and be clear about why I asked the original question.

    Expected behaviour; With VR enabled, given a simple a scene when I tilt my head in different axis I expect to see the scene shift accordingly.

    Observed behaviour: Scene elements like a skybox and terrain appear to parallax against foreground objects causing nausea. Mostly when looking up and down. I initially put this down to improper distortion. It's not something I've experienced when replicating the scene in another engine.

    What I tried: After adding trees to the scene, the observed height of a tree (apparent image height as seen within the headset) appears to change when looking up and down.

     

    Thoughts/suggestions?

  2. Rift. If you say it's implemented then it's possible I must be bypassing it. Using C++ only and a bare map, terrain, a cockpit model and a directional light. No post processing filters or anything. If those are required I'll give it another go, but I didn't read anything that suggested otherwise. Screenshot of mirror display attached.

    int main(const int argc,const char *argv[])
    {
        Leadwerks::Window* window = Leadwerks::Window::Create("VR Test", 0, 0, 1024, 768, Window::Titlebar);
        Context* context = Context::Create(window);
        World* world = World::Create();
        Camera* camera = Camera::Create();
    	VR::Enable();
        Map::Load("Maps/start.map");
      ...
      ...
    }

     

    le_vr_mirror.jpg

  3. The artist can create a number of 3D props that use the same material, placing the textures from all the props on one texture. If they map the texture coords appropriately then it's a fairly trivial optimization.

     

    Since props using the same material get batched together it will save on internal material switching. Reducing draw calls in Unity is a quick and easy level optimization, especially for mobile devices. In for desktops I'm not sure it's such a great return on effort if you're only talking about a few instances. But in principle (and in practice) it works.

    • Upvote 2
  4. Use ModelViewer.exe and drill down through the model tree-control on the left until you find the bone you want (or not if they are missing). At least you can confirm what is in the model and identify any export issues you may (or may not) have.

  5. I just read the pinned post about Intel/ATI graphics GPUs, we recently came across this in the wild when testing our game.

     

    I want to know if there's some method with Leadwerks engine to enumerate and specify a graphics device before initialization? Some of us have multiple displays that want to run in true full-screen without having to use a borderless/full screen window which is a performance sink.

  6. I bought into the LE3 mobile SDKs to show support. I'm glad it wasn't such a huge layout. What I *wanted* was LE2.5 with OpenGL 4 and better tools. Maybe some of the streaming stuff that was talked about. Linux is the new toy-boy thanks to Steam, I don't know how long that will last. Ouya got a lot of press then reality arrived in the post. SteamOS is cooling off and we've not seen much in the way of a breakout yet. I remain hopeful and enthused by some of the content posted by new LE3 users.

     

    It's frustrating that to this day, I still spend most of my time working around LE 2.5 limitations to achieve things that should be quite simple. But the awesome thing is, every big problem I came up against I've been able to resolve with a lot of heart-ache and creative use of the semi-open nature of the engine. In the end, much more rewarding. (Source would have been more awesome.)

     

    I'll just finish by saying I have had a very real love-hate relationship with Leadwerks engine. Hell, they didn't even send me a t-shirt :) even though the company used a personal photo of our Christmas tree with the kids presents under it in a promotional post last year. It was funny as hell to see (especially our lovely brown curtains).

     

    I lost out but then I knew that ahead of time anyway. Josh needs to do this. Last time Josh changed tack, some of the old community deserted to other engines because they were not interested in mobile. Josh went ahead and did it anyway but you have to keep in mind that there's a certain silicon valley mentality that's OK to try things and fail. You learn and try the new thing. It's something people could learn from. Failure is an option.

     

    I still want a bloody t-shirt from the tight git for the Xmas tree thing though :)

     

    I look forward to seeing a renewed focus for the desktop which is what he's best at. Maybe upkeep of the mobile engine could be entrusted to the community under a strict usage license. Just a thought if it's not going to be doing anything in future.

     

    All the above is just my personal ramblings.

    • Upvote 2
  7. Funny that for DK2 they went to the old tried and tested technology that TrackIR uses for head-tracking. Another unit to stick on top of the monitor.

     

    The guys from TESTED dot com gave their impressions of the unit and the demos they'd seen.

     

    I think there's a space for small indie games and apps here. Looks like everyone else is kicking back and waiting to see what happens. There was even a Blitzmax demo for DK1 kicking around. Now they've got the translational movement drift issue sorted it might be worth looking at again. I can see it being as useful as TrackIR, something that you keep around for those few occasions when you use it as when you do it enhances an already good experience. But it you're using just for the experience of VR, it's going to get tired quickly.

  8. Not had to do this myself but I'll jump in just because it's an LE2.5 question.

     

    There are at least two states when in water. Floating on the surface and swimming underwater (turn off gravity for this body when buoyancy is neutral).

     

    If I had to tackle it, I'd start with having a cylinder with buoyancy and when the character enters water to the point where our character should float, update the controller position with the forces from the buoyant "float" entity.

     

    If we want to swim under water, we don't do this. Instead we'll need to turn off the buoyancy of our float and ramp up the dampening really high to "swimulate" the weight of water. But still update the controller with out float body forces etc.

     

    Might work.

  9. This is what I do everyday professionally, a combination of automated, manual and unit tests. Just adopted Coded UI for GUI testing.

     

    It always IS a pain but it's how you ensure quality and reduce the number of bugs that reach your customers and giving a ****ty impression of your product Even having those testing processes in place demonstrates you take quality seriously.

     

    You can do it for graphics, certainly you can have tests that check physics, parenting, hiding, movement and confirm these entities behaved as expected. Generally test cases are different from unit testing and involves manual testing to confirm that when you do action "a" you get result "b". (Think of all those small examples in the documentation, could be tweaked for a regression test suite to be run every major release).

     

    End users shouldn't be doing this for middleware, your own code maybe.

  10. I'm fascinated by Rastar's terrain mesh tessellation exploration.

     

    With large landscapes once you've licked the problem of the terrain mesh you have to deal with vegetation then any civilization layer for roads/buildings and finally collisions. It all adds up to (as Rastar says); a long and twisted journey. So I would seriously consider something small if you can get away with it.

     

    Having multiple-terrain objects in a scene and having everything in it parented to the terrain entity would be ideal.

×
×
  • Create New...