Jump to content

Rastar

Members
  • Posts

    421
  • Joined

  • Last visited

Posts posted by Rastar

  1. Probably the only time you have to take the folder layout into account is when your developing both for Windows and OSX. You can't publish from Windows to OSX (only to Android), and on OSX you can only publish to iOS. But developing for Windows and OSX doesn't work out of the box, or I haven't found out. When I create a new class in VisualStudio, it is created in Projects/WIndows. I then remove it again from the project, move it in the file system to "Source" and "Add existing item.." But this is cumbersome, of course - I hope there is an easier way for this.

  2. I am procedurally generating a model with several surfaces. Many of them have the same basic vertex structure and will be displacement mapped in the shader. Is it possible to reuse those surfaces? I would like to Draw them several times with different shader parameters.

  3. Hi,

     

    how do your structure your source folders when developing for several platforms? By default, under Visual Studio and XCode, when you create new C++ classes they are created in the Project/Windows and Projects/MacOS folders. But many of them are (and should be) platform-independent, so I guess they should go under Source. However, I couldn't find a nice way to integrate that folder into a VS solution (it's possible under XCode, though).

     

    So: What's your preferred way of doing this?

  4. biggrin.png Bending light? Am I Albert Einstein or what? No, actually I want to project some quantum-dynamic strings into hyperspace and then genetically modify their submolecular megatexture before transforming the space-time-fabric into super-entropic multi-convex Voronoi regions, silly!

     

    Or to say it in plain words: I am trying to use some third-party code for drawing skies and clouds...

  5. I am trying to draw some stuff *before* and *after* everything else is rendered, but I have a hard time trying to do that:

    1. Anything that I do before World->Render() doesn't appear, I guess that method clears the buffer? Also, the camera and projection matrices aren't set at that time (I could compute them myself, though).
    2. I can't figure out a proper way for using DrawHooks. I have tried pivots and the camera as the wrapped entity, but both don't seem to receive the DrawHook call. If I use one of the other entities (e.g. Model::Box() ) - would my drawing code be rendered *in addition* or *instead of* the box drawing itself? Also, how could I influence the time when that object is rendered?

  6. I need those two matrices from the Camera object. I found these undocumented members:

     

    projectioncameramatrix

    projectionmatrix

     

    So: I guess the latter one is the ProjectionMatrix I'm looking for? Then, is projectioncameramatrix the ModelView*Projection combined matrix? If so, where is the pure ModelView?

  7. I'm playing around with a third-party library which defines an enum for various renderers, among them "OPENGL". That seems to clash with the OPENGL preprocessor macro in the standard project template. Is there a way to solve this? Can I rename the macro? And do a search&replace in the Leadwerks header file?

     

    It would probably be safer to call the LE macros, __OPENGL__, __OPENGLES__ etc.

  8. Alright, I'm in. I'm gonna try something that I've been thinking about for a while which unfortunately is way above my head. Which means I most certainly won't finish... But at least I'll try to push myself and learn something along the way.

     

    Elements, ey? Love that topic!

  9. If you look at the docs (User Guide) it says

     

    Collision type

     

    This is the object's collision type. Collision types control what kinds of objects collide with one another. The following collisions will be recognized:

    • Character : Character
    • Character : Prop
    • Character : Scene
    • Character : Trigger (detection only, no physical reaction)
    • Prop : Prop
    • Prop : Scene
    • Prop : Debris
    • Debris : Scene

    So by using setCollisionType you're telling what kind your entity is, and these rules define if a collision will be registered when your entity intersects with another object.

  10. This isn't useless - AABB intersection is an inexpensive check because the geometry is simple and no transformations have to be done. If it fails, the objects won't collide and you're done. If it succeeds you can continue doing (more expensive) checks using tighter, object-aligned volumes.

     

    I haven't used it, but looking at the docs I would guess that Entity::CollisionHook will let you know when two objects collide physically and react to that.

  11. Yes, of course. But in principal you can move to every position on your terrain, and then you need the high-res texture *at that point*. And if the texture was generated offline it would have to have the high resolution everywhere. I am not talking about applying a grass texture somewhere, but rather a color map for the entire terrain.

  12. Yes, that is *the* tutorial everybody mentions when talking about World Machine... smile.png

     

    As I said, I'm not sure how exactly the new texturing works. But it seems to render a high-resolution texture at close range and lower res in the distance (or rather, same resolution everywhere, but more pixels per meter in the near distance and fewer in the far distance). Now, if you wanted to use a WM color map for this, it would have to support the close range for the whole terrain, meaning it would have to be (prohibitively) huge.

     

    In WM, you use selectors (based on height, slope, direction, convexity, ...) to isolate a piece of terrain that you want to color in a certain way. I guess that in Josh's texturing system this is done on the fly, so you specify selection criteria as well, but the coloring is done at run time. But this would mean you couldn't use pre-calculated color maps.

     

    Again, I'm not really an expert on this, just my guess. Please, somebody, correct me if I'm wrong.

  13. World Machine is a wonderful and extremely powerful terrain generator. It works similar to the method Josh described for procedural terrain generation in a recent post, but it is node based. So you assemble a terrain generation process by linking several kinds of nodes, and you will find things like Voronoi, Perlin noise and erosion nodes (and many, many more). It takes some time do know how to get the effect you want, but then the possibilities are endless. What makes it invaluable for me is the possibility (in the Pro version) to generate tiled terrains, so you can for example create 16x16 tiles at a resolution of up to 4096 and use that with a streaming system.

     

    World Machine's main output is a heightmap, and you will no doubt be able to import that into Leadwerks 3. However, you can produce many more outputs like a (whole terrain) diffuse map, normal map, splat map and light maps. I haven't fully understood how the new terrain system in LE3 is going to work, for example if the texture blending maps are generated on the fly, or if you can import one generated in a tool like World Machine.

     

    The latter would be nice. As an example: WM has very powerful erosion feature that not only shape your terrain, but also creates deposition and flow maps. When water grinds off the rocks of a mountain it washes the sand away and finally leaves it somewhere. WM generates a wear map (where were the mountains eroded?), a flow map (which path did the water take?) and a deposition map (where did the stuff end?). You can use that information to realistically color your terrain and place vegetation (the sediments are rich in nutrients and water is close by, so plants are more like to grow there) in a procedural way:

     

    Tl;dr : You will certainly be able to use heightmaps from WM in Leadwerks 3, but I'm not sure about other output. However, there are use cases for using other WM textures, so let's hope that the new terrain system will allow that.

    • Upvote 1
  14. There are two versions of the Kinect, but the XBox version will work on Windows. If you have the Kinect that came bundled with the XBox you might need an additional power cable/USB adapter, the stand-alone version broght that along. The XBox Kinect does not have the near-view functionality of the Windows Kinect sensor (which makes sense in a desktop settings), and you are not allowed to make/sell Windows applications using an XBox Kinect.

     

    Apart from that, there is the OpenNI initiative (mainly supported by PrimeSense, who developed most of the Kinect technology). They provide an open API (that internally is mapped onto the MS API when you're on Windows), and they also have a OSX implementation for that (see http://www.openni.org/).

  15. Well, I'm not much of a Blender artist myself, however a quick search gave three interesting Blender add-ons:

     

    http://wiki.blender.org/index.php/Extensions:2.6/Py/Scripts/Curve/Sapling_Tree#Instructions

    http://wiki.blender.org/index.php/Extensions:2.6/Py/Scripts/Curve/Ivy_Gen

    http://blenderthings.blogspot.nl/p/the-spacetree-blender-add-on.html

     

    I would definitely agree that those forester trees don't match the quality of your work, for example (got a few in my collection...), but I thought for someone like me who isn't too much into modeling it is a nice way to quickly generate a variety of acceptable models.

  16. Don't know if that's been mentioned here before, but I just stumbled across a very nice tree generator, Forester Pro (http://www.hptware.co.uk/forester.htm). You start from tree templates (you can create custom ones as well) and change/randomize several parameters to create new tree variants. It really has a lot of functionality and costs just $20 for an indie license (there is a lite version for non-comercial purposes as well).

    • Upvote 1
  17. Thanks for the link! Yes, I saw that. It might actually be too good for my purposes, and it's still quite hefty in terms of resources (e.g. they need about 45 MB memory per tree model for the z-tree thing. The second (older) paper I linked uses a tiled volumetric texture which sounds lighter in terms of requirements, but they are ignoring dynamic lighting.

     

    Still, what a fascinating topic! If only if were a little more knowledgeable... :-)

  18. I've finally received my Ouya. It ended up at German customs because they didn't attach a sales receipt to the outside (@Josh: Don't forget that, at least when sending stuff to Germany, or your customers will have to pay extra and waste time visiting the custom's office... ;-) ). When I said the package contained a "games console", the lady at the customs immediately asked "An Ouya? That's the only reference number I know by heart...". So quite a few Ouyas must have been sent to Germany, at least to Berlin where I live which in a sense is good news.

     

    Of course they didn't send the addtional controller that I ordered, so I won't touch the thing and even send it back if they don't at least confirm the issue. If that goes well, I'll post some experiences here.

     

    UPDATE: I emailed Ouya support and got a response (containing an attached invoice document for my second controller) within an hour or so. They seem to be improving on that front... So it's time to play around with this little thing!

×
×
  • Create New...