Friday, July 15, 2016

Intelligent Terrain Synthesis

Don't you hate it when your favorite TV series puts out an episode that is just clips of stuff that happened in earlier episodes? This post has some of that but hopefully will provide you with a better idea of how we see Procedural Generation in the near future.

This video shows the new procedural terrain system to be released in Voxel Farm 3:



In case you want to find out more about what is happening under the hood, these previous posts may help:

Geometry is Destiny Part I and Part II
Introducing Pollock
Terrain Synthesis

The idea is simple. Instead of asking an artist to labor over a hundred of different assets, either by hand or by using complex generation tools like World Machine, we now have a synthetic entity that can do some of that work through a mix of AI and simulation. You do not have to be an expert or initiated at all in the arts of procedural generation to get a satisfactory outcome.

Why are AI and simulation important? After working for a while in procedural generation, it became clear to me there was no workaround to the entropy problem. This I believe can be stated like this: Viewers of procedurally generated content will perceive only the "seed" information, not the "expanded" data. Yes, you may have a noise function that can output terabytes of data, but all this data will be compressed by the human physique to the few bytes that take to express the noise function itself. I posted more in detail about this problem here:

Uncanny Valley of Procedural Generation
Procedural Information
Evolution of Procedural

This does not mean all procedural generation is bad. It means it must produce information in order to be good. Good Procedural Generation is closer to simulation, automation and AI. You cannot have information without work being done, and if work is to be done better leave it to the machine.

The video at the top shows our first attempt at having AI that can do procedural generation, stay tuned because more is coming.


26 comments:

  1. It seems very promising, though the switch between biomes seemed a bit abrupt =P.

    Also, at the generation point, would it be possible to, instead of having cardinal directions, have an arrow whose point you can drag around (Possibly with a number next to it in degrees or something, so you can get very precise directions) so you have more fine controls over wind-direction? Also, I'd suggest adding an identifier to the temperature (Like, Celsius) so that people know what to expect from the temperature range =P.

    But yea, as I said: Looks very promising and very cool!

    ReplyDelete
    Replies
    1. Thanks for the tips. About wind direction, you are painting with very broad strokes. It may not make a big difference if you could specify more than 45 degree increments.

      Delete
    2. The streaks of desert seem to show pretty clearly what angle the wind goes though, especially at the biome-edges. Or is this just a coincidence?

      Delete
    3. It is no coincidence. You are right, with finer angles you could break the streaks. But I think we should break the streaks even if the wind direction is constant. We will throw in some "turbulence" for wind and temperature to break this pattern.

      Delete
    4. I wonder if it's possible to generate voxel Sand Dunes (not just texture) based off your wind and temp. content ?

      Delete
    5. In theory yes, since dunes are made by wind. We do not have a dune generator algorithm. The Pollock logic that turns photos into heightmaps cannot do dunes very well. This is because dunes are shaped like crescents and the algorithm cannot really see the large scale asymmetry.

      I love dunes and they are fairly common in fiction scenes. This ranks high in our wish list, but we are not working on this yet.

      Delete
  2. What speed was the video playback running at? 2x? 4x? I guess the loading screens would take too long at 1x, but it's a bit difficult to figure out what's going on in the biome image selection part.

    I don't remember if you answered this in a previous post, but how do the images work? Are the elevation values extracted through some 3D perspective trickery, or does it only use the colors/patterns from the image? If you're actually able to extract the height values from images that's really cool.

    Finally, your water looks a bit odd. Is the water surface modeled as a sing wave or some other repetitive function? I think it would look more realistic if there were additional random frequency components. Of course, that's a bit off topic.

    ReplyDelete
  3. Sorry, I meant to say "sine wave".

    ReplyDelete
    Replies
    1. At the moment the system cannot see depth from a single photograph. It finds elevation patterns and stores them in a probabilistic model. You can then use the model to predict new terrain given a starting point. The model is fractal as well so you can ask for answers at different frequencies or scales, and the answer will match how the source looks at that scale. This means you could synthesize from mountains to small rocks (only mountains are shown in this video).

      There are other simulation processes applied to this, for instance snow, which is computed using a physically-based model.

      All this is happening when biomes are generated. A typical continent will have from 15 to 20 biomes. This computes in about 5 minutes. This generation runs only once, it creates maps that are used later by the real-time component.

      Never mind the water, we did not pay much attention to it. Something I did not mention about this generation: we can get other very useful maps almost free here, like ocean currents and wave direction, or cloud probability.

      Delete
  4. Are you into the whole bitcoin decentralized currency thing? :D

    Because I would love to buy Voxel Farm Creators with bitcoins and finally start using it after several years of following your blog.

    Otherwise I would need to get a credit card or PayPal acc. ;)

    ReplyDelete
    Replies
    1. Debit-cards work as well I'm pretty sure (If that's a thing where you're from (I don't know much about international economics))

      Delete
    2. At the moment we have to take whatever our online merchant supports, and bitcoin is not supported. It is not a trivial task to roll out another payment processor. The update system is linked to your account so you can get updates for the products while your subscription remains active. There is actual programming into making these things work and we rather spend the effort working in voxels and procedural stuff.

      Delete
    3. I just couldnt wait and bought it now :D

      Just installing now and i am relly curious what the creators edition offers :)

      Maybe i will do a youtube account for stuff about voxelfarm, thats why i want to know if its ok for you to show your software in youtube videos??

      Many thanks for this great piece of software ;)

      Delete
    4. Its me again

      I played a bit around with Voxel Studio now and I am still a bit clumsy with the movement and selection but I am slowly getting into it.

      After I was getting familiar with the movement I tried to make a big hole into one of the mountains and just realized that your LOD-System works really well.
      It's fun to reminisce about the coffe mug problem while seeing the hole I made kilometers away. :)

      PS: After making the payment I realized that the whole licensing gets managed by your online merchant ^^

      Delete
    5. Thanks for your support. You are free to create and post videos and live streams from the software. We can hook you up with an early access version of Voxel Studio 3. Just email your request to support at voxelfarm.com and they will send you a link.

      Delete
  5. Hello Miguel,

    A bit off-topic, but are there any active projects currently using your engine besides Landmark? I saw a video of Dual Universe recently and thought that elements of it bear a striking resemblance to your work.

    ReplyDelete
    Replies
    1. Yes, there are several active projects. It takes time to develop a game, from concept to studio green-light. For this reason you do not hear about them for years. EQN was a special case because they chose to develop in public and crowdsource some of the game creation. But most of the development occurs in secret.

      About people using the Indy and Pro licenses, this is something we do not track. Dual Universe confirmed publicly they are not using Voxel Farm. I saw this headline somewhere on the Internet so it must be true :)

      Delete
    2. EQ:N is also a good example of WHY studios usually don't announce development early, since it got cancelled, thus disappointing a lot of people, even though games get cancelled quite often, and most people don't hear about it. =/.

      Delete
    3. Yes cancellation is a constant and necessary aspect of the industry.

      I can think of two specific Voxel Farm games that got canned which I am very happy they were never announced. These were so high profile that a public cancellation would have resorted in all sort of death threats and civil unrest.

      Of course we are not telling what these games were, or who was making them, so do not bother asking. I do hope the story becomes public one day in the future because they would have been very cool games.

      Delete
  6. Hey Miguel, I have a question. Clearly from your screenshots, you have a method to render objects a huge distance from the camera. You have integration with Unity, and I assume at least a beta integration with UE4. This classic gamasutra article, http://www.gamasutra.com/view/feature/131393/a_realtime_procedural_universe_.php?print=1, describes the problem is the limited numerical resolution of the Z buffer fails over planetary scale distances, so terrain like you have will render incorrectly.

    The solution requires rejiggering the renderer, or camera hacks in Unity. Do you support or know of an implementation of an appropriate large scale renderer for UE4 or Unity? All large scale games have one at some level, such as GTA5, but all my google searches reveal people having nothing but problems with this.

    ReplyDelete
    Replies
    1. Voxel Farm worlds can be very large, for this reason the engine's coordinate system uses 64 bit double floating point precision. When it comes to feed a scene to a 32bit system like Unity or Unreal, each mesh chunk is relative to a reference frame that is 64bit coordinates. The reference frame is initialized at the starting viewer position. Unless the viewer travels a very large distance, the mesh chunk positions can be expressed in 32bit coordinates with little to no error. If the frame of reference needs updating, this is something the application (your game) needs to do in a single frame. This involves translating all objects so the rendering remains identical but it is much feasible.

      Since voxels produce non-intersecting geometry, that is only visible surfaces produce triangles, the z-buffer issue is largely minimized. You could run our Unity 3D or UE4 demos and there is no flickering or z-fighting even for very distant chunks (20km away or more).

      For planetary scales, this approach begins to break. We encountered this issue when running the new prototype code for the multi-clipmap scenes (seen here: http://procworld.blogspot.ca/2016/01/riding-voxels-into-sunset.html) There are a couple of ways to get around this, but we are still deciding on what would be the best approach.

      Delete
    2. Yes, of course, 64 bit coordinates are obvious. I was talking about the latter - your engine can presumably generate a scene for an entire planet, using the global methods you discuss in previous posts, without having full detail level data available for every single cube of the surface. You generate that data in realtime when a player visits the regions, presumably.

      This is straightforward - your engine does it at a higher level of quality than anyone else, but a half dozen engines can generate the planet scale scenes. The question is how you render it in UE4 or Unity. There's a camera hack in Unity but it's a hack...

      Delete
    3. Yes, of course, 64 bit coordinates are obvious. I was talking about the latter - your engine can presumably generate a scene for an entire planet, using the global methods you discuss in previous posts, without having full detail level data available for every single cube of the surface. You generate that data in realtime when a player visits the regions, presumably.

      This is straightforward - your engine does it at a higher level of quality than anyone else, but a half dozen engines can generate the planet scale scenes. The question is how you render it in UE4 or Unity. There's a camera hack in Unity but it's a hack...

      Delete
  7. Hello! I just wanted to ask... is there a planned timeframe for this update, or a "we hope this can be in the program around X month"? Or if I buy a Voxel Studio license can I use this even if it is only by using a beta/alpha unstable no-guarantees-it-will-work-correctly release? I really want this to create maps for playing RPGs with friends, even if it's not perfect yet.

    ReplyDelete