Phony…

Generate grass on cpu, adapt to local terrain
Generate grass on gpu, no adaption necessary
Geometry shader: particle system to simulate gravity, generate leave mesh on terrain surface.

First: finish scenegraph manager, implement basic terrain tile class (flat, no heightmap)
Add shader support for dx9 renderer
Start dx10 engine, test geometry shader, texture arrays, …

implementation time

a first output of the new engine architecture

Currently, the engine only renders one type of mesh with one texture, but if they remain static (nested or not – doesn’t matter), the architecture is able to draw about 3k of these objects on an old x800 with approximately 20fps (debug):

3ktigers

Since I can’t (yet) animate that amount of objects simultaneously, I have to invest some more time into optimizing the internal vector and matrix classes. If the number of objects get reduced to about 1k, the engine can handle updating all their matrices and still run at a decent framerate (~15-20fps, release)

make some noise

An outline of a simple zoomable heightmap generation algorithm:

  • create a 256×256 noise map
  • upon zooming in, create a new one as soon as the zoom factor reaches (1.5 x iteration level)
  • sample it at the respective coordinates to make it serve as the base for the new texture
  • use global coordinates (relative to the sphere) to create new texture iteration
  • iteration++
  • when zooming further in towards (2x iteration level), blend the new texture over the old one to hide transition artifacts
  • when zooming out again, blend it out between (2x iteration level) and (1.5x iteration level)
  • discard the texture upon reaching (1.5x iteration level)

using this approach, we will only have 256×256 heightmaps in graphics memory, albeit n of them.

i.e. if the target planet has a circumference (edge size) of  40’000km (earth), and the maximum zoom level would be 1m edge size, we would end up with approx. 25 textures.

One approach would be to only keep the minimum amount of textures (thus limiting the view distance) in memory and discarding the rest, re-rendering them on demand.

The same algorithm can be applied to movement parallel to the ground: if the camera is moved further than (0.25x edge width of current iteration), a new texture at the same iteration level but with new base coordinates is computed and blended over the old one between (0.25x edge width) and (0.33x edge width)

    ideas in no particular order

    game:
    general:

    • usually one sun per starsystem
    • different star system can only be reached via jump, not using sublight engines (would take years otherwise)
    • this eliminates the need for rendering each star individually since they don’t alter their positions visually when travelling at sublight or less.
    • upon arrival in a new star system: skybox texture has to be calculated only once. Star position within the same star system are fixed.
    • galaxy / planets: fractal-based
    • generate planets as they get discovered
    • person who discovers a previously unknown planet gets to name it
    • first person to reach a new starsystem gets to name it
    • generate planet: server creates db entry with resource distribution etc., sends seed string to client
    • client: parses seed string, calls fractal function to calculate planet surface in desired resolution
    • since fractal function is deterministic, planet will look exactly the same on each client that receives the same seed string
    • careful: discovering a new planet triggers server-side generation of seed string and resource initialisation, not client-side.
    • Client only gets seed string and current resource distribution, as queried from the server-side db
    • and: iphone/web interface (ship movement / trading / ingame messaging)

    ship configuration:

    • each ship class has a max. complexity (#of elements to be added)
    • each element has a complexity level, so that the maxed out ship doesn’t get too complex
    • elements must be connected to other elements of the ship, cannot be free flying
    • elements can be added in any place and/or angle as long as they touch one of the ship’s components or are close enough to one
    • crucial elements (elements that connect “leaves” to the ship’s body) cannot be removed if they’re the only connector a leave has to the ship’s body
    • elements can be researched/constructed, bought, stolen or found/salvaged from wreckages (both in space and on planets)

    element examples:

    • cargo container
    • crew quarter
    • ngine room
    • exhaust
    • cockpit
    • solar panel
    • certain elements (such as a solar panel) have animation attached to them. All elements attached to an animated element get transformed as well (cargo container connected to a solar panel gets pulled towards/pushed away from the ship if the panel folds or unfolds)

    jump effect:
    if own ship jumps:

    • wavelength compression of surrounding (blueshift, fade out of all visible light, fade in of invisible spectrum, etc.)

    if other ship jumps:

    • ship starts to glow (starflare, blind effect on viewer, scales with distance), at jump
    • ship disappears, leaves a glowing “hull” of itself that fades out and expands quickly (shockwave, in the shape of the ship), as well as a short trail pointing towards the direction the ship has taken off to.
    • jump isn’t instant, but has a charge-time of ~30s?

    sublight: all stars (everything) in front of the ship experience blue-shift, all stars behind the ship: red-shift (can be done using shaders).