Skip to content

Latest commit

 

History

History
157 lines (115 loc) · 6.12 KB

MapGen_README.md

File metadata and controls

157 lines (115 loc) · 6.12 KB

Map Generation Project

By Joe Howarth

Purpose

This project seeks to procedurally create plausible geography and display it as a map. The generator will be used for creating the starting conditions in a historical population dynamics simulation.

How to Run

npm i
npm run serve

Methodology

Unlike many map and terrain generators, this project does not use noise as the fundamental building block. Instead, it uses a Delaunay triangulation over semi-random points, then combines elevations from several hill, mountain, basin and slope functions. Once the base geology is in place, for each triangle it's slope, downhill triangle, water flux and erosion rate are calculated. Eroding the initial terrain leads to natural looking ravines, plains, fingered mountains and long river valleys.

Rivers can easily be found by thresholding water flux. Similarly, coast outlines and terrain contours can be made by finding edges that straddle some height value. These three methods share a way to combine neighboring segments into 1 longer river, coastlines etc.

Cities and Towns can be placed by A.) valuing high water flux, B.) distance away from other settlements and C.) coastal triangles. The first several placed are classified cities and get the best locations and have the highest repulsion factors. Then towns follow, but these aren't as affected by distance, so they can "fill in the gaps".

Data Structures

Array Based DCEL

For performance, using flat arrays and indexes instead of heap allocated objects to represent DCEL graph

halfedges: maps edge index to opposite halfedge index

triangles: maps edge index to vertex index

points: maps point index to x,y coords

centroids: maps triangle id to x,y coord of centroid

To gain full power from structure, many transformation functions were defined, for example

  • edges_of_triangle
  • triangle_of_edge
  • next_edge
  • prev_edge
  • edges_around_point etc.

This structure is ideally suited for static DCEL's, but suffers when updates are required.

After initial DCEL building, many triangles near edges were too obtuse and disrupted slope and erosion calculation, so these trinagles were removes. This posed a problem for the index based DCEL because many operations would map over all points, centroids, triangles etc. and expect a contiguous array. This was solved by creating a new contiguous array 'triIDs' which mapped to the now hole-y base DCEL generated by the Delaunay triangulation.

On top of this, conventional adjacency lists and centroids allowed easy mapping over the graph for calculating slopes, downhills, erosion, flux, city sites, rivers,

Parts to Consider for Computation Geometry Final Project

Use of DCEL adaptation

see above

Planar point location

A key feature of maps is for visually displaying information and allowing for interaction. To enable this, there needs to be a mapping from coordinates to polygonal regions.

Given a Delaunay triangulation already exists, I first turned to the Kirkpatrick triangle refinement planar point location algorithm. Finding independant sets and intersecting trinagles worked, however linking to old DCEL representation with the 'refined' DCEL proved too cumbersome with the static DCEL I had previously chosen. Additionally, the fraction of removed triangles was relatively low, causing high space and (non-asymptotic) runtimes. My progress on this can be found under src/map_gen/kirkpatrick.js

As a sanity check, I implemented naive/brute force by looping over all trinagles and checking if the point was in any of them. Surprisingly, this method was relatively fast for normal map sizes, however, it was clearly non-optimal.

The next approach I tried involved 'walking' the graph toward the query point. This was achieved by checking the adjacent triangle with the minimum squared distance to the query point, then checking if that trinagle contained the point and if not, repeating. This significantly improved over naive and was further aided by using the last query point as the start location. Therfore, if every frame the triangle under the mouse needed to be found, the query would be nearly instant.

Expanding on the idea of good starting locations, a grid could be used such that a query pt. could be hashed to the grid (bucketed) then the walk would start no more than a fixed distance away for any point on the map. If the number of buckets was proporational to the number of triangles, then there would be a constant number of triangles as the shortest path between the buckets starting triangle and any point within the bucket. Therefore queries would be constant time, with linear space. In practice, a very small grid was needed to achieve fast queries even on large maps. This approach leveraged the regularity of the Delaunay triangulation of this type of map which the more general Kirkpatrick algorithm could not do.

Planar Point Data

Number of Triangles Distance from last query (km) Naive (ms) Vectoring (ms) Vectoring w/ grid (ms)
294500 19.75 63.40 0.20 0.086
294500 20.8 79.3 0.31 0.22
294500 149 58.5 5.1 0.144
294500 185.3 59.3 3.33 0.13
23950 6.31 6.12 0.13 0.056
23950 227.3 5.69 1.297 0.098
1994 41.1 0.45 0.19 0.032
1994 93.1 0.75 0.21 0.028

Scaling (relative to 1e3, average ms) [Note: grid using constant size, not proporational to number of triangles]

N Naive Vectoring Vectoring w/ grid
1e3 1, 0.60 1, 0.20 1, 0.03
2e4 9.8, 5.9 3.5, 0.71 2.5, 0.077
3e5 108.5, 65.1 11.2, 2.24 4.83, 0.145

References