Skip to content

Random Complex Notes

Christopher M Overton edited this page May 13, 2015 · 12 revisions

5/5/15 At the moment working on a random monotone polygon generator, although there additional work needed in this problem.

The algorithms general working goes as follows:

  1. Generate a base primitive polygon type (3 vertices, 3 edges).
  2. While iterator continues until MaxSize number of sides have been generated for the n gon.
  3. As 2. continues, we load the Q (queue) with the sides of the m gon (where m <= n) as we have cycled through all sides of the m gon. This cyclically repeats until 2. condition is complete.
  4. A subdivision x is chosen, and neighbor edge and points are determined, so that two neighboring points and edges are needed for a given edge subdivision.
  5. I have chosen at the moment (while having deficits to be worked on), a cubic interpolation method for smoothing the n polygon...or leading over repeat generations a smoother n gon for higher orders of n. This however, does have requirements. That we have 4 points for the set p given such that x0 = -1, x1 = 0, x2 = 1, and x3 = 2 on a given x interval. In order to do this, I have chosen to re scale the set of points to 1/abs(minx-maxx) where minx and maxx are determined from the given interval. This re scales maxx to maxx/(maxx - minx) or equals 1 + minx/(maxx-minx) which is 1 when translating all four points of set p leads to p2x = 1 and p1x = 0. Then for p0 and p3 we determine both the slopes of respective edges in relation to p1, and p2, and then compute from point slope formula under scaling and translation x0 = -1 and x3 = 2 which given p0 and p3 for the desired set. Once having the set p we then merely need supply the subdivision point x for the given edge interval.

The deficits of this algorithm occur where the incident angles for neighboring edges is greater than pi radians. In this case, this can lead to edge curve concavity, while the inverse on the other side. Higher orders of n typically produce n gons that are 'scimitar' shaped. :)

Likely the resolution to this is angle testing neighboring edges and then using an alternate algorithm which reduces differences between edge neighbors with such higher angle differences (above pi radians).

A simple method that I can think of for instance, may compute the centroid and the normal of an edge (simply - inverse of the slope of the edge).
An outward extension from a given subdivision point increases the distance of such point relative the centroid of the m gon. At this point I haven't considered any particular method which extends along the normal axis outward a desired metric, but I may consider for now something as simple as a length given by
1/2 the distance of the edge, for example.

update 5/7/15:

So I've managed to include a test (currently unused as I've been testing another algorithm sub part mentioned below) which determines whether or not a edge is good for cubic interpolation. The idea goes as follows: if a vertex of such edge and neighbors are such that such vertex is a relative x minimum or maximum, then a test is yielded positive for invalidating interpolation methods. Pretty simple, but important since this tells that a relative x min or maximum between neighboring points means that an interpolated edge may yield a concavity result.

The alternate method which is actually is easier to implement for building higher order n polygons computes the normal vector of a given edge, but this means that we have in order necessary elements for ensuring that our normal in the 2d plane is in the desired direction. Namely:

  1. The walk direction of the primitive polygon is established clockwise at the outset. That is, we used the signed area computation method (or cross product of ordered walk vertices of the polygon) in determining direction. If it is that the walk direction is counter clock wise then we compute the inverse direction of such. This is first vertex kept in position, plus reverse all remainders in the ordered list shown in code.
  2. With the ordered direction I've incorporated a list 'rotation heir' which tracks direction as we compute new edges and remove old ones. We track direction by computing the nearest vertices to the parent orders of rotation heir hash set. With this hash it is then an easy look up to ensure that direction on a given edge is always in the appropriate clockwise walk pattern. Which means that the b-a vector is always in the clockwise where the edge (a,b) means that a is the start of the walk, and b is the end of the walk on such edge. This means any positive angle rotation of such vector is away from the n-gons centroid and not towards, and this in turn means that we produce with any added vertex in the direction of such rotation vector that is non concave...this still doesn't solve the concavity problem in its fullest however.

The goal that I am working towards...here one considered method for computing a triangulated subdivision of the n-gon, actually works by re scaling the set of vertices of the polygon, which retaining the original relation between vertices and their neighboring positions relative to the old set. The idea is similar in some respects to the layered onion approach, except that one produces quad subdivision with each scaled set of added vertices, or in topological graphics related work, this is basically a scaled extrusion of points in building one set of faces to the next iteration. There are some limitations and downsides to this however. Once having a set of quad polygons in subdividing the given n gon area, as desired, then one can compute the triangulation of all such quads.

update 5/8/15: I've opted to try another approach here which is using conformal geometry in the construction of more primitive n gon types, and when meaning conformal one means, conforming the geometry in this case of a ellipse/circle as a prescribed boundary condition for the set of vertices. I've coded this actually under the subset file heading RandomComplex2.py . The problems that I've found, especially in the way of polar coordinate computations using a eccentric elliptical form (where e > 0) is that highly eccentric elliptic forms tend to increase the degree of acceleration/deceleration and hence velocity change with angle change relative to a given focus. This means the 45 degrees geometrically corresponds in differed ways say relative to 135 degrees on the ellipse in so far as in the case of equal step subdivisions of theta in relating the distance between neighboring vertices, or in other words, as on increases the step values of theta, for an equal step angle subdivision, as one approaches pi radians neighboring vertices are more closely bunched together (in the polar form). This leads to a n gon geometry that has quasi curved forms on end and a more linear form on the other end of the elliptic. In other words, one portion of the geometry appears more linear while the other side more closely approximating the curvature of the ellipse itself. A solution to this problem is to force the use of a circle when using equal angle step subdivisions, and instead alternately when conforming an elliptic type data set, scaling on either axis or both together with differed scaling values which appear to provide greater visual uniformity in so far as the linear appearance of such geometry.

Since the conformal approach appears to provide a method for controlling convexity of n gon geometry generation. I will likely also consider, as in the previous RandomComplex.py script method, constructing from a given primitive 3 gon, a corresponding CircumCircle, and then from this generating either random edge subdivision while conforming normal distance boundaries to the boundary of the circumcircle. This ensures that any n gon edge subdivision is always guaranteed to produce a convex n gon. As it turn out, I one would likely suspect any convex conformal geometry is likely to in turn produce another convex n gon geometry when considering higher order edge subdivision and normal conformed distance translation of newly created vertices.

Update 5/11/15: I've added a third polygon generator which computes from a 3 gon primitive, a circumcircle, that is, the circle whose circumference includes the three points of the 3 gon. In turn with the circumcircle's center and radius computed, for any edge subdivision of the triangle we can determine not only a subdivison but also the position which converts the 3 gon to a higher order n gon with subdivision vertex translated to the circumcircles edge boundary (normal relative an original position). For all the fuss one might ask what the difference might be between picking three arbitrary points on a randomly centered and sized circle? Technically this might be the same, although one again would need to check any randomly selected point for nearest neighbor nodes (at least some hashing track algorithm), otherwise, one would contend with edge crossings. I'll likely modify the algorithm further so that beyond say a user prescribed m value using the circumcircle method for n gon subdivision, so that at n >= m interpolant methods instead could be used for subdivision. To ensure integrity of convexity in the n gon, I'll likely include coordinate rotation to ensure that for boundary conditions monotonic coordinate x positions are resulting. This can be assumed I believe provided that the n gon has pre existing convex conditions, meaning that second derivatives are not greater than a linear case or at least are even as opposed to supplying non convex oddness, or in other words, non convexity implies that for four given vertices there exists no local rotation which ensures monotonicity in so far as a given coordinate axis.

So, I've also completed a given addition to RandomComplex3.py which includes cubic interpolation on a given edge in so far as the handling of a subdivision smoothing polygon subdivision process. It is worth noting that I've restricted this method to n gon iteration say beyond 10 edges, which traditional add with increasing edges points on the circle. If it were so that one continued supplying vertices of the n gon in this manner one would merely approximate the circle which is un interesting. In this case, with higher n gon subdivision iterations one would like to preserve some aspect of the primitive n gon type, however, supplying a new method which iterate interpolates a new edge. Again, however, even as in the case, where one can ensure, as in the case of rotated coordinates, that coordinate positions are more likely to be monotonic in nature, that is, ordered on the x axis where for a given edge pair (a,b), neighbors n1 and n2 are such that n1x <= ax <= bx <= n2x, we still have the possibility of a non convex polygon appearing the in the iteration cycle. However, I may qualify under this interpolant method with a sufficiently well defined primitive polygon type say of 5 or greater ordered convex edges (using the circumcircle method), that we have generally established that interpolations appear to be more stably approximated in randomization to a convex polygon, or this is to say even if it is non convex, it includes less area (via the vertex to vertex line test) that were outside the defined boundaries of such polygon. This potentially is usefully nice, since the set of vertices of the polygon, are more easily defined scalable relative to other non convex polygon generating procedures that would be less stable for scalability.

5/12/15

So after any number of trials with algorithm fixing cubic spline interpolation for a given edge segment, it appears curve stability of iterated n gon subdivisions of higher degree becomes problematic, that is, where the curve oscillations lead to vertices further away, not closer to the original subdividing edge. I suspect this is likely owing to maintaining continuity between first derivatives on either edge endpoint coupled with fixed boundary conditions, and where the differential between first derivatives are greater on either end point lead to rising of such conditions. A little bit of on line research yielded a corner cutting method called Chaikin's method, or alternately I've implemented a randomized method, which were inversely additive in the subdivision process, as opposed to being additive and destructive. As given in the original algorithm version of this type, I take a randomized proportion of the original edge length and this forms a normal translation from the original subdividing vertex midpoint.

Not to get myself too sidetracked here, but this has rekindled some interest in topographical contouring which brings up an interesting side note, since I were thinking about the random topographical map generation from the standpoint of circle packing routines, and then adding to this with described circumcircles (for such packing method), one defining potentially closed contour paths. Here I've also considered randomized gradient fields, to be used in determining scalars in side the closed contour region in so defining the shape of the original polygon, that presumably, prior may have been smoothed mathematical by any given iteration. This still leads to the contours connectivity problem in 3d modeling especially where normals between edges are significantly varied.

Here is an example polygon generated with inset subdivision, and then given to random height mapping along the defined contour face loops