Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement additional filters for filtering extracted shorelines #143

Open
2320sharon opened this issue Jun 1, 2023 · 5 comments
Open

Implement additional filters for filtering extracted shorelines #143

2320sharon opened this issue Jun 1, 2023 · 5 comments
Assignees
Labels
help wanted Extra attention is needed V2 for version 2 of coastseg

Comments

@2320sharon
Copy link
Collaborator

This issue is for tracking ideas related to improving the quality of the extracted shorelines.
Here are a few ideas the CoastSeg team has with some inspiration from new research

  1. Try using the pandas simplify the shoreline vectors
  2. Try dropping any sections of the shoreline that are outside of the shoreline buffer range
  3. Apply a vertex filter
  • This filter compares each vertex to its neighbors and drops the vertex if its > 3 STD
@2320sharon 2320sharon added help wanted Extra attention is needed V2 for version 2 of coastseg labels Jun 1, 2023
@2320sharon 2320sharon self-assigned this Jun 1, 2023
@2320sharon
Copy link
Collaborator Author

tide_modeling_workflow.zip
I have the scripts and files I used to test the fes2014.

The zip file contains the following:

Data Files

  1. test_points.geojson: a series of points all around the world to test the tide model with
  2. tide_regions_map.geojson: bounding boxes for each tide region around the world. The tide model is clipped to each region so that predictions are faster
  3. tides_for_points.json: a json file that is the output of the testing_matrix_for_bad_tides.py script that tried to find the tide for all the points in points_with_no_tide.json by increasing and decreasing the radius for the tide location and the time.
  4. points_with_no_tide.json: The geometry for each point that didn't have a tide as found by the isolate_points_with_no_tide.py script.

Tide Model Setup Scripts

  • If you already have the tide model setup and the clipped to the regions no need to run these files
  1. aviso_fes_tides.py: used to download the fes2014 model
  2. clip_and_write_new_nc_files_cmd_line.py: a command line script I made that clips the downloaded tide model to the tide regions specified in the tide_regions_map.geojson
  • the instructions are at the top of the script

Tide Testing Files

  1. prototype_model_tides_coastseg.py: For each point in test_points.geojson attempts to find the tide and prints it.
  2. isolate_points_with_no_tide.py: a script used to isolate only the points that didn't have a tide and save them to a file named points_with_no_tide.json
  3. testing_matrix_for_bad_tides.py: A script used to find the tide for all the points in points_with_no_tide.json by increasing and decreasing the radius for the tide location and the time.

Recipe for the Environment

conda create -n tide_modeling python=3.10 -y
conda activate tide_modeling
pip install pyTMD matplotlib
conda install -c conda-forge geopandas xarray netCDF4

@mlundine
Copy link

mlundine commented Feb 29, 2024

See below for vertex filter, best implemented at end on the extracted_shorelines_lines.geojson file.

import geopandas as gpd
import numpy as np
import os

def vertex_filter(shorelines):
    """
    Recursive 3-sigma filter on vertices in shorelines
    Will filter out shorelines that have too many or too few
    vertices until all of the shorelines left in the file are within
    Mean+/-3*std
    
    Saves output to the same directory with same name but with (_vtx) appended.

    inputs:
    shorelines (str): path to the extracted shorelines geojson
    outputs:
    new_path (str): path to the filtered file 
    """
    gdf = gpd.read_file(shorelines)
    
    count = len(gdf)
    new_count = None
    for index, row in gdf.iterrows():
        gdf.at[index,'vtx'] = len(row['geometry'].coords)
    filter_gdf = gdf.copy()

    while count != new_count:
        count = len(filter_gdf)
        sigma = np.std(filter_gdf['vtx'])
        mean = np.mean(filter_gdf['vtx'])
        limit = mean+3*sigma
        filter_gdf = gdf[gdf['vtx']< limit]
        if mean < 5:
            break
        new_count = len(filter_gdf)
    
    new_path = os.path.splitext(shorelines)[0]+'_vtx.geojson'
    filter_gdf.to_file(new_path)
    return new_path`

@mlundine
Copy link

mlundine commented Feb 29, 2024

Smoothing the vector shorelines, probably best implemented after tidal correction.

image

import os
import geopandas as gpd
import numpy as np
import shapely


def arr_to_LineString(coords):
    """
    Makes a line feature from a list of xy tuples
    inputs: coords
    outputs: line
    """
    points = [None]*len(coords)
    i=0
    for xy in coords:
        points[i] = shapely.geometry.Point(xy)
        i=i+1
    line = shapely.geometry.LineString(points)
    return line

def chaikins_corner_cutting(coords, refinements=5):
    """
    Smooths out lines or polygons with Chaikin's method
    """
    i=0
    for _ in range(refinements):
        L = coords.repeat(2, axis=0)
        R = np.empty_like(L)
        R[0] = L[0]
        R[2::2] = L[1:-1:2]
        R[1:-1:2] = L[2::2]
        R[-1] = L[-1]
        coords = L * 0.75 + R * 0.25
        i=i+1
    return coords

def smooth_lines(shorelines):
    """
    Smooths out shorelines with Chaikin's method
    saves output with '_smooth' appended to original filename in same directory

    inputs:
    shorelines (str): path to extracted shorelines
    outputs:
    save_path (str): path of output file
    """
    dirname = os.path.dirname(shorelines)
    save_path = os.path.join(dirname,os.path.splitext(os.path.basename(shapefile))[0]+'_smooth.geojson')
    lines = gpd.read_file(shorelines)
    new_lines = lines.copy()
    for i in range(len(lines)):
        line = lines.iloc[i]
        coords = np.array(line.geometry)
        refined = chaikins_corner_cutting(coords)
        refined_geom = arr_to_LineString(refined)
        new_lines['geometry'][i] = refined_geom
    new_lines.to_file(save_path)
    return save_path

@2320sharon
Copy link
Collaborator Author

Thanks for posting these here. Your code is very clean and well structured.
Wouldn't it make sense to smooth the shorelines before determining the shoreline position along each transect transect because we would want the best possible shorelines to be used ?

@mlundine
Copy link

Ahh you're right I forgot the tidal correction is on the transect intersections.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed V2 for version 2 of coastseg
Projects
None yet
Development

No branches or pull requests

2 participants