Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IMAGEIO FFMPEG_WRITER WARNING when creating MP4 timelapse animation & other warning #512 #513

Open
wants to merge 18 commits into
base: development
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions .github/FUNDING.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# These are supported funding model platforms

github: # Replace with up to 4 GitHub Sponsors-enabled usernames e.g., [user1, user2]
patreon: # Replace with a single Patreon username
open_collective: # Replace with a single Open Collective username
ko_fi: # Replace with a single Ko-fi username
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
liberapay: # Replace with a single Liberapay username
issuehunt: # Replace with a single IssueHunt username
lfx_crowdfunding: # Replace with a single LFX Crowdfunding project-name e.g., cloud-foundry
polar: # Replace with a single Polar username
buy_me_a_coffee: voskilianp
custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -15,4 +15,5 @@
*.xml
/deprecated
/worked_example
/.virtual_documents
/.virtual_documents
example/Narrabeen_Profiles.csv
30 changes: 16 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,12 +12,22 @@ CoastSat is an open-source software toolkit written in Python that enables users

![Alt text](https://github.com/kvos/CoastSat/blob/master/doc/example.gif)

*Finding CoastSat useful? Show your support with a Github star — it’s a simple click that helps others discover it* ⭐️

:star: **If you like the repo put a star on it!** :star:
#### Latest toolbox updates

:arrow_forward: *(2024/04/26)*
CoastSat v2.5: contributions from @2320sharon and @DanieTheron to improve the download updates and cloud masking for Landsat.

:arrow_forward: *(2023/11/09)*
CoastSat v2.4: bug & fixes, function to create animations, S2_HARMONIZED collection, better instructions on gcloud installations

:arrow_forward: *(2023/07/07)*
CoastSat v2.3: addition of a better cloud mask for Sentinel-2 imagery using the s2cloudless collection on GEE

#### Additional resources

:point_right: Visit the [CoastSat website](http://coastsat.wrl.unsw.edu.au/) to explore and download regional-scale datasets of satellite-derived shorelines and beach slopes generated with CoastSat in different regions (Pacific Rim, US Atlantic coast).
:point_right: Visit the [CoastSat website](http://coastsat.wrl.unsw.edu.au/) to explore and download existing datasets of satellite-derived shorelines and beach slopes generated with CoastSat in the Pacific and Atlantic basins.

:point_right: Useful publications describing the toolbox:

Expand All @@ -29,22 +39,14 @@ CoastSat is an open-source software toolkit written in Python that enables users
- Beach slope dataset for Australia: https://doi.org/10.5194/essd-14-1345-2022

:point_right: Other repositories and extensions related to the toolbox:
- [CoastSat.slope](https://github.com/kvos/CoastSat.slope): estimates the beach-face slope from the satellite-derived shorelines obtained with CoastSat.
- [CoastSeg](https://github.com/dbuscombe-usgs/CoastSeg): an interactive toolbox for downloading satellite imagery, applying image segmentation models, mapping shoreline positions and more.
- [SDS_Benchmark](https://github.com/SatelliteShorelines/SDS_Benchmark): testbed for satellite-derived shorelines mapping algorithms and validation against benchmark datasets.
- [CoastSat.slope](https://github.com/kvos/CoastSat.slope): estimates the beach-face slope from the satellite-derived shorelines obtained with CoastSat.
- [CoastSat.PlanetScope](https://github.com/ydoherty/CoastSat.PlanetScope): shoreline extraction for PlanetScope Dove imagery (near-daily since 2017 at 3m resolution).
- [CoastSeg](https://github.com/dbuscombe-usgs/CoastSeg): image segmentation, deep learning, doodler.
- [CoastSat.islands](https://github.com/mcuttler/CoastSat.islands): 2D planform measurements for small reef islands.
- [InletTracker](https://github.com/VHeimhuber/InletTracker): monitoring of intermittent open/close estuary entrances.
- [CoastSat.Maxar](https://github.com/kvos/CoastSat.Maxar): shoreline extraction on Maxar World-View images (in progress)

#### Latest toolbox updates

:arrow_forward: *(2023/11/09)*
CoastSat v2.4: bug & fixes, function to create animations, S2_HARMONIZED collection, better instructions on gcloud installations

:arrow_forward: *(2023/07/07)*
CoastSat v2.3: addition of a better cloud mask for Sentinel-2 imagery using the s2cloudless collection on GEE

- [InletTracker](https://github.com/VHeimhuber/InletTracker): monitoring of intermittent open/close estuary entrances.

### Project description

Satellite remote sensing can provide low-cost long-term shoreline data capable of resolving the temporal scales of interest to coastal scientists and engineers at sites where no in-situ field measurements are available. CoastSat enables the non-expert user to extract shorelines from Landsat 5, Landsat 7, Landsat 8, Landsat 9 and Sentinel-2 images.
Expand Down
14 changes: 14 additions & 0 deletions authenticate.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
import ee
#script to run to test ee connexion before running example scripts
def authenticate_and_initialize():
try:
# Authenticate the Earth Engine session.
ee.Authenticate()
# Initialize the Earth Engine module.
ee.Initialize()
print("Authentication successful!")
except Exception as e:
print(f"Authentication failed: {e}")

if __name__ == "__main__":
authenticate_and_initialize()
50 changes: 33 additions & 17 deletions coastsat/SDS_download.py
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ def retrieve_images(inputs):
# get epsg code
im_epsg = int(im_meta['bands'][0]['crs'][5:])

# get quality flags (geometric and radiometric quality)
# get geometric accuracy, radiometric quality and tilename for Landsat
if satname in ['L5','L7','L8','L9']:
if 'GEOMETRIC_RMSE_MODEL' in im_meta['properties'].keys():
acc_georef = im_meta['properties']['GEOMETRIC_RMSE_MODEL']
Expand All @@ -156,6 +156,10 @@ def retrieve_images(inputs):
rad_quality = im_meta['properties']['IMAGE_QUALITY']
elif satname in ['L8','L9']:
rad_quality = im_meta['properties']['IMAGE_QUALITY_OLI']
# add tilename (path/row)
tilename = '%03d%03d'%(im_meta['properties']['WRS_PATH'],im_meta['properties']['WRS_ROW'])

# get geometric accuracy, radiometric quality and tilename for S2
elif satname in ['S2']:
# Sentinel-2 products don't provide a georeferencing accuracy (RMSE as in Landsat)
# but they have a flag indicating if the geometric quality control was PASSED or FAILED
Expand Down Expand Up @@ -186,15 +190,17 @@ def retrieve_images(inputs):
' raise an issue at https://github.com/kvos/CoastSat/issues'+
' and add you inputs in text (not a screenshot pls).')
rad_quality = 'PASSED'

# add tilename (MGRS name)
tilename = im_meta['properties']['MGRS_TILE']

# select image by id
image_ee = ee.Image(im_meta['id'])

# for S2 add s2cloudless probability band
if satname == 'S2':
if len(im_dict_s2cloudless[i]) == 0:
raise Exception('could not find matching s2cloudless image, raise issue on Github at'+
'https://github.com/kvos/CoastSat/issues and provide your inputs.')
print('Warning: S2cloudless mask for image %s is not available yet, try again tomorrow.'%im_date)
continue
im_cloud = ee.Image(im_dict_s2cloudless[i]['id'])
cloud_prob = im_cloud.select('probability').rename('s2cloudless')
image_ee = image_ee.addBands(cloud_prob)
Expand Down Expand Up @@ -237,7 +243,7 @@ def retrieve_images(inputs):

# create filename for image
for key in bands.keys():
im_fn[key] = im_date + '_' + satname + '_' + inputs['sitename'] + '_' + key + suffix
im_fn[key] = im_date + '_' + satname + '_' + tilename + '_' + inputs['sitename'] + '_' + key + suffix
# if multiple images taken at the same date add 'dupX' to the name (duplicate number X)
duplicate_counter = 0
while im_fn['ms'] in all_names:
Expand Down Expand Up @@ -304,7 +310,7 @@ def retrieve_images(inputs):

# create filename for both images (ms and pan)
for key in bands.keys():
im_fn[key] = im_date + '_' + satname + '_' + inputs['sitename'] + '_' + key + suffix
im_fn[key] = im_date + '_' + satname + '_' + tilename + '_' + inputs['sitename'] + '_' + key + suffix
# if multiple images taken at the same date add 'dupX' to the name (duplicate number X)
duplicate_counter = 0
while im_fn['ms'] in all_names:
Expand Down Expand Up @@ -375,8 +381,7 @@ def retrieve_images(inputs):

# create filename for the three images (ms, swir and mask)
for key in bands.keys():
im_fn[key] = im_date + '_' + satname + '_' \
+ inputs['sitename'] + '_' + key + suffix
im_fn[key] = im_date + '_' + satname + '_' + tilename + '_' + inputs['sitename'] + '_' + key + suffix
# if multiple images taken at the same date add 'dupX' to the name (duplicate)
duplicate_counter = 0
while im_fn['ms'] in all_names:
Expand Down Expand Up @@ -410,7 +415,7 @@ def retrieve_images(inputs):
width, height = SDS_tools.get_image_dimensions(image_path)
# write metadata in a text file for easy access
filename_txt = im_fn['ms'].replace('_ms','').replace('.tif','')
metadict = {'filename':filename_ms,'epsg':im_epsg,
metadict = {'filename':filename_ms,'tile':tilename,'epsg':im_epsg,
'acc_georef':acc_georef,'image_quality':rad_quality,
'im_width':width,'im_height':height}
with open(os.path.join(filepaths[0],filename_txt + '.txt'), 'w') as f:
Expand Down Expand Up @@ -471,7 +476,7 @@ def get_metadata(inputs):
# if a folder has been created for the given satellite mission
if satname in os.listdir(filepath):
# update the metadata dict
metadata[satname] = {'filenames':[],'dates':[],'epsg':[],'acc_georef':[],
metadata[satname] = {'filenames':[],'dates':[],'tilename':[],'epsg':[],'acc_georef':[],
'im_quality':[],'im_dimensions':[]}
# directory where the metadata .txt files are stored
filepath_meta = os.path.join(filepath, satname, 'meta')
Expand All @@ -483,6 +488,7 @@ def get_metadata(inputs):
# read them and extract the metadata info
with open(os.path.join(filepath_meta, im_meta), 'r') as f:
filename = f.readline().split('\t')[1].replace('\n','')
tilename = f.readline().split('\t')[1].replace('\n','')
epsg = int(f.readline().split('\t')[1].replace('\n',''))
acc_georef = f.readline().split('\t')[1].replace('\n','')
im_quality = f.readline().split('\t')[1].replace('\n','')
Expand All @@ -500,6 +506,7 @@ def get_metadata(inputs):
# store the information in the metadata dict
metadata[satname]['filenames'].append(filename)
metadata[satname]['dates'].append(date)
metadata[satname]['tilename'].append(tilename)
metadata[satname]['epsg'].append(epsg)
metadata[satname]['acc_georef'].append(acc_georef)
metadata[satname]['im_quality'].append(im_quality)
Expand Down Expand Up @@ -596,13 +603,22 @@ def check_images_available(inputs):
# remove from download list the images that are already existing
if satname in metadata_existing:
if len(metadata_existing[satname]['dates']) > 0:
first_date = metadata_existing[satname]['dates'][0] - timedelta(days=1)
last_date = metadata_existing[satname]['dates'][-1] + timedelta(days=1)
date_list = [datetime.fromtimestamp(_['properties']['system:time_start']/1000, tz=pytz.utc) for _ in im_dict_T1[satname]]
idx_new = np.where([np.logical_or(_< first_date, _ > last_date) for _ in date_list])[0]
# only keep images corresponding to dates that are not already existing
im_dict_T1[satname] = [im_dict_T1[satname][_] for _ in idx_new]
print('%s: %d images already exist, %s to download'%(satname, len(date_list)-len(idx_new), len(idx_new)))
# get all the possible availabe dates for the imagery requested
avail_date_list = [datetime.fromtimestamp(image['properties']['system:time_start'] / 1000, tz=pytz.utc).replace( microsecond=0) for image in im_dict_T1[satname]]
# if no images are available, skip this loop
if len(avail_date_list) == 0:
print(f'{satname}:There are {len(avail_date_list)} images available, {len(metadata_existing[satname]["dates"])} images already exist, {len(avail_date_list)} to download')
continue
# get the dates of the images that are already downloaded
downloaded_dates = metadata_existing[satname]['dates']
# if no images are already downloaded, skip this loop and use whats already in im_dict_T1[satname]
if len(downloaded_dates) == 0:
print(f'{satname}:There are {len(avail_date_list)} images available, {len(downloaded_dates)} images already exist, {len(avail_date_list)} to download')
continue
# get the indices of the images that are not already downloaded
idx_new = np.where([ not avail_date in downloaded_dates for avail_date in avail_date_list])[0]
im_dict_T1[satname] = [im_dict_T1[satname][index] for index in idx_new]
print('%s: %d images already exist, %s to download'%(satname, len(avail_date_list), len(idx_new)))

# if only S2 is in sat_list, stop here as no Tier 2 for Sentinel
if len(inputs['sat_list']) == 1 and inputs['sat_list'][0] == 'S2':
Expand Down
18 changes: 12 additions & 6 deletions coastsat/SDS_preprocess.py
Original file line number Diff line number Diff line change
Expand Up @@ -402,11 +402,14 @@ def is_set(x, n):
if sum(sum(cloud_mask)) > 0 and sum(sum(~cloud_mask)) > 0:
cloud_mask = morphology.remove_small_objects(cloud_mask, min_size=40, connectivity=1)

if cloud_mask_issue:
if cloud_mask_issue:
cloud_mask = np.zeros_like(im_QA, dtype=bool)
for value in cloud_values:
cloud_mask_temp = np.isin(im_QA, value)
elem = morphology.square(6) # use a square of width 6 pixels
cloud_mask = morphology.binary_opening(cloud_mask,elem) # perform image opening
# remove objects with less than min_size connected pixels
cloud_mask = morphology.remove_small_objects(cloud_mask, min_size=100, connectivity=1)
cloud_mask_temp = morphology.binary_opening(cloud_mask_temp, elem) # perform image opening
cloud_mask_temp = morphology.remove_small_objects(cloud_mask_temp, min_size=100, connectivity=1)
cloud_mask = np.logical_or(cloud_mask, cloud_mask_temp)

return cloud_mask

Expand Down Expand Up @@ -820,8 +823,10 @@ def get_reference_sl(metadata, settings):
# create figure
fig, ax = plt.subplots(1,1, figsize=[18,9], tight_layout=True)
mng = plt.get_current_fig_manager()
mng.window.showMaximized()
# loop trhough the images
#AttributeError: '_tkinter.tkapp' object has no attribute 'showMaximized'
#mng.window.showMaximized()
# Maximize the window using tkinter method
mng.window.wm_attributes('-zoomed', True) # loop through the images
for i in range(len(filenames)):

# read image
Expand Down Expand Up @@ -1008,3 +1013,4 @@ def press(event):
'download more images and try again')

return pts_coords

16 changes: 13 additions & 3 deletions coastsat/SDS_shoreline.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,15 @@ def extract_shorelines(metadata, settings):
sitename = settings['inputs']['sitename']
filepath_data = settings['inputs']['filepath']
collection = settings['inputs']['landsat_collection']
filepath_models = os.path.join(os.getcwd(), 'classification', 'models')
# Get the current directory of the script
current_dir = os.path.abspath(os.path.dirname(__file__))

# Navigate to the 'models' directory
filepath_models = os.path.abspath(os.path.join(current_dir, '..', 'classification', 'models'))

# Verify the path to ensure it is correct
print(f"The model file path is: {filepath_models}")

# initialise output structure
output = dict([])
# create a subfolder to store the .jpg images showing the detection
Expand Down Expand Up @@ -805,8 +813,10 @@ def show_detection(im_ms, cloud_mask, im_labels, shoreline,image_epsg, georef,
fig = plt.figure()
fig.set_size_inches([18, 9])
mng = plt.get_current_fig_manager()
mng.window.showMaximized()

#AttributeError: '_tkinter.tkapp' object has no attribute 'showMaximized'
#mng.window.showMaximized()
# Maximize the window using tkinter method
mng.window.wm_attributes('-zoomed', True) # loop through the images
# according to the image shape, decide whether it is better to have the images
# in vertical subplots or horizontal subplots
if im_RGB.shape[1] > 2.5*im_RGB.shape[0]:
Expand Down
25 changes: 18 additions & 7 deletions coastsat/SDS_tools.py
Original file line number Diff line number Diff line change
Expand Up @@ -653,24 +653,35 @@ def get_closest_datapoint(dates, dates_ts, values_ts):
values corresponding to the input dates

"""
# check if the time-series cover the dates

# Check if the time-series cover the dates
if dates[0] < dates_ts[0] or dates[-1] > dates_ts[-1]:
raise Exception('Time-series do not cover the range of your input dates')
# get closest point to each date (no interpolation)

# Get closest point to each date (no interpolation)
temp = []

def find(item, lst):
start = 0
# Start the search from the position of the item
start = lst.index(item, start)
return start
for i,date in enumerate(dates):

for i, date in enumerate(dates):
print('\rExtracting closest points: %d%%' % int((i+1)*100/len(dates)), end='')
temp.append(values_ts[find(min(item for item in dates_ts if item > date), dates_ts)])
# Find the minimum item in dates_ts that is greater than the current date
try:
closest_date = min(item for item in dates_ts if item >= date)
index = find(closest_date, dates_ts)
temp.append(values_ts[index])
except ValueError:
# This error occurs if the list comprehension returns an empty list, meaning no date in dates_ts is >= date
raise ValueError(f"No date in time series is greater than or equal to {date}")

values = np.array(temp)

return values


###################################################################################################
# GEODATAFRAMES AND READ/WRITE GEOJSON
###################################################################################################
Expand Down
2 changes: 1 addition & 1 deletion coastsat/SDS_transects.py
Original file line number Diff line number Diff line change
Expand Up @@ -341,7 +341,7 @@ def compute_intersection_QC(output, transects, settings):
med_intersect[i] = np.nanmedian(xy_rot[0,:])
max_intersect[i] = np.nanmax(xy_rot[0,:])
min_intersect[i] = np.nanmin(xy_rot[0,:])
n_intersect[i] = len(xy_rot[0,:])
n_intersect[i] = np.sum(~np.isnan(xy_rot[0, :])) # count only non-nan values

# quality control the intersections using dispersion metrics (std and range)
condition1 = std_intersect <= settings['max_std']
Expand Down
Loading