Skip to content

Commit

Permalink
Deploy blog to new server. (#89)
Browse files Browse the repository at this point in the history
* Initial static website

* Add deploy steps for website

* Oap website update (#80)

* Initial static website

* Add deploy steps for website

* Add deploy action

* reference dockerfile in correct dir

* reference correct branch in deploy

* Fix branch on trigger names

* Add initial blog (#87)

* Update source of blog Dockerfile

* Differentiate blog and website actions

* Fix missing build stage for blog

* Update dockerfile so that submodule blog theme can be installed

* Change workdir for hugo build step

* Fix theme case

* Change blog base url to null to see if it fixes image loading issue

* Folder placeholders

* Change 2023 update to draft
  • Loading branch information
scottcha committed Nov 27, 2023
1 parent b427146 commit 560de16
Show file tree
Hide file tree
Showing 33 changed files with 514 additions and 1 deletion.
2 changes: 1 addition & 1 deletion .github/workflows/deploy.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Deploy to Ubuntu Server
name: Deploy Website to Ubuntu Server

on:
push:
Expand Down
49 changes: 49 additions & 0 deletions .github/workflows/deploy_blog.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
name: Deploy Blog to Ubuntu Server

on:
push:
branches:
- master
- ppe

jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: Login to GitHub Container Registry
uses: docker/login-action@v1
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GHCR_TOKEN_OAP}}
- name: Build and push Docker image
uses: docker/build-push-action@v2
with:
context: .
push: true
tags: ghcr.io/${{ github.repository_owner }}/oap_blog:latest
file: Blog/Dockerfile
- name: SSH into server and deploy
uses: appleboy/ssh-action@master
with:
host: ${{ secrets.SERVER_HOST }}
username: ${{ secrets.SERVER_USERNAME }}
password: ${{ secrets.SERVER_PASSWORD }}
port: ${{ secrets.SSH_PORT}}
script: |
echo "${{ secrets.GHCR_TOKEN}}" | docker login ghcr.io -u ${{ github.repository_owner }} --password-stdin
if [ ${{ github.ref }} = 'refs/heads/master' ]
then
docker stop oap_blog_prod || true
docker rm oap_blog_prod || true
docker pull ghcr.io/${{ github.repository_owner }}/oap_blog:latest
docker run -d --name oap_blog_prod -p 83:80 ghcr.io/${{ github.repository_owner }}/oap_blog:latest
elif [ ${{ github.ref }} = 'refs/heads/ppe' ]
then
docker stop oap_blog_ppe || true
docker rm oap_blog_ppe || true
docker pull ghcr.io/${{ github.repository_owner }}/oap_blog:latest
docker run -d --name oap_blog_ppe -p 8083:80 ghcr.io/${{ github.repository_owner }}/oap_blog:latest
fi
7 changes: 7 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -322,3 +322,10 @@ TestData/5.MLData/
TestData/4.GFSFiltered1xInterpolationZarr/lat_lon_union.csv
study.pkl
DataPipelineNotebooks/.last_checked

# Hugo
/Blog/public/
/Blog/resources/_gen/
/Blog/hugo_stats.json
/Blog/themes/*/node_modules/
/Blog/themes/*/dist/
3 changes: 3 additions & 0 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[submodule "themes/LoveIt"]
path = Blog/themes/LoveIt
url = https://github.com/dillonzq/LoveIt.git
Empty file added Blog/.hugo_build.lock
Empty file.
32 changes: 32 additions & 0 deletions Blog/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Start from the Ubuntu image and name this stage as 'builder'
# Note: alpine linux isn't compatible with the hugo command (without hacks around glibc)
FROM ubuntu:latest AS builder

# Install Hugo and Git
RUN apt-get update && \
apt-get install -y wget ca-certificates git && \
wget https://github.com/gohugoio/hugo/releases/download/v0.120.4/hugo_extended_0.120.4_Linux-64bit.tar.gz && \
tar xzf hugo_extended_0.120.4_Linux-64bit.tar.gz && \
mv hugo /usr/local/bin/ && \
rm -r hugo_extended_0.120.4_Linux-64bit.tar.gz

# Copy your Hugo site source into the Docker container
COPY . /src

# Set the working directory
WORKDIR /src

# Initialize Git and update the LoveIt theme submodule
RUN git init && \
git submodule update --init --recursive

WORKDIR /src/Blog

# Build your Hugo site
RUN hugo --minify --config hugo.toml

# Use an Apache HTTP Server Docker image to serve your Hugo site
FROM httpd:2.4-alpine

# Copy the built Hugo site from the builder container to the Apache HTTP Server container
COPY --from=builder /src/Blog/public/ /usr/local/apache2/htdocs/
5 changes: 5 additions & 0 deletions Blog/archetypes/default.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
+++
title = '{{ replace .File.ContentBaseName "-" " " | title }}'
date = {{ .Date }}
draft = true
+++
181 changes: 181 additions & 0 deletions Blog/assets/css/_page/_home.scss
Original file line number Diff line number Diff line change
@@ -0,0 +1,181 @@
.home {
.home-profile {
@include transform(translateY(16vh));
padding: 0 0 .5rem;
text-align: center;

.home-avatar {
padding: .5rem;

img {
display: inline-block;
max-width: 100%;
height: auto;
margin: 0 auto;
@include transition(all 0.4s ease);

&:hover {
position: relative;
@include transform(translateY(-.75rem));
}
}
}

.home-title {
font-size: 1.25rem;
font-weight: bold;
margin: 0;
padding: .5rem;
}

.home-subtitle {
font-size: 1rem;
font-weight: normal;
margin: 0;
}

.links {
padding: .5rem;
font-size: 1.5rem;

a * {
vertical-align: text-bottom;
}

img {
height: 1.5rem;
padding: 0 .25rem;
}
}

.home-disclaimer {
font-size: 1rem;
line-height: 1.5rem;
font-weight: normal;
margin: 0;
padding: .5rem;
color: $global-font-secondary-color;

[theme=dark] & {
color: $global-font-secondary-color-dark;
}
}
}
}

.home[data-home=posts] {
.home-profile {
@include transform(translateY(0));
padding-top: 2rem;
}

.home-avatar img {
width: 32rem;
}

.summary {
padding-top: 1rem;
padding-bottom: .8rem;
color: $global-font-color;
border-bottom: 1px dashed $global-border-color;

[theme=dark] & {
color: $global-font-color-dark;
border-bottom: 1px dashed $global-border-color-dark;
}

.featured-image-preview {
width: 4rem;
height: 4rem;
margin-right: 1rem;
@include transition(transform 0.4s ease);

img {
width: 100%;
height: 100%;
object-fit: cover;

&.lazyloaded {
@include object-fit(cover);
}
}

&:hover {
@include transform(scale(1.01));
}
}

.title-and-preview{
display: flex;
flex-direction: row;
align-items: center;
}

.single-title {
font-size: 1.25rem;
line-height: 140%;
margin: 0.4rem 0;
}

.content {
@include box(vertical);
-webkit-line-clamp: 3;
margin-top: .3rem;
width: 100%;
overflow: hidden;
text-overflow: ellipsis;
@include overflow-wrap(break-word);
color: $global-font-secondary-color;

[theme=dark] & {
color: $global-font-secondary-color-dark;
}

h2,
h3,
h4,
h5,
h6,
p {
font-size: 1rem;
line-height: 1.5;
display: inline;

&::after {
content: "\A";
white-space: pre;
}
}

h2 {
font-size: 1.125rem;
}

@include link(false, true);

b, strong {
color: $global-font-secondary-color;

[theme=dark] & {
color: $global-font-secondary-color-dark;
}
}
}

.post-footer {
margin-top: .4rem;
display: flex;
justify-content: space-between;
align-items: center;
font-size: .875rem;

@include link(false, false);

.post-tags {
padding: 0;

@include link(true, true);
}
}
}
}
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
72 changes: 72 additions & 0 deletions Blog/content/posts/20181201_18-19-season-preview/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
+++
title = '18-19 Season Preview'
date = 2018-12-01T11:48:48-07:00
draft = false
+++

## Summary

We have fixed several critical issues affecting the inability for the model to generalize and which had a negative impact on last seasons accuracy. We are also publishing accuracy goals for when experimental forecasts might also be published again and have outlined some analysis of where errors are still affecting the model and what is the plan to address them.

## Looking back on 17-18 Season

Before we look at what are the plans for the coming season I wanted to provide a bit of an update on how we did and what we have learned from last season. I launched the Open Avalanche Project in early March of 2018 at the tail end of the last Northern Hemisphere Avalanche Season. While everything was deployed and published as experimental I did want to provide an update on how accurate the forecasts provided were. The forecasts performed very poorly last season. The full analysis is available here: [https://github.com/scottcha/OpenAvalancheProject/blob/develop/ML/ForecastAccuracyAnalysis17-18.ipynb](https://github.com/scottcha/OpenAvalancheProject/blob/develop/ML/ForecastAccuracyAnalysis17-18.ipynb) but the net is that measuring both at a regional as well as a single point per region the overall accuracy was less than 30%. As I investigated there were several critical errors which contributed to this:

1. The train/test split was not done across time and led to model overfitting.
2. The model training pipeline had a few different assumptions about date alignments than the prediction pipeline.
3. I was using 0 as a proxy for missing data incorrectly when it does have meaning in this model. For example, 0 snow depth does have meaning and isn't a good proxy for missing data.

All of these issues have been examined and will be rectified before other models are published.

That being said there was lots learned in getting the end to end pipeline built and it will be easier for the future to focus less on that and more effort on building the best forecasts possible.

## Looking forward to the 18-19 Season

Over the summer I made a few major updates to the pipeline, resolving the first two issues above (the third will be resolved once new models are published).

I also made the investment to bring in additional data, from Utah Avalanche Center, over the summer greatly expanding the data available to train on but also getting additional regional coverage in a continental climate.

Now that the train/test split is done on a season boundary (currently the training is done from the 13-14 season through the 16-17 season and the 17-18 season is used as the test set) we have accuracy numbers which are better indicators of the real-world model performance.

## When models will be live

The experience and learnings from last year were informative and influenced the position that we should move cautiously in publishing even the experimental forecasts. We want to avoid both people reading in too much to the forecasts as well as we want to build a broad basis of support in the methodology across the public and avalanche forecasting communities. While we'll continue to make all of our work public but **we will only publish experimental forecasts generated from models with a full season accuracy > 75%**. That number seeks to strike a balance between what is likely useful from moving the conversation forward versus as well as keeping even this experimental work from being misinterpreted or taken out of context. While we are close to achieving a 75% accuracy on a continental snowpack we still have some work to get there (and the gap is wider for the coastal snowpack).

**Coastal Model Performance**

This basic Random Forest Classifier model struggles to the High forecasts correct a majority of the time and only has an overall accuracy value of 57.3% for the 17-18 test season. I've attempted more complex models (Xgboost, LSTM DNN) and neither have a significant improvement in accuracy.
![Coastal Confusion Matrix](coastal17-18Confusion.PNG)

Breaking the forecast down across months there was not a clear pattern on parts of the winter which were harder to predict than others.
![Coastal By Date](NWACByDate.PNG)

Breaking down by region did demonstrate the that Olympics did contribute the most to the error.
![Coastal By Region](NWACByRegion.PNG)

Across all avalanche problem types the problem types most associated with the model incorrectly predicting the forecast for the Coastal regions are Wind Slabs, followed by Loose Wet and then by Persistent Slab and Cornices.

Errors when Wind Slab is a forecasted avalanche problem:
![Coastal By Wind Slab](NWACByProblem.PNG)

**Continental Model Performance**

The model has better skill and works across the forecast values better in the coastal region but with an overall accuracy value of 72.1%.

![Continental Confusion Matrix](Continental17-18Confusion.PNG)

As in the coastal forecast there was not a clear error pattern across dates.

Breaking down by region demonstrates that the Uintas Region is the highest contribution to the error.
![Continental by Region](UACByRegion.PNG)

As in the Coastal forecast Wind Slabs also are the highest avalanche problem correlated with model error followed by Storm Slab and Persistent Slab.

Errors when Wind Slab is a forecasted avalanche problem:
![Continental by Wind Slab](UACByProblem.PNG)

## Next Steps

1. **Improve wind data:** From the analysis there definitely is a need to examine the wind values going into the forecasting or otherwise determining if it's possible to improve the Wind Slab forecasting.
2. **Get additional data:** Today the model only uses two forecast regions for training and evaluation. We are taking steps to get data from other avalanche centers to help expand the amount of data available to train.
3. **Explore other useful features/models:** While the models are designed to be able to approximate some aspects of snowpack evolution it's not clear how to connect these models with existing state of the art physics models of the snowpack. We need to investigate this as well as other possible modeling goals (such as predicting avalanche problems).
4. **Continue conversations with the community:** As we learn more about what is possible in the space of automatically generating avalanche forecasts we want to continue the conversation about the appropriate and innovative uses which may be possible.

0 comments on commit 560de16

Please sign in to comment.