🌟 Website for Progressive Hack Night NYC.
Clone or download
Pull request Compare This branch is 159 commits ahead, 405 commits behind chihacknight:master.
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
_layouts
_posts
blog
css
data
docs
donate
events
fonts
images
js
lib
projects
---code-of-conduct.md.old
.gitignore
.ruby-gemset
.ruby-version
CNAME
Gemfile
Gemfile.lock
LICENSE
Procfile
README.md
Rakefile
_config.yml
_speaker-submissions.html
about.html
app.json
apple-touch-icon-144x144.png
apple-touch-icon-152x152.png
blog_feed.xml
breakouts.html
community.html
config.ru
contact.html
culture.html
data-resources.html
event_feed.xml
events.html
faq.html
favicon-16x16.png
favicon-32x32.png
favicon.ico
index.html
leadership-council.html
mstile-144x144.png
open-source-people.html
open-source-projects.html
photos.html
projects.html
robots.txt
sitemap.xml
steering-committee.html

README.md

Progressive Hack Night

Website for Progressive Hack Night.

Forked from Chi Hack Night

RSVP

Running locally

This website is built using Jekyll. You will need to install it first.

git clone https://github.com/ProgressiveHackNight/proghacknight.org.git
cd proghacknight.org
jekyll serve -w

Then open your web browser and navigate to http://localhost:4000

Dependencies

  • Jekyll - Static site generator built in Ruby
  • Bootstrap 3 - HTML and CSS layouts
  • DataTables - for searching and sorting tables
  • Mustache - templating library for javascript (used on projects page)
  • jQuery Address - for deep linking URLs on the projects page

To Deploy

From s3_website

1. Generate s3_website.yml:
$ s3_website cfg create
2. Fill out s3_website.yml with s3 credentials:
s3_id: <%= ENV['S3_ID'] %>
s3_secret: <%= ENV['S3_SECRET'] %>
s3_bucket: blog.example.com
3. Run s3_website cfg apply to configure your bucket to function as an S3 website
$ s3_website cfg apply
4. Push the site to s3
$ s3_website push

At any later time when you would like to synchronise your local website with the S3 website, simply run s3_website push again. (It will calculate the difference, update the changed files, upload the new files and delete the obsolete files.)