Read today's world news in less than 10kB
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
.circleci
archive
bin
dist
routes
scraper
test
views
.eslintrc
.gitignore
README.md
app.js
iisnode.yml
index.js
job.js
nginx.conf
package-lock.json
package.json
web.config

README.md

WorldNews-10K

Circle CI build status

This project was created for the 10kB challenge by 10k Apart. Its main goal is to present today's world news in less than 10kB. To do this we fetch json from the /r/WorldNews subreddit using Reddit's API. From this json we parse titles, URLs to articles, comment counts and comment URLs. The article URLs are fed to a scraper, which downloads article images, and gets additional article descriptions via Open Graph metadata. The images are then compressed and preview versions are also scaled down and base64 encoded. These previews are displayed on initial load of the application with a blur filter on top. After the page has completely loaded with these minimal resources, it lazy loads the high res version of the images and two web fonts.

Stack

The application uses Express.js with Pug as template engine. For image processing Jimp is used, together with a few small processing libraries to convert image streams from one format to another. To scrape the Open Graph metadata we use Metascraper, and node-fetch is used to fetch json and images over http. The stylesheet is written using indented SASS, and processed at server runtime using node-sass-middleware.

How to build the project

You need the following dependencies to build this project locally.

  1. Git - To clone the project.
  2. Node 6.0.0 or greater - The project is written using ECMAScript 2015, which is only supported by Node 6 or later.
  3. Node-gyp - Node-sass-middleware requires node-gyp globally installed to build correctly.
  4. Forever - To start the scraper and server as daemons.

Once you have what's needed. Install the required node modules using npm install. After installing all dependencies, you can start the application in production mode using npm start and stop it with npm stop. These scripts launches and also closes forever daemons for both the scraper job and the server. To debug the application use npm run debug:job and npm run debug:server. To run tests you need mocha installed globally and type npm test.

Note on server performance

The original online version of this project uses NGINX to reverse proxy the express application and serve everything except the index, which is dynamic. This helps off loading express.