Download Hacker News (HN) stories and comments using their official APIs
C# R
Permalink
Failed to load latest commit information.
External/DotNetCommonUtils
Properties Initial May 26, 2014
bin/Debug
packages
.bowerrc
.gitattributes Initial May 26, 2014
.gitignore
App.config
HackerNewsDownloader.csproj
HackerNewsDownloader.sln
JsonNetUtils.cs Reorg code, cleanup, use large json file format May 30, 2014
LICENSE.txt
MainForm.Designer.cs AnalyzePostTimes using local time Jun 20, 2014
MainForm.cs
MainForm.resx Reorg code, cleanup, use large json file format May 30, 2014
OptimalPostingTime.R AnalyzePostTimes using local time Jun 20, 2014
PostTimeAnalyzer.cs
Program.cs Reorg code, cleanup, use large json file format May 30, 2014
README.md Update README.md Jun 3, 2014
SampleResponse.txt Initial May 26, 2014
StoryUrlStats.cs Reorg code, cleanup, use large json file format May 30, 2014
WPThemeAnalysis.linq Added GetStats May 31, 2014
WpAnalyzer.cs Reorg code, cleanup, use large json file format May 30, 2014
bower.json
packages.config

README.md

HackerNewsDownloader

Introduction

This is a small program to download Hacker News (HN) stories and comments using public official APIs. The code uses REST APIs to retrieve items and paginate through time. It respects API rate limit of 10,000 requests per hour.

Hacker News: https://news.ycombinator.com/ Hacker News Public APIs: https://hn.algolia.com/api

Points of Interest

The output is json file containing array of objects. Each object is the response object that was returned by API call. Within this object hits property contains actual list of items.

As the output files are very large, the code uses JSON.NET is streaming mode to read and write very large JSON files. It also uses RestSharp for making API calls.

Data Files for Download

The output generated by this program as of May 29, 2014 is available at https://github.com/sytelus/HackerNewsData. This contains all of the stories and comments posted on HN from 2006 to 29 May, 2014. Please see that repo for more information and download links.

The Code

While the Algolia's HN API are simple, the documentation is minimal and is missing few pieces like hitsPerPage query parameter and the best way to paginate through results so you can download all of the posts and comments. Here's the simple C# function I wrote using JSON.Net and RestSharp to download HN data:

private static IEnumerable<JObject> GetHnItems(string itemType)
{
    const string baseUrl = @"https://hn.algolia.com/api/v1/search_by_date?tags={2}&hitsPerPage={0}&numericFilters=created_at_i<{1}";
    var restClient = new RestClient();
    var offset = DateTime.UtcNow.ToUnixTime();
    var limit = 1000;

    var hitCount = 0;
    do
    {
        var request = new RestRequest(baseUrl.FormatEx(limit, offset, itemType), Method.GET);

        var response = restClient.Execute(request);
        if (response.StatusCode == HttpStatusCode.OK)
        {
            var responseJson = JObject.Parse(response.Content);
            var hits = responseJson["hits"];
            hitCount = hits.Count();

            yield return responseJson;

            if (hitCount > 0)
            {
                offset = hits.Min(h => h["created_at_i"].Value<long>());
                Thread.Sleep(4000);
            }
        }
        else
            throw new Exception("Recieved Response {0}-{1}, Body {2} for request {3}, Error: {4}".FormatEx(response.StatusCode, response.StatusDescription, response.Content, response.ResponseUri, response.ErrorMessage));
    } while (hitCount > 0);
}

More Info

See blog entry http://shitalshah.com/p/downloading-all-of-hacker-news-posts-and-comments/

License

MIT License as described in LICENSE.txt.