HTTPS clone URL
Subversion checkout URL
- API Wrappers
- API: submit
- Architecture Overview
- Cron jobs
- Install guide
- Mapping code to a url
- OAuth2 App Types
- OAuth2 iOS Example
- OAuth2 PHP Example
- OAuth2 Python Example
- OAuth2 Quick Start Example
- reddit install script for Ubuntu
- The Beauty Salon
- The Code Salon
Clone this wiki locally
To reduce the amount of work done in-request, reddit defers to asynchronous message queues for many tasks. These jobs are described by the
reddit-consumer- jobs in the upstart directory. Following are explanations of the various queue consumer jobs.
Voting on a link or comment inserts an item on one of these queues. The queue processors receive a message for each vote they need to process and will update scores, karma, and cached listings accordingly.
When a link is submitted, it is added to the scraper_q for deferred processing. This queue processor works through the submitted links and will scrape the submitted URL for media embed information and thumbnails.
After a comment is created, this queue processor does the work of updating the cached comment tree data structures.
This processor inserts items onto the
/comments listings. It exists because the lock contention of prepending to the list from every app was too high.
When new links, comments, or subreddits are created, or when existing ones are edited, the Amazon CloudSearch index needs to be updated for search to work. This queue processor handles batching up and sending updates to Amazon for processing.