Skip to content

Add rate limiting to Reddit integration#1015

Merged
garrrikkotua merged 5 commits intomainfrom
improvement/adapt-reddit
Jun 30, 2023
Merged

Add rate limiting to Reddit integration#1015
garrrikkotua merged 5 commits intomainfrom
improvement/adapt-reddit

Conversation

@garrrikkotua
Copy link
Copy Markdown
Contributor

@garrrikkotua garrrikkotua commented Jun 22, 2023

Changes proposed ✍️

What

🤖 Generated by Copilot at 5ba51d7

This pull request adds rate limiting functionality to the Reddit API integration using a Redis cache and a rate limiter class. It modifies the services/libs/integrations and services/apps/integration_stream_worker packages to use the @crowd/redis package as a dependency and pass the global cache instance to the Reddit API functions. It also adds the increment method to the ICache and RedisCache interfaces and classes, and the RateLimiter and handleRateLimit modules to the services/libs/redis and services/libs/integrations packages respectively.

🤖 Generated by Copilot at 5ba51d7

To fetch data from Reddit with ease
We must handle the rate limit keys
We use RateLimiter
And globalCache with it
To avoid errors from the API's

Why

How

🤖 Generated by Copilot at 5ba51d7

  • Implement rate limiting logic for Reddit API requests using RedisCache and RateLimiter classes (link, link, link, link, link, link, link, link, link, link, link, link, link, link, link, link, link, link, link, link)
  • Add globalCache property to IProcessStreamContext interface and pass it to processStream function in integrationStreamService.ts (link, link, link)
  • Add handleRateLimit.ts module to export getRateLimiter function that returns a configured RateLimiter instance using globalCache and Reddit API parameters (link)
  • Import getRateLimiter function and use it to create a local rateLimiter variable in each Reddit API function (getComments, getMoreComments, getPosts) in integrations/reddit/api directory (link, link, link, link, link, link)
  • Call checkRateLimit and incrementRateLimit methods of rateLimiter variable before and after making a Reddit API request in each Reddit API function (getComments, getMoreComments, getPosts) in integrations/reddit/api directory (link, link, link, link, link, link)
  • Add RateLimiter class to services/libs/redis/src directory that implements the rate limiting logic using RedisCache and rate limit parameters (link)
  • Export RateLimiter class from services/libs/redis/src/index.ts and import it in handleRateLimit.ts module (link)
  • Add increment method to RedisCache class in services/libs/redis/src/cache.ts and ICache interface in services/libs/types/src/caching.ts that increments a cache value by a given amount and sets an expiration time (link, link)
  • Add dependency on local @crowd/redis package to services/libs/integrations package (link, link, link, link, link)
    • Add @crowd/redis package to dependencies and node_modules sections of services/libs/integrations/package-lock.json file (link, link, link, link)
    • Add @crowd/redis package to dependencies section of services/libs/integrations/package.json file (link)

Checklist ✅

  • Label appropriately with Feature, Improvement, or Bug.
  • Add screehshots to the PR description for relevant FE changes
  • New backend functionality has been unit-tested.
  • API documentation has been updated (if necessary) (see docs on API documentation).
  • Quality standards are met.

@garrrikkotua garrrikkotua added the Improvement Created by Linear-GitHub Sync label Jun 22, 2023
@garrrikkotua garrrikkotua requested a review from themarolt June 22, 2023 14:23
Comment thread services/libs/integrations/package.json Outdated
@garrrikkotua garrrikkotua merged commit bdf3e61 into main Jun 30, 2023
@garrrikkotua garrrikkotua deleted the improvement/adapt-reddit branch June 30, 2023 11:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Improvement Created by Linear-GitHub Sync

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants