Skip to content

Commit

Permalink
Merge branch 'main' into ui-stuff
Browse files Browse the repository at this point in the history
  • Loading branch information
ishaan812 committed Nov 5, 2023
2 parents b65f04b + 0db51e4 commit f8b61a5
Show file tree
Hide file tree
Showing 9 changed files with 142 additions and 12 deletions.
4 changes: 3 additions & 1 deletion .eslintrc.json
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,9 @@
"eslint:recommended",
"plugin:@typescript-eslint/recommended",
"plugin:jest/recommended",
"prettier"
"prettier",
"modular/best-practices",
"modular/style"
],
"rules": {
// The following rule is enabled only to supplement the inline suppression
Expand Down
58 changes: 54 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,63 @@


# GitHub OpenAPI Search

The goal of this project is to provide a robust yet easy way to search Github for Swagger and OpenAPI definitions. Understanding that there is a lot of noise available, that we only care about OpenAPIs that validate, and that the Github API has rate limits that require you to automate the crawling over time. Providing a robust open-source solution that will crawl public Github repositories for machine-readable API definitions.
The project will consist of developing an open-source API that allows you to pass in search parameters and then utilize the GitHub API to perform the search, helping simplify the search interface, make rate limits visible as part of the response, and handle conducting a search in an asynchronous way, allowing the user to make a call to initiate, but then separate calls to receive results over time as results come in, helping show outcomes over time.
The goal of this project is to provide a robust yet easy way to search Github for OpenAPI and Swagger definitions. Understanding that there is a lot of noise available, that we only care about OpenAPIs that validate, and that the Github API has rate limits that require you to automate the crawling over time. Providing a robust open-source solution that will crawl public Github repositories for machine-readable API definitions.
The project will consist of developing an open-source API that allows you to pass in search parameters and then utilize the GitHub API to perform the search, helping simplify the search interface, and handle conducting a search in an asynchronous way, allowing the user to make a call to initiate, but then separate calls to receive results over time as results come in, helping show outcomes over time.

## Tech Stack

- Node JS/Express JS
- Typescript
- Octokit.JS
- Jest
- Jest (For testing)
- Docker
- Python (Scripting)
- ElasticSearch

## Dev Runbook
Dependancies: NodeJS 19, npm, Github APIKey
How to get a Github API Key: https://docs.github.com/en/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens

## Setting up OpenAPI Search with Docker Compose

1. Clone the repository to your local setup
2. Make sure you have Docker installed locally.
3. Run `docker compose up`
4. Two Containers - Elasticsearch (The database container) and an instance of the server should have started.
5. Now to load the database with OpenAPI Files, run
`python scripts/seed_script.py` from the root of the folder. (Takes around 2-3hrs)
(More configuration of organisation list you can edit the scripts/assets/organisations1.txt, scripts/assets/organisations2.txt is for the next 1000 organisations)

## Setting up the server manually

1. Clone the repository to your local setup
2. Run `npm i`
3. Make a `.env` file in the directory and add the variables:
**PORT**= **(port number you want to host the api)**
**GITHUB_API_KEY**= **(github API key)**
**ES_HOST**= **(determines location of elasticsearch db)**
4. Run `npm run build:watch` on one terminal.
5. On another terminal, run `npm run start` to start the server on the port specified on.
6. Now the nodejs server should be running! To test it just go to `localhost:{{PORT}}` and then you will be able to see the admin panel through which you can inference with some of the API's
7. Now to load the database with OpenAPI Files, run
`python scripts/seed_script.py` from the root of the folder. (Takes around 2-3hrs)

## Setting up ElasticSearch locally (Manually)
1. docker pull docker.elastic.co/elasticsearch/elasticsearch:8.8.2
2. docker network create elastic
3. docker run \
-p 9200:9200 \
-p 9300:9300 \
-e "discovery.type=single-node" \
-e "xpack.security.enabled=false" \
docker.elastic.co/elasticsearch/elasticsearch:8.8.2

## Loading Details
Currently, we are only indexing OpenAPI Files from the top 1000 most popular organisations from Github (Based on stars). Although more organisations can be indexed by adding them to the `scripts/assets/organisations.txt` file.


## API Endpoints
[![Run in Postman](https://run.pstmn.io/button.svg)](https://app.getpostman.com/run-collection/19841716-f1801bb7-b189-429b-a875-91b115d349a2?action=collection%2Ffork&source=rip_markdown&collection-url=entityId%3D19841716-f1801bb7-b189-429b-a875-91b115d349a2%26entityType%3Dcollection%26workspaceId%3D5ebe19fb-61d4-47a7-9cae-de3834853f6b)


🚧Under Construction
71 changes: 71 additions & 0 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
"@types/jest": "~29.5",
"@types/node": "^18.16.19",
"@types/openapi-to-postmanv2": "^3.2.1",

"@types/supertest": "^2.0.12",
"@typescript-eslint/eslint-plugin": "~5.59",
"@typescript-eslint/parser": "~5.59",
Expand Down
5 changes: 2 additions & 3 deletions scripts/seed_script.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@

import requests


def call_local_endpoint(query):
url = f'http://localhost:8080/openapi?{query}'
print(f"Calling {url}")

try:
response = requests.post(url)

# Check if the response was successful (status code 200)
if response.status_code == 200:
print("Request to localhost:8080/search was successful!")
Expand All @@ -36,5 +36,4 @@ def loadbyorganisations(filename):
#Get Open API files
loadbyorganisations("scripts/assets/organisations1.txt")

#Have to load Swagger Files too
#Load by Repository

3 changes: 3 additions & 0 deletions src/DB/dbutils.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import { esClient } from '../app.js';


export async function checkClusterHealth(): Promise<string> {
try {
const response = await esClient.cat.health();
Expand All @@ -23,6 +24,7 @@ export async function BulkStoreToDB(validFiles: any): Promise<void> {
}
}


export async function DeleteDocumentWithId(Id: string): Promise<void> {
try {
const index = 'openapi';
Expand Down Expand Up @@ -71,3 +73,4 @@ export async function GetDocumentWithId(id: string): Promise<any> {
console.error('Error getting document from database:', error);
}
}

7 changes: 5 additions & 2 deletions src/app.ts
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,13 @@ dotenv.config();
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const rootDir = path.join(__dirname);

const octokit = new CustomOctokit({
userAgent: 'github-openapi-search/v0.0.1',
auth: process.env.GITHUB_API_KEY,
throttle: {
onRateLimit: (retryAfter, options): boolean => {

octokit.log.warn(
`Request quota exhausted for request ${options.method} ${options.url}`,
);
Expand All @@ -38,6 +40,7 @@ const octokit = new CustomOctokit({
},
});


const esHost = process.env.ES_HOST || 'localhost';
const esClient = new es.Client({
host: 'http://' + esHost + ':9200',
Expand All @@ -47,7 +50,6 @@ const esClient = new es.Client({
const app = express();
app.set('view engine', 'pug');
app.set('views', path.join(rootDir, 'templates'));

app.get('/search', async (_req, _res) => {
const query = _req.query.q as string;
const results = await passiveSearch(query);
Expand Down Expand Up @@ -80,6 +82,7 @@ app.use('/ping', async (_req, _res) => {
_res.send(response);
});


app.get('/openapi/:id', async (_req, _res) => {
const id = _req.params.id;
GetDocumentWithId(id).then((response) => {
Expand Down Expand Up @@ -114,5 +117,5 @@ app.get('/file/:id', (_req, _res) => {
});
})


export { octokit, esClient, app };

3 changes: 1 addition & 2 deletions src/searchtools/search.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,6 @@ export async function activeSearch(
per_page: 100,
},
(response: any) => {
console.log(response);
files = files.concat(response.data);
if (files.length >= 200) {
processCount++;
Expand Down Expand Up @@ -73,6 +72,7 @@ export async function activeSearch(
return validFiles;
}


export async function passiveSearch(query: string): Promise<any> {
try {
if (esClient === undefined) {
Expand Down Expand Up @@ -105,6 +105,5 @@ export async function passiveSearch(query: string): Promise<any> {
return error;
}
}

return 'Database not found';
}
2 changes: 2 additions & 0 deletions src/updatetools/updateutils.ts
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@

import OASNormalize from 'oas-normalize';
import { octokit, esClient } from '../app.js';
import { DeleteDocumentWithId, CreateDocument } from '../DB/dbutils.js';
Expand Down Expand Up @@ -110,3 +111,4 @@ export async function UpdateDocument(document: any): Promise<void> {
}
});
}

0 comments on commit f8b61a5

Please sign in to comment.