Skip to content
This repository has been archived by the owner on Jan 5, 2023. It is now read-only.

feat: add data aggregation #18

Merged
merged 2 commits into from Dec 15, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
3 changes: 2 additions & 1 deletion .gitignore
@@ -1,4 +1,5 @@
/node_modules/
.env
/prod/
/data/**.csv
/data/**.csv
/data/**.md
6 changes: 6 additions & 0 deletions README.md
Expand Up @@ -18,6 +18,12 @@ To run this tool, complete the following steps:

The tool will then begin collecting data, aggregating the results, and storing the results in `.csv` files. The files will be saved to the `data` directory, and can be opened with a text editor or most spreadsheet programs.

## Aggregating Data

After following the steps in the "Generating Data" section above, you can use the tool to aggregate data into a markdown file. Run `npm run aggregate` to start the automated process. The tool will read each `.csv` file within the `data` directory and parse it into a markdown list. The first column in each `.csv` file should be the contributor's _name_, and the second column should be the contributor's _url_ (GitHub, Twitter, YouTube, etc.). The tool will ignore any columns after the second column. The tool will generate the list in a `- [name](url)` format, to allow hotlinks. If the `url` is empty, the tool will add the contributor in the `- name` format.

The list will be sorted and written to `data/contributors.md`, where you can copy and paste it to whichever platform you wish to announce top contributors on.

## Feedback and Bugs

If you have feedback or a bug report, please feel free to open a GitHub issue!
Expand Down
1 change: 1 addition & 0 deletions package.json
Expand Up @@ -4,6 +4,7 @@
"description": "A tool to generate lists of contributors",
"main": "./prod/index.js",
"scripts": {
"aggregate": "node -r dotenv/config prod/modules/aggregateData.js",
"build": "tsc",
"lint": "eslint src --max-warnings 0",
"start": "node -r dotenv/config prod/index.js",
Expand Down
35 changes: 35 additions & 0 deletions src/modules/aggregateData.ts
@@ -0,0 +1,35 @@
import { readFile, writeFile } from "fs/promises";
import { join } from "path";

import { getFileNames } from "../utils/getFileNames";
import { logHandler } from "../utils/logHandler";

(async () => {
logHandler.log("info", "Beginning data aggregation...");
const fileNames = await getFileNames();
const globalContributors: string[] = [];
for (const file of fileNames) {
logHandler.log("info", `Aggregating data from ${file}...`);
const filePath = join(process.cwd(), "data", file);
const content = await readFile(filePath, "utf8");
const contributors = content.split("\n").slice(1);
const mappedContributors = contributors.map((el) => {
if (!el) {
return "";
}
const [name, url] = el.split(",");
return url ? `- [${name}](${url})` : `- ${name}`;
});
globalContributors.push(...mappedContributors.filter((el) => el));
}
logHandler.log("info", "Writing data to output file...");
const outputFile = join(process.cwd(), "data", "contributors.md");
const sorted = globalContributors.sort((a, b) =>
a
.replace(/\W/gi, "")
.toLowerCase()
.localeCompare(b.replace(/\W/gi, "").toLowerCase())
);
await writeFile(outputFile, sorted.join("\n"));
logHandler.log("info", "Data aggregation complete!");
})();
19 changes: 19 additions & 0 deletions src/utils/getFileNames.ts
@@ -0,0 +1,19 @@
import { readdir } from "fs/promises";
import { join } from "path";

import { logHandler } from "./logHandler";

/**
* Module to get the file names within the data directory, excluding any
* that are not .csv files.
*
* @returns {string[]} An array of file names.
*/
export const getFileNames = async () => {
logHandler.log("info", "Reading files in the data directory...");
const dataDirectory = join(process.cwd(), "data");
const files = await readdir(dataDirectory);
const filtered = files.filter((file) => file.endsWith(".csv"));
logHandler.log("info", `Found ${filtered.length} CSV files.`);
return filtered;
};