Skip to content

Commit

Permalink
Merge pull request #31 from ArkeeAgency/antoinekm/cli
Browse files Browse the repository at this point in the history
Add cli
  • Loading branch information
goenning committed Mar 6, 2024
2 parents 9c35961 + 15984d8 commit 93686dd
Show file tree
Hide file tree
Showing 16 changed files with 5,784 additions and 658 deletions.
13 changes: 13 additions & 0 deletions .changeset/config.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
{
"$schema": "https://unpkg.com/@changesets/config@3.0.0/schema.json",
"commit": false,
"fixed": [["google-indexing-script"]],
"changelog": [
"@changesets/changelog-github",
{ "repo": "goenning/google-indexing-script" }
],
"linked": [],
"access": "public",
"baseBranch": "main",
"updateInternalDependencies": "patch"
}
7 changes: 7 additions & 0 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
**What did I change?**

<!-- A brief description of the change. -->

**Why did I change it?**

<!-- A brief description of why the change was made. -->
29 changes: 29 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
name: CI
on: [push]

jobs:
build:
name: Build, lint, and test on Node ${{ matrix.node }} and ${{ matrix.os }}

runs-on: ${{ matrix.os }}
strategy:
matrix:
node: ["18.x", "20.x"]
os: [ubuntu-latest]

steps:
- name: Checkout repo
uses: actions/checkout@v4

- name: Use Node ${{ matrix.node }}
uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node }}
cache: "npm"

- name: Install Dependencies
run: npm install

- name: Build
run: npm run build

37 changes: 37 additions & 0 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
name: Release
on:
push:
branches:
- main

jobs:
release:
if: github.repository == 'goenning/google-indexing-script'

runs-on: ubuntu-latest

steps:
- name: Checkout repo
uses: actions/checkout@v4

- name: Use Node
uses: actions/setup-node@v4
with:
cache: "npm"

- name: Install Dependencies
run: npm install

- name: Build
run: npm run build

- name: Create Release Pull Request or Publish to npm
uses: changesets/action@v1
with:
publish: npm run release
version: npm run version
commit: "release version"
title: "release version"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
37 changes: 37 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# Contributing to Google Indexing Script

Before jumping into a PR be sure to search [existing PRs](/goenning/google-indexing-script/pulls) or [issues](/goenning/google-indexing-script/issues) for an open or closed item that relates to your submission.

# Developing

All pull requests should be opened against `main`.

1. Clone the repository
```bash
git clone https://github.com/goenning/google-indexing-script.git
```

2. Install dependencies
```bash
npm install
```

3. Install the cli globally
```bash
npm install -g .
```

4. Run the development bundle
```bash
npm run dev
```

5. See how to [use it](/README.md#installation) and make your changes !

# Building

After making your changes, you can build the project with the following command:

```bash
npm run build
```
111 changes: 95 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,30 +16,109 @@ You can read more about the motivation behind it and how it works in this blog p

## Preparation

1. Download or clone this repository
2. Follow this [guide](https://developers.google.com/search/apis/indexing-api/v3/prereqs) from Google. By the end of it, you should have a project on Google Cloud with the Indexing API enabled, a service account with the `Owner` permission on your sites.
3. Make sure you enable both `Google Search Console API` and `Web Search Indexing API` on your [Google Project ➤ API Services ➤ Enabled API & Services](https://console.cloud.google.com/apis/dashboard).
4. [Download the JSON](https://github.com/goenning/google-indexing-script/issues/2) file with the credentials of your service account and save it in the same folder as the script. The file should be named `service_account.json`
1. Follow this [guide](https://developers.google.com/search/apis/indexing-api/v3/prereqs) from Google. By the end of it, you should have a project on Google Cloud with the Indexing API enabled, a service account with the `Owner` permission on your sites.
2. Make sure you enable both `Google Search Console API` and `Web Search Indexing API` on your [Google Project ➤ API Services ➤ Enabled API & Services](https://console.cloud.google.com/apis/dashboard).
3. [Download the JSON](https://github.com/goenning/google-indexing-script/issues/2) file with the credentials of your service account and save it in the same folder as the script. The file should be named `service_account.json`


## Installation

### Using CLI

Install the cli globally on your machine.

```bash
npm i -g google-indexing-script
```

### Using the repository

Clone the repository to your machine.

```bash
git clone https://github.com/goenning/google-indexing-script.git
cd google-indexing-script
```

Install and build the project.

```bash
npm install
npm run build
npm i -g .
```

> [!NOTE]
> Ensure you are using an up-to-date Node.js version, with a preference for v20 or later. Check your current version with `node -v`.
## Usage

1. Open a terminal and navigate to the folder where you cloned the repository
2. Ensure you are using an up-to-date Node.js version, with a preference for v20 or later. Check your current version with `node -v`.
3. Run `npm install` to install the dependencies
4. Run `npm run index <domain or url>` to index all the pages of your site.
- If your site is a `Domain` Property on GSC, then run it like `npm run index seogets.com`
- Otherwise, if it's a `URL Prefix` property, then run it like `npm run index https://seogets.com`
- When in doubt try both 😀
<details open>
<summary>With <code>service_account.json</code> <i>(recommended)</i></summary>

Create a `.gis` directory in your home folder and move the `service_account.json` file there.

```bash
mkdir ~/.gis
mv service_account.json ~/.gis
```

Run the script with the domain or url you want to index.

```bash
gis <domain or url>
# `domain` property on gsc
gis seogets.com
# `url prefix` property on gsc
gis https://seogets.com
```

When in doubt try both 😀

Here are some other ways to run the script:

```bash
# custom path to service_account.json
gis seogets.com --path /path/to/service_account.json
# long version command
google-indexing-script seogets.com
# cloned repository
npm run index seogets.com
```
</details>

<details>
<summary>With environment variables</summary>

Open `service_account.json` and copy the `client_email` and `private_key` values.

Run the script with the domain or url you want to index.

```bash
GIS_CLIENT_EMAIL=your-client-email GIS_PRIVATE_KEY=your-private-key gis seogets.com
```
</details>

<details>
<summary>With arguments <i>(not recommended)</i></summary>

Open `service_account.json` and copy the `client_email` and `private_key` values.

Once you have the values, run the script with the domain or url you want to index, the client email and the private key.

```bash
gis seogets.com --client-email your-client-email --private-key your-private-key
```
</details>

Here's an example of what you should expect:

![](./output.png)

**Important Notes:**

- Your site must have 1 or more sitemaps submitted to Google Search Console. Otherwise, the script will not be able to find the pages to index.
- You can run the script as many times as you want. It will only index the pages that are not already indexed.
- Sites with a large number of pages might take a while to index, be patient.
> [!IMPORTANT]
> - Your site must have 1 or more sitemaps submitted to Google Search Console. Otherwise, the script will not be able to find the pages to index.
> - You can run the script as many times as you want. It will only index the pages that are not already indexed.
> - Sites with a large number of pages might take a while to index, be patient.
## 📄 License

Expand Down
Loading

0 comments on commit 93686dd

Please sign in to comment.