Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Adding custom token separators to search index #1280

Merged
merged 2 commits into from
Jun 22, 2023
Merged

Conversation

adilansari
Copy link
Contributor

@adilansari adilansari commented Jun 21, 2023

Create a search index

$ -> http PUT localhost:8081/v1/search/indexes/shoes

{
  schema: {
  title: 'shoes_index',
  properties: {
    name: {
      type: 'string',
    },
  },
  options: {
    token_separators: [ '-' ],
  },
}           

Retrieve a search index

$ ➔  http GET localhost:8081/v1/projects/todo_app/search/indexes                                                                                                 ±[●][tokenize]
HTTP/1.1 200 OK
Content-Length: 169
Content-Type: application/json
Date: Wed, 21 Jun 2023 23:42:48 GMT
Server-Timing: total;dur=2
Vary: Origin

{
    "indexes": [
        {
            "name": "shoes",
            "schema": {
                "options": {
                    "token_separators": [
                        "-"
                    ]
                },
                "properties": {
                    "name": {
                        "type": "string"
                    }
                },
                "source": {
                    "type": "external"
                },
                "title": "shoes_index"
            }
        }
    ]
}

Typesense collection

$ ➔  http GET localhost:8108/collections X-TYPESENSE-API-KEY:<replace_me>                                                                                       ±[●][tokenize]
HTTP/1.1 200 OK
Connection: keep-alive
accept-ranges: none
access-control-allow-origin: *
content-encoding: gzip
content-type: application/json; charset=utf-8
transfer-encoding: chunked
vary: accept-encoding

[
    {
        "created_at": 1687390839,
        "default_sorting_field": "",
        "enable_nested_fields": true,
        "fields": [
            {
                "facet": false,
                "index": true,
                "infix": false,
                "locale": "",
                "name": "name",
                "optional": true,
                "sort": false,
                "type": "string"
            },
            {
                "facet": false,
                "index": false,
                "infix": false,
                "locale": "",
                "name": "_tigris_id",
                "optional": true,
                "sort": false,
                "type": "string"
            },
            {
                "facet": false,
                "index": false,
                "infix": false,
                "locale": "",
                "name": "_tigris_created_at",
                "optional": true,
                "sort": false,
                "type": "int64"
            },
            {
                "facet": false,
                "index": false,
                "infix": false,
                "locale": "",
                "name": "_tigris_updated_at",
                "optional": true,
                "sort": false,
                "type": "int64"
            }
        ],
        "name": "1:411:shoes",
        "num_documents": 0,
        "symbols_to_index": [],
        "token_separators": [
            "-"
        ]
    }
]

@reviewpad reviewpad bot added the medium label Jun 21, 2023
@reviewpad
Copy link

reviewpad bot commented Jun 22, 2023

Reviewpad Report

‼️ Errors

  • Unconventional commit detected: 'rejecting update on token separators' (c0138c7)

}

type SearchSchemaOptions struct {
TokenSeparators *[]string `json:"token_separators,omitempty"`
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You don't need a pointer to it, the slice will be empty if token_separators is not set.

@adilansari adilansari merged commit 104895b into main Jun 22, 2023
3 checks passed
@adilansari adilansari deleted the tokenize branch June 22, 2023 18:13
@tigrisdata-argocd-bot
Copy link
Collaborator

🎉 This PR is included in version 1.0.0-beta.121 🎉

The release is available on GitHub release

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants