Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Creating multiple records for a single document #74

Closed
AlanFoster opened this issue May 26, 2020 · 2 comments
Closed

Creating multiple records for a single document #74

AlanFoster opened this issue May 26, 2020 · 2 comments

Comments

@AlanFoster
Copy link

AlanFoster commented May 26, 2020

Hi, I'm following the walkthrough from: https://www.youtube.com/watch?v=VSkXyuXzwlc&feature=emb_logo

The walkthrough shows how to create search terms for a single blog post.

The solution creates multiple records from a single blog post on Algolia without an objectId, and removes duplication via the slug attribute via the index's settings.

// https://github.com/jlengstorf/gatsby-algolia/blob/f3f6792332fb5d08acccecd11f4011ad5a0522d1/gatsby-config.js#L47-L53
chunks.push({
  slug: post.fields.slug,
  date: post.frontmatter.date,
  title: post.frontmatter.title,
  excerpt: post.frontmatter.excerpt,
})

However This doesn't work in the latest release of gatsby-plugin-algolia due to the following check added in https://github.com/algolia/gatsby-plugin-algolia/pull/27/files#diff-fda05457e393bada716f508859bfc604R103-R107

    if (objects.length > 0 && !objects[0].objectID) {
      report.panic(
        `failed to index to Algolia. Query results do not have 'objectID' key`
      );
    }

When adding the objectId:

objectId: node.id

This causes the last write to Algolia to win, and it removes the duplicate search records for the same blog post. This means there's now one record for each blog post, the last one, rather than multiple search snippets for each post as required.

Is the approach taken by the video the correct solution for chunking blog posts into multiple search terms on Algolia? Or is there a regression in the library

@Haroenv
Copy link
Contributor

Haroenv commented Jun 1, 2020

If you are splitting a single node into multiple, you need to also split the id, for example with the index. In pseudo code:

splitIntoChunks(singleObject).map((item, index) => ({...item, objectID: item.id + '-' + index,}));

This is indeed needed to be able to differentiate between every "part" of the item.

This is not a bug in the library, but rather a fix, since without object id, partial updates won't work and will still need an update on every item.

@AlanFoster
Copy link
Author

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants