Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RMP] Support for simple ML/CF Models (like Implicit) in Merlin Models and Systems #104

Open
7 of 10 tasks
viswa-nvidia opened this issue Feb 17, 2022 · 6 comments
Open
7 of 10 tasks
Assignees
Labels

Comments

@viswa-nvidia
Copy link

viswa-nvidia commented Feb 17, 2022

Problem:

Latent factor models enable discovery of the underlying structure between interactions and items. These approaches have been popular over the years to leverage implicit feedback data. Customers who are using simple models via Implicit and LightFM want to be able to deploy those models within the Merlin ecosystem.

Goal:

Constraints:

Systems

  • Serve Implicit/LightFM as a self-contained op with everything required to serve within the exported triton model directory
    Requires installing the python package in the tritonserver environment where it will run.
    Decomposing the serving of these into different operators (retrieval with nearest neighbour search through embedding space)

Blocking issues

  • Inference is blocked w/ issues on serialization of model w/ Implicit

Starting Point:

Merlin-models

wrap Implicit and lightFM in High-level model API:

NVTabular

N/A

Merlin-systems

Examples and Docs (To happen in 22.09)

@viswa-nvidia
Copy link
Author

@benfred , please add the tickets related to this epic here

@karlhigley karlhigley changed the title [RMP] Simple ML/CF Models (Ben) [RMP] Simple ML/CF Models Feb 17, 2022
@karlhigley karlhigley added this to the Merlin release 22.04 milestone Feb 18, 2022
@viswa-nvidia
Copy link
Author

@EvenOldridge @benfred , is Batch prediction in NVTabular section required for 22.04 ?

@karlhigley karlhigley removed this from the Merlin 22.05 milestone Apr 6, 2022
@EvenOldridge EvenOldridge added this to the Merlin 22.06 milestone Apr 6, 2022
@EvenOldridge EvenOldridge changed the title [RMP] Simple ML/CF Models [RMP] Inference support for simple ML/CF Models in Implicit May 2, 2022
@karlhigley karlhigley changed the title [RMP] Inference support for simple ML/CF Models in Implicit [RMP] Support for simple ML/CF Models (like Implicit) in Merlin Models and Systems May 20, 2022
@oliverholworthy
Copy link
Member

For the serving part. My understanding is that, for both Implicit and LightFM, the way to serve these at scale is similar to the two tower / bi-encoder design of the DLRM example we have.

Where the item embeddings are stored in a approximate-nearest neighbour search index. And we search through this space using a user embedding. (for user->item recommendations).

The difference from the DLRM example, is in how the user embedding is computed at serving time.

Computing the User Embedding:

  • LightFM represents users and items as an aggregation (sum) of their feature embeddings. In this case we would lookup all the feature embeddings corresponding to the user and then aggregate (sum) these.
  • Implicit has support for a few different models. Each has the concept of an item and user (factor/embedding)
    The alternating least-squares implementation (ALS) appears to be the only one that supports calculating the user factor (embedding) for unseen/new users by running the solver provided items the user has interacted and some model parameters (I'm currently unsure if it would be feasible to store this in practice in a serving scenario / how fast this inference would be).
    However, for the case where we have the user factors/embeddings computed already, we can look these up at query-time.

It seems that one of the main questions is where do we store the user/item/feature embeddings for fast lookup at serving time. Ideally with support for versioning (for migrating to new model version) / updating (for new users and items).

For vector/embedding based item->user, user->user, item->item recommendations. two main components are:

  • Storing user/item/feature embeddings (Key-value lookup) + optional aggregation (e.g. in the case of LightFM)
  • Approximate Nearest Neighbour search through item or user space.
    We currently have the QueryFaiss operator. Which is currently limited to a pre-computed index. (so would need to re-compute the index and re-deploy Triton for new items to be considered.) -> Issue for extending functionality of the nearest neighbour candidate retrieval here Add integrations for other nearest neighbor search tools systems#10

We could start with a similar approach to the nearest neighbour implementation we have now. A first pass at this could be to store the user/feature embeddings as static data inside Triton. Albeit with similar challenges to having the item embeddings index stored as file(s) inside Triton (updating, and scaling to large numbers of items/users). Or they could be managed in a database outside Triton.

@viswa-nvidia
Copy link
Author

@benfred / @oliverholworthy , please fill up the problem, goal and constraints section in the description. You may have provided this in the comments. Please summarise it there. Let me know if you are facing any difficulties.

@viswa-nvidia
Copy link
Author

@oliverholworthy , I have added 22.07 in the description to specify that the task is in scope for 22.07. If it needs to be moved out to future release, please specify

@oliverholworthy
Copy link
Member

I've updated the description to make it clearer which tasks are in scope for 22.07. I think we can achieve a self-contained operator for Implicit for the upcoming release.

@karlhigley karlhigley added this to the Merlin 22.12 milestone Oct 19, 2022
@viswa-nvidia viswa-nvidia modified the milestones: Merlin 23.01, Backlog Nov 15, 2022
@viswa-nvidia viswa-nvidia removed the epic label Dec 15, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants