-
Notifications
You must be signed in to change notification settings - Fork 3.7k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Pre processing an Index to use custom ID's and searches #641
Comments
Your understanding is correct. If you you want to use string IDs, you'll have to manage the mapping from ids to strings in the calling code. |
Hi, maybe a similar question, can we add new data with a dict-like format, that's the KEY are String type? Or the users MUST do the <String type key, Int type key> mappings by themself? @mdouze |
Yes the users should do the mapping themselves. |
@mdouze , ok, got it, so does FAISS plan to support this feature(use String type as the index ids, like "Dict" data structure)? |
No. Faiss is not intended as a full-featured DBMS. |
Closing, as the initial question has been addressed. Feel free to keep commenting if you have further questions. |
OK, thanks for your reply. |
@mdouze I'm a bit new to this codebase. How would one go about handling their own mapping? |
@sjakati98 The internal ids are sequential, so you just have to maintain a map from your custom ids to internal ids. You can use |
That seems simple enough. Thanks! |
Summary
I am using IndexFlatIP and I understand that the ID of each vector will be the position at the index it assumes after being added. As a possible solution to use custom ID's, I've found about IndexIDMap, that allows to use a function called add_with_ids. This structure also has its own searching function, that implicitly searches at the original index, which makes me think that the searching can be done directly over the IndexIDMap variable and we will have the same performance.
I would like to know if my understandings are correct and if there is a way to use also custom string ID's.
Interface:
The text was updated successfully, but these errors were encountered: