You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However all the adding and search is happending in memory and if the application closes or crashes the data is lost. My question is since faiss supports writing index via:
It really depends on the operating conditions.
One approach is with two indexes: one big one with most of the vectors, and one in which you add new vectors. At search time, you search in both.
Then you can save every 10k adds with:
save small index with (fast) with incremental file names
merge small index into big one (fast, in RAM)
clear small index.
At recover time, you then need to load the small indexes to reconstruct the big one. You could have a background job that merges the small indexes on disk.
Thanks i think this would work without data loss in case of failure.
Other question is how do you handle meta data for the vectors because when the results for distance search are achieved that might not be revelant. For example,
In our application we have clientId,categoryId for each vector and other attributes as well. So when the topK results are returned that might not be for that clientID. Is there an Index that suports adding attributes for vectors inside the index as well ?
Running on:
Interface:
About my app:
However all the adding and search is happending in memory and if the application closes or crashes the data is lost. My question is since faiss supports writing index via:
how do i implement the most efficient index saving strategy ?
Block all requests while index is being written to file for every new vector added
This will lead to decrease in performance.
Periodically update the index in the background after every 10,000 new vectors
if application crashes unwritten new vectors will be lost
Other strategy ?
Please help me. I have been scratching my head for the last 2 weeks regarding this problem.
The text was updated successfully, but these errors were encountered: