-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
cache provider in feature store instance #1924
Conversation
Hi @DvirDukhan. Thanks for your PR. I'm waiting for a feast-dev member to verify that this patch is reasonable to test. If it is, they should reply with Once the patch is verified, the new status will be reflected by the I understand the commands that are listed here. Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/lgtm
Signed-off-by: DvirDukhan <dvir@redislabs.com>
3feb8eb
to
1223652
Compare
/kind housekeeping |
/lgtm |
/ok-to-test |
Codecov Report
@@ Coverage Diff @@
## master #1924 +/- ##
=======================================
Coverage 82.34% 82.35%
=======================================
Files 96 96
Lines 7490 7492 +2
=======================================
+ Hits 6168 6170 +2
Misses 1322 1322
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
@@ -118,7 +119,7 @@ def project(self) -> str: | |||
|
|||
def _get_provider(self) -> Provider: | |||
# TODO: Bake self.repo_path into self.config so that we dont only have one interface to paths | |||
return get_provider(self.config, self.repo_path) | |||
return self._provider |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
only nit here is, what if e.g. feature_store.yaml changes?
an alternate approach might be to mimic what we do for the registry. i.e. have a config that specifies the cache TTL (which can be infinite), after which it does a refresh
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wouldn't it force you to create a new instance? or "refresh" it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah it would recreate the provider after the ttl expires.
as is, this would force users to restart the feature server in order to see changes from the yaml config, which seems fine as a default behavior. But I can see a world where someone wants the original behavior eg for debugging
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this use case is different. We need caching for the registry for online retrieval. In the case of the provider I dont see a downside to having users create a new feature store instance when config changes, or is there a good use case? It's not super clear to me how debugging would be impacted significantly by having real caching with expiration
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i'm cool with this going in as is, but i see value in having a feature server running but wanting to mess around with parameters to get it working.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
could basically be also a binary debug mode flag that continuously reinstantiates if true instead of a real cache
Signed-off-by: DvirDukhan <dvir@redislabs.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/lgtm
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: achals, adchia, DvirDukhan The full list of commands accepted by this bot can be found here. The pull request process is described here
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
What this PR does / why we need it:
This PR saves the provider instance, and as such the offline and the online store instances. This will reduce the amount of time spent in functions involving them, as re-instantiation of these objects is no longer necessary.
Which issue(s) this PR fixes:
None
Does this PR introduce a user-facing change?: