Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[Requirement] Custom allocator support for gpu_cache #454

Open
mfbalin opened this issue Jun 27, 2024 · 1 comment
Open

[Requirement] Custom allocator support for gpu_cache #454

mfbalin opened this issue Jun 27, 2024 · 1 comment
Assignees

Comments

@mfbalin
Copy link

mfbalin commented Jun 27, 2024

Is your feature request related to a problem? Please describe.
We utilize gpu_cache in the Deep Graph Library project. We would like the gpu_cache to be able to use a custom allocator. That way, we can make it use the pytorch allocator in our project.

@yingcanw
Copy link
Collaborator

yingcanw commented Jul 3, 2024

Thanks for your feedback. We have abstracted the allocator in the latest EmbedCache. You can implement a custom allocator for host and device memory allocation.

@yingcanw yingcanw self-assigned this Jul 3, 2024
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants