Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Enable taking derivatives with respect to multiple variables simultaneously #13

Open
thomasahle opened this issue Jan 5, 2025 · 0 comments
Labels
enhancement New feature or request

Comments

@thomasahle
Copy link
Owner

Background
Our grad mechanism currently only handles differentiation w.r.t. a single Variable. For certain use cases (like computing Jacobians or Hessians, or typical backprop on a large network), we might want partial derivatives with respect to all parameters in one pass.

Potential Approaches

  • Extend the Derivative class to store a tuple/list of variables.
  • Reuse single-variable derivatives but memoize or reuse computations so we don’t reevaluate sub-expressions multiple times.
  • Exploit the isomorphic hashing approach to avoid redundant computations (caching repeated sub-graphs).

Additional Context

  • This is important for scenarios like second-order optimization (Hessian-based methods) or certain advanced autodiff use cases.
@thomasahle thomasahle added the enhancement New feature or request label Jan 5, 2025
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant