Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Complete the build algorithms #457

Closed
zolkis opened this issue Aug 24, 2023 · 1 comment · Fixed by #603
Closed

Complete the build algorithms #457

zolkis opened this issue Aug 24, 2023 · 1 comment · Fixed by #603
Assignees

Comments

@zolkis
Copy link
Collaborator

zolkis commented Aug 24, 2023

Complete the missing parts of the build()/buildSync() algorithm.
A part of that is to resolve #448.

@a-sully
Copy link
Contributor

a-sully commented Jan 25, 2024

What other parts are missing? Is there more to this than just adding the traversal algorithm (#457)?

inexorabletash added a commit to inexorabletash/webnn that referenced this issue Feb 28, 2024
As discussed in webmachinelearning#572:

- Define an "operator" concept, with inputs, outputs, activations.
- Defer "platform operator" and "platform operand" to build.
- Standardize the graph connection steps across builder methods.
- Simplify activation, input and constant steps.

Not covered in this change:

- Erroring if input's [[builder]] doesn't match `this`
- Build algorithm - covered by webmachinelearning#448 and webmachinelearning#457
- gru() is missing steps to populate output. Added "Issue" note.
- Introducing "Validate arguments" section for each method.
- Introducing "Calculate output shape" section for each method.

For webmachinelearning#549 and webmachinelearning#572.
fdwr pushed a commit that referenced this issue Mar 12, 2024
* Content: Define operand concept, simplify graph connection steps

As discussed in #572:

- Define an "operator" concept, with inputs, outputs, activations.
- Defer "platform operator" and "platform operand" to build.
- Standardize the graph connection steps across builder methods.
- Simplify activation, input and constant steps.

Not covered in this change:

- Erroring if input's [[builder]] doesn't match `this`
- Build algorithm - covered by #448 and #457
- gru() is missing steps to populate output. Added "Issue" note.
- Introducing "Validate arguments" section for each method.
- Introducing "Calculate output shape" section for each method.

For #549 and #572.

* Trivial fixes - missing spaces, wrong operation name

* lstm(): Fix output2 shape calculation

In tip-of-tree, the "desc" MLOperandDescriptor is populated, used,
then modified and conditionally used again (if `returnSequence` is
true) when creating "output2". This was broken in previous commits in
this branch. Restore the intent, using a separate "desc2"
MLOperandDescriptor only conditionally populated and then
conditionally used.

* Update index.bs

Co-authored-by: Ningxin Hu <ningxin.hu@intel.com>

* Initial review feedback:

* Add other operand inputs
* PreLU -> PReLU
* Define split() output shapes calculations

* Introduce definition for computational graph with inputs and constants

* Revise Programming Model section

---------

Co-authored-by: Ningxin Hu <ningxin.hu@intel.com>
inexorabletash added a commit to inexorabletash/webnn that referenced this issue Mar 13, 2024
- Adds validation that passed outputs are neither inputs nor
  constants, matching the Chromium implementation.

- Traverses the graph, starting with outputs, to visit all connected
  operands.

- Previously the outputs were iterated over to process inputs and
  constants, which didn't make any sense. Inputs are hooked up to
  [[inputsDescriptors]]. Nothing is specifically mentioned about
  constants, but an issue about transferring is referenced. (webmachinelearning#566)

- The impact of MLGraphBuilder re-use (webmachinelearning#567) is called out, since it
  could allow for removing the graph traversal.

- Populates graph's [[outputDescriptors]], which was previously just
  missing. (webmachinelearning#448)

- Makes most of the validation behavior of build() happen
  synchronously rather than "in parallel". A promise is still
  returned, of course.

- Converting the graph into an implementation-defined format is called
  out, which remains in "in parallel" steps.

Fixes webmachinelearning#448, fixes webmachinelearning#457, fixes webmachinelearning#552.
@inexorabletash inexorabletash self-assigned this Mar 13, 2024
@fdwr fdwr closed this as completed in #603 Mar 16, 2024
@fdwr fdwr closed this as completed in 1062297 Mar 16, 2024
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants