Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

MXNet hook saving more tensors than specified #327

Open
rahul003 opened this issue Aug 14, 2020 · 0 comments
Open

MXNet hook saving more tensors than specified #327

rahul003 opened this issue Aug 14, 2020 · 0 comments

Comments

@rahul003
Copy link
Contributor


def test_save_shapes(out_dir, hook=None):
    hook_created = False
    if hook is None:
        hook_created = True
        global_reduce_config = ReductionConfig(save_raw_tensor=True)
        global_save_config = SaveConfig(save_steps=[0, 1])

        hook = t_hook(
            out_dir=out_dir,
            save_config=global_save_config,
            include_collections=[
                "weights",
                "biases",
                "gradients",
                "default",
                "ReluActivation",
                "flatten",
            ],
            reduction_config=global_reduce_config,
        )
        hook.get_collection("ReluActivation").include(["relu*"])
        hook.get_collection("ReluActivation").save_config = SaveConfig(save_steps=[1])
        hook.get_collection("flatten").include(["flatten*"])
        hook.get_collection("ReluActivation").save_config = SaveConfig(save_steps=[1])
    
    run_mnist_gluon_model(hook=hook, num_steps_train=10, num_steps_eval=10)
    
    tr = create_trial(out_dir)
    print(0, len(tr.tensor_names(step=0)))
    print(1, len(tr.tensor_names(step=1)))
    if hook_created:
        shutil.rmtree(out_dir)

In step 0 it should only save 21 tensors, and 31 in step 1. But both steps save 31 tensors.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant