You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've got a very large (130K x 130K), but sparse directed adjacency matrix that I'm trying to train an autoencoder on. The entries are either + indicating a positive connection and negative indicating a negative connection.
I'm basing my code off of Tutorial6.ipynb but updated to use things like RandomLinkSplit.
x (torch.Tensor, optional) - Node feature matrix with shape [num_nodes, num_node_features].
My issue is that I'm not sure what to pass in... I have 130K "samples", and each sample can form a connection with the other 130K samples in the connectivity matrix. In a standard autoencoder we train the model to reconstruct its input. In my case, would my model be reconstructing the 130K connections? Which means that I'd need to pass in rows of size N x 130_000 for training?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hey all!
I've got a very large (130K x 130K), but sparse directed adjacency matrix that I'm trying to train an autoencoder on. The entries are either + indicating a positive connection and negative indicating a negative connection.
I'm basing my code off of Tutorial6.ipynb but updated to use things like
RandomLinkSplit
.My code looks like the following:
I was thinking that I could use it as such:
but I'm running into
IndexError: too many indices for array: array is 0-dimensional, but 1 were indexed
.I'm reading through the torch_geometric.data.Data documentation, and it explicitly mentions
My issue is that I'm not sure what to pass in... I have 130K "samples", and each sample can form a connection with the other 130K samples in the connectivity matrix. In a standard autoencoder we train the model to reconstruct its input. In my case, would my model be reconstructing the 130K connections? Which means that I'd need to pass in rows of size N x 130_000 for training?
I found another issue where the user is using
from_scipy_sparse_matrix
, Model Outputs The Same Value for All Graphs in Binary Classification , but that user's issue is different in that they have a separate dataset.Beta Was this translation helpful? Give feedback.
All reactions