You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, My dataset is very large, which cannot be handled by tangram, so I wonder if it is possible to have a key or api known as batch size to address this problem. Thanks a lot.
The text was updated successfully, but these errors were encountered:
Hey @HelloWorldLTY, currently there isn't a way to process batches.
Could you let me know the dimensions of your dataset, both single cell and spatial.
I will try to help you with a work around!
Thank you so much for your patience!
Since your spatial data is super large, one thing you could is to split your spatial data to different part and then continue with you mapping.
For example: If you have many tissue segments, you could map each segment separately.
Hi, My dataset is very large, which cannot be handled by tangram, so I wonder if it is possible to have a key or api known as batch size to address this problem. Thanks a lot.
The text was updated successfully, but these errors were encountered: