Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Regression tasks: Epistemic and Aleatoric Uncertainty Estimation #22

Open
feracero opened this issue Mar 13, 2023 · 8 comments
Open

Regression tasks: Epistemic and Aleatoric Uncertainty Estimation #22

feracero opened this issue Mar 13, 2023 · 8 comments
Assignees
Labels
enhancement New feature or request

Comments

@feracero
Copy link

Hello, I am trying to use this repository for regression tasks (I can see the examples seem to focus on classification tasks).

I would like to do estimate epistemic and aleatoric uncertainty for my bayesian neural network as described in 3.1 here https://arxiv.org/pdf/2204.09308.pdf

Could you please provide some guidance on how to obtain the mean and variance used by the final layer to generate the output samples? In this way one could estimate epistemic uncertainty for regression tasks.

Thank you!

@famura
Copy link

famura commented Mar 4, 2024

Hi @feracero, I am currently also thinking about using this repo for regression tasks. Did you have success?

@ranganathkrishnan (and other contributors) would it be possible to add an example? One thing that I am not sure of is how to modify the loss computation which is exemplified for classification in the Training snippet section.

@giulioturrisi
Copy link

@famura, did you manage to perform a regression with this repo? or did you end up with using another one? (if yes, please let me know which :D)

@famura
Copy link

famura commented Apr 17, 2024

@giulioturrisi I was putting it off due to other projects. It is still on my plate to try it within the next 1-2 months though. What is your experience?

@giulioturrisi
Copy link

@famura I have just started now to look around for libraries actually. If i find something nice, I will ping you.

@ranganathkrishnan ranganathkrishnan self-assigned this Apr 17, 2024
@ranganathkrishnan ranganathkrishnan added the enhancement New feature or request label Apr 17, 2024
@ranganathkrishnan
Copy link
Contributor

Hi @feracero, I am currently also thinking about using this repo for regression tasks. Did you have success?

@ranganathkrishnan (and other contributors) would it be possible to add an example? One thing that I am not sure of is how to modify the loss computation which is exemplified for classification in the Training snippet section.

Hi @famura It should be straightforward to use model with LinearReparameterization layers with torch.nn.MSELoss() for regression task. I will add an example for regression in the repo.

@famura
Copy link

famura commented May 8, 2024

Nice, thank you @ranganathkrishnan.

Is there a specific reason why only LSTMs and not GRUs or RNNs are supported here?
Or in other words, why did you have to re-code the LSTM forward pass here instead of using the one from PyTroch?

Update: I think my questions can be answered with "Because we need the KL from the layers that make up the LSTM"

@ranganathkrishnan
Copy link
Contributor

Nice, thank you @ranganathkrishnan.

Is there a specific reason why only LSTMs and not GRUs or RNNs are supported here? Or in other words, why did you have to re-code the LSTM forward pass here instead of using the one from PyTroch?

Update: I think my questions can be answered with "Because we need the KL from the layers that make up the LSTM"

Hi @famura, No specific reason, we included implementation of reference Bayesian LSTM for time-series prediction tasks, Contributions are welcome through PRs. If you end up implementing Bayesian GRU and RNN layers, please send the pull request. Thanks!

@staco-tx-mli
Copy link

Hi @feracero, I am currently also thinking about using this repo for regression tasks. Did you have success?
@ranganathkrishnan (and other contributors) would it be possible to add an example? One thing that I am not sure of is how to modify the loss computation which is exemplified for classification in the Training snippet section.

Hi @famura It should be straightforward to use model with LinearReparameterization layers with torch.nn.MSELoss() for regression task. I will add an example for regression in the repo.

Hello, i am currently also looking into using this library for a regression task. How does one weight the KL divergence compared to the MSE loss? In my case, i have multple outputs where all outputs are standardized (μ = 0, σ = 1) based on the training data.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

5 participants