Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Reshape Targets and Mean for RHS in Cholesky solver #2587

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

turquoisedragon2926
Copy link

This is a potential fix to #2577 bug regarding Fantasy Models for Multitask GPs

Code changes

Added reshaping to align targets and fant_mean in the calculation of small_system_rhs to ensure proper tensor operations in the fantasy data update step for the GP model.

Tests

None added

Documentation

None updated

Examples

In order to test, one can run:

import torch
import gpytorch

class MultitaskGPModel(gpytorch.models.ExactGP):
    def __init__(self, train_x, train_y, likelihood, n_tasks):
        super(MultitaskGPModel, self).__init__(train_x, train_y, likelihood)
        self.mean_module = gpytorch.means.MultitaskMean(
            gpytorch.means.ConstantMean(), num_tasks=n_tasks
        )
        self.covar_module = gpytorch.kernels.MultitaskKernel(
            gpytorch.kernels.RBFKernel(), num_tasks=n_tasks, rank=1
        )

    def forward(self, x):
        mean_x = self.mean_module(x)
        covar_x = self.covar_module(x)
        return gpytorch.distributions.MultitaskMultivariateNormal(mean_x, covar_x)


input_dim = 1
output_dim = 2
n_train = 10
train_x = torch.randn(n_train, input_dim)
train_y = torch.randn(n_train, output_dim)

likelihood = gpytorch.likelihoods.MultitaskGaussianLikelihood(num_tasks=output_dim)
model = MultitaskGPModel(train_x, train_y, likelihood, output_dim)

model.train()
model.eval()

# get a posterior to fill in caches
model(torch.randn(n_train, input_dim))

# Generate some new data and get fantasy model
n_new = 5
new_x = torch.randn(n_new, input_dim)
new_y = torch.randn(n_new, output_dim)

model.get_fantasy_model(new_x, new_y)

for varying number of output_dim

Copy link
Member

@gpleiss gpleiss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should assume that all of the variable could come from batched GPs, so the reshape call could cause some issues.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants