Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

For convolutional layers, convert_to_analog and convert_to_analog_mapped behave differently #627

Open
coreylammie opened this issue Mar 13, 2024 · 2 comments
Assignees
Labels
bug Something isn't working good first issue Good for newcomers

Comments

@coreylammie
Copy link
Contributor

coreylammie commented Mar 13, 2024

Description

For convolutional layers, convert_to_analog and convert_to_analog_mapped behave differently. For linear layers, they behave the same.

This is unexpected behavior considering the following change log entry:

aihwkit/CHANGELOG.md

Lines 120 to 121 in d993b5a

* `convert_to_analog` now also considered mapping. Set
`mapping.max_out_size = 0` and `mapping.max_out_size = 0` to avoid this. (\#512)

MWE:

import torch
import torch.nn as nn
import torch.nn.functional as F
from aihwkit.simulator.configs import InferenceRPUConfig
from aihwkit.nn.conversion import convert_to_analog, convert_to_analog_mapped

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 32, 3, 1)
        self.conv2 = nn.Conv2d(32, 64, 3, 1)
        self.dropout1 = nn.Dropout(0.25)
        self.dropout2 = nn.Dropout(0.5)
        self.fc1 = nn.Linear(9216, 128)
        self.fc2 = nn.Linear(128, 10)

    def forward(self, x):
        x = self.conv1(x)
        x = F.relu(x)
        x = self.conv2(x)
        x = F.relu(x)
        x = F.max_pool2d(x, 2)
        x = self.dropout1(x)
        x = torch.flatten(x, 1)
        x = self.fc1(x)
        x = F.relu(x)
        x = self.dropout2(x)
        x = self.fc2(x)
        output = F.log_softmax(x, dim=1)
        return output
    
if __name__ == "__main__":
    model = Net()
    rpu_config = InferenceRPUConfig()
    rpu_config.mapping.max_input_size = 10
    rpu_config.mapping.max_output_size = 10
    model_1 = convert_to_analog(model, rpu_config)
    print('convert_to_analog - conv1')
    for tile in model_1.conv1.analog_tiles():
        print("\t", tile)

    print('convert_to_analog - fc2')
    for tile in model_1.fc2.analog_tiles():
  	    print("\t", tile)

    model_2 = convert_to_analog_mapped(model, rpu_config)
    print('convert_to_analog_mapped - conv1')
    for tile in model_2.conv1.analog_tiles():
        print("\t", tile)

    print('convert_to_analog_mapped - fc2')
    for tile in model_2.fc2.analog_tiles():
  	    print("\t", tile)    
@coreylammie coreylammie added the bug Something isn't working label Mar 13, 2024
@maljoras
Copy link
Collaborator

That's a known issue. They are not supposed to be the same. The mapped version is deprecated but the conv layer is not yet adapted to the tile module array if I remember correctly.

@PabloCarmona
Copy link
Collaborator

We are currently conducting some tests and investigating to adapt the new Conv2d layer to our tile modules

@PabloCarmona PabloCarmona added good first issue Good for newcomers and removed status:doing labels Jan 16, 2025
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
bug Something isn't working good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

4 participants