Models

models
# from torch import randn as torch_randn
# from fastai.vision.all import test_eq

print(torch.cuda.is_available())
True

source

get_model_class

 get_model_class (model_name:str)

source

regist_model

 regist_model (model_class)

NMFlow

Noise Modeler


source

NMFlow

 NMFlow (in_ch=1, ch_exp_coef=1.0, width_exp_coef=2.0, num_bits=16,
         conv_net_feats=16, pre_arch='UD',
         arch='NE|SAL|SDL|CL2|SAL|SDL|CL2', device='cuda', codes=None)

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool

# NMFlow(arch="NE|SAL|SDL|CL2")

Denoiser


source

NMFlowDenoiser

 NMFlowDenoiser (denoiser, kwargs_flow, flow_pth_path, num_bits=8)

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool

NMFlowGAN

Generator


source

NMFlowGANGenerator

 NMFlowGANGenerator (kwargs_unet, kwargs_flow)

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool

kwargs_unet = {
        'depth': 1,
}
kwargs_flow = {
        'device': 'cuda',
        'arch': 'NE',
        'num_bits': 8,
}

model = NMFlowGANGenerator(kwargs_unet,kwargs_flow).cuda()

noisy = torch.randint(256,[1, 1, 2, 2], device=kwargs_flow['device'])
clean = torch.randint(256,[1, 1, 2, 2], device=kwargs_flow['device'])

kwargs = dict()
kwargs['camera'] = torch.tensor([2], dtype=torch.float32, device=kwargs_flow['device'])

z, objectives, y, x = model.forward(noisy,clean, kwargs=kwargs)
print('clean: ', clean)
print('noisy: ', noisy)
print('z: ', z)
print('obj: ', objectives)
print('x: ', x)
print('y: ', y)
clean:  tensor([[[[ 41,  36],
          [125, 172]]]], device='cuda:0')
noisy:  tensor([[[[163,   3],
          [237, 143]]]], device='cuda:0')
z:  tensor([[[[ 0.4746, -0.1307],
          [ 0.4361, -0.1141]]]], device='cuda:0')
obj:  tensor([-22.1807], device='cuda:0')
x:  tensor([[[[-41.,  71.],
          [-79.,  83.]]]], device='cuda:0')
y:  tensor([[[[-67.2944, 142.6038],
          [148.0360, 288.9552]]]], device='cuda:0', grad_fn=<AddBackward0>)

Test

NMFlow(**kwargs_flow).cuda().sample(kwargs)
tensor([[[[ 13.,   0.],
          [118., 255.]]]], device='cuda:0')
UNet(**kwargs_unet).cuda()(clean*1.)
tensor([[[[ 40.4377,  36.2475],
          [125.0727, 171.7873]]]], device='cuda:0', grad_fn=<AddBackward0>)

Critic


source

NMFlowGANCritic

 NMFlowGANCritic (in_ch=1, nc=64, num_bits=8)

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool


source

Discriminator_96

 Discriminator_96 (in_nc=3, nc=64)

Discriminator with 96x96 input, refer to Kai Zhang, https://github.com/cszn/KAIR

GAN Denoiser


source

NMFlowGANDenoiser

 NMFlowGANDenoiser (denoiser, kwargs_flow, kwargs_unet, pretrained_path,
                    num_bits=8)

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool

DnCNNFlowGAN


source

DnCNNFlowGAN

 DnCNNFlowGAN (kwargs_dncnn, kwargs_unet, kwargs_flow, pretrained_path,
               num_bits=8)

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool

UNetFlowGAN


source

UNetFlowGAN

 UNetFlowGAN (kwargs_unet, kwargs_flow, pretrained_path, num_bits=8)

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool


source

MyUNetFlowGAN

 MyUNetFlowGAN (kwargs_myunet, kwargs_unet, kwargs_flow, pretrained_path,
                num_bits=8)

Base class for all neural network modules.

Your models should also subclass this class.

Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::

import torch.nn as nn
import torch.nn.functional as F

class Model(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(1, 20, 5)
        self.conv2 = nn.Conv2d(20, 20, 5)

    def forward(self, x):
        x = F.relu(self.conv1(x))
        return F.relu(self.conv2(x))

Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to, etc.

.. note:: As per the example above, an __init__() call to the parent class must be made before assignment on the child.

:ivar training: Boolean represents whether this module is in training or evaluation mode. :vartype training: bool