In [1]:
require 'nngraph';
The first type of new module you'll encounter is the Identity module. This module just takes in whatever's input and passes it on the next layer.
In [2]:
a = torch.Tensor{1,2,3}
Important note: The below code isn't nngraph yet. However, the Identity module is important for nngraph.
In [3]:
module1 = nn.Identity()
In [4]:
module1:forward(a)
Out[4]:
Here is how this would be written in nngraph:
In [5]:
-- Notice the extra (). The extra () contain properties of this module when embedded into a graph.
x1 = nn.Identity()()
m = nn.gModule({x1},{x1})
In [6]:
m:forward(a)
Out[6]:
gModule is the master module that indicates the input and output nodes of the graph. The above module's input and output node are both x1.
Let's try something more complex than this.
x = a + b
In [7]:
-- Declare some tensors
t1 = torch.Tensor{1,2,3}
t2 = torch.Tensor{3,4,5}
Without nngraph.
In [8]:
-- nn.CAddTable adds tensors together. Obviously, dimensions need to match.
a = nn.CAddTable()
In [9]:
a:forward({t1,t2})
Out[9]:
With nngraph.
In [10]:
a = nn.Identity()()
b = nn.Identity()()
x = nn.CAddTable()({a,b})
m = nn.gModule({a,b},{x})
In [11]:
m:forward({t1,t2})
Out[11]:
Element-wise subtraction and multiplication
In [12]:
a = nn.Identity()()
b = nn.Identity()()
x = nn.CSubTable()({a,b})
m = nn.gModule({a,b},{x})
In [13]:
m:forward({t1,t2})
Out[13]:
In [14]:
a = nn.Identity()()
b = nn.Identity()()
x = nn.CMulTable()({a,b})
m = nn.gModule({a,b},{x})
In [15]:
m:forward({t1,t2})
Out[15]:
Select one element out of a table.
In [16]:
k = {5, torch.Tensor{1,2,3}}
In [17]:
k
Out[17]:
In [18]:
a = nn.Identity()()
x= nn.SelectTable(1)(a)
m = nn.gModule({a},{x})
In [19]:
m:forward(k)
Out[19]:
Negative index
In [22]:
a = nn.Identity()()
x= nn.SelectTable(2)(a)
m = nn.gModule({a},{x})
In [23]:
m:forward(k)
Out[23]:
narrows down dimension dim from offset upto size.
In [25]:
nn.Narrow(1,2,3):forward(torch.Tensor(5,2):fill(1))
Out[25]:
In [24]:
torch.Tensor(5,2):fill(1)
Out[24]:
Creates a vector of size word_vec_size for each number from 1 to vocab_size.
In [26]:
m = nn.LookupTable(4, 5)
In [27]:
m:forward(torch.Tensor{4})
Out[27]:
In [28]:
m.weight
Out[28]:
Insert modules in this module to make them parallel.
In [29]:
m = nn.ConcatTable()
In [30]:
m:add(nn.Linear(5,2));
In [31]:
m:add(nn.Linear(5,3));
In [24]:
m:forward(torch.randn(5))
Out[24]:
split a tensor into table of tensors along a given dimension
In [26]:
m = nn.SplitTable(1)
In [27]:
m:forward(torch.rand(3,2))
Out[27]:
In [ ]: