Practical Dependent Types in Haskell: Type-Safe Neural Networks (Part 1) - Exercises

This IHaskell Jupyter notebook contains my attempts at the exercises posed at the end of Practical Dependent Types in Haskell: Type-Safe Neural Networks (Part 1) by Justin Le.

Original author: David Banas capn.freako@gmail.com
Original date: January 10, 2018

Copyright © 2018 David Banas; all rights reserved World wide.

Language Pragmas, Imports, and Base Type Definitions


In [1]:
{-# LANGUAGE AllowAmbiguousTypes #-}
{-# LANGUAGE DataKinds #-}
{-# LANGUAGE ExplicitForAll #-}
{-# LANGUAGE GADTs #-}
{-# LANGUAGE KindSignatures #-}
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE TypeOperators #-}

import Control.Monad.Random
import Data.Singletons.Prelude
import Data.Singletons.TypeLits
import GHC.TypeLits
import Numeric.LinearAlgebra.Static

data Weights i o = W { wBiases :: !(R o)
                     , wNodes  :: !(L o i)
                     }                      -- an "o x i" layer
                     
data Network :: Nat -> [Nat] -> Nat -> * where
    O     :: !(Weights i o)
          -> Network i '[] o
    (:&~) :: KnownNat h
          => !(Weights i h)
          -> !(Network h hs o)
          -> Network i (h ': hs) o
infixr 5 :&~

randomWeights :: (MonadRandom m, KnownNat i, KnownNat o)
              => m (Weights i o)
randomWeights = do
    s1 :: Int <- getRandom
    s2 :: Int <- getRandom
    let wB = randomVector  s1 Uniform * 2 - 1
        wN = uniformSample s2 (-1) 1
    return $ W wB wN
    
randomNet :: forall m i hs o. (MonadRandom m, KnownNat i, SingI hs, KnownNat o)
          => m (Network i hs o)
randomNet = go sing
  where
    go :: forall h hs'. KnownNat h
       => Sing hs'
       -> m (Network h hs' o)
    go = \case
        SNil            ->     O <$> randomWeights
        SNat `SCons` ss -> (:&~) <$> randomWeights <*> go ss

Popping off the input layer

Write a function that “pops” the input layer off of a Network, returning both the input layer’s weights and the rest of the network, (Weights i h, Network h hs o).


In [2]:
pop :: (KnownNat i, KnownNat o, KnownNat h) => Network  i (h ': hs) o -> (Weights i h, Network h hs o)
pop (w :&~ n) = (w, n)

Think about what its type would have to be. Could it possibly be called with a network that cannot be popped? (that is, that has only one layer?)

No, because the (':) in the type signature would cause an error to be flagged at compile time.

Let's confirm this...


In [3]:
r = do
  (w1 :: Weights 5 3) <- randomWeights
  (w2 :: Weights 3 1) <- randomWeights
  let r1 = pop $ O w2
  return r1


<interactive>:4:18: error:
• Couldn't match type ‘'[]’ with ‘h : hs’
Expected type: Network 3 (h : hs) 1
Actual type: Network 3 '[] 1
• In the second argument of ‘(&dollar)’, namely ‘O w2’
In the expression: pop &dollar O w2
In an equation for ‘r1’: r1 = pop &dollar O w2
• Relevant bindings include
r1 :: (Weights 3 h, Network h hs 1) (bound at <interactive>:4:7)
r :: m (Weights 3 h, Network h hs 1) (bound at <interactive>:1:1)

Okay, good, our error was caught at compile time as expected. Now, let's make sure a correct case goes through...


In [4]:
r = do
  (w1 :: Weights 5 3) <- randomWeights
  (w2 :: Weights 3 1) <- randomWeights
  let r1 = pop $ w1 :&~ O w2
  return r1

Okay, looks good.

Point-by-point Addition of Two Networks

Write a function that takes two networks of the same dimensions and adds together their weights. Remember that L m n has a Num instance that adds the matrices together element-by-element.


In [5]:
addW :: (KnownNat i, KnownNat o)
     => Weights i o
     -> Weights i o
     -> Weights i o
addW (W b1 w1) (W b2 w2) = W (b1 + b2) (w1 + w2)

addN :: (KnownNat i, KnownNat o)
     => Network  i hs o
     -> Network  i hs o
     -> Network  i hs o
addN (O w1) (O w2) = O (addW w1 w2)
addN (w1 :&~ n1) (w2 :&~ n2) = addW w1 w2 :&~ addN n1 n2

Could this function ever be accidentally called on two networks that have different internal structures?

I don't think so, since the i and o are shared by the two arguments in the type signature, but let's confirm...


In [6]:
-- Test different network depths.
r = do
  (w1 :: Weights 5 3) <- randomWeights
  (w2 :: Weights 3 1) <- randomWeights
  let r1 = addN (w1 :&~ O w2) (O w1)
  return r1


<interactive>:4:32: error:
• Couldn't match type ‘'[]’ with ‘'[3]’
Expected type: Network 5 '[3] 1
Actual type: Network 5 '[] 3
• In the second argument of ‘addN’, namely ‘(O w1)’
In the expression: addN (w1 :&~ O w2) (O w1)
In an equation for ‘r1’: r1 = addN (w1 :&~ O w2) (O w1)

Good, we expected failure.


In [7]:
-- Test different network widths.
r = do
  (w1 :: Weights 5 3) <- randomWeights
  (w2 :: Weights 3 1) <- randomWeights
  let r1 = addN (w1 :&~ O w2) (w2 :&~ O w1)
  return r1


<interactive>:4:39: error:
• Couldn't match type ‘5’ with ‘1’
Expected type: Network 1 '[] 3
Actual type: Network 5 '[] 3
• In the second argument of ‘(:&~)’, namely ‘O w1’
In the second argument of ‘addN’, namely ‘(w2 :&~ O w1)’
In the expression: addN (w1 :&~ O w2) (w2 :&~ O w1)
<interactive>:4:32: error:
• Couldn't match type ‘3’ with ‘5’
Expected type: Network 5 '[3] 1
Actual type: Network 3 '[1] 3
• In the second argument of ‘addN’, namely ‘(w2 :&~ O w1)’
In the expression: addN (w1 :&~ O w2) (w2 :&~ O w1)
In an equation for ‘r1’: r1 = addN (w1 :&~ O w2) (w2 :&~ O w1)

Good, we expected failure.


In [8]:
-- Test a correct case.
r = do
  (w1 :: Weights 5 3) <- randomWeights
  (w2 :: Weights 3 1) <- randomWeights
  let r1 = addN (w1 :&~ O w2) (w1 :&~ O w2)
  return r1

Good, we expected success.

Return hidden layer structure.

Write a function that takes a Network i hs o and returns the singleton representing its hidden layer structure — Sing hs


In [12]:
hiddenSing :: (SingI hs) => Network i hs o -> Sing hs
hiddenSing (n :: Network i hs o) = sing :: Sing hs

r = do
  net <- randomNet
  return $ hiddenSing (net :: Network 5 '[3] 1)

:t r


r :: forall (m :: * -> *). MonadRandom m => m (Sing '[3])