Neural Network from Scratch

The mathematics and working of a neural netowrk is shown in this blog.
ML
Coding
Author

Kishan Ved

Published

August 9, 2023

Neural Network from scratch

This colab contains forward propogation in a neural network written from scratch in python.

The aim is to create a neural network to predict whether roasting of coffee beans is done properly based its on temperature and duration.

The logic is the same as logistic regression.

Generating coffee roasting data

import numpy as np

def load_coffee_data():
    # Creates a coffee roasting data set.
    # roasting duration: 12-15 minutes is best
    # temperature range: 175-260C is best

    rng = np.random.default_rng(2)
    X = rng.random(400).reshape(-1,2)
    X[:,1] = X[:,1] * 4 + 11.5          # 12-15 min is best
    X[:,0] = X[:,0] * (285-150) + 150  # 350-500 F (175-260 C) is best
    Y = np.zeros(len(X))

    i=0
    for t,d in X:
        y = -3/(260-175)*t + 21
        if (t > 175 and t < 260 and d > 12 and d < 15 and d<=y ):
            Y[i] = 1
        else:
            Y[i] = 0
        i += 1

    return (X, Y.reshape(-1,1))
X,Y = load_coffee_data()
print(X)
[[185.31763812  12.69396457]
 [259.92047498  11.86766377]
 [231.01357101  14.41424211]
 [175.3666449   11.72058651]
 [187.12086467  14.12973206]
 [225.90586448  12.10024905]
 [208.40515676  14.17718919]
 [207.07593089  14.0327376 ]
 [280.60385359  14.23225929]
 [202.86935247  12.24901028]
 [196.70468985  13.54426389]
 [270.31327028  14.60225577]
 [192.94979108  15.19686759]
 [213.57283453  14.27503537]
 [164.47298664  11.91817423]
 [177.25750542  15.03779869]
 [241.7745473   14.89694529]
 [236.99889634  13.12616959]
 [219.73805621  13.87377407]
 [266.38592796  13.25274466]
 [270.45241485  13.95486775]
 [261.96307698  13.49222422]
 [243.4899478   12.8561015 ]
 [220.58184803  12.36489356]
 [163.59498627  11.65441652]
 [244.76317931  13.32572248]
 [271.19410986  14.84073282]
 [201.98784315  15.39471508]
 [229.9283715   14.56353326]
 [204.97123839  12.28467965]
 [173.18989704  12.2248249 ]
 [231.51374483  11.95053142]
 [152.68795109  14.83198786]
 [163.42050092  13.30233814]
 [215.94730737  13.98108963]
 [218.04195512  15.24937257]
 [251.30354024  13.79786599]
 [233.33173846  13.52620597]
 [280.2428442   12.40650425]
 [243.01866352  13.72038672]
 [155.67159152  12.6846    ]
 [275.16753628  14.63825943]
 [151.73219763  12.68650532]
 [151.32372212  14.80986777]
 [164.89962496  11.72982072]
 [282.55425133  13.28347952]
 [192.98305487  11.69605356]
 [202.59536338  12.96415623]
 [220.66990639  11.52714187]
 [169.97498884  12.3395461 ]
 [209.46811003  12.70921347]
 [232.79923633  12.64173042]
 [272.80045636  15.34747983]
 [158.02370683  12.33623888]
 [226.00812974  14.58264092]
 [158.64327123  12.23921074]
 [211.65721643  14.17476303]
 [271.94927014  14.96789185]
 [257.15698108  11.71092857]
 [281.84592659  13.9585118 ]
 [161.62563505  12.5197151 ]
 [233.80180142  13.04256047]
 [210.29146919  14.71834026]
 [261.24418195  13.68714792]
 [256.98089905  13.12401972]
 [281.55504804  13.92063666]
 [280.63922766  11.67595077]
 [269.16350813  13.73750875]
 [246.34126918  12.2703799 ]
 [224.07333576  12.6571117 ]
 [164.23986438  11.51274708]
 [272.42340268  14.18361726]
 [177.67793377  12.53127471]
 [212.85523206  14.77314901]
 [165.87945639  15.37113932]
 [277.42795347  12.47864038]
 [236.50555103  12.94039013]
 [244.13865151  11.8476644 ]
 [213.44539383  13.85396321]
 [234.57422435  14.2711819 ]
 [270.33648536  12.46500084]
 [170.68123284  13.06242775]
 [226.79179994  15.34211504]
 [245.91825406  14.4537702 ]
 [281.31680226  12.57097265]
 [185.02731945  13.19016335]
 [189.881701    14.10441354]
 [278.48137931  12.11404597]
 [219.9229371   14.2103124 ]
 [216.57898499  15.15497536]
 [249.48122814  15.02870718]
 [165.0882734   12.28305401]
 [158.87007046  14.81727799]
 [279.98051934  11.55596408]
 [256.54924192  14.41132479]
 [272.60521724  12.58154189]
 [246.49491975  12.44969153]
 [160.26448201  14.48081625]
 [155.69875114  14.29837785]
 [188.26743273  13.44969358]
 [270.35697577  12.47363152]
 [213.22379155  12.92019779]
 [175.70141973  13.39460587]
 [174.52009415  14.69602997]
 [233.00092162  12.63252301]
 [281.36917436  12.88110738]
 [240.61964926  14.43289491]
 [185.80556267  11.54705521]
 [270.50314335  15.32566605]
 [172.98079126  12.11442084]
 [208.41010162  13.89027827]
 [283.51265469  15.35398447]
 [283.36013339  12.48227766]
 [230.84923224  13.24347657]
 [181.23930992  11.76150948]
 [172.77833046  12.93380692]
 [161.88293361  12.10295597]
 [156.0279505   13.99162509]
 [216.51672478  12.47421201]
 [221.05707984  13.19778711]
 [238.9856852   15.23066888]
 [197.69443437  14.08060613]
 [179.55375966  15.2595976 ]
 [233.38848487  12.13498975]
 [184.70189322  12.13660544]
 [174.18309465  12.72719543]
 [261.11450983  13.32823521]
 [187.41794561  13.17630344]
 [186.09876106  14.43472966]
 [157.93546835  12.65691424]
 [193.6382219   12.22607807]
 [249.65103076  12.22098945]
 [190.56498132  11.72590625]
 [252.00406102  12.95545207]
 [238.55033302  12.36894441]
 [152.94302629  12.78967263]
 [255.17362013  14.84978312]
 [197.09336712  14.88776311]
 [156.79710499  13.58845814]
 [184.7520262   13.25631075]
 [179.92164592  15.07426649]
 [190.79357984  15.28116771]
 [164.72717415  13.21932402]
 [209.86506596  14.33773917]
 [196.57800642  13.46997985]
 [159.51062317  12.74412583]
 [247.87288291  11.92364557]
 [212.44231569  12.44690782]
 [172.34040752  11.98526205]
 [259.8719002   14.24662808]
 [201.22657354  13.06656377]
 [248.34175919  13.9158246 ]
 [273.66206058  15.17765474]
 [215.09358488  14.13654103]
 [223.53014138  12.74114217]
 [211.22431235  14.38470346]
 [224.61209061  14.02962547]
 [215.7548952   15.31191057]
 [254.8222958   12.02314013]
 [259.90478358  15.17031031]
 [260.24885577  12.87243299]
 [199.6677145   12.47285748]
 [157.52013585  13.38824538]
 [264.8148241   14.57524634]
 [239.39685171  14.88876268]
 [238.98037311  12.39333366]
 [258.42593537  12.97008002]
 [270.15836599  12.80593012]
 [162.40676333  14.41959278]
 [164.53231154  14.98085772]
 [205.6096794   14.6204848 ]
 [157.09674149  13.67535103]
 [241.38069604  12.01802052]
 [232.13370589  12.07209046]
 [191.03853216  12.96114082]
 [233.644025    12.02047538]
 [174.9514637   14.62502635]
 [246.64321151  13.31682268]
 [188.07040705  14.26881857]
 [213.15899445  12.74607869]
 [268.08223648  12.30736127]
 [258.57818535  13.97127162]
 [237.20731698  14.22860981]
 [251.01659085  15.02378838]
 [274.27882002  12.52195619]
 [172.12463765  15.08549633]
 [177.51695425  12.38785975]
 [258.70969379  15.36444235]
 [264.01362166  13.56692157]
 [200.7060497   15.45420693]
 [249.36929914  14.01637408]
 [151.50238376  12.28076712]
 [151.82138896  15.12816166]
 [181.92285734  12.18408524]
 [228.64664273  12.31240743]
 [223.78183257  15.2991668 ]
 [266.62767329  12.48051014]
 [273.68398195  13.09756176]
 [220.61000617  12.7998907 ]
 [284.99434167  12.72829382]]
print(Y)
[[1.]
 [0.]
 [0.]
 [0.]
 [1.]
 [1.]
 [0.]
 [0.]
 [0.]
 [1.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [1.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [1.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [1.]
 [1.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [1.]
 [0.]
 [0.]
 [0.]
 [1.]
 [1.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [1.]
 [1.]
 [1.]
 [0.]
 [0.]
 [1.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [0.]
 [1.]
 [1.]
 [0.]
 [0.]
 [0.]
 [1.]
 [0.]]

Normalizing the data

def Normalization_(X): # Same as tensorflow.keras.layers.Normalization(axis=-1)
  n_samples = X.shape[0]
  n_features = X.shape[1]
  mean_X = np.zeros(n_features)
  std_X = np.zeros(n_features)

  for i in range(n_features):
    for j in range(n_samples):
      mean_X[i] += X[j][i]
    mean_X[i]/=n_samples

  for i in range(n_features):
    for j in range(n_samples):
      std_X[i]+=(X[j][i]-mean_X[i])**2
    std_X[i]/=n_samples
    std_X[i] = std_X[i]**0.5


  for i in range(n_features):
    for j in range(n_samples):
      if(std_X[i]==0): std_X[i]=0.000001
      X[j][i] = (X[j][i]-mean_X[i])/std_X[i]

  return X

The normalized data looks like this:

Xn = Normalization_(X)
print(Xn)
[[-0.83455487 -0.65287939]
 [ 1.03230561 -1.38514475]
 [ 0.3089396   0.87162554]
 [-1.08356841 -1.51548419]
 [-0.78943095  0.61949357]
 [ 0.18112441 -1.17902835]
 [-0.25681309  0.66154994]
 [-0.29007562  0.53353736]
 [ 1.54988623  0.7103529 ]
 [-0.39534099 -1.04719659]
 [-0.54960543  0.10065339]
 [ 1.29237479  1.0382427 ]
 [-0.64356797  1.56518596]
 [-0.12749718  0.74826093]
 [-1.35617118 -1.34038253]
 [-1.03625154  1.42421957]
 [ 0.57822214  1.2993957 ]
 [ 0.4587163  -0.26986057]
 [ 0.02678125  0.39266434]
 [ 1.19409703 -0.1576901 ]
 [ 1.29585675  0.46452933]
 [ 1.08341966  0.05453598]
 [ 0.6211483  -0.50919412]
 [ 0.04789629 -0.94450116]
 [-1.37814225 -1.57412384]
 [ 0.65300963 -0.09301737]
 [ 1.31441691  1.24958036]
 [-0.41739987  1.74051783]
 [ 0.28178358  1.00392692]
 [-0.34274342 -1.0155865 ]
 [-1.13803932 -1.0686296 ]
 [ 0.32145594 -1.31170768]
 [-1.65107977  1.2418306 ]
 [-1.38250857 -0.11374051]
 [-0.0680784   0.48776708]
 [-0.01566196  1.61171572]
 [ 0.81667554  0.32539483]
 [ 0.36694939  0.0846505 ]
 [ 1.54085233 -0.90762589]
 [ 0.60935489  0.25673292]
 [-1.57641718 -0.66117825]
 [ 1.41384789  1.07014904]
 [-1.67499653 -0.65948976]
 [-1.68521822  1.22222785]
 [-1.34549499 -1.50730086]
 [ 1.59869295 -0.13045297]
 [-0.64273558 -1.53722522]
 [-0.40219729 -0.41343633]
 [ 0.05009986 -1.68691427]
 [-1.21848914 -0.966964  ]
 [-0.23021376 -0.63936587]
 [ 0.35362406 -0.69916914]
 [ 1.35461411  1.69865809]
 [-1.51755788 -0.96989486]
 [ 0.18368349  1.02086007]
 [-1.50205391 -1.05588091]
 [-0.17543359  0.65939988]
 [ 1.33331403  1.36226833]
 [ 0.96315197 -1.52404303]
 [ 1.58096785  0.46775867]
 [-1.42742327 -0.80729876]
 [ 0.37871224 -0.34395473]
 [-0.20961003  1.14111644]
 [ 1.06543004  0.22727679]
 [ 0.9587457  -0.27176578]
 [ 1.5736889   0.43419383]
 [ 1.55077143 -1.55504025]
 [ 1.26360316  0.27190641]
 [ 0.69249973 -1.02825889]
 [ 0.13526723 -0.6855383 ]
 [-1.36200483 -1.69967088]
 [ 1.34517872  0.66724647]
 [-1.02573074 -0.79705467]
 [-0.14545446  1.18968779]
 [-1.3209757   1.71962506]
 [ 1.47041254 -0.84369906]
 [ 0.44637084 -0.43449778]
 [ 0.63738145 -1.40286813]
 [-0.13068626  0.37510802]
 [ 0.39804134  0.744846  ]
 [ 1.29295573 -0.85578638]
 [-1.2008161  -0.3263484 ]
 [ 0.20329405  1.69390383]
 [ 0.6819142   0.90665522]
 [ 1.56772704 -0.76187447]
 [-0.8418198  -0.21314948]
 [-0.72034382  0.59705637]
 [ 1.49677345 -1.16680156]
 [ 0.03140771  0.69090362]
 [-0.05227131  1.52806119]
 [ 0.77107402  1.4161627 ]
 [-1.34077425 -1.01702713]
 [-1.49637849  1.22879477]
 [ 1.53428792 -1.6613721 ]
 [ 0.94794392  0.86904023]
 [ 1.34972845 -0.75250804]
 [ 0.69634468 -0.86935345]
 [-1.46148475  0.93062335]
 [-1.57573754  0.76894698]
 [-0.76073925  0.01684545]
 [ 1.29346848 -0.8481379 ]
 [-0.13623163 -0.45239218]
 [-1.07519101 -0.03197311]
 [-1.10475248  1.12134512]
 [ 0.35867103 -0.70732872]
 [ 1.5690376  -0.48703398]
 [ 0.54932198  0.8881556 ]
 [-0.82234506 -1.66926712]
 [ 1.29712617  1.67932678]
 [-1.14327198 -1.16646935]
 [-0.25668935  0.40729032]
 [ 1.62267603  1.70442248]
 [ 1.61885934 -0.84047571]
 [ 0.30482718 -0.16590346]
 [-0.93661091 -1.47921837]
 [-1.14833836 -0.44033181]
 [-1.42098463 -1.17662949]
 [-1.56749966  0.49710358]
 [-0.05382931 -0.84762347]
 [ 0.05978848 -0.20639333]
 [ 0.50843368  1.59514056]
 [-0.52483807  0.57595831]
 [-0.9787901   1.62077711]
 [ 0.36836941 -1.14824125]
 [-0.84996327 -1.14680943]
 [-1.11318555 -0.62343031]
 [ 1.06218512 -0.09079061]
 [-0.7819968  -0.22543209]
 [-0.81500806  0.88978155]
 [-1.51976596 -0.68571329]
 [-0.62634069 -1.06751904]
 [ 0.77532316 -1.07202856]
 [-0.70324542 -1.51076985]
 [ 0.83420536 -0.42114994]
 [ 0.49753944 -0.94091131]
 [-1.64469677 -0.56806319]
 [ 0.91352037  1.25760072]
 [-0.53987917  1.29125847]
 [-1.54825235  0.1398182 ]
 [-0.84870874 -0.15452985]
 [-0.96958412  1.45653722]
 [-0.69752497  1.63989248]
 [-1.34981039 -0.18730738]
 [-0.22028034  0.80382886]
 [-0.55277555  0.03482309]
 [-1.4803493  -0.60842663]
 [ 0.73082681 -1.33553385]
 [-0.15578727 -0.87182037]
 [-1.15929693 -1.2809295 ]
 [ 1.03109008  0.72308648]
 [-0.43644987 -0.32268307]
 [ 0.74255997  0.4299294 ]
 [ 1.37617488  1.54815959]
 [-0.08944194  0.62552765]
 [ 0.12167434 -0.61107074]
 [-0.18626657  0.84544847]
 [ 0.14874903  0.53077939]
 [-0.07289332  1.66713671]
 [ 0.90472884 -1.24736205]
 [ 1.03191295  1.54165097]
 [ 1.04052301 -0.4947212 ]
 [-0.47545874 -0.84882385]
 [-1.53015923 -0.03760976]
 [ 1.15478174  1.01430703]
 [ 0.51872271  1.29214429]
 [ 0.50830076 -0.91929763]
 [ 0.99490627 -0.40818669]
 [ 1.28849847 -0.55365585]
 [-1.40787633  0.87636729]
 [-1.35468663  1.37375864]
 [-0.32676708  1.05439721]
 [-1.54075425  0.21682242]
 [ 0.56836641 -1.25189904]
 [ 0.33696984 -1.20398241]
 [-0.69139529 -0.41610858]
 [ 0.37476404 -1.24972355]
 [-1.0939579   1.05842192]
 [ 0.70005553 -0.10090436]
 [-0.76566962  0.74275162]
 [-0.13785311 -0.60669601]
 [ 1.23654543 -0.99548611]
 [ 0.99871617  0.47906639]
 [ 0.46393182  0.70711874]
 [ 0.80949492  1.41180367]
 [ 1.39160866 -0.80531272]
 [-1.16469636  1.46648908]
 [-1.02975909 -0.92414859]
 [ 1.00200704  1.71369023]
 [ 1.13473246  0.12073255]
 [-0.44947545  1.79323933]
 [ 0.76827311  0.51903606]
 [-1.68074739 -1.01905377]
 [-1.67276461  1.50429896]
 [-0.91950583 -1.10473297]
 [ 0.24970961 -0.99101422]
 [ 0.12797266  1.65584322]
 [ 1.20014646 -0.84204209]
 [ 1.37672344 -0.29521274]
 [ 0.04860092 -0.55900796]
 [ 1.65975375 -0.62245691]]

Activation function - the Sigmoid function

def sigmoid(z):
    z = np.clip(z, -500, 500 ) # protect against overflow
    g = 1.0/(1.0+np.exp(-z))
    return g

Dense function to create a new layer in our neural network

def dense_(a_in, W, b, g): # This function creates a layer in our neural network
  # W is a matrix where columns correspond to arrays w to be used

  n = W.shape[1] # This is the number of neurons in the layer (as number of w arrays = number of neurons)
  a_out = np.zeros(n) # Output/activation of this layer

  for i in range(n):
    w = W[:,i] # Extract columns from W
    z = np.dot(w,a_in) + b[i]
    a_out[i] = g(z)
  return a_out

Sequential function to connect layers together one after another

def sequential_(X, W1, b1, W2, b2):
  a1 = dense_(X, W1, b1, sigmoid)
  a2 = dense_(a1, W2, b2, sigmoid)
  ans = a2
  return ans

Inferencing our neural network

Inferencing a neural network is to test the predictions against pre trained values of parameters.

W1_tmp = np.array( [[-8.93,  0.29, 12.9 ], [-0.1,  -7.32, 10.81]] ) # 2 X 3
b1_tmp = np.array( [-9.82, -9.28,  0.96] )
W2_tmp = np.array( [[-31.18], [-27.59], [-32.56]] ) # 3 X 1
b2_tmp = np.array( [15.41] )

Predict function to predict values based on our neural network

def predict_(X,W1_tmp,b1_tmp,W2_tmp,b2_tmp):
  Xn = Normalization_(X)
  n = Xn.shape[0]
  pred = np.zeros(n)
  for i in range(n):
    pred[i] = sequential_(Xn[i],W1_tmp,b1_tmp,W2_tmp,b2_tmp)

  yhat = (pred>=0.5).astype(int)
  return yhat

Testing our neural network

X_tst = np.array([
    [200,13.9],  # postive example
    [200,17]])   # negative example
predictions = predict_(X_tst, W1_tmp, b1_tmp, W2_tmp, b2_tmp)
print(predictions)
[1 0]