Deep Learning in Clojure from Scratch to GPU: Learning a Regression

Deep Learning in Clojure from Scratch to GPU: Learning a Regression

The network returned a vector of ones because the output activation is the sigmoid function. Here we are doing regression, which is more difficult, since the network has to learn to approximate the actual real value of the function. We can never expect to get the exact floating point values that the real function is returning, especially not with the test observations that the network hasn’t seen during the learning phase, but the difference is within an acceptable range.

Source: dragan.rocks