Multilayer Perceptron With Sigmoid Activation Produces Straight Line On Sin(2x) Regression
I'm trying to approximate noisy data from the sin(2x) function using a multilayer perceptron: # Get data datasets = gen_datasets() # Add noise datasets['ysin_train'] = add_noise(da
Solution 1:
The default value of stddev=1.0
in tf.random_normal
, which you use for weight & bias initialization, is huge. Try an explicit value of stddev=0.01
for the weights; as for the biases, common practice is to initialize them to zero.
As an initial approach, I would also try a higher learning_rate
of 0.01 (or maybe not - see answer in a related question here)
Post a Comment for "Multilayer Perceptron With Sigmoid Activation Produces Straight Line On Sin(2x) Regression"