Neural Network
Are there any methods to use floats for inputs in Multi-Layer-Perceptron Neural Net?
Yes. You just...use the floats as the inputs. Maybe transform them in some way, scaling, biasing, some sort of preprocessing function, that would use some domain knowledge about the inputs and possibly the problem.
Thanks for the quick answer, I see now that it might work with floats.
The problem is comparable with the inputs of course.
And the bias is unnecessary, I think, because the inputs in my NN will never be 0.
The problem is comparable with the inputs of course.
And the bias is unnecessary, I think, because the inputs in my NN will never be 0.
In fact, it's usually the case that input units will accept floats. This is because the units higher up in the network must almost always accept floats, assuming the outputs of neurons are smooth and not step-functions.
Since it's easier to program one general unit rather than different types of units, then your input units will be just like your hidden ones, and so must accept floats.
You might not need a bias unit, but you should be certain that you're never going to have zero as an input. It's common, even when using floats, to still have a zero input.
Since it's easier to program one general unit rather than different types of units, then your input units will be just like your hidden ones, and so must accept floats.
Quote: Original post by brolin
And the bias is unnecessary, I think, because the inputs in my NN will never be 0.
You might not need a bias unit, but you should be certain that you're never going to have zero as an input. It's common, even when using floats, to still have a zero input.
I would be amazed if you didn't need a bias neuron. Unless you have reason to believe that your target will go through the origin, the bias will be necessary to allow the function to drift away from the origin.
-Kirk
-Kirk
If by your question about using floats as inputs, you're actually asking "can a MLP network be used to map a continuous input space into a continuous/discrete output space", then the answer is yes.
However, you'll find that you achieve better performance with less neurons if you use a basis function network. If you want a purely continuous mapping, radial basis function (RBF) networks are very useful. If you need to map from continuous inputs to discrete decisions, a fuzzy-neural or b-spline network can do this for you.
Cheers,
Timkin
However, you'll find that you achieve better performance with less neurons if you use a basis function network. If you want a purely continuous mapping, radial basis function (RBF) networks are very useful. If you need to map from continuous inputs to discrete decisions, a fuzzy-neural or b-spline network can do this for you.
Cheers,
Timkin
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement