java,data-structures,machine-learning,perceptron

If I understand your question, you're trying to count how many instances of each word in the global dictionary occur in a given file. I'd recommend creating an array of integers, where the index represents the index into the global dictionary and the value represents the number of occurrences of...

machine-learning,neural-network,perceptron

Changing the learning rate to 0.075 fixed the issue.

artificial-intelligence,neural-network,perceptron

No you can't, not with the weights unchanged. But sigmoids are continuous approximations of binary threshold units and it should be similar. The page says this: Now replace all the perceptrons in the network by sigmoid neurons, and multiply the weights and biases by a positive constant c>0. Show that...

java,machine-learning,perceptron

Let's establish some vocabulary up front (I guess you are using the 20-newsgroup dataset): "Class Label" is what you're trying to predict, in your binary case this is "atheism" vs. the rest "Feature vector" that's what you input to your classifier "Document" that is a single e-mail from the dataset...

math,machine-learning,neural-network,linear-algebra,perceptron

A linear function is f(x) = a x + b. If we take another linear function g(z) = c z + d, and apply g(f(x)) (which would be the equivalent of feeding the output of one linear layer as the input to the next linear layer) we get g(f(x)) =...

Some binary classifiers have uncalibrated decision_function method that yields positive or negative values and a threshold at zero. It is possible to use that and compute a calibrated probability estimate of correct classification, see this on-going pull request for instance: https://github.com/scikit-learn/scikit-learn/pull/1176...

python,algorithm,neural-network,perceptron

The problem is that you are not recomputing the output after the weights change so the error signal remains constant and the weights will change in the same way on every iteration. Change the code as follows: def update(theta,x,n,target,output): for i in range(0,len(x)): output[i] = evaluate(theta,x[i]) # This line is...

matlab,neural-network,linear-regression,backpropagation,perceptron

A neural network will generally not find or encode a formula like t = a + b*X1 + c*X2, unless you built a really simple one with no hidden layers and linear output. If you did then you could read the values [a,b,c] from the weights attached to bias, input...

matlab,neural-network,perceptron

The tansig activation function essentially makes it possible than a neuron becomes inactive due to saturation. A linear neuron is always active. Therefore if one linear neuron has bad parameters, it will always affect the outcome of the classification. A higher number of neurons yield a higher probability of bad...

java,machine-learning,perceptron

Not sure if I understand you correctly. Training and test set need to have the exact same format. To test, you just solve the equation for known weights and features (of your test set). In principle, you should generate test and training data together to ensure they're as equal as...

c++,neural-network,openframeworks,perceptron

The following may work: output = 1 / (1 + exp(-net)); You may also love tanh(net) or 1 / (1 + abs(net)) functions, which are much faster according to this answer....