machine learning - Autoencoder and Neural Network Overfitting in terms of parameter number? -


i have 1100 sequences 2 classes. of them 400 1 class 1 , 700 class 2. have used 1 hidden layer auto-encoder of 2 neurons capture features. initial features tri-grams each sequences. so, each sequence have 6860 tri-grams. result of input vectors sparse vectors.

now, if calculate parameters network, have

6860 * 2 = 13720 paramters (1st layer) 2 * 6860 = 13720 parameters (2nd layer) -----------------------------------------            27440 parameters (in total) 

now, way many parameters in comparison number of data points. hence, have used dropout value of 0.98, on layer 1->hidden layer hidden layer->output layer makes number of parameters 13720 * 0.02 = 274 on each layer , in total 548 parameters.

now, after training, tried encoder on test data of 500 sequences, , extracted hidden layer of 2 dimensional data. use data on 5 neuron single hidden layer neural network classify. results in getting around 90% accuracy.

my question overfitting in autoencoder? overfitting using neural network? worried low number of data points. use of dropout seem sensible?

try increasing hidden layer size dropouts turned off until fit data, using hidden layer size can start increasing dropouts parameter feel how model behaves.

also may want add alpha parameter tune weight updates.

and may have luck aggregating of parameters together.


Comments

Popular posts from this blog

php - How to display all orders for a single product showing the most recent first? Woocommerce -

asp.net - How to correctly use QUERY_STRING in ISAPI rewrite? -

angularjs - How restrict admin panel using in backend laravel and admin panel on angular? -