python - Autoencoder not learning while training -
python 3.5.2, tensorflow 1.0.0
somewhat new in programming autoencoders. trying implement simple network familiarize here. have used same input data in cnn able classify accuracy of 98%. data have 2000 row data , each row signal. trying 3 stacked layers of auto encoders 512 256 , 64 nodes.
class dimensions: input_width, input_height = 1,1024 batch_size = 50 layer = [input_width*input_height, 512, 256, 64] learningrate = 0.001 def myencoder(x,corrupt_prob,dimensions): current_input = corrupt(x) * corrupt_prob + x * (1 - corrupt_prob) encoder = [] layer_i, n_output in enumerate(dimensions.layer[1:]): n_input = int(current_input.get_shape()[1]) w = tf.variable( tf.random_uniform([n_input, n_output], -1.0 / math.sqrt(n_input), 1.0 / math.sqrt(n_input))) b = tf.variable(tf.zeros([n_output])) encoder.append(w) output = tf.nn.tanh(tf.matmul(current_input, w) + b) current_input = output z = current_input encoder.reverse() # build decoder using same weights layer_i, n_output in enumerate(model.layer[:-1][::-1]): w = tf.transpose(encoder[layer_i]) b = tf.variable(tf.zeros([n_output])) output = tf.nn.tanh(tf.matmul(current_input, w) + b) current_input = output # have reconstruction through network y = current_input # cost function measures pixel-wise difference cost = tf.sqrt(tf.reduce_mean(tf.square(y - x))) return z,y,cost sess = tf.session() model = dimensions() data_train,data_test,label_train,label_test = load_data(datainfo,folder) x = tf.placeholder(tf.float32,[model.batch_size,model.input_height*model.input_width]) corrupt_prob = tf.placeholder(tf.float32,[1]) z,y,cost = myencoder(x,corrupt_prob,dimensions) train_step = tf.train.adamoptimizer(model.learningrate).minimize(cost) lossfun = np.zeros(steps) sess.run(tf.global_variables_initializer()) in range(steps): train_data = batchdata(data_train, model.batch_size) epoch_loss = 0 j in range(model.batch_size): sess.run(train_step,feed_dict={x:train_data,corrupt_prob:[1.0]}) c = sess.run(cost, feed_dict={x: train_data, corrupt_prob: [1.0]}) epoch_loss += c lossfun[i] = epoch_loss print('epoch', i, 'completed out of', steps, 'loss:', epoch_loss)
my loss function appears this xaxis - no of iterations, y axis - loss
the loss doesn't decrease , network doesn't learn anything. appreciated !
in function myencoder, weight variables w , b initialized in every training step.
Comments
Post a Comment