python - tflearn label encoding with large number of classes -


i trying adapt convolutional neural net example of tflearn classification ~12000 distinct class labels , more 1 million training examples. number of labels apparently problem in terms of memory consumption when one-hot encoding them. first map string labels continuous integers, pass these list to_categorical() function. following code leads memoryerror:

trainy = to_categorical(trainy, nb_classes=n_classes) 

do have encode labels or should use different loss function cross-entropy? can train in batches tflearn - can pass generator dnn.fit() function?

thanks advice!

in regression layer link, can specify labels feed in should one-hot encoded on run

tflearn.layers.regression(incoming_net,                           loss = 'categorical_crossentropy',                           batch_size = 64,                           to_one_hot = true,                           n_classes = 12000) 

in way should not have memory error, because labels encoded in batches while training.


Comments

Popular posts from this blog

Command prompt result in label. Python 2.7 -

javascript - How do I use URL parameters to change link href on page? -

amazon web services - AWS Route53 Trying To Get Site To Resolve To www -