I am following the tutorial about tensorflow: https://www.tensorflow.org/tutorials/wide
There are lots of categorical features which have to be converted to sparse matrix with tf.feature_column.categorical_column_with_vocabulary_list().
BUT, I do not want to use the predefined Estimator,
m = tf.estimator.LinearClassifier(
model_dir=model_dir, feature_columns=base_columns + crossed_columns)
I prefer to use a costumed NN model, with:
estimator = tf.contrib.learn.Estimator(model_fn=model)
estimator.fit(input_fn=input_fn(df, num_epochs=100, shuffle=True), \
steps=100)
So in model(), there will be
def model(features, labels, mode):
...
node = tf.add(tf.matmul(features, w), b)
...
Then, I got the errors like:
TypeError: Failed to convert object of type <class 'dict'> to Tensor.
Contents: {'education': <tf.Tensor
'random_shuffle_queue_DequeueUpTo:1' shape=(?,) dtype=string>, 'age':
<tf.Tensor 'random_shuffle_queue_DequeueUpTo:2' shape=(?,) dtype=float64> ...
My question is how to convert the features to a tensor that can be used as input.
Hope I have described the question clear. Thank you in advance.
features is a dict of Tensor you can get a Tensor by like features['education'], but this Tensor still type of string, it still can't use tf.add(tf.matmul(features, w), b), you should process your string type feature into numberical feature with like tf.feature_column.categorical_column_with_vocabulary_list().
You can check offical dnn implementation , in def dnn_logit_fn part, it use feature_column_lib.input_layer to generate input layer from features and columns, and the columns is a list of tf.feature_columns.*.
When defining a tf.feature_columns.* such as tf.feature_column.categorical_column_with_vocabulary_list(), it accept string which must exists in features.keys() as first parameter, it connects a tensor from features to a feature_column to tell tf how to process the raw input (string) tensor into a feature tensor(numberical).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With