In [ ]:
#@title Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
This tutorial provides an example of how to load pandas dataframes into a tf.data.Dataset.
This tutorials uses a small dataset provided by the Cleveland Clinic Foundation for Heart Disease. There are several hundred rows in the CSV. Each row describes a patient, and each column describes an attribute. We will use this information to predict whether a patient has heart disease, which in this dataset is a binary classification task.
In [ ]:
import pandas as pd
import tensorflow as tf
Download the csv file containing the heart dataset.
In [ ]:
csv_file = tf.keras.utils.get_file('heart.csv', 'https://storage.googleapis.com/applied-dl/heart.csv')
Read the csv file using pandas.
In [ ]:
df = pd.read_csv(csv_file)
In [ ]:
df.head()
In [ ]:
df.dtypes
Convert thal column which is an object in the dataframe to a discrete numerical value.
In [ ]:
df['thal'] = pd.Categorical(df['thal'])
df['thal'] = df.thal.cat.codes
In [ ]:
df.head()
Use tf.data.Dataset.from_tensor_slices to read the values from a pandas dataframe.
One of the advantages of using tf.data.Dataset is it allows you to write simple, highly efficient data pipelines. Read the loading data guide to find out more.
In [ ]:
target = df.pop('target')
In [ ]:
dataset = tf.data.Dataset.from_tensor_slices((df.values, target.values))
In [ ]:
for feat, targ in dataset.take(5):
print ('Features: {}, Target: {}'.format(feat, targ))
Since a pd.Series implements the __array__ protocol it can be used transparently nearly anywhere you would use a np.array or a tf.Tensor.
In [ ]:
tf.constant(df['thal'])
Shuffle and batch the dataset.
In [ ]:
train_dataset = dataset.shuffle(len(df)).batch(1)
In [ ]:
def get_compiled_model():
model = tf.keras.Sequential([
tf.keras.layers.Dense(10, activation='relu'),
tf.keras.layers.Dense(10, activation='relu'),
tf.keras.layers.Dense(1)
])
model.compile(optimizer='adam',
loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),
metrics=['accuracy'])
return model
In [ ]:
model = get_compiled_model()
model.fit(train_dataset, epochs=15)
Passing a dictionary as an input to a model is as easy as creating a matching dictionary of tf.keras.layers.Input layers, applying any pre-processing and stacking them up using the functional api. You can use this as an alternative to feature columns.
In [ ]:
inputs = {key: tf.keras.layers.Input(shape=(), name=key) for key in df.keys()}
x = tf.stack(list(inputs.values()), axis=-1)
x = tf.keras.layers.Dense(10, activation='relu')(x)
output = tf.keras.layers.Dense(1)(x)
model_func = tf.keras.Model(inputs=inputs, outputs=output)
model_func.compile(optimizer='adam',
loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),
metrics=['accuracy'])
The easiest way to preserve the column structure of a pd.DataFrame when used with tf.data is to convert the pd.DataFrame to a dict, and slice that dictionary.
In [ ]:
dict_slices = tf.data.Dataset.from_tensor_slices((df.to_dict('list'), target.values)).batch(16)
In [ ]:
for dict_slice in dict_slices.take(1):
print (dict_slice)
In [ ]:
model_func.fit(dict_slices, epochs=15)