Fairness Indicators TensorBoard Plugin Example Colab

Overview

In this activity, you'll use Fairness Indicators for TensorBoard. With the plugin, you can visualize fairness evaluations for your runs and easily compare performance across groups.

Importing

Run the following code to install the required libraries.


In [0]:
!pip install fairness_indicators 'absl-py<0.9,>=0.7'
!pip install google-api-python-client==1.8.3
!pip install tensorboard-plugin-fairness-indicators
!pip install tensorflow-serving-api==2.2.0c2

Restart the runtime. After the runtime is restarted, continue with following cells without running previous cell again.


In [0]:
# %tensorflow_version 1.x	# Uncomment this line if running in Google Colab.

In [0]:
import datetime
import os
import tempfile
from tensorboard_plugin_fairness_indicators import summary_v2
import tensorflow as tf

# example_model.py is provided in fairness_indicators package to train and
# evaluate an example model. 
from fairness_indicators import example_model

tf.compat.v1.enable_eager_execution()

Data and Constants


In [0]:
# To know about dataset, check Fairness Indicators Example Colab at:
# https://github.com/tensorflow/fairness-indicators/blob/master/fairness_indicators/documentation/examples/Fairness_Indicators_Example_Colab.ipynb

train_tf_file = tf.keras.utils.get_file('train.tf', 'https://storage.googleapis.com/civil_comments_dataset/train_tf_processed.tfrecord')
validate_tf_file = tf.keras.utils.get_file('validate.tf', 'https://storage.googleapis.com/civil_comments_dataset/validate_tf_processed.tfrecord')

BASE_DIR = tempfile.gettempdir()
TEXT_FEATURE = 'comment_text'
LABEL = 'toxicity'
FEATURE_MAP = {
    # Label:
    LABEL: tf.io.FixedLenFeature([], tf.float32),
    # Text:
    TEXT_FEATURE: tf.io.FixedLenFeature([], tf.string),

    # Identities:
    'sexual_orientation': tf.io.VarLenFeature(tf.string),
    'gender': tf.io.VarLenFeature(tf.string),
    'religion': tf.io.VarLenFeature(tf.string),
    'race': tf.io.VarLenFeature(tf.string),
    'disability': tf.io.VarLenFeature(tf.string),
}

Train the Model


In [0]:
model_dir = os.path.join(BASE_DIR, 'train',
                         datetime.datetime.now().strftime('%Y%m%d-%H%M%S'))

classifier = example_model.train_model(model_dir,
                                       train_tf_file,
                                       LABEL,
                                       TEXT_FEATURE,
                                       FEATURE_MAP)

Run TensorFlow Model Analysis with Fairness Indicators

This step might take 2 to 5 minutes.


In [0]:
tfma_eval_result_path = os.path.join(BASE_DIR, 'tfma_eval_result')

example_model.evaluate_model(classifier,
                             validate_tf_file,
                             tfma_eval_result_path,
                             'gender',
                             LABEL,
                             FEATURE_MAP)

Visualize Fairness Indicators in TensorBoard

Below you will visualize Fairness Indicators in Tensorboard and compare performance of each slice of the data on selected metrics. You can adjust the baseline comparison slice as well as the displayed threshold(s) using the drop down menus at the top of the visualization. You can also select different evaluation runs using the drop down menu at the top-left corner.

Write Fairness Indicators Summary

Write summary file containing all required information to visualize Fairness Indicators in TensorBoard.


In [0]:
import tensorflow.compat.v2 as tf2

writer = tf2.summary.create_file_writer(
    os.path.join(model_dir, 'fairness_indicators'))
with writer.as_default():
  summary_v2.FairnessIndicators(tfma_eval_result_path, step=1)
writer.close()

Launch TensorBoard

Navigate to "Fairness Indicators" tab to visualize Fairness Indicators.


In [0]:
%load_ext tensorboard

In [0]:
%tensorboard --logdir=$model_dir