Applying CFL to the Visual Bars Dataset

This example demonstrates how to run a basic CFL experiment on Visual Bars Data.

[1]:
import numpy as np
import visual_bars.generate_visual_bars_data as vbd
from cfl.experiment import Experiment

1. Loading Data

Key Points for Loading Data

  1. The data should consist of a data set X and a data set Y that are aligned (each row in X corresponds to the row with the same index in Y)

  2. X should be the causal data set and Y should be the effect data

  3. For most instances of CFL, X and Y should be reshaped such that each one is a 2D array with dimensions (n_samples, n_features). Some instances of CFL require that X be a 4-D array.

Load the Visual Bars Data

To create a visual bars data set, we need to import the file generate_visual_bars_data.py. If you are trying to generate the visual bars data from outside of the root directory of the cfl reposity, add the visual_bars directory path to the PYTHONPATH (same as you did for the cfl package) for easy importing.

See the visual bars page for background on the visual bars data set.

[2]:
# In order to generate visual bars data, we set the number of images we want to generate
# (`n_samples`), the size of each image in pixels (`im_shape`), and the intensity of random noise
# in the image (`noise_lvl`). To have reproducible results, we also will set a random seed.

# create visual bars data
n_samples = 5000
im_shape = (10, 10)
noise_lvl= 0.03
random_state = 180

vb_data = vbd.VisualBarsData(n_samples=n_samples, im_shape=im_shape,
                             noise_lvl=noise_lvl, set_random_seed=random_state)

[3]:
# We save an array of images to a variable `X` and the array of target behavior to a variable
# `Y`. Note that `X` and `Y` are aligned - they must have the same number of observations, and
# the nth image in `X` must correspond to the nth target value in `Y`.

# retrieve the images and the target
X = vb_data.getImages()
Y = vb_data.getTarget()

# X and Y have the same number of rows
print(X.shape)
print(Y.shape)
print(X.shape[0]==Y.shape[0])


(5000, 10, 10)
(5000,)
True

2. Shaping the Data

Because our X dataset is a set of images, we will use CondExpCNN for our CDE. This CDE requires that X be a 4-D array with dimensions (n_samples, im_height, im_width, n_channels) and Y to be 2-dimensional, with shape (n_samples, n_features).

The ‘basic’ CDE CondExpMod, which is better suited to non-spatially-organized data, requires a different input shape than CondExpCNN. For further guidelines about shaping your data, see info about CDEs.

[4]:
#reformat x, y into the right shape for the neural net

# expand X
X = np.expand_dims(X, -1)
print(X.shape) #black and white images have just one channel

# expand Y
Y = np.expand_dims(Y, -1)
print(Y.shape)
(5000, 10, 10, 1)
(5000, 1)

3. Setting up the CFL Pipeline

We will now set up an Experiment. The Experiment will create a CFL pipeline and automatically save all the parameters, results, and the trained models generated during this experiment.

CFL takes in 3 sets of parameters, each in dictionary form:

  • data_info

  • CDE_params

  • cluster_params

For further details on the meaning of these parameters, consult the documentation for the clusterer and the CondExp base class.

In this case, we use a convolutional neural net for the CDE, and K-means for clustering. Consult the documentation for the other available models.

Note that we didn’t specify some parameters, and so those parameters are given default values (values printed below).

Here we only are interested in clustering our cause space. For an example of how to cluster both cause and effect spaces, please refer to the El Niño tutorial.

[5]:
# the parameters should be passed in dictionary form
data_info = {'X_dims' : X.shape,
             'Y_dims' : Y.shape,
             'Y_type' : 'categorical' #options: 'categorical' or 'continuous'
            }

# pass in empty parameter dictionaries to use the default parameter values (not
# allowed for data_info)
cnn_params = {  'model'            : 'CondExpCNN',
                'model_params'     : {
                    'filters'          : [8],
                    'input_shape'      : data_info['X_dims'][1:],
                    'kernel_size'      : [(4, 4)],
                    'pool_size'        : [(2, 2)],
                    'padding'          : ['same'],
                    'conv_activation'  : ['relu'],
                    'dense_units'      : 16,
                    'dense_activation' : 'relu',
                    'output_activation': None,

                    'batch_size'  : 84, # parameters for training
                    'n_epochs'    : 500,
                    'optimizer'   : 'adam',
                    'opt_config'  : {'lr' : 1e-5},
                    'loss'        : 'mean_squared_error',
                    'best'        : True,
                    'verbose'     : 1
                }
            }

cause_cluster_params = {'model' : 'KMeans',
                        'model_params' : {
                            'n_clusters' : 4,
                            'verbose'    : 1
                        }
}

# steps of this CFL pipeline
block_names = ['CondDensityEstimator', 'CauseClusterer']
block_params = [cnn_params, cause_cluster_params]

# folder to save results to
save_path = 'visual_bars_results'

# create the experiment!
my_exp = Experiment(X_train=X, Y_train=Y, data_info=data_info,
                    block_names=block_names, block_params=block_params,
                    results_path=save_path)
save_path 'visual_bars_cf' doesn't exist, creating now.
All results from this run will be saved to visual_bars_cf/experiment0000
Block: verbose not specified in input, defaulting to 1
CondExpBase: weights_path not specified in input, defaulting to None
CondExpBase: show_plot not specified in input, defaulting to True
CondExpBase: standardize not specified in input, defaulting to False
CondExpBase: tb_path not specified in input, defaulting to None
CondExpBase: optuna_callback not specified in input, defaulting to None
CondExpBase: optuna_trial not specified in input, defaulting to None
CondExpBase: early_stopping not specified in input, defaulting to False
CondExpBase: checkpoint_name not specified in input, defaulting to tmp_checkpoints
Block: tune not specified in input, defaulting to False
Block: user_input not specified in input, defaulting to True
Block: verbose not specified in input, defaulting to 1

Note that any parameters not specified will be given default values (shown in the print statements).

4. Training with a CFL object

We now train the CFL object on our data.

[6]:
train_results = my_exp.train()
#################### Beginning CFL Experiment training. ####################
Beginning CondDensityEstimator training...
No GPU device detected.
Epoch 1/500
WARNING:tensorflow:AutoGraph could not transform <function Model.make_train_function.<locals>.train_function at 0x1a44e105f0> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: 'arguments' object has no attribute 'posonlyargs'
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph could not transform <function Model.make_train_function.<locals>.train_function at 0x1a44e105f0> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: 'arguments' object has no attribute 'posonlyargs'
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
37/45 [=======================>......] - ETA: 0s - loss: 0.4977WARNING:tensorflow:AutoGraph could not transform <function Model.make_test_function.<locals>.test_function at 0x1a46436050> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: 'arguments' object has no attribute 'posonlyargs'
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph could not transform <function Model.make_test_function.<locals>.test_function at 0x1a46436050> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: 'arguments' object has no attribute 'posonlyargs'
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
45/45 [==============================] - 3s 19ms/step - loss: 0.4909 - val_loss: 0.4782
Epoch 2/500
45/45 [==============================] - 1s 13ms/step - loss: 0.4648 - val_loss: 0.4531
Epoch 3/500
45/45 [==============================] - 1s 11ms/step - loss: 0.4406 - val_loss: 0.4297
Epoch 4/500
45/45 [==============================] - 1s 11ms/step - loss: 0.4181 - val_loss: 0.4081
Epoch 5/500
45/45 [==============================] - 1s 13ms/step - loss: 0.3971 - val_loss: 0.3875
Epoch 6/500
45/45 [==============================] - 0s 10ms/step - loss: 0.3774 - val_loss: 0.3683
Epoch 7/500
45/45 [==============================] - 1s 14ms/step - loss: 0.3589 - val_loss: 0.3505
Epoch 8/500
45/45 [==============================] - 1s 14ms/step - loss: 0.3418 - val_loss: 0.3335
Epoch 9/500
45/45 [==============================] - 0s 11ms/step - loss: 0.3257 - val_loss: 0.3180
Epoch 10/500
45/45 [==============================] - 0s 9ms/step - loss: 0.3106 - val_loss: 0.3036
Epoch 11/500
45/45 [==============================] - 0s 9ms/step - loss: 0.2968 - val_loss: 0.2897
Epoch 12/500
45/45 [==============================] - 0s 10ms/step - loss: 0.2839 - val_loss: 0.2769
Epoch 13/500
45/45 [==============================] - 0s 11ms/step - loss: 0.2720 - val_loss: 0.2653
Epoch 14/500
45/45 [==============================] - 0s 10ms/step - loss: 0.2610 - val_loss: 0.2545
Epoch 15/500
45/45 [==============================] - 0s 9ms/step - loss: 0.2509 - val_loss: 0.2450
Epoch 16/500
45/45 [==============================] - 0s 11ms/step - loss: 0.2419 - val_loss: 0.2362
Epoch 17/500
45/45 [==============================] - 0s 10ms/step - loss: 0.2338 - val_loss: 0.2284
Epoch 18/500
45/45 [==============================] - 0s 9ms/step - loss: 0.2266 - val_loss: 0.2215
Epoch 19/500
45/45 [==============================] - 1s 12ms/step - loss: 0.2203 - val_loss: 0.2155
Epoch 20/500
45/45 [==============================] - 0s 8ms/step - loss: 0.2150 - val_loss: 0.2106
Epoch 21/500
45/45 [==============================] - 1s 13ms/step - loss: 0.2104 - val_loss: 0.2064
Epoch 22/500
45/45 [==============================] - 0s 10ms/step - loss: 0.2065 - val_loss: 0.2029
Epoch 23/500
45/45 [==============================] - 1s 12ms/step - loss: 0.2033 - val_loss: 0.1998
Epoch 24/500
45/45 [==============================] - 0s 11ms/step - loss: 0.2006 - val_loss: 0.1973
Epoch 25/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1983 - val_loss: 0.1952
Epoch 26/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1964 - val_loss: 0.1934
Epoch 27/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1948 - val_loss: 0.1919
Epoch 28/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1934 - val_loss: 0.1905
Epoch 29/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1921 - val_loss: 0.1893
Epoch 30/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1910 - val_loss: 0.1883
Epoch 31/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1900 - val_loss: 0.1873
Epoch 32/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1890 - val_loss: 0.1864
Epoch 33/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1881 - val_loss: 0.1855
Epoch 34/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1873 - val_loss: 0.1848
Epoch 35/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1865 - val_loss: 0.1840
Epoch 36/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1857 - val_loss: 0.1833
Epoch 37/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1850 - val_loss: 0.1826
Epoch 38/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1842 - val_loss: 0.1819
Epoch 39/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1835 - val_loss: 0.1813
Epoch 40/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1829 - val_loss: 0.1807
Epoch 41/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1822 - val_loss: 0.1801
Epoch 42/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1816 - val_loss: 0.1796
Epoch 43/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1810 - val_loss: 0.1791
Epoch 44/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1804 - val_loss: 0.1786
Epoch 45/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1798 - val_loss: 0.1781
Epoch 46/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1792 - val_loss: 0.1776
Epoch 47/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1787 - val_loss: 0.1771
Epoch 48/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1782 - val_loss: 0.1766
Epoch 49/500
45/45 [==============================] - 1s 16ms/step - loss: 0.1776 - val_loss: 0.1762
Epoch 50/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1771 - val_loss: 0.1757
Epoch 51/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1766 - val_loss: 0.1752
Epoch 52/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1762 - val_loss: 0.1748
Epoch 53/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1757 - val_loss: 0.1744
Epoch 54/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1752 - val_loss: 0.1739
Epoch 55/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1748 - val_loss: 0.1735
Epoch 56/500
45/45 [==============================] - 1s 14ms/step - loss: 0.1744 - val_loss: 0.1731
Epoch 57/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1739 - val_loss: 0.1727
Epoch 58/500
45/45 [==============================] - 1s 16ms/step - loss: 0.1735 - val_loss: 0.1723
Epoch 59/500
45/45 [==============================] - 1s 15ms/step - loss: 0.1731 - val_loss: 0.1720
Epoch 60/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1727 - val_loss: 0.1716
Epoch 61/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1723 - val_loss: 0.1712
Epoch 62/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1719 - val_loss: 0.1708
Epoch 63/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1715 - val_loss: 0.1705
Epoch 64/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1711 - val_loss: 0.1701
Epoch 65/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1708 - val_loss: 0.1697
Epoch 66/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1704 - val_loss: 0.1694
Epoch 67/500
45/45 [==============================] - 1s 16ms/step - loss: 0.1701 - val_loss: 0.1690
Epoch 68/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1697 - val_loss: 0.1687
Epoch 69/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1694 - val_loss: 0.1683
Epoch 70/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1690 - val_loss: 0.1679
Epoch 71/500
45/45 [==============================] - 1s 17ms/step - loss: 0.1687 - val_loss: 0.1676
Epoch 72/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1683 - val_loss: 0.1673
Epoch 73/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1680 - val_loss: 0.1669
Epoch 74/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1677 - val_loss: 0.1666
Epoch 75/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1673 - val_loss: 0.1663
Epoch 76/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1670 - val_loss: 0.1660
Epoch 77/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1667 - val_loss: 0.1657
Epoch 78/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1664 - val_loss: 0.1653
Epoch 79/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1661 - val_loss: 0.1650
Epoch 80/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1658 - val_loss: 0.1647
Epoch 81/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1655 - val_loss: 0.1644
Epoch 82/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1652 - val_loss: 0.1641
Epoch 83/500
45/45 [==============================] - 1s 17ms/step - loss: 0.1649 - val_loss: 0.1638
Epoch 84/500
45/45 [==============================] - 1s 16ms/step - loss: 0.1646 - val_loss: 0.1635
Epoch 85/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1643 - val_loss: 0.1632
Epoch 86/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1640 - val_loss: 0.1629
Epoch 87/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1638 - val_loss: 0.1626
Epoch 88/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1634 - val_loss: 0.1624
Epoch 89/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1632 - val_loss: 0.1621
Epoch 90/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1629 - val_loss: 0.1618
Epoch 91/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1626 - val_loss: 0.1616
Epoch 92/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1623 - val_loss: 0.1613
Epoch 93/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1621 - val_loss: 0.1611
Epoch 94/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1618 - val_loss: 0.1608
Epoch 95/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1616 - val_loss: 0.1606
Epoch 96/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1613 - val_loss: 0.1603
Epoch 97/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1611 - val_loss: 0.1601
Epoch 98/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1608 - val_loss: 0.1598
Epoch 99/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1606 - val_loss: 0.1596
Epoch 100/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1603 - val_loss: 0.1594
Epoch 101/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1601 - val_loss: 0.1592
Epoch 102/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1598 - val_loss: 0.1589
Epoch 103/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1596 - val_loss: 0.1587
Epoch 104/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1593 - val_loss: 0.1585
Epoch 105/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1591 - val_loss: 0.1583
Epoch 106/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1588 - val_loss: 0.1581
Epoch 107/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1586 - val_loss: 0.1579
Epoch 108/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1584 - val_loss: 0.1577
Epoch 109/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1582 - val_loss: 0.1575
Epoch 110/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1580 - val_loss: 0.1573
Epoch 111/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1578 - val_loss: 0.1572
Epoch 112/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1576 - val_loss: 0.1570
Epoch 113/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1574 - val_loss: 0.1568
Epoch 114/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1572 - val_loss: 0.1566
Epoch 115/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1570 - val_loss: 0.1565
Epoch 116/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1568 - val_loss: 0.1563
Epoch 117/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1566 - val_loss: 0.1561
Epoch 118/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1564 - val_loss: 0.1560
Epoch 119/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1562 - val_loss: 0.1558
Epoch 120/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1560 - val_loss: 0.1557
Epoch 121/500
45/45 [==============================] - 1s 14ms/step - loss: 0.1559 - val_loss: 0.1555
Epoch 122/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1557 - val_loss: 0.1554
Epoch 123/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1555 - val_loss: 0.1553
Epoch 124/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1553 - val_loss: 0.1551
Epoch 125/500
45/45 [==============================] - 1s 17ms/step - loss: 0.1552 - val_loss: 0.1550
Epoch 126/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1550 - val_loss: 0.1549
Epoch 127/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1548 - val_loss: 0.1548
Epoch 128/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1547 - val_loss: 0.1547
Epoch 129/500
45/45 [==============================] - 1s 16ms/step - loss: 0.1545 - val_loss: 0.1545
Epoch 130/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1544 - val_loss: 0.1544
Epoch 131/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1542 - val_loss: 0.1543
Epoch 132/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1541 - val_loss: 0.1542
Epoch 133/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1540 - val_loss: 0.1541
Epoch 134/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1538 - val_loss: 0.1540
Epoch 135/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1537 - val_loss: 0.1538
Epoch 136/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1536 - val_loss: 0.1537
Epoch 137/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1534 - val_loss: 0.1536
Epoch 138/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1533 - val_loss: 0.1535
Epoch 139/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1532 - val_loss: 0.1534
Epoch 140/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1531 - val_loss: 0.1533
Epoch 141/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1529 - val_loss: 0.1532
Epoch 142/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1528 - val_loss: 0.1531
Epoch 143/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1527 - val_loss: 0.1530
Epoch 144/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1526 - val_loss: 0.1529
Epoch 145/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1525 - val_loss: 0.1528
Epoch 146/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1524 - val_loss: 0.1527
Epoch 147/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1523 - val_loss: 0.1526
Epoch 148/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1521 - val_loss: 0.1525
Epoch 149/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1520 - val_loss: 0.1524
Epoch 150/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1519 - val_loss: 0.1524
Epoch 151/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1519 - val_loss: 0.1523
Epoch 152/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1517 - val_loss: 0.1522
Epoch 153/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1516 - val_loss: 0.1521
Epoch 154/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1515 - val_loss: 0.1520
Epoch 155/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1514 - val_loss: 0.1519
Epoch 156/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1513 - val_loss: 0.1518
Epoch 157/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1512 - val_loss: 0.1518
Epoch 158/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1511 - val_loss: 0.1517
Epoch 159/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1510 - val_loss: 0.1516
Epoch 160/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1510 - val_loss: 0.1515
Epoch 161/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1509 - val_loss: 0.1514
Epoch 162/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1508 - val_loss: 0.1514
Epoch 163/500
45/45 [==============================] - 1s 15ms/step - loss: 0.1507 - val_loss: 0.1513
Epoch 164/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1506 - val_loss: 0.1512
Epoch 165/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1505 - val_loss: 0.1511
Epoch 166/500
45/45 [==============================] - 1s 14ms/step - loss: 0.1504 - val_loss: 0.1511
Epoch 167/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1503 - val_loss: 0.1510
Epoch 168/500
45/45 [==============================] - 1s 17ms/step - loss: 0.1502 - val_loss: 0.1509
Epoch 169/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1502 - val_loss: 0.1509
Epoch 170/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1501 - val_loss: 0.1508
Epoch 171/500
45/45 [==============================] - 1s 14ms/step - loss: 0.1500 - val_loss: 0.1507
Epoch 172/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1499 - val_loss: 0.1507
Epoch 173/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1498 - val_loss: 0.1506
Epoch 174/500
45/45 [==============================] - 1s 25ms/step - loss: 0.1498 - val_loss: 0.1505
Epoch 175/500
45/45 [==============================] - 1s 17ms/step - loss: 0.1497 - val_loss: 0.1504
Epoch 176/500
45/45 [==============================] - 1s 14ms/step - loss: 0.1496 - val_loss: 0.1504
Epoch 177/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1495 - val_loss: 0.1503
Epoch 178/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1494 - val_loss: 0.1503
Epoch 179/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1494 - val_loss: 0.1502
Epoch 180/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1493 - val_loss: 0.1501
Epoch 181/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1492 - val_loss: 0.1501
Epoch 182/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1491 - val_loss: 0.1500
Epoch 183/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1491 - val_loss: 0.1500
Epoch 184/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1490 - val_loss: 0.1499
Epoch 185/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1489 - val_loss: 0.1498
Epoch 186/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1489 - val_loss: 0.1497
Epoch 187/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1488 - val_loss: 0.1497
Epoch 188/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1487 - val_loss: 0.1496
Epoch 189/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1486 - val_loss: 0.1496
Epoch 190/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1486 - val_loss: 0.1495
Epoch 191/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1485 - val_loss: 0.1494
Epoch 192/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1484 - val_loss: 0.1494
Epoch 193/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1484 - val_loss: 0.1493
Epoch 194/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1483 - val_loss: 0.1493
Epoch 195/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1483 - val_loss: 0.1492
Epoch 196/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1482 - val_loss: 0.1492
Epoch 197/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1481 - val_loss: 0.1491
Epoch 198/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1480 - val_loss: 0.1491
Epoch 199/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1480 - val_loss: 0.1490
Epoch 200/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1479 - val_loss: 0.1490
Epoch 201/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1479 - val_loss: 0.1489
Epoch 202/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1478 - val_loss: 0.1489
Epoch 203/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1477 - val_loss: 0.1488
Epoch 204/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1477 - val_loss: 0.1488
Epoch 205/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1476 - val_loss: 0.1487
Epoch 206/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1476 - val_loss: 0.1487
Epoch 207/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1475 - val_loss: 0.1486
Epoch 208/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1475 - val_loss: 0.1486
Epoch 209/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1474 - val_loss: 0.1485
Epoch 210/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1473 - val_loss: 0.1485
Epoch 211/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1473 - val_loss: 0.1484
Epoch 212/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1472 - val_loss: 0.1484
Epoch 213/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1472 - val_loss: 0.1483
Epoch 214/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1471 - val_loss: 0.1483
Epoch 215/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1470 - val_loss: 0.1483
Epoch 216/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1470 - val_loss: 0.1482
Epoch 217/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1469 - val_loss: 0.1482
Epoch 218/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1469 - val_loss: 0.1481
Epoch 219/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1469 - val_loss: 0.1481
Epoch 220/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1468 - val_loss: 0.1480
Epoch 221/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1467 - val_loss: 0.1480
Epoch 222/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1467 - val_loss: 0.1480
Epoch 223/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1466 - val_loss: 0.1479
Epoch 224/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1466 - val_loss: 0.1479
Epoch 225/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1465 - val_loss: 0.1479
Epoch 226/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1465 - val_loss: 0.1478
Epoch 227/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1464 - val_loss: 0.1478
Epoch 228/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1464 - val_loss: 0.1477
Epoch 229/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1463 - val_loss: 0.1477
Epoch 230/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1463 - val_loss: 0.1476
Epoch 231/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1462 - val_loss: 0.1476
Epoch 232/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1462 - val_loss: 0.1476
Epoch 233/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1461 - val_loss: 0.1475
Epoch 234/500
45/45 [==============================] - 1s 16ms/step - loss: 0.1461 - val_loss: 0.1475
Epoch 235/500
45/45 [==============================] - 1s 16ms/step - loss: 0.1460 - val_loss: 0.1475
Epoch 236/500
45/45 [==============================] - 1s 19ms/step - loss: 0.1460 - val_loss: 0.1474
Epoch 237/500
45/45 [==============================] - 1s 20ms/step - loss: 0.1459 - val_loss: 0.1474
Epoch 238/500
45/45 [==============================] - 1s 15ms/step - loss: 0.1459 - val_loss: 0.1474
Epoch 239/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1458 - val_loss: 0.1473
Epoch 240/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1458 - val_loss: 0.1473
Epoch 241/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1458 - val_loss: 0.1473
Epoch 242/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1457 - val_loss: 0.1472
Epoch 243/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1457 - val_loss: 0.1472
Epoch 244/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1456 - val_loss: 0.1472
Epoch 245/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1456 - val_loss: 0.1471
Epoch 246/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1455 - val_loss: 0.1471
Epoch 247/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1455 - val_loss: 0.1470
Epoch 248/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1454 - val_loss: 0.1470
Epoch 249/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1454 - val_loss: 0.1470
Epoch 250/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1454 - val_loss: 0.1470
Epoch 251/500
45/45 [==============================] - 1s 16ms/step - loss: 0.1453 - val_loss: 0.1469
Epoch 252/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1453 - val_loss: 0.1469
Epoch 253/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1452 - val_loss: 0.1469
Epoch 254/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1452 - val_loss: 0.1468
Epoch 255/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1452 - val_loss: 0.1468
Epoch 256/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1451 - val_loss: 0.1468
Epoch 257/500
45/45 [==============================] - 1s 15ms/step - loss: 0.1451 - val_loss: 0.1467
Epoch 258/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1450 - val_loss: 0.1467
Epoch 259/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1450 - val_loss: 0.1467
Epoch 260/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1449 - val_loss: 0.1466
Epoch 261/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1449 - val_loss: 0.1466
Epoch 262/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1449 - val_loss: 0.1466
Epoch 263/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1448 - val_loss: 0.1466
Epoch 264/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1448 - val_loss: 0.1466
Epoch 265/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1447 - val_loss: 0.1465
Epoch 266/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1447 - val_loss: 0.1465
Epoch 267/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1447 - val_loss: 0.1465
Epoch 268/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1446 - val_loss: 0.1464
Epoch 269/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1446 - val_loss: 0.1464
Epoch 270/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1446 - val_loss: 0.1464
Epoch 271/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1445 - val_loss: 0.1463
Epoch 272/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1445 - val_loss: 0.1463
Epoch 273/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1444 - val_loss: 0.1463
Epoch 274/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1444 - val_loss: 0.1463
Epoch 275/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1444 - val_loss: 0.1462
Epoch 276/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1443 - val_loss: 0.1462
Epoch 277/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1443 - val_loss: 0.1462
Epoch 278/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1443 - val_loss: 0.1461
Epoch 279/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1442 - val_loss: 0.1461
Epoch 280/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1442 - val_loss: 0.1461
Epoch 281/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1442 - val_loss: 0.1461
Epoch 282/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1441 - val_loss: 0.1460
Epoch 283/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1441 - val_loss: 0.1460
Epoch 284/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1441 - val_loss: 0.1460
Epoch 285/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1440 - val_loss: 0.1460
Epoch 286/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1440 - val_loss: 0.1460
Epoch 287/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1440 - val_loss: 0.1459
Epoch 288/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1439 - val_loss: 0.1459
Epoch 289/500
45/45 [==============================] - 1s 15ms/step - loss: 0.1439 - val_loss: 0.1459
Epoch 290/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1439 - val_loss: 0.1458
Epoch 291/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1438 - val_loss: 0.1458
Epoch 292/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1438 - val_loss: 0.1458
Epoch 293/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1438 - val_loss: 0.1458
Epoch 294/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1438 - val_loss: 0.1458
Epoch 295/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1437 - val_loss: 0.1457
Epoch 296/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1437 - val_loss: 0.1457
Epoch 297/500
45/45 [==============================] - 1s 15ms/step - loss: 0.1436 - val_loss: 0.1457
Epoch 298/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1436 - val_loss: 0.1457
Epoch 299/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1436 - val_loss: 0.1456
Epoch 300/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1435 - val_loss: 0.1456
Epoch 301/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1435 - val_loss: 0.1456
Epoch 302/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1435 - val_loss: 0.1456
Epoch 303/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1434 - val_loss: 0.1455
Epoch 304/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1434 - val_loss: 0.1455
Epoch 305/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1434 - val_loss: 0.1455
Epoch 306/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1433 - val_loss: 0.1455
Epoch 307/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1433 - val_loss: 0.1455
Epoch 308/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1433 - val_loss: 0.1455
Epoch 309/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1432 - val_loss: 0.1454
Epoch 310/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1432 - val_loss: 0.1454
Epoch 311/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1432 - val_loss: 0.1454
Epoch 312/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1431 - val_loss: 0.1454
Epoch 313/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1431 - val_loss: 0.1453
Epoch 314/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1431 - val_loss: 0.1453
Epoch 315/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1431 - val_loss: 0.1453
Epoch 316/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1430 - val_loss: 0.1453
Epoch 317/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1430 - val_loss: 0.1453
Epoch 318/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1430 - val_loss: 0.1452
Epoch 319/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1429 - val_loss: 0.1452
Epoch 320/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1429 - val_loss: 0.1452
Epoch 321/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1429 - val_loss: 0.1452
Epoch 322/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1428 - val_loss: 0.1452
Epoch 323/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1428 - val_loss: 0.1451
Epoch 324/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1428 - val_loss: 0.1451
Epoch 325/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1427 - val_loss: 0.1451
Epoch 326/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1427 - val_loss: 0.1451
Epoch 327/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1427 - val_loss: 0.1451
Epoch 328/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1427 - val_loss: 0.1450
Epoch 329/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1426 - val_loss: 0.1450
Epoch 330/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1426 - val_loss: 0.1450
Epoch 331/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1426 - val_loss: 0.1450
Epoch 332/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1425 - val_loss: 0.1450
Epoch 333/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1425 - val_loss: 0.1450
Epoch 334/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1425 - val_loss: 0.1449
Epoch 335/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1424 - val_loss: 0.1449
Epoch 336/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1424 - val_loss: 0.1449
Epoch 337/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1424 - val_loss: 0.1449
Epoch 338/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1423 - val_loss: 0.1449
Epoch 339/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1423 - val_loss: 0.1449
Epoch 340/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1423 - val_loss: 0.1448
Epoch 341/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1423 - val_loss: 0.1448
Epoch 342/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1422 - val_loss: 0.1448
Epoch 343/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1422 - val_loss: 0.1448
Epoch 344/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1422 - val_loss: 0.1448
Epoch 345/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1422 - val_loss: 0.1448
Epoch 346/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1421 - val_loss: 0.1448
Epoch 347/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1421 - val_loss: 0.1447
Epoch 348/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1421 - val_loss: 0.1447
Epoch 349/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1420 - val_loss: 0.1447
Epoch 350/500
45/45 [==============================] - 1s 14ms/step - loss: 0.1420 - val_loss: 0.1447
Epoch 351/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1420 - val_loss: 0.1447
Epoch 352/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1419 - val_loss: 0.1447
Epoch 353/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1419 - val_loss: 0.1447
Epoch 354/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1419 - val_loss: 0.1446
Epoch 355/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1419 - val_loss: 0.1446
Epoch 356/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1418 - val_loss: 0.1446
Epoch 357/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1418 - val_loss: 0.1446
Epoch 358/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1418 - val_loss: 0.1446
Epoch 359/500
45/45 [==============================] - 1s 17ms/step - loss: 0.1417 - val_loss: 0.1445
Epoch 360/500
45/45 [==============================] - 1s 16ms/step - loss: 0.1417 - val_loss: 0.1446
Epoch 361/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1417 - val_loss: 0.1445
Epoch 362/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1417 - val_loss: 0.1445
Epoch 363/500
45/45 [==============================] - 1s 16ms/step - loss: 0.1416 - val_loss: 0.1445
Epoch 364/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1416 - val_loss: 0.1445
Epoch 365/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1416 - val_loss: 0.1445
Epoch 366/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1416 - val_loss: 0.1445
Epoch 367/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1415 - val_loss: 0.1444
Epoch 368/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1415 - val_loss: 0.1444
Epoch 369/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1415 - val_loss: 0.1444
Epoch 370/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1415 - val_loss: 0.1444
Epoch 371/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1414 - val_loss: 0.1444
Epoch 372/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1414 - val_loss: 0.1444
Epoch 373/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1414 - val_loss: 0.1444
Epoch 374/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1413 - val_loss: 0.1444
Epoch 375/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1413 - val_loss: 0.1444
Epoch 376/500
45/45 [==============================] - 0s 6ms/step - loss: 0.1413 - val_loss: 0.1444
Epoch 377/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1413 - val_loss: 0.1444
Epoch 378/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1412 - val_loss: 0.1443
Epoch 379/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1412 - val_loss: 0.1443
Epoch 380/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1412 - val_loss: 0.1443
Epoch 381/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1412 - val_loss: 0.1443
Epoch 382/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1412 - val_loss: 0.1443
Epoch 383/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1411 - val_loss: 0.1443
Epoch 384/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1411 - val_loss: 0.1443
Epoch 385/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1411 - val_loss: 0.1442
Epoch 386/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1410 - val_loss: 0.1442
Epoch 387/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1410 - val_loss: 0.1442
Epoch 388/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1410 - val_loss: 0.1442
Epoch 389/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1410 - val_loss: 0.1442
Epoch 390/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1410 - val_loss: 0.1442
Epoch 391/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1409 - val_loss: 0.1442
Epoch 392/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1409 - val_loss: 0.1442
Epoch 393/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1409 - val_loss: 0.1442
Epoch 394/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1409 - val_loss: 0.1442
Epoch 395/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1408 - val_loss: 0.1441
Epoch 396/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1409 - val_loss: 0.1441
Epoch 397/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1408 - val_loss: 0.1441
Epoch 398/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1408 - val_loss: 0.1441
Epoch 399/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1408 - val_loss: 0.1441
Epoch 400/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1407 - val_loss: 0.1441
Epoch 401/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1407 - val_loss: 0.1441
Epoch 402/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1407 - val_loss: 0.1440
Epoch 403/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1406 - val_loss: 0.1440
Epoch 404/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1406 - val_loss: 0.1440
Epoch 405/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1406 - val_loss: 0.1440
Epoch 406/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1406 - val_loss: 0.1440
Epoch 407/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1405 - val_loss: 0.1440
Epoch 408/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1405 - val_loss: 0.1440
Epoch 409/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1405 - val_loss: 0.1440
Epoch 410/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1405 - val_loss: 0.1440
Epoch 411/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1404 - val_loss: 0.1440
Epoch 412/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1404 - val_loss: 0.1440
Epoch 413/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1404 - val_loss: 0.1439
Epoch 414/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1404 - val_loss: 0.1439
Epoch 415/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1404 - val_loss: 0.1439
Epoch 416/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1404 - val_loss: 0.1439
Epoch 417/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1403 - val_loss: 0.1439
Epoch 418/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1403 - val_loss: 0.1439
Epoch 419/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1403 - val_loss: 0.1439
Epoch 420/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1402 - val_loss: 0.1439
Epoch 421/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1402 - val_loss: 0.1439
Epoch 422/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1402 - val_loss: 0.1438
Epoch 423/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1402 - val_loss: 0.1438
Epoch 424/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1402 - val_loss: 0.1438
Epoch 425/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1402 - val_loss: 0.1438
Epoch 426/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1401 - val_loss: 0.1438
Epoch 427/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1401 - val_loss: 0.1438
Epoch 428/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1401 - val_loss: 0.1438
Epoch 429/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1400 - val_loss: 0.1438
Epoch 430/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1400 - val_loss: 0.1438
Epoch 431/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1400 - val_loss: 0.1438
Epoch 432/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1400 - val_loss: 0.1438
Epoch 433/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1400 - val_loss: 0.1438
Epoch 434/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1399 - val_loss: 0.1438
Epoch 435/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1399 - val_loss: 0.1438
Epoch 436/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1399 - val_loss: 0.1437
Epoch 437/500
45/45 [==============================] - 1s 14ms/step - loss: 0.1399 - val_loss: 0.1437
Epoch 438/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1399 - val_loss: 0.1437
Epoch 439/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1398 - val_loss: 0.1437
Epoch 440/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1398 - val_loss: 0.1437
Epoch 441/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1398 - val_loss: 0.1437
Epoch 442/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1398 - val_loss: 0.1437
Epoch 443/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1397 - val_loss: 0.1437
Epoch 444/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1397 - val_loss: 0.1437
Epoch 445/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1397 - val_loss: 0.1437
Epoch 446/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1397 - val_loss: 0.1436
Epoch 447/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1397 - val_loss: 0.1437
Epoch 448/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1397 - val_loss: 0.1437
Epoch 449/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1396 - val_loss: 0.1436
Epoch 450/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1396 - val_loss: 0.1436
Epoch 451/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1396 - val_loss: 0.1436
Epoch 452/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1396 - val_loss: 0.1436
Epoch 453/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1395 - val_loss: 0.1436
Epoch 454/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1395 - val_loss: 0.1436
Epoch 455/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1395 - val_loss: 0.1436
Epoch 456/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1395 - val_loss: 0.1436
Epoch 457/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1395 - val_loss: 0.1436
Epoch 458/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1394 - val_loss: 0.1436
Epoch 459/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1394 - val_loss: 0.1436
Epoch 460/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1394 - val_loss: 0.1436
Epoch 461/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1394 - val_loss: 0.1435
Epoch 462/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1394 - val_loss: 0.1435
Epoch 463/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1393 - val_loss: 0.1435
Epoch 464/500
45/45 [==============================] - 1s 15ms/step - loss: 0.1393 - val_loss: 0.1435
Epoch 465/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1393 - val_loss: 0.1435
Epoch 466/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1393 - val_loss: 0.1435
Epoch 467/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1393 - val_loss: 0.1435
Epoch 468/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1393 - val_loss: 0.1435
Epoch 469/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1393 - val_loss: 0.1435
Epoch 470/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1392 - val_loss: 0.1435
Epoch 471/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1392 - val_loss: 0.1435
Epoch 472/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1392 - val_loss: 0.1435
Epoch 473/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1392 - val_loss: 0.1435
Epoch 474/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1391 - val_loss: 0.1435
Epoch 475/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1391 - val_loss: 0.1435
Epoch 476/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1391 - val_loss: 0.1434
Epoch 477/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1391 - val_loss: 0.1434
Epoch 478/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1391 - val_loss: 0.1434
Epoch 479/500
45/45 [==============================] - 1s 14ms/step - loss: 0.1390 - val_loss: 0.1434
Epoch 480/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1390 - val_loss: 0.1434
Epoch 481/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1390 - val_loss: 0.1434
Epoch 482/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1390 - val_loss: 0.1434
Epoch 483/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1390 - val_loss: 0.1434
Epoch 484/500
45/45 [==============================] - 0s 8ms/step - loss: 0.1390 - val_loss: 0.1434
Epoch 485/500
45/45 [==============================] - 0s 11ms/step - loss: 0.1389 - val_loss: 0.1434
Epoch 486/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1389 - val_loss: 0.1434
Epoch 487/500
45/45 [==============================] - 0s 7ms/step - loss: 0.1389 - val_loss: 0.1434
Epoch 488/500
45/45 [==============================] - 1s 11ms/step - loss: 0.1389 - val_loss: 0.1434
Epoch 489/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1389 - val_loss: 0.1434
Epoch 490/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1388 - val_loss: 0.1433
Epoch 491/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1389 - val_loss: 0.1433
Epoch 492/500
45/45 [==============================] - 1s 13ms/step - loss: 0.1388 - val_loss: 0.1433
Epoch 493/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1388 - val_loss: 0.1433
Epoch 494/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1388 - val_loss: 0.1433
Epoch 495/500
45/45 [==============================] - 1s 15ms/step - loss: 0.1388 - val_loss: 0.1433
Epoch 496/500
45/45 [==============================] - 0s 9ms/step - loss: 0.1387 - val_loss: 0.1433
Epoch 497/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1387 - val_loss: 0.1433
Epoch 498/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1387 - val_loss: 0.1433
Epoch 499/500
45/45 [==============================] - 1s 12ms/step - loss: 0.1387 - val_loss: 0.1433
Epoch 500/500
45/45 [==============================] - 0s 10ms/step - loss: 0.1387 - val_loss: 0.1433
../_images/dataset_applications_cfl_code_intro_14_1.png
WARNING:tensorflow:AutoGraph could not transform <function Model.make_predict_function.<locals>.predict_function at 0x104345680> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: 'arguments' object has no attribute 'posonlyargs'
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
WARNING: AutoGraph could not transform <function Model.make_predict_function.<locals>.predict_function at 0x104345680> and will run it as-is.
Please report this to the TensorFlow team. When filing the bug, set the verbosity to 10 (on Linux, `export AUTOGRAPH_VERBOSITY=10`) and attach the full output.
Cause: 'arguments' object has no attribute 'posonlyargs'
To silence this warning, decorate the function with @tf.autograph.experimental.do_not_convert
Loading parameters from  tmp_checkpoints01042022182345/best_weights
Saving parameters to  visual_bars_cf/experiment0000/trained_blocks/CondDensityEstimator
CondDensityEstimator training complete.
Beginning CauseClusterer training...
Initialization complete
Iteration 0, inertia 31.74104881286621
Iteration 1, inertia 27.955549240112305
Iteration 2, inertia 27.403724670410156
Iteration 3, inertia 27.310035705566406
Converged at iteration 3: center shift 8.516711204720195e-06 within tolerance 1.0708475112915039e-05.
Initialization complete
Iteration 0, inertia 41.764991760253906
Iteration 1, inertia 32.048770904541016
Iteration 2, inertia 28.2745304107666
Iteration 3, inertia 27.50068473815918
Iteration 4, inertia 27.336685180664062
Iteration 5, inertia 27.298301696777344
Converged at iteration 5: center shift 5.856688403582666e-06 within tolerance 1.0708475112915039e-05.
Initialization complete
Iteration 0, inertia 35.83229446411133
Iteration 1, inertia 28.004274368286133
Iteration 2, inertia 27.4078369140625
Iteration 3, inertia 27.310283660888672
Iteration 4, inertia 27.290353775024414
Converged at iteration 4: center shift 1.4517515865009045e-06 within tolerance 1.0708475112915039e-05.
Initialization complete
Iteration 0, inertia 36.079872131347656
Iteration 1, inertia 30.432510375976562
Iteration 2, inertia 28.223527908325195
Iteration 3, inertia 27.537763595581055
Iteration 4, inertia 27.348379135131836
Iteration 5, inertia 27.300628662109375
Converged at iteration 5: center shift 7.0389687607530504e-06 within tolerance 1.0708475112915039e-05.
Initialization complete
Iteration 0, inertia 39.56998062133789
Iteration 1, inertia 28.852554321289062
Iteration 2, inertia 27.617097854614258
Iteration 3, inertia 27.33992576599121
Iteration 4, inertia 27.298669815063477
Converged at iteration 4: center shift 5.457996849145275e-06 within tolerance 1.0708475112915039e-05.
Initialization complete
Iteration 0, inertia 38.25403594970703
Iteration 1, inertia 28.615015029907227
Iteration 2, inertia 27.388574600219727
Iteration 3, inertia 27.298891067504883
Converged at iteration 3: center shift 7.256875505845528e-06 within tolerance 1.0708475112915039e-05.
Initialization complete
Iteration 0, inertia 40.471160888671875
Iteration 1, inertia 28.677902221679688
Iteration 2, inertia 27.489965438842773
Iteration 3, inertia 27.325923919677734
Iteration 4, inertia 27.29704475402832
Converged at iteration 4: center shift 3.1758779641677393e-06 within tolerance 1.0708475112915039e-05.
Initialization complete
Iteration 0, inertia 46.321685791015625
Iteration 1, inertia 31.743112564086914
Iteration 2, inertia 28.905921936035156
Iteration 3, inertia 27.809476852416992
Iteration 4, inertia 27.40079689025879
Iteration 5, inertia 27.319799423217773
Iteration 6, inertia 27.29273223876953
Converged at iteration 6: center shift 2.4717703581700334e-06 within tolerance 1.0708475112915039e-05.
Initialization complete
Iteration 0, inertia 39.955589294433594
Iteration 1, inertia 29.287607192993164
Iteration 2, inertia 27.550859451293945
Iteration 3, inertia 27.320213317871094
Iteration 4, inertia 27.293081283569336
Converged at iteration 4: center shift 3.2116554393724073e-06 within tolerance 1.0708475112915039e-05.
Initialization complete
Iteration 0, inertia 30.616464614868164
Iteration 1, inertia 27.48357582092285
Iteration 2, inertia 27.311649322509766
Iteration 3, inertia 27.28844451904297
Converged at iteration 3: center shift 9.807829428609693e-07 within tolerance 1.0708475112915039e-05.
CauseClusterer training complete.
Experiment training complete.

Calling train() on the CFL Experiment results in a dictionary of dictionaries. The keys of the first dictionary correspond to the Blocks in the CFL pipeline. In this case (and in all cases at the moment), they are CDE and Clusterer. The keys of the inner dictionaries are the results returned by each Block. Below, we access the cause macrostate labels:

[7]:
# Below, we print the first few `x_lbls`.
# We can see that there are 4 classes in the data, and
# that they are represented by the numbers `0` through `3`. Each of these labels tells us the
# macrovariable to which the corresponding visual bars image was assigned.

cause_cluster_results = train_results['CauseClusterer']
print(cause_cluster_results['x_lbls'][:20])
[0 3 1 1 3 3 0 2 1 0 1 1 3 0 3 1 2 1 2 3]

5. Optional: Predicting on a New Dataset

To predict on different data using the same, already trained CFL pipeline, we just create a second data set, and call the predict method on that new dataset.

[8]:
n_samples = 100
im_shape = (10, 10)
noise_lvl= 0.03
random_state = 180

# make second dataset for prediction
vb_data = vbd.VisualBarsData(n_samples=n_samples, im_shape = im_shape,
    noise_lvl=noise_lvl, set_random_seed=random_state)

# retrieve the images and the target
X_new = vb_data.getImages()
Y_new = vb_data.getTarget()

#reformat x, y into the right shape for the neural net
X_new_CNN = np.expand_dims(X_new, -1)
Y_new_CNN = np.expand_dims(Y_new, -1)

# put X, Y into a new Dataset object
# and register the new dataset with the Experiment
my_exp.add_dataset(X=X_new_CNN, Y=Y_new_CNN, dataset_name='predict_data')

# predict!
results_new = my_exp.predict('predict_data')
Beginning Experiment prediction.
Beginning CondDensityEstimator prediction...
CondDensityEstimator prediction complete.
Beginning CauseClusterer prediction...
Initialization complete
Iteration 0, inertia 0.9525160789489746
Iteration 1, inertia 0.69085294008255
Iteration 2, inertia 0.6027211546897888
Iteration 3, inertia 0.5640036463737488
Iteration 4, inertia 0.5532782077789307
Iteration 5, inertia 0.5506182312965393
Iteration 6, inertia 0.5469791889190674
Iteration 7, inertia 0.5403821468353271
Iteration 8, inertia 0.5382898449897766
Iteration 9, inertia 0.5369626879692078
Converged at iteration 9: strict convergence.
Initialization complete
Iteration 0, inertia 1.0982351303100586
Iteration 1, inertia 0.5481781363487244
Iteration 2, inertia 0.5420354604721069
Converged at iteration 2: strict convergence.
Initialization complete
Iteration 0, inertia 0.5787597894668579
Iteration 1, inertia 0.5436096787452698
Iteration 2, inertia 0.5417847633361816
Converged at iteration 2: strict convergence.
Initialization complete
Iteration 0, inertia 0.6528603434562683
Iteration 1, inertia 0.5439485907554626
Iteration 2, inertia 0.5420354604721069
Converged at iteration 2: strict convergence.
Initialization complete
Iteration 0, inertia 0.7406123876571655
Iteration 1, inertia 0.5712636113166809
Iteration 2, inertia 0.5498159527778625
Iteration 3, inertia 0.5469791889190674
Iteration 4, inertia 0.5403821468353271
Iteration 5, inertia 0.5382898449897766
Iteration 6, inertia 0.5369626879692078
Converged at iteration 6: strict convergence.
Initialization complete
Iteration 0, inertia 0.6956158876419067
Iteration 1, inertia 0.5762235522270203
Iteration 2, inertia 0.5420354604721069
Converged at iteration 2: strict convergence.
Initialization complete
Iteration 0, inertia 0.9158010482788086
Iteration 1, inertia 0.5723662376403809
Iteration 2, inertia 0.5420354604721069
Converged at iteration 2: strict convergence.
Initialization complete
Iteration 0, inertia 0.6088575124740601
Iteration 1, inertia 0.5423613786697388
Converged at iteration 1: strict convergence.
Initialization complete
Iteration 0, inertia 0.9155406951904297
Iteration 1, inertia 0.6416553258895874
Iteration 2, inertia 0.5822334885597229
Iteration 3, inertia 0.5619029402732849
Iteration 4, inertia 0.5532782077789307
Iteration 5, inertia 0.5506182312965393
Iteration 6, inertia 0.5469791889190674
Iteration 7, inertia 0.5403821468353271
Iteration 8, inertia 0.5382898449897766
Iteration 9, inertia 0.5369626879692078
Converged at iteration 9: strict convergence.
Initialization complete
Iteration 0, inertia 0.5804789662361145
Iteration 1, inertia 0.5425180196762085
Iteration 2, inertia 0.5389862656593323
Iteration 3, inertia 0.5369626879692078
Converged at iteration 3: strict convergence.
CauseClusterer prediction complete.
Prediction complete.
[9]:
# take a look at some of the results for the new data set
cluster_results_new = results_new['CauseClusterer']
cluster_results_new['x_lbls'][:20]
[9]:
array([2, 0, 0, 2, 2, 3, 3, 1, 2, 1, 2, 3, 2, 2, 0, 2, 1, 0, 1, 0],
      dtype=int32)

6. Visualize Results

We can view some images with their predicted label using the viewImagesandLabels() function.

[10]:
import visual_bars.visual_bars_vis as vis

vis.viewImagesAndLabels(X_new, im_shape=im_shape, n_examples=10, x_lbls=cluster_results_new['x_lbls'])
../_images/dataset_applications_cfl_code_intro_21_0.png

Conclusion

As we can see, CFL has done a pretty good job of recovering the observational partitions (with some errors). For the most part, the images with a horizontal bar, a vertical bar, both, or neither, have been separated into distinct classes. Try this experiment again with a different CDE or a different sample size, and see how your results differ!