LUCID Advanced
In this example, we dive deeper into canonical inverse design by exploring some different configurations. Note that there is a default configuration which works for many exampels. The first part of the code is from the basic tensorflow example.
[1]:
import pandas as pd
import tensorflow as tf
from canonical_sets.data import Adult
from canonical_sets.models import ClassifierTF
from canonical_sets import LUCID
tf.keras.utils.set_random_seed(42)
data = Adult()
model = ClassifierTF(2)
model.compile(optimizer="adam", loss="binary_crossentropy", metrics = ["accuracy"])
callback = tf.keras.callbacks.EarlyStopping(monitor="val_accuracy", patience=5)
model.fit(data.train_data.to_numpy(), data.train_labels.to_numpy(), epochs=200,
validation_data=(data.val_data.to_numpy(), data.val_labels.to_numpy()), callbacks=[callback])
model.evaluate(data.test_data.to_numpy(), data.test_labels.to_numpy())
example_data = data.train_data
outputs = pd.DataFrame([[0, 1]], columns=["<=50K", ">50K"])
Epoch 1/200
755/755 [==============================] - 2s 2ms/step - loss: 0.4249 - accuracy: 0.7994 - val_loss: 0.3708 - val_accuracy: 0.8294
Epoch 2/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3649 - accuracy: 0.8265 - val_loss: 0.3563 - val_accuracy: 0.8356
Epoch 3/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3572 - accuracy: 0.8312 - val_loss: 0.3523 - val_accuracy: 0.8376
Epoch 4/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3541 - accuracy: 0.8338 - val_loss: 0.3491 - val_accuracy: 0.8351
Epoch 5/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3523 - accuracy: 0.8345 - val_loss: 0.3471 - val_accuracy: 0.8371
Epoch 6/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3511 - accuracy: 0.8344 - val_loss: 0.3465 - val_accuracy: 0.8369
Epoch 7/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3501 - accuracy: 0.8360 - val_loss: 0.3456 - val_accuracy: 0.8357
Epoch 8/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3494 - accuracy: 0.8360 - val_loss: 0.3441 - val_accuracy: 0.8384
Epoch 9/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3488 - accuracy: 0.8361 - val_loss: 0.3433 - val_accuracy: 0.8392
Epoch 10/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3483 - accuracy: 0.8364 - val_loss: 0.3428 - val_accuracy: 0.8405
Epoch 11/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3477 - accuracy: 0.8368 - val_loss: 0.3419 - val_accuracy: 0.8404
Epoch 12/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3472 - accuracy: 0.8363 - val_loss: 0.3440 - val_accuracy: 0.8374
Epoch 13/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3468 - accuracy: 0.8370 - val_loss: 0.3412 - val_accuracy: 0.8419
Epoch 14/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3463 - accuracy: 0.8365 - val_loss: 0.3407 - val_accuracy: 0.8414
Epoch 15/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3458 - accuracy: 0.8375 - val_loss: 0.3405 - val_accuracy: 0.8410
Epoch 16/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3457 - accuracy: 0.8370 - val_loss: 0.3400 - val_accuracy: 0.8392
Epoch 17/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3453 - accuracy: 0.8371 - val_loss: 0.3398 - val_accuracy: 0.8417
Epoch 18/200
755/755 [==============================] - 1s 1ms/step - loss: 0.3450 - accuracy: 0.8377 - val_loss: 0.3387 - val_accuracy: 0.8409
471/471 [==============================] - 0s 928us/step - loss: 0.3456 - accuracy: 0.8366
Canonical inverse design depends on a few hyperparameters, such as the number of samples, number of epochs, and the learning rate.
[2]:
lucid = LUCID(model, outputs, example_data, numb_of_samples=10,
numb_of_epochs=5, lr=1)
lucid.results.head()
100%|██████████| 10/10 [00:00<00:00, 13.97it/s]
[2]:
<=50K | >50K | Age | fnlwgt | Education-Num | Capital Gain | Capital Loss | Hours per week | Workclass+Federal-gov | Workclass+Local-gov | Workclass+Private | Workclass+Self-emp-inc | Workclass+Self-emp-not-inc | Workclass+State-gov | Workclass+Without-pay | Education+10th | Education+11th | Education+12th | Education+1st-4th | Education+5th-6th | Education+7th-8th | Education+9th | Education+Assoc-acdm | Education+Assoc-voc | Education+Bachelors | Education+Doctorate | Education+HS-grad | Education+Masters | Education+Preschool | Education+Prof-school | Education+Some-college | Martial Status+Divorced | Martial Status+Married-AF-spouse | Martial Status+Married-civ-spouse | Martial Status+Married-spouse-absent | Martial Status+Never-married | Martial Status+Separated | Martial Status+Widowed | Occupation+Adm-clerical | Occupation+Armed-Forces | ... | Country+Canada | Country+China | Country+Columbia | Country+Cuba | Country+Dominican-Republic | Country+Ecuador | Country+El-Salvador | Country+England | Country+France | Country+Germany | Country+Greece | Country+Guatemala | Country+Haiti | Country+Holand-Netherlands | Country+Honduras | Country+Hong | Country+Hungary | Country+India | Country+Iran | Country+Ireland | Country+Italy | Country+Jamaica | Country+Japan | Country+Laos | Country+Mexico | Country+Nicaragua | Country+Outlying-US(Guam-USVI-etc) | Country+Peru | Country+Philippines | Country+Poland | Country+Portugal | Country+Puerto-Rico | Country+Scotland | Country+South | Country+Taiwan | Country+Thailand | Country+Trinadad&Tobago | Country+United-States | Country+Vietnam | Country+Yugoslavia | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample | epoch | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
1 | 1 | 9.924375e-01 | 0.007562 | 0.953400 | -0.239609 | 0.846492 | -0.476615 | -0.361806 | -0.763818 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 4.237517e-22 | 1.000000 | 1.904700 | 0.029004 | 2.135167 | 1.446826 | 0.500876 | 0.590019 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
6 | 4.276351e-07 | 1.000000 | 1.904700 | 0.029004 | 2.135167 | 1.446826 | 0.500876 | 0.590019 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
2 | 1 | 2.573124e-01 | 0.742688 | -0.008990 | 0.521141 | 0.374523 | -0.503690 | 0.244093 | -0.368689 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 2.811741e-07 | 1.000000 | 0.237662 | 0.590786 | 0.708648 | -0.004985 | 0.467768 | -0.017669 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 rows × 106 columns
You will notice that there are 6 epochs instead of the requested 5 (numb_of_epochs). This is because the extra_epoch arugment is True by default. Later in this tutorial, we will get into more detail, and just put it on False for now.
[3]:
lucid = LUCID(model, outputs, example_data, numb_of_samples=10,
numb_of_epochs=5, lr=1, extra_epoch=False)
lucid.results.head()
100%|██████████| 10/10 [00:00<00:00, 24.28it/s]
[3]:
<=50K | >50K | Age | fnlwgt | Education-Num | Capital Gain | Capital Loss | Hours per week | Workclass+Federal-gov | Workclass+Local-gov | Workclass+Private | Workclass+Self-emp-inc | Workclass+Self-emp-not-inc | Workclass+State-gov | Workclass+Without-pay | Education+10th | Education+11th | Education+12th | Education+1st-4th | Education+5th-6th | Education+7th-8th | Education+9th | Education+Assoc-acdm | Education+Assoc-voc | Education+Bachelors | Education+Doctorate | Education+HS-grad | Education+Masters | Education+Preschool | Education+Prof-school | Education+Some-college | Martial Status+Divorced | Martial Status+Married-AF-spouse | Martial Status+Married-civ-spouse | Martial Status+Married-spouse-absent | Martial Status+Never-married | Martial Status+Separated | Martial Status+Widowed | Occupation+Adm-clerical | Occupation+Armed-Forces | ... | Country+Canada | Country+China | Country+Columbia | Country+Cuba | Country+Dominican-Republic | Country+Ecuador | Country+El-Salvador | Country+England | Country+France | Country+Germany | Country+Greece | Country+Guatemala | Country+Haiti | Country+Holand-Netherlands | Country+Honduras | Country+Hong | Country+Hungary | Country+India | Country+Iran | Country+Ireland | Country+Italy | Country+Jamaica | Country+Japan | Country+Laos | Country+Mexico | Country+Nicaragua | Country+Outlying-US(Guam-USVI-etc) | Country+Peru | Country+Philippines | Country+Poland | Country+Portugal | Country+Puerto-Rico | Country+Scotland | Country+South | Country+Taiwan | Country+Thailand | Country+Trinadad&Tobago | Country+United-States | Country+Vietnam | Country+Yugoslavia | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample | epoch | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
1 | 1 | 9.924375e-01 | 0.007562 | 0.953400 | -0.239609 | 0.846492 | -0.476615 | -0.361806 | -0.763818 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 4.237517e-22 | 1.000000 | 1.904700 | 0.029004 | 2.135167 | 1.446826 | 0.500876 | 0.590019 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
2 | 1 | 2.573124e-01 | 0.742688 | -0.008990 | 0.521141 | 0.374523 | -0.503690 | 0.244093 | -0.368689 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 2.811741e-07 | 1.000000 | 0.237662 | 0.590786 | 0.708648 | -0.004985 | 0.467768 | -0.017669 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
3 | 1 | 1.750611e-02 | 0.982494 | 0.461862 | 0.364942 | 0.525183 | 0.198825 | -0.529045 | -0.943877 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 rows × 106 columns
Besides these hyperparameters, there is also the range of the uniform dsitribution from which we are sampling the initial input vectors.
[4]:
lucid = LUCID(model, outputs, example_data, numb_of_samples=10,
numb_of_epochs=5, lr=1, low=-0.5, high=0.5, extra_epoch=False)
lucid.results.head()
100%|██████████| 10/10 [00:00<00:00, 23.21it/s]
[4]:
<=50K | >50K | Age | fnlwgt | Education-Num | Capital Gain | Capital Loss | Hours per week | Workclass+Federal-gov | Workclass+Local-gov | Workclass+Private | Workclass+Self-emp-inc | Workclass+Self-emp-not-inc | Workclass+State-gov | Workclass+Without-pay | Education+10th | Education+11th | Education+12th | Education+1st-4th | Education+5th-6th | Education+7th-8th | Education+9th | Education+Assoc-acdm | Education+Assoc-voc | Education+Bachelors | Education+Doctorate | Education+HS-grad | Education+Masters | Education+Preschool | Education+Prof-school | Education+Some-college | Martial Status+Divorced | Martial Status+Married-AF-spouse | Martial Status+Married-civ-spouse | Martial Status+Married-spouse-absent | Martial Status+Never-married | Martial Status+Separated | Martial Status+Widowed | Occupation+Adm-clerical | Occupation+Armed-Forces | ... | Country+Canada | Country+China | Country+Columbia | Country+Cuba | Country+Dominican-Republic | Country+Ecuador | Country+El-Salvador | Country+England | Country+France | Country+Germany | Country+Greece | Country+Guatemala | Country+Haiti | Country+Holand-Netherlands | Country+Honduras | Country+Hong | Country+Hungary | Country+India | Country+Iran | Country+Ireland | Country+Italy | Country+Jamaica | Country+Japan | Country+Laos | Country+Mexico | Country+Nicaragua | Country+Outlying-US(Guam-USVI-etc) | Country+Peru | Country+Philippines | Country+Poland | Country+Portugal | Country+Puerto-Rico | Country+Scotland | Country+South | Country+Taiwan | Country+Thailand | Country+Trinadad&Tobago | Country+United-States | Country+Vietnam | Country+Yugoslavia | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample | epoch | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
1 | 1 | 9.113827e-01 | 0.088617 | 0.476700 | -0.119804 | 0.423246 | -0.238308 | -0.180903 | -0.381909 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 2.751082e-21 | 1.000000 | 1.350318 | 0.126873 | 1.606689 | 1.528067 | 0.611333 | 0.861375 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
2 | 1 | 3.457343e-01 | 0.654266 | -0.004495 | 0.260570 | 0.187262 | -0.251845 | 0.122047 | -0.184344 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 3.462221e-09 | 1.000000 | 0.326914 | 0.354148 | 0.636203 | 0.418232 | 0.422583 | 0.287298 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
3 | 1 | 1.070129e-01 | 0.892987 | 0.230931 | 0.182471 | 0.262591 | 0.099413 | -0.264523 | -0.471939 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 rows × 106 columns
There is also the pre- and post-processing of the categorical features. By randomly drawing values from the uniform distribution, the initial input vectors do not look like real examples with clean one-hot encoded categorical features. By setting one_hot_pre to True, the features are one-hot encoded by setting the max value to 1 and all the others to 0. Note that the default value here is False and thus that the inverse design starts from the randomly drawn values.
[5]:
lucid = LUCID(model, outputs, example_data, numb_of_samples=10,
numb_of_epochs=5, lr=1, low=-0.5, high=0.5, extra_epoch=False,
one_hot_pre=True)
lucid.results.head()
100%|██████████| 10/10 [00:00<00:00, 24.87it/s]
[5]:
<=50K | >50K | Age | fnlwgt | Education-Num | Capital Gain | Capital Loss | Hours per week | Workclass+Federal-gov | Workclass+Local-gov | Workclass+Private | Workclass+Self-emp-inc | Workclass+Self-emp-not-inc | Workclass+State-gov | Workclass+Without-pay | Education+10th | Education+11th | Education+12th | Education+1st-4th | Education+5th-6th | Education+7th-8th | Education+9th | Education+Assoc-acdm | Education+Assoc-voc | Education+Bachelors | Education+Doctorate | Education+HS-grad | Education+Masters | Education+Preschool | Education+Prof-school | Education+Some-college | Martial Status+Divorced | Martial Status+Married-AF-spouse | Martial Status+Married-civ-spouse | Martial Status+Married-spouse-absent | Martial Status+Never-married | Martial Status+Separated | Martial Status+Widowed | Occupation+Adm-clerical | Occupation+Armed-Forces | ... | Country+Canada | Country+China | Country+Columbia | Country+Cuba | Country+Dominican-Republic | Country+Ecuador | Country+El-Salvador | Country+England | Country+France | Country+Germany | Country+Greece | Country+Guatemala | Country+Haiti | Country+Holand-Netherlands | Country+Honduras | Country+Hong | Country+Hungary | Country+India | Country+Iran | Country+Ireland | Country+Italy | Country+Jamaica | Country+Japan | Country+Laos | Country+Mexico | Country+Nicaragua | Country+Outlying-US(Guam-USVI-etc) | Country+Peru | Country+Philippines | Country+Poland | Country+Portugal | Country+Puerto-Rico | Country+Scotland | Country+South | Country+Taiwan | Country+Thailand | Country+Trinadad&Tobago | Country+United-States | Country+Vietnam | Country+Yugoslavia | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample | epoch | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
1 | 1 | 5.788748e-01 | 0.421125 | 0.476700 | -0.119804 | 0.423246 | -0.238308 | -0.180903 | -0.381909 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 2.729235e-14 | 1.000000 | 1.031588 | 0.036876 | 1.174923 | 0.883625 | 0.322295 | 0.407777 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
2 | 1 | 8.363793e-01 | 0.163621 | -0.004495 | 0.260570 | 0.187262 | -0.251845 | 0.122047 | -0.184344 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 8.151322e-20 | 1.000000 | 0.797228 | 0.486947 | 1.273311 | 1.369164 | 0.849085 | 0.956622 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
3 | 1 | 5.620487e-01 | 0.437951 | 0.230931 | 0.182471 | 0.262591 | 0.099413 | -0.264523 | -0.471939 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 rows × 106 columns
It may seem that the output has hardly changed (but pay attention to the outcomes as they have changed) and to get a better understanding of what is happening we can set one_hot_post to False as it is True by default.
[6]:
lucid = LUCID(model, outputs, example_data, numb_of_samples=10,
numb_of_epochs=5, lr=1, low=-0.5, high=0.5, extra_epoch=False,
one_hot_pre=True, one_hot_post=False)
lucid.results.head()
100%|██████████| 10/10 [00:00<00:00, 22.12it/s]
[6]:
<=50K | >50K | Age | fnlwgt | Education-Num | Capital Gain | Capital Loss | Hours per week | Workclass+Federal-gov | Workclass+Local-gov | Workclass+Private | Workclass+Self-emp-inc | Workclass+Self-emp-not-inc | Workclass+State-gov | Workclass+Without-pay | Education+10th | Education+11th | Education+12th | Education+1st-4th | Education+5th-6th | Education+7th-8th | Education+9th | Education+Assoc-acdm | Education+Assoc-voc | Education+Bachelors | Education+Doctorate | Education+HS-grad | Education+Masters | Education+Preschool | Education+Prof-school | Education+Some-college | Martial Status+Divorced | Martial Status+Married-AF-spouse | Martial Status+Married-civ-spouse | Martial Status+Married-spouse-absent | Martial Status+Never-married | Martial Status+Separated | Martial Status+Widowed | Occupation+Adm-clerical | Occupation+Armed-Forces | ... | Country+Canada | Country+China | Country+Columbia | Country+Cuba | Country+Dominican-Republic | Country+Ecuador | Country+El-Salvador | Country+England | Country+France | Country+Germany | Country+Greece | Country+Guatemala | Country+Haiti | Country+Holand-Netherlands | Country+Honduras | Country+Hong | Country+Hungary | Country+India | Country+Iran | Country+Ireland | Country+Italy | Country+Jamaica | Country+Japan | Country+Laos | Country+Mexico | Country+Nicaragua | Country+Outlying-US(Guam-USVI-etc) | Country+Peru | Country+Philippines | Country+Poland | Country+Portugal | Country+Puerto-Rico | Country+Scotland | Country+South | Country+Taiwan | Country+Thailand | Country+Trinadad&Tobago | Country+United-States | Country+Vietnam | Country+Yugoslavia | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample | epoch | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
1 | 1 | 5.788748e-01 | 0.421125 | 0.476700 | -0.119804 | 0.423246 | -0.238308 | -0.180903 | -0.381909 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
5 | 2.729235e-14 | 1.000000 | 1.031588 | 0.036876 | 1.174923 | 0.883625 | 0.322295 | 0.407777 | 0.391457 | 0.084103 | 1.190574 | 0.299621 | -0.081415 | -0.035028 | -1.196411 | -0.255586 | -0.295613 | -0.265623 | -0.245875 | -0.357402 | -0.401788 | -0.313564 | -0.136359 | -0.039786 | 0.118625 | 0.367230 | -0.145566 | 1.224463 | -1.470688 | 0.496926 | -0.040126 | -0.121691 | 0.666288 | 0.589624 | -0.084355 | -0.386005 | -0.135835 | 1.021528 | 0.017219 | -0.370577 | ... | 0.299418 | -0.234353 | -0.908358 | 0.21682 | -0.591017 | -0.056395 | 0.842807 | 0.017224 | 0.225849 | 0.175000 | -0.234865 | -0.095956 | 0.045567 | -0.421948 | -0.380516 | -0.027031 | 0.150950 | -0.117630 | -0.172209 | 0.353784 | 0.411896 | -0.022650 | 0.111154 | -0.185068 | -0.356184 | -0.580819 | -1.206737 | -0.373074 | 0.285363 | -0.049789 | -0.119837 | -0.326336 | -0.227715 | -0.445389 | 0.025778 | -0.355864 | -0.161897 | 0.110150 | -0.286929 | 0.294768 | |
2 | 1 | 8.363793e-01 | 0.163621 | -0.004495 | 0.260570 | 0.187262 | -0.251845 | 0.122047 | -0.184344 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.00000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
5 | 8.151322e-20 | 1.000000 | 0.797228 | 0.486947 | 1.273311 | 1.369164 | 0.849085 | 0.956622 | 0.565591 | 0.121515 | 0.275348 | 1.432903 | -0.117632 | -0.050609 | -1.728618 | -0.369280 | -0.427112 | 0.616219 | -0.355249 | -0.516387 | -0.580518 | -0.453049 | -0.197017 | -0.057485 | 0.171393 | 0.530587 | -0.210319 | 0.324312 | -2.124903 | 0.717976 | -0.057976 | -0.175823 | 0.962677 | 0.851910 | 0.878121 | -0.557715 | -0.196260 | 0.031104 | 0.024879 | 0.464577 | ... | 0.432610 | -0.338602 | -1.312428 | 0.31327 | -0.853922 | -0.081482 | -0.227118 | 0.024886 | 0.326314 | 0.252846 | -0.339342 | -0.138640 | 0.065836 | -0.609645 | -0.549782 | 0.960944 | 0.218098 | -0.169957 | -0.248814 | 0.511159 | 0.595122 | -0.032725 | 0.160599 | -0.267393 | -0.514628 | -0.839188 | -1.743536 | -0.539031 | 0.412302 | -0.071936 | -0.173145 | -0.471502 | -0.329011 | -0.643515 | 0.037245 | -0.514165 | -0.233915 | 0.159149 | -0.414565 | 0.425892 | |
3 | 1 | 5.620487e-01 | 0.437951 | 0.230931 | 0.182471 | 0.262591 | 0.099413 | -0.264523 | -0.471939 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 1.00000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
5 rows × 106 columns
Looking at the categorical variables, we now see that only the first epoch is one-hot encoded. And setting both to False:
[7]:
lucid = LUCID(model, outputs, example_data, numb_of_samples=10,
numb_of_epochs=5, lr=1, low=-0.5, high=0.5, extra_epoch=False,
one_hot_pre=False, one_hot_post=False)
lucid.results.head()
100%|██████████| 10/10 [00:00<00:00, 22.86it/s]
[7]:
<=50K | >50K | Age | fnlwgt | Education-Num | Capital Gain | Capital Loss | Hours per week | Workclass+Federal-gov | Workclass+Local-gov | Workclass+Private | Workclass+Self-emp-inc | Workclass+Self-emp-not-inc | Workclass+State-gov | Workclass+Without-pay | Education+10th | Education+11th | Education+12th | Education+1st-4th | Education+5th-6th | Education+7th-8th | Education+9th | Education+Assoc-acdm | Education+Assoc-voc | Education+Bachelors | Education+Doctorate | Education+HS-grad | Education+Masters | Education+Preschool | Education+Prof-school | Education+Some-college | Martial Status+Divorced | Martial Status+Married-AF-spouse | Martial Status+Married-civ-spouse | Martial Status+Married-spouse-absent | Martial Status+Never-married | Martial Status+Separated | Martial Status+Widowed | Occupation+Adm-clerical | Occupation+Armed-Forces | ... | Country+Canada | Country+China | Country+Columbia | Country+Cuba | Country+Dominican-Republic | Country+Ecuador | Country+El-Salvador | Country+England | Country+France | Country+Germany | Country+Greece | Country+Guatemala | Country+Haiti | Country+Holand-Netherlands | Country+Honduras | Country+Hong | Country+Hungary | Country+India | Country+Iran | Country+Ireland | Country+Italy | Country+Jamaica | Country+Japan | Country+Laos | Country+Mexico | Country+Nicaragua | Country+Outlying-US(Guam-USVI-etc) | Country+Peru | Country+Philippines | Country+Poland | Country+Portugal | Country+Puerto-Rico | Country+Scotland | Country+South | Country+Taiwan | Country+Thailand | Country+Trinadad&Tobago | Country+United-States | Country+Vietnam | Country+Yugoslavia | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample | epoch | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
1 | 1 | 9.113827e-01 | 0.088617 | 0.476700 | -0.119804 | 0.423246 | -0.238308 | -0.180903 | -0.381909 | -0.258234 | -0.181466 | 0.464079 | -0.236350 | -0.058994 | 0.109871 | 0.363621 | 0.363758 | 0.174881 | 0.159874 | 0.235758 | -0.277246 | -0.327934 | 0.370415 | -0.439861 | 0.183689 | 0.171238 | 0.111018 | -0.439863 | 0.477769 | -0.061048 | 0.032595 | -0.496868 | -0.248733 | 0.358490 | -0.074702 | 0.235819 | 0.422043 | -0.346526 | 0.492259 | -0.317668 | 0.440113 | ... | -0.475167 | 0.012549 | 0.112786 | -0.219651 | -0.047868 | 0.031221 | 0.488635 | -0.494022 | -0.298712 | -0.243124 | -0.159255 | -0.446998 | -0.173506 | -0.301642 | 0.329215 | 0.137122 | -0.414559 | -0.279840 | 0.181176 | -0.365590 | 0.456514 | -0.363476 | 0.297702 | -0.001706 | -0.195292 | -0.265794 | 0.393008 | -0.122463 | -0.423655 | -0.466193 | -0.038387 | 0.373824 | -0.149391 | 0.323521 | 0.447199 | -0.432959 | -0.202160 | 0.119161 | -0.198243 | -0.325154 |
5 | 2.751082e-21 | 1.000000 | 1.350318 | 0.126873 | 1.606689 | 1.528067 | 0.611333 | 0.861375 | 0.358077 | -0.049054 | 0.764119 | 0.235373 | -0.187175 | 0.054723 | -1.520012 | -0.038638 | -0.290532 | -0.258323 | -0.151349 | -0.839940 | -0.960510 | -0.123262 | -0.654546 | 0.121049 | 0.358001 | 0.689186 | -0.669042 | 0.831165 | -2.376504 | 0.814956 | -0.560042 | -0.440323 | 1.407497 | 0.853604 | 0.103010 | -0.185685 | -0.560385 | 0.526152 | -0.290558 | -0.143325 | ... | -0.003762 | -0.356417 | -1.317336 | 0.121711 | -0.978367 | -0.057567 | 0.241150 | -0.466904 | 0.056865 | 0.032397 | -0.529028 | -0.598071 | -0.101765 | -0.965958 | -0.269870 | 0.094564 | -0.176903 | -0.465037 | -0.089950 | 0.191408 | 1.105003 | -0.399136 | 0.472703 | -0.293078 | -0.756070 | -1.180237 | -1.506882 | -0.709832 | 0.025622 | -0.544580 | -0.227058 | -0.139960 | -0.507906 | -0.377702 | 0.487784 | -0.993232 | -0.457052 | 0.292582 | -0.649984 | 0.138931 | |
2 | 1 | 3.457343e-01 | 0.654266 | -0.004495 | 0.260570 | 0.187262 | -0.251845 | 0.122047 | -0.184344 | -0.222784 | 0.429865 | -0.220405 | 0.448835 | -0.297983 | 0.065478 | -0.119697 | 0.308107 | -0.372446 | 0.451340 | 0.228827 | -0.324455 | 0.192759 | -0.407167 | 0.056402 | -0.397506 | -0.456307 | -0.107309 | 0.191054 | 0.226977 | 0.169032 | 0.091555 | 0.379001 | 0.037658 | -0.402178 | -0.358704 | 0.462280 | 0.298104 | -0.493246 | 0.445812 | 0.155423 | 0.410685 | ... | -0.003134 | -0.185595 | -0.469695 | -0.407376 | 0.089020 | 0.134038 | 0.466344 | -0.485120 | 0.233160 | 0.132863 | 0.001685 | -0.379063 | 0.357940 | 0.202411 | 0.068614 | 0.481099 | -0.409647 | 0.227080 | -0.011366 | -0.219501 | 0.385330 | 0.307728 | -0.060551 | 0.141648 | 0.131039 | -0.491563 | -0.278342 | -0.283107 | -0.279731 | 0.241137 | 0.441554 | -0.233563 | 0.247273 | -0.497929 | 0.309356 | 0.306809 | -0.445467 | -0.401064 | 0.053227 | 0.262911 |
5 | 3.462221e-09 | 1.000000 | 0.326914 | 0.354148 | 0.636203 | 0.418232 | 0.422583 | 0.287298 | 0.011015 | 0.480096 | -0.106584 | 0.627784 | -0.346608 | 0.044558 | -0.834257 | 0.155457 | -0.549002 | 0.292696 | 0.081978 | -0.537914 | -0.047210 | -0.594444 | -0.025039 | -0.421268 | -0.385458 | 0.112019 | 0.104114 | 0.361039 | -0.709340 | 0.388345 | 0.355035 | -0.035022 | -0.004235 | -0.006550 | 0.411899 | 0.067561 | -0.574374 | 0.458669 | 0.165707 | 0.189356 | ... | 0.175695 | -0.325563 | -1.012214 | -0.277880 | -0.263966 | 0.100356 | 0.372461 | -0.474833 | 0.368049 | 0.237382 | -0.138589 | -0.436373 | 0.385155 | -0.049599 | -0.158650 | 0.464955 | -0.319492 | 0.156825 | -0.114218 | -0.008203 | 0.631336 | 0.294201 | 0.005836 | 0.031116 | -0.081693 | -0.838459 | -0.999068 | -0.505927 | -0.109297 | 0.211401 | 0.369981 | -0.428468 | 0.111270 | -0.763939 | 0.324752 | 0.094268 | -0.542161 | -0.335276 | -0.118142 | 0.438962 | |
3 | 1 | 1.070129e-01 | 0.892987 | 0.230931 | 0.182471 | 0.262591 | 0.099413 | -0.264523 | -0.471939 | 0.411676 | -0.085919 | 0.223464 | -0.234049 | 0.480370 | -0.321910 | -0.453082 | -0.005884 | -0.129471 | 0.155574 | -0.499954 | -0.356985 | 0.358976 | 0.182810 | -0.167967 | -0.475184 | -0.457797 | -0.110417 | 0.443011 | 0.418612 | -0.185273 | -0.127779 | -0.233250 | -0.358126 | 0.091519 | -0.036920 | 0.335330 | -0.004566 | -0.482285 | -0.062105 | -0.365962 | -0.167599 | ... | -0.455733 | -0.051803 | -0.033444 | 0.460267 | -0.242194 | -0.055889 | 0.059008 | 0.349989 | -0.374671 | 0.428501 | 0.079709 | -0.037467 | -0.056606 | -0.450109 | -0.224974 | 0.119929 | 0.419012 | 0.420682 | -0.243936 | 0.036146 | -0.194076 | -0.132148 | -0.148903 | -0.067192 | -0.496738 | -0.045131 | -0.186041 | -0.236350 | 0.072413 | -0.008981 | 0.219689 | -0.036214 | 0.351113 | 0.288333 | 0.041047 | 0.446285 | 0.146386 | -0.078596 | -0.273668 | -0.337543 |
5 rows × 106 columns
We see that none of them is one-hot encoded. So, the pre-processing happens before the inverse design and thus affects the outcomes, whereas the post-processing is done after the inverse design and does not affect the outcomes. By default, we apply post-processing on all epochs for a clean view, but no pre-processing so we start from the initially random vectors. To see whether the “real” examples at the end would lead to very different conclusions from the “unrealistic” examples that we get from the inverse design we one-hot encode the categorical features and run an additional forward pass. This is done by setting the extra_epoch argument to True again (which it is by default).
[8]:
lucid = LUCID(model, outputs, example_data, numb_of_samples=10,
numb_of_epochs=5, lr=1, low=-0.5, high=0.5, extra_epoch=True,
one_hot_pre=False, one_hot_post=False)
lucid.results.head()
100%|██████████| 10/10 [00:00<00:00, 14.34it/s]
[8]:
<=50K | >50K | Age | fnlwgt | Education-Num | Capital Gain | Capital Loss | Hours per week | Workclass+Federal-gov | Workclass+Local-gov | Workclass+Private | Workclass+Self-emp-inc | Workclass+Self-emp-not-inc | Workclass+State-gov | Workclass+Without-pay | Education+10th | Education+11th | Education+12th | Education+1st-4th | Education+5th-6th | Education+7th-8th | Education+9th | Education+Assoc-acdm | Education+Assoc-voc | Education+Bachelors | Education+Doctorate | Education+HS-grad | Education+Masters | Education+Preschool | Education+Prof-school | Education+Some-college | Martial Status+Divorced | Martial Status+Married-AF-spouse | Martial Status+Married-civ-spouse | Martial Status+Married-spouse-absent | Martial Status+Never-married | Martial Status+Separated | Martial Status+Widowed | Occupation+Adm-clerical | Occupation+Armed-Forces | ... | Country+Canada | Country+China | Country+Columbia | Country+Cuba | Country+Dominican-Republic | Country+Ecuador | Country+El-Salvador | Country+England | Country+France | Country+Germany | Country+Greece | Country+Guatemala | Country+Haiti | Country+Holand-Netherlands | Country+Honduras | Country+Hong | Country+Hungary | Country+India | Country+Iran | Country+Ireland | Country+Italy | Country+Jamaica | Country+Japan | Country+Laos | Country+Mexico | Country+Nicaragua | Country+Outlying-US(Guam-USVI-etc) | Country+Peru | Country+Philippines | Country+Poland | Country+Portugal | Country+Puerto-Rico | Country+Scotland | Country+South | Country+Taiwan | Country+Thailand | Country+Trinadad&Tobago | Country+United-States | Country+Vietnam | Country+Yugoslavia | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample | epoch | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
1 | 1 | 9.113827e-01 | 0.088617 | 0.476700 | -0.119804 | 0.423246 | -0.238308 | -0.180903 | -0.381909 | -0.258234 | -0.181466 | 0.464079 | -0.236350 | -0.058994 | 0.109871 | 0.363621 | 0.363758 | 0.174881 | 0.159874 | 0.235758 | -0.277246 | -0.327934 | 0.370415 | -0.439861 | 0.183689 | 0.171238 | 0.111018 | -0.439863 | 0.477769 | -0.061048 | 0.032595 | -0.496868 | -0.248733 | 0.358490 | -0.074702 | 0.235819 | 0.422043 | -0.346526 | 0.492259 | -0.317668 | 0.440113 | ... | -0.475167 | 0.012549 | 0.112786 | -0.219651 | -0.047868 | 0.031221 | 0.488635 | -0.494022 | -0.298712 | -0.243124 | -0.159255 | -0.446998 | -0.173506 | -0.301642 | 0.329215 | 0.137122 | -0.414559 | -0.279840 | 0.181176 | -0.365590 | 0.456514 | -0.363476 | 0.297702 | -0.001706 | -0.195292 | -0.265794 | 0.393008 | -0.122463 | -0.423655 | -0.466193 | -0.038387 | 0.373824 | -0.149391 | 0.323521 | 0.447199 | -0.432959 | -0.202160 | 0.119161 | -0.198243 | -0.325154 |
5 | 2.751082e-21 | 1.000000 | 1.350318 | 0.126873 | 1.606689 | 1.528067 | 0.611333 | 0.861375 | 0.358077 | -0.049054 | 0.764119 | 0.235373 | -0.187175 | 0.054723 | -1.520012 | -0.038638 | -0.290532 | -0.258323 | -0.151349 | -0.839940 | -0.960510 | -0.123262 | -0.654546 | 0.121049 | 0.358001 | 0.689186 | -0.669042 | 0.831165 | -2.376504 | 0.814956 | -0.560042 | -0.440323 | 1.407497 | 0.853604 | 0.103010 | -0.185685 | -0.560385 | 0.526152 | -0.290558 | -0.143325 | ... | -0.003762 | -0.356417 | -1.317336 | 0.121711 | -0.978367 | -0.057567 | 0.241150 | -0.466904 | 0.056865 | 0.032397 | -0.529028 | -0.598071 | -0.101765 | -0.965958 | -0.269870 | 0.094564 | -0.176903 | -0.465037 | -0.089950 | 0.191408 | 1.105003 | -0.399136 | 0.472703 | -0.293078 | -0.756070 | -1.180237 | -1.506882 | -0.709832 | 0.025622 | -0.544580 | -0.227058 | -0.139960 | -0.507906 | -0.377702 | 0.487784 | -0.993232 | -0.457052 | 0.292582 | -0.649984 | 0.138931 | |
6 | 7.542913e-07 | 0.999999 | 1.350318 | 0.126873 | 1.606689 | 1.528067 | 0.611333 | 0.861375 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | ... | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | |
2 | 1 | 3.457343e-01 | 0.654266 | -0.004495 | 0.260570 | 0.187262 | -0.251845 | 0.122047 | -0.184344 | -0.222784 | 0.429865 | -0.220405 | 0.448835 | -0.297983 | 0.065478 | -0.119697 | 0.308107 | -0.372446 | 0.451340 | 0.228827 | -0.324455 | 0.192759 | -0.407167 | 0.056402 | -0.397506 | -0.456307 | -0.107309 | 0.191054 | 0.226977 | 0.169032 | 0.091555 | 0.379001 | 0.037658 | -0.402178 | -0.358704 | 0.462280 | 0.298104 | -0.493246 | 0.445812 | 0.155423 | 0.410685 | ... | -0.003134 | -0.185595 | -0.469695 | -0.407376 | 0.089020 | 0.134038 | 0.466344 | -0.485120 | 0.233160 | 0.132863 | 0.001685 | -0.379063 | 0.357940 | 0.202411 | 0.068614 | 0.481099 | -0.409647 | 0.227080 | -0.011366 | -0.219501 | 0.385330 | 0.307728 | -0.060551 | 0.141648 | 0.131039 | -0.491563 | -0.278342 | -0.283107 | -0.279731 | 0.241137 | 0.441554 | -0.233563 | 0.247273 | -0.497929 | 0.309356 | 0.306809 | -0.445467 | -0.401064 | 0.053227 | 0.262911 |
5 | 3.462221e-09 | 1.000000 | 0.326914 | 0.354148 | 0.636203 | 0.418232 | 0.422583 | 0.287298 | 0.011015 | 0.480096 | -0.106584 | 0.627784 | -0.346608 | 0.044558 | -0.834257 | 0.155457 | -0.549002 | 0.292696 | 0.081978 | -0.537914 | -0.047210 | -0.594444 | -0.025039 | -0.421268 | -0.385458 | 0.112019 | 0.104114 | 0.361039 | -0.709340 | 0.388345 | 0.355035 | -0.035022 | -0.004235 | -0.006550 | 0.411899 | 0.067561 | -0.574374 | 0.458669 | 0.165707 | 0.189356 | ... | 0.175695 | -0.325563 | -1.012214 | -0.277880 | -0.263966 | 0.100356 | 0.372461 | -0.474833 | 0.368049 | 0.237382 | -0.138589 | -0.436373 | 0.385155 | -0.049599 | -0.158650 | 0.464955 | -0.319492 | 0.156825 | -0.114218 | -0.008203 | 0.631336 | 0.294201 | 0.005836 | 0.031116 | -0.081693 | -0.838459 | -0.999068 | -0.505927 | -0.109297 | 0.211401 | 0.369981 | -0.428468 | 0.111270 | -0.763939 | 0.324752 | 0.094268 | -0.542161 | -0.335276 | -0.118142 | 0.438962 |
5 rows × 106 columns
The extra epoch is now the only one-hot encoded variable. By applying post-processing again, you will see that the outcomes do not change, but that the categorical features are now just one-hot encoded across all epochs.
[9]:
lucid = LUCID(model, outputs, example_data, numb_of_samples=10,
numb_of_epochs=5, lr=1, low=-0.5, high=0.5, extra_epoch=True,
one_hot_pre=False, one_hot_post=True)
lucid.results.head()
100%|██████████| 10/10 [00:00<00:00, 13.95it/s]
[9]:
<=50K | >50K | Age | fnlwgt | Education-Num | Capital Gain | Capital Loss | Hours per week | Workclass+Federal-gov | Workclass+Local-gov | Workclass+Private | Workclass+Self-emp-inc | Workclass+Self-emp-not-inc | Workclass+State-gov | Workclass+Without-pay | Education+10th | Education+11th | Education+12th | Education+1st-4th | Education+5th-6th | Education+7th-8th | Education+9th | Education+Assoc-acdm | Education+Assoc-voc | Education+Bachelors | Education+Doctorate | Education+HS-grad | Education+Masters | Education+Preschool | Education+Prof-school | Education+Some-college | Martial Status+Divorced | Martial Status+Married-AF-spouse | Martial Status+Married-civ-spouse | Martial Status+Married-spouse-absent | Martial Status+Never-married | Martial Status+Separated | Martial Status+Widowed | Occupation+Adm-clerical | Occupation+Armed-Forces | ... | Country+Canada | Country+China | Country+Columbia | Country+Cuba | Country+Dominican-Republic | Country+Ecuador | Country+El-Salvador | Country+England | Country+France | Country+Germany | Country+Greece | Country+Guatemala | Country+Haiti | Country+Holand-Netherlands | Country+Honduras | Country+Hong | Country+Hungary | Country+India | Country+Iran | Country+Ireland | Country+Italy | Country+Jamaica | Country+Japan | Country+Laos | Country+Mexico | Country+Nicaragua | Country+Outlying-US(Guam-USVI-etc) | Country+Peru | Country+Philippines | Country+Poland | Country+Portugal | Country+Puerto-Rico | Country+Scotland | Country+South | Country+Taiwan | Country+Thailand | Country+Trinadad&Tobago | Country+United-States | Country+Vietnam | Country+Yugoslavia | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample | epoch | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
1 | 1 | 9.113827e-01 | 0.088617 | 0.476700 | -0.119804 | 0.423246 | -0.238308 | -0.180903 | -0.381909 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 2.751082e-21 | 1.000000 | 1.350318 | 0.126873 | 1.606689 | 1.528067 | 0.611333 | 0.861375 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
6 | 7.542913e-07 | 0.999999 | 1.350318 | 0.126873 | 1.606689 | 1.528067 | 0.611333 | 0.861375 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
2 | 1 | 3.457343e-01 | 0.654266 | -0.004495 | 0.260570 | 0.187262 | -0.251845 | 0.122047 | -0.184344 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 | 3.462221e-09 | 1.000000 | 0.326914 | 0.354148 | 0.636203 | 0.418232 | 0.422583 | 0.287298 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 rows × 106 columns
Finally, we have been mostly focusing on the first and last example, but the log_every_n argument allows us to monitor many more examples and to perform a dynamic analysis.
[10]:
lucid = LUCID(model, outputs, example_data, numb_of_samples=10,
numb_of_epochs=5, lr=1, low=-0.5, high=0.5, extra_epoch=True,
one_hot_pre=False, one_hot_post=True, log_every_n=1)
lucid.results.head()
100%|██████████| 10/10 [00:00<00:00, 12.68it/s]
[10]:
<=50K | >50K | Age | fnlwgt | Education-Num | Capital Gain | Capital Loss | Hours per week | Workclass+Federal-gov | Workclass+Local-gov | Workclass+Private | Workclass+Self-emp-inc | Workclass+Self-emp-not-inc | Workclass+State-gov | Workclass+Without-pay | Education+10th | Education+11th | Education+12th | Education+1st-4th | Education+5th-6th | Education+7th-8th | Education+9th | Education+Assoc-acdm | Education+Assoc-voc | Education+Bachelors | Education+Doctorate | Education+HS-grad | Education+Masters | Education+Preschool | Education+Prof-school | Education+Some-college | Martial Status+Divorced | Martial Status+Married-AF-spouse | Martial Status+Married-civ-spouse | Martial Status+Married-spouse-absent | Martial Status+Never-married | Martial Status+Separated | Martial Status+Widowed | Occupation+Adm-clerical | Occupation+Armed-Forces | ... | Country+Canada | Country+China | Country+Columbia | Country+Cuba | Country+Dominican-Republic | Country+Ecuador | Country+El-Salvador | Country+England | Country+France | Country+Germany | Country+Greece | Country+Guatemala | Country+Haiti | Country+Holand-Netherlands | Country+Honduras | Country+Hong | Country+Hungary | Country+India | Country+Iran | Country+Ireland | Country+Italy | Country+Jamaica | Country+Japan | Country+Laos | Country+Mexico | Country+Nicaragua | Country+Outlying-US(Guam-USVI-etc) | Country+Peru | Country+Philippines | Country+Poland | Country+Portugal | Country+Puerto-Rico | Country+Scotland | Country+South | Country+Taiwan | Country+Thailand | Country+Trinadad&Tobago | Country+United-States | Country+Vietnam | Country+Yugoslavia | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
sample | epoch | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
1 | 1 | 9.113827e-01 | 0.088617 | 0.476700 | -0.119804 | 0.423246 | -0.238308 | -0.180903 | -0.381909 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
2 | 2.751082e-21 | 1.000000 | 1.350318 | 0.126873 | 1.606689 | 1.528067 | 0.611333 | 0.861375 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
3 | 2.751082e-21 | 1.000000 | 1.350318 | 0.126873 | 1.606689 | 1.528067 | 0.611333 | 0.861375 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
4 | 2.751082e-21 | 1.000000 | 1.350318 | 0.126873 | 1.606689 | 1.528067 | 0.611333 | 0.861375 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
5 | 2.751082e-21 | 1.000000 | 1.350318 | 0.126873 | 1.606689 | 1.528067 | 0.611333 | 0.861375 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ... | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
5 rows × 106 columns
Hopefully, canonical inverse design seems less mysterious right now! Note that many of the arguments have default values that apply to many cases and that you are not required to specify all of them all the time. If there are still things unclear, feel free to submit an issue or a PR!