Best Practices Optimization¶
ML/AI Best Practices: "Selecting Surrogate Model Form/Size for Optimization"¶
In this notebook we demonstrate the use of model and solver statistics to select the best surrogate model. For this purpose we trained (offline) different models with ALAMO, PySMO for three basis forms, and TensorFlow Keras. The surrogates are imported into the notebook, and the IDAES flowsheet is constructed and solved.
1. Introduction¶
This example demonstrates autothermal reformer optimization leveraging the ALAMO, PySMO and Keras surrogate trainers, and compares key indicators of model performance. In this notebook, IPOPT will be run with statistics using ALAMO, PySMO Polynomial, PySMO RBF, PySMO Kriging and Keras surrogate models to assess each model type for flowsheet integration and tractability.
2. Problem Statement¶
Within the context of a larger Natural Gas Fuel Cell (NGFC) system, the autothermal reformer unit is used to generate syngas from air, steam, and natural gas. Two input variables are considered for this example (reformer bypass fraction and fuel to steam ratio). The reformer bypass fraction (also called internal reformation percentage) plays an important role in the syngas final composition and it is typically controlled in this process. The fuel to steam ratio is an important variable that affects the final syngas reaction and heat duty required by the reactor. The syngas is then used as fuel by a solid-oxide fuel cell (SOFC) to generate electricity and heat.
The autothermal reformer is typically modeled using the IDAES Gibbs reactor and this reactor is robust once it is initialized; however, the overall model robustness is affected due to several components present in the reaction, scaling issues for the largrangean multipliers, and Gibbs free energy minimization formulation. Substituting rigorously trained and validated surrogates in lieu of rigorous unit model equations increases the robustness of the problem.
2.1. Inputs:¶
- Bypass fraction (dimensionless) - split fraction of natural gas to bypass AR unit and feed directly to the power island
- NG-Steam Ratio (dimensionless) - proportion of natural relative to steam fed into AR unit operation
2.2. Outputs:¶
- Steam flowrate (kg/s) - inlet steam fed to AR unit
- Reformer duty (kW) - required energy input to AR unit
- Composition (dimensionless) - outlet mole fractions of components (Ar, C2H6, C3H8, C4H10, CH4, CO, CO2, H2, H2O, N2, O2)
from IPython.display import Image
Image("AR_PFD.png")
3. Training Surrogates¶
Previous Jupyter Notebooks demonstrated the workflow to import data, train surrogate models using ALAMO, PySMO and Keras, and develop IDAES's validation plots. To keep this notebook simple, this notebook simply loads the surrogate models trained off line.
Note that the training/loading method includes a "retrain" argument in case the user wants to retrain all surrogate models. Since the retrain method runs ALAMO, PySMO (Polynomial, Radial Basis Functions, and Kriging basis types) and Keras, it takes about an 1 hr to complete the training for all models.
Each run will overwrite the serialized JSON files for previously trained surrogates if retraining is enforced. To retrain individual surrogates, simply delete the desired JSON before running this notebook (for Keras, delete the folder keras_surrogate/
)
from AR_training_methods import train_load_surrogates
train_load_surrogates(retrain=False)
# setting retrain to True will take ~ 1 hour to run, best to load if possible
# setting retrain to False will only generate missing surrogates (only if JSON/folder doesn't exist)
# this method trains surrogates and serializes to JSON, so no objects are returned from the method itself
# imports to capture long output
from io import StringIO
import sys
2023-03-04 01:44:38.415841: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 AVX512F AVX512_VNNI FMA To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags. 2023-03-04 01:44:38.530991: I tensorflow/core/util/util.cc:169] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`. 2023-03-04 01:44:38.535981: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory 2023-03-04 01:44:38.535993: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine. 2023-03-04 01:44:38.558897: E tensorflow/stream_executor/cuda/cuda_blas.cc:2981] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered 2023-03-04 01:44:39.108356: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer.so.7'; dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory 2023-03-04 01:44:39.108435: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libnvinfer_plugin.so.7'; dlerror: libnvinfer_plugin.so.7: cannot open shared object file: No such file or directory 2023-03-04 01:44:39.108441: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Cannot dlopen some TensorRT libraries. If you would like to use Nvidia GPU with TensorRT, please make sure the missing libraries mentioned above are installed properly.
Loading existing surrogate models and training missing models. Any training output will print below; otherwise, models will be loaded without any further output. Epoch 1/1000
2023-03-04 01:44:41.323429: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcuda.so.1'; dlerror: libcuda.so.1: cannot open shared object file: No such file or directory; LD_LIBRARY_PATH: :/home/runner/.idaes/bin 2023-03-04 01:44:41.323449: W tensorflow/stream_executor/cuda/cuda_driver.cc:263] failed call to cuInit: UNKNOWN ERROR (303) 2023-03-04 01:44:41.323490: I tensorflow/stream_executor/cuda/cuda_diagnostics.cc:156] kernel driver does not appear to be running on this host (c046ea564732): /proc/driver/nvidia/version does not exist 2023-03-04 01:44:41.323714: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 AVX512F AVX512_VNNI FMA To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
3/3 [==============================] - 0s 75ms/step - loss: 0.3703 - mae: 0.5194 - mse: 0.3703 - val_loss: 0.3230 - val_mae: 0.4945 - val_mse: 0.3230 Epoch 2/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.3078 - mae: 0.4684 - mse: 0.3078 - val_loss: 0.2686 - val_mae: 0.4450 - val_mse: 0.2686 Epoch 3/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.2556 - mae: 0.4217 - mse: 0.2556 - val_loss: 0.2235 - val_mae: 0.3991 - val_mse: 0.2235 Epoch 4/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.2136 - mae: 0.3798 - mse: 0.2136 - val_loss: 0.1862 - val_mae: 0.3568 - val_mse: 0.1862 Epoch 5/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.1792 - mae: 0.3424 - mse: 0.1792 - val_loss: 0.1557 - val_mae: 0.3193 - val_mse: 0.1557 Epoch 6/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.1512 - mae: 0.3100 - mse: 0.1512 - val_loss: 0.1303 - val_mae: 0.2857 - val_mse: 0.1303 Epoch 7/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.1286 - mae: 0.2829 - mse: 0.1286 - val_loss: 0.1099 - val_mae: 0.2583 - val_mse: 0.1099 Epoch 8/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.1108 - mae: 0.2615 - mse: 0.1108 - val_loss: 0.0935 - val_mae: 0.2381 - val_mse: 0.0935 Epoch 9/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0969 - mae: 0.2445 - mse: 0.0969 - val_loss: 0.0810 - val_mae: 0.2227 - val_mse: 0.0810 Epoch 10/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0870 - mae: 0.2324 - mse: 0.0870 - val_loss: 0.0717 - val_mae: 0.2123 - val_mse: 0.0717 Epoch 11/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0795 - mae: 0.2228 - mse: 0.0795 - val_loss: 0.0650 - val_mae: 0.2041 - val_mse: 0.0650 Epoch 12/1000 3/3 [==============================] - 0s 15ms/step - loss: 0.0745 - mae: 0.2165 - mse: 0.0745 - val_loss: 0.0599 - val_mae: 0.1972 - val_mse: 0.0599 Epoch 13/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0703 - mae: 0.2108 - mse: 0.0703 - val_loss: 0.0565 - val_mae: 0.1925 - val_mse: 0.0565 Epoch 14/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0670 - mae: 0.2056 - mse: 0.0670 - val_loss: 0.0534 - val_mae: 0.1879 - val_mse: 0.0534 Epoch 15/1000 3/3 [==============================] - 0s 15ms/step - loss: 0.0640 - mae: 0.2005 - mse: 0.0640 - val_loss: 0.0506 - val_mae: 0.1828 - val_mse: 0.0506 Epoch 16/1000 3/3 [==============================] - 0s 15ms/step - loss: 0.0611 - mae: 0.1949 - mse: 0.0611 - val_loss: 0.0477 - val_mae: 0.1767 - val_mse: 0.0477 Epoch 17/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0582 - mae: 0.1889 - mse: 0.0582 - val_loss: 0.0454 - val_mae: 0.1711 - val_mse: 0.0454 Epoch 18/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0557 - mae: 0.1833 - mse: 0.0557 - val_loss: 0.0436 - val_mae: 0.1659 - val_mse: 0.0436 Epoch 19/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0533 - mae: 0.1778 - mse: 0.0533 - val_loss: 0.0418 - val_mae: 0.1601 - val_mse: 0.0418 Epoch 20/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0514 - mae: 0.1727 - mse: 0.0514 - val_loss: 0.0403 - val_mae: 0.1546 - val_mse: 0.0403 Epoch 21/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0496 - mae: 0.1681 - mse: 0.0496 - val_loss: 0.0385 - val_mae: 0.1487 - val_mse: 0.0385 Epoch 22/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0479 - mae: 0.1630 - mse: 0.0479 - val_loss: 0.0370 - val_mae: 0.1437 - val_mse: 0.0370 Epoch 23/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0464 - mae: 0.1589 - mse: 0.0464 - val_loss: 0.0359 - val_mae: 0.1404 - val_mse: 0.0359 Epoch 24/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0449 - mae: 0.1550 - mse: 0.0449 - val_loss: 0.0347 - val_mae: 0.1372 - val_mse: 0.0347 Epoch 25/1000 3/3 [==============================] - 0s 15ms/step - loss: 0.0433 - mae: 0.1511 - mse: 0.0433 - val_loss: 0.0335 - val_mae: 0.1347 - val_mse: 0.0335 Epoch 26/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0420 - mae: 0.1481 - mse: 0.0420 - val_loss: 0.0323 - val_mae: 0.1327 - val_mse: 0.0323 Epoch 27/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0407 - mae: 0.1452 - mse: 0.0407 - val_loss: 0.0311 - val_mae: 0.1305 - val_mse: 0.0311 Epoch 28/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0395 - mae: 0.1425 - mse: 0.0395 - val_loss: 0.0299 - val_mae: 0.1284 - val_mse: 0.0299 Epoch 29/1000 3/3 [==============================] - 0s 15ms/step - loss: 0.0383 - mae: 0.1402 - mse: 0.0383 - val_loss: 0.0288 - val_mae: 0.1263 - val_mse: 0.0288 Epoch 30/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0372 - mae: 0.1377 - mse: 0.0372 - val_loss: 0.0276 - val_mae: 0.1240 - val_mse: 0.0276 Epoch 31/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0362 - mae: 0.1357 - mse: 0.0362 - val_loss: 0.0265 - val_mae: 0.1219 - val_mse: 0.0265 Epoch 32/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0352 - mae: 0.1337 - mse: 0.0352 - val_loss: 0.0257 - val_mae: 0.1201 - val_mse: 0.0257 Epoch 33/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0341 - mae: 0.1317 - mse: 0.0341 - val_loss: 0.0250 - val_mae: 0.1186 - val_mse: 0.0250 Epoch 34/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0332 - mae: 0.1298 - mse: 0.0332 - val_loss: 0.0243 - val_mae: 0.1168 - val_mse: 0.0243 Epoch 35/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0322 - mae: 0.1280 - mse: 0.0322 - val_loss: 0.0238 - val_mae: 0.1155 - val_mse: 0.0238 Epoch 36/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0311 - mae: 0.1262 - mse: 0.0311 - val_loss: 0.0235 - val_mae: 0.1144 - val_mse: 0.0235 Epoch 37/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0303 - mae: 0.1249 - mse: 0.0303 - val_loss: 0.0230 - val_mae: 0.1129 - val_mse: 0.0230 Epoch 38/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0293 - mae: 0.1228 - mse: 0.0293 - val_loss: 0.0219 - val_mae: 0.1100 - val_mse: 0.0219 Epoch 39/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0283 - mae: 0.1204 - mse: 0.0283 - val_loss: 0.0209 - val_mae: 0.1069 - val_mse: 0.0209 Epoch 40/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0274 - mae: 0.1183 - mse: 0.0274 - val_loss: 0.0199 - val_mae: 0.1043 - val_mse: 0.0199 Epoch 41/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0265 - mae: 0.1164 - mse: 0.0265 - val_loss: 0.0190 - val_mae: 0.1023 - val_mse: 0.0190 Epoch 42/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0255 - mae: 0.1141 - mse: 0.0255 - val_loss: 0.0184 - val_mae: 0.1012 - val_mse: 0.0184 Epoch 43/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0246 - mae: 0.1125 - mse: 0.0246 - val_loss: 0.0180 - val_mae: 0.1005 - val_mse: 0.0180 Epoch 44/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0236 - mae: 0.1104 - mse: 0.0236 - val_loss: 0.0173 - val_mae: 0.0985 - val_mse: 0.0173 Epoch 45/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0227 - mae: 0.1083 - mse: 0.0227 - val_loss: 0.0165 - val_mae: 0.0961 - val_mse: 0.0165 Epoch 46/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0218 - mae: 0.1062 - mse: 0.0218 - val_loss: 0.0158 - val_mae: 0.0937 - val_mse: 0.0158 Epoch 47/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0209 - mae: 0.1038 - mse: 0.0209 - val_loss: 0.0148 - val_mae: 0.0908 - val_mse: 0.0148 Epoch 48/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0200 - mae: 0.1017 - mse: 0.0200 - val_loss: 0.0141 - val_mae: 0.0885 - val_mse: 0.0141 Epoch 49/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0192 - mae: 0.0995 - mse: 0.0192 - val_loss: 0.0132 - val_mae: 0.0861 - val_mse: 0.0132 Epoch 50/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0183 - mae: 0.0973 - mse: 0.0183 - val_loss: 0.0129 - val_mae: 0.0851 - val_mse: 0.0129 Epoch 51/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0174 - mae: 0.0953 - mse: 0.0174 - val_loss: 0.0126 - val_mae: 0.0841 - val_mse: 0.0126 Epoch 52/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0166 - mae: 0.0934 - mse: 0.0166 - val_loss: 0.0122 - val_mae: 0.0825 - val_mse: 0.0122 Epoch 53/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0158 - mae: 0.0914 - mse: 0.0158 - val_loss: 0.0115 - val_mae: 0.0798 - val_mse: 0.0115 Epoch 54/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0150 - mae: 0.0891 - mse: 0.0150 - val_loss: 0.0108 - val_mae: 0.0770 - val_mse: 0.0108 Epoch 55/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0143 - mae: 0.0868 - mse: 0.0143 - val_loss: 0.0099 - val_mae: 0.0740 - val_mse: 0.0099 Epoch 56/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0135 - mae: 0.0842 - mse: 0.0135 - val_loss: 0.0095 - val_mae: 0.0725 - val_mse: 0.0095 Epoch 57/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0128 - mae: 0.0822 - mse: 0.0128 - val_loss: 0.0091 - val_mae: 0.0712 - val_mse: 0.0091 Epoch 58/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0121 - mae: 0.0804 - mse: 0.0121 - val_loss: 0.0088 - val_mae: 0.0700 - val_mse: 0.0088 Epoch 59/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0115 - mae: 0.0785 - mse: 0.0115 - val_loss: 0.0083 - val_mae: 0.0679 - val_mse: 0.0083 Epoch 60/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0109 - mae: 0.0764 - mse: 0.0109 - val_loss: 0.0076 - val_mae: 0.0651 - val_mse: 0.0076 Epoch 61/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0103 - mae: 0.0743 - mse: 0.0103 - val_loss: 0.0072 - val_mae: 0.0631 - val_mse: 0.0072 Epoch 62/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0098 - mae: 0.0725 - mse: 0.0098 - val_loss: 0.0068 - val_mae: 0.0619 - val_mse: 0.0068 Epoch 63/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0092 - mae: 0.0707 - mse: 0.0092 - val_loss: 0.0067 - val_mae: 0.0616 - val_mse: 0.0067 Epoch 64/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0088 - mae: 0.0695 - mse: 0.0088 - val_loss: 0.0065 - val_mae: 0.0613 - val_mse: 0.0065 Epoch 65/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0084 - mae: 0.0679 - mse: 0.0084 - val_loss: 0.0061 - val_mae: 0.0594 - val_mse: 0.0061 Epoch 66/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0079 - mae: 0.0662 - mse: 0.0079 - val_loss: 0.0059 - val_mae: 0.0590 - val_mse: 0.0059 Epoch 67/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0075 - mae: 0.0650 - mse: 0.0075 - val_loss: 0.0057 - val_mae: 0.0587 - val_mse: 0.0057 Epoch 68/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0071 - mae: 0.0638 - mse: 0.0071 - val_loss: 0.0056 - val_mae: 0.0582 - val_mse: 0.0056 Epoch 69/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0068 - mae: 0.0627 - mse: 0.0068 - val_loss: 0.0055 - val_mae: 0.0580 - val_mse: 0.0055 Epoch 70/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0065 - mae: 0.0615 - mse: 0.0065 - val_loss: 0.0051 - val_mae: 0.0561 - val_mse: 0.0051 Epoch 71/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0062 - mae: 0.0602 - mse: 0.0062 - val_loss: 0.0049 - val_mae: 0.0547 - val_mse: 0.0049 Epoch 72/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0060 - mae: 0.0590 - mse: 0.0060 - val_loss: 0.0046 - val_mae: 0.0527 - val_mse: 0.0046 Epoch 73/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0058 - mae: 0.0578 - mse: 0.0058 - val_loss: 0.0045 - val_mae: 0.0521 - val_mse: 0.0045 Epoch 74/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0056 - mae: 0.0571 - mse: 0.0056 - val_loss: 0.0045 - val_mae: 0.0529 - val_mse: 0.0045 Epoch 75/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0054 - mae: 0.0564 - mse: 0.0054 - val_loss: 0.0044 - val_mae: 0.0527 - val_mse: 0.0044 Epoch 76/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0052 - mae: 0.0554 - mse: 0.0052 - val_loss: 0.0042 - val_mae: 0.0511 - val_mse: 0.0042 Epoch 77/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0051 - mae: 0.0546 - mse: 0.0051 - val_loss: 0.0040 - val_mae: 0.0492 - val_mse: 0.0040 Epoch 78/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0050 - mae: 0.0540 - mse: 0.0050 - val_loss: 0.0039 - val_mae: 0.0494 - val_mse: 0.0039 Epoch 79/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0048 - mae: 0.0535 - mse: 0.0048 - val_loss: 0.0040 - val_mae: 0.0502 - val_mse: 0.0040 Epoch 80/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0047 - mae: 0.0534 - mse: 0.0047 - val_loss: 0.0043 - val_mae: 0.0527 - val_mse: 0.0043 Epoch 81/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0047 - mae: 0.0537 - mse: 0.0047 - val_loss: 0.0044 - val_mae: 0.0537 - val_mse: 0.0044 Epoch 82/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0046 - mae: 0.0534 - mse: 0.0046 - val_loss: 0.0041 - val_mae: 0.0521 - val_mse: 0.0041 Epoch 83/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0045 - mae: 0.0528 - mse: 0.0045 - val_loss: 0.0039 - val_mae: 0.0507 - val_mse: 0.0039 Epoch 84/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0044 - mae: 0.0524 - mse: 0.0044 - val_loss: 0.0038 - val_mae: 0.0499 - val_mse: 0.0038 Epoch 85/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0044 - mae: 0.0522 - mse: 0.0044 - val_loss: 0.0038 - val_mae: 0.0501 - val_mse: 0.0038 Epoch 86/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0043 - mae: 0.0519 - mse: 0.0043 - val_loss: 0.0039 - val_mae: 0.0507 - val_mse: 0.0039 Epoch 87/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0043 - mae: 0.0519 - mse: 0.0043 - val_loss: 0.0040 - val_mae: 0.0518 - val_mse: 0.0040 Epoch 88/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0042 - mae: 0.0519 - mse: 0.0042 - val_loss: 0.0039 - val_mae: 0.0511 - val_mse: 0.0039 Epoch 89/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0042 - mae: 0.0516 - mse: 0.0042 - val_loss: 0.0039 - val_mae: 0.0504 - val_mse: 0.0039 Epoch 90/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0042 - mae: 0.0513 - mse: 0.0042 - val_loss: 0.0038 - val_mae: 0.0500 - val_mse: 0.0038 Epoch 91/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0041 - mae: 0.0510 - mse: 0.0041 - val_loss: 0.0037 - val_mae: 0.0491 - val_mse: 0.0037 Epoch 92/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0041 - mae: 0.0508 - mse: 0.0041 - val_loss: 0.0039 - val_mae: 0.0502 - val_mse: 0.0039 Epoch 93/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0041 - mae: 0.0508 - mse: 0.0041 - val_loss: 0.0039 - val_mae: 0.0508 - val_mse: 0.0039 Epoch 94/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0040 - mae: 0.0508 - mse: 0.0040 - val_loss: 0.0040 - val_mae: 0.0514 - val_mse: 0.0040 Epoch 95/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0040 - mae: 0.0511 - mse: 0.0040 - val_loss: 0.0041 - val_mae: 0.0518 - val_mse: 0.0041 Epoch 96/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0040 - mae: 0.0512 - mse: 0.0040 - val_loss: 0.0040 - val_mae: 0.0516 - val_mse: 0.0040 Epoch 97/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0040 - mae: 0.0512 - mse: 0.0040 - val_loss: 0.0039 - val_mae: 0.0511 - val_mse: 0.0039 Epoch 98/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0040 - mae: 0.0508 - mse: 0.0040 - val_loss: 0.0037 - val_mae: 0.0497 - val_mse: 0.0037 Epoch 99/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0040 - mae: 0.0507 - mse: 0.0040 - val_loss: 0.0036 - val_mae: 0.0489 - val_mse: 0.0036 Epoch 100/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0040 - mae: 0.0502 - mse: 0.0040 - val_loss: 0.0039 - val_mae: 0.0504 - val_mse: 0.0039 Epoch 101/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0040 - mae: 0.0505 - mse: 0.0040 - val_loss: 0.0040 - val_mae: 0.0515 - val_mse: 0.0040 Epoch 102/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0040 - mae: 0.0507 - mse: 0.0040 - val_loss: 0.0039 - val_mae: 0.0507 - val_mse: 0.0039 Epoch 103/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0501 - mse: 0.0039 - val_loss: 0.0037 - val_mae: 0.0487 - val_mse: 0.0037 Epoch 104/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0040 - mae: 0.0499 - mse: 0.0040 - val_loss: 0.0036 - val_mae: 0.0483 - val_mse: 0.0036 Epoch 105/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0499 - mse: 0.0039 - val_loss: 0.0038 - val_mae: 0.0499 - val_mse: 0.0038 Epoch 106/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0500 - mse: 0.0039 - val_loss: 0.0040 - val_mae: 0.0513 - val_mse: 0.0040 Epoch 107/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0506 - mse: 0.0039 - val_loss: 0.0041 - val_mae: 0.0520 - val_mse: 0.0041 Epoch 108/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0505 - mse: 0.0039 - val_loss: 0.0039 - val_mae: 0.0507 - val_mse: 0.0039 Epoch 109/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0039 - mae: 0.0503 - mse: 0.0039 - val_loss: 0.0038 - val_mae: 0.0500 - val_mse: 0.0038 Epoch 110/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0502 - mse: 0.0039 - val_loss: 0.0039 - val_mae: 0.0508 - val_mse: 0.0039 Epoch 111/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0502 - mse: 0.0039 - val_loss: 0.0038 - val_mae: 0.0504 - val_mse: 0.0038 Epoch 112/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0500 - mse: 0.0039 - val_loss: 0.0037 - val_mae: 0.0496 - val_mse: 0.0037 Epoch 113/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0500 - mse: 0.0039 - val_loss: 0.0038 - val_mae: 0.0501 - val_mse: 0.0038 Epoch 114/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0498 - mse: 0.0039 - val_loss: 0.0037 - val_mae: 0.0495 - val_mse: 0.0037 Epoch 115/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0498 - mse: 0.0039 - val_loss: 0.0037 - val_mae: 0.0494 - val_mse: 0.0037 Epoch 116/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0497 - mse: 0.0039 - val_loss: 0.0037 - val_mae: 0.0491 - val_mse: 0.0037 Epoch 117/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0039 - mae: 0.0497 - mse: 0.0039 - val_loss: 0.0036 - val_mae: 0.0483 - val_mse: 0.0036 Epoch 118/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0495 - mse: 0.0039 - val_loss: 0.0037 - val_mae: 0.0488 - val_mse: 0.0037 Epoch 119/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0495 - mse: 0.0039 - val_loss: 0.0038 - val_mae: 0.0497 - val_mse: 0.0038 Epoch 120/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0498 - mse: 0.0039 - val_loss: 0.0039 - val_mae: 0.0507 - val_mse: 0.0039 Epoch 121/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0499 - mse: 0.0039 - val_loss: 0.0038 - val_mae: 0.0503 - val_mse: 0.0038 Epoch 122/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0498 - mse: 0.0039 - val_loss: 0.0038 - val_mae: 0.0504 - val_mse: 0.0038 Epoch 123/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0501 - mse: 0.0039 - val_loss: 0.0039 - val_mae: 0.0512 - val_mse: 0.0039 Epoch 124/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0501 - mse: 0.0039 - val_loss: 0.0038 - val_mae: 0.0506 - val_mse: 0.0038 Epoch 125/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0502 - mse: 0.0039 - val_loss: 0.0036 - val_mae: 0.0490 - val_mse: 0.0036 Epoch 126/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0497 - mse: 0.0039 - val_loss: 0.0038 - val_mae: 0.0501 - val_mse: 0.0038 Epoch 127/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0498 - mse: 0.0038 - val_loss: 0.0039 - val_mae: 0.0504 - val_mse: 0.0039 Epoch 128/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0497 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0496 - val_mse: 0.0037 Epoch 129/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0498 - mse: 0.0039 - val_loss: 0.0037 - val_mae: 0.0494 - val_mse: 0.0037 Epoch 130/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0038 - mae: 0.0493 - mse: 0.0038 - val_loss: 0.0035 - val_mae: 0.0477 - val_mse: 0.0035 Epoch 131/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0039 - mae: 0.0494 - mse: 0.0039 - val_loss: 0.0035 - val_mae: 0.0470 - val_mse: 0.0035 Epoch 132/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0491 - mse: 0.0038 - val_loss: 0.0038 - val_mae: 0.0496 - val_mse: 0.0038 Epoch 133/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0497 - mse: 0.0039 - val_loss: 0.0041 - val_mae: 0.0520 - val_mse: 0.0041 Epoch 134/1000 3/3 [==============================] - 0s 10ms/step - loss: 0.0039 - mae: 0.0504 - mse: 0.0039 - val_loss: 0.0040 - val_mae: 0.0512 - val_mse: 0.0040 Epoch 135/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0500 - mse: 0.0038 - val_loss: 0.0038 - val_mae: 0.0507 - val_mse: 0.0038 Epoch 136/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0498 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0496 - val_mse: 0.0037 Epoch 137/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0497 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0494 - val_mse: 0.0037 Epoch 138/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0495 - mse: 0.0038 - val_loss: 0.0038 - val_mae: 0.0498 - val_mse: 0.0038 Epoch 139/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0497 - mse: 0.0038 - val_loss: 0.0039 - val_mae: 0.0509 - val_mse: 0.0039 Epoch 140/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0499 - mse: 0.0039 - val_loss: 0.0038 - val_mae: 0.0497 - val_mse: 0.0038 Epoch 141/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0494 - mse: 0.0038 - val_loss: 0.0038 - val_mae: 0.0501 - val_mse: 0.0038 Epoch 142/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0495 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0496 - val_mse: 0.0037 Epoch 143/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0494 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0492 - val_mse: 0.0037 Epoch 144/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0494 - mse: 0.0038 - val_loss: 0.0038 - val_mae: 0.0497 - val_mse: 0.0038 Epoch 145/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0497 - mse: 0.0038 - val_loss: 0.0038 - val_mae: 0.0501 - val_mse: 0.0038 Epoch 146/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0495 - mse: 0.0038 - val_loss: 0.0036 - val_mae: 0.0489 - val_mse: 0.0036 Epoch 147/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0493 - mse: 0.0038 - val_loss: 0.0036 - val_mae: 0.0487 - val_mse: 0.0036 Epoch 148/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0492 - mse: 0.0038 - val_loss: 0.0039 - val_mae: 0.0508 - val_mse: 0.0039 Epoch 149/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0504 - mse: 0.0039 - val_loss: 0.0040 - val_mae: 0.0515 - val_mse: 0.0040 Epoch 150/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0497 - mse: 0.0038 - val_loss: 0.0036 - val_mae: 0.0483 - val_mse: 0.0036 Epoch 151/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0039 - mae: 0.0495 - mse: 0.0039 - val_loss: 0.0035 - val_mae: 0.0475 - val_mse: 0.0035 Epoch 152/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0493 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0493 - val_mse: 0.0037 Epoch 153/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0491 - mse: 0.0038 - val_loss: 0.0038 - val_mae: 0.0496 - val_mse: 0.0038 Epoch 154/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0492 - mse: 0.0038 - val_loss: 0.0038 - val_mae: 0.0496 - val_mse: 0.0038 Epoch 155/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0492 - mse: 0.0038 - val_loss: 0.0038 - val_mae: 0.0496 - val_mse: 0.0038 Epoch 156/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0493 - mse: 0.0038 - val_loss: 0.0038 - val_mae: 0.0502 - val_mse: 0.0038 Epoch 157/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0493 - mse: 0.0038 - val_loss: 0.0036 - val_mae: 0.0487 - val_mse: 0.0036 Epoch 158/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0490 - mse: 0.0038 - val_loss: 0.0036 - val_mae: 0.0487 - val_mse: 0.0036 Epoch 159/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0491 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0496 - val_mse: 0.0037 Epoch 160/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0492 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0499 - val_mse: 0.0037 Epoch 161/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0493 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0497 - val_mse: 0.0037 Epoch 162/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0494 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0495 - val_mse: 0.0037 Epoch 163/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0492 - mse: 0.0038 - val_loss: 0.0036 - val_mae: 0.0490 - val_mse: 0.0036 Epoch 164/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0489 - mse: 0.0038 - val_loss: 0.0035 - val_mae: 0.0480 - val_mse: 0.0035 Epoch 165/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0488 - mse: 0.0038 - val_loss: 0.0036 - val_mae: 0.0481 - val_mse: 0.0036 Epoch 166/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0487 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0490 - val_mse: 0.0037 Epoch 167/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0494 - mse: 0.0038 - val_loss: 0.0040 - val_mae: 0.0510 - val_mse: 0.0040 Epoch 168/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0494 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0498 - val_mse: 0.0037 Epoch 169/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0492 - mse: 0.0038 - val_loss: 0.0036 - val_mae: 0.0488 - val_mse: 0.0036 Epoch 170/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0490 - mse: 0.0038 - val_loss: 0.0036 - val_mae: 0.0488 - val_mse: 0.0036 Epoch 171/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0489 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0490 - val_mse: 0.0036 Epoch 172/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0488 - mse: 0.0037 - val_loss: 0.0037 - val_mae: 0.0490 - val_mse: 0.0037 Epoch 173/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0488 - mse: 0.0037 - val_loss: 0.0037 - val_mae: 0.0495 - val_mse: 0.0037 Epoch 174/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0489 - mse: 0.0037 - val_loss: 0.0037 - val_mae: 0.0491 - val_mse: 0.0037 Epoch 175/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0038 - mae: 0.0492 - mse: 0.0038 - val_loss: 0.0037 - val_mae: 0.0496 - val_mse: 0.0037 Epoch 176/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0489 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0484 - val_mse: 0.0036 Epoch 177/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0487 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0483 - val_mse: 0.0036 Epoch 178/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0489 - mse: 0.0037 - val_loss: 0.0037 - val_mae: 0.0493 - val_mse: 0.0037 Epoch 179/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0488 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0485 - val_mse: 0.0036 Epoch 180/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0487 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0487 - val_mse: 0.0036 Epoch 181/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0488 - mse: 0.0037 - val_loss: 0.0037 - val_mae: 0.0490 - val_mse: 0.0037 Epoch 182/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0487 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0488 - val_mse: 0.0036 Epoch 183/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0488 - mse: 0.0037 - val_loss: 0.0037 - val_mae: 0.0495 - val_mse: 0.0037 Epoch 184/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0489 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0490 - val_mse: 0.0036 Epoch 185/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0489 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0490 - val_mse: 0.0036 Epoch 186/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0487 - mse: 0.0037 - val_loss: 0.0035 - val_mae: 0.0481 - val_mse: 0.0035 Epoch 187/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0485 - mse: 0.0037 - val_loss: 0.0035 - val_mae: 0.0477 - val_mse: 0.0035 Epoch 188/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0484 - mse: 0.0037 - val_loss: 0.0035 - val_mae: 0.0479 - val_mse: 0.0035 Epoch 189/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0483 - mse: 0.0037 - val_loss: 0.0035 - val_mae: 0.0478 - val_mse: 0.0035 Epoch 190/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0484 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0487 - val_mse: 0.0036 Epoch 191/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0485 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0490 - val_mse: 0.0036 Epoch 192/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0486 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0487 - val_mse: 0.0036 Epoch 193/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0485 - mse: 0.0037 - val_loss: 0.0035 - val_mae: 0.0481 - val_mse: 0.0035 Epoch 194/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0484 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0482 - val_mse: 0.0036 Epoch 195/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0484 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0486 - val_mse: 0.0036 Epoch 196/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0484 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0484 - val_mse: 0.0036 Epoch 197/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0484 - mse: 0.0037 - val_loss: 0.0037 - val_mae: 0.0491 - val_mse: 0.0037 Epoch 198/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0485 - mse: 0.0037 - val_loss: 0.0037 - val_mae: 0.0494 - val_mse: 0.0037 Epoch 199/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0486 - mse: 0.0037 - val_loss: 0.0037 - val_mae: 0.0490 - val_mse: 0.0037 Epoch 200/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0486 - mse: 0.0037 - val_loss: 0.0035 - val_mae: 0.0476 - val_mse: 0.0035 Epoch 201/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0482 - mse: 0.0037 - val_loss: 0.0035 - val_mae: 0.0478 - val_mse: 0.0035 Epoch 202/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0483 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0484 - val_mse: 0.0036 Epoch 203/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0484 - mse: 0.0037 - val_loss: 0.0036 - val_mae: 0.0485 - val_mse: 0.0036 Epoch 204/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0036 - mae: 0.0481 - mse: 0.0036 - val_loss: 0.0034 - val_mae: 0.0471 - val_mse: 0.0034 Epoch 205/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0037 - mae: 0.0480 - mse: 0.0037 - val_loss: 0.0034 - val_mae: 0.0473 - val_mse: 0.0034 Epoch 206/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0480 - mse: 0.0036 - val_loss: 0.0037 - val_mae: 0.0493 - val_mse: 0.0037 Epoch 207/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0485 - mse: 0.0036 - val_loss: 0.0037 - val_mae: 0.0494 - val_mse: 0.0037 Epoch 208/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0485 - mse: 0.0036 - val_loss: 0.0036 - val_mae: 0.0485 - val_mse: 0.0036 Epoch 209/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0483 - mse: 0.0036 - val_loss: 0.0035 - val_mae: 0.0478 - val_mse: 0.0035 Epoch 210/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0482 - mse: 0.0036 - val_loss: 0.0036 - val_mae: 0.0485 - val_mse: 0.0036 Epoch 211/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0485 - mse: 0.0036 - val_loss: 0.0037 - val_mae: 0.0492 - val_mse: 0.0037 Epoch 212/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0482 - mse: 0.0036 - val_loss: 0.0035 - val_mae: 0.0477 - val_mse: 0.0035 Epoch 213/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0036 - mae: 0.0479 - mse: 0.0036 - val_loss: 0.0033 - val_mae: 0.0463 - val_mse: 0.0033 Epoch 214/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0478 - mse: 0.0036 - val_loss: 0.0035 - val_mae: 0.0474 - val_mse: 0.0035 Epoch 215/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0479 - mse: 0.0036 - val_loss: 0.0036 - val_mae: 0.0484 - val_mse: 0.0036 Epoch 216/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0479 - mse: 0.0036 - val_loss: 0.0035 - val_mae: 0.0475 - val_mse: 0.0035 Epoch 217/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0480 - mse: 0.0036 - val_loss: 0.0035 - val_mae: 0.0481 - val_mse: 0.0035 Epoch 218/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0479 - mse: 0.0036 - val_loss: 0.0034 - val_mae: 0.0474 - val_mse: 0.0034 Epoch 219/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0481 - mse: 0.0037 - val_loss: 0.0034 - val_mae: 0.0471 - val_mse: 0.0034 Epoch 220/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0037 - mae: 0.0483 - mse: 0.0037 - val_loss: 0.0038 - val_mae: 0.0501 - val_mse: 0.0038 Epoch 221/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0487 - mse: 0.0036 - val_loss: 0.0037 - val_mae: 0.0491 - val_mse: 0.0037 Epoch 222/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0481 - mse: 0.0036 - val_loss: 0.0034 - val_mae: 0.0475 - val_mse: 0.0034 Epoch 223/1000 3/3 [==============================] - 0s 17ms/step - loss: 0.0036 - mae: 0.0477 - mse: 0.0036 - val_loss: 0.0033 - val_mae: 0.0463 - val_mse: 0.0033 Epoch 224/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0036 - mae: 0.0476 - mse: 0.0036 - val_loss: 0.0034 - val_mae: 0.0467 - val_mse: 0.0034 Epoch 225/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0475 - mse: 0.0036 - val_loss: 0.0037 - val_mae: 0.0491 - val_mse: 0.0037 Epoch 226/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0484 - mse: 0.0036 - val_loss: 0.0036 - val_mae: 0.0490 - val_mse: 0.0036 Epoch 227/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0036 - mae: 0.0485 - mse: 0.0036 - val_loss: 0.0036 - val_mae: 0.0488 - val_mse: 0.0036 Epoch 228/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0483 - mse: 0.0036 - val_loss: 0.0034 - val_mae: 0.0479 - val_mse: 0.0034 Epoch 229/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0481 - mse: 0.0036 - val_loss: 0.0035 - val_mae: 0.0482 - val_mse: 0.0035 Epoch 230/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0479 - mse: 0.0036 - val_loss: 0.0034 - val_mae: 0.0470 - val_mse: 0.0034 Epoch 231/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0036 - mae: 0.0478 - mse: 0.0036 - val_loss: 0.0032 - val_mae: 0.0454 - val_mse: 0.0032 Epoch 232/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0475 - mse: 0.0036 - val_loss: 0.0034 - val_mae: 0.0470 - val_mse: 0.0034 Epoch 233/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0473 - mse: 0.0036 - val_loss: 0.0033 - val_mae: 0.0463 - val_mse: 0.0033 Epoch 234/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0471 - mse: 0.0036 - val_loss: 0.0033 - val_mae: 0.0460 - val_mse: 0.0033 Epoch 235/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0472 - mse: 0.0036 - val_loss: 0.0033 - val_mae: 0.0463 - val_mse: 0.0033 Epoch 236/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0473 - mse: 0.0036 - val_loss: 0.0034 - val_mae: 0.0474 - val_mse: 0.0034 Epoch 237/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0475 - mse: 0.0035 - val_loss: 0.0034 - val_mae: 0.0474 - val_mse: 0.0034 Epoch 238/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0475 - mse: 0.0035 - val_loss: 0.0034 - val_mae: 0.0470 - val_mse: 0.0034 Epoch 239/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0474 - mse: 0.0035 - val_loss: 0.0035 - val_mae: 0.0476 - val_mse: 0.0035 Epoch 240/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0475 - mse: 0.0035 - val_loss: 0.0035 - val_mae: 0.0476 - val_mse: 0.0035 Epoch 241/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0474 - mse: 0.0035 - val_loss: 0.0034 - val_mae: 0.0467 - val_mse: 0.0034 Epoch 242/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0474 - mse: 0.0036 - val_loss: 0.0033 - val_mae: 0.0465 - val_mse: 0.0033 Epoch 243/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0471 - mse: 0.0035 - val_loss: 0.0032 - val_mae: 0.0456 - val_mse: 0.0032 Epoch 244/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0470 - mse: 0.0035 - val_loss: 0.0033 - val_mae: 0.0459 - val_mse: 0.0033 Epoch 245/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0470 - mse: 0.0035 - val_loss: 0.0034 - val_mae: 0.0472 - val_mse: 0.0034 Epoch 246/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0472 - mse: 0.0035 - val_loss: 0.0036 - val_mae: 0.0485 - val_mse: 0.0036 Epoch 247/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0477 - mse: 0.0035 - val_loss: 0.0035 - val_mae: 0.0482 - val_mse: 0.0035 Epoch 248/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0481 - mse: 0.0036 - val_loss: 0.0033 - val_mae: 0.0468 - val_mse: 0.0033 Epoch 249/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0036 - mae: 0.0477 - mse: 0.0036 - val_loss: 0.0035 - val_mae: 0.0486 - val_mse: 0.0035 Epoch 250/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0476 - mse: 0.0035 - val_loss: 0.0034 - val_mae: 0.0473 - val_mse: 0.0034 Epoch 251/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0472 - mse: 0.0035 - val_loss: 0.0033 - val_mae: 0.0465 - val_mse: 0.0033 Epoch 252/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0471 - mse: 0.0035 - val_loss: 0.0034 - val_mae: 0.0469 - val_mse: 0.0034 Epoch 253/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0469 - mse: 0.0035 - val_loss: 0.0033 - val_mae: 0.0460 - val_mse: 0.0033 Epoch 254/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0035 - mae: 0.0467 - mse: 0.0035 - val_loss: 0.0032 - val_mae: 0.0454 - val_mse: 0.0032 Epoch 255/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0467 - mse: 0.0035 - val_loss: 0.0033 - val_mae: 0.0464 - val_mse: 0.0033 Epoch 256/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0469 - mse: 0.0035 - val_loss: 0.0034 - val_mae: 0.0468 - val_mse: 0.0034 Epoch 257/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0470 - mse: 0.0035 - val_loss: 0.0032 - val_mae: 0.0457 - val_mse: 0.0032 Epoch 258/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0467 - mse: 0.0035 - val_loss: 0.0033 - val_mae: 0.0465 - val_mse: 0.0033 Epoch 259/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0471 - mse: 0.0035 - val_loss: 0.0035 - val_mae: 0.0476 - val_mse: 0.0035 Epoch 260/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0470 - mse: 0.0035 - val_loss: 0.0034 - val_mae: 0.0474 - val_mse: 0.0034 Epoch 261/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0469 - mse: 0.0035 - val_loss: 0.0033 - val_mae: 0.0467 - val_mse: 0.0033 Epoch 262/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0470 - mse: 0.0035 - val_loss: 0.0033 - val_mae: 0.0466 - val_mse: 0.0033 Epoch 263/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0034 - mae: 0.0467 - mse: 0.0034 - val_loss: 0.0032 - val_mae: 0.0454 - val_mse: 0.0032 Epoch 264/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0035 - mae: 0.0465 - mse: 0.0035 - val_loss: 0.0031 - val_mae: 0.0448 - val_mse: 0.0031 Epoch 265/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0463 - mse: 0.0035 - val_loss: 0.0034 - val_mae: 0.0469 - val_mse: 0.0034 Epoch 266/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0472 - mse: 0.0035 - val_loss: 0.0035 - val_mae: 0.0478 - val_mse: 0.0035 Epoch 267/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0472 - mse: 0.0035 - val_loss: 0.0032 - val_mae: 0.0456 - val_mse: 0.0032 Epoch 268/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0465 - mse: 0.0035 - val_loss: 0.0033 - val_mae: 0.0466 - val_mse: 0.0033 Epoch 269/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0470 - mse: 0.0035 - val_loss: 0.0036 - val_mae: 0.0490 - val_mse: 0.0036 Epoch 270/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0478 - mse: 0.0035 - val_loss: 0.0034 - val_mae: 0.0476 - val_mse: 0.0034 Epoch 271/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0035 - mae: 0.0470 - mse: 0.0035 - val_loss: 0.0031 - val_mae: 0.0452 - val_mse: 0.0031 Epoch 272/1000 3/3 [==============================] - 0s 14ms/step - loss: 0.0035 - mae: 0.0466 - mse: 0.0035 - val_loss: 0.0032 - val_mae: 0.0456 - val_mse: 0.0032 Epoch 273/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0470 - mse: 0.0035 - val_loss: 0.0035 - val_mae: 0.0480 - val_mse: 0.0035 Epoch 274/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0471 - mse: 0.0035 - val_loss: 0.0033 - val_mae: 0.0465 - val_mse: 0.0033 Epoch 275/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0466 - mse: 0.0034 - val_loss: 0.0032 - val_mae: 0.0457 - val_mse: 0.0032 Epoch 276/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0463 - mse: 0.0034 - val_loss: 0.0034 - val_mae: 0.0475 - val_mse: 0.0034 Epoch 277/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0035 - mae: 0.0474 - mse: 0.0035 - val_loss: 0.0035 - val_mae: 0.0482 - val_mse: 0.0035 Epoch 278/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0469 - mse: 0.0034 - val_loss: 0.0032 - val_mae: 0.0459 - val_mse: 0.0032 Epoch 279/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0034 - mae: 0.0463 - mse: 0.0034 - val_loss: 0.0031 - val_mae: 0.0447 - val_mse: 0.0031 Epoch 280/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0462 - mse: 0.0034 - val_loss: 0.0032 - val_mae: 0.0455 - val_mse: 0.0032 Epoch 281/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0465 - mse: 0.0034 - val_loss: 0.0033 - val_mae: 0.0463 - val_mse: 0.0033 Epoch 282/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0461 - mse: 0.0034 - val_loss: 0.0031 - val_mae: 0.0446 - val_mse: 0.0031 Epoch 283/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0034 - mae: 0.0457 - mse: 0.0034 - val_loss: 0.0031 - val_mae: 0.0440 - val_mse: 0.0031 Epoch 284/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0034 - mae: 0.0458 - mse: 0.0034 - val_loss: 0.0032 - val_mae: 0.0451 - val_mse: 0.0032 Epoch 285/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0463 - mse: 0.0034 - val_loss: 0.0034 - val_mae: 0.0471 - val_mse: 0.0034 Epoch 286/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0463 - mse: 0.0034 - val_loss: 0.0033 - val_mae: 0.0461 - val_mse: 0.0033 Epoch 287/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0461 - mse: 0.0034 - val_loss: 0.0033 - val_mae: 0.0463 - val_mse: 0.0033 Epoch 288/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0464 - mse: 0.0034 - val_loss: 0.0034 - val_mae: 0.0474 - val_mse: 0.0034 Epoch 289/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0465 - mse: 0.0034 - val_loss: 0.0033 - val_mae: 0.0463 - val_mse: 0.0033 Epoch 290/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0465 - mse: 0.0034 - val_loss: 0.0034 - val_mae: 0.0471 - val_mse: 0.0034 Epoch 291/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0467 - mse: 0.0034 - val_loss: 0.0033 - val_mae: 0.0465 - val_mse: 0.0033 Epoch 292/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0465 - mse: 0.0034 - val_loss: 0.0034 - val_mae: 0.0473 - val_mse: 0.0034 Epoch 293/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0466 - mse: 0.0034 - val_loss: 0.0032 - val_mae: 0.0463 - val_mse: 0.0032 Epoch 294/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0465 - mse: 0.0034 - val_loss: 0.0032 - val_mae: 0.0463 - val_mse: 0.0032 Epoch 295/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0034 - mae: 0.0462 - mse: 0.0034 - val_loss: 0.0032 - val_mae: 0.0456 - val_mse: 0.0032 Epoch 296/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0458 - mse: 0.0033 - val_loss: 0.0032 - val_mae: 0.0458 - val_mse: 0.0032 Epoch 297/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0458 - mse: 0.0033 - val_loss: 0.0034 - val_mae: 0.0467 - val_mse: 0.0034 Epoch 298/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0462 - mse: 0.0033 - val_loss: 0.0034 - val_mae: 0.0472 - val_mse: 0.0034 Epoch 299/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0463 - mse: 0.0033 - val_loss: 0.0032 - val_mae: 0.0463 - val_mse: 0.0032 Epoch 300/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0463 - mse: 0.0033 - val_loss: 0.0033 - val_mae: 0.0468 - val_mse: 0.0033 Epoch 301/1000 3/3 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0461 - mse: 0.0033 - val_loss: 0.0032 - val_mae: 0.0456 - val_mse: 0.0032 Epoch 302/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0458 - mse: 0.0033 - val_loss: 0.0033 - val_mae: 0.0461 - val_mse: 0.0033 Epoch 303/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0458 - mse: 0.0033 - val_loss: 0.0033 - val_mae: 0.0463 - val_mse: 0.0033 Epoch 304/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0457 - mse: 0.0033 - val_loss: 0.0033 - val_mae: 0.0463 - val_mse: 0.0033 Epoch 305/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0457 - mse: 0.0033 - val_loss: 0.0031 - val_mae: 0.0452 - val_mse: 0.0031 Epoch 306/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0454 - mse: 0.0033 - val_loss: 0.0032 - val_mae: 0.0457 - val_mse: 0.0032 Epoch 307/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0456 - mse: 0.0033 - val_loss: 0.0032 - val_mae: 0.0457 - val_mse: 0.0032 Epoch 308/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0456 - mse: 0.0033 - val_loss: 0.0032 - val_mae: 0.0455 - val_mse: 0.0032 Epoch 309/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0454 - mse: 0.0033 - val_loss: 0.0031 - val_mae: 0.0449 - val_mse: 0.0031 Epoch 310/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0454 - mse: 0.0033 - val_loss: 0.0031 - val_mae: 0.0448 - val_mse: 0.0031 Epoch 311/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0033 - mae: 0.0452 - mse: 0.0033 - val_loss: 0.0030 - val_mae: 0.0438 - val_mse: 0.0030 Epoch 312/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0033 - mae: 0.0450 - mse: 0.0033 - val_loss: 0.0031 - val_mae: 0.0448 - val_mse: 0.0031 Epoch 313/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0453 - mse: 0.0033 - val_loss: 0.0031 - val_mae: 0.0450 - val_mse: 0.0031 Epoch 314/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0451 - mse: 0.0032 - val_loss: 0.0030 - val_mae: 0.0439 - val_mse: 0.0030 Epoch 315/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0033 - mae: 0.0450 - mse: 0.0033 - val_loss: 0.0029 - val_mae: 0.0430 - val_mse: 0.0029 Epoch 316/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0453 - mse: 0.0033 - val_loss: 0.0031 - val_mae: 0.0445 - val_mse: 0.0031 Epoch 317/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0450 - mse: 0.0032 - val_loss: 0.0030 - val_mae: 0.0440 - val_mse: 0.0030 Epoch 318/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0448 - mse: 0.0033 - val_loss: 0.0029 - val_mae: 0.0429 - val_mse: 0.0029 Epoch 319/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0445 - mse: 0.0033 - val_loss: 0.0030 - val_mae: 0.0436 - val_mse: 0.0030 Epoch 320/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0455 - mse: 0.0033 - val_loss: 0.0034 - val_mae: 0.0471 - val_mse: 0.0034 Epoch 321/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0458 - mse: 0.0033 - val_loss: 0.0031 - val_mae: 0.0454 - val_mse: 0.0031 Epoch 322/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0451 - mse: 0.0032 - val_loss: 0.0029 - val_mae: 0.0438 - val_mse: 0.0029 Epoch 323/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0450 - mse: 0.0032 - val_loss: 0.0031 - val_mae: 0.0450 - val_mse: 0.0031 Epoch 324/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0453 - mse: 0.0032 - val_loss: 0.0033 - val_mae: 0.0467 - val_mse: 0.0033 Epoch 325/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0458 - mse: 0.0033 - val_loss: 0.0032 - val_mae: 0.0459 - val_mse: 0.0032 Epoch 326/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0453 - mse: 0.0033 - val_loss: 0.0030 - val_mae: 0.0440 - val_mse: 0.0030 Epoch 327/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0447 - mse: 0.0032 - val_loss: 0.0031 - val_mae: 0.0451 - val_mse: 0.0031 Epoch 328/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0449 - mse: 0.0032 - val_loss: 0.0033 - val_mae: 0.0462 - val_mse: 0.0033 Epoch 329/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0456 - mse: 0.0032 - val_loss: 0.0033 - val_mae: 0.0462 - val_mse: 0.0033 Epoch 330/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0453 - mse: 0.0032 - val_loss: 0.0029 - val_mae: 0.0439 - val_mse: 0.0029 Epoch 331/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0446 - mse: 0.0032 - val_loss: 0.0029 - val_mae: 0.0434 - val_mse: 0.0029 Epoch 332/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0453 - mse: 0.0033 - val_loss: 0.0033 - val_mae: 0.0460 - val_mse: 0.0033 Epoch 333/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0448 - mse: 0.0032 - val_loss: 0.0030 - val_mae: 0.0438 - val_mse: 0.0030 Epoch 334/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0443 - mse: 0.0032 - val_loss: 0.0029 - val_mae: 0.0434 - val_mse: 0.0029 Epoch 335/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0443 - mse: 0.0032 - val_loss: 0.0030 - val_mae: 0.0440 - val_mse: 0.0030 Epoch 336/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0447 - mse: 0.0032 - val_loss: 0.0030 - val_mae: 0.0439 - val_mse: 0.0030 Epoch 337/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0444 - mse: 0.0032 - val_loss: 0.0031 - val_mae: 0.0447 - val_mse: 0.0031 Epoch 338/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0455 - mse: 0.0033 - val_loss: 0.0032 - val_mae: 0.0455 - val_mse: 0.0032 Epoch 339/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0032 - mae: 0.0447 - mse: 0.0032 - val_loss: 0.0028 - val_mae: 0.0426 - val_mse: 0.0028 Epoch 340/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0441 - mse: 0.0032 - val_loss: 0.0029 - val_mae: 0.0426 - val_mse: 0.0029 Epoch 341/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0440 - mse: 0.0031 - val_loss: 0.0032 - val_mae: 0.0454 - val_mse: 0.0032 Epoch 342/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0453 - mse: 0.0032 - val_loss: 0.0032 - val_mae: 0.0453 - val_mse: 0.0032 Epoch 343/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0031 - mae: 0.0445 - mse: 0.0031 - val_loss: 0.0028 - val_mae: 0.0428 - val_mse: 0.0028 Epoch 344/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0032 - mae: 0.0441 - mse: 0.0032 - val_loss: 0.0027 - val_mae: 0.0412 - val_mse: 0.0027 Epoch 345/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0439 - mse: 0.0032 - val_loss: 0.0029 - val_mae: 0.0427 - val_mse: 0.0029 Epoch 346/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0442 - mse: 0.0031 - val_loss: 0.0031 - val_mae: 0.0445 - val_mse: 0.0031 Epoch 347/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0444 - mse: 0.0031 - val_loss: 0.0030 - val_mae: 0.0438 - val_mse: 0.0030 Epoch 348/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0440 - mse: 0.0032 - val_loss: 0.0029 - val_mae: 0.0432 - val_mse: 0.0029 Epoch 349/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0439 - mse: 0.0031 - val_loss: 0.0031 - val_mae: 0.0451 - val_mse: 0.0031 Epoch 350/1000 3/3 [==============================] - 0s 10ms/step - loss: 0.0032 - mae: 0.0455 - mse: 0.0032 - val_loss: 0.0032 - val_mae: 0.0462 - val_mse: 0.0032 Epoch 351/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0447 - mse: 0.0031 - val_loss: 0.0029 - val_mae: 0.0434 - val_mse: 0.0029 Epoch 352/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0444 - mse: 0.0033 - val_loss: 0.0027 - val_mae: 0.0420 - val_mse: 0.0027 Epoch 353/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0438 - mse: 0.0031 - val_loss: 0.0032 - val_mae: 0.0457 - val_mse: 0.0032 Epoch 354/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0455 - mse: 0.0032 - val_loss: 0.0033 - val_mae: 0.0466 - val_mse: 0.0033 Epoch 355/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0454 - mse: 0.0032 - val_loss: 0.0031 - val_mae: 0.0452 - val_mse: 0.0031 Epoch 356/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0445 - mse: 0.0031 - val_loss: 0.0029 - val_mae: 0.0441 - val_mse: 0.0029 Epoch 357/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0443 - mse: 0.0031 - val_loss: 0.0028 - val_mae: 0.0430 - val_mse: 0.0028 Epoch 358/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0438 - mse: 0.0031 - val_loss: 0.0029 - val_mae: 0.0434 - val_mse: 0.0029 Epoch 359/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0436 - mse: 0.0031 - val_loss: 0.0029 - val_mae: 0.0433 - val_mse: 0.0029 Epoch 360/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0434 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0423 - val_mse: 0.0028 Epoch 361/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0432 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0426 - val_mse: 0.0028 Epoch 362/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0433 - mse: 0.0030 - val_loss: 0.0029 - val_mae: 0.0433 - val_mse: 0.0029 Epoch 363/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0445 - mse: 0.0031 - val_loss: 0.0032 - val_mae: 0.0456 - val_mse: 0.0032 Epoch 364/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0442 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0428 - val_mse: 0.0028 Epoch 365/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0031 - mae: 0.0432 - mse: 0.0031 - val_loss: 0.0027 - val_mae: 0.0415 - val_mse: 0.0027 Epoch 366/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0429 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0425 - val_mse: 0.0028 Epoch 367/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0432 - mse: 0.0030 - val_loss: 0.0029 - val_mae: 0.0434 - val_mse: 0.0029 Epoch 368/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0433 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0421 - val_mse: 0.0028 Epoch 369/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0429 - mse: 0.0030 - val_loss: 0.0027 - val_mae: 0.0417 - val_mse: 0.0027 Epoch 370/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0429 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0426 - val_mse: 0.0028 Epoch 371/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0431 - mse: 0.0030 - val_loss: 0.0029 - val_mae: 0.0434 - val_mse: 0.0029 Epoch 372/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0433 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0427 - val_mse: 0.0028 Epoch 373/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0433 - mse: 0.0030 - val_loss: 0.0029 - val_mae: 0.0434 - val_mse: 0.0029 Epoch 374/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0433 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0430 - val_mse: 0.0028 Epoch 375/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0431 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0427 - val_mse: 0.0028 Epoch 376/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0433 - mse: 0.0030 - val_loss: 0.0029 - val_mae: 0.0432 - val_mse: 0.0029 Epoch 377/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0431 - mse: 0.0030 - val_loss: 0.0027 - val_mae: 0.0420 - val_mse: 0.0027 Epoch 378/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0030 - mae: 0.0427 - mse: 0.0030 - val_loss: 0.0026 - val_mae: 0.0403 - val_mse: 0.0026 Epoch 379/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0422 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0419 - val_mse: 0.0028 Epoch 380/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0428 - mse: 0.0030 - val_loss: 0.0030 - val_mae: 0.0437 - val_mse: 0.0030 Epoch 381/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0432 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0428 - val_mse: 0.0028 Epoch 382/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0428 - mse: 0.0029 - val_loss: 0.0027 - val_mae: 0.0419 - val_mse: 0.0027 Epoch 383/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0426 - mse: 0.0029 - val_loss: 0.0027 - val_mae: 0.0420 - val_mse: 0.0027 Epoch 384/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0429 - mse: 0.0030 - val_loss: 0.0028 - val_mae: 0.0425 - val_mse: 0.0028 Epoch 385/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0030 - mae: 0.0424 - mse: 0.0030 - val_loss: 0.0026 - val_mae: 0.0403 - val_mse: 0.0026 Epoch 386/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0419 - mse: 0.0029 - val_loss: 0.0027 - val_mae: 0.0418 - val_mse: 0.0027 Epoch 387/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0436 - mse: 0.0030 - val_loss: 0.0029 - val_mae: 0.0435 - val_mse: 0.0029 Epoch 388/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0029 - mae: 0.0424 - mse: 0.0029 - val_loss: 0.0026 - val_mae: 0.0407 - val_mse: 0.0026 Epoch 389/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0030 - mae: 0.0421 - mse: 0.0030 - val_loss: 0.0025 - val_mae: 0.0403 - val_mse: 0.0025 Epoch 390/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0418 - mse: 0.0030 - val_loss: 0.0026 - val_mae: 0.0407 - val_mse: 0.0026 Epoch 391/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0424 - mse: 0.0029 - val_loss: 0.0029 - val_mae: 0.0435 - val_mse: 0.0029 Epoch 392/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0430 - mse: 0.0029 - val_loss: 0.0028 - val_mae: 0.0429 - val_mse: 0.0028 Epoch 393/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0427 - mse: 0.0029 - val_loss: 0.0027 - val_mae: 0.0419 - val_mse: 0.0027 Epoch 394/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0029 - mae: 0.0421 - mse: 0.0029 - val_loss: 0.0025 - val_mae: 0.0403 - val_mse: 0.0025 Epoch 395/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0417 - mse: 0.0029 - val_loss: 0.0025 - val_mae: 0.0401 - val_mse: 0.0025 Epoch 396/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0416 - mse: 0.0029 - val_loss: 0.0027 - val_mae: 0.0412 - val_mse: 0.0027 Epoch 397/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0420 - mse: 0.0029 - val_loss: 0.0026 - val_mae: 0.0409 - val_mse: 0.0026 Epoch 398/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0028 - mae: 0.0416 - mse: 0.0028 - val_loss: 0.0025 - val_mae: 0.0403 - val_mse: 0.0025 Epoch 399/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0028 - mae: 0.0415 - mse: 0.0028 - val_loss: 0.0026 - val_mae: 0.0409 - val_mse: 0.0026 Epoch 400/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0028 - mae: 0.0417 - mse: 0.0028 - val_loss: 0.0027 - val_mae: 0.0420 - val_mse: 0.0027 Epoch 401/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0421 - mse: 0.0029 - val_loss: 0.0028 - val_mae: 0.0422 - val_mse: 0.0028 Epoch 402/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0030 - mae: 0.0440 - mse: 0.0030 - val_loss: 0.0029 - val_mae: 0.0438 - val_mse: 0.0029 Epoch 403/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0029 - mae: 0.0425 - mse: 0.0029 - val_loss: 0.0025 - val_mae: 0.0404 - val_mse: 0.0025 Epoch 404/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0029 - mae: 0.0415 - mse: 0.0029 - val_loss: 0.0025 - val_mae: 0.0395 - val_mse: 0.0025 Epoch 405/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0028 - mae: 0.0408 - mse: 0.0028 - val_loss: 0.0027 - val_mae: 0.0411 - val_mse: 0.0027 Epoch 406/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0028 - mae: 0.0424 - mse: 0.0028 - val_loss: 0.0030 - val_mae: 0.0439 - val_mse: 0.0030 Epoch 407/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0431 - mse: 0.0029 - val_loss: 0.0028 - val_mae: 0.0429 - val_mse: 0.0028 Epoch 408/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0028 - mae: 0.0423 - mse: 0.0028 - val_loss: 0.0026 - val_mae: 0.0411 - val_mse: 0.0026 Epoch 409/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0029 - mae: 0.0417 - mse: 0.0029 - val_loss: 0.0026 - val_mae: 0.0410 - val_mse: 0.0026 Epoch 410/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0028 - mae: 0.0413 - mse: 0.0028 - val_loss: 0.0029 - val_mae: 0.0434 - val_mse: 0.0029 Epoch 411/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0432 - mse: 0.0029 - val_loss: 0.0028 - val_mae: 0.0426 - val_mse: 0.0028 Epoch 412/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0027 - mae: 0.0412 - mse: 0.0027 - val_loss: 0.0025 - val_mae: 0.0397 - val_mse: 0.0025 Epoch 413/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0029 - mae: 0.0412 - mse: 0.0029 - val_loss: 0.0024 - val_mae: 0.0386 - val_mse: 0.0024 Epoch 414/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0028 - mae: 0.0408 - mse: 0.0028 - val_loss: 0.0028 - val_mae: 0.0422 - val_mse: 0.0028 Epoch 415/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0028 - mae: 0.0423 - mse: 0.0028 - val_loss: 0.0027 - val_mae: 0.0413 - val_mse: 0.0027 Epoch 416/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0409 - mse: 0.0027 - val_loss: 0.0024 - val_mae: 0.0395 - val_mse: 0.0024 Epoch 417/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0028 - mae: 0.0408 - mse: 0.0028 - val_loss: 0.0024 - val_mae: 0.0386 - val_mse: 0.0024 Epoch 418/1000 3/3 [==============================] - 0s 13ms/step - loss: 0.0028 - mae: 0.0406 - mse: 0.0028 - val_loss: 0.0027 - val_mae: 0.0417 - val_mse: 0.0027 Epoch 419/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0028 - mae: 0.0419 - mse: 0.0028 - val_loss: 0.0027 - val_mae: 0.0413 - val_mse: 0.0027 Epoch 420/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0409 - mse: 0.0027 - val_loss: 0.0024 - val_mae: 0.0393 - val_mse: 0.0024 Epoch 421/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0404 - mse: 0.0027 - val_loss: 0.0024 - val_mae: 0.0395 - val_mse: 0.0024 Epoch 422/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0403 - mse: 0.0027 - val_loss: 0.0026 - val_mae: 0.0408 - val_mse: 0.0026 Epoch 423/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0413 - mse: 0.0027 - val_loss: 0.0027 - val_mae: 0.0414 - val_mse: 0.0027 Epoch 424/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0407 - mse: 0.0027 - val_loss: 0.0024 - val_mae: 0.0391 - val_mse: 0.0024 Epoch 425/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0027 - mae: 0.0404 - mse: 0.0027 - val_loss: 0.0023 - val_mae: 0.0385 - val_mse: 0.0023 Epoch 426/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0027 - mae: 0.0402 - mse: 0.0027 - val_loss: 0.0024 - val_mae: 0.0390 - val_mse: 0.0024 Epoch 427/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0404 - mse: 0.0027 - val_loss: 0.0025 - val_mae: 0.0393 - val_mse: 0.0025 Epoch 428/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0403 - mse: 0.0027 - val_loss: 0.0023 - val_mae: 0.0383 - val_mse: 0.0023 Epoch 429/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0399 - mse: 0.0027 - val_loss: 0.0023 - val_mae: 0.0382 - val_mse: 0.0023 Epoch 430/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0401 - mse: 0.0027 - val_loss: 0.0027 - val_mae: 0.0416 - val_mse: 0.0027 Epoch 431/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0411 - mse: 0.0027 - val_loss: 0.0026 - val_mae: 0.0407 - val_mse: 0.0026 Epoch 432/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0403 - mse: 0.0026 - val_loss: 0.0024 - val_mae: 0.0390 - val_mse: 0.0024 Epoch 433/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0399 - mse: 0.0027 - val_loss: 0.0024 - val_mae: 0.0389 - val_mse: 0.0024 Epoch 434/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0406 - mse: 0.0027 - val_loss: 0.0026 - val_mae: 0.0408 - val_mse: 0.0026 Epoch 435/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0404 - mse: 0.0026 - val_loss: 0.0024 - val_mae: 0.0390 - val_mse: 0.0024 Epoch 436/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0397 - mse: 0.0026 - val_loss: 0.0024 - val_mae: 0.0393 - val_mse: 0.0024 Epoch 437/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0397 - mse: 0.0026 - val_loss: 0.0026 - val_mae: 0.0408 - val_mse: 0.0026 Epoch 438/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0027 - mae: 0.0415 - mse: 0.0027 - val_loss: 0.0027 - val_mae: 0.0419 - val_mse: 0.0027 Epoch 439/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0407 - mse: 0.0026 - val_loss: 0.0024 - val_mae: 0.0394 - val_mse: 0.0024 Epoch 440/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0399 - mse: 0.0026 - val_loss: 0.0024 - val_mae: 0.0392 - val_mse: 0.0024 Epoch 441/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0398 - mse: 0.0026 - val_loss: 0.0025 - val_mae: 0.0400 - val_mse: 0.0025 Epoch 442/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0403 - mse: 0.0026 - val_loss: 0.0024 - val_mae: 0.0396 - val_mse: 0.0024 Epoch 443/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0399 - mse: 0.0026 - val_loss: 0.0024 - val_mae: 0.0388 - val_mse: 0.0024 Epoch 444/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0394 - mse: 0.0026 - val_loss: 0.0024 - val_mae: 0.0393 - val_mse: 0.0024 Epoch 445/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0397 - mse: 0.0026 - val_loss: 0.0024 - val_mae: 0.0392 - val_mse: 0.0024 Epoch 446/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0397 - mse: 0.0025 - val_loss: 0.0024 - val_mae: 0.0389 - val_mse: 0.0024 Epoch 447/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0396 - mse: 0.0025 - val_loss: 0.0023 - val_mae: 0.0385 - val_mse: 0.0023 Epoch 448/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0026 - mae: 0.0392 - mse: 0.0026 - val_loss: 0.0022 - val_mae: 0.0374 - val_mse: 0.0022 Epoch 449/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0026 - mae: 0.0392 - mse: 0.0026 - val_loss: 0.0024 - val_mae: 0.0388 - val_mse: 0.0024 Epoch 450/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0392 - mse: 0.0025 - val_loss: 0.0023 - val_mae: 0.0373 - val_mse: 0.0023 Epoch 451/1000 3/3 [==============================] - 0s 15ms/step - loss: 0.0025 - mae: 0.0387 - mse: 0.0025 - val_loss: 0.0022 - val_mae: 0.0367 - val_mse: 0.0022 Epoch 452/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0384 - mse: 0.0025 - val_loss: 0.0022 - val_mae: 0.0374 - val_mse: 0.0022 Epoch 453/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0386 - mse: 0.0025 - val_loss: 0.0023 - val_mae: 0.0382 - val_mse: 0.0023 Epoch 454/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0392 - mse: 0.0025 - val_loss: 0.0025 - val_mae: 0.0394 - val_mse: 0.0025 Epoch 455/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0393 - mse: 0.0025 - val_loss: 0.0023 - val_mae: 0.0379 - val_mse: 0.0023 Epoch 456/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0025 - mae: 0.0385 - mse: 0.0025 - val_loss: 0.0021 - val_mae: 0.0365 - val_mse: 0.0021 Epoch 457/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0026 - mae: 0.0381 - mse: 0.0026 - val_loss: 0.0021 - val_mae: 0.0359 - val_mse: 0.0021 Epoch 458/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0381 - mse: 0.0025 - val_loss: 0.0024 - val_mae: 0.0397 - val_mse: 0.0024 Epoch 459/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0397 - mse: 0.0025 - val_loss: 0.0023 - val_mae: 0.0384 - val_mse: 0.0023 Epoch 460/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0390 - mse: 0.0025 - val_loss: 0.0023 - val_mae: 0.0383 - val_mse: 0.0023 Epoch 461/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0398 - mse: 0.0025 - val_loss: 0.0023 - val_mae: 0.0388 - val_mse: 0.0023 Epoch 462/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0392 - mse: 0.0025 - val_loss: 0.0022 - val_mae: 0.0369 - val_mse: 0.0022 Epoch 463/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0024 - mae: 0.0380 - mse: 0.0024 - val_loss: 0.0022 - val_mae: 0.0371 - val_mse: 0.0022 Epoch 464/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0024 - mae: 0.0384 - mse: 0.0024 - val_loss: 0.0024 - val_mae: 0.0392 - val_mse: 0.0024 Epoch 465/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0390 - mse: 0.0025 - val_loss: 0.0023 - val_mae: 0.0379 - val_mse: 0.0023 Epoch 466/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0024 - mae: 0.0387 - mse: 0.0024 - val_loss: 0.0023 - val_mae: 0.0382 - val_mse: 0.0023 Epoch 467/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0024 - mae: 0.0383 - mse: 0.0024 - val_loss: 0.0021 - val_mae: 0.0362 - val_mse: 0.0021 Epoch 468/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0024 - mae: 0.0376 - mse: 0.0024 - val_loss: 0.0021 - val_mae: 0.0357 - val_mse: 0.0021 Epoch 469/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0024 - mae: 0.0376 - mse: 0.0024 - val_loss: 0.0022 - val_mae: 0.0368 - val_mse: 0.0022 Epoch 470/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0024 - mae: 0.0388 - mse: 0.0024 - val_loss: 0.0022 - val_mae: 0.0373 - val_mse: 0.0022 Epoch 471/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0024 - mae: 0.0381 - mse: 0.0024 - val_loss: 0.0020 - val_mae: 0.0358 - val_mse: 0.0020 Epoch 472/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0376 - mse: 0.0025 - val_loss: 0.0020 - val_mae: 0.0356 - val_mse: 0.0020 Epoch 473/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0024 - mae: 0.0384 - mse: 0.0024 - val_loss: 0.0024 - val_mae: 0.0392 - val_mse: 0.0024 Epoch 474/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0024 - mae: 0.0392 - mse: 0.0024 - val_loss: 0.0022 - val_mae: 0.0368 - val_mse: 0.0022 Epoch 475/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0024 - mae: 0.0378 - mse: 0.0024 - val_loss: 0.0020 - val_mae: 0.0356 - val_mse: 0.0020 Epoch 476/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0024 - mae: 0.0372 - mse: 0.0024 - val_loss: 0.0021 - val_mae: 0.0361 - val_mse: 0.0021 Epoch 477/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0375 - mse: 0.0023 - val_loss: 0.0022 - val_mae: 0.0377 - val_mse: 0.0022 Epoch 478/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0025 - mae: 0.0394 - mse: 0.0025 - val_loss: 0.0023 - val_mae: 0.0379 - val_mse: 0.0023 Epoch 479/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0377 - mse: 0.0023 - val_loss: 0.0020 - val_mae: 0.0355 - val_mse: 0.0020 Epoch 480/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0024 - mae: 0.0372 - mse: 0.0024 - val_loss: 0.0020 - val_mae: 0.0350 - val_mse: 0.0020 Epoch 481/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0368 - mse: 0.0023 - val_loss: 0.0021 - val_mae: 0.0363 - val_mse: 0.0021 Epoch 482/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0374 - mse: 0.0023 - val_loss: 0.0021 - val_mae: 0.0361 - val_mse: 0.0021 Epoch 483/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0370 - mse: 0.0023 - val_loss: 0.0020 - val_mae: 0.0356 - val_mse: 0.0020 Epoch 484/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0368 - mse: 0.0023 - val_loss: 0.0021 - val_mae: 0.0357 - val_mse: 0.0021 Epoch 485/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0368 - mse: 0.0023 - val_loss: 0.0020 - val_mae: 0.0353 - val_mse: 0.0020 Epoch 486/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0367 - mse: 0.0023 - val_loss: 0.0021 - val_mae: 0.0355 - val_mse: 0.0021 Epoch 487/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0369 - mse: 0.0023 - val_loss: 0.0021 - val_mae: 0.0362 - val_mse: 0.0021 Epoch 488/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0372 - mse: 0.0023 - val_loss: 0.0020 - val_mae: 0.0354 - val_mse: 0.0020 Epoch 489/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0023 - mae: 0.0366 - mse: 0.0023 - val_loss: 0.0019 - val_mae: 0.0340 - val_mse: 0.0019 Epoch 490/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0359 - mse: 0.0023 - val_loss: 0.0019 - val_mae: 0.0342 - val_mse: 0.0019 Epoch 491/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0370 - mse: 0.0023 - val_loss: 0.0022 - val_mae: 0.0379 - val_mse: 0.0022 Epoch 492/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0378 - mse: 0.0023 - val_loss: 0.0021 - val_mae: 0.0359 - val_mse: 0.0021 Epoch 493/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0022 - mae: 0.0365 - mse: 0.0022 - val_loss: 0.0019 - val_mae: 0.0346 - val_mse: 0.0019 Epoch 494/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0360 - mse: 0.0023 - val_loss: 0.0019 - val_mae: 0.0343 - val_mse: 0.0019 Epoch 495/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0022 - mae: 0.0358 - mse: 0.0022 - val_loss: 0.0022 - val_mae: 0.0371 - val_mse: 0.0022 Epoch 496/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0377 - mse: 0.0023 - val_loss: 0.0022 - val_mae: 0.0373 - val_mse: 0.0022 Epoch 497/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0022 - mae: 0.0367 - mse: 0.0022 - val_loss: 0.0019 - val_mae: 0.0346 - val_mse: 0.0019 Epoch 498/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0022 - mae: 0.0358 - mse: 0.0022 - val_loss: 0.0019 - val_mae: 0.0336 - val_mse: 0.0019 Epoch 499/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0022 - mae: 0.0355 - mse: 0.0022 - val_loss: 0.0019 - val_mae: 0.0345 - val_mse: 0.0019 Epoch 500/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0022 - mae: 0.0359 - mse: 0.0022 - val_loss: 0.0019 - val_mae: 0.0346 - val_mse: 0.0019 Epoch 501/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0022 - mae: 0.0369 - mse: 0.0022 - val_loss: 0.0020 - val_mae: 0.0351 - val_mse: 0.0020 Epoch 502/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0022 - mae: 0.0361 - mse: 0.0022 - val_loss: 0.0018 - val_mae: 0.0339 - val_mse: 0.0018 Epoch 503/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0022 - mae: 0.0359 - mse: 0.0022 - val_loss: 0.0018 - val_mae: 0.0332 - val_mse: 0.0018 Epoch 504/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0022 - mae: 0.0353 - mse: 0.0022 - val_loss: 0.0018 - val_mae: 0.0333 - val_mse: 0.0018 Epoch 505/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0022 - mae: 0.0360 - mse: 0.0022 - val_loss: 0.0020 - val_mae: 0.0360 - val_mse: 0.0020 Epoch 506/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0022 - mae: 0.0366 - mse: 0.0022 - val_loss: 0.0019 - val_mae: 0.0345 - val_mse: 0.0019 Epoch 507/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0022 - mae: 0.0354 - mse: 0.0022 - val_loss: 0.0018 - val_mae: 0.0331 - val_mse: 0.0018 Epoch 508/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0022 - mae: 0.0351 - mse: 0.0022 - val_loss: 0.0018 - val_mae: 0.0330 - val_mse: 0.0018 Epoch 509/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0022 - mae: 0.0357 - mse: 0.0022 - val_loss: 0.0021 - val_mae: 0.0364 - val_mse: 0.0021 Epoch 510/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0021 - mae: 0.0363 - mse: 0.0021 - val_loss: 0.0018 - val_mae: 0.0337 - val_mse: 0.0018 Epoch 511/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0022 - mae: 0.0353 - mse: 0.0022 - val_loss: 0.0018 - val_mae: 0.0328 - val_mse: 0.0018 Epoch 512/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0021 - mae: 0.0343 - mse: 0.0021 - val_loss: 0.0020 - val_mae: 0.0356 - val_mse: 0.0020 Epoch 513/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0022 - mae: 0.0372 - mse: 0.0022 - val_loss: 0.0023 - val_mae: 0.0390 - val_mse: 0.0023 Epoch 514/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0022 - mae: 0.0375 - mse: 0.0022 - val_loss: 0.0020 - val_mae: 0.0352 - val_mse: 0.0020 Epoch 515/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0021 - mae: 0.0358 - mse: 0.0021 - val_loss: 0.0018 - val_mae: 0.0333 - val_mse: 0.0018 Epoch 516/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0021 - mae: 0.0345 - mse: 0.0021 - val_loss: 0.0019 - val_mae: 0.0341 - val_mse: 0.0019 Epoch 517/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0021 - mae: 0.0356 - mse: 0.0021 - val_loss: 0.0021 - val_mae: 0.0372 - val_mse: 0.0021 Epoch 518/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0021 - mae: 0.0364 - mse: 0.0021 - val_loss: 0.0019 - val_mae: 0.0339 - val_mse: 0.0019 Epoch 519/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0021 - mae: 0.0348 - mse: 0.0021 - val_loss: 0.0017 - val_mae: 0.0323 - val_mse: 0.0017 Epoch 520/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0021 - mae: 0.0344 - mse: 0.0021 - val_loss: 0.0018 - val_mae: 0.0331 - val_mse: 0.0018 Epoch 521/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0020 - mae: 0.0350 - mse: 0.0020 - val_loss: 0.0019 - val_mae: 0.0343 - val_mse: 0.0019 Epoch 522/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0020 - mae: 0.0350 - mse: 0.0020 - val_loss: 0.0018 - val_mae: 0.0326 - val_mse: 0.0018 Epoch 523/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0020 - mae: 0.0342 - mse: 0.0020 - val_loss: 0.0016 - val_mae: 0.0311 - val_mse: 0.0016 Epoch 524/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0021 - mae: 0.0337 - mse: 0.0021 - val_loss: 0.0017 - val_mae: 0.0327 - val_mse: 0.0017 Epoch 525/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0020 - mae: 0.0343 - mse: 0.0020 - val_loss: 0.0018 - val_mae: 0.0333 - val_mse: 0.0018 Epoch 526/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0020 - mae: 0.0350 - mse: 0.0020 - val_loss: 0.0018 - val_mae: 0.0330 - val_mse: 0.0018 Epoch 527/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0020 - mae: 0.0338 - mse: 0.0020 - val_loss: 0.0016 - val_mae: 0.0314 - val_mse: 0.0016 Epoch 528/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0020 - mae: 0.0336 - mse: 0.0020 - val_loss: 0.0016 - val_mae: 0.0312 - val_mse: 0.0016 Epoch 529/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0020 - mae: 0.0335 - mse: 0.0020 - val_loss: 0.0017 - val_mae: 0.0320 - val_mse: 0.0017 Epoch 530/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0020 - mae: 0.0349 - mse: 0.0020 - val_loss: 0.0017 - val_mae: 0.0324 - val_mse: 0.0017 Epoch 531/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0019 - mae: 0.0335 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0309 - val_mse: 0.0016 Epoch 532/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0020 - mae: 0.0332 - mse: 0.0020 - val_loss: 0.0016 - val_mae: 0.0303 - val_mse: 0.0016 Epoch 533/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0020 - mae: 0.0339 - mse: 0.0020 - val_loss: 0.0017 - val_mae: 0.0322 - val_mse: 0.0017 Epoch 534/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0019 - mae: 0.0331 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0305 - val_mse: 0.0016 Epoch 535/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0020 - mae: 0.0330 - mse: 0.0020 - val_loss: 0.0016 - val_mae: 0.0312 - val_mse: 0.0016 Epoch 536/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0330 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0312 - val_mse: 0.0016 Epoch 537/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0019 - mae: 0.0328 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0307 - val_mse: 0.0016 Epoch 538/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0328 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0317 - val_mse: 0.0016 Epoch 539/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0336 - mse: 0.0019 - val_loss: 0.0017 - val_mae: 0.0328 - val_mse: 0.0017 Epoch 540/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0340 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0314 - val_mse: 0.0016 Epoch 541/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0019 - mae: 0.0329 - mse: 0.0019 - val_loss: 0.0015 - val_mae: 0.0296 - val_mse: 0.0015 Epoch 542/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0019 - mae: 0.0328 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0310 - val_mse: 0.0016 Epoch 543/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0337 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0318 - val_mse: 0.0016 Epoch 544/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0330 - mse: 0.0019 - val_loss: 0.0015 - val_mae: 0.0294 - val_mse: 0.0015 Epoch 545/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0320 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0303 - val_mse: 0.0016 Epoch 546/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0018 - mae: 0.0323 - mse: 0.0018 - val_loss: 0.0016 - val_mae: 0.0318 - val_mse: 0.0016 Epoch 547/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0331 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0316 - val_mse: 0.0016 Epoch 548/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0018 - mae: 0.0326 - mse: 0.0018 - val_loss: 0.0015 - val_mae: 0.0296 - val_mse: 0.0015 Epoch 549/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0019 - mae: 0.0320 - mse: 0.0019 - val_loss: 0.0014 - val_mae: 0.0282 - val_mse: 0.0014 Epoch 550/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0312 - mse: 0.0019 - val_loss: 0.0016 - val_mae: 0.0314 - val_mse: 0.0016 Epoch 551/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0339 - mse: 0.0019 - val_loss: 0.0018 - val_mae: 0.0338 - val_mse: 0.0018 Epoch 552/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0018 - mae: 0.0336 - mse: 0.0018 - val_loss: 0.0016 - val_mae: 0.0309 - val_mse: 0.0016 Epoch 553/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0019 - mae: 0.0331 - mse: 0.0019 - val_loss: 0.0015 - val_mae: 0.0297 - val_mse: 0.0015 Epoch 554/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0018 - mae: 0.0325 - mse: 0.0018 - val_loss: 0.0017 - val_mae: 0.0331 - val_mse: 0.0017 Epoch 555/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0018 - mae: 0.0337 - mse: 0.0018 - val_loss: 0.0016 - val_mae: 0.0312 - val_mse: 0.0016 Epoch 556/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0018 - mae: 0.0324 - mse: 0.0018 - val_loss: 0.0015 - val_mae: 0.0298 - val_mse: 0.0015 Epoch 557/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0018 - mae: 0.0317 - mse: 0.0018 - val_loss: 0.0014 - val_mae: 0.0287 - val_mse: 0.0014 Epoch 558/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0018 - mae: 0.0311 - mse: 0.0018 - val_loss: 0.0016 - val_mae: 0.0313 - val_mse: 0.0016 Epoch 559/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0018 - mae: 0.0332 - mse: 0.0018 - val_loss: 0.0016 - val_mae: 0.0322 - val_mse: 0.0016 Epoch 560/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0017 - mae: 0.0324 - mse: 0.0017 - val_loss: 0.0014 - val_mae: 0.0293 - val_mse: 0.0014 Epoch 561/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0018 - mae: 0.0315 - mse: 0.0018 - val_loss: 0.0014 - val_mae: 0.0283 - val_mse: 0.0014 Epoch 562/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0315 - mse: 0.0017 - val_loss: 0.0017 - val_mae: 0.0333 - val_mse: 0.0017 Epoch 563/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0018 - mae: 0.0332 - mse: 0.0018 - val_loss: 0.0015 - val_mae: 0.0301 - val_mse: 0.0015 Epoch 564/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0314 - mse: 0.0017 - val_loss: 0.0014 - val_mae: 0.0294 - val_mse: 0.0014 Epoch 565/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0312 - mse: 0.0017 - val_loss: 0.0015 - val_mae: 0.0298 - val_mse: 0.0015 Epoch 566/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0320 - mse: 0.0017 - val_loss: 0.0016 - val_mae: 0.0324 - val_mse: 0.0016 Epoch 567/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0324 - mse: 0.0017 - val_loss: 0.0015 - val_mae: 0.0294 - val_mse: 0.0015 Epoch 568/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0017 - mae: 0.0307 - mse: 0.0017 - val_loss: 0.0014 - val_mae: 0.0282 - val_mse: 0.0014 Epoch 569/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0303 - mse: 0.0017 - val_loss: 0.0014 - val_mae: 0.0291 - val_mse: 0.0014 Epoch 570/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0322 - mse: 0.0017 - val_loss: 0.0015 - val_mae: 0.0312 - val_mse: 0.0015 Epoch 571/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0017 - mae: 0.0319 - mse: 0.0017 - val_loss: 0.0014 - val_mae: 0.0283 - val_mse: 0.0014 Epoch 572/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0018 - mae: 0.0307 - mse: 0.0018 - val_loss: 0.0013 - val_mae: 0.0269 - val_mse: 0.0013 Epoch 573/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0296 - mse: 0.0017 - val_loss: 0.0015 - val_mae: 0.0312 - val_mse: 0.0015 Epoch 574/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0331 - mse: 0.0017 - val_loss: 0.0016 - val_mae: 0.0320 - val_mse: 0.0016 Epoch 575/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0320 - mse: 0.0017 - val_loss: 0.0014 - val_mae: 0.0288 - val_mse: 0.0014 Epoch 576/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0308 - mse: 0.0017 - val_loss: 0.0013 - val_mae: 0.0269 - val_mse: 0.0013 Epoch 577/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0017 - mae: 0.0299 - mse: 0.0017 - val_loss: 0.0013 - val_mae: 0.0278 - val_mse: 0.0013 Epoch 578/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0016 - mae: 0.0305 - mse: 0.0016 - val_loss: 0.0014 - val_mae: 0.0285 - val_mse: 0.0014 Epoch 579/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0016 - mae: 0.0299 - mse: 0.0016 - val_loss: 0.0013 - val_mae: 0.0270 - val_mse: 0.0013 Epoch 580/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0016 - mae: 0.0297 - mse: 0.0016 - val_loss: 0.0013 - val_mae: 0.0276 - val_mse: 0.0013 Epoch 581/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0016 - mae: 0.0297 - mse: 0.0016 - val_loss: 0.0014 - val_mae: 0.0289 - val_mse: 0.0014 Epoch 582/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0016 - mae: 0.0314 - mse: 0.0016 - val_loss: 0.0014 - val_mae: 0.0297 - val_mse: 0.0014 Epoch 583/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0016 - mae: 0.0306 - mse: 0.0016 - val_loss: 0.0013 - val_mae: 0.0272 - val_mse: 0.0013 Epoch 584/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0016 - mae: 0.0296 - mse: 0.0016 - val_loss: 0.0013 - val_mae: 0.0266 - val_mse: 0.0013 Epoch 585/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0016 - mae: 0.0294 - mse: 0.0016 - val_loss: 0.0013 - val_mae: 0.0279 - val_mse: 0.0013 Epoch 586/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0015 - mae: 0.0298 - mse: 0.0015 - val_loss: 0.0013 - val_mae: 0.0281 - val_mse: 0.0013 Epoch 587/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0016 - mae: 0.0301 - mse: 0.0016 - val_loss: 0.0013 - val_mae: 0.0275 - val_mse: 0.0013 Epoch 588/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0016 - mae: 0.0293 - mse: 0.0016 - val_loss: 0.0012 - val_mae: 0.0260 - val_mse: 0.0012 Epoch 589/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0015 - mae: 0.0289 - mse: 0.0015 - val_loss: 0.0013 - val_mae: 0.0285 - val_mse: 0.0013 Epoch 590/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0015 - mae: 0.0303 - mse: 0.0015 - val_loss: 0.0013 - val_mae: 0.0277 - val_mse: 0.0013 Epoch 591/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0016 - mae: 0.0299 - mse: 0.0016 - val_loss: 0.0012 - val_mae: 0.0264 - val_mse: 0.0012 Epoch 592/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0015 - mae: 0.0291 - mse: 0.0015 - val_loss: 0.0013 - val_mae: 0.0286 - val_mse: 0.0013 Epoch 593/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0015 - mae: 0.0305 - mse: 0.0015 - val_loss: 0.0013 - val_mae: 0.0278 - val_mse: 0.0013 Epoch 594/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0015 - mae: 0.0287 - mse: 0.0015 - val_loss: 0.0012 - val_mae: 0.0253 - val_mse: 0.0012 Epoch 595/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0015 - mae: 0.0281 - mse: 0.0015 - val_loss: 0.0012 - val_mae: 0.0247 - val_mse: 0.0012 Epoch 596/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0015 - mae: 0.0283 - mse: 0.0015 - val_loss: 0.0013 - val_mae: 0.0277 - val_mse: 0.0013 Epoch 597/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0015 - mae: 0.0296 - mse: 0.0015 - val_loss: 0.0012 - val_mae: 0.0268 - val_mse: 0.0012 Epoch 598/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0015 - mae: 0.0287 - mse: 0.0015 - val_loss: 0.0012 - val_mae: 0.0261 - val_mse: 0.0012 Epoch 599/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0015 - mae: 0.0284 - mse: 0.0015 - val_loss: 0.0012 - val_mae: 0.0257 - val_mse: 0.0012 Epoch 600/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0014 - mae: 0.0283 - mse: 0.0014 - val_loss: 0.0012 - val_mae: 0.0268 - val_mse: 0.0012 Epoch 601/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0014 - mae: 0.0285 - mse: 0.0014 - val_loss: 0.0012 - val_mae: 0.0265 - val_mse: 0.0012 Epoch 602/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0014 - mae: 0.0286 - mse: 0.0014 - val_loss: 0.0013 - val_mae: 0.0275 - val_mse: 0.0013 Epoch 603/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0015 - mae: 0.0295 - mse: 0.0015 - val_loss: 0.0012 - val_mae: 0.0270 - val_mse: 0.0012 Epoch 604/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0015 - mae: 0.0287 - mse: 0.0015 - val_loss: 0.0011 - val_mae: 0.0246 - val_mse: 0.0011 Epoch 605/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0014 - mae: 0.0284 - mse: 0.0014 - val_loss: 0.0012 - val_mae: 0.0270 - val_mse: 0.0012 Epoch 606/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0014 - mae: 0.0290 - mse: 0.0014 - val_loss: 0.0011 - val_mae: 0.0257 - val_mse: 0.0011 Epoch 607/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0014 - mae: 0.0278 - mse: 0.0014 - val_loss: 0.0011 - val_mae: 0.0243 - val_mse: 0.0011 Epoch 608/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0014 - mae: 0.0273 - mse: 0.0014 - val_loss: 0.0011 - val_mae: 0.0249 - val_mse: 0.0011 Epoch 609/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0014 - mae: 0.0285 - mse: 0.0014 - val_loss: 0.0012 - val_mae: 0.0270 - val_mse: 0.0012 Epoch 610/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0014 - mae: 0.0284 - mse: 0.0014 - val_loss: 0.0011 - val_mae: 0.0246 - val_mse: 0.0011 Epoch 611/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0014 - mae: 0.0272 - mse: 0.0014 - val_loss: 0.0011 - val_mae: 0.0238 - val_mse: 0.0011 Epoch 612/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0014 - mae: 0.0266 - mse: 0.0014 - val_loss: 0.0011 - val_mae: 0.0248 - val_mse: 0.0011 Epoch 613/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0014 - mae: 0.0285 - mse: 0.0014 - val_loss: 0.0012 - val_mae: 0.0274 - val_mse: 0.0012 Epoch 614/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0014 - mae: 0.0284 - mse: 0.0014 - val_loss: 0.0011 - val_mae: 0.0247 - val_mse: 0.0011 Epoch 615/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0013 - mae: 0.0271 - mse: 0.0013 - val_loss: 0.0011 - val_mae: 0.0253 - val_mse: 0.0011 Epoch 616/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0013 - mae: 0.0276 - mse: 0.0013 - val_loss: 0.0011 - val_mae: 0.0253 - val_mse: 0.0011 Epoch 617/1000 3/3 [==============================] - 0s 17ms/step - loss: 0.0013 - mae: 0.0274 - mse: 0.0013 - val_loss: 0.0010 - val_mae: 0.0242 - val_mse: 0.0010 Epoch 618/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0013 - mae: 0.0267 - mse: 0.0013 - val_loss: 0.0010 - val_mae: 0.0235 - val_mse: 0.0010 Epoch 619/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0013 - mae: 0.0264 - mse: 0.0013 - val_loss: 0.0011 - val_mae: 0.0247 - val_mse: 0.0011 Epoch 620/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0013 - mae: 0.0272 - mse: 0.0013 - val_loss: 0.0011 - val_mae: 0.0253 - val_mse: 0.0011 Epoch 621/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0013 - mae: 0.0274 - mse: 0.0013 - val_loss: 0.0011 - val_mae: 0.0245 - val_mse: 0.0011 Epoch 622/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0013 - mae: 0.0266 - mse: 0.0013 - val_loss: 9.9468e-04 - val_mae: 0.0232 - val_mse: 9.9468e-04 Epoch 623/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0013 - mae: 0.0260 - mse: 0.0013 - val_loss: 9.9667e-04 - val_mae: 0.0233 - val_mse: 9.9667e-04 Epoch 624/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0013 - mae: 0.0265 - mse: 0.0013 - val_loss: 0.0011 - val_mae: 0.0256 - val_mse: 0.0011 Epoch 625/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0013 - mae: 0.0272 - mse: 0.0013 - val_loss: 0.0011 - val_mae: 0.0253 - val_mse: 0.0011 Epoch 626/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0013 - mae: 0.0268 - mse: 0.0013 - val_loss: 0.0010 - val_mae: 0.0243 - val_mse: 0.0010 Epoch 627/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0013 - mae: 0.0264 - mse: 0.0013 - val_loss: 9.9029e-04 - val_mae: 0.0232 - val_mse: 9.9029e-04 Epoch 628/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0012 - mae: 0.0259 - mse: 0.0012 - val_loss: 0.0010 - val_mae: 0.0243 - val_mse: 0.0010 Epoch 629/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0013 - mae: 0.0272 - mse: 0.0013 - val_loss: 0.0010 - val_mae: 0.0247 - val_mse: 0.0010 Epoch 630/1000 3/3 [==============================] - 0s 17ms/step - loss: 0.0012 - mae: 0.0264 - mse: 0.0012 - val_loss: 9.6052e-04 - val_mae: 0.0227 - val_mse: 9.6052e-04 Epoch 631/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0012 - mae: 0.0254 - mse: 0.0012 - val_loss: 9.7069e-04 - val_mae: 0.0232 - val_mse: 9.7069e-04 Epoch 632/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0012 - mae: 0.0264 - mse: 0.0012 - val_loss: 0.0011 - val_mae: 0.0267 - val_mse: 0.0011 Epoch 633/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0013 - mae: 0.0277 - mse: 0.0013 - val_loss: 0.0010 - val_mae: 0.0240 - val_mse: 0.0010 Epoch 634/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0012 - mae: 0.0261 - mse: 0.0012 - val_loss: 9.3565e-04 - val_mae: 0.0226 - val_mse: 9.3565e-04 Epoch 635/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0012 - mae: 0.0254 - mse: 0.0012 - val_loss: 9.0488e-04 - val_mae: 0.0217 - val_mse: 9.0488e-04 Epoch 636/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0012 - mae: 0.0258 - mse: 0.0012 - val_loss: 9.4948e-04 - val_mae: 0.0232 - val_mse: 9.4948e-04 Epoch 637/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0012 - mae: 0.0255 - mse: 0.0012 - val_loss: 9.1181e-04 - val_mae: 0.0222 - val_mse: 9.1181e-04 Epoch 638/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0012 - mae: 0.0248 - mse: 0.0012 - val_loss: 9.2338e-04 - val_mae: 0.0225 - val_mse: 9.2338e-04 Epoch 639/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0012 - mae: 0.0257 - mse: 0.0012 - val_loss: 9.6467e-04 - val_mae: 0.0234 - val_mse: 9.6467e-04 Epoch 640/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0012 - mae: 0.0259 - mse: 0.0012 - val_loss: 9.4741e-04 - val_mae: 0.0231 - val_mse: 9.4741e-04 Epoch 641/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0011 - mae: 0.0256 - mse: 0.0011 - val_loss: 9.3018e-04 - val_mae: 0.0227 - val_mse: 9.3018e-04 Epoch 642/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0012 - mae: 0.0249 - mse: 0.0012 - val_loss: 8.8523e-04 - val_mae: 0.0215 - val_mse: 8.8523e-04 Epoch 643/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0011 - mae: 0.0245 - mse: 0.0011 - val_loss: 9.9985e-04 - val_mae: 0.0245 - val_mse: 9.9985e-04 Epoch 644/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0012 - mae: 0.0264 - mse: 0.0012 - val_loss: 9.5870e-04 - val_mae: 0.0235 - val_mse: 9.5870e-04 Epoch 645/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0012 - mae: 0.0262 - mse: 0.0012 - val_loss: 8.7885e-04 - val_mae: 0.0213 - val_mse: 8.7885e-04 Epoch 646/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0011 - mae: 0.0248 - mse: 0.0011 - val_loss: 9.5380e-04 - val_mae: 0.0235 - val_mse: 9.5380e-04 Epoch 647/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0011 - mae: 0.0256 - mse: 0.0011 - val_loss: 8.7075e-04 - val_mae: 0.0220 - val_mse: 8.7075e-04 Epoch 648/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0011 - mae: 0.0249 - mse: 0.0011 - val_loss: 8.7422e-04 - val_mae: 0.0221 - val_mse: 8.7422e-04 Epoch 649/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0011 - mae: 0.0245 - mse: 0.0011 - val_loss: 8.4718e-04 - val_mae: 0.0213 - val_mse: 8.4718e-04 Epoch 650/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0011 - mae: 0.0241 - mse: 0.0011 - val_loss: 8.3768e-04 - val_mae: 0.0210 - val_mse: 8.3768e-04 Epoch 651/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0011 - mae: 0.0241 - mse: 0.0011 - val_loss: 8.4415e-04 - val_mae: 0.0213 - val_mse: 8.4415e-04 Epoch 652/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0011 - mae: 0.0240 - mse: 0.0011 - val_loss: 8.3980e-04 - val_mae: 0.0214 - val_mse: 8.3980e-04 Epoch 653/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0011 - mae: 0.0242 - mse: 0.0011 - val_loss: 8.5774e-04 - val_mae: 0.0216 - val_mse: 8.5774e-04 Epoch 654/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0011 - mae: 0.0241 - mse: 0.0011 - val_loss: 8.2203e-04 - val_mae: 0.0209 - val_mse: 8.2203e-04 Epoch 655/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0010 - mae: 0.0241 - mse: 0.0010 - val_loss: 8.4277e-04 - val_mae: 0.0217 - val_mse: 8.4277e-04 Epoch 656/1000 3/3 [==============================] - 0s 16ms/step - loss: 0.0010 - mae: 0.0240 - mse: 0.0010 - val_loss: 8.0094e-04 - val_mae: 0.0208 - val_mse: 8.0094e-04 Epoch 657/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0010 - mae: 0.0236 - mse: 0.0010 - val_loss: 8.2144e-04 - val_mae: 0.0212 - val_mse: 8.2144e-04 Epoch 658/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0010 - mae: 0.0240 - mse: 0.0010 - val_loss: 8.6371e-04 - val_mae: 0.0221 - val_mse: 8.6371e-04 Epoch 659/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0010 - mae: 0.0245 - mse: 0.0010 - val_loss: 8.2287e-04 - val_mae: 0.0212 - val_mse: 8.2287e-04 Epoch 660/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0010 - mae: 0.0238 - mse: 0.0010 - val_loss: 8.0889e-04 - val_mae: 0.0209 - val_mse: 8.0889e-04 Epoch 661/1000 3/3 [==============================] - 0s 12ms/step - loss: 0.0010 - mae: 0.0235 - mse: 0.0010 - val_loss: 8.2066e-04 - val_mae: 0.0213 - val_mse: 8.2066e-04 Epoch 662/1000 3/3 [==============================] - 0s 11ms/step - loss: 0.0010 - mae: 0.0239 - mse: 0.0010 - val_loss: 8.0823e-04 - val_mae: 0.0211 - val_mse: 8.0823e-04 Epoch 663/1000 3/3 [==============================] - 0s 16ms/step - loss: 9.9341e-04 - mae: 0.0237 - mse: 9.9341e-04 - val_loss: 7.8661e-04 - val_mae: 0.0206 - val_mse: 7.8661e-04 Epoch 664/1000 3/3 [==============================] - 0s 16ms/step - loss: 9.9725e-04 - mae: 0.0232 - mse: 9.9725e-04 - val_loss: 7.5625e-04 - val_mae: 0.0198 - val_mse: 7.5625e-04 Epoch 665/1000 3/3 [==============================] - 0s 11ms/step - loss: 9.7694e-04 - mae: 0.0230 - mse: 9.7694e-04 - val_loss: 7.9756e-04 - val_mae: 0.0210 - val_mse: 7.9756e-04 Epoch 666/1000 3/3 [==============================] - 0s 11ms/step - loss: 9.8644e-04 - mae: 0.0238 - mse: 9.8644e-04 - val_loss: 7.5844e-04 - val_mae: 0.0203 - val_mse: 7.5844e-04 Epoch 667/1000 3/3 [==============================] - 0s 16ms/step - loss: 9.7016e-04 - mae: 0.0232 - mse: 9.7016e-04 - val_loss: 7.2821e-04 - val_mae: 0.0197 - val_mse: 7.2821e-04 Epoch 668/1000 3/3 [==============================] - 0s 17ms/step - loss: 9.7748e-04 - mae: 0.0226 - mse: 9.7748e-04 - val_loss: 7.1643e-04 - val_mae: 0.0192 - val_mse: 7.1643e-04 Epoch 669/1000 3/3 [==============================] - 0s 11ms/step - loss: 9.5811e-04 - mae: 0.0225 - mse: 9.5811e-04 - val_loss: 7.2894e-04 - val_mae: 0.0198 - val_mse: 7.2894e-04 Epoch 670/1000 3/3 [==============================] - 0s 16ms/step - loss: 9.4795e-04 - mae: 0.0227 - mse: 9.4795e-04 - val_loss: 7.1638e-04 - val_mae: 0.0194 - val_mse: 7.1638e-04 Epoch 671/1000 3/3 [==============================] - 0s 11ms/step - loss: 9.3934e-04 - mae: 0.0224 - mse: 9.3934e-04 - val_loss: 7.3413e-04 - val_mae: 0.0196 - val_mse: 7.3413e-04 Epoch 672/1000 3/3 [==============================] - 0s 11ms/step - loss: 9.3393e-04 - mae: 0.0225 - mse: 9.3393e-04 - val_loss: 7.3812e-04 - val_mae: 0.0197 - val_mse: 7.3812e-04 Epoch 673/1000 3/3 [==============================] - 0s 12ms/step - loss: 9.2747e-04 - mae: 0.0226 - mse: 9.2747e-04 - val_loss: 7.2314e-04 - val_mae: 0.0197 - val_mse: 7.2314e-04 Epoch 674/1000 3/3 [==============================] - 0s 16ms/step - loss: 9.1853e-04 - mae: 0.0223 - mse: 9.1853e-04 - val_loss: 7.0723e-04 - val_mae: 0.0193 - val_mse: 7.0723e-04 Epoch 675/1000 3/3 [==============================] - 0s 12ms/step - loss: 9.2846e-04 - mae: 0.0219 - mse: 9.2846e-04 - val_loss: 7.1848e-04 - val_mae: 0.0193 - val_mse: 7.1848e-04 Epoch 676/1000 3/3 [==============================] - 0s 12ms/step - loss: 9.0158e-04 - mae: 0.0220 - mse: 9.0158e-04 - val_loss: 7.6497e-04 - val_mae: 0.0207 - val_mse: 7.6497e-04 Epoch 677/1000 3/3 [==============================] - 0s 11ms/step - loss: 9.2306e-04 - mae: 0.0233 - mse: 9.2306e-04 - val_loss: 7.4836e-04 - val_mae: 0.0204 - val_mse: 7.4836e-04 Epoch 678/1000 3/3 [==============================] - 0s 16ms/step - loss: 8.8806e-04 - mae: 0.0223 - mse: 8.8806e-04 - val_loss: 7.0096e-04 - val_mae: 0.0192 - val_mse: 7.0096e-04 Epoch 679/1000 3/3 [==============================] - 0s 16ms/step - loss: 9.2738e-04 - mae: 0.0219 - mse: 9.2738e-04 - val_loss: 6.7410e-04 - val_mae: 0.0184 - val_mse: 6.7410e-04 Epoch 680/1000 3/3 [==============================] - 0s 12ms/step - loss: 8.9527e-04 - mae: 0.0217 - mse: 8.9527e-04 - val_loss: 6.9685e-04 - val_mae: 0.0196 - val_mse: 6.9685e-04 Epoch 681/1000 3/3 [==============================] - 0s 16ms/step - loss: 8.8926e-04 - mae: 0.0222 - mse: 8.8926e-04 - val_loss: 6.6410e-04 - val_mae: 0.0186 - val_mse: 6.6410e-04 Epoch 682/1000 3/3 [==============================] - 0s 16ms/step - loss: 8.7937e-04 - mae: 0.0214 - mse: 8.7937e-04 - val_loss: 6.5225e-04 - val_mae: 0.0180 - val_mse: 6.5225e-04 Epoch 683/1000 3/3 [==============================] - 0s 11ms/step - loss: 8.8318e-04 - mae: 0.0211 - mse: 8.8318e-04 - val_loss: 6.8086e-04 - val_mae: 0.0190 - val_mse: 6.8086e-04 Epoch 684/1000 3/3 [==============================] - 0s 11ms/step - loss: 8.6212e-04 - mae: 0.0222 - mse: 8.6212e-04 - val_loss: 7.4957e-04 - val_mae: 0.0208 - val_mse: 7.4957e-04 Epoch 685/1000 3/3 [==============================] - 0s 11ms/step - loss: 8.9469e-04 - mae: 0.0227 - mse: 8.9469e-04 - val_loss: 6.7012e-04 - val_mae: 0.0193 - val_mse: 6.7012e-04 Epoch 686/1000 3/3 [==============================] - 0s 11ms/step - loss: 8.6440e-04 - mae: 0.0220 - mse: 8.6440e-04 - val_loss: 6.5980e-04 - val_mae: 0.0188 - val_mse: 6.5980e-04 Epoch 687/1000 3/3 [==============================] - 0s 11ms/step - loss: 8.5453e-04 - mae: 0.0217 - mse: 8.5453e-04 - val_loss: 6.6504e-04 - val_mae: 0.0189 - val_mse: 6.6504e-04 Epoch 688/1000 3/3 [==============================] - 0s 11ms/step - loss: 8.4361e-04 - mae: 0.0219 - mse: 8.4361e-04 - val_loss: 6.8180e-04 - val_mae: 0.0194 - val_mse: 6.8180e-04 Epoch 689/1000 3/3 [==============================] - 0s 11ms/step - loss: 8.3685e-04 - mae: 0.0215 - mse: 8.3685e-04 - val_loss: 6.6095e-04 - val_mae: 0.0186 - val_mse: 6.6095e-04 Epoch 690/1000 3/3 [==============================] - 0s 11ms/step - loss: 8.3841e-04 - mae: 0.0210 - mse: 8.3841e-04 - val_loss: 6.5232e-04 - val_mae: 0.0184 - val_mse: 6.5232e-04 Epoch 691/1000 3/3 [==============================] - 0s 11ms/step - loss: 8.2897e-04 - mae: 0.0213 - mse: 8.2897e-04 - val_loss: 6.5808e-04 - val_mae: 0.0188 - val_mse: 6.5808e-04 Epoch 692/1000 3/3 [==============================] - 0s 16ms/step - loss: 8.1348e-04 - mae: 0.0210 - mse: 8.1348e-04 - val_loss: 6.2917e-04 - val_mae: 0.0182 - val_mse: 6.2917e-04 Epoch 693/1000 3/3 [==============================] - 0s 16ms/step - loss: 8.0956e-04 - mae: 0.0207 - mse: 8.0956e-04 - val_loss: 6.2379e-04 - val_mae: 0.0180 - val_mse: 6.2379e-04 Epoch 694/1000 3/3 [==============================] - 0s 16ms/step - loss: 8.0403e-04 - mae: 0.0209 - mse: 8.0403e-04 - val_loss: 6.0678e-04 - val_mae: 0.0176 - val_mse: 6.0678e-04 Epoch 695/1000 3/3 [==============================] - 0s 16ms/step - loss: 7.9047e-04 - mae: 0.0204 - mse: 7.9047e-04 - val_loss: 5.8710e-04 - val_mae: 0.0170 - val_mse: 5.8710e-04 Epoch 696/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.9290e-04 - mae: 0.0200 - mse: 7.9290e-04 - val_loss: 5.9261e-04 - val_mae: 0.0173 - val_mse: 5.9261e-04 Epoch 697/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.8511e-04 - mae: 0.0203 - mse: 7.8511e-04 - val_loss: 6.2274e-04 - val_mae: 0.0183 - val_mse: 6.2274e-04 Epoch 698/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.8841e-04 - mae: 0.0211 - mse: 7.8841e-04 - val_loss: 6.1430e-04 - val_mae: 0.0180 - val_mse: 6.1430e-04 Epoch 699/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.7652e-04 - mae: 0.0207 - mse: 7.7652e-04 - val_loss: 6.0631e-04 - val_mae: 0.0179 - val_mse: 6.0631e-04 Epoch 700/1000 3/3 [==============================] - 0s 16ms/step - loss: 7.7958e-04 - mae: 0.0204 - mse: 7.7958e-04 - val_loss: 5.7585e-04 - val_mae: 0.0171 - val_mse: 5.7585e-04 Epoch 701/1000 3/3 [==============================] - 0s 16ms/step - loss: 7.6934e-04 - mae: 0.0201 - mse: 7.6934e-04 - val_loss: 5.7089e-04 - val_mae: 0.0173 - val_mse: 5.7089e-04 Epoch 702/1000 3/3 [==============================] - 0s 16ms/step - loss: 7.6505e-04 - mae: 0.0204 - mse: 7.6505e-04 - val_loss: 5.6207e-04 - val_mae: 0.0169 - val_mse: 5.6207e-04 Epoch 703/1000 3/3 [==============================] - 0s 12ms/step - loss: 7.6170e-04 - mae: 0.0198 - mse: 7.6170e-04 - val_loss: 5.6563e-04 - val_mae: 0.0167 - val_mse: 5.6563e-04 Epoch 704/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.6931e-04 - mae: 0.0202 - mse: 7.6931e-04 - val_loss: 5.8084e-04 - val_mae: 0.0175 - val_mse: 5.8084e-04 Epoch 705/1000 3/3 [==============================] - 0s 16ms/step - loss: 7.3958e-04 - mae: 0.0200 - mse: 7.3958e-04 - val_loss: 5.4050e-04 - val_mae: 0.0163 - val_mse: 5.4050e-04 Epoch 706/1000 3/3 [==============================] - 0s 16ms/step - loss: 7.7073e-04 - mae: 0.0195 - mse: 7.7073e-04 - val_loss: 5.3910e-04 - val_mae: 0.0162 - val_mse: 5.3910e-04 Epoch 707/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.3158e-04 - mae: 0.0193 - mse: 7.3158e-04 - val_loss: 5.7376e-04 - val_mae: 0.0176 - val_mse: 5.7376e-04 Epoch 708/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.4696e-04 - mae: 0.0207 - mse: 7.4696e-04 - val_loss: 5.4650e-04 - val_mae: 0.0169 - val_mse: 5.4650e-04 Epoch 709/1000 3/3 [==============================] - 0s 16ms/step - loss: 7.6931e-04 - mae: 0.0200 - mse: 7.6931e-04 - val_loss: 5.3508e-04 - val_mae: 0.0164 - val_mse: 5.3508e-04 Epoch 710/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.3399e-04 - mae: 0.0194 - mse: 7.3399e-04 - val_loss: 5.4938e-04 - val_mae: 0.0172 - val_mse: 5.4938e-04 Epoch 711/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.2552e-04 - mae: 0.0202 - mse: 7.2552e-04 - val_loss: 5.4022e-04 - val_mae: 0.0169 - val_mse: 5.4022e-04 Epoch 712/1000 3/3 [==============================] - 0s 16ms/step - loss: 7.0285e-04 - mae: 0.0196 - mse: 7.0285e-04 - val_loss: 5.3285e-04 - val_mae: 0.0166 - val_mse: 5.3285e-04 Epoch 713/1000 3/3 [==============================] - 0s 16ms/step - loss: 7.4101e-04 - mae: 0.0196 - mse: 7.4101e-04 - val_loss: 4.9636e-04 - val_mae: 0.0154 - val_mse: 4.9636e-04 Epoch 714/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.2607e-04 - mae: 0.0198 - mse: 7.2607e-04 - val_loss: 5.4380e-04 - val_mae: 0.0172 - val_mse: 5.4380e-04 Epoch 715/1000 3/3 [==============================] - 0s 11ms/step - loss: 7.1269e-04 - mae: 0.0194 - mse: 7.1269e-04 - val_loss: 5.1138e-04 - val_mae: 0.0160 - val_mse: 5.1138e-04 Epoch 716/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.9546e-04 - mae: 0.0188 - mse: 6.9546e-04 - val_loss: 5.1086e-04 - val_mae: 0.0162 - val_mse: 5.1086e-04 Epoch 717/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.9742e-04 - mae: 0.0197 - mse: 6.9742e-04 - val_loss: 5.3359e-04 - val_mae: 0.0167 - val_mse: 5.3359e-04 Epoch 718/1000 3/3 [==============================] - 0s 16ms/step - loss: 6.9504e-04 - mae: 0.0194 - mse: 6.9504e-04 - val_loss: 4.8957e-04 - val_mae: 0.0156 - val_mse: 4.8957e-04 Epoch 719/1000 3/3 [==============================] - 0s 16ms/step - loss: 6.8427e-04 - mae: 0.0184 - mse: 6.8427e-04 - val_loss: 4.7826e-04 - val_mae: 0.0153 - val_mse: 4.7826e-04 Epoch 720/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.7545e-04 - mae: 0.0188 - mse: 6.7545e-04 - val_loss: 4.9385e-04 - val_mae: 0.0159 - val_mse: 4.9385e-04 Epoch 721/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.5442e-04 - mae: 0.0184 - mse: 6.5442e-04 - val_loss: 4.8619e-04 - val_mae: 0.0153 - val_mse: 4.8619e-04 Epoch 722/1000 3/3 [==============================] - 0s 16ms/step - loss: 6.7341e-04 - mae: 0.0182 - mse: 6.7341e-04 - val_loss: 4.6407e-04 - val_mae: 0.0150 - val_mse: 4.6407e-04 Epoch 723/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.5085e-04 - mae: 0.0184 - mse: 6.5085e-04 - val_loss: 4.6846e-04 - val_mae: 0.0154 - val_mse: 4.6846e-04 Epoch 724/1000 3/3 [==============================] - 0s 16ms/step - loss: 6.5144e-04 - mae: 0.0187 - mse: 6.5144e-04 - val_loss: 4.5671e-04 - val_mae: 0.0152 - val_mse: 4.5671e-04 Epoch 725/1000 3/3 [==============================] - 0s 12ms/step - loss: 6.4336e-04 - mae: 0.0184 - mse: 6.4336e-04 - val_loss: 4.6100e-04 - val_mae: 0.0153 - val_mse: 4.6100e-04 Epoch 726/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.4691e-04 - mae: 0.0187 - mse: 6.4691e-04 - val_loss: 4.5814e-04 - val_mae: 0.0151 - val_mse: 4.5814e-04 Epoch 727/1000 3/3 [==============================] - 0s 16ms/step - loss: 6.2372e-04 - mae: 0.0180 - mse: 6.2372e-04 - val_loss: 4.4914e-04 - val_mae: 0.0146 - val_mse: 4.4914e-04 Epoch 728/1000 3/3 [==============================] - 0s 16ms/step - loss: 6.3263e-04 - mae: 0.0177 - mse: 6.3263e-04 - val_loss: 4.4776e-04 - val_mae: 0.0148 - val_mse: 4.4776e-04 Epoch 729/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.4904e-04 - mae: 0.0186 - mse: 6.4904e-04 - val_loss: 4.5529e-04 - val_mae: 0.0152 - val_mse: 4.5529e-04 Epoch 730/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.1171e-04 - mae: 0.0180 - mse: 6.1171e-04 - val_loss: 4.5619e-04 - val_mae: 0.0150 - val_mse: 4.5619e-04 Epoch 731/1000 3/3 [==============================] - 0s 16ms/step - loss: 6.4572e-04 - mae: 0.0178 - mse: 6.4572e-04 - val_loss: 4.3695e-04 - val_mae: 0.0143 - val_mse: 4.3695e-04 Epoch 732/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.0603e-04 - mae: 0.0176 - mse: 6.0603e-04 - val_loss: 4.5955e-04 - val_mae: 0.0157 - val_mse: 4.5955e-04 Epoch 733/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.2644e-04 - mae: 0.0187 - mse: 6.2644e-04 - val_loss: 4.4111e-04 - val_mae: 0.0152 - val_mse: 4.4111e-04 Epoch 734/1000 3/3 [==============================] - 0s 11ms/step - loss: 6.0186e-04 - mae: 0.0180 - mse: 6.0186e-04 - val_loss: 4.5417e-04 - val_mae: 0.0154 - val_mse: 4.5417e-04 Epoch 735/1000 3/3 [==============================] - 0s 16ms/step - loss: 6.0860e-04 - mae: 0.0180 - mse: 6.0860e-04 - val_loss: 4.3505e-04 - val_mae: 0.0147 - val_mse: 4.3505e-04 Epoch 736/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.9176e-04 - mae: 0.0178 - mse: 5.9176e-04 - val_loss: 4.2457e-04 - val_mae: 0.0145 - val_mse: 4.2457e-04 Epoch 737/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.8747e-04 - mae: 0.0171 - mse: 5.8747e-04 - val_loss: 4.1892e-04 - val_mae: 0.0141 - val_mse: 4.1892e-04 Epoch 738/1000 3/3 [==============================] - 0s 11ms/step - loss: 5.8757e-04 - mae: 0.0172 - mse: 5.8757e-04 - val_loss: 4.3860e-04 - val_mae: 0.0148 - val_mse: 4.3860e-04 Epoch 739/1000 3/3 [==============================] - 0s 11ms/step - loss: 5.8981e-04 - mae: 0.0173 - mse: 5.8981e-04 - val_loss: 4.2678e-04 - val_mae: 0.0144 - val_mse: 4.2678e-04 Epoch 740/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.7258e-04 - mae: 0.0173 - mse: 5.7258e-04 - val_loss: 4.1246e-04 - val_mae: 0.0146 - val_mse: 4.1246e-04 Epoch 741/1000 3/3 [==============================] - 0s 12ms/step - loss: 5.7170e-04 - mae: 0.0175 - mse: 5.7170e-04 - val_loss: 4.1358e-04 - val_mae: 0.0147 - val_mse: 4.1358e-04 Epoch 742/1000 3/3 [==============================] - 0s 11ms/step - loss: 5.7016e-04 - mae: 0.0178 - mse: 5.7016e-04 - val_loss: 4.1896e-04 - val_mae: 0.0146 - val_mse: 4.1896e-04 Epoch 743/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.7885e-04 - mae: 0.0173 - mse: 5.7885e-04 - val_loss: 3.9847e-04 - val_mae: 0.0137 - val_mse: 3.9847e-04 Epoch 744/1000 3/3 [==============================] - 0s 11ms/step - loss: 5.7846e-04 - mae: 0.0174 - mse: 5.7846e-04 - val_loss: 4.0070e-04 - val_mae: 0.0142 - val_mse: 4.0070e-04 Epoch 745/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.6727e-04 - mae: 0.0171 - mse: 5.6727e-04 - val_loss: 3.9635e-04 - val_mae: 0.0136 - val_mse: 3.9635e-04 Epoch 746/1000 3/3 [==============================] - 0s 11ms/step - loss: 5.5433e-04 - mae: 0.0165 - mse: 5.5433e-04 - val_loss: 3.9948e-04 - val_mae: 0.0141 - val_mse: 3.9948e-04 Epoch 747/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.5088e-04 - mae: 0.0172 - mse: 5.5088e-04 - val_loss: 3.8764e-04 - val_mae: 0.0139 - val_mse: 3.8764e-04 Epoch 748/1000 3/3 [==============================] - 0s 11ms/step - loss: 5.4255e-04 - mae: 0.0171 - mse: 5.4255e-04 - val_loss: 3.9453e-04 - val_mae: 0.0143 - val_mse: 3.9453e-04 Epoch 749/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.5546e-04 - mae: 0.0169 - mse: 5.5546e-04 - val_loss: 3.8202e-04 - val_mae: 0.0136 - val_mse: 3.8202e-04 Epoch 750/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.4362e-04 - mae: 0.0170 - mse: 5.4362e-04 - val_loss: 3.8053e-04 - val_mae: 0.0139 - val_mse: 3.8053e-04 Epoch 751/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.4836e-04 - mae: 0.0168 - mse: 5.4836e-04 - val_loss: 3.6908e-04 - val_mae: 0.0133 - val_mse: 3.6908e-04 Epoch 752/1000 3/3 [==============================] - 0s 11ms/step - loss: 5.2773e-04 - mae: 0.0163 - mse: 5.2773e-04 - val_loss: 3.8010e-04 - val_mae: 0.0138 - val_mse: 3.8010e-04 Epoch 753/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.3464e-04 - mae: 0.0171 - mse: 5.3464e-04 - val_loss: 3.5728e-04 - val_mae: 0.0131 - val_mse: 3.5728e-04 Epoch 754/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.1858e-04 - mae: 0.0165 - mse: 5.1858e-04 - val_loss: 3.5728e-04 - val_mae: 0.0134 - val_mse: 3.5728e-04 Epoch 755/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.3500e-04 - mae: 0.0163 - mse: 5.3500e-04 - val_loss: 3.5565e-04 - val_mae: 0.0129 - val_mse: 3.5565e-04 Epoch 756/1000 3/3 [==============================] - 0s 11ms/step - loss: 5.1476e-04 - mae: 0.0163 - mse: 5.1476e-04 - val_loss: 3.7698e-04 - val_mae: 0.0138 - val_mse: 3.7698e-04 Epoch 757/1000 3/3 [==============================] - 0s 16ms/step - loss: 5.1201e-04 - mae: 0.0163 - mse: 5.1201e-04 - val_loss: 3.5513e-04 - val_mae: 0.0132 - val_mse: 3.5513e-04 Epoch 758/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.9862e-04 - mae: 0.0160 - mse: 4.9862e-04 - val_loss: 3.5147e-04 - val_mae: 0.0132 - val_mse: 3.5147e-04 Epoch 759/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.9800e-04 - mae: 0.0162 - mse: 4.9800e-04 - val_loss: 3.3802e-04 - val_mae: 0.0129 - val_mse: 3.3802e-04 Epoch 760/1000 3/3 [==============================] - 0s 12ms/step - loss: 4.9632e-04 - mae: 0.0162 - mse: 4.9632e-04 - val_loss: 3.5630e-04 - val_mae: 0.0135 - val_mse: 3.5630e-04 Epoch 761/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.9780e-04 - mae: 0.0161 - mse: 4.9780e-04 - val_loss: 3.4925e-04 - val_mae: 0.0130 - val_mse: 3.4925e-04 Epoch 762/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.9231e-04 - mae: 0.0158 - mse: 4.9231e-04 - val_loss: 3.4424e-04 - val_mae: 0.0130 - val_mse: 3.4424e-04 Epoch 763/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.8621e-04 - mae: 0.0160 - mse: 4.8621e-04 - val_loss: 3.4822e-04 - val_mae: 0.0130 - val_mse: 3.4822e-04 Epoch 764/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.8103e-04 - mae: 0.0156 - mse: 4.8103e-04 - val_loss: 3.6216e-04 - val_mae: 0.0132 - val_mse: 3.6216e-04 Epoch 765/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.9049e-04 - mae: 0.0162 - mse: 4.9049e-04 - val_loss: 3.3832e-04 - val_mae: 0.0129 - val_mse: 3.3832e-04 Epoch 766/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.8455e-04 - mae: 0.0159 - mse: 4.8455e-04 - val_loss: 3.2436e-04 - val_mae: 0.0124 - val_mse: 3.2436e-04 Epoch 767/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.7701e-04 - mae: 0.0156 - mse: 4.7701e-04 - val_loss: 3.3300e-04 - val_mae: 0.0129 - val_mse: 3.3300e-04 Epoch 768/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.7721e-04 - mae: 0.0159 - mse: 4.7721e-04 - val_loss: 3.3879e-04 - val_mae: 0.0129 - val_mse: 3.3879e-04 Epoch 769/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.6310e-04 - mae: 0.0155 - mse: 4.6310e-04 - val_loss: 3.2356e-04 - val_mae: 0.0125 - val_mse: 3.2356e-04 Epoch 770/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.5619e-04 - mae: 0.0155 - mse: 4.5619e-04 - val_loss: 3.2551e-04 - val_mae: 0.0126 - val_mse: 3.2551e-04 Epoch 771/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.5658e-04 - mae: 0.0154 - mse: 4.5658e-04 - val_loss: 3.1608e-04 - val_mae: 0.0123 - val_mse: 3.1608e-04 Epoch 772/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.5379e-04 - mae: 0.0151 - mse: 4.5379e-04 - val_loss: 2.9878e-04 - val_mae: 0.0118 - val_mse: 2.9878e-04 Epoch 773/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.4647e-04 - mae: 0.0150 - mse: 4.4647e-04 - val_loss: 3.0916e-04 - val_mae: 0.0121 - val_mse: 3.0916e-04 Epoch 774/1000 3/3 [==============================] - 0s 12ms/step - loss: 4.4427e-04 - mae: 0.0149 - mse: 4.4427e-04 - val_loss: 3.1390e-04 - val_mae: 0.0123 - val_mse: 3.1390e-04 Epoch 775/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.4607e-04 - mae: 0.0153 - mse: 4.4607e-04 - val_loss: 3.0695e-04 - val_mae: 0.0123 - val_mse: 3.0695e-04 Epoch 776/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.3720e-04 - mae: 0.0151 - mse: 4.3720e-04 - val_loss: 3.0534e-04 - val_mae: 0.0122 - val_mse: 3.0534e-04 Epoch 777/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.3459e-04 - mae: 0.0148 - mse: 4.3459e-04 - val_loss: 3.0218e-04 - val_mae: 0.0118 - val_mse: 3.0218e-04 Epoch 778/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.2940e-04 - mae: 0.0145 - mse: 4.2940e-04 - val_loss: 2.9567e-04 - val_mae: 0.0118 - val_mse: 2.9567e-04 Epoch 779/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.2721e-04 - mae: 0.0147 - mse: 4.2721e-04 - val_loss: 2.8961e-04 - val_mae: 0.0118 - val_mse: 2.8961e-04 Epoch 780/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.2604e-04 - mae: 0.0149 - mse: 4.2604e-04 - val_loss: 2.9130e-04 - val_mae: 0.0119 - val_mse: 2.9130e-04 Epoch 781/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.2370e-04 - mae: 0.0149 - mse: 4.2370e-04 - val_loss: 2.8695e-04 - val_mae: 0.0119 - val_mse: 2.8695e-04 Epoch 782/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.2237e-04 - mae: 0.0148 - mse: 4.2237e-04 - val_loss: 2.9031e-04 - val_mae: 0.0118 - val_mse: 2.9031e-04 Epoch 783/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.3087e-04 - mae: 0.0146 - mse: 4.3087e-04 - val_loss: 2.7516e-04 - val_mae: 0.0113 - val_mse: 2.7516e-04 Epoch 784/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.1445e-04 - mae: 0.0146 - mse: 4.1445e-04 - val_loss: 2.9832e-04 - val_mae: 0.0124 - val_mse: 2.9832e-04 Epoch 785/1000 3/3 [==============================] - 0s 13ms/step - loss: 4.3051e-04 - mae: 0.0153 - mse: 4.3051e-04 - val_loss: 2.8942e-04 - val_mae: 0.0120 - val_mse: 2.8942e-04 Epoch 786/1000 3/3 [==============================] - 0s 12ms/step - loss: 4.2001e-04 - mae: 0.0148 - mse: 4.2001e-04 - val_loss: 2.9147e-04 - val_mae: 0.0121 - val_mse: 2.9147e-04 Epoch 787/1000 3/3 [==============================] - 0s 12ms/step - loss: 4.0990e-04 - mae: 0.0146 - mse: 4.0990e-04 - val_loss: 2.8698e-04 - val_mae: 0.0119 - val_mse: 2.8698e-04 Epoch 788/1000 3/3 [==============================] - 0s 11ms/step - loss: 4.1049e-04 - mae: 0.0147 - mse: 4.1049e-04 - val_loss: 2.7652e-04 - val_mae: 0.0117 - val_mse: 2.7652e-04 Epoch 789/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.9985e-04 - mae: 0.0143 - mse: 3.9985e-04 - val_loss: 2.7597e-04 - val_mae: 0.0115 - val_mse: 2.7597e-04 Epoch 790/1000 3/3 [==============================] - 0s 16ms/step - loss: 4.1544e-04 - mae: 0.0146 - mse: 4.1544e-04 - val_loss: 2.6913e-04 - val_mae: 0.0113 - val_mse: 2.6913e-04 Epoch 791/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.9524e-04 - mae: 0.0139 - mse: 3.9524e-04 - val_loss: 2.8276e-04 - val_mae: 0.0116 - val_mse: 2.8276e-04 Epoch 792/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.9784e-04 - mae: 0.0141 - mse: 3.9784e-04 - val_loss: 2.6937e-04 - val_mae: 0.0114 - val_mse: 2.6937e-04 Epoch 793/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.9021e-04 - mae: 0.0142 - mse: 3.9021e-04 - val_loss: 2.5221e-04 - val_mae: 0.0111 - val_mse: 2.5221e-04 Epoch 794/1000 3/3 [==============================] - 0s 12ms/step - loss: 3.8845e-04 - mae: 0.0142 - mse: 3.8845e-04 - val_loss: 2.5260e-04 - val_mae: 0.0112 - val_mse: 2.5260e-04 Epoch 795/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.7916e-04 - mae: 0.0140 - mse: 3.7916e-04 - val_loss: 2.6772e-04 - val_mae: 0.0115 - val_mse: 2.6772e-04 Epoch 796/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.8530e-04 - mae: 0.0138 - mse: 3.8530e-04 - val_loss: 2.6625e-04 - val_mae: 0.0112 - val_mse: 2.6625e-04 Epoch 797/1000 3/3 [==============================] - 0s 12ms/step - loss: 3.7992e-04 - mae: 0.0137 - mse: 3.7992e-04 - val_loss: 2.5304e-04 - val_mae: 0.0108 - val_mse: 2.5304e-04 Epoch 798/1000 3/3 [==============================] - 0s 17ms/step - loss: 3.7715e-04 - mae: 0.0141 - mse: 3.7715e-04 - val_loss: 2.4814e-04 - val_mae: 0.0109 - val_mse: 2.4814e-04 Epoch 799/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.8168e-04 - mae: 0.0141 - mse: 3.8168e-04 - val_loss: 2.5087e-04 - val_mae: 0.0112 - val_mse: 2.5087e-04 Epoch 800/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.7618e-04 - mae: 0.0140 - mse: 3.7618e-04 - val_loss: 2.4943e-04 - val_mae: 0.0111 - val_mse: 2.4943e-04 Epoch 801/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.7216e-04 - mae: 0.0139 - mse: 3.7216e-04 - val_loss: 2.5762e-04 - val_mae: 0.0112 - val_mse: 2.5762e-04 Epoch 802/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.6648e-04 - mae: 0.0134 - mse: 3.6648e-04 - val_loss: 2.4619e-04 - val_mae: 0.0108 - val_mse: 2.4619e-04 Epoch 803/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.6452e-04 - mae: 0.0137 - mse: 3.6452e-04 - val_loss: 2.5341e-04 - val_mae: 0.0109 - val_mse: 2.5341e-04 Epoch 804/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.6865e-04 - mae: 0.0134 - mse: 3.6865e-04 - val_loss: 2.6098e-04 - val_mae: 0.0110 - val_mse: 2.6098e-04 Epoch 805/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.5439e-04 - mae: 0.0131 - mse: 3.5439e-04 - val_loss: 2.4341e-04 - val_mae: 0.0108 - val_mse: 2.4341e-04 Epoch 806/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.5763e-04 - mae: 0.0136 - mse: 3.5763e-04 - val_loss: 2.3487e-04 - val_mae: 0.0108 - val_mse: 2.3487e-04 Epoch 807/1000 3/3 [==============================] - 0s 12ms/step - loss: 3.5768e-04 - mae: 0.0139 - mse: 3.5768e-04 - val_loss: 2.4863e-04 - val_mae: 0.0114 - val_mse: 2.4863e-04 Epoch 808/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.5210e-04 - mae: 0.0138 - mse: 3.5210e-04 - val_loss: 2.4211e-04 - val_mae: 0.0110 - val_mse: 2.4211e-04 Epoch 809/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.4521e-04 - mae: 0.0133 - mse: 3.4521e-04 - val_loss: 2.3772e-04 - val_mae: 0.0106 - val_mse: 2.3772e-04 Epoch 810/1000 3/3 [==============================] - 0s 17ms/step - loss: 3.5158e-04 - mae: 0.0131 - mse: 3.5158e-04 - val_loss: 2.3209e-04 - val_mae: 0.0103 - val_mse: 2.3209e-04 Epoch 811/1000 3/3 [==============================] - 0s 12ms/step - loss: 3.4438e-04 - mae: 0.0131 - mse: 3.4438e-04 - val_loss: 2.3949e-04 - val_mae: 0.0109 - val_mse: 2.3949e-04 Epoch 812/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.4791e-04 - mae: 0.0134 - mse: 3.4791e-04 - val_loss: 2.3263e-04 - val_mae: 0.0106 - val_mse: 2.3263e-04 Epoch 813/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.3556e-04 - mae: 0.0131 - mse: 3.3556e-04 - val_loss: 2.3165e-04 - val_mae: 0.0107 - val_mse: 2.3165e-04 Epoch 814/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.3744e-04 - mae: 0.0134 - mse: 3.3744e-04 - val_loss: 2.3418e-04 - val_mae: 0.0108 - val_mse: 2.3418e-04 Epoch 815/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.3442e-04 - mae: 0.0132 - mse: 3.3442e-04 - val_loss: 2.2861e-04 - val_mae: 0.0105 - val_mse: 2.2861e-04 Epoch 816/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.3495e-04 - mae: 0.0131 - mse: 3.3495e-04 - val_loss: 2.2481e-04 - val_mae: 0.0104 - val_mse: 2.2481e-04 Epoch 817/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.2278e-04 - mae: 0.0128 - mse: 3.2278e-04 - val_loss: 2.3211e-04 - val_mae: 0.0105 - val_mse: 2.3211e-04 Epoch 818/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.2989e-04 - mae: 0.0127 - mse: 3.2989e-04 - val_loss: 2.2491e-04 - val_mae: 0.0102 - val_mse: 2.2491e-04 Epoch 819/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.2001e-04 - mae: 0.0127 - mse: 3.2001e-04 - val_loss: 2.2645e-04 - val_mae: 0.0104 - val_mse: 2.2645e-04 Epoch 820/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.2213e-04 - mae: 0.0130 - mse: 3.2213e-04 - val_loss: 2.2295e-04 - val_mae: 0.0105 - val_mse: 2.2295e-04 Epoch 821/1000 3/3 [==============================] - 0s 17ms/step - loss: 3.2554e-04 - mae: 0.0128 - mse: 3.2554e-04 - val_loss: 2.1300e-04 - val_mae: 0.0101 - val_mse: 2.1300e-04 Epoch 822/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.1589e-04 - mae: 0.0127 - mse: 3.1589e-04 - val_loss: 2.1044e-04 - val_mae: 0.0101 - val_mse: 2.1044e-04 Epoch 823/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.1945e-04 - mae: 0.0129 - mse: 3.1945e-04 - val_loss: 2.1447e-04 - val_mae: 0.0104 - val_mse: 2.1447e-04 Epoch 824/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.1721e-04 - mae: 0.0128 - mse: 3.1721e-04 - val_loss: 2.0680e-04 - val_mae: 0.0099 - val_mse: 2.0680e-04 Epoch 825/1000 3/3 [==============================] - 0s 12ms/step - loss: 3.1320e-04 - mae: 0.0123 - mse: 3.1320e-04 - val_loss: 2.1540e-04 - val_mae: 0.0099 - val_mse: 2.1540e-04 Epoch 826/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.1283e-04 - mae: 0.0122 - mse: 3.1283e-04 - val_loss: 2.2320e-04 - val_mae: 0.0106 - val_mse: 2.2320e-04 Epoch 827/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.1279e-04 - mae: 0.0129 - mse: 3.1279e-04 - val_loss: 2.0525e-04 - val_mae: 0.0100 - val_mse: 2.0525e-04 Epoch 828/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.1229e-04 - mae: 0.0127 - mse: 3.1229e-04 - val_loss: 2.0692e-04 - val_mae: 0.0101 - val_mse: 2.0692e-04 Epoch 829/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.0460e-04 - mae: 0.0126 - mse: 3.0460e-04 - val_loss: 2.1529e-04 - val_mae: 0.0104 - val_mse: 2.1529e-04 Epoch 830/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.0457e-04 - mae: 0.0127 - mse: 3.0457e-04 - val_loss: 2.0186e-04 - val_mae: 0.0100 - val_mse: 2.0186e-04 Epoch 831/1000 3/3 [==============================] - 0s 16ms/step - loss: 3.0204e-04 - mae: 0.0125 - mse: 3.0204e-04 - val_loss: 1.9718e-04 - val_mae: 0.0097 - val_mse: 1.9718e-04 Epoch 832/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.9858e-04 - mae: 0.0121 - mse: 2.9858e-04 - val_loss: 2.0959e-04 - val_mae: 0.0102 - val_mse: 2.0959e-04 Epoch 833/1000 3/3 [==============================] - 0s 11ms/step - loss: 3.0487e-04 - mae: 0.0127 - mse: 3.0487e-04 - val_loss: 2.0322e-04 - val_mae: 0.0101 - val_mse: 2.0322e-04 Epoch 834/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.9640e-04 - mae: 0.0124 - mse: 2.9640e-04 - val_loss: 2.0371e-04 - val_mae: 0.0102 - val_mse: 2.0371e-04 Epoch 835/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.9577e-04 - mae: 0.0125 - mse: 2.9577e-04 - val_loss: 1.9416e-04 - val_mae: 0.0099 - val_mse: 1.9416e-04 Epoch 836/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.8784e-04 - mae: 0.0122 - mse: 2.8784e-04 - val_loss: 2.0014e-04 - val_mae: 0.0099 - val_mse: 2.0014e-04 Epoch 837/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.9024e-04 - mae: 0.0119 - mse: 2.9024e-04 - val_loss: 1.9116e-04 - val_mae: 0.0093 - val_mse: 1.9116e-04 Epoch 838/1000 3/3 [==============================] - 0s 12ms/step - loss: 2.9056e-04 - mae: 0.0121 - mse: 2.9056e-04 - val_loss: 1.9916e-04 - val_mae: 0.0097 - val_mse: 1.9916e-04 Epoch 839/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.8218e-04 - mae: 0.0120 - mse: 2.8218e-04 - val_loss: 2.0924e-04 - val_mae: 0.0100 - val_mse: 2.0924e-04 Epoch 840/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.8669e-04 - mae: 0.0118 - mse: 2.8669e-04 - val_loss: 1.9388e-04 - val_mae: 0.0095 - val_mse: 1.9388e-04 Epoch 841/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.8288e-04 - mae: 0.0121 - mse: 2.8288e-04 - val_loss: 1.8231e-04 - val_mae: 0.0095 - val_mse: 1.8231e-04 Epoch 842/1000 3/3 [==============================] - 0s 13ms/step - loss: 2.8088e-04 - mae: 0.0123 - mse: 2.8088e-04 - val_loss: 2.0203e-04 - val_mae: 0.0105 - val_mse: 2.0203e-04 Epoch 843/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.8271e-04 - mae: 0.0121 - mse: 2.8271e-04 - val_loss: 1.8176e-04 - val_mae: 0.0093 - val_mse: 1.8176e-04 Epoch 844/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.7666e-04 - mae: 0.0118 - mse: 2.7666e-04 - val_loss: 1.7612e-04 - val_mae: 0.0090 - val_mse: 1.7612e-04 Epoch 845/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.7452e-04 - mae: 0.0119 - mse: 2.7452e-04 - val_loss: 1.9327e-04 - val_mae: 0.0100 - val_mse: 1.9327e-04 Epoch 846/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.8214e-04 - mae: 0.0121 - mse: 2.8214e-04 - val_loss: 1.8769e-04 - val_mae: 0.0096 - val_mse: 1.8769e-04 Epoch 847/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.7084e-04 - mae: 0.0116 - mse: 2.7084e-04 - val_loss: 1.7640e-04 - val_mae: 0.0091 - val_mse: 1.7640e-04 Epoch 848/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.7175e-04 - mae: 0.0117 - mse: 2.7175e-04 - val_loss: 1.8390e-04 - val_mae: 0.0095 - val_mse: 1.8390e-04 Epoch 849/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.6855e-04 - mae: 0.0116 - mse: 2.6855e-04 - val_loss: 1.7926e-04 - val_mae: 0.0094 - val_mse: 1.7926e-04 Epoch 850/1000 3/3 [==============================] - 0s 15ms/step - loss: 2.7339e-04 - mae: 0.0122 - mse: 2.7339e-04 - val_loss: 1.8165e-04 - val_mae: 0.0095 - val_mse: 1.8165e-04 Epoch 851/1000 3/3 [==============================] - 0s 12ms/step - loss: 2.6081e-04 - mae: 0.0117 - mse: 2.6081e-04 - val_loss: 1.9177e-04 - val_mae: 0.0099 - val_mse: 1.9177e-04 Epoch 852/1000 3/3 [==============================] - 0s 18ms/step - loss: 2.7114e-04 - mae: 0.0116 - mse: 2.7114e-04 - val_loss: 1.6749e-04 - val_mae: 0.0089 - val_mse: 1.6749e-04 Epoch 853/1000 3/3 [==============================] - 0s 13ms/step - loss: 2.6274e-04 - mae: 0.0117 - mse: 2.6274e-04 - val_loss: 1.7810e-04 - val_mae: 0.0095 - val_mse: 1.7810e-04 Epoch 854/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.5970e-04 - mae: 0.0117 - mse: 2.5970e-04 - val_loss: 1.8593e-04 - val_mae: 0.0096 - val_mse: 1.8593e-04 Epoch 855/1000 3/3 [==============================] - 0s 12ms/step - loss: 2.7597e-04 - mae: 0.0119 - mse: 2.7597e-04 - val_loss: 1.7997e-04 - val_mae: 0.0093 - val_mse: 1.7997e-04 Epoch 856/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.5826e-04 - mae: 0.0111 - mse: 2.5826e-04 - val_loss: 1.8164e-04 - val_mae: 0.0091 - val_mse: 1.8164e-04 Epoch 857/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.5563e-04 - mae: 0.0111 - mse: 2.5563e-04 - val_loss: 1.7189e-04 - val_mae: 0.0093 - val_mse: 1.7189e-04 Epoch 858/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.6109e-04 - mae: 0.0119 - mse: 2.6109e-04 - val_loss: 1.7148e-04 - val_mae: 0.0094 - val_mse: 1.7148e-04 Epoch 859/1000 3/3 [==============================] - 0s 12ms/step - loss: 2.5194e-04 - mae: 0.0117 - mse: 2.5194e-04 - val_loss: 1.7311e-04 - val_mae: 0.0096 - val_mse: 1.7311e-04 Epoch 860/1000 3/3 [==============================] - 0s 18ms/step - loss: 2.5149e-04 - mae: 0.0115 - mse: 2.5149e-04 - val_loss: 1.6659e-04 - val_mae: 0.0091 - val_mse: 1.6659e-04 Epoch 861/1000 3/3 [==============================] - 0s 17ms/step - loss: 2.4911e-04 - mae: 0.0112 - mse: 2.4911e-04 - val_loss: 1.6652e-04 - val_mae: 0.0091 - val_mse: 1.6652e-04 Epoch 862/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.4324e-04 - mae: 0.0110 - mse: 2.4324e-04 - val_loss: 1.6234e-04 - val_mae: 0.0088 - val_mse: 1.6234e-04 Epoch 863/1000 3/3 [==============================] - 0s 18ms/step - loss: 2.4454e-04 - mae: 0.0111 - mse: 2.4454e-04 - val_loss: 1.5970e-04 - val_mae: 0.0087 - val_mse: 1.5970e-04 Epoch 864/1000 3/3 [==============================] - 0s 17ms/step - loss: 2.4393e-04 - mae: 0.0109 - mse: 2.4393e-04 - val_loss: 1.5933e-04 - val_mae: 0.0087 - val_mse: 1.5933e-04 Epoch 865/1000 3/3 [==============================] - 0s 18ms/step - loss: 2.3910e-04 - mae: 0.0109 - mse: 2.3910e-04 - val_loss: 1.5468e-04 - val_mae: 0.0087 - val_mse: 1.5468e-04 Epoch 866/1000 3/3 [==============================] - 0s 12ms/step - loss: 2.3832e-04 - mae: 0.0112 - mse: 2.3832e-04 - val_loss: 1.5954e-04 - val_mae: 0.0090 - val_mse: 1.5954e-04 Epoch 867/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.6112e-04 - mae: 0.0119 - mse: 2.6112e-04 - val_loss: 1.5854e-04 - val_mae: 0.0089 - val_mse: 1.5854e-04 Epoch 868/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.3792e-04 - mae: 0.0111 - mse: 2.3792e-04 - val_loss: 1.7508e-04 - val_mae: 0.0093 - val_mse: 1.7508e-04 Epoch 869/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.4024e-04 - mae: 0.0107 - mse: 2.4024e-04 - val_loss: 1.5986e-04 - val_mae: 0.0089 - val_mse: 1.5986e-04 Epoch 870/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.4439e-04 - mae: 0.0114 - mse: 2.4439e-04 - val_loss: 1.6065e-04 - val_mae: 0.0091 - val_mse: 1.6065e-04 Epoch 871/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.3012e-04 - mae: 0.0110 - mse: 2.3012e-04 - val_loss: 1.6823e-04 - val_mae: 0.0093 - val_mse: 1.6823e-04 Epoch 872/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.4165e-04 - mae: 0.0110 - mse: 2.4165e-04 - val_loss: 1.5221e-04 - val_mae: 0.0084 - val_mse: 1.5221e-04 Epoch 873/1000 3/3 [==============================] - 0s 17ms/step - loss: 2.4728e-04 - mae: 0.0114 - mse: 2.4728e-04 - val_loss: 1.4686e-04 - val_mae: 0.0085 - val_mse: 1.4686e-04 Epoch 874/1000 3/3 [==============================] - 0s 12ms/step - loss: 2.3056e-04 - mae: 0.0112 - mse: 2.3056e-04 - val_loss: 1.8102e-04 - val_mae: 0.0100 - val_mse: 1.8102e-04 Epoch 875/1000 3/3 [==============================] - 0s 13ms/step - loss: 2.4299e-04 - mae: 0.0110 - mse: 2.4299e-04 - val_loss: 1.5376e-04 - val_mae: 0.0085 - val_mse: 1.5376e-04 Epoch 876/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.4020e-04 - mae: 0.0111 - mse: 2.4020e-04 - val_loss: 1.4998e-04 - val_mae: 0.0085 - val_mse: 1.4998e-04 Epoch 877/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.2532e-04 - mae: 0.0108 - mse: 2.2532e-04 - val_loss: 1.6193e-04 - val_mae: 0.0094 - val_mse: 1.6193e-04 Epoch 878/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.3051e-04 - mae: 0.0110 - mse: 2.3051e-04 - val_loss: 1.4412e-04 - val_mae: 0.0085 - val_mse: 1.4412e-04 Epoch 879/1000 3/3 [==============================] - 0s 12ms/step - loss: 2.3706e-04 - mae: 0.0113 - mse: 2.3706e-04 - val_loss: 1.5050e-04 - val_mae: 0.0089 - val_mse: 1.5050e-04 Epoch 880/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.2235e-04 - mae: 0.0108 - mse: 2.2235e-04 - val_loss: 1.5810e-04 - val_mae: 0.0089 - val_mse: 1.5810e-04 Epoch 881/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.2649e-04 - mae: 0.0107 - mse: 2.2649e-04 - val_loss: 1.4544e-04 - val_mae: 0.0083 - val_mse: 1.4544e-04 Epoch 882/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.2320e-04 - mae: 0.0107 - mse: 2.2320e-04 - val_loss: 1.5350e-04 - val_mae: 0.0088 - val_mse: 1.5350e-04 Epoch 883/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.1769e-04 - mae: 0.0106 - mse: 2.1769e-04 - val_loss: 1.4390e-04 - val_mae: 0.0084 - val_mse: 1.4390e-04 Epoch 884/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.1553e-04 - mae: 0.0105 - mse: 2.1553e-04 - val_loss: 1.3615e-04 - val_mae: 0.0080 - val_mse: 1.3615e-04 Epoch 885/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.1433e-04 - mae: 0.0104 - mse: 2.1433e-04 - val_loss: 1.4603e-04 - val_mae: 0.0085 - val_mse: 1.4603e-04 Epoch 886/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.1166e-04 - mae: 0.0104 - mse: 2.1166e-04 - val_loss: 1.4701e-04 - val_mae: 0.0087 - val_mse: 1.4701e-04 Epoch 887/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.1121e-04 - mae: 0.0105 - mse: 2.1121e-04 - val_loss: 1.4426e-04 - val_mae: 0.0084 - val_mse: 1.4426e-04 Epoch 888/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.0932e-04 - mae: 0.0104 - mse: 2.0932e-04 - val_loss: 1.4222e-04 - val_mae: 0.0084 - val_mse: 1.4222e-04 Epoch 889/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.1158e-04 - mae: 0.0104 - mse: 2.1158e-04 - val_loss: 1.3459e-04 - val_mae: 0.0082 - val_mse: 1.3459e-04 Epoch 890/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.1109e-04 - mae: 0.0104 - mse: 2.1109e-04 - val_loss: 1.4265e-04 - val_mae: 0.0085 - val_mse: 1.4265e-04 Epoch 891/1000 3/3 [==============================] - 0s 12ms/step - loss: 2.0845e-04 - mae: 0.0103 - mse: 2.0845e-04 - val_loss: 1.3733e-04 - val_mae: 0.0082 - val_mse: 1.3733e-04 Epoch 892/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.1703e-04 - mae: 0.0103 - mse: 2.1703e-04 - val_loss: 1.3788e-04 - val_mae: 0.0079 - val_mse: 1.3788e-04 Epoch 893/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.1259e-04 - mae: 0.0104 - mse: 2.1259e-04 - val_loss: 1.3241e-04 - val_mae: 0.0081 - val_mse: 1.3241e-04 Epoch 894/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.0317e-04 - mae: 0.0103 - mse: 2.0317e-04 - val_loss: 1.5428e-04 - val_mae: 0.0092 - val_mse: 1.5428e-04 Epoch 895/1000 3/3 [==============================] - 0s 16ms/step - loss: 2.0768e-04 - mae: 0.0104 - mse: 2.0768e-04 - val_loss: 1.2819e-04 - val_mae: 0.0080 - val_mse: 1.2819e-04 Epoch 896/1000 3/3 [==============================] - 0s 12ms/step - loss: 2.0477e-04 - mae: 0.0104 - mse: 2.0477e-04 - val_loss: 1.3091e-04 - val_mae: 0.0082 - val_mse: 1.3091e-04 Epoch 897/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.0194e-04 - mae: 0.0104 - mse: 2.0194e-04 - val_loss: 1.3718e-04 - val_mae: 0.0084 - val_mse: 1.3718e-04 Epoch 898/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.0358e-04 - mae: 0.0104 - mse: 2.0358e-04 - val_loss: 1.3719e-04 - val_mae: 0.0083 - val_mse: 1.3719e-04 Epoch 899/1000 3/3 [==============================] - 0s 11ms/step - loss: 2.0083e-04 - mae: 0.0100 - mse: 2.0083e-04 - val_loss: 1.3793e-04 - val_mae: 0.0083 - val_mse: 1.3793e-04 Epoch 900/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.9696e-04 - mae: 0.0099 - mse: 1.9696e-04 - val_loss: 1.3024e-04 - val_mae: 0.0080 - val_mse: 1.3024e-04 Epoch 901/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.9260e-04 - mae: 0.0101 - mse: 1.9260e-04 - val_loss: 1.2316e-04 - val_mae: 0.0079 - val_mse: 1.2316e-04 Epoch 902/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.9399e-04 - mae: 0.0102 - mse: 1.9399e-04 - val_loss: 1.2789e-04 - val_mae: 0.0081 - val_mse: 1.2789e-04 Epoch 903/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.9506e-04 - mae: 0.0100 - mse: 1.9506e-04 - val_loss: 1.3080e-04 - val_mae: 0.0082 - val_mse: 1.3080e-04 Epoch 904/1000 3/3 [==============================] - 0s 17ms/step - loss: 1.9848e-04 - mae: 0.0101 - mse: 1.9848e-04 - val_loss: 1.1762e-04 - val_mae: 0.0077 - val_mse: 1.1762e-04 Epoch 905/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.9524e-04 - mae: 0.0103 - mse: 1.9524e-04 - val_loss: 1.2655e-04 - val_mae: 0.0081 - val_mse: 1.2655e-04 Epoch 906/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.9070e-04 - mae: 0.0100 - mse: 1.9070e-04 - val_loss: 1.2773e-04 - val_mae: 0.0081 - val_mse: 1.2773e-04 Epoch 907/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.9271e-04 - mae: 0.0098 - mse: 1.9271e-04 - val_loss: 1.3312e-04 - val_mae: 0.0081 - val_mse: 1.3312e-04 Epoch 908/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.9147e-04 - mae: 0.0099 - mse: 1.9147e-04 - val_loss: 1.2177e-04 - val_mae: 0.0078 - val_mse: 1.2177e-04 Epoch 909/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.8610e-04 - mae: 0.0100 - mse: 1.8610e-04 - val_loss: 1.2845e-04 - val_mae: 0.0080 - val_mse: 1.2845e-04 Epoch 910/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.8499e-04 - mae: 0.0097 - mse: 1.8499e-04 - val_loss: 1.2147e-04 - val_mae: 0.0079 - val_mse: 1.2147e-04 Epoch 911/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.8738e-04 - mae: 0.0099 - mse: 1.8738e-04 - val_loss: 1.2447e-04 - val_mae: 0.0081 - val_mse: 1.2447e-04 Epoch 912/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.8332e-04 - mae: 0.0100 - mse: 1.8332e-04 - val_loss: 1.1633e-04 - val_mae: 0.0078 - val_mse: 1.1633e-04 Epoch 913/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.8557e-04 - mae: 0.0101 - mse: 1.8557e-04 - val_loss: 1.1720e-04 - val_mae: 0.0077 - val_mse: 1.1720e-04 Epoch 914/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.8307e-04 - mae: 0.0097 - mse: 1.8307e-04 - val_loss: 1.2224e-04 - val_mae: 0.0080 - val_mse: 1.2224e-04 Epoch 915/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.8277e-04 - mae: 0.0098 - mse: 1.8277e-04 - val_loss: 1.2075e-04 - val_mae: 0.0080 - val_mse: 1.2075e-04 Epoch 916/1000 3/3 [==============================] - 0s 19ms/step - loss: 1.8343e-04 - mae: 0.0098 - mse: 1.8343e-04 - val_loss: 1.1569e-04 - val_mae: 0.0076 - val_mse: 1.1569e-04 Epoch 917/1000 3/3 [==============================] - 0s 18ms/step - loss: 1.8042e-04 - mae: 0.0098 - mse: 1.8042e-04 - val_loss: 1.1075e-04 - val_mae: 0.0075 - val_mse: 1.1075e-04 Epoch 918/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.8290e-04 - mae: 0.0098 - mse: 1.8290e-04 - val_loss: 1.2820e-04 - val_mae: 0.0083 - val_mse: 1.2820e-04 Epoch 919/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.7845e-04 - mae: 0.0097 - mse: 1.7845e-04 - val_loss: 1.1133e-04 - val_mae: 0.0076 - val_mse: 1.1133e-04 Epoch 920/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.7238e-04 - mae: 0.0096 - mse: 1.7238e-04 - val_loss: 1.1358e-04 - val_mae: 0.0076 - val_mse: 1.1358e-04 Epoch 921/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.7172e-04 - mae: 0.0094 - mse: 1.7172e-04 - val_loss: 1.1253e-04 - val_mae: 0.0075 - val_mse: 1.1253e-04 Epoch 922/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.7362e-04 - mae: 0.0093 - mse: 1.7362e-04 - val_loss: 1.1229e-04 - val_mae: 0.0076 - val_mse: 1.1229e-04 Epoch 923/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.7103e-04 - mae: 0.0095 - mse: 1.7103e-04 - val_loss: 1.1075e-04 - val_mae: 0.0076 - val_mse: 1.1075e-04 Epoch 924/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.7168e-04 - mae: 0.0095 - mse: 1.7168e-04 - val_loss: 1.1026e-04 - val_mae: 0.0076 - val_mse: 1.1026e-04 Epoch 925/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.6926e-04 - mae: 0.0093 - mse: 1.6926e-04 - val_loss: 1.0689e-04 - val_mae: 0.0074 - val_mse: 1.0689e-04 Epoch 926/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.6935e-04 - mae: 0.0095 - mse: 1.6935e-04 - val_loss: 1.1401e-04 - val_mae: 0.0078 - val_mse: 1.1401e-04 Epoch 927/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.8026e-04 - mae: 0.0096 - mse: 1.8026e-04 - val_loss: 1.1238e-04 - val_mae: 0.0075 - val_mse: 1.1238e-04 Epoch 928/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.7115e-04 - mae: 0.0093 - mse: 1.7115e-04 - val_loss: 1.1253e-04 - val_mae: 0.0077 - val_mse: 1.1253e-04 Epoch 929/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.6847e-04 - mae: 0.0095 - mse: 1.6847e-04 - val_loss: 1.2257e-04 - val_mae: 0.0083 - val_mse: 1.2257e-04 Epoch 930/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.8097e-04 - mae: 0.0098 - mse: 1.8097e-04 - val_loss: 1.0527e-04 - val_mae: 0.0073 - val_mse: 1.0527e-04 Epoch 931/1000 3/3 [==============================] - 0s 18ms/step - loss: 1.8222e-04 - mae: 0.0100 - mse: 1.8222e-04 - val_loss: 1.0020e-04 - val_mae: 0.0074 - val_mse: 1.0020e-04 Epoch 932/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.7787e-04 - mae: 0.0098 - mse: 1.7787e-04 - val_loss: 1.2416e-04 - val_mae: 0.0085 - val_mse: 1.2416e-04 Epoch 933/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.7237e-04 - mae: 0.0097 - mse: 1.7237e-04 - val_loss: 1.1113e-04 - val_mae: 0.0077 - val_mse: 1.1113e-04 Epoch 934/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.7051e-04 - mae: 0.0097 - mse: 1.7051e-04 - val_loss: 1.1517e-04 - val_mae: 0.0077 - val_mse: 1.1517e-04 Epoch 935/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.6732e-04 - mae: 0.0092 - mse: 1.6732e-04 - val_loss: 1.0261e-04 - val_mae: 0.0070 - val_mse: 1.0261e-04 Epoch 936/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.6180e-04 - mae: 0.0091 - mse: 1.6180e-04 - val_loss: 1.0320e-04 - val_mae: 0.0072 - val_mse: 1.0320e-04 Epoch 937/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.7138e-04 - mae: 0.0094 - mse: 1.7138e-04 - val_loss: 1.1162e-04 - val_mae: 0.0076 - val_mse: 1.1162e-04 Epoch 938/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.6421e-04 - mae: 0.0093 - mse: 1.6421e-04 - val_loss: 1.0036e-04 - val_mae: 0.0071 - val_mse: 1.0036e-04 Epoch 939/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.6093e-04 - mae: 0.0093 - mse: 1.6093e-04 - val_loss: 1.1470e-04 - val_mae: 0.0077 - val_mse: 1.1470e-04 Epoch 940/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.6121e-04 - mae: 0.0091 - mse: 1.6121e-04 - val_loss: 1.0180e-04 - val_mae: 0.0072 - val_mse: 1.0180e-04 Epoch 941/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5999e-04 - mae: 0.0091 - mse: 1.5999e-04 - val_loss: 1.0029e-04 - val_mae: 0.0071 - val_mse: 1.0029e-04 Epoch 942/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.5427e-04 - mae: 0.0091 - mse: 1.5427e-04 - val_loss: 9.3530e-05 - val_mae: 0.0070 - val_mse: 9.3530e-05 Epoch 943/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.5368e-04 - mae: 0.0092 - mse: 1.5368e-04 - val_loss: 1.0512e-04 - val_mae: 0.0076 - val_mse: 1.0512e-04 Epoch 944/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5454e-04 - mae: 0.0091 - mse: 1.5454e-04 - val_loss: 1.0442e-04 - val_mae: 0.0074 - val_mse: 1.0442e-04 Epoch 945/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5367e-04 - mae: 0.0088 - mse: 1.5367e-04 - val_loss: 1.0189e-04 - val_mae: 0.0072 - val_mse: 1.0189e-04 Epoch 946/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5448e-04 - mae: 0.0090 - mse: 1.5448e-04 - val_loss: 9.5632e-05 - val_mae: 0.0070 - val_mse: 9.5632e-05 Epoch 947/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.5785e-04 - mae: 0.0093 - mse: 1.5785e-04 - val_loss: 9.7264e-05 - val_mae: 0.0073 - val_mse: 9.7264e-05 Epoch 948/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5833e-04 - mae: 0.0090 - mse: 1.5833e-04 - val_loss: 9.9373e-05 - val_mae: 0.0073 - val_mse: 9.9373e-05 Epoch 949/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.5463e-04 - mae: 0.0090 - mse: 1.5463e-04 - val_loss: 9.0346e-05 - val_mae: 0.0070 - val_mse: 9.0346e-05 Epoch 950/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5549e-04 - mae: 0.0093 - mse: 1.5549e-04 - val_loss: 1.0954e-04 - val_mae: 0.0079 - val_mse: 1.0954e-04 Epoch 951/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5138e-04 - mae: 0.0090 - mse: 1.5138e-04 - val_loss: 1.0181e-04 - val_mae: 0.0073 - val_mse: 1.0181e-04 Epoch 952/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5412e-04 - mae: 0.0088 - mse: 1.5412e-04 - val_loss: 9.6677e-05 - val_mae: 0.0070 - val_mse: 9.6677e-05 Epoch 953/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5504e-04 - mae: 0.0092 - mse: 1.5504e-04 - val_loss: 9.1955e-05 - val_mae: 0.0071 - val_mse: 9.1955e-05 Epoch 954/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.4759e-04 - mae: 0.0090 - mse: 1.4759e-04 - val_loss: 1.0541e-04 - val_mae: 0.0076 - val_mse: 1.0541e-04 Epoch 955/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5188e-04 - mae: 0.0088 - mse: 1.5188e-04 - val_loss: 9.8365e-05 - val_mae: 0.0073 - val_mse: 9.8365e-05 Epoch 956/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.4818e-04 - mae: 0.0089 - mse: 1.4818e-04 - val_loss: 9.1026e-05 - val_mae: 0.0069 - val_mse: 9.1026e-05 Epoch 957/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.4453e-04 - mae: 0.0089 - mse: 1.4453e-04 - val_loss: 1.0008e-04 - val_mae: 0.0073 - val_mse: 1.0008e-04 Epoch 958/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.4526e-04 - mae: 0.0086 - mse: 1.4526e-04 - val_loss: 9.6444e-05 - val_mae: 0.0071 - val_mse: 9.6444e-05 Epoch 959/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.5034e-04 - mae: 0.0091 - mse: 1.5034e-04 - val_loss: 9.1480e-05 - val_mae: 0.0071 - val_mse: 9.1480e-05 Epoch 960/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.4976e-04 - mae: 0.0089 - mse: 1.4976e-04 - val_loss: 9.3349e-05 - val_mae: 0.0071 - val_mse: 9.3349e-05 Epoch 961/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.3979e-04 - mae: 0.0087 - mse: 1.3979e-04 - val_loss: 9.0883e-05 - val_mae: 0.0070 - val_mse: 9.0883e-05 Epoch 962/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.4361e-04 - mae: 0.0089 - mse: 1.4361e-04 - val_loss: 1.0077e-04 - val_mae: 0.0074 - val_mse: 1.0077e-04 Epoch 963/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.4255e-04 - mae: 0.0086 - mse: 1.4255e-04 - val_loss: 8.8860e-05 - val_mae: 0.0067 - val_mse: 8.8860e-05 Epoch 964/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.4076e-04 - mae: 0.0087 - mse: 1.4076e-04 - val_loss: 9.3213e-05 - val_mae: 0.0070 - val_mse: 9.3213e-05 Epoch 965/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.3662e-04 - mae: 0.0086 - mse: 1.3662e-04 - val_loss: 9.3388e-05 - val_mae: 0.0070 - val_mse: 9.3388e-05 Epoch 966/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.3935e-04 - mae: 0.0085 - mse: 1.3935e-04 - val_loss: 9.1231e-05 - val_mae: 0.0067 - val_mse: 9.1231e-05 Epoch 967/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.4670e-04 - mae: 0.0085 - mse: 1.4670e-04 - val_loss: 9.4426e-05 - val_mae: 0.0071 - val_mse: 9.4426e-05 Epoch 968/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.4272e-04 - mae: 0.0090 - mse: 1.4272e-04 - val_loss: 8.4742e-05 - val_mae: 0.0068 - val_mse: 8.4742e-05 Epoch 969/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.4159e-04 - mae: 0.0091 - mse: 1.4159e-04 - val_loss: 9.9903e-05 - val_mae: 0.0076 - val_mse: 9.9903e-05 Epoch 970/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.4136e-04 - mae: 0.0087 - mse: 1.4136e-04 - val_loss: 9.3345e-05 - val_mae: 0.0072 - val_mse: 9.3345e-05 Epoch 971/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.4340e-04 - mae: 0.0089 - mse: 1.4340e-04 - val_loss: 8.4468e-05 - val_mae: 0.0066 - val_mse: 8.4468e-05 Epoch 972/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.3374e-04 - mae: 0.0085 - mse: 1.3374e-04 - val_loss: 8.7341e-05 - val_mae: 0.0068 - val_mse: 8.7341e-05 Epoch 973/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.3643e-04 - mae: 0.0086 - mse: 1.3643e-04 - val_loss: 8.6567e-05 - val_mae: 0.0068 - val_mse: 8.6567e-05 Epoch 974/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.3491e-04 - mae: 0.0085 - mse: 1.3491e-04 - val_loss: 9.6637e-05 - val_mae: 0.0072 - val_mse: 9.6637e-05 Epoch 975/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.3949e-04 - mae: 0.0088 - mse: 1.3949e-04 - val_loss: 8.6927e-05 - val_mae: 0.0067 - val_mse: 8.6927e-05 Epoch 976/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.3688e-04 - mae: 0.0084 - mse: 1.3688e-04 - val_loss: 9.2093e-05 - val_mae: 0.0068 - val_mse: 9.2093e-05 Epoch 977/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.3427e-04 - mae: 0.0085 - mse: 1.3427e-04 - val_loss: 8.4303e-05 - val_mae: 0.0067 - val_mse: 8.4303e-05 Epoch 978/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.3646e-04 - mae: 0.0087 - mse: 1.3646e-04 - val_loss: 9.1711e-05 - val_mae: 0.0071 - val_mse: 9.1711e-05 Epoch 979/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.3732e-04 - mae: 0.0087 - mse: 1.3732e-04 - val_loss: 8.2200e-05 - val_mae: 0.0067 - val_mse: 8.2200e-05 Epoch 980/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.2526e-04 - mae: 0.0082 - mse: 1.2526e-04 - val_loss: 1.0392e-04 - val_mae: 0.0074 - val_mse: 1.0392e-04 Epoch 981/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.3412e-04 - mae: 0.0082 - mse: 1.3412e-04 - val_loss: 8.1916e-05 - val_mae: 0.0065 - val_mse: 8.1916e-05 Epoch 982/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.3123e-04 - mae: 0.0085 - mse: 1.3123e-04 - val_loss: 8.0986e-05 - val_mae: 0.0067 - val_mse: 8.0986e-05 Epoch 983/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.2668e-04 - mae: 0.0084 - mse: 1.2668e-04 - val_loss: 8.8134e-05 - val_mae: 0.0069 - val_mse: 8.8134e-05 Epoch 984/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.2770e-04 - mae: 0.0082 - mse: 1.2770e-04 - val_loss: 8.4017e-05 - val_mae: 0.0066 - val_mse: 8.4017e-05 Epoch 985/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.2767e-04 - mae: 0.0081 - mse: 1.2767e-04 - val_loss: 7.7591e-05 - val_mae: 0.0064 - val_mse: 7.7591e-05 Epoch 986/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.2811e-04 - mae: 0.0085 - mse: 1.2811e-04 - val_loss: 7.8276e-05 - val_mae: 0.0066 - val_mse: 7.8276e-05 Epoch 987/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.2328e-04 - mae: 0.0082 - mse: 1.2328e-04 - val_loss: 9.2079e-05 - val_mae: 0.0071 - val_mse: 9.2079e-05 Epoch 988/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.2592e-04 - mae: 0.0081 - mse: 1.2592e-04 - val_loss: 7.5439e-05 - val_mae: 0.0064 - val_mse: 7.5439e-05 Epoch 989/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.2697e-04 - mae: 0.0086 - mse: 1.2697e-04 - val_loss: 8.1590e-05 - val_mae: 0.0068 - val_mse: 8.1590e-05 Epoch 990/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.2365e-04 - mae: 0.0082 - mse: 1.2365e-04 - val_loss: 8.8449e-05 - val_mae: 0.0067 - val_mse: 8.8449e-05 Epoch 991/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.2486e-04 - mae: 0.0079 - mse: 1.2486e-04 - val_loss: 8.3745e-05 - val_mae: 0.0065 - val_mse: 8.3745e-05 Epoch 992/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.2131e-04 - mae: 0.0082 - mse: 1.2131e-04 - val_loss: 7.8311e-05 - val_mae: 0.0066 - val_mse: 7.8311e-05 Epoch 993/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.2277e-04 - mae: 0.0083 - mse: 1.2277e-04 - val_loss: 7.5934e-05 - val_mae: 0.0064 - val_mse: 7.5934e-05 Epoch 994/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.1942e-04 - mae: 0.0082 - mse: 1.1942e-04 - val_loss: 7.9994e-05 - val_mae: 0.0067 - val_mse: 7.9994e-05 Epoch 995/1000 3/3 [==============================] - 0s 12ms/step - loss: 1.1739e-04 - mae: 0.0079 - mse: 1.1739e-04 - val_loss: 8.5978e-05 - val_mae: 0.0066 - val_mse: 8.5978e-05 Epoch 996/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.1918e-04 - mae: 0.0078 - mse: 1.1918e-04 - val_loss: 7.7404e-05 - val_mae: 0.0062 - val_mse: 7.7404e-05 Epoch 997/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.1708e-04 - mae: 0.0080 - mse: 1.1708e-04 - val_loss: 7.7886e-05 - val_mae: 0.0064 - val_mse: 7.7886e-05 Epoch 998/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.1993e-04 - mae: 0.0079 - mse: 1.1993e-04 - val_loss: 7.7599e-05 - val_mae: 0.0065 - val_mse: 7.7599e-05 Epoch 999/1000 3/3 [==============================] - 0s 16ms/step - loss: 1.1611e-04 - mae: 0.0081 - mse: 1.1611e-04 - val_loss: 7.2370e-05 - val_mae: 0.0064 - val_mse: 7.2370e-05 Epoch 1000/1000 3/3 [==============================] - 0s 11ms/step - loss: 1.1746e-04 - mae: 0.0083 - mse: 1.1746e-04 - val_loss: 8.1322e-05 - val_mae: 0.0068 - val_mse: 8.1322e-05 INFO:tensorflow:Assets written to: keras_surrogate/assets
4. Build and Run IDAES Flowsheet¶
This step builds an IDAES flowsheet and imports the surrogate model objects. As shown in the prior three examples, a single model object accounts for all input and output variables, and the JSON model serialized earlier may be imported into a single SurrogateBlock() component. While the serialization method and file structure differs slightly between the ALAMO, PySMO and Keras Python Wrappers, the three are imported similarly into IDAES flowsheets as shown below.
4.1 Build IDAES Flowsheet¶
This method builds an instance of the IDAES flowsheet model and solves the flowsheet using IPOPT. The method allows users to select a case and the surrogate model type to be used (i.e., alamo, pysmo, keras). The case argument consists of a list with values for the input variables (in this order, bypass split fraction and natural gas to steam ratio). Then the method fixes the input variables values to solve a square problem with IPOPT.
# Import IDAES and Pyomo libraries
from pyomo.environ import ConcreteModel, SolverFactory, value, Var, \
Constraint, Set
from pyomo.contrib.parmest.utils.ipopt_solver_wrapper import ipopt_solve_with_stats
from idaes.core.surrogate.surrogate_block import SurrogateBlock
from idaes.core.surrogate.alamopy import AlamoSurrogate
from idaes.core.surrogate.pysmo_surrogate import PysmoSurrogate
from idaes.core.surrogate.keras_surrogate import KerasSurrogate
from idaes.core import FlowsheetBlock
def build_flowsheet(case, surrogate_type='alamo'):
print(case, ' ', surrogate_type)
# create the IDAES model and flowsheet
m = ConcreteModel()
m.fs = FlowsheetBlock(dynamic=False)
# create flowsheet input variables
m.fs.bypass_frac = Var(initialize=0.80, bounds=[0.1, 0.8], doc='natural gas bypass fraction')
m.fs.ng_steam_ratio = Var(initialize=0.80, bounds=[0.8, 1.2], doc='natural gas to steam ratio')
# create flowsheet output variables
m.fs.steam_flowrate = Var(initialize=0.2, doc="steam flowrate")
m.fs.reformer_duty = Var(initialize=10000, doc="reformer heat duty")
m.fs.AR = Var(initialize=0, doc="AR fraction")
m.fs.C2H6 = Var(initialize=0, doc="C2H6 fraction")
m.fs.C3H8 = Var(initialize=0, doc="C3H8 fraction")
m.fs.C4H10 = Var(initialize=0, doc="C4H10 fraction")
m.fs.CH4 = Var(initialize=0, doc="CH4 fraction")
m.fs.CO = Var(initialize=0, doc="CO fraction")
m.fs.CO2 = Var(initialize=0, doc="CO2 fraction")
m.fs.H2 = Var(initialize=0, doc="H2 fraction")
m.fs.H2O = Var(initialize=0, doc="H2O fraction")
m.fs.N2 = Var(initialize=0, doc="N2 fraction")
m.fs.O2 = Var(initialize=0, doc="O2 fraction")
# create input and output variable object lists for flowsheet
inputs = [m.fs.bypass_frac, m.fs.ng_steam_ratio]
outputs = [m.fs.steam_flowrate, m.fs.reformer_duty, m.fs.AR, m.fs.C2H6, m.fs.C3H8,
m.fs.C4H10, m.fs.CH4, m.fs.CO, m.fs.CO2, m.fs.H2, m.fs.H2O, m.fs.N2, m.fs.O2]
# create the Pyomo/IDAES block that corresponds to the surrogate
# call correct PySMO object to use below (will let us avoid nested switches)
# capture long output from loading surrogates (don't need to print it)
stream = StringIO()
oldstdout = sys.stdout
sys.stdout = stream
if surrogate_type=='alamo':
surrogate = AlamoSurrogate.load_from_file('alamo_surrogate.json')
m.fs.surrogate = SurrogateBlock()
m.fs.surrogate.build_model(surrogate, input_vars=inputs, output_vars=outputs)
elif surrogate_type=='keras':
keras_surrogate = KerasSurrogate.load_from_folder('keras_surrogate')
m.fs.surrogate = SurrogateBlock()
m.fs.surrogate.build_model(keras_surrogate,
formulation=KerasSurrogate.Formulation.FULL_SPACE,
input_vars=inputs, output_vars=outputs)
else: # surrogate is one of the three pysmo basis options
surrogate = PysmoSurrogate.load_from_file(str(surrogate_type) + '_surrogate.json')
m.fs.surrogate = SurrogateBlock()
m.fs.surrogate.build_model(surrogate, input_vars=inputs, output_vars=outputs)
# revert to standard output
sys.stdout = oldstdout
# fix input values and solve flowsheet
m.fs.bypass_frac.fix(case[0])
m.fs.ng_steam_ratio.fix(case[1])
solver = SolverFactory('ipopt')
try: # attempt to solve problem
[status_obj, solved, iters, time, regu] = ipopt_solve_with_stats(m, solver)
except: # retry solving one more time
[status_obj, solved, iters, time, regu] = ipopt_solve_with_stats(m, solver)
return (status_obj['Problem'][0]['Number of variables'],
status_obj['Problem'][0]['Number of constraints'],
value(m.fs.steam_flowrate), value(m.fs.reformer_duty),
value(m.fs.C2H6), value(m.fs.CH4), value(m.fs.H2), value(m.fs.O2),
value(iters), value(time)) # don't report regu
4.2 Model Size/Form Comparison¶
As mentioned above, as part of best practices the IDAES ML/AI demonstration includes the analysis of model/solver statistics and performance to determine the best surrogate model, including model size, model form, model trainer, etc. This section provides the rigorous analysis of solver performance comparing differnt surrogate models (ALAMO, PySMO polynomial, PysMO RBF, and PySMO Kriging).
To obtain the results, we run the flowsheet for ten different simulation cases for each surrogate model type. We collect and compare IPOPT iterations and runtime statistics. Additionally, since the simulation cases are obtained from the training data set we can also compare model performance (absolute error of measurement vs predicted output values).
# Import Auto-reformer training data
import numpy as np
import pandas as pd
np.set_printoptions(precision=6, suppress=True)
csv_data = pd.read_csv(r'reformer-data.csv') # 2800 data points
# extracting 10 data points out of 2800 data points, randomly selecting 10 cases to run
case_data = csv_data.sample(n = 10)
# selecting columns that correspond to Input Variables
inputs = np.array(case_data.iloc[:, :2])
# selecting columns that correspod to Output Variables
cols = ["Steam_Flow", "Reformer_Duty", "C2H6", "CH4", "H2", "O2"]
outputs = np.array(case_data[cols])
# For results comparison with minimum memory usage we will extract the values to plot on each pass
# note that the entire model could be returned and saved on each loop if desired
# create empty dictionaries so we may easily index results as we save them
# for convenience while plotting, each output variable has its own dictionary
# indexed by (case number, trainer type)
trainers = ['alamo', 'pysmo_poly', 'pysmo_rbf', 'pysmo_krig', 'keras']
cases = range(len(inputs))
model_vars = {}
model_cons = {}
steam_flow_error = {}
reformer_duty_error = {}
conc_C2H6 = {}
conc_CH4 = {}
conc_H2 = {}
conc_O2 = {}
ipopt_iters = {}
ipopt_time = {}
# run flowsheet for each trainer and save results
i = 0
for case in inputs: # each case is a value pair (bypass_frac, ng_steam_ratio)
i = i + 1
for trainer in trainers:
[numvar, numcon, sf, rd, eth, meth, hyd, oxy, iters, time] = build_flowsheet(case, surrogate_type=trainer)
model_vars[trainer] = numvar # will be overwritten, but has the same values for all cases
model_cons[trainer] = numcon # will be overwritten, but has the same values for all cases
steam_flow_error[(i, trainer)] = abs((sf - value(outputs[i-1,0]))/value(outputs[i-1,0]))
reformer_duty_error[(i, trainer)] = abs((rd - value(outputs[i-1,1]))/value(outputs[i-1,1]))
conc_C2H6[(i, trainer)] = abs(eth - value(outputs[i-1,2]))
conc_CH4[(i, trainer)] = abs(meth - value(outputs[i-1,3]))
conc_H2[(i, trainer)] = abs(hyd - value(outputs[i-1,4]))
conc_O2[(i, trainer)] = abs(oxy - value(outputs[i-1,5]))
ipopt_iters[(i, trainer)] = iters
ipopt_time[(i, trainer)] = time
[0.373913 1.073684] alamo Ipopt 3.13.2: output_file=/tmp/tmpvx8tkqa6ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.69e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.69e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.373913 1.073684] pysmo_poly 2023-03-04 01:45:10 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmpb4zr52jbipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.69e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 1.11e-16 0.00e+00 -1.0 1.69e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 1.1102230246251565e-16 1.1102230246251565e-16 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 1.1102230246251565e-16 1.1102230246251565e-16 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.373913 1.073684] pysmo_rbf 2023-03-04 01:45:10 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmp_k_gpsfnipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.69e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.69e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.373913 1.073684] pysmo_krig 2023-03-04 01:45:11 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmp3jvr02baipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.69e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 1.11e-16 0.00e+00 -1.0 1.69e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 1.1102230246251565e-16 1.1102230246251565e-16 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 1.1102230246251565e-16 1.1102230246251565e-16 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.373913 1.073684] keras Ipopt 3.13.2: output_file=/tmp/tmpx8k65dbiipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2567 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 231 variables with only lower bounds: 0 variables with lower and upper bounds: 192 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.02e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 5.00e+03 2.86e+01 -1.0 2.43e+04 - 1.69e-02 5.09e-01f 1 2 0.0000000e+00 4.70e+03 2.73e+01 -1.0 1.50e+04 - 6.99e-01 6.14e-02h 1 3 0.0000000e+00 4.69e+03 2.33e+04 -1.0 1.41e+04 - 9.99e-01 8.33e-04h 1 4r 0.0000000e+00 4.69e+03 1.00e+03 3.7 0.00e+00 - 0.00e+00 2.61e-07R 6 5r 0.0000000e+00 4.69e+03 1.49e+03 3.7 4.17e+05 - 2.39e-02 4.72e-04f 1 6r 0.0000000e+00 4.44e+03 9.19e+02 2.3 3.45e+05 - 1.00e+00 1.26e-02f 1 7r 0.0000000e+00 3.35e+03 6.64e+02 2.3 1.77e+02 - 1.00e+00 2.47e-01f 1 8r 0.0000000e+00 1.15e+02 2.62e+01 1.6 1.31e+01 - 8.09e-01 9.66e-01f 1 9r 0.0000000e+00 4.77e+01 1.34e+02 0.9 2.11e+00 - 8.55e-01 5.87e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10r 0.0000000e+00 3.10e+01 1.61e+02 0.2 2.17e+01 - 6.43e-01 3.51e-01f 1 11r 0.0000000e+00 7.42e+00 9.04e+01 0.2 8.78e+01 - 8.80e-01 7.65e-01f 1 12r 0.0000000e+00 2.03e+00 2.16e+02 0.2 1.45e+02 - 4.64e-01 7.35e-01f 1 13r 0.0000000e+00 3.22e-01 4.95e+01 0.2 9.27e+00 - 6.18e-01 8.46e-01f 1 14r 0.0000000e+00 2.49e-01 3.45e+01 -0.5 1.27e+01 - 8.31e-01 1.00e+00f 1 15r 0.0000000e+00 2.67e-01 1.91e-01 -0.5 2.30e+02 - 1.00e+00 1.00e+00f 1 16r 0.0000000e+00 2.81e-01 1.14e+00 -2.9 1.18e+00 - 9.11e-01 9.34e-01f 1 17r 0.0000000e+00 2.47e-01 7.20e+01 -2.9 4.00e+03 - 5.73e-01 3.01e-01f 1 18r 0.0000000e+00 1.64e-01 1.57e+01 -2.9 3.15e+03 - 8.46e-01 1.00e+00f 1 19r 0.0000000e+00 1.62e-01 6.06e-02 -2.9 3.24e+02 - 1.00e+00 1.00e+00h 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 20r 0.0000000e+00 1.62e-01 4.54e-07 -2.9 5.43e-01 - 1.00e+00 1.00e+00h 1 21r 0.0000000e+00 1.62e-01 1.87e-02 -6.5 1.07e-01 - 1.00e+00 1.00e+00f 1 22r 0.0000000e+00 5.89e-02 6.56e+01 -6.5 4.69e+04 - 2.21e-01 1.12e-01f 1 23r 0.0000000e+00 3.55e-02 2.51e-02 -6.5 1.14e+02 -4.0 1.00e+00 1.00e+00f 1 24r 0.0000000e+00 2.74e-02 1.14e-02 -6.5 3.42e+02 -4.5 1.00e+00 1.00e+00f 1 25r 0.0000000e+00 2.99e-03 2.94e-01 -6.5 1.03e+03 -5.0 1.00e+00 1.00e+00f 1 26r 0.0000000e+00 1.01e-03 3.75e+01 -6.5 3.21e+03 -5.4 1.00e+00 3.89e-02f 1 27r 0.0000000e+00 1.01e-03 9.49e+02 -6.5 3.24e+01 - 9.84e-01 5.09e-07h 2 28r 0.0000000e+00 6.75e-09 1.41e-02 -6.5 5.60e+00 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 28 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 6.7509925427700068e-09 6.7509925427700068e-09 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 6.7509925427700068e-09 6.7509925427700068e-09 Number of objective function evaluations = 36 Number of objective gradient evaluations = 6 Number of equality constraint evaluations = 36 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 30 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 28 Total CPU secs in IPOPT (w/o function evaluations) = 0.052 Total CPU secs in NLP function evaluations = 0.001 EXIT: Optimal Solution Found. [0.394203 0.936842] alamo Ipopt 3.13.2: output_file=/tmp/tmpwd0vo7ojipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.42e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.42e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.394203 0.936842] pysmo_poly 2023-03-04 01:45:11 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmpnc2x52_uipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.43e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.43e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.394203 0.936842] pysmo_rbf 2023-03-04 01:45:11 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmprawu45ymipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.42e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.42e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.394203 0.936842] pysmo_krig 2023-03-04 01:45:11 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmppnm2pigkipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.43e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.43e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.394203 0.936842] keras Ipopt 3.13.2: output_file=/tmp/tmpc4wx4ca9ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2567 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 231 variables with only lower bounds: 0 variables with lower and upper bounds: 192 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.02e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 4.45e+03 2.23e+01 -1.0 2.16e+04 - 2.38e-02 5.64e-01f 1 2 0.0000000e+00 3.94e+03 2.04e+01 -1.0 1.22e+04 - 8.08e-01 1.15e-01h 1 3 0.0000000e+00 3.93e+03 6.87e+03 -1.0 1.09e+04 - 1.00e+00 2.59e-03h 1 4 0.0000000e+00 3.93e+03 2.59e+08 -1.0 1.09e+04 - 1.00e+00 2.64e-05h 1 5r 0.0000000e+00 3.93e+03 1.00e+03 3.6 0.00e+00 - 0.00e+00 1.32e-07R 2 6r 0.0000000e+00 3.91e+03 1.51e+03 3.6 4.42e+05 - 4.86e-02 8.49e-04f 1 7r 0.0000000e+00 3.71e+03 1.13e+03 2.2 2.57e+05 - 6.03e-01 1.34e-02f 1 8r 0.0000000e+00 2.08e+03 6.63e+02 1.5 1.17e+02 - 7.03e-01 2.93e-01f 1 9r 0.0000000e+00 1.20e+03 3.84e+02 1.5 9.38e+00 - 5.47e-01 3.72e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10r 0.0000000e+00 9.33e+02 2.92e+02 1.5 5.30e+00 - 6.17e-01 2.12e-01f 1 11r 0.0000000e+00 3.33e+02 5.83e+02 1.5 4.04e+00 - 9.98e-01 6.29e-01f 1 12r 0.0000000e+00 1.72e+02 2.26e+02 0.8 1.46e+00 - 6.10e-01 4.85e-01f 1 13r 0.0000000e+00 7.97e+01 1.50e+02 0.8 4.95e+00 - 4.81e-01 5.37e-01f 1 14r 0.0000000e+00 3.72e+01 1.06e+02 0.8 1.42e+01 - 3.64e-01 5.34e-01f 1 15r 0.0000000e+00 1.38e+01 1.80e+02 0.1 2.46e+01 - 4.44e-01 6.32e-01f 1 16r 0.0000000e+00 1.02e+01 1.26e+02 0.1 1.09e+02 - 4.90e-01 2.66e-01f 1 17r 0.0000000e+00 2.73e+00 2.75e+01 0.1 1.32e+02 - 6.82e-01 7.38e-01f 1 18r 0.0000000e+00 5.41e-01 2.79e+02 -0.6 1.00e+02 - 2.85e-01 8.09e-01f 1 19r 0.0000000e+00 1.73e-01 3.46e+01 -0.6 2.13e+02 - 6.26e-01 1.00e+00f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 20r 0.0000000e+00 2.00e-01 8.81e-01 -0.6 1.51e+01 - 1.00e+00 1.00e+00f 1 21r 0.0000000e+00 2.20e-01 2.38e+00 -2.0 1.04e+01 - 8.83e-01 1.00e+00f 1 22r 0.0000000e+00 1.99e-01 4.29e+01 -2.0 1.61e+03 - 9.43e-01 5.10e-01f 1 23r 0.0000000e+00 1.77e-01 1.28e+00 -2.0 8.11e+02 - 1.00e+00 1.00e+00f 1 24r 0.0000000e+00 1.77e-01 4.92e-04 -2.0 1.19e+01 - 1.00e+00 1.00e+00h 1 25r 0.0000000e+00 1.77e-01 4.46e-02 -4.5 3.56e-01 - 1.00e+00 1.00e+00f 1 26r 0.0000000e+00 8.87e-02 8.58e+01 -4.5 1.31e+04 - 5.41e-01 3.66e-01f 1 27r 0.0000000e+00 8.87e-02 2.76e+02 -4.5 2.70e+04 - 1.00e+00 1.97e-05f 1 28r 0.0000000e+00 6.07e-02 6.08e+02 -4.5 5.23e+03 - 1.00e+00 3.76e-01f 1 29r 0.0000000e+00 6.07e-02 6.79e+02 -4.5 5.01e+03 - 6.80e-01 1.16e-05f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 30r 0.0000000e+00 2.66e-02 8.47e+02 -4.5 2.07e+02 - 1.54e-05 5.61e-01f 1 31r 0.0000000e+00 5.72e-07 1.28e-01 -4.5 9.08e+01 - 1.00e+00 1.00e+00h 1 32r 0.0000000e+00 3.63e-08 3.06e-10 -4.5 2.27e-03 - 1.00e+00 1.00e+00h 1 33r 0.0000000e+00 2.02e-10 4.06e-03 -6.8 5.20e-03 - 1.00e+00 1.00e+00f 1 Number of Iterations....: 33 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 9.6116170578142146e-11 2.0190782379359007e-10 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 9.6116170578142146e-11 2.0190782379359007e-10 Number of objective function evaluations = 36 Number of objective gradient evaluations = 7 Number of equality constraint evaluations = 36 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 35 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 33 Total CPU secs in IPOPT (w/o function evaluations) = 0.053 Total CPU secs in NLP function evaluations = 0.001 EXIT: Optimal Solution Found. [0.789855 0.978947] alamo Ipopt 3.13.2: output_file=/tmp/tmph0kitvbwipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 8.38e+02 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 8.38e+02 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.789855 0.978947] pysmo_poly 2023-03-04 01:45:12 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmpkl0j0v04ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 8.19e+02 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 8.19e+02 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.789855 0.978947] pysmo_rbf 2023-03-04 01:45:12 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmp104m1uy3ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 8.31e+02 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 8.31e+02 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.789855 0.978947] pysmo_krig 2023-03-04 01:45:13 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmp90rwzqunipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 8.33e+02 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 8.33e+02 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.789855 0.978947] keras Ipopt 3.13.2: output_file=/tmp/tmprozqrxdpipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2567 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 231 variables with only lower bounds: 0 variables with lower and upper bounds: 192 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.02e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 1.66e+03 9.55e+01 -1.0 1.54e+04 - 9.70e-03 8.38e-01f 1 2 0.0000000e+00 7.69e-03 7.30e+00 -1.0 3.10e+03 - 9.77e-01 1.00e+00h 1 3 0.0000000e+00 6.53e-06 6.59e-04 -1.0 9.81e+01 - 1.00e+00 1.00e+00h 1 4 0.0000000e+00 3.33e-16 6.83e-15 -2.5 2.73e-02 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 4 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 3.3306690738754696e-16 3.3306690738754696e-16 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 3.3306690738754696e-16 3.3306690738754696e-16 Number of objective function evaluations = 5 Number of objective gradient evaluations = 5 Number of equality constraint evaluations = 5 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 5 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 4 Total CPU secs in IPOPT (w/o function evaluations) = 0.009 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.191304 0.936842] alamo Ipopt 3.13.2: output_file=/tmp/tmpzp5k77dmipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.18e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 1.11e-16 0.00e+00 -1.0 2.18e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 1.1102230246251565e-16 1.1102230246251565e-16 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 1.1102230246251565e-16 1.1102230246251565e-16 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.191304 0.936842] pysmo_poly 2023-03-04 01:45:13 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmpr3fei7zxipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.18e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 1.11e-16 0.00e+00 -1.0 2.18e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 1.1102230246251565e-16 1.1102230246251565e-16 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 1.1102230246251565e-16 1.1102230246251565e-16 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.191304 0.936842] pysmo_rbf 2023-03-04 01:45:13 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmp489v9ahmipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.18e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.18e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.191304 0.936842] pysmo_krig 2023-03-04 01:45:14 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmpvtd6f8vyipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.18e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 1.11e-16 0.00e+00 -1.0 2.18e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 1.1102230246251565e-16 1.1102230246251565e-16 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 1.1102230246251565e-16 1.1102230246251565e-16 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.191304 0.936842] keras Ipopt 3.13.2: output_file=/tmp/tmpd_cn5xgkipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2567 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 231 variables with only lower bounds: 0 variables with lower and upper bounds: 192 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.02e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 6.35e+03 1.14e+01 -1.0 2.51e+04 - 2.99e-02 3.77e-01h 1 2 0.0000000e+00 6.21e+03 3.46e+01 -1.0 2.23e+04 - 6.46e-01 2.15e-02h 1 3 0.0000000e+00 6.21e+03 1.52e+05 -1.0 2.20e+04 - 9.60e-01 2.39e-04h 1 4r 0.0000000e+00 6.21e+03 1.00e+03 3.8 0.00e+00 - 0.00e+00 2.99e-07R 4 5r 0.0000000e+00 6.21e+03 2.05e+03 3.8 3.16e+05 - 1.47e-02 2.39e-04f 1 6r 0.0000000e+00 5.68e+03 1.46e+03 1.7 3.04e+05 - 5.17e-01 1.96e-02f 1 7r 0.0000000e+00 5.23e+03 1.14e+03 1.7 1.29e+03 - 4.84e-01 4.63e-02f 1 8r 0.0000000e+00 1.52e+03 4.66e+02 1.7 2.54e+01 - 7.25e-01 5.47e-01f 1 9r 0.0000000e+00 4.41e+02 1.35e+02 1.7 7.21e+00 - 1.00e+00 6.53e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10r 0.0000000e+00 1.09e+02 2.03e+02 1.0 2.18e+00 - 3.46e-01 7.53e-01f 1 11r 0.0000000e+00 4.91e+01 9.47e+01 1.0 2.32e+00 - 4.75e-01 5.50e-01f 1 12r 0.0000000e+00 3.87e+01 7.30e+01 0.3 1.32e+01 - 3.07e-01 2.13e-01f 1 13r 0.0000000e+00 1.85e+01 1.02e+02 0.3 4.28e+01 - 4.17e-01 5.25e-01f 1 14r 0.0000000e+00 5.16e+00 1.32e+02 0.3 6.72e+01 - 4.85e-01 7.29e-01f 1 15r 0.0000000e+00 2.32e+00 2.50e+02 0.3 8.80e+01 - 2.89e-01 5.60e-01f 1 16r 0.0000000e+00 3.16e-01 2.10e+02 0.3 1.98e+01 - 5.94e-01 1.00e+00f 1 17r 0.0000000e+00 3.67e-01 6.13e-01 0.3 3.07e+00 - 1.00e+00 1.00e+00f 1 18r 0.0000000e+00 4.21e-01 2.13e+01 -1.8 5.85e+00 - 6.97e-01 9.25e-01f 1 19r 0.0000000e+00 4.51e-01 1.64e+00 -1.8 1.47e+03 - 9.53e-01 1.00e+00f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 20r 0.0000000e+00 4.61e-01 7.86e-02 -1.8 1.92e+01 - 1.00e+00 1.00e+00h 1 21r 0.0000000e+00 4.64e-01 9.88e-02 -4.1 9.44e+00 - 9.85e-01 1.00e+00f 1 22r 0.0000000e+00 4.64e-01 2.38e+01 -4.1 1.99e+04 - 1.00e+00 3.27e-02f 1 23r 0.0000000e+00 4.55e-01 5.75e+02 -4.1 6.49e+03 - 1.00e+00 4.22e-02f 1 24r 0.0000000e+00 3.08e-01 1.67e+02 -4.1 6.47e+03 - 1.00e+00 7.03e-01f 1 25r 0.0000000e+00 3.08e-01 1.12e+02 -4.1 8.09e+03 - 1.00e+00 5.29e-04f 1 26r 0.0000000e+00 1.84e-01 3.70e+01 -4.1 5.67e+03 - 1.00e+00 1.00e+00f 1 27r 0.0000000e+00 6.39e-02 7.23e+00 -4.1 4.83e+03 - 1.00e+00 1.00e+00f 1 28r 0.0000000e+00 1.09e-02 2.71e+02 -4.1 4.21e+03 - 1.67e-01 5.35e-01f 1 29r 0.0000000e+00 3.80e-02 1.39e+01 -4.1 1.45e+03 - 3.78e-03 1.00e+00h 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 30r 0.0000000e+00 3.38e-02 5.38e-03 -4.1 1.25e+02 - 1.00e+00 1.00e+00f 1 31r 0.0000000e+00 3.37e-02 1.20e-06 -4.1 1.92e+00 - 1.00e+00 1.00e+00h 1 32r 0.0000000e+00 3.37e-02 1.76e-02 -6.1 5.09e-01 - 1.00e+00 1.00e+00f 1 33r 0.0000000e+00 3.01e-02 1.27e-02 -6.1 1.27e+02 -4.0 1.00e+00 1.00e+00f 1 34r 0.0000000e+00 1.94e-02 2.20e-02 -6.1 3.84e+02 -4.5 1.00e+00 1.00e+00f 1 35r 0.0000000e+00 3.37e-04 1.13e+01 -6.1 1.18e+03 -5.0 1.00e+00 5.76e-01f 1 36r 0.0000000e+00 3.37e-04 1.01e+03 -6.1 4.95e+03 - 1.00e+00 1.08e-07f 1 37r 0.0000000e+00 2.88e-09 1.13e-02 -6.1 6.91e+00 - 1.00e+00 1.00e+00f 1 Number of Iterations....: 37 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 2.8782755068235133e-09 2.8782755068235133e-09 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 2.8782755068235133e-09 2.8782755068235133e-09 Number of objective function evaluations = 42 Number of objective gradient evaluations = 6 Number of equality constraint evaluations = 42 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 39 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 37 Total CPU secs in IPOPT (w/o function evaluations) = 0.066 Total CPU secs in NLP function evaluations = 0.001 EXIT: Optimal Solution Found. [0.231884 0.863158] alamo Ipopt 3.13.2: output_file=/tmp/tmp5vlv23h4ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.90e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.90e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.231884 0.863158] pysmo_poly 2023-03-04 01:45:15 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmpw2x3enezipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.90e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.90e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.231884 0.863158] pysmo_rbf 2023-03-04 01:45:15 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmpmylc7y6cipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.90e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.90e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.231884 0.863158] pysmo_krig 2023-03-04 01:45:15 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmpppu954l9ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.90e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 1.11e-16 0.00e+00 -1.0 1.90e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 1.1102230246251565e-16 1.1102230246251565e-16 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 1.1102230246251565e-16 1.1102230246251565e-16 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.231884 0.863158] keras Ipopt 3.13.2: output_file=/tmp/tmp8ja9x4rlipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2567 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 231 variables with only lower bounds: 0 variables with lower and upper bounds: 192 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.02e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 5.98e+03 9.83e+00 -1.0 2.32e+04 - 3.79e-02 4.13e-01h 1 2 0.0000000e+00 5.82e+03 2.98e+01 -1.0 1.95e+04 - 6.82e-01 2.73e-02h 1 3 0.0000000e+00 5.82e+03 1.05e+05 -1.0 1.91e+04 - 9.74e-01 3.10e-04h 1 4r 0.0000000e+00 5.82e+03 1.00e+03 3.8 0.00e+00 - 0.00e+00 3.88e-07R 4 5r 0.0000000e+00 5.81e+03 2.51e+03 3.8 3.23e+05 - 1.71e-02 2.45e-04f 1 6r 0.0000000e+00 5.36e+03 2.16e+03 1.7 3.06e+05 - 3.92e-01 1.82e-02f 1 7r 0.0000000e+00 4.90e+03 1.79e+03 1.7 1.31e+03 - 5.14e-01 4.27e-02f 1 8r 0.0000000e+00 2.31e+03 1.08e+03 1.7 2.48e+01 - 4.15e-01 3.88e-01f 1 9r 0.0000000e+00 8.08e+02 4.02e+02 1.7 1.12e+01 - 7.73e-01 5.35e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10r 0.0000000e+00 3.22e+02 2.22e+02 1.7 3.91e+00 - 4.93e-01 5.71e-01f 1 11r 0.0000000e+00 9.24e+01 5.72e+02 1.0 1.63e+00 - 2.27e-01 6.93e-01f 1 12r 0.0000000e+00 6.37e+01 2.79e+02 1.0 4.56e+00 - 4.22e-01 3.04e-01f 1 13r 0.0000000e+00 3.37e+01 1.78e+02 0.3 1.18e+01 - 3.91e-01 4.67e-01f 1 14r 0.0000000e+00 1.84e+01 1.22e+02 0.3 4.52e+01 - 3.49e-01 4.53e-01f 1 15r 0.0000000e+00 4.50e+00 1.48e+02 0.3 5.96e+01 - 5.04e-01 7.58e-01f 1 16r 0.0000000e+00 2.21e+00 1.22e+02 0.3 8.55e+01 - 4.16e-01 5.16e-01f 1 17r 0.0000000e+00 5.94e-01 4.05e+01 0.3 3.35e+01 - 7.14e-01 7.39e-01f 1 18r 0.0000000e+00 2.98e-01 1.26e+01 -0.4 9.71e+00 - 7.09e-01 7.22e-01f 1 19r 0.0000000e+00 3.70e-01 3.55e+01 -1.1 1.88e+02 - 7.67e-01 9.86e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 20r 0.0000000e+00 3.84e-01 1.37e-01 -1.1 4.22e+02 - 1.00e+00 1.00e+00f 1 21r 0.0000000e+00 3.91e-01 4.31e-01 -2.8 6.12e+00 - 9.69e-01 1.00e+00f 1 22r 0.0000000e+00 3.93e-01 1.96e+00 -2.8 4.12e+03 - 1.00e+00 4.45e-01f 1 23r 0.0000000e+00 3.92e-01 7.70e+01 -2.8 1.90e+03 - 1.00e+00 3.13e-02f 1 24r 0.0000000e+00 3.42e-01 7.42e+00 -2.8 1.52e+03 - 1.00e+00 1.00e+00f 1 25r 0.0000000e+00 3.41e-01 1.43e-02 -2.8 1.24e+02 - 1.00e+00 1.00e+00h 1 26r 0.0000000e+00 3.41e-01 2.21e-02 -4.1 2.22e-01 - 1.00e+00 1.00e+00f 1 27r 0.0000000e+00 2.71e-01 2.59e+02 -4.1 7.55e+03 - 1.00e+00 3.20e-01f 1 28r 0.0000000e+00 1.57e-01 5.47e+01 -4.1 8.71e+03 - 1.00e+00 5.02e-01f 1 29r 0.0000000e+00 1.57e-01 1.06e+02 -4.1 8.18e+03 - 1.00e+00 2.13e-04f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 30r 0.0000000e+00 3.67e-02 3.27e+01 -4.1 4.82e+03 - 1.00e+00 1.00e+00f 1 31r 0.0000000e+00 1.71e-02 2.21e+02 -4.1 4.18e+03 - 1.00e+00 3.08e-01f 1 32r 0.0000000e+00 1.71e-02 4.48e+02 -4.1 4.07e+03 - 3.02e-01 2.29e-05h 1 33r 0.0000000e+00 1.27e-06 4.12e+02 -4.1 1.07e+02 - 1.14e-04 1.00e+00f 1 34r 0.0000000e+00 3.19e-07 5.57e-05 -4.1 2.56e-02 - 1.00e+00 1.00e+00h 1 35r 0.0000000e+00 3.59e-09 1.49e-02 -6.2 6.53e-02 - 1.00e+00 1.00e+00f 1 Number of Iterations....: 35 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 3.5900145078926471e-09 3.5900145078926471e-09 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 3.5900145078926471e-09 3.5900145078926471e-09 Number of objective function evaluations = 40 Number of objective gradient evaluations = 6 Number of equality constraint evaluations = 40 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 37 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 35 Total CPU secs in IPOPT (w/o function evaluations) = 0.058 Total CPU secs in NLP function evaluations = 0.001 EXIT: Optimal Solution Found. [0.150725 0.978947] alamo Ipopt 3.13.2: output_file=/tmp/tmpt9ovy7jbipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.41e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.41e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.150725 0.978947] pysmo_poly 2023-03-04 01:45:15 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmp6etivr8eipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.42e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.42e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.150725 0.978947] pysmo_rbf 2023-03-04 01:45:16 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmpe5ro3w_dipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.41e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.41e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.150725 0.978947] pysmo_krig 2023-03-04 01:45:16 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmpiahi8dctipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.41e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.41e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.150725 0.978947] keras Ipopt 3.13.2: output_file=/tmp/tmpj2bvsphuipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2567 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 231 variables with only lower bounds: 0 variables with lower and upper bounds: 192 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.02e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 6.63e+03 1.38e+01 -1.0 2.66e+04 - 2.33e-02 3.50e-01h 1 2 0.0000000e+00 6.51e+03 3.71e+01 -1.0 2.47e+04 - 5.99e-01 1.81e-02h 1 3 0.0000000e+00 6.51e+03 1.91e+05 -1.0 2.44e+04 - 9.40e-01 1.99e-04h 1 4r 0.0000000e+00 6.51e+03 1.00e+03 3.8 0.00e+00 - 0.00e+00 4.98e-07R 3 5r 0.0000000e+00 6.50e+03 2.14e+03 3.8 3.08e+05 - 1.14e-02 1.79e-04f 1 6r 0.0000000e+00 5.91e+03 1.52e+03 1.7 3.00e+05 - 6.02e-01 2.09e-02f 1 7r 0.0000000e+00 5.50e+03 1.23e+03 1.7 1.41e+03 - 5.50e-01 4.43e-02f 1 8r 0.0000000e+00 2.12e+03 5.01e+02 1.7 2.51e+01 - 9.86e-01 5.11e-01f 1 9r 0.0000000e+00 4.99e+02 9.97e+01 1.7 8.98e+00 - 1.00e+00 7.63e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10r 0.0000000e+00 8.02e+01 1.58e+02 1.0 2.15e+00 - 2.34e-01 8.40e-01f 1 11r 0.0000000e+00 6.51e+01 1.00e+02 0.3 4.13e+00 - 3.74e-01 1.88e-01f 1 12r 0.0000000e+00 3.17e+01 1.22e+02 0.3 1.99e+01 - 4.10e-01 5.14e-01f 1 13r 0.0000000e+00 7.20e+00 2.08e+02 0.3 3.80e+01 - 4.59e-01 7.77e-01f 1 14r 0.0000000e+00 3.35e+00 2.62e+02 0.3 1.37e+02 - 3.39e-01 5.45e-01f 1 15r 0.0000000e+00 5.27e-01 8.11e+02 0.3 7.51e+01 - 1.48e-01 8.58e-01f 1 16r 0.0000000e+00 3.47e-01 5.05e+01 0.3 3.55e+01 - 9.72e-01 1.00e+00f 1 17r 0.0000000e+00 4.83e-01 4.26e+01 -0.4 1.95e+00 - 4.85e-01 9.74e-01f 1 18r 0.0000000e+00 4.89e-01 5.52e-01 -0.4 1.72e+02 - 1.00e+00 1.00e+00h 1 19r 0.0000000e+00 5.09e-01 1.82e+00 -2.7 3.60e+00 - 8.86e-01 9.88e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 20r 0.0000000e+00 5.15e-01 7.43e+00 -2.7 4.08e+03 - 1.00e+00 5.06e-01f 1 21r 0.0000000e+00 5.13e-01 9.73e+01 -2.7 1.56e+03 - 1.00e+00 6.74e-02f 1 22r 0.0000000e+00 4.71e-01 4.73e+00 -2.7 1.30e+03 - 1.00e+00 1.00e+00f 1 23r 0.0000000e+00 4.71e-01 1.06e-03 -2.7 7.59e+01 - 1.00e+00 1.00e+00h 1 24r 0.0000000e+00 4.71e-01 3.59e-02 -6.0 1.58e-01 - 1.00e+00 1.00e+00f 1 25r 0.0000000e+00 3.85e-01 1.19e+02 -6.0 1.37e+04 - 3.66e-01 2.15e-01f 1 26r 0.0000000e+00 2.88e-01 2.44e+02 -6.0 3.27e+04 - 4.70e-01 1.19e-01f 1 27r 0.0000000e+00 2.88e-01 2.56e+02 -6.0 1.65e+05 - 9.51e-02 4.75e-08f 1 28r 0.0000000e+00 2.12e-01 2.52e+02 -6.0 5.31e+03 - 7.92e-01 6.27e-01f 1 29r 0.0000000e+00 1.71e-01 5.71e+02 -6.0 5.40e+03 - 4.42e-07 3.13e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 30r 0.0000000e+00 1.70e-01 5.29e+02 -6.0 7.99e+01 -4.0 1.00e+00 4.31e-02f 1 31r 0.0000000e+00 2.08e-02 4.48e+03 -6.0 6.18e+03 - 1.57e-07 1.00e+00f 1 32r 0.0000000e+00 8.90e-03 1.89e+03 -6.0 3.84e+02 -4.5 1.00e+00 5.79e-01f 1 33r 0.0000000e+00 8.90e-03 2.05e+03 -6.0 1.20e+03 -5.0 1.00e+00 1.75e-06h 1 34r 0.0000000e+00 8.84e-03 1.99e+03 -6.0 3.91e+03 - 9.63e-03 7.18e-03f 1 35r 0.0000000e+00 8.84e-03 1.99e+03 -6.0 2.05e+02 - 0.00e+00 2.71e-09R 2 36r 0.0000000e+00 8.84e-03 3.13e+03 -6.0 1.45e+02 - 5.56e-01 2.07e-05f 1 37r 0.0000000e+00 6.40e-03 2.88e+02 -6.0 1.72e+02 - 1.39e-06 8.48e-01f 1 38r 0.0000000e+00 3.95e-04 3.33e+01 -6.0 5.18e-01 - 1.00e+00 8.64e-01f 1 39r 0.0000000e+00 1.02e-08 3.25e-04 -6.0 1.02e+01 - 1.00e+00 1.00e+00f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 40r 0.0000000e+00 5.86e-10 9.55e-12 -6.0 3.71e-04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 40 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 2.2348750627898539e-10 5.8571458794176579e-10 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 2.2348750627898539e-10 5.8571458794176579e-10 Number of objective function evaluations = 46 Number of objective gradient evaluations = 6 Number of equality constraint evaluations = 47 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 43 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 40 Total CPU secs in IPOPT (w/o function evaluations) = 0.071 Total CPU secs in NLP function evaluations = 0.001 EXIT: Optimal Solution Found. [0.728986 0.8 ] alamo Ipopt 3.13.2: output_file=/tmp/tmpirh88l93ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.83e+03 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.83e+03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.728986 0.8 ] pysmo_poly 2023-03-04 01:45:17 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmpxpgwm9hjipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.83e+03 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.83e+03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.728986 0.8 ] pysmo_rbf 2023-03-04 01:45:17 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmpxck2kh2xipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.82e+03 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.82e+03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.728986 0.8 ] pysmo_krig 2023-03-04 01:45:17 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmpmxr43ppjipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.83e+03 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 1.83e+03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.728986 0.8 ] keras Ipopt 3.13.2: output_file=/tmp/tmpifzp2nmfipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2567 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 231 variables with only lower bounds: 0 variables with lower and upper bounds: 192 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.02e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 1.85e+02 7.45e+01 -1.0 1.34e+04 - 1.31e-02 9.82e-01f 1 2 0.0000000e+00 1.84e+00 7.36e+00 -1.0 1.36e+03 - 9.85e-01 9.90e-01h 1 3 0.0000000e+00 8.29e-03 6.17e+02 -1.0 1.17e+01 - 1.00e+00 9.95e-01h 1 4 0.0000000e+00 1.14e-12 7.50e-09 -1.0 5.24e-02 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 4 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 1.1421974477343610e-12 1.1421974477343610e-12 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 1.1421974477343610e-12 1.1421974477343610e-12 Number of objective function evaluations = 5 Number of objective gradient evaluations = 5 Number of equality constraint evaluations = 5 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 5 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 4 Total CPU secs in IPOPT (w/o function evaluations) = 0.009 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.1 1.052632] alamo Ipopt 3.13.2: output_file=/tmp/tmp81ejpo4sipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.76e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.76e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.1 1.052632] pysmo_poly 2023-03-04 01:45:18 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmpm__miaa1ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.76e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.76e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.1 1.052632] pysmo_rbf 2023-03-04 01:45:18 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmpi3_g5t44ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.76e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.76e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.1 1.052632] pysmo_krig 2023-03-04 01:45:18 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmpddu1wopbipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.76e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.76e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.1 1.052632] keras Ipopt 3.13.2: output_file=/tmp/tmphrej6gaeipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2567 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 231 variables with only lower bounds: 0 variables with lower and upper bounds: 192 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.02e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 6.94e+03 1.76e+01 -1.0 2.87e+04 - 1.70e-02 3.19e-01h 1 2 0.0000000e+00 6.84e+03 3.62e+01 -1.0 2.79e+04 - 4.99e-01 1.50e-02h 1 3 0.0000000e+00 6.84e+03 1.62e+05 -1.0 2.77e+04 - 6.70e-01 1.63e-04h 1 4r 0.0000000e+00 6.84e+03 1.00e+03 3.8 0.00e+00 - 0.00e+00 4.08e-07R 3 5r 0.0000000e+00 6.83e+03 1.00e+03 3.8 3.01e+05 - 3.05e-04 1.76e-04f 1 6r 0.0000000e+00 6.15e+03 9.33e+02 1.7 3.01e+05 - 1.00e+00 2.19e-02f 1 7r 0.0000000e+00 5.87e+03 2.21e+02 1.7 1.45e+03 - 1.00e+00 4.55e-02f 1 8r 0.0000000e+00 1.71e+02 1.14e+01 1.7 2.31e+01 - 1.00e+00 9.71e-01f 1 9r 0.0000000e+00 3.30e+01 3.29e+01 1.0 2.93e+00 - 7.11e-01 8.09e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10r 0.0000000e+00 1.08e+01 6.07e+01 0.3 2.38e+01 - 7.91e-01 6.77e-01f 1 11r 0.0000000e+00 3.16e+00 5.91e+01 -0.4 1.13e+02 - 6.01e-01 7.15e-01f 1 12r 0.0000000e+00 4.91e-01 9.80e+01 -0.4 1.84e+02 - 4.54e-01 8.69e-01f 1 13r 0.0000000e+00 5.47e-01 5.54e-01 -0.4 3.65e+01 - 1.00e+00 1.00e+00f 1 14r 0.0000000e+00 5.79e-01 8.85e+00 -1.8 8.39e+00 - 7.92e-01 9.31e-01f 1 15r 0.0000000e+00 5.94e-01 3.48e-01 -1.8 1.21e+03 - 1.00e+00 1.00e+00f 1 16r 0.0000000e+00 5.99e-01 8.32e-02 -2.6 1.40e+01 - 1.00e+00 1.00e+00f 1 17r 0.0000000e+00 5.98e-01 1.08e+01 -4.0 2.64e+03 - 9.98e-01 1.98e-01f 1 18r 0.0000000e+00 5.92e-01 5.46e+02 -4.0 6.33e+03 - 1.00e+00 2.97e-02f 1 19r 0.0000000e+00 4.36e-01 1.25e+02 -4.0 6.24e+03 - 5.20e-01 7.74e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 20r 0.0000000e+00 3.14e-01 6.71e+00 -4.0 5.98e+03 - 1.00e+00 1.00e+00f 1 21r 0.0000000e+00 2.93e-01 4.50e+01 -4.0 9.77e+02 - 1.05e-01 1.00e+00h 1 22r 0.0000000e+00 2.96e-01 3.23e-02 -4.0 1.08e+02 - 1.00e+00 1.00e+00h 1 23r 0.0000000e+00 2.96e-01 1.34e-06 -4.0 1.89e+00 - 1.00e+00 1.00e+00h 1 24r 0.0000000e+00 2.96e-01 1.59e-02 -6.0 6.99e-02 - 1.00e+00 1.00e+00f 1 25r 0.0000000e+00 2.93e-01 1.07e-02 -6.0 1.07e+02 -4.0 1.00e+00 1.00e+00f 1 26r 0.0000000e+00 2.86e-01 2.95e-02 -6.0 3.22e+02 -4.5 1.00e+00 1.00e+00f 1 27r 0.0000000e+00 2.62e-01 2.46e-01 -6.0 9.70e+02 -5.0 1.00e+00 1.00e+00f 1 28r 0.0000000e+00 1.90e-01 1.79e+00 -6.0 2.98e+03 -5.4 1.00e+00 1.00e+00f 1 29r 0.0000000e+00 9.10e-02 1.14e+02 -6.0 1.06e+04 -5.9 1.00e+00 3.63e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 30r 0.0000000e+00 9.10e-02 1.77e+02 -6.0 8.38e+04 -6.4 3.75e-02 9.95e-08f 1 31r 0.0000000e+00 9.10e-02 1.23e+03 -6.0 5.94e+03 - 9.93e-02 1.06e-07h 1 32r 0.0000000e+00 5.97e-02 5.23e+02 -6.0 8.30e+03 - 1.09e-07 1.32e-01f 1 33r 0.0000000e+00 1.23e-02 4.78e+02 -6.0 2.43e+04 -6.0 9.08e-02 8.54e-02f 1 34r 0.0000000e+00 1.23e-02 6.82e+02 -6.0 1.33e+04 - 1.00e+00 1.68e-07f 1 35r 0.0000000e+00 6.45e-03 3.68e+02 -6.0 3.06e+02 - 1.00e+00 4.75e-01f 1 36r 0.0000000e+00 6.45e-03 1.16e+03 -6.0 1.61e+02 - 1.00e+00 4.98e-07h 2 37r 0.0000000e+00 6.45e-03 9.69e+02 -6.0 1.37e+02 - 2.77e-07 5.21e-07h 2 38r 0.0000000e+00 1.01e-06 1.06e-01 -6.0 1.26e+02 - 1.00e+00 1.00e+00f 1 39r 0.0000000e+00 1.42e-09 1.11e-11 -6.0 2.71e-02 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 39 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 1.4219911775859018e-09 1.4219911775859018e-09 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 1.4219911775859018e-09 1.4219911775859018e-09 Number of objective function evaluations = 45 Number of objective gradient evaluations = 6 Number of equality constraint evaluations = 45 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 41 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 39 Total CPU secs in IPOPT (w/o function evaluations) = 0.081 Total CPU secs in NLP function evaluations = 0.001 EXIT: Optimal Solution Found. [0.181159 0.915789] alamo Ipopt 3.13.2: output_file=/tmp/tmp8l8r2b6bipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.18e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.18e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.181159 0.915789] pysmo_poly 2023-03-04 01:45:19 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmptmly5cj1ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.18e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.18e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.181159 0.915789] pysmo_rbf 2023-03-04 01:45:19 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmpscnfi1adipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.18e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.18e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.181159 0.915789] pysmo_krig 2023-03-04 01:45:19 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmprtrhzg6yipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 2.18e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 1.11e-16 0.00e+00 -1.0 2.18e+04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 1.1102230246251565e-16 1.1102230246251565e-16 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 1.1102230246251565e-16 1.1102230246251565e-16 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.181159 0.915789] keras Ipopt 3.13.2: output_file=/tmp/tmphufymnb9ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2567 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 231 variables with only lower bounds: 0 variables with lower and upper bounds: 192 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.02e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 6.39e+03 1.02e+01 -1.0 2.50e+04 - 3.28e-02 3.73e-01h 1 2 0.0000000e+00 6.26e+03 3.54e+01 -1.0 2.24e+04 - 6.42e-01 2.10e-02h 1 3 0.0000000e+00 6.26e+03 1.60e+05 -1.0 2.21e+04 - 9.58e-01 2.32e-04h 1 4r 0.0000000e+00 6.26e+03 1.00e+03 3.8 0.00e+00 - 0.00e+00 2.91e-07R 4 5r 0.0000000e+00 6.25e+03 2.09e+03 3.8 3.11e+05 - 1.47e-02 2.36e-04f 1 6r 0.0000000e+00 5.70e+03 1.53e+03 1.7 3.00e+05 - 4.78e-01 2.00e-02f 1 7r 0.0000000e+00 5.28e+03 1.24e+03 1.7 1.44e+03 - 4.43e-01 4.17e-02f 1 8r 0.0000000e+00 1.77e+03 6.34e+02 1.7 2.66e+01 - 4.92e-01 4.90e-01f 1 9r 0.0000000e+00 4.78e+02 1.83e+02 1.7 8.86e+00 - 8.97e-01 6.11e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10r 0.0000000e+00 1.99e+02 1.90e+02 1.0 2.55e+00 - 3.44e-01 5.69e-01f 1 11r 0.0000000e+00 4.51e+01 3.22e+02 1.0 1.87e+00 - 2.88e-01 7.61e-01f 1 12r 0.0000000e+00 3.28e+01 2.08e+02 0.3 1.21e+01 - 4.06e-01 2.71e-01f 1 13r 0.0000000e+00 1.59e+01 3.63e+02 0.3 4.02e+01 - 2.07e-01 5.16e-01f 1 14r 0.0000000e+00 8.72e+00 1.90e+02 0.3 5.92e+01 - 4.66e-01 4.54e-01f 1 15r 0.0000000e+00 2.56e+00 1.86e+02 0.3 8.14e+01 - 4.22e-01 7.15e-01f 1 16r 0.0000000e+00 5.68e-01 3.48e+02 0.3 3.99e+01 - 3.71e-01 7.89e-01f 1 17r 0.0000000e+00 3.53e-01 1.75e+01 0.3 4.73e-01 - 9.52e-01 1.00e+00f 1 18r 0.0000000e+00 4.20e-01 3.66e+01 -1.1 4.42e+00 - 6.56e-01 9.37e-01f 1 19r 0.0000000e+00 4.50e-01 3.27e-01 -1.1 5.69e+02 - 1.00e+00 1.00e+00f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 20r 0.0000000e+00 4.62e-01 9.19e-01 -2.7 8.25e+00 - 9.24e-01 1.00e+00f 1 21r 0.0000000e+00 4.64e-01 2.15e+00 -2.7 3.81e+03 - 1.00e+00 4.94e-01f 1 22r 0.0000000e+00 4.62e-01 1.03e+02 -2.7 1.44e+03 - 1.00e+00 5.41e-02f 1 23r 0.0000000e+00 4.20e-01 4.76e+00 -2.7 1.28e+03 - 1.00e+00 1.00e+00f 1 24r 0.0000000e+00 4.20e-01 2.72e-03 -2.7 7.83e+01 - 1.00e+00 1.00e+00h 1 25r 0.0000000e+00 4.20e-01 3.25e-02 -6.1 9.91e-02 - 1.00e+00 1.00e+00f 1 26r 0.0000000e+00 3.54e-01 1.81e+02 -6.1 1.40e+04 - 3.85e-01 1.60e-01f 1 27r 0.0000000e+00 2.42e-01 3.47e+02 -6.1 2.76e+04 - 6.12e-01 1.54e-01f 1 28r 0.0000000e+00 2.42e-01 2.97e+02 -6.1 2.22e+05 - 6.85e-02 3.86e-08f 1 29r 0.0000000e+00 1.70e-01 2.46e+02 -6.1 5.10e+03 - 7.43e-01 5.98e-01f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 30r 0.0000000e+00 1.30e-01 5.86e+02 -6.1 5.25e+03 - 3.65e-07 3.03e-01f 1 31r 0.0000000e+00 1.29e-01 5.37e+02 -6.1 8.42e+01 -4.0 1.00e+00 4.45e-02f 1 32r 0.0000000e+00 1.68e-02 3.90e+03 -6.1 6.09e+03 - 1.30e-07 8.25e-01f 1 33r 0.0000000e+00 1.68e-02 3.90e+03 -6.1 3.79e+02 -4.5 1.00e+00 4.85e-06h 1 34r 0.0000000e+00 1.03e-02 2.19e+03 -6.1 3.78e+02 - 1.00e+00 4.40e-01f 1 35r 0.0000000e+00 1.03e-02 2.82e+03 -6.1 2.12e+02 - 1.00e+00 3.21e-07h 2 36r 0.0000000e+00 1.03e-02 2.90e+03 -6.1 1.16e+01 - 2.75e-07 2.24e-07h 2 37r 0.0000000e+00 6.60e-03 1.02e+03 -6.1 1.33e+02 - 1.39e-07 3.62e-01f 1 38r 0.0000000e+00 5.25e-07 9.13e-02 -6.1 8.49e+01 - 1.00e+00 1.00e+00f 1 39r 0.0000000e+00 4.37e-10 8.41e-12 -6.1 1.28e-02 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 39 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 1.6708709416057843e-10 4.3655745685100555e-10 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 1.6708709416057843e-10 4.3655745685100555e-10 Number of objective function evaluations = 46 Number of objective gradient evaluations = 6 Number of equality constraint evaluations = 46 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 41 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 39 Total CPU secs in IPOPT (w/o function evaluations) = 0.068 Total CPU secs in NLP function evaluations = 0.001 EXIT: Optimal Solution Found. [0.668116 1.073684] alamo Ipopt 3.13.2: output_file=/tmp/tmpa0u1s9twipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 5.48e+03 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 5.48e+03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.668116 1.073684] pysmo_poly 2023-03-04 01:45:20 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmpu31eoio3ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 5.48e+03 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 5.48e+03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.668116 1.073684] pysmo_rbf 2023-03-04 01:45:20 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmp0290qjj0ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 5.49e+03 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 5.49e+03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.668116 1.073684] pysmo_krig 2023-03-04 01:45:20 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmpaa7y_xoqipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 13 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 0 Total number of variables............................: 13 variables with only lower bounds: 0 variables with lower and upper bounds: 0 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 5.48e+03 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 5.48e+03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 1 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00 Number of objective function evaluations = 2 Number of objective gradient evaluations = 2 Number of equality constraint evaluations = 2 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 2 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 1 Total CPU secs in IPOPT (w/o function evaluations) = 0.000 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. [0.668116 1.073684] keras Ipopt 3.13.2: output_file=/tmp/tmpyzgcav85ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2567 Number of nonzeros in inequality constraint Jacobian.: 0 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 231 variables with only lower bounds: 0 variables with lower and upper bounds: 192 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 0 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 0 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 0.0000000e+00 1.02e+04 0.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 0.0000000e+00 1.21e+03 9.52e+01 -1.0 1.91e+04 - 1.03e-02 8.81e-01f 1 2 0.0000000e+00 3.78e-03 5.31e+00 -1.0 2.73e+03 - 9.68e-01 1.00e+00h 1 3 0.0000000e+00 1.80e-06 6.88e-04 -1.0 4.11e+01 - 1.00e+00 1.00e+00h 1 4 0.0000000e+00 2.22e-16 1.83e-15 -2.5 1.85e-03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 4 (scaled) (unscaled) Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00 Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00 Constraint violation....: 2.2204460492503131e-16 2.2204460492503131e-16 Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00 Overall NLP error.......: 2.2204460492503131e-16 2.2204460492503131e-16 Number of objective function evaluations = 5 Number of objective gradient evaluations = 5 Number of equality constraint evaluations = 5 Number of inequality constraint evaluations = 0 Number of equality constraint Jacobian evaluations = 5 Number of inequality constraint Jacobian evaluations = 0 Number of Lagrangian Hessian evaluations = 4 Total CPU secs in IPOPT (w/o function evaluations) = 0.009 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found.
We can visualize these results by plotting a graph for each of the quantities above, creating a data series for each surrogate trainer. Some data series may overlay if values are identical for all cases:
from matplotlib import pyplot as plt
# create figure/axes for each plot sequentially, plotting each trainer as a separate data series
# Comparing model sizes
fig = plt.figure()
ax = fig.add_subplot()
for trainer in trainers:
# plot number of constraints vs number of variables for each trainer
sf = [steam_flow_error[(i,j)] for (i,j) in steam_flow_error if j == trainer]
ax.plot(model_vars[trainer], model_cons[trainer], label=trainer, marker='o')
# add info to plot
ax.set_xlabel('Number of Variables')
ax.set_ylabel('Number of Constraints')
ax.set_title('Comparison of Model Size')
ax.legend()
plt.yscale('log')
plt.xscale('log')
print('Note: PySMO generates identical IPOPT model size (model constraints and variables) '
'regardless of surrogate expression size.')
plt.show()
print()
print('Process variable predictions displayed with relative error:')
print()
# Steam Flow Prediction
fig = plt.figure()
ax = fig.add_subplot()
for trainer in trainers:
# pick out the points that use that trainer and plot them against case number
sf = [steam_flow_error[(i,j)] for (i,j) in steam_flow_error if j == trainer]
ax.plot(cases, sf, label=trainer)
# add info to plot
ax.set_xlabel('Cases')
ax.set_ylabel('Relative Error')
ax.set_title('Steam Flow Prediction')
ax.legend()
plt.yscale('log')
plt.show()
# Reformer Duty Prediction
fig = plt.figure()
ax = fig.add_subplot()
for trainer in trainers:
# pick out the points that use that trainer and plot them against case number
rd = [reformer_duty_error[(i,j)] for (i,j) in reformer_duty_error if j == trainer]
ax.plot(cases, rd, label=trainer)
# add info to plot
ax.set_xlabel('Cases')
ax.set_ylabel('Relative Error')
ax.set_title('Reformer Duty Prediction')
ax.legend()
plt.yscale('log')
plt.show()
# C2H6 Mole Fraction Prediction
fig = plt.figure()
ax = fig.add_subplot()
for trainer in trainers:
# pick out the points that use that trainer and plot them against case number
eth = [conc_C2H6[(i,j)] for (i,j) in conc_C2H6 if j == trainer]
ax.plot(cases, eth, label=trainer)
# add info to plot
ax.set_xlabel('Cases')
ax.set_ylabel('Absolute Error')
ax.set_title('C2H6 Mole Fraction Prediction (O(1E-2))')
ax.legend()
plt.yscale('log')
plt.show()
print()
print('Mole fraction predictions displayed with absolute error:')
print()
# CH4 Mole Fraction Prediction
fig = plt.figure()
ax = fig.add_subplot()
for trainer in trainers:
# pick out the points that use that trainer and plot them against case number
meth = [conc_CH4[(i,j)] for (i,j) in conc_CH4 if j == trainer]
ax.plot(cases, meth, label=trainer)
# add info to plot
ax.set_xlabel('Cases')
ax.set_ylabel('Absolute Error')
ax.set_title('CH4 Mole Fraction Prediction (O(1E-1))')
ax.legend()
plt.yscale('log')
plt.show()
# H2 Mole Fraction Prediction
fig = plt.figure()
ax = fig.add_subplot()
for trainer in trainers:
# pick out the points that use that trainer and plot them against case number
hyd = [conc_H2[(i,j)] for (i,j) in conc_H2 if j == trainer]
ax.plot(cases, hyd, label=trainer)
# add info to plot
ax.set_xlabel('Cases')
ax.set_ylabel('Absolute Error')
ax.set_title('H2 Mole Fraction Prediction (O(1E-1))')
ax.legend()
plt.yscale('log')
plt.show()
# O2 Mole Fraction Prediction
fig = plt.figure()
ax = fig.add_subplot()
for trainer in trainers:
# pick out the points that use that trainer and plot them against case number
oxy = [conc_O2[(i,j)] for (i,j) in conc_O2 if j == trainer]
ax.plot(cases, oxy, label=trainer)
# add info to plot
ax.set_xlabel('Cases')
ax.set_ylabel('Absolute Error')
ax.set_title('O2 Mole Fraction Prediction (O(1E-20))')
ax.legend()
plt.yscale('log')
plt.show()
Note: PySMO generates identical IPOPT model size (model constraints and variables) regardless of surrogate expression size.
Process variable predictions displayed with relative error:
Mole fraction predictions displayed with absolute error:
4.3 Comparing Surrogate Optimization¶
Extending this analysis, we will run a single optimization scenario for each surrogate model and compare results. As in previous examples detailing workflows for ALAMO, PySMO and Keras, we will optimize hydrogen production while restricting nitrogen below 34 mol% in the product stream.
# Import additional Pyomo libraries
from pyomo.environ import Objective, maximize
def run_optimization(surrogate_type='alamo'):
print(surrogate_type)
# create the IDAES model and flowsheet
m = ConcreteModel()
m.fs = FlowsheetBlock(dynamic=False)
# create flowsheet input variables
m.fs.bypass_frac = Var(initialize=0.80, bounds=[0.1, 0.8], doc='natural gas bypass fraction')
m.fs.ng_steam_ratio = Var(initialize=0.80, bounds=[0.8, 1.2], doc='natural gas to steam ratio')
# create flowsheet output variables
m.fs.steam_flowrate = Var(initialize=0.2, doc="steam flowrate")
m.fs.reformer_duty = Var(initialize=10000, doc="reformer heat duty")
m.fs.AR = Var(initialize=0, doc="AR fraction")
m.fs.C2H6 = Var(initialize=0, doc="C2H6 fraction")
m.fs.C3H8 = Var(initialize=0, doc="C3H8 fraction")
m.fs.C4H10 = Var(initialize=0, doc="C4H10 fraction")
m.fs.CH4 = Var(initialize=0, doc="CH4 fraction")
m.fs.CO = Var(initialize=0, doc="CO fraction")
m.fs.CO2 = Var(initialize=0, doc="CO2 fraction")
m.fs.H2 = Var(initialize=0, doc="H2 fraction")
m.fs.H2O = Var(initialize=0, doc="H2O fraction")
m.fs.N2 = Var(initialize=0, doc="N2 fraction")
m.fs.O2 = Var(initialize=0, doc="O2 fraction")
# create input and output variable object lists for flowsheet
inputs = [m.fs.bypass_frac, m.fs.ng_steam_ratio]
outputs = [m.fs.steam_flowrate, m.fs.reformer_duty, m.fs.AR, m.fs.C2H6, m.fs.C3H8,
m.fs.C4H10, m.fs.CH4, m.fs.CO, m.fs.CO2, m.fs.H2, m.fs.H2O, m.fs.N2, m.fs.O2]
# create the Pyomo/IDAES block that corresponds to the surrogate
# call correct PySMO object to use below (will let us avoid nested switches)
# capture long output from loading surrogates (don't need to print it)
stream = StringIO()
oldstdout = sys.stdout
sys.stdout = stream
if surrogate_type=='alamo':
surrogate = AlamoSurrogate.load_from_file('alamo_surrogate.json')
m.fs.surrogate = SurrogateBlock()
m.fs.surrogate.build_model(surrogate, input_vars=inputs, output_vars=outputs)
elif surrogate_type=='keras':
keras_surrogate = KerasSurrogate.load_from_folder('keras_surrogate')
m.fs.surrogate = SurrogateBlock()
m.fs.surrogate.build_model(keras_surrogate,
formulation=KerasSurrogate.Formulation.FULL_SPACE,
input_vars=inputs, output_vars=outputs)
else: # surrogate is one of the three pysmo basis options
surrogate = PysmoSurrogate.load_from_file(str(surrogate_type) + '_surrogate.json')
m.fs.surrogate = SurrogateBlock()
m.fs.surrogate.build_model(surrogate, input_vars=inputs, output_vars=outputs)
# revert to standard output
sys.stdout = oldstdout
# unfix input values and add the objective/constraint to the model
m.fs.bypass_frac.unfix()
m.fs.ng_steam_ratio.unfix()
m.fs.obj = Objective(expr=m.fs.H2, sense=maximize)
m.fs.con = Constraint(expr=m.fs.N2 <= 0.34)
solver = SolverFactory('ipopt')
try: # attempt to solve problem
[status_obj, solved, iters, time, regu] = ipopt_solve_with_stats(m, solver)
except: # retry solving one more time
[status_obj, solved, iters, time, regu] = ipopt_solve_with_stats(m, solver)
return inputs, outputs, iters, time # don't report regu
# create list objects to store data, run optimization
results = {}
for trainer in trainers:
inputs, outputs, iters, time = run_optimization(trainer)
results[('IPOPT iterations', trainer)] = iters
results[('Solve time', trainer)] = time
for var in inputs:
results[(var.name, trainer)] = value(var)
for var in outputs:
results[(var.name, trainer)] = value(var)
alamo Ipopt 3.13.2: output_file=/tmp/tmp_5gzv502ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 38 Number of nonzeros in inequality constraint Jacobian.: 1 Number of nonzeros in Lagrangian Hessian.............: 3 Total number of variables............................: 15 variables with only lower bounds: 0 variables with lower and upper bounds: 2 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 1 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 -0.0000000e+00 7.88e+01 2.17e-02 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 -2.2218212e-01 1.58e+01 1.25e-02 -1.7 6.96e+02 - 9.47e-01 1.00e+00h 1 2 -3.1854147e-01 7.68e+02 1.60e-02 -2.5 6.56e+03 - 9.57e-01 1.00e+00h 1 3 -3.2360844e-01 9.35e+02 9.32e-02 -2.5 1.78e+04 - 6.41e-01 2.43e-01h 2 4 -3.2533719e-01 2.67e+02 1.02e-02 -2.5 8.65e+03 - 1.00e+00 1.00e+00h 1 5 -3.2669381e-01 8.52e+01 6.65e-04 -2.5 4.39e+03 - 1.00e+00 1.00e+00h 1 6 -3.2634224e-01 4.59e-02 2.84e-06 -2.5 4.89e+01 - 1.00e+00 1.00e+00h 1 7 -3.3118273e-01 8.76e+01 9.09e-03 -3.8 3.64e+03 - 9.24e-01 1.00e+00h 1 8 -3.3116961e-01 5.06e+00 3.39e-03 -3.8 7.58e+02 - 1.00e+00 1.00e+00h 1 9 -3.3125922e-01 3.50e-01 2.52e-04 -3.8 1.86e+02 - 1.00e+00 1.00e+00h 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10 -3.3151675e-01 1.01e+00 7.86e-03 -5.7 3.13e+02 - 1.00e+00 8.43e-01h 1 11 -3.3155438e-01 4.63e-03 8.59e-06 -5.7 3.34e+01 - 1.00e+00 1.00e+00h 1 12 -3.3155859e-01 4.97e-05 4.19e-07 -8.6 1.63e+00 - 1.00e+00 9.99e-01h 1 13 -3.3155860e-01 1.04e-09 3.00e-13 -8.6 1.18e-02 - 1.00e+00 1.00e+00f 1 Number of Iterations....: 13 (scaled) (unscaled) Objective...............: -3.3155859651606573e-01 -3.3155859651606573e-01 Dual infeasibility......: 2.9984271622641878e-13 2.9984271622641878e-13 Constraint violation....: 3.9503515658851742e-12 1.0440999176353216e-09 Complementarity.........: 2.5059080624555941e-09 2.5059080624555941e-09 Overall NLP error.......: 2.5059080624555941e-09 2.5059080624555941e-09 Number of objective function evaluations = 19 Number of objective gradient evaluations = 14 Number of equality constraint evaluations = 19 Number of inequality constraint evaluations = 19 Number of equality constraint Jacobian evaluations = 14 Number of inequality constraint Jacobian evaluations = 14 Number of Lagrangian Hessian evaluations = 13 Total CPU secs in IPOPT (w/o function evaluations) = 0.002 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. pysmo_poly 2023-03-04 01:45:23 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=poly Ipopt 3.13.2: output_file=/tmp/tmp4sypbpseipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 39 Number of nonzeros in inequality constraint Jacobian.: 1 Number of nonzeros in Lagrangian Hessian.............: 3 Total number of variables............................: 15 variables with only lower bounds: 0 variables with lower and upper bounds: 2 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 1 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 -0.0000000e+00 4.21e+01 2.83e-02 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 -2.2241148e-01 6.57e+00 1.52e-02 -1.7 6.97e+02 - 9.36e-01 1.00e+00h 1 2 -3.0914457e-01 4.13e+02 5.74e-02 -2.5 5.65e+03 - 9.92e-01 1.00e+00h 1 3 -3.2247065e-01 1.72e+03 2.08e-02 -2.5 1.49e+04 - 5.45e-01 5.63e-01h 1 4 -3.2740606e-01 3.52e+02 3.34e-02 -2.5 8.44e+03 - 1.00e+00 1.00e+00h 1 5 -3.2677007e-01 3.90e+00 3.67e-04 -2.5 1.84e+03 - 1.00e+00 1.00e+00h 1 6 -3.3124733e-01 5.42e+01 9.47e-03 -3.8 3.59e+03 - 9.41e-01 1.00e+00h 1 7 -3.3106060e-01 9.48e-01 1.05e-04 -3.8 1.86e+01 - 1.00e+00 1.00e+00h 1 8 -3.3114022e-01 1.21e-02 3.96e-06 -3.8 6.51e+01 - 1.00e+00 1.00e+00h 1 9 -3.3141920e-01 1.07e-01 8.90e-05 -5.7 2.34e+02 - 1.00e+00 1.00e+00h 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10 -3.3142572e-01 4.35e-04 1.82e-07 -5.7 9.61e+00 - 1.00e+00 1.00e+00h 1 11 -3.3142943e-01 2.50e-05 1.79e-08 -8.6 3.26e+00 - 1.00e+00 1.00e+00h 1 12 -3.3142944e-01 2.21e-09 2.72e-13 -8.6 3.07e-03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 12 (scaled) (unscaled) Objective...............: -3.3142943546623593e-01 -3.3142943546623593e-01 Dual infeasibility......: 2.7231705260207202e-13 2.7231705260207202e-13 Constraint violation....: 7.8709894095886068e-12 2.2118911147117611e-09 Complementarity.........: 2.5059064368625494e-09 2.5059064368625494e-09 Overall NLP error.......: 2.5059064368625494e-09 2.5059064368625494e-09 Number of objective function evaluations = 13 Number of objective gradient evaluations = 13 Number of equality constraint evaluations = 13 Number of inequality constraint evaluations = 13 Number of equality constraint Jacobian evaluations = 13 Number of inequality constraint Jacobian evaluations = 13 Number of Lagrangian Hessian evaluations = 12 Total CPU secs in IPOPT (w/o function evaluations) = 0.002 Total CPU secs in NLP function evaluations = 0.000 EXIT: Optimal Solution Found. pysmo_rbf 2023-03-04 01:45:23 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=rbf Ipopt 3.13.2: output_file=/tmp/tmpofe25wycipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 39 Number of nonzeros in inequality constraint Jacobian.: 1 Number of nonzeros in Lagrangian Hessian.............: 3 Total number of variables............................: 15 variables with only lower bounds: 0 variables with lower and upper bounds: 2 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 1 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 -0.0000000e+00 4.58e+01 2.28e-02 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 -2.2187269e-01 1.44e+01 1.33e-02 -1.7 6.80e+02 - 9.44e-01 1.00e+00h 1 2 -3.1833654e-01 6.49e+02 1.65e-02 -2.5 6.42e+03 - 9.23e-01 1.00e+00h 1 3 -3.2904216e-01 1.98e+03 3.27e-03 -2.5 1.61e+04 - 6.11e-01 5.97e-01h 1 4 -3.2728772e-01 2.07e+02 3.54e-03 -2.5 8.41e+03 - 1.00e+00 1.00e+00h 1 5 -3.2617335e-01 1.27e+01 3.00e-04 -2.5 7.83e+02 - 1.00e+00 1.00e+00h 1 6 -3.2612431e-01 5.10e-03 1.28e-06 -2.5 4.03e+01 - 1.00e+00 1.00e+00h 1 7 -3.3094622e-01 5.19e+01 4.80e-03 -3.8 3.78e+03 - 9.21e-01 1.00e+00h 1 8 -3.3154557e-01 3.78e+00 1.51e-03 -3.8 1.17e+03 - 1.00e+00 9.89e-01h 1 9 -3.3138015e-01 4.26e-01 1.61e-05 -3.8 3.24e+02 - 1.00e+00 1.00e+00f 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10 -3.3138995e-01 5.21e-04 8.48e-08 -3.8 1.39e+01 - 1.00e+00 1.00e+00h 1 11 -3.3167385e-01 1.35e-01 8.06e-04 -5.7 1.76e+02 - 1.00e+00 9.73e-01h 1 12 -3.3168390e-01 1.87e-04 4.24e-08 -5.7 1.52e+01 - 1.00e+00 1.00e+00h 1 13 -3.3168758e-01 3.70e-05 1.36e-09 -8.6 1.69e+00 - 1.00e+00 1.00e+00h 1 14 -3.3168758e-01 8.00e-10 5.07e-14 -8.6 3.09e-03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 14 (scaled) (unscaled) Objective...............: -3.3168758389150360e-01 -3.3168758389150360e-01 Dual infeasibility......: 5.0701311326148040e-14 5.0701311326148040e-14 Constraint violation....: 2.9648061330219216e-12 8.0035533756017685e-10 Complementarity.........: 2.5059042476427779e-09 2.5059042476427779e-09 Overall NLP error.......: 2.5059042476427779e-09 2.5059042476427779e-09 Number of objective function evaluations = 15 Number of objective gradient evaluations = 15 Number of equality constraint evaluations = 15 Number of inequality constraint evaluations = 15 Number of equality constraint Jacobian evaluations = 15 Number of inequality constraint Jacobian evaluations = 15 Number of Lagrangian Hessian evaluations = 14 Total CPU secs in IPOPT (w/o function evaluations) = 0.003 Total CPU secs in NLP function evaluations = 0.004 EXIT: Optimal Solution Found. pysmo_krig 2023-03-04 01:45:24 [INFO] idaes.core.surrogate.pysmo_surrogate: Decode surrogate. type=kriging Ipopt 3.13.2: output_file=/tmp/tmpg_n5jyloipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 39 Number of nonzeros in inequality constraint Jacobian.: 1 Number of nonzeros in Lagrangian Hessian.............: 3 Total number of variables............................: 15 variables with only lower bounds: 0 variables with lower and upper bounds: 2 variables with only upper bounds: 0 Total number of equality constraints.................: 13 Total number of inequality constraints...............: 1 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 -0.0000000e+00 4.40e+01 2.42e-02 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 -2.2190844e-01 1.06e+01 1.26e-02 -1.7 6.89e+02 - 9.47e-01 1.00e+00h 1 2 -3.2032292e-01 5.70e+02 7.81e-03 -2.5 6.67e+03 - 8.97e-01 9.77e-01h 1 3 -3.2995685e-01 1.85e+03 2.50e-02 -2.5 1.79e+04 - 6.00e-01 5.04e-01h 1 4 -3.2790677e-01 1.61e+02 1.24e-03 -2.5 9.11e+03 - 1.00e+00 1.00e+00h 1 5 -3.2604850e-01 4.20e+00 4.73e-04 -2.5 1.24e+03 - 1.00e+00 1.00e+00h 1 6 -3.3088714e-01 7.59e+01 4.85e-03 -3.8 3.88e+03 - 9.08e-01 1.00e+00h 1 7 -3.3156602e-01 9.52e+00 8.02e-03 -3.8 1.35e+03 - 1.00e+00 9.35e-01h 1 8 -3.3138785e-01 5.54e-01 6.03e-05 -3.8 3.23e+02 - 1.00e+00 1.00e+00f 1 9 -3.3140041e-01 5.45e-04 3.26e-07 -3.8 1.60e+01 - 1.00e+00 1.00e+00h 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10 -3.3168406e-01 1.87e-01 8.34e-04 -5.7 1.63e+02 - 1.00e+00 9.70e-01h 1 11 -3.3169478e-01 8.68e-04 6.08e-08 -5.7 1.54e+01 - 1.00e+00 1.00e+00h 1 12 -3.3169845e-01 4.24e-05 1.77e-09 -8.6 1.54e+00 - 1.00e+00 1.00e+00h 1 13 -3.3169845e-01 1.16e-08 4.06e-13 -8.6 3.03e-03 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 13 (scaled) (unscaled) Objective...............: -3.3169845093034361e-01 -3.3169845093034361e-01 Dual infeasibility......: 4.0575307332948303e-13 4.0575307332948303e-13 Constraint violation....: 4.1904810647402525e-11 1.1648808140307665e-08 Complementarity.........: 2.5059043155605903e-09 2.5059043155605903e-09 Overall NLP error.......: 2.5059043155605903e-09 1.1648808140307665e-08 Number of objective function evaluations = 14 Number of objective gradient evaluations = 14 Number of equality constraint evaluations = 14 Number of inequality constraint evaluations = 14 Number of equality constraint Jacobian evaluations = 14 Number of inequality constraint Jacobian evaluations = 14 Number of Lagrangian Hessian evaluations = 13 Total CPU secs in IPOPT (w/o function evaluations) = 0.002 Total CPU secs in NLP function evaluations = 0.003 EXIT: Optimal Solution Found. keras Ipopt 3.13.2: output_file=/tmp/tmpxv5754d2ipopt_out max_iter=500 max_cpu_time=120 ****************************************************************************** This program contains Ipopt, a library for large-scale nonlinear optimization. Ipopt is released as open source code under the Eclipse Public License (EPL). For more information visit http://projects.coin-or.org/Ipopt This version of Ipopt was compiled from source code available at https://github.com/IDAES/Ipopt as part of the Institute for the Design of Advanced Energy Systems Process Systems Engineering Framework (IDAES PSE Framework) Copyright (c) 2018-2019. See https://github.com/IDAES/idaes-pse. This version of Ipopt was compiled using HSL, a collection of Fortran codes for large-scale scientific computation. All technical papers, sales and publicity material resulting from use of the HSL codes within IPOPT must contain the following acknowledgement: HSL, a collection of Fortran codes for large-scale scientific computation. See http://www.hsl.rl.ac.uk. ****************************************************************************** This is Ipopt version 3.13.2, running with linear solver ma27. Number of nonzeros in equality constraint Jacobian...: 2569 Number of nonzeros in inequality constraint Jacobian.: 1 Number of nonzeros in Lagrangian Hessian.............: 80 Total number of variables............................: 233 variables with only lower bounds: 0 variables with lower and upper bounds: 194 variables with only upper bounds: 0 Total number of equality constraints.................: 231 Total number of inequality constraints...............: 1 inequality constraints with only lower bounds: 0 inequality constraints with lower and upper bounds: 0 inequality constraints with only upper bounds: 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 0 -0.0000000e+00 1.02e+04 2.70e-04 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0 1 -1.8386079e-01 5.27e+03 2.58e+01 -1.0 2.15e+04 - 1.78e-02 4.83e-01h 1 2 -3.1212372e-01 7.49e-02 5.22e+00 -1.0 1.19e+04 - 4.35e-01 1.00e+00h 1 3 -3.0993530e-01 6.70e-03 6.61e+00 -1.0 1.59e+03 - 4.00e-01 1.00e+00f 1 4 -3.0455707e-01 2.47e-03 2.80e-01 -1.0 9.26e+02 - 1.00e+00 1.00e+00f 1 5 -3.0350170e-01 6.09e-05 9.28e-03 -1.7 1.79e+02 - 1.00e+00 1.00e+00h 1 6 -3.0513488e-01 3.82e-04 1.59e-02 -3.8 4.72e+02 - 9.74e-01 1.00e+00h 1 7 -3.2153146e-01 2.93e-02 5.19e-03 -3.8 6.19e+03 - 1.00e+00 7.87e-01h 1 8 -3.2574532e-01 1.33e-02 3.42e-02 -3.8 3.44e+03 - 1.00e+00 6.57e-01h 1 9 -3.2897244e-01 1.87e-02 3.21e-03 -3.8 4.07e+03 - 1.00e+00 1.00e+00h 1 iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls 10 -3.2885962e-01 5.57e-04 4.57e-05 -3.8 3.35e+02 - 1.00e+00 1.00e+00h 1 11 -3.3135561e-01 5.10e-03 1.27e-02 -5.7 3.40e+03 - 7.83e-01 7.24e-01h 1 12 -3.3265628e-01 4.13e-03 1.18e-02 -5.7 2.97e+03 - 9.81e-01 5.03e-01h 1 13 -3.3253945e-01 1.16e-05 2.25e-02 -5.7 8.13e+01 - 9.34e-01 1.00e+00h 1 14 -3.3255414e-01 2.59e-07 2.99e-07 -5.7 1.87e+01 - 1.00e+00 1.00e+00h 1 15 -3.3256308e-01 2.78e-08 5.85e-04 -8.6 6.85e+00 - 1.00e+00 9.64e-01h 1 16 -3.3256321e-01 7.94e-11 2.59e-02 -8.6 1.77e-01 - 3.63e-01 1.00e+00f 1 17 -3.3256321e-01 5.55e-16 2.50e-14 -8.6 2.55e-04 - 1.00e+00 1.00e+00h 1 Number of Iterations....: 17 (scaled) (unscaled) Objective...............: -3.3256321363793595e-01 -3.3256321363793595e-01 Dual infeasibility......: 2.5035529205297280e-14 2.5035529205297280e-14 Constraint violation....: 5.5511151231257827e-16 5.5511151231257827e-16 Complementarity.........: 2.5569089533427999e-09 2.5569089533427999e-09 Overall NLP error.......: 2.5569089533427999e-09 2.5569089533427999e-09 Number of objective function evaluations = 18 Number of objective gradient evaluations = 18 Number of equality constraint evaluations = 18 Number of inequality constraint evaluations = 18 Number of equality constraint Jacobian evaluations = 18 Number of inequality constraint Jacobian evaluations = 18 Number of Lagrangian Hessian evaluations = 17 Total CPU secs in IPOPT (w/o function evaluations) = 0.033 Total CPU secs in NLP function evaluations = 0.001 EXIT: Optimal Solution Found.
# print results as a table
df_index = ['IPOPT iterations', 'Solve time']
for var in inputs:
df_index.append(var.name)
for var in outputs:
df_index.append(var.name)
df_cols = trainers
df = pd.DataFrame(index = df_index, columns = df_cols)
for i in df_index:
for j in df_cols:
df[j][i] = results[(i, j)]
df # display results table
alamo | pysmo_poly | pysmo_rbf | pysmo_krig | keras | |
---|---|---|---|---|---|
IPOPT iterations | 13 | 12 | 14 | 13 | 17 |
Solve time | 0.002 | 0.002 | 0.007 | 0.005 | 0.034 |
fs.bypass_frac | 0.1 | 0.1 | 0.1 | 0.1 | 0.1 |
fs.ng_steam_ratio | 1.138445 | 1.111183 | 1.124305 | 1.124276 | 1.104679 |
fs.steam_flowrate | 1.241674 | 1.21194 | 1.226153 | 1.226191 | 1.195974 |
fs.reformer_duty | 39390.78895 | 38820.997576 | 39062.177089 | 39072.795322 | 38028.680499 |
fs.AR | 0.004107 | 0.004103 | 0.004107 | 0.004107 | 0.00411 |
fs.C2H6 | 0.000406 | 0.000545 | 0.000545 | 0.000519 | 0.000448 |
fs.C3H8 | 0.000089 | 0.000119 | 0.000119 | 0.000114 | 0.00011 |
fs.C4H10 | 0.000051 | 0.000068 | 0.000068 | 0.000065 | 0.000072 |
fs.CH4 | 0.012764 | 0.016972 | 0.016991 | 0.016224 | 0.017682 |
fs.CO | 0.104531 | 0.104863 | 0.104856 | 0.104849 | 0.10598 |
fs.CO2 | 0.053986 | 0.053488 | 0.053528 | 0.05353 | 0.053167 |
fs.H2 | 0.331559 | 0.331429 | 0.331688 | 0.331698 | 0.332563 |
fs.H2O | 0.151056 | 0.148414 | 0.148918 | 0.148931 | 0.147103 |
fs.N2 | 0.34 | 0.34 | 0.34 | 0.34 | 0.34 |
fs.O2 | 0.0 | -0.0 | 0.0 | 0.0 | 0.0 |