2025-11-04 13:17:48.960408: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:17:48.972985: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762258668.987232 1508936 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762258668.994101 1508936 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762258669.008893 1508936 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762258669.008917 1508936 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762258669.008920 1508936 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762258669.008922 1508936 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:17:49.012209: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/tune/impl/tuner_internal.py:144: RayDeprecationWarning: The `RunConfig` class should be imported from `ray.tune` when passing it to the Tuner. Please update your imports. See this issue for more context and migration options: https://github.com/ray-project/ray/issues/49454. Disable these warnings by setting the environment variable: RAY_TRAIN_ENABLE_V2_MIGRATION_WARNINGS=0
  _log_deprecation_warning(
2025-11-04 13:17:52,106	INFO worker.py:1927 -- Started a local Ray instance.
2025-11-04 13:17:52,805	INFO tune.py:253 -- Initializing Ray automatically. For cluster usage or custom Ray initialization, call `ray.init(...)` before `Tuner(...)`.
2025-11-04 13:17:52,881	INFO trial.py:182 -- Creating a new dirname dir_49751_42bb because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,884	INFO trial.py:182 -- Creating a new dirname dir_49751_4cd6 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,888	INFO trial.py:182 -- Creating a new dirname dir_49751_f199 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,890	INFO trial.py:182 -- Creating a new dirname dir_49751_7ba2 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,892	INFO trial.py:182 -- Creating a new dirname dir_49751_b529 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,894	INFO trial.py:182 -- Creating a new dirname dir_49751_05d3 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,897	INFO trial.py:182 -- Creating a new dirname dir_49751_d43d because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,899	INFO trial.py:182 -- Creating a new dirname dir_49751_6639 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,901	INFO trial.py:182 -- Creating a new dirname dir_49751_fb5a because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,904	INFO trial.py:182 -- Creating a new dirname dir_49751_6171 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,906	INFO trial.py:182 -- Creating a new dirname dir_49751_bdb5 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,909	INFO trial.py:182 -- Creating a new dirname dir_49751_bbf9 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,912	INFO trial.py:182 -- Creating a new dirname dir_49751_5d81 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,916	INFO trial.py:182 -- Creating a new dirname dir_49751_b1f7 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,919	INFO trial.py:182 -- Creating a new dirname dir_49751_441a because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,922	INFO trial.py:182 -- Creating a new dirname dir_49751_59e6 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,926	INFO trial.py:182 -- Creating a new dirname dir_49751_b605 because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,930	INFO trial.py:182 -- Creating a new dirname dir_49751_319c because trial dirname 'dir_49751' already exists.
2025-11-04 13:17:52,933	INFO trial.py:182 -- Creating a new dirname dir_49751_2694 because trial dirname 'dir_49751' already exists.
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Se lanza la búsqueda de hiperparámetros óptimos del modelo
╭─────────────────────────────────────────────────────────────────╮
│ Configuration for experiment     ESANN_hyperparameters_tuning   │
├─────────────────────────────────────────────────────────────────┤
│ Search algorithm                 BasicVariantGenerator          │
│ Scheduler                        AsyncHyperBandScheduler        │
│ Number of trials                 20                             │
╰─────────────────────────────────────────────────────────────────╯

View detailed results here: /mnt/nvme1n2/git/uniovi-simur-wearablepermed-data/output/Paper_results/cases_dataset_M/case_M_ESANN_acc_gyr_superclasses_CPA_METs/ESANN_hyperparameters_tuning
To visualize your results with TensorBoard, run: `tensorboard --logdir /tmp/ray/session_2025-11-04_13-17-51_401054_1508936/artifacts/2025-11-04_13-17-52/ESANN_hyperparameters_tuning/driver_artifacts`

Trial status: 20 PENDING
Current time: 2025-11-04 13:17:53. Total running time: 0s
Logical resource usage: 0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status       N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs │
├───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    PENDING            4   adam            relu                                  128                 64                  3          0.000218484         65 │
│ trial_49751    PENDING            4   adam            tanh                                   64                128                  5          9.98547e-05         55 │
│ trial_49751    PENDING            4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131 │
│ trial_49751    PENDING            3   rmsprop         relu                                   64                128                  5          0.000380128         91 │
│ trial_49751    PENDING            4   adam            tanh                                   32                 32                  3          2.97881e-05         59 │
│ trial_49751    PENDING            3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94 │
│ trial_49751    PENDING            4   adam            relu                                   64                128                  3          1.6132e-05         114 │
│ trial_49751    PENDING            2   adam            relu                                   64                 64                  3          2.51454e-05        141 │
│ trial_49751    PENDING            3   adam            relu                                   32                 64                  3          0.000152599        138 │
│ trial_49751    PENDING            4   adam            tanh                                   64                 64                  5          0.00199444          93 │
│ trial_49751    PENDING            2   rmsprop         relu                                   32                128                  5          0.000810222         51 │
│ trial_49751    PENDING            3   adam            tanh                                   32                 32                  5          0.00337379          79 │
│ trial_49751    PENDING            4   adam            tanh                                  128                128                  3          7.68947e-05         83 │
│ trial_49751    PENDING            4   adam            relu                                  128                128                  5          1.32913e-05        108 │
│ trial_49751    PENDING            2   adam            tanh                                   64                 64                  3          4.37965e-05        125 │
│ trial_49751    PENDING            4   adam            tanh                                   64                 32                  3          8.13867e-05        140 │
│ trial_49751    PENDING            4   adam            tanh                                  128                128                  3          1.21494e-05         85 │
│ trial_49751    PENDING            4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105 │
│ trial_49751    PENDING            3   rmsprop         tanh                                   32                128                  5          0.00135033         103 │
│ trial_49751    PENDING            4   adam            relu                                   64                 32                  3          0.00148826         119 │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            2 │
│ epochs                            51 │
│ funcion_activacion              relu │
│ numero_filtros                   128 │
│ optimizador                  rmsprop │
│ tamanho_filtro                     5 │
│ tamanho_minilote                  32 │
│ tasa_aprendizaje             0.00081 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            3 │
│ epochs                            94 │
│ funcion_activacion              relu │
│ numero_filtros                    32 │
│ optimizador                  rmsprop │
│ tamanho_filtro                     5 │
│ tamanho_minilote                  64 │
│ tasa_aprendizaje             0.00004 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            2 │
│ epochs                           141 │
│ funcion_activacion              relu │
│ numero_filtros                    64 │
│ optimizador                     adam │
│ tamanho_filtro                     3 │
│ tamanho_minilote                  64 │
│ tasa_aprendizaje             0.00003 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                           119 │
│ funcion_activacion              relu │
│ numero_filtros                    32 │
│ optimizador                     adam │
│ tamanho_filtro                     3 │
│ tamanho_minilote                  64 │
│ tasa_aprendizaje             0.00149 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                            93 │
│ funcion_activacion              tanh │
│ numero_filtros                    64 │
│ optimizador                     adam │
│ tamanho_filtro                     5 │
│ tamanho_minilote                  64 │
│ tasa_aprendizaje             0.00199 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                           114 │
│ funcion_activacion              relu │
│ numero_filtros                   128 │
│ optimizador                     adam │
│ tamanho_filtro                     3 │
│ tamanho_minilote                  64 │
│ tasa_aprendizaje             0.00002 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                            85 │
│ funcion_activacion              tanh │
│ numero_filtros                   128 │
│ optimizador                     adam │
│ tamanho_filtro                     3 │
│ tamanho_minilote                 128 │
│ tasa_aprendizaje             0.00001 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                            59 │
│ funcion_activacion              tanh │
│ numero_filtros                    32 │
│ optimizador                     adam │
│ tamanho_filtro                     3 │
│ tamanho_minilote                  32 │
│ tasa_aprendizaje             0.00003 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            3 │
│ epochs                            79 │
│ funcion_activacion              tanh │
│ numero_filtros                    32 │
│ optimizador                     adam │
│ tamanho_filtro                     5 │
│ tamanho_minilote                  32 │
│ tasa_aprendizaje             0.00337 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                           105 │
│ funcion_activacion              relu │
│ numero_filtros                    64 │
│ optimizador                  rmsprop │
│ tamanho_filtro                     5 │
│ tamanho_minilote                 128 │
│ tasa_aprendizaje             0.00007 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                            65 │
│ funcion_activacion              relu │
│ numero_filtros                    64 │
│ optimizador                     adam │
│ tamanho_filtro                     3 │
│ tamanho_minilote                 128 │
│ tasa_aprendizaje             0.00022 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                           131 │
│ funcion_activacion              tanh │
│ numero_filtros                   128 │
│ optimizador                  rmsprop │
│ tamanho_filtro                     5 │
│ tamanho_minilote                 128 │
│ tasa_aprendizaje             0.00007 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                           108 │
│ funcion_activacion              relu │
│ numero_filtros                   128 │
│ optimizador                     adam │
│ tamanho_filtro                     5 │
│ tamanho_minilote                 128 │
│ tasa_aprendizaje             0.00001 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
[36m(train_cnn_ray_tune pid=1510570)[0m 2025-11-04 13:17:56.215372: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
[36m(train_cnn_ray_tune pid=1510568)[0m 2025-11-04 13:17:56.203231: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
[36m(train_cnn_ray_tune pid=1510574)[0m WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
[36m(train_cnn_ray_tune pid=1510574)[0m E0000 00:00:1762258676.230446 1511727 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
[36m(train_cnn_ray_tune pid=1510568)[0m E0000 00:00:1762258676.238513 1511729 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
[36m(train_cnn_ray_tune pid=1510570)[0m W0000 00:00:1762258676.291813 1511736 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
[36m(train_cnn_ray_tune pid=1510570)[0m W0000 00:00:1762258676.291851 1511736 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
[36m(train_cnn_ray_tune pid=1510570)[0m W0000 00:00:1762258676.291853 1511736 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
[36m(train_cnn_ray_tune pid=1510570)[0m W0000 00:00:1762258676.291856 1511736 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
[36m(train_cnn_ray_tune pid=1510570)[0m 2025-11-04 13:17:56.297853: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
[36m(train_cnn_ray_tune pid=1510570)[0m To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
[36m(train_cnn_ray_tune pid=1510574)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
[36m(train_cnn_ray_tune pid=1510574)[0m   warnings.warn(
[36m(train_cnn_ray_tune pid=1510574)[0m 2025-11-04 13:17:59.575672: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: CUDA_ERROR_NO_DEVICE: no CUDA-capable device is detected
[36m(train_cnn_ray_tune pid=1510574)[0m 2025-11-04 13:17:59.575718: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:167] env: CUDA_VISIBLE_DEVICES=""
[36m(train_cnn_ray_tune pid=1510574)[0m 2025-11-04 13:17:59.575726: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:170] CUDA_VISIBLE_DEVICES is set to an empty string - this hides all GPUs from CUDA
[36m(train_cnn_ray_tune pid=1510574)[0m 2025-11-04 13:17:59.575730: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:178] verbose logging is disabled. Rerun with verbose logging (usually --v=1 or --vmodule=cuda_diagnostics=1) to get more diagnostic output from this module
[36m(train_cnn_ray_tune pid=1510574)[0m 2025-11-04 13:17:59.575735: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:183] retrieving CUDA diagnostic information for host: simur-MS-7B94
[36m(train_cnn_ray_tune pid=1510574)[0m 2025-11-04 13:17:59.575738: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:190] hostname: simur-MS-7B94
[36m(train_cnn_ray_tune pid=1510574)[0m 2025-11-04 13:17:59.575959: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:197] libcuda reported version is: 570.133.7
[36m(train_cnn_ray_tune pid=1510574)[0m 2025-11-04 13:17:59.575993: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:201] kernel reported version is: 570.133.7
[36m(train_cnn_ray_tune pid=1510574)[0m 2025-11-04 13:17:59.575998: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:291] kernel version seems to match DSO: 570.133.7
╭─────────────────────────────────────╮
│ Trial trial_49751 config            │
├─────────────────────────────────────┤
│ N_capas                           4 │
│ epochs                           55 │
│ funcion_activacion             tanh │
│ numero_filtros                  128 │
│ optimizador                    adam │
│ tamanho_filtro                    5 │
│ tamanho_minilote                 64 │
│ tasa_aprendizaje             0.0001 │
╰─────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                           140 │
│ funcion_activacion              tanh │
│ numero_filtros                    32 │
│ optimizador                     adam │
│ tamanho_filtro                     3 │
│ tamanho_minilote                  64 │
│ tasa_aprendizaje             0.00008 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            3 │
│ epochs                           138 │
│ funcion_activacion              relu │
│ numero_filtros                    64 │
│ optimizador                     adam │
│ tamanho_filtro                     3 │
│ tamanho_minilote                  32 │
│ tasa_aprendizaje             0.00015 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            2 │
│ epochs                           125 │
│ funcion_activacion              tanh │
│ numero_filtros                    64 │
│ optimizador                     adam │
│ tamanho_filtro                     3 │
│ tamanho_minilote                  64 │
│ tasa_aprendizaje             0.00004 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            3 │
│ epochs                            91 │
│ funcion_activacion              relu │
│ numero_filtros                   128 │
│ optimizador                  rmsprop │
│ tamanho_filtro                     5 │
│ tamanho_minilote                  64 │
│ tasa_aprendizaje             0.00038 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            4 │
│ epochs                            83 │
│ funcion_activacion              tanh │
│ numero_filtros                   128 │
│ optimizador                     adam │
│ tamanho_filtro                     3 │
│ tamanho_minilote                 128 │
│ tasa_aprendizaje             0.00008 │
╰──────────────────────────────────────╯
Trial trial_49751 started with configuration:
╭──────────────────────────────────────╮
│ Trial trial_49751 config             │
├──────────────────────────────────────┤
│ N_capas                            3 │
│ epochs                           103 │
│ funcion_activacion              tanh │
│ numero_filtros                   128 │
│ optimizador                  rmsprop │
│ tamanho_filtro                     5 │
│ tamanho_minilote                  32 │
│ tasa_aprendizaje             0.00135 │
╰──────────────────────────────────────╯
[36m(train_cnn_ray_tune pid=1510574)[0m Model: "sequential"
[36m(train_cnn_ray_tune pid=1510574)[0m ┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
[36m(train_cnn_ray_tune pid=1510574)[0m ┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
[36m(train_cnn_ray_tune pid=1510574)[0m ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
[36m(train_cnn_ray_tune pid=1510574)[0m │ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
[36m(train_cnn_ray_tune pid=1510574)[0m ├─────────────────────────────────┼────────────────────────┼───────────────┤
[36m(train_cnn_ray_tune pid=1510574)[0m │ layer_normalization             │ (None, 6, 128)         │           256 │
[36m(train_cnn_ray_tune pid=1510574)[0m │ (LayerNormalization)            │                        │               │
[36m(train_cnn_ray_tune pid=1510574)[0m ├─────────────────────────────────┼────────────────────────┼───────────────┤
[36m(train_cnn_ray_tune pid=1510574)[0m │ dropout (Dropout)               │ (None, 6, 128)         │             0 │
[36m(train_cnn_ray_tune pid=1510574)[0m ├─────────────────────────────────┼────────────────────────┼───────────────┤
[36m(train_cnn_ray_tune pid=1510574)[0m │ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
[36m(train_cnn_ray_tune pid=1510574)[0m ├─────────────────────────────────┼────────────────────────┼───────────────┤
[36m(train_cnn_ray_tune pid=1510574)[0m │ layer_normalization_1           │ (None, 6, 128)         │           256 │
[36m(train_cnn_ray_tune pid=1510574)[0m │ (LayerNormalization)            │                        │               │
[36m(train_cnn_ray_tune pid=1510574)[0m ├─────────────────────────────────┼────────────────────────┼───────────────┤
[36m(train_cnn_ray_tune pid=1510574)[0m │ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
[36m(train_cnn_ray_tune pid=1510574)[0m ├─────────────────────────────────┼────────────────────────┼───────────────┤
[36m(train_cnn_ray_tune pid=1510574)[0m │ global_average_pooling1d        │ (None, 128)            │             0 │
[36m(train_cnn_ray_tune pid=1510574)[0m │ (GlobalAveragePooling1D)        │                        │               │
[36m(train_cnn_ray_tune pid=1510574)[0m ├─────────────────────────────────┼────────────────────────┼───────────────┤
[36m(train_cnn_ray_tune pid=1510574)[0m │ dropout_2 (Dropout)             │ (None, 128)            │             0 │
[36m(train_cnn_ray_tune pid=1510574)[0m ├─────────────────────────────────┼────────────────────────┼───────────────┤
[36m(train_cnn_ray_tune pid=1510574)[0m │ dense (Dense)                   │ (None, 4)              │           516 │
[36m(train_cnn_ray_tune pid=1510574)[0m └─────────────────────────────────┴────────────────────────┴───────────────┘
[36m(train_cnn_ray_tune pid=1510574)[0m  Total params: 243,204 (950.02 KB)
[36m(train_cnn_ray_tune pid=1510574)[0m  Trainable params: 243,204 (950.02 KB)
[36m(train_cnn_ray_tune pid=1510574)[0m  Non-trainable params: 0 (0.00 B)
[36m(train_cnn_ray_tune pid=1510574)[0m Epoch 1/51
[36m(train_cnn_ray_tune pid=1510561)[0m  Total params: 407,812 (1.56 MB)
[36m(train_cnn_ray_tune pid=1510561)[0m  Trainable params: 407,812 (1.56 MB)
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13:54[0m 3s/step - accuracy: 0.1562 - loss: 2.4229
[1m  4/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 24ms/step - accuracy: 0.2435 - loss: 2.2591 
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m  7/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 22ms/step - accuracy: 0.2561 - loss: 2.2210
[1m 11/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 19ms/step - accuracy: 0.2583 - loss: 2.2282
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 14/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2572 - loss: 2.2272
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 18/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2569 - loss: 2.2163
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 21/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2573 - loss: 2.2079
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 24/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2582 - loss: 2.1978
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 27/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2587 - loss: 2.1890
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 31/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2582 - loss: 2.1787
[1m 33/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2583 - loss: 2.1735
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 36/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2584 - loss: 2.1662
[1m 39/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2591 - loss: 2.1579
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 43/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2604 - loss: 2.1464
[1m 46/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.2610 - loss: 2.1381
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7:26[0m 3s/step - accuracy: 0.2656 - loss: 2.1818
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 39ms/step - accuracy: 0.3155 - loss: 2.0658
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  7/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 39ms/step - accuracy: 0.3197 - loss: 2.0275
[1m  8/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 42ms/step - accuracy: 0.3168 - loss: 2.0267
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  9/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 44ms/step - accuracy: 0.3136 - loss: 2.0280
[1m 11/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 43ms/step - accuracy: 0.3102 - loss: 2.0339
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 13/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 42ms/step - accuracy: 0.3063 - loss: 2.0399
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 15/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 43ms/step - accuracy: 0.3034 - loss: 2.0425
[1m 16/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 43ms/step - accuracy: 0.3024 - loss: 2.0435
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 17/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 44ms/step - accuracy: 0.3016 - loss: 2.0430
[1m 18/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 44ms/step - accuracy: 0.3011 - loss: 2.0424
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 20/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 43ms/step - accuracy: 0.2998 - loss: 2.0410
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 22/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 43ms/step - accuracy: 0.2986 - loss: 2.0407
[36m(train_cnn_ray_tune pid=1510572)[0m Model: "sequential"[32m [repeated 19x across cluster] (Ray deduplicates logs by default. Set RAY_DEDUP_LOGS=0 to disable log deduplication, or see https://docs.ray.io/en/master/ray-observability/user-guides/configure-logging.html#log-deduplication for more options.)[0m
[36m(train_cnn_ray_tune pid=1510572)[0m ┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m ┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m │ global_average_pooling1d        │ (None, 128)            │             0 │[32m [repeated 134x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m ├─────────────────────────────────┼────────────────────────┼───────────────┤[32m [repeated 239x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m │ layer_normalization             │ (None, 6, 128)         │           256 │[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m │ (LayerNormalization)            │                        │               │[32m [repeated 67x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m │ dropout (Dropout)               │ (None, 6, 128)         │             0 │[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m │ dropout_4 (Dropout)             │ (None, 128)            │             0 │[32m [repeated 67x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m │ (GlobalAveragePooling1D)        │                        │               │[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m │ dense (Dense)                   │ (None, 4)              │           516 │[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m └─────────────────────────────────┴────────────────────────┴───────────────┘[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m  Total params: 245,508 (959.02 KB)[32m [repeated 14x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m  Trainable params: 245,508 (959.02 KB)[32m [repeated 14x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m  Non-trainable params: 0 (0.00 B)[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 24/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 44ms/step - accuracy: 0.2979 - loss: 2.0398
[1m 25/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 44ms/step - accuracy: 0.2975 - loss: 2.0397
[36m(train_cnn_ray_tune pid=1510566)[0m Epoch 1/103[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 27/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 44ms/step - accuracy: 0.2965 - loss: 2.0401
[1m 29/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 43ms/step - accuracy: 0.2960 - loss: 2.0397
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 30/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 43ms/step - accuracy: 0.2959 - loss: 2.0392
[1m 32/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 44ms/step - accuracy: 0.2956 - loss: 2.0379
[36m(train_cnn_ray_tune pid=1510566)[0m  Total params: 325,508 (1.24 MB)[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m  Trainable params: 325,508 (1.24 MB)[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m  6/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 27ms/step - accuracy: 0.3063 - loss: 2.0755 
[1m  8/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 29ms/step - accuracy: 0.2969 - loss: 2.0786
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 36/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 43ms/step - accuracy: 0.2946 - loss: 2.0350
[1m 38/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 43ms/step - accuracy: 0.2941 - loss: 2.0329
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 125ms/step - accuracy: 0.3125 - loss: 2.1056
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 42/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 42ms/step - accuracy: 0.2934 - loss: 2.0282
[1m 44/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 42ms/step - accuracy: 0.2932 - loss: 2.0254
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m 4/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 119ms/step - accuracy: 0.2990 - loss: 2.1214
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m 5/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 122ms/step - accuracy: 0.2914 - loss: 2.1409
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 5/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 69ms/step - accuracy: 0.2481 - loss: 2.1003
[1m 6/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 69ms/step - accuracy: 0.2476 - loss: 2.1021
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m 6/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 118ms/step - accuracy: 0.2847 - loss: 2.1504
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 7/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 71ms/step - accuracy: 0.2460 - loss: 2.1053
[1m 8/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 70ms/step - accuracy: 0.2460 - loss: 2.1054
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m 7/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 118ms/step - accuracy: 0.2797 - loss: 2.1551
[1m 8/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 116ms/step - accuracy: 0.2766 - loss: 2.1549
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m 9/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 113ms/step - accuracy: 0.2731 - loss: 2.1569
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m11/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 67ms/step - accuracy: 0.2466 - loss: 2.1064
[1m12/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 67ms/step - accuracy: 0.2468 - loss: 2.1068
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m10/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 114ms/step - accuracy: 0.2708 - loss: 2.1561
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m13/82[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m4s[0m 66ms/step - accuracy: 0.2477 - loss: 2.1060
[1m14/82[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m4s[0m 66ms/step - accuracy: 0.2482 - loss: 2.1064
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m11/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 113ms/step - accuracy: 0.2693 - loss: 2.1540
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m15/82[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m4s[0m 66ms/step - accuracy: 0.2488 - loss: 2.1066
[1m16/82[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m4s[0m 66ms/step - accuracy: 0.2491 - loss: 2.1067
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m12/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 113ms/step - accuracy: 0.2681 - loss: 2.1522
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m17/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m4s[0m 65ms/step - accuracy: 0.2494 - loss: 2.1068
[1m18/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m4s[0m 65ms/step - accuracy: 0.2497 - loss: 2.1061
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m13/82[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m7s[0m 115ms/step - accuracy: 0.2673 - loss: 2.1500
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m14/82[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m7s[0m 115ms/step - accuracy: 0.2666 - loss: 2.1487
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m15/82[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m7s[0m 114ms/step - accuracy: 0.2660 - loss: 2.1479
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m22/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m3s[0m 65ms/step - accuracy: 0.2495 - loss: 2.1048
[1m23/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m3s[0m 64ms/step - accuracy: 0.2494 - loss: 2.1043
[1m24/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m3s[0m 64ms/step - accuracy: 0.2493 - loss: 2.1035
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m27/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m3s[0m 63ms/step - accuracy: 0.2492 - loss: 2.1008
[1m28/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m3s[0m 62ms/step - accuracy: 0.2491 - loss: 2.1000
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 78/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m10s[0m 44ms/step - accuracy: 0.2923 - loss: 1.9770
[1m 79/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m10s[0m 44ms/step - accuracy: 0.2923 - loss: 1.9760
[1m 80/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m10s[0m 44ms/step - accuracy: 0.2922 - loss: 1.9749
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m29/82[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m3s[0m 63ms/step - accuracy: 0.2491 - loss: 2.0992
[1m30/82[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m3s[0m 62ms/step - accuracy: 0.2490 - loss: 2.0984
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 55/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m9s[0m 35ms/step - accuracy: 0.2543 - loss: 2.1245
[1m 56/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m9s[0m 35ms/step - accuracy: 0.2541 - loss: 2.1247
[1m 57/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m9s[0m 35ms/step - accuracy: 0.2538 - loss: 2.1248
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18:41[0m 7s/step - accuracy: 0.2656 - loss: 1.8652
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 62ms/step - accuracy: 0.2383 - loss: 1.9828 [32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 45ms/step - accuracy: 0.2539 - loss: 1.9056
[1m 17/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 44ms/step - accuracy: 0.2531 - loss: 1.9103[32m [repeated 136x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m17s[0m 108ms/step - accuracy: 0.2639 - loss: 2.1335
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18:09[0m 7s/step - accuracy: 0.5000 - loss: 1.3397
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 53ms/step - accuracy: 0.4375 - loss: 1.4955 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 59ms/step - accuracy: 0.3958 - loss: 1.5584
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m34/82[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m3s[0m 64ms/step - accuracy: 0.2487 - loss: 2.0961
[1m35/82[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m3s[0m 64ms/step - accuracy: 0.2487 - loss: 2.0952
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 31ms/step - accuracy: 0.2609 - loss: 2.1632[32m [repeated 97x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m17s[0m 108ms/step - accuracy: 0.2604 - loss: 2.1331
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m16s[0m 104ms/step - accuracy: 0.2577 - loss: 2.1336
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 100ms/step - accuracy: 0.2582 - loss: 2.1340
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 97ms/step - accuracy: 0.2567 - loss: 2.1342 
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 74/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m9s[0m 37ms/step - accuracy: 0.2528 - loss: 2.1235
[1m 76/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m9s[0m 37ms/step - accuracy: 0.2528 - loss: 2.1232
[1m 77/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m9s[0m 37ms/step - accuracy: 0.2528 - loss: 2.1231
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9:13[0m 7s/step - accuracy: 0.2578 - loss: 2.2471[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m105/328[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m10s[0m 48ms/step - accuracy: 0.2906 - loss: 1.9537[32m [repeated 59x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m 24/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 75ms/step - accuracy: 0.2538 - loss: 2.0639
[1m 25/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m10s[0m 74ms/step - accuracy: 0.2538 - loss: 2.0638[32m [repeated 35x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m10s[0m 41ms/step - accuracy: 0.2476 - loss: 2.1423 - val_accuracy: 0.3736 - val_loss: 1.3727
[36m(train_cnn_ray_tune pid=1510568)[0m Epoch 2/94
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m 20/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 67ms/step - accuracy: 0.2512 - loss: 2.0486 
[1m 21/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 67ms/step - accuracy: 0.2515 - loss: 2.0475[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m41/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m3s[0m 75ms/step - accuracy: 0.2361 - loss: 2.1847[32m [repeated 54x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m27/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m6s[0m 123ms/step - accuracy: 0.2532 - loss: 2.1560[32m [repeated 90x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m44/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m2s[0m 75ms/step - accuracy: 0.2368 - loss: 2.1810
[1m45/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m2s[0m 75ms/step - accuracy: 0.2370 - loss: 2.1799[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m42/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m5s[0m 135ms/step - accuracy: 0.2429 - loss: 2.1080
[1m43/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m5s[0m 134ms/step - accuracy: 0.2431 - loss: 2.1076
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m6s[0m 47ms/step - accuracy: 0.2546 - loss: 1.9017
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m5s[0m 47ms/step - accuracy: 0.2548 - loss: 1.9007
[1m 39/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m5s[0m 48ms/step - accuracy: 0.2550 - loss: 1.8998[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 97ms/step - accuracy: 0.2188 - loss: 2.1246
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 27ms/step - accuracy: 0.2422 - loss: 2.0688 [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m2s[0m 47ms/step - accuracy: 0.2627 - loss: 1.8261
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m2s[0m 47ms/step - accuracy: 0.2625 - loss: 1.8266[32m [repeated 230x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 48ms/step - accuracy: 0.2387 - loss: 2.0666  [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m1s[0m 44ms/step - accuracy: 0.2631 - loss: 1.8125[32m [repeated 269x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m12s[0m 94ms/step - accuracy: 0.2479 - loss: 2.0737 - val_accuracy: 0.3634 - val_loss: 1.3419
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 133ms/step - accuracy: 0.2344 - loss: 2.1901[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 48/164[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m10s[0m 87ms/step - accuracy: 0.2580 - loss: 2.1052[32m [repeated 42x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 44/164[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m10s[0m 87ms/step - accuracy: 0.2579 - loss: 2.1037
[1m 45/164[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m10s[0m 87ms/step - accuracy: 0.2579 - loss: 2.1041[32m [repeated 20x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m10s[0m 42ms/step - accuracy: 0.2630 - loss: 2.1539 - val_accuracy: 0.3666 - val_loss: 1.3286[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m Epoch 2/105[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 49/164[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m9s[0m 87ms/step - accuracy: 0.2580 - loss: 2.1057 
[1m 50/164[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m9s[0m 86ms/step - accuracy: 0.2580 - loss: 2.1061[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 68ms/step - accuracy: 0.2552 - loss: 1.9628[32m [repeated 36x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m63/82[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m2s[0m 128ms/step - accuracy: 0.2504 - loss: 2.1967[32m [repeated 140x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m36/82[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m3s[0m 72ms/step - accuracy: 0.2586 - loss: 2.0004
[1m37/82[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m3s[0m 72ms/step - accuracy: 0.2584 - loss: 2.0002[32m [repeated 22x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 55ms/step - accuracy: 0.3008 - loss: 1.4707  
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 44ms/step - accuracy: 0.2826 - loss: 1.4831
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m70/82[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m1s[0m 137ms/step - accuracy: 0.2470 - loss: 2.0990
[1m71/82[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m1s[0m 136ms/step - accuracy: 0.2471 - loss: 2.0986[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m7s[0m 75ms/step - accuracy: 0.2558 - loss: 2.0821
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m6s[0m 75ms/step - accuracy: 0.2557 - loss: 2.0825
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m6s[0m 75ms/step - accuracy: 0.2557 - loss: 2.0829
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 119ms/step - accuracy: 0.2969 - loss: 1.7664
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 50ms/step - accuracy: 0.3008 - loss: 1.8103  
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m256/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m3s[0m 54ms/step - accuracy: 0.2942 - loss: 1.8413
[1m258/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m3s[0m 54ms/step - accuracy: 0.2944 - loss: 1.8400[32m [repeated 234x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 35ms/step - accuracy: 0.4896 - loss: 1.2509 [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m261/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m3s[0m 54ms/step - accuracy: 0.2945 - loss: 1.8380[32m [repeated 303x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m1s[0m 73ms/step - accuracy: 0.2555 - loss: 2.1007
[1m149/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m1s[0m 73ms/step - accuracy: 0.2555 - loss: 2.1009
[1m150/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m1s[0m 73ms/step - accuracy: 0.2554 - loss: 2.1011
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m14s[0m 95ms/step - accuracy: 0.2421 - loss: 2.1521 - val_accuracy: 0.3473 - val_loss: 1.3579
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 113ms/step - accuracy: 0.2812 - loss: 2.1253[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m18s[0m 165ms/step - accuracy: 0.2572 - loss: 2.1175 - val_accuracy: 0.3680 - val_loss: 1.4622
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 34/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 40ms/step - accuracy: 0.3926 - loss: 1.3512[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 42/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 40ms/step - accuracy: 0.3892 - loss: 1.3510
[1m 44/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 39ms/step - accuracy: 0.3883 - loss: 1.3512[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 63ms/step - accuracy: 0.2734 - loss: 1.4709 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 65ms/step - accuracy: 0.2656 - loss: 1.4723
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 223ms/step - accuracy: 0.2891 - loss: 2.1416
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 95ms/step - accuracy: 0.2617 - loss: 2.1214  
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 82ms/step - accuracy: 0.2588 - loss: 1.9844 - val_accuracy: 0.3690 - val_loss: 1.3190
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m18s[0m 70ms/step - accuracy: 0.2701 - loss: 1.8491 - val_accuracy: 0.3704 - val_loss: 1.2903[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m Epoch 3/105[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 75/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m9s[0m 39ms/step - accuracy: 0.3835 - loss: 1.3509 
[1m 76/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m9s[0m 40ms/step - accuracy: 0.3835 - loss: 1.3509[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m79/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 68ms/step - accuracy: 0.2575 - loss: 2.0188[32m [repeated 46x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m 9/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 124ms/step - accuracy: 0.2633 - loss: 2.0281[32m [repeated 83x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m11/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 79ms/step - accuracy: 0.2931 - loss: 1.8983
[1m12/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 78ms/step - accuracy: 0.2915 - loss: 1.9021[32m [repeated 37x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 47ms/step - accuracy: 0.3160 - loss: 2.0776  
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 37ms/step - accuracy: 0.3119 - loss: 2.0901
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 67ms/step - accuracy: 0.3066 - loss: 1.7725  
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 62ms/step - accuracy: 0.3077 - loss: 1.7688
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m19/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m7s[0m 117ms/step - accuracy: 0.2507 - loss: 2.1034
[1m20/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m7s[0m 116ms/step - accuracy: 0.2509 - loss: 2.1029[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m162/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 33ms/step - accuracy: 0.2957 - loss: 2.0085
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 33ms/step - accuracy: 0.2957 - loss: 2.0082[32m [repeated 220x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 47ms/step - accuracy: 0.3003 - loss: 1.3634 

Trial status: 20 RUNNING
Current time: 2025-11-04 13:18:23. Total running time: 30s
Logical resource usage: 20.0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status       N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs │
├───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    RUNNING            4   adam            relu                                  128                 64                  3          0.000218484         65 │
│ trial_49751    RUNNING            4   adam            tanh                                   64                128                  5          9.98547e-05         55 │
│ trial_49751    RUNNING            4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131 │
│ trial_49751    RUNNING            3   rmsprop         relu                                   64                128                  5          0.000380128         91 │
│ trial_49751    RUNNING            4   adam            tanh                                   32                 32                  3          2.97881e-05         59 │
│ trial_49751    RUNNING            3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94 │
│ trial_49751    RUNNING            4   adam            relu                                   64                128                  3          1.6132e-05         114 │
│ trial_49751    RUNNING            2   adam            relu                                   64                 64                  3          2.51454e-05        141 │
│ trial_49751    RUNNING            3   adam            relu                                   32                 64                  3          0.000152599        138 │
│ trial_49751    RUNNING            4   adam            tanh                                   64                 64                  5          0.00199444          93 │
│ trial_49751    RUNNING            2   rmsprop         relu                                   32                128                  5          0.000810222         51 │
│ trial_49751    RUNNING            3   adam            tanh                                   32                 32                  5          0.00337379          79 │
│ trial_49751    RUNNING            4   adam            tanh                                  128                128                  3          7.68947e-05         83 │
│ trial_49751    RUNNING            4   adam            relu                                  128                128                  5          1.32913e-05        108 │
│ trial_49751    RUNNING            2   adam            tanh                                   64                 64                  3          4.37965e-05        125 │
│ trial_49751    RUNNING            4   adam            tanh                                   64                 32                  3          8.13867e-05        140 │
│ trial_49751    RUNNING            4   adam            tanh                                  128                128                  3          1.21494e-05         85 │
│ trial_49751    RUNNING            4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105 │
│ trial_49751    RUNNING            3   rmsprop         tanh                                   32                128                  5          0.00135033         103 │
│ trial_49751    RUNNING            4   adam            relu                                   64                 32                  3          0.00148826         119 │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m141/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m7s[0m 40ms/step - accuracy: 0.3849 - loss: 1.3446[32m [repeated 274x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m158/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 40ms/step - accuracy: 0.2637 - loss: 1.4647
[1m159/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 40ms/step - accuracy: 0.2638 - loss: 1.4644
[1m160/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 40ms/step - accuracy: 0.2639 - loss: 1.4642
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m17s[0m 106ms/step - accuracy: 0.2227 - loss: 2.2324
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 80ms/step - accuracy: 0.2352 - loss: 2.2054 
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 41ms/step - accuracy: 0.2795 - loss: 1.4411  
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 36ms/step - accuracy: 0.2783 - loss: 1.4310
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m17s[0m 105ms/step - accuracy: 0.2969 - loss: 1.4418[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m19s[0m 151ms/step - accuracy: 0.2531 - loss: 2.1489 - val_accuracy: 0.3192 - val_loss: 1.4239[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 13/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 77ms/step - accuracy: 0.2486 - loss: 2.1514[32m [repeated 25x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 42/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 39ms/step - accuracy: 0.2755 - loss: 2.0093
[1m 44/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 38ms/step - accuracy: 0.2755 - loss: 2.0102[32m [repeated 34x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 38ms/step - accuracy: 0.2552 - loss: 2.2422  
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 39ms/step - accuracy: 0.2639 - loss: 2.1617
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 42ms/step - accuracy: 0.2690 - loss: 2.1302
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 45ms/step - accuracy: 0.2726 - loss: 2.1085
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m36s[0m 113ms/step - accuracy: 0.4062 - loss: 1.3263
[1m  2/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 56ms/step - accuracy: 0.4375 - loss: 1.2726 [32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 78ms/step - accuracy: 0.2574 - loss: 2.0183 - val_accuracy: 0.3420 - val_loss: 1.3900
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m8s[0m 46ms/step - accuracy: 0.2500 - loss: 1.8530 - val_accuracy: 0.3466 - val_loss: 1.3477[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m Epoch 3/140[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 28ms/step - accuracy: 0.2387 - loss: 2.1145 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 31ms/step - accuracy: 0.2473 - loss: 2.0933
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 76/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m9s[0m 40ms/step - accuracy: 0.2731 - loss: 2.0276 
[1m 78/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m9s[0m 39ms/step - accuracy: 0.2728 - loss: 2.0288
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m62/82[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m1s[0m 63ms/step - accuracy: 0.2719 - loss: 1.9049[32m [repeated 41x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m50/82[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m3s[0m 122ms/step - accuracy: 0.2570 - loss: 2.0384[32m [repeated 159x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m66/82[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 62ms/step - accuracy: 0.2717 - loss: 1.9047
[1m67/82[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 62ms/step - accuracy: 0.2716 - loss: 1.9047[32m [repeated 43x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 80ms/step - accuracy: 0.2644 - loss: 1.9510 - val_accuracy: 0.3708 - val_loss: 1.3192
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 64ms/step - accuracy: 0.2773 - loss: 1.8837  
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 64ms/step - accuracy: 0.2708 - loss: 1.9013
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m55/82[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m3s[0m 123ms/step - accuracy: 0.2569 - loss: 2.0393
[1m56/82[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m3s[0m 123ms/step - accuracy: 0.2568 - loss: 2.0396[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m2s[0m 39ms/step - accuracy: 0.2824 - loss: 1.3964
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m2s[0m 39ms/step - accuracy: 0.2825 - loss: 1.3962[32m [repeated 241x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 59/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m8s[0m 84ms/step - accuracy: 0.2531 - loss: 2.0862[32m [repeated 247x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m25s[0m 157ms/step - accuracy: 0.3281 - loss: 1.5272[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 67ms/step - accuracy: 0.3164 - loss: 1.5479 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 65ms/step - accuracy: 0.3238 - loss: 1.5533
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 92/328[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m12s[0m 54ms/step - accuracy: 0.3595 - loss: 1.3772[32m [repeated 54x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 93/328[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m12s[0m 55ms/step - accuracy: 0.3593 - loss: 1.3773
[1m 94/328[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m12s[0m 55ms/step - accuracy: 0.3591 - loss: 1.3774[32m [repeated 33x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m3s[0m 69ms/step - accuracy: 0.2559 - loss: 2.0914
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m3s[0m 69ms/step - accuracy: 0.2559 - loss: 2.0915
[1m113/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m3s[0m 69ms/step - accuracy: 0.2559 - loss: 2.0916
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 124ms/step - accuracy: 0.2891 - loss: 1.8276
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 68ms/step - accuracy: 0.2891 - loss: 1.8323  [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 42ms/step - accuracy: 0.2866 - loss: 1.9501 - val_accuracy: 0.3827 - val_loss: 1.3023[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 5/141[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m11s[0m 132ms/step - accuracy: 0.2649 - loss: 2.0844 - val_accuracy: 0.3539 - val_loss: 1.5532
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m 25/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m9s[0m 72ms/step - accuracy: 0.3281 - loss: 1.5281 
[1m 26/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m9s[0m 72ms/step - accuracy: 0.3277 - loss: 1.5282
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m46/82[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m2s[0m 66ms/step - accuracy: 0.2834 - loss: 1.8282[32m [repeated 62x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 101ms/step - accuracy: 0.2513 - loss: 2.0990[32m [repeated 118x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 33ms/step - accuracy: 0.2865 - loss: 2.0117  
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m51/82[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m2s[0m 66ms/step - accuracy: 0.2827 - loss: 1.8295
[1m52/82[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 66ms/step - accuracy: 0.2825 - loss: 1.8296[32m [repeated 26x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 40ms/step - accuracy: 0.2977 - loss: 1.3667  
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 44ms/step - accuracy: 0.3034 - loss: 1.3683
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 70ms/step - accuracy: 0.2707 - loss: 1.9033 - val_accuracy: 0.3606 - val_loss: 1.3909
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m11/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 110ms/step - accuracy: 0.2778 - loss: 2.0136
[1m12/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 107ms/step - accuracy: 0.2763 - loss: 2.0145[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m167/328[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m8s[0m 51ms/step - accuracy: 0.3552 - loss: 1.3763
[1m168/328[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m8s[0m 52ms/step - accuracy: 0.3552 - loss: 1.3763[32m [repeated 224x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m280/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m1s[0m 38ms/step - accuracy: 0.2647 - loss: 2.0794[32m [repeated 224x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 81ms/step - accuracy: 0.2702 - loss: 1.8902 - val_accuracy: 0.3529 - val_loss: 1.3483
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 80ms/step - accuracy: 0.2793 - loss: 1.8205  
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 67ms/step - accuracy: 0.2791 - loss: 1.8256
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 165ms/step - accuracy: 0.2891 - loss: 1.8059[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 37ms/step - accuracy: 0.3229 - loss: 1.3477 
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 35ms/step - accuracy: 0.3187 - loss: 1.3510
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 64ms/step - accuracy: 0.2802 - loss: 2.0691[32m [repeated 30x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 37/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 40ms/step - accuracy: 0.4638 - loss: 1.2137
[1m 39/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 40ms/step - accuracy: 0.4641 - loss: 1.2129[32m [repeated 26x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 147ms/step - accuracy: 0.2109 - loss: 1.8137
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 65ms/step - accuracy: 0.2285 - loss: 1.8058  [32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m14s[0m 43ms/step - accuracy: 0.2605 - loss: 1.8976 - val_accuracy: 0.4343 - val_loss: 1.2618[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m Epoch 3/138[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m12s[0m 140ms/step - accuracy: 0.2512 - loss: 2.1138 - val_accuracy: 0.3269 - val_loss: 1.4164[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  8/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 61ms/step - accuracy: 0.2772 - loss: 2.0770 
[1m  9/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 61ms/step - accuracy: 0.2763 - loss: 2.0797
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m27/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m3s[0m 67ms/step - accuracy: 0.2663 - loss: 1.7837[32m [repeated 38x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m42/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m4s[0m 115ms/step - accuracy: 0.2657 - loss: 2.0208[32m [repeated 147x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m16s[0m 52ms/step - accuracy: 0.3090 - loss: 1.6792 [32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m33/82[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m3s[0m 66ms/step - accuracy: 0.2661 - loss: 1.7845
[1m34/82[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m3s[0m 66ms/step - accuracy: 0.2661 - loss: 1.7843[32m [repeated 37x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 42ms/step - accuracy: 0.2387 - loss: 1.7789  
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 38ms/step - accuracy: 0.2376 - loss: 1.7821[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m14s[0m 44ms/step - accuracy: 0.2743 - loss: 2.1355 
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 39ms/step - accuracy: 0.2552 - loss: 2.1328
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m61/82[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m2s[0m 111ms/step - accuracy: 0.2682 - loss: 2.0496
[1m62/82[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m2s[0m 111ms/step - accuracy: 0.2682 - loss: 2.0493[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m266/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m3s[0m 51ms/step - accuracy: 0.3557 - loss: 1.3695
[1m268/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m3s[0m 51ms/step - accuracy: 0.3557 - loss: 1.3694[32m [repeated 224x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m5s[0m 63ms/step - accuracy: 0.2633 - loss: 2.1075[32m [repeated 228x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 75ms/step - accuracy: 0.2795 - loss: 1.8300 - val_accuracy: 0.3845 - val_loss: 1.3750
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m283/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m2s[0m 51ms/step - accuracy: 0.3561 - loss: 1.3684
[1m284/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m2s[0m 51ms/step - accuracy: 0.3561 - loss: 1.3683
[1m285/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m2s[0m 51ms/step - accuracy: 0.3561 - loss: 1.3682
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m20s[0m 123ms/step - accuracy: 0.2500 - loss: 1.3662[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 45/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 36ms/step - accuracy: 0.2548 - loss: 2.0871[32m [repeated 39x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 47/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 36ms/step - accuracy: 0.2560 - loss: 2.0853
[1m 49/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 36ms/step - accuracy: 0.2570 - loss: 2.0838 [32m [repeated 26x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 77ms/step - accuracy: 0.2641 - loss: 1.8651 - val_accuracy: 0.3508 - val_loss: 1.3663
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 146ms/step - accuracy: 0.1875 - loss: 2.0281
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 69ms/step - accuracy: 0.1914 - loss: 2.0525  [32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m11s[0m 69ms/step - accuracy: 0.3215 - loss: 1.5095 - val_accuracy: 0.5056 - val_loss: 1.1939[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m Epoch 4/91[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m10s[0m 122ms/step - accuracy: 0.2690 - loss: 2.0448 - val_accuracy: 0.3732 - val_loss: 1.5448
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m15/82[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m4s[0m 66ms/step - accuracy: 0.2687 - loss: 1.6949[32m [repeated 41x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m79/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 121ms/step - accuracy: 0.2566 - loss: 2.0873[32m [repeated 134x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 51ms/step - accuracy: 0.3368 - loss: 1.3556  
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m19/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m4s[0m 65ms/step - accuracy: 0.2688 - loss: 1.6991
[1m20/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m4s[0m 66ms/step - accuracy: 0.2687 - loss: 1.7006[32m [repeated 39x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 54ms/step - accuracy: 0.2852 - loss: 1.6662  
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 41ms/step - accuracy: 0.2939 - loss: 1.6640[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 67ms/step - accuracy: 0.3672 - loss: 1.3844 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 69ms/step - accuracy: 0.3715 - loss: 1.3815
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m76/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 122ms/step - accuracy: 0.2566 - loss: 2.0866
[1m77/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 122ms/step - accuracy: 0.2566 - loss: 2.0869[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m116/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 38ms/step - accuracy: 0.3120 - loss: 1.3578
[1m118/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 38ms/step - accuracy: 0.3119 - loss: 1.3579[32m [repeated 267x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m 89/164[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m2s[0m 38ms/step - accuracy: 0.2695 - loss: 1.7156[32m [repeated 282x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m161/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 61ms/step - accuracy: 0.2578 - loss: 2.1081
[1m162/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 61ms/step - accuracy: 0.2578 - loss: 2.1080
[1m163/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 61ms/step - accuracy: 0.2578 - loss: 2.1080[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 19/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m16s[0m 52ms/step - accuracy: 0.3305 - loss: 1.3321
[1m 20/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m16s[0m 52ms/step - accuracy: 0.3316 - loss: 1.3320
[1m 21/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m16s[0m 52ms/step - accuracy: 0.3325 - loss: 1.3319
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m16:02[0m 6s/step - accuracy: 0.2656 - loss: 1.9048[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 25/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 52ms/step - accuracy: 0.3343 - loss: 1.3329[32m [repeated 14x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 27/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 51ms/step - accuracy: 0.3342 - loss: 1.3340
[1m 29/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 51ms/step - accuracy: 0.3346 - loss: 1.3346[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 73ms/step - accuracy: 0.2682 - loss: 1.7730 - val_accuracy: 0.4055 - val_loss: 1.3361
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m22s[0m 140ms/step - accuracy: 0.2812 - loss: 1.7837
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 69ms/step - accuracy: 0.2891 - loss: 1.8221 [32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 44/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 53ms/step - accuracy: 0.3385 - loss: 1.3342
[1m 45/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m14s[0m 53ms/step - accuracy: 0.3387 - loss: 1.3341
[1m 46/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m14s[0m 53ms/step - accuracy: 0.3391 - loss: 1.3339
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m11s[0m 70ms/step - accuracy: 0.2577 - loss: 2.1078 - val_accuracy: 0.2837 - val_loss: 1.3974[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m Epoch 4/114[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m 32/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 34ms/step - accuracy: 0.3433 - loss: 1.3348 
[1m 34/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 34ms/step - accuracy: 0.3442 - loss: 1.3337
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m11s[0m 136ms/step - accuracy: 0.2565 - loss: 2.0883 - val_accuracy: 0.3322 - val_loss: 1.4091[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m80/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 70ms/step - accuracy: 0.2568 - loss: 1.8587[32m [repeated 49x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 74ms/step - accuracy: 0.2710 - loss: 1.7054 - val_accuracy: 0.4129 - val_loss: 1.3269
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 44ms/step - accuracy: 0.3255 - loss: 1.3569  
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m42/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m4s[0m 116ms/step - accuracy: 0.2563 - loss: 2.0548[32m [repeated 130x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m81/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 71ms/step - accuracy: 0.2569 - loss: 1.8584
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 71ms/step - accuracy: 0.2571 - loss: 1.8582[32m [repeated 33x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 57ms/step - accuracy: 0.3359 - loss: 1.7682  
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 36ms/step - accuracy: 0.3213 - loss: 1.7889[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m39/82[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m4s[0m 116ms/step - accuracy: 0.2567 - loss: 2.0556
[1m40/82[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m4s[0m 116ms/step - accuracy: 0.2565 - loss: 2.0554[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m  9/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 51ms/step - accuracy: 0.3053 - loss: 1.5760
[1m 10/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 53ms/step - accuracy: 0.3049 - loss: 1.5809[32m [repeated 201x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m106/328[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m7s[0m 35ms/step - accuracy: 0.3490 - loss: 1.3301[32m [repeated 249x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m43s[0m 133ms/step - accuracy: 0.5938 - loss: 0.9864[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 11/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 49ms/step - accuracy: 0.3136 - loss: 1.6346[32m [repeated 69x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m125/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m10s[0m 51ms/step - accuracy: 0.3599 - loss: 1.3143
[1m126/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m10s[0m 51ms/step - accuracy: 0.3601 - loss: 1.3142[32m [repeated 39x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m36s[0m 110ms/step - accuracy: 0.3125 - loss: 1.6389
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 34ms/step - accuracy: 0.3073 - loss: 1.6648 [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 45ms/step - accuracy: 0.2922 - loss: 1.9035 - val_accuracy: 0.4101 - val_loss: 1.2739[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 8/141[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 41/164[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m9s[0m 80ms/step - accuracy: 0.3096 - loss: 1.9064 
[1m 42/164[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m9s[0m 80ms/step - accuracy: 0.3094 - loss: 1.9062[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m61/82[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 74ms/step - accuracy: 0.2748 - loss: 1.8039[32m [repeated 56x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 81ms/step - accuracy: 0.2572 - loss: 1.8579 - val_accuracy: 0.3504 - val_loss: 1.3989
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m  2/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m30s[0m 92ms/step - accuracy: 0.5391 - loss: 1.0766 
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m67/82[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m1s[0m 128ms/step - accuracy: 0.2495 - loss: 2.1117[32m [repeated 144x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m74/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 71ms/step - accuracy: 0.2809 - loss: 1.6562
[1m75/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 71ms/step - accuracy: 0.2810 - loss: 1.6560[32m [repeated 38x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 53ms/step - accuracy: 0.2969 - loss: 1.3779  
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 56ms/step - accuracy: 0.2934 - loss: 1.3821[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 70ms/step - accuracy: 0.2773 - loss: 2.0533 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 46ms/step - accuracy: 0.2799 - loss: 2.0148 
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m11s[0m 133ms/step - accuracy: 0.2817 - loss: 1.9779 - val_accuracy: 0.3796 - val_loss: 1.5508
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 49ms/step - accuracy: 0.3438 - loss: 1.3362  
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 80ms/step - accuracy: 0.2818 - loss: 1.6540 - val_accuracy: 0.3996 - val_loss: 1.3170
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m77/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 119ms/step - accuracy: 0.2560 - loss: 2.0466
[1m78/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 119ms/step - accuracy: 0.2560 - loss: 2.0464[32m [repeated 4x across cluster][0m
Trial status: 20 RUNNING
Current time: 2025-11-04 13:18:53. Total running time: 1min 0s
Logical resource usage: 20.0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status       N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs │
├───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    RUNNING            4   adam            relu                                  128                 64                  3          0.000218484         65 │
│ trial_49751    RUNNING            4   adam            tanh                                   64                128                  5          9.98547e-05         55 │
│ trial_49751    RUNNING            4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131 │
│ trial_49751    RUNNING            3   rmsprop         relu                                   64                128                  5          0.000380128         91 │
│ trial_49751    RUNNING            4   adam            tanh                                   32                 32                  3          2.97881e-05         59 │
│ trial_49751    RUNNING            3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94 │
│ trial_49751    RUNNING            4   adam            relu                                   64                128                  3          1.6132e-05         114 │
│ trial_49751    RUNNING            2   adam            relu                                   64                 64                  3          2.51454e-05        141 │
│ trial_49751    RUNNING            3   adam            relu                                   32                 64                  3          0.000152599        138 │
│ trial_49751    RUNNING            4   adam            tanh                                   64                 64                  5          0.00199444          93 │
│ trial_49751    RUNNING            2   rmsprop         relu                                   32                128                  5          0.000810222         51 │
│ trial_49751    RUNNING            3   adam            tanh                                   32                 32                  5          0.00337379          79 │
│ trial_49751    RUNNING            4   adam            tanh                                  128                128                  3          7.68947e-05         83 │
│ trial_49751    RUNNING            4   adam            relu                                  128                128                  5          1.32913e-05        108 │
│ trial_49751    RUNNING            2   adam            tanh                                   64                 64                  3          4.37965e-05        125 │
│ trial_49751    RUNNING            4   adam            tanh                                   64                 32                  3          8.13867e-05        140 │
│ trial_49751    RUNNING            4   adam            tanh                                  128                128                  3          1.21494e-05         85 │
│ trial_49751    RUNNING            4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105 │
│ trial_49751    RUNNING            3   rmsprop         tanh                                   32                128                  5          0.00135033         103 │
│ trial_49751    RUNNING            4   adam            relu                                   64                 32                  3          0.00148826         119 │
╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m252/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m2s[0m 34ms/step - accuracy: 0.3492 - loss: 1.3329
[1m254/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m2s[0m 34ms/step - accuracy: 0.3492 - loss: 1.3329[32m [repeated 215x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m122/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m8s[0m 43ms/step - accuracy: 0.5017 - loss: 1.1317[32m [repeated 261x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 410ms/step
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m22s[0m 141ms/step - accuracy: 0.3438 - loss: 1.3629[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 5/49[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 15ms/step  
[1m 8/49[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 18ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m11/49[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 18ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m15/49[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 17ms/step
[1m19/49[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m22/49[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 17ms/step
[1m25/49[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 66/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m9s[0m 38ms/step - accuracy: 0.2679 - loss: 2.1052 [32m [repeated 43x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 60/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m10s[0m 38ms/step - accuracy: 0.2686 - loss: 2.1072
[1m 62/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m10s[0m 38ms/step - accuracy: 0.2684 - loss: 2.1065[32m [repeated 28x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m28/49[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 18ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m31/49[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 18ms/step
[1m34/49[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m38/49[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 17ms/step
[1m43/49[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m47/49[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 16ms/step
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 97/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m2s[0m 35ms/step - accuracy: 0.2733 - loss: 1.9092
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m2s[0m 35ms/step - accuracy: 0.2733 - loss: 1.9089
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m2s[0m 35ms/step - accuracy: 0.2734 - loss: 1.9086
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 23ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m32s[0m 98ms/step - accuracy: 0.2500 - loss: 2.4544
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 32ms/step - accuracy: 0.2535 - loss: 2.2743
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 23ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 50ms/step
[1m 4/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m8s[0m 46ms/step - accuracy: 0.3142 - loss: 1.3534 - val_accuracy: 0.3645 - val_loss: 1.2865[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m Epoch 5/108[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m 8/89[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m12/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 17ms/step
[1m15/89[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m1s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m19/89[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 17ms/step
[1m22/89[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m32s[0m 201ms/step - accuracy: 0.3281 - loss: 1.5627
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 50ms/step - accuracy: 0.3125 - loss: 1.5697  
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m26/89[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m1s[0m 17ms/step
[1m30/89[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m34/89[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m37/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m40/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m51/82[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 62ms/step - accuracy: 0.2991 - loss: 1.6017[32m [repeated 34x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m43/89[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 18ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m45/89[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 19ms/step
[1m48/89[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m31/82[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m6s[0m 118ms/step - accuracy: 0.2573 - loss: 2.0314[32m [repeated 105x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m51/89[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 19ms/step
[1m54/89[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m58/82[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 62ms/step - accuracy: 0.2981 - loss: 1.6022
[1m59/82[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 62ms/step - accuracy: 0.2979 - loss: 1.6023[32m [repeated 22x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510542)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:56.738629: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:56.759923: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m WARNING: All log messages before absl::InitializeLog() is called are written to STDERR[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m E0000 00:00:1762258676.789012 1511862 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m E0000 00:00:1762258676.797401 1511862 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m W0000 00:00:1762258676.818610 1511862 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.[32m [repeated 76x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:56.824590: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m   warnings.warn([32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:59.956203: E external/local_xla/xla/stream_executor/cuda/cuda_platform.cc:51] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: CUDA_ERROR_NO_DEVICE: no CUDA-capable device is detected[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:59.956314: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:167] env: CUDA_VISIBLE_DEVICES=""[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:59.956323: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:170] CUDA_VISIBLE_DEVICES is set to an empty string - this hides all GPUs from CUDA[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:59.956331: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:178] verbose logging is disabled. Rerun with verbose logging (usually --v=1 or --vmodule=cuda_diagnostics=1) to get more diagnostic output from this module[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:59.956337: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:183] retrieving CUDA diagnostic information for host: simur-MS-7B94[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:59.956341: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:190] hostname: simur-MS-7B94[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:59.956831: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:197] libcuda reported version is: 570.133.7[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:59.956891: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:201] kernel reported version is: 570.133.7[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 2025-11-04 13:17:59.956894: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:291] kernel version seems to match DSO: 570.133.7[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 46ms/step - accuracy: 0.3420 - loss: 1.3552  
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 39ms/step - accuracy: 0.3302 - loss: 1.3579[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m57/89[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 19ms/step
[1m61/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m12s[0m 147ms/step - accuracy: 0.2517 - loss: 2.1087 - val_accuracy: 0.3371 - val_loss: 1.4140[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m65/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 19ms/step
[1m68/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m71/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 19ms/step
[1m75/89[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m78/89[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m82/89[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 18ms/step
[1m85/89[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m88/89[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 19ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 19ms/step

Trial trial_49751 finished iteration 1 at 2025-11-04 13:18:57. Total running time: 1min 4s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s              61.237 │
│ time_total_s                  61.237 │
│ training_iteration                 1 │
│ val_accuracy                 0.34761 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:18:57. Total running time: 1min 4s
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 79ms/step - accuracy: 0.2734 - loss: 2.0114 
[36m(train_cnn_ray_tune pid=1510542)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 85ms/step - accuracy: 0.2740 - loss: 1.8019 - val_accuracy: 0.3476 - val_loss: 1.3916
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m21/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m7s[0m 125ms/step - accuracy: 0.3141 - loss: 1.8839
[1m22/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m7s[0m 123ms/step - accuracy: 0.3133 - loss: 1.8854[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 27ms/step - accuracy: 0.2752 - loss: 1.8462  
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 30ms/step - accuracy: 0.2918 - loss: 1.8325
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 30ms/step - accuracy: 0.2978 - loss: 1.8273[32m [repeated 280x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m 26/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 32ms/step - accuracy: 0.3904 - loss: 1.3039 
[1m 28/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 32ms/step - accuracy: 0.3889 - loss: 1.3051
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m197/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m4s[0m 38ms/step - accuracy: 0.2645 - loss: 2.0729[32m [repeated 278x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 74ms/step - accuracy: 0.2950 - loss: 1.6031 - val_accuracy: 0.4213 - val_loss: 1.2996
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 75ms/step - accuracy: 0.2832 - loss: 1.5985  
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 70ms/step - accuracy: 0.2964 - loss: 1.5777
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 141ms/step - accuracy: 0.2734 - loss: 1.6637[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 16/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 75ms/step - accuracy: 0.2970 - loss: 1.8910[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 14/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 76ms/step - accuracy: 0.2992 - loss: 1.8900
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 75ms/step - accuracy: 0.2981 - loss: 1.8905[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m308/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m1s[0m 52ms/step - accuracy: 0.3788 - loss: 1.2972
[1m309/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 52ms/step - accuracy: 0.3789 - loss: 1.2972
[1m310/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 52ms/step - accuracy: 0.3789 - loss: 1.2971[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m20s[0m 89ms/step - accuracy: 0.2936 - loss: 1.9106 - val_accuracy: 0.3638 - val_loss: 1.4540[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m Epoch 9/65[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m17s[0m 109ms/step - accuracy: 0.2969 - loss: 1.8113
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 30ms/step - accuracy: 0.2943 - loss: 1.8112  [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m33/82[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m3s[0m 64ms/step - accuracy: 0.2971 - loss: 1.5655[32m [repeated 29x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m55/82[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m3s[0m 120ms/step - accuracy: 0.2629 - loss: 2.0685[32m [repeated 156x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m41/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m2s[0m 62ms/step - accuracy: 0.2945 - loss: 1.5690
[1m42/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m2s[0m 62ms/step - accuracy: 0.2942 - loss: 1.5692[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 60ms/step - accuracy: 0.3242 - loss: 1.9127  
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 64ms/step - accuracy: 0.3238 - loss: 1.9126
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m10s[0m 127ms/step - accuracy: 0.2831 - loss: 1.9517 - val_accuracy: 0.3838 - val_loss: 1.5609
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m 4/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 114ms/step - accuracy: 0.2565 - loss: 2.0920
[1m 5/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 107ms/step - accuracy: 0.2590 - loss: 2.0786[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 41ms/step - accuracy: 0.4219 - loss: 1.1898 
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m181/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m4s[0m 32ms/step - accuracy: 0.3728 - loss: 1.3193
[1m183/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m4s[0m 32ms/step - accuracy: 0.3727 - loss: 1.3193[32m [repeated 266x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m 26/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m9s[0m 71ms/step - accuracy: 0.2516 - loss: 2.0359 
[1m 27/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m9s[0m 70ms/step - accuracy: 0.2516 - loss: 2.0365
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m 39/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m5s[0m 47ms/step - accuracy: 0.3653 - loss: 1.3312[32m [repeated 238x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 121ms/step - accuracy: 0.2292 - loss: 2.1109 
[1m 4/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 113ms/step - accuracy: 0.2368 - loss: 2.0996
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 117ms/step - accuracy: 0.2188 - loss: 2.1785[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 28ms/step - accuracy: 0.2483 - loss: 2.0750  
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 51/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m12s[0m 46ms/step - accuracy: 0.4235 - loss: 1.2371[32m [repeated 36x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 56/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m12s[0m 45ms/step - accuracy: 0.4228 - loss: 1.2386
[1m 58/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m12s[0m 45ms/step - accuracy: 0.4225 - loss: 1.2392[32m [repeated 26x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 31ms/step - accuracy: 0.3168 - loss: 1.7309  
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 32ms/step - accuracy: 0.3176 - loss: 1.7554
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 70ms/step - accuracy: 0.2912 - loss: 1.5700 - val_accuracy: 0.4059 - val_loss: 1.2847
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m14s[0m 43ms/step - accuracy: 0.2652 - loss: 2.0610 - val_accuracy: 0.3588 - val_loss: 1.3682[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m Epoch 10/65[32m [repeated 14x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m267/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m1s[0m 32ms/step - accuracy: 0.3739 - loss: 1.3188
[1m268/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m1s[0m 32ms/step - accuracy: 0.3739 - loss: 1.3188
[1m270/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m1s[0m 32ms/step - accuracy: 0.3739 - loss: 1.3188
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m20s[0m 126ms/step - accuracy: 0.3281 - loss: 1.6567
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 33ms/step - accuracy: 0.3281 - loss: 1.6545  [32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 32/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 33ms/step - accuracy: 0.2624 - loss: 2.0021 
[1m 33/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 34ms/step - accuracy: 0.2617 - loss: 2.0042
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 35/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 34ms/step - accuracy: 0.2608 - loss: 2.0073 
[1m 37/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 34ms/step - accuracy: 0.2601 - loss: 2.0103
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m18/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m3s[0m 58ms/step - accuracy: 0.2566 - loss: 1.5735[32m [repeated 30x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m26/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m6s[0m 114ms/step - accuracy: 0.2592 - loss: 2.0365[32m [repeated 98x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m31/82[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m3s[0m 59ms/step - accuracy: 0.2632 - loss: 1.5623
[1m32/82[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m2s[0m 59ms/step - accuracy: 0.2635 - loss: 1.5619[32m [repeated 20x across cluster][0m
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m11s[0m 139ms/step - accuracy: 0.2971 - loss: 1.9010 - val_accuracy: 0.3599 - val_loss: 1.5343[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m22/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m6s[0m 114ms/step - accuracy: 0.2586 - loss: 2.0404
[1m23/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m6s[0m 113ms/step - accuracy: 0.2587 - loss: 2.0393[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m121/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 39ms/step - accuracy: 0.2705 - loss: 1.6439
[1m123/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m1s[0m 39ms/step - accuracy: 0.2705 - loss: 1.6437[32m [repeated 259x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m126/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m1s[0m 39ms/step - accuracy: 0.2705 - loss: 1.6433[32m [repeated 262x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 123ms/step - accuracy: 0.2656 - loss: 2.0972[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m14s[0m 88ms/step - accuracy: 0.2461 - loss: 2.0644 
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  8/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 76ms/step - accuracy: 0.2421 - loss: 2.0472[32m [repeated 37x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 78ms/step - accuracy: 0.2407 - loss: 2.0392
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 77ms/step - accuracy: 0.2410 - loss: 2.0427[32m [repeated 26x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 38ms/step - accuracy: 0.3594 - loss: 1.2973 
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 38ms/step - accuracy: 0.2770 - loss: 1.8553 - val_accuracy: 0.3648 - val_loss: 1.3224[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m Epoch 11/94[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 71ms/step - accuracy: 0.2727 - loss: 1.5473 - val_accuracy: 0.3841 - val_loss: 1.2922
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m20s[0m 125ms/step - accuracy: 0.2812 - loss: 1.6625
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 30ms/step - accuracy: 0.2691 - loss: 1.7967  [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 32ms/step - accuracy: 0.3012 - loss: 1.5721 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 34ms/step - accuracy: 0.2837 - loss: 1.5833[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 62ms/step - accuracy: 0.2726 - loss: 1.5475[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m254/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m2s[0m 35ms/step - accuracy: 0.3081 - loss: 1.5524
[1m256/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m2s[0m 35ms/step - accuracy: 0.3080 - loss: 1.5524
[1m258/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m2s[0m 35ms/step - accuracy: 0.3080 - loss: 1.5524
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m70/82[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m1s[0m 115ms/step - accuracy: 0.2615 - loss: 2.0241[32m [repeated 155x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m79/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 62ms/step - accuracy: 0.2722 - loss: 1.5481
[1m80/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 62ms/step - accuracy: 0.2724 - loss: 1.5479[32m [repeated 14x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m10s[0m 121ms/step - accuracy: 0.2754 - loss: 1.9567 - val_accuracy: 0.4038 - val_loss: 1.5292
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m286/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m1s[0m 34ms/step - accuracy: 0.3081 - loss: 1.5516
[1m288/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m1s[0m 34ms/step - accuracy: 0.3081 - loss: 1.5516
[1m290/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m1s[0m 34ms/step - accuracy: 0.3081 - loss: 1.5515
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 410ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m 4/49[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 18ms/step  
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m 7/49[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 19ms/step
[1m10/49[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 20ms/step
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m299/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 34ms/step - accuracy: 0.3081 - loss: 1.5513
[1m301/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 34ms/step - accuracy: 0.3081 - loss: 1.5512
[1m303/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 34ms/step - accuracy: 0.3082 - loss: 1.5512
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m81/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 113ms/step - accuracy: 0.2616 - loss: 2.0222
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 113ms/step - accuracy: 0.2617 - loss: 2.0220[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m13/49[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m0s[0m 20ms/step
[1m16/49[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 20ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m19/49[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 20ms/step
[1m23/49[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 89/164[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m2s[0m 30ms/step - accuracy: 0.2750 - loss: 1.8305
[1m 92/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m2s[0m 30ms/step - accuracy: 0.2751 - loss: 1.8307[32m [repeated 266x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m26/49[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 19ms/step
[1m29/49[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 20ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m33/49[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 19ms/step
[1m36/49[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m262/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m2s[0m 38ms/step - accuracy: 0.5213 - loss: 1.0890[32m [repeated 259x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m39/49[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m43/49[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 18ms/step
[1m46/49[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 18ms/step
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m10s[0m 124ms/step - accuracy: 0.2617 - loss: 2.0218 - val_accuracy: 0.3652 - val_loss: 1.3767
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 25ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 26ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 67ms/step
[1m 4/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 18ms/step
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 193ms/step - accuracy: 0.2969 - loss: 2.1469[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m 7/89[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 20ms/step
[1m 9/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 23ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m12/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 21ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m15/89[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m1s[0m 22ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m19/89[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 20ms/step
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 10/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 63ms/step - accuracy: 0.2586 - loss: 1.8666 [32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 85ms/step - accuracy: 0.2778 - loss: 1.8278
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 78ms/step - accuracy: 0.2728 - loss: 1.8432[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 50ms/step - accuracy: 0.3867 - loss: 1.2265  [32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m22/89[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 20ms/step
[1m25/89[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m1s[0m 20ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m29/89[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m1s[0m 20ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m33/89[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m1s[0m 20ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m36/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m1s[0m 19ms/step
[1m39/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m42/89[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 19ms/step
[1m46/89[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m49/89[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m53/89[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 18ms/step
[1m57/89[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 18ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m59/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 19ms/step
[1m61/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m64/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 19ms/step
[1m67/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 42ms/step - accuracy: 0.3299 - loss: 1.3406 - val_accuracy: 0.3690 - val_loss: 1.2700[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m Epoch 10/119[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m70/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 19ms/step
[1m73/89[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m76/89[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 20ms/step
[1m80/89[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m82/89[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 20ms/step
[1m85/89[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 20ms/step
[36m(train_cnn_ray_tune pid=1510572)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510572)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510572)[0m 
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 20ms/step

Trial trial_49751 finished iteration 1 at 2025-11-04 13:19:16. Total running time: 1min 23s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             79.8381 │
│ time_total_s                 79.8381 │
│ training_iteration                 1 │
│ val_accuracy                 0.40379 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:19:16. Total running time: 1min 23s
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m17s[0m 109ms/step - accuracy: 0.4219 - loss: 1.2378
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 32ms/step - accuracy: 0.4210 - loss: 1.2807  [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 45ms/step - accuracy: 0.2986 - loss: 1.7083 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 37ms/step - accuracy: 0.3023 - loss: 1.7231[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m22/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m3s[0m 55ms/step - accuracy: 0.2926 - loss: 1.5199[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 30ms/step - accuracy: 0.3214 - loss: 1.7317
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 30ms/step - accuracy: 0.3213 - loss: 1.7322
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 30ms/step - accuracy: 0.3212 - loss: 1.7327[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m 7/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 107ms/step - accuracy: 0.2395 - loss: 2.1597[32m [repeated 86x across cluster][0m
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m16s[0m 352ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m24/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m3s[0m 54ms/step - accuracy: 0.2924 - loss: 1.5196
[1m26/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m2s[0m 53ms/step - accuracy: 0.2920 - loss: 1.5198[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m 5/49[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step  
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 25ms/step[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m40/82[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m4s[0m 102ms/step - accuracy: 0.2825 - loss: 1.9581
[1m41/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m4s[0m 102ms/step - accuracy: 0.2823 - loss: 1.9586[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 78ms/step
[1m 4/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 77/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 34ms/step - accuracy: 0.3412 - loss: 1.3369
[1m 79/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 34ms/step - accuracy: 0.3410 - loss: 1.3369[32m [repeated 261x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 27/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 32ms/step - accuracy: 0.2628 - loss: 2.0043[32m [repeated 308x across cluster][0m
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m10s[0m 127ms/step - accuracy: 0.2882 - loss: 1.9032 - val_accuracy: 0.3599 - val_loss: 1.5085[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 66ms/step - accuracy: 0.3750 - loss: 1.5776[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 22/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 45ms/step - accuracy: 0.4621 - loss: 1.1994[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 18/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 45ms/step - accuracy: 0.4659 - loss: 1.1916
[1m 20/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 45ms/step - accuracy: 0.4640 - loss: 1.1951[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 63ms/step - accuracy: 0.2227 - loss: 2.1347 

Trial trial_49751 finished iteration 1 at 2025-11-04 13:19:19. Total running time: 1min 27s
[36m(train_cnn_ray_tune pid=1510561)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510561)[0m   _log_deprecation_warning(
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             84.0283 │
│ time_total_s                 84.0283 │
│ training_iteration                 1 │
│ val_accuracy                  0.3599 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:19:19. Total running time: 1min 27s
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m10s[0m 64ms/step - accuracy: 0.2884 - loss: 1.5195 - val_accuracy: 0.4006 - val_loss: 1.2789
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m11s[0m 34ms/step - accuracy: 0.3838 - loss: 1.3090 - val_accuracy: 0.4231 - val_loss: 1.2276[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m Epoch 7/79[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m34s[0m 105ms/step - accuracy: 0.5312 - loss: 1.1612
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m14s[0m 43ms/step - accuracy: 0.4896 - loss: 1.1594 [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.3636 - loss: 1.5975 
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 25ms/step - accuracy: 0.3500 - loss: 1.6406[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m20/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m3s[0m 57ms/step - accuracy: 0.3052 - loss: 1.4908[32m [repeated 23x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 30ms/step - accuracy: 0.3081 - loss: 1.7264
[1m 77/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 30ms/step - accuracy: 0.3081 - loss: 1.7270
[1m 79/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 29ms/step - accuracy: 0.3082 - loss: 1.7274[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m77/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 101ms/step - accuracy: 0.2779 - loss: 1.9641[32m [repeated 74x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m17/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m3s[0m 57ms/step - accuracy: 0.3056 - loss: 1.4889
[1m18/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m3s[0m 57ms/step - accuracy: 0.3055 - loss: 1.4894[32m [repeated 17x across cluster][0m
[36m(train_cnn_ray_tune pid=1510561)[0m 
[1m86/89[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 19ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 20ms/step[32m [repeated 16x across cluster][0m

Trial status: 17 RUNNING | 3 TERMINATED
Current time: 2025-11-04 13:19:23. Total running time: 1min 30s
Logical resource usage: 17.0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status         N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs     iter     total time (s)     val_accuracy │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    RUNNING              4   adam            relu                                  128                 64                  3          0.000218484         65                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                128                  5          9.98547e-05         55                                              │
│ trial_49751    RUNNING              3   rmsprop         relu                                   64                128                  5          0.000380128         91                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   32                 32                  3          2.97881e-05         59                                              │
│ trial_49751    RUNNING              3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94                                              │
│ trial_49751    RUNNING              4   adam            relu                                   64                128                  3          1.6132e-05         114                                              │
│ trial_49751    RUNNING              2   adam            relu                                   64                 64                  3          2.51454e-05        141                                              │
│ trial_49751    RUNNING              3   adam            relu                                   32                 64                  3          0.000152599        138                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                 64                  5          0.00199444          93                                              │
│ trial_49751    RUNNING              2   rmsprop         relu                                   32                128                  5          0.000810222         51                                              │
│ trial_49751    RUNNING              3   adam            tanh                                   32                 32                  5          0.00337379          79                                              │
│ trial_49751    RUNNING              4   adam            relu                                  128                128                  5          1.32913e-05        108                                              │
│ trial_49751    RUNNING              2   adam            tanh                                   64                 64                  3          4.37965e-05        125                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                 32                  3          8.13867e-05        140                                              │
│ trial_49751    RUNNING              4   adam            tanh                                  128                128                  3          1.21494e-05         85                                              │
│ trial_49751    RUNNING              3   rmsprop         tanh                                   32                128                  5          0.00135033         103                                              │
│ trial_49751    RUNNING              4   adam            relu                                   64                 32                  3          0.00148826         119                                              │
│ trial_49751    TERMINATED           4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131        1            84.0283         0.359902 │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          7.68947e-05         83        1            79.8381         0.403792 │
│ trial_49751    TERMINATED           4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105        1            61.237          0.347612 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m65/82[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m1s[0m 102ms/step - accuracy: 0.2605 - loss: 2.0760
[1m66/82[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m1s[0m 102ms/step - accuracy: 0.2605 - loss: 2.0759[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m111/328[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m8s[0m 41ms/step - accuracy: 0.4375 - loss: 1.2445
[1m113/328[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m8s[0m 41ms/step - accuracy: 0.4373 - loss: 1.2447[32m [repeated 287x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 90ms/step - accuracy: 0.2012 - loss: 2.2836  [32m [repeated 237x across cluster][0m
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 114ms/step - accuracy: 0.2777 - loss: 1.9638 - val_accuracy: 0.3694 - val_loss: 1.3993
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 59ms/step - accuracy: 0.4492 - loss: 1.1580   
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 57ms/step - accuracy: 0.4540 - loss: 1.1709
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 83ms/step - accuracy: 0.3320 - loss: 1.6489 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 71ms/step - accuracy: 0.3273 - loss: 1.6393
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m28s[0m 173ms/step - accuracy: 0.3281 - loss: 1.6847[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 10/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 66ms/step - accuracy: 0.3117 - loss: 1.6802[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 68ms/step - accuracy: 0.3209 - loss: 1.6562
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 67ms/step - accuracy: 0.3190 - loss: 1.6616[32m [repeated 17x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 40ms/step - accuracy: 0.3490 - loss: 1.3826  [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 113ms/step - accuracy: 0.2607 - loss: 2.0735 - val_accuracy: 0.3564 - val_loss: 1.4481
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m5s[0m 60ms/step - accuracy: 0.2947 - loss: 1.5031 - val_accuracy: 0.4129 - val_loss: 1.2735
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 37ms/step - accuracy: 0.3204 - loss: 1.7543 - val_accuracy: 0.4371 - val_loss: 1.2443[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 13/141[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m17s[0m 108ms/step - accuracy: 0.3281 - loss: 1.6603
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 32ms/step - accuracy: 0.3299 - loss: 1.6889  [32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m14/82[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m6s[0m 98ms/step - accuracy: 0.2723 - loss: 1.9497[32m [repeated 62x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m119/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 28ms/step - accuracy: 0.3152 - loss: 1.7196
[1m121/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 28ms/step - accuracy: 0.3152 - loss: 1.7193
[1m123/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m1s[0m 28ms/step - accuracy: 0.3153 - loss: 1.7190[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m80/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 101ms/step - accuracy: 0.2607 - loss: 2.0741[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  4/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 25ms/step - accuracy: 0.3496 - loss: 1.3002  
[1m  7/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 27ms/step - accuracy: 0.3400 - loss: 1.3433
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m25/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m3s[0m 56ms/step - accuracy: 0.3057 - loss: 1.4525
[1m26/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m3s[0m 56ms/step - accuracy: 0.3056 - loss: 1.4530[32m [repeated 21x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m24/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m5s[0m 99ms/step - accuracy: 0.2656 - loss: 1.9767 
[1m25/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m5s[0m 99ms/step - accuracy: 0.2651 - loss: 1.9785
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m21/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m6s[0m 101ms/step - accuracy: 0.2672 - loss: 1.9702
[1m22/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m6s[0m 101ms/step - accuracy: 0.2666 - loss: 1.9725[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m235/328[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m3s[0m 41ms/step - accuracy: 0.4314 - loss: 1.2472
[1m236/328[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m3s[0m 41ms/step - accuracy: 0.4314 - loss: 1.2471[32m [repeated 323x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 28ms/step - accuracy: 0.3108 - loss: 1.3450 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 29ms/step - accuracy: 0.3308 - loss: 1.3424
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m6s[0m 66ms/step - accuracy: 0.2916 - loss: 1.7351[32m [repeated 230x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m119/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 28ms/step - accuracy: 0.3077 - loss: 1.7612
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 113ms/step - accuracy: 0.2812 - loss: 1.9920[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m17s[0m 354ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 4/49[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 18ms/step  
[1m 8/49[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  2/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 53ms/step - accuracy: 0.3555 - loss: 1.3504 [32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 33ms/step - accuracy: 0.2969 - loss: 1.8727  
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m12/49[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 15ms/step
[1m16/49[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m19/49[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m23/49[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m26/49[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 16ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m29/49[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m32/49[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 17ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m37/49[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 16ms/step
[1m42/49[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 15ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m5s[0m 60ms/step - accuracy: 0.3000 - loss: 1.4623 - val_accuracy: 0.3961 - val_loss: 1.2733
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m47/49[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 15ms/step
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 31ms/step - accuracy: 0.5625 - loss: 0.9577
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 31ms/step - accuracy: 0.5384 - loss: 1.0120 
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 21ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 21ms/step
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m5s[0m 31ms/step - accuracy: 0.3073 - loss: 1.7635 - val_accuracy: 0.4473 - val_loss: 1.2358[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 14/141[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 45ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m 5/89[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m10/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 22/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 33ms/step - accuracy: 0.5392 - loss: 1.0374
[1m 24/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 32ms/step - accuracy: 0.5387 - loss: 1.0385 
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m14/89[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m1s[0m 14ms/step
[1m19/89[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 72ms/step - accuracy: 0.2969 - loss: 1.7947
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 28ms/step - accuracy: 0.3073 - loss: 1.8164 
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m24/89[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m0s[0m 13ms/step
[1m28/89[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 13ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m32/89[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 14ms/step
[1m36/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510565)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m40/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m43/89[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 14ms/step
[1m48/89[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m23/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m3s[0m 52ms/step - accuracy: 0.2957 - loss: 1.4399[32m [repeated 82x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 26ms/step - accuracy: 0.3224 - loss: 1.6985
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 26ms/step - accuracy: 0.3225 - loss: 1.6983
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m1s[0m 26ms/step - accuracy: 0.3225 - loss: 1.6982
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m51/89[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 14ms/step
[1m54/89[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 103ms/step - accuracy: 0.2621 - loss: 2.0333 - val_accuracy: 0.3732 - val_loss: 1.4189
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m58/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 2/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 56ms/step - accuracy: 0.2793 - loss: 1.4433 [32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 43ms/step - accuracy: 0.2561 - loss: 2.0807  
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 46ms/step - accuracy: 0.2546 - loss: 2.0825
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m62/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m67/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m26/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m2s[0m 51ms/step - accuracy: 0.2957 - loss: 1.4401
[1m27/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m2s[0m 51ms/step - accuracy: 0.2958 - loss: 1.4401[32m [repeated 32x across cluster][0m
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m73/89[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 14ms/step
[1m78/89[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m83/89[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 13ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 13ms/step
[36m(train_cnn_ray_tune pid=1510565)[0m 
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

Trial trial_49751 finished iteration 1 at 2025-11-04 13:19:32. Total running time: 1min 39s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             96.2637 │
│ time_total_s                 96.2637 │
│ training_iteration                 1 │
│ val_accuracy                  0.3585 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:19:32. Total running time: 1min 39s
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m14s[0m 311ms/step
[1m 5/49[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 14ms/step  
[36m(train_cnn_ray_tune pid=1510562)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 53ms/step
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.2970 - loss: 1.7111
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.2958 - loss: 1.7123[32m [repeated 301x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3324 - loss: 1.6663 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3239 - loss: 1.6687[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m 83/164[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m3s[0m 49ms/step - accuracy: 0.2684 - loss: 1.9938[32m [repeated 219x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m16s[0m 99ms/step - accuracy: 0.3594 - loss: 1.3097
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 31ms/step - accuracy: 0.3950 - loss: 1.2927 
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 327ms/step[36m(train_cnn_ray_tune pid=1510562)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510562)[0m   _log_deprecation_warning(

[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m 5/49[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 15ms/step  

Trial trial_49751 finished iteration 1 at 2025-11-04 13:19:34. Total running time: 1min 41s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             98.8178 │
│ time_total_s                 98.8178 │
│ training_iteration                 1 │
│ val_accuracy                 0.37324 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:19:34. Total running time: 1min 41s
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m28s[0m 88ms/step - accuracy: 0.4375 - loss: 1.1696[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 19/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 35ms/step - accuracy: 0.4147 - loss: 1.2611[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m43/49[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 16ms/step[32m [repeated 27x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 33/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 34ms/step - accuracy: 0.4238 - loss: 1.2547
[1m 35/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 34ms/step - accuracy: 0.4248 - loss: 1.2541 [32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 51ms/step
[1m 5/89[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 44ms/step - accuracy: 0.3914 - loss: 1.2945 - val_accuracy: 0.4860 - val_loss: 1.1997[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m Epoch 9/93[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m45/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m1s[0m 41ms/step - accuracy: 0.3006 - loss: 1.4419[32m [repeated 40x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 52ms/step - accuracy: 0.2975 - loss: 1.4424 - val_accuracy: 0.3813 - val_loss: 1.2759[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 11/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 25ms/step - accuracy: 0.3696 - loss: 1.3987
[1m 13/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 25ms/step - accuracy: 0.3634 - loss: 1.4055
[1m 15/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 25ms/step - accuracy: 0.3575 - loss: 1.4114

Trial trial_49751 finished iteration 1 at 2025-11-04 13:19:36. Total running time: 1min 44s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             100.985 │
│ time_total_s                 100.985 │
│ training_iteration                 1 │
│ val_accuracy                 0.36131 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:19:36. Total running time: 1min 44s
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m47/82[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m1s[0m 41ms/step - accuracy: 0.3006 - loss: 1.4420
[1m49/82[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m1s[0m 41ms/step - accuracy: 0.3005 - loss: 1.4422[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.3079 - loss: 1.6915
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m3s[0m 25ms/step - accuracy: 0.3065 - loss: 1.6942[32m [repeated 312x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.2986 - loss: 1.5056 
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.2761 - loss: 1.5258[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 23ms/step - accuracy: 0.3143 - loss: 1.7405[32m [repeated 218x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 40ms/step - accuracy: 0.2917 - loss: 1.4153 
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 114ms/step - accuracy: 0.2969 - loss: 2.0442
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 43ms/step - accuracy: 0.2691 - loss: 2.0599  [32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m27s[0m 83ms/step - accuracy: 0.3438 - loss: 1.2808[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 40ms/step - accuracy: 0.4731 - loss: 1.1661 [32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510567)[0m 
[1m87/89[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 15ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 32ms/step - accuracy: 0.5399 - loss: 0.9400
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 29ms/step - accuracy: 0.5496 - loss: 0.9529 
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m10s[0m 30ms/step - accuracy: 0.5444 - loss: 1.0294 - val_accuracy: 0.5727 - val_loss: 1.0231[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m Epoch 8/51[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m76/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 40ms/step - accuracy: 0.3054 - loss: 1.4217[32m [repeated 36x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 47ms/step - accuracy: 0.3005 - loss: 1.4415 - val_accuracy: 0.3862 - val_loss: 1.2697
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m115/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 22ms/step - accuracy: 0.3304 - loss: 1.6559
[1m118/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 22ms/step - accuracy: 0.3303 - loss: 1.6558
[1m120/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 22ms/step - accuracy: 0.3302 - loss: 1.6557[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m80/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 41ms/step - accuracy: 0.3053 - loss: 1.4215
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 40ms/step - accuracy: 0.3053 - loss: 1.4215[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 47ms/step - accuracy: 0.3053 - loss: 1.4214 - val_accuracy: 0.3950 - val_loss: 1.2682
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 37ms/step - accuracy: 0.3759 - loss: 1.2565  
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 40ms/step - accuracy: 0.2756 - loss: 1.4543 
[1m 4/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 44ms/step - accuracy: 0.2795 - loss: 1.4482
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m324/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 31ms/step - accuracy: 0.4362 - loss: 1.2383
[1m326/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 31ms/step - accuracy: 0.4362 - loss: 1.2382[32m [repeated 290x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 18ms/step - accuracy: 0.3415 - loss: 1.5666 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 18ms/step - accuracy: 0.3360 - loss: 1.5866[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m1s[0m 40ms/step - accuracy: 0.2689 - loss: 1.9785[32m [repeated 254x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 93ms/step - accuracy: 0.2969 - loss: 1.7436
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 44ms/step - accuracy: 0.3264 - loss: 1.6515 [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 73ms/step - accuracy: 0.3281 - loss: 1.3543[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 19ms/step - accuracy: 0.3258 - loss: 1.6151 [32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m5s[0m 30ms/step - accuracy: 0.2601 - loss: 1.5307 - val_accuracy: 0.3662 - val_loss: 1.2924[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m Epoch 15/140[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m16s[0m 346ms/step
[1m 6/49[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 12ms/step  
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m11/49[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 12ms/step
[1m16/49[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 12ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m   _log_deprecation_warning([32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 44ms/step - accuracy: 0.3039 - loss: 1.4190 - val_accuracy: 0.3926 - val_loss: 1.2682
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m19/49[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 13ms/step
[1m24/49[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 13ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m28/49[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 13ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m33/49[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 12ms/step
[1m38/49[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 12ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m42/49[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 13ms/step
[1m47/49[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 13ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 9/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 34ms/step - accuracy: 0.3280 - loss: 1.3763[32m [repeated 27x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m302/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 21ms/step - accuracy: 0.4002 - loss: 1.2853
[1m305/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 20ms/step - accuracy: 0.4001 - loss: 1.2853
[1m308/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 20ms/step - accuracy: 0.4001 - loss: 1.2853[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 18ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 19ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m21/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m2s[0m 36ms/step - accuracy: 0.3215 - loss: 1.3852
[1m23/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m2s[0m 36ms/step - accuracy: 0.3207 - loss: 1.3859[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 49ms/step
[1m 6/89[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 11ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m10/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 11ms/step
[1m15/89[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 12ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m19/89[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 12ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m24/89[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m0s[0m 12ms/step
[1m29/89[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 12ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m34/89[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 12ms/step
[1m38/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 12ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m43/89[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 12ms/step
[1m48/89[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 12ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m52/89[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 12ms/step
[1m56/89[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 12ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m60/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 12ms/step
[1m64/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 13ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m69/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 13ms/step
[1m73/89[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 13ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m78/89[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 13ms/step
[1m84/89[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 12ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m88/89[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 12ms/step
[36m(train_cnn_ray_tune pid=1510569)[0m 
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 13ms/step

Trial trial_49751 finished iteration 1 at 2025-11-04 13:19:48. Total running time: 1min 55s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s              112.29 │
│ time_total_s                  112.29 │
│ training_iteration                 1 │
│ val_accuracy                 0.32725 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:19:48. Total running time: 1min 55s
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 24ms/step - accuracy: 0.2692 - loss: 1.5072
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m1s[0m 24ms/step - accuracy: 0.2693 - loss: 1.5072[32m [repeated 289x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 26ms/step - accuracy: 0.4705 - loss: 1.2357 
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 26ms/step - accuracy: 0.4810 - loss: 1.2169[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m 68/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m5s[0m 19ms/step - accuracy: 0.3944 - loss: 1.2705[32m [repeated 210x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 77ms/step - accuracy: 0.3281 - loss: 1.6091
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3447 - loss: 1.7066 [32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 43ms/step - accuracy: 0.3132 - loss: 1.3914 - val_accuracy: 0.4034 - val_loss: 1.2654
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 80ms/step - accuracy: 0.4219 - loss: 1.2560[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 30ms/step - accuracy: 0.3976 - loss: 1.2780 [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m5s[0m 28ms/step - accuracy: 0.2717 - loss: 1.5041 - val_accuracy: 0.3718 - val_loss: 1.2920[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m Epoch 16/140[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m55/82[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m1s[0m 37ms/step - accuracy: 0.3238 - loss: 1.3865[32m [repeated 33x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m118/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 20ms/step - accuracy: 0.3120 - loss: 1.6619
[1m121/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 20ms/step - accuracy: 0.3120 - loss: 1.6617
[1m124/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 20ms/step - accuracy: 0.3120 - loss: 1.6616[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m57/82[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 37ms/step - accuracy: 0.3234 - loss: 1.3867
[1m59/82[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 37ms/step - accuracy: 0.3232 - loss: 1.3869[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 40ms/step - accuracy: 0.2665 - loss: 1.6723  

Trial status: 13 RUNNING | 7 TERMINATED
Current time: 2025-11-04 13:19:53. Total running time: 2min 0s
Logical resource usage: 13.0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status         N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs     iter     total time (s)     val_accuracy │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    RUNNING              4   adam            relu                                  128                 64                  3          0.000218484         65                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                128                  5          9.98547e-05         55                                              │
│ trial_49751    RUNNING              3   rmsprop         relu                                   64                128                  5          0.000380128         91                                              │
│ trial_49751    RUNNING              3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94                                              │
│ trial_49751    RUNNING              2   adam            relu                                   64                 64                  3          2.51454e-05        141                                              │
│ trial_49751    RUNNING              3   adam            relu                                   32                 64                  3          0.000152599        138                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                 64                  5          0.00199444          93                                              │
│ trial_49751    RUNNING              2   rmsprop         relu                                   32                128                  5          0.000810222         51                                              │
│ trial_49751    RUNNING              3   adam            tanh                                   32                 32                  5          0.00337379          79                                              │
│ trial_49751    RUNNING              2   adam            tanh                                   64                 64                  3          4.37965e-05        125                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                 32                  3          8.13867e-05        140                                              │
│ trial_49751    RUNNING              3   rmsprop         tanh                                   32                128                  5          0.00135033         103                                              │
│ trial_49751    RUNNING              4   adam            relu                                   64                 32                  3          0.00148826         119                                              │
│ trial_49751    TERMINATED           4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131        1            84.0283         0.359902 │
│ trial_49751    TERMINATED           4   adam            tanh                                   32                 32                  3          2.97881e-05         59        1            96.2637         0.358497 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                128                  3          1.6132e-05         114        1           112.29           0.327247 │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          7.68947e-05         83        1            79.8381         0.403792 │
│ trial_49751    TERMINATED           4   adam            relu                                  128                128                  5          1.32913e-05        108        1           100.985          0.361306 │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          1.21494e-05         85        1            98.8178         0.373244 │
│ trial_49751    TERMINATED           4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105        1            61.237          0.347612 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 42ms/step - accuracy: 0.3213 - loss: 1.3875 - val_accuracy: 0.4199 - val_loss: 1.2623
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 35ms/step - accuracy: 0.3581 - loss: 1.3396 
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m312/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 28ms/step - accuracy: 0.4405 - loss: 1.2147
[1m314/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 28ms/step - accuracy: 0.4405 - loss: 1.2147[32m [repeated 309x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  4/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 21ms/step - accuracy: 0.3717 - loss: 1.4487 
[1m  7/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 20ms/step - accuracy: 0.3602 - loss: 1.4318[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m122/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 19ms/step - accuracy: 0.3148 - loss: 1.6969[32m [repeated 205x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 78ms/step - accuracy: 0.2656 - loss: 1.3057
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 27ms/step - accuracy: 0.3099 - loss: 1.2913 [32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m23s[0m 70ms/step - accuracy: 0.3438 - loss: 1.2719[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 19ms/step - accuracy: 0.3053 - loss: 1.6681 
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 332ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m 7/49[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 9ms/step   
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m12/49[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 10ms/step
[1m18/49[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 9ms/step 
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m23/49[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 10ms/step
[1m29/49[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 9ms/step 
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m35/49[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 9ms/step
[1m41/49[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 9ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m46/49[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m10s[0m 31ms/step - accuracy: 0.4406 - loss: 1.2149 - val_accuracy: 0.4779 - val_loss: 1.2067[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m Epoch 8/103[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 41ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 38ms/step - accuracy: 0.3173 - loss: 1.3800 - val_accuracy: 0.4091 - val_loss: 1.2604
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m 8/89[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 8ms/step 
[1m15/89[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 8ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m20/89[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 9ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m26/89[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m0s[0m 9ms/step
[1m32/89[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 9ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m39/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 9ms/step
[1m46/89[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 8ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m52/89[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 9ms/step
[1m59/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 8ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m65/82[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 33ms/step - accuracy: 0.3192 - loss: 1.3782[32m [repeated 22x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 92/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m1s[0m 17ms/step - accuracy: 0.3293 - loss: 1.6723
[1m 95/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m1s[0m 17ms/step - accuracy: 0.3293 - loss: 1.6721
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 17ms/step - accuracy: 0.3292 - loss: 1.6717[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m66/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 8ms/step
[1m73/89[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 8ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m79/89[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 8ms/step
[1m85/89[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 8ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m 
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 9ms/step
[36m(train_cnn_ray_tune pid=1510563)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510563)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m22/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m1s[0m 29ms/step - accuracy: 0.2994 - loss: 1.3931
[1m24/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m1s[0m 29ms/step - accuracy: 0.3008 - loss: 1.3916[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4030 - loss: 1.2588 

Trial trial_49751 finished iteration 1 at 2025-11-04 13:19:57. Total running time: 2min 4s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             121.581 │
│ time_total_s                 121.581 │
│ training_iteration                 1 │
│ val_accuracy                 0.35218 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:19:57. Total running time: 2min 4s
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m1s[0m 18ms/step - accuracy: 0.3987 - loss: 1.2892
[1m 78/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m1s[0m 18ms/step - accuracy: 0.3984 - loss: 1.2889[32m [repeated 306x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.5286 - loss: 1.0935 
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 25ms/step - accuracy: 0.5325 - loss: 1.0777[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 14ms/step - accuracy: 0.3263 - loss: 1.6192[32m [repeated 96x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 33ms/step - accuracy: 0.3111 - loss: 1.3811 - val_accuracy: 0.4112 - val_loss: 1.2592
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 74ms/step - accuracy: 0.3125 - loss: 1.3477
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 25ms/step - accuracy: 0.3268 - loss: 1.3478[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 72ms/step - accuracy: 0.4688 - loss: 1.1969[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 21ms/step - accuracy: 0.3969 - loss: 1.2824 - val_accuracy: 0.5011 - val_loss: 1.1937[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m Epoch 19/119[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 33ms/step - accuracy: 0.3192 - loss: 1.3677 - val_accuracy: 0.4280 - val_loss: 1.2561
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m81/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 28ms/step - accuracy: 0.3191 - loss: 1.3677[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m127/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 18ms/step - accuracy: 0.3969 - loss: 1.2845
[1m130/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 18ms/step - accuracy: 0.3968 - loss: 1.2843
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 18ms/step - accuracy: 0.3968 - loss: 1.2842[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 7/82[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 28ms/step - accuracy: 0.3109 - loss: 1.3793
[1m 9/82[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 28ms/step - accuracy: 0.3170 - loss: 1.3756[32m [repeated 30x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4769 - loss: 1.2133 
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m155/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 18ms/step - accuracy: 0.3961 - loss: 1.2640
[1m159/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 18ms/step - accuracy: 0.3961 - loss: 1.2640[32m [repeated 299x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 27ms/step - accuracy: 0.4679 - loss: 1.1546 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 27ms/step - accuracy: 0.4924 - loss: 1.1146[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m163/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 18ms/step - accuracy: 0.3961 - loss: 1.2640[32m [repeated 84x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 70ms/step - accuracy: 0.4219 - loss: 1.1850
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.4072 - loss: 1.1988 [32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 19ms/step - accuracy: 0.3773 - loss: 1.2678 
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 36ms/step - accuracy: 0.3208 - loss: 1.3695 - val_accuracy: 0.4287 - val_loss: 1.2561
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 67ms/step - accuracy: 0.3281 - loss: 1.2777[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  3/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 28ms/step - accuracy: 0.3576 - loss: 1.2758  
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 18ms/step - accuracy: 0.3328 - loss: 1.5564 - val_accuracy: 0.4617 - val_loss: 1.2300[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m Epoch 24/125[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m47/82[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m1s[0m 33ms/step - accuracy: 0.3458 - loss: 1.3515[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 18ms/step - accuracy: 0.3961 - loss: 1.2639
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 18ms/step - accuracy: 0.3961 - loss: 1.2639
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 18ms/step - accuracy: 0.3961 - loss: 1.2639[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m65/82[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 33ms/step - accuracy: 0.3440 - loss: 1.3521
[1m67/82[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 33ms/step - accuracy: 0.3438 - loss: 1.3522[32m [repeated 27x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 39ms/step - accuracy: 0.3429 - loss: 1.3527 - val_accuracy: 0.4161 - val_loss: 1.2529
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 16/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 20ms/step - accuracy: 0.3990 - loss: 1.2586
[1m 19/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 20ms/step - accuracy: 0.4005 - loss: 1.2586[32m [repeated 278x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.3799 - loss: 1.2554 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.3956 - loss: 1.2492[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m138/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.3701 - loss: 1.3468[32m [repeated 164x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 93ms/step - accuracy: 0.5156 - loss: 1.2012
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 28ms/step - accuracy: 0.4922 - loss: 1.2137 [32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 32ms/step - accuracy: 0.2665 - loss: 1.6665 
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 78ms/step - accuracy: 0.6094 - loss: 1.0029[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 22ms/step - accuracy: 0.2945 - loss: 1.6353 - val_accuracy: 0.3697 - val_loss: 1.2645[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m Epoch 25/94[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 39ms/step - accuracy: 0.3288 - loss: 1.3573 - val_accuracy: 0.4091 - val_loss: 1.2517
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m15/82[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m2s[0m 32ms/step - accuracy: 0.3434 - loss: 1.3501[32m [repeated 20x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m123/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 21ms/step - accuracy: 0.4156 - loss: 1.2499
[1m126/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 21ms/step - accuracy: 0.4156 - loss: 1.2500
[1m129/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 21ms/step - accuracy: 0.4156 - loss: 1.2500[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m25/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m1s[0m 32ms/step - accuracy: 0.3430 - loss: 1.3493
[1m27/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m1s[0m 32ms/step - accuracy: 0.3428 - loss: 1.3495[32m [repeated 20x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 61/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m2s[0m 21ms/step - accuracy: 0.4204 - loss: 1.2315
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m2s[0m 21ms/step - accuracy: 0.4206 - loss: 1.2318[32m [repeated 302x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  4/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.3203 - loss: 1.3352 
[1m  8/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 17ms/step - accuracy: 0.3584 - loss: 1.3236[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 51/164[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m4s[0m 38ms/step - accuracy: 0.3075 - loss: 1.5619[32m [repeated 162x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m22s[0m 68ms/step - accuracy: 0.3750 - loss: 1.3025
[1m  4/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 19ms/step - accuracy: 0.4434 - loss: 1.2202 [32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 37ms/step - accuracy: 0.3371 - loss: 1.3490 - val_accuracy: 0.4368 - val_loss: 1.2452
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3376 - loss: 1.2625 
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 77ms/step - accuracy: 0.2656 - loss: 1.6817[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 20ms/step - accuracy: 0.3300 - loss: 1.5432 - val_accuracy: 0.4617 - val_loss: 1.2278[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m Epoch 27/125[32m [repeated 14x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m75/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 34ms/step - accuracy: 0.3447 - loss: 1.3394[32m [repeated 20x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 89/328[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m6s[0m 26ms/step - accuracy: 0.4264 - loss: 1.2504
[1m 91/328[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m6s[0m 26ms/step - accuracy: 0.4265 - loss: 1.2499
[1m 93/328[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m6s[0m 26ms/step - accuracy: 0.4267 - loss: 1.2493[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m79/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 34ms/step - accuracy: 0.3446 - loss: 1.3396
[1m81/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 34ms/step - accuracy: 0.3445 - loss: 1.3397[32m [repeated 23x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 43ms/step - accuracy: 0.3444 - loss: 1.3398 - val_accuracy: 0.4414 - val_loss: 1.2443
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m1s[0m 21ms/step - accuracy: 0.4118 - loss: 1.2346
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m1s[0m 21ms/step - accuracy: 0.4121 - loss: 1.2345[32m [repeated 313x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m  4/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 17ms/step - accuracy: 0.6074 - loss: 0.9959 
[1m  7/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.6102 - loss: 0.9943
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m162/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 29ms/step - accuracy: 0.4278 - loss: 1.2380[32m [repeated 137x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 17ms/step - accuracy: 0.3123 - loss: 1.5235 
[1m  9/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 15ms/step - accuracy: 0.3184 - loss: 1.5120
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 59ms/step - accuracy: 0.3906 - loss: 1.4420
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 14ms/step - accuracy: 0.3495 - loss: 1.5047[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 37ms/step - accuracy: 0.3229 - loss: 1.4931 [32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 69ms/step - accuracy: 0.2500 - loss: 1.5403[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 267ms/step
[1m 6/49[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 11ms/step  
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m12/49[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 11ms/step
[1m19/49[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m25/49[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 10ms/step
[1m30/49[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m35/49[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 10ms/step
[1m41/49[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m47/49[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 35ms/step - accuracy: 0.3418 - loss: 1.3387 - val_accuracy: 0.4579 - val_loss: 1.2422
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 19ms/step - accuracy: 0.3310 - loss: 1.5695 - val_accuracy: 0.4789 - val_loss: 1.1873[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 28/141[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 44ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m 7/89[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 9ms/step 
[1m13/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m18/89[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 10ms/step
[1m23/89[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m28/89[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 10ms/step
[1m34/89[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m39/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 10ms/step
[1m45/89[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m50/89[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m55/89[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 10ms/step
[1m61/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m67/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 10ms/step
[1m72/89[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m33/82[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m1s[0m 32ms/step - accuracy: 0.3509 - loss: 1.3404[32m [repeated 28x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 29/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 17ms/step - accuracy: 0.3809 - loss: 1.3081
[1m 32/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 17ms/step - accuracy: 0.3816 - loss: 1.3084
[1m 35/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 17ms/step - accuracy: 0.3826 - loss: 1.3086[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m77/89[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 10ms/step
[1m82/89[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510555)[0m 
[1m87/89[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 10ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 11ms/step

Trial trial_49751 finished iteration 1 at 2025-11-04 13:20:22. Total running time: 2min 29s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             146.211 │
│ time_total_s                 146.211 │
│ training_iteration                 1 │
│ val_accuracy                 0.58251 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:20:22. Total running time: 2min 29s
[36m(train_cnn_ray_tune pid=1510555)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510555)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m39/82[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m1s[0m 32ms/step - accuracy: 0.3507 - loss: 1.3400
[1m41/82[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m1s[0m 32ms/step - accuracy: 0.3505 - loss: 1.3400[32m [repeated 16x across cluster][0m

Trial status: 11 RUNNING | 9 TERMINATED
Current time: 2025-11-04 13:20:23. Total running time: 2min 30s
Logical resource usage: 11.0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status         N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs     iter     total time (s)     val_accuracy │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    RUNNING              4   adam            relu                                  128                 64                  3          0.000218484         65                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                128                  5          9.98547e-05         55                                              │
│ trial_49751    RUNNING              3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94                                              │
│ trial_49751    RUNNING              2   adam            relu                                   64                 64                  3          2.51454e-05        141                                              │
│ trial_49751    RUNNING              3   adam            relu                                   32                 64                  3          0.000152599        138                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                 64                  5          0.00199444          93                                              │
│ trial_49751    RUNNING              2   rmsprop         relu                                   32                128                  5          0.000810222         51                                              │
│ trial_49751    RUNNING              3   adam            tanh                                   32                 32                  5          0.00337379          79                                              │
│ trial_49751    RUNNING              2   adam            tanh                                   64                 64                  3          4.37965e-05        125                                              │
│ trial_49751    RUNNING              3   rmsprop         tanh                                   32                128                  5          0.00135033         103                                              │
│ trial_49751    RUNNING              4   adam            relu                                   64                 32                  3          0.00148826         119                                              │
│ trial_49751    TERMINATED           4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131        1            84.0283         0.359902 │
│ trial_49751    TERMINATED           3   rmsprop         relu                                   64                128                  5          0.000380128         91        1           146.211          0.582514 │
│ trial_49751    TERMINATED           4   adam            tanh                                   32                 32                  3          2.97881e-05         59        1            96.2637         0.358497 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                128                  3          1.6132e-05         114        1           112.29           0.327247 │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          7.68947e-05         83        1            79.8381         0.403792 │
│ trial_49751    TERMINATED           4   adam            relu                                  128                128                  5          1.32913e-05        108        1           100.985          0.361306 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                 32                  3          8.13867e-05        140        1           121.581          0.352177 │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          1.21494e-05         85        1            98.8178         0.373244 │
│ trial_49751    TERMINATED           4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105        1            61.237          0.347612 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 36ms/step - accuracy: 0.3486 - loss: 1.3387 - val_accuracy: 0.4666 - val_loss: 1.2376
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m 95/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m1s[0m 15ms/step - accuracy: 0.3385 - loss: 1.5036
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 15ms/step - accuracy: 0.3384 - loss: 1.5037[32m [repeated 289x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m285/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 19ms/step - accuracy: 0.5964 - loss: 0.9602[32m [repeated 116x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  4/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 23ms/step - accuracy: 0.4290 - loss: 1.2949 
[1m  7/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 24ms/step - accuracy: 0.4544 - loss: 1.2545[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 89ms/step - accuracy: 0.3516 - loss: 1.3707
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 34ms/step - accuracy: 0.3585 - loss: 1.3612[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 18ms/step - accuracy: 0.4176 - loss: 1.2049 
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 76ms/step - accuracy: 0.4062 - loss: 1.2619[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 36ms/step - accuracy: 0.3576 - loss: 1.3808 
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m7s[0m 21ms/step - accuracy: 0.5958 - loss: 0.9592 - val_accuracy: 0.5751 - val_loss: 1.0162[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m Epoch 14/51[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 34ms/step - accuracy: 0.3521 - loss: 1.3346 - val_accuracy: 0.4747 - val_loss: 1.2300
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 3/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 29ms/step - accuracy: 0.3216 - loss: 1.3488[32m [repeated 30x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 45/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 17ms/step - accuracy: 0.6119 - loss: 0.9457
[1m 48/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 17ms/step - accuracy: 0.6128 - loss: 0.9436
[1m 51/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m4s[0m 17ms/step - accuracy: 0.6136 - loss: 0.9419[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m17/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 29ms/step - accuracy: 0.3370 - loss: 1.3376
[1m19/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 29ms/step - accuracy: 0.3379 - loss: 1.3377[32m [repeated 19x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m252/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m1s[0m 22ms/step - accuracy: 0.4446 - loss: 1.2025
[1m255/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m1s[0m 22ms/step - accuracy: 0.4447 - loss: 1.2025[32m [repeated 271x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m2s[0m 34ms/step - accuracy: 0.3066 - loss: 1.4874[32m [repeated 135x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 15ms/step - accuracy: 0.3826 - loss: 1.4228 
[1m  9/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 16ms/step - accuracy: 0.3724 - loss: 1.4462[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11s[0m 70ms/step - accuracy: 0.5312 - loss: 1.1402
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.5191 - loss: 1.1379 [32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 34ms/step - accuracy: 0.3451 - loss: 1.3356 - val_accuracy: 0.4772 - val_loss: 1.2290
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 76ms/step - accuracy: 0.3438 - loss: 1.2518[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 15ms/step - accuracy: 0.3976 - loss: 1.4981 
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 19ms/step - accuracy: 0.3432 - loss: 1.5521 - val_accuracy: 0.4877 - val_loss: 1.1803[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 31/141[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  4/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 19ms/step - accuracy: 0.3730 - loss: 1.2166 
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m81/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 28ms/step - accuracy: 0.3630 - loss: 1.3144[32m [repeated 23x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m127/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 15ms/step - accuracy: 0.3350 - loss: 1.5002
[1m130/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 15ms/step - accuracy: 0.3350 - loss: 1.5001
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 15ms/step - accuracy: 0.3351 - loss: 1.5000
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 33ms/step - accuracy: 0.3629 - loss: 1.3146 - val_accuracy: 0.4860 - val_loss: 1.2200
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m70/82[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 28ms/step - accuracy: 0.3638 - loss: 1.3135
[1m73/82[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 28ms/step - accuracy: 0.3636 - loss: 1.3137[32m [repeated 22x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 15ms/step - accuracy: 0.3369 - loss: 1.4557
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 15ms/step - accuracy: 0.3369 - loss: 1.4559
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 15ms/step - accuracy: 0.3369 - loss: 1.4560
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 18ms/step - accuracy: 0.4392 - loss: 1.2046
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 18ms/step - accuracy: 0.4391 - loss: 1.2045[32m [repeated 295x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m3s[0m 32ms/step - accuracy: 0.3181 - loss: 1.4714[32m [repeated 99x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  3/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.4002 - loss: 1.2310 
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 25ms/step - accuracy: 0.3954 - loss: 1.2370[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 63ms/step - accuracy: 0.4219 - loss: 1.5682
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 16ms/step - accuracy: 0.3852 - loss: 1.5420 [32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m13s[0m 82ms/step - accuracy: 0.4219 - loss: 1.1947[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 35ms/step - accuracy: 0.3614 - loss: 1.3264 - val_accuracy: 0.4831 - val_loss: 1.2193
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 20ms/step - accuracy: 0.2948 - loss: 1.5453 - val_accuracy: 0.4010 - val_loss: 1.2518[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m Epoch 32/94[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 17ms/step - accuracy: 0.3380 - loss: 1.5189 [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m17/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 30ms/step - accuracy: 0.3762 - loss: 1.3032[32m [repeated 21x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m67/82[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 30ms/step - accuracy: 0.3720 - loss: 1.3073
[1m69/82[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 30ms/step - accuracy: 0.3719 - loss: 1.3073[32m [repeated 26x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 34ms/step - accuracy: 0.3717 - loss: 1.3074 - val_accuracy: 0.4933 - val_loss: 1.2119
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m12s[0m 255ms/step
[1m11/49[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 5ms/step   
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m165/328[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m2s[0m 17ms/step - accuracy: 0.3949 - loss: 1.2877
[1m168/328[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m2s[0m 17ms/step - accuracy: 0.3952 - loss: 1.2877
[1m171/328[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m2s[0m 17ms/step - accuracy: 0.3954 - loss: 1.2876
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m20/49[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 5ms/step
[1m29/49[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 5ms/step
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m38/49[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 6ms/step
[36m(train_cnn_ray_tune pid=1510573)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510573)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510574)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510574)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m47/49[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 6ms/step
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 186ms/step
[1m11/49[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 5ms/step  
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 13/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 14ms/step - accuracy: 0.4639 - loss: 1.2179
[1m 17/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 14ms/step - accuracy: 0.4630 - loss: 1.2165[32m [repeated 269x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 10ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 25/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m2s[0m 15ms/step - accuracy: 0.4599 - loss: 1.2164[32m [repeated 132x across cluster][0m
[36m(train_cnn_ray_tune pid=1510573)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 43ms/step
[1m10/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 6ms/step 
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  4/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 19ms/step - accuracy: 0.5547 - loss: 1.1169 
[1m  7/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.5377 - loss: 1.1266[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 58ms/step - accuracy: 0.2969 - loss: 1.5799
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 13ms/step - accuracy: 0.3174 - loss: 1.6244[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 39ms/step
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m 9/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 6ms/step 
[1m17/89[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 6ms/step
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 62/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step - accuracy: 0.4538 - loss: 1.2122
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step - accuracy: 0.4534 - loss: 1.2119
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m1s[0m 16ms/step - accuracy: 0.4530 - loss: 1.2117

Trial trial_49751 finished iteration 1 at 2025-11-04 13:20:40. Total running time: 2min 47s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             164.172 │
│ time_total_s                 164.172 │
│ training_iteration                 1 │
│ val_accuracy                 0.49368 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:20:40. Total running time: 2min 47s
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m20s[0m 61ms/step - accuracy: 0.4375 - loss: 1.1794[32m [repeated 9x across cluster][0m

Trial trial_49751 finished iteration 1 at 2025-11-04 13:20:40. Total running time: 2min 47s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             164.546 │
│ time_total_s                 164.546 │
│ training_iteration                 1 │
│ val_accuracy                 0.57303 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:20:40. Total running time: 2min 47s
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 29ms/step - accuracy: 0.3712 - loss: 1.3162 - val_accuracy: 0.4954 - val_loss: 1.2066
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 16ms/step - accuracy: 0.3491 - loss: 1.4512 - val_accuracy: 0.4603 - val_loss: 1.2212[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m Epoch 36/65[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 15ms/step - accuracy: 0.4045 - loss: 1.2747 [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m64/82[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 22ms/step - accuracy: 0.3775 - loss: 1.3081[32m [repeated 24x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m78/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 22ms/step - accuracy: 0.3785 - loss: 1.3075
[1m81/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 22ms/step - accuracy: 0.3788 - loss: 1.3073[32m [repeated 20x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 26ms/step - accuracy: 0.3789 - loss: 1.3073 - val_accuracy: 0.5091 - val_loss: 1.1965
[36m(train_cnn_ray_tune pid=1510574)[0m 
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 7ms/step[32m [repeated 17x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m285/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 17ms/step - accuracy: 0.4646 - loss: 1.1774
[1m289/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 17ms/step - accuracy: 0.4645 - loss: 1.1775[32m [repeated 256x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 49/164[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m2s[0m 25ms/step - accuracy: 0.3361 - loss: 1.4342[32m [repeated 75x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 16ms/step - accuracy: 0.4134 - loss: 1.2416 
[1m  9/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 16ms/step - accuracy: 0.4200 - loss: 1.2288[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 57ms/step - accuracy: 0.3281 - loss: 1.4873
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 11ms/step - accuracy: 0.3138 - loss: 1.4835[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 25ms/step - accuracy: 0.3859 - loss: 1.2940 - val_accuracy: 0.5109 - val_loss: 1.1896
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m241/328[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 17ms/step - accuracy: 0.4659 - loss: 1.1760
[1m244/328[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 17ms/step - accuracy: 0.4658 - loss: 1.1761
[1m247/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m1s[0m 17ms/step - accuracy: 0.4657 - loss: 1.1762[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 67ms/step - accuracy: 0.3516 - loss: 1.3530[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 22ms/step - accuracy: 0.4351 - loss: 1.2200 - val_accuracy: 0.5021 - val_loss: 1.1767[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m Epoch 23/93[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 26ms/step - accuracy: 0.3931 - loss: 1.2928 - val_accuracy: 0.5151 - val_loss: 1.1740
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m81/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 21ms/step - accuracy: 0.3931 - loss: 1.2929[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m19/82[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 20ms/step - accuracy: 0.3990 - loss: 1.2782
[1m22/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m1s[0m 20ms/step - accuracy: 0.3995 - loss: 1.2781[32m [repeated 22x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 26ms/step - accuracy: 0.4016 - loss: 1.2828 - val_accuracy: 0.5225 - val_loss: 1.1731
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m224/328[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m1s[0m 16ms/step - accuracy: 0.4710 - loss: 1.1697
[1m228/328[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m1s[0m 16ms/step - accuracy: 0.4708 - loss: 1.1699[32m [repeated 258x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m2s[0m 25ms/step - accuracy: 0.3345 - loss: 1.4179[32m [repeated 44x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 19ms/step - accuracy: 0.4307 - loss: 1.2163 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 18ms/step - accuracy: 0.4374 - loss: 1.2097[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 56ms/step - accuracy: 0.3438 - loss: 1.3223
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 13ms/step - accuracy: 0.4137 - loss: 1.2714 [32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 20/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 12ms/step - accuracy: 0.3952 - loss: 1.2875
[1m 24/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 12ms/step - accuracy: 0.3969 - loss: 1.2866
[1m 28/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 12ms/step - accuracy: 0.3990 - loss: 1.2849[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 56ms/step - accuracy: 0.3750 - loss: 1.4154[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 25ms/step - accuracy: 0.4005 - loss: 1.2834 - val_accuracy: 0.5158 - val_loss: 1.1700
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 14ms/step - accuracy: 0.3466 - loss: 1.4851 - val_accuracy: 0.4961 - val_loss: 1.1660[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m Epoch 41/65[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m49/82[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 20ms/step - accuracy: 0.4049 - loss: 1.2774[32m [repeated 18x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m57/82[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 21ms/step - accuracy: 0.4057 - loss: 1.2768
[1m60/82[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 21ms/step - accuracy: 0.4059 - loss: 1.2767[32m [repeated 23x across cluster][0m

Trial status: 9 RUNNING | 11 TERMINATED
Current time: 2025-11-04 13:20:53. Total running time: 3min 0s
Logical resource usage: 9.0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status         N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs     iter     total time (s)     val_accuracy │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    RUNNING              4   adam            relu                                  128                 64                  3          0.000218484         65                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                128                  5          9.98547e-05         55                                              │
│ trial_49751    RUNNING              3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94                                              │
│ trial_49751    RUNNING              2   adam            relu                                   64                 64                  3          2.51454e-05        141                                              │
│ trial_49751    RUNNING              3   adam            relu                                   32                 64                  3          0.000152599        138                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                 64                  5          0.00199444          93                                              │
│ trial_49751    RUNNING              2   adam            tanh                                   64                 64                  3          4.37965e-05        125                                              │
│ trial_49751    RUNNING              3   rmsprop         tanh                                   32                128                  5          0.00135033         103                                              │
│ trial_49751    RUNNING              4   adam            relu                                   64                 32                  3          0.00148826         119                                              │
│ trial_49751    TERMINATED           4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131        1            84.0283         0.359902 │
│ trial_49751    TERMINATED           3   rmsprop         relu                                   64                128                  5          0.000380128         91        1           146.211          0.582514 │
│ trial_49751    TERMINATED           4   adam            tanh                                   32                 32                  3          2.97881e-05         59        1            96.2637         0.358497 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                128                  3          1.6132e-05         114        1           112.29           0.327247 │
│ trial_49751    TERMINATED           2   rmsprop         relu                                   32                128                  5          0.000810222         51        1           164.546          0.573034 │
│ trial_49751    TERMINATED           3   adam            tanh                                   32                 32                  5          0.00337379          79        1           164.172          0.49368  │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          7.68947e-05         83        1            79.8381         0.403792 │
│ trial_49751    TERMINATED           4   adam            relu                                  128                128                  5          1.32913e-05        108        1           100.985          0.361306 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                 32                  3          8.13867e-05        140        1           121.581          0.352177 │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          1.21494e-05         85        1            98.8178         0.373244 │
│ trial_49751    TERMINATED           4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105        1            61.237          0.347612 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 25ms/step - accuracy: 0.4070 - loss: 1.2763 - val_accuracy: 0.5242 - val_loss: 1.1495
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 14/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 13ms/step - accuracy: 0.2746 - loss: 1.5647
[1m 18/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 13ms/step - accuracy: 0.2757 - loss: 1.5647[32m [repeated 238x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 26/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 13ms/step - accuracy: 0.4336 - loss: 1.2641[32m [repeated 80x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 16ms/step - accuracy: 0.4449 - loss: 1.1812 
[1m  8/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 17ms/step - accuracy: 0.4359 - loss: 1.1886[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 60ms/step - accuracy: 0.4062 - loss: 1.3298
[1m  6/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 12ms/step - accuracy: 0.4444 - loss: 1.2727 [32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 54/164[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.3057 - loss: 1.4887
[1m 58/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.3053 - loss: 1.4895
[1m 62/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.3048 - loss: 1.4905[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 60ms/step - accuracy: 0.3281 - loss: 1.3604[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 27ms/step - accuracy: 0.4092 - loss: 1.2756 - val_accuracy: 0.5316 - val_loss: 1.1478
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 17ms/step - accuracy: 0.4631 - loss: 1.1766 - val_accuracy: 0.5298 - val_loss: 1.0910[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m Epoch 35/119[32m [repeated 14x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m 12/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 10ms/step - accuracy: 0.3277 - loss: 1.4183
[1m 17/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 10ms/step - accuracy: 0.3325 - loss: 1.4087
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m80/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 21ms/step - accuracy: 0.4149 - loss: 1.2576[32m [repeated 20x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 24ms/step - accuracy: 0.4152 - loss: 1.2576 - val_accuracy: 0.5267 - val_loss: 1.1345
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m74/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 20ms/step - accuracy: 0.4143 - loss: 1.2580
[1m77/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 21ms/step - accuracy: 0.4146 - loss: 1.2578[32m [repeated 20x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 21/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.2890 - loss: 1.4847
[1m 25/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.2910 - loss: 1.4840[32m [repeated 254x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m111/328[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m3s[0m 16ms/step - accuracy: 0.4564 - loss: 1.1576[32m [repeated 60x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  6/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 12ms/step - accuracy: 0.4043 - loss: 1.2769 
[1m 10/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 12ms/step - accuracy: 0.4165 - loss: 1.2703[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 59ms/step - accuracy: 0.3906 - loss: 1.4848
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 11ms/step - accuracy: 0.3739 - loss: 1.4300[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 27ms/step - accuracy: 0.4173 - loss: 1.2521 - val_accuracy: 0.5302 - val_loss: 1.1329
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 12ms/step - accuracy: 0.3585 - loss: 1.4351
[1m114/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 12ms/step - accuracy: 0.3585 - loss: 1.4354
[1m118/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 12ms/step - accuracy: 0.3585 - loss: 1.4357[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 56ms/step - accuracy: 0.1875 - loss: 1.4658[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 14ms/step - accuracy: 0.3495 - loss: 1.4060 - val_accuracy: 0.4628 - val_loss: 1.2185[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m Epoch 44/125[32m [repeated 17x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 25ms/step - accuracy: 0.4324 - loss: 1.2488 - val_accuracy: 0.5256 - val_loss: 1.1206
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m58/82[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 22ms/step - accuracy: 0.4342 - loss: 1.2479[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m25/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m1s[0m 22ms/step - accuracy: 0.4307 - loss: 1.2409
[1m28/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m1s[0m 22ms/step - accuracy: 0.4314 - loss: 1.2408[32m [repeated 23x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  6/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 12ms/step - accuracy: 0.4428 - loss: 1.1992 
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 25ms/step - accuracy: 0.4321 - loss: 1.2409 - val_accuracy: 0.5298 - val_loss: 1.1208
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 40/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 15ms/step - accuracy: 0.4548 - loss: 1.1900
[1m 44/328[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 15ms/step - accuracy: 0.4557 - loss: 1.1893[32m [repeated 255x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 62/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m3s[0m 12ms/step - accuracy: 0.4473 - loss: 1.1944[32m [repeated 70x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 16ms/step - accuracy: 0.4153 - loss: 1.2056 
[1m  9/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 16ms/step - accuracy: 0.4354 - loss: 1.1869[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 65ms/step - accuracy: 0.4688 - loss: 1.2559
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 19ms/step - accuracy: 0.4551 - loss: 1.1972 [32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m122/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 25ms/step - accuracy: 0.3378 - loss: 1.3937
[1m124/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 25ms/step - accuracy: 0.3379 - loss: 1.3935
[1m126/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 25ms/step - accuracy: 0.3381 - loss: 1.3933
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10s[0m 63ms/step - accuracy: 0.3281 - loss: 1.3221[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m1s[0m 15ms/step - accuracy: 0.4581 - loss: 1.1696
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m1s[0m 15ms/step - accuracy: 0.4581 - loss: 1.1694
[1m 76/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m1s[0m 15ms/step - accuracy: 0.4582 - loss: 1.1693
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 26ms/step - accuracy: 0.4373 - loss: 1.2351 - val_accuracy: 0.5337 - val_loss: 1.1123
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 15ms/step - accuracy: 0.3080 - loss: 1.4684 - val_accuracy: 0.4508 - val_loss: 1.2380[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m Epoch 44/94[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m50/82[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 22ms/step - accuracy: 0.4332 - loss: 1.2310[32m [repeated 25x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m59/82[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 22ms/step - accuracy: 0.4346 - loss: 1.2300
[1m62/82[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 22ms/step - accuracy: 0.4350 - loss: 1.2297[32m [repeated 21x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 26ms/step - accuracy: 0.4375 - loss: 1.2276 - val_accuracy: 0.5428 - val_loss: 1.1141
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m132/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 14ms/step - accuracy: 0.4711 - loss: 1.1688
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 14ms/step - accuracy: 0.4711 - loss: 1.1686[32m [repeated 255x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m161/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 12ms/step - accuracy: 0.3615 - loss: 1.4166[32m [repeated 71x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 14ms/step - accuracy: 0.4647 - loss: 1.1606 
[1m  9/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 13ms/step - accuracy: 0.4596 - loss: 1.1875[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 57ms/step - accuracy: 0.4219 - loss: 1.4322
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 11ms/step - accuracy: 0.4328 - loss: 1.3627[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 57ms/step - accuracy: 0.4219 - loss: 1.2112[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 25ms/step - accuracy: 0.4469 - loss: 1.2159 - val_accuracy: 0.5284 - val_loss: 1.1060
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m265/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m1s[0m 16ms/step - accuracy: 0.4606 - loss: 1.1790
[1m268/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 16ms/step - accuracy: 0.4606 - loss: 1.1789
[1m271/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 16ms/step - accuracy: 0.4607 - loss: 1.1788
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m181/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.4547 - loss: 1.2010
[1m186/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.4547 - loss: 1.2010
[1m190/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.4547 - loss: 1.2010
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 15ms/step - accuracy: 0.3179 - loss: 1.4422 - val_accuracy: 0.4452 - val_loss: 1.2383[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m Epoch 46/94[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m81/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 22ms/step - accuracy: 0.4504 - loss: 1.2139[32m [repeated 25x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 26ms/step - accuracy: 0.4503 - loss: 1.2140 - val_accuracy: 0.5365 - val_loss: 1.0998
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m70/82[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 22ms/step - accuracy: 0.4515 - loss: 1.2126
[1m73/82[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 22ms/step - accuracy: 0.4512 - loss: 1.2130[32m [repeated 17x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 13ms/step - accuracy: 0.3335 - loss: 1.3922 
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 25ms/step - accuracy: 0.3606 - loss: 1.3617
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 25ms/step - accuracy: 0.3605 - loss: 1.3617[32m [repeated 254x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 25ms/step - accuracy: 0.3607 - loss: 1.3616[32m [repeated 60x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 13ms/step - accuracy: 0.4297 - loss: 1.2744 
[1m  9/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 14ms/step - accuracy: 0.4378 - loss: 1.2394[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 58ms/step - accuracy: 0.3438 - loss: 1.4942
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.3518 - loss: 1.4339[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 26ms/step - accuracy: 0.4454 - loss: 1.2051 - val_accuracy: 0.5439 - val_loss: 1.0965
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 1/82[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 58ms/step - accuracy: 0.4453 - loss: 1.2211[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m118/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 15ms/step - accuracy: 0.4534 - loss: 1.1732
[1m122/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 15ms/step - accuracy: 0.4539 - loss: 1.1728
[1m126/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 15ms/step - accuracy: 0.4543 - loss: 1.1725[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 13ms/step - accuracy: 0.3716 - loss: 1.4126 - val_accuracy: 0.5239 - val_loss: 1.1483[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 50/141[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 25ms/step - accuracy: 0.4514 - loss: 1.2058 - val_accuracy: 0.5460 - val_loss: 1.0914
[36m(train_cnn_ray_tune pid=1510564)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510564)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510575)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510575)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m22/82[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m1s[0m 23ms/step - accuracy: 0.4755 - loss: 1.1743[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m31/82[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m1s[0m 22ms/step - accuracy: 0.4744 - loss: 1.1811
[1m34/82[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m1s[0m 22ms/step - accuracy: 0.4738 - loss: 1.1827[32m [repeated 23x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 324ms/step
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m10/49[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 6ms/step   
[1m20/49[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 5ms/step
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  4/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 18ms/step - accuracy: 0.4766 - loss: 1.0297 [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m30/49[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 5ms/step
[1m40/49[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 5ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 25ms/step - accuracy: 0.4675 - loss: 1.1935 - val_accuracy: 0.5425 - val_loss: 1.0900
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m15s[0m 320ms/step
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m232/328[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 16ms/step - accuracy: 0.4742 - loss: 1.1495
[1m236/328[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 16ms/step - accuracy: 0.4742 - loss: 1.1497[32m [repeated 225x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 24ms/step - accuracy: 0.3602 - loss: 1.3529[32m [repeated 82x across cluster][0m
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 11ms/step
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  5/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 14ms/step - accuracy: 0.4929 - loss: 1.1273 
[1m 10/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 13ms/step - accuracy: 0.4802 - loss: 1.1415
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 40ms/step
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 57ms/step - accuracy: 0.2031 - loss: 1.5457
[1m  5/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 13ms/step - accuracy: 0.3085 - loss: 1.4270[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 35ms/step
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m13/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step 
[1m24/89[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m0s[0m 5ms/step
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 59ms/step - accuracy: 0.3438 - loss: 1.2483[32m [repeated 9x across cluster][0m

Trial trial_49751 finished iteration 1 at 2025-11-04 13:21:20. Total running time: 3min 27s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             204.577 │
│ time_total_s                 204.577 │
│ training_iteration                 1 │
│ val_accuracy                 0.53055 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:21:20. Total running time: 3min 27s

Trial trial_49751 finished iteration 1 at 2025-11-04 13:21:20. Total running time: 3min 27s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             204.732 │
│ time_total_s                 204.732 │
│ training_iteration                 1 │
│ val_accuracy                 0.49403 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:21:20. Total running time: 3min 27s
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m273/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 15ms/step - accuracy: 0.4743 - loss: 1.1516
[1m277/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 15ms/step - accuracy: 0.4744 - loss: 1.1518
[1m281/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 15ms/step - accuracy: 0.4744 - loss: 1.1520[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 22ms/step - accuracy: 0.4650 - loss: 1.1859 - val_accuracy: 0.5421 - val_loss: 1.0826
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  8/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 8ms/step - accuracy: 0.3118 - loss: 1.4209 
[1m 13/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 9ms/step - accuracy: 0.3181 - loss: 1.4183
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3594 - loss: 1.3655
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3596 - loss: 1.3653
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m6s[0m 17ms/step - accuracy: 0.4743 - loss: 1.1542 - val_accuracy: 0.5383 - val_loss: 1.1474[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m Epoch 19/103[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m80/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 18ms/step - accuracy: 0.4577 - loss: 1.1921[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 22ms/step - accuracy: 0.4579 - loss: 1.1920 - val_accuracy: 0.5502 - val_loss: 1.0839
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m74/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 18ms/step - accuracy: 0.4573 - loss: 1.1923
[1m77/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 18ms/step - accuracy: 0.4576 - loss: 1.1922[32m [repeated 26x across cluster][0m

Trial status: 7 RUNNING | 13 TERMINATED
Current time: 2025-11-04 13:21:23. Total running time: 3min 30s
Logical resource usage: 7.0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status         N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs     iter     total time (s)     val_accuracy │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    RUNNING              4   adam            relu                                  128                 64                  3          0.000218484         65                                              │
│ trial_49751    RUNNING              4   adam            tanh                                   64                128                  5          9.98547e-05         55                                              │
│ trial_49751    RUNNING              3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94                                              │
│ trial_49751    RUNNING              2   adam            relu                                   64                 64                  3          2.51454e-05        141                                              │
│ trial_49751    RUNNING              3   adam            relu                                   32                 64                  3          0.000152599        138                                              │
│ trial_49751    RUNNING              2   adam            tanh                                   64                 64                  3          4.37965e-05        125                                              │
│ trial_49751    RUNNING              3   rmsprop         tanh                                   32                128                  5          0.00135033         103                                              │
│ trial_49751    TERMINATED           4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131        1            84.0283         0.359902 │
│ trial_49751    TERMINATED           3   rmsprop         relu                                   64                128                  5          0.000380128         91        1           146.211          0.582514 │
│ trial_49751    TERMINATED           4   adam            tanh                                   32                 32                  3          2.97881e-05         59        1            96.2637         0.358497 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                128                  3          1.6132e-05         114        1           112.29           0.327247 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                 64                  5          0.00199444          93        1           204.732          0.494031 │
│ trial_49751    TERMINATED           2   rmsprop         relu                                   32                128                  5          0.000810222         51        1           164.546          0.573034 │
│ trial_49751    TERMINATED           3   adam            tanh                                   32                 32                  5          0.00337379          79        1           164.172          0.49368  │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          7.68947e-05         83        1            79.8381         0.403792 │
│ trial_49751    TERMINATED           4   adam            relu                                  128                128                  5          1.32913e-05        108        1           100.985          0.361306 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                 32                  3          8.13867e-05        140        1           121.581          0.352177 │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          1.21494e-05         85        1            98.8178         0.373244 │
│ trial_49751    TERMINATED           4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105        1            61.237          0.347612 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                 32                  3          0.00148826         119        1           204.577          0.530548 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
[36m(train_cnn_ray_tune pid=1510564)[0m 
[1m11/49[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 5ms/step   
[1m21/49[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 5ms/step
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 19ms/step - accuracy: 0.3363 - loss: 1.3455 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 19ms/step - accuracy: 0.3444 - loss: 1.3434
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m85/89[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 6ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 7ms/step[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m 13/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 9ms/step - accuracy: 0.3898 - loss: 1.3000
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 10ms/step - accuracy: 0.3159 - loss: 1.4158
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 10ms/step - accuracy: 0.3159 - loss: 1.4158[32m [repeated 172x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m192/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m1s[0m 10ms/step - accuracy: 0.4553 - loss: 1.1813[32m [repeated 56x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 21ms/step - accuracy: 0.4594 - loss: 1.1871 - val_accuracy: 0.5446 - val_loss: 1.0792
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 11ms/step - accuracy: 0.3888 - loss: 1.3789 - val_accuracy: 0.5284 - val_loss: 1.1364
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 61ms/step - accuracy: 0.2812 - loss: 1.3299
[1m  6/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 11ms/step - accuracy: 0.3516 - loss: 1.2334 [32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510575)[0m 
[1m10/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 6ms/step 
[1m18/89[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 6ms/step
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 57ms/step - accuracy: 0.2188 - loss: 1.5211[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 76/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3157 - loss: 1.4311 
[1m 83/164[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3156 - loss: 1.4306
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  8/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 8ms/step - accuracy: 0.3090 - loss: 1.3880 
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 8ms/step - accuracy: 0.3307 - loss: 1.3739
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m115/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3154 - loss: 1.4284
[1m121/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3154 - loss: 1.4282[32m [repeated 30x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 20ms/step - accuracy: 0.4600 - loss: 1.1893 - val_accuracy: 0.5442 - val_loss: 1.0744
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 12ms/step - accuracy: 0.4583 - loss: 1.1806 - val_accuracy: 0.5390 - val_loss: 1.0708[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 55/141[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 9ms/step - accuracy: 0.3433 - loss: 1.3619 
[1m 13/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 9ms/step - accuracy: 0.3392 - loss: 1.3741
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m81/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 17ms/step - accuracy: 0.4720 - loss: 1.1726[32m [repeated 33x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 20ms/step - accuracy: 0.4720 - loss: 1.1726 - val_accuracy: 0.5527 - val_loss: 1.0673
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m28/82[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 17ms/step - accuracy: 0.4732 - loss: 1.1635
[1m31/82[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 17ms/step - accuracy: 0.4730 - loss: 1.1646[32m [repeated 18x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  7/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 10ms/step - accuracy: 0.3745 - loss: 1.2823 
[1m 12/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 10ms/step - accuracy: 0.4102 - loss: 1.2501[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m236/328[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.4798 - loss: 1.1572
[1m240/328[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.4798 - loss: 1.1572
[1m244/328[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m1s[0m 12ms/step - accuracy: 0.4798 - loss: 1.1572
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 56/164[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3214 - loss: 1.4112[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 63/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 10ms/step - accuracy: 0.3090 - loss: 1.4296
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 10ms/step - accuracy: 0.3084 - loss: 1.4290[32m [repeated 143x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 10ms/step - accuracy: 0.4743 - loss: 1.1686[32m [repeated 27x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 21ms/step - accuracy: 0.4651 - loss: 1.1770 - val_accuracy: 0.5527 - val_loss: 1.0637
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 11ms/step - accuracy: 0.3695 - loss: 1.3862 - val_accuracy: 0.5298 - val_loss: 1.1344[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 60ms/step - accuracy: 0.2969 - loss: 1.4248
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 11ms/step - accuracy: 0.3194 - loss: 1.4189[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 56ms/step - accuracy: 0.5625 - loss: 1.1715[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 19ms/step - accuracy: 0.4748 - loss: 1.1695 - val_accuracy: 0.5513 - val_loss: 1.0672
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m116/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m1s[0m 9ms/step - accuracy: 0.4730 - loss: 1.1611
[1m122/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m1s[0m 9ms/step - accuracy: 0.4725 - loss: 1.1613[32m [repeated 69x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m4s[0m 21ms/step - accuracy: 0.3819 - loss: 1.3253 - val_accuracy: 0.4821 - val_loss: 1.2100[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m Epoch 58/125[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 9ms/step - accuracy: 0.3119 - loss: 1.3939 
[1m 12/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 10ms/step - accuracy: 0.3108 - loss: 1.3968[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m80/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 18ms/step - accuracy: 0.4763 - loss: 1.1615[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 21ms/step - accuracy: 0.4762 - loss: 1.1615 - val_accuracy: 0.5548 - val_loss: 1.0637
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m73/82[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 18ms/step - accuracy: 0.4766 - loss: 1.1611
[1m76/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 18ms/step - accuracy: 0.4765 - loss: 1.1613[32m [repeated 32x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4352 - loss: 1.2709 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 19ms/step - accuracy: 0.4232 - loss: 1.2754[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m153/328[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 12ms/step - accuracy: 0.4686 - loss: 1.1526
[1m157/328[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 12ms/step - accuracy: 0.4687 - loss: 1.1528
[1m161/328[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 12ms/step - accuracy: 0.4689 - loss: 1.1531[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m163/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 9ms/step - accuracy: 0.3807 - loss: 1.3641[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m313/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 12ms/step - accuracy: 0.4735 - loss: 1.1566
[1m318/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 12ms/step - accuracy: 0.4735 - loss: 1.1566[32m [repeated 118x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m163/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 10ms/step - accuracy: 0.3075 - loss: 1.4062[32m [repeated 37x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 20ms/step - accuracy: 0.4749 - loss: 1.1531 - val_accuracy: 0.5576 - val_loss: 1.0595
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 11ms/step - accuracy: 0.3808 - loss: 1.3640 - val_accuracy: 0.5320 - val_loss: 1.1333[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 59ms/step - accuracy: 0.3438 - loss: 1.3216
[1m  7/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 10ms/step - accuracy: 0.4722 - loss: 1.1815 [32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 51ms/step - accuracy: 0.3125 - loss: 1.4825[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m 77/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3558 - loss: 1.3576
[1m 82/164[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3564 - loss: 1.3566
[1m 87/164[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3570 - loss: 1.3558
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 20ms/step - accuracy: 0.4752 - loss: 1.1601 - val_accuracy: 0.5548 - val_loss: 1.0527
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m116/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3602 - loss: 1.3517
[1m121/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3606 - loss: 1.3511[32m [repeated 84x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m5s[0m 14ms/step - accuracy: 0.4736 - loss: 1.1568 - val_accuracy: 0.5481 - val_loss: 1.1697[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m Epoch 58/94[32m [repeated 14x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 9ms/step - accuracy: 0.3330 - loss: 1.4086 
[1m 13/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 9ms/step - accuracy: 0.3403 - loss: 1.3858[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  7/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 9ms/step - accuracy: 0.4480 - loss: 1.1913  
[1m 13/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 9ms/step - accuracy: 0.4439 - loss: 1.1881
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m79/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 17ms/step - accuracy: 0.4816 - loss: 1.1483[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m 78/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3693 - loss: 1.3221
[1m 84/164[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3694 - loss: 1.3226
[1m 91/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 9ms/step - accuracy: 0.3694 - loss: 1.3232
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 20ms/step - accuracy: 0.4814 - loss: 1.1486 - val_accuracy: 0.5527 - val_loss: 1.0532
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m72/82[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 17ms/step - accuracy: 0.4818 - loss: 1.1477
[1m75/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 17ms/step - accuracy: 0.4817 - loss: 1.1479[32m [repeated 31x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m  4/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 19ms/step - accuracy: 0.4398 - loss: 1.2415 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4215 - loss: 1.2545[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m151/328[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 13ms/step - accuracy: 0.4917 - loss: 1.1452
[1m155/328[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 13ms/step - accuracy: 0.4916 - loss: 1.1454
[1m159/328[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m2s[0m 13ms/step - accuracy: 0.4914 - loss: 1.1456
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 9ms/step - accuracy: 0.4023 - loss: 1.3408[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m82/82[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 20ms/step - accuracy: 0.4742 - loss: 1.1548 - val_accuracy: 0.5572 - val_loss: 1.0595
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m202/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 10ms/step - accuracy: 0.4691 - loss: 1.1643
[1m208/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m1s[0m 10ms/step - accuracy: 0.4693 - loss: 1.1641[32m [repeated 97x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m327/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 13ms/step - accuracy: 0.4885 - loss: 1.1487[32m [repeated 39x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510541)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 11ms/step - accuracy: 0.4019 - loss: 1.3411 - val_accuracy: 0.5334 - val_loss: 1.1306[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m14s[0m 312ms/step
[1m13/49[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step   
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 56ms/step - accuracy: 0.4062 - loss: 1.2546
[1m  6/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 10ms/step - accuracy: 0.3535 - loss: 1.3316[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m24/49[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 5ms/step
[1m37/49[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 4ms/step
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 51ms/step - accuracy: 0.2969 - loss: 1.4033[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 10ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 10ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 36ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m12/89[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 5ms/step 
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m24/89[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step
[1m38/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 4ms/step
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 20/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 8ms/step - accuracy: 0.3956 - loss: 1.3156
[1m 28/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m1s[0m 8ms/step - accuracy: 0.3991 - loss: 1.3187
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 8ms/step - accuracy: 0.3999 - loss: 1.3206
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m51/89[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 4ms/step
[1m66/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 4ms/step
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m80/89[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 4ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 4ms/step

Trial trial_49751 finished iteration 1 at 2025-11-04 13:21:41. Total running time: 3min 48s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             225.353 │
│ time_total_s                 225.353 │
│ training_iteration                 1 │
│ val_accuracy                 0.55723 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:21:41. Total running time: 3min 48s
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  8/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 8ms/step - accuracy: 0.4876 - loss: 1.1081  
[1m 15/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 8ms/step - accuracy: 0.4974 - loss: 1.0978
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 8ms/step - accuracy: 0.3178 - loss: 1.3988
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 8ms/step - accuracy: 0.3179 - loss: 1.3989[32m [repeated 78x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m3s[0m 20ms/step - accuracy: 0.3841 - loss: 1.3021 - val_accuracy: 0.4765 - val_loss: 1.2088[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 63/141[32m [repeated 14x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m  7/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 9ms/step - accuracy: 0.3243 - loss: 1.3727 
[1m 14/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 8ms/step - accuracy: 0.3495 - loss: 1.3454[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m79/82[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 17ms/step - accuracy: 0.4739 - loss: 1.1549[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510541)[0m 
[1m72/82[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 17ms/step - accuracy: 0.4732 - loss: 1.1552
[1m75/82[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 17ms/step - accuracy: 0.4735 - loss: 1.1550[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510566)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m  6/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 11ms/step - accuracy: 0.4734 - loss: 1.1526 
[1m 11/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 12ms/step - accuracy: 0.4735 - loss: 1.1448
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 193ms/step
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m162/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 7ms/step - accuracy: 0.3969 - loss: 1.3359[32m [repeated 23x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m19/49[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step  
[1m37/49[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 3ms/step
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  9/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 7ms/step - accuracy: 0.4342 - loss: 1.2349  
[1m 18/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 6ms/step - accuracy: 0.4648 - loss: 1.1927
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 30ms/step
[1m15/89[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step 
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m1s[0m 13ms/step - accuracy: 0.4080 - loss: 1.3038
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m1s[0m 13ms/step - accuracy: 0.4057 - loss: 1.3051[32m [repeated 62x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m163/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 16ms/step - accuracy: 0.3834 - loss: 1.3110[32m [repeated 17x across cluster][0m

Trial trial_49751 finished iteration 1 at 2025-11-04 13:21:45. Total running time: 3min 52s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             228.867 │
│ time_total_s                 228.867 │
│ training_iteration                 1 │
│ val_accuracy                 0.55513 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:21:45. Total running time: 3min 52s
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 8ms/step - accuracy: 0.3969 - loss: 1.3358 - val_accuracy: 0.5386 - val_loss: 1.1219[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 42ms/step - accuracy: 0.2812 - loss: 1.4777
[1m  9/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 7ms/step - accuracy: 0.3370 - loss: 1.4168 [32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510566)[0m 
[1m85/89[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 4ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 4ms/step[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 43ms/step - accuracy: 0.3594 - loss: 1.3684[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m307/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 7ms/step - accuracy: 0.4892 - loss: 1.1339
[1m318/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 7ms/step - accuracy: 0.4892 - loss: 1.1339[32m [repeated 135x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 10/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 6ms/step - accuracy: 0.4873 - loss: 1.1374  
[1m 19/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 6ms/step - accuracy: 0.4899 - loss: 1.1369
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m 81/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 5ms/step - accuracy: 0.3768 - loss: 1.3147
[1m 90/164[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 5ms/step - accuracy: 0.3763 - loss: 1.3154
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 5ms/step - accuracy: 0.3762 - loss: 1.3158
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 14ms/step - accuracy: 0.3873 - loss: 1.3056 - val_accuracy: 0.4796 - val_loss: 1.2107[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 67/141[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  9/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 6ms/step - accuracy: 0.3958 - loss: 1.3269 
[1m 18/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 6ms/step - accuracy: 0.3966 - loss: 1.3304[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 122ms/step
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m37/49[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 1ms/step  
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m9s[0m 197ms/step
[1m20/49[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 3ms/step  
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 48/164[0m [32m━━━━━[0m[37m━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3439 - loss: 1.3719[32m [repeated 24x across cluster][0m
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 23ms/step
[36m(train_cnn_ray_tune pid=1510543)[0m 
[1m33/89[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step 
[1m68/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step

Trial trial_49751 finished iteration 1 at 2025-11-04 13:21:49. Total running time: 3min 56s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             233.479 │
│ time_total_s                 233.479 │
│ training_iteration                 1 │
│ val_accuracy                 0.47226 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:21:49. Total running time: 3min 56s
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m2s[0m 24ms/step
[1m18/89[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step 
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m156/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 12ms/step - accuracy: 0.4043 - loss: 1.2850
[1m160/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 12ms/step - accuracy: 0.4041 - loss: 1.2851[32m [repeated 27x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m127/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 12ms/step - accuracy: 0.4055 - loss: 1.2845[32m [repeated 7x across cluster][0m

Trial trial_49751 finished iteration 1 at 2025-11-04 13:21:49. Total running time: 3min 57s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             233.973 │
│ time_total_s                 233.973 │
│ training_iteration                 1 │
│ val_accuracy                 0.48174 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:21:49. Total running time: 3min 57s
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 5ms/step - accuracy: 0.4163 - loss: 1.2942 - val_accuracy: 0.5428 - val_loss: 1.1169[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 30ms/step - accuracy: 0.3594 - loss: 1.2397
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3920 - loss: 1.3046 [32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m73/89[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 3ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 3ms/step[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 36ms/step - accuracy: 0.4062 - loss: 1.2146[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m186/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4906 - loss: 1.1364
[1m200/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4907 - loss: 1.1359[32m [repeated 112x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m294/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4914 - loss: 1.1338
[1m307/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 4ms/step - accuracy: 0.4915 - loss: 1.1334
[1m320/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 4ms/step - accuracy: 0.4916 - loss: 1.1330
[36m(train_cnn_ray_tune pid=1510549)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 14ms/step - accuracy: 0.4038 - loss: 1.2853 - val_accuracy: 0.4817 - val_loss: 1.2119
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 72/141[32m [repeated 13x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3323 - loss: 1.3480 
[1m 28/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3289 - loss: 1.3612[32m [repeated 5x across cluster][0m

Trial status: 17 TERMINATED | 3 RUNNING
Current time: 2025-11-04 13:21:53. Total running time: 4min 0s
Logical resource usage: 3.0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status         N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs     iter     total time (s)     val_accuracy │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    RUNNING              3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94                                              │
│ trial_49751    RUNNING              2   adam            relu                                   64                 64                  3          2.51454e-05        141                                              │
│ trial_49751    RUNNING              3   adam            relu                                   32                 64                  3          0.000152599        138                                              │
│ trial_49751    TERMINATED           4   adam            relu                                  128                 64                  3          0.000218484         65        1           225.353          0.557233 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                128                  5          9.98547e-05         55        1           233.973          0.481742 │
│ trial_49751    TERMINATED           4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131        1            84.0283         0.359902 │
│ trial_49751    TERMINATED           3   rmsprop         relu                                   64                128                  5          0.000380128         91        1           146.211          0.582514 │
│ trial_49751    TERMINATED           4   adam            tanh                                   32                 32                  3          2.97881e-05         59        1            96.2637         0.358497 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                128                  3          1.6132e-05         114        1           112.29           0.327247 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                 64                  5          0.00199444          93        1           204.732          0.494031 │
│ trial_49751    TERMINATED           2   rmsprop         relu                                   32                128                  5          0.000810222         51        1           164.546          0.573034 │
│ trial_49751    TERMINATED           3   adam            tanh                                   32                 32                  5          0.00337379          79        1           164.172          0.49368  │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          7.68947e-05         83        1            79.8381         0.403792 │
│ trial_49751    TERMINATED           4   adam            relu                                  128                128                  5          1.32913e-05        108        1           100.985          0.361306 │
│ trial_49751    TERMINATED           2   adam            tanh                                   64                 64                  3          4.37965e-05        125        1           233.479          0.472261 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                 32                  3          8.13867e-05        140        1           121.581          0.352177 │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          1.21494e-05         85        1            98.8178         0.373244 │
│ trial_49751    TERMINATED           4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105        1            61.237          0.347612 │
│ trial_49751    TERMINATED           3   rmsprop         tanh                                   32                128                  5          0.00135033         103        1           228.867          0.555126 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                 32                  3          0.00148826         119        1           204.577          0.530548 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 28/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 4ms/step - accuracy: 0.5048 - loss: 1.0866[32m [repeated 11x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 14/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3493 - loss: 1.3532 
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 5ms/step - accuracy: 0.3393 - loss: 1.3633 - val_accuracy: 0.4937 - val_loss: 1.2168[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 27ms/step - accuracy: 0.5625 - loss: 0.9891
[1m 14/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 4ms/step - accuracy: 0.5222 - loss: 1.0625 [32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 32ms/step - accuracy: 0.2656 - loss: 1.3414[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 31ms/step - accuracy: 0.4844 - loss: 1.1566
[1m 14/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4418 - loss: 1.2325 
[1m 28/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4378 - loss: 1.2438
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m295/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 4ms/step - accuracy: 0.5020 - loss: 1.0958
[1m308/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 4ms/step - accuracy: 0.5019 - loss: 1.0964[32m [repeated 93x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m128/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4059 - loss: 1.2960
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4064 - loss: 1.2955
[1m156/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 4ms/step - accuracy: 0.4067 - loss: 1.2949
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 79/141[32m [repeated 17x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 14/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 4ms/step - accuracy: 0.4892 - loss: 1.1205 
[1m 28/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 4ms/step - accuracy: 0.4895 - loss: 1.1150[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m157/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 4ms/step - accuracy: 0.4232 - loss: 1.2699[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3531 - loss: 1.3441
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3531 - loss: 1.3446
[1m159/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 4ms/step - accuracy: 0.3531 - loss: 1.3450
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 4ms/step - accuracy: 0.4155 - loss: 1.2658 - val_accuracy: 0.5397 - val_loss: 1.1050[32m [repeated 17x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 31ms/step - accuracy: 0.3594 - loss: 1.3857
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4403 - loss: 1.2368 [32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 29ms/step - accuracy: 0.4531 - loss: 1.2513[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 27/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3198 - loss: 1.3609
[1m 39/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3270 - loss: 1.3578[32m [repeated 92x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 86/141[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4233 - loss: 1.3051 
[1m 29/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4243 - loss: 1.2961[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m323/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 4ms/step - accuracy: 0.5230 - loss: 1.0807[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 5ms/step - accuracy: 0.4345 - loss: 1.2442 - val_accuracy: 0.5485 - val_loss: 1.0945[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 29ms/step - accuracy: 0.4844 - loss: 1.1630
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4340 - loss: 1.2331 [32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 31ms/step - accuracy: 0.3594 - loss: 1.2912[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4043 - loss: 1.2840 
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 53/164[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3538 - loss: 1.3436
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3565 - loss: 1.3411[32m [repeated 99x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454[32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m   _log_deprecation_warning([32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m268/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.5119 - loss: 1.0748
[1m281/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 4ms/step - accuracy: 0.5116 - loss: 1.0757
[1m294/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 4ms/step - accuracy: 0.5113 - loss: 1.0765
[36m(train_cnn_ray_tune pid=1510568)[0m Epoch 88/94[32m [repeated 16x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4311 - loss: 1.2282 
[1m 29/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4364 - loss: 1.2321[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 55/164[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3596 - loss: 1.3256[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 5ms/step - accuracy: 0.4405 - loss: 1.2452 - val_accuracy: 0.5435 - val_loss: 1.0894[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m8s[0m 27ms/step - accuracy: 0.6250 - loss: 0.8740
[1m 14/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 4ms/step - accuracy: 0.5567 - loss: 1.0245 [32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 30ms/step - accuracy: 0.4375 - loss: 1.2150[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 14/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 4ms/step - accuracy: 0.5138 - loss: 1.0763 [32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3863 - loss: 1.3035
[1m 80/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3857 - loss: 1.3047[32m [repeated 95x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 14/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 4ms/step - accuracy: 0.5192 - loss: 1.0371  
[1m 28/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 4ms/step - accuracy: 0.5013 - loss: 1.0703
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 99/141[32m [repeated 17x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3738 - loss: 1.2966 
[1m 28/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.3836 - loss: 1.2972[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m7s[0m 151ms/step
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m36/49[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 1ms/step  
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 4ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 4ms/step
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 20ms/step
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m38/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step 
[1m74/89[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step

Trial trial_49751 finished iteration 1 at 2025-11-04 13:22:13. Total running time: 4min 20s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             258.051 │
│ time_total_s                 258.051 │
│ training_iteration                 1 │
│ val_accuracy                 0.50913 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:22:13. Total running time: 4min 20s
[36m(train_cnn_ray_tune pid=1510568)[0m 
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m158/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 3ms/step - accuracy: 0.4458 - loss: 1.2223[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 4ms/step - accuracy: 0.4326 - loss: 1.2273 - val_accuracy: 0.5414 - val_loss: 1.0829[32m [repeated 15x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.3906 - loss: 1.2758
[1m 17/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4464 - loss: 1.2331 [32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.5781 - loss: 1.0924[32m [repeated 8x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
[36m(train_cnn_ray_tune pid=1510571)[0m   _log_deprecation_warning(
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 15/164[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 4ms/step - accuracy: 0.4433 - loss: 1.1989 
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4448 - loss: 1.2170
[1m114/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4454 - loss: 1.2174
[1m130/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4457 - loss: 1.2177
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m156/328[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.5201 - loss: 1.0698
[1m171/328[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.5200 - loss: 1.0702[32m [repeated 69x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 106/141[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 16/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 3ms/step - accuracy: 0.5612 - loss: 1.0359 
[1m 31/328[0m [32m━[0m[37m━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 3ms/step - accuracy: 0.5516 - loss: 1.0444[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m323/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 3ms/step - accuracy: 0.5241 - loss: 1.0624[32m [repeated 7x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m6s[0m 135ms/step
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m38/49[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 1ms/step  
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 3ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 4ms/step
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 4ms/step - accuracy: 0.4503 - loss: 1.2036 - val_accuracy: 0.5404 - val_loss: 1.0752[32m [repeated 12x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 19ms/step
[1m38/89[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step 

Trial trial_49751 finished iteration 1 at 2025-11-04 13:22:20. Total running time: 4min 27s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             264.735 │
│ time_total_s                 264.735 │
│ training_iteration                 1 │
│ val_accuracy                 0.57022 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:22:20. Total running time: 4min 27s
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4062 - loss: 1.2397
[1m 19/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4458 - loss: 1.2088 [32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510571)[0m 
[1m75/89[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4062 - loss: 1.1254[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 77/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4428 - loss: 1.2216
[1m 96/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4448 - loss: 1.2190[32m [repeated 53x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 114/141[32m [repeated 10x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 20/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4614 - loss: 1.2118 
[1m 40/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4588 - loss: 1.2089[32m [repeated 4x across cluster][0m

Trial status: 19 TERMINATED | 1 RUNNING
Current time: 2025-11-04 13:22:23. Total running time: 4min 30s
Logical resource usage: 1.0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status         N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs     iter     total time (s)     val_accuracy │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    RUNNING              2   adam            relu                                   64                 64                  3          2.51454e-05        141                                              │
│ trial_49751    TERMINATED           4   adam            relu                                  128                 64                  3          0.000218484         65        1           225.353          0.557233 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                128                  5          9.98547e-05         55        1           233.973          0.481742 │
│ trial_49751    TERMINATED           4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131        1            84.0283         0.359902 │
│ trial_49751    TERMINATED           3   rmsprop         relu                                   64                128                  5          0.000380128         91        1           146.211          0.582514 │
│ trial_49751    TERMINATED           4   adam            tanh                                   32                 32                  3          2.97881e-05         59        1            96.2637         0.358497 │
│ trial_49751    TERMINATED           3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94        1           258.051          0.509129 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                128                  3          1.6132e-05         114        1           112.29           0.327247 │
│ trial_49751    TERMINATED           3   adam            relu                                   32                 64                  3          0.000152599        138        1           264.735          0.570225 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                 64                  5          0.00199444          93        1           204.732          0.494031 │
│ trial_49751    TERMINATED           2   rmsprop         relu                                   32                128                  5          0.000810222         51        1           164.546          0.573034 │
│ trial_49751    TERMINATED           3   adam            tanh                                   32                 32                  5          0.00337379          79        1           164.172          0.49368  │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          7.68947e-05         83        1            79.8381         0.403792 │
│ trial_49751    TERMINATED           4   adam            relu                                  128                128                  5          1.32913e-05        108        1           100.985          0.361306 │
│ trial_49751    TERMINATED           2   adam            tanh                                   64                 64                  3          4.37965e-05        125        1           233.479          0.472261 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                 32                  3          8.13867e-05        140        1           121.581          0.352177 │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          1.21494e-05         85        1            98.8178         0.373244 │
│ trial_49751    TERMINATED           4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105        1            61.237          0.347612 │
│ trial_49751    TERMINATED           3   rmsprop         tanh                                   32                128                  5          0.00135033         103        1           228.867          0.555126 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                 32                  3          0.00148826         119        1           204.577          0.530548 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m154/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 3ms/step - accuracy: 0.4538 - loss: 1.2006[32m [repeated 2x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 3ms/step - accuracy: 0.4590 - loss: 1.1958 - val_accuracy: 0.5404 - val_loss: 1.0721[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4531 - loss: 1.2751
[1m 20/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4699 - loss: 1.1900 [32m [repeated 3x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.3906 - loss: 1.3463[32m [repeated 5x across cluster][0m
2025-11-04 13:22:38,302	INFO tune.py:1009 -- Wrote the latest version of all result files and experiment state to '/mnt/nvme1n2/git/uniovi-simur-wearablepermed-data/output/Paper_results/cases_dataset_M/case_M_ESANN_acc_gyr_superclasses_CPA_METs/ESANN_hyperparameters_tuning' in 0.0051s.
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 96/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4616 - loss: 1.1845
[1m116/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4616 - loss: 1.1848[32m [repeated 27x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 123/141[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 20/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4452 - loss: 1.1911 
[1m 39/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4537 - loss: 1.1881[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m154/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 3ms/step - accuracy: 0.4612 - loss: 1.1860[32m [repeated 6x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 3ms/step - accuracy: 0.4600 - loss: 1.1904 - val_accuracy: 0.5341 - val_loss: 1.0666[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4531 - loss: 1.2069
[1m 19/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4570 - loss: 1.1958 [32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 19ms/step - accuracy: 0.3906 - loss: 1.2653[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m116/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4643 - loss: 1.1777
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4648 - loss: 1.1775[32m [repeated 27x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 132/141[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 20/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4620 - loss: 1.2036 
[1m 39/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4635 - loss: 1.1961[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m154/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 3ms/step - accuracy: 0.4574 - loss: 1.1849[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 3ms/step - accuracy: 0.4701 - loss: 1.1768 - val_accuracy: 0.5351 - val_loss: 1.0612[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5312 - loss: 1.2214
[1m 20/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4765 - loss: 1.1823 [32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.7188 - loss: 0.9178[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4622 - loss: 1.1769
[1m157/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 3ms/step - accuracy: 0.4636 - loss: 1.1757[32m [repeated 26x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m Epoch 141/141[32m [repeated 9x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 75ms/step
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 20/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4515 - loss: 1.1875 
[1m 40/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4519 - loss: 1.1894[32m [repeated 5x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m48/49[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 1ms/step 
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 13ms/step
[1m47/89[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 1ms/step 

Trial trial_49751 finished iteration 1 at 2025-11-04 13:22:38. Total running time: 4min 45s
╭──────────────────────────────────────╮
│ Trial trial_49751 result             │
├──────────────────────────────────────┤
│ checkpoint_dir_name                  │
│ time_this_iter_s             282.484 │
│ time_total_s                 282.484 │
│ training_iteration                 1 │
│ val_accuracy                  0.5323 │
╰──────────────────────────────────────╯

Trial trial_49751 completed after 1 iterations at 2025-11-04 13:22:38. Total running time: 4min 45s

Trial status: 20 TERMINATED
Current time: 2025-11-04 13:22:38. Total running time: 4min 45s
Logical resource usage: 1.0/20 CPUs, 0/1 GPUs (0.0/1.0 accelerator_type:G)
[36m(train_cnn_ray_tune pid=1510570)[0m /home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/ray/train/_internal/session.py:772: RayDeprecationWarning: `ray.train.report` should be switched to `ray.tune.report` when running in a function passed to Ray Tune. This will be an error in the future. See this issue for more context: https://github.com/ray-project/ray/issues/49454
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762258958.442462 1508936 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
[36m(train_cnn_ray_tune pid=1510570)[0m   _log_deprecation_warning(
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Trial name     status         N_capas   optimizador     funcion_activacion       tamanho_minilote     numero_filtros     tamanho_filtro     tasa_aprendizaje     epochs     iter     total time (s)     val_accuracy │
├──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┤
│ trial_49751    TERMINATED           4   adam            relu                                  128                 64                  3          0.000218484         65        1           225.353          0.557233 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                128                  5          9.98547e-05         55        1           233.973          0.481742 │
│ trial_49751    TERMINATED           4   rmsprop         tanh                                  128                128                  5          7.1455e-05         131        1            84.0283         0.359902 │
│ trial_49751    TERMINATED           3   rmsprop         relu                                   64                128                  5          0.000380128         91        1           146.211          0.582514 │
│ trial_49751    TERMINATED           4   adam            tanh                                   32                 32                  3          2.97881e-05         59        1            96.2637         0.358497 │
│ trial_49751    TERMINATED           3   rmsprop         relu                                   64                 32                  5          3.70795e-05         94        1           258.051          0.509129 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                128                  3          1.6132e-05         114        1           112.29           0.327247 │
│ trial_49751    TERMINATED           2   adam            relu                                   64                 64                  3          2.51454e-05        141        1           282.484          0.532303 │
│ trial_49751    TERMINATED           3   adam            relu                                   32                 64                  3          0.000152599        138        1           264.735          0.570225 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                 64                  5          0.00199444          93        1           204.732          0.494031 │
│ trial_49751    TERMINATED           2   rmsprop         relu                                   32                128                  5          0.000810222         51        1           164.546          0.573034 │
│ trial_49751    TERMINATED           3   adam            tanh                                   32                 32                  5          0.00337379          79        1           164.172          0.49368  │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          7.68947e-05         83        1            79.8381         0.403792 │
│ trial_49751    TERMINATED           4   adam            relu                                  128                128                  5          1.32913e-05        108        1           100.985          0.361306 │
│ trial_49751    TERMINATED           2   adam            tanh                                   64                 64                  3          4.37965e-05        125        1           233.479          0.472261 │
│ trial_49751    TERMINATED           4   adam            tanh                                   64                 32                  3          8.13867e-05        140        1           121.581          0.352177 │
│ trial_49751    TERMINATED           4   adam            tanh                                  128                128                  3          1.21494e-05         85        1            98.8178         0.373244 │
│ trial_49751    TERMINATED           4   rmsprop         relu                                  128                 64                  5          7.17235e-05        105        1            61.237          0.347612 │
│ trial_49751    TERMINATED           3   rmsprop         tanh                                   32                128                  5          0.00135033         103        1           228.867          0.555126 │
│ trial_49751    TERMINATED           4   adam            relu                                   64                 32                  3          0.00148826         119        1           204.577          0.530548 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

Mejores hiperparámetros: {'N_capas': 3, 'optimizador': 'rmsprop', 'funcion_activacion': 'relu', 'tamanho_minilote': 64, 'numero_filtros': 128, 'tamanho_filtro': 5, 'tasa_aprendizaje': 0.0003801283536206291, 'epochs': 91}
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 1ms/step
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762258960.320695 1576440 service.cc:152] XLA service 0x7a987800be20 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762258960.320728 1576440 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:22:40.364794: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762258960.528373 1576440 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762258963.303022 1576440 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m11:18[0m 4s/step - accuracy: 0.1719 - loss: 2.0510
[1m 28/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2309 - loss: 2.1366  
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2491 - loss: 2.0866
[1m 95/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2554 - loss: 2.0567
[1m130/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2601 - loss: 2.0288
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2640 - loss: 2.0044
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2641 - loss: 2.0038 - val_accuracy: 0.4628 - val_loss: 1.2464
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.2656 - loss: 1.8066
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2972 - loss: 1.7074 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3008 - loss: 1.7035
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3031 - loss: 1.6951
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3046 - loss: 1.6859
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3053 - loss: 1.6793 - val_accuracy: 0.4761 - val_loss: 1.2092
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2969 - loss: 1.5912
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3181 - loss: 1.5405 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3242 - loss: 1.5276
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3283 - loss: 1.5151
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3300 - loss: 1.5064
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3307 - loss: 1.5002 - val_accuracy: 0.4944 - val_loss: 1.1885
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.2969 - loss: 1.5413
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3298 - loss: 1.4344 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3395 - loss: 1.4180
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3438 - loss: 1.4092
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3471 - loss: 1.4020
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3491 - loss: 1.3983 - val_accuracy: 0.5063 - val_loss: 1.1743
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.2656 - loss: 1.3429
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3701 - loss: 1.3245 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3788 - loss: 1.3199
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3825 - loss: 1.3175
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3852 - loss: 1.3149
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3871 - loss: 1.3132 - val_accuracy: 0.5344 - val_loss: 1.1405
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2942
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4285 - loss: 1.2746 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4276 - loss: 1.2703
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4275 - loss: 1.2669
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4273 - loss: 1.2643
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4269 - loss: 1.2631 - val_accuracy: 0.5372 - val_loss: 1.0978
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.1271
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4497 - loss: 1.2011 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4440 - loss: 1.2089
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4435 - loss: 1.2119
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4424 - loss: 1.2141
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4424 - loss: 1.2141 - val_accuracy: 0.5636 - val_loss: 1.0856
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.2000
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4744 - loss: 1.1751 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4718 - loss: 1.1749
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4718 - loss: 1.1747
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4726 - loss: 1.1732
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4730 - loss: 1.1724 - val_accuracy: 0.5755 - val_loss: 1.0577
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.2338
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4831 - loss: 1.1657 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4852 - loss: 1.1545
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4842 - loss: 1.1512
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4845 - loss: 1.1477
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4848 - loss: 1.1465 - val_accuracy: 0.5569 - val_loss: 1.0494
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.2235
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4814 - loss: 1.1559 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4844 - loss: 1.1431
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4863 - loss: 1.1389
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4880 - loss: 1.1353
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4888 - loss: 1.1334 - val_accuracy: 0.5702 - val_loss: 1.0462
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.4531 - loss: 1.1654
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4956 - loss: 1.1195 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4986 - loss: 1.1162
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5004 - loss: 1.1121
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5007 - loss: 1.1107
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5005 - loss: 1.1100 - val_accuracy: 0.5685 - val_loss: 1.0452
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0498
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5165 - loss: 1.0793 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5145 - loss: 1.0758
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5145 - loss: 1.0769
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5151 - loss: 1.0771
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5152 - loss: 1.0777 - val_accuracy: 0.5881 - val_loss: 1.0424
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4688 - loss: 1.1294
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5246 - loss: 1.0556 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5270 - loss: 1.0578
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5276 - loss: 1.0588
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5282 - loss: 1.0600
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5283 - loss: 1.0613 - val_accuracy: 0.5772 - val_loss: 1.0458
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6094 - loss: 0.9484
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5412 - loss: 1.0423 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5414 - loss: 1.0456
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5420 - loss: 1.0465
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5414 - loss: 1.0481
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5409 - loss: 1.0485 - val_accuracy: 0.5783 - val_loss: 1.0479
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 0.9165
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5334 - loss: 1.0357 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5352 - loss: 1.0382
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5356 - loss: 1.0387
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5352 - loss: 1.0393
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5349 - loss: 1.0395 - val_accuracy: 0.5811 - val_loss: 1.0378
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 0.9973
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5418 - loss: 1.0207 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5394 - loss: 1.0256
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5393 - loss: 1.0281
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5396 - loss: 1.0287
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5396 - loss: 1.0291 - val_accuracy: 0.5772 - val_loss: 1.0381
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 0.9352
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5749 - loss: 0.9683 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5635 - loss: 0.9910
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5595 - loss: 0.9999
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5568 - loss: 1.0053
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5552 - loss: 1.0076 - val_accuracy: 0.5734 - val_loss: 1.0351
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6094 - loss: 1.0195
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5661 - loss: 1.0156 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5630 - loss: 1.0151
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5613 - loss: 1.0131
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5605 - loss: 1.0119
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5599 - loss: 1.0119 - val_accuracy: 0.5762 - val_loss: 1.0427
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6562 - loss: 1.0012
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5563 - loss: 1.0165 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5548 - loss: 1.0105
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5541 - loss: 1.0077
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5532 - loss: 1.0072
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5526 - loss: 1.0071 - val_accuracy: 0.5811 - val_loss: 1.0237
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6406 - loss: 0.9023
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5679 - loss: 0.9755 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5654 - loss: 0.9808
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5650 - loss: 0.9822
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5640 - loss: 0.9844
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5632 - loss: 0.9858 - val_accuracy: 0.5825 - val_loss: 1.0392
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.6406 - loss: 0.8489
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5779 - loss: 0.9414 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5706 - loss: 0.9575
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5693 - loss: 0.9629
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5692 - loss: 0.9659
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5693 - loss: 0.9676 - val_accuracy: 0.5727 - val_loss: 1.0333
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6250 - loss: 0.9847
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5827 - loss: 0.9757 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5746 - loss: 0.9790
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5709 - loss: 0.9799
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5705 - loss: 0.9790
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5708 - loss: 0.9782 - val_accuracy: 0.5765 - val_loss: 1.0454
Epoch 23/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 0.9113
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5863 - loss: 0.9294 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5853 - loss: 0.9358
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5836 - loss: 0.9437
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5822 - loss: 0.9497
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5815 - loss: 0.9523 - val_accuracy: 0.5776 - val_loss: 1.0309
Epoch 24/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6094 - loss: 0.8965
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5894 - loss: 0.9287 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5889 - loss: 0.9373
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5885 - loss: 0.9436
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5869 - loss: 0.9488
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5862 - loss: 0.9510 - val_accuracy: 0.5797 - val_loss: 1.0370

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 414ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 17ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m156/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 3ms/step - accuracy: 0.4767 - loss: 1.1609[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 3ms/step - accuracy: 0.4765 - loss: 1.1611 - val_accuracy: 0.5323 - val_loss: 1.0564[32m [repeated 4x across cluster][0m
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4062 - loss: 1.2517
[1m 20/164[0m [32m━━[0m[37m━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4855 - loss: 1.1501 
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4062 - loss: 1.2623
[36m(train_cnn_ray_tune pid=1510570)[0m 
[1m117/164[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4779 - loss: 1.1599
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 3ms/step - accuracy: 0.4772 - loss: 1.1604[32m [repeated 3x across cluster][0m

=== EJECUCIÓN 1 ===

--- TRAIN (ejecución 1) ---

--- TEST (ejecución 1) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:13[0m 958ms/step
[1m 59/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 868us/step  
[1m129/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 787us/step
[1m195/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 780us/step
[1m259/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 782us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 15ms/step
[1m65/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 786us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 944us/step
Global accuracy score (validation) = 58.18 [%]
Global F1 score (validation) = 57.04 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.23823588 0.1748967  0.5379061  0.04896132]
 [0.11456525 0.07572161 0.7915519  0.01816126]
 [0.23595648 0.17331995 0.5418735  0.04885005]
 ...
 [0.16062635 0.10313669 0.36410865 0.37212834]
 [0.36402708 0.2153293  0.33013588 0.09050778]
 [0.43830162 0.49692428 0.03856174 0.02621235]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 63.53 [%]
Global accuracy score (test) = 57.15 [%]
Global F1 score (train) = 62.93 [%]
Global F1 score (test) = 57.07 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.49      0.40      0.44       400
MODERATE-INTENSITY       0.49      0.56      0.53       400
         SEDENTARY       0.63      0.72      0.67       400
VIGOROUS-INTENSITY       0.68      0.61      0.65       345

          accuracy                           0.57      1545
         macro avg       0.57      0.57      0.57      1545
      weighted avg       0.57      0.57      0.57      1545

2025-11-04 13:23:09.674581: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:23:09.685784: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762258989.698968 1579546 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762258989.703225 1579546 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762258989.713641 1579546 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762258989.713663 1579546 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762258989.713665 1579546 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762258989.713666 1579546 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:23:09.716769: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762258992.104605 1579546 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762258993.796407 1579677 service.cc:152] XLA service 0x78e91400bd90 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762258993.796433 1579677 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:23:13.839875: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762258994.007965 1579677 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762258996.779459 1579677 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:47[0m 4s/step - accuracy: 0.2344 - loss: 2.2898
[1m 29/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2542 - loss: 2.0972  
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2662 - loss: 2.0545
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2732 - loss: 2.0308
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2763 - loss: 2.0126
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2779 - loss: 1.9995
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2780 - loss: 1.9990 - val_accuracy: 0.4670 - val_loss: 1.2677
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.2344 - loss: 2.1311
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2775 - loss: 1.8124 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2829 - loss: 1.7812
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2881 - loss: 1.7535
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2911 - loss: 1.7384
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.2925 - loss: 1.7294 - val_accuracy: 0.4659 - val_loss: 1.2134
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2656 - loss: 1.6620
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2979 - loss: 1.5870 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3040 - loss: 1.5655
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3092 - loss: 1.5487
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.3132 - loss: 1.5357
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3148 - loss: 1.5305 - val_accuracy: 0.4940 - val_loss: 1.2019
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.4062 - loss: 1.2472
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3498 - loss: 1.3813 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3508 - loss: 1.3885
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3528 - loss: 1.3879
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3537 - loss: 1.3861
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3540 - loss: 1.3848 - val_accuracy: 0.5011 - val_loss: 1.1972
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4531 - loss: 1.2873
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4189 - loss: 1.3108 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4093 - loss: 1.3158
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4034 - loss: 1.3190
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4022 - loss: 1.3182
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4019 - loss: 1.3176 - val_accuracy: 0.5309 - val_loss: 1.1532
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4375 - loss: 1.2631
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4286 - loss: 1.2670 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4247 - loss: 1.2680
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4226 - loss: 1.2694
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4207 - loss: 1.2693
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4206 - loss: 1.2680 - val_accuracy: 0.5456 - val_loss: 1.1259
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 27ms/step - accuracy: 0.4688 - loss: 1.2591
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4349 - loss: 1.2454 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4403 - loss: 1.2379
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4431 - loss: 1.2317
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4446 - loss: 1.2278
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4454 - loss: 1.2255 - val_accuracy: 0.5632 - val_loss: 1.0812
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.4844 - loss: 1.1784
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4779 - loss: 1.1664 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4755 - loss: 1.1695
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4758 - loss: 1.1683
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4750 - loss: 1.1681
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4749 - loss: 1.1679 - val_accuracy: 0.5646 - val_loss: 1.0626
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2697
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4928 - loss: 1.1289 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4893 - loss: 1.1288
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4873 - loss: 1.1315
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4864 - loss: 1.1334
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4859 - loss: 1.1345 - val_accuracy: 0.5706 - val_loss: 1.0449
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5156 - loss: 1.2327
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5094 - loss: 1.1252 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5119 - loss: 1.1139
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5092 - loss: 1.1140
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5083 - loss: 1.1142
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5077 - loss: 1.1145 - val_accuracy: 0.5885 - val_loss: 1.0375
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6094 - loss: 0.9241
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5193 - loss: 1.0836 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5146 - loss: 1.0895
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5150 - loss: 1.0884
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5151 - loss: 1.0885
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5152 - loss: 1.0886 - val_accuracy: 0.5853 - val_loss: 1.0525
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 30ms/step - accuracy: 0.5156 - loss: 1.2011
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5023 - loss: 1.1049 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5073 - loss: 1.0917
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5121 - loss: 1.0843
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5138 - loss: 1.0815
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5149 - loss: 1.0795 - val_accuracy: 0.5902 - val_loss: 1.0256
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 25ms/step - accuracy: 0.4688 - loss: 1.0873
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5021 - loss: 1.0880 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5058 - loss: 1.0829
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5074 - loss: 1.0802
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5096 - loss: 1.0775
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5118 - loss: 1.0750 - val_accuracy: 0.5857 - val_loss: 1.0250
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.6719 - loss: 0.9180
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5492 - loss: 1.0417 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5454 - loss: 1.0494
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5417 - loss: 1.0527
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5398 - loss: 1.0541
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5391 - loss: 1.0539 - val_accuracy: 0.5927 - val_loss: 1.0441
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.0974
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5423 - loss: 1.0065 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5466 - loss: 1.0151
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5463 - loss: 1.0190
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5461 - loss: 1.0218
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5458 - loss: 1.0231 - val_accuracy: 0.5987 - val_loss: 1.0289
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.1440
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5314 - loss: 1.0544 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5417 - loss: 1.0387
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5436 - loss: 1.0348
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5433 - loss: 1.0340
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5439 - loss: 1.0328 - val_accuracy: 0.5885 - val_loss: 1.0302
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.1092
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5322 - loss: 1.0433 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5406 - loss: 1.0325
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5444 - loss: 1.0283
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5469 - loss: 1.0257
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5478 - loss: 1.0242 - val_accuracy: 0.6011 - val_loss: 1.0275
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6250 - loss: 0.8660
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5803 - loss: 0.9758 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5696 - loss: 0.9886
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5662 - loss: 0.9927
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5646 - loss: 0.9948
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5639 - loss: 0.9960 - val_accuracy: 0.5941 - val_loss: 1.0231
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6875 - loss: 0.7757
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5949 - loss: 0.9408 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5895 - loss: 0.9550
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5836 - loss: 0.9633
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5799 - loss: 0.9682
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5774 - loss: 0.9715 - val_accuracy: 0.5916 - val_loss: 1.0378
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6562 - loss: 0.9913
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5894 - loss: 0.9649 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5833 - loss: 0.9679
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5811 - loss: 0.9702
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5780 - loss: 0.9745
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5769 - loss: 0.9760 - val_accuracy: 0.5811 - val_loss: 1.0689
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5938 - loss: 1.0024
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5749 - loss: 0.9849 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5742 - loss: 0.9798
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5739 - loss: 0.9789
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5746 - loss: 0.9763
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5750 - loss: 0.9749 - val_accuracy: 0.5857 - val_loss: 1.0336
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5469 - loss: 0.9365
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5568 - loss: 0.9649 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5684 - loss: 0.9586
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5740 - loss: 0.9562
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5763 - loss: 0.9572
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5768 - loss: 0.9580 - val_accuracy: 0.5832 - val_loss: 1.0486
Epoch 23/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 1.0095
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6003 - loss: 0.9380 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5908 - loss: 0.9481
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5889 - loss: 0.9503
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5881 - loss: 0.9526
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5873 - loss: 0.9541 - val_accuracy: 0.5934 - val_loss: 1.0449

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 400ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 1: 57.15 [%]
F1-score capturado en la ejecución 1: 57.07 [%]

=== EJECUCIÓN 2 ===

--- TRAIN (ejecución 2) ---

--- TEST (ejecución 2) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:15[0m 966ms/step
[1m 62/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 823us/step  
[1m129/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 784us/step
[1m198/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 764us/step
[1m257/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 785us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m60/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 854us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 978us/step
Global accuracy score (validation) = 58.25 [%]
Global F1 score (validation) = 57.2 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.10009154 0.06235737 0.8243227  0.01322838]
 [0.16104987 0.10801999 0.6839211  0.04700911]
 [0.39323795 0.45943144 0.08959211 0.05773854]
 ...
 [0.14842479 0.08616646 0.5113363  0.2540724 ]
 [0.17205569 0.1408303  0.10685241 0.5802617 ]
 [0.3603182  0.4580837  0.03889625 0.14270188]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 63.09 [%]
Global accuracy score (test) = 52.1 [%]
Global F1 score (train) = 62.46 [%]
Global F1 score (test) = 51.89 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.39      0.30      0.34       400
MODERATE-INTENSITY       0.45      0.49      0.47       400
         SEDENTARY       0.56      0.70      0.62       400
VIGOROUS-INTENSITY       0.69      0.60      0.64       345

          accuracy                           0.52      1545
         macro avg       0.52      0.52      0.52      1545
      weighted avg       0.52      0.52      0.51      1545

2025-11-04 13:23:40.680504: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:23:40.691817: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259020.704946 1582659 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259020.709208 1582659 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259020.719207 1582659 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259020.719226 1582659 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259020.719229 1582659 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259020.719237 1582659 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:23:40.722568: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259023.078095 1582659 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259024.757421 1582789 service.cc:152] XLA service 0x70c19401d630 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259024.757468 1582789 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:23:44.799499: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259024.968053 1582789 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259027.696159 1582789 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:39[0m 4s/step - accuracy: 0.2500 - loss: 2.1666
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2541 - loss: 2.0922  
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2609 - loss: 2.0721
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2644 - loss: 2.0606
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2681 - loss: 2.0431
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2711 - loss: 2.0268
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2712 - loss: 2.0263 - val_accuracy: 0.4621 - val_loss: 1.2531
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3125 - loss: 1.5652
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3010 - loss: 1.7998 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3062 - loss: 1.7837
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3070 - loss: 1.7727
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3065 - loss: 1.7610
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3059 - loss: 1.7534 - val_accuracy: 0.4796 - val_loss: 1.2004
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3750 - loss: 1.6856
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3089 - loss: 1.6045 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3126 - loss: 1.5821
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3159 - loss: 1.5663
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3184 - loss: 1.5552
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3197 - loss: 1.5474 - val_accuracy: 0.4909 - val_loss: 1.1922
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3438 - loss: 1.4816
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3501 - loss: 1.4363 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3517 - loss: 1.4286
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3529 - loss: 1.4216
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3530 - loss: 1.4165
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3533 - loss: 1.4130 - val_accuracy: 0.5112 - val_loss: 1.1873
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3438 - loss: 1.2119
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3611 - loss: 1.3127 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3640 - loss: 1.3226
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3667 - loss: 1.3245
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3708 - loss: 1.3225
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3741 - loss: 1.3200 - val_accuracy: 0.5165 - val_loss: 1.1552
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4688 - loss: 1.3448
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4156 - loss: 1.2855 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4144 - loss: 1.2784
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4159 - loss: 1.2727
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4186 - loss: 1.2677
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4201 - loss: 1.2653 - val_accuracy: 0.5372 - val_loss: 1.1291
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3594 - loss: 1.3130
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4374 - loss: 1.2231 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4387 - loss: 1.2232
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4383 - loss: 1.2225
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4393 - loss: 1.2203
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4403 - loss: 1.2186 - val_accuracy: 0.5597 - val_loss: 1.0979
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4531 - loss: 1.1790
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4780 - loss: 1.1594 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4798 - loss: 1.1585
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4795 - loss: 1.1601
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4787 - loss: 1.1615
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4786 - loss: 1.1615 - val_accuracy: 0.5509 - val_loss: 1.0844
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4062 - loss: 1.2350
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4818 - loss: 1.1524 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4835 - loss: 1.1491
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4845 - loss: 1.1478
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4862 - loss: 1.1464
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4869 - loss: 1.1457 - val_accuracy: 0.5822 - val_loss: 1.0786
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3594 - loss: 1.1975
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4763 - loss: 1.1412 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4809 - loss: 1.1373
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4833 - loss: 1.1331
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4845 - loss: 1.1306
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4856 - loss: 1.1288 - val_accuracy: 0.5646 - val_loss: 1.0664
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.1612
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5080 - loss: 1.1022 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5143 - loss: 1.1001
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5149 - loss: 1.0989
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5140 - loss: 1.0993
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5139 - loss: 1.0983 - val_accuracy: 0.5688 - val_loss: 1.0806
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5156 - loss: 1.1087
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4989 - loss: 1.0700 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5058 - loss: 1.0668
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5077 - loss: 1.0678
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5087 - loss: 1.0694
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5089 - loss: 1.0710 - val_accuracy: 0.5629 - val_loss: 1.0667
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.0636
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5256 - loss: 1.0581 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5311 - loss: 1.0533
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5331 - loss: 1.0517
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5317 - loss: 1.0526
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5304 - loss: 1.0535 - val_accuracy: 0.5643 - val_loss: 1.0631
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 0.9746
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5418 - loss: 1.0474 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5386 - loss: 1.0494
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5364 - loss: 1.0483
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5347 - loss: 1.0478
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5339 - loss: 1.0481 - val_accuracy: 0.5572 - val_loss: 1.0688
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.5781 - loss: 1.0952
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5566 - loss: 1.0282 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5490 - loss: 1.0284
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5458 - loss: 1.0302
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5434 - loss: 1.0328
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5423 - loss: 1.0340 - val_accuracy: 0.5713 - val_loss: 1.0632
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5156 - loss: 1.0820
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5324 - loss: 1.0414 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5411 - loss: 1.0305
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5404 - loss: 1.0306
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5400 - loss: 1.0310
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5402 - loss: 1.0305 - val_accuracy: 0.5632 - val_loss: 1.0597
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.5625 - loss: 1.0515
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5516 - loss: 1.0246 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5468 - loss: 1.0312
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5456 - loss: 1.0316
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5464 - loss: 1.0294
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5467 - loss: 1.0280 - val_accuracy: 0.5702 - val_loss: 1.0458
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 0.9956
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5768 - loss: 0.9944 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5661 - loss: 0.9963
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5608 - loss: 0.9982
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5585 - loss: 0.9998
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5581 - loss: 1.0003 - val_accuracy: 0.5765 - val_loss: 1.0544
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 0.8985
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5481 - loss: 0.9735 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5496 - loss: 0.9841
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5516 - loss: 0.9880
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5523 - loss: 0.9901
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5526 - loss: 0.9913 - val_accuracy: 0.5650 - val_loss: 1.0576
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.1631
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5577 - loss: 0.9999 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5622 - loss: 0.9910
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5626 - loss: 0.9875
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5634 - loss: 0.9862
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5634 - loss: 0.9858 - val_accuracy: 0.5723 - val_loss: 1.0681
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.1288
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5767 - loss: 0.9766 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5753 - loss: 0.9775
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5737 - loss: 0.9783
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5726 - loss: 0.9789
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5723 - loss: 0.9791 - val_accuracy: 0.5808 - val_loss: 1.0685
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 1.0746
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5656 - loss: 0.9901 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5653 - loss: 0.9828
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5648 - loss: 0.9812
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5655 - loss: 0.9791
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5658 - loss: 0.9781 - val_accuracy: 0.5720 - val_loss: 1.0721

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 413ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 2: 52.1 [%]
F1-score capturado en la ejecución 2: 51.89 [%]

=== EJECUCIÓN 3 ===

--- TRAIN (ejecución 3) ---

--- TEST (ejecución 3) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:14[0m 962ms/step
[1m 64/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 798us/step  
[1m129/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 789us/step
[1m193/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 788us/step
[1m263/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 769us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 13ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m66/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 774us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 945us/step
Global accuracy score (validation) = 57.48 [%]
Global F1 score (validation) = 55.75 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.37173057 0.466175   0.11319878 0.04889574]
 [0.23161669 0.1611569  0.5606427  0.04658373]
 [0.08718113 0.04302511 0.85873556 0.01105815]
 ...
 [0.226827   0.129868   0.34601405 0.29729104]
 [0.09885282 0.07705909 0.03692997 0.78715813]
 [0.27538496 0.36396208 0.02258191 0.33807105]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 61.68 [%]
Global accuracy score (test) = 54.3 [%]
Global F1 score (train) = 60.43 [%]
Global F1 score (test) = 54.4 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.39      0.29      0.34       400
MODERATE-INTENSITY       0.43      0.57      0.49       400
         SEDENTARY       0.69      0.70      0.69       400
VIGOROUS-INTENSITY       0.70      0.61      0.65       345

          accuracy                           0.54      1545
         macro avg       0.55      0.55      0.54      1545
      weighted avg       0.55      0.54      0.54      1545

2025-11-04 13:24:11.254616: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:24:11.266056: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259051.279494 1585700 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259051.283731 1585700 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259051.293522 1585700 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259051.293542 1585700 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259051.293544 1585700 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259051.293545 1585700 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:24:11.296710: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259053.677519 1585700 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259055.374044 1585818 service.cc:152] XLA service 0x7e8bbc00b770 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259055.374091 1585818 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:24:15.426563: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259055.591449 1585818 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259058.306633 1585818 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:40[0m 4s/step - accuracy: 0.3281 - loss: 2.1050
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2582 - loss: 2.1147  
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2626 - loss: 2.0605
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2666 - loss: 2.0307
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2708 - loss: 2.0043
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2723 - loss: 1.9933
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 30ms/step - accuracy: 0.2723 - loss: 1.9927 - val_accuracy: 0.4196 - val_loss: 1.2318
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.2031 - loss: 1.8384
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2935 - loss: 1.7067 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2950 - loss: 1.7065
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2973 - loss: 1.6980
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3002 - loss: 1.6884
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3022 - loss: 1.6796 - val_accuracy: 0.4835 - val_loss: 1.2030
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3750 - loss: 1.4607
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3301 - loss: 1.5178 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3313 - loss: 1.5097
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3322 - loss: 1.5051
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3319 - loss: 1.5011
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3322 - loss: 1.4969 - val_accuracy: 0.4733 - val_loss: 1.1932
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3125 - loss: 1.4109
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3467 - loss: 1.3934 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3487 - loss: 1.3938
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3501 - loss: 1.3930
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3520 - loss: 1.3905
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3535 - loss: 1.3881 - val_accuracy: 0.5007 - val_loss: 1.1728
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.3463
[1m 39/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3885 - loss: 1.3374 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3881 - loss: 1.3297
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3888 - loss: 1.3251
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3911 - loss: 1.3201
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3921 - loss: 1.3181 - val_accuracy: 0.5256 - val_loss: 1.1436
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3438 - loss: 1.3191
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3922 - loss: 1.2853 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4060 - loss: 1.2722
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4123 - loss: 1.2644
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4145 - loss: 1.2617
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4153 - loss: 1.2603 - val_accuracy: 0.5621 - val_loss: 1.1128
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5000 - loss: 1.1863
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4800 - loss: 1.2033 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4665 - loss: 1.2066
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4596 - loss: 1.2088
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4568 - loss: 1.2090
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4564 - loss: 1.2081 - val_accuracy: 0.5548 - val_loss: 1.0913
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4375 - loss: 1.1387
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4828 - loss: 1.1428 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4805 - loss: 1.1503
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4778 - loss: 1.1544
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4751 - loss: 1.1580
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4739 - loss: 1.1598 - val_accuracy: 0.5660 - val_loss: 1.0765
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4844 - loss: 1.1060
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4843 - loss: 1.1189 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4846 - loss: 1.1229
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4861 - loss: 1.1259
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4875 - loss: 1.1283
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4881 - loss: 1.1289 - val_accuracy: 0.5565 - val_loss: 1.0595
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.1247
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4961 - loss: 1.1256 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4983 - loss: 1.1229
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4969 - loss: 1.1210
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4968 - loss: 1.1200
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4968 - loss: 1.1194 - val_accuracy: 0.5614 - val_loss: 1.0534
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.2768
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5160 - loss: 1.1057 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5090 - loss: 1.1059
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5071 - loss: 1.1044
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5073 - loss: 1.1022
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5071 - loss: 1.1013 - val_accuracy: 0.5583 - val_loss: 1.0606
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5938 - loss: 1.1082
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5335 - loss: 1.0672 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5206 - loss: 1.0776
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5169 - loss: 1.0784
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5159 - loss: 1.0788
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5157 - loss: 1.0789 - val_accuracy: 0.5597 - val_loss: 1.0489
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5312 - loss: 1.0578
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5056 - loss: 1.0886 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5152 - loss: 1.0767
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5192 - loss: 1.0718
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5211 - loss: 1.0690
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5218 - loss: 1.0674 - val_accuracy: 0.5815 - val_loss: 1.0544
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5469 - loss: 0.9795
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5439 - loss: 1.0288 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5420 - loss: 1.0309
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5402 - loss: 1.0329
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5387 - loss: 1.0360
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5373 - loss: 1.0382 - val_accuracy: 0.5723 - val_loss: 1.0490
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 0.9716
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5623 - loss: 1.0188 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5569 - loss: 1.0222
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5530 - loss: 1.0249
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5508 - loss: 1.0281
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5494 - loss: 1.0302 - val_accuracy: 0.5815 - val_loss: 1.0500
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 0.9897
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5241 - loss: 1.0439 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5281 - loss: 1.0418
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5296 - loss: 1.0409
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5314 - loss: 1.0388
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5330 - loss: 1.0368 - val_accuracy: 0.5706 - val_loss: 1.0560
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5938 - loss: 0.9827
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5620 - loss: 1.0030 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5515 - loss: 1.0160
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5497 - loss: 1.0174
[1m131/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5505 - loss: 1.0156
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5510 - loss: 1.0149 - val_accuracy: 0.5699 - val_loss: 1.0487
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5156 - loss: 1.0844
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5494 - loss: 1.0309 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5537 - loss: 1.0129
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5534 - loss: 1.0080
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5536 - loss: 1.0064
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5536 - loss: 1.0069 - val_accuracy: 0.5857 - val_loss: 1.0408
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.5938 - loss: 0.9979
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5598 - loss: 1.0079 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5579 - loss: 1.0021
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5583 - loss: 0.9997
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5588 - loss: 0.9986
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5587 - loss: 0.9985 - val_accuracy: 0.5699 - val_loss: 1.0609
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.4531 - loss: 1.0600
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5453 - loss: 0.9964 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5512 - loss: 0.9904
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5524 - loss: 0.9900
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5528 - loss: 0.9895
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5531 - loss: 0.9892 - val_accuracy: 0.5734 - val_loss: 1.0518
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.1295
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5545 - loss: 0.9922 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5645 - loss: 0.9812
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5675 - loss: 0.9802
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5694 - loss: 0.9794
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5699 - loss: 0.9792 - val_accuracy: 0.5755 - val_loss: 1.0445
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5469 - loss: 0.9697
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5758 - loss: 0.9821 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5753 - loss: 0.9761
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5774 - loss: 0.9710
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5781 - loss: 0.9686
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5780 - loss: 0.9682 - val_accuracy: 0.5716 - val_loss: 1.0441
Epoch 23/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 0.9572
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5926 - loss: 0.9416 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5858 - loss: 0.9488
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5809 - loss: 0.9538
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5792 - loss: 0.9577
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5786 - loss: 0.9594 - val_accuracy: 0.5769 - val_loss: 1.0545

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 388ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 3: 54.3 [%]
F1-score capturado en la ejecución 3: 54.4 [%]

=== EJECUCIÓN 4 ===

--- TRAIN (ejecución 4) ---

--- TEST (ejecución 4) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:19[0m 976ms/step
[1m 64/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 803us/step  
[1m131/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 776us/step
[1m199/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 763us/step
[1m268/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 753us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m67/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 758us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 906us/step
Global accuracy score (validation) = 57.34 [%]
Global F1 score (validation) = 56.55 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.36376193 0.28054556 0.31531966 0.04037289]
 [0.07621062 0.05110758 0.8638394  0.00884245]
 [0.11059619 0.072248   0.8056331  0.01152277]
 ...
 [0.22073427 0.12761815 0.4583974  0.19325015]
 [0.17880377 0.13100694 0.16705917 0.5231301 ]
 [0.3499794  0.41129103 0.04017484 0.19855472]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 63.54 [%]
Global accuracy score (test) = 51.46 [%]
Global F1 score (train) = 63.26 [%]
Global F1 score (test) = 51.73 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.37      0.30      0.33       400
MODERATE-INTENSITY       0.41      0.54      0.47       400
         SEDENTARY       0.63      0.65      0.64       400
VIGOROUS-INTENSITY       0.68      0.59      0.63       345

          accuracy                           0.51      1545
         macro avg       0.53      0.52      0.52      1545
      weighted avg       0.52      0.51      0.51      1545

2025-11-04 13:24:42.108812: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:24:42.120034: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259082.133141 1588827 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259082.137276 1588827 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259082.147158 1588827 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259082.147175 1588827 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259082.147177 1588827 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259082.147179 1588827 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:24:42.150299: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259084.480200 1588827 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259086.156380 1588935 service.cc:152] XLA service 0x70db0000d150 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259086.156438 1588935 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:24:46.207180: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259086.370095 1588935 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259089.079170 1588935 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:38[0m 4s/step - accuracy: 0.2500 - loss: 1.9965
[1m 28/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2481 - loss: 2.1667  
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2602 - loss: 2.1058
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2671 - loss: 2.0705
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2718 - loss: 2.0420
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 18ms/step - accuracy: 0.2748 - loss: 2.0240
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2749 - loss: 2.0234 - val_accuracy: 0.4698 - val_loss: 1.2504
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.2188 - loss: 1.8498
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2891 - loss: 1.7904 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2962 - loss: 1.7613
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3002 - loss: 1.7417
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3025 - loss: 1.7278
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3038 - loss: 1.7182 - val_accuracy: 0.4677 - val_loss: 1.2197
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 25ms/step - accuracy: 0.2969 - loss: 1.6523
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3091 - loss: 1.6028 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3165 - loss: 1.5782
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3209 - loss: 1.5627
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3234 - loss: 1.5518
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3249 - loss: 1.5456 - val_accuracy: 0.4810 - val_loss: 1.1995
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.2892
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3641 - loss: 1.3894 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3571 - loss: 1.3998
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3541 - loss: 1.4023
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3525 - loss: 1.4015
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3526 - loss: 1.3999 - val_accuracy: 0.4937 - val_loss: 1.2011
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.2656 - loss: 1.5103
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3513 - loss: 1.3784 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3636 - loss: 1.3664
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3682 - loss: 1.3616
[1m132/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3712 - loss: 1.3574
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3741 - loss: 1.3527 - val_accuracy: 0.5225 - val_loss: 1.1673
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.4375 - loss: 1.2110
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4168 - loss: 1.2827 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4154 - loss: 1.2788
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4162 - loss: 1.2769
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4170 - loss: 1.2751
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4178 - loss: 1.2732 - val_accuracy: 0.5270 - val_loss: 1.1428
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3906 - loss: 1.2652
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4358 - loss: 1.2287 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4439 - loss: 1.2256
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4464 - loss: 1.2229
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4467 - loss: 1.2220
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4469 - loss: 1.2212 - val_accuracy: 0.5390 - val_loss: 1.1190
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5312 - loss: 1.0635
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4594 - loss: 1.1970 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4623 - loss: 1.1971
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4648 - loss: 1.1922
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4659 - loss: 1.1884
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4662 - loss: 1.1867 - val_accuracy: 0.5558 - val_loss: 1.0815
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.1179
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4891 - loss: 1.1471 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4833 - loss: 1.1470
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4816 - loss: 1.1456
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4810 - loss: 1.1448
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4807 - loss: 1.1451 - val_accuracy: 0.5558 - val_loss: 1.0940
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4531 - loss: 1.2224
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4637 - loss: 1.1769 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4700 - loss: 1.1609
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4742 - loss: 1.1522
[1m131/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4774 - loss: 1.1467
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4801 - loss: 1.1426 - val_accuracy: 0.5583 - val_loss: 1.1113
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.0260
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4786 - loss: 1.1227 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4869 - loss: 1.1128
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4893 - loss: 1.1109
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4913 - loss: 1.1100
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4923 - loss: 1.1095 - val_accuracy: 0.5576 - val_loss: 1.0744
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3438 - loss: 1.2432
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4926 - loss: 1.1111 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5035 - loss: 1.1006
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5072 - loss: 1.0975
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5082 - loss: 1.0962
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5085 - loss: 1.0954 - val_accuracy: 0.5706 - val_loss: 1.0574
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5000 - loss: 1.1027
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5172 - loss: 1.0847 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5178 - loss: 1.0822
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5196 - loss: 1.0797
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5212 - loss: 1.0773
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5214 - loss: 1.0764 - val_accuracy: 0.5646 - val_loss: 1.0630
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5000 - loss: 1.1548
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5314 - loss: 1.0658 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5312 - loss: 1.0631
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5305 - loss: 1.0615
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5294 - loss: 1.0607
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5286 - loss: 1.0600 - val_accuracy: 0.5874 - val_loss: 1.0536
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5938 - loss: 0.9860
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5246 - loss: 1.0475 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5270 - loss: 1.0489
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5295 - loss: 1.0472
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5305 - loss: 1.0470
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5312 - loss: 1.0471 - val_accuracy: 0.5762 - val_loss: 1.0465
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.0240
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5390 - loss: 1.0155 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5362 - loss: 1.0259
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5350 - loss: 1.0288
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5341 - loss: 1.0309
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5339 - loss: 1.0316 - val_accuracy: 0.5850 - val_loss: 1.0453
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.0924
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5425 - loss: 1.0408 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5452 - loss: 1.0324
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5465 - loss: 1.0266
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5475 - loss: 1.0232
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5478 - loss: 1.0223 - val_accuracy: 0.5864 - val_loss: 1.0620
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 1.1433
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5688 - loss: 1.0076 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5665 - loss: 0.9990
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5649 - loss: 0.9982
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5631 - loss: 1.0006
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5619 - loss: 1.0022 - val_accuracy: 0.5801 - val_loss: 1.0628
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.7188 - loss: 0.8429
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5686 - loss: 0.9842 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5626 - loss: 0.9898
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5625 - loss: 0.9904
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5622 - loss: 0.9904
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5623 - loss: 0.9905 - val_accuracy: 0.5748 - val_loss: 1.0672
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5938 - loss: 0.9437
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5539 - loss: 1.0107 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5549 - loss: 1.0052
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5549 - loss: 1.0011
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5560 - loss: 0.9976
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5566 - loss: 0.9964 - val_accuracy: 0.5636 - val_loss: 1.0648
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.0617
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5579 - loss: 0.9994 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5643 - loss: 0.9863
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5651 - loss: 0.9823
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5652 - loss: 0.9811
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5652 - loss: 0.9802 - val_accuracy: 0.5769 - val_loss: 1.0637

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 409ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 4: 51.46 [%]
F1-score capturado en la ejecución 4: 51.73 [%]

=== EJECUCIÓN 5 ===

--- TRAIN (ejecución 5) ---

--- TEST (ejecución 5) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:16[0m 969ms/step
[1m 64/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 795us/step  
[1m126/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 804us/step
[1m197/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 769us/step
[1m262/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 772us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 15ms/step
[1m64/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 800us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 918us/step
Global accuracy score (validation) = 56.46 [%]
Global F1 score (validation) = 55.64 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.12070926 0.07781659 0.7833628  0.01811143]
 [0.11438675 0.07248855 0.79684263 0.01628205]
 [0.16406561 0.09906639 0.6880382  0.04882989]
 ...
 [0.1130296  0.06182274 0.23280063 0.592347  ]
 [0.33431724 0.32880992 0.05185578 0.2850171 ]
 [0.35609454 0.4527555  0.03361947 0.15753041]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 62.56 [%]
Global accuracy score (test) = 56.25 [%]
Global F1 score (train) = 62.12 [%]
Global F1 score (test) = 56.49 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.44      0.37      0.40       400
MODERATE-INTENSITY       0.47      0.59      0.53       400
         SEDENTARY       0.66      0.68      0.67       400
VIGOROUS-INTENSITY       0.72      0.61      0.66       345

          accuracy                           0.56      1545
         macro avg       0.57      0.56      0.56      1545
      weighted avg       0.57      0.56      0.56      1545

2025-11-04 13:25:12.393517: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:25:12.405126: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259112.418397 1591761 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259112.422479 1591761 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259112.432622 1591761 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259112.432642 1591761 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259112.432645 1591761 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259112.432647 1591761 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:25:12.435811: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259114.788481 1591761 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259116.449476 1591869 service.cc:152] XLA service 0x73374000b980 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259116.449562 1591869 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:25:16.492198: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259116.655178 1591869 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259119.355991 1591869 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:34[0m 4s/step - accuracy: 0.3281 - loss: 1.8632
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2650 - loss: 2.0798  
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2637 - loss: 2.0756
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2668 - loss: 2.0512
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2694 - loss: 2.0285
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2712 - loss: 2.0134
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2713 - loss: 2.0129 - val_accuracy: 0.4498 - val_loss: 1.3443
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3438 - loss: 1.5065
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3076 - loss: 1.7300 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3076 - loss: 1.7190
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3112 - loss: 1.7023
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3132 - loss: 1.6926
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3141 - loss: 1.6870 - val_accuracy: 0.4905 - val_loss: 1.1992
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3125 - loss: 1.6524
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2981 - loss: 1.6042 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3079 - loss: 1.5766
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3147 - loss: 1.5581
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3192 - loss: 1.5449
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3216 - loss: 1.5370 - val_accuracy: 0.5063 - val_loss: 1.1961
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.3594 - loss: 1.4212
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3402 - loss: 1.4274 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3435 - loss: 1.4140
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3459 - loss: 1.4079
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.3486 - loss: 1.4022
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3498 - loss: 1.3999 - val_accuracy: 0.5197 - val_loss: 1.1839
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4531 - loss: 1.2793
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3726 - loss: 1.3483 
[1m 76/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3778 - loss: 1.3402
[1m114/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3815 - loss: 1.3345
[1m151/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.3843 - loss: 1.3288
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3854 - loss: 1.3267 - val_accuracy: 0.5379 - val_loss: 1.1527
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2170
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4302 - loss: 1.2510 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4229 - loss: 1.2633
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4205 - loss: 1.2654
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4210 - loss: 1.2644
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4214 - loss: 1.2636 - val_accuracy: 0.5523 - val_loss: 1.1266
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.2689
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4416 - loss: 1.2152 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4480 - loss: 1.2061
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4529 - loss: 1.2007
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4551 - loss: 1.1981
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4556 - loss: 1.1976 - val_accuracy: 0.5702 - val_loss: 1.0980
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.0707
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5060 - loss: 1.1276 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4881 - loss: 1.1520
[1m 97/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4830 - loss: 1.1606
[1m129/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4807 - loss: 1.1644
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4790 - loss: 1.1662
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4790 - loss: 1.1662 - val_accuracy: 0.5685 - val_loss: 1.0809
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.2574
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4850 - loss: 1.1576 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4889 - loss: 1.1454
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4907 - loss: 1.1422
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4917 - loss: 1.1402
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4919 - loss: 1.1393 - val_accuracy: 0.5762 - val_loss: 1.0775
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5156 - loss: 1.0039
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5063 - loss: 1.1044 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5020 - loss: 1.1104
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5016 - loss: 1.1117
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5016 - loss: 1.1115
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5019 - loss: 1.1113 - val_accuracy: 0.5822 - val_loss: 1.0712
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.0102
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5081 - loss: 1.0764 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5098 - loss: 1.0852
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5088 - loss: 1.0891
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5092 - loss: 1.0909
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5093 - loss: 1.0914 - val_accuracy: 0.5846 - val_loss: 1.0732
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4844 - loss: 1.0161
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5261 - loss: 1.0810 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5295 - loss: 1.0786
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5287 - loss: 1.0775
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5278 - loss: 1.0771
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5272 - loss: 1.0771 - val_accuracy: 0.5555 - val_loss: 1.0689
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4375 - loss: 1.0595
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5306 - loss: 1.0426 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5324 - loss: 1.0462
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5311 - loss: 1.0506
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5292 - loss: 1.0548
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5281 - loss: 1.0565 - val_accuracy: 0.5720 - val_loss: 1.0538
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6406 - loss: 0.8337
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5562 - loss: 1.0362 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5465 - loss: 1.0444
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5440 - loss: 1.0460
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5422 - loss: 1.0464
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5412 - loss: 1.0471 - val_accuracy: 0.5611 - val_loss: 1.0828
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 0.9300
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5508 - loss: 1.0108 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5494 - loss: 1.0270
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5461 - loss: 1.0335
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5445 - loss: 1.0368
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5441 - loss: 1.0373 - val_accuracy: 0.5772 - val_loss: 1.0842
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.0866
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5226 - loss: 1.0556 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5277 - loss: 1.0462
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5326 - loss: 1.0397
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5360 - loss: 1.0339
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5374 - loss: 1.0317 - val_accuracy: 0.5787 - val_loss: 1.0696
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5781 - loss: 1.0142
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5245 - loss: 1.0462 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5283 - loss: 1.0414
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5323 - loss: 1.0375
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5354 - loss: 1.0341
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5370 - loss: 1.0321 - val_accuracy: 0.5857 - val_loss: 1.0686
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5000 - loss: 1.1788
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5444 - loss: 1.0639 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5518 - loss: 1.0360
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5542 - loss: 1.0260
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5550 - loss: 1.0220
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5561 - loss: 1.0193 - val_accuracy: 0.5751 - val_loss: 1.0864

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 393ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 5: 56.25 [%]
F1-score capturado en la ejecución 5: 56.49 [%]

=== EJECUCIÓN 6 ===

--- TRAIN (ejecución 6) ---

--- TEST (ejecución 6) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:12[0m 957ms/step
[1m 65/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 788us/step  
[1m135/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 756us/step
[1m201/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 760us/step
[1m264/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 769us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 21ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m64/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 798us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 920us/step
Global accuracy score (validation) = 57.48 [%]
Global F1 score (validation) = 56.24 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.13773131 0.09093001 0.7405229  0.03081572]
 [0.21191877 0.16825292 0.58318627 0.03664205]
 [0.21269682 0.1683894  0.583629   0.03528473]
 ...
 [0.10343016 0.0705755  0.06191561 0.76407874]
 [0.13947937 0.08148441 0.13425912 0.6447771 ]
 [0.4190907  0.4720625  0.04199351 0.06685336]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 60.86 [%]
Global accuracy score (test) = 55.02 [%]
Global F1 score (train) = 60.02 [%]
Global F1 score (test) = 54.67 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.41      0.29      0.34       400
MODERATE-INTENSITY       0.47      0.62      0.53       400
         SEDENTARY       0.64      0.69      0.67       400
VIGOROUS-INTENSITY       0.70      0.60      0.65       345

          accuracy                           0.55      1545
         macro avg       0.55      0.55      0.55      1545
      weighted avg       0.55      0.55      0.54      1545

2025-11-04 13:25:41.307259: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:25:41.319158: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259141.333110 1594399 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259141.337479 1594399 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259141.348159 1594399 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259141.348181 1594399 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259141.348183 1594399 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259141.348185 1594399 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:25:41.351517: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259143.677479 1594399 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259145.352882 1594529 service.cc:152] XLA service 0x7fa94800bfb0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259145.352936 1594529 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:25:45.397714: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259145.561240 1594529 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259148.298792 1594529 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:41[0m 4s/step - accuracy: 0.2188 - loss: 2.5035
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2650 - loss: 2.1430  
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2682 - loss: 2.1038
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2696 - loss: 2.0763
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2716 - loss: 2.0553
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2736 - loss: 2.0364
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2737 - loss: 2.0357 - val_accuracy: 0.4614 - val_loss: 1.3492
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.2031 - loss: 2.0837
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2946 - loss: 1.7955 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3036 - loss: 1.7668
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3062 - loss: 1.7540
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3071 - loss: 1.7417
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3075 - loss: 1.7328 - val_accuracy: 0.4649 - val_loss: 1.2127
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3594 - loss: 1.4482
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3358 - loss: 1.5452 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3276 - loss: 1.5552
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3250 - loss: 1.5524
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3243 - loss: 1.5468
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3250 - loss: 1.5416 - val_accuracy: 0.4796 - val_loss: 1.1934
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3594 - loss: 1.4695
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3297 - loss: 1.4622 
[1m 76/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3361 - loss: 1.4415
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3396 - loss: 1.4314
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3422 - loss: 1.4238
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3433 - loss: 1.4207 - val_accuracy: 0.5242 - val_loss: 1.1756
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.3438 - loss: 1.3683
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3830 - loss: 1.3333 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3893 - loss: 1.3271
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3886 - loss: 1.3271
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3888 - loss: 1.3263
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3893 - loss: 1.3252 - val_accuracy: 0.5348 - val_loss: 1.1461
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3594 - loss: 1.2719
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3985 - loss: 1.2864 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4049 - loss: 1.2831
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4067 - loss: 1.2812
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4081 - loss: 1.2799
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4090 - loss: 1.2789 - val_accuracy: 0.5513 - val_loss: 1.1321
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3438 - loss: 1.2844
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4113 - loss: 1.2679 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4234 - loss: 1.2503
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4311 - loss: 1.2391
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4349 - loss: 1.2331
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4369 - loss: 1.2299 - val_accuracy: 0.5583 - val_loss: 1.1039
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4219 - loss: 1.1419
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4578 - loss: 1.1703 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4610 - loss: 1.1751
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4608 - loss: 1.1777
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4620 - loss: 1.1769
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4628 - loss: 1.1760 - val_accuracy: 0.5548 - val_loss: 1.0795
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4531 - loss: 1.0856
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4868 - loss: 1.1279 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4854 - loss: 1.1363
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4840 - loss: 1.1396
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4845 - loss: 1.1403
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4849 - loss: 1.1403 - val_accuracy: 0.5646 - val_loss: 1.0818
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 1.0993
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4925 - loss: 1.1170 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4953 - loss: 1.1177
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4971 - loss: 1.1166
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4981 - loss: 1.1165
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4984 - loss: 1.1167 - val_accuracy: 0.5650 - val_loss: 1.0632
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.1433
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5088 - loss: 1.0803 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5086 - loss: 1.0832
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5081 - loss: 1.0857
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5087 - loss: 1.0867
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5089 - loss: 1.0874 - val_accuracy: 0.5787 - val_loss: 1.0589
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.0308
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5277 - loss: 1.0551 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5185 - loss: 1.0650
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5164 - loss: 1.0704
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5156 - loss: 1.0732
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5149 - loss: 1.0748 - val_accuracy: 0.5621 - val_loss: 1.0553
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 1.0092
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5277 - loss: 1.0940 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5307 - loss: 1.0788
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5316 - loss: 1.0736
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5300 - loss: 1.0720
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5286 - loss: 1.0715 - val_accuracy: 0.5565 - val_loss: 1.0560
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5156 - loss: 0.9824
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5029 - loss: 1.0931 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5116 - loss: 1.0860
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5153 - loss: 1.0825
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5188 - loss: 1.0790
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5215 - loss: 1.0754 - val_accuracy: 0.5593 - val_loss: 1.0585
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5000 - loss: 1.0509
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5085 - loss: 1.0567 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5156 - loss: 1.0483
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5186 - loss: 1.0467
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5214 - loss: 1.0451
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5227 - loss: 1.0445 - val_accuracy: 0.5653 - val_loss: 1.0525
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4531 - loss: 1.2415
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5468 - loss: 1.0348 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5528 - loss: 1.0218
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5532 - loss: 1.0216
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5517 - loss: 1.0238
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5508 - loss: 1.0248 - val_accuracy: 0.5590 - val_loss: 1.0440
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5938 - loss: 1.0371
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5678 - loss: 1.0015 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5647 - loss: 1.0065
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5628 - loss: 1.0089
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5623 - loss: 1.0101
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5622 - loss: 1.0104 - val_accuracy: 0.5534 - val_loss: 1.0492
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 0.8684
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5614 - loss: 0.9957 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5619 - loss: 0.9976
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5607 - loss: 0.9996
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5597 - loss: 0.9997
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5594 - loss: 0.9997 - val_accuracy: 0.5664 - val_loss: 1.0425
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5781 - loss: 1.0055
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5711 - loss: 0.9962 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5722 - loss: 0.9907
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5702 - loss: 0.9906
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5687 - loss: 0.9893
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5678 - loss: 0.9890 - val_accuracy: 0.5579 - val_loss: 1.0687
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 1.0050
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5658 - loss: 0.9941 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5706 - loss: 0.9907
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5724 - loss: 0.9869
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5735 - loss: 0.9848
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5737 - loss: 0.9839 - val_accuracy: 0.5685 - val_loss: 1.0629
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.5000 - loss: 0.9057
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5649 - loss: 0.9668 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5670 - loss: 0.9731
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5662 - loss: 0.9743
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5653 - loss: 0.9748
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5652 - loss: 0.9746 - val_accuracy: 0.5783 - val_loss: 1.0506
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 0.9330
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5578 - loss: 0.9769 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5621 - loss: 0.9745
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5661 - loss: 0.9715
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5684 - loss: 0.9702
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5696 - loss: 0.9696 - val_accuracy: 0.5590 - val_loss: 1.0692
Epoch 23/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 0.9187
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5650 - loss: 0.9719 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5735 - loss: 0.9611
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5756 - loss: 0.9576
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5760 - loss: 0.9558
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5767 - loss: 0.9547 - val_accuracy: 0.5769 - val_loss: 1.0610

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 397ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 6: 55.02 [%]
F1-score capturado en la ejecución 6: 54.67 [%]

=== EJECUCIÓN 7 ===

--- TRAIN (ejecución 7) ---

--- TEST (ejecución 7) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:19[0m 978ms/step
[1m 60/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 857us/step  
[1m115/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 883us/step
[1m180/328[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 843us/step
[1m251/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 808us/step
[1m306/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 828us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m65/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 783us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 912us/step
Global accuracy score (validation) = 57.41 [%]
Global F1 score (validation) = 56.37 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.07067136 0.03976003 0.8807954  0.00877328]
 [0.11759622 0.07047898 0.7951682  0.01675662]
 [0.10838885 0.06549589 0.8113405  0.01477464]
 ...
 [0.06783697 0.04422224 0.06756473 0.82037604]
 [0.23293173 0.21282037 0.03748621 0.51676166]
 [0.40808362 0.49473542 0.03218193 0.0649991 ]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 63.56 [%]
Global accuracy score (test) = 53.27 [%]
Global F1 score (train) = 62.98 [%]
Global F1 score (test) = 53.24 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.39      0.31      0.35       400
MODERATE-INTENSITY       0.47      0.56      0.51       400
         SEDENTARY       0.60      0.65      0.63       400
VIGOROUS-INTENSITY       0.67      0.63      0.65       345

          accuracy                           0.53      1545
         macro avg       0.53      0.54      0.53      1545
      weighted avg       0.53      0.53      0.53      1545

2025-11-04 13:26:12.183333: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:26:12.194900: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259172.207946 1597512 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259172.212154 1597512 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259172.222193 1597512 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259172.222210 1597512 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259172.222212 1597512 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259172.222213 1597512 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:26:12.225307: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259174.553422 1597512 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259176.203952 1597645 service.cc:152] XLA service 0x73706401e0b0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259176.203983 1597645 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:26:16.246753: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259176.420266 1597645 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259179.184976 1597645 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:44[0m 4s/step - accuracy: 0.2656 - loss: 2.2010
[1m 29/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2686 - loss: 2.0456  
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2761 - loss: 2.0168
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2791 - loss: 1.9988
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2813 - loss: 1.9823
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2823 - loss: 1.9702
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2824 - loss: 1.9698 - val_accuracy: 0.4410 - val_loss: 1.2511
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.2812 - loss: 1.6041
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3118 - loss: 1.6824 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3112 - loss: 1.6866
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3123 - loss: 1.6808
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3133 - loss: 1.6729
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3136 - loss: 1.6684 - val_accuracy: 0.4702 - val_loss: 1.2026
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3281 - loss: 1.5051
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3379 - loss: 1.5294 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3334 - loss: 1.5269
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3310 - loss: 1.5213
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3318 - loss: 1.5135
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3326 - loss: 1.5083 - val_accuracy: 0.4912 - val_loss: 1.1897
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4062 - loss: 1.3962
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3554 - loss: 1.4220 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3608 - loss: 1.4074
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3623 - loss: 1.3997
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3639 - loss: 1.3933
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3647 - loss: 1.3898 - val_accuracy: 0.5060 - val_loss: 1.1671
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4375 - loss: 1.2420
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4142 - loss: 1.3115 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4095 - loss: 1.3147
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4058 - loss: 1.3133
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4037 - loss: 1.3114
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4033 - loss: 1.3099 - val_accuracy: 0.5176 - val_loss: 1.1318
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2344 - loss: 1.4625
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3939 - loss: 1.2817 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4048 - loss: 1.2674
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4104 - loss: 1.2597
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4143 - loss: 1.2551
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4170 - loss: 1.2521 - val_accuracy: 0.5456 - val_loss: 1.0965
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4375 - loss: 1.2733
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4549 - loss: 1.1983 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4549 - loss: 1.2007
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4562 - loss: 1.1990
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4570 - loss: 1.1973
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4577 - loss: 1.1958 - val_accuracy: 0.5593 - val_loss: 1.0775
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.1401
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4905 - loss: 1.1526 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4862 - loss: 1.1574
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4836 - loss: 1.1591
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4810 - loss: 1.1604
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4798 - loss: 1.1603 - val_accuracy: 0.5604 - val_loss: 1.0564
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5781 - loss: 1.1089
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4953 - loss: 1.1366 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4919 - loss: 1.1374
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4907 - loss: 1.1384
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4905 - loss: 1.1380
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4908 - loss: 1.1374 - val_accuracy: 0.5583 - val_loss: 1.0609
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.5156 - loss: 1.1599
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4828 - loss: 1.1225 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4886 - loss: 1.1158
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4892 - loss: 1.1148
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4902 - loss: 1.1141
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4910 - loss: 1.1132 - val_accuracy: 0.5572 - val_loss: 1.0701
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.0471
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5049 - loss: 1.0802 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5042 - loss: 1.0880
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5045 - loss: 1.0907
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5049 - loss: 1.0923
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5053 - loss: 1.0927 - val_accuracy: 0.5632 - val_loss: 1.0562
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.1522
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5021 - loss: 1.1052 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5058 - loss: 1.0932
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5054 - loss: 1.0892
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5066 - loss: 1.0861
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5076 - loss: 1.0847 - val_accuracy: 0.5702 - val_loss: 1.0466
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.5625 - loss: 1.0170
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5372 - loss: 1.0620 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5298 - loss: 1.0661
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5279 - loss: 1.0679
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5274 - loss: 1.0684
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5273 - loss: 1.0680 - val_accuracy: 0.5755 - val_loss: 1.0498
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 1.0145
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5189 - loss: 1.0639 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5204 - loss: 1.0677
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5215 - loss: 1.0660
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5229 - loss: 1.0632
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5238 - loss: 1.0615 - val_accuracy: 0.5660 - val_loss: 1.0413
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5938 - loss: 0.9628
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5387 - loss: 1.0499 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5393 - loss: 1.0438
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5396 - loss: 1.0423
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5397 - loss: 1.0415
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5394 - loss: 1.0414 - val_accuracy: 0.5741 - val_loss: 1.0460
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5312 - loss: 1.0936
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5236 - loss: 1.0512 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5321 - loss: 1.0460
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5359 - loss: 1.0430
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5382 - loss: 1.0406
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5390 - loss: 1.0395 - val_accuracy: 0.5611 - val_loss: 1.0850
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5469 - loss: 1.1393
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5471 - loss: 1.0509 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5480 - loss: 1.0393
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5471 - loss: 1.0361
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5470 - loss: 1.0331
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5476 - loss: 1.0308 - val_accuracy: 0.5636 - val_loss: 1.0570
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5469 - loss: 0.9592
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5653 - loss: 0.9995 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5615 - loss: 0.9994
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5595 - loss: 1.0009
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5590 - loss: 1.0010
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5587 - loss: 1.0010 - val_accuracy: 0.5653 - val_loss: 1.0595
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 0.9787
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5499 - loss: 0.9856 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5499 - loss: 0.9927
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5510 - loss: 0.9965
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5523 - loss: 0.9977
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5530 - loss: 0.9983 - val_accuracy: 0.5660 - val_loss: 1.0536

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 394ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 17ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 7: 53.27 [%]
F1-score capturado en la ejecución 7: 53.24 [%]

=== EJECUCIÓN 8 ===

--- TRAIN (ejecución 8) ---

--- TEST (ejecución 8) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:14[0m 961ms/step
[1m 63/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 823us/step  
[1m130/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 789us/step
[1m188/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 813us/step
[1m259/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 785us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 20ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 15ms/step
[1m62/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 825us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 947us/step
Global accuracy score (validation) = 56.25 [%]
Global F1 score (validation) = 55.27 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.3494122  0.51002276 0.07428005 0.066285  ]
 [0.2789959  0.21723042 0.4565147  0.04725901]
 [0.10584166 0.06501867 0.81926507 0.00987468]
 ...
 [0.15051262 0.08639491 0.53260005 0.23049247]
 [0.23976865 0.15856318 0.35813224 0.24353597]
 [0.4144436  0.49240386 0.04991498 0.04323746]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 61.27 [%]
Global accuracy score (test) = 52.94 [%]
Global F1 score (train) = 60.73 [%]
Global F1 score (test) = 52.63 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.39      0.27      0.32       400
MODERATE-INTENSITY       0.45      0.60      0.51       400
         SEDENTARY       0.61      0.66      0.64       400
VIGOROUS-INTENSITY       0.68      0.60      0.64       345

          accuracy                           0.53      1545
         macro avg       0.53      0.53      0.53      1545
      weighted avg       0.53      0.53      0.52      1545

2025-11-04 13:26:41.682640: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:26:41.693752: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259201.706579 1600259 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259201.710595 1600259 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259201.720282 1600259 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259201.720304 1600259 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259201.720306 1600259 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259201.720308 1600259 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:26:41.723407: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259204.093619 1600259 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259205.769799 1600390 service.cc:152] XLA service 0x7e49c400c930 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259205.769842 1600390 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:26:45.815880: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259205.984150 1600390 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259208.727002 1600390 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:41[0m 4s/step - accuracy: 0.1719 - loss: 2.5281
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2784 - loss: 2.0894  
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2853 - loss: 2.0383
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2866 - loss: 2.0099
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2858 - loss: 1.9920
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2856 - loss: 1.9806
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2856 - loss: 1.9803 - val_accuracy: 0.4635 - val_loss: 1.2461
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.2031 - loss: 2.0100
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2931 - loss: 1.7913 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2942 - loss: 1.7740
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2965 - loss: 1.7564
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2974 - loss: 1.7426
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.2980 - loss: 1.7321
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.2981 - loss: 1.7317 - val_accuracy: 0.4902 - val_loss: 1.2311
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3750 - loss: 1.4959
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3185 - loss: 1.5699 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3205 - loss: 1.5615
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3203 - loss: 1.5551
[1m131/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3199 - loss: 1.5494
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3207 - loss: 1.5420 - val_accuracy: 0.5070 - val_loss: 1.1837
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.3750 - loss: 1.4265
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3530 - loss: 1.4061 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3511 - loss: 1.4101
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3482 - loss: 1.4117
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3475 - loss: 1.4104
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3478 - loss: 1.4089 - val_accuracy: 0.5183 - val_loss: 1.1808
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2648
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3885 - loss: 1.3097 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3819 - loss: 1.3190
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3817 - loss: 1.3203
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3830 - loss: 1.3199
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3844 - loss: 1.3189 - val_accuracy: 0.5411 - val_loss: 1.1622
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2067
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4339 - loss: 1.2530 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4232 - loss: 1.2608
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4191 - loss: 1.2637
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4175 - loss: 1.2643
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4174 - loss: 1.2637 - val_accuracy: 0.5411 - val_loss: 1.1259
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3438 - loss: 1.3291
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4320 - loss: 1.2305 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4389 - loss: 1.2244
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4424 - loss: 1.2194
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4448 - loss: 1.2152
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4459 - loss: 1.2137 - val_accuracy: 0.5418 - val_loss: 1.0956
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.2598
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4823 - loss: 1.1669 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4772 - loss: 1.1641
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4730 - loss: 1.1661
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4711 - loss: 1.1674
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4704 - loss: 1.1677 - val_accuracy: 0.5565 - val_loss: 1.0764
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.4531 - loss: 1.1974
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4994 - loss: 1.1299 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4908 - loss: 1.1438
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4886 - loss: 1.1474
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4886 - loss: 1.1469
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4889 - loss: 1.1462 - val_accuracy: 0.5667 - val_loss: 1.0618
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.5469 - loss: 1.0687
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4924 - loss: 1.1266 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4973 - loss: 1.1214
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4997 - loss: 1.1184
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5008 - loss: 1.1165
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5010 - loss: 1.1157 - val_accuracy: 0.5650 - val_loss: 1.0501
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.1680
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4936 - loss: 1.1408 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5030 - loss: 1.1240
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5076 - loss: 1.1153
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5100 - loss: 1.1093
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5105 - loss: 1.1075 - val_accuracy: 0.5674 - val_loss: 1.0385
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4219 - loss: 1.1858
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4832 - loss: 1.1195 
[1m 76/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4941 - loss: 1.1058
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5014 - loss: 1.1000
[1m149/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5062 - loss: 1.0956
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5077 - loss: 1.0941 - val_accuracy: 0.5646 - val_loss: 1.0536
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4844 - loss: 1.0412
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5143 - loss: 1.0739 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5231 - loss: 1.0664
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5244 - loss: 1.0642
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5244 - loss: 1.0653
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5245 - loss: 1.0654 - val_accuracy: 0.5667 - val_loss: 1.0585
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4062 - loss: 1.1921
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5292 - loss: 1.0454 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5295 - loss: 1.0481
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5296 - loss: 1.0511
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5304 - loss: 1.0514
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5308 - loss: 1.0512 - val_accuracy: 0.5948 - val_loss: 1.0359
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.1171
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4936 - loss: 1.0913 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5120 - loss: 1.0674
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5194 - loss: 1.0587
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5232 - loss: 1.0553
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5251 - loss: 1.0536 - val_accuracy: 0.5667 - val_loss: 1.0403
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.0641
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5368 - loss: 1.0286 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5365 - loss: 1.0277
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5362 - loss: 1.0275
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5370 - loss: 1.0268
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5379 - loss: 1.0261 - val_accuracy: 0.5734 - val_loss: 1.0424
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6094 - loss: 0.9106
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5482 - loss: 0.9994 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5501 - loss: 0.9977
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5513 - loss: 0.9978
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5525 - loss: 0.9984
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5526 - loss: 1.0003 - val_accuracy: 0.5779 - val_loss: 1.0293
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.1647
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5475 - loss: 1.0342 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5543 - loss: 1.0182
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5564 - loss: 1.0133
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5565 - loss: 1.0119
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5567 - loss: 1.0113 - val_accuracy: 0.5909 - val_loss: 1.0402
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 0.9647
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5748 - loss: 0.9871 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5693 - loss: 0.9912
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5669 - loss: 0.9913
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5654 - loss: 0.9921
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5648 - loss: 0.9923 - val_accuracy: 0.5811 - val_loss: 1.0517
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.6406 - loss: 0.8336
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5463 - loss: 1.0099 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5526 - loss: 1.0038
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5576 - loss: 0.9976
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5603 - loss: 0.9944
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5613 - loss: 0.9931 - val_accuracy: 0.5980 - val_loss: 1.0521
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.0676
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5400 - loss: 0.9879 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5499 - loss: 0.9835
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5560 - loss: 0.9819
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5595 - loss: 0.9802
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5613 - loss: 0.9790 - val_accuracy: 0.5846 - val_loss: 1.0305
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.0260
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5661 - loss: 0.9585 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5692 - loss: 0.9644
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5722 - loss: 0.9646
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5737 - loss: 0.9645
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5747 - loss: 0.9643 - val_accuracy: 0.5790 - val_loss: 1.0393

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 395ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 17ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 8: 52.94 [%]
F1-score capturado en la ejecución 8: 52.63 [%]

=== EJECUCIÓN 9 ===

--- TRAIN (ejecución 9) ---

--- TEST (ejecución 9) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:17[0m 971ms/step
[1m 70/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 725us/step  
[1m139/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 730us/step
[1m207/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 732us/step
[1m273/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 740us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 13ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 15ms/step
[1m63/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 808us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 947us/step
Global accuracy score (validation) = 57.23 [%]
Global F1 score (validation) = 55.45 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.07413686 0.04709338 0.8705962  0.00817357]
 [0.35467428 0.54236674 0.04532144 0.05763752]
 [0.4035364  0.46215457 0.09860842 0.03570064]
 ...
 [0.07887482 0.06350099 0.05024891 0.80737525]
 [0.16346104 0.11087964 0.22191735 0.50374186]
 [0.27945662 0.4579173  0.02092732 0.24169879]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 62.21 [%]
Global accuracy score (test) = 52.56 [%]
Global F1 score (train) = 60.82 [%]
Global F1 score (test) = 51.48 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.36      0.19      0.25       400
MODERATE-INTENSITY       0.43      0.64      0.51       400
         SEDENTARY       0.62      0.68      0.65       400
VIGOROUS-INTENSITY       0.69      0.61      0.65       345

          accuracy                           0.53      1545
         macro avg       0.52      0.53      0.51      1545
      weighted avg       0.52      0.53      0.51      1545

2025-11-04 13:27:12.264206: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:27:12.275563: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259232.288562 1603279 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259232.292535 1603279 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259232.302393 1603279 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259232.302412 1603279 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259232.302414 1603279 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259232.302415 1603279 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:27:12.305644: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259234.647780 1603279 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259236.285216 1603397 service.cc:152] XLA service 0x74b1bc004160 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259236.285284 1603397 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:27:16.335087: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259236.497505 1603397 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259239.181442 1603397 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:29[0m 4s/step - accuracy: 0.1875 - loss: 2.0014
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2492 - loss: 2.1194  
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2616 - loss: 2.0915
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2686 - loss: 2.0620
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2726 - loss: 2.0361
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2750 - loss: 2.0174
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2751 - loss: 2.0168 - val_accuracy: 0.4642 - val_loss: 1.2652
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3438 - loss: 1.5607
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2998 - loss: 1.7778 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2981 - loss: 1.7637
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3008 - loss: 1.7450
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3030 - loss: 1.7286
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3044 - loss: 1.7167 - val_accuracy: 0.4726 - val_loss: 1.1884
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2500 - loss: 1.5845
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2999 - loss: 1.5599 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3102 - loss: 1.5326
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3144 - loss: 1.5188
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3174 - loss: 1.5097
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3188 - loss: 1.5047 - val_accuracy: 0.4926 - val_loss: 1.1892
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3281 - loss: 1.3765
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3616 - loss: 1.3967 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3637 - loss: 1.3913
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3632 - loss: 1.3896
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3627 - loss: 1.3870
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3630 - loss: 1.3842 - val_accuracy: 0.5098 - val_loss: 1.1667
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3281 - loss: 1.4216
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3950 - loss: 1.3261 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3894 - loss: 1.3223
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3871 - loss: 1.3221
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3885 - loss: 1.3191
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3896 - loss: 1.3170 - val_accuracy: 0.5193 - val_loss: 1.1342
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3750 - loss: 1.3656
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4273 - loss: 1.2700 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4294 - loss: 1.2685
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4297 - loss: 1.2648
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4310 - loss: 1.2599
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4313 - loss: 1.2581 - val_accuracy: 0.5372 - val_loss: 1.1050
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.3438 - loss: 1.2349
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4294 - loss: 1.2181 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4349 - loss: 1.2155
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4389 - loss: 1.2109
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4419 - loss: 1.2076
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4442 - loss: 1.2047 - val_accuracy: 0.5639 - val_loss: 1.0788
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5625 - loss: 1.1302
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4680 - loss: 1.2039 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4735 - loss: 1.1884
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4761 - loss: 1.1821
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4770 - loss: 1.1777
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4776 - loss: 1.1756 - val_accuracy: 0.5646 - val_loss: 1.0592
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.4219 - loss: 1.1586
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4677 - loss: 1.1493 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4668 - loss: 1.1551
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4681 - loss: 1.1545
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4715 - loss: 1.1507
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4736 - loss: 1.1487 - val_accuracy: 0.5783 - val_loss: 1.0617
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.0334
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5111 - loss: 1.1278 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5135 - loss: 1.1182
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5127 - loss: 1.1159
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5121 - loss: 1.1148
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5118 - loss: 1.1141 - val_accuracy: 0.5650 - val_loss: 1.0517
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5312 - loss: 1.0484
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5062 - loss: 1.1074 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5057 - loss: 1.1021
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5078 - loss: 1.0985
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5091 - loss: 1.0963
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5095 - loss: 1.0950 - val_accuracy: 0.5692 - val_loss: 1.0557
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5156 - loss: 1.0935
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5107 - loss: 1.0860 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5152 - loss: 1.0823
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5182 - loss: 1.0787
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5189 - loss: 1.0766
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5193 - loss: 1.0755 - val_accuracy: 0.5664 - val_loss: 1.0480
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 27ms/step - accuracy: 0.6250 - loss: 0.8426
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5612 - loss: 1.0306 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5558 - loss: 1.0346
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5499 - loss: 1.0400
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5461 - loss: 1.0432
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5444 - loss: 1.0446 - val_accuracy: 0.5741 - val_loss: 1.0455
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 0.9219
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5102 - loss: 1.0691 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5185 - loss: 1.0639
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5224 - loss: 1.0581
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5254 - loss: 1.0532
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5268 - loss: 1.0515 - val_accuracy: 0.5758 - val_loss: 1.0415
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.1181
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5427 - loss: 1.0204 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5450 - loss: 1.0153
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5446 - loss: 1.0168
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5443 - loss: 1.0183
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5440 - loss: 1.0199 - val_accuracy: 0.5706 - val_loss: 1.0507
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5156 - loss: 1.0232
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5618 - loss: 1.0099 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5602 - loss: 1.0073
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5579 - loss: 1.0076
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5553 - loss: 1.0088
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5541 - loss: 1.0098 - val_accuracy: 0.5822 - val_loss: 1.0613
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5156 - loss: 0.9601
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5518 - loss: 1.0177 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5568 - loss: 1.0105
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5596 - loss: 1.0074
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5613 - loss: 1.0066
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5613 - loss: 1.0066 - val_accuracy: 0.5874 - val_loss: 1.0528
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5312 - loss: 1.0617
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5756 - loss: 0.9790 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5721 - loss: 0.9879
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5682 - loss: 0.9934
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5665 - loss: 0.9949
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5659 - loss: 0.9949 - val_accuracy: 0.5650 - val_loss: 1.0575
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.7188 - loss: 0.8744
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5762 - loss: 0.9801 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5723 - loss: 0.9789
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5706 - loss: 0.9790
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5690 - loss: 0.9794
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5682 - loss: 0.9799 - val_accuracy: 0.5492 - val_loss: 1.0834

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 393ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 17ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 9: 52.56 [%]
F1-score capturado en la ejecución 9: 51.48 [%]

=== EJECUCIÓN 10 ===

--- TRAIN (ejecución 10) ---

--- TEST (ejecución 10) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:24[0m 993ms/step
[1m 61/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 841us/step  
[1m124/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 820us/step
[1m190/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 799us/step
[1m261/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 776us/step
[1m326/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 775us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m63/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 816us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 922us/step
Global accuracy score (validation) = 54.39 [%]
Global F1 score (validation) = 50.99 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.27431554 0.21891466 0.4510824  0.05568746]
 [0.3192039  0.49159563 0.0589853  0.1302152 ]
 [0.29741156 0.23903465 0.4190216  0.04453215]
 ...
 [0.12429472 0.07840982 0.3616036  0.4356919 ]
 [0.11817874 0.13959056 0.03186647 0.7103642 ]
 [0.3592318  0.52192134 0.03410722 0.08473969]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 60.76 [%]
Global accuracy score (test) = 51.2 [%]
Global F1 score (train) = 57.84 [%]
Global F1 score (test) = 49.06 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.35      0.13      0.19       400
MODERATE-INTENSITY       0.41      0.68      0.51       400
         SEDENTARY       0.65      0.63      0.64       400
VIGOROUS-INTENSITY       0.62      0.63      0.63       345

          accuracy                           0.51      1545
         macro avg       0.51      0.52      0.49      1545
      weighted avg       0.50      0.51      0.49      1545

2025-11-04 13:27:41.782568: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:27:41.793952: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259261.807190 1606034 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259261.811422 1606034 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259261.821455 1606034 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259261.821476 1606034 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259261.821479 1606034 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259261.821481 1606034 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:27:41.824464: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259264.200514 1606034 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259265.875863 1606165 service.cc:152] XLA service 0x72fcc000bae0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259265.875894 1606165 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:27:45.920093: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259266.083037 1606165 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259268.826738 1606165 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:39[0m 4s/step - accuracy: 0.2969 - loss: 2.1831
[1m 26/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2610 - loss: 2.1378  
[1m 63/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2639 - loss: 2.0938
[1m 95/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2667 - loss: 2.0702
[1m128/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2691 - loss: 2.0463
[1m162/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 2ms/step - accuracy: 0.2713 - loss: 2.0230
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2715 - loss: 2.0217
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2715 - loss: 2.0210 - val_accuracy: 0.4800 - val_loss: 1.2667
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.1875 - loss: 2.0804
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2911 - loss: 1.7747 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3006 - loss: 1.7361
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3028 - loss: 1.7156
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3040 - loss: 1.7022
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3052 - loss: 1.6920 - val_accuracy: 0.4663 - val_loss: 1.2267
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2344 - loss: 1.5837
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3164 - loss: 1.5298 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3195 - loss: 1.5234
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3225 - loss: 1.5112
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3252 - loss: 1.5006
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3264 - loss: 1.4958 - val_accuracy: 0.4849 - val_loss: 1.2056
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3281 - loss: 1.3995
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3421 - loss: 1.3963 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3489 - loss: 1.3874
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3517 - loss: 1.3850
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3550 - loss: 1.3818
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3572 - loss: 1.3784 - val_accuracy: 0.5179 - val_loss: 1.1881
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2175
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3857 - loss: 1.3110 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3842 - loss: 1.3179
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3851 - loss: 1.3173
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3875 - loss: 1.3142
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3899 - loss: 1.3112 - val_accuracy: 0.5260 - val_loss: 1.1529
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2757
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4176 - loss: 1.2706 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4110 - loss: 1.2757
[1m 95/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4126 - loss: 1.2725
[1m131/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4149 - loss: 1.2684
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4175 - loss: 1.2647 - val_accuracy: 0.5435 - val_loss: 1.1141
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.4062 - loss: 1.2506
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4700 - loss: 1.1902 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4591 - loss: 1.2037
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4553 - loss: 1.2081
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4539 - loss: 1.2091
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4537 - loss: 1.2091 - val_accuracy: 0.5530 - val_loss: 1.0935
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4531 - loss: 1.1127
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4671 - loss: 1.1608 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4760 - loss: 1.1631
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4806 - loss: 1.1627
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4819 - loss: 1.1627
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4823 - loss: 1.1627 - val_accuracy: 0.5583 - val_loss: 1.0817
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4375 - loss: 1.2448
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4645 - loss: 1.1680 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4720 - loss: 1.1563
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4748 - loss: 1.1506
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4763 - loss: 1.1474
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4771 - loss: 1.1460 - val_accuracy: 0.5681 - val_loss: 1.0730
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 25ms/step - accuracy: 0.5156 - loss: 1.0329
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5050 - loss: 1.1184 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5023 - loss: 1.1169
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5024 - loss: 1.1166
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5030 - loss: 1.1149
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5023 - loss: 1.1148 - val_accuracy: 0.5558 - val_loss: 1.0487
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.0636
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4978 - loss: 1.1011 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5035 - loss: 1.0918
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5050 - loss: 1.0901
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5052 - loss: 1.0912
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5053 - loss: 1.0916 - val_accuracy: 0.5621 - val_loss: 1.0489
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.1609
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4995 - loss: 1.0996 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5011 - loss: 1.0973
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5044 - loss: 1.0940
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5074 - loss: 1.0914
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5086 - loss: 1.0898 - val_accuracy: 0.5755 - val_loss: 1.0646
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 0.9645
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5233 - loss: 1.0740 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5207 - loss: 1.0707
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5183 - loss: 1.0712
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5188 - loss: 1.0710
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5191 - loss: 1.0708 - val_accuracy: 0.5702 - val_loss: 1.0602
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.4688 - loss: 1.1826
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5156 - loss: 1.0718 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5222 - loss: 1.0630
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5235 - loss: 1.0610
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5239 - loss: 1.0601
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5243 - loss: 1.0596 - val_accuracy: 0.5527 - val_loss: 1.0454
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6719 - loss: 0.9073
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5750 - loss: 1.0168 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5650 - loss: 1.0218
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5593 - loss: 1.0247
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5547 - loss: 1.0283
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5521 - loss: 1.0302 - val_accuracy: 0.5730 - val_loss: 1.0529
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0328
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5410 - loss: 1.0567 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5411 - loss: 1.0503
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5429 - loss: 1.0430
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5439 - loss: 1.0386
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5441 - loss: 1.0369 - val_accuracy: 0.5621 - val_loss: 1.0543
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.5312 - loss: 0.9992
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5367 - loss: 1.0439 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5449 - loss: 1.0320
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5466 - loss: 1.0289
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5471 - loss: 1.0268
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5473 - loss: 1.0258 - val_accuracy: 0.5699 - val_loss: 1.0556
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5938 - loss: 0.9390
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5560 - loss: 1.0151 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5584 - loss: 1.0122
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5609 - loss: 1.0092
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5615 - loss: 1.0081
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5613 - loss: 1.0075 - val_accuracy: 0.5734 - val_loss: 1.0547
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.6406 - loss: 0.8264
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5896 - loss: 0.9567 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5797 - loss: 0.9703
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5737 - loss: 0.9798
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5716 - loss: 0.9837
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5707 - loss: 0.9849 - val_accuracy: 0.5748 - val_loss: 1.0537

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m20s[0m 421ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 10: 51.2 [%]
F1-score capturado en la ejecución 10: 49.06 [%]

=== EJECUCIÓN 11 ===

--- TRAIN (ejecución 11) ---

--- TEST (ejecución 11) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:22[0m 988ms/step
[1m 63/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 810us/step  
[1m132/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 767us/step
[1m198/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 766us/step
[1m269/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 752us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m66/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 777us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 906us/step
Global accuracy score (validation) = 57.79 [%]
Global F1 score (validation) = 57.37 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.21415718 0.14011541 0.6156503  0.03007722]
 [0.39083844 0.39353672 0.17556345 0.0400614 ]
 [0.36824957 0.46557528 0.10519491 0.06098024]
 ...
 [0.15045054 0.10418276 0.17342578 0.5719409 ]
 [0.15423645 0.10157157 0.19104403 0.55314803]
 [0.43782026 0.46474987 0.04047596 0.05695388]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 61.04 [%]
Global accuracy score (test) = 52.69 [%]
Global F1 score (train) = 61.12 [%]
Global F1 score (test) = 53.13 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.39      0.35      0.37       400
MODERATE-INTENSITY       0.42      0.50      0.46       400
         SEDENTARY       0.61      0.69      0.65       400
VIGOROUS-INTENSITY       0.74      0.57      0.64       345

          accuracy                           0.53      1545
         macro avg       0.54      0.53      0.53      1545
      weighted avg       0.54      0.53      0.53      1545

2025-11-04 13:28:11.310446: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:28:11.321989: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259291.335676 1608779 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259291.339847 1608779 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259291.349771 1608779 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259291.349792 1608779 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259291.349801 1608779 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259291.349803 1608779 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:28:11.353045: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259293.712246 1608779 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259295.416064 1608908 service.cc:152] XLA service 0x7ad73800cee0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259295.416126 1608908 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:28:15.462093: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259295.624314 1608908 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259298.322868 1608908 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:38[0m 4s/step - accuracy: 0.3281 - loss: 2.2440
[1m 26/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2733 - loss: 2.1557  
[1m 62/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2745 - loss: 2.1042
[1m 94/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2772 - loss: 2.0692
[1m128/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2797 - loss: 2.0383
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2821 - loss: 2.0109
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2822 - loss: 2.0102 - val_accuracy: 0.4673 - val_loss: 1.2531
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3125 - loss: 1.5966
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3117 - loss: 1.7192 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3104 - loss: 1.7196
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3098 - loss: 1.7185
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3099 - loss: 1.7122
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3105 - loss: 1.7051 - val_accuracy: 0.4740 - val_loss: 1.2190
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3594 - loss: 1.6385
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3470 - loss: 1.5471 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3447 - loss: 1.5358
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3413 - loss: 1.5318
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3394 - loss: 1.5273
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3388 - loss: 1.5244 - val_accuracy: 0.4944 - val_loss: 1.1933
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.2096
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3610 - loss: 1.4261 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3571 - loss: 1.4278
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3568 - loss: 1.4246
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3575 - loss: 1.4203
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3586 - loss: 1.4168 - val_accuracy: 0.5018 - val_loss: 1.1743
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.2003
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3995 - loss: 1.3215 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3895 - loss: 1.3325
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3871 - loss: 1.3325
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3875 - loss: 1.3303
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3880 - loss: 1.3281 - val_accuracy: 0.5239 - val_loss: 1.1579
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3438 - loss: 1.3639
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4021 - loss: 1.2798 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4056 - loss: 1.2761
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4075 - loss: 1.2734
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4087 - loss: 1.2728
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4094 - loss: 1.2721 - val_accuracy: 0.5358 - val_loss: 1.1326
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4062 - loss: 1.2884
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4293 - loss: 1.2564 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4378 - loss: 1.2401
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4396 - loss: 1.2344
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4412 - loss: 1.2312
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4429 - loss: 1.2286 - val_accuracy: 0.5467 - val_loss: 1.0915
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2459
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4803 - loss: 1.1850 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4822 - loss: 1.1793
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4801 - loss: 1.1794
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4779 - loss: 1.1797
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4770 - loss: 1.1801 - val_accuracy: 0.5537 - val_loss: 1.0678
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.5469 - loss: 1.1119
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4834 - loss: 1.1593 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4857 - loss: 1.1547
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4879 - loss: 1.1509
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4888 - loss: 1.1494
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4890 - loss: 1.1486 - val_accuracy: 0.5509 - val_loss: 1.0605
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.1333
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5210 - loss: 1.1071 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5095 - loss: 1.1122
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5038 - loss: 1.1159
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5014 - loss: 1.1175
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5008 - loss: 1.1177 - val_accuracy: 0.5527 - val_loss: 1.0593
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.1526
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5089 - loss: 1.1009 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5051 - loss: 1.1013
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5039 - loss: 1.1027
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5036 - loss: 1.1036
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5038 - loss: 1.1038 - val_accuracy: 0.5604 - val_loss: 1.0596
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5156 - loss: 1.0088
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5026 - loss: 1.0817 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5042 - loss: 1.0837
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5056 - loss: 1.0844
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5077 - loss: 1.0832
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5089 - loss: 1.0823 - val_accuracy: 0.5625 - val_loss: 1.0691
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.1780
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5111 - loss: 1.1179 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5078 - loss: 1.1070
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5088 - loss: 1.1008
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5106 - loss: 1.0962
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5119 - loss: 1.0935 - val_accuracy: 0.5674 - val_loss: 1.0554
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.0355
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5176 - loss: 1.0690 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5194 - loss: 1.0689
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5212 - loss: 1.0654
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5225 - loss: 1.0634
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5235 - loss: 1.0619 - val_accuracy: 0.5702 - val_loss: 1.0452
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 1.0601
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5381 - loss: 1.0521 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5306 - loss: 1.0539
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5288 - loss: 1.0550
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5292 - loss: 1.0531
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5300 - loss: 1.0511 - val_accuracy: 0.5702 - val_loss: 1.0552
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5000 - loss: 1.0223
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5443 - loss: 1.0030 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5443 - loss: 1.0068
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5435 - loss: 1.0122
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5429 - loss: 1.0159
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5429 - loss: 1.0177 - val_accuracy: 0.5614 - val_loss: 1.0441
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.6250 - loss: 0.9221
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5559 - loss: 1.0197 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5513 - loss: 1.0276
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5492 - loss: 1.0293
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5479 - loss: 1.0288
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5479 - loss: 1.0281 - val_accuracy: 0.5709 - val_loss: 1.0682
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5156 - loss: 1.0464
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5446 - loss: 1.0043 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5450 - loss: 1.0034
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5469 - loss: 1.0044
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5487 - loss: 1.0036
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5492 - loss: 1.0036 - val_accuracy: 0.5650 - val_loss: 1.0649
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 1.0874
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5640 - loss: 1.0135 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5621 - loss: 1.0068
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5628 - loss: 1.0016
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5622 - loss: 1.0007
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5620 - loss: 1.0002 - val_accuracy: 0.5636 - val_loss: 1.0623
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.0965
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5493 - loss: 1.0072 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5579 - loss: 0.9982
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5570 - loss: 1.0005
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5571 - loss: 0.9992
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5580 - loss: 0.9971 - val_accuracy: 0.5681 - val_loss: 1.0679
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5312 - loss: 0.9250
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5885 - loss: 0.9321 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5834 - loss: 0.9449
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5796 - loss: 0.9513
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5777 - loss: 0.9553
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5764 - loss: 0.9580 - val_accuracy: 0.5779 - val_loss: 1.0578

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 395ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 17ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 11: 52.69 [%]
F1-score capturado en la ejecución 11: 53.13 [%]

=== EJECUCIÓN 12 ===

--- TRAIN (ejecución 12) ---

--- TEST (ejecución 12) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:14[0m 962ms/step
[1m 59/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 873us/step  
[1m131/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 779us/step
[1m198/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 770us/step
[1m258/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 785us/step
[1m325/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 777us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 15ms/step
[1m69/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 747us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 910us/step
Global accuracy score (validation) = 59.09 [%]
Global F1 score (validation) = 57.81 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.39220458 0.47327372 0.09604501 0.0384767 ]
 [0.1383322  0.08676007 0.737997   0.03691076]
 [0.25497383 0.18157846 0.52000743 0.04344038]
 ...
 [0.20453106 0.13126197 0.40219074 0.2620163 ]
 [0.22633131 0.1768716  0.09366969 0.50312734]
 [0.42153814 0.48232123 0.05287072 0.04326997]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 62.96 [%]
Global accuracy score (test) = 52.62 [%]
Global F1 score (train) = 62.44 [%]
Global F1 score (test) = 52.37 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.40      0.28      0.32       400
MODERATE-INTENSITY       0.45      0.57      0.50       400
         SEDENTARY       0.58      0.66      0.62       400
VIGOROUS-INTENSITY       0.70      0.61      0.65       345

          accuracy                           0.53      1545
         macro avg       0.53      0.53      0.52      1545
      weighted avg       0.52      0.53      0.52      1545

2025-11-04 13:28:41.518029: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:28:41.529662: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259321.542834 1611711 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259321.547005 1611711 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259321.556754 1611711 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259321.556779 1611711 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259321.556781 1611711 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259321.556782 1611711 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:28:41.559953: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259323.898324 1611711 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259325.572453 1611838 service.cc:152] XLA service 0x734ad00046c0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259325.572491 1611838 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:28:45.615359: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259325.785426 1611838 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259328.494825 1611838 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:35[0m 4s/step - accuracy: 0.1875 - loss: 2.5373
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2398 - loss: 2.1165  
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2579 - loss: 2.0693
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2640 - loss: 2.0457
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2674 - loss: 2.0259
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2695 - loss: 2.0119
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2695 - loss: 2.0114 - val_accuracy: 0.4688 - val_loss: 1.2320
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.3594 - loss: 1.6177
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2897 - loss: 1.7710 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2924 - loss: 1.7514
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2946 - loss: 1.7356
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2962 - loss: 1.7214
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.2970 - loss: 1.7135 - val_accuracy: 0.4891 - val_loss: 1.1908
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2500 - loss: 1.5374
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2997 - loss: 1.5710 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3027 - loss: 1.5580
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3060 - loss: 1.5450
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3083 - loss: 1.5354
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3099 - loss: 1.5294 - val_accuracy: 0.4933 - val_loss: 1.2002
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.3764
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3379 - loss: 1.4125 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3398 - loss: 1.4130
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3404 - loss: 1.4119
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3429 - loss: 1.4071
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3450 - loss: 1.4023 - val_accuracy: 0.5204 - val_loss: 1.1831
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3125 - loss: 1.3226
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3558 - loss: 1.3347 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3643 - loss: 1.3309
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3677 - loss: 1.3278
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3708 - loss: 1.3255
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3726 - loss: 1.3243 - val_accuracy: 0.5404 - val_loss: 1.1474
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.2947
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3951 - loss: 1.3122 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3960 - loss: 1.3040
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3985 - loss: 1.2978
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4015 - loss: 1.2927
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4036 - loss: 1.2894 - val_accuracy: 0.5737 - val_loss: 1.1158
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.1954
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4184 - loss: 1.2478 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4283 - loss: 1.2333
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4360 - loss: 1.2257
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4389 - loss: 1.2212
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4402 - loss: 1.2192 - val_accuracy: 0.5709 - val_loss: 1.0614
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.2778
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4629 - loss: 1.1971 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4687 - loss: 1.1873
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4696 - loss: 1.1829
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4707 - loss: 1.1798
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4714 - loss: 1.1779 - val_accuracy: 0.5808 - val_loss: 1.0713
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4688 - loss: 1.1614
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4681 - loss: 1.1506 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4672 - loss: 1.1532
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4679 - loss: 1.1536
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4703 - loss: 1.1512
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4719 - loss: 1.1495 - val_accuracy: 0.5685 - val_loss: 1.0490
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4062 - loss: 1.1542
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4847 - loss: 1.1289 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4824 - loss: 1.1326
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4823 - loss: 1.1335
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4844 - loss: 1.1308
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4852 - loss: 1.1292 - val_accuracy: 0.5667 - val_loss: 1.0347
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.1750
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4897 - loss: 1.1327 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4987 - loss: 1.1247
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5009 - loss: 1.1209
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5026 - loss: 1.1183
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5032 - loss: 1.1167 - val_accuracy: 0.5755 - val_loss: 1.0449
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5156 - loss: 1.1355
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5357 - loss: 1.0839 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5277 - loss: 1.0882
[1m 97/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5247 - loss: 1.0884
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5223 - loss: 1.0890
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5217 - loss: 1.0882 - val_accuracy: 0.5737 - val_loss: 1.0475
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.0028
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4971 - loss: 1.0675 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5061 - loss: 1.0684
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5095 - loss: 1.0689
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5114 - loss: 1.0689
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5124 - loss: 1.0688 - val_accuracy: 0.5902 - val_loss: 1.0315
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.1668
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5231 - loss: 1.0653 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5253 - loss: 1.0624
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5272 - loss: 1.0620
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5282 - loss: 1.0610
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5291 - loss: 1.0605 - val_accuracy: 0.5815 - val_loss: 1.0252
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.5312 - loss: 1.0202
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5238 - loss: 1.0799 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5284 - loss: 1.0706
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5307 - loss: 1.0631
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5314 - loss: 1.0589
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5320 - loss: 1.0562 - val_accuracy: 0.5748 - val_loss: 1.0315
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.0220
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5277 - loss: 1.0471 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5332 - loss: 1.0409
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5346 - loss: 1.0380
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5357 - loss: 1.0358
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5361 - loss: 1.0351 - val_accuracy: 0.5878 - val_loss: 1.0289
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 1.1256
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5552 - loss: 1.0132 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5510 - loss: 1.0135
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5496 - loss: 1.0149
[1m132/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5498 - loss: 1.0151
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5490 - loss: 1.0159 - val_accuracy: 0.5832 - val_loss: 1.0388
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5781 - loss: 0.9143
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5437 - loss: 1.0015 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5469 - loss: 1.0023
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5484 - loss: 1.0042
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5488 - loss: 1.0051
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5489 - loss: 1.0056 - val_accuracy: 0.5878 - val_loss: 1.0282
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6250 - loss: 0.9069
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5635 - loss: 0.9906 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5593 - loss: 0.9923
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5579 - loss: 0.9939
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5573 - loss: 0.9950
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5570 - loss: 0.9961 - val_accuracy: 0.5839 - val_loss: 1.0374

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 407ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 12: 52.62 [%]
F1-score capturado en la ejecución 12: 52.37 [%]

=== EJECUCIÓN 13 ===

--- TRAIN (ejecución 13) ---

--- TEST (ejecución 13) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:12[0m 955ms/step
[1m 65/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 785us/step  
[1m134/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 756us/step
[1m210/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 720us/step
[1m282/328[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 715us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m59/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 869us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 950us/step
Global accuracy score (validation) = 56.74 [%]
Global F1 score (validation) = 56.17 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.39786047 0.44530278 0.09605131 0.06078543]
 [0.11284931 0.06521071 0.81051546 0.01142451]
 [0.39786047 0.44530278 0.09605131 0.06078543]
 ...
 [0.15578318 0.0828104  0.6696883  0.09171807]
 [0.30422142 0.2259294  0.3860857  0.0837635 ]
 [0.3568228  0.5098509  0.03265432 0.10067197]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 61.26 [%]
Global accuracy score (test) = 54.24 [%]
Global F1 score (train) = 61.31 [%]
Global F1 score (test) = 54.72 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.41      0.44      0.42       400
MODERATE-INTENSITY       0.47      0.43      0.45       400
         SEDENTARY       0.59      0.71      0.64       400
VIGOROUS-INTENSITY       0.76      0.60      0.67       345

          accuracy                           0.54      1545
         macro avg       0.56      0.54      0.55      1545
      weighted avg       0.55      0.54      0.54      1545

2025-11-04 13:29:11.086784: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:29:11.098067: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259351.111411 1614459 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259351.115698 1614459 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259351.125947 1614459 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259351.125966 1614459 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259351.125969 1614459 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259351.125970 1614459 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:29:11.129171: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259353.493978 1614459 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259355.178631 1614576 service.cc:152] XLA service 0x7e3c3800bf10 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259355.178680 1614576 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:29:15.223090: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259355.392862 1614576 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259358.154779 1614576 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:45[0m 4s/step - accuracy: 0.2656 - loss: 2.0770
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2771 - loss: 2.0444  
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2803 - loss: 2.0328
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2824 - loss: 2.0176
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2846 - loss: 1.9991
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 18ms/step - accuracy: 0.2860 - loss: 1.9877
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2860 - loss: 1.9873 - val_accuracy: 0.4484 - val_loss: 1.2958
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.2500 - loss: 1.9378
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2718 - loss: 1.8097 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2779 - loss: 1.7859
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2831 - loss: 1.7696
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2855 - loss: 1.7586
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.2874 - loss: 1.7504 - val_accuracy: 0.4688 - val_loss: 1.2311
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.3727
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3095 - loss: 1.5839 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3115 - loss: 1.5857
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3149 - loss: 1.5756
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3183 - loss: 1.5647
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3197 - loss: 1.5588 - val_accuracy: 0.4860 - val_loss: 1.1845
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3281 - loss: 1.6162
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3354 - loss: 1.4963 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3412 - loss: 1.4720
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3467 - loss: 1.4549
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3497 - loss: 1.4450
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3511 - loss: 1.4402 - val_accuracy: 0.5049 - val_loss: 1.1717
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4062 - loss: 1.3731
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3828 - loss: 1.3448 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3816 - loss: 1.3432
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3836 - loss: 1.3393
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3868 - loss: 1.3340
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3880 - loss: 1.3318 - val_accuracy: 0.5246 - val_loss: 1.1495
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3281 - loss: 1.3077
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4118 - loss: 1.2748 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4106 - loss: 1.2798
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4117 - loss: 1.2785
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4125 - loss: 1.2763
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4132 - loss: 1.2750 - val_accuracy: 0.5555 - val_loss: 1.1085
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.2459
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4298 - loss: 1.2371 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4364 - loss: 1.2259
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4409 - loss: 1.2210
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4432 - loss: 1.2187
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4445 - loss: 1.2174 - val_accuracy: 0.5316 - val_loss: 1.0804
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4219 - loss: 1.1554
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4640 - loss: 1.1791 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4641 - loss: 1.1813
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4669 - loss: 1.1793
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4690 - loss: 1.1768
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4696 - loss: 1.1759 - val_accuracy: 0.5492 - val_loss: 1.0608
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.1910
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4733 - loss: 1.1665 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4732 - loss: 1.1616
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4738 - loss: 1.1578
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4747 - loss: 1.1548
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4758 - loss: 1.1526 - val_accuracy: 0.5565 - val_loss: 1.0470
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.2317
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5006 - loss: 1.1235 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4967 - loss: 1.1226
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4973 - loss: 1.1187
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4974 - loss: 1.1174
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4971 - loss: 1.1170 - val_accuracy: 0.5562 - val_loss: 1.0572
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6094 - loss: 0.9606
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5317 - loss: 1.0948 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5193 - loss: 1.1027
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5145 - loss: 1.1038
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5121 - loss: 1.1046
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5106 - loss: 1.1052 - val_accuracy: 0.5839 - val_loss: 1.0333
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.5000 - loss: 1.1260
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5182 - loss: 1.0857 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5159 - loss: 1.0838
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5156 - loss: 1.0831
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5156 - loss: 1.0827
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5159 - loss: 1.0823 - val_accuracy: 0.5576 - val_loss: 1.0514
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.4688 - loss: 1.2361
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4975 - loss: 1.1067 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5033 - loss: 1.0934
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5053 - loss: 1.0869
[1m132/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5069 - loss: 1.0830
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5084 - loss: 1.0802 - val_accuracy: 0.5548 - val_loss: 1.0505
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5625 - loss: 0.9828
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5336 - loss: 1.0211 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5333 - loss: 1.0311
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5328 - loss: 1.0373
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5328 - loss: 1.0408
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5331 - loss: 1.0419 - val_accuracy: 0.5790 - val_loss: 1.0254
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4062 - loss: 1.1495
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5311 - loss: 1.0637 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5359 - loss: 1.0546
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5378 - loss: 1.0489
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5396 - loss: 1.0444
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5401 - loss: 1.0430 - val_accuracy: 0.5836 - val_loss: 1.0336
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 1.0011
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5534 - loss: 1.0067 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5545 - loss: 1.0115
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5529 - loss: 1.0123
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5509 - loss: 1.0142
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5497 - loss: 1.0156 - val_accuracy: 0.5720 - val_loss: 1.0431
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5000 - loss: 1.0729
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5433 - loss: 1.0199 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5490 - loss: 1.0147
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5494 - loss: 1.0151
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5492 - loss: 1.0150
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5493 - loss: 1.0143 - val_accuracy: 0.5836 - val_loss: 1.0456
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.6250 - loss: 0.9512
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5780 - loss: 1.0003 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5691 - loss: 0.9984
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5662 - loss: 0.9974
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5639 - loss: 0.9981
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5627 - loss: 0.9989 - val_accuracy: 0.5843 - val_loss: 1.0323
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5156 - loss: 0.8188
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5633 - loss: 0.9684 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5667 - loss: 0.9747
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5678 - loss: 0.9767
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5673 - loss: 0.9792
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5664 - loss: 0.9810 - val_accuracy: 0.5492 - val_loss: 1.0546

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 403ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 17ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 13: 54.24 [%]
F1-score capturado en la ejecución 13: 54.72 [%]

=== EJECUCIÓN 14 ===

--- TRAIN (ejecución 14) ---

--- TEST (ejecución 14) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:15[0m 965ms/step
[1m 57/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 893us/step  
[1m127/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 796us/step
[1m182/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 831us/step
[1m251/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 803us/step
[1m323/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 781us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 17ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m60/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 848us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 942us/step
Global accuracy score (validation) = 54.63 [%]
Global F1 score (validation) = 52.37 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.09515108 0.06237116 0.83111596 0.01136177]
 [0.11674577 0.07778378 0.7918281  0.01364229]
 [0.37475282 0.40135428 0.18950728 0.03438558]
 ...
 [0.14990777 0.08942442 0.29254234 0.4681254 ]
 [0.13006246 0.08714726 0.13954668 0.6432437 ]
 [0.38452595 0.5078949  0.05755074 0.05002845]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 60.79 [%]
Global accuracy score (test) = 52.49 [%]
Global F1 score (train) = 58.48 [%]
Global F1 score (test) = 51.19 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.38      0.18      0.25       400
MODERATE-INTENSITY       0.42      0.59      0.49       400
         SEDENTARY       0.60      0.71      0.65       400
VIGOROUS-INTENSITY       0.69      0.63      0.66       345

          accuracy                           0.52      1545
         macro avg       0.52      0.53      0.51      1545
      weighted avg       0.52      0.52      0.51      1545

2025-11-04 13:29:40.665664: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:29:40.677060: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259380.690059 1617227 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259380.694230 1617227 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259380.703942 1617227 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259380.703960 1617227 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259380.703962 1617227 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259380.703963 1617227 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:29:40.707188: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259383.042103 1617227 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259384.738627 1617330 service.cc:152] XLA service 0x7d520c00b540 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259384.738696 1617330 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:29:44.789337: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259384.957548 1617330 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259387.688618 1617330 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:45[0m 4s/step - accuracy: 0.2031 - loss: 2.0868
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2672 - loss: 2.0041  
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2692 - loss: 2.0038
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2716 - loss: 1.9919
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2742 - loss: 1.9755
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2756 - loss: 1.9644
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2756 - loss: 1.9640 - val_accuracy: 0.4652 - val_loss: 1.2741
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3125 - loss: 1.8172
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3066 - loss: 1.7984 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3078 - loss: 1.7706
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3091 - loss: 1.7540
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3112 - loss: 1.7355
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3120 - loss: 1.7244 - val_accuracy: 0.4793 - val_loss: 1.2070
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 27ms/step - accuracy: 0.3125 - loss: 1.5604
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3258 - loss: 1.5678 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3196 - loss: 1.5682
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3173 - loss: 1.5644
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3167 - loss: 1.5568
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3172 - loss: 1.5508 - val_accuracy: 0.4923 - val_loss: 1.1893
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2031 - loss: 1.6515
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3218 - loss: 1.4442 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3287 - loss: 1.4356
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3353 - loss: 1.4271
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3394 - loss: 1.4200
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3412 - loss: 1.4166 - val_accuracy: 0.5067 - val_loss: 1.1760
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3906 - loss: 1.3010
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3794 - loss: 1.3207 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3757 - loss: 1.3337
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3749 - loss: 1.3363
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3749 - loss: 1.3354
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3753 - loss: 1.3341 - val_accuracy: 0.5221 - val_loss: 1.1575
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.4062 - loss: 1.2926
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4011 - loss: 1.2783 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4066 - loss: 1.2721
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4101 - loss: 1.2697
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4110 - loss: 1.2686
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4117 - loss: 1.2675 - val_accuracy: 0.5341 - val_loss: 1.1298
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.4688 - loss: 1.0158
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4656 - loss: 1.1946 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4624 - loss: 1.2004
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4602 - loss: 1.2031
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4582 - loss: 1.2047
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4575 - loss: 1.2050 - val_accuracy: 0.5544 - val_loss: 1.0901
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3281 - loss: 1.3438
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4558 - loss: 1.1858 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4627 - loss: 1.1810
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4676 - loss: 1.1770
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4712 - loss: 1.1740
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4727 - loss: 1.1727 - val_accuracy: 0.5534 - val_loss: 1.0821
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.1173
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5042 - loss: 1.1409 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4978 - loss: 1.1449
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4951 - loss: 1.1450
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4948 - loss: 1.1438
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4950 - loss: 1.1426 - val_accuracy: 0.5614 - val_loss: 1.0711
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 0.9993
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4844 - loss: 1.1170 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4905 - loss: 1.1146
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4950 - loss: 1.1129
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4976 - loss: 1.1118
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4982 - loss: 1.1115 - val_accuracy: 0.5636 - val_loss: 1.0744
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.1900
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4937 - loss: 1.1031 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4982 - loss: 1.1012
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5008 - loss: 1.1004
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5034 - loss: 1.0990
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5047 - loss: 1.0983 - val_accuracy: 0.5607 - val_loss: 1.0662
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4688 - loss: 1.0753
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5174 - loss: 1.0747 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5178 - loss: 1.0752
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5192 - loss: 1.0757
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5201 - loss: 1.0750
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5207 - loss: 1.0751 - val_accuracy: 0.5650 - val_loss: 1.0597
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4844 - loss: 1.1612
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5201 - loss: 1.0733 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5268 - loss: 1.0641
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5253 - loss: 1.0656
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5246 - loss: 1.0660
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5245 - loss: 1.0654 - val_accuracy: 0.5629 - val_loss: 1.0625
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.0801
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5307 - loss: 1.0657 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5263 - loss: 1.0649
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5262 - loss: 1.0639
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5271 - loss: 1.0619
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5275 - loss: 1.0609 - val_accuracy: 0.5664 - val_loss: 1.0682
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6562 - loss: 0.9081
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5677 - loss: 1.0136 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5584 - loss: 1.0252
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5555 - loss: 1.0289
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5542 - loss: 1.0300
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5535 - loss: 1.0306 - val_accuracy: 0.5755 - val_loss: 1.0666
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 0.9367
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5372 - loss: 1.0133 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5376 - loss: 1.0217
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5382 - loss: 1.0254
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5391 - loss: 1.0271
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5396 - loss: 1.0275 - val_accuracy: 0.5678 - val_loss: 1.0648
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5938 - loss: 0.9104
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5537 - loss: 1.0189 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5507 - loss: 1.0175
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5518 - loss: 1.0156
[1m150/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5527 - loss: 1.0144
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5529 - loss: 1.0142 - val_accuracy: 0.5667 - val_loss: 1.0794

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 409ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 14: 52.49 [%]
F1-score capturado en la ejecución 14: 51.19 [%]

=== EJECUCIÓN 15 ===

--- TRAIN (ejecución 15) ---

--- TEST (ejecución 15) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:18[0m 974ms/step
[1m 67/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 758us/step  
[1m135/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 748us/step
[1m205/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 739us/step
[1m273/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 738us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m63/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 812us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 922us/step
Global accuracy score (validation) = 57.51 [%]
Global F1 score (validation) = 56.93 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.1430824  0.08419897 0.7579638  0.01475489]
 [0.2775117  0.192356   0.4840579  0.04607437]
 [0.13136673 0.07724227 0.7672386  0.0241524 ]
 ...
 [0.1693567  0.0999333  0.3366045  0.3941055 ]
 [0.2482966  0.16135398 0.24133573 0.3490137 ]
 [0.4121904  0.4836682  0.05529737 0.04884398]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 60.48 [%]
Global accuracy score (test) = 51.78 [%]
Global F1 score (train) = 60.38 [%]
Global F1 score (test) = 52.22 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.36      0.35      0.35       400
MODERATE-INTENSITY       0.46      0.47      0.47       400
         SEDENTARY       0.63      0.62      0.63       400
VIGOROUS-INTENSITY       0.64      0.64      0.64       345

          accuracy                           0.52      1545
         macro avg       0.52      0.52      0.52      1545
      weighted avg       0.52      0.52      0.52      1545

2025-11-04 13:30:09.476621: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:30:09.488074: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259409.501470 1619774 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259409.505800 1619774 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259409.516562 1619774 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259409.516587 1619774 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259409.516589 1619774 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259409.516591 1619774 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:30:09.519946: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259411.912433 1619774 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259413.589130 1619917 service.cc:152] XLA service 0x784e6800cee0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259413.589168 1619917 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:30:13.632078: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259413.794928 1619917 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259416.511800 1619917 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:36[0m 4s/step - accuracy: 0.2656 - loss: 2.0354
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2635 - loss: 2.0907  
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2717 - loss: 2.0338
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2757 - loss: 2.0025
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2784 - loss: 1.9766
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2801 - loss: 1.9605
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2802 - loss: 1.9600 - val_accuracy: 0.4508 - val_loss: 1.2387
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2969 - loss: 1.7522
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2805 - loss: 1.7712 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2866 - loss: 1.7540
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2910 - loss: 1.7358
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2940 - loss: 1.7198
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.2954 - loss: 1.7118 - val_accuracy: 0.4737 - val_loss: 1.2138
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3750 - loss: 1.4310
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3566 - loss: 1.4815 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3505 - loss: 1.4858
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3478 - loss: 1.4841
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3473 - loss: 1.4806
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3476 - loss: 1.4770 - val_accuracy: 0.5046 - val_loss: 1.1891
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4531 - loss: 1.3485
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3809 - loss: 1.3964 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3709 - loss: 1.3930
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3679 - loss: 1.3895
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3679 - loss: 1.3849
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3681 - loss: 1.3817 - val_accuracy: 0.4996 - val_loss: 1.1822
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.3179
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4029 - loss: 1.3248 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3955 - loss: 1.3258
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3949 - loss: 1.3210
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3951 - loss: 1.3176
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3957 - loss: 1.3159 - val_accuracy: 0.5179 - val_loss: 1.1588
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4688 - loss: 1.2651
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4189 - loss: 1.2612 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4179 - loss: 1.2596
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4185 - loss: 1.2572
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4194 - loss: 1.2557
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4201 - loss: 1.2549 - val_accuracy: 0.5274 - val_loss: 1.1233
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.1402
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4505 - loss: 1.2068 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4531 - loss: 1.2071
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4562 - loss: 1.2046
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4588 - loss: 1.2022
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4608 - loss: 1.2001 - val_accuracy: 0.5376 - val_loss: 1.0981
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.1958
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4876 - loss: 1.1652 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4832 - loss: 1.1630
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4807 - loss: 1.1647
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4793 - loss: 1.1647
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4788 - loss: 1.1645 - val_accuracy: 0.5481 - val_loss: 1.0822
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4688 - loss: 1.2195
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4772 - loss: 1.1319 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4793 - loss: 1.1330
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4835 - loss: 1.1305
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4864 - loss: 1.1284
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4884 - loss: 1.1276 - val_accuracy: 0.5513 - val_loss: 1.0688
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.4219 - loss: 1.1349
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4947 - loss: 1.1207 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4956 - loss: 1.1236
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4982 - loss: 1.1205
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5003 - loss: 1.1181
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5014 - loss: 1.1168 - val_accuracy: 0.5530 - val_loss: 1.0608
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5781 - loss: 0.9121
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5101 - loss: 1.0669 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5091 - loss: 1.0788
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5099 - loss: 1.0824
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5097 - loss: 1.0842
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5099 - loss: 1.0849 - val_accuracy: 0.5674 - val_loss: 1.0569
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2124
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5148 - loss: 1.0762 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5160 - loss: 1.0736
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5167 - loss: 1.0753
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5164 - loss: 1.0771
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5162 - loss: 1.0774 - val_accuracy: 0.5660 - val_loss: 1.0493
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.0833
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5143 - loss: 1.0909 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5215 - loss: 1.0771
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5212 - loss: 1.0736
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5209 - loss: 1.0723
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5203 - loss: 1.0720 - val_accuracy: 0.5586 - val_loss: 1.0656
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4531 - loss: 1.1178
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5090 - loss: 1.0601 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5172 - loss: 1.0514
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5235 - loss: 1.0463
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5265 - loss: 1.0447
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5276 - loss: 1.0441 - val_accuracy: 0.5713 - val_loss: 1.0581
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5s[0m 31ms/step - accuracy: 0.5938 - loss: 0.9990
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5395 - loss: 1.0401 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5397 - loss: 1.0367
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5415 - loss: 1.0346
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5421 - loss: 1.0339
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5424 - loss: 1.0334 - val_accuracy: 0.5702 - val_loss: 1.0504
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4531 - loss: 1.1895
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5553 - loss: 1.0306 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5544 - loss: 1.0228
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5533 - loss: 1.0191
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5533 - loss: 1.0161
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5530 - loss: 1.0150 - val_accuracy: 0.5625 - val_loss: 1.0482
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5156 - loss: 1.0985
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5434 - loss: 1.0127 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5483 - loss: 1.0089
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5503 - loss: 1.0080
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5513 - loss: 1.0075
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5516 - loss: 1.0073 - val_accuracy: 0.5734 - val_loss: 1.0577
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 1.0351
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5507 - loss: 1.0201 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5489 - loss: 1.0155
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5522 - loss: 1.0087
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5539 - loss: 1.0050
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5549 - loss: 1.0026 - val_accuracy: 0.5685 - val_loss: 1.0736
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6250 - loss: 1.0185
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5726 - loss: 1.0100 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5668 - loss: 0.9999
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5646 - loss: 0.9994
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5647 - loss: 0.9966
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5651 - loss: 0.9943 - val_accuracy: 0.5734 - val_loss: 1.0527
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6094 - loss: 0.9388
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5816 - loss: 0.9652 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5797 - loss: 0.9614
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5775 - loss: 0.9634
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5766 - loss: 0.9646
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5758 - loss: 0.9655 - val_accuracy: 0.5678 - val_loss: 1.0569
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.6719 - loss: 0.9028
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5920 - loss: 0.9461 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5852 - loss: 0.9581
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5825 - loss: 0.9605
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5819 - loss: 0.9597
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5813 - loss: 0.9600 - val_accuracy: 0.5671 - val_loss: 1.0696

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 390ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 15: 51.78 [%]
F1-score capturado en la ejecución 15: 52.22 [%]

=== EJECUCIÓN 16 ===

--- TRAIN (ejecución 16) ---

--- TEST (ejecución 16) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:07[0m 939ms/step
[1m 64/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 797us/step  
[1m133/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 760us/step
[1m197/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 768us/step
[1m267/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 754us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 23ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m64/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 800us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 908us/step
Global accuracy score (validation) = 57.2 [%]
Global F1 score (validation) = 55.53 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.07508515 0.04338294 0.8711097  0.01042216]
 [0.37559107 0.4801103  0.08429866 0.05999991]
 [0.125946   0.07783736 0.76470864 0.03150803]
 ...
 [0.13459915 0.09308495 0.27549744 0.4968184 ]
 [0.11060023 0.09500894 0.0717127  0.7226781 ]
 [0.3565716  0.5440654  0.03135574 0.06800719]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 62.85 [%]
Global accuracy score (test) = 53.33 [%]
Global F1 score (train) = 61.74 [%]
Global F1 score (test) = 52.68 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.39      0.25      0.31       400
MODERATE-INTENSITY       0.45      0.56      0.50       400
         SEDENTARY       0.61      0.71      0.66       400
VIGOROUS-INTENSITY       0.68      0.62      0.65       345

          accuracy                           0.53      1545
         macro avg       0.53      0.54      0.53      1545
      weighted avg       0.53      0.53      0.52      1545

2025-11-04 13:30:39.723560: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:30:39.735003: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259439.748080 1622706 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259439.752200 1622706 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259439.762080 1622706 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259439.762098 1622706 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259439.762101 1622706 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259439.762102 1622706 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:30:39.765258: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259442.165160 1622706 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259443.850109 1622843 service.cc:152] XLA service 0x7ee1a400b770 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259443.850162 1622843 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:30:43.899728: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259444.065886 1622843 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259446.778257 1622843 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:39[0m 4s/step - accuracy: 0.2031 - loss: 2.2135
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2444 - loss: 2.1821  
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2583 - loss: 2.1315
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2665 - loss: 2.0977
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2716 - loss: 2.0707
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2743 - loss: 2.0544
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2744 - loss: 2.0538 - val_accuracy: 0.4568 - val_loss: 1.2693
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.1719 - loss: 1.8910
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3069 - loss: 1.7583 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3080 - loss: 1.7626
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3077 - loss: 1.7602
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3079 - loss: 1.7538
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3077 - loss: 1.7494 - val_accuracy: 0.4856 - val_loss: 1.1950
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3594 - loss: 1.4837
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3340 - loss: 1.5853 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3308 - loss: 1.5782
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3295 - loss: 1.5726
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3296 - loss: 1.5663
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3301 - loss: 1.5596 - val_accuracy: 0.4860 - val_loss: 1.1869
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3750 - loss: 1.5063
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3476 - loss: 1.4543 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3470 - loss: 1.4436
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3488 - loss: 1.4362
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3509 - loss: 1.4295
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3524 - loss: 1.4250 - val_accuracy: 0.5130 - val_loss: 1.1765
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2969 - loss: 1.4851
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3683 - loss: 1.3888 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3676 - loss: 1.3818
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3677 - loss: 1.3745
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3688 - loss: 1.3675
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3706 - loss: 1.3611 - val_accuracy: 0.5305 - val_loss: 1.1648
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3594 - loss: 1.3298
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3857 - loss: 1.2997 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3911 - loss: 1.2941
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3967 - loss: 1.2885
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4007 - loss: 1.2838
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4029 - loss: 1.2813 - val_accuracy: 0.5442 - val_loss: 1.1375
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4219 - loss: 1.2165
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4009 - loss: 1.2551 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4152 - loss: 1.2415
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4243 - loss: 1.2341
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4290 - loss: 1.2306
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4314 - loss: 1.2285 - val_accuracy: 0.5348 - val_loss: 1.1113
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.5312 - loss: 1.0848
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4585 - loss: 1.1865 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4544 - loss: 1.1868
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4549 - loss: 1.1848
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4565 - loss: 1.1821
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4577 - loss: 1.1807 - val_accuracy: 0.5600 - val_loss: 1.0835
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6094 - loss: 0.9984
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5027 - loss: 1.1175 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4912 - loss: 1.1286
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4875 - loss: 1.1341
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4859 - loss: 1.1375
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4855 - loss: 1.1387 - val_accuracy: 0.5636 - val_loss: 1.0738
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.1280
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5070 - loss: 1.1155 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4985 - loss: 1.1194
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4951 - loss: 1.1220
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4948 - loss: 1.1223
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4951 - loss: 1.1219 - val_accuracy: 0.5579 - val_loss: 1.0720
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.4688 - loss: 1.1118
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4817 - loss: 1.1314 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4853 - loss: 1.1237
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4863 - loss: 1.1213
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4873 - loss: 1.1191
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4879 - loss: 1.1179 - val_accuracy: 0.5636 - val_loss: 1.0524
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4375 - loss: 1.0908
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4887 - loss: 1.0967 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4972 - loss: 1.0905
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4991 - loss: 1.0906
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5002 - loss: 1.0905
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5010 - loss: 1.0896 - val_accuracy: 0.5639 - val_loss: 1.0686
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.0738
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5342 - loss: 1.0452 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5284 - loss: 1.0519
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5254 - loss: 1.0567
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5233 - loss: 1.0618
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5222 - loss: 1.0639 - val_accuracy: 0.5787 - val_loss: 1.0479
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.2881
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5310 - loss: 1.0800 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5296 - loss: 1.0681
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5296 - loss: 1.0619
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5297 - loss: 1.0595
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5293 - loss: 1.0588 - val_accuracy: 0.5758 - val_loss: 1.0517
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.0982
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5187 - loss: 1.0597 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5263 - loss: 1.0484
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5293 - loss: 1.0440
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5307 - loss: 1.0429
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5315 - loss: 1.0424 - val_accuracy: 0.5537 - val_loss: 1.0496
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5469 - loss: 1.0642
[1m 28/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5546 - loss: 1.0285 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5526 - loss: 1.0280
[1m 95/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5501 - loss: 1.0283
[1m127/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5487 - loss: 1.0274
[1m162/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 2ms/step - accuracy: 0.5473 - loss: 1.0268
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5472 - loss: 1.0268 - val_accuracy: 0.5513 - val_loss: 1.0617
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6250 - loss: 0.9260
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5391 - loss: 1.0436 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5402 - loss: 1.0406
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5432 - loss: 1.0361
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5448 - loss: 1.0330
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5454 - loss: 1.0317 - val_accuracy: 0.5534 - val_loss: 1.0509
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6406 - loss: 0.9488
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5775 - loss: 0.9618 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5726 - loss: 0.9718
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5696 - loss: 0.9793
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5669 - loss: 0.9857
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5653 - loss: 0.9891 - val_accuracy: 0.5495 - val_loss: 1.0508

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 400ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 16: 53.33 [%]
F1-score capturado en la ejecución 16: 52.68 [%]

=== EJECUCIÓN 17 ===

--- TRAIN (ejecución 17) ---

--- TEST (ejecución 17) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:22[0m 985ms/step
[1m 67/328[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 765us/step  
[1m139/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 733us/step
[1m205/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 744us/step
[1m271/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 748us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m66/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 771us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 914us/step
Global accuracy score (validation) = 54.81 [%]
Global F1 score (validation) = 52.52 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.09818254 0.06949746 0.82164335 0.01067667]
 [0.11292537 0.07936186 0.79378754 0.01392519]
 [0.09624685 0.06817531 0.8253271  0.01025078]
 ...
 [0.15586202 0.10113589 0.24962872 0.49337333]
 [0.16800165 0.11953522 0.14071931 0.57174385]
 [0.36872244 0.45867682 0.05213324 0.12046751]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 60.0 [%]
Global accuracy score (test) = 51.59 [%]
Global F1 score (train) = 58.15 [%]
Global F1 score (test) = 50.32 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.33      0.17      0.22       400
MODERATE-INTENSITY       0.42      0.64      0.51       400
         SEDENTARY       0.61      0.67      0.64       400
VIGOROUS-INTENSITY       0.70      0.60      0.64       345

          accuracy                           0.52      1545
         macro avg       0.51      0.52      0.50      1545
      weighted avg       0.51      0.52      0.50      1545

2025-11-04 13:31:08.879919: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:31:08.891333: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259468.904724 1625368 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259468.908942 1625368 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259468.918627 1625368 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259468.918647 1625368 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259468.918649 1625368 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259468.918651 1625368 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:31:08.921807: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259471.258155 1625368 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259472.927348 1625506 service.cc:152] XLA service 0x7f734400adf0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259472.927393 1625506 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:31:12.972503: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259473.142344 1625506 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259475.874041 1625506 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:39[0m 4s/step - accuracy: 0.2188 - loss: 2.2287
[1m 28/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2609 - loss: 2.0609  
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2691 - loss: 2.0216
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2732 - loss: 1.9907
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2763 - loss: 1.9693
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2780 - loss: 1.9557
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2780 - loss: 1.9552 - val_accuracy: 0.4621 - val_loss: 1.2350
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.2500 - loss: 1.8804
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2846 - loss: 1.7814 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2901 - loss: 1.7589
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2937 - loss: 1.7381
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2965 - loss: 1.7224
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.2992 - loss: 1.7091 - val_accuracy: 0.4740 - val_loss: 1.2161
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2812 - loss: 1.6087
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3092 - loss: 1.5360 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3165 - loss: 1.5244
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3187 - loss: 1.5158
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3202 - loss: 1.5082
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3213 - loss: 1.5039 - val_accuracy: 0.4695 - val_loss: 1.2086
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2812 - loss: 1.4739
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3232 - loss: 1.4406 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3311 - loss: 1.4228
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3380 - loss: 1.4107
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3427 - loss: 1.4025
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3460 - loss: 1.3965 - val_accuracy: 0.4846 - val_loss: 1.1905
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.2542
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4134 - loss: 1.2929 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4058 - loss: 1.2965
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4047 - loss: 1.2957
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4042 - loss: 1.2947
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4038 - loss: 1.2938 - val_accuracy: 0.5323 - val_loss: 1.1559
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5000 - loss: 1.2155
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4568 - loss: 1.2197 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4454 - loss: 1.2372
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4410 - loss: 1.2419
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4393 - loss: 1.2427
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4387 - loss: 1.2420 - val_accuracy: 0.5372 - val_loss: 1.1128
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.2218
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4576 - loss: 1.2020 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4535 - loss: 1.2032
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4540 - loss: 1.2017
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4535 - loss: 1.2011
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4535 - loss: 1.2004 - val_accuracy: 0.5555 - val_loss: 1.0938
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.2399
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4599 - loss: 1.1899 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4663 - loss: 1.1810
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4690 - loss: 1.1769
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4700 - loss: 1.1734
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4699 - loss: 1.1723 - val_accuracy: 0.5618 - val_loss: 1.0889
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.1433
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4850 - loss: 1.1499 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4829 - loss: 1.1470
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4834 - loss: 1.1439
[1m130/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4829 - loss: 1.1428
[1m162/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 2ms/step - accuracy: 0.4828 - loss: 1.1420
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4828 - loss: 1.1418 - val_accuracy: 0.5751 - val_loss: 1.0825
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.5156 - loss: 1.1533
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4967 - loss: 1.1343 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5040 - loss: 1.1216
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5056 - loss: 1.1180
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5055 - loss: 1.1177
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5055 - loss: 1.1167 - val_accuracy: 0.5671 - val_loss: 1.0983
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.1560
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5187 - loss: 1.0868 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5135 - loss: 1.0939
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5114 - loss: 1.0970
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5113 - loss: 1.0969
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5116 - loss: 1.0962 - val_accuracy: 0.5730 - val_loss: 1.0655
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.0894
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5054 - loss: 1.0913 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5112 - loss: 1.0887
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5136 - loss: 1.0879
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5142 - loss: 1.0876
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5142 - loss: 1.0875 - val_accuracy: 0.5737 - val_loss: 1.0676
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5312 - loss: 0.9875
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5192 - loss: 1.0785 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5177 - loss: 1.0814
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5176 - loss: 1.0804
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5175 - loss: 1.0792
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5178 - loss: 1.0775 - val_accuracy: 0.5674 - val_loss: 1.0688
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5938 - loss: 0.9497
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5239 - loss: 1.0546 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5246 - loss: 1.0553
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5253 - loss: 1.0557
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5261 - loss: 1.0550
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5268 - loss: 1.0548 - val_accuracy: 0.5772 - val_loss: 1.0639
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 0.9966
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5287 - loss: 1.0611 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5323 - loss: 1.0520
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5331 - loss: 1.0472
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5328 - loss: 1.0459
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5326 - loss: 1.0454 - val_accuracy: 0.5474 - val_loss: 1.0605
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.0800
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5463 - loss: 1.0146 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5454 - loss: 1.0171
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5462 - loss: 1.0183
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5458 - loss: 1.0203
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5456 - loss: 1.0210 - val_accuracy: 0.5699 - val_loss: 1.0578
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4375 - loss: 1.0375
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5352 - loss: 1.0195 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5402 - loss: 1.0140
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5423 - loss: 1.0140
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5428 - loss: 1.0156
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5429 - loss: 1.0165 - val_accuracy: 0.5632 - val_loss: 1.0664
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 0.9075
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5574 - loss: 0.9947 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5504 - loss: 1.0075
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5501 - loss: 1.0109
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5497 - loss: 1.0136
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5497 - loss: 1.0142 - val_accuracy: 0.5636 - val_loss: 1.0782
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.4688 - loss: 1.0602
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5347 - loss: 1.0296 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5418 - loss: 1.0215
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5453 - loss: 1.0168
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5481 - loss: 1.0142
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5503 - loss: 1.0121 - val_accuracy: 0.5709 - val_loss: 1.0644
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 0.9589
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5629 - loss: 0.9778 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5627 - loss: 0.9830
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5621 - loss: 0.9883
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5623 - loss: 0.9908
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5624 - loss: 0.9913 - val_accuracy: 0.5741 - val_loss: 1.0551
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 0.9564
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5471 - loss: 1.0115 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5533 - loss: 1.0035
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5571 - loss: 0.9969
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5587 - loss: 0.9929
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5591 - loss: 0.9913 - val_accuracy: 0.5706 - val_loss: 1.0781
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6250 - loss: 0.9443
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5702 - loss: 0.9699 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5733 - loss: 0.9734
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5720 - loss: 0.9771
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5709 - loss: 0.9781
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5708 - loss: 0.9784 - val_accuracy: 0.5758 - val_loss: 1.0709
Epoch 23/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.6406 - loss: 0.9245
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5689 - loss: 0.9771 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5752 - loss: 0.9653
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5759 - loss: 0.9634
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5763 - loss: 0.9622
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5763 - loss: 0.9625 - val_accuracy: 0.5716 - val_loss: 1.0588
Epoch 24/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5781 - loss: 0.9259
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.6016 - loss: 0.9278 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5936 - loss: 0.9342
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5900 - loss: 0.9379
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5882 - loss: 0.9412
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5869 - loss: 0.9430 - val_accuracy: 0.5657 - val_loss: 1.0762
Epoch 25/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5469 - loss: 1.1242
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5845 - loss: 0.9668 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5838 - loss: 0.9558
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5849 - loss: 0.9521
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5854 - loss: 0.9509
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5853 - loss: 0.9507 - val_accuracy: 0.5762 - val_loss: 1.0559

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 409ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 17: 51.59 [%]
F1-score capturado en la ejecución 17: 50.32 [%]

=== EJECUCIÓN 18 ===

--- TRAIN (ejecución 18) ---

--- TEST (ejecución 18) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:22[0m 986ms/step
[1m 65/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 790us/step  
[1m134/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 759us/step
[1m201/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 755us/step
[1m264/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 765us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 17ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 17ms/step
[1m63/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 810us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 921us/step
Global accuracy score (validation) = 57.37 [%]
Global F1 score (validation) = 56.86 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.08031513 0.04905053 0.85832626 0.01230804]
 [0.22498867 0.1616964  0.5725845  0.04073046]
 [0.09626385 0.06143386 0.8293209  0.01298142]
 ...
 [0.13650864 0.07396243 0.4328279  0.35670108]
 [0.24986991 0.2480957  0.03603684 0.46599755]
 [0.29917732 0.32646868 0.02714729 0.34720674]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 64.05 [%]
Global accuracy score (test) = 52.49 [%]
Global F1 score (train) = 63.82 [%]
Global F1 score (test) = 52.58 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.38      0.33      0.35       400
MODERATE-INTENSITY       0.46      0.47      0.46       400
         SEDENTARY       0.56      0.71      0.63       400
VIGOROUS-INTENSITY       0.73      0.61      0.66       345

          accuracy                           0.52      1545
         macro avg       0.53      0.53      0.53      1545
      weighted avg       0.52      0.52      0.52      1545

2025-11-04 13:31:40.537086: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:31:40.548451: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259500.561845 1628664 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259500.566008 1628664 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259500.576197 1628664 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259500.576218 1628664 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259500.576220 1628664 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259500.576222 1628664 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:31:40.579567: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259502.926340 1628664 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259504.591314 1628797 service.cc:152] XLA service 0x7c23e401d520 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259504.591357 1628797 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:31:44.637218: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259504.798259 1628797 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259507.519716 1628797 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:37[0m 4s/step - accuracy: 0.1406 - loss: 2.2161
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2653 - loss: 2.0249  
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2725 - loss: 1.9920
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2773 - loss: 1.9642
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2802 - loss: 1.9440
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2819 - loss: 1.9302
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 30ms/step - accuracy: 0.2819 - loss: 1.9298 - val_accuracy: 0.4382 - val_loss: 1.2740
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3438 - loss: 1.6550
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3085 - loss: 1.6714 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3083 - loss: 1.6669
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3085 - loss: 1.6630
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3094 - loss: 1.6586
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3104 - loss: 1.6541 - val_accuracy: 0.4779 - val_loss: 1.1991
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.2656 - loss: 1.6307
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3183 - loss: 1.5379 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3203 - loss: 1.5350
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3203 - loss: 1.5292
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3213 - loss: 1.5229
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3230 - loss: 1.5165 - val_accuracy: 0.5049 - val_loss: 1.1743
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.3594 - loss: 1.3085
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3488 - loss: 1.4100 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3445 - loss: 1.4206
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3460 - loss: 1.4179
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3485 - loss: 1.4131
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3498 - loss: 1.4103 - val_accuracy: 0.5211 - val_loss: 1.1702
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4219 - loss: 1.2908
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3965 - loss: 1.3118 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3968 - loss: 1.3123
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3949 - loss: 1.3133
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3948 - loss: 1.3127
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3951 - loss: 1.3120 - val_accuracy: 0.5400 - val_loss: 1.1311
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3906 - loss: 1.2452
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4109 - loss: 1.2769 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4151 - loss: 1.2748
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4170 - loss: 1.2713
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4199 - loss: 1.2660
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4217 - loss: 1.2626 - val_accuracy: 0.5562 - val_loss: 1.0949
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4688 - loss: 1.1767
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4367 - loss: 1.2222 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4413 - loss: 1.2176
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4435 - loss: 1.2150
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4460 - loss: 1.2120
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4473 - loss: 1.2105 - val_accuracy: 0.5544 - val_loss: 1.0751
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4219 - loss: 1.2498
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4741 - loss: 1.1707 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4753 - loss: 1.1653
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4756 - loss: 1.1619
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4746 - loss: 1.1607
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4745 - loss: 1.1601 - val_accuracy: 0.5527 - val_loss: 1.0570
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.5625 - loss: 1.1370
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4871 - loss: 1.1274 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4880 - loss: 1.1259
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4893 - loss: 1.1261
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4898 - loss: 1.1280
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4904 - loss: 1.1281 - val_accuracy: 0.5765 - val_loss: 1.0436
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.1207
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4749 - loss: 1.1560 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4875 - loss: 1.1408
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4932 - loss: 1.1318
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4957 - loss: 1.1265
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4979 - loss: 1.1227 - val_accuracy: 0.5737 - val_loss: 1.0467
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.0716
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5241 - loss: 1.0850 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5177 - loss: 1.0877
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5157 - loss: 1.0872
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5143 - loss: 1.0871
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5138 - loss: 1.0868 - val_accuracy: 0.5790 - val_loss: 1.0425
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.5156 - loss: 1.1431
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5396 - loss: 1.0655 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5347 - loss: 1.0662
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5326 - loss: 1.0664
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5311 - loss: 1.0669
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5303 - loss: 1.0672 - val_accuracy: 0.5706 - val_loss: 1.0378
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.6562 - loss: 0.9360
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5191 - loss: 1.0834 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5214 - loss: 1.0806
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5238 - loss: 1.0763
[1m149/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5251 - loss: 1.0728
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5257 - loss: 1.0714 - val_accuracy: 0.5748 - val_loss: 1.0367
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6562 - loss: 0.9978
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5279 - loss: 1.0509 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5305 - loss: 1.0475
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5310 - loss: 1.0475
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5315 - loss: 1.0469
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5322 - loss: 1.0461 - val_accuracy: 0.5787 - val_loss: 1.0308
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 0.9794
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5256 - loss: 1.0244 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5277 - loss: 1.0295
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5290 - loss: 1.0321
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5309 - loss: 1.0333
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5318 - loss: 1.0339 - val_accuracy: 0.5797 - val_loss: 1.0277
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5469 - loss: 0.9594
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5583 - loss: 1.0175 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5541 - loss: 1.0158
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5525 - loss: 1.0152
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5509 - loss: 1.0160
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5505 - loss: 1.0162 - val_accuracy: 0.5797 - val_loss: 1.0288
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6562 - loss: 0.9204
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5705 - loss: 1.0065 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5662 - loss: 1.0041
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5621 - loss: 1.0057
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5594 - loss: 1.0072
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5587 - loss: 1.0074 - val_accuracy: 0.5741 - val_loss: 1.0303
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 1.0522
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5681 - loss: 0.9978 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5656 - loss: 0.9940
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5652 - loss: 0.9927
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5645 - loss: 0.9926
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5639 - loss: 0.9934 - val_accuracy: 0.5730 - val_loss: 1.0260
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 0.9947
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5656 - loss: 0.9544 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5642 - loss: 0.9673
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5626 - loss: 0.9738
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5622 - loss: 0.9782
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5623 - loss: 0.9802 - val_accuracy: 0.5779 - val_loss: 1.0514
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 0.9093
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5833 - loss: 0.9690 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5793 - loss: 0.9724
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5782 - loss: 0.9729
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5764 - loss: 0.9751
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5758 - loss: 0.9762 - val_accuracy: 0.5741 - val_loss: 1.0416
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5469 - loss: 0.9886
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5837 - loss: 0.9550 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5803 - loss: 0.9641
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5786 - loss: 0.9683
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5772 - loss: 0.9698
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5762 - loss: 0.9705 - val_accuracy: 0.5646 - val_loss: 1.0578
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 0.8574
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5888 - loss: 0.9256 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5829 - loss: 0.9414
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5812 - loss: 0.9473
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5804 - loss: 0.9501
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5800 - loss: 0.9514 - val_accuracy: 0.5737 - val_loss: 1.0357
Epoch 23/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4375 - loss: 1.3471
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5714 - loss: 0.9844 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5775 - loss: 0.9671
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5789 - loss: 0.9611
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5793 - loss: 0.9579
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5792 - loss: 0.9570 - val_accuracy: 0.5744 - val_loss: 1.0526

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 402ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 18: 52.49 [%]
F1-score capturado en la ejecución 18: 52.58 [%]

=== EJECUCIÓN 19 ===

--- TRAIN (ejecución 19) ---

--- TEST (ejecución 19) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:18[0m 975ms/step
[1m 59/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 865us/step  
[1m126/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 810us/step
[1m193/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 789us/step
[1m264/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 766us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 15ms/step
[1m66/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 777us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 967us/step
Global accuracy score (validation) = 57.9 [%]
Global F1 score (validation) = 56.49 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.3886258  0.42297706 0.13412547 0.05427163]
 [0.44282866 0.35121667 0.12194764 0.08400702]
 [0.246268   0.17462416 0.5307738  0.04833396]
 ...
 [0.12231723 0.07199608 0.3254426  0.4802441 ]
 [0.09432875 0.08127784 0.03823058 0.78616273]
 [0.3528676  0.42479125 0.03010175 0.19223939]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 63.56 [%]
Global accuracy score (test) = 53.2 [%]
Global F1 score (train) = 62.44 [%]
Global F1 score (test) = 52.89 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.40      0.30      0.34       400
MODERATE-INTENSITY       0.46      0.54      0.50       400
         SEDENTARY       0.61      0.69      0.64       400
VIGOROUS-INTENSITY       0.65      0.61      0.63       345

          accuracy                           0.53      1545
         macro avg       0.53      0.53      0.53      1545
      weighted avg       0.53      0.53      0.53      1545

2025-11-04 13:32:11.373258: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:32:11.384750: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259531.397747 1631774 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259531.401875 1631774 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259531.411683 1631774 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259531.411703 1631774 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259531.411705 1631774 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259531.411706 1631774 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:32:11.414798: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259533.755474 1631774 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259535.428968 1631915 service.cc:152] XLA service 0x7887b0002fb0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259535.429034 1631915 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:32:15.477929: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259535.644064 1631915 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259538.363077 1631915 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:39[0m 4s/step - accuracy: 0.3438 - loss: 2.0670
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3085 - loss: 2.0735  
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2999 - loss: 2.0620
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2963 - loss: 2.0459
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2943 - loss: 2.0267
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2935 - loss: 2.0131
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2935 - loss: 2.0125 - val_accuracy: 0.4712 - val_loss: 1.2183
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.2031 - loss: 1.9303
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2925 - loss: 1.8051 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2986 - loss: 1.7773
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3002 - loss: 1.7607
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3014 - loss: 1.7466
[1m162/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 2ms/step - accuracy: 0.3028 - loss: 1.7356
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3030 - loss: 1.7345 - val_accuracy: 0.4754 - val_loss: 1.2141
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2656 - loss: 1.6746
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2950 - loss: 1.5938 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3053 - loss: 1.5694
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3077 - loss: 1.5610
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3099 - loss: 1.5532
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3114 - loss: 1.5481 - val_accuracy: 0.5109 - val_loss: 1.1967
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.2812 - loss: 1.4330
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3179 - loss: 1.4521 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3238 - loss: 1.4422
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3267 - loss: 1.4373
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3309 - loss: 1.4304
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3331 - loss: 1.4264 - val_accuracy: 0.5081 - val_loss: 1.1830
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4219 - loss: 1.2552
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3782 - loss: 1.3263 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3755 - loss: 1.3380
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3759 - loss: 1.3390
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3767 - loss: 1.3382
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3775 - loss: 1.3372 - val_accuracy: 0.5260 - val_loss: 1.1652
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3750 - loss: 1.3626
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3848 - loss: 1.3205 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3914 - loss: 1.3103
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3950 - loss: 1.3041
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3983 - loss: 1.2984
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3999 - loss: 1.2957 - val_accuracy: 0.5288 - val_loss: 1.1317
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5156 - loss: 1.1894
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4396 - loss: 1.2568 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4418 - loss: 1.2495
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4415 - loss: 1.2492
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4420 - loss: 1.2471
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4430 - loss: 1.2444 - val_accuracy: 0.5565 - val_loss: 1.1034
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3750 - loss: 1.2984
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4598 - loss: 1.1878 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4556 - loss: 1.1882
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4538 - loss: 1.1899
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4544 - loss: 1.1895
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4554 - loss: 1.1884 - val_accuracy: 0.5604 - val_loss: 1.0906
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.1466
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4685 - loss: 1.1598 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4743 - loss: 1.1570
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4767 - loss: 1.1573
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4779 - loss: 1.1562
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4783 - loss: 1.1556 - val_accuracy: 0.5600 - val_loss: 1.0567
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6250 - loss: 1.1866
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5060 - loss: 1.1396 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5019 - loss: 1.1369
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5012 - loss: 1.1338
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4999 - loss: 1.1324
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4993 - loss: 1.1317 - val_accuracy: 0.5716 - val_loss: 1.0710
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5312 - loss: 0.9965
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5082 - loss: 1.1150 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5079 - loss: 1.1128
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5075 - loss: 1.1125
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5073 - loss: 1.1131
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5071 - loss: 1.1131 - val_accuracy: 0.5685 - val_loss: 1.0632
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.3331
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5197 - loss: 1.1006 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5186 - loss: 1.0973
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5188 - loss: 1.0951
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5183 - loss: 1.0943
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5174 - loss: 1.0944 - val_accuracy: 0.5744 - val_loss: 1.0464
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6250 - loss: 1.0058
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5341 - loss: 1.0629 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5336 - loss: 1.0658
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5327 - loss: 1.0689
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5320 - loss: 1.0723
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5317 - loss: 1.0735 - val_accuracy: 0.5808 - val_loss: 1.0507
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.2514
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5286 - loss: 1.0771 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5251 - loss: 1.0793
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5246 - loss: 1.0773
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5250 - loss: 1.0748
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5254 - loss: 1.0733 - val_accuracy: 0.5783 - val_loss: 1.0604
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 1.0973
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5255 - loss: 1.0555 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5270 - loss: 1.0525
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5290 - loss: 1.0510
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5302 - loss: 1.0501
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5305 - loss: 1.0498 - val_accuracy: 0.5829 - val_loss: 1.0450
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.0247
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5285 - loss: 1.0530 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5336 - loss: 1.0479
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5362 - loss: 1.0447
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5381 - loss: 1.0434
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5392 - loss: 1.0434 - val_accuracy: 0.5737 - val_loss: 1.0421
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.0446
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5461 - loss: 1.0197 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5478 - loss: 1.0220
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5471 - loss: 1.0256
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5461 - loss: 1.0272
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5457 - loss: 1.0279 - val_accuracy: 0.5864 - val_loss: 1.0533
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 0.9909
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5728 - loss: 0.9938 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5730 - loss: 0.9929
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5710 - loss: 0.9935
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5686 - loss: 0.9970
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5673 - loss: 0.9985 - val_accuracy: 0.5892 - val_loss: 1.0462
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 0.9911
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5414 - loss: 1.0199 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5527 - loss: 1.0067
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5556 - loss: 1.0053
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5577 - loss: 1.0044
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5583 - loss: 1.0043 - val_accuracy: 0.5846 - val_loss: 1.0512
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5000 - loss: 1.1024
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5633 - loss: 1.0213 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5667 - loss: 1.0158
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5671 - loss: 1.0130
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5668 - loss: 1.0103
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5669 - loss: 1.0082 - val_accuracy: 0.5843 - val_loss: 1.0672
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4688 - loss: 1.0986
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5759 - loss: 0.9761 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5770 - loss: 0.9763
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5770 - loss: 0.9771
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5764 - loss: 0.9778
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5765 - loss: 0.9778 - val_accuracy: 0.5829 - val_loss: 1.0573

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 398ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 19: 53.2 [%]
F1-score capturado en la ejecución 19: 52.89 [%]

=== EJECUCIÓN 20 ===

--- TRAIN (ejecución 20) ---

--- TEST (ejecución 20) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:13[0m 960ms/step
[1m 57/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 898us/step  
[1m127/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 800us/step
[1m189/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 804us/step
[1m255/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 794us/step
[1m321/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 788us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 13ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 15ms/step
[1m72/89[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 712us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 846us/step
Global accuracy score (validation) = 57.51 [%]
Global F1 score (validation) = 56.16 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.23220201 0.16661172 0.554127   0.04705927]
 [0.08486249 0.05158341 0.85417765 0.0093764 ]
 [0.23730096 0.1710536  0.5423434  0.04930203]
 ...
 [0.16071393 0.09556857 0.5016227  0.24209486]
 [0.3423042  0.35887322 0.0659693  0.23285338]
 [0.3324881  0.4233142  0.04613116 0.19806655]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 62.88 [%]
Global accuracy score (test) = 52.69 [%]
Global F1 score (train) = 61.98 [%]
Global F1 score (test) = 51.82 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.40      0.24      0.30       400
MODERATE-INTENSITY       0.44      0.57      0.50       400
         SEDENTARY       0.59      0.72      0.65       400
VIGOROUS-INTENSITY       0.68      0.59      0.63       345

          accuracy                           0.53      1545
         macro avg       0.53      0.53      0.52      1545
      weighted avg       0.52      0.53      0.51      1545

2025-11-04 13:32:41.642286: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:32:41.653441: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259561.666740 1634708 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259561.670739 1634708 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259561.681042 1634708 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259561.681068 1634708 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259561.681070 1634708 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259561.681072 1634708 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:32:41.684248: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259564.051692 1634708 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259565.722912 1634842 service.cc:152] XLA service 0x7761dc01de60 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259565.722948 1634842 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:32:45.766141: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259565.938716 1634842 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259568.661758 1634842 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:39[0m 4s/step - accuracy: 0.3281 - loss: 1.8278
[1m 27/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2962 - loss: 2.0353  
[1m 61/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2951 - loss: 2.0146
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2930 - loss: 1.9997
[1m132/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2917 - loss: 1.9858
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2908 - loss: 1.9735
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2908 - loss: 1.9731 - val_accuracy: 0.4466 - val_loss: 1.3128
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.2812 - loss: 1.7564
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2974 - loss: 1.7959 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3029 - loss: 1.7696
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3037 - loss: 1.7560
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3047 - loss: 1.7417
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3050 - loss: 1.7346 - val_accuracy: 0.4737 - val_loss: 1.2137
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.2969 - loss: 1.5851
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3368 - loss: 1.5334 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3350 - loss: 1.5351
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3345 - loss: 1.5319
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3346 - loss: 1.5278
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3349 - loss: 1.5249 - val_accuracy: 0.4905 - val_loss: 1.2022
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.3913
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3651 - loss: 1.4229 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3593 - loss: 1.4260
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3578 - loss: 1.4251
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3576 - loss: 1.4215
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3578 - loss: 1.4191 - val_accuracy: 0.5067 - val_loss: 1.1892
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3906 - loss: 1.5879
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3929 - loss: 1.3762 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3824 - loss: 1.3728
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3798 - loss: 1.3691
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3794 - loss: 1.3641
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3797 - loss: 1.3598 - val_accuracy: 0.5137 - val_loss: 1.1770
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2397
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4110 - loss: 1.2847 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4075 - loss: 1.2865
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4079 - loss: 1.2875
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4096 - loss: 1.2864
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4103 - loss: 1.2854 - val_accuracy: 0.5305 - val_loss: 1.1484
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4219 - loss: 1.2663
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4319 - loss: 1.2630 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4317 - loss: 1.2601
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4333 - loss: 1.2559
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4356 - loss: 1.2508
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4373 - loss: 1.2474 - val_accuracy: 0.5425 - val_loss: 1.1118
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.6094 - loss: 1.0629
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4736 - loss: 1.1861 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4686 - loss: 1.1883
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4677 - loss: 1.1886
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4681 - loss: 1.1873
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4686 - loss: 1.1862 - val_accuracy: 0.5600 - val_loss: 1.0844
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.1047
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4742 - loss: 1.1813 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4773 - loss: 1.1692
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4794 - loss: 1.1632
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4808 - loss: 1.1593
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4821 - loss: 1.1564 - val_accuracy: 0.5579 - val_loss: 1.0856
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3906 - loss: 1.3639
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4870 - loss: 1.1446 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4879 - loss: 1.1403
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4862 - loss: 1.1396
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4867 - loss: 1.1381
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4879 - loss: 1.1364 - val_accuracy: 0.5681 - val_loss: 1.0618
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.5312 - loss: 1.0089
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5077 - loss: 1.1197 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5082 - loss: 1.1133
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5082 - loss: 1.1110
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5075 - loss: 1.1104
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5074 - loss: 1.1101 - val_accuracy: 0.5758 - val_loss: 1.0673
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0716
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4962 - loss: 1.1155 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5018 - loss: 1.1087
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5057 - loss: 1.1034
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5093 - loss: 1.0987
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5108 - loss: 1.0968 - val_accuracy: 0.5716 - val_loss: 1.0550
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5156 - loss: 1.0781
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5235 - loss: 1.0731 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5174 - loss: 1.0816
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5169 - loss: 1.0832
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5174 - loss: 1.0837
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5188 - loss: 1.0825 - val_accuracy: 0.5755 - val_loss: 1.0486
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6094 - loss: 1.0732
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5127 - loss: 1.1015 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5220 - loss: 1.0812
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5253 - loss: 1.0720
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5264 - loss: 1.0672
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5271 - loss: 1.0653 - val_accuracy: 0.5755 - val_loss: 1.0577
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 0.9599
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5472 - loss: 1.0124 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5463 - loss: 1.0174
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5435 - loss: 1.0235
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5420 - loss: 1.0281
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5416 - loss: 1.0299 - val_accuracy: 0.5723 - val_loss: 1.0577
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.4688 - loss: 1.0461
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5278 - loss: 1.0397 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5295 - loss: 1.0426
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5327 - loss: 1.0410
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5345 - loss: 1.0403
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5356 - loss: 1.0399 - val_accuracy: 0.5787 - val_loss: 1.0474
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.0945
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5635 - loss: 1.0116 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5633 - loss: 1.0182
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5627 - loss: 1.0204
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5619 - loss: 1.0212
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5607 - loss: 1.0221 - val_accuracy: 0.5646 - val_loss: 1.0655
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.2640
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5335 - loss: 1.0542 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5424 - loss: 1.0340
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5455 - loss: 1.0280
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5470 - loss: 1.0259
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5479 - loss: 1.0246 - val_accuracy: 0.5797 - val_loss: 1.0638
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.2136
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5354 - loss: 1.0281 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5454 - loss: 1.0159
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5493 - loss: 1.0119
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5516 - loss: 1.0099
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5525 - loss: 1.0092 - val_accuracy: 0.5804 - val_loss: 1.0436
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5312 - loss: 1.0233
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5641 - loss: 0.9981 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5659 - loss: 0.9939
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5676 - loss: 0.9902
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5684 - loss: 0.9891
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5681 - loss: 0.9893 - val_accuracy: 0.5797 - val_loss: 1.0494
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6719 - loss: 0.9030
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5997 - loss: 0.9792 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5929 - loss: 0.9724
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5880 - loss: 0.9713
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5847 - loss: 0.9726
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5828 - loss: 0.9741 - val_accuracy: 0.5730 - val_loss: 1.0452
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5625 - loss: 0.9858
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5593 - loss: 0.9913 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5642 - loss: 0.9818
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5657 - loss: 0.9786
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5664 - loss: 0.9778
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5667 - loss: 0.9773 - val_accuracy: 0.5629 - val_loss: 1.0548
Epoch 23/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 1.1190
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5748 - loss: 0.9873 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5739 - loss: 0.9851
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5753 - loss: 0.9804
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5774 - loss: 0.9753
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5789 - loss: 0.9721 - val_accuracy: 0.5614 - val_loss: 1.0400
Epoch 24/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0689
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5667 - loss: 0.9854 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5777 - loss: 0.9686
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5801 - loss: 0.9645
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5802 - loss: 0.9629
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5797 - loss: 0.9625 - val_accuracy: 0.5713 - val_loss: 1.0728
Epoch 25/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 0.9969
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5655 - loss: 0.9715 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5706 - loss: 0.9697
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5763 - loss: 0.9654
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5802 - loss: 0.9619
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5820 - loss: 0.9600 - val_accuracy: 0.5787 - val_loss: 1.0554
Epoch 26/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 1.0744
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6079 - loss: 0.9270 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6009 - loss: 0.9327
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5974 - loss: 0.9371
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5953 - loss: 0.9410
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5949 - loss: 0.9416 - val_accuracy: 0.5734 - val_loss: 1.0675
Epoch 27/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 0.9298
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.6087 - loss: 0.9181 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6032 - loss: 0.9199
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6003 - loss: 0.9226
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5987 - loss: 0.9255
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5982 - loss: 0.9266 - val_accuracy: 0.5783 - val_loss: 1.0619
Epoch 28/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5938 - loss: 0.9598
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6122 - loss: 0.9153 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6061 - loss: 0.9209
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6044 - loss: 0.9212
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.6028 - loss: 0.9221
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.6015 - loss: 0.9228 - val_accuracy: 0.5779 - val_loss: 1.0612

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 414ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 20: 52.69 [%]
F1-score capturado en la ejecución 20: 51.82 [%]

=== EJECUCIÓN 21 ===

--- TRAIN (ejecución 21) ---

--- TEST (ejecución 21) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:17[0m 972ms/step
[1m 60/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 860us/step  
[1m122/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 837us/step
[1m188/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 812us/step
[1m251/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 810us/step
[1m319/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 796us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 17ms/step
[1m70/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 731us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 867us/step
Global accuracy score (validation) = 56.95 [%]
Global F1 score (validation) = 55.75 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.21964781 0.15005746 0.59357697 0.03671767]
 [0.3910658  0.53824115 0.04393312 0.02675996]
 [0.14636719 0.07960481 0.7364651  0.03756284]
 ...
 [0.11095899 0.05982289 0.18605074 0.64316744]
 [0.24735144 0.15040784 0.10235029 0.49989045]
 [0.33471674 0.4418893  0.02489983 0.19849408]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 64.81 [%]
Global accuracy score (test) = 55.15 [%]
Global F1 score (train) = 63.76 [%]
Global F1 score (test) = 55.06 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.41      0.32      0.36       400
MODERATE-INTENSITY       0.47      0.57      0.52       400
         SEDENTARY       0.64      0.69      0.67       400
VIGOROUS-INTENSITY       0.69      0.63      0.66       345

          accuracy                           0.55      1545
         macro avg       0.55      0.55      0.55      1545
      weighted avg       0.55      0.55      0.55      1545

2025-11-04 13:33:14.468984: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:33:14.480704: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259594.494537 1638297 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259594.498842 1638297 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259594.509077 1638297 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259594.509102 1638297 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259594.509104 1638297 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259594.509105 1638297 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:33:14.512302: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259596.878711 1638297 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259598.560078 1638414 service.cc:152] XLA service 0x79d9ec00af20 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259598.560115 1638414 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:33:18.604998: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259598.780869 1638414 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259601.514889 1638414 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:43[0m 4s/step - accuracy: 0.2969 - loss: 1.8102
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2779 - loss: 2.0469  
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2807 - loss: 2.0354
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2823 - loss: 2.0174
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2837 - loss: 1.9994
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2843 - loss: 1.9872
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 30ms/step - accuracy: 0.2843 - loss: 1.9868 - val_accuracy: 0.4638 - val_loss: 1.2627
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.2344 - loss: 1.9581
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2986 - loss: 1.7419 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3024 - loss: 1.7261
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3039 - loss: 1.7166
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3051 - loss: 1.7073
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3056 - loss: 1.6999 - val_accuracy: 0.4726 - val_loss: 1.2430
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3125 - loss: 1.4858
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3280 - loss: 1.5434 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3278 - loss: 1.5476
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3260 - loss: 1.5471
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3259 - loss: 1.5429
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3263 - loss: 1.5378 - val_accuracy: 0.4958 - val_loss: 1.2040
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.3125 - loss: 1.5207
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3192 - loss: 1.4577 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3289 - loss: 1.4356
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3335 - loss: 1.4282
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3376 - loss: 1.4208
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3398 - loss: 1.4170 - val_accuracy: 0.5077 - val_loss: 1.1847
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3438 - loss: 1.4603
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3494 - loss: 1.3625 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3565 - loss: 1.3504
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3601 - loss: 1.3452
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3631 - loss: 1.3422
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3656 - loss: 1.3397 - val_accuracy: 0.5179 - val_loss: 1.1597
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3750 - loss: 1.2722
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3904 - loss: 1.3028 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3984 - loss: 1.2948
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4031 - loss: 1.2890
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4059 - loss: 1.2852
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4068 - loss: 1.2835 - val_accuracy: 0.5544 - val_loss: 1.1339
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3281 - loss: 1.4019
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4221 - loss: 1.2527 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4324 - loss: 1.2405
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4387 - loss: 1.2339
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4428 - loss: 1.2295
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4444 - loss: 1.2272 - val_accuracy: 0.5527 - val_loss: 1.1021
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4688 - loss: 1.0668
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4513 - loss: 1.1838 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4537 - loss: 1.1880
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4551 - loss: 1.1884
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4570 - loss: 1.1867
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4580 - loss: 1.1860 - val_accuracy: 0.5569 - val_loss: 1.0852
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4688 - loss: 1.2324
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4427 - loss: 1.2068 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4567 - loss: 1.1847
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4635 - loss: 1.1743
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4670 - loss: 1.1682
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4687 - loss: 1.1653 - val_accuracy: 0.5614 - val_loss: 1.0759
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.0904
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4828 - loss: 1.1055 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4875 - loss: 1.1105
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4891 - loss: 1.1152
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4896 - loss: 1.1175
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4899 - loss: 1.1179 - val_accuracy: 0.5537 - val_loss: 1.0716
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.0198
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5085 - loss: 1.0831 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5110 - loss: 1.0876
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5108 - loss: 1.0913
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5095 - loss: 1.0946
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5091 - loss: 1.0958 - val_accuracy: 0.5572 - val_loss: 1.0638
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.1138
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5113 - loss: 1.0976 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5095 - loss: 1.1005
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5118 - loss: 1.0979
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5134 - loss: 1.0962
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5144 - loss: 1.0946 - val_accuracy: 0.5513 - val_loss: 1.0969
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.2087
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5246 - loss: 1.0725 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5226 - loss: 1.0721
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5233 - loss: 1.0691
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5246 - loss: 1.0673
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5251 - loss: 1.0666 - val_accuracy: 0.5555 - val_loss: 1.0574
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.5938 - loss: 0.9908
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5343 - loss: 1.0651 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5295 - loss: 1.0680
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5277 - loss: 1.0677
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5278 - loss: 1.0665
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5280 - loss: 1.0652 - val_accuracy: 0.5565 - val_loss: 1.0873
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.1185
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5452 - loss: 1.0524 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5401 - loss: 1.0507
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5390 - loss: 1.0476
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5380 - loss: 1.0466
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5381 - loss: 1.0462 - val_accuracy: 0.5499 - val_loss: 1.0507
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.1509
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5429 - loss: 1.0261 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5437 - loss: 1.0226
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5439 - loss: 1.0245
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5444 - loss: 1.0252
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5448 - loss: 1.0251 - val_accuracy: 0.5551 - val_loss: 1.0662
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0263
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5426 - loss: 1.0140 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5457 - loss: 1.0147
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5476 - loss: 1.0134
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5489 - loss: 1.0136
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5498 - loss: 1.0137 - val_accuracy: 0.5629 - val_loss: 1.0614
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.6250 - loss: 0.9873
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5587 - loss: 0.9954 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5608 - loss: 0.9966
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5600 - loss: 0.9970
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5594 - loss: 0.9987
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5590 - loss: 0.9998 - val_accuracy: 0.5621 - val_loss: 1.0439
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5312 - loss: 0.9531
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5588 - loss: 0.9839 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5609 - loss: 0.9883
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5614 - loss: 0.9925
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5617 - loss: 0.9950
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5619 - loss: 0.9960 - val_accuracy: 0.5565 - val_loss: 1.0659
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 0.9367
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5613 - loss: 0.9909 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5673 - loss: 0.9872
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5698 - loss: 0.9847
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5697 - loss: 0.9847
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5693 - loss: 0.9852 - val_accuracy: 0.5723 - val_loss: 1.0440
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6094 - loss: 1.0259
[1m 29/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5978 - loss: 0.9752 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5885 - loss: 0.9802
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5865 - loss: 0.9766
[1m132/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5846 - loss: 0.9762
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5829 - loss: 0.9764
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5828 - loss: 0.9764 - val_accuracy: 0.5621 - val_loss: 1.0642
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6406 - loss: 0.8902
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5945 - loss: 0.9285 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5866 - loss: 0.9430
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5867 - loss: 0.9485
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5861 - loss: 0.9526
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5857 - loss: 0.9549 - val_accuracy: 0.5583 - val_loss: 1.0544
Epoch 23/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.6094 - loss: 0.8086
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5870 - loss: 0.9388 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5863 - loss: 0.9453
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5855 - loss: 0.9503
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5848 - loss: 0.9524
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5848 - loss: 0.9522 - val_accuracy: 0.5692 - val_loss: 1.0422
Epoch 24/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6250 - loss: 0.9291
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5875 - loss: 0.9414 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5793 - loss: 0.9505
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5767 - loss: 0.9536
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5775 - loss: 0.9543
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5777 - loss: 0.9546 - val_accuracy: 0.5653 - val_loss: 1.0457
Epoch 25/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.7188 - loss: 0.8247
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5646 - loss: 0.9633 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5699 - loss: 0.9590
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5738 - loss: 0.9559
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5763 - loss: 0.9547
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5777 - loss: 0.9538 - val_accuracy: 0.5583 - val_loss: 1.0790
Epoch 26/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 0.9200
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5935 - loss: 0.9342 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5922 - loss: 0.9384
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5912 - loss: 0.9385
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5901 - loss: 0.9395
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5897 - loss: 0.9401 - val_accuracy: 0.5639 - val_loss: 1.0760
Epoch 27/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6250 - loss: 0.8773
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5979 - loss: 0.8976 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5969 - loss: 0.9068
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5980 - loss: 0.9105
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5984 - loss: 0.9132
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5986 - loss: 0.9145 - val_accuracy: 0.5629 - val_loss: 1.0606
Epoch 28/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5938 - loss: 0.9258
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6087 - loss: 0.9068 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6070 - loss: 0.9099
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.6062 - loss: 0.9121
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.6048 - loss: 0.9151
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.6041 - loss: 0.9165 - val_accuracy: 0.5579 - val_loss: 1.0531

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 384ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 21: 55.15 [%]
F1-score capturado en la ejecución 21: 55.06 [%]

=== EJECUCIÓN 22 ===

--- TRAIN (ejecución 22) ---

--- TEST (ejecución 22) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:15[0m 964ms/step
[1m 63/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 816us/step  
[1m132/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 769us/step
[1m196/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 774us/step
[1m254/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 795us/step
[1m320/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 788us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m63/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 820us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 999us/step
Global accuracy score (validation) = 56.53 [%]
Global F1 score (validation) = 55.45 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.11660801 0.06781416 0.80482507 0.01075282]
 [0.11308214 0.07289626 0.8025685  0.0114531 ]
 [0.08468156 0.05268316 0.85445285 0.00818244]
 ...
 [0.15726544 0.09269514 0.16994925 0.58009017]
 [0.1844863  0.07112835 0.5399641  0.20442128]
 [0.35867727 0.54845536 0.02551107 0.06735631]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 65.01 [%]
Global accuracy score (test) = 52.82 [%]
Global F1 score (train) = 64.27 [%]
Global F1 score (test) = 52.73 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.41      0.31      0.35       400
MODERATE-INTENSITY       0.44      0.54      0.48       400
         SEDENTARY       0.60      0.69      0.65       400
VIGOROUS-INTENSITY       0.69      0.58      0.63       345

          accuracy                           0.53      1545
         macro avg       0.53      0.53      0.53      1545
      weighted avg       0.53      0.53      0.52      1545

2025-11-04 13:33:47.114176: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:33:47.125557: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259627.140471 1641864 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259627.144736 1641864 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259627.155392 1641864 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259627.155415 1641864 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259627.155424 1641864 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259627.155425 1641864 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:33:47.158753: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259629.536815 1641864 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259631.205642 1641991 service.cc:152] XLA service 0x77cc44005670 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259631.205697 1641991 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:33:51.257041: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259631.431118 1641991 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259634.159574 1641991 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:41[0m 4s/step - accuracy: 0.1562 - loss: 2.3845
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2550 - loss: 2.1086  
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2647 - loss: 2.0623
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2694 - loss: 2.0330
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2735 - loss: 2.0098
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2761 - loss: 1.9936
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2761 - loss: 1.9931 - val_accuracy: 0.4772 - val_loss: 1.2540
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.2812 - loss: 1.5857
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2961 - loss: 1.7200 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3076 - loss: 1.7132
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3115 - loss: 1.7083
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3130 - loss: 1.7009
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3134 - loss: 1.6941 - val_accuracy: 0.4768 - val_loss: 1.2113
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3594 - loss: 1.6011
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3209 - loss: 1.5989 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3178 - loss: 1.5910
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3189 - loss: 1.5772
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3208 - loss: 1.5645
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3225 - loss: 1.5553 - val_accuracy: 0.4972 - val_loss: 1.1804
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.2969 - loss: 1.3704
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3292 - loss: 1.4368 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3343 - loss: 1.4337
[1m 97/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3366 - loss: 1.4289
[1m132/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3402 - loss: 1.4214
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3435 - loss: 1.4153 - val_accuracy: 0.4982 - val_loss: 1.1707
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2969 - loss: 1.4640
[1m 30/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3720 - loss: 1.3637 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3829 - loss: 1.3459
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3862 - loss: 1.3392
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3904 - loss: 1.3323
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3929 - loss: 1.3278 - val_accuracy: 0.5302 - val_loss: 1.1329
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 27ms/step - accuracy: 0.5156 - loss: 1.1954
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4119 - loss: 1.2670 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4148 - loss: 1.2602
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4173 - loss: 1.2566
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4199 - loss: 1.2535
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4209 - loss: 1.2523 - val_accuracy: 0.5471 - val_loss: 1.0996
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.1899
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4516 - loss: 1.1998 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4479 - loss: 1.2104
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4486 - loss: 1.2129
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4498 - loss: 1.2121
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4509 - loss: 1.2110 - val_accuracy: 0.5516 - val_loss: 1.0741
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5000 - loss: 1.1745
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4588 - loss: 1.1870 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4604 - loss: 1.1849
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4625 - loss: 1.1825
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4643 - loss: 1.1803
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4654 - loss: 1.1789 - val_accuracy: 0.5660 - val_loss: 1.0514
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.1408
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4898 - loss: 1.1261 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4873 - loss: 1.1311
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4872 - loss: 1.1331
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4880 - loss: 1.1332
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4882 - loss: 1.1332 - val_accuracy: 0.5674 - val_loss: 1.0429
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0741
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4857 - loss: 1.1096 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4844 - loss: 1.1147
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4856 - loss: 1.1173
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4877 - loss: 1.1179
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4889 - loss: 1.1178 - val_accuracy: 0.5769 - val_loss: 1.0424
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.0049
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4751 - loss: 1.0983 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4841 - loss: 1.1035
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4911 - loss: 1.1024
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4945 - loss: 1.1022
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4963 - loss: 1.1011 - val_accuracy: 0.5671 - val_loss: 1.0344
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.4531 - loss: 1.1111
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5185 - loss: 1.0642 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5221 - loss: 1.0639
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5219 - loss: 1.0664
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5218 - loss: 1.0679
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5221 - loss: 1.0689 - val_accuracy: 0.5583 - val_loss: 1.0324
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.1495
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5411 - loss: 1.0474 
[1m 76/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5313 - loss: 1.0551
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5286 - loss: 1.0575
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5282 - loss: 1.0578
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5282 - loss: 1.0575 - val_accuracy: 0.5667 - val_loss: 1.0417
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.1166
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5439 - loss: 1.0531 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5361 - loss: 1.0588
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5347 - loss: 1.0580
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5339 - loss: 1.0568
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5334 - loss: 1.0568 - val_accuracy: 0.5643 - val_loss: 1.0273
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 0.9901
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5539 - loss: 1.0054 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5495 - loss: 1.0155
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5469 - loss: 1.0214
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5451 - loss: 1.0244
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5443 - loss: 1.0261 - val_accuracy: 0.5758 - val_loss: 1.0335
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5312 - loss: 1.1298
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5362 - loss: 1.0498 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5341 - loss: 1.0469
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5356 - loss: 1.0438
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5368 - loss: 1.0419
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5374 - loss: 1.0405 - val_accuracy: 0.5787 - val_loss: 1.0453
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5938 - loss: 1.0523
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5536 - loss: 1.0068 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5523 - loss: 1.0078
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5509 - loss: 1.0094
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5506 - loss: 1.0089
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5504 - loss: 1.0090 - val_accuracy: 0.5618 - val_loss: 1.0391
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5625 - loss: 0.9945
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5492 - loss: 1.0178 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5495 - loss: 1.0141
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5497 - loss: 1.0110
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5500 - loss: 1.0093
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5503 - loss: 1.0091 - val_accuracy: 0.5783 - val_loss: 1.0398
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.4688 - loss: 0.9911
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5629 - loss: 0.9774 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5599 - loss: 0.9852
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5609 - loss: 0.9851
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5612 - loss: 0.9861
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5612 - loss: 0.9866 - val_accuracy: 0.5681 - val_loss: 1.0448

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 379ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 22: 52.82 [%]
F1-score capturado en la ejecución 22: 52.73 [%]

=== EJECUCIÓN 23 ===

--- TRAIN (ejecución 23) ---

--- TEST (ejecución 23) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:17[0m 972ms/step
[1m 64/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 799us/step  
[1m131/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 775us/step
[1m198/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 770us/step
[1m274/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 739us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m67/89[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 759us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 912us/step
Global accuracy score (validation) = 56.25 [%]
Global F1 score (validation) = 55.31 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.2536946  0.20072196 0.49792776 0.04765569]
 [0.0973563  0.06677812 0.8239825  0.01188308]
 [0.25612286 0.20252633 0.49361667 0.04773414]
 ...
 [0.19120188 0.12644713 0.20882806 0.4735229 ]
 [0.37162027 0.24830821 0.32502973 0.05504177]
 [0.43270513 0.48578662 0.04362817 0.03788007]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 61.02 [%]
Global accuracy score (test) = 54.76 [%]
Global F1 score (train) = 60.78 [%]
Global F1 score (test) = 54.59 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.42      0.28      0.34       400
MODERATE-INTENSITY       0.45      0.65      0.53       400
         SEDENTARY       0.64      0.67      0.65       400
VIGOROUS-INTENSITY       0.75      0.59      0.66       345

          accuracy                           0.55      1545
         macro avg       0.56      0.55      0.55      1545
      weighted avg       0.56      0.55      0.54      1545

2025-11-04 13:34:16.682166: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:34:16.696911: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259656.715929 1644614 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259656.719835 1644614 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259656.731288 1644614 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259656.731311 1644614 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259656.731314 1644614 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259656.731315 1644614 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:34:16.734336: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259659.127218 1644614 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259660.814573 1644757 service.cc:152] XLA service 0x78329c00b610 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259660.814604 1644757 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:34:20.857839: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259661.027658 1644757 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259663.807543 1644757 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:48[0m 4s/step - accuracy: 0.1875 - loss: 2.1972
[1m 27/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2379 - loss: 2.1124  
[1m 62/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2592 - loss: 2.0592
[1m 96/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2676 - loss: 2.0349
[1m132/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2723 - loss: 2.0154
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2752 - loss: 1.9991
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 30ms/step - accuracy: 0.2752 - loss: 1.9986 - val_accuracy: 0.4663 - val_loss: 1.3438
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3281 - loss: 1.8288
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3149 - loss: 1.7663 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3129 - loss: 1.7565
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3124 - loss: 1.7393
[1m131/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3125 - loss: 1.7259
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3123 - loss: 1.7148 - val_accuracy: 0.4702 - val_loss: 1.2416
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2656 - loss: 1.5165
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3049 - loss: 1.5809 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3162 - loss: 1.5677
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3203 - loss: 1.5592
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3224 - loss: 1.5515
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3237 - loss: 1.5454 - val_accuracy: 0.5147 - val_loss: 1.1879
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.2656 - loss: 1.3859
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3350 - loss: 1.4156 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3468 - loss: 1.4160
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3504 - loss: 1.4129
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3531 - loss: 1.4081
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3544 - loss: 1.4052 - val_accuracy: 0.5211 - val_loss: 1.1821
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.4375 - loss: 1.3643
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3859 - loss: 1.3450 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3825 - loss: 1.3468
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3807 - loss: 1.3461
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3802 - loss: 1.3441
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3806 - loss: 1.3420 - val_accuracy: 0.5351 - val_loss: 1.1594
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3750 - loss: 1.2695
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4104 - loss: 1.2848 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4161 - loss: 1.2761
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4179 - loss: 1.2727
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4181 - loss: 1.2721
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4186 - loss: 1.2710 - val_accuracy: 0.5348 - val_loss: 1.1316
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3594 - loss: 1.2612
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4306 - loss: 1.2376 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4344 - loss: 1.2385
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4393 - loss: 1.2325
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4423 - loss: 1.2279
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4439 - loss: 1.2255 - val_accuracy: 0.5372 - val_loss: 1.0981
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.4219 - loss: 1.1849
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4646 - loss: 1.1721 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4671 - loss: 1.1748
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4696 - loss: 1.1741
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4703 - loss: 1.1737
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4704 - loss: 1.1731 - val_accuracy: 0.5383 - val_loss: 1.1015
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5625 - loss: 1.1130
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4804 - loss: 1.1501 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4741 - loss: 1.1546
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4725 - loss: 1.1548
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4719 - loss: 1.1554
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4715 - loss: 1.1561 - val_accuracy: 0.5435 - val_loss: 1.0843
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.1657
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5011 - loss: 1.1204 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5005 - loss: 1.1212
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4987 - loss: 1.1240
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4971 - loss: 1.1253
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4963 - loss: 1.1261 - val_accuracy: 0.5590 - val_loss: 1.0739
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4062 - loss: 1.2109
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4856 - loss: 1.1394 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4909 - loss: 1.1296
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4939 - loss: 1.1237
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4962 - loss: 1.1207
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4973 - loss: 1.1191 - val_accuracy: 0.5537 - val_loss: 1.0656
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4375 - loss: 1.1409
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5183 - loss: 1.0671 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5192 - loss: 1.0702
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5178 - loss: 1.0732
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5159 - loss: 1.0762
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5148 - loss: 1.0779 - val_accuracy: 0.5699 - val_loss: 1.0644
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5625 - loss: 1.0565
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5251 - loss: 1.0758 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5219 - loss: 1.0754
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5215 - loss: 1.0740
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5213 - loss: 1.0735
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5213 - loss: 1.0735 - val_accuracy: 0.5702 - val_loss: 1.0668
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5000 - loss: 1.0692
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5146 - loss: 1.0555 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5150 - loss: 1.0552
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5174 - loss: 1.0558
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5198 - loss: 1.0551
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5212 - loss: 1.0552 - val_accuracy: 0.5569 - val_loss: 1.0567
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.1103
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5427 - loss: 1.0279 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5296 - loss: 1.0445
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5275 - loss: 1.0478
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5272 - loss: 1.0488
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5276 - loss: 1.0491 - val_accuracy: 0.5751 - val_loss: 1.0609
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0937
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5360 - loss: 1.0442 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5367 - loss: 1.0446
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5373 - loss: 1.0444
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5379 - loss: 1.0434
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5381 - loss: 1.0429 - val_accuracy: 0.5794 - val_loss: 1.0630
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0581
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5658 - loss: 1.0154 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5602 - loss: 1.0125
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5563 - loss: 1.0146
[1m149/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5534 - loss: 1.0176
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5528 - loss: 1.0183 - val_accuracy: 0.5428 - val_loss: 1.0816
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 0.9417
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5628 - loss: 0.9892 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5594 - loss: 0.9943
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5581 - loss: 0.9970
[1m130/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5566 - loss: 1.0002
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5551 - loss: 1.0036 - val_accuracy: 0.5625 - val_loss: 1.0713
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 0.9808
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5464 - loss: 0.9995 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5411 - loss: 1.0077
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5400 - loss: 1.0104
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5406 - loss: 1.0108
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5411 - loss: 1.0110 - val_accuracy: 0.5629 - val_loss: 1.0682

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 398ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 23: 54.76 [%]
F1-score capturado en la ejecución 23: 54.59 [%]

=== EJECUCIÓN 24 ===

--- TRAIN (ejecución 24) ---

--- TEST (ejecución 24) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:14[0m 962ms/step
[1m 62/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 830us/step  
[1m129/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 788us/step
[1m197/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 771us/step
[1m266/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 759us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 15ms/step
[1m66/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 778us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 933us/step
Global accuracy score (validation) = 56.5 [%]
Global F1 score (validation) = 55.11 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.2410569  0.18887077 0.5194247  0.05064768]
 [0.08073412 0.05196065 0.85827607 0.00902918]
 [0.3745955  0.4990768  0.06015413 0.06617358]
 ...
 [0.1428496  0.10289185 0.15769999 0.5965586 ]
 [0.33386266 0.336985   0.08981259 0.2393397 ]
 [0.43417713 0.45239127 0.06690487 0.04652679]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 60.99 [%]
Global accuracy score (test) = 53.33 [%]
Global F1 score (train) = 60.49 [%]
Global F1 score (test) = 53.41 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.38      0.30      0.33       400
MODERATE-INTENSITY       0.45      0.58      0.50       400
         SEDENTARY       0.65      0.66      0.66       400
VIGOROUS-INTENSITY       0.68      0.60      0.64       345

          accuracy                           0.53      1545
         macro avg       0.54      0.54      0.53      1545
      weighted avg       0.54      0.53      0.53      1545

2025-11-04 13:34:46.173100: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:34:46.184733: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259686.197853 1647378 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259686.201909 1647378 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259686.211644 1647378 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259686.211665 1647378 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259686.211667 1647378 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259686.211669 1647378 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:34:46.214890: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259688.566931 1647378 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259690.225333 1647509 service.cc:152] XLA service 0x70e33c01d8e0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259690.225372 1647509 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:34:50.284748: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259690.452501 1647509 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259693.105994 1647509 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:28[0m 4s/step - accuracy: 0.1562 - loss: 2.4149
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2525 - loss: 2.0890  
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2630 - loss: 2.0238
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2695 - loss: 1.9877
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2740 - loss: 1.9584
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 18ms/step - accuracy: 0.2764 - loss: 1.9408
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 30ms/step - accuracy: 0.2765 - loss: 1.9402 - val_accuracy: 0.4136 - val_loss: 1.2796
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.3438 - loss: 1.6702
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3122 - loss: 1.6702 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3125 - loss: 1.6700
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3127 - loss: 1.6666
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3133 - loss: 1.6606
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3141 - loss: 1.6562 - val_accuracy: 0.4737 - val_loss: 1.2132
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2812 - loss: 1.4491
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2955 - loss: 1.5459 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2994 - loss: 1.5371
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3031 - loss: 1.5275
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3071 - loss: 1.5183
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3098 - loss: 1.5121 - val_accuracy: 0.4663 - val_loss: 1.2099
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4062 - loss: 1.3812
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3470 - loss: 1.4043 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3500 - loss: 1.4011
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3529 - loss: 1.3951
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3558 - loss: 1.3900
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3578 - loss: 1.3860 - val_accuracy: 0.4944 - val_loss: 1.1850
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.2339
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4175 - loss: 1.2949 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4021 - loss: 1.3032
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3992 - loss: 1.3029
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3993 - loss: 1.3020
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4000 - loss: 1.3010 - val_accuracy: 0.5179 - val_loss: 1.1490
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4062 - loss: 1.2031
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4068 - loss: 1.2733 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4117 - loss: 1.2680
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4140 - loss: 1.2639
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.4163 - loss: 1.2595
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4172 - loss: 1.2576 - val_accuracy: 0.5320 - val_loss: 1.1147
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3906 - loss: 1.2405
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4237 - loss: 1.2470 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4351 - loss: 1.2313
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4406 - loss: 1.2230
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4445 - loss: 1.2162
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4461 - loss: 1.2132 - val_accuracy: 0.5527 - val_loss: 1.1087
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.1880
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4662 - loss: 1.1895 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4675 - loss: 1.1839
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4681 - loss: 1.1800
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4695 - loss: 1.1755
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4701 - loss: 1.1737 - val_accuracy: 0.5509 - val_loss: 1.0691
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.0727
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4861 - loss: 1.1182 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4847 - loss: 1.1221
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4831 - loss: 1.1254
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4835 - loss: 1.1258
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4838 - loss: 1.1262 - val_accuracy: 0.5583 - val_loss: 1.0715
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4531 - loss: 1.1900
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4795 - loss: 1.1255 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4871 - loss: 1.1199
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4922 - loss: 1.1164
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4940 - loss: 1.1147
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4946 - loss: 1.1137 - val_accuracy: 0.5702 - val_loss: 1.0695
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4219 - loss: 1.1880
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4856 - loss: 1.1128 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4933 - loss: 1.1052
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4983 - loss: 1.1009
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5022 - loss: 1.0978
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5040 - loss: 1.0967 - val_accuracy: 0.5607 - val_loss: 1.0566
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.3010
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5230 - loss: 1.0797 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5193 - loss: 1.0788
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5189 - loss: 1.0778
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5184 - loss: 1.0777
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5183 - loss: 1.0776 - val_accuracy: 0.5650 - val_loss: 1.0595
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.0353
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5077 - loss: 1.0679 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5090 - loss: 1.0655
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5104 - loss: 1.0647
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5114 - loss: 1.0648
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5121 - loss: 1.0645 - val_accuracy: 0.5621 - val_loss: 1.0497
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6875 - loss: 0.9410
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5682 - loss: 1.0245 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5546 - loss: 1.0357
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5480 - loss: 1.0400
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5448 - loss: 1.0418
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5433 - loss: 1.0426 - val_accuracy: 0.5614 - val_loss: 1.0823
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.0391
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5185 - loss: 1.0123 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5284 - loss: 1.0165
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5316 - loss: 1.0194
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5327 - loss: 1.0217
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5338 - loss: 1.0233 - val_accuracy: 0.5681 - val_loss: 1.0663
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5625 - loss: 0.9657
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5544 - loss: 1.0228 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5496 - loss: 1.0230
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5488 - loss: 1.0232
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5475 - loss: 1.0235
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5467 - loss: 1.0234 - val_accuracy: 0.5534 - val_loss: 1.0926
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5938 - loss: 0.9457
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5547 - loss: 0.9889 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5504 - loss: 1.0034
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5477 - loss: 1.0094
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5468 - loss: 1.0116
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5469 - loss: 1.0121 - val_accuracy: 0.5685 - val_loss: 1.0513
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6094 - loss: 0.8847
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5648 - loss: 0.9817 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5592 - loss: 0.9920
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5572 - loss: 0.9962
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5558 - loss: 0.9986
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5557 - loss: 0.9994 - val_accuracy: 0.5551 - val_loss: 1.0800

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 394ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 24: 53.33 [%]
F1-score capturado en la ejecución 24: 53.41 [%]

=== EJECUCIÓN 25 ===

--- TRAIN (ejecución 25) ---

--- TEST (ejecución 25) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:17[0m 972ms/step
[1m 60/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 856us/step  
[1m127/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 801us/step
[1m194/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 785us/step
[1m260/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 780us/step
[1m324/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 780us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 13ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m58/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 883us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 978us/step
Global accuracy score (validation) = 55.58 [%]
Global F1 score (validation) = 52.65 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.23382035 0.1705302  0.56183136 0.03381814]
 [0.3855168  0.48046696 0.08263082 0.05138543]
 [0.07749201 0.0471945  0.8686456  0.00666783]
 ...
 [0.22183388 0.15093258 0.24892978 0.3783038 ]
 [0.11686472 0.11249855 0.03271331 0.7379234 ]
 [0.38165665 0.49891067 0.08465342 0.0347793 ]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 60.13 [%]
Global accuracy score (test) = 52.36 [%]
Global F1 score (train) = 57.98 [%]
Global F1 score (test) = 50.6 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.40      0.17      0.23       400
MODERATE-INTENSITY       0.43      0.66      0.52       400
         SEDENTARY       0.60      0.68      0.64       400
VIGOROUS-INTENSITY       0.67      0.60      0.64       345

          accuracy                           0.52      1545
         macro avg       0.52      0.53      0.51      1545
      weighted avg       0.52      0.52      0.50      1545

2025-11-04 13:35:15.189881: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:35:15.201173: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259715.214196 1650047 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259715.218306 1650047 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259715.228034 1650047 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259715.228053 1650047 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259715.228056 1650047 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259715.228057 1650047 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:35:15.231178: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259717.581829 1650047 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259719.272068 1650144 service.cc:152] XLA service 0x76472c00c5c0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259719.272123 1650144 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:35:19.315512: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259719.477590 1650144 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259722.156100 1650144 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:29[0m 4s/step - accuracy: 0.3125 - loss: 1.9326
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2650 - loss: 2.0584  
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2661 - loss: 2.0398
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2691 - loss: 2.0278
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2722 - loss: 2.0137
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2748 - loss: 1.9982
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2749 - loss: 1.9977 - val_accuracy: 0.4723 - val_loss: 1.2383
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 24ms/step - accuracy: 0.2656 - loss: 1.9661
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2913 - loss: 1.8203 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2948 - loss: 1.7991
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2958 - loss: 1.7798
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2965 - loss: 1.7646
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.2969 - loss: 1.7541 - val_accuracy: 0.4775 - val_loss: 1.2045
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.2812 - loss: 1.6654
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3166 - loss: 1.5753 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3208 - loss: 1.5587
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3215 - loss: 1.5492
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3216 - loss: 1.5404
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3220 - loss: 1.5337 - val_accuracy: 0.4975 - val_loss: 1.1875
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3438 - loss: 1.4760
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3415 - loss: 1.4266 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3450 - loss: 1.4191
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3464 - loss: 1.4149
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3479 - loss: 1.4101
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3495 - loss: 1.4058 - val_accuracy: 0.5112 - val_loss: 1.1765
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3438 - loss: 1.2793
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3858 - loss: 1.3198 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3890 - loss: 1.3240
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3909 - loss: 1.3232
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3926 - loss: 1.3213
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3933 - loss: 1.3201 - val_accuracy: 0.5176 - val_loss: 1.1453
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.3481
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4288 - loss: 1.2636 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4220 - loss: 1.2685
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4182 - loss: 1.2712
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4179 - loss: 1.2698
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4180 - loss: 1.2687 - val_accuracy: 0.5404 - val_loss: 1.1067
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 26ms/step - accuracy: 0.5156 - loss: 1.2092
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4610 - loss: 1.2091 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4622 - loss: 1.2089
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4619 - loss: 1.2069
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4602 - loss: 1.2066
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4595 - loss: 1.2061 - val_accuracy: 0.5523 - val_loss: 1.0857
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 25ms/step - accuracy: 0.4062 - loss: 1.1994
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4655 - loss: 1.1911 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4659 - loss: 1.1867
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4654 - loss: 1.1829
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4662 - loss: 1.1799
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4669 - loss: 1.1777 - val_accuracy: 0.5593 - val_loss: 1.0830
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4062 - loss: 1.3470
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4649 - loss: 1.1747 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4725 - loss: 1.1651
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4762 - loss: 1.1603
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4778 - loss: 1.1569
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4782 - loss: 1.1557 - val_accuracy: 0.5657 - val_loss: 1.0599
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.4688 - loss: 1.0842
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4732 - loss: 1.1443 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4821 - loss: 1.1360
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4849 - loss: 1.1326
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4876 - loss: 1.1300
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4889 - loss: 1.1285 - val_accuracy: 0.5667 - val_loss: 1.0480
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.0495
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5258 - loss: 1.0503 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5184 - loss: 1.0642
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5138 - loss: 1.0726
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5109 - loss: 1.0779
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5099 - loss: 1.0801 - val_accuracy: 0.5621 - val_loss: 1.0519
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.3144
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5091 - loss: 1.0959 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5056 - loss: 1.0941
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5051 - loss: 1.0931
[1m132/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5066 - loss: 1.0907
[1m161/164[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 2ms/step - accuracy: 0.5076 - loss: 1.0897
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5078 - loss: 1.0896 - val_accuracy: 0.5695 - val_loss: 1.0514
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 1.0492
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5124 - loss: 1.0771 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5162 - loss: 1.0770
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5192 - loss: 1.0761
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5198 - loss: 1.0766
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5198 - loss: 1.0769 - val_accuracy: 0.5667 - val_loss: 1.0441
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4062 - loss: 1.1940
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5155 - loss: 1.0680 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5251 - loss: 1.0582
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5293 - loss: 1.0542
[1m149/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5303 - loss: 1.0536
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5306 - loss: 1.0536 - val_accuracy: 0.5621 - val_loss: 1.0344
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.0773
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5463 - loss: 1.0277 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5375 - loss: 1.0355
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5336 - loss: 1.0406
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5324 - loss: 1.0435
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5319 - loss: 1.0448 - val_accuracy: 0.5667 - val_loss: 1.0392
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4844 - loss: 1.0596
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5156 - loss: 1.0537 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5194 - loss: 1.0499
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5220 - loss: 1.0464
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5253 - loss: 1.0421
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5273 - loss: 1.0398 - val_accuracy: 0.5758 - val_loss: 1.0424
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5781 - loss: 1.0115
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5421 - loss: 1.0279 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5407 - loss: 1.0317
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5424 - loss: 1.0308
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5436 - loss: 1.0293
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5439 - loss: 1.0287 - val_accuracy: 0.5741 - val_loss: 1.0441
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.0047
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5536 - loss: 1.0176 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5477 - loss: 1.0201
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5465 - loss: 1.0200
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5468 - loss: 1.0182
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5475 - loss: 1.0169 - val_accuracy: 0.5632 - val_loss: 1.0528
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4219 - loss: 1.2116
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5414 - loss: 1.0250 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5514 - loss: 1.0189
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5533 - loss: 1.0169
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5548 - loss: 1.0141
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5557 - loss: 1.0128 - val_accuracy: 0.5583 - val_loss: 1.0548

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 401ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 25: 52.36 [%]
F1-score capturado en la ejecución 25: 50.6 [%]

=== EJECUCIÓN 26 ===

--- TRAIN (ejecución 26) ---

--- TEST (ejecución 26) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:19[0m 976ms/step
[1m 64/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 801us/step  
[1m138/328[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 737us/step
[1m207/328[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 734us/step
[1m271/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 749us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m54/89[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 952us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 1ms/step  
Global accuracy score (validation) = 55.34 [%]
Global F1 score (validation) = 52.4 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.22155184 0.17321596 0.56230664 0.04292561]
 [0.1243747  0.0834453  0.76573986 0.02644013]
 [0.135616   0.09176167 0.7512808  0.02134161]
 ...
 [0.16048235 0.10208683 0.526233   0.21119788]
 [0.11418818 0.10901129 0.03434425 0.74245626]
 [0.39613432 0.49928737 0.03830946 0.06626887]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 59.54 [%]
Global accuracy score (test) = 53.33 [%]
Global F1 score (train) = 57.07 [%]
Global F1 score (test) = 51.35 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.38      0.15      0.21       400
MODERATE-INTENSITY       0.42      0.70      0.53       400
         SEDENTARY       0.63      0.71      0.67       400
VIGOROUS-INTENSITY       0.72      0.59      0.65       345

          accuracy                           0.53      1545
         macro avg       0.54      0.54      0.51      1545
      weighted avg       0.53      0.53      0.51      1545

2025-11-04 13:35:44.700431: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:35:44.711785: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259744.724882 1652794 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259744.729011 1652794 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259744.738759 1652794 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259744.738775 1652794 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259744.738778 1652794 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259744.738779 1652794 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:35:44.741952: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259747.064433 1652794 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259748.727748 1652911 service.cc:152] XLA service 0x746d9000c5a0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259748.727802 1652911 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:35:48.776551: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259748.940772 1652911 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259751.616086 1652911 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:30[0m 4s/step - accuracy: 0.2656 - loss: 2.0605
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2699 - loss: 2.0427  
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2690 - loss: 2.0375
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2710 - loss: 2.0245
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2732 - loss: 2.0086
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2745 - loss: 2.0000
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 30ms/step - accuracy: 0.2746 - loss: 1.9995 - val_accuracy: 0.4603 - val_loss: 1.2260
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3281 - loss: 1.6813
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3088 - loss: 1.7527 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3150 - loss: 1.7185
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3151 - loss: 1.7069
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3147 - loss: 1.6971
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3146 - loss: 1.6886 - val_accuracy: 0.4965 - val_loss: 1.1925
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3438 - loss: 1.4310
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3365 - loss: 1.5102 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3340 - loss: 1.5067
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3326 - loss: 1.5031
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3318 - loss: 1.4993
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3315 - loss: 1.4958 - val_accuracy: 0.5091 - val_loss: 1.1817
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.2812 - loss: 1.3959
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3435 - loss: 1.4023 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3523 - loss: 1.3924
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3545 - loss: 1.3890
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3548 - loss: 1.3852
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3549 - loss: 1.3828 - val_accuracy: 0.5190 - val_loss: 1.1779
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3125 - loss: 1.3730
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3890 - loss: 1.2951 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3910 - loss: 1.2968
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3937 - loss: 1.2957
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3960 - loss: 1.2936
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3972 - loss: 1.2923 - val_accuracy: 0.5267 - val_loss: 1.1482
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.3438 - loss: 1.2486
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4137 - loss: 1.2503 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4178 - loss: 1.2551
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4217 - loss: 1.2535
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4240 - loss: 1.2514
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4252 - loss: 1.2500 - val_accuracy: 0.5376 - val_loss: 1.1214
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4219 - loss: 1.2564
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4138 - loss: 1.2309 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4228 - loss: 1.2263
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4286 - loss: 1.2210
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4320 - loss: 1.2176
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4346 - loss: 1.2151 - val_accuracy: 0.5411 - val_loss: 1.0916
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4219 - loss: 1.2159
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4714 - loss: 1.1636 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4689 - loss: 1.1667
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4690 - loss: 1.1671
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4697 - loss: 1.1668
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4699 - loss: 1.1662 - val_accuracy: 0.5544 - val_loss: 1.0802
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4844 - loss: 1.1557
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4779 - loss: 1.1406 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4826 - loss: 1.1383
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4834 - loss: 1.1376
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4833 - loss: 1.1364
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4838 - loss: 1.1353 - val_accuracy: 0.5614 - val_loss: 1.0715
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5938 - loss: 0.9248
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4972 - loss: 1.1199 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4966 - loss: 1.1214
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4976 - loss: 1.1188
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4988 - loss: 1.1169
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4989 - loss: 1.1159 - val_accuracy: 0.5439 - val_loss: 1.0668
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4375 - loss: 1.1402
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4916 - loss: 1.1137 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4938 - loss: 1.1081
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4950 - loss: 1.1063
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4963 - loss: 1.1041
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4976 - loss: 1.1025 - val_accuracy: 0.5650 - val_loss: 1.0644
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5625 - loss: 0.9631
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5175 - loss: 1.0906 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5141 - loss: 1.0912
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5131 - loss: 1.0919
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5123 - loss: 1.0920
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5118 - loss: 1.0913 - val_accuracy: 0.5667 - val_loss: 1.0557
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.4844 - loss: 1.1040
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5179 - loss: 1.0659 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5151 - loss: 1.0673
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5161 - loss: 1.0679
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5167 - loss: 1.0684
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5173 - loss: 1.0681 - val_accuracy: 0.5516 - val_loss: 1.0637
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5469 - loss: 1.0005
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5547 - loss: 1.0252 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5485 - loss: 1.0346
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5454 - loss: 1.0378
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5425 - loss: 1.0398
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5405 - loss: 1.0416 - val_accuracy: 0.5737 - val_loss: 1.0620
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.1048
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5269 - loss: 1.0433 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5310 - loss: 1.0423
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5332 - loss: 1.0412
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5344 - loss: 1.0395
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5347 - loss: 1.0385 - val_accuracy: 0.5555 - val_loss: 1.0834
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.6094 - loss: 0.9635
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5637 - loss: 1.0026 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5503 - loss: 1.0104
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5487 - loss: 1.0096
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5474 - loss: 1.0110
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5464 - loss: 1.0123 - val_accuracy: 0.5685 - val_loss: 1.0641
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5938 - loss: 1.0881
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5619 - loss: 1.0037 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5522 - loss: 1.0064
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5486 - loss: 1.0106
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5480 - loss: 1.0118
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5482 - loss: 1.0116 - val_accuracy: 0.5692 - val_loss: 1.0459
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5000 - loss: 1.0991
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5666 - loss: 0.9929 
[1m 66/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5621 - loss: 0.9969
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5602 - loss: 0.9986
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5586 - loss: 0.9994
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5578 - loss: 0.9998 - val_accuracy: 0.5636 - val_loss: 1.0482
Epoch 19/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6875 - loss: 0.8252
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5693 - loss: 0.9817 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5592 - loss: 0.9941
[1m 97/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5567 - loss: 0.9968
[1m131/164[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5564 - loss: 0.9962
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5568 - loss: 0.9955 - val_accuracy: 0.5439 - val_loss: 1.0579
Epoch 20/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.0754
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5715 - loss: 1.0006 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5702 - loss: 0.9904
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5691 - loss: 0.9865
[1m147/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5683 - loss: 0.9844
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5678 - loss: 0.9838 - val_accuracy: 0.5600 - val_loss: 1.0642
Epoch 21/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 0.9315
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5768 - loss: 0.9524 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5721 - loss: 0.9604
[1m112/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5712 - loss: 0.9640
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5703 - loss: 0.9668
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5698 - loss: 0.9679 - val_accuracy: 0.5657 - val_loss: 1.0653
Epoch 22/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6719 - loss: 0.8976
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5809 - loss: 0.9615 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5800 - loss: 0.9571
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5793 - loss: 0.9565
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5788 - loss: 0.9579
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5786 - loss: 0.9584 - val_accuracy: 0.5629 - val_loss: 1.0647

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m20s[0m 426ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 17ms/step
Saved model to disk.

Accuracy capturado en la ejecución 26: 53.33 [%]
F1-score capturado en la ejecución 26: 51.35 [%]

=== EJECUCIÓN 27 ===

--- TRAIN (ejecución 27) ---

--- TEST (ejecución 27) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:16[0m 969ms/step
[1m 58/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 880us/step  
[1m125/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 812us/step
[1m190/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 800us/step
[1m250/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 810us/step
[1m312/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 810us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m65/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 784us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 930us/step
Global accuracy score (validation) = 56.71 [%]
Global F1 score (validation) = 54.55 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.38118726 0.504769   0.07771236 0.03633136]
 [0.0820305  0.04399639 0.8646631  0.00931004]
 [0.26309174 0.1976876  0.4883138  0.05090686]
 ...
 [0.09949575 0.06191967 0.15548879 0.68309575]
 [0.18128219 0.10951547 0.22107567 0.48812667]
 [0.36894473 0.4530807  0.03585076 0.14212383]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 62.02 [%]
Global accuracy score (test) = 54.76 [%]
Global F1 score (train) = 60.57 [%]
Global F1 score (test) = 54.24 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.43      0.28      0.34       400
MODERATE-INTENSITY       0.45      0.61      0.52       400
         SEDENTARY       0.63      0.70      0.67       400
VIGOROUS-INTENSITY       0.69      0.61      0.65       345

          accuracy                           0.55      1545
         macro avg       0.55      0.55      0.54      1545
      weighted avg       0.55      0.55      0.54      1545

2025-11-04 13:36:15.219159: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:36:15.230741: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259775.244492 1655821 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259775.248779 1655821 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259775.259031 1655821 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259775.259054 1655821 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259775.259057 1655821 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259775.259059 1655821 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:36:15.262314: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259777.614715 1655821 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259779.288798 1655931 service.cc:152] XLA service 0x7f6c6000b3e0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259779.288840 1655931 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:36:19.332058: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259779.503252 1655931 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259782.238190 1655931 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:40[0m 4s/step - accuracy: 0.2812 - loss: 2.3814
[1m 29/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2571 - loss: 2.1442  
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2644 - loss: 2.0790
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2688 - loss: 2.0464
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2718 - loss: 2.0201
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2734 - loss: 2.0026
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2735 - loss: 2.0020 - val_accuracy: 0.4589 - val_loss: 1.2312
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.1719 - loss: 2.2788
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3000 - loss: 1.8086 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3060 - loss: 1.7679
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3079 - loss: 1.7449
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3095 - loss: 1.7257
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3108 - loss: 1.7128 - val_accuracy: 0.4691 - val_loss: 1.2047
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3750 - loss: 1.4144
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3315 - loss: 1.5346 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3285 - loss: 1.5297
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3291 - loss: 1.5198
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3313 - loss: 1.5093
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3329 - loss: 1.5030 - val_accuracy: 0.4796 - val_loss: 1.1732
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4219 - loss: 1.3249
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3735 - loss: 1.3983 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3721 - loss: 1.3948
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3699 - loss: 1.3940
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3696 - loss: 1.3901
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3699 - loss: 1.3873 - val_accuracy: 0.5151 - val_loss: 1.1502
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.2969 - loss: 1.4772
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3654 - loss: 1.3554 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3713 - loss: 1.3440
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3759 - loss: 1.3347
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3793 - loss: 1.3286
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3816 - loss: 1.3247 - val_accuracy: 0.5562 - val_loss: 1.1314
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.1871
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4366 - loss: 1.2458 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4367 - loss: 1.2454
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4384 - loss: 1.2435
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4395 - loss: 1.2418
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4394 - loss: 1.2411 - val_accuracy: 0.5590 - val_loss: 1.0949
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m4s[0m 27ms/step - accuracy: 0.4062 - loss: 1.1191
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4590 - loss: 1.1725 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4544 - loss: 1.1856
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4537 - loss: 1.1909
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4544 - loss: 1.1915
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4550 - loss: 1.1913 - val_accuracy: 0.5506 - val_loss: 1.0666
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 1.1536
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4636 - loss: 1.1773 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4666 - loss: 1.1717
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4702 - loss: 1.1657
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4729 - loss: 1.1626
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4736 - loss: 1.1618 - val_accuracy: 0.5607 - val_loss: 1.0670
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.4531 - loss: 1.1737
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4778 - loss: 1.1360 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4859 - loss: 1.1332
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4872 - loss: 1.1334
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4872 - loss: 1.1334
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4876 - loss: 1.1334 - val_accuracy: 0.5727 - val_loss: 1.0641
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.0966
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4958 - loss: 1.1118 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4985 - loss: 1.1180
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4998 - loss: 1.1187
[1m144/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5011 - loss: 1.1169
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5015 - loss: 1.1163 - val_accuracy: 0.5804 - val_loss: 1.0532
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5156 - loss: 1.1758
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5056 - loss: 1.1030 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5101 - loss: 1.0981
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5112 - loss: 1.0957
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5108 - loss: 1.0953
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5105 - loss: 1.0954 - val_accuracy: 0.5688 - val_loss: 1.0514
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5625 - loss: 1.1348
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5275 - loss: 1.0780 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5249 - loss: 1.0763
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5226 - loss: 1.0749
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5219 - loss: 1.0740
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5216 - loss: 1.0738 - val_accuracy: 0.5737 - val_loss: 1.0599
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5781 - loss: 1.0529
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5226 - loss: 1.0884 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5265 - loss: 1.0753
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5281 - loss: 1.0706
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5287 - loss: 1.0688
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5287 - loss: 1.0682 - val_accuracy: 0.5685 - val_loss: 1.0410
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5156 - loss: 0.9508
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5067 - loss: 1.0605 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5173 - loss: 1.0560
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5217 - loss: 1.0545
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5246 - loss: 1.0527
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5261 - loss: 1.0518 - val_accuracy: 0.5671 - val_loss: 1.0423
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.6250 - loss: 0.9356
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5432 - loss: 1.0219 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5421 - loss: 1.0229
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5418 - loss: 1.0238
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5413 - loss: 1.0251
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5407 - loss: 1.0263 - val_accuracy: 0.5488 - val_loss: 1.0703
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5000 - loss: 1.0688
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5413 - loss: 1.0275 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5394 - loss: 1.0320
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5375 - loss: 1.0350
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5367 - loss: 1.0356
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5366 - loss: 1.0353 - val_accuracy: 0.5590 - val_loss: 1.0694
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6094 - loss: 0.9887
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5572 - loss: 1.0046 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5564 - loss: 1.0030
[1m 98/164[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5549 - loss: 1.0047
[1m133/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5536 - loss: 1.0072
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5533 - loss: 1.0079 - val_accuracy: 0.5569 - val_loss: 1.0551
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 0.9490
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5406 - loss: 1.0259 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5522 - loss: 1.0130
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5533 - loss: 1.0100
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5541 - loss: 1.0076
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5548 - loss: 1.0058 - val_accuracy: 0.5621 - val_loss: 1.0508

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m19s[0m 399ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 27: 54.76 [%]
F1-score capturado en la ejecución 27: 54.24 [%]

=== EJECUCIÓN 28 ===

--- TRAIN (ejecución 28) ---

--- TEST (ejecución 28) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:14[0m 961ms/step
[1m 52/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 1000us/step 
[1m114/328[0m [32m━━━━━━[0m[37m━━━━━━━━━━━━━━[0m [1m0s[0m 901us/step 
[1m174/328[0m [32m━━━━━━━━━━[0m[37m━━━━━━━━━━[0m [1m0s[0m 880us/step
[1m241/328[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 844us/step
[1m306/328[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 829us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m66/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 775us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 891us/step
Global accuracy score (validation) = 56.21 [%]
Global F1 score (validation) = 54.04 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.11264249 0.06690761 0.8070322  0.01341768]
 [0.08802801 0.05011584 0.8514054  0.01045072]
 [0.11277516 0.06697838 0.80678576 0.01346075]
 ...
 [0.16528308 0.14554755 0.08737995 0.6017895 ]
 [0.33802396 0.33780035 0.1042861  0.21988958]
 [0.36943632 0.50908095 0.04047446 0.08100836]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 61.02 [%]
Global accuracy score (test) = 52.75 [%]
Global F1 score (train) = 60.13 [%]
Global F1 score (test) = 51.81 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.35      0.20      0.26       400
MODERATE-INTENSITY       0.43      0.62      0.51       400
         SEDENTARY       0.63      0.70      0.66       400
VIGOROUS-INTENSITY       0.70      0.59      0.64       345

          accuracy                           0.53      1545
         macro avg       0.53      0.53      0.52      1545
      weighted avg       0.52      0.53      0.51      1545

2025-11-04 13:36:44.408026: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:36:44.419983: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259804.434589 1658457 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259804.439000 1658457 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259804.449523 1658457 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259804.449544 1658457 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259804.449546 1658457 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259804.449548 1658457 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:36:44.452906: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259806.803642 1658457 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259808.460957 1658588 service.cc:152] XLA service 0x7815600052d0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259808.461000 1658588 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:36:48.508232: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259808.670999 1658588 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259811.389413 1658588 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:36[0m 4s/step - accuracy: 0.2500 - loss: 2.0309
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2500 - loss: 2.0408  
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2616 - loss: 1.9919
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2663 - loss: 1.9674
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.2695 - loss: 1.9469
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2710 - loss: 1.9359
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2710 - loss: 1.9354 - val_accuracy: 0.4691 - val_loss: 1.2831
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.3125 - loss: 1.6163
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3024 - loss: 1.6524 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3063 - loss: 1.6584
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3070 - loss: 1.6583
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3072 - loss: 1.6534
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3080 - loss: 1.6476 - val_accuracy: 0.4768 - val_loss: 1.2049
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3281 - loss: 1.3948
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3398 - loss: 1.4817 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3376 - loss: 1.4848
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3351 - loss: 1.4856
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3340 - loss: 1.4840
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3341 - loss: 1.4817 - val_accuracy: 0.4895 - val_loss: 1.1869
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3906 - loss: 1.3752
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3485 - loss: 1.4079 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3518 - loss: 1.3993
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3525 - loss: 1.3937
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3532 - loss: 1.3903
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3539 - loss: 1.3876 - val_accuracy: 0.4947 - val_loss: 1.1760
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3594 - loss: 1.3093
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3908 - loss: 1.3312 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3919 - loss: 1.3334
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3938 - loss: 1.3288
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3951 - loss: 1.3238
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3959 - loss: 1.3216 - val_accuracy: 0.5312 - val_loss: 1.1406
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.3125 - loss: 1.3521
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4052 - loss: 1.2788 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4167 - loss: 1.2690
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4205 - loss: 1.2651
[1m137/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4222 - loss: 1.2626
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4241 - loss: 1.2598 - val_accuracy: 0.5495 - val_loss: 1.0969
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.1652
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4356 - loss: 1.2205 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4358 - loss: 1.2199
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4393 - loss: 1.2156
[1m140/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4424 - loss: 1.2115
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4444 - loss: 1.2091 - val_accuracy: 0.5597 - val_loss: 1.0877
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0402
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4809 - loss: 1.1482 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4739 - loss: 1.1554
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4727 - loss: 1.1579
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4727 - loss: 1.1588
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4725 - loss: 1.1596 - val_accuracy: 0.5513 - val_loss: 1.0736
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3750 - loss: 1.2123
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4894 - loss: 1.1332 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4869 - loss: 1.1361
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4864 - loss: 1.1386
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4867 - loss: 1.1386
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4872 - loss: 1.1378 - val_accuracy: 0.5643 - val_loss: 1.0688
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.5000 - loss: 1.0621
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4842 - loss: 1.1438 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4832 - loss: 1.1390
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4846 - loss: 1.1352
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4862 - loss: 1.1320
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4874 - loss: 1.1302 - val_accuracy: 0.5674 - val_loss: 1.0563
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6719 - loss: 0.9785
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5238 - loss: 1.1072 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5146 - loss: 1.1103
[1m103/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5120 - loss: 1.1091
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5108 - loss: 1.1079
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5098 - loss: 1.1072 - val_accuracy: 0.5692 - val_loss: 1.0466
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5469 - loss: 1.0442
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5183 - loss: 1.0747 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5133 - loss: 1.0804
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5137 - loss: 1.0814
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5141 - loss: 1.0803
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5139 - loss: 1.0800 - val_accuracy: 0.5618 - val_loss: 1.0432
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5938 - loss: 0.8812
[1m 38/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5052 - loss: 1.0716 
[1m 74/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5108 - loss: 1.0726
[1m111/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5136 - loss: 1.0698
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5150 - loss: 1.0680
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5156 - loss: 1.0671 - val_accuracy: 0.5787 - val_loss: 1.0519
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5000 - loss: 0.9761
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5071 - loss: 1.0732 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5173 - loss: 1.0605
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5199 - loss: 1.0575
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5220 - loss: 1.0567
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5228 - loss: 1.0565 - val_accuracy: 0.5716 - val_loss: 1.0549
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.5156 - loss: 1.0378
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5301 - loss: 1.0289 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5352 - loss: 1.0297
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5355 - loss: 1.0307
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5360 - loss: 1.0315
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5363 - loss: 1.0326 - val_accuracy: 0.5758 - val_loss: 1.0561
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4688 - loss: 1.1728
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5198 - loss: 1.0378 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5270 - loss: 1.0292
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5307 - loss: 1.0254
[1m139/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5324 - loss: 1.0245
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5332 - loss: 1.0248 - val_accuracy: 0.5688 - val_loss: 1.0543
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.1389
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5362 - loss: 1.0216 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5434 - loss: 1.0171
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5470 - loss: 1.0156
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5487 - loss: 1.0142
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5493 - loss: 1.0131 - val_accuracy: 0.5699 - val_loss: 1.0547

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 389ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 28: 52.75 [%]
F1-score capturado en la ejecución 28: 51.81 [%]

=== EJECUCIÓN 29 ===

--- TRAIN (ejecución 29) ---

--- TEST (ejecución 29) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:10[0m 948ms/step
[1m 63/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 816us/step  
[1m122/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 832us/step
[1m193/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 788us/step
[1m264/328[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 766us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 17ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 15ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 15ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 19ms/step
[1m63/89[0m [32m━━━━━━━━━━━━━━[0m[37m━━━━━━[0m [1m0s[0m 809us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 924us/step
Global accuracy score (validation) = 56.78 [%]
Global F1 score (validation) = 54.08 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.08727251 0.0542022  0.84755117 0.01097405]
 [0.23400073 0.17040169 0.554589   0.04100861]
 [0.11277982 0.06643496 0.7960305  0.02475474]
 ...
 [0.16543277 0.09316811 0.60473114 0.13666807]
 [0.2479307  0.19892076 0.14988312 0.40326536]
 [0.3904901  0.4457982  0.06362834 0.1000834 ]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 60.48 [%]
Global accuracy score (test) = 54.5 [%]
Global F1 score (train) = 59.0 [%]
Global F1 score (test) = 53.57 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.43      0.23      0.30       400
MODERATE-INTENSITY       0.45      0.68      0.54       400
         SEDENTARY       0.61      0.69      0.65       400
VIGOROUS-INTENSITY       0.74      0.59      0.66       345

          accuracy                           0.54      1545
         macro avg       0.56      0.55      0.54      1545
      weighted avg       0.55      0.54      0.53      1545

2025-11-04 13:37:13.157440: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-11-04 13:37:13.168561: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:467] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
E0000 00:00:1762259833.181726 1661020 cuda_dnn.cc:8579] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
E0000 00:00:1762259833.185633 1661020 cuda_blas.cc:1407] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
W0000 00:00:1762259833.195724 1661020 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259833.195742 1661020 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259833.195744 1661020 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
W0000 00:00:1762259833.195746 1661020 computation_placer.cc:177] computation placer already registered. Please check linkage and avoid linking the same target more than once.
2025-11-04 13:37:13.198885: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
/home/simur/git/uniovi-simur-wearablepermed-ml/.venv/lib/python3.12/site-packages/keras/src/layers/core/input_layer.py:27: UserWarning: Argument `input_shape` is deprecated. Use `shape` instead.
  warnings.warn(
I0000 00:00:1762259835.592051 1661020 gpu_device.cc:2019] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 13768 MB memory:  -> device: 0, name: NVIDIA GeForce RTX 4060 Ti, pci bus id: 0000:65:00.0, compute capability: 8.9
1 GPU(s) detected and VRAM set to crossover mode..
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Epoch 1/91
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
I0000 00:00:1762259837.263214 1661152 service.cc:152] XLA service 0x72d2ec00c040 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
I0000 00:00:1762259837.263269 1661152 service.cc:160]   StreamExecutor device (0): NVIDIA GeForce RTX 4060 Ti, Compute Capability 8.9
2025-11-04 13:37:17.312018: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:269] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
I0000 00:00:1762259837.488063 1661152 cuda_dnn.cc:529] Loaded cuDNN version 91002
I0000 00:00:1762259840.182983 1661152 device_compiler.h:188] Compiled cluster using XLA!  This line is logged at most once for the lifetime of the process.

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m10:36[0m 4s/step - accuracy: 0.3594 - loss: 1.7534
[1m 31/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3022 - loss: 2.0407  
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2968 - loss: 2.0296
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2931 - loss: 2.0107
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2903 - loss: 1.9976
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 19ms/step - accuracy: 0.2894 - loss: 1.9867
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m9s[0m 31ms/step - accuracy: 0.2893 - loss: 1.9863 - val_accuracy: 0.4470 - val_loss: 1.3350
Epoch 2/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.2500 - loss: 1.9386
[1m 32/164[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3031 - loss: 1.7825 
[1m 67/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2993 - loss: 1.7710
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2980 - loss: 1.7617
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.2972 - loss: 1.7514
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.2974 - loss: 1.7419 - val_accuracy: 0.4902 - val_loss: 1.1951
Epoch 3/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3438 - loss: 1.6340
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3093 - loss: 1.5728 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3179 - loss: 1.5580
[1m104/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3202 - loss: 1.5500
[1m136/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3213 - loss: 1.5426
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3220 - loss: 1.5366 - val_accuracy: 0.4958 - val_loss: 1.2039
Epoch 4/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3281 - loss: 1.5262
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3396 - loss: 1.4560 
[1m 70/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3371 - loss: 1.4448
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3389 - loss: 1.4357
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3414 - loss: 1.4287
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3429 - loss: 1.4249 - val_accuracy: 0.5046 - val_loss: 1.1815
Epoch 5/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.3750 - loss: 1.3794
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.3660 - loss: 1.3462 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3668 - loss: 1.3451
[1m106/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3695 - loss: 1.3412
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3720 - loss: 1.3377
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3733 - loss: 1.3356 - val_accuracy: 0.5162 - val_loss: 1.1583
Epoch 6/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 20ms/step - accuracy: 0.2344 - loss: 1.3013
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3798 - loss: 1.3054 
[1m 75/164[0m [32m━━━━━━━━━[0m[37m━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3895 - loss: 1.2980
[1m113/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.3917 - loss: 1.2948
[1m149/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.3949 - loss: 1.2901
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.3964 - loss: 1.2881 - val_accuracy: 0.5495 - val_loss: 1.1346
Epoch 7/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4531 - loss: 1.2067
[1m 37/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4383 - loss: 1.2226 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4400 - loss: 1.2214
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4395 - loss: 1.2207
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4383 - loss: 1.2215
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4373 - loss: 1.2218 - val_accuracy: 0.5400 - val_loss: 1.1037
Epoch 8/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4531 - loss: 1.2677
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4390 - loss: 1.2043 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4492 - loss: 1.1938
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4524 - loss: 1.1886
[1m143/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4556 - loss: 1.1838
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4569 - loss: 1.1823 - val_accuracy: 0.5376 - val_loss: 1.0806
Epoch 9/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0267
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4841 - loss: 1.1490 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4815 - loss: 1.1507
[1m107/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4811 - loss: 1.1508
[1m145/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4800 - loss: 1.1509
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4797 - loss: 1.1504 - val_accuracy: 0.5629 - val_loss: 1.0634
Epoch 10/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.0357
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4967 - loss: 1.1021 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4963 - loss: 1.1082
[1m108/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4956 - loss: 1.1101
[1m146/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4954 - loss: 1.1118
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4954 - loss: 1.1123 - val_accuracy: 0.5629 - val_loss: 1.0657
Epoch 11/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4375 - loss: 1.1248
[1m 33/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4857 - loss: 1.1274 
[1m 65/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4900 - loss: 1.1249
[1m100/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.4936 - loss: 1.1223
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4969 - loss: 1.1177
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.4985 - loss: 1.1151 - val_accuracy: 0.5492 - val_loss: 1.0769
Epoch 12/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4375 - loss: 1.1778
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.4908 - loss: 1.1064 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5013 - loss: 1.0945
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5034 - loss: 1.0926
[1m135/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5051 - loss: 1.0899
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5064 - loss: 1.0882 - val_accuracy: 0.5653 - val_loss: 1.0513
Epoch 13/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4062 - loss: 1.1020
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5032 - loss: 1.0620 
[1m 68/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5098 - loss: 1.0644
[1m102/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5131 - loss: 1.0657
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5155 - loss: 1.0662
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5169 - loss: 1.0663 - val_accuracy: 0.5667 - val_loss: 1.0494
Epoch 14/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.4844 - loss: 1.1255
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5109 - loss: 1.0567 
[1m 73/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5147 - loss: 1.0575
[1m110/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5178 - loss: 1.0583
[1m148/164[0m [32m━━━━━━━━━━━━━━━━━━[0m[37m━━[0m [1m0s[0m 1ms/step - accuracy: 0.5199 - loss: 1.0582
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5206 - loss: 1.0580 - val_accuracy: 0.5607 - val_loss: 1.0644
Epoch 15/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 21ms/step - accuracy: 0.5312 - loss: 1.0622
[1m 35/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5392 - loss: 1.0354 
[1m 69/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5412 - loss: 1.0313
[1m105/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5420 - loss: 1.0318
[1m141/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5412 - loss: 1.0329
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5402 - loss: 1.0344 - val_accuracy: 0.5664 - val_loss: 1.0682
Epoch 16/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.6250 - loss: 0.9321
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5628 - loss: 1.0035 
[1m 71/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5559 - loss: 1.0112
[1m101/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5563 - loss: 1.0114
[1m138/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5550 - loss: 1.0135
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5534 - loss: 1.0154 - val_accuracy: 0.5499 - val_loss: 1.0556
Epoch 17/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 23ms/step - accuracy: 0.5469 - loss: 1.0740
[1m 34/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5586 - loss: 1.0276 
[1m 64/164[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5565 - loss: 1.0244
[1m 99/164[0m [32m━━━━━━━━━━━━[0m[37m━━━━━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5551 - loss: 1.0226
[1m134/164[0m [32m━━━━━━━━━━━━━━━━[0m[37m━━━━[0m [1m0s[0m 2ms/step - accuracy: 0.5540 - loss: 1.0219
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5534 - loss: 1.0210 - val_accuracy: 0.5702 - val_loss: 1.0538
Epoch 18/91

[1m  1/164[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m3s[0m 22ms/step - accuracy: 0.4844 - loss: 1.1562
[1m 36/164[0m [32m━━━━[0m[37m━━━━━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5536 - loss: 1.0215 
[1m 72/164[0m [32m━━━━━━━━[0m[37m━━━━━━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5560 - loss: 1.0173
[1m109/164[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5546 - loss: 1.0173
[1m142/164[0m [32m━━━━━━━━━━━━━━━━━[0m[37m━━━[0m [1m0s[0m 1ms/step - accuracy: 0.5541 - loss: 1.0169
[1m164/164[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step - accuracy: 0.5542 - loss: 1.0164 - val_accuracy: 0.5650 - val_loss: 1.0693

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m18s[0m 392ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 16ms/step  
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 16ms/step
Saved model to disk.

Accuracy capturado en la ejecución 29: 54.5 [%]
F1-score capturado en la ejecución 29: 53.57 [%]

=== EJECUCIÓN 30 ===

--- TRAIN (ejecución 30) ---

--- TEST (ejecución 30) ---
['LIGHT-INTENSITY' 'MODERATE-INTENSITY' 'SEDENTARY' 'VIGOROUS-INTENSITY']
4
Mapeo de etiquetas: {'LIGHT-INTENSITY': 0, 'MODERATE-INTENSITY': 1, 'SEDENTARY': 2, 'VIGOROUS-INTENSITY': 3}
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
This activity can't be balanced (in a downsampling way)
Model: "sequential"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓
┃ Layer (type)                    ┃ Output Shape           ┃       Param # ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩
│ conv1d (Conv1D)                 │ (None, 6, 128)         │       160,128 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization             │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout (Dropout)               │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_1 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_1           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_1 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ conv1d_2 (Conv1D)               │ (None, 6, 128)         │        82,048 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ layer_normalization_2           │ (None, 6, 128)         │           256 │
│ (LayerNormalization)            │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_2 (Dropout)             │ (None, 6, 128)         │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ global_average_pooling1d        │ (None, 128)            │             0 │
│ (GlobalAveragePooling1D)        │                        │               │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dropout_3 (Dropout)             │ (None, 128)            │             0 │
├─────────────────────────────────┼────────────────────────┼───────────────┤
│ dense (Dense)                   │ (None, 4)              │           516 │
└─────────────────────────────────┴────────────────────────┴───────────────┘
 Total params: 325,508 (1.24 MB)
 Trainable params: 325,508 (1.24 MB)
 Non-trainable params: 0 (0.00 B)
Loaded model from disk.
(1545, 6, 250)
(10469, 6, 250)

[1m  1/328[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m5:13[0m 958ms/step
[1m 62/328[0m [32m━━━[0m[37m━━━━━━━━━━━━━━━━━[0m [1m0s[0m 824us/step  
[1m126/328[0m [32m━━━━━━━[0m[37m━━━━━━━━━━━━━[0m [1m0s[0m 806us/step
[1m189/328[0m [32m━━━━━━━━━━━[0m[37m━━━━━━━━━[0m [1m0s[0m 807us/step
[1m260/328[0m [32m━━━━━━━━━━━━━━━[0m[37m━━━━━[0m [1m0s[0m 780us/step
[1m324/328[0m [32m━━━━━━━━━━━━━━━━━━━[0m[37m━[0m [1m0s[0m 783us/step
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 2ms/step  
[1m328/328[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m2s[0m 2ms/step

[1m 1/49[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m0s[0m 16ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 14ms/step
[1m49/49[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m1s[0m 14ms/step

[1m 1/89[0m [37m━━━━━━━━━━━━━━━━━━━━[0m [1m1s[0m 16ms/step
[1m58/89[0m [32m━━━━━━━━━━━━━[0m[37m━━━━━━━[0m [1m0s[0m 889us/step
[1m89/89[0m [32m━━━━━━━━━━━━━━━━━━━━[0m[37m[0m [1m0s[0m 1ms/step  
Global accuracy score (validation) = 56.07 [%]
Global F1 score (validation) = 53.05 [%]
[[2.]
 [2.]
 [2.]
 ...
 [3.]
 [3.]
 [3.]]
(1545, 1)
[[0.2723913  0.21665876 0.45240143 0.0585485 ]
 [0.13045967 0.08924674 0.76530373 0.01498991]
 [0.11893378 0.08073916 0.7857774  0.01454964]
 ...
 [0.07342409 0.04995958 0.09642653 0.78018975]
 [0.2679193  0.25850153 0.08940291 0.38417622]
 [0.30808938 0.442719   0.0268516  0.22233996]]
(1545, 4)
-------------------------------------------------

Global accuracy score (train) = 59.98 [%]
Global accuracy score (test) = 53.14 [%]
Global F1 score (train) = 57.65 [%]
Global F1 score (test) = 51.54 [%]
                    precision    recall  f1-score   support

   LIGHT-INTENSITY       0.43      0.17      0.25       400
MODERATE-INTENSITY       0.41      0.67      0.51       400
         SEDENTARY       0.66      0.69      0.67       400
VIGOROUS-INTENSITY       0.65      0.61      0.63       345

          accuracy                           0.53      1545
         macro avg       0.54      0.53      0.52      1545
      weighted avg       0.53      0.53      0.51      1545


Accuracy capturado en la ejecución 30: 53.14 [%]
F1-score capturado en la ejecución 30: 51.54 [%]

=== RESUMEN FINAL ===
Accuracies: [57.15, 52.1, 54.3, 51.46, 56.25, 55.02, 53.27, 52.94, 52.56, 51.2, 52.69, 52.62, 54.24, 52.49, 51.78, 53.33, 51.59, 52.49, 53.2, 52.69, 55.15, 52.82, 54.76, 53.33, 52.36, 53.33, 54.76, 52.75, 54.5, 53.14]
F1-scores: [57.07, 51.89, 54.4, 51.73, 56.49, 54.67, 53.24, 52.63, 51.48, 49.06, 53.13, 52.37, 54.72, 51.19, 52.22, 52.68, 50.32, 52.58, 52.89, 51.82, 55.06, 52.73, 54.59, 53.41, 50.6, 51.35, 54.24, 51.81, 53.57, 51.54]
Accuracy mean: 53.3423 | std: 1.3731
F1 mean: 52.8493 | std: 1.7389

Resultados guardados en /mnt/nvme1n2/git/uniovi-simur-wearablepermed-data/output/Paper_results/cases_dataset_M/case_M_ESANN_acc_gyr_superclasses_CPA_METs/metrics_test.npz
