[2025-04-03 00:30:16 simmim_finetune] (main_finetune.py 375): INFO Full config saved to checkpoint/human/config.json
[2025-04-03 00:30:16 simmim_finetune] (main_finetune.py 378): INFO AMP_OPT_LEVEL: O0
AUG:
  AUTO_AUGMENT: rand-m9-mstd0.5-inc1
  COLOR_JITTER: 0.4
  CUTMIX: 1.0
  CUTMIX_MINMAX: null
  MIXUP: 0.8
  MIXUP_MODE: batch
  MIXUP_PROB: 1.0
  MIXUP_SWITCH_PROB: 0.5
  RECOUNT: 1
  REMODE: pixel
  REPROB: 0.25
BASE:
- ''
DATA:
  BATCH_SIZE: 128
  DATASET: imagenet
  DATA_PATH: ''
  IMG_SIZE: 224
  INTERPOLATION: bicubic
  MASK_PATCH_SIZE: 32
  MASK_RATIO: 0.6
  NUM_WORKERS: 8
  PIN_MEMORY: true
  TRAIN_PATH: VBench-2.0_human_anomaly/dataset/human_train.jsonl
  VAL_PATH: VBench-2.0_human_anomaly/dataset/human_test.jsonl
EVAL_MODE: false
LOCAL_RANK: 0
LOSS:
  FOCAL: false
  FOCAL_ALPHA: 0.25
  FOCAL_GAMMA: 2.0
MODEL:
  DROP_PATH_RATE: 0.1
  DROP_RATE: 0.0
  LABEL_SMOOTHING: 0.1
  NAME: simmim_finetune
  NUM_CLASSES: 2
  RESUME: ''
  SWIN:
    APE: false
    DEPTHS:
    - 2
    - 2
    - 6
    - 2
    EMBED_DIM: 96
    IN_CHANS: 3
    MLP_RATIO: 4.0
    NUM_HEADS:
    - 3
    - 6
    - 12
    - 24
    PATCH_NORM: true
    PATCH_SIZE: 4
    QKV_BIAS: true
    QK_SCALE: null
    WINDOW_SIZE: 7
  TYPE: vit
  VIT:
    DEPTH: 12
    EMBED_DIM: 768
    INIT_VALUES: 0.1
    IN_CHANS: 3
    MLP_RATIO: 4
    NUM_HEADS: 12
    PATCH_SIZE: 16
    QKV_BIAS: true
    USE_APE: false
    USE_MEAN_POOLING: true
    USE_RPB: true
    USE_SHARED_RPB: false
OUTPUT: checkpoint/human
PRETRAINED: pretrain/simmim_pretrain__vit_base__img224__800ep.pth
PRINT_FREQ: 2
SAVE_FREQ: 5
SEED: 0
TAG: simmim_finetune__vit_base__img224__800ep
TEST:
  CROP: true
THROUGHPUT_MODE: false
TRAIN:
  ACCUMULATION_STEPS: 0
  AUTO_RESUME: true
  BASE_LR: 0.00125
  CLIP_GRAD: 5.0
  EPOCHS: 30
  LAYER_DECAY: 0.65
  LR_SCHEDULER:
    DECAY_EPOCHS: 30
    DECAY_RATE: 0.1
    GAMMA: 0.1
    MULTISTEPS: []
    NAME: cosine
  MIN_LR: 2.5e-07
  OPTIMIZER:
    BETAS:
    - 0.9
    - 0.999
    EPS: 1.0e-08
    MOMENTUM: 0.9
    NAME: adamw
  START_EPOCH: 0
  USE_CHECKPOINT: false
  WARMUP_EPOCHS: 3
  WARMUP_LR: 2.5e-07
  WEIGHT_DECAY: 0.05

[2025-04-03 00:30:16 simmim_finetune] (data_finetune.py 88): INFO Fine-tune data transform, is_train=True:
Compose(
    RandomResizedCropAndInterpolation(size=(224, 224), scale=(0.08, 1.0), ratio=(0.75, 1.3333), interpolation=PIL.Image.BICUBIC)
    RandomHorizontalFlip(p=0.5)
    <timm.data.auto_augment.RandAugment object at 0x7f69b37d4f40>
    ToTensor()
    Normalize(mean=tensor([0.4850, 0.4560, 0.4060]), std=tensor([0.2290, 0.2240, 0.2250]))
    <timm.data.random_erasing.RandomErasing object at 0x7f69b37d52a0>
)
[2025-04-03 00:30:16 simmim_finetune] (data_finetune.py 88): INFO Fine-tune data transform, is_train=False:
Compose(
    Resize(size=256, interpolation=bicubic, max_size=None, antialias=True)
    CenterCrop(size=(224, 224))
    ToTensor()
    Normalize(mean=(0.485, 0.456, 0.406), std=(0.229, 0.224, 0.225))
)
[2025-04-03 00:30:16 simmim_finetune] (data_finetune.py 26): INFO Build dataset: train images = 73366, val images = 1984
[2025-04-03 00:30:16 simmim_finetune] (main_finetune.py 102): INFO Creating model:vit/simmim_finetune
[2025-04-03 00:30:18 simmim_finetune] (main_finetune.py 105): INFO VisionTransformer(
  (patch_embed): PatchEmbed(
    (proj): Conv2d(3, 768, kernel_size=(16, 16), stride=(16, 16))
  )
  (pos_drop): Dropout(p=0.0, inplace=False)
  (blocks): ModuleList(
    (0): Block(
      (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True)
      (attn): Attention(
        (qkv): Linear(in_features=768, out_features=2304, bias=False)
        (attn_drop): Dropout(p=0.0, inplace=False)
        (proj): Linear(in_features=768, out_features=768, bias=True)
        (proj_drop): Dropout(p=0.0, inplace=False)
      )
      (drop_path): Identity()
      (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True)
      (mlp): Mlp(
        (fc1): Linear(in_features=768, out_features=3072, bias=True)
        (act): GELU(approximate='none')
        (fc2): Linear(in_features=3072, out_features=768, bias=True)
        (drop): Dropout(p=0.0, inplace=False)
      )
    )
    (1-11): 11 x Block(
      (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True)
      (attn): Attention(
        (qkv): Linear(in_features=768, out_features=2304, bias=False)
        (attn_drop): Dropout(p=0.0, inplace=False)
        (proj): Linear(in_features=768, out_features=768, bias=True)
        (proj_drop): Dropout(p=0.0, inplace=False)
      )
      (drop_path): DropPath()
      (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True)
      (mlp): Mlp(
        (fc1): Linear(in_features=768, out_features=3072, bias=True)
        (act): GELU(approximate='none')
        (fc2): Linear(in_features=3072, out_features=768, bias=True)
        (drop): Dropout(p=0.0, inplace=False)
      )
    )
  )
  (norm): Identity()
  (fc_norm): LayerNorm((768,), eps=1e-06, elementwise_affine=True)
  (head): Linear(in_features=768, out_features=2, bias=True)
)
[2025-04-03 00:30:18 simmim_finetune] (optimizer.py 70): INFO >>>>>>>>>> Build Optimizer for Fine-tuning Stage
[2025-04-03 00:30:18 simmim_finetune] (optimizer.py 87): INFO No weight decay: {'pos_embed', 'cls_token'}
[2025-04-03 00:30:18 simmim_finetune] (optimizer.py 182): INFO Param groups = {
  "layer_0_no_decay": {
    "group_name": "layer_0_no_decay",
    "weight_decay": 0.0,
    "params": [
      "cls_token",
      "patch_embed.proj.bias"
    ],
    "lr": 4.621507363773394e-06,
    "lr_scale": 0.003697205891018715
  },
  "layer_0_decay": {
    "group_name": "layer_0_decay",
    "weight_decay": 0.05,
    "params": [
      "patch_embed.proj.weight"
    ],
    "lr": 4.621507363773394e-06,
    "lr_scale": 0.003697205891018715
  },
  "layer_1_no_decay": {
    "group_name": "layer_1_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.0.gamma_1",
      "blocks.0.gamma_2",
      "blocks.0.norm1.weight",
      "blocks.0.norm1.bias",
      "blocks.0.attn.q_bias",
      "blocks.0.attn.v_bias",
      "blocks.0.attn.proj.bias",
      "blocks.0.norm2.weight",
      "blocks.0.norm2.bias",
      "blocks.0.mlp.fc1.bias",
      "blocks.0.mlp.fc2.bias"
    ],
    "lr": 7.110011328882144e-06,
    "lr_scale": 0.005688009063105715
  },
  "layer_1_decay": {
    "group_name": "layer_1_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.0.attn.relative_position_bias_table",
      "blocks.0.attn.qkv.weight",
      "blocks.0.attn.proj.weight",
      "blocks.0.mlp.fc1.weight",
      "blocks.0.mlp.fc2.weight"
    ],
    "lr": 7.110011328882144e-06,
    "lr_scale": 0.005688009063105715
  },
  "layer_2_no_decay": {
    "group_name": "layer_2_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.1.gamma_1",
      "blocks.1.gamma_2",
      "blocks.1.norm1.weight",
      "blocks.1.norm1.bias",
      "blocks.1.attn.q_bias",
      "blocks.1.attn.v_bias",
      "blocks.1.attn.proj.bias",
      "blocks.1.norm2.weight",
      "blocks.1.norm2.bias",
      "blocks.1.mlp.fc1.bias",
      "blocks.1.mlp.fc2.bias"
    ],
    "lr": 1.093847896751099e-05,
    "lr_scale": 0.008750783174008792
  },
  "layer_2_decay": {
    "group_name": "layer_2_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.1.attn.relative_position_bias_table",
      "blocks.1.attn.qkv.weight",
      "blocks.1.attn.proj.weight",
      "blocks.1.mlp.fc1.weight",
      "blocks.1.mlp.fc2.weight"
    ],
    "lr": 1.093847896751099e-05,
    "lr_scale": 0.008750783174008792
  },
  "layer_3_no_decay": {
    "group_name": "layer_3_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.2.gamma_1",
      "blocks.2.gamma_2",
      "blocks.2.norm1.weight",
      "blocks.2.norm1.bias",
      "blocks.2.attn.q_bias",
      "blocks.2.attn.v_bias",
      "blocks.2.attn.proj.bias",
      "blocks.2.norm2.weight",
      "blocks.2.norm2.bias",
      "blocks.2.mlp.fc1.bias",
      "blocks.2.mlp.fc2.bias"
    ],
    "lr": 1.682842918078614e-05,
    "lr_scale": 0.013462743344628911
  },
  "layer_3_decay": {
    "group_name": "layer_3_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.2.attn.relative_position_bias_table",
      "blocks.2.attn.qkv.weight",
      "blocks.2.attn.proj.weight",
      "blocks.2.mlp.fc1.weight",
      "blocks.2.mlp.fc2.weight"
    ],
    "lr": 1.682842918078614e-05,
    "lr_scale": 0.013462743344628911
  },
  "layer_4_no_decay": {
    "group_name": "layer_4_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.3.gamma_1",
      "blocks.3.gamma_2",
      "blocks.3.norm1.weight",
      "blocks.3.norm1.bias",
      "blocks.3.attn.q_bias",
      "blocks.3.attn.v_bias",
      "blocks.3.attn.proj.bias",
      "blocks.3.norm2.weight",
      "blocks.3.norm2.bias",
      "blocks.3.mlp.fc1.bias",
      "blocks.3.mlp.fc2.bias"
    ],
    "lr": 2.588989104736329e-05,
    "lr_scale": 0.02071191283789063
  },
  "layer_4_decay": {
    "group_name": "layer_4_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.3.attn.relative_position_bias_table",
      "blocks.3.attn.qkv.weight",
      "blocks.3.attn.proj.weight",
      "blocks.3.mlp.fc1.weight",
      "blocks.3.mlp.fc2.weight"
    ],
    "lr": 2.588989104736329e-05,
    "lr_scale": 0.02071191283789063
  },
  "layer_5_no_decay": {
    "group_name": "layer_5_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.4.gamma_1",
      "blocks.4.gamma_2",
      "blocks.4.norm1.weight",
      "blocks.4.norm1.bias",
      "blocks.4.attn.q_bias",
      "blocks.4.attn.v_bias",
      "blocks.4.attn.proj.bias",
      "blocks.4.norm2.weight",
      "blocks.4.norm2.bias",
      "blocks.4.mlp.fc1.bias",
      "blocks.4.mlp.fc2.bias"
    ],
    "lr": 3.983060161132814e-05,
    "lr_scale": 0.03186448128906251
  },
  "layer_5_decay": {
    "group_name": "layer_5_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.4.attn.relative_position_bias_table",
      "blocks.4.attn.qkv.weight",
      "blocks.4.attn.proj.weight",
      "blocks.4.mlp.fc1.weight",
      "blocks.4.mlp.fc2.weight"
    ],
    "lr": 3.983060161132814e-05,
    "lr_scale": 0.03186448128906251
  },
  "layer_6_no_decay": {
    "group_name": "layer_6_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.5.gamma_1",
      "blocks.5.gamma_2",
      "blocks.5.norm1.weight",
      "blocks.5.norm1.bias",
      "blocks.5.attn.q_bias",
      "blocks.5.attn.v_bias",
      "blocks.5.attn.proj.bias",
      "blocks.5.norm2.weight",
      "blocks.5.norm2.bias",
      "blocks.5.mlp.fc1.bias",
      "blocks.5.mlp.fc2.bias"
    ],
    "lr": 6.127784863281252e-05,
    "lr_scale": 0.049022278906250015
  },
  "layer_6_decay": {
    "group_name": "layer_6_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.5.attn.relative_position_bias_table",
      "blocks.5.attn.qkv.weight",
      "blocks.5.attn.proj.weight",
      "blocks.5.mlp.fc1.weight",
      "blocks.5.mlp.fc2.weight"
    ],
    "lr": 6.127784863281252e-05,
    "lr_scale": 0.049022278906250015
  },
  "layer_7_no_decay": {
    "group_name": "layer_7_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.6.gamma_1",
      "blocks.6.gamma_2",
      "blocks.6.norm1.weight",
      "blocks.6.norm1.bias",
      "blocks.6.attn.q_bias",
      "blocks.6.attn.v_bias",
      "blocks.6.attn.proj.bias",
      "blocks.6.norm2.weight",
      "blocks.6.norm2.bias",
      "blocks.6.mlp.fc1.bias",
      "blocks.6.mlp.fc2.bias"
    ],
    "lr": 9.427361328125001e-05,
    "lr_scale": 0.07541889062500001
  },
  "layer_7_decay": {
    "group_name": "layer_7_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.6.attn.relative_position_bias_table",
      "blocks.6.attn.qkv.weight",
      "blocks.6.attn.proj.weight",
      "blocks.6.mlp.fc1.weight",
      "blocks.6.mlp.fc2.weight"
    ],
    "lr": 9.427361328125001e-05,
    "lr_scale": 0.07541889062500001
  },
  "layer_8_no_decay": {
    "group_name": "layer_8_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.7.gamma_1",
      "blocks.7.gamma_2",
      "blocks.7.norm1.weight",
      "blocks.7.norm1.bias",
      "blocks.7.attn.q_bias",
      "blocks.7.attn.v_bias",
      "blocks.7.attn.proj.bias",
      "blocks.7.norm2.weight",
      "blocks.7.norm2.bias",
      "blocks.7.mlp.fc1.bias",
      "blocks.7.mlp.fc2.bias"
    ],
    "lr": 0.00014503632812500002,
    "lr_scale": 0.11602906250000002
  },
  "layer_8_decay": {
    "group_name": "layer_8_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.7.attn.relative_position_bias_table",
      "blocks.7.attn.qkv.weight",
      "blocks.7.attn.proj.weight",
      "blocks.7.mlp.fc1.weight",
      "blocks.7.mlp.fc2.weight"
    ],
    "lr": 0.00014503632812500002,
    "lr_scale": 0.11602906250000002
  },
  "layer_9_no_decay": {
    "group_name": "layer_9_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.8.gamma_1",
      "blocks.8.gamma_2",
      "blocks.8.norm1.weight",
      "blocks.8.norm1.bias",
      "blocks.8.attn.q_bias",
      "blocks.8.attn.v_bias",
      "blocks.8.attn.proj.bias",
      "blocks.8.norm2.weight",
      "blocks.8.norm2.bias",
      "blocks.8.mlp.fc1.bias",
      "blocks.8.mlp.fc2.bias"
    ],
    "lr": 0.00022313281250000005,
    "lr_scale": 0.17850625000000003
  },
  "layer_9_decay": {
    "group_name": "layer_9_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.8.attn.relative_position_bias_table",
      "blocks.8.attn.qkv.weight",
      "blocks.8.attn.proj.weight",
      "blocks.8.mlp.fc1.weight",
      "blocks.8.mlp.fc2.weight"
    ],
    "lr": 0.00022313281250000005,
    "lr_scale": 0.17850625000000003
  },
  "layer_10_no_decay": {
    "group_name": "layer_10_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.9.gamma_1",
      "blocks.9.gamma_2",
      "blocks.9.norm1.weight",
      "blocks.9.norm1.bias",
      "blocks.9.attn.q_bias",
      "blocks.9.attn.v_bias",
      "blocks.9.attn.proj.bias",
      "blocks.9.norm2.weight",
      "blocks.9.norm2.bias",
      "blocks.9.mlp.fc1.bias",
      "blocks.9.mlp.fc2.bias"
    ],
    "lr": 0.00034328125,
    "lr_scale": 0.274625
  },
  "layer_10_decay": {
    "group_name": "layer_10_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.9.attn.relative_position_bias_table",
      "blocks.9.attn.qkv.weight",
      "blocks.9.attn.proj.weight",
      "blocks.9.mlp.fc1.weight",
      "blocks.9.mlp.fc2.weight"
    ],
    "lr": 0.00034328125,
    "lr_scale": 0.274625
  },
  "layer_11_no_decay": {
    "group_name": "layer_11_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.10.gamma_1",
      "blocks.10.gamma_2",
      "blocks.10.norm1.weight",
      "blocks.10.norm1.bias",
      "blocks.10.attn.q_bias",
      "blocks.10.attn.v_bias",
      "blocks.10.attn.proj.bias",
      "blocks.10.norm2.weight",
      "blocks.10.norm2.bias",
      "blocks.10.mlp.fc1.bias",
      "blocks.10.mlp.fc2.bias"
    ],
    "lr": 0.0005281250000000001,
    "lr_scale": 0.42250000000000004
  },
  "layer_11_decay": {
    "group_name": "layer_11_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.10.attn.relative_position_bias_table",
      "blocks.10.attn.qkv.weight",
      "blocks.10.attn.proj.weight",
      "blocks.10.mlp.fc1.weight",
      "blocks.10.mlp.fc2.weight"
    ],
    "lr": 0.0005281250000000001,
    "lr_scale": 0.42250000000000004
  },
  "layer_12_no_decay": {
    "group_name": "layer_12_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.11.gamma_1",
      "blocks.11.gamma_2",
      "blocks.11.norm1.weight",
      "blocks.11.norm1.bias",
      "blocks.11.attn.q_bias",
      "blocks.11.attn.v_bias",
      "blocks.11.attn.proj.bias",
      "blocks.11.norm2.weight",
      "blocks.11.norm2.bias",
      "blocks.11.mlp.fc1.bias",
      "blocks.11.mlp.fc2.bias"
    ],
    "lr": 0.0008125000000000001,
    "lr_scale": 0.65
  },
  "layer_12_decay": {
    "group_name": "layer_12_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.11.attn.relative_position_bias_table",
      "blocks.11.attn.qkv.weight",
      "blocks.11.attn.proj.weight",
      "blocks.11.mlp.fc1.weight",
      "blocks.11.mlp.fc2.weight"
    ],
    "lr": 0.0008125000000000001,
    "lr_scale": 0.65
  },
  "layer_13_no_decay": {
    "group_name": "layer_13_no_decay",
    "weight_decay": 0.0,
    "params": [
      "fc_norm.weight",
      "fc_norm.bias",
      "head.bias"
    ],
    "lr": 0.00125,
    "lr_scale": 1.0
  },
  "layer_13_decay": {
    "group_name": "layer_13_decay",
    "weight_decay": 0.05,
    "params": [
      "head.weight"
    ],
    "lr": 0.00125,
    "lr_scale": 1.0
  }
}
[2025-04-03 00:30:18 simmim_finetune] (optimizer.py 105): INFO AdamW (
Parameter Group 0
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_0_no_decay
    lr: 4.621507363773394e-06
    lr_scale: 0.003697205891018715
    maximize: False
    weight_decay: 0.0

Parameter Group 1
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_0_decay
    lr: 4.621507363773394e-06
    lr_scale: 0.003697205891018715
    maximize: False
    weight_decay: 0.05

Parameter Group 2
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_1_no_decay
    lr: 7.110011328882144e-06
    lr_scale: 0.005688009063105715
    maximize: False
    weight_decay: 0.0

Parameter Group 3
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_1_decay
    lr: 7.110011328882144e-06
    lr_scale: 0.005688009063105715
    maximize: False
    weight_decay: 0.05

Parameter Group 4
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_2_no_decay
    lr: 1.093847896751099e-05
    lr_scale: 0.008750783174008792
    maximize: False
    weight_decay: 0.0

Parameter Group 5
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_2_decay
    lr: 1.093847896751099e-05
    lr_scale: 0.008750783174008792
    maximize: False
    weight_decay: 0.05

Parameter Group 6
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_3_no_decay
    lr: 1.682842918078614e-05
    lr_scale: 0.013462743344628911
    maximize: False
    weight_decay: 0.0

Parameter Group 7
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_3_decay
    lr: 1.682842918078614e-05
    lr_scale: 0.013462743344628911
    maximize: False
    weight_decay: 0.05

Parameter Group 8
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_4_no_decay
    lr: 2.588989104736329e-05
    lr_scale: 0.02071191283789063
    maximize: False
    weight_decay: 0.0

Parameter Group 9
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_4_decay
    lr: 2.588989104736329e-05
    lr_scale: 0.02071191283789063
    maximize: False
    weight_decay: 0.05

Parameter Group 10
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_5_no_decay
    lr: 3.983060161132814e-05
    lr_scale: 0.03186448128906251
    maximize: False
    weight_decay: 0.0

Parameter Group 11
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_5_decay
    lr: 3.983060161132814e-05
    lr_scale: 0.03186448128906251
    maximize: False
    weight_decay: 0.05

Parameter Group 12
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_6_no_decay
    lr: 6.127784863281252e-05
    lr_scale: 0.049022278906250015
    maximize: False
    weight_decay: 0.0

Parameter Group 13
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_6_decay
    lr: 6.127784863281252e-05
    lr_scale: 0.049022278906250015
    maximize: False
    weight_decay: 0.05

Parameter Group 14
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_7_no_decay
    lr: 9.427361328125001e-05
    lr_scale: 0.07541889062500001
    maximize: False
    weight_decay: 0.0

Parameter Group 15
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_7_decay
    lr: 9.427361328125001e-05
    lr_scale: 0.07541889062500001
    maximize: False
    weight_decay: 0.05

Parameter Group 16
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_8_no_decay
    lr: 0.00014503632812500002
    lr_scale: 0.11602906250000002
    maximize: False
    weight_decay: 0.0

Parameter Group 17
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_8_decay
    lr: 0.00014503632812500002
    lr_scale: 0.11602906250000002
    maximize: False
    weight_decay: 0.05

Parameter Group 18
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_9_no_decay
    lr: 0.00022313281250000005
    lr_scale: 0.17850625000000003
    maximize: False
    weight_decay: 0.0

Parameter Group 19
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_9_decay
    lr: 0.00022313281250000005
    lr_scale: 0.17850625000000003
    maximize: False
    weight_decay: 0.05

Parameter Group 20
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_10_no_decay
    lr: 0.00034328125
    lr_scale: 0.274625
    maximize: False
    weight_decay: 0.0

Parameter Group 21
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_10_decay
    lr: 0.00034328125
    lr_scale: 0.274625
    maximize: False
    weight_decay: 0.05

Parameter Group 22
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_11_no_decay
    lr: 0.0005281250000000001
    lr_scale: 0.42250000000000004
    maximize: False
    weight_decay: 0.0

Parameter Group 23
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_11_decay
    lr: 0.0005281250000000001
    lr_scale: 0.42250000000000004
    maximize: False
    weight_decay: 0.05

Parameter Group 24
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_12_no_decay
    lr: 0.0008125000000000001
    lr_scale: 0.65
    maximize: False
    weight_decay: 0.0

Parameter Group 25
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_12_decay
    lr: 0.0008125000000000001
    lr_scale: 0.65
    maximize: False
    weight_decay: 0.05

Parameter Group 26
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_13_no_decay
    lr: 0.00125
    lr_scale: 1.0
    maximize: False
    weight_decay: 0.0

Parameter Group 27
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_13_decay
    lr: 0.00125
    lr_scale: 1.0
    maximize: False
    weight_decay: 0.05
)
[2025-04-03 00:30:18 simmim_finetune] (main_finetune.py 116): INFO number of params: 85763522
[2025-04-03 00:30:18 simmim_finetune] (utils.py 81): INFO All checkpoints founded in checkpoint/human: []
[2025-04-03 00:30:18 simmim_finetune] (main_finetune.py 146): INFO no checkpoint found in checkpoint/human, ignoring auto resume
[2025-04-03 00:30:18 simmim_finetune] (utils.py 99): INFO >>>>>>>>>> Fine-tuned from pretrain/simmim_pretrain__vit_base__img224__800ep.pth ..........
[2025-04-03 00:30:18 simmim_finetune] (utils.py 105): INFO Detect pre-trained model, remove [encoder.] prefix.
[2025-04-03 00:30:18 simmim_finetune] (utils.py 113): INFO >>>>>>>>>> Remapping pre-trained keys for VIT ..........
[2025-04-03 00:30:18 simmim_finetune] (utils.py 210): INFO Expand the shared relative position embedding to each transformer block.
[2025-04-03 00:30:18 simmim_finetune] (utils.py 119): INFO _IncompatibleKeys(missing_keys=['blocks.0.attn.relative_position_index', 'blocks.1.attn.relative_position_index', 'blocks.2.attn.relative_position_index', 'blocks.3.attn.relative_position_index', 'blocks.4.attn.relative_position_index', 'blocks.5.attn.relative_position_index', 'blocks.6.attn.relative_position_index', 'blocks.7.attn.relative_position_index', 'blocks.8.attn.relative_position_index', 'blocks.9.attn.relative_position_index', 'blocks.10.attn.relative_position_index', 'blocks.11.attn.relative_position_index', 'fc_norm.weight', 'fc_norm.bias', 'head.weight', 'head.bias'], unexpected_keys=['mask_token', 'norm.weight', 'norm.bias'])
[2025-04-03 00:30:18 simmim_finetune] (utils.py 123): INFO >>>>>>>>>> loaded successfully 'pretrain/simmim_pretrain__vit_base__img224__800ep.pth'
[2025-04-03 00:30:18 simmim_finetune] (main_finetune.py 161): INFO Start training
[2025-04-03 00:30:18 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07]
[2025-04-03 00:30:25 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][0/573]	eta 0:58:57 lr 0.000000	time 6.1734 (6.1734)	loss 0.6931 (0.6931)	grad_norm 0.9811 (0.9811)	mem 19956MB
[2025-04-03 00:30:26 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][2/573]	eta 0:25:10 lr 0.000002	time 0.8787 (2.6459)	loss 0.6932 (0.6932)	grad_norm 1.1281 (1.0852)	mem 20675MB
[2025-04-03 00:30:28 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][4/573]	eta 0:18:23 lr 0.000003	time 0.8785 (1.9394)	loss 0.6932 (0.6932)	grad_norm 0.7244 (1.0045)	mem 20675MB
[2025-04-03 00:30:30 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][6/573]	eta 0:15:27 lr 0.000005	time 0.8782 (1.6365)	loss 0.6932 (0.6932)	grad_norm 1.8955 (1.0882)	mem 20675MB
[2025-04-03 00:30:32 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][8/573]	eta 0:13:49 lr 0.000006	time 0.8783 (1.4681)	loss 0.6931 (0.6931)	grad_norm 0.6747 (1.0512)	mem 20675MB
[2025-04-03 00:30:33 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][10/573]	eta 0:12:46 lr 0.000008	time 0.8784 (1.3611)	loss 0.6932 (0.6931)	grad_norm 2.1558 (1.2129)	mem 20675MB
[2025-04-03 00:30:35 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][12/573]	eta 0:12:01 lr 0.000009	time 0.8784 (1.2870)	loss 0.6930 (0.6931)	grad_norm 1.5494 (1.2478)	mem 20675MB
[2025-04-03 00:30:37 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][14/573]	eta 0:11:29 lr 0.000010	time 0.8789 (1.2327)	loss 0.6925 (0.6931)	grad_norm 1.6015 (1.2477)	mem 20675MB
[2025-04-03 00:30:39 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][16/573]	eta 0:11:03 lr 0.000012	time 0.8782 (1.1911)	loss 0.6925 (0.6930)	grad_norm 2.0447 (1.3555)	mem 20675MB
[2025-04-03 00:30:40 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][18/573]	eta 0:10:42 lr 0.000013	time 0.8783 (1.1583)	loss 0.6928 (0.6930)	grad_norm 1.0967 (1.4142)	mem 20675MB
[2025-04-03 00:30:42 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][20/573]	eta 0:10:25 lr 0.000015	time 0.8784 (1.1318)	loss 0.6927 (0.6930)	grad_norm 1.6757 (1.4594)	mem 20675MB
[2025-04-03 00:30:44 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][22/573]	eta 0:10:11 lr 0.000016	time 0.8786 (1.1098)	loss 0.6913 (0.6929)	grad_norm 1.3450 (1.5239)	mem 20675MB
[2025-04-03 00:30:46 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][24/573]	eta 0:09:59 lr 0.000018	time 0.8783 (1.0913)	loss 0.6911 (0.6927)	grad_norm 0.9146 (1.4723)	mem 20675MB
[2025-04-03 00:30:47 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][26/573]	eta 0:09:48 lr 0.000019	time 0.8790 (1.0757)	loss 0.6917 (0.6926)	grad_norm 1.2716 (1.4835)	mem 20675MB
[2025-04-03 00:30:49 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][28/573]	eta 0:09:38 lr 0.000021	time 0.8785 (1.0621)	loss 0.6898 (0.6924)	grad_norm 1.1812 (1.4591)	mem 20675MB
[2025-04-03 00:30:51 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][30/573]	eta 0:09:30 lr 0.000022	time 0.8787 (1.0504)	loss 0.6869 (0.6922)	grad_norm 2.0964 (1.4579)	mem 20675MB
[2025-04-03 00:30:53 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][32/573]	eta 0:09:22 lr 0.000024	time 0.8787 (1.0401)	loss 0.6888 (0.6920)	grad_norm 1.1033 (1.4408)	mem 20675MB
[2025-04-03 00:30:55 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][34/573]	eta 0:09:15 lr 0.000025	time 0.8784 (1.0310)	loss 0.6977 (0.6921)	grad_norm 3.4552 (1.4910)	mem 20675MB
[2025-04-03 00:30:56 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][36/573]	eta 0:09:09 lr 0.000026	time 0.8785 (1.0230)	loss 0.6832 (0.6919)	grad_norm 1.7955 (1.5050)	mem 20675MB
[2025-04-03 00:30:58 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][38/573]	eta 0:09:03 lr 0.000028	time 0.8783 (1.0156)	loss 0.6895 (0.6917)	grad_norm 2.9641 (1.5309)	mem 20675MB
[2025-04-03 00:31:00 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][40/573]	eta 0:08:57 lr 0.000029	time 0.8783 (1.0090)	loss 0.6849 (0.6913)	grad_norm 1.3962 (1.5260)	mem 20675MB
[2025-04-03 00:31:02 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][42/573]	eta 0:08:52 lr 0.000031	time 0.8787 (1.0031)	loss 0.6786 (0.6911)	grad_norm 1.6443 (1.5239)	mem 20675MB
[2025-04-03 00:31:03 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][44/573]	eta 0:08:47 lr 0.000032	time 0.8788 (0.9976)	loss 0.6595 (0.6903)	grad_norm 3.4403 (1.5533)	mem 20675MB
[2025-04-03 00:31:05 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][46/573]	eta 0:08:43 lr 0.000034	time 0.8787 (0.9926)	loss 0.6817 (0.6899)	grad_norm 1.7981 (1.5480)	mem 20675MB
[2025-04-03 00:31:07 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][48/573]	eta 0:08:38 lr 0.000035	time 0.8797 (0.9882)	loss 0.6848 (0.6895)	grad_norm 0.9108 (1.5247)	mem 20675MB
[2025-04-03 00:31:09 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][50/573]	eta 0:08:34 lr 0.000037	time 0.8801 (0.9839)	loss 0.6703 (0.6886)	grad_norm 1.1330 (1.5210)	mem 20675MB
[2025-04-03 00:31:10 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][52/573]	eta 0:08:30 lr 0.000038	time 0.8783 (0.9800)	loss 0.6465 (0.6874)	grad_norm 1.9864 (1.5469)	mem 20675MB
[2025-04-03 00:31:12 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][54/573]	eta 0:08:26 lr 0.000040	time 0.8785 (0.9763)	loss 0.6407 (0.6858)	grad_norm 1.7410 (1.5612)	mem 20675MB
[2025-04-03 00:31:14 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][56/573]	eta 0:08:23 lr 0.000041	time 0.8782 (0.9730)	loss 0.6737 (0.6854)	grad_norm 1.5559 (1.5493)	mem 20675MB
[2025-04-03 00:31:16 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][58/573]	eta 0:08:19 lr 0.000042	time 0.8806 (0.9698)	loss 0.6733 (0.6850)	grad_norm 1.3683 (1.5713)	mem 20675MB
[2025-04-03 00:31:17 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][60/573]	eta 0:08:16 lr 0.000044	time 0.8784 (0.9669)	loss 0.6645 (0.6847)	grad_norm 3.2784 (1.6108)	mem 20675MB
[2025-04-03 00:31:19 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][62/573]	eta 0:08:12 lr 0.000045	time 0.8787 (0.9642)	loss 0.6451 (0.6837)	grad_norm 1.7002 (1.5990)	mem 20675MB
[2025-04-03 00:31:21 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][64/573]	eta 0:08:09 lr 0.000047	time 0.8825 (0.9616)	loss 0.6626 (0.6827)	grad_norm 1.3296 (1.5846)	mem 20675MB
[2025-04-03 00:31:23 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][66/573]	eta 0:08:06 lr 0.000048	time 0.8838 (0.9593)	loss 0.6396 (0.6815)	grad_norm 1.8930 (1.6074)	mem 20675MB
[2025-04-03 00:31:24 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][68/573]	eta 0:08:03 lr 0.000050	time 0.8783 (0.9570)	loss 0.6574 (0.6806)	grad_norm 1.0140 (1.6068)	mem 20675MB
[2025-04-03 00:31:26 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][70/573]	eta 0:08:00 lr 0.000051	time 0.8782 (0.9548)	loss 0.6287 (0.6789)	grad_norm 2.1965 (1.6152)	mem 20675MB
[2025-04-03 00:31:28 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][72/573]	eta 0:07:57 lr 0.000053	time 0.8786 (0.9528)	loss 0.6991 (0.6795)	grad_norm 2.1347 (1.6495)	mem 20675MB
[2025-04-03 00:31:30 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][74/573]	eta 0:07:54 lr 0.000054	time 0.8780 (0.9509)	loss 0.6299 (0.6786)	grad_norm 1.4919 (1.6450)	mem 20675MB
[2025-04-03 00:31:32 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][76/573]	eta 0:07:51 lr 0.000056	time 0.8794 (0.9490)	loss 0.6447 (0.6780)	grad_norm 0.9372 (1.6269)	mem 20675MB
[2025-04-03 00:31:33 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][78/573]	eta 0:07:48 lr 0.000057	time 0.8777 (0.9472)	loss 0.6449 (0.6771)	grad_norm 1.6665 (1.6373)	mem 20675MB
[2025-04-03 00:31:35 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][80/573]	eta 0:07:46 lr 0.000058	time 0.8778 (0.9455)	loss 0.6371 (0.6765)	grad_norm 1.9422 (1.6325)	mem 20675MB
[2025-04-03 00:31:37 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][82/573]	eta 0:07:43 lr 0.000060	time 0.8782 (0.9439)	loss 0.5995 (0.6749)	grad_norm 2.0037 (1.6400)	mem 20675MB
[2025-04-03 00:31:39 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][84/573]	eta 0:07:40 lr 0.000061	time 0.8780 (0.9424)	loss 0.6023 (0.6744)	grad_norm 1.7320 (1.6472)	mem 20675MB
[2025-04-03 00:31:40 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][86/573]	eta 0:07:38 lr 0.000063	time 0.8778 (0.9409)	loss 0.6345 (0.6732)	grad_norm 2.8155 (1.6618)	mem 20675MB
[2025-04-03 00:31:42 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][88/573]	eta 0:07:35 lr 0.000064	time 0.8780 (0.9395)	loss 0.6556 (0.6726)	grad_norm 2.0538 (1.6593)	mem 20675MB
[2025-04-03 00:31:44 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][90/573]	eta 0:07:33 lr 0.000066	time 0.8779 (0.9382)	loss 0.6808 (0.6723)	grad_norm 2.6019 (1.6761)	mem 20675MB
[2025-04-03 00:31:46 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][92/573]	eta 0:07:30 lr 0.000067	time 0.8781 (0.9369)	loss 0.6766 (0.6716)	grad_norm 2.2939 (1.6942)	mem 20675MB
[2025-04-03 00:31:47 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][94/573]	eta 0:07:28 lr 0.000069	time 0.8777 (0.9357)	loss 0.6369 (0.6712)	grad_norm 1.2309 (1.7203)	mem 20675MB
[2025-04-03 00:31:49 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][96/573]	eta 0:07:25 lr 0.000070	time 0.8777 (0.9345)	loss 0.6206 (0.6704)	grad_norm 1.4107 (1.7160)	mem 20675MB
[2025-04-03 00:31:51 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][98/573]	eta 0:07:23 lr 0.000071	time 0.8778 (0.9334)	loss 0.6692 (0.6704)	grad_norm 1.9232 (1.7191)	mem 20675MB
[2025-04-03 00:31:53 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][100/573]	eta 0:07:20 lr 0.000073	time 0.8780 (0.9323)	loss 0.6467 (0.6702)	grad_norm 1.0487 (1.7168)	mem 20675MB
[2025-04-03 00:31:54 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][102/573]	eta 0:07:18 lr 0.000074	time 0.8780 (0.9313)	loss 0.6265 (0.6698)	grad_norm 1.7372 (1.7194)	mem 20675MB
[2025-04-03 00:31:56 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][104/573]	eta 0:07:16 lr 0.000076	time 0.8777 (0.9303)	loss 0.6467 (0.6692)	grad_norm 3.7069 (1.7359)	mem 20675MB
[2025-04-03 00:31:58 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][106/573]	eta 0:07:13 lr 0.000077	time 0.8782 (0.9293)	loss 0.6668 (0.6692)	grad_norm 1.4241 (1.7404)	mem 20675MB
[2025-04-03 00:32:00 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][108/573]	eta 0:07:11 lr 0.000079	time 0.8781 (0.9284)	loss 0.6441 (0.6685)	grad_norm 1.8681 (1.7433)	mem 20675MB
[2025-04-03 00:32:01 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][110/573]	eta 0:07:09 lr 0.000080	time 0.8778 (0.9275)	loss 0.6287 (0.6677)	grad_norm 1.6289 (1.7387)	mem 20675MB
[2025-04-03 00:32:03 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][112/573]	eta 0:07:07 lr 0.000082	time 0.8781 (0.9266)	loss 0.6691 (0.6673)	grad_norm 2.2118 (1.7449)	mem 20675MB
[2025-04-03 00:32:05 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][114/573]	eta 0:07:04 lr 0.000083	time 0.8778 (0.9258)	loss 0.5865 (0.6664)	grad_norm 2.8088 (1.7599)	mem 20675MB
[2025-04-03 00:32:07 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][116/573]	eta 0:07:02 lr 0.000085	time 0.8776 (0.9250)	loss 0.6104 (0.6654)	grad_norm 1.7641 (1.7579)	mem 20675MB
[2025-04-03 00:32:08 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][118/573]	eta 0:07:00 lr 0.000086	time 0.8778 (0.9242)	loss 0.6561 (0.6656)	grad_norm 1.7080 (1.7739)	mem 20675MB
[2025-04-03 00:32:10 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][120/573]	eta 0:06:58 lr 0.000087	time 0.8778 (0.9235)	loss 0.6221 (0.6654)	grad_norm 1.7304 (1.7875)	mem 20675MB
[2025-04-03 00:32:12 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][122/573]	eta 0:06:56 lr 0.000089	time 0.8776 (0.9227)	loss 0.6291 (0.6647)	grad_norm 0.9561 (1.7809)	mem 20675MB
[2025-04-03 00:32:14 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][124/573]	eta 0:06:53 lr 0.000090	time 0.8778 (0.9220)	loss 0.6502 (0.6641)	grad_norm 1.2835 (1.7781)	mem 20675MB
[2025-04-03 00:32:15 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][126/573]	eta 0:06:51 lr 0.000092	time 0.8778 (0.9213)	loss 0.6739 (0.6635)	grad_norm 2.4163 (1.7892)	mem 20675MB
[2025-04-03 00:32:17 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][128/573]	eta 0:06:49 lr 0.000093	time 0.8778 (0.9207)	loss 0.6174 (0.6626)	grad_norm 2.6756 (1.7975)	mem 20675MB
[2025-04-03 00:32:19 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][130/573]	eta 0:06:47 lr 0.000095	time 0.8776 (0.9200)	loss 0.6335 (0.6623)	grad_norm 1.2741 (1.7989)	mem 20675MB
[2025-04-03 00:32:21 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][132/573]	eta 0:06:45 lr 0.000096	time 0.8776 (0.9194)	loss 0.6325 (0.6616)	grad_norm 1.1067 (1.7937)	mem 20675MB
[2025-04-03 00:32:22 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][134/573]	eta 0:06:43 lr 0.000098	time 0.8777 (0.9188)	loss 0.6284 (0.6613)	grad_norm 2.8432 (1.8019)	mem 20675MB
[2025-04-03 00:32:24 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][136/573]	eta 0:06:41 lr 0.000099	time 0.8777 (0.9182)	loss 0.6641 (0.6611)	grad_norm 2.1132 (1.8122)	mem 20675MB
[2025-04-03 00:32:26 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][138/573]	eta 0:06:39 lr 0.000101	time 0.8778 (0.9176)	loss 0.6767 (0.6607)	grad_norm 2.5992 (1.8231)	mem 20675MB
[2025-04-03 00:32:28 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][140/573]	eta 0:06:37 lr 0.000102	time 0.8776 (0.9171)	loss 0.6399 (0.6600)	grad_norm 3.5659 (1.8469)	mem 20675MB
[2025-04-03 00:32:30 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][142/573]	eta 0:06:35 lr 0.000103	time 0.8775 (0.9165)	loss 0.6209 (0.6596)	grad_norm 1.3187 (1.8475)	mem 20675MB
[2025-04-03 00:32:31 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][144/573]	eta 0:06:32 lr 0.000105	time 0.8776 (0.9160)	loss 0.6953 (0.6598)	grad_norm 2.7893 (1.8711)	mem 20675MB
[2025-04-03 00:32:33 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][146/573]	eta 0:06:30 lr 0.000106	time 0.8776 (0.9155)	loss 0.6062 (0.6592)	grad_norm 2.5538 (1.8836)	mem 20675MB
[2025-04-03 00:32:35 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][148/573]	eta 0:06:28 lr 0.000108	time 0.8778 (0.9150)	loss 0.6052 (0.6587)	grad_norm 1.9901 (1.8806)	mem 20675MB
[2025-04-03 00:32:37 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][150/573]	eta 0:06:26 lr 0.000109	time 0.8774 (0.9145)	loss 0.5874 (0.6581)	grad_norm 2.8241 (1.8946)	mem 20675MB
[2025-04-03 00:32:38 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][152/573]	eta 0:06:24 lr 0.000111	time 0.8777 (0.9141)	loss 0.5726 (0.6577)	grad_norm 1.9632 (1.9091)	mem 20675MB
[2025-04-03 00:32:40 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][154/573]	eta 0:06:22 lr 0.000112	time 0.8775 (0.9136)	loss 0.6135 (0.6571)	grad_norm 2.9444 (1.9180)	mem 20675MB
[2025-04-03 00:32:42 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][156/573]	eta 0:06:20 lr 0.000114	time 0.8775 (0.9132)	loss 0.5705 (0.6563)	grad_norm 1.9513 (1.9148)	mem 20675MB
[2025-04-03 00:32:44 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][158/573]	eta 0:06:18 lr 0.000115	time 0.8778 (0.9127)	loss 0.7043 (0.6563)	grad_norm 6.7194 (1.9605)	mem 20675MB
[2025-04-03 00:32:45 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][160/573]	eta 0:06:16 lr 0.000117	time 0.8775 (0.9123)	loss 0.6551 (0.6563)	grad_norm 2.2542 (1.9669)	mem 20675MB
[2025-04-03 00:32:47 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][162/573]	eta 0:06:14 lr 0.000118	time 0.8778 (0.9119)	loss 0.5654 (0.6557)	grad_norm 2.6441 (1.9694)	mem 20675MB
[2025-04-03 00:32:49 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][164/573]	eta 0:06:12 lr 0.000119	time 0.8775 (0.9115)	loss 0.6464 (0.6555)	grad_norm 2.0507 (1.9854)	mem 20675MB
[2025-04-03 00:32:51 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][166/573]	eta 0:06:10 lr 0.000121	time 0.8779 (0.9111)	loss 0.6702 (0.6555)	grad_norm 2.3940 (1.9870)	mem 20675MB
[2025-04-03 00:32:52 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][168/573]	eta 0:06:08 lr 0.000122	time 0.8773 (0.9107)	loss 0.6299 (0.6555)	grad_norm 3.0775 (1.9937)	mem 20675MB
[2025-04-03 00:32:54 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][170/573]	eta 0:06:06 lr 0.000124	time 0.8776 (0.9103)	loss 0.5894 (0.6548)	grad_norm 2.3091 (1.9969)	mem 20675MB
[2025-04-03 00:32:56 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][172/573]	eta 0:06:04 lr 0.000125	time 0.8775 (0.9099)	loss 0.6866 (0.6548)	grad_norm 1.9630 (1.9987)	mem 20675MB
[2025-04-03 00:32:58 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][174/573]	eta 0:06:02 lr 0.000127	time 0.8777 (0.9096)	loss 0.6344 (0.6542)	grad_norm 1.5159 (2.0019)	mem 20675MB
[2025-04-03 00:32:59 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][176/573]	eta 0:06:00 lr 0.000128	time 0.8775 (0.9092)	loss 0.5736 (0.6537)	grad_norm 2.6239 (2.0048)	mem 20675MB
[2025-04-03 00:33:01 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][178/573]	eta 0:05:59 lr 0.000130	time 0.8775 (0.9089)	loss 0.6366 (0.6531)	grad_norm 2.3624 (2.0085)	mem 20675MB
[2025-04-03 00:33:03 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][180/573]	eta 0:05:57 lr 0.000131	time 0.8774 (0.9086)	loss 0.5897 (0.6523)	grad_norm 1.9338 (2.0115)	mem 20675MB
[2025-04-03 00:33:05 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][182/573]	eta 0:05:55 lr 0.000133	time 0.8777 (0.9082)	loss 0.6605 (0.6524)	grad_norm 3.0340 (2.0256)	mem 20675MB
[2025-04-03 00:33:06 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][184/573]	eta 0:05:53 lr 0.000134	time 0.8775 (0.9079)	loss 0.6087 (0.6519)	grad_norm 1.9494 (2.0309)	mem 20675MB
[2025-04-03 00:33:08 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][186/573]	eta 0:05:51 lr 0.000135	time 0.8776 (0.9076)	loss 0.6180 (0.6517)	grad_norm 2.2383 (2.0353)	mem 20675MB
[2025-04-03 00:33:10 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][188/573]	eta 0:05:49 lr 0.000137	time 0.8776 (0.9073)	loss 0.7505 (0.6522)	grad_norm 6.3494 (2.0665)	mem 20675MB
[2025-04-03 00:33:12 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][190/573]	eta 0:05:47 lr 0.000138	time 0.8773 (0.9070)	loss 0.5830 (0.6514)	grad_norm 2.3404 (2.0695)	mem 20675MB
[2025-04-03 00:33:13 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][192/573]	eta 0:05:45 lr 0.000140	time 0.8781 (0.9067)	loss 0.6554 (0.6515)	grad_norm 2.2785 (2.0770)	mem 20675MB
[2025-04-03 00:33:15 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][194/573]	eta 0:05:43 lr 0.000141	time 0.8772 (0.9064)	loss 0.5698 (0.6509)	grad_norm 3.6197 (2.0835)	mem 20675MB
[2025-04-03 00:33:17 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][196/573]	eta 0:05:41 lr 0.000143	time 0.8775 (0.9061)	loss 0.6282 (0.6505)	grad_norm 1.4082 (2.0876)	mem 20675MB
[2025-04-03 00:33:19 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][198/573]	eta 0:05:39 lr 0.000144	time 0.8772 (0.9058)	loss 0.5828 (0.6498)	grad_norm 2.1922 (2.0880)	mem 20675MB
[2025-04-03 00:33:20 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][200/573]	eta 0:05:37 lr 0.000146	time 0.8772 (0.9056)	loss 0.5903 (0.6494)	grad_norm 1.6423 (2.0912)	mem 20675MB
[2025-04-03 00:33:22 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][202/573]	eta 0:05:35 lr 0.000147	time 0.8776 (0.9053)	loss 0.5781 (0.6494)	grad_norm 3.7749 (2.1114)	mem 20675MB
[2025-04-03 00:33:24 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][204/573]	eta 0:05:33 lr 0.000149	time 0.8788 (0.9050)	loss 0.5639 (0.6489)	grad_norm 3.3442 (2.1190)	mem 20675MB
[2025-04-03 00:33:26 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][206/573]	eta 0:05:32 lr 0.000150	time 0.8777 (0.9048)	loss 0.5829 (0.6480)	grad_norm 3.1769 (2.1279)	mem 20675MB
[2025-04-03 00:33:27 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][208/573]	eta 0:05:30 lr 0.000151	time 0.8776 (0.9045)	loss 0.6128 (0.6477)	grad_norm 3.0344 (2.1340)	mem 20675MB
[2025-04-03 00:33:29 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][210/573]	eta 0:05:28 lr 0.000153	time 0.8774 (0.9043)	loss 0.6315 (0.6474)	grad_norm 2.9071 (2.1398)	mem 20675MB
[2025-04-03 00:33:31 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][212/573]	eta 0:05:26 lr 0.000154	time 0.8772 (0.9040)	loss 0.5814 (0.6469)	grad_norm 2.5894 (2.1436)	mem 20675MB
[2025-04-03 00:33:33 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][214/573]	eta 0:05:24 lr 0.000156	time 0.8777 (0.9038)	loss 0.6019 (0.6466)	grad_norm 2.3187 (2.1508)	mem 20675MB
[2025-04-03 00:33:35 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][216/573]	eta 0:05:22 lr 0.000157	time 0.8776 (0.9036)	loss 0.6464 (0.6461)	grad_norm 2.7867 (2.1536)	mem 20675MB
[2025-04-03 00:33:36 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][218/573]	eta 0:05:20 lr 0.000159	time 0.8775 (0.9033)	loss 0.6184 (0.6455)	grad_norm 3.0268 (2.1602)	mem 20675MB
[2025-04-03 00:33:38 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][220/573]	eta 0:05:18 lr 0.000160	time 0.8777 (0.9031)	loss 0.6610 (0.6455)	grad_norm 3.0862 (2.1746)	mem 20675MB
[2025-04-03 00:33:40 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][222/573]	eta 0:05:16 lr 0.000162	time 0.8779 (0.9029)	loss 0.5564 (0.6452)	grad_norm 1.8113 (2.1791)	mem 20675MB
[2025-04-03 00:33:42 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][224/573]	eta 0:05:15 lr 0.000163	time 0.8775 (0.9027)	loss 0.6394 (0.6451)	grad_norm 3.7527 (2.1828)	mem 20675MB
[2025-04-03 00:33:43 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][226/573]	eta 0:05:13 lr 0.000165	time 0.8773 (0.9025)	loss 0.6181 (0.6446)	grad_norm 1.3386 (2.1798)	mem 20675MB
[2025-04-03 00:33:45 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][228/573]	eta 0:05:11 lr 0.000166	time 0.8780 (0.9022)	loss 0.5844 (0.6443)	grad_norm 2.2550 (2.1770)	mem 20675MB
[2025-04-03 00:33:47 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][230/573]	eta 0:05:09 lr 0.000167	time 0.8776 (0.9020)	loss 0.5845 (0.6440)	grad_norm 2.4941 (2.1744)	mem 20675MB
[2025-04-03 00:33:49 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][232/573]	eta 0:05:07 lr 0.000169	time 0.8771 (0.9018)	loss 0.5781 (0.6437)	grad_norm 3.9243 (2.1845)	mem 20675MB
[2025-04-03 00:33:50 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][234/573]	eta 0:05:05 lr 0.000170	time 0.8772 (0.9016)	loss 0.6578 (0.6434)	grad_norm 2.3412 (2.1881)	mem 20675MB
[2025-04-03 00:33:52 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][236/573]	eta 0:05:03 lr 0.000172	time 0.8777 (0.9014)	loss 0.6179 (0.6434)	grad_norm 2.4109 (2.1961)	mem 20675MB
[2025-04-03 00:33:54 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][238/573]	eta 0:05:01 lr 0.000173	time 0.8776 (0.9012)	loss 0.5567 (0.6432)	grad_norm 2.8034 (2.2070)	mem 20675MB
[2025-04-03 00:33:56 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][240/573]	eta 0:05:00 lr 0.000175	time 0.8776 (0.9010)	loss 0.6335 (0.6433)	grad_norm 2.7728 (2.2075)	mem 20675MB
[2025-04-03 00:33:57 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][242/573]	eta 0:04:58 lr 0.000176	time 0.8773 (0.9009)	loss 0.6425 (0.6431)	grad_norm 4.2987 (2.2163)	mem 20675MB
[2025-04-03 00:33:59 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][244/573]	eta 0:04:56 lr 0.000178	time 0.8774 (0.9007)	loss 0.6345 (0.6428)	grad_norm 2.2073 (2.2164)	mem 20675MB
[2025-04-03 00:34:01 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][246/573]	eta 0:04:54 lr 0.000179	time 0.8774 (0.9005)	loss 0.6508 (0.6425)	grad_norm 4.1333 (2.2252)	mem 20675MB
[2025-04-03 00:34:03 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][248/573]	eta 0:04:52 lr 0.000181	time 0.8777 (0.9003)	loss 0.5651 (0.6419)	grad_norm 2.3358 (2.2246)	mem 20675MB
[2025-04-03 00:34:04 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][250/573]	eta 0:04:50 lr 0.000182	time 0.8775 (0.9001)	loss 0.6275 (0.6415)	grad_norm 2.5755 (2.2301)	mem 20675MB
[2025-04-03 00:34:06 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][252/573]	eta 0:04:48 lr 0.000183	time 0.8773 (0.9000)	loss 0.6420 (0.6416)	grad_norm 2.8925 (2.2393)	mem 20675MB
[2025-04-03 00:34:08 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][254/573]	eta 0:04:47 lr 0.000185	time 0.8775 (0.8998)	loss 0.6087 (0.6412)	grad_norm 3.9741 (2.2550)	mem 20675MB
[2025-04-03 00:34:10 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][256/573]	eta 0:04:45 lr 0.000186	time 0.8778 (0.8996)	loss 0.5771 (0.6408)	grad_norm 3.8661 (2.2684)	mem 20675MB
[2025-04-03 00:34:11 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][258/573]	eta 0:04:43 lr 0.000188	time 0.8774 (0.8995)	loss 0.6427 (0.6408)	grad_norm 2.9198 (2.2669)	mem 20675MB
[2025-04-03 00:34:13 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][260/573]	eta 0:04:41 lr 0.000189	time 0.8776 (0.8993)	loss 0.6248 (0.6407)	grad_norm 1.5568 (2.2672)	mem 20675MB
[2025-04-03 00:34:15 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][262/573]	eta 0:04:39 lr 0.000191	time 0.8776 (0.8992)	loss 0.6124 (0.6405)	grad_norm 2.1452 (2.2650)	mem 20675MB
[2025-04-03 00:34:17 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][264/573]	eta 0:04:37 lr 0.000192	time 0.8776 (0.8990)	loss 0.5483 (0.6401)	grad_norm 2.5892 (2.2712)	mem 20675MB
[2025-04-03 00:34:18 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][266/573]	eta 0:04:35 lr 0.000194	time 0.8776 (0.8988)	loss 0.5763 (0.6396)	grad_norm 2.6548 (2.2711)	mem 20675MB
[2025-04-03 00:34:20 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][268/573]	eta 0:04:34 lr 0.000195	time 0.8775 (0.8987)	loss 0.5618 (0.6392)	grad_norm 5.4619 (2.2878)	mem 20675MB
[2025-04-03 00:34:22 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][270/573]	eta 0:04:32 lr 0.000197	time 0.8775 (0.8985)	loss 0.6272 (0.6394)	grad_norm 4.4999 (2.3021)	mem 20675MB
[2025-04-03 00:34:24 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][272/573]	eta 0:04:30 lr 0.000198	time 0.8774 (0.8984)	loss 0.6828 (0.6393)	grad_norm 3.1203 (2.3051)	mem 20675MB
[2025-04-03 00:34:25 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][274/573]	eta 0:04:28 lr 0.000199	time 0.8772 (0.8982)	loss 0.6355 (0.6392)	grad_norm 3.7793 (2.3117)	mem 20675MB
[2025-04-03 00:34:27 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][276/573]	eta 0:04:26 lr 0.000201	time 0.8772 (0.8981)	loss 0.6151 (0.6392)	grad_norm 1.4481 (2.3063)	mem 20675MB
[2025-04-03 00:34:29 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][278/573]	eta 0:04:24 lr 0.000202	time 0.8775 (0.8980)	loss 0.6788 (0.6391)	grad_norm 4.6803 (2.3132)	mem 20675MB
[2025-04-03 00:34:31 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][280/573]	eta 0:04:23 lr 0.000204	time 0.8772 (0.8978)	loss 0.6241 (0.6392)	grad_norm 3.0574 (2.3220)	mem 20675MB
[2025-04-03 00:34:32 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][282/573]	eta 0:04:21 lr 0.000205	time 0.8772 (0.8977)	loss 0.6104 (0.6389)	grad_norm 2.6812 (2.3238)	mem 20675MB
[2025-04-03 00:34:34 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][284/573]	eta 0:04:19 lr 0.000207	time 0.8772 (0.8975)	loss 0.6600 (0.6386)	grad_norm 2.5675 (2.3257)	mem 20675MB
[2025-04-03 00:34:36 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][286/573]	eta 0:04:17 lr 0.000208	time 0.8774 (0.8974)	loss 0.6713 (0.6385)	grad_norm 3.8491 (2.3332)	mem 20675MB
[2025-04-03 00:34:38 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][288/573]	eta 0:04:15 lr 0.000210	time 0.8772 (0.8973)	loss 0.5873 (0.6385)	grad_norm 2.4789 (2.3367)	mem 20675MB
[2025-04-03 00:34:40 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][290/573]	eta 0:04:13 lr 0.000211	time 0.8773 (0.8971)	loss 0.5716 (0.6381)	grad_norm 2.6547 (2.3461)	mem 20675MB
[2025-04-03 00:34:41 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][292/573]	eta 0:04:12 lr 0.000213	time 0.8773 (0.8970)	loss 0.5596 (0.6376)	grad_norm 3.3094 (2.3483)	mem 20675MB
[2025-04-03 00:34:43 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][294/573]	eta 0:04:10 lr 0.000214	time 0.8772 (0.8969)	loss 0.6185 (0.6373)	grad_norm 2.7597 (2.3564)	mem 20675MB
[2025-04-03 00:34:45 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][296/573]	eta 0:04:08 lr 0.000215	time 0.8772 (0.8968)	loss 0.5684 (0.6368)	grad_norm 5.8740 (2.3722)	mem 20675MB
[2025-04-03 00:34:47 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][298/573]	eta 0:04:06 lr 0.000217	time 0.8772 (0.8966)	loss 0.5950 (0.6365)	grad_norm 1.9044 (2.3817)	mem 20675MB
[2025-04-03 00:34:48 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][300/573]	eta 0:04:04 lr 0.000218	time 0.8772 (0.8965)	loss 0.5881 (0.6361)	grad_norm 2.2866 (2.3821)	mem 20675MB
[2025-04-03 00:34:50 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][302/573]	eta 0:04:02 lr 0.000220	time 0.8775 (0.8964)	loss 0.6456 (0.6361)	grad_norm 4.9301 (2.3920)	mem 20675MB
[2025-04-03 00:34:52 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][304/573]	eta 0:04:01 lr 0.000221	time 0.8773 (0.8963)	loss 0.5514 (0.6360)	grad_norm 2.5637 (2.4024)	mem 20675MB
[2025-04-03 00:34:54 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][306/573]	eta 0:03:59 lr 0.000223	time 0.8777 (0.8961)	loss 0.6415 (0.6358)	grad_norm 2.8053 (2.4045)	mem 20675MB
[2025-04-03 00:34:55 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][308/573]	eta 0:03:57 lr 0.000224	time 0.8773 (0.8960)	loss 0.6290 (0.6356)	grad_norm 3.0141 (2.4068)	mem 20675MB
[2025-04-03 00:34:57 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][310/573]	eta 0:03:55 lr 0.000226	time 0.8773 (0.8959)	loss 0.6327 (0.6355)	grad_norm 1.2328 (2.4022)	mem 20675MB
[2025-04-03 00:34:59 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][312/573]	eta 0:03:53 lr 0.000227	time 0.8774 (0.8958)	loss 0.6212 (0.6355)	grad_norm 1.1429 (2.3950)	mem 20675MB
[2025-04-03 00:35:01 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][314/573]	eta 0:03:51 lr 0.000229	time 0.8773 (0.8957)	loss 0.6118 (0.6353)	grad_norm 1.0798 (2.3873)	mem 20675MB
[2025-04-03 00:35:02 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][316/573]	eta 0:03:50 lr 0.000230	time 0.8772 (0.8956)	loss 0.6222 (0.6353)	grad_norm 2.0676 (2.3863)	mem 20675MB
[2025-04-03 00:35:04 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][318/573]	eta 0:03:48 lr 0.000231	time 0.8774 (0.8955)	loss 0.6786 (0.6354)	grad_norm 3.2055 (2.3877)	mem 20675MB
[2025-04-03 00:35:06 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][320/573]	eta 0:03:46 lr 0.000233	time 0.8779 (0.8954)	loss 0.6256 (0.6351)	grad_norm 2.0420 (2.3880)	mem 20675MB
[2025-04-03 00:35:08 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][322/573]	eta 0:03:44 lr 0.000234	time 0.8772 (0.8953)	loss 0.6179 (0.6349)	grad_norm 2.3554 (2.3856)	mem 20675MB
[2025-04-03 00:35:09 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][324/573]	eta 0:03:42 lr 0.000236	time 0.8774 (0.8952)	loss 0.6330 (0.6349)	grad_norm 2.2542 (2.3828)	mem 20675MB
[2025-04-03 00:35:11 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][326/573]	eta 0:03:41 lr 0.000237	time 0.8776 (0.8950)	loss 0.5404 (0.6345)	grad_norm 2.6977 (2.3862)	mem 20675MB
[2025-04-03 00:35:13 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][328/573]	eta 0:03:39 lr 0.000239	time 0.8772 (0.8949)	loss 0.5718 (0.6343)	grad_norm 3.2525 (2.3893)	mem 20675MB
[2025-04-03 00:35:15 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][330/573]	eta 0:03:37 lr 0.000240	time 0.8778 (0.8948)	loss 0.5548 (0.6341)	grad_norm 2.4461 (2.3902)	mem 20675MB
[2025-04-03 00:35:16 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][332/573]	eta 0:03:35 lr 0.000242	time 0.8774 (0.8947)	loss 0.6361 (0.6341)	grad_norm 2.9292 (2.3905)	mem 20675MB
[2025-04-03 00:35:18 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][334/573]	eta 0:03:33 lr 0.000243	time 0.8777 (0.8947)	loss 0.5998 (0.6338)	grad_norm 2.3153 (2.3907)	mem 20675MB
[2025-04-03 00:35:20 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][336/573]	eta 0:03:32 lr 0.000245	time 0.8773 (0.8946)	loss 0.5896 (0.6337)	grad_norm 2.0615 (2.3904)	mem 20675MB
[2025-04-03 00:35:22 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][338/573]	eta 0:03:30 lr 0.000246	time 0.8771 (0.8945)	loss 0.5971 (0.6336)	grad_norm 2.5603 (2.3926)	mem 20675MB
[2025-04-03 00:35:23 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][340/573]	eta 0:03:28 lr 0.000247	time 0.8772 (0.8944)	loss 0.6774 (0.6336)	grad_norm 4.1910 (2.4006)	mem 20675MB
[2025-04-03 00:35:25 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][342/573]	eta 0:03:26 lr 0.000249	time 0.8770 (0.8943)	loss 0.6406 (0.6338)	grad_norm 2.4635 (2.4009)	mem 20675MB
[2025-04-03 00:35:27 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][344/573]	eta 0:03:24 lr 0.000250	time 0.8783 (0.8942)	loss 0.5465 (0.6336)	grad_norm 2.5420 (2.4071)	mem 20675MB
[2025-04-03 00:35:29 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][346/573]	eta 0:03:22 lr 0.000252	time 0.8774 (0.8941)	loss 0.5914 (0.6335)	grad_norm 3.8881 (2.4135)	mem 20675MB
[2025-04-03 00:35:30 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][348/573]	eta 0:03:21 lr 0.000253	time 0.8777 (0.8940)	loss 0.6331 (0.6334)	grad_norm 1.7251 (2.4124)	mem 20675MB
[2025-04-03 00:35:32 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][350/573]	eta 0:03:19 lr 0.000255	time 0.8774 (0.8939)	loss 0.6417 (0.6333)	grad_norm 3.3124 (2.4143)	mem 20675MB
[2025-04-03 00:35:34 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][352/573]	eta 0:03:17 lr 0.000256	time 0.8774 (0.8938)	loss 0.6558 (0.6333)	grad_norm 2.4043 (2.4158)	mem 20675MB
[2025-04-03 00:35:36 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][354/573]	eta 0:03:15 lr 0.000258	time 0.8773 (0.8937)	loss 0.5750 (0.6330)	grad_norm 3.1067 (2.4184)	mem 20675MB
[2025-04-03 00:35:37 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][356/573]	eta 0:03:13 lr 0.000259	time 0.8773 (0.8936)	loss 0.5329 (0.6324)	grad_norm 2.5647 (2.4202)	mem 20675MB
[2025-04-03 00:35:39 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][358/573]	eta 0:03:12 lr 0.000261	time 0.8775 (0.8936)	loss 0.5939 (0.6321)	grad_norm 4.8307 (2.4288)	mem 20675MB
[2025-04-03 00:35:41 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][360/573]	eta 0:03:10 lr 0.000262	time 0.8792 (0.8935)	loss 0.5422 (0.6319)	grad_norm 2.2929 (2.4328)	mem 20675MB
[2025-04-03 00:35:43 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][362/573]	eta 0:03:08 lr 0.000263	time 0.8775 (0.8934)	loss 0.6450 (0.6321)	grad_norm 4.4543 (2.4414)	mem 20675MB
[2025-04-03 00:35:44 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][364/573]	eta 0:03:06 lr 0.000265	time 0.8773 (0.8933)	loss 0.6253 (0.6319)	grad_norm 2.5749 (2.4428)	mem 20675MB
[2025-04-03 00:35:46 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][366/573]	eta 0:03:04 lr 0.000266	time 0.8774 (0.8932)	loss 0.5789 (0.6318)	grad_norm 3.7590 (2.4441)	mem 20675MB
[2025-04-03 00:35:48 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][368/573]	eta 0:03:03 lr 0.000268	time 0.8771 (0.8931)	loss 0.6306 (0.6316)	grad_norm 5.1101 (2.4545)	mem 20675MB
[2025-04-03 00:35:50 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][370/573]	eta 0:03:01 lr 0.000269	time 0.8775 (0.8931)	loss 0.5774 (0.6314)	grad_norm 2.5602 (2.4529)	mem 20675MB
[2025-04-03 00:35:52 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][372/573]	eta 0:02:59 lr 0.000271	time 0.8774 (0.8930)	loss 0.6757 (0.6316)	grad_norm 4.9371 (2.4664)	mem 20675MB
[2025-04-03 00:35:53 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][374/573]	eta 0:02:57 lr 0.000272	time 0.8773 (0.8929)	loss 0.5815 (0.6314)	grad_norm 2.5977 (2.4656)	mem 20675MB
[2025-04-03 00:35:55 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][376/573]	eta 0:02:55 lr 0.000274	time 0.8773 (0.8928)	loss 0.6416 (0.6312)	grad_norm 3.3378 (2.4693)	mem 20675MB
[2025-04-03 00:35:57 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][378/573]	eta 0:02:54 lr 0.000275	time 0.8774 (0.8928)	loss 0.6411 (0.6314)	grad_norm 4.5812 (2.4785)	mem 20675MB
[2025-04-03 00:35:59 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][380/573]	eta 0:02:52 lr 0.000277	time 0.8775 (0.8927)	loss 0.5333 (0.6310)	grad_norm 3.2436 (2.4790)	mem 20675MB
[2025-04-03 00:36:00 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][382/573]	eta 0:02:50 lr 0.000278	time 0.8770 (0.8926)	loss 0.5579 (0.6307)	grad_norm 3.7528 (2.4868)	mem 20675MB
[2025-04-03 00:36:02 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][384/573]	eta 0:02:48 lr 0.000279	time 0.8775 (0.8925)	loss 0.5617 (0.6304)	grad_norm 2.7456 (2.4883)	mem 20675MB
[2025-04-03 00:36:04 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][386/573]	eta 0:02:46 lr 0.000281	time 0.8774 (0.8925)	loss 0.6847 (0.6307)	grad_norm 3.5418 (2.4945)	mem 20675MB
[2025-04-03 00:36:06 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][388/573]	eta 0:02:45 lr 0.000282	time 0.8774 (0.8924)	loss 0.6332 (0.6306)	grad_norm 4.1305 (2.5024)	mem 20675MB
[2025-04-03 00:36:07 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][390/573]	eta 0:02:43 lr 0.000284	time 0.8774 (0.8923)	loss 0.6348 (0.6304)	grad_norm 1.8745 (2.5004)	mem 20675MB
[2025-04-03 00:36:09 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][392/573]	eta 0:02:41 lr 0.000285	time 0.8774 (0.8922)	loss 0.6102 (0.6302)	grad_norm 3.2783 (2.5020)	mem 20675MB
[2025-04-03 00:36:11 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][394/573]	eta 0:02:39 lr 0.000287	time 0.8772 (0.8922)	loss 0.6240 (0.6303)	grad_norm 1.5741 (2.5132)	mem 20675MB
[2025-04-03 00:36:13 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][396/573]	eta 0:02:37 lr 0.000288	time 0.8776 (0.8921)	loss 0.5474 (0.6301)	grad_norm 2.5874 (2.5113)	mem 20675MB
[2025-04-03 00:36:14 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][398/573]	eta 0:02:36 lr 0.000290	time 0.8780 (0.8920)	loss 0.6865 (0.6302)	grad_norm 6.2667 (2.5210)	mem 20675MB
[2025-04-03 00:36:16 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][400/573]	eta 0:02:34 lr 0.000291	time 0.8772 (0.8920)	loss 0.6508 (0.6301)	grad_norm 1.9217 (2.5177)	mem 20675MB
[2025-04-03 00:36:18 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][402/573]	eta 0:02:32 lr 0.000293	time 0.8787 (0.8919)	loss 0.5125 (0.6297)	grad_norm 2.5967 (2.5195)	mem 20675MB
[2025-04-03 00:36:20 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][404/573]	eta 0:02:30 lr 0.000294	time 0.8775 (0.8918)	loss 0.6201 (0.6296)	grad_norm 3.7018 (2.5247)	mem 20675MB
[2025-04-03 00:36:21 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][406/573]	eta 0:02:28 lr 0.000295	time 0.8772 (0.8918)	loss 0.5999 (0.6296)	grad_norm 2.0166 (2.5251)	mem 20675MB
[2025-04-03 00:36:23 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][408/573]	eta 0:02:27 lr 0.000297	time 0.8772 (0.8917)	loss 0.5995 (0.6295)	grad_norm 1.8934 (2.5226)	mem 20675MB
[2025-04-03 00:36:25 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][410/573]	eta 0:02:25 lr 0.000298	time 0.8773 (0.8916)	loss 0.5997 (0.6293)	grad_norm 1.7988 (2.5195)	mem 20675MB
[2025-04-03 00:36:27 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][412/573]	eta 0:02:23 lr 0.000300	time 0.8772 (0.8916)	loss 0.6338 (0.6293)	grad_norm 1.8137 (2.5191)	mem 20675MB
[2025-04-03 00:36:28 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][414/573]	eta 0:02:21 lr 0.000301	time 0.8783 (0.8915)	loss 0.6448 (0.6293)	grad_norm 3.3649 (2.5208)	mem 20675MB
[2025-04-03 00:36:30 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][416/573]	eta 0:02:19 lr 0.000303	time 0.8774 (0.8914)	loss 0.4884 (0.6287)	grad_norm 2.9023 (2.5213)	mem 20675MB
[2025-04-03 00:36:32 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][418/573]	eta 0:02:18 lr 0.000304	time 0.8774 (0.8914)	loss 0.5262 (0.6287)	grad_norm 4.1292 (2.5288)	mem 20675MB
[2025-04-03 00:36:34 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][420/573]	eta 0:02:16 lr 0.000306	time 0.8775 (0.8913)	loss 0.6360 (0.6287)	grad_norm 2.4260 (2.5312)	mem 20675MB
[2025-04-03 00:36:35 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][422/573]	eta 0:02:14 lr 0.000307	time 0.8775 (0.8912)	loss 0.5681 (0.6285)	grad_norm 3.9604 (2.5328)	mem 20675MB
[2025-04-03 00:36:37 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][424/573]	eta 0:02:12 lr 0.000309	time 0.8775 (0.8912)	loss 0.6137 (0.6284)	grad_norm 1.3783 (2.5299)	mem 20675MB
[2025-04-03 00:36:39 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][426/573]	eta 0:02:10 lr 0.000310	time 0.8771 (0.8911)	loss 0.6580 (0.6285)	grad_norm 2.1711 (2.5273)	mem 20675MB
[2025-04-03 00:36:41 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][428/573]	eta 0:02:09 lr 0.000311	time 0.8775 (0.8911)	loss 0.6146 (0.6285)	grad_norm 2.6392 (2.5264)	mem 20675MB
[2025-04-03 00:36:42 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][430/573]	eta 0:02:07 lr 0.000313	time 0.8776 (0.8910)	loss 0.6190 (0.6284)	grad_norm 1.8906 (2.5234)	mem 20675MB
[2025-04-03 00:36:44 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][432/573]	eta 0:02:05 lr 0.000314	time 0.8774 (0.8909)	loss 0.6749 (0.6286)	grad_norm 3.0625 (2.5260)	mem 20675MB
[2025-04-03 00:36:46 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][434/573]	eta 0:02:03 lr 0.000316	time 0.8774 (0.8909)	loss 0.6364 (0.6286)	grad_norm 2.3399 (2.5228)	mem 20675MB
[2025-04-03 00:36:48 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][436/573]	eta 0:02:02 lr 0.000317	time 0.8773 (0.8908)	loss 0.6345 (0.6285)	grad_norm 3.9624 (2.5277)	mem 20675MB
[2025-04-03 00:36:49 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][438/573]	eta 0:02:00 lr 0.000319	time 0.8771 (0.8908)	loss 0.6086 (0.6284)	grad_norm 3.3068 (2.5288)	mem 20675MB
[2025-04-03 00:36:51 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][440/573]	eta 0:01:58 lr 0.000320	time 0.8776 (0.8907)	loss 0.6639 (0.6283)	grad_norm 4.8962 (2.5346)	mem 20675MB
[2025-04-03 00:36:53 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][442/573]	eta 0:01:56 lr 0.000322	time 0.8775 (0.8907)	loss 0.6511 (0.6283)	grad_norm 4.3623 (2.5401)	mem 20675MB
[2025-04-03 00:36:55 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][444/573]	eta 0:01:54 lr 0.000323	time 0.8771 (0.8906)	loss 0.6357 (0.6283)	grad_norm 4.3826 (2.5460)	mem 20675MB
[2025-04-03 00:36:57 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][446/573]	eta 0:01:53 lr 0.000325	time 0.8772 (0.8905)	loss 0.5252 (0.6280)	grad_norm 3.0398 (2.5487)	mem 20675MB
[2025-04-03 00:36:58 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][448/573]	eta 0:01:51 lr 0.000326	time 0.8771 (0.8905)	loss 0.6409 (0.6278)	grad_norm 3.9504 (2.5536)	mem 20675MB
[2025-04-03 00:37:00 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][450/573]	eta 0:01:49 lr 0.000327	time 0.8774 (0.8904)	loss 0.5824 (0.6276)	grad_norm 2.9307 (2.5575)	mem 20675MB
[2025-04-03 00:37:02 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][452/573]	eta 0:01:47 lr 0.000329	time 0.8772 (0.8904)	loss 0.5253 (0.6272)	grad_norm 3.7210 (2.5607)	mem 20675MB
[2025-04-03 00:37:04 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][454/573]	eta 0:01:45 lr 0.000330	time 0.8773 (0.8903)	loss 0.5277 (0.6269)	grad_norm 3.5452 (2.5622)	mem 20675MB
[2025-04-03 00:37:05 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][456/573]	eta 0:01:44 lr 0.000332	time 0.8771 (0.8903)	loss 0.4997 (0.6266)	grad_norm 4.2559 (2.5687)	mem 20675MB
[2025-04-03 00:37:07 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][458/573]	eta 0:01:42 lr 0.000333	time 0.8772 (0.8902)	loss 0.5448 (0.6264)	grad_norm 4.0526 (2.5743)	mem 20675MB
[2025-04-03 00:37:09 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][460/573]	eta 0:01:40 lr 0.000335	time 0.8770 (0.8902)	loss 0.6305 (0.6265)	grad_norm 3.7572 (2.5777)	mem 20675MB
[2025-04-03 00:37:11 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][462/573]	eta 0:01:38 lr 0.000336	time 0.8775 (0.8901)	loss 0.5812 (0.6261)	grad_norm 2.0409 (2.5768)	mem 20675MB
[2025-04-03 00:37:12 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][464/573]	eta 0:01:37 lr 0.000338	time 0.8774 (0.8901)	loss 0.5716 (0.6260)	grad_norm 3.4485 (2.5788)	mem 20675MB
[2025-04-03 00:37:14 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][466/573]	eta 0:01:35 lr 0.000339	time 0.8771 (0.8900)	loss 0.5809 (0.6258)	grad_norm 2.2902 (2.5801)	mem 20675MB
[2025-04-03 00:37:16 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][468/573]	eta 0:01:33 lr 0.000340	time 0.8775 (0.8900)	loss 0.5288 (0.6254)	grad_norm 4.7515 (2.5922)	mem 20675MB
[2025-04-03 00:37:18 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][470/573]	eta 0:01:31 lr 0.000342	time 0.8772 (0.8899)	loss 0.6522 (0.6254)	grad_norm 3.8336 (2.5957)	mem 20675MB
[2025-04-03 00:37:19 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][472/573]	eta 0:01:29 lr 0.000343	time 0.8774 (0.8899)	loss 0.5390 (0.6251)	grad_norm 1.9126 (2.5937)	mem 20675MB
[2025-04-03 00:37:21 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][474/573]	eta 0:01:28 lr 0.000345	time 0.8775 (0.8898)	loss 0.6196 (0.6251)	grad_norm 2.0081 (2.5917)	mem 20675MB
[2025-04-03 00:37:23 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][476/573]	eta 0:01:26 lr 0.000346	time 0.8775 (0.8898)	loss 0.6597 (0.6251)	grad_norm 2.5390 (2.5932)	mem 20675MB
[2025-04-03 00:37:25 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][478/573]	eta 0:01:24 lr 0.000348	time 0.8774 (0.8897)	loss 0.6273 (0.6250)	grad_norm 1.6382 (2.5908)	mem 20675MB
[2025-04-03 00:37:26 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][480/573]	eta 0:01:22 lr 0.000349	time 0.8774 (0.8897)	loss 0.6337 (0.6250)	grad_norm 1.5730 (2.5899)	mem 20675MB
[2025-04-03 00:37:28 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][482/573]	eta 0:01:20 lr 0.000351	time 0.8774 (0.8896)	loss 0.6637 (0.6250)	grad_norm 2.2948 (2.5885)	mem 20675MB
[2025-04-03 00:37:30 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][484/573]	eta 0:01:19 lr 0.000352	time 0.8784 (0.8896)	loss 0.6443 (0.6251)	grad_norm 3.1283 (2.5887)	mem 20675MB
[2025-04-03 00:37:32 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][486/573]	eta 0:01:17 lr 0.000354	time 0.8773 (0.8895)	loss 0.5971 (0.6251)	grad_norm 1.8982 (2.5870)	mem 20675MB
[2025-04-03 00:37:33 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][488/573]	eta 0:01:15 lr 0.000355	time 0.8771 (0.8895)	loss 0.6074 (0.6251)	grad_norm 1.8856 (2.5836)	mem 20675MB
[2025-04-03 00:37:35 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][490/573]	eta 0:01:13 lr 0.000356	time 0.8772 (0.8894)	loss 0.6107 (0.6250)	grad_norm 1.6100 (2.5811)	mem 20675MB
[2025-04-03 00:37:37 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][492/573]	eta 0:01:12 lr 0.000358	time 0.8777 (0.8894)	loss 0.5248 (0.6248)	grad_norm 2.4137 (2.5786)	mem 20675MB
[2025-04-03 00:37:39 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][494/573]	eta 0:01:10 lr 0.000359	time 0.8773 (0.8893)	loss 0.6306 (0.6248)	grad_norm 3.8497 (2.5816)	mem 20675MB
[2025-04-03 00:37:40 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][496/573]	eta 0:01:08 lr 0.000361	time 0.8774 (0.8893)	loss 0.5726 (0.6245)	grad_norm 2.2323 (2.5802)	mem 20675MB
[2025-04-03 00:37:42 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][498/573]	eta 0:01:06 lr 0.000362	time 0.8776 (0.8893)	loss 0.5831 (0.6244)	grad_norm 2.6162 (2.5830)	mem 20675MB
[2025-04-03 00:37:44 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][500/573]	eta 0:01:04 lr 0.000364	time 0.8774 (0.8892)	loss 0.6566 (0.6242)	grad_norm 2.6187 (2.5841)	mem 20675MB
[2025-04-03 00:37:46 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][502/573]	eta 0:01:03 lr 0.000365	time 0.8775 (0.8892)	loss 0.5657 (0.6240)	grad_norm 4.9009 (2.5879)	mem 20675MB
[2025-04-03 00:37:47 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][504/573]	eta 0:01:01 lr 0.000367	time 0.8771 (0.8891)	loss 0.6482 (0.6241)	grad_norm 2.1020 (2.5859)	mem 20675MB
[2025-04-03 00:37:49 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][506/573]	eta 0:00:59 lr 0.000368	time 0.8775 (0.8891)	loss 0.5717 (0.6240)	grad_norm 2.1404 (2.5868)	mem 20675MB
[2025-04-03 00:37:51 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][508/573]	eta 0:00:57 lr 0.000370	time 0.8775 (0.8890)	loss 0.6453 (0.6241)	grad_norm 3.1441 (2.5904)	mem 20675MB
[2025-04-03 00:37:53 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][510/573]	eta 0:00:56 lr 0.000371	time 0.8776 (0.8890)	loss 0.6332 (0.6240)	grad_norm 4.2447 (2.5929)	mem 20675MB
[2025-04-03 00:37:54 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][512/573]	eta 0:00:54 lr 0.000372	time 0.8774 (0.8890)	loss 0.6334 (0.6241)	grad_norm 1.6672 (2.5939)	mem 20675MB
[2025-04-03 00:37:56 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][514/573]	eta 0:00:52 lr 0.000374	time 0.8771 (0.8889)	loss 0.5920 (0.6240)	grad_norm 1.9868 (2.5902)	mem 20675MB
[2025-04-03 00:37:58 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][516/573]	eta 0:00:50 lr 0.000375	time 0.8772 (0.8889)	loss 0.6066 (0.6238)	grad_norm 2.2568 (2.5891)	mem 20675MB
[2025-04-03 00:38:00 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][518/573]	eta 0:00:48 lr 0.000377	time 0.8775 (0.8888)	loss 0.4705 (0.6235)	grad_norm 2.7724 (2.5907)	mem 20675MB
[2025-04-03 00:38:01 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][520/573]	eta 0:00:47 lr 0.000378	time 0.8786 (0.8888)	loss 0.5578 (0.6234)	grad_norm 3.8668 (2.5978)	mem 20675MB
[2025-04-03 00:38:03 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][522/573]	eta 0:00:45 lr 0.000380	time 0.8771 (0.8888)	loss 0.5642 (0.6230)	grad_norm 2.6224 (2.5995)	mem 20675MB
[2025-04-03 00:38:05 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][524/573]	eta 0:00:43 lr 0.000381	time 0.8778 (0.8887)	loss 0.6250 (0.6228)	grad_norm 3.1274 (2.6064)	mem 20675MB
[2025-04-03 00:38:07 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][526/573]	eta 0:00:41 lr 0.000383	time 0.8772 (0.8887)	loss 0.5876 (0.6224)	grad_norm 2.8904 (2.6101)	mem 20675MB
[2025-04-03 00:38:09 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][528/573]	eta 0:00:39 lr 0.000384	time 0.8776 (0.8886)	loss 0.6364 (0.6224)	grad_norm 5.5013 (2.6198)	mem 20675MB
[2025-04-03 00:38:10 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][530/573]	eta 0:00:38 lr 0.000386	time 0.8775 (0.8886)	loss 0.6048 (0.6224)	grad_norm 2.6779 (2.6186)	mem 20675MB
[2025-04-03 00:38:12 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][532/573]	eta 0:00:36 lr 0.000387	time 0.8781 (0.8886)	loss 0.6373 (0.6224)	grad_norm 2.3431 (2.6157)	mem 20675MB
[2025-04-03 00:38:14 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][534/573]	eta 0:00:34 lr 0.000388	time 0.8775 (0.8885)	loss 0.5512 (0.6221)	grad_norm 2.2751 (2.6154)	mem 20675MB
[2025-04-03 00:38:16 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][536/573]	eta 0:00:32 lr 0.000390	time 0.8777 (0.8885)	loss 0.5063 (0.6218)	grad_norm 3.0872 (2.6147)	mem 20675MB
[2025-04-03 00:38:17 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][538/573]	eta 0:00:31 lr 0.000391	time 0.8773 (0.8884)	loss 0.6628 (0.6218)	grad_norm 2.6427 (2.6151)	mem 20675MB
[2025-04-03 00:38:19 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][540/573]	eta 0:00:29 lr 0.000393	time 0.8772 (0.8884)	loss 0.6192 (0.6219)	grad_norm 2.6498 (2.6172)	mem 20675MB
[2025-04-03 00:38:21 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][542/573]	eta 0:00:27 lr 0.000394	time 0.8773 (0.8884)	loss 0.5688 (0.6216)	grad_norm 2.5856 (2.6200)	mem 20675MB
[2025-04-03 00:38:23 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][544/573]	eta 0:00:25 lr 0.000396	time 0.8779 (0.8883)	loss 0.5876 (0.6215)	grad_norm 4.7494 (2.6247)	mem 20675MB
[2025-04-03 00:38:24 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][546/573]	eta 0:00:23 lr 0.000397	time 0.8780 (0.8883)	loss 0.6429 (0.6214)	grad_norm 2.3214 (2.6281)	mem 20675MB
[2025-04-03 00:38:26 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][548/573]	eta 0:00:22 lr 0.000399	time 0.8777 (0.8883)	loss 0.6430 (0.6215)	grad_norm 2.3207 (2.6259)	mem 20675MB
[2025-04-03 00:38:28 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][550/573]	eta 0:00:20 lr 0.000400	time 0.8781 (0.8882)	loss 0.6315 (0.6213)	grad_norm 1.5493 (2.6248)	mem 20675MB
[2025-04-03 00:38:30 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][552/573]	eta 0:00:18 lr 0.000402	time 0.8777 (0.8882)	loss 0.6413 (0.6214)	grad_norm 2.2902 (2.6287)	mem 20675MB
[2025-04-03 00:38:31 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][554/573]	eta 0:00:16 lr 0.000403	time 0.8771 (0.8882)	loss 0.5962 (0.6213)	grad_norm 2.9616 (2.6274)	mem 20675MB
[2025-04-03 00:38:33 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][556/573]	eta 0:00:15 lr 0.000404	time 0.8774 (0.8881)	loss 0.4891 (0.6211)	grad_norm 2.7915 (2.6276)	mem 20675MB
[2025-04-03 00:38:35 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][558/573]	eta 0:00:13 lr 0.000406	time 0.8777 (0.8881)	loss 0.6319 (0.6211)	grad_norm 3.6397 (2.6281)	mem 20675MB
[2025-04-03 00:38:37 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][560/573]	eta 0:00:11 lr 0.000407	time 0.8769 (0.8881)	loss 0.5781 (0.6210)	grad_norm 1.5638 (2.6244)	mem 20675MB
[2025-04-03 00:38:38 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][562/573]	eta 0:00:09 lr 0.000409	time 0.8780 (0.8880)	loss 0.5710 (0.6208)	grad_norm 1.9051 (2.6240)	mem 20675MB
[2025-04-03 00:38:40 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][564/573]	eta 0:00:07 lr 0.000410	time 0.8787 (0.8880)	loss 0.5964 (0.6207)	grad_norm 2.6433 (2.6240)	mem 20675MB
[2025-04-03 00:38:42 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][566/573]	eta 0:00:06 lr 0.000412	time 0.8771 (0.8880)	loss 0.6281 (0.6205)	grad_norm 2.6110 (2.6271)	mem 20675MB
[2025-04-03 00:38:44 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][568/573]	eta 0:00:04 lr 0.000413	time 0.8770 (0.8879)	loss 0.5713 (0.6205)	grad_norm 2.4433 (2.6285)	mem 20675MB
[2025-04-03 00:38:45 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][570/573]	eta 0:00:02 lr 0.000415	time 0.8771 (0.8879)	loss 0.5812 (0.6204)	grad_norm 1.8113 (2.6262)	mem 20675MB
[2025-04-03 00:38:47 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][572/573]	eta 0:00:00 lr 0.000416	time 0.8771 (0.8879)	loss 0.5931 (0.6202)	grad_norm 2.6565 (2.6253)	mem 20675MB
[2025-04-03 00:38:47 simmim_finetune] (main_finetune.py 260): INFO EPOCH 0 training takes 0:08:28
[2025-04-03 00:38:47 simmim_finetune] (utils.py 60): INFO checkpoint/human/ckpt0.pth saving......
[2025-04-03 00:38:51 simmim_finetune] (utils.py 62): INFO checkpoint/human/ckpt0.pth saved !!!
[2025-04-03 00:38:52 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.854 (1.854)	Loss 0.5358 (0.5358)	Acc@1 72.656 (72.656)	Mem 20675MB
[2025-04-03 00:38:53 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.808)	Loss 0.5141 (0.5184)	Acc@1 77.344 (76.302)	Mem 20675MB
[2025-04-03 00:38:54 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.598)	Loss 0.6270 (0.5472)	Acc@1 62.500 (72.812)	Mem 20675MB
[2025-04-03 00:38:54 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.508)	Loss 0.5504 (0.5458)	Acc@1 73.438 (73.549)	Mem 20675MB
[2025-04-03 00:38:55 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.458)	Loss 0.7623 (0.5709)	Acc@1 53.906 (71.267)	Mem 20675MB
[2025-04-03 00:38:55 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.427)	Loss 0.6217 (0.5855)	Acc@1 63.281 (69.389)	Mem 20675MB
[2025-04-03 00:38:56 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.405)	Loss 0.6246 (0.5962)	Acc@1 64.062 (68.269)	Mem 20675MB
[2025-04-03 00:38:56 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.388)	Loss 0.6470 (0.6015)	Acc@1 67.188 (67.812)	Mem 20675MB
[2025-04-03 00:38:57 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 67.893
[2025-04-03 00:38:57 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 67.9%
[2025-04-03 00:38:57 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 67.89%
[2025-04-03 00:38:57 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.7046260686901576e-06, 1.7046260686901576e-06, 2.5326797441073806e-06, 2.5326797441073806e-06, 3.8066084755184914e-06, 3.8066084755184914e-06, 5.766498831535586e-06, 5.766498831535586e-06, 8.781714763869576e-06, 8.781714763869576e-06, 1.342050850592187e-05, 1.342050850592187e-05, 2.0557114262925398e-05, 2.0557114262925398e-05, 3.1536507735238514e-05, 3.1536507735238514e-05, 4.842788230802793e-05, 4.842788230802793e-05, 7.441461242001166e-05, 7.441461242001166e-05, 0.0001143941972076789, 0.0001143941972076789, 0.00017590125072716696, 0.00017590125072716696, 0.00027052748691099477, 0.00027052748691099477, 0.00041610631180919133, 0.00041610631180919133]
[2025-04-03 00:38:59 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][0/573]	eta 0:21:16 lr 0.000417	time 2.2285 (2.2285)	loss 0.6381 (0.6381)	grad_norm 3.5162 (3.5162)	mem 20675MB
[2025-04-03 00:39:01 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][2/573]	eta 0:12:38 lr 0.000418	time 0.8771 (1.3282)	loss 0.5222 (0.5818)	grad_norm 2.3270 (2.5013)	mem 20675MB
[2025-04-03 00:39:02 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][4/573]	eta 0:10:53 lr 0.000420	time 0.8777 (1.1483)	loss 0.6229 (0.5950)	grad_norm 1.8935 (2.4590)	mem 20675MB
[2025-04-03 00:39:04 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][6/573]	eta 0:10:07 lr 0.000421	time 0.8774 (1.0711)	loss 0.5245 (0.5937)	grad_norm 3.0668 (2.6304)	mem 20675MB
[2025-04-03 00:39:06 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][8/573]	eta 0:09:40 lr 0.000423	time 0.8772 (1.0282)	loss 0.5583 (0.5994)	grad_norm 2.3468 (2.7211)	mem 20675MB
[2025-04-03 00:39:08 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][10/573]	eta 0:09:23 lr 0.000424	time 0.8772 (1.0009)	loss 0.4944 (0.5903)	grad_norm 2.9115 (2.6675)	mem 20675MB
[2025-04-03 00:39:09 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][12/573]	eta 0:09:10 lr 0.000426	time 0.8770 (0.9820)	loss 0.6431 (0.5897)	grad_norm 2.3209 (2.6313)	mem 20675MB
[2025-04-03 00:39:11 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][14/573]	eta 0:09:01 lr 0.000427	time 0.8771 (0.9681)	loss 0.6652 (0.5996)	grad_norm 2.5033 (2.7518)	mem 20675MB
[2025-04-03 00:39:13 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][16/573]	eta 0:08:53 lr 0.000428	time 0.8777 (0.9576)	loss 0.6704 (0.5991)	grad_norm 4.8500 (3.0118)	mem 20675MB
[2025-04-03 00:39:15 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][18/573]	eta 0:08:46 lr 0.000430	time 0.8775 (0.9492)	loss 0.6388 (0.6039)	grad_norm 4.7854 (3.1114)	mem 20675MB
[2025-04-03 00:39:17 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][20/573]	eta 0:08:41 lr 0.000431	time 0.8775 (0.9424)	loss 0.5938 (0.6007)	grad_norm 2.9240 (3.0754)	mem 20675MB
[2025-04-03 00:39:18 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][22/573]	eta 0:08:36 lr 0.000433	time 0.8782 (0.9369)	loss 0.5113 (0.5973)	grad_norm 3.0168 (2.9947)	mem 20675MB
[2025-04-03 00:39:20 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][24/573]	eta 0:08:31 lr 0.000434	time 0.8773 (0.9322)	loss 0.5429 (0.5968)	grad_norm 2.3490 (2.9224)	mem 20675MB
[2025-04-03 00:39:22 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][26/573]	eta 0:08:27 lr 0.000436	time 0.8772 (0.9282)	loss 0.6377 (0.5965)	grad_norm 3.7281 (2.9134)	mem 20675MB
[2025-04-03 00:39:24 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][28/573]	eta 0:08:23 lr 0.000437	time 0.8776 (0.9248)	loss 0.6208 (0.5936)	grad_norm 3.8344 (2.9769)	mem 20675MB
[2025-04-03 00:39:25 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][30/573]	eta 0:08:20 lr 0.000439	time 0.8774 (0.9218)	loss 0.6964 (0.5924)	grad_norm 6.0474 (3.0850)	mem 20675MB
[2025-04-03 00:39:27 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][32/573]	eta 0:08:17 lr 0.000440	time 0.8775 (0.9192)	loss 0.5359 (0.5927)	grad_norm 1.8453 (3.0842)	mem 20675MB
[2025-04-03 00:39:29 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][34/573]	eta 0:08:14 lr 0.000442	time 0.8772 (0.9168)	loss 0.6017 (0.5930)	grad_norm 5.3356 (3.1109)	mem 20675MB
[2025-04-03 00:39:31 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][36/573]	eta 0:08:11 lr 0.000443	time 0.8774 (0.9147)	loss 0.5655 (0.5947)	grad_norm 3.2118 (3.0858)	mem 20675MB
[2025-04-03 00:39:32 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][38/573]	eta 0:08:08 lr 0.000444	time 0.8773 (0.9128)	loss 0.5789 (0.5942)	grad_norm 3.5895 (3.1006)	mem 20675MB
[2025-04-03 00:39:34 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][40/573]	eta 0:08:05 lr 0.000446	time 0.8807 (0.9112)	loss 0.5857 (0.5951)	grad_norm 4.0167 (3.1536)	mem 20675MB
[2025-04-03 00:39:36 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][42/573]	eta 0:08:03 lr 0.000447	time 0.8771 (0.9097)	loss 0.5712 (0.5958)	grad_norm 2.8188 (3.1582)	mem 20675MB
[2025-04-03 00:39:38 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][44/573]	eta 0:08:00 lr 0.000449	time 0.8771 (0.9083)	loss 0.5954 (0.5964)	grad_norm 3.4644 (3.1386)	mem 20675MB
[2025-04-03 00:39:39 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][46/573]	eta 0:07:57 lr 0.000450	time 0.8770 (0.9070)	loss 0.6388 (0.5982)	grad_norm 2.1280 (3.1386)	mem 20675MB
[2025-04-03 00:39:41 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][48/573]	eta 0:07:55 lr 0.000452	time 0.8774 (0.9058)	loss 0.5803 (0.5968)	grad_norm 2.3362 (3.0966)	mem 20675MB
[2025-04-03 00:39:43 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][50/573]	eta 0:07:53 lr 0.000453	time 0.8774 (0.9048)	loss 0.5821 (0.5971)	grad_norm 3.4370 (3.1017)	mem 20675MB
[2025-04-03 00:39:45 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][52/573]	eta 0:07:50 lr 0.000455	time 0.8769 (0.9037)	loss 0.5311 (0.5961)	grad_norm 1.9794 (3.0549)	mem 20675MB
[2025-04-03 00:39:46 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][54/573]	eta 0:07:48 lr 0.000456	time 0.8773 (0.9028)	loss 0.5265 (0.5944)	grad_norm 3.8892 (3.0535)	mem 20675MB
[2025-04-03 00:39:48 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][56/573]	eta 0:07:46 lr 0.000458	time 0.8774 (0.9020)	loss 0.6427 (0.5941)	grad_norm 2.4906 (3.0347)	mem 20675MB
[2025-04-03 00:39:50 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][58/573]	eta 0:07:44 lr 0.000459	time 0.8786 (0.9012)	loss 0.6150 (0.5947)	grad_norm 1.8577 (3.0012)	mem 20675MB
[2025-04-03 00:39:52 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][60/573]	eta 0:07:41 lr 0.000460	time 0.8773 (0.9004)	loss 0.5931 (0.5950)	grad_norm 3.2409 (3.0035)	mem 20675MB
[2025-04-03 00:39:53 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][62/573]	eta 0:07:39 lr 0.000462	time 0.8774 (0.8997)	loss 0.5556 (0.5939)	grad_norm 3.6735 (2.9946)	mem 20675MB
[2025-04-03 00:39:55 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][64/573]	eta 0:07:37 lr 0.000463	time 0.8773 (0.8991)	loss 0.5712 (0.5922)	grad_norm 3.2648 (2.9887)	mem 20675MB
[2025-04-03 00:39:57 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][66/573]	eta 0:07:35 lr 0.000465	time 0.8773 (0.8984)	loss 0.5028 (0.5908)	grad_norm 3.0483 (2.9999)	mem 20675MB
[2025-04-03 00:39:59 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][68/573]	eta 0:07:33 lr 0.000466	time 0.8771 (0.8979)	loss 0.4591 (0.5875)	grad_norm 4.2825 (3.0199)	mem 20675MB
[2025-04-03 00:40:00 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][70/573]	eta 0:07:31 lr 0.000468	time 0.8773 (0.8973)	loss 0.5964 (0.5892)	grad_norm 3.8815 (3.0927)	mem 20675MB
[2025-04-03 00:40:02 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][72/573]	eta 0:07:29 lr 0.000469	time 0.8772 (0.8968)	loss 0.6888 (0.5894)	grad_norm 3.4499 (3.1080)	mem 20675MB
[2025-04-03 00:40:04 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][74/573]	eta 0:07:27 lr 0.000471	time 0.8771 (0.8963)	loss 0.5865 (0.5891)	grad_norm 2.8389 (3.0998)	mem 20675MB
[2025-04-03 00:40:06 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][76/573]	eta 0:07:25 lr 0.000472	time 0.8770 (0.8958)	loss 0.5925 (0.5886)	grad_norm 1.9276 (3.0781)	mem 20675MB
[2025-04-03 00:40:07 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][78/573]	eta 0:07:23 lr 0.000474	time 0.8770 (0.8953)	loss 0.5310 (0.5884)	grad_norm 2.5711 (3.0680)	mem 20675MB
[2025-04-03 00:40:09 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][80/573]	eta 0:07:21 lr 0.000475	time 0.8772 (0.8949)	loss 0.5396 (0.5883)	grad_norm 2.7079 (3.0602)	mem 20675MB
[2025-04-03 00:40:11 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][82/573]	eta 0:07:19 lr 0.000476	time 0.8781 (0.8945)	loss 0.5021 (0.5871)	grad_norm 3.1265 (3.0584)	mem 20675MB
[2025-04-03 00:40:13 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][84/573]	eta 0:07:17 lr 0.000478	time 0.8773 (0.8941)	loss 0.5321 (0.5856)	grad_norm 4.1742 (3.0889)	mem 20675MB
[2025-04-03 00:40:14 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][86/573]	eta 0:07:15 lr 0.000479	time 0.8772 (0.8938)	loss 0.6958 (0.5868)	grad_norm 5.8975 (3.1216)	mem 20675MB
[2025-04-03 00:40:16 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][88/573]	eta 0:07:13 lr 0.000481	time 0.8769 (0.8934)	loss 0.6003 (0.5854)	grad_norm 4.8520 (3.1607)	mem 20675MB
[2025-04-03 00:40:18 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][90/573]	eta 0:07:11 lr 0.000482	time 0.8770 (0.8931)	loss 0.6733 (0.5862)	grad_norm 2.8738 (3.1578)	mem 20675MB
[2025-04-03 00:40:20 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][92/573]	eta 0:07:09 lr 0.000484	time 0.8773 (0.8927)	loss 0.5882 (0.5861)	grad_norm 1.6261 (3.1421)	mem 20675MB
[2025-04-03 00:40:22 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][94/573]	eta 0:07:07 lr 0.000485	time 0.8772 (0.8924)	loss 0.5992 (0.5865)	grad_norm 1.7966 (3.1154)	mem 20675MB
[2025-04-03 00:40:23 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][96/573]	eta 0:07:05 lr 0.000487	time 0.8772 (0.8921)	loss 0.6672 (0.5875)	grad_norm 2.3934 (3.0947)	mem 20675MB
[2025-04-03 00:40:25 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][98/573]	eta 0:07:03 lr 0.000488	time 0.8771 (0.8918)	loss 0.5669 (0.5879)	grad_norm 4.4422 (3.0951)	mem 20675MB
[2025-04-03 00:40:27 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][100/573]	eta 0:07:01 lr 0.000490	time 0.8772 (0.8916)	loss 0.5813 (0.5878)	grad_norm 2.1602 (3.0840)	mem 20675MB
[2025-04-03 00:40:29 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][102/573]	eta 0:06:59 lr 0.000491	time 0.8773 (0.8913)	loss 0.5001 (0.5871)	grad_norm 2.3584 (3.0677)	mem 20675MB
[2025-04-03 00:40:30 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][104/573]	eta 0:06:57 lr 0.000492	time 0.8774 (0.8911)	loss 0.5518 (0.5863)	grad_norm 2.7737 (3.0623)	mem 20675MB
[2025-04-03 00:40:32 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][106/573]	eta 0:06:56 lr 0.000494	time 0.8778 (0.8908)	loss 0.5204 (0.5853)	grad_norm 4.2391 (3.0724)	mem 20675MB
[2025-04-03 00:40:34 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][108/573]	eta 0:06:54 lr 0.000495	time 0.8770 (0.8906)	loss 0.5228 (0.5849)	grad_norm 3.3056 (3.0804)	mem 20675MB
[2025-04-03 00:40:36 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][110/573]	eta 0:06:52 lr 0.000497	time 0.8773 (0.8904)	loss 0.6046 (0.5853)	grad_norm 3.7333 (3.1045)	mem 20675MB
[2025-04-03 00:40:37 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][112/573]	eta 0:06:50 lr 0.000498	time 0.8772 (0.8902)	loss 0.5769 (0.5854)	grad_norm 2.8844 (3.0987)	mem 20675MB
[2025-04-03 00:40:39 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][114/573]	eta 0:06:48 lr 0.000500	time 0.8772 (0.8900)	loss 0.5470 (0.5853)	grad_norm 4.1526 (3.0997)	mem 20675MB
[2025-04-03 00:40:41 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][116/573]	eta 0:06:46 lr 0.000501	time 0.8772 (0.8897)	loss 0.5467 (0.5846)	grad_norm 3.2872 (3.1202)	mem 20675MB
[2025-04-03 00:40:43 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][118/573]	eta 0:06:44 lr 0.000503	time 0.8772 (0.8895)	loss 0.6034 (0.5842)	grad_norm 2.0942 (3.1143)	mem 20675MB
[2025-04-03 00:40:44 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][120/573]	eta 0:06:42 lr 0.000504	time 0.8771 (0.8894)	loss 0.6012 (0.5845)	grad_norm 2.1863 (3.1083)	mem 20675MB
[2025-04-03 00:40:46 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][122/573]	eta 0:06:41 lr 0.000506	time 0.8774 (0.8892)	loss 0.4971 (0.5834)	grad_norm 3.8533 (3.1246)	mem 20675MB
[2025-04-03 00:40:48 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][124/573]	eta 0:06:39 lr 0.000507	time 0.8772 (0.8890)	loss 0.6694 (0.5838)	grad_norm 3.9637 (3.1434)	mem 20675MB
[2025-04-03 00:40:50 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][126/573]	eta 0:06:37 lr 0.000508	time 0.8772 (0.8888)	loss 0.5881 (0.5836)	grad_norm 3.6634 (3.1576)	mem 20675MB
[2025-04-03 00:40:51 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][128/573]	eta 0:06:35 lr 0.000510	time 0.8775 (0.8887)	loss 0.5301 (0.5834)	grad_norm 2.1751 (3.1406)	mem 20675MB
[2025-04-03 00:40:53 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][130/573]	eta 0:06:33 lr 0.000511	time 0.8788 (0.8885)	loss 0.5889 (0.5835)	grad_norm 1.8358 (3.1255)	mem 20675MB
[2025-04-03 00:40:55 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][132/573]	eta 0:06:31 lr 0.000513	time 0.8773 (0.8883)	loss 0.6484 (0.5843)	grad_norm 2.7081 (3.1117)	mem 20675MB
[2025-04-03 00:40:57 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][134/573]	eta 0:06:29 lr 0.000514	time 0.8770 (0.8882)	loss 0.5648 (0.5836)	grad_norm 2.6854 (3.1134)	mem 20675MB
[2025-04-03 00:40:58 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][136/573]	eta 0:06:28 lr 0.000516	time 0.8781 (0.8881)	loss 0.7175 (0.5845)	grad_norm 5.9700 (3.1454)	mem 20675MB
[2025-04-03 00:41:00 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][138/573]	eta 0:06:26 lr 0.000517	time 0.8776 (0.8879)	loss 0.5395 (0.5837)	grad_norm 5.6669 (3.1668)	mem 20675MB
[2025-04-03 00:41:02 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][140/573]	eta 0:06:24 lr 0.000519	time 0.8774 (0.8878)	loss 0.5383 (0.5830)	grad_norm 3.5267 (3.1690)	mem 20675MB
[2025-04-03 00:41:04 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][142/573]	eta 0:06:22 lr 0.000520	time 0.8772 (0.8876)	loss 0.5189 (0.5832)	grad_norm 3.5994 (3.2045)	mem 20675MB
[2025-04-03 00:41:05 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][144/573]	eta 0:06:20 lr 0.000522	time 0.8771 (0.8875)	loss 0.8478 (0.5844)	grad_norm 12.0162 (3.2656)	mem 20675MB
[2025-04-03 00:41:07 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][146/573]	eta 0:06:18 lr 0.000523	time 0.8773 (0.8874)	loss 0.5044 (0.5843)	grad_norm 2.5747 (3.2851)	mem 20675MB
[2025-04-03 00:41:09 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][148/573]	eta 0:06:17 lr 0.000524	time 0.8774 (0.8873)	loss 0.5474 (0.5843)	grad_norm 2.4900 (3.2725)	mem 20675MB
[2025-04-03 00:41:11 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][150/573]	eta 0:06:15 lr 0.000526	time 0.8774 (0.8871)	loss 0.5822 (0.5844)	grad_norm 2.1959 (3.2637)	mem 20675MB
[2025-04-03 00:41:12 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][152/573]	eta 0:06:13 lr 0.000527	time 0.8773 (0.8870)	loss 0.5952 (0.5843)	grad_norm 1.4558 (3.2428)	mem 20675MB
[2025-04-03 00:41:14 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][154/573]	eta 0:06:11 lr 0.000529	time 0.8774 (0.8869)	loss 0.5911 (0.5841)	grad_norm 1.5590 (3.2315)	mem 20675MB
[2025-04-03 00:41:16 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][156/573]	eta 0:06:09 lr 0.000530	time 0.8770 (0.8868)	loss 0.6040 (0.5844)	grad_norm 1.5259 (3.2142)	mem 20675MB
[2025-04-03 00:41:18 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][158/573]	eta 0:06:07 lr 0.000532	time 0.8769 (0.8867)	loss 0.6364 (0.5850)	grad_norm 2.4469 (3.2011)	mem 20675MB
[2025-04-03 00:41:19 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][160/573]	eta 0:06:06 lr 0.000533	time 0.8770 (0.8866)	loss 0.5951 (0.5851)	grad_norm 1.7302 (3.1887)	mem 20675MB
[2025-04-03 00:41:21 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][162/573]	eta 0:06:04 lr 0.000535	time 0.8776 (0.8865)	loss 0.6447 (0.5854)	grad_norm 2.5942 (3.1823)	mem 20675MB
[2025-04-03 00:41:23 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][164/573]	eta 0:06:02 lr 0.000536	time 0.8773 (0.8864)	loss 0.6336 (0.5865)	grad_norm 1.9729 (3.1785)	mem 20675MB
[2025-04-03 00:41:25 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][166/573]	eta 0:06:00 lr 0.000538	time 0.8770 (0.8863)	loss 0.6141 (0.5867)	grad_norm 1.5459 (3.1623)	mem 20675MB
[2025-04-03 00:41:26 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][168/573]	eta 0:05:58 lr 0.000539	time 0.8772 (0.8862)	loss 0.5156 (0.5863)	grad_norm 3.0644 (3.1573)	mem 20675MB
[2025-04-03 00:41:28 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][170/573]	eta 0:05:57 lr 0.000540	time 0.8772 (0.8861)	loss 0.5461 (0.5861)	grad_norm 5.4266 (3.1631)	mem 20675MB
[2025-04-03 00:41:30 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][172/573]	eta 0:05:55 lr 0.000542	time 0.8771 (0.8860)	loss 0.5604 (0.5857)	grad_norm 2.3091 (3.1636)	mem 20675MB
[2025-04-03 00:41:32 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][174/573]	eta 0:05:53 lr 0.000543	time 0.8771 (0.8859)	loss 0.4707 (0.5857)	grad_norm 4.3565 (3.1843)	mem 20675MB
[2025-04-03 00:41:34 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][176/573]	eta 0:05:51 lr 0.000545	time 0.8771 (0.8858)	loss 0.6071 (0.5859)	grad_norm 4.1887 (3.1888)	mem 20675MB
[2025-04-03 00:41:35 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][178/573]	eta 0:05:49 lr 0.000546	time 0.8774 (0.8857)	loss 0.6040 (0.5863)	grad_norm 3.5148 (3.1884)	mem 20675MB
[2025-04-03 00:41:37 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][180/573]	eta 0:05:48 lr 0.000548	time 0.8770 (0.8856)	loss 0.5626 (0.5858)	grad_norm 3.1299 (3.1976)	mem 20675MB
[2025-04-03 00:41:39 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][182/573]	eta 0:05:46 lr 0.000549	time 0.8773 (0.8855)	loss 0.6201 (0.5856)	grad_norm 3.4392 (3.1972)	mem 20675MB
[2025-04-03 00:41:41 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][184/573]	eta 0:05:44 lr 0.000551	time 0.8772 (0.8855)	loss 0.6697 (0.5860)	grad_norm 1.9924 (3.1825)	mem 20675MB
[2025-04-03 00:41:42 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][186/573]	eta 0:05:42 lr 0.000552	time 0.8771 (0.8854)	loss 0.6152 (0.5864)	grad_norm 2.7785 (3.1710)	mem 20675MB
[2025-04-03 00:41:44 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][188/573]	eta 0:05:40 lr 0.000554	time 0.8775 (0.8853)	loss 0.6305 (0.5869)	grad_norm 1.4720 (3.1576)	mem 20675MB
[2025-04-03 00:41:46 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][190/573]	eta 0:05:39 lr 0.000555	time 0.8773 (0.8852)	loss 0.5792 (0.5867)	grad_norm 2.8677 (3.1526)	mem 20675MB
[2025-04-03 00:41:48 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][192/573]	eta 0:05:37 lr 0.000556	time 0.8771 (0.8852)	loss 0.6618 (0.5873)	grad_norm 5.6587 (3.1621)	mem 20675MB
[2025-04-03 00:41:49 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][194/573]	eta 0:05:35 lr 0.000558	time 0.8770 (0.8851)	loss 0.5945 (0.5869)	grad_norm 1.9257 (3.1525)	mem 20675MB
[2025-04-03 00:41:51 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][196/573]	eta 0:05:33 lr 0.000559	time 0.8772 (0.8850)	loss 0.6142 (0.5869)	grad_norm 1.8781 (3.1566)	mem 20675MB
[2025-04-03 00:41:53 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][198/573]	eta 0:05:31 lr 0.000561	time 0.8775 (0.8849)	loss 0.5923 (0.5874)	grad_norm 2.9787 (3.1590)	mem 20675MB
[2025-04-03 00:41:55 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][200/573]	eta 0:05:30 lr 0.000562	time 0.8771 (0.8849)	loss 0.6373 (0.5875)	grad_norm 4.3784 (3.1661)	mem 20675MB
[2025-04-03 00:41:56 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][202/573]	eta 0:05:28 lr 0.000564	time 0.8773 (0.8848)	loss 0.6258 (0.5876)	grad_norm 3.7551 (3.1825)	mem 20675MB
[2025-04-03 00:41:58 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][204/573]	eta 0:05:26 lr 0.000565	time 0.8773 (0.8847)	loss 0.6719 (0.5879)	grad_norm 2.2967 (3.1807)	mem 20675MB
[2025-04-03 00:42:00 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][206/573]	eta 0:05:24 lr 0.000567	time 0.8770 (0.8847)	loss 0.5929 (0.5878)	grad_norm 4.2913 (3.1905)	mem 20675MB
[2025-04-03 00:42:02 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][208/573]	eta 0:05:22 lr 0.000568	time 0.8775 (0.8846)	loss 0.5878 (0.5880)	grad_norm 3.2407 (3.1845)	mem 20675MB
[2025-04-03 00:42:03 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][210/573]	eta 0:05:21 lr 0.000570	time 0.8777 (0.8846)	loss 0.4987 (0.5877)	grad_norm 2.4824 (3.1730)	mem 20675MB
[2025-04-03 00:42:05 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][212/573]	eta 0:05:19 lr 0.000571	time 0.8771 (0.8845)	loss 0.5511 (0.5871)	grad_norm 2.8420 (3.1761)	mem 20675MB
[2025-04-03 00:42:07 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][214/573]	eta 0:05:17 lr 0.000572	time 0.8773 (0.8844)	loss 0.5534 (0.5871)	grad_norm 3.2669 (3.1843)	mem 20675MB
[2025-04-03 00:42:09 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][216/573]	eta 0:05:15 lr 0.000574	time 0.8780 (0.8844)	loss 0.5894 (0.5873)	grad_norm 3.6964 (3.1906)	mem 20675MB
[2025-04-03 00:42:10 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][218/573]	eta 0:05:13 lr 0.000575	time 0.8771 (0.8843)	loss 0.5884 (0.5871)	grad_norm 1.7206 (3.1846)	mem 20675MB
[2025-04-03 00:42:12 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][220/573]	eta 0:05:12 lr 0.000577	time 0.8771 (0.8843)	loss 0.6026 (0.5873)	grad_norm 1.4387 (3.1714)	mem 20675MB
[2025-04-03 00:42:14 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][222/573]	eta 0:05:10 lr 0.000578	time 0.8773 (0.8842)	loss 0.5273 (0.5869)	grad_norm 4.7789 (3.1798)	mem 20675MB
[2025-04-03 00:42:16 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][224/573]	eta 0:05:08 lr 0.000580	time 0.8773 (0.8842)	loss 0.6521 (0.5875)	grad_norm 4.1639 (3.1820)	mem 20675MB
[2025-04-03 00:42:17 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][226/573]	eta 0:05:06 lr 0.000581	time 0.8774 (0.8841)	loss 0.5723 (0.5876)	grad_norm 2.0350 (3.1729)	mem 20675MB
[2025-04-03 00:42:19 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][228/573]	eta 0:05:04 lr 0.000583	time 0.8774 (0.8841)	loss 0.5186 (0.5875)	grad_norm 4.2250 (3.1755)	mem 20675MB
[2025-04-03 00:42:21 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][230/573]	eta 0:05:03 lr 0.000584	time 0.8770 (0.8840)	loss 0.5458 (0.5869)	grad_norm 2.9295 (3.1717)	mem 20675MB
[2025-04-03 00:42:23 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][232/573]	eta 0:05:01 lr 0.000586	time 0.8773 (0.8840)	loss 0.4744 (0.5865)	grad_norm 3.8367 (3.1739)	mem 20675MB
[2025-04-03 00:42:24 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][234/573]	eta 0:04:59 lr 0.000587	time 0.8772 (0.8839)	loss 0.6322 (0.5862)	grad_norm 4.2602 (3.1851)	mem 20675MB
[2025-04-03 00:42:26 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][236/573]	eta 0:04:57 lr 0.000588	time 0.8771 (0.8839)	loss 0.4573 (0.5851)	grad_norm 6.0339 (3.2037)	mem 20675MB
[2025-04-03 00:42:28 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][238/573]	eta 0:04:56 lr 0.000590	time 0.8770 (0.8838)	loss 0.6851 (0.5850)	grad_norm 7.8084 (3.2356)	mem 20675MB
[2025-04-03 00:42:30 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][240/573]	eta 0:04:54 lr 0.000591	time 0.8770 (0.8838)	loss 0.5851 (0.5848)	grad_norm 3.7221 (3.2389)	mem 20675MB
[2025-04-03 00:42:31 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][242/573]	eta 0:04:52 lr 0.000593	time 0.8774 (0.8837)	loss 0.6726 (0.5855)	grad_norm 5.0411 (3.2628)	mem 20675MB
[2025-04-03 00:42:33 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][244/573]	eta 0:04:50 lr 0.000594	time 0.8771 (0.8837)	loss 0.6441 (0.5859)	grad_norm 3.9684 (3.2622)	mem 20675MB
[2025-04-03 00:42:35 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][246/573]	eta 0:04:48 lr 0.000596	time 0.8775 (0.8836)	loss 0.5326 (0.5857)	grad_norm 3.2999 (3.2588)	mem 20675MB
[2025-04-03 00:42:37 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][248/573]	eta 0:04:47 lr 0.000597	time 0.8775 (0.8836)	loss 0.5332 (0.5858)	grad_norm 2.4536 (3.2691)	mem 20675MB
[2025-04-03 00:42:38 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][250/573]	eta 0:04:45 lr 0.000599	time 0.8770 (0.8835)	loss 0.5967 (0.5858)	grad_norm 1.5644 (3.2566)	mem 20675MB
[2025-04-03 00:42:40 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][252/573]	eta 0:04:43 lr 0.000600	time 0.8773 (0.8835)	loss 0.4575 (0.5849)	grad_norm 2.6611 (3.2537)	mem 20675MB
[2025-04-03 00:42:42 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][254/573]	eta 0:04:41 lr 0.000601	time 0.8771 (0.8835)	loss 0.6057 (0.5849)	grad_norm 2.8073 (3.2564)	mem 20675MB
[2025-04-03 00:42:44 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][256/573]	eta 0:04:40 lr 0.000603	time 0.8771 (0.8834)	loss 0.6359 (0.5851)	grad_norm 4.7298 (3.2621)	mem 20675MB
[2025-04-03 00:42:46 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][258/573]	eta 0:04:38 lr 0.000604	time 0.8773 (0.8834)	loss 0.5785 (0.5846)	grad_norm 2.9527 (3.2629)	mem 20675MB
[2025-04-03 00:42:47 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][260/573]	eta 0:04:36 lr 0.000606	time 0.8772 (0.8833)	loss 0.5490 (0.5846)	grad_norm 3.3085 (3.2616)	mem 20675MB
[2025-04-03 00:42:49 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][262/573]	eta 0:04:34 lr 0.000607	time 0.8771 (0.8833)	loss 0.6361 (0.5846)	grad_norm 2.3007 (3.2571)	mem 20675MB
[2025-04-03 00:42:51 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][264/573]	eta 0:04:32 lr 0.000609	time 0.8772 (0.8833)	loss 0.5906 (0.5844)	grad_norm 1.8231 (3.2508)	mem 20675MB
[2025-04-03 00:42:53 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][266/573]	eta 0:04:31 lr 0.000610	time 0.8772 (0.8832)	loss 0.5583 (0.5843)	grad_norm 2.4014 (3.2431)	mem 20675MB
[2025-04-03 00:42:54 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][268/573]	eta 0:04:29 lr 0.000612	time 0.8769 (0.8832)	loss 0.5076 (0.5839)	grad_norm 2.6754 (3.2467)	mem 20675MB
[2025-04-03 00:42:56 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][270/573]	eta 0:04:27 lr 0.000613	time 0.8772 (0.8831)	loss 0.4615 (0.5836)	grad_norm 3.3316 (3.2452)	mem 20675MB
[2025-04-03 00:42:58 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][272/573]	eta 0:04:25 lr 0.000615	time 0.8772 (0.8831)	loss 0.6444 (0.5835)	grad_norm 3.0483 (3.2468)	mem 20675MB
[2025-04-03 00:43:00 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][274/573]	eta 0:04:24 lr 0.000616	time 0.8773 (0.8831)	loss 0.4679 (0.5832)	grad_norm 2.5451 (3.2400)	mem 20675MB
[2025-04-03 00:43:01 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][276/573]	eta 0:04:22 lr 0.000617	time 0.8773 (0.8830)	loss 0.6176 (0.5833)	grad_norm 4.5426 (3.2416)	mem 20675MB
[2025-04-03 00:43:03 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][278/573]	eta 0:04:20 lr 0.000619	time 0.8772 (0.8830)	loss 0.5007 (0.5833)	grad_norm 3.4426 (3.2414)	mem 20675MB
[2025-04-03 00:43:05 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][280/573]	eta 0:04:18 lr 0.000620	time 0.8770 (0.8830)	loss 0.6238 (0.5833)	grad_norm 1.8168 (3.2345)	mem 20675MB
[2025-04-03 00:43:07 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][282/573]	eta 0:04:16 lr 0.000622	time 0.8771 (0.8829)	loss 0.6054 (0.5834)	grad_norm 1.4419 (3.2249)	mem 20675MB
[2025-04-03 00:43:08 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][284/573]	eta 0:04:15 lr 0.000623	time 0.8773 (0.8829)	loss 0.5480 (0.5831)	grad_norm 3.1822 (3.2196)	mem 20675MB
[2025-04-03 00:43:10 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][286/573]	eta 0:04:13 lr 0.000625	time 0.8772 (0.8829)	loss 0.4862 (0.5829)	grad_norm 2.2263 (3.2135)	mem 20675MB
[2025-04-03 00:43:12 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][288/573]	eta 0:04:11 lr 0.000626	time 0.8769 (0.8828)	loss 0.5820 (0.5830)	grad_norm 2.6481 (3.2073)	mem 20675MB
[2025-04-03 00:43:14 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][290/573]	eta 0:04:09 lr 0.000628	time 0.8770 (0.8828)	loss 0.6050 (0.5829)	grad_norm 2.6042 (3.2039)	mem 20675MB
[2025-04-03 00:43:15 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][292/573]	eta 0:04:08 lr 0.000629	time 0.8770 (0.8827)	loss 0.6141 (0.5830)	grad_norm 4.7912 (3.2067)	mem 20675MB
[2025-04-03 00:43:17 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][294/573]	eta 0:04:06 lr 0.000631	time 0.8770 (0.8827)	loss 0.5988 (0.5828)	grad_norm 2.6654 (3.2038)	mem 20675MB
[2025-04-03 00:43:19 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][296/573]	eta 0:04:04 lr 0.000632	time 0.8772 (0.8827)	loss 0.6312 (0.5831)	grad_norm 2.5830 (3.1968)	mem 20675MB
[2025-04-03 00:43:21 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][298/573]	eta 0:04:02 lr 0.000633	time 0.8772 (0.8827)	loss 0.4993 (0.5828)	grad_norm 2.7981 (3.1928)	mem 20675MB
[2025-04-03 00:43:22 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][300/573]	eta 0:04:00 lr 0.000635	time 0.8771 (0.8826)	loss 0.5821 (0.5828)	grad_norm 3.9979 (3.1934)	mem 20675MB
[2025-04-03 00:43:24 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][302/573]	eta 0:03:59 lr 0.000636	time 0.8771 (0.8826)	loss 0.5395 (0.5828)	grad_norm 1.9817 (3.1897)	mem 20675MB
[2025-04-03 00:43:26 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][304/573]	eta 0:03:57 lr 0.000638	time 0.8771 (0.8826)	loss 0.6030 (0.5825)	grad_norm 1.8624 (3.1832)	mem 20675MB
[2025-04-03 00:43:28 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][306/573]	eta 0:03:55 lr 0.000639	time 0.8769 (0.8825)	loss 0.4980 (0.5820)	grad_norm 3.6454 (3.1866)	mem 20675MB
[2025-04-03 00:43:29 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][308/573]	eta 0:03:53 lr 0.000641	time 0.8773 (0.8825)	loss 0.4684 (0.5816)	grad_norm 3.6280 (3.1841)	mem 20675MB
[2025-04-03 00:43:31 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][310/573]	eta 0:03:52 lr 0.000642	time 0.8775 (0.8825)	loss 0.6968 (0.5821)	grad_norm 3.7472 (3.1850)	mem 20675MB
[2025-04-03 00:43:33 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][312/573]	eta 0:03:50 lr 0.000644	time 0.8773 (0.8825)	loss 0.6485 (0.5820)	grad_norm 2.4462 (3.1849)	mem 20675MB
[2025-04-03 00:43:35 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][314/573]	eta 0:03:48 lr 0.000645	time 0.8770 (0.8824)	loss 0.5963 (0.5819)	grad_norm 1.6441 (3.1787)	mem 20675MB
[2025-04-03 00:43:36 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][316/573]	eta 0:03:46 lr 0.000647	time 0.8771 (0.8824)	loss 0.5304 (0.5819)	grad_norm 2.7835 (3.1756)	mem 20675MB
[2025-04-03 00:43:38 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][318/573]	eta 0:03:45 lr 0.000648	time 0.8775 (0.8824)	loss 0.6158 (0.5818)	grad_norm 1.5937 (3.1732)	mem 20675MB
[2025-04-03 00:43:40 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][320/573]	eta 0:03:43 lr 0.000649	time 0.8769 (0.8823)	loss 0.6602 (0.5819)	grad_norm 4.0237 (3.1768)	mem 20675MB
[2025-04-03 00:43:42 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][322/573]	eta 0:03:41 lr 0.000651	time 0.8771 (0.8823)	loss 0.5852 (0.5821)	grad_norm 1.8620 (3.1749)	mem 20675MB
[2025-04-03 00:43:43 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][324/573]	eta 0:03:39 lr 0.000652	time 0.8775 (0.8823)	loss 0.5385 (0.5820)	grad_norm 3.0651 (3.1771)	mem 20675MB
[2025-04-03 00:43:45 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][326/573]	eta 0:03:37 lr 0.000654	time 0.8788 (0.8823)	loss 0.6166 (0.5818)	grad_norm 2.3173 (3.1716)	mem 20675MB
[2025-04-03 00:43:47 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][328/573]	eta 0:03:36 lr 0.000655	time 0.8774 (0.8822)	loss 0.6337 (0.5820)	grad_norm 2.0577 (3.1649)	mem 20675MB
[2025-04-03 00:43:49 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][330/573]	eta 0:03:34 lr 0.000657	time 0.8776 (0.8822)	loss 0.4833 (0.5817)	grad_norm 2.3056 (3.1594)	mem 20675MB
[2025-04-03 00:43:50 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][332/573]	eta 0:03:32 lr 0.000658	time 0.8771 (0.8822)	loss 0.6125 (0.5819)	grad_norm 3.5010 (3.1584)	mem 20675MB
[2025-04-03 00:43:52 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][334/573]	eta 0:03:30 lr 0.000660	time 0.8770 (0.8822)	loss 0.5508 (0.5816)	grad_norm 2.5477 (3.1598)	mem 20675MB
[2025-04-03 00:43:54 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][336/573]	eta 0:03:29 lr 0.000661	time 0.8772 (0.8821)	loss 0.5856 (0.5819)	grad_norm 1.5546 (3.1525)	mem 20675MB
[2025-04-03 00:43:56 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][338/573]	eta 0:03:27 lr 0.000663	time 0.8775 (0.8821)	loss 0.5130 (0.5814)	grad_norm 2.9187 (3.1488)	mem 20675MB
[2025-04-03 00:43:58 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][340/573]	eta 0:03:25 lr 0.000664	time 0.8774 (0.8821)	loss 0.4971 (0.5813)	grad_norm 2.2712 (3.1500)	mem 20675MB
[2025-04-03 00:43:59 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][342/573]	eta 0:03:23 lr 0.000665	time 0.8773 (0.8821)	loss 0.6430 (0.5817)	grad_norm 2.1259 (3.1436)	mem 20675MB
[2025-04-03 00:44:01 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][344/573]	eta 0:03:21 lr 0.000667	time 0.8771 (0.8821)	loss 0.5732 (0.5817)	grad_norm 4.2570 (3.1550)	mem 20675MB
[2025-04-03 00:44:03 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][346/573]	eta 0:03:20 lr 0.000668	time 0.8770 (0.8820)	loss 0.6260 (0.5818)	grad_norm 3.3461 (3.1509)	mem 20675MB
[2025-04-03 00:44:05 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][348/573]	eta 0:03:18 lr 0.000670	time 0.8770 (0.8820)	loss 0.5896 (0.5820)	grad_norm 2.0104 (3.1523)	mem 20675MB
[2025-04-03 00:44:06 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][350/573]	eta 0:03:16 lr 0.000671	time 0.8776 (0.8820)	loss 0.6801 (0.5824)	grad_norm 4.7291 (3.1528)	mem 20675MB
[2025-04-03 00:44:08 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][352/573]	eta 0:03:14 lr 0.000673	time 0.8785 (0.8820)	loss 0.5840 (0.5826)	grad_norm 1.7655 (3.1455)	mem 20675MB
[2025-04-03 00:44:10 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][354/573]	eta 0:03:13 lr 0.000674	time 0.8775 (0.8819)	loss 0.5875 (0.5827)	grad_norm 1.9586 (3.1425)	mem 20675MB
[2025-04-03 00:44:12 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][356/573]	eta 0:03:11 lr 0.000676	time 0.8772 (0.8819)	loss 0.5905 (0.5829)	grad_norm 1.7588 (3.1343)	mem 20675MB
[2025-04-03 00:44:13 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][358/573]	eta 0:03:09 lr 0.000677	time 0.8775 (0.8819)	loss 0.6002 (0.5829)	grad_norm 1.9210 (3.1277)	mem 20675MB
[2025-04-03 00:44:15 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][360/573]	eta 0:03:07 lr 0.000679	time 0.8773 (0.8819)	loss 0.5232 (0.5829)	grad_norm 2.9823 (3.1307)	mem 20675MB
[2025-04-03 00:44:17 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][362/573]	eta 0:03:06 lr 0.000680	time 0.8774 (0.8819)	loss 0.5136 (0.5824)	grad_norm 2.9499 (3.1298)	mem 20675MB
[2025-04-03 00:44:19 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][364/573]	eta 0:03:04 lr 0.000681	time 0.8774 (0.8818)	loss 0.5597 (0.5824)	grad_norm 3.5546 (3.1308)	mem 20675MB
[2025-04-03 00:44:20 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][366/573]	eta 0:03:02 lr 0.000683	time 0.8770 (0.8818)	loss 0.5915 (0.5824)	grad_norm 3.6255 (3.1304)	mem 20675MB
[2025-04-03 00:44:22 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][368/573]	eta 0:03:00 lr 0.000684	time 0.8770 (0.8818)	loss 0.5046 (0.5823)	grad_norm 3.1638 (3.1377)	mem 20675MB
[2025-04-03 00:44:24 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][370/573]	eta 0:02:59 lr 0.000686	time 0.8776 (0.8818)	loss 0.5281 (0.5821)	grad_norm 3.8573 (3.1380)	mem 20675MB
[2025-04-03 00:44:26 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][372/573]	eta 0:02:57 lr 0.000687	time 0.8771 (0.8818)	loss 0.6037 (0.5823)	grad_norm 2.0239 (3.1383)	mem 20675MB
[2025-04-03 00:44:27 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][374/573]	eta 0:02:55 lr 0.000689	time 0.8774 (0.8817)	loss 0.5662 (0.5820)	grad_norm 1.7851 (3.1335)	mem 20675MB
[2025-04-03 00:44:29 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][376/573]	eta 0:02:53 lr 0.000690	time 0.8773 (0.8817)	loss 0.5130 (0.5815)	grad_norm 2.3685 (3.1314)	mem 20675MB
[2025-04-03 00:44:31 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][378/573]	eta 0:02:51 lr 0.000692	time 0.8788 (0.8817)	loss 0.5958 (0.5812)	grad_norm 3.6947 (3.1343)	mem 20675MB
[2025-04-03 00:44:33 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][380/573]	eta 0:02:50 lr 0.000693	time 0.8775 (0.8817)	loss 0.4401 (0.5808)	grad_norm 3.5021 (3.1360)	mem 20675MB
[2025-04-03 00:44:34 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][382/573]	eta 0:02:48 lr 0.000695	time 0.8778 (0.8817)	loss 0.5823 (0.5809)	grad_norm 3.8914 (3.1525)	mem 20675MB
[2025-04-03 00:44:36 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][384/573]	eta 0:02:46 lr 0.000696	time 0.8775 (0.8817)	loss 0.6558 (0.5810)	grad_norm 5.8962 (3.1586)	mem 20675MB
[2025-04-03 00:44:38 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][386/573]	eta 0:02:44 lr 0.000697	time 0.8773 (0.8816)	loss 0.6702 (0.5810)	grad_norm 3.7064 (3.1638)	mem 20675MB
[2025-04-03 00:44:40 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][388/573]	eta 0:02:43 lr 0.000699	time 0.8772 (0.8816)	loss 0.6066 (0.5811)	grad_norm 3.4450 (3.1653)	mem 20675MB
[2025-04-03 00:44:41 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][390/573]	eta 0:02:41 lr 0.000700	time 0.8774 (0.8816)	loss 0.5866 (0.5812)	grad_norm 1.5022 (3.1573)	mem 20675MB
[2025-04-03 00:44:43 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][392/573]	eta 0:02:39 lr 0.000702	time 0.8772 (0.8816)	loss 0.6123 (0.5812)	grad_norm 1.3031 (3.1508)	mem 20675MB
[2025-04-03 00:44:45 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][394/573]	eta 0:02:37 lr 0.000703	time 0.8771 (0.8816)	loss 0.6103 (0.5813)	grad_norm 3.0238 (3.1470)	mem 20675MB
[2025-04-03 00:44:47 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][396/573]	eta 0:02:36 lr 0.000705	time 0.8775 (0.8816)	loss 0.5871 (0.5815)	grad_norm 2.3996 (3.1426)	mem 20675MB
[2025-04-03 00:44:48 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][398/573]	eta 0:02:34 lr 0.000706	time 0.8773 (0.8815)	loss 0.6224 (0.5816)	grad_norm 2.8928 (3.1408)	mem 20675MB
[2025-04-03 00:44:50 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][400/573]	eta 0:02:32 lr 0.000708	time 0.8772 (0.8815)	loss 0.6259 (0.5817)	grad_norm 1.7177 (3.1353)	mem 20675MB
[2025-04-03 00:44:52 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][402/573]	eta 0:02:30 lr 0.000709	time 0.8773 (0.8815)	loss 0.5478 (0.5816)	grad_norm 2.3430 (3.1311)	mem 20675MB
[2025-04-03 00:44:54 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][404/573]	eta 0:02:28 lr 0.000711	time 0.8771 (0.8815)	loss 0.5326 (0.5812)	grad_norm 3.0669 (3.1303)	mem 20675MB
[2025-04-03 00:44:55 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][406/573]	eta 0:02:27 lr 0.000712	time 0.8772 (0.8815)	loss 0.5851 (0.5811)	grad_norm 3.2562 (3.1295)	mem 20675MB
[2025-04-03 00:44:57 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][408/573]	eta 0:02:25 lr 0.000713	time 0.8778 (0.8815)	loss 0.6544 (0.5813)	grad_norm 3.0361 (3.1296)	mem 20675MB
[2025-04-03 00:44:59 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][410/573]	eta 0:02:23 lr 0.000715	time 0.8772 (0.8814)	loss 0.6338 (0.5812)	grad_norm 2.7545 (3.1286)	mem 20675MB
[2025-04-03 00:45:01 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][412/573]	eta 0:02:21 lr 0.000716	time 0.8769 (0.8814)	loss 0.4706 (0.5810)	grad_norm 2.7709 (3.1252)	mem 20675MB
[2025-04-03 00:45:03 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][414/573]	eta 0:02:20 lr 0.000718	time 0.8772 (0.8814)	loss 0.4961 (0.5809)	grad_norm 2.4471 (3.1254)	mem 20675MB
[2025-04-03 00:45:04 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][416/573]	eta 0:02:18 lr 0.000719	time 0.8773 (0.8814)	loss 0.6332 (0.5811)	grad_norm 2.5240 (3.1199)	mem 20675MB
[2025-04-03 00:45:06 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][418/573]	eta 0:02:16 lr 0.000721	time 0.8771 (0.8814)	loss 0.5133 (0.5810)	grad_norm 2.7961 (3.1173)	mem 20675MB
[2025-04-03 00:45:08 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][420/573]	eta 0:02:14 lr 0.000722	time 0.8770 (0.8814)	loss 0.5136 (0.5806)	grad_norm 2.7036 (3.1164)	mem 20675MB
[2025-04-03 00:45:10 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][422/573]	eta 0:02:13 lr 0.000724	time 0.8773 (0.8813)	loss 0.5896 (0.5807)	grad_norm 3.0722 (3.1139)	mem 20675MB
[2025-04-03 00:45:11 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][424/573]	eta 0:02:11 lr 0.000725	time 0.8773 (0.8813)	loss 0.5604 (0.5804)	grad_norm 2.2229 (3.1175)	mem 20675MB
[2025-04-03 00:45:13 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][426/573]	eta 0:02:09 lr 0.000727	time 0.8774 (0.8813)	loss 0.6340 (0.5806)	grad_norm 4.5217 (3.1206)	mem 20675MB
[2025-04-03 00:45:15 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][428/573]	eta 0:02:07 lr 0.000728	time 0.8770 (0.8813)	loss 0.5579 (0.5805)	grad_norm 3.8842 (3.1205)	mem 20675MB
[2025-04-03 00:45:17 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][430/573]	eta 0:02:06 lr 0.000729	time 0.8772 (0.8813)	loss 0.5821 (0.5806)	grad_norm 2.6671 (3.1160)	mem 20675MB
[2025-04-03 00:45:18 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][432/573]	eta 0:02:04 lr 0.000731	time 0.8770 (0.8813)	loss 0.5784 (0.5803)	grad_norm 1.6192 (3.1111)	mem 20675MB
[2025-04-03 00:45:20 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][434/573]	eta 0:02:02 lr 0.000732	time 0.8773 (0.8813)	loss 0.6448 (0.5805)	grad_norm 3.3203 (3.1088)	mem 20675MB
[2025-04-03 00:45:22 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][436/573]	eta 0:02:00 lr 0.000734	time 0.8772 (0.8812)	loss 0.6277 (0.5807)	grad_norm 2.2468 (3.1036)	mem 20675MB
[2025-04-03 00:45:24 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][438/573]	eta 0:01:58 lr 0.000735	time 0.8775 (0.8812)	loss 0.6090 (0.5808)	grad_norm 2.9983 (3.1017)	mem 20675MB
[2025-04-03 00:45:25 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][440/573]	eta 0:01:57 lr 0.000737	time 0.8774 (0.8812)	loss 0.6067 (0.5810)	grad_norm 2.0086 (3.0957)	mem 20675MB
[2025-04-03 00:45:27 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][442/573]	eta 0:01:55 lr 0.000738	time 0.8773 (0.8812)	loss 0.4786 (0.5808)	grad_norm 2.5913 (3.0917)	mem 20675MB
[2025-04-03 00:45:29 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][444/573]	eta 0:01:53 lr 0.000740	time 0.8773 (0.8812)	loss 0.5188 (0.5804)	grad_norm 3.0400 (3.0925)	mem 20675MB
[2025-04-03 00:45:31 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][446/573]	eta 0:01:51 lr 0.000741	time 0.8774 (0.8812)	loss 0.6550 (0.5805)	grad_norm 3.0742 (3.0970)	mem 20675MB
[2025-04-03 00:45:32 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][448/573]	eta 0:01:50 lr 0.000743	time 0.8772 (0.8812)	loss 0.6009 (0.5807)	grad_norm 2.9534 (3.0957)	mem 20675MB
[2025-04-03 00:45:34 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][450/573]	eta 0:01:48 lr 0.000744	time 0.8770 (0.8811)	loss 0.5013 (0.5803)	grad_norm 2.5661 (3.0939)	mem 20675MB
[2025-04-03 00:45:36 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][452/573]	eta 0:01:46 lr 0.000745	time 0.8771 (0.8811)	loss 0.5760 (0.5800)	grad_norm 2.1122 (3.0904)	mem 20675MB
[2025-04-03 00:45:38 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][454/573]	eta 0:01:44 lr 0.000747	time 0.8770 (0.8811)	loss 0.5262 (0.5799)	grad_norm 2.2633 (3.0869)	mem 20675MB
[2025-04-03 00:45:39 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][456/573]	eta 0:01:43 lr 0.000748	time 0.8785 (0.8811)	loss 0.4921 (0.5798)	grad_norm 4.9963 (3.0886)	mem 20675MB
[2025-04-03 00:45:41 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][458/573]	eta 0:01:41 lr 0.000750	time 0.8774 (0.8811)	loss 0.5810 (0.5798)	grad_norm 3.4041 (3.0922)	mem 20675MB
[2025-04-03 00:45:43 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][460/573]	eta 0:01:39 lr 0.000751	time 0.8774 (0.8811)	loss 0.5946 (0.5796)	grad_norm 3.6850 (3.0964)	mem 20675MB
[2025-04-03 00:45:45 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][462/573]	eta 0:01:37 lr 0.000753	time 0.8773 (0.8811)	loss 0.6199 (0.5798)	grad_norm 2.5372 (3.0960)	mem 20675MB
[2025-04-03 00:45:46 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][464/573]	eta 0:01:36 lr 0.000754	time 0.8776 (0.8811)	loss 0.5957 (0.5800)	grad_norm 2.1584 (3.0976)	mem 20675MB
[2025-04-03 00:45:48 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][466/573]	eta 0:01:34 lr 0.000756	time 0.8775 (0.8810)	loss 0.5520 (0.5801)	grad_norm 3.5203 (3.0992)	mem 20675MB
[2025-04-03 00:45:50 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][468/573]	eta 0:01:32 lr 0.000757	time 0.8773 (0.8810)	loss 0.6047 (0.5802)	grad_norm 2.0004 (3.0940)	mem 20675MB
[2025-04-03 00:45:52 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][470/573]	eta 0:01:30 lr 0.000759	time 0.8777 (0.8810)	loss 0.5312 (0.5801)	grad_norm 2.6416 (3.0893)	mem 20675MB
[2025-04-03 00:45:53 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][472/573]	eta 0:01:28 lr 0.000760	time 0.8771 (0.8810)	loss 0.5700 (0.5800)	grad_norm 1.6788 (3.0837)	mem 20675MB
[2025-04-03 00:45:55 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][474/573]	eta 0:01:27 lr 0.000761	time 0.8772 (0.8810)	loss 0.5362 (0.5797)	grad_norm 2.8685 (3.0831)	mem 20675MB
[2025-04-03 00:45:57 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][476/573]	eta 0:01:25 lr 0.000763	time 0.8771 (0.8810)	loss 0.6555 (0.5799)	grad_norm 3.1852 (3.0821)	mem 20675MB
[2025-04-03 00:45:59 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][478/573]	eta 0:01:23 lr 0.000764	time 0.8771 (0.8810)	loss 0.6087 (0.5798)	grad_norm 3.0559 (3.0824)	mem 20675MB
[2025-04-03 00:46:00 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][480/573]	eta 0:01:21 lr 0.000766	time 0.8771 (0.8810)	loss 0.5766 (0.5799)	grad_norm 2.0669 (3.0796)	mem 20675MB
[2025-04-03 00:46:02 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][482/573]	eta 0:01:20 lr 0.000767	time 0.8773 (0.8810)	loss 0.5588 (0.5799)	grad_norm 3.0050 (3.0794)	mem 20675MB
[2025-04-03 00:46:04 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][484/573]	eta 0:01:18 lr 0.000769	time 0.8772 (0.8809)	loss 0.6060 (0.5801)	grad_norm 1.5821 (3.0742)	mem 20675MB
[2025-04-03 00:46:06 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][486/573]	eta 0:01:16 lr 0.000770	time 0.8772 (0.8809)	loss 0.5844 (0.5801)	grad_norm 2.4428 (3.0700)	mem 20675MB
[2025-04-03 00:46:07 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][488/573]	eta 0:01:14 lr 0.000772	time 0.8774 (0.8809)	loss 0.5525 (0.5799)	grad_norm 2.5490 (3.0702)	mem 20675MB
[2025-04-03 00:46:09 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][490/573]	eta 0:01:13 lr 0.000773	time 0.8773 (0.8809)	loss 0.6184 (0.5800)	grad_norm 2.5281 (3.0659)	mem 20675MB
[2025-04-03 00:46:11 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][492/573]	eta 0:01:11 lr 0.000775	time 0.8775 (0.8809)	loss 0.4935 (0.5799)	grad_norm 3.8135 (3.0659)	mem 20675MB
[2025-04-03 00:46:13 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][494/573]	eta 0:01:09 lr 0.000776	time 0.8771 (0.8809)	loss 0.5957 (0.5799)	grad_norm 2.0663 (3.0643)	mem 20675MB
[2025-04-03 00:46:15 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][496/573]	eta 0:01:07 lr 0.000777	time 0.8778 (0.8809)	loss 0.4962 (0.5795)	grad_norm 5.0818 (3.0683)	mem 20675MB
[2025-04-03 00:46:16 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][498/573]	eta 0:01:06 lr 0.000779	time 0.8772 (0.8809)	loss 0.4744 (0.5794)	grad_norm 4.4206 (3.0701)	mem 20675MB
[2025-04-03 00:46:18 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][500/573]	eta 0:01:04 lr 0.000780	time 0.8774 (0.8809)	loss 0.5704 (0.5793)	grad_norm 2.3000 (3.0718)	mem 20675MB
[2025-04-03 00:46:20 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][502/573]	eta 0:01:02 lr 0.000782	time 0.8770 (0.8808)	loss 0.5073 (0.5790)	grad_norm 3.3530 (3.0770)	mem 20675MB
[2025-04-03 00:46:22 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][504/573]	eta 0:01:00 lr 0.000783	time 0.8780 (0.8808)	loss 0.5928 (0.5790)	grad_norm 2.5710 (3.0738)	mem 20675MB
[2025-04-03 00:46:23 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][506/573]	eta 0:00:59 lr 0.000785	time 0.8776 (0.8808)	loss 0.6010 (0.5792)	grad_norm 2.4313 (3.0788)	mem 20675MB
[2025-04-03 00:46:25 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][508/573]	eta 0:00:57 lr 0.000786	time 0.8772 (0.8808)	loss 0.6015 (0.5791)	grad_norm 1.9780 (3.0780)	mem 20675MB
[2025-04-03 00:46:27 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][510/573]	eta 0:00:55 lr 0.000788	time 0.8774 (0.8808)	loss 0.5290 (0.5791)	grad_norm 1.7659 (3.0749)	mem 20675MB
[2025-04-03 00:46:29 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][512/573]	eta 0:00:53 lr 0.000789	time 0.8775 (0.8808)	loss 0.5867 (0.5788)	grad_norm 4.1422 (3.0761)	mem 20675MB
[2025-04-03 00:46:30 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][514/573]	eta 0:00:51 lr 0.000791	time 0.8773 (0.8808)	loss 0.4426 (0.5786)	grad_norm 3.3089 (3.0747)	mem 20675MB
[2025-04-03 00:46:32 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][516/573]	eta 0:00:50 lr 0.000792	time 0.8776 (0.8808)	loss 0.6541 (0.5787)	grad_norm 3.4978 (3.0747)	mem 20675MB
[2025-04-03 00:46:34 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][518/573]	eta 0:00:48 lr 0.000793	time 0.8775 (0.8808)	loss 0.5140 (0.5786)	grad_norm 3.6779 (3.0750)	mem 20675MB
[2025-04-03 00:46:36 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][520/573]	eta 0:00:46 lr 0.000795	time 0.8773 (0.8808)	loss 0.5362 (0.5785)	grad_norm 3.3102 (3.0737)	mem 20675MB
[2025-04-03 00:46:37 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][522/573]	eta 0:00:44 lr 0.000796	time 0.8773 (0.8807)	loss 0.6616 (0.5785)	grad_norm 4.0416 (3.0775)	mem 20675MB
[2025-04-03 00:46:39 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][524/573]	eta 0:00:43 lr 0.000798	time 0.8779 (0.8807)	loss 0.4511 (0.5785)	grad_norm 2.8217 (3.0772)	mem 20675MB
[2025-04-03 00:46:41 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][526/573]	eta 0:00:41 lr 0.000799	time 0.8777 (0.8807)	loss 0.5540 (0.5782)	grad_norm 2.3678 (3.0752)	mem 20675MB
[2025-04-03 00:46:43 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][528/573]	eta 0:00:39 lr 0.000801	time 0.8776 (0.8807)	loss 0.5165 (0.5782)	grad_norm 2.2234 (3.0720)	mem 20675MB
[2025-04-03 00:46:44 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][530/573]	eta 0:00:37 lr 0.000802	time 0.8775 (0.8807)	loss 0.5982 (0.5783)	grad_norm 3.6164 (3.0714)	mem 20675MB
[2025-04-03 00:46:46 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][532/573]	eta 0:00:36 lr 0.000804	time 0.8773 (0.8807)	loss 0.5618 (0.5783)	grad_norm 1.8178 (3.0705)	mem 20675MB
[2025-04-03 00:46:48 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][534/573]	eta 0:00:34 lr 0.000805	time 0.8776 (0.8807)	loss 0.5236 (0.5783)	grad_norm 2.2866 (3.0690)	mem 20675MB
[2025-04-03 00:46:50 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][536/573]	eta 0:00:32 lr 0.000807	time 0.8775 (0.8807)	loss 0.5928 (0.5781)	grad_norm 2.5931 (3.0690)	mem 20675MB
[2025-04-03 00:46:51 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][538/573]	eta 0:00:30 lr 0.000808	time 0.8773 (0.8807)	loss 0.6452 (0.5783)	grad_norm 4.3443 (3.0738)	mem 20675MB
[2025-04-03 00:46:53 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][540/573]	eta 0:00:29 lr 0.000809	time 0.8772 (0.8807)	loss 0.5615 (0.5782)	grad_norm 1.9779 (3.0699)	mem 20675MB
[2025-04-03 00:46:55 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][542/573]	eta 0:00:27 lr 0.000811	time 0.8771 (0.8807)	loss 0.5913 (0.5784)	grad_norm 1.5543 (3.0669)	mem 20675MB
[2025-04-03 00:46:57 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][544/573]	eta 0:00:25 lr 0.000812	time 0.8776 (0.8807)	loss 0.6570 (0.5786)	grad_norm 3.5210 (3.0667)	mem 20675MB
[2025-04-03 00:46:58 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][546/573]	eta 0:00:23 lr 0.000814	time 0.8770 (0.8806)	loss 0.5624 (0.5786)	grad_norm 3.8152 (3.0672)	mem 20675MB
[2025-04-03 00:47:00 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][548/573]	eta 0:00:22 lr 0.000815	time 0.8773 (0.8806)	loss 0.5671 (0.5785)	grad_norm 2.3919 (3.0694)	mem 20675MB
[2025-04-03 00:47:02 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][550/573]	eta 0:00:20 lr 0.000817	time 0.8773 (0.8806)	loss 0.5310 (0.5784)	grad_norm 2.8396 (3.0692)	mem 20675MB
[2025-04-03 00:47:04 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][552/573]	eta 0:00:18 lr 0.000818	time 0.8774 (0.8806)	loss 0.6089 (0.5785)	grad_norm 4.1406 (3.0696)	mem 20675MB
[2025-04-03 00:47:05 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][554/573]	eta 0:00:16 lr 0.000820	time 0.8776 (0.8806)	loss 0.5599 (0.5784)	grad_norm 3.1163 (3.0684)	mem 20675MB
[2025-04-03 00:47:07 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][556/573]	eta 0:00:14 lr 0.000821	time 0.8772 (0.8806)	loss 0.5615 (0.5785)	grad_norm 2.6121 (3.0723)	mem 20675MB
[2025-04-03 00:47:09 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][558/573]	eta 0:00:13 lr 0.000823	time 0.8774 (0.8806)	loss 0.4562 (0.5783)	grad_norm 3.4837 (3.0732)	mem 20675MB
[2025-04-03 00:47:11 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][560/573]	eta 0:00:11 lr 0.000824	time 0.8771 (0.8806)	loss 0.5393 (0.5781)	grad_norm 3.5682 (3.0752)	mem 20675MB
[2025-04-03 00:47:12 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][562/573]	eta 0:00:09 lr 0.000825	time 0.8789 (0.8806)	loss 0.6414 (0.5782)	grad_norm 2.5068 (3.0744)	mem 20675MB
[2025-04-03 00:47:14 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][564/573]	eta 0:00:07 lr 0.000827	time 0.8769 (0.8806)	loss 0.6330 (0.5781)	grad_norm 4.6442 (3.0770)	mem 20675MB
[2025-04-03 00:47:16 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][566/573]	eta 0:00:06 lr 0.000828	time 0.8768 (0.8806)	loss 0.6464 (0.5783)	grad_norm 3.0852 (3.0771)	mem 20675MB
[2025-04-03 00:47:18 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][568/573]	eta 0:00:04 lr 0.000830	time 0.8769 (0.8805)	loss 0.5721 (0.5781)	grad_norm 1.9492 (3.0756)	mem 20675MB
[2025-04-03 00:47:20 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][570/573]	eta 0:00:02 lr 0.000831	time 0.8768 (0.8805)	loss 0.6247 (0.5782)	grad_norm 1.5772 (3.0706)	mem 20675MB
[2025-04-03 00:47:21 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][572/573]	eta 0:00:00 lr 0.000833	time 0.8775 (0.8805)	loss 0.5356 (0.5781)	grad_norm 2.8791 (3.0692)	mem 20675MB
[2025-04-03 00:47:21 simmim_finetune] (main_finetune.py 260): INFO EPOCH 1 training takes 0:08:24
[2025-04-03 00:47:23 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.843 (1.843)	Loss 0.6794 (0.6794)	Acc@1 63.281 (63.281)	Mem 20675MB
[2025-04-03 00:47:24 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.804)	Loss 0.6123 (0.6345)	Acc@1 69.531 (66.927)	Mem 20675MB
[2025-04-03 00:47:24 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.596)	Loss 0.6605 (0.6451)	Acc@1 66.406 (65.312)	Mem 20675MB
[2025-04-03 00:47:25 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.507)	Loss 0.6988 (0.6492)	Acc@1 60.156 (65.290)	Mem 20675MB
[2025-04-03 00:47:25 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.284 (0.458)	Loss 0.5958 (0.6278)	Acc@1 67.969 (66.580)	Mem 20675MB
[2025-04-03 00:47:26 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.426)	Loss 0.4323 (0.5977)	Acc@1 78.906 (68.111)	Mem 20675MB
[2025-04-03 00:47:27 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.404)	Loss 0.4804 (0.5806)	Acc@1 77.344 (69.231)	Mem 20675MB
[2025-04-03 00:47:27 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.388)	Loss 0.4833 (0.5657)	Acc@1 75.781 (70.156)	Mem 20675MB
[2025-04-03 00:47:27 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 70.413
[2025-04-03 00:47:27 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 70.4%
[2025-04-03 00:47:27 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 70.41%
[2025-04-03 00:47:27 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.161795189947956e-06, 3.161795189947956e-06, 4.819350187068095e-06, 4.819350187068095e-06, 7.369434798022155e-06, 7.369434798022155e-06, 1.1292641891797632e-05, 1.1292641891797632e-05, 1.7328345112990673e-05, 1.7328345112990673e-05, 2.6614042376364583e-05, 2.6614042376364583e-05, 4.0899730473862905e-05, 4.0899730473862905e-05, 6.287771216232186e-05, 6.287771216232186e-05, 9.668999168302794e-05, 9.668999168302794e-05, 0.000148708883253345, 0.000148708883253345, 0.0002287379472076789, 0.0002287379472076789, 0.00035185958406050026, 0.00035185958406050026, 0.0005412774869109949, 0.0005412774869109949, 0.0008326896451425248, 0.0008326896451425248]
[2025-04-03 00:47:30 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][0/573]	eta 0:21:24 lr 0.000833	time 2.2410 (2.2410)	loss 0.6256 (0.6256)	grad_norm 2.4486 (2.4486)	mem 20675MB
[2025-04-03 00:47:31 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][2/573]	eta 0:12:40 lr 0.000835	time 0.8773 (1.3325)	loss 0.5078 (0.5646)	grad_norm 3.5614 (2.7590)	mem 20675MB
[2025-04-03 00:47:33 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][4/573]	eta 0:10:54 lr 0.000836	time 0.8770 (1.1507)	loss 0.6435 (0.5688)	grad_norm 3.9350 (2.9714)	mem 20675MB
[2025-04-03 00:47:35 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][6/573]	eta 0:10:08 lr 0.000838	time 0.8772 (1.0727)	loss 0.6635 (0.5894)	grad_norm 3.2445 (3.1097)	mem 20675MB
[2025-04-03 00:47:37 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][8/573]	eta 0:09:41 lr 0.000839	time 0.8772 (1.0295)	loss 0.6114 (0.5789)	grad_norm 3.0467 (3.2045)	mem 20675MB
[2025-04-03 00:47:39 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][10/573]	eta 0:09:24 lr 0.000841	time 0.8773 (1.0019)	loss 0.5898 (0.5845)	grad_norm 1.6195 (2.9647)	mem 20675MB
[2025-04-03 00:47:40 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][12/573]	eta 0:09:11 lr 0.000842	time 0.8772 (0.9829)	loss 0.6334 (0.5797)	grad_norm 3.3613 (3.0217)	mem 20675MB
[2025-04-03 00:47:42 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][14/573]	eta 0:09:01 lr 0.000844	time 0.8779 (0.9689)	loss 0.6157 (0.5757)	grad_norm 2.3161 (2.9407)	mem 20675MB
[2025-04-03 00:47:44 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][16/573]	eta 0:08:53 lr 0.000845	time 0.8772 (0.9583)	loss 0.5383 (0.5725)	grad_norm 4.1326 (2.9751)	mem 20675MB
[2025-04-03 00:47:46 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][18/573]	eta 0:08:47 lr 0.000847	time 0.8771 (0.9498)	loss 0.5400 (0.5634)	grad_norm 2.8767 (2.9980)	mem 20675MB
[2025-04-03 00:47:47 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][20/573]	eta 0:08:41 lr 0.000848	time 0.8773 (0.9430)	loss 0.6127 (0.5710)	grad_norm 4.0506 (3.1068)	mem 20675MB
[2025-04-03 00:47:49 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][22/573]	eta 0:08:36 lr 0.000849	time 0.8770 (0.9374)	loss 0.6335 (0.5760)	grad_norm 3.2527 (3.1578)	mem 20675MB
[2025-04-03 00:47:51 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][24/573]	eta 0:08:31 lr 0.000851	time 0.8769 (0.9326)	loss 0.5662 (0.5794)	grad_norm 3.5381 (3.2164)	mem 20675MB
[2025-04-03 00:47:53 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][26/573]	eta 0:08:27 lr 0.000852	time 0.8776 (0.9286)	loss 0.5793 (0.5783)	grad_norm 2.0233 (3.1499)	mem 20675MB
[2025-04-03 00:47:54 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][28/573]	eta 0:08:24 lr 0.000854	time 0.8772 (0.9251)	loss 0.6454 (0.5832)	grad_norm 3.6654 (3.1377)	mem 20675MB
[2025-04-03 00:47:56 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][30/573]	eta 0:08:20 lr 0.000855	time 0.8772 (0.9220)	loss 0.6545 (0.5873)	grad_norm 1.7058 (3.0481)	mem 20675MB
[2025-04-03 00:47:58 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][32/573]	eta 0:08:17 lr 0.000857	time 0.8771 (0.9194)	loss 0.6045 (0.5882)	grad_norm 2.4019 (2.9916)	mem 20675MB
[2025-04-03 00:48:00 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][34/573]	eta 0:08:14 lr 0.000858	time 0.8772 (0.9170)	loss 0.5285 (0.5873)	grad_norm 2.4071 (2.9601)	mem 20675MB
[2025-04-03 00:48:01 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][36/573]	eta 0:08:11 lr 0.000860	time 0.8775 (0.9149)	loss 0.6487 (0.5910)	grad_norm 3.7343 (2.9805)	mem 20675MB
[2025-04-03 00:48:03 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][38/573]	eta 0:08:08 lr 0.000861	time 0.8774 (0.9130)	loss 0.6053 (0.5913)	grad_norm 2.0322 (2.9460)	mem 20675MB
[2025-04-03 00:48:05 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][40/573]	eta 0:08:05 lr 0.000862	time 0.8771 (0.9113)	loss 0.5897 (0.5903)	grad_norm 1.3965 (2.9167)	mem 20675MB
[2025-04-03 00:48:07 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][42/573]	eta 0:08:03 lr 0.000864	time 0.8771 (0.9098)	loss 0.5772 (0.5901)	grad_norm 2.3717 (2.8830)	mem 20675MB
[2025-04-03 00:48:08 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][44/573]	eta 0:08:00 lr 0.000865	time 0.8771 (0.9083)	loss 0.4677 (0.5879)	grad_norm 3.3395 (2.8799)	mem 20675MB
[2025-04-03 00:48:10 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][46/573]	eta 0:07:58 lr 0.000867	time 0.8771 (0.9071)	loss 0.5580 (0.5857)	grad_norm 3.2371 (2.8832)	mem 20675MB
[2025-04-03 00:48:12 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][48/573]	eta 0:07:55 lr 0.000868	time 0.8773 (0.9059)	loss 0.4913 (0.5829)	grad_norm 4.9067 (2.9575)	mem 20675MB
[2025-04-03 00:48:14 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][50/573]	eta 0:07:53 lr 0.000870	time 0.8774 (0.9048)	loss 0.5820 (0.5832)	grad_norm 3.7925 (3.0292)	mem 20675MB
[2025-04-03 00:48:15 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][52/573]	eta 0:07:50 lr 0.000871	time 0.8772 (0.9038)	loss 0.5759 (0.5840)	grad_norm 2.3435 (3.0245)	mem 20675MB
[2025-04-03 00:48:17 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][54/573]	eta 0:07:48 lr 0.000873	time 0.8772 (0.9028)	loss 0.6104 (0.5853)	grad_norm 2.1725 (3.0042)	mem 20675MB
[2025-04-03 00:48:19 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][56/573]	eta 0:07:46 lr 0.000874	time 0.8773 (0.9020)	loss 0.6874 (0.5854)	grad_norm 2.6922 (3.0206)	mem 20675MB
[2025-04-03 00:48:21 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][58/573]	eta 0:07:44 lr 0.000876	time 0.8772 (0.9011)	loss 0.6177 (0.5864)	grad_norm 1.3802 (3.0200)	mem 20675MB
[2025-04-03 00:48:22 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][60/573]	eta 0:07:41 lr 0.000877	time 0.8773 (0.9004)	loss 0.6511 (0.5878)	grad_norm 5.9166 (3.0527)	mem 20675MB
[2025-04-03 00:48:24 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][62/573]	eta 0:07:39 lr 0.000878	time 0.8773 (0.8997)	loss 0.5503 (0.5871)	grad_norm 3.1438 (3.0551)	mem 20675MB
[2025-04-03 00:48:26 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][64/573]	eta 0:07:37 lr 0.000880	time 0.8776 (0.8990)	loss 0.5200 (0.5861)	grad_norm 3.1513 (3.0442)	mem 20675MB
[2025-04-03 00:48:28 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][66/573]	eta 0:07:35 lr 0.000881	time 0.8773 (0.8984)	loss 0.5839 (0.5872)	grad_norm 2.5281 (3.0639)	mem 20675MB
[2025-04-03 00:48:29 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][68/573]	eta 0:07:33 lr 0.000883	time 0.8775 (0.8978)	loss 0.6622 (0.5883)	grad_norm 3.3262 (3.0615)	mem 20675MB
[2025-04-03 00:48:31 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][70/573]	eta 0:07:31 lr 0.000884	time 0.8774 (0.8973)	loss 0.7093 (0.5885)	grad_norm 6.0812 (3.1042)	mem 20675MB
[2025-04-03 00:48:33 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][72/573]	eta 0:07:29 lr 0.000886	time 0.8770 (0.8967)	loss 0.4068 (0.5851)	grad_norm 2.7304 (3.0992)	mem 20675MB
[2025-04-03 00:48:35 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][74/573]	eta 0:07:27 lr 0.000887	time 0.8774 (0.8962)	loss 0.4582 (0.5837)	grad_norm 3.7903 (3.1096)	mem 20675MB
[2025-04-03 00:48:36 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][76/573]	eta 0:07:25 lr 0.000889	time 0.8771 (0.8958)	loss 0.6051 (0.5838)	grad_norm 3.4322 (3.0971)	mem 20675MB
[2025-04-03 00:48:38 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][78/573]	eta 0:07:23 lr 0.000890	time 0.8771 (0.8953)	loss 0.5913 (0.5840)	grad_norm 2.2922 (3.0855)	mem 20675MB
[2025-04-03 00:48:40 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][80/573]	eta 0:07:21 lr 0.000892	time 0.8770 (0.8949)	loss 0.5940 (0.5834)	grad_norm 4.6699 (3.1093)	mem 20675MB
[2025-04-03 00:48:42 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][82/573]	eta 0:07:19 lr 0.000893	time 0.8774 (0.8945)	loss 0.5548 (0.5818)	grad_norm 2.4018 (3.1010)	mem 20675MB
[2025-04-03 00:48:43 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][84/573]	eta 0:07:17 lr 0.000894	time 0.8770 (0.8941)	loss 0.6117 (0.5831)	grad_norm 3.8168 (3.1318)	mem 20675MB
[2025-04-03 00:48:45 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][86/573]	eta 0:07:15 lr 0.000896	time 0.8773 (0.8937)	loss 0.6662 (0.5829)	grad_norm 2.9354 (3.1273)	mem 20675MB
[2025-04-03 00:48:47 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][88/573]	eta 0:07:13 lr 0.000897	time 0.8771 (0.8934)	loss 0.6085 (0.5830)	grad_norm 4.0708 (3.1482)	mem 20675MB
[2025-04-03 00:48:49 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][90/573]	eta 0:07:11 lr 0.000899	time 0.8770 (0.8930)	loss 0.5757 (0.5819)	grad_norm 2.7522 (3.1417)	mem 20675MB
[2025-04-03 00:48:51 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][92/573]	eta 0:07:09 lr 0.000900	time 0.8772 (0.8927)	loss 0.5975 (0.5826)	grad_norm 3.9546 (3.1598)	mem 20675MB
[2025-04-03 00:48:52 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][94/573]	eta 0:07:07 lr 0.000902	time 0.8788 (0.8924)	loss 0.6358 (0.5825)	grad_norm 3.5885 (3.1573)	mem 20675MB
[2025-04-03 00:48:54 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][96/573]	eta 0:07:05 lr 0.000903	time 0.8773 (0.8921)	loss 0.5713 (0.5829)	grad_norm 2.4174 (3.1483)	mem 20675MB
[2025-04-03 00:48:56 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][98/573]	eta 0:07:03 lr 0.000905	time 0.8770 (0.8918)	loss 0.4677 (0.5814)	grad_norm 3.5505 (3.1459)	mem 20675MB
[2025-04-03 00:48:58 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][100/573]	eta 0:07:01 lr 0.000906	time 0.8770 (0.8916)	loss 0.5428 (0.5813)	grad_norm 2.2411 (3.1336)	mem 20675MB
[2025-04-03 00:48:59 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][102/573]	eta 0:06:59 lr 0.000908	time 0.8768 (0.8913)	loss 0.4117 (0.5786)	grad_norm 3.7724 (3.1309)	mem 20675MB
[2025-04-03 00:49:01 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][104/573]	eta 0:06:57 lr 0.000909	time 0.8772 (0.8910)	loss 0.6402 (0.5778)	grad_norm 4.6838 (3.1470)	mem 20675MB
[2025-04-03 00:49:03 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][106/573]	eta 0:06:55 lr 0.000910	time 0.8770 (0.8908)	loss 0.6158 (0.5785)	grad_norm 3.0120 (3.1714)	mem 20675MB
[2025-04-03 00:49:05 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][108/573]	eta 0:06:54 lr 0.000912	time 0.8773 (0.8906)	loss 0.5386 (0.5781)	grad_norm 2.6202 (3.1697)	mem 20675MB
[2025-04-03 00:49:06 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][110/573]	eta 0:06:52 lr 0.000913	time 0.8771 (0.8904)	loss 0.5222 (0.5774)	grad_norm 2.4288 (3.1567)	mem 20675MB
[2025-04-03 00:49:08 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][112/573]	eta 0:06:50 lr 0.000915	time 0.8772 (0.8901)	loss 0.6013 (0.5779)	grad_norm 3.8965 (3.1514)	mem 20675MB
[2025-04-03 00:49:10 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][114/573]	eta 0:06:48 lr 0.000916	time 0.8771 (0.8899)	loss 0.5017 (0.5773)	grad_norm 1.9555 (3.1346)	mem 20675MB
[2025-04-03 00:49:12 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][116/573]	eta 0:06:46 lr 0.000918	time 0.8770 (0.8897)	loss 0.5933 (0.5774)	grad_norm 3.2609 (3.1509)	mem 20675MB
[2025-04-03 00:49:13 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][118/573]	eta 0:06:44 lr 0.000919	time 0.8772 (0.8895)	loss 0.6379 (0.5781)	grad_norm 3.0258 (3.1384)	mem 20675MB
[2025-04-03 00:49:15 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][120/573]	eta 0:06:42 lr 0.000921	time 0.8771 (0.8893)	loss 0.6770 (0.5795)	grad_norm 7.0453 (3.1864)	mem 20675MB
[2025-04-03 00:49:17 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][122/573]	eta 0:06:41 lr 0.000922	time 0.8770 (0.8891)	loss 0.5818 (0.5791)	grad_norm 1.7752 (3.1710)	mem 20675MB
[2025-04-03 00:49:19 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][124/573]	eta 0:06:39 lr 0.000924	time 0.8772 (0.8890)	loss 0.5480 (0.5786)	grad_norm 3.9915 (3.1773)	mem 20675MB
[2025-04-03 00:49:20 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][126/573]	eta 0:06:37 lr 0.000925	time 0.8771 (0.8888)	loss 0.5470 (0.5784)	grad_norm 2.5854 (3.1910)	mem 20675MB
[2025-04-03 00:49:22 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][128/573]	eta 0:06:35 lr 0.000926	time 0.8773 (0.8886)	loss 0.6299 (0.5791)	grad_norm 2.5372 (3.1876)	mem 20675MB
[2025-04-03 00:49:24 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][130/573]	eta 0:06:33 lr 0.000928	time 0.8774 (0.8885)	loss 0.6201 (0.5802)	grad_norm 3.7940 (3.2024)	mem 20675MB
[2025-04-03 00:49:26 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][132/573]	eta 0:06:31 lr 0.000929	time 0.8778 (0.8883)	loss 0.5041 (0.5799)	grad_norm 2.6654 (3.2005)	mem 20675MB
[2025-04-03 00:49:27 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][134/573]	eta 0:06:29 lr 0.000931	time 0.8773 (0.8882)	loss 0.6156 (0.5802)	grad_norm 3.9356 (3.2000)	mem 20675MB
[2025-04-03 00:49:29 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][136/573]	eta 0:06:28 lr 0.000932	time 0.8770 (0.8880)	loss 0.6094 (0.5808)	grad_norm 2.0587 (3.1844)	mem 20675MB
[2025-04-03 00:49:31 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][138/573]	eta 0:06:26 lr 0.000934	time 0.8770 (0.8879)	loss 0.5888 (0.5804)	grad_norm 2.0137 (3.1738)	mem 20675MB
[2025-04-03 00:49:33 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][140/573]	eta 0:06:24 lr 0.000935	time 0.8770 (0.8877)	loss 0.5833 (0.5806)	grad_norm 1.8229 (3.1670)	mem 20675MB
[2025-04-03 00:49:34 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][142/573]	eta 0:06:22 lr 0.000937	time 0.8771 (0.8876)	loss 0.5283 (0.5806)	grad_norm 2.6711 (3.1538)	mem 20675MB
[2025-04-03 00:49:36 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][144/573]	eta 0:06:20 lr 0.000938	time 0.8773 (0.8875)	loss 0.5910 (0.5806)	grad_norm 2.9418 (3.1493)	mem 20675MB
[2025-04-03 00:49:38 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][146/573]	eta 0:06:18 lr 0.000940	time 0.8769 (0.8873)	loss 0.6368 (0.5801)	grad_norm 3.3089 (3.1484)	mem 20675MB
[2025-04-03 00:49:40 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][148/573]	eta 0:06:17 lr 0.000941	time 0.8774 (0.8872)	loss 0.6171 (0.5802)	grad_norm 3.4662 (3.1430)	mem 20675MB
[2025-04-03 00:49:41 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][150/573]	eta 0:06:15 lr 0.000942	time 0.8773 (0.8871)	loss 0.5023 (0.5800)	grad_norm 3.3135 (3.1553)	mem 20675MB
[2025-04-03 00:49:43 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][152/573]	eta 0:06:13 lr 0.000944	time 0.8770 (0.8870)	loss 0.6080 (0.5799)	grad_norm 3.1108 (3.1509)	mem 20675MB
[2025-04-03 00:49:45 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][154/573]	eta 0:06:11 lr 0.000945	time 0.8771 (0.8868)	loss 0.6054 (0.5803)	grad_norm 2.4832 (3.1478)	mem 20675MB
[2025-04-03 00:49:47 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][156/573]	eta 0:06:09 lr 0.000947	time 0.8774 (0.8867)	loss 0.6175 (0.5800)	grad_norm 2.7044 (3.1468)	mem 20675MB
[2025-04-03 00:49:48 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][158/573]	eta 0:06:07 lr 0.000948	time 0.8773 (0.8866)	loss 0.5078 (0.5799)	grad_norm 3.8480 (3.1564)	mem 20675MB
[2025-04-03 00:49:50 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][160/573]	eta 0:06:06 lr 0.000950	time 0.8771 (0.8865)	loss 0.5440 (0.5796)	grad_norm 3.1893 (3.1517)	mem 20675MB
[2025-04-03 00:49:52 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][162/573]	eta 0:06:04 lr 0.000951	time 0.8775 (0.8864)	loss 0.5415 (0.5793)	grad_norm 4.2011 (3.1521)	mem 20675MB
[2025-04-03 00:49:54 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][164/573]	eta 0:06:02 lr 0.000953	time 0.8773 (0.8863)	loss 0.4649 (0.5789)	grad_norm 2.9217 (3.1494)	mem 20675MB
[2025-04-03 00:49:55 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][166/573]	eta 0:06:00 lr 0.000954	time 0.8769 (0.8862)	loss 0.4643 (0.5780)	grad_norm 2.6305 (3.1358)	mem 20675MB
[2025-04-03 00:49:57 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][168/573]	eta 0:05:58 lr 0.000956	time 0.8776 (0.8861)	loss 0.4693 (0.5775)	grad_norm 4.7116 (3.1396)	mem 20675MB
[2025-04-03 00:49:59 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][170/573]	eta 0:05:57 lr 0.000957	time 0.8777 (0.8860)	loss 0.5707 (0.5774)	grad_norm 3.0587 (3.1428)	mem 20675MB
[2025-04-03 00:50:01 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][172/573]	eta 0:05:55 lr 0.000958	time 0.8772 (0.8859)	loss 0.5305 (0.5768)	grad_norm 2.4765 (3.1378)	mem 20675MB
[2025-04-03 00:50:03 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][174/573]	eta 0:05:53 lr 0.000960	time 0.8774 (0.8859)	loss 0.4287 (0.5760)	grad_norm 3.9081 (3.1418)	mem 20675MB
[2025-04-03 00:50:04 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][176/573]	eta 0:05:51 lr 0.000961	time 0.8771 (0.8858)	loss 0.5667 (0.5754)	grad_norm 2.7246 (3.1348)	mem 20675MB
[2025-04-03 00:50:06 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][178/573]	eta 0:05:49 lr 0.000963	time 0.8776 (0.8857)	loss 0.6302 (0.5753)	grad_norm 7.9344 (3.1614)	mem 20675MB
[2025-04-03 00:50:08 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][180/573]	eta 0:05:48 lr 0.000964	time 0.8774 (0.8856)	loss 0.5246 (0.5751)	grad_norm 3.3902 (3.1658)	mem 20675MB
[2025-04-03 00:50:10 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][182/573]	eta 0:05:46 lr 0.000966	time 0.8772 (0.8855)	loss 0.5245 (0.5751)	grad_norm 3.4040 (3.1765)	mem 20675MB
[2025-04-03 00:50:11 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][184/573]	eta 0:05:44 lr 0.000967	time 0.8775 (0.8854)	loss 0.6277 (0.5754)	grad_norm 2.1061 (3.1762)	mem 20675MB
[2025-04-03 00:50:13 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][186/573]	eta 0:05:42 lr 0.000969	time 0.8771 (0.8854)	loss 0.5625 (0.5751)	grad_norm 2.5257 (3.1739)	mem 20675MB
[2025-04-03 00:50:15 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][188/573]	eta 0:05:40 lr 0.000970	time 0.8783 (0.8853)	loss 0.5118 (0.5747)	grad_norm 2.8869 (3.1707)	mem 20675MB
[2025-04-03 00:50:17 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][190/573]	eta 0:05:39 lr 0.000972	time 0.8771 (0.8852)	loss 0.6312 (0.5746)	grad_norm 2.6828 (3.1644)	mem 20675MB
[2025-04-03 00:50:18 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][192/573]	eta 0:05:37 lr 0.000973	time 0.8773 (0.8851)	loss 0.6087 (0.5747)	grad_norm 2.8816 (3.1577)	mem 20675MB
[2025-04-03 00:50:20 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][194/573]	eta 0:05:35 lr 0.000974	time 0.8775 (0.8851)	loss 0.6663 (0.5751)	grad_norm 4.0732 (3.1547)	mem 20675MB
[2025-04-03 00:50:22 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][196/573]	eta 0:05:33 lr 0.000976	time 0.8772 (0.8850)	loss 0.5275 (0.5742)	grad_norm 3.8316 (3.1563)	mem 20675MB
[2025-04-03 00:50:24 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][198/573]	eta 0:05:31 lr 0.000977	time 0.8770 (0.8849)	loss 0.5514 (0.5739)	grad_norm 2.6420 (3.1534)	mem 20675MB
[2025-04-03 00:50:25 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][200/573]	eta 0:05:30 lr 0.000979	time 0.8776 (0.8849)	loss 0.5437 (0.5738)	grad_norm 2.3444 (3.1476)	mem 20675MB
[2025-04-03 00:50:27 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][202/573]	eta 0:05:28 lr 0.000980	time 0.8774 (0.8848)	loss 0.6267 (0.5741)	grad_norm 2.5595 (3.1374)	mem 20675MB
[2025-04-03 00:50:29 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][204/573]	eta 0:05:26 lr 0.000982	time 0.8772 (0.8847)	loss 0.5461 (0.5744)	grad_norm 2.1773 (3.1393)	mem 20675MB
[2025-04-03 00:50:31 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][206/573]	eta 0:05:24 lr 0.000983	time 0.8774 (0.8847)	loss 0.5648 (0.5747)	grad_norm 1.6970 (3.1383)	mem 20675MB
[2025-04-03 00:50:32 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][208/573]	eta 0:05:22 lr 0.000985	time 0.8775 (0.8846)	loss 0.5170 (0.5746)	grad_norm 2.7030 (3.1288)	mem 20675MB
[2025-04-03 00:50:34 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][210/573]	eta 0:05:21 lr 0.000986	time 0.8775 (0.8846)	loss 0.6330 (0.5749)	grad_norm 2.5242 (3.1200)	mem 20675MB
[2025-04-03 00:50:36 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][212/573]	eta 0:05:19 lr 0.000988	time 0.8774 (0.8845)	loss 0.5327 (0.5747)	grad_norm 3.9917 (3.1182)	mem 20675MB
[2025-04-03 00:50:38 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][214/573]	eta 0:05:17 lr 0.000989	time 0.8771 (0.8844)	loss 0.5789 (0.5748)	grad_norm 1.8270 (3.1157)	mem 20675MB
[2025-04-03 00:50:39 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][216/573]	eta 0:05:15 lr 0.000990	time 0.8792 (0.8844)	loss 0.5916 (0.5751)	grad_norm 3.2227 (3.1226)	mem 20675MB
[2025-04-03 00:50:41 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][218/573]	eta 0:05:13 lr 0.000992	time 0.8776 (0.8843)	loss 0.5782 (0.5752)	grad_norm 1.9307 (3.1146)	mem 20675MB
[2025-04-03 00:50:43 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][220/573]	eta 0:05:12 lr 0.000993	time 0.8774 (0.8843)	loss 0.6187 (0.5755)	grad_norm 2.6909 (3.1119)	mem 20675MB
[2025-04-03 00:50:45 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][222/573]	eta 0:05:10 lr 0.000995	time 0.8772 (0.8842)	loss 0.5005 (0.5751)	grad_norm 2.0740 (3.1020)	mem 20675MB
[2025-04-03 00:50:46 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][224/573]	eta 0:05:08 lr 0.000996	time 0.8799 (0.8842)	loss 0.6336 (0.5751)	grad_norm 3.6167 (3.0992)	mem 20675MB
[2025-04-03 00:50:48 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][226/573]	eta 0:05:06 lr 0.000998	time 0.8774 (0.8841)	loss 0.5506 (0.5750)	grad_norm 1.8254 (3.0893)	mem 20675MB
[2025-04-03 00:50:50 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][228/573]	eta 0:05:05 lr 0.000999	time 0.8775 (0.8841)	loss 0.5823 (0.5749)	grad_norm 1.9923 (3.0799)	mem 20675MB
[2025-04-03 00:50:52 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][230/573]	eta 0:05:03 lr 0.001001	time 0.8770 (0.8840)	loss 0.4850 (0.5744)	grad_norm 5.1594 (3.0902)	mem 20675MB
[2025-04-03 00:50:53 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][232/573]	eta 0:05:01 lr 0.001002	time 0.8773 (0.8840)	loss 0.5669 (0.5743)	grad_norm 3.5496 (3.0880)	mem 20675MB
[2025-04-03 00:50:55 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][234/573]	eta 0:04:59 lr 0.001004	time 0.8775 (0.8839)	loss 0.6133 (0.5745)	grad_norm 2.0886 (3.0862)	mem 20675MB
[2025-04-03 00:50:57 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][236/573]	eta 0:04:57 lr 0.001005	time 0.8778 (0.8839)	loss 0.5955 (0.5744)	grad_norm 1.9348 (3.0791)	mem 20675MB
[2025-04-03 00:50:59 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][238/573]	eta 0:04:56 lr 0.001006	time 0.8776 (0.8838)	loss 0.5921 (0.5740)	grad_norm 1.9976 (3.0746)	mem 20675MB
[2025-04-03 00:51:00 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][240/573]	eta 0:04:54 lr 0.001008	time 0.8776 (0.8838)	loss 0.6702 (0.5742)	grad_norm 4.2111 (3.0807)	mem 20675MB
[2025-04-03 00:51:02 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][242/573]	eta 0:04:52 lr 0.001009	time 0.8776 (0.8837)	loss 0.5949 (0.5738)	grad_norm 1.9175 (3.0767)	mem 20675MB
[2025-04-03 00:51:04 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][244/573]	eta 0:04:50 lr 0.001011	time 0.8776 (0.8837)	loss 0.5488 (0.5739)	grad_norm 2.4733 (3.0735)	mem 20675MB
[2025-04-03 00:51:06 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][246/573]	eta 0:04:48 lr 0.001012	time 0.8775 (0.8837)	loss 0.5058 (0.5731)	grad_norm 2.7556 (3.0691)	mem 20675MB
[2025-04-03 00:51:08 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][248/573]	eta 0:04:47 lr 0.001014	time 0.8776 (0.8836)	loss 0.5941 (0.5735)	grad_norm 2.0876 (3.0730)	mem 20675MB
[2025-04-03 00:51:09 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][250/573]	eta 0:04:45 lr 0.001015	time 0.8771 (0.8836)	loss 0.5941 (0.5733)	grad_norm 2.4935 (3.0720)	mem 20675MB
[2025-04-03 00:51:11 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][252/573]	eta 0:04:43 lr 0.001017	time 0.8776 (0.8835)	loss 0.5900 (0.5737)	grad_norm 2.5284 (3.0660)	mem 20675MB
[2025-04-03 00:51:13 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][254/573]	eta 0:04:41 lr 0.001018	time 0.8775 (0.8835)	loss 0.5792 (0.5736)	grad_norm 2.0294 (3.0567)	mem 20675MB
[2025-04-03 00:51:15 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][256/573]	eta 0:04:40 lr 0.001020	time 0.8783 (0.8835)	loss 0.6119 (0.5737)	grad_norm 3.8468 (3.0563)	mem 20675MB
[2025-04-03 00:51:16 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][258/573]	eta 0:04:38 lr 0.001021	time 0.8772 (0.8834)	loss 0.6294 (0.5738)	grad_norm 3.9616 (3.0563)	mem 20675MB
[2025-04-03 00:51:18 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][260/573]	eta 0:04:36 lr 0.001022	time 0.8776 (0.8834)	loss 0.5656 (0.5738)	grad_norm 1.8553 (3.0517)	mem 20675MB
[2025-04-03 00:51:20 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][262/573]	eta 0:04:34 lr 0.001024	time 0.8776 (0.8833)	loss 0.5766 (0.5736)	grad_norm 4.7324 (3.0578)	mem 20675MB
[2025-04-03 00:51:22 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][264/573]	eta 0:04:32 lr 0.001025	time 0.8774 (0.8833)	loss 0.5571 (0.5736)	grad_norm 1.8841 (3.0543)	mem 20675MB
[2025-04-03 00:51:23 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][266/573]	eta 0:04:31 lr 0.001027	time 0.8771 (0.8833)	loss 0.5579 (0.5733)	grad_norm 3.9519 (3.0552)	mem 20675MB
[2025-04-03 00:51:25 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][268/573]	eta 0:04:29 lr 0.001028	time 0.8775 (0.8832)	loss 0.5554 (0.5735)	grad_norm 2.9752 (3.0522)	mem 20675MB
[2025-04-03 00:51:27 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][270/573]	eta 0:04:27 lr 0.001030	time 0.8772 (0.8832)	loss 0.5449 (0.5730)	grad_norm 2.9259 (3.0515)	mem 20675MB
[2025-04-03 00:51:29 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][272/573]	eta 0:04:25 lr 0.001031	time 0.8779 (0.8831)	loss 0.5584 (0.5730)	grad_norm 3.1638 (3.0488)	mem 20675MB
[2025-04-03 00:51:30 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][274/573]	eta 0:04:24 lr 0.001033	time 0.8771 (0.8831)	loss 0.5267 (0.5729)	grad_norm 2.2127 (3.0431)	mem 20675MB
[2025-04-03 00:51:32 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][276/573]	eta 0:04:22 lr 0.001034	time 0.8775 (0.8831)	loss 0.5622 (0.5730)	grad_norm 3.2412 (3.0498)	mem 20675MB
[2025-04-03 00:51:34 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][278/573]	eta 0:04:20 lr 0.001036	time 0.8770 (0.8830)	loss 0.5203 (0.5728)	grad_norm 3.7897 (3.0527)	mem 20675MB
[2025-04-03 00:51:36 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][280/573]	eta 0:04:18 lr 0.001037	time 0.8775 (0.8830)	loss 0.5273 (0.5722)	grad_norm 2.8726 (3.0529)	mem 20675MB
[2025-04-03 00:51:37 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][282/573]	eta 0:04:16 lr 0.001038	time 0.8775 (0.8830)	loss 0.5280 (0.5719)	grad_norm 2.1181 (3.0526)	mem 20675MB
[2025-04-03 00:51:39 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][284/573]	eta 0:04:15 lr 0.001040	time 0.8774 (0.8829)	loss 0.6003 (0.5721)	grad_norm 2.5876 (3.0542)	mem 20675MB
[2025-04-03 00:51:41 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][286/573]	eta 0:04:13 lr 0.001041	time 0.8774 (0.8829)	loss 0.5814 (0.5724)	grad_norm 2.0479 (3.0481)	mem 20675MB
[2025-04-03 00:51:43 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][288/573]	eta 0:04:11 lr 0.001043	time 0.8772 (0.8829)	loss 0.5823 (0.5723)	grad_norm 1.8820 (3.0432)	mem 20675MB
[2025-04-03 00:51:44 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][290/573]	eta 0:04:09 lr 0.001044	time 0.8774 (0.8828)	loss 0.5471 (0.5724)	grad_norm 3.2206 (3.0396)	mem 20675MB
[2025-04-03 00:51:46 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][292/573]	eta 0:04:08 lr 0.001046	time 0.8797 (0.8828)	loss 0.4937 (0.5720)	grad_norm 2.6891 (3.0366)	mem 20675MB
[2025-04-03 00:51:48 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][294/573]	eta 0:04:06 lr 0.001047	time 0.8775 (0.8828)	loss 0.5055 (0.5719)	grad_norm 3.4277 (3.0355)	mem 20675MB
[2025-04-03 00:51:50 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][296/573]	eta 0:04:04 lr 0.001049	time 0.8772 (0.8828)	loss 0.5904 (0.5723)	grad_norm 2.4081 (3.0354)	mem 20675MB
[2025-04-03 00:51:51 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][298/573]	eta 0:04:02 lr 0.001050	time 0.8777 (0.8827)	loss 0.5589 (0.5724)	grad_norm 5.2224 (3.0447)	mem 20675MB
[2025-04-03 00:51:53 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][300/573]	eta 0:04:00 lr 0.001052	time 0.8776 (0.8827)	loss 0.6105 (0.5726)	grad_norm 1.8271 (3.0382)	mem 20675MB
[2025-04-03 00:51:55 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][302/573]	eta 0:03:59 lr 0.001053	time 0.8774 (0.8827)	loss 0.5354 (0.5722)	grad_norm 1.9985 (3.0328)	mem 20675MB
[2025-04-03 00:51:57 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][304/573]	eta 0:03:57 lr 0.001054	time 0.8774 (0.8826)	loss 0.5548 (0.5720)	grad_norm 1.2768 (3.0237)	mem 20675MB
[2025-04-03 00:51:58 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][306/573]	eta 0:03:55 lr 0.001056	time 0.8779 (0.8826)	loss 0.5399 (0.5722)	grad_norm 3.2115 (3.0243)	mem 20675MB
[2025-04-03 00:52:00 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][308/573]	eta 0:03:53 lr 0.001057	time 0.8774 (0.8826)	loss 0.5823 (0.5722)	grad_norm 3.2350 (3.0253)	mem 20675MB
[2025-04-03 00:52:02 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][310/573]	eta 0:03:52 lr 0.001059	time 0.8774 (0.8826)	loss 0.5719 (0.5724)	grad_norm 1.5513 (3.0219)	mem 20675MB
[2025-04-03 00:52:04 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][312/573]	eta 0:03:50 lr 0.001060	time 0.8772 (0.8825)	loss 0.4912 (0.5721)	grad_norm 2.9071 (3.0248)	mem 20675MB
[2025-04-03 00:52:05 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][314/573]	eta 0:03:48 lr 0.001062	time 0.8773 (0.8825)	loss 0.5867 (0.5725)	grad_norm 2.0024 (3.0212)	mem 20675MB
[2025-04-03 00:52:07 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][316/573]	eta 0:03:46 lr 0.001063	time 0.8774 (0.8825)	loss 0.5294 (0.5727)	grad_norm 2.2634 (3.0178)	mem 20675MB
[2025-04-03 00:52:09 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][318/573]	eta 0:03:45 lr 0.001065	time 0.8772 (0.8824)	loss 0.5567 (0.5729)	grad_norm 1.4542 (3.0104)	mem 20675MB
[2025-04-03 00:52:11 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][320/573]	eta 0:03:43 lr 0.001066	time 0.8771 (0.8824)	loss 0.5435 (0.5724)	grad_norm 2.0834 (3.0057)	mem 20675MB
[2025-04-03 00:52:13 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][322/573]	eta 0:03:41 lr 0.001068	time 0.8772 (0.8824)	loss 0.5354 (0.5725)	grad_norm 2.3573 (2.9995)	mem 20675MB
[2025-04-03 00:52:14 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][324/573]	eta 0:03:39 lr 0.001069	time 0.8777 (0.8824)	loss 0.4051 (0.5720)	grad_norm 3.7684 (2.9985)	mem 20675MB
[2025-04-03 00:52:16 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][326/573]	eta 0:03:37 lr 0.001070	time 0.8774 (0.8823)	loss 0.5581 (0.5721)	grad_norm 3.0846 (3.0029)	mem 20675MB
[2025-04-03 00:52:18 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][328/573]	eta 0:03:36 lr 0.001072	time 0.8773 (0.8823)	loss 0.4239 (0.5717)	grad_norm 3.1358 (3.0068)	mem 20675MB
[2025-04-03 00:52:20 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][330/573]	eta 0:03:34 lr 0.001073	time 0.8773 (0.8823)	loss 0.4552 (0.5712)	grad_norm 2.7125 (3.0040)	mem 20675MB
[2025-04-03 00:52:21 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][332/573]	eta 0:03:32 lr 0.001075	time 0.8777 (0.8823)	loss 0.5050 (0.5709)	grad_norm 2.9363 (3.0022)	mem 20675MB
[2025-04-03 00:52:23 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][334/573]	eta 0:03:30 lr 0.001076	time 0.8773 (0.8822)	loss 0.5565 (0.5704)	grad_norm 2.9635 (3.0020)	mem 20675MB
[2025-04-03 00:52:25 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][336/573]	eta 0:03:29 lr 0.001078	time 0.8775 (0.8822)	loss 0.4730 (0.5699)	grad_norm 3.8500 (3.0059)	mem 20675MB
[2025-04-03 00:52:27 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][338/573]	eta 0:03:27 lr 0.001079	time 0.8771 (0.8822)	loss 0.6178 (0.5698)	grad_norm 3.0790 (3.0037)	mem 20675MB
[2025-04-03 00:52:28 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][340/573]	eta 0:03:25 lr 0.001081	time 0.8776 (0.8822)	loss 0.4576 (0.5696)	grad_norm 3.8104 (3.0049)	mem 20675MB
[2025-04-03 00:52:30 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][342/573]	eta 0:03:23 lr 0.001082	time 0.8773 (0.8821)	loss 0.5825 (0.5696)	grad_norm 3.5028 (3.0116)	mem 20675MB
[2025-04-03 00:52:32 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][344/573]	eta 0:03:22 lr 0.001084	time 0.8774 (0.8821)	loss 0.5516 (0.5694)	grad_norm 3.7908 (3.0197)	mem 20675MB
[2025-04-03 00:52:34 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][346/573]	eta 0:03:20 lr 0.001085	time 0.8771 (0.8821)	loss 0.4891 (0.5691)	grad_norm 2.6515 (3.0182)	mem 20675MB
[2025-04-03 00:52:35 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][348/573]	eta 0:03:18 lr 0.001086	time 0.8775 (0.8821)	loss 0.4825 (0.5690)	grad_norm 3.3276 (3.0217)	mem 20675MB
[2025-04-03 00:52:37 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][350/573]	eta 0:03:16 lr 0.001088	time 0.8773 (0.8821)	loss 0.5722 (0.5691)	grad_norm 3.6396 (3.0272)	mem 20675MB
[2025-04-03 00:52:39 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][352/573]	eta 0:03:14 lr 0.001089	time 0.8775 (0.8820)	loss 0.5520 (0.5688)	grad_norm 3.1187 (3.0237)	mem 20675MB
[2025-04-03 00:52:41 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][354/573]	eta 0:03:13 lr 0.001091	time 0.8774 (0.8820)	loss 0.5661 (0.5689)	grad_norm 3.2179 (3.0249)	mem 20675MB
[2025-04-03 00:52:42 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][356/573]	eta 0:03:11 lr 0.001092	time 0.8772 (0.8820)	loss 0.6118 (0.5691)	grad_norm 2.1083 (3.0220)	mem 20675MB
[2025-04-03 00:52:44 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][358/573]	eta 0:03:09 lr 0.001094	time 0.8772 (0.8820)	loss 0.6021 (0.5693)	grad_norm 1.7011 (3.0171)	mem 20675MB
[2025-04-03 00:52:46 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][360/573]	eta 0:03:07 lr 0.001095	time 0.8775 (0.8819)	loss 0.5807 (0.5694)	grad_norm 2.3159 (3.0112)	mem 20675MB
[2025-04-03 00:52:48 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][362/573]	eta 0:03:06 lr 0.001097	time 0.8772 (0.8819)	loss 0.5929 (0.5696)	grad_norm 2.1925 (3.0060)	mem 20675MB
[2025-04-03 00:52:49 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][364/573]	eta 0:03:04 lr 0.001098	time 0.8771 (0.8819)	loss 0.5756 (0.5693)	grad_norm 1.4511 (3.0022)	mem 20675MB
[2025-04-03 00:52:51 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][366/573]	eta 0:03:02 lr 0.001100	time 0.8774 (0.8819)	loss 0.5636 (0.5693)	grad_norm 3.6189 (3.0001)	mem 20675MB
[2025-04-03 00:52:53 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][368/573]	eta 0:03:00 lr 0.001101	time 0.8773 (0.8819)	loss 0.6887 (0.5696)	grad_norm 4.2468 (3.0022)	mem 20675MB
[2025-04-03 00:52:55 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][370/573]	eta 0:02:59 lr 0.001102	time 0.8772 (0.8818)	loss 0.5620 (0.5696)	grad_norm 2.5148 (2.9993)	mem 20675MB
[2025-04-03 00:52:56 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][372/573]	eta 0:02:57 lr 0.001104	time 0.8772 (0.8818)	loss 0.5921 (0.5698)	grad_norm 2.1234 (2.9935)	mem 20675MB
[2025-04-03 00:52:58 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][374/573]	eta 0:02:55 lr 0.001105	time 0.8775 (0.8818)	loss 0.5732 (0.5698)	grad_norm 1.9855 (2.9880)	mem 20675MB
[2025-04-03 00:53:00 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][376/573]	eta 0:02:53 lr 0.001107	time 0.8772 (0.8818)	loss 0.5705 (0.5697)	grad_norm 2.4695 (2.9844)	mem 20675MB
[2025-04-03 00:53:02 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][378/573]	eta 0:02:51 lr 0.001108	time 0.8777 (0.8818)	loss 0.6218 (0.5699)	grad_norm 3.0449 (2.9827)	mem 20675MB
[2025-04-03 00:53:03 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][380/573]	eta 0:02:50 lr 0.001110	time 0.8775 (0.8818)	loss 0.5492 (0.5699)	grad_norm 4.4352 (2.9852)	mem 20675MB
[2025-04-03 00:53:05 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][382/573]	eta 0:02:48 lr 0.001111	time 0.8772 (0.8817)	loss 0.5677 (0.5696)	grad_norm 2.0591 (2.9839)	mem 20675MB
[2025-04-03 00:53:07 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][384/573]	eta 0:02:46 lr 0.001113	time 0.8774 (0.8817)	loss 0.4030 (0.5690)	grad_norm 4.8568 (2.9894)	mem 20675MB
[2025-04-03 00:53:09 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][386/573]	eta 0:02:44 lr 0.001114	time 0.8773 (0.8817)	loss 0.5443 (0.5690)	grad_norm 5.5470 (3.0006)	mem 20675MB
[2025-04-03 00:53:10 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][388/573]	eta 0:02:43 lr 0.001116	time 0.8773 (0.8817)	loss 0.5405 (0.5689)	grad_norm 3.2418 (3.0005)	mem 20675MB
[2025-04-03 00:53:12 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][390/573]	eta 0:02:41 lr 0.001117	time 0.8774 (0.8817)	loss 0.6267 (0.5693)	grad_norm 5.2759 (3.0128)	mem 20675MB
[2025-04-03 00:53:14 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][392/573]	eta 0:02:39 lr 0.001118	time 0.8772 (0.8816)	loss 0.5424 (0.5692)	grad_norm 3.1295 (3.0115)	mem 20675MB
[2025-04-03 00:53:16 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][394/573]	eta 0:02:37 lr 0.001120	time 0.8774 (0.8816)	loss 0.4772 (0.5690)	grad_norm 3.5185 (3.0165)	mem 20675MB
[2025-04-03 00:53:17 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][396/573]	eta 0:02:36 lr 0.001121	time 0.8775 (0.8816)	loss 0.5703 (0.5691)	grad_norm 2.0385 (3.0115)	mem 20675MB
[2025-04-03 00:53:19 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][398/573]	eta 0:02:34 lr 0.001123	time 0.8771 (0.8816)	loss 0.5826 (0.5689)	grad_norm 2.9718 (3.0109)	mem 20675MB
[2025-04-03 00:53:21 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][400/573]	eta 0:02:32 lr 0.001124	time 0.8772 (0.8816)	loss 0.6032 (0.5689)	grad_norm 3.6380 (3.0135)	mem 20675MB
[2025-04-03 00:53:23 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][402/573]	eta 0:02:30 lr 0.001126	time 0.8769 (0.8816)	loss 0.4304 (0.5686)	grad_norm 2.9888 (3.0143)	mem 20675MB
[2025-04-03 00:53:25 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][404/573]	eta 0:02:28 lr 0.001127	time 0.8771 (0.8815)	loss 0.5392 (0.5684)	grad_norm 1.8696 (3.0128)	mem 20675MB
[2025-04-03 00:53:26 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][406/573]	eta 0:02:27 lr 0.001129	time 0.8773 (0.8815)	loss 0.5855 (0.5682)	grad_norm 2.4387 (3.0133)	mem 20675MB
[2025-04-03 00:53:28 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][408/573]	eta 0:02:25 lr 0.001130	time 0.8775 (0.8815)	loss 0.6244 (0.5685)	grad_norm 2.6930 (3.0156)	mem 20675MB
[2025-04-03 00:53:30 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][410/573]	eta 0:02:23 lr 0.001131	time 0.8772 (0.8815)	loss 0.5649 (0.5685)	grad_norm 2.1708 (3.0116)	mem 20675MB
[2025-04-03 00:53:32 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][412/573]	eta 0:02:21 lr 0.001133	time 0.8769 (0.8815)	loss 0.6038 (0.5684)	grad_norm 5.4888 (3.0229)	mem 20675MB
[2025-04-03 00:53:33 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][414/573]	eta 0:02:20 lr 0.001134	time 0.8776 (0.8815)	loss 0.4503 (0.5678)	grad_norm 5.4312 (3.0304)	mem 20675MB
[2025-04-03 00:53:35 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][416/573]	eta 0:02:18 lr 0.001136	time 0.8772 (0.8814)	loss 0.4916 (0.5672)	grad_norm 5.3084 (3.0374)	mem 20675MB
[2025-04-03 00:53:37 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][418/573]	eta 0:02:16 lr 0.001137	time 0.8775 (0.8814)	loss 0.5220 (0.5671)	grad_norm 5.1882 (3.0427)	mem 20675MB
[2025-04-03 00:53:39 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][420/573]	eta 0:02:14 lr 0.001139	time 0.8773 (0.8814)	loss 0.6159 (0.5673)	grad_norm 3.3647 (3.0477)	mem 20675MB
[2025-04-03 00:53:40 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][422/573]	eta 0:02:13 lr 0.001140	time 0.8797 (0.8814)	loss 0.5903 (0.5671)	grad_norm 1.8642 (3.0462)	mem 20675MB
[2025-04-03 00:53:42 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][424/573]	eta 0:02:11 lr 0.001142	time 0.8772 (0.8814)	loss 0.5506 (0.5670)	grad_norm 2.2528 (3.0433)	mem 20675MB
[2025-04-03 00:53:44 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][426/573]	eta 0:02:09 lr 0.001143	time 0.8773 (0.8814)	loss 0.6218 (0.5672)	grad_norm 1.7654 (3.0417)	mem 20675MB
[2025-04-03 00:53:46 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][428/573]	eta 0:02:07 lr 0.001145	time 0.8773 (0.8814)	loss 0.6049 (0.5673)	grad_norm 2.5137 (3.0383)	mem 20675MB
[2025-04-03 00:53:47 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][430/573]	eta 0:02:06 lr 0.001146	time 0.8774 (0.8813)	loss 0.4574 (0.5670)	grad_norm 3.2340 (3.0370)	mem 20675MB
[2025-04-03 00:53:49 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][432/573]	eta 0:02:04 lr 0.001147	time 0.8771 (0.8813)	loss 0.5708 (0.5671)	grad_norm 1.6633 (3.0328)	mem 20675MB
[2025-04-03 00:53:51 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][434/573]	eta 0:02:02 lr 0.001149	time 0.8773 (0.8813)	loss 0.4829 (0.5669)	grad_norm 4.0800 (3.0338)	mem 20675MB
[2025-04-03 00:53:53 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][436/573]	eta 0:02:00 lr 0.001150	time 0.8773 (0.8813)	loss 0.4915 (0.5668)	grad_norm 2.7039 (3.0319)	mem 20675MB
[2025-04-03 00:53:54 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][438/573]	eta 0:01:58 lr 0.001152	time 0.8773 (0.8813)	loss 0.6027 (0.5668)	grad_norm 3.7318 (3.0316)	mem 20675MB
[2025-04-03 00:53:56 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][440/573]	eta 0:01:57 lr 0.001153	time 0.8773 (0.8813)	loss 0.5491 (0.5671)	grad_norm 1.7146 (3.0290)	mem 20675MB
[2025-04-03 00:53:58 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][442/573]	eta 0:01:55 lr 0.001155	time 0.8775 (0.8813)	loss 0.6625 (0.5674)	grad_norm 2.0559 (3.0274)	mem 20675MB
[2025-04-03 00:54:00 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][444/573]	eta 0:01:53 lr 0.001156	time 0.8773 (0.8812)	loss 0.6016 (0.5675)	grad_norm 2.2298 (3.0229)	mem 20675MB
[2025-04-03 00:54:01 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][446/573]	eta 0:01:51 lr 0.001158	time 0.8772 (0.8812)	loss 0.6114 (0.5677)	grad_norm 1.6074 (3.0191)	mem 20675MB
[2025-04-03 00:54:03 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][448/573]	eta 0:01:50 lr 0.001159	time 0.8775 (0.8812)	loss 0.5667 (0.5679)	grad_norm 4.6270 (3.0191)	mem 20675MB
[2025-04-03 00:54:05 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][450/573]	eta 0:01:48 lr 0.001161	time 0.8775 (0.8812)	loss 0.5589 (0.5679)	grad_norm 2.7649 (3.0171)	mem 20675MB
[2025-04-03 00:54:07 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][452/573]	eta 0:01:46 lr 0.001162	time 0.8776 (0.8812)	loss 0.5635 (0.5678)	grad_norm 1.9647 (3.0149)	mem 20675MB
[2025-04-03 00:54:08 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][454/573]	eta 0:01:44 lr 0.001163	time 0.8774 (0.8812)	loss 0.6358 (0.5681)	grad_norm 4.1056 (3.0214)	mem 20675MB
[2025-04-03 00:54:10 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][456/573]	eta 0:01:43 lr 0.001165	time 0.8773 (0.8812)	loss 0.6194 (0.5682)	grad_norm 1.6140 (3.0171)	mem 20675MB
[2025-04-03 00:54:12 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][458/573]	eta 0:01:41 lr 0.001166	time 0.8773 (0.8812)	loss 0.5803 (0.5683)	grad_norm 2.6180 (3.0161)	mem 20675MB
[2025-04-03 00:54:14 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][460/573]	eta 0:01:39 lr 0.001168	time 0.8774 (0.8811)	loss 0.6281 (0.5685)	grad_norm 1.5410 (3.0130)	mem 20675MB
[2025-04-03 00:54:15 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][462/573]	eta 0:01:37 lr 0.001169	time 0.8771 (0.8811)	loss 0.5900 (0.5683)	grad_norm 1.3843 (3.0093)	mem 20675MB
[2025-04-03 00:54:17 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][464/573]	eta 0:01:36 lr 0.001171	time 0.8787 (0.8811)	loss 0.5126 (0.5682)	grad_norm 1.9685 (3.0037)	mem 20675MB
[2025-04-03 00:54:19 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][466/573]	eta 0:01:34 lr 0.001172	time 0.8774 (0.8811)	loss 0.5217 (0.5682)	grad_norm 2.5434 (3.0039)	mem 20675MB
[2025-04-03 00:54:21 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][468/573]	eta 0:01:32 lr 0.001174	time 0.8773 (0.8811)	loss 0.5094 (0.5681)	grad_norm 2.5627 (3.0010)	mem 20675MB
[2025-04-03 00:54:22 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][470/573]	eta 0:01:30 lr 0.001175	time 0.8773 (0.8811)	loss 0.5727 (0.5682)	grad_norm 2.6756 (3.0005)	mem 20675MB
[2025-04-03 00:54:24 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][472/573]	eta 0:01:28 lr 0.001177	time 0.8773 (0.8811)	loss 0.4890 (0.5681)	grad_norm 4.0948 (3.0024)	mem 20675MB
[2025-04-03 00:54:26 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][474/573]	eta 0:01:27 lr 0.001178	time 0.8773 (0.8811)	loss 0.5067 (0.5680)	grad_norm 3.0591 (3.0025)	mem 20675MB
[2025-04-03 00:54:28 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][476/573]	eta 0:01:25 lr 0.001179	time 0.8776 (0.8810)	loss 0.5759 (0.5680)	grad_norm 1.7754 (2.9985)	mem 20675MB
[2025-04-03 00:54:30 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][478/573]	eta 0:01:23 lr 0.001181	time 0.8776 (0.8810)	loss 0.4550 (0.5679)	grad_norm 2.4401 (2.9971)	mem 20675MB
[2025-04-03 00:54:31 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][480/573]	eta 0:01:21 lr 0.001182	time 0.8775 (0.8810)	loss 0.5973 (0.5681)	grad_norm 1.5668 (2.9912)	mem 20675MB
[2025-04-03 00:54:33 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][482/573]	eta 0:01:20 lr 0.001184	time 0.8772 (0.8810)	loss 0.6239 (0.5681)	grad_norm 1.3238 (2.9864)	mem 20675MB
[2025-04-03 00:54:35 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][484/573]	eta 0:01:18 lr 0.001185	time 0.8772 (0.8810)	loss 0.6094 (0.5682)	grad_norm 2.2590 (2.9841)	mem 20675MB
[2025-04-03 00:54:37 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][486/573]	eta 0:01:16 lr 0.001187	time 0.8771 (0.8810)	loss 0.5902 (0.5685)	grad_norm 3.0105 (2.9831)	mem 20675MB
[2025-04-03 00:54:38 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][488/573]	eta 0:01:14 lr 0.001188	time 0.8772 (0.8810)	loss 0.5387 (0.5685)	grad_norm 2.4986 (2.9819)	mem 20675MB
[2025-04-03 00:54:40 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][490/573]	eta 0:01:13 lr 0.001190	time 0.8774 (0.8810)	loss 0.5482 (0.5685)	grad_norm 1.7597 (2.9764)	mem 20675MB
[2025-04-03 00:54:42 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][492/573]	eta 0:01:11 lr 0.001191	time 0.8775 (0.8810)	loss 0.4652 (0.5682)	grad_norm 2.7569 (2.9741)	mem 20675MB
[2025-04-03 00:54:44 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][494/573]	eta 0:01:09 lr 0.001193	time 0.8775 (0.8809)	loss 0.5598 (0.5680)	grad_norm 3.2870 (2.9750)	mem 20675MB
[2025-04-03 00:54:45 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][496/573]	eta 0:01:07 lr 0.001194	time 0.8777 (0.8809)	loss 0.4199 (0.5675)	grad_norm 3.0184 (2.9757)	mem 20675MB
[2025-04-03 00:54:47 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][498/573]	eta 0:01:06 lr 0.001195	time 0.8774 (0.8809)	loss 0.5852 (0.5675)	grad_norm 2.8873 (2.9774)	mem 20675MB
[2025-04-03 00:54:49 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][500/573]	eta 0:01:04 lr 0.001197	time 0.8774 (0.8809)	loss 0.4290 (0.5670)	grad_norm 4.4312 (2.9895)	mem 20675MB
[2025-04-03 00:54:51 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][502/573]	eta 0:01:02 lr 0.001198	time 0.8777 (0.8809)	loss 0.5873 (0.5668)	grad_norm 2.9599 (2.9957)	mem 20675MB
[2025-04-03 00:54:52 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][504/573]	eta 0:01:00 lr 0.001200	time 0.8770 (0.8809)	loss 0.4723 (0.5668)	grad_norm 2.3459 (2.9940)	mem 20675MB
[2025-04-03 00:54:54 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][506/573]	eta 0:00:59 lr 0.001201	time 0.8771 (0.8809)	loss 0.5163 (0.5665)	grad_norm 2.4242 (2.9936)	mem 20675MB
[2025-04-03 00:54:56 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][508/573]	eta 0:00:57 lr 0.001203	time 0.8774 (0.8809)	loss 0.4540 (0.5662)	grad_norm 2.3480 (2.9917)	mem 20675MB
[2025-04-03 00:54:58 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][510/573]	eta 0:00:55 lr 0.001204	time 0.8774 (0.8809)	loss 0.5828 (0.5662)	grad_norm 2.5899 (2.9914)	mem 20675MB
[2025-04-03 00:54:59 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][512/573]	eta 0:00:53 lr 0.001206	time 0.8776 (0.8809)	loss 0.5556 (0.5659)	grad_norm 2.5246 (2.9898)	mem 20675MB
[2025-04-03 00:55:01 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][514/573]	eta 0:00:51 lr 0.001207	time 0.8773 (0.8808)	loss 0.5303 (0.5659)	grad_norm 2.9092 (2.9866)	mem 20675MB
[2025-04-03 00:55:03 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][516/573]	eta 0:00:50 lr 0.001209	time 0.8776 (0.8808)	loss 0.6538 (0.5660)	grad_norm 3.0631 (2.9853)	mem 20675MB
[2025-04-03 00:55:05 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][518/573]	eta 0:00:48 lr 0.001210	time 0.8777 (0.8808)	loss 0.6074 (0.5658)	grad_norm 4.5341 (2.9881)	mem 20675MB
[2025-04-03 00:55:06 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][520/573]	eta 0:00:46 lr 0.001211	time 0.8773 (0.8808)	loss 0.5979 (0.5659)	grad_norm 2.0210 (2.9862)	mem 20675MB
[2025-04-03 00:55:08 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][522/573]	eta 0:00:44 lr 0.001213	time 0.8774 (0.8808)	loss 0.6414 (0.5659)	grad_norm 2.3235 (2.9844)	mem 20675MB
[2025-04-03 00:55:10 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][524/573]	eta 0:00:43 lr 0.001214	time 0.8771 (0.8808)	loss 0.6114 (0.5660)	grad_norm 4.5719 (2.9857)	mem 20675MB
[2025-04-03 00:55:12 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][526/573]	eta 0:00:41 lr 0.001216	time 0.8773 (0.8808)	loss 0.5992 (0.5659)	grad_norm 1.7588 (2.9835)	mem 20675MB
[2025-04-03 00:55:13 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][528/573]	eta 0:00:39 lr 0.001217	time 0.8775 (0.8808)	loss 0.5593 (0.5657)	grad_norm 1.9184 (2.9840)	mem 20675MB
[2025-04-03 00:55:15 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][530/573]	eta 0:00:37 lr 0.001219	time 0.8774 (0.8808)	loss 0.4676 (0.5656)	grad_norm 2.5305 (2.9807)	mem 20675MB
[2025-04-03 00:55:17 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][532/573]	eta 0:00:36 lr 0.001220	time 0.8775 (0.8808)	loss 0.5461 (0.5657)	grad_norm 4.7656 (2.9835)	mem 20675MB
[2025-04-03 00:55:19 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][534/573]	eta 0:00:34 lr 0.001222	time 0.8771 (0.8808)	loss 0.5754 (0.5657)	grad_norm 2.1744 (2.9796)	mem 20675MB
[2025-04-03 00:55:20 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][536/573]	eta 0:00:32 lr 0.001223	time 0.8776 (0.8807)	loss 0.5210 (0.5656)	grad_norm 3.9065 (2.9837)	mem 20675MB
[2025-04-03 00:55:22 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][538/573]	eta 0:00:30 lr 0.001225	time 0.8786 (0.8807)	loss 0.5786 (0.5656)	grad_norm 3.1864 (2.9824)	mem 20675MB
[2025-04-03 00:55:24 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][540/573]	eta 0:00:29 lr 0.001226	time 0.8773 (0.8807)	loss 0.6207 (0.5659)	grad_norm 6.5795 (2.9925)	mem 20675MB
[2025-04-03 00:55:26 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][542/573]	eta 0:00:27 lr 0.001227	time 0.8773 (0.8807)	loss 0.5487 (0.5657)	grad_norm 2.0724 (2.9894)	mem 20675MB
[2025-04-03 00:55:27 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][544/573]	eta 0:00:25 lr 0.001229	time 0.8778 (0.8807)	loss 0.5139 (0.5655)	grad_norm 2.8651 (2.9883)	mem 20675MB
[2025-04-03 00:55:29 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][546/573]	eta 0:00:23 lr 0.001230	time 0.8772 (0.8807)	loss 0.5284 (0.5656)	grad_norm 3.9480 (2.9914)	mem 20675MB
[2025-04-03 00:55:31 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][548/573]	eta 0:00:22 lr 0.001232	time 0.8771 (0.8807)	loss 0.5633 (0.5656)	grad_norm 3.2307 (2.9927)	mem 20675MB
[2025-04-03 00:55:33 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][550/573]	eta 0:00:20 lr 0.001233	time 0.8771 (0.8807)	loss 0.5776 (0.5658)	grad_norm 3.0085 (2.9987)	mem 20675MB
[2025-04-03 00:55:35 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][552/573]	eta 0:00:18 lr 0.001235	time 0.8771 (0.8807)	loss 0.5868 (0.5661)	grad_norm 3.2323 (3.0009)	mem 20675MB
[2025-04-03 00:55:36 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][554/573]	eta 0:00:16 lr 0.001236	time 0.8775 (0.8807)	loss 0.5988 (0.5663)	grad_norm 2.0995 (2.9972)	mem 20675MB
[2025-04-03 00:55:38 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][556/573]	eta 0:00:14 lr 0.001238	time 0.8771 (0.8807)	loss 0.5656 (0.5664)	grad_norm 2.5489 (2.9933)	mem 20675MB
[2025-04-03 00:55:40 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][558/573]	eta 0:00:13 lr 0.001239	time 0.8771 (0.8806)	loss 0.6617 (0.5666)	grad_norm 3.5782 (2.9936)	mem 20675MB
[2025-04-03 00:55:42 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][560/573]	eta 0:00:11 lr 0.001241	time 0.8772 (0.8806)	loss 0.4966 (0.5664)	grad_norm 2.6506 (2.9944)	mem 20675MB
[2025-04-03 00:55:43 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][562/573]	eta 0:00:09 lr 0.001242	time 0.8773 (0.8806)	loss 0.5517 (0.5665)	grad_norm 4.4221 (2.9993)	mem 20675MB
[2025-04-03 00:55:45 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][564/573]	eta 0:00:07 lr 0.001243	time 0.8772 (0.8806)	loss 0.5541 (0.5663)	grad_norm 3.4071 (3.0003)	mem 20675MB
[2025-04-03 00:55:47 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][566/573]	eta 0:00:06 lr 0.001245	time 0.8771 (0.8806)	loss 0.5150 (0.5662)	grad_norm 1.9782 (2.9998)	mem 20675MB
[2025-04-03 00:55:49 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][568/573]	eta 0:00:04 lr 0.001246	time 0.8772 (0.8806)	loss 0.5001 (0.5661)	grad_norm 3.8605 (3.0078)	mem 20675MB
[2025-04-03 00:55:50 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][570/573]	eta 0:00:02 lr 0.001248	time 0.8768 (0.8806)	loss 0.5645 (0.5661)	grad_norm 1.5058 (3.0050)	mem 20675MB
[2025-04-03 00:55:52 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][572/573]	eta 0:00:00 lr 0.001249	time 0.8772 (0.8806)	loss 0.5659 (0.5659)	grad_norm 2.5463 (3.0034)	mem 20675MB
[2025-04-03 00:55:52 simmim_finetune] (main_finetune.py 260): INFO EPOCH 2 training takes 0:08:24
[2025-04-03 00:55:54 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.838 (1.838)	Loss 0.5226 (0.5226)	Acc@1 76.562 (76.562)	Mem 20675MB
[2025-04-03 00:55:55 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.802)	Loss 0.4773 (0.5025)	Acc@1 80.469 (79.427)	Mem 20675MB
[2025-04-03 00:55:55 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.595)	Loss 0.5504 (0.5209)	Acc@1 76.562 (78.125)	Mem 20675MB
[2025-04-03 00:55:56 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.506)	Loss 0.5409 (0.5185)	Acc@1 74.219 (78.125)	Mem 20675MB
[2025-04-03 00:55:56 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.457)	Loss 0.6864 (0.5359)	Acc@1 56.250 (75.694)	Mem 20675MB
[2025-04-03 00:55:57 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.425)	Loss 0.5645 (0.5430)	Acc@1 74.219 (74.858)	Mem 20675MB
[2025-04-03 00:55:57 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.403)	Loss 0.5952 (0.5510)	Acc@1 67.969 (73.918)	Mem 20675MB
[2025-04-03 00:55:58 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.387)	Loss 0.5682 (0.5549)	Acc@1 70.312 (73.125)	Mem 20675MB
[2025-04-03 00:55:58 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 73.034
[2025-04-03 00:55:58 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 73.0%
[2025-04-03 00:55:58 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 73.03%
[2025-04-03 00:55:58 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.618964311205754e-06, 4.618964311205754e-06, 7.10602063002881e-06, 7.10602063002881e-06, 1.0932261120525818e-05, 1.0932261120525818e-05, 1.6818784952059678e-05, 1.6818784952059678e-05, 2.587497546211177e-05, 2.587497546211177e-05, 3.98075762468073e-05, 3.98075762468073e-05, 6.124234668480042e-05, 6.124234668480042e-05, 9.421891658940519e-05, 9.421891658940519e-05, 0.00014495210105802794, 0.00014495210105802794, 0.00022300315408667836, 0.00022300315408667836, 0.00034308169720767887, 0.00034308169720767887, 0.0005278179173938336, 0.0005278179173938336, 0.0008120274869109948, 0.0008120274869109948, 0.0012492729784758581, 0.0012492729784758581]
[2025-04-03 00:56:01 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][0/573]	eta 0:22:55 lr 0.001219	time 2.4008 (2.4008)	loss 0.5206 (0.5206)	grad_norm 2.5616 (2.5616)	mem 20675MB
[2025-04-03 00:56:02 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][2/573]	eta 0:13:11 lr 0.001219	time 0.8770 (1.3858)	loss 0.6194 (0.5266)	grad_norm 3.9280 (3.2506)	mem 20675MB
[2025-04-03 00:56:04 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][4/573]	eta 0:11:12 lr 0.001219	time 0.8772 (1.1826)	loss 0.4969 (0.5232)	grad_norm 3.6400 (3.0892)	mem 20675MB
[2025-04-03 00:56:06 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][6/573]	eta 0:10:21 lr 0.001219	time 0.8776 (1.0957)	loss 0.6251 (0.5628)	grad_norm 3.1010 (3.2758)	mem 20675MB
[2025-04-03 00:56:08 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][8/573]	eta 0:09:51 lr 0.001219	time 0.8772 (1.0473)	loss 0.5797 (0.5552)	grad_norm 2.2849 (3.1031)	mem 20675MB
[2025-04-03 00:56:09 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][10/573]	eta 0:09:32 lr 0.001219	time 0.8775 (1.0165)	loss 0.6087 (0.5676)	grad_norm 2.1959 (2.9246)	mem 20675MB
[2025-04-03 00:56:11 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][12/573]	eta 0:09:18 lr 0.001219	time 0.8775 (0.9952)	loss 0.5968 (0.5713)	grad_norm 2.2994 (2.7550)	mem 20675MB
[2025-04-03 00:56:13 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][14/573]	eta 0:09:07 lr 0.001219	time 0.8778 (0.9797)	loss 0.4850 (0.5637)	grad_norm 3.2802 (2.7444)	mem 20675MB
[2025-04-03 00:56:15 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][16/573]	eta 0:08:59 lr 0.001219	time 0.8773 (0.9677)	loss 0.6471 (0.5598)	grad_norm 2.9250 (2.7447)	mem 20675MB
[2025-04-03 00:56:16 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][18/573]	eta 0:08:51 lr 0.001219	time 0.8773 (0.9583)	loss 0.6287 (0.5598)	grad_norm 3.8701 (2.7907)	mem 20675MB
[2025-04-03 00:56:18 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][20/573]	eta 0:08:45 lr 0.001219	time 0.8773 (0.9507)	loss 0.6509 (0.5594)	grad_norm 3.5207 (2.8050)	mem 20675MB
[2025-04-03 00:56:20 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][22/573]	eta 0:08:40 lr 0.001219	time 0.8773 (0.9444)	loss 0.5256 (0.5570)	grad_norm 2.1114 (2.7720)	mem 20675MB
[2025-04-03 00:56:22 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][24/573]	eta 0:08:35 lr 0.001219	time 0.8774 (0.9391)	loss 0.3783 (0.5537)	grad_norm 3.0171 (2.8028)	mem 20675MB
[2025-04-03 00:56:24 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][26/573]	eta 0:08:31 lr 0.001218	time 0.8772 (0.9345)	loss 0.5086 (0.5554)	grad_norm 3.0450 (2.8663)	mem 20675MB
[2025-04-03 00:56:25 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][28/573]	eta 0:08:27 lr 0.001218	time 0.8774 (0.9307)	loss 0.5835 (0.5552)	grad_norm 1.8042 (2.8189)	mem 20675MB
[2025-04-03 00:56:27 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][30/573]	eta 0:08:23 lr 0.001218	time 0.8773 (0.9273)	loss 0.4369 (0.5496)	grad_norm 4.2412 (2.9180)	mem 20675MB
[2025-04-03 00:56:29 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][32/573]	eta 0:08:20 lr 0.001218	time 0.8772 (0.9243)	loss 0.5425 (0.5478)	grad_norm 2.8379 (2.9176)	mem 20675MB
[2025-04-03 00:56:31 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][34/573]	eta 0:08:16 lr 0.001218	time 0.8771 (0.9217)	loss 0.6128 (0.5474)	grad_norm 3.6953 (2.9591)	mem 20675MB
[2025-04-03 00:56:32 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][36/573]	eta 0:08:13 lr 0.001218	time 0.8772 (0.9193)	loss 0.4669 (0.5421)	grad_norm 2.5145 (2.9371)	mem 20675MB
[2025-04-03 00:56:34 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][38/573]	eta 0:08:10 lr 0.001218	time 0.8777 (0.9172)	loss 0.4974 (0.5418)	grad_norm 4.3272 (2.9688)	mem 20675MB
[2025-04-03 00:56:36 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][40/573]	eta 0:08:07 lr 0.001218	time 0.8775 (0.9153)	loss 0.6145 (0.5466)	grad_norm 5.6038 (3.0150)	mem 20675MB
[2025-04-03 00:56:38 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][42/573]	eta 0:08:05 lr 0.001218	time 0.8773 (0.9136)	loss 0.5980 (0.5500)	grad_norm 4.3713 (3.0247)	mem 20675MB
[2025-04-03 00:56:39 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][44/573]	eta 0:08:02 lr 0.001218	time 0.8775 (0.9120)	loss 0.6597 (0.5545)	grad_norm 1.8110 (2.9678)	mem 20675MB
[2025-04-03 00:56:41 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][46/573]	eta 0:07:59 lr 0.001218	time 0.8772 (0.9106)	loss 0.6208 (0.5557)	grad_norm 2.2205 (2.9511)	mem 20675MB
[2025-04-03 00:56:43 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][48/573]	eta 0:07:57 lr 0.001218	time 0.8776 (0.9093)	loss 0.5958 (0.5560)	grad_norm 4.5142 (2.9783)	mem 20675MB
[2025-04-03 00:56:45 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][50/573]	eta 0:07:54 lr 0.001218	time 0.8785 (0.9081)	loss 0.5616 (0.5574)	grad_norm 2.2991 (2.9335)	mem 20675MB
[2025-04-03 00:56:46 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][52/573]	eta 0:07:52 lr 0.001218	time 0.8772 (0.9069)	loss 0.4952 (0.5552)	grad_norm 3.0928 (2.9224)	mem 20675MB
[2025-04-03 00:56:48 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][54/573]	eta 0:07:50 lr 0.001217	time 0.8776 (0.9059)	loss 0.6125 (0.5576)	grad_norm 2.8763 (2.9388)	mem 20675MB
[2025-04-03 00:56:50 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][56/573]	eta 0:07:47 lr 0.001217	time 0.8773 (0.9049)	loss 0.5529 (0.5581)	grad_norm 1.8297 (2.9297)	mem 20675MB
[2025-04-03 00:56:52 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][58/573]	eta 0:07:45 lr 0.001217	time 0.8771 (0.9040)	loss 0.4637 (0.5565)	grad_norm 2.4450 (2.9206)	mem 20675MB
[2025-04-03 00:56:53 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][60/573]	eta 0:07:43 lr 0.001217	time 0.8773 (0.9032)	loss 0.6236 (0.5560)	grad_norm 1.7250 (2.8944)	mem 20675MB
[2025-04-03 00:56:55 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][62/573]	eta 0:07:41 lr 0.001217	time 0.8773 (0.9024)	loss 0.5416 (0.5564)	grad_norm 2.5602 (2.9065)	mem 20675MB
[2025-04-03 00:56:57 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][64/573]	eta 0:07:38 lr 0.001217	time 0.8774 (0.9016)	loss 0.4145 (0.5535)	grad_norm 3.4219 (2.9140)	mem 20675MB
[2025-04-03 00:56:59 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][66/573]	eta 0:07:36 lr 0.001217	time 0.8774 (0.9009)	loss 0.6546 (0.5540)	grad_norm 4.4244 (2.9412)	mem 20675MB
[2025-04-03 00:57:00 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][68/573]	eta 0:07:34 lr 0.001217	time 0.8771 (0.9002)	loss 0.5418 (0.5539)	grad_norm 1.8623 (2.9478)	mem 20675MB
[2025-04-03 00:57:02 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][70/573]	eta 0:07:32 lr 0.001217	time 0.8772 (0.8996)	loss 0.5869 (0.5535)	grad_norm 3.1702 (2.9467)	mem 20675MB
[2025-04-03 00:57:04 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][72/573]	eta 0:07:30 lr 0.001217	time 0.8771 (0.8990)	loss 0.6339 (0.5555)	grad_norm 2.6516 (2.9425)	mem 20675MB
[2025-04-03 00:57:06 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][74/573]	eta 0:07:28 lr 0.001217	time 0.8787 (0.8985)	loss 0.5864 (0.5567)	grad_norm 1.8129 (2.9142)	mem 20675MB
[2025-04-03 00:57:07 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][76/573]	eta 0:07:26 lr 0.001217	time 0.8772 (0.8980)	loss 0.6034 (0.5582)	grad_norm 1.7564 (2.8930)	mem 20675MB
[2025-04-03 00:57:09 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][78/573]	eta 0:07:24 lr 0.001217	time 0.8776 (0.8975)	loss 0.5931 (0.5597)	grad_norm 2.4905 (2.8921)	mem 20675MB
[2025-04-03 00:57:11 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][80/573]	eta 0:07:22 lr 0.001217	time 0.8776 (0.8970)	loss 0.6105 (0.5591)	grad_norm 3.7929 (2.8983)	mem 20675MB
[2025-04-03 00:57:13 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][82/573]	eta 0:07:20 lr 0.001216	time 0.8772 (0.8966)	loss 0.5130 (0.5584)	grad_norm 1.9359 (2.8781)	mem 20675MB
[2025-04-03 00:57:14 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][84/573]	eta 0:07:18 lr 0.001216	time 0.8773 (0.8961)	loss 0.5644 (0.5581)	grad_norm 3.8229 (2.8994)	mem 20675MB
[2025-04-03 00:57:16 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][86/573]	eta 0:07:16 lr 0.001216	time 0.8773 (0.8957)	loss 0.6265 (0.5581)	grad_norm 3.3306 (2.9059)	mem 20675MB
[2025-04-03 00:57:18 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][88/573]	eta 0:07:14 lr 0.001216	time 0.8775 (0.8953)	loss 0.6264 (0.5594)	grad_norm 3.3019 (2.9139)	mem 20675MB
[2025-04-03 00:57:20 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][90/573]	eta 0:07:12 lr 0.001216	time 0.8773 (0.8949)	loss 0.5215 (0.5598)	grad_norm 6.4226 (2.9541)	mem 20675MB
[2025-04-03 00:57:21 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][92/573]	eta 0:07:10 lr 0.001216	time 0.8774 (0.8946)	loss 0.5598 (0.5595)	grad_norm 1.9520 (2.9525)	mem 20675MB
[2025-04-03 00:57:23 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][94/573]	eta 0:07:08 lr 0.001216	time 0.8774 (0.8943)	loss 0.5149 (0.5588)	grad_norm 1.8936 (2.9390)	mem 20675MB
[2025-04-03 00:57:25 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][96/573]	eta 0:07:06 lr 0.001216	time 0.8772 (0.8939)	loss 0.5248 (0.5585)	grad_norm 2.3436 (2.9217)	mem 20675MB
[2025-04-03 00:57:27 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][98/573]	eta 0:07:04 lr 0.001216	time 0.8771 (0.8936)	loss 0.5895 (0.5593)	grad_norm 2.4782 (2.9155)	mem 20675MB
[2025-04-03 00:57:29 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][100/573]	eta 0:07:02 lr 0.001216	time 0.8776 (0.8933)	loss 0.6191 (0.5595)	grad_norm 1.6751 (2.8970)	mem 20675MB
[2025-04-03 00:57:30 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][102/573]	eta 0:07:00 lr 0.001216	time 0.8774 (0.8930)	loss 0.4708 (0.5590)	grad_norm 2.8257 (2.8889)	mem 20675MB
[2025-04-03 00:57:32 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][104/573]	eta 0:06:58 lr 0.001216	time 0.8774 (0.8927)	loss 0.5391 (0.5594)	grad_norm 3.5149 (2.8927)	mem 20675MB
[2025-04-03 00:57:34 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][106/573]	eta 0:06:56 lr 0.001216	time 0.8781 (0.8925)	loss 0.6100 (0.5601)	grad_norm 1.9065 (2.8750)	mem 20675MB
[2025-04-03 00:57:36 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][108/573]	eta 0:06:54 lr 0.001215	time 0.8771 (0.8922)	loss 0.5906 (0.5601)	grad_norm 4.4795 (2.8910)	mem 20675MB
[2025-04-03 00:57:37 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][110/573]	eta 0:06:52 lr 0.001215	time 0.8775 (0.8920)	loss 0.5931 (0.5603)	grad_norm 1.5382 (2.8788)	mem 20675MB
[2025-04-03 00:57:39 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][112/573]	eta 0:06:51 lr 0.001215	time 0.8777 (0.8917)	loss 0.5574 (0.5607)	grad_norm 1.7527 (2.8659)	mem 20675MB
[2025-04-03 00:57:41 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][114/573]	eta 0:06:49 lr 0.001215	time 0.8774 (0.8915)	loss 0.5425 (0.5608)	grad_norm 2.9864 (2.8679)	mem 20675MB
[2025-04-03 00:57:43 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][116/573]	eta 0:06:47 lr 0.001215	time 0.8777 (0.8913)	loss 0.4571 (0.5602)	grad_norm 2.1810 (2.8503)	mem 20675MB
[2025-04-03 00:57:44 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][118/573]	eta 0:06:45 lr 0.001215	time 0.8776 (0.8910)	loss 0.4607 (0.5597)	grad_norm 2.4400 (2.8372)	mem 20675MB
[2025-04-03 00:57:46 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][120/573]	eta 0:06:43 lr 0.001215	time 0.8780 (0.8908)	loss 0.5817 (0.5599)	grad_norm 2.2324 (2.8356)	mem 20675MB
[2025-04-03 00:57:48 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][122/573]	eta 0:06:41 lr 0.001215	time 0.8774 (0.8906)	loss 0.4823 (0.5584)	grad_norm 3.1363 (2.8356)	mem 20675MB
[2025-04-03 00:57:50 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][124/573]	eta 0:06:39 lr 0.001215	time 0.8771 (0.8904)	loss 0.5916 (0.5588)	grad_norm 2.5749 (2.8363)	mem 20675MB
[2025-04-03 00:57:51 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][126/573]	eta 0:06:37 lr 0.001215	time 0.8775 (0.8902)	loss 0.5559 (0.5580)	grad_norm 2.4230 (2.8314)	mem 20675MB
[2025-04-03 00:57:53 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][128/573]	eta 0:06:36 lr 0.001215	time 0.8773 (0.8901)	loss 0.5041 (0.5566)	grad_norm 2.5756 (2.8306)	mem 20675MB
[2025-04-03 00:57:55 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][130/573]	eta 0:06:34 lr 0.001215	time 0.8773 (0.8899)	loss 0.5248 (0.5568)	grad_norm 4.9616 (2.8434)	mem 20675MB
[2025-04-03 00:57:57 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][132/573]	eta 0:06:32 lr 0.001215	time 0.8774 (0.8897)	loss 0.4999 (0.5554)	grad_norm 3.1189 (2.8477)	mem 20675MB
[2025-04-03 00:57:58 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][134/573]	eta 0:06:30 lr 0.001215	time 0.8776 (0.8895)	loss 0.4443 (0.5553)	grad_norm 5.1831 (2.8781)	mem 20675MB
[2025-04-03 00:58:00 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][136/573]	eta 0:06:28 lr 0.001214	time 0.8774 (0.8894)	loss 0.6110 (0.5551)	grad_norm 2.4217 (2.8792)	mem 20675MB
[2025-04-03 00:58:02 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][138/573]	eta 0:06:26 lr 0.001214	time 0.8770 (0.8892)	loss 0.5696 (0.5548)	grad_norm 1.7024 (2.8701)	mem 20675MB
[2025-04-03 00:58:04 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][140/573]	eta 0:06:24 lr 0.001214	time 0.8772 (0.8891)	loss 0.5597 (0.5550)	grad_norm 2.2871 (2.8618)	mem 20675MB
[2025-04-03 00:58:05 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][142/573]	eta 0:06:23 lr 0.001214	time 0.8774 (0.8889)	loss 0.5784 (0.5554)	grad_norm 1.6701 (2.8522)	mem 20675MB
[2025-04-03 00:58:07 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][144/573]	eta 0:06:21 lr 0.001214	time 0.8772 (0.8888)	loss 0.4209 (0.5544)	grad_norm 2.2074 (2.8572)	mem 20675MB
[2025-04-03 00:58:09 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][146/573]	eta 0:06:19 lr 0.001214	time 0.8775 (0.8886)	loss 0.5231 (0.5546)	grad_norm 4.2592 (2.8732)	mem 20675MB
[2025-04-03 00:58:11 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][148/573]	eta 0:06:17 lr 0.001214	time 0.8772 (0.8885)	loss 0.5537 (0.5539)	grad_norm 1.8908 (2.8748)	mem 20675MB
[2025-04-03 00:58:12 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][150/573]	eta 0:06:15 lr 0.001214	time 0.8777 (0.8884)	loss 0.5718 (0.5535)	grad_norm 2.7081 (2.8838)	mem 20675MB
[2025-04-03 00:58:14 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][152/573]	eta 0:06:13 lr 0.001214	time 0.8775 (0.8882)	loss 0.5430 (0.5536)	grad_norm 3.5776 (2.8808)	mem 20675MB
[2025-04-03 00:58:16 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][154/573]	eta 0:06:12 lr 0.001214	time 0.8787 (0.8881)	loss 0.5898 (0.5543)	grad_norm 2.5600 (2.8764)	mem 20675MB
[2025-04-03 00:58:18 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][156/573]	eta 0:06:10 lr 0.001214	time 0.8774 (0.8880)	loss 0.6207 (0.5550)	grad_norm 2.9253 (2.8707)	mem 20675MB
[2025-04-03 00:58:19 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][158/573]	eta 0:06:08 lr 0.001214	time 0.8776 (0.8879)	loss 0.5329 (0.5550)	grad_norm 1.8950 (2.8567)	mem 20675MB
[2025-04-03 00:58:21 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][160/573]	eta 0:06:06 lr 0.001214	time 0.8773 (0.8877)	loss 0.6075 (0.5557)	grad_norm 1.4223 (2.8393)	mem 20675MB
[2025-04-03 00:58:23 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][162/573]	eta 0:06:04 lr 0.001213	time 0.8772 (0.8876)	loss 0.5544 (0.5557)	grad_norm 1.8088 (2.8270)	mem 20675MB
[2025-04-03 00:58:25 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][164/573]	eta 0:06:02 lr 0.001213	time 0.8775 (0.8875)	loss 0.6836 (0.5555)	grad_norm 3.9463 (2.8364)	mem 20675MB
[2025-04-03 00:58:26 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][166/573]	eta 0:06:01 lr 0.001213	time 0.8774 (0.8874)	loss 0.5478 (0.5559)	grad_norm 2.4630 (2.8356)	mem 20675MB
[2025-04-03 00:58:28 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][168/573]	eta 0:05:59 lr 0.001213	time 0.8776 (0.8873)	loss 0.5796 (0.5559)	grad_norm 2.4523 (2.8346)	mem 20675MB
[2025-04-03 00:58:30 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][170/573]	eta 0:05:57 lr 0.001213	time 0.8774 (0.8872)	loss 0.5162 (0.5547)	grad_norm 2.9819 (2.8379)	mem 20675MB
[2025-04-03 00:58:32 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][172/573]	eta 0:05:55 lr 0.001213	time 0.8777 (0.8871)	loss 0.4796 (0.5536)	grad_norm 3.2775 (2.8411)	mem 20675MB
[2025-04-03 00:58:34 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][174/573]	eta 0:05:53 lr 0.001213	time 0.8772 (0.8870)	loss 0.4730 (0.5539)	grad_norm 5.7293 (2.8632)	mem 20675MB
[2025-04-03 00:58:35 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][176/573]	eta 0:05:52 lr 0.001213	time 0.8772 (0.8869)	loss 0.6248 (0.5547)	grad_norm 1.9355 (2.8622)	mem 20675MB
[2025-04-03 00:58:37 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][178/573]	eta 0:05:50 lr 0.001213	time 0.8771 (0.8868)	loss 0.4850 (0.5536)	grad_norm 3.7386 (2.8777)	mem 20675MB
[2025-04-03 00:58:39 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][180/573]	eta 0:05:48 lr 0.001213	time 0.8774 (0.8867)	loss 0.5403 (0.5537)	grad_norm 2.1936 (2.8760)	mem 20675MB
[2025-04-03 00:58:41 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][182/573]	eta 0:05:46 lr 0.001213	time 0.8785 (0.8866)	loss 0.5661 (0.5543)	grad_norm 1.6772 (2.8712)	mem 20675MB
[2025-04-03 00:58:42 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][184/573]	eta 0:05:44 lr 0.001213	time 0.8771 (0.8865)	loss 0.6161 (0.5545)	grad_norm 1.6492 (2.8579)	mem 20675MB
[2025-04-03 00:58:44 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][186/573]	eta 0:05:43 lr 0.001213	time 0.8772 (0.8864)	loss 0.5346 (0.5545)	grad_norm 3.0774 (2.8584)	mem 20675MB
[2025-04-03 00:58:46 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][188/573]	eta 0:05:41 lr 0.001212	time 0.8777 (0.8863)	loss 0.5247 (0.5540)	grad_norm 3.0330 (2.8572)	mem 20675MB
[2025-04-03 00:58:48 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][190/573]	eta 0:05:39 lr 0.001212	time 0.8770 (0.8863)	loss 0.5590 (0.5541)	grad_norm 2.4673 (2.8654)	mem 20675MB
[2025-04-03 00:58:49 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][192/573]	eta 0:05:37 lr 0.001212	time 0.8772 (0.8862)	loss 0.5347 (0.5543)	grad_norm 2.5152 (2.8640)	mem 20675MB
[2025-04-03 00:58:51 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][194/573]	eta 0:05:35 lr 0.001212	time 0.8774 (0.8861)	loss 0.4594 (0.5535)	grad_norm 5.9757 (2.8784)	mem 20675MB
[2025-04-03 00:58:53 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][196/573]	eta 0:05:34 lr 0.001212	time 0.8779 (0.8860)	loss 0.5845 (0.5528)	grad_norm 4.8035 (2.8910)	mem 20675MB
[2025-04-03 00:58:55 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][198/573]	eta 0:05:32 lr 0.001212	time 0.8776 (0.8859)	loss 0.4219 (0.5518)	grad_norm 3.0203 (2.8928)	mem 20675MB
[2025-04-03 00:58:56 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][200/573]	eta 0:05:30 lr 0.001212	time 0.8776 (0.8859)	loss 0.5579 (0.5521)	grad_norm 5.2832 (2.9096)	mem 20675MB
[2025-04-03 00:58:58 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][202/573]	eta 0:05:28 lr 0.001212	time 0.8779 (0.8858)	loss 0.5382 (0.5519)	grad_norm 3.0369 (2.9108)	mem 20675MB
[2025-04-03 00:59:00 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][204/573]	eta 0:05:26 lr 0.001212	time 0.8772 (0.8857)	loss 0.4891 (0.5516)	grad_norm 3.7452 (2.9171)	mem 20675MB
[2025-04-03 00:59:02 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][206/573]	eta 0:05:25 lr 0.001212	time 0.8774 (0.8856)	loss 0.4562 (0.5514)	grad_norm 4.0645 (2.9284)	mem 20675MB
[2025-04-03 00:59:03 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][208/573]	eta 0:05:23 lr 0.001212	time 0.8773 (0.8856)	loss 0.6102 (0.5515)	grad_norm 1.5641 (2.9245)	mem 20675MB
[2025-04-03 00:59:05 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][210/573]	eta 0:05:21 lr 0.001212	time 0.8774 (0.8855)	loss 0.6192 (0.5519)	grad_norm 4.9982 (2.9356)	mem 20675MB
[2025-04-03 00:59:07 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][212/573]	eta 0:05:19 lr 0.001211	time 0.8773 (0.8854)	loss 0.5042 (0.5517)	grad_norm 3.5616 (2.9357)	mem 20675MB
[2025-04-03 00:59:09 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][214/573]	eta 0:05:17 lr 0.001211	time 0.8773 (0.8854)	loss 0.5807 (0.5515)	grad_norm 2.2497 (2.9290)	mem 20675MB
[2025-04-03 00:59:10 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][216/573]	eta 0:05:16 lr 0.001211	time 0.8774 (0.8853)	loss 0.5411 (0.5516)	grad_norm 3.1516 (2.9303)	mem 20675MB
[2025-04-03 00:59:12 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][218/573]	eta 0:05:14 lr 0.001211	time 0.8774 (0.8852)	loss 0.5666 (0.5520)	grad_norm 3.4047 (2.9434)	mem 20675MB
[2025-04-03 00:59:14 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][220/573]	eta 0:05:12 lr 0.001211	time 0.8774 (0.8852)	loss 0.5372 (0.5517)	grad_norm 3.1973 (2.9439)	mem 20675MB
[2025-04-03 00:59:16 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][222/573]	eta 0:05:10 lr 0.001211	time 0.8771 (0.8851)	loss 0.5806 (0.5519)	grad_norm 3.6558 (2.9440)	mem 20675MB
[2025-04-03 00:59:17 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][224/573]	eta 0:05:08 lr 0.001211	time 0.8773 (0.8850)	loss 0.5302 (0.5517)	grad_norm 2.6046 (2.9401)	mem 20675MB
[2025-04-03 00:59:19 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][226/573]	eta 0:05:07 lr 0.001211	time 0.8773 (0.8850)	loss 0.4935 (0.5515)	grad_norm 2.3165 (2.9323)	mem 20675MB
[2025-04-03 00:59:21 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][228/573]	eta 0:05:05 lr 0.001211	time 0.8768 (0.8849)	loss 0.4105 (0.5505)	grad_norm 2.9773 (2.9321)	mem 20675MB
[2025-04-03 00:59:23 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][230/573]	eta 0:05:03 lr 0.001211	time 0.8770 (0.8849)	loss 0.6442 (0.5510)	grad_norm 3.4144 (2.9346)	mem 20675MB
[2025-04-03 00:59:24 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][232/573]	eta 0:05:01 lr 0.001211	time 0.8770 (0.8848)	loss 0.5618 (0.5511)	grad_norm 2.5923 (2.9282)	mem 20675MB
[2025-04-03 00:59:26 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][234/573]	eta 0:04:59 lr 0.001211	time 0.8773 (0.8847)	loss 0.4286 (0.5502)	grad_norm 3.4093 (2.9326)	mem 20675MB
[2025-04-03 00:59:28 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][236/573]	eta 0:04:58 lr 0.001211	time 0.8773 (0.8847)	loss 0.6314 (0.5505)	grad_norm 4.2892 (2.9339)	mem 20675MB
[2025-04-03 00:59:30 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][238/573]	eta 0:04:56 lr 0.001210	time 0.8771 (0.8846)	loss 0.6199 (0.5509)	grad_norm 2.5768 (2.9277)	mem 20675MB
[2025-04-03 00:59:31 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][240/573]	eta 0:04:54 lr 0.001210	time 0.8774 (0.8846)	loss 0.4578 (0.5505)	grad_norm 2.8900 (2.9272)	mem 20675MB
[2025-04-03 00:59:33 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][242/573]	eta 0:04:52 lr 0.001210	time 0.8772 (0.8845)	loss 0.6200 (0.5509)	grad_norm 2.7537 (2.9219)	mem 20675MB
[2025-04-03 00:59:35 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][244/573]	eta 0:04:50 lr 0.001210	time 0.8773 (0.8845)	loss 0.4590 (0.5501)	grad_norm 2.8222 (2.9222)	mem 20675MB
[2025-04-03 00:59:37 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][246/573]	eta 0:04:49 lr 0.001210	time 0.8770 (0.8844)	loss 0.4220 (0.5497)	grad_norm 2.4444 (2.9189)	mem 20675MB
[2025-04-03 00:59:38 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][248/573]	eta 0:04:47 lr 0.001210	time 0.8773 (0.8844)	loss 0.4528 (0.5494)	grad_norm 2.9049 (2.9197)	mem 20675MB
[2025-04-03 00:59:40 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][250/573]	eta 0:04:45 lr 0.001210	time 0.8771 (0.8843)	loss 0.5273 (0.5492)	grad_norm 3.2832 (2.9223)	mem 20675MB
[2025-04-03 00:59:42 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][252/573]	eta 0:04:43 lr 0.001210	time 0.8776 (0.8843)	loss 0.6007 (0.5492)	grad_norm 3.7399 (2.9337)	mem 20675MB
[2025-04-03 00:59:44 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][254/573]	eta 0:04:42 lr 0.001210	time 0.8773 (0.8842)	loss 0.6855 (0.5498)	grad_norm 5.2277 (2.9474)	mem 20675MB
[2025-04-03 00:59:46 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][256/573]	eta 0:04:40 lr 0.001210	time 0.8771 (0.8842)	loss 0.5316 (0.5497)	grad_norm 2.1832 (2.9492)	mem 20675MB
[2025-04-03 00:59:47 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][258/573]	eta 0:04:38 lr 0.001210	time 0.8776 (0.8841)	loss 0.6243 (0.5502)	grad_norm 3.9648 (2.9495)	mem 20675MB
[2025-04-03 00:59:49 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][260/573]	eta 0:04:36 lr 0.001210	time 0.8776 (0.8841)	loss 0.4305 (0.5494)	grad_norm 2.7953 (2.9590)	mem 20675MB
[2025-04-03 00:59:51 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][262/573]	eta 0:04:34 lr 0.001209	time 0.8771 (0.8840)	loss 0.4289 (0.5486)	grad_norm 2.5108 (2.9560)	mem 20675MB
[2025-04-03 00:59:53 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][264/573]	eta 0:04:33 lr 0.001209	time 0.8776 (0.8840)	loss 0.4163 (0.5484)	grad_norm 3.0523 (2.9559)	mem 20675MB
[2025-04-03 00:59:54 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][266/573]	eta 0:04:31 lr 0.001209	time 0.8771 (0.8839)	loss 0.5090 (0.5484)	grad_norm 3.3888 (2.9564)	mem 20675MB
[2025-04-03 00:59:56 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][268/573]	eta 0:04:29 lr 0.001209	time 0.8772 (0.8839)	loss 0.6197 (0.5487)	grad_norm 5.2983 (2.9704)	mem 20675MB
[2025-04-03 00:59:58 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][270/573]	eta 0:04:27 lr 0.001209	time 0.8775 (0.8839)	loss 0.4933 (0.5485)	grad_norm 2.7806 (2.9668)	mem 20675MB
[2025-04-03 01:00:00 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][272/573]	eta 0:04:26 lr 0.001209	time 0.8770 (0.8838)	loss 0.5514 (0.5486)	grad_norm 3.9377 (2.9659)	mem 20675MB
[2025-04-03 01:00:01 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][274/573]	eta 0:04:24 lr 0.001209	time 0.8770 (0.8838)	loss 0.5450 (0.5484)	grad_norm 2.1373 (2.9635)	mem 20675MB
[2025-04-03 01:00:03 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][276/573]	eta 0:04:22 lr 0.001209	time 0.8773 (0.8837)	loss 0.5948 (0.5480)	grad_norm 3.4252 (2.9716)	mem 20675MB
[2025-04-03 01:00:05 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][278/573]	eta 0:04:20 lr 0.001209	time 0.8774 (0.8837)	loss 0.4351 (0.5478)	grad_norm 3.3125 (2.9726)	mem 20675MB
[2025-04-03 01:00:07 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][280/573]	eta 0:04:18 lr 0.001209	time 0.8774 (0.8837)	loss 0.5917 (0.5482)	grad_norm 2.9764 (2.9707)	mem 20675MB
[2025-04-03 01:00:08 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][282/573]	eta 0:04:17 lr 0.001209	time 0.8774 (0.8836)	loss 0.4765 (0.5482)	grad_norm 2.6388 (2.9742)	mem 20675MB
[2025-04-03 01:00:10 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][284/573]	eta 0:04:15 lr 0.001209	time 0.8775 (0.8836)	loss 0.6349 (0.5486)	grad_norm 3.1878 (2.9699)	mem 20675MB
[2025-04-03 01:00:12 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][286/573]	eta 0:04:13 lr 0.001209	time 0.8773 (0.8836)	loss 0.6783 (0.5490)	grad_norm 2.1929 (2.9637)	mem 20675MB
[2025-04-03 01:00:14 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][288/573]	eta 0:04:11 lr 0.001208	time 0.8771 (0.8835)	loss 0.6015 (0.5494)	grad_norm 1.5175 (2.9557)	mem 20675MB
[2025-04-03 01:00:15 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][290/573]	eta 0:04:10 lr 0.001208	time 0.8770 (0.8835)	loss 0.5343 (0.5495)	grad_norm 2.2877 (2.9511)	mem 20675MB
[2025-04-03 01:00:17 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][292/573]	eta 0:04:08 lr 0.001208	time 0.8772 (0.8834)	loss 0.5382 (0.5496)	grad_norm 1.7878 (2.9427)	mem 20675MB
[2025-04-03 01:00:19 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][294/573]	eta 0:04:06 lr 0.001208	time 0.8774 (0.8834)	loss 0.5031 (0.5492)	grad_norm 2.0211 (2.9373)	mem 20675MB
[2025-04-03 01:00:21 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][296/573]	eta 0:04:04 lr 0.001208	time 0.8773 (0.8834)	loss 0.4921 (0.5488)	grad_norm 2.3388 (2.9366)	mem 20675MB
[2025-04-03 01:00:22 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][298/573]	eta 0:04:02 lr 0.001208	time 0.8771 (0.8833)	loss 0.6481 (0.5492)	grad_norm 3.5459 (2.9376)	mem 20675MB
[2025-04-03 01:00:24 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][300/573]	eta 0:04:01 lr 0.001208	time 0.8772 (0.8833)	loss 0.6222 (0.5495)	grad_norm 2.3094 (2.9437)	mem 20675MB
[2025-04-03 01:00:26 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][302/573]	eta 0:03:59 lr 0.001208	time 0.8770 (0.8833)	loss 0.5662 (0.5493)	grad_norm 2.0480 (2.9406)	mem 20675MB
[2025-04-03 01:00:28 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][304/573]	eta 0:03:57 lr 0.001208	time 0.8772 (0.8832)	loss 0.4931 (0.5491)	grad_norm 2.9675 (2.9381)	mem 20675MB
[2025-04-03 01:00:29 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][306/573]	eta 0:03:55 lr 0.001208	time 0.8774 (0.8832)	loss 0.6023 (0.5495)	grad_norm 1.8877 (2.9363)	mem 20675MB
[2025-04-03 01:00:31 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][308/573]	eta 0:03:54 lr 0.001208	time 0.8777 (0.8832)	loss 0.4988 (0.5495)	grad_norm 3.3478 (2.9331)	mem 20675MB
[2025-04-03 01:00:33 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][310/573]	eta 0:03:52 lr 0.001208	time 0.8778 (0.8831)	loss 0.5621 (0.5495)	grad_norm 1.6570 (2.9274)	mem 20675MB
[2025-04-03 01:00:35 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][312/573]	eta 0:03:50 lr 0.001207	time 0.8776 (0.8831)	loss 0.5873 (0.5496)	grad_norm 1.7612 (2.9191)	mem 20675MB
[2025-04-03 01:00:36 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][314/573]	eta 0:03:48 lr 0.001207	time 0.8773 (0.8831)	loss 0.4717 (0.5497)	grad_norm 2.3925 (2.9161)	mem 20675MB
[2025-04-03 01:00:38 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][316/573]	eta 0:03:46 lr 0.001207	time 0.8775 (0.8830)	loss 0.5683 (0.5494)	grad_norm 2.5043 (2.9152)	mem 20675MB
[2025-04-03 01:00:40 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][318/573]	eta 0:03:45 lr 0.001207	time 0.8774 (0.8830)	loss 0.4852 (0.5492)	grad_norm 4.0308 (2.9168)	mem 20675MB
[2025-04-03 01:00:42 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][320/573]	eta 0:03:43 lr 0.001207	time 0.8777 (0.8830)	loss 0.5417 (0.5492)	grad_norm 2.2119 (2.9150)	mem 20675MB
[2025-04-03 01:00:43 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][322/573]	eta 0:03:41 lr 0.001207	time 0.8775 (0.8830)	loss 0.4806 (0.5493)	grad_norm 2.4724 (2.9123)	mem 20675MB
[2025-04-03 01:00:45 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][324/573]	eta 0:03:39 lr 0.001207	time 0.8780 (0.8829)	loss 0.4198 (0.5489)	grad_norm 2.7953 (2.9125)	mem 20675MB
[2025-04-03 01:00:47 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][326/573]	eta 0:03:38 lr 0.001207	time 0.8781 (0.8829)	loss 0.4591 (0.5488)	grad_norm 2.1943 (2.9102)	mem 20675MB
[2025-04-03 01:00:49 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][328/573]	eta 0:03:36 lr 0.001207	time 0.8784 (0.8829)	loss 0.6505 (0.5490)	grad_norm 3.4617 (2.9079)	mem 20675MB
[2025-04-03 01:00:51 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][330/573]	eta 0:03:34 lr 0.001207	time 0.8777 (0.8828)	loss 0.5367 (0.5490)	grad_norm 3.9741 (2.9072)	mem 20675MB
[2025-04-03 01:00:52 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][332/573]	eta 0:03:32 lr 0.001207	time 0.8775 (0.8828)	loss 0.5832 (0.5491)	grad_norm 1.7810 (2.9126)	mem 20675MB
[2025-04-03 01:00:54 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][334/573]	eta 0:03:30 lr 0.001207	time 0.8777 (0.8828)	loss 0.6154 (0.5495)	grad_norm 1.4564 (2.9085)	mem 20675MB
[2025-04-03 01:00:56 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][336/573]	eta 0:03:29 lr 0.001206	time 0.8777 (0.8828)	loss 0.5093 (0.5493)	grad_norm 4.6078 (2.9167)	mem 20675MB
[2025-04-03 01:00:58 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][338/573]	eta 0:03:27 lr 0.001206	time 0.8774 (0.8827)	loss 0.4847 (0.5489)	grad_norm 3.9460 (2.9192)	mem 20675MB
[2025-04-03 01:00:59 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][340/573]	eta 0:03:25 lr 0.001206	time 0.8777 (0.8827)	loss 0.4848 (0.5488)	grad_norm 2.6714 (2.9160)	mem 20675MB
[2025-04-03 01:01:01 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][342/573]	eta 0:03:23 lr 0.001206	time 0.8784 (0.8827)	loss 0.3939 (0.5481)	grad_norm 4.7811 (2.9190)	mem 20675MB
[2025-04-03 01:01:03 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][344/573]	eta 0:03:22 lr 0.001206	time 0.8777 (0.8827)	loss 0.6570 (0.5484)	grad_norm 4.4012 (2.9249)	mem 20675MB
[2025-04-03 01:01:05 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][346/573]	eta 0:03:20 lr 0.001206	time 0.8778 (0.8826)	loss 0.5264 (0.5485)	grad_norm 3.1178 (2.9252)	mem 20675MB
[2025-04-03 01:01:06 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][348/573]	eta 0:03:18 lr 0.001206	time 0.8794 (0.8826)	loss 0.5226 (0.5481)	grad_norm 1.9187 (2.9224)	mem 20675MB
[2025-04-03 01:01:08 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][350/573]	eta 0:03:16 lr 0.001206	time 0.8777 (0.8826)	loss 0.6020 (0.5484)	grad_norm 2.0989 (2.9172)	mem 20675MB
[2025-04-03 01:01:10 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][352/573]	eta 0:03:15 lr 0.001206	time 0.8776 (0.8826)	loss 0.5947 (0.5485)	grad_norm 1.5827 (2.9110)	mem 20675MB
[2025-04-03 01:01:12 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][354/573]	eta 0:03:13 lr 0.001206	time 0.8777 (0.8826)	loss 0.5450 (0.5485)	grad_norm 1.5909 (2.9056)	mem 20675MB
[2025-04-03 01:01:13 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][356/573]	eta 0:03:11 lr 0.001206	time 0.8776 (0.8825)	loss 0.5805 (0.5486)	grad_norm 1.7442 (2.9000)	mem 20675MB
[2025-04-03 01:01:15 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][358/573]	eta 0:03:09 lr 0.001206	time 0.8775 (0.8825)	loss 0.5602 (0.5486)	grad_norm 2.2105 (2.8950)	mem 20675MB
[2025-04-03 01:01:17 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][360/573]	eta 0:03:07 lr 0.001205	time 0.8791 (0.8825)	loss 0.4941 (0.5485)	grad_norm 3.4360 (2.8929)	mem 20675MB
[2025-04-03 01:01:19 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][362/573]	eta 0:03:06 lr 0.001205	time 0.8779 (0.8825)	loss 0.3897 (0.5476)	grad_norm 2.5670 (2.8918)	mem 20675MB
[2025-04-03 01:01:20 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][364/573]	eta 0:03:04 lr 0.001205	time 0.8776 (0.8825)	loss 0.6535 (0.5479)	grad_norm 3.2401 (2.8925)	mem 20675MB
[2025-04-03 01:01:22 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][366/573]	eta 0:03:02 lr 0.001205	time 0.8778 (0.8824)	loss 0.5724 (0.5483)	grad_norm 3.2484 (2.8977)	mem 20675MB
[2025-04-03 01:01:24 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][368/573]	eta 0:03:00 lr 0.001205	time 0.8775 (0.8824)	loss 0.5793 (0.5484)	grad_norm 2.8045 (2.8943)	mem 20675MB
[2025-04-03 01:01:26 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][370/573]	eta 0:02:59 lr 0.001205	time 0.8775 (0.8824)	loss 0.5326 (0.5484)	grad_norm 1.8173 (2.8882)	mem 20675MB
[2025-04-03 01:01:27 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][372/573]	eta 0:02:57 lr 0.001205	time 0.8778 (0.8824)	loss 0.5112 (0.5485)	grad_norm 2.2213 (2.8840)	mem 20675MB
[2025-04-03 01:01:29 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][374/573]	eta 0:02:55 lr 0.001205	time 0.8777 (0.8823)	loss 0.5029 (0.5486)	grad_norm 2.8014 (2.8805)	mem 20675MB
[2025-04-03 01:01:31 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][376/573]	eta 0:02:53 lr 0.001205	time 0.8772 (0.8823)	loss 0.5870 (0.5488)	grad_norm 1.4311 (2.8755)	mem 20675MB
[2025-04-03 01:01:33 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][378/573]	eta 0:02:52 lr 0.001205	time 0.8772 (0.8823)	loss 0.5140 (0.5485)	grad_norm 2.0149 (2.8743)	mem 20675MB
[2025-04-03 01:01:34 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][380/573]	eta 0:02:50 lr 0.001205	time 0.8773 (0.8823)	loss 0.3991 (0.5482)	grad_norm 2.2065 (2.8684)	mem 20675MB
[2025-04-03 01:01:36 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][382/573]	eta 0:02:48 lr 0.001204	time 0.8774 (0.8823)	loss 0.6187 (0.5485)	grad_norm 3.6580 (2.8704)	mem 20675MB
[2025-04-03 01:01:38 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][384/573]	eta 0:02:46 lr 0.001204	time 0.8770 (0.8822)	loss 0.4848 (0.5483)	grad_norm 2.3931 (2.8673)	mem 20675MB
[2025-04-03 01:01:40 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][386/573]	eta 0:02:44 lr 0.001204	time 0.8775 (0.8822)	loss 0.5617 (0.5485)	grad_norm 2.1235 (2.8670)	mem 20675MB
[2025-04-03 01:01:41 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][388/573]	eta 0:02:43 lr 0.001204	time 0.8775 (0.8822)	loss 0.4523 (0.5484)	grad_norm 3.6327 (2.8674)	mem 20675MB
[2025-04-03 01:01:43 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][390/573]	eta 0:02:41 lr 0.001204	time 0.8778 (0.8822)	loss 0.5205 (0.5484)	grad_norm 2.5246 (2.8670)	mem 20675MB
[2025-04-03 01:01:45 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][392/573]	eta 0:02:39 lr 0.001204	time 0.8771 (0.8822)	loss 0.4921 (0.5483)	grad_norm 3.0171 (2.8667)	mem 20675MB
[2025-04-03 01:01:47 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][394/573]	eta 0:02:37 lr 0.001204	time 0.8774 (0.8821)	loss 0.6016 (0.5483)	grad_norm 3.0840 (2.8684)	mem 20675MB
[2025-04-03 01:01:48 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][396/573]	eta 0:02:36 lr 0.001204	time 0.8772 (0.8821)	loss 0.6219 (0.5487)	grad_norm 2.0026 (2.8640)	mem 20675MB
[2025-04-03 01:01:50 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][398/573]	eta 0:02:34 lr 0.001204	time 0.8775 (0.8821)	loss 0.5288 (0.5488)	grad_norm 3.4487 (2.8637)	mem 20675MB
[2025-04-03 01:01:52 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][400/573]	eta 0:02:32 lr 0.001204	time 0.8802 (0.8821)	loss 0.4489 (0.5485)	grad_norm 2.4229 (2.8602)	mem 20675MB
[2025-04-03 01:01:54 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][402/573]	eta 0:02:30 lr 0.001204	time 0.8783 (0.8821)	loss 0.5513 (0.5485)	grad_norm 2.6417 (2.8573)	mem 20675MB
[2025-04-03 01:01:56 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][404/573]	eta 0:02:29 lr 0.001204	time 0.8773 (0.8820)	loss 0.5514 (0.5486)	grad_norm 2.1119 (2.8533)	mem 20675MB
[2025-04-03 01:01:57 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][406/573]	eta 0:02:27 lr 0.001203	time 0.8776 (0.8820)	loss 0.5847 (0.5488)	grad_norm 2.2228 (2.8498)	mem 20675MB
[2025-04-03 01:01:59 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][408/573]	eta 0:02:25 lr 0.001203	time 0.8773 (0.8820)	loss 0.5264 (0.5487)	grad_norm 2.5238 (2.8469)	mem 20675MB
[2025-04-03 01:02:01 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][410/573]	eta 0:02:23 lr 0.001203	time 0.8773 (0.8820)	loss 0.6336 (0.5488)	grad_norm 3.7941 (2.8483)	mem 20675MB
[2025-04-03 01:02:03 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][412/573]	eta 0:02:21 lr 0.001203	time 0.8774 (0.8820)	loss 0.6164 (0.5491)	grad_norm 3.0963 (2.8471)	mem 20675MB
[2025-04-03 01:02:04 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][414/573]	eta 0:02:20 lr 0.001203	time 0.8773 (0.8820)	loss 0.6423 (0.5493)	grad_norm 2.4744 (2.8457)	mem 20675MB
[2025-04-03 01:02:06 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][416/573]	eta 0:02:18 lr 0.001203	time 0.8774 (0.8819)	loss 0.5589 (0.5495)	grad_norm 1.9631 (2.8412)	mem 20675MB
[2025-04-03 01:02:08 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][418/573]	eta 0:02:16 lr 0.001203	time 0.8772 (0.8819)	loss 0.5076 (0.5495)	grad_norm 2.7774 (2.8395)	mem 20675MB
[2025-04-03 01:02:10 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][420/573]	eta 0:02:14 lr 0.001203	time 0.8771 (0.8819)	loss 0.6045 (0.5495)	grad_norm 1.7215 (2.8389)	mem 20675MB
[2025-04-03 01:02:11 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][422/573]	eta 0:02:13 lr 0.001203	time 0.8773 (0.8819)	loss 0.6021 (0.5495)	grad_norm 2.4622 (2.8408)	mem 20675MB
[2025-04-03 01:02:13 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][424/573]	eta 0:02:11 lr 0.001203	time 0.8775 (0.8819)	loss 0.5455 (0.5493)	grad_norm 3.3262 (2.8441)	mem 20675MB
[2025-04-03 01:02:15 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][426/573]	eta 0:02:09 lr 0.001203	time 0.8773 (0.8818)	loss 0.6169 (0.5496)	grad_norm 1.9528 (2.8437)	mem 20675MB
[2025-04-03 01:02:17 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][428/573]	eta 0:02:07 lr 0.001203	time 0.8771 (0.8818)	loss 0.5583 (0.5495)	grad_norm 1.8180 (2.8384)	mem 20675MB
[2025-04-03 01:02:18 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][430/573]	eta 0:02:06 lr 0.001202	time 0.8773 (0.8818)	loss 0.5013 (0.5492)	grad_norm 4.2065 (2.8402)	mem 20675MB
[2025-04-03 01:02:20 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][432/573]	eta 0:02:04 lr 0.001202	time 0.8774 (0.8818)	loss 0.5888 (0.5494)	grad_norm 3.2079 (2.8422)	mem 20675MB
[2025-04-03 01:02:22 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][434/573]	eta 0:02:02 lr 0.001202	time 0.8777 (0.8818)	loss 0.6038 (0.5495)	grad_norm 2.1991 (2.8392)	mem 20675MB
[2025-04-03 01:02:24 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][436/573]	eta 0:02:00 lr 0.001202	time 0.8771 (0.8818)	loss 0.6112 (0.5496)	grad_norm 3.3999 (2.8394)	mem 20675MB
[2025-04-03 01:02:25 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][438/573]	eta 0:01:59 lr 0.001202	time 0.8775 (0.8817)	loss 0.6328 (0.5499)	grad_norm 2.1290 (2.8358)	mem 20675MB
[2025-04-03 01:02:27 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][440/573]	eta 0:01:57 lr 0.001202	time 0.8773 (0.8817)	loss 0.4459 (0.5494)	grad_norm 3.1977 (2.8409)	mem 20675MB
[2025-04-03 01:02:29 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][442/573]	eta 0:01:55 lr 0.001202	time 0.8771 (0.8817)	loss 0.5294 (0.5495)	grad_norm 3.4925 (2.8405)	mem 20675MB
[2025-04-03 01:02:31 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][444/573]	eta 0:01:53 lr 0.001202	time 0.8770 (0.8817)	loss 0.5537 (0.5497)	grad_norm 2.5984 (2.8391)	mem 20675MB
[2025-04-03 01:02:32 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][446/573]	eta 0:01:51 lr 0.001202	time 0.8772 (0.8817)	loss 0.5574 (0.5497)	grad_norm 3.0922 (2.8415)	mem 20675MB
[2025-04-03 01:02:34 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][448/573]	eta 0:01:50 lr 0.001202	time 0.8774 (0.8817)	loss 0.5856 (0.5500)	grad_norm 2.6243 (2.8408)	mem 20675MB
[2025-04-03 01:02:36 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][450/573]	eta 0:01:48 lr 0.001202	time 0.8773 (0.8817)	loss 0.5995 (0.5500)	grad_norm 1.9296 (2.8363)	mem 20675MB
[2025-04-03 01:02:38 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][452/573]	eta 0:01:46 lr 0.001201	time 0.8778 (0.8816)	loss 0.4998 (0.5501)	grad_norm 2.2263 (2.8364)	mem 20675MB
[2025-04-03 01:02:39 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][454/573]	eta 0:01:44 lr 0.001201	time 0.8772 (0.8816)	loss 0.5786 (0.5500)	grad_norm 1.5237 (2.8360)	mem 20675MB
[2025-04-03 01:02:41 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][456/573]	eta 0:01:43 lr 0.001201	time 0.8770 (0.8816)	loss 0.5978 (0.5499)	grad_norm 2.6431 (2.8389)	mem 20675MB
[2025-04-03 01:02:43 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][458/573]	eta 0:01:41 lr 0.001201	time 0.8771 (0.8816)	loss 0.5700 (0.5501)	grad_norm 1.8359 (2.8357)	mem 20675MB
[2025-04-03 01:02:45 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][460/573]	eta 0:01:39 lr 0.001201	time 0.8771 (0.8816)	loss 0.5490 (0.5500)	grad_norm 1.9154 (2.8436)	mem 20675MB
[2025-04-03 01:02:46 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][462/573]	eta 0:01:37 lr 0.001201	time 0.8772 (0.8816)	loss 0.5520 (0.5498)	grad_norm 1.7624 (2.8425)	mem 20675MB
[2025-04-03 01:02:48 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][464/573]	eta 0:01:36 lr 0.001201	time 0.8774 (0.8816)	loss 0.4670 (0.5495)	grad_norm 2.6768 (2.8418)	mem 20675MB
[2025-04-03 01:02:50 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][466/573]	eta 0:01:34 lr 0.001201	time 0.8778 (0.8815)	loss 0.6374 (0.5498)	grad_norm 3.2012 (2.8415)	mem 20675MB
[2025-04-03 01:02:52 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][468/573]	eta 0:01:32 lr 0.001201	time 0.8774 (0.8815)	loss 0.6018 (0.5499)	grad_norm 3.9873 (2.8417)	mem 20675MB
[2025-04-03 01:02:53 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][470/573]	eta 0:01:30 lr 0.001201	time 0.8808 (0.8815)	loss 0.5867 (0.5501)	grad_norm 2.0768 (2.8424)	mem 20675MB
[2025-04-03 01:02:55 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][472/573]	eta 0:01:29 lr 0.001201	time 0.8771 (0.8815)	loss 0.4923 (0.5500)	grad_norm 2.5730 (2.8393)	mem 20675MB
[2025-04-03 01:02:57 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][474/573]	eta 0:01:27 lr 0.001200	time 0.8769 (0.8815)	loss 0.5066 (0.5500)	grad_norm 2.5956 (2.8359)	mem 20675MB
[2025-04-03 01:02:59 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][476/573]	eta 0:01:25 lr 0.001200	time 0.8774 (0.8815)	loss 0.4189 (0.5497)	grad_norm 3.1047 (2.8344)	mem 20675MB
[2025-04-03 01:03:01 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][478/573]	eta 0:01:23 lr 0.001200	time 0.8776 (0.8815)	loss 0.4893 (0.5495)	grad_norm 1.6846 (2.8309)	mem 20675MB
[2025-04-03 01:03:02 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][480/573]	eta 0:01:21 lr 0.001200	time 0.8770 (0.8815)	loss 0.4316 (0.5490)	grad_norm 4.3014 (2.8367)	mem 20675MB
[2025-04-03 01:03:04 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][482/573]	eta 0:01:20 lr 0.001200	time 0.8770 (0.8814)	loss 0.5542 (0.5492)	grad_norm 3.3112 (2.8413)	mem 20675MB
[2025-04-03 01:03:06 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][484/573]	eta 0:01:18 lr 0.001200	time 0.8774 (0.8814)	loss 0.5622 (0.5491)	grad_norm 5.4594 (2.8466)	mem 20675MB
[2025-04-03 01:03:08 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][486/573]	eta 0:01:16 lr 0.001200	time 0.8775 (0.8814)	loss 0.4859 (0.5489)	grad_norm 2.5345 (2.8505)	mem 20675MB
[2025-04-03 01:03:09 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][488/573]	eta 0:01:14 lr 0.001200	time 0.8770 (0.8814)	loss 0.5167 (0.5489)	grad_norm 3.8041 (2.8512)	mem 20675MB
[2025-04-03 01:03:11 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][490/573]	eta 0:01:13 lr 0.001200	time 0.8780 (0.8814)	loss 0.5555 (0.5489)	grad_norm 1.5487 (2.8468)	mem 20675MB
[2025-04-03 01:03:13 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][492/573]	eta 0:01:11 lr 0.001200	time 0.8774 (0.8814)	loss 0.5758 (0.5488)	grad_norm 1.3987 (2.8432)	mem 20675MB
[2025-04-03 01:03:15 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][494/573]	eta 0:01:09 lr 0.001200	time 0.8777 (0.8814)	loss 0.5503 (0.5487)	grad_norm 1.9470 (2.8403)	mem 20675MB
[2025-04-03 01:03:16 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][496/573]	eta 0:01:07 lr 0.001199	time 0.8770 (0.8814)	loss 0.5038 (0.5487)	grad_norm 4.5040 (2.8420)	mem 20675MB
[2025-04-03 01:03:18 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][498/573]	eta 0:01:06 lr 0.001199	time 0.8770 (0.8814)	loss 0.5674 (0.5487)	grad_norm 2.8979 (2.8412)	mem 20675MB
[2025-04-03 01:03:20 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][500/573]	eta 0:01:04 lr 0.001199	time 0.8771 (0.8813)	loss 0.6213 (0.5485)	grad_norm 3.8874 (2.8428)	mem 20675MB
[2025-04-03 01:03:22 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][502/573]	eta 0:01:02 lr 0.001199	time 0.8774 (0.8813)	loss 0.5647 (0.5485)	grad_norm 2.4345 (2.8427)	mem 20675MB
[2025-04-03 01:03:23 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][504/573]	eta 0:01:00 lr 0.001199	time 0.8772 (0.8813)	loss 0.5586 (0.5483)	grad_norm 3.4270 (2.8432)	mem 20675MB
[2025-04-03 01:03:25 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][506/573]	eta 0:00:59 lr 0.001199	time 0.8776 (0.8813)	loss 0.6444 (0.5481)	grad_norm 4.4197 (2.8510)	mem 20675MB
[2025-04-03 01:03:27 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][508/573]	eta 0:00:57 lr 0.001199	time 0.8776 (0.8813)	loss 0.5590 (0.5481)	grad_norm 2.7410 (2.8495)	mem 20675MB
[2025-04-03 01:03:29 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][510/573]	eta 0:00:55 lr 0.001199	time 0.8772 (0.8813)	loss 0.5888 (0.5481)	grad_norm 3.4828 (2.8524)	mem 20675MB
[2025-04-03 01:03:30 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][512/573]	eta 0:00:53 lr 0.001199	time 0.8772 (0.8813)	loss 0.4261 (0.5478)	grad_norm 4.0417 (2.8554)	mem 20675MB
[2025-04-03 01:03:32 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][514/573]	eta 0:00:51 lr 0.001199	time 0.8771 (0.8813)	loss 0.4840 (0.5476)	grad_norm 3.7875 (2.8575)	mem 20675MB
[2025-04-03 01:03:34 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][516/573]	eta 0:00:50 lr 0.001199	time 0.8770 (0.8813)	loss 0.5820 (0.5478)	grad_norm 1.5298 (2.8564)	mem 20675MB
[2025-04-03 01:03:36 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][518/573]	eta 0:00:48 lr 0.001199	time 0.8773 (0.8812)	loss 0.4673 (0.5476)	grad_norm 3.2002 (2.8593)	mem 20675MB
[2025-04-03 01:03:37 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][520/573]	eta 0:00:46 lr 0.001198	time 0.8774 (0.8812)	loss 0.6305 (0.5477)	grad_norm 5.1772 (2.8647)	mem 20675MB
[2025-04-03 01:03:39 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][522/573]	eta 0:00:44 lr 0.001198	time 0.8771 (0.8812)	loss 0.5869 (0.5478)	grad_norm 4.2677 (2.8699)	mem 20675MB
[2025-04-03 01:03:41 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][524/573]	eta 0:00:43 lr 0.001198	time 0.8774 (0.8812)	loss 0.4801 (0.5477)	grad_norm 4.3803 (2.8716)	mem 20675MB
[2025-04-03 01:03:43 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][526/573]	eta 0:00:41 lr 0.001198	time 0.8772 (0.8812)	loss 0.5855 (0.5477)	grad_norm 4.4395 (2.8741)	mem 20675MB
[2025-04-03 01:03:44 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][528/573]	eta 0:00:39 lr 0.001198	time 0.8774 (0.8812)	loss 0.4418 (0.5476)	grad_norm 3.5617 (2.8792)	mem 20675MB
[2025-04-03 01:03:46 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][530/573]	eta 0:00:37 lr 0.001198	time 0.8770 (0.8812)	loss 0.5684 (0.5475)	grad_norm 5.8881 (2.8847)	mem 20675MB
[2025-04-03 01:03:48 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][532/573]	eta 0:00:36 lr 0.001198	time 0.8773 (0.8812)	loss 0.6286 (0.5477)	grad_norm 5.4435 (2.8895)	mem 20675MB
[2025-04-03 01:03:50 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][534/573]	eta 0:00:34 lr 0.001198	time 0.8771 (0.8812)	loss 0.6002 (0.5478)	grad_norm 1.9143 (2.8909)	mem 20675MB
[2025-04-03 01:03:51 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][536/573]	eta 0:00:32 lr 0.001198	time 0.8773 (0.8812)	loss 0.6116 (0.5478)	grad_norm 3.5509 (2.8934)	mem 20675MB
[2025-04-03 01:03:53 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][538/573]	eta 0:00:30 lr 0.001198	time 0.8772 (0.8811)	loss 0.4799 (0.5478)	grad_norm 4.5494 (2.8989)	mem 20675MB
[2025-04-03 01:03:55 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][540/573]	eta 0:00:29 lr 0.001197	time 0.8769 (0.8811)	loss 0.4997 (0.5476)	grad_norm 2.1575 (2.8974)	mem 20675MB
[2025-04-03 01:03:57 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][542/573]	eta 0:00:27 lr 0.001197	time 0.8770 (0.8811)	loss 0.5539 (0.5477)	grad_norm 2.6616 (2.8948)	mem 20675MB
[2025-04-03 01:03:58 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][544/573]	eta 0:00:25 lr 0.001197	time 0.8776 (0.8811)	loss 0.6598 (0.5479)	grad_norm 3.3769 (2.8947)	mem 20675MB
[2025-04-03 01:04:00 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][546/573]	eta 0:00:23 lr 0.001197	time 0.8770 (0.8811)	loss 0.5516 (0.5479)	grad_norm 1.8191 (2.8970)	mem 20675MB
[2025-04-03 01:04:02 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][548/573]	eta 0:00:22 lr 0.001197	time 0.8774 (0.8811)	loss 0.6670 (0.5481)	grad_norm 3.4581 (2.8962)	mem 20675MB
[2025-04-03 01:04:04 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][550/573]	eta 0:00:20 lr 0.001197	time 0.8771 (0.8811)	loss 0.4876 (0.5481)	grad_norm 2.8822 (2.8975)	mem 20675MB
[2025-04-03 01:04:06 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][552/573]	eta 0:00:18 lr 0.001197	time 0.8770 (0.8811)	loss 0.6193 (0.5482)	grad_norm 2.9309 (2.8983)	mem 20675MB
[2025-04-03 01:04:07 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][554/573]	eta 0:00:16 lr 0.001197	time 0.8770 (0.8811)	loss 0.5957 (0.5483)	grad_norm 1.9397 (2.8948)	mem 20675MB
[2025-04-03 01:04:09 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][556/573]	eta 0:00:14 lr 0.001197	time 0.8771 (0.8810)	loss 0.5333 (0.5484)	grad_norm 1.8595 (2.8911)	mem 20675MB
[2025-04-03 01:04:11 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][558/573]	eta 0:00:13 lr 0.001197	time 0.8768 (0.8810)	loss 0.5467 (0.5484)	grad_norm 1.6304 (2.8884)	mem 20675MB
[2025-04-03 01:04:13 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][560/573]	eta 0:00:11 lr 0.001197	time 0.8767 (0.8810)	loss 0.6098 (0.5483)	grad_norm 2.8833 (2.8877)	mem 20675MB
[2025-04-03 01:04:14 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][562/573]	eta 0:00:09 lr 0.001196	time 0.8766 (0.8810)	loss 0.6042 (0.5484)	grad_norm 2.1166 (2.8838)	mem 20675MB
[2025-04-03 01:04:16 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][564/573]	eta 0:00:07 lr 0.001196	time 0.8768 (0.8810)	loss 0.4965 (0.5482)	grad_norm 3.3518 (2.8846)	mem 20675MB
[2025-04-03 01:04:18 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][566/573]	eta 0:00:06 lr 0.001196	time 0.8773 (0.8810)	loss 0.6246 (0.5481)	grad_norm 2.9684 (2.8844)	mem 20675MB
[2025-04-03 01:04:20 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][568/573]	eta 0:00:04 lr 0.001196	time 0.8768 (0.8810)	loss 0.6029 (0.5482)	grad_norm 2.7746 (2.8830)	mem 20675MB
[2025-04-03 01:04:21 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][570/573]	eta 0:00:02 lr 0.001196	time 0.8772 (0.8810)	loss 0.6124 (0.5481)	grad_norm 2.5945 (2.8825)	mem 20675MB
[2025-04-03 01:04:23 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][572/573]	eta 0:00:00 lr 0.001196	time 0.8770 (0.8810)	loss 0.5940 (0.5482)	grad_norm 2.5405 (2.8813)	mem 20675MB
[2025-04-03 01:04:23 simmim_finetune] (main_finetune.py 260): INFO EPOCH 3 training takes 0:08:24
[2025-04-03 01:04:25 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.946 (1.946)	Loss 0.5584 (0.5584)	Acc@1 66.406 (66.406)	Mem 20675MB
[2025-04-03 01:04:26 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.838)	Loss 0.5169 (0.5305)	Acc@1 71.094 (70.312)	Mem 20675MB
[2025-04-03 01:04:26 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.616)	Loss 0.5792 (0.5430)	Acc@1 71.875 (70.781)	Mem 20675MB
[2025-04-03 01:04:27 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.521)	Loss 0.5699 (0.5411)	Acc@1 70.312 (71.875)	Mem 20675MB
[2025-04-03 01:04:27 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.469)	Loss 0.6256 (0.5408)	Acc@1 66.406 (71.875)	Mem 20675MB
[2025-04-03 01:04:28 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.435)	Loss 0.4775 (0.5349)	Acc@1 80.469 (72.798)	Mem 20675MB
[2025-04-03 01:04:29 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.412)	Loss 0.5085 (0.5332)	Acc@1 79.688 (73.197)	Mem 20675MB
[2025-04-03 01:04:29 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.395)	Loss 0.5372 (0.5336)	Acc@1 80.469 (73.802)	Mem 20675MB
[2025-04-03 01:04:29 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 73.891
[2025-04-03 01:04:29 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 73.9%
[2025-04-03 01:04:29 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 73.89%
[2025-04-03 01:04:29 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.432701471889148e-06, 4.432701471889148e-06, 6.813726672468429e-06, 6.813726672468429e-06, 1.0476842365667323e-05, 1.0476842365667323e-05, 1.61124049705887e-05, 1.61124049705887e-05, 2.4782501285852354e-05, 2.4782501285852354e-05, 3.8121111001642593e-05, 3.8121111001642593e-05, 5.864204902593526e-05, 5.864204902593526e-05, 9.021272290946244e-05, 9.021272290946244e-05, 0.0001387829904225812, 0.0001387829904225812, 0.00021350647890430236, 0.00021350647890430236, 0.00032846569195310403, 0.00032846569195310403, 0.0005053260197204914, 0.0005053260197204914, 0.0007774188316703181, 0.0007774188316703181, 0.0011960231577469744, 0.0011960231577469744]
[2025-04-03 01:04:32 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][0/573]	eta 0:24:15 lr 0.001196	time 2.5393 (2.5393)	loss 0.4488 (0.4488)	grad_norm 5.6969 (5.6969)	mem 20675MB
[2025-04-03 01:04:34 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][2/573]	eta 0:13:37 lr 0.001196	time 0.8772 (1.4319)	loss 0.4074 (0.4680)	grad_norm 2.6123 (3.6711)	mem 20675MB
[2025-04-03 01:04:35 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][4/573]	eta 0:11:28 lr 0.001196	time 0.8791 (1.2106)	loss 0.5854 (0.5111)	grad_norm 2.6571 (3.2540)	mem 20675MB
[2025-04-03 01:04:37 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][6/573]	eta 0:10:32 lr 0.001196	time 0.8773 (1.1156)	loss 0.5330 (0.5224)	grad_norm 1.8623 (2.9744)	mem 20675MB
[2025-04-03 01:04:39 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][8/573]	eta 0:10:00 lr 0.001196	time 0.8771 (1.0628)	loss 0.5569 (0.5186)	grad_norm 2.9449 (2.9109)	mem 20675MB
[2025-04-03 01:04:41 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][10/573]	eta 0:09:39 lr 0.001196	time 0.8773 (1.0292)	loss 0.5854 (0.5145)	grad_norm 4.5876 (3.2285)	mem 20675MB
[2025-04-03 01:04:42 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][12/573]	eta 0:09:24 lr 0.001195	time 0.8776 (1.0060)	loss 0.4323 (0.5126)	grad_norm 2.7840 (3.1247)	mem 20675MB
[2025-04-03 01:04:44 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][14/573]	eta 0:09:12 lr 0.001195	time 0.8804 (0.9891)	loss 0.5595 (0.5190)	grad_norm 2.7267 (3.0491)	mem 20675MB
[2025-04-03 01:04:46 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][16/573]	eta 0:09:03 lr 0.001195	time 0.8772 (0.9760)	loss 0.5465 (0.5207)	grad_norm 1.7251 (2.8743)	mem 20675MB
[2025-04-03 01:04:48 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][18/573]	eta 0:08:55 lr 0.001195	time 0.8775 (0.9657)	loss 0.4817 (0.5194)	grad_norm 4.7362 (3.0610)	mem 20675MB
[2025-04-03 01:04:50 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][20/573]	eta 0:08:49 lr 0.001195	time 0.8770 (0.9574)	loss 0.5680 (0.5243)	grad_norm 3.7335 (3.0841)	mem 20675MB
[2025-04-03 01:04:51 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][22/573]	eta 0:08:43 lr 0.001195	time 0.8773 (0.9505)	loss 0.5568 (0.5239)	grad_norm 5.3784 (3.1810)	mem 20675MB
[2025-04-03 01:04:53 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][24/573]	eta 0:08:38 lr 0.001195	time 0.8771 (0.9447)	loss 0.5705 (0.5270)	grad_norm 4.2507 (3.2117)	mem 20675MB
[2025-04-03 01:04:55 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][26/573]	eta 0:08:34 lr 0.001195	time 0.8772 (0.9398)	loss 0.5630 (0.5287)	grad_norm 2.5299 (3.1284)	mem 20675MB
[2025-04-03 01:04:57 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][28/573]	eta 0:08:29 lr 0.001195	time 0.8786 (0.9355)	loss 0.4672 (0.5281)	grad_norm 2.8465 (3.0846)	mem 20675MB
[2025-04-03 01:04:58 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][30/573]	eta 0:08:26 lr 0.001195	time 0.8770 (0.9319)	loss 0.6178 (0.5324)	grad_norm 2.2547 (3.0356)	mem 20675MB
[2025-04-03 01:05:00 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][32/573]	eta 0:08:22 lr 0.001194	time 0.8773 (0.9287)	loss 0.5631 (0.5344)	grad_norm 1.6365 (2.9524)	mem 20675MB
[2025-04-03 01:05:02 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][34/573]	eta 0:08:18 lr 0.001194	time 0.8773 (0.9258)	loss 0.5852 (0.5364)	grad_norm 1.9510 (2.9106)	mem 20675MB
[2025-04-03 01:05:04 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][36/573]	eta 0:08:15 lr 0.001194	time 0.8773 (0.9232)	loss 0.5498 (0.5356)	grad_norm 2.7511 (2.9057)	mem 20675MB
[2025-04-03 01:05:05 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][38/573]	eta 0:08:12 lr 0.001194	time 0.8773 (0.9209)	loss 0.6158 (0.5395)	grad_norm 2.5375 (2.8880)	mem 20675MB
[2025-04-03 01:05:07 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][40/573]	eta 0:08:09 lr 0.001194	time 0.8770 (0.9188)	loss 0.4557 (0.5378)	grad_norm 3.1951 (2.8866)	mem 20675MB
[2025-04-03 01:05:09 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][42/573]	eta 0:08:06 lr 0.001194	time 0.8774 (0.9169)	loss 0.6154 (0.5414)	grad_norm 2.0188 (2.8735)	mem 20675MB
[2025-04-03 01:05:11 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][44/573]	eta 0:08:04 lr 0.001194	time 0.8775 (0.9152)	loss 0.4661 (0.5420)	grad_norm 3.2672 (2.8847)	mem 20675MB
[2025-04-03 01:05:12 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][46/573]	eta 0:08:01 lr 0.001194	time 0.8775 (0.9136)	loss 0.6075 (0.5416)	grad_norm 2.6755 (2.8941)	mem 20675MB
[2025-04-03 01:05:14 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][48/573]	eta 0:07:58 lr 0.001194	time 0.8771 (0.9122)	loss 0.5361 (0.5390)	grad_norm 2.0750 (2.8895)	mem 20675MB
[2025-04-03 01:05:16 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][50/573]	eta 0:07:56 lr 0.001194	time 0.8772 (0.9109)	loss 0.5433 (0.5393)	grad_norm 2.4403 (2.8708)	mem 20675MB
[2025-04-03 01:05:18 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][52/573]	eta 0:07:53 lr 0.001194	time 0.8770 (0.9096)	loss 0.5634 (0.5386)	grad_norm 2.8443 (2.8677)	mem 20675MB
[2025-04-03 01:05:19 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][54/573]	eta 0:07:51 lr 0.001193	time 0.8773 (0.9085)	loss 0.6350 (0.5400)	grad_norm 4.6629 (2.9279)	mem 20675MB
[2025-04-03 01:05:21 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][56/573]	eta 0:07:49 lr 0.001193	time 0.8769 (0.9074)	loss 0.4829 (0.5403)	grad_norm 2.9063 (2.9061)	mem 20675MB
[2025-04-03 01:05:23 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][58/573]	eta 0:07:46 lr 0.001193	time 0.8772 (0.9064)	loss 0.4405 (0.5413)	grad_norm 4.3595 (2.9825)	mem 20675MB
[2025-04-03 01:05:25 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][60/573]	eta 0:07:44 lr 0.001193	time 0.8770 (0.9055)	loss 0.5739 (0.5418)	grad_norm 1.9354 (2.9738)	mem 20675MB
[2025-04-03 01:05:26 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][62/573]	eta 0:07:42 lr 0.001193	time 0.8769 (0.9046)	loss 0.5442 (0.5417)	grad_norm 5.0513 (2.9870)	mem 20675MB
[2025-04-03 01:05:28 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][64/573]	eta 0:07:40 lr 0.001193	time 0.8771 (0.9038)	loss 0.5192 (0.5413)	grad_norm 3.5791 (3.0309)	mem 20675MB
[2025-04-03 01:05:30 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][66/573]	eta 0:07:37 lr 0.001193	time 0.8769 (0.9030)	loss 0.3898 (0.5416)	grad_norm 2.2456 (3.0290)	mem 20675MB
[2025-04-03 01:05:32 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][68/573]	eta 0:07:35 lr 0.001193	time 0.8771 (0.9023)	loss 0.5838 (0.5402)	grad_norm 3.4310 (3.0285)	mem 20675MB
[2025-04-03 01:05:33 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][70/573]	eta 0:07:33 lr 0.001193	time 0.8768 (0.9016)	loss 0.5291 (0.5406)	grad_norm 2.0880 (3.0283)	mem 20675MB
[2025-04-03 01:05:35 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][72/573]	eta 0:07:31 lr 0.001193	time 0.8768 (0.9009)	loss 0.5795 (0.5418)	grad_norm 2.1137 (3.0124)	mem 20675MB
[2025-04-03 01:05:37 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][74/573]	eta 0:07:29 lr 0.001192	time 0.8772 (0.9003)	loss 0.4876 (0.5421)	grad_norm 3.2443 (3.0024)	mem 20675MB
[2025-04-03 01:05:39 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][76/573]	eta 0:07:27 lr 0.001192	time 0.8773 (0.8997)	loss 0.4646 (0.5399)	grad_norm 4.4473 (3.0349)	mem 20675MB
[2025-04-03 01:05:40 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][78/573]	eta 0:07:25 lr 0.001192	time 0.8768 (0.8992)	loss 0.5489 (0.5406)	grad_norm 2.0131 (3.0160)	mem 20675MB
[2025-04-03 01:05:42 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][80/573]	eta 0:07:23 lr 0.001192	time 0.8770 (0.8986)	loss 0.5056 (0.5403)	grad_norm 2.7527 (2.9990)	mem 20675MB
[2025-04-03 01:05:44 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][82/573]	eta 0:07:20 lr 0.001192	time 0.8770 (0.8981)	loss 0.4217 (0.5397)	grad_norm 4.1813 (3.0148)	mem 20675MB
[2025-04-03 01:05:46 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][84/573]	eta 0:07:18 lr 0.001192	time 0.8775 (0.8977)	loss 0.6328 (0.5411)	grad_norm 3.2370 (3.0157)	mem 20675MB
[2025-04-03 01:05:47 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][86/573]	eta 0:07:16 lr 0.001192	time 0.8772 (0.8972)	loss 0.5461 (0.5408)	grad_norm 2.3736 (3.0051)	mem 20675MB
[2025-04-03 01:05:49 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][88/573]	eta 0:07:14 lr 0.001192	time 0.8773 (0.8968)	loss 0.6000 (0.5421)	grad_norm 1.8176 (2.9881)	mem 20675MB
[2025-04-03 01:05:51 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][90/573]	eta 0:07:12 lr 0.001192	time 0.8769 (0.8964)	loss 0.5177 (0.5429)	grad_norm 2.6463 (2.9756)	mem 20675MB
[2025-04-03 01:05:53 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][92/573]	eta 0:07:10 lr 0.001192	time 0.8773 (0.8960)	loss 0.6095 (0.5427)	grad_norm 2.2940 (2.9614)	mem 20675MB
[2025-04-03 01:05:54 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][94/573]	eta 0:07:08 lr 0.001192	time 0.8772 (0.8956)	loss 0.5300 (0.5433)	grad_norm 2.8898 (2.9658)	mem 20675MB
[2025-04-03 01:05:56 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][96/573]	eta 0:07:07 lr 0.001191	time 0.8769 (0.8952)	loss 0.5981 (0.5423)	grad_norm 3.1977 (2.9697)	mem 20675MB
[2025-04-03 01:05:58 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][98/573]	eta 0:07:05 lr 0.001191	time 0.8773 (0.8949)	loss 0.5510 (0.5416)	grad_norm 5.0610 (2.9801)	mem 20675MB
[2025-04-03 01:06:00 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][100/573]	eta 0:07:03 lr 0.001191	time 0.8770 (0.8946)	loss 0.6196 (0.5423)	grad_norm 3.8825 (3.0139)	mem 20675MB
[2025-04-03 01:06:02 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][102/573]	eta 0:07:01 lr 0.001191	time 0.8769 (0.8942)	loss 0.4873 (0.5415)	grad_norm 3.1140 (3.0077)	mem 20675MB
[2025-04-03 01:06:03 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][104/573]	eta 0:06:59 lr 0.001191	time 0.8773 (0.8939)	loss 0.4246 (0.5408)	grad_norm 3.3881 (3.0128)	mem 20675MB
[2025-04-03 01:06:05 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][106/573]	eta 0:06:57 lr 0.001191	time 0.8787 (0.8936)	loss 0.6577 (0.5412)	grad_norm 2.3644 (3.0018)	mem 20675MB
[2025-04-03 01:06:07 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][108/573]	eta 0:06:55 lr 0.001191	time 0.8771 (0.8934)	loss 0.5902 (0.5412)	grad_norm 2.2813 (2.9892)	mem 20675MB
[2025-04-03 01:06:09 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][110/573]	eta 0:06:53 lr 0.001191	time 0.8771 (0.8931)	loss 0.5589 (0.5422)	grad_norm 3.7141 (2.9879)	mem 20675MB
[2025-04-03 01:06:10 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][112/573]	eta 0:06:51 lr 0.001191	time 0.8771 (0.8928)	loss 0.5521 (0.5430)	grad_norm 1.6096 (2.9657)	mem 20675MB
[2025-04-03 01:06:12 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][114/573]	eta 0:06:49 lr 0.001191	time 0.8774 (0.8926)	loss 0.6413 (0.5446)	grad_norm 1.5796 (2.9451)	mem 20675MB
[2025-04-03 01:06:14 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][116/573]	eta 0:06:47 lr 0.001190	time 0.8772 (0.8923)	loss 0.4971 (0.5441)	grad_norm 2.1863 (2.9311)	mem 20675MB
[2025-04-03 01:06:16 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][118/573]	eta 0:06:45 lr 0.001190	time 0.8773 (0.8921)	loss 0.4116 (0.5429)	grad_norm 2.4916 (2.9241)	mem 20675MB
[2025-04-03 01:06:17 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][120/573]	eta 0:06:44 lr 0.001190	time 0.8773 (0.8918)	loss 0.6505 (0.5435)	grad_norm 2.4072 (2.9142)	mem 20675MB
[2025-04-03 01:06:19 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][122/573]	eta 0:06:42 lr 0.001190	time 0.8775 (0.8916)	loss 0.5805 (0.5429)	grad_norm 2.9417 (2.9156)	mem 20675MB
[2025-04-03 01:06:21 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][124/573]	eta 0:06:40 lr 0.001190	time 0.8772 (0.8914)	loss 0.4410 (0.5408)	grad_norm 4.4767 (2.9315)	mem 20675MB
[2025-04-03 01:06:23 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][126/573]	eta 0:06:38 lr 0.001190	time 0.8771 (0.8912)	loss 0.4968 (0.5401)	grad_norm 2.1443 (2.9264)	mem 20675MB
[2025-04-03 01:06:24 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][128/573]	eta 0:06:36 lr 0.001190	time 0.8770 (0.8910)	loss 0.5069 (0.5402)	grad_norm 5.9725 (2.9520)	mem 20675MB
[2025-04-03 01:06:26 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][130/573]	eta 0:06:34 lr 0.001190	time 0.8769 (0.8908)	loss 0.5659 (0.5396)	grad_norm 2.6571 (2.9589)	mem 20675MB
[2025-04-03 01:06:28 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][132/573]	eta 0:06:32 lr 0.001190	time 0.8772 (0.8906)	loss 0.5063 (0.5403)	grad_norm 2.5225 (2.9593)	mem 20675MB
[2025-04-03 01:06:30 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][134/573]	eta 0:06:30 lr 0.001190	time 0.8774 (0.8904)	loss 0.5931 (0.5399)	grad_norm 3.9499 (2.9703)	mem 20675MB
[2025-04-03 01:06:31 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][136/573]	eta 0:06:29 lr 0.001189	time 0.8773 (0.8902)	loss 0.5295 (0.5401)	grad_norm 2.2315 (2.9590)	mem 20675MB
[2025-04-03 01:06:33 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][138/573]	eta 0:06:27 lr 0.001189	time 0.8770 (0.8901)	loss 0.6127 (0.5405)	grad_norm 2.4483 (2.9672)	mem 20675MB
[2025-04-03 01:06:35 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][140/573]	eta 0:06:25 lr 0.001189	time 0.8769 (0.8899)	loss 0.5838 (0.5406)	grad_norm 1.6495 (2.9569)	mem 20675MB
[2025-04-03 01:06:37 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][142/573]	eta 0:06:23 lr 0.001189	time 0.8772 (0.8897)	loss 0.5056 (0.5407)	grad_norm 2.8660 (2.9548)	mem 20675MB
[2025-04-03 01:06:38 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][144/573]	eta 0:06:21 lr 0.001189	time 0.8771 (0.8895)	loss 0.4386 (0.5404)	grad_norm 3.1372 (2.9484)	mem 20675MB
[2025-04-03 01:06:40 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][146/573]	eta 0:06:19 lr 0.001189	time 0.8772 (0.8894)	loss 0.6255 (0.5415)	grad_norm 2.5416 (2.9416)	mem 20675MB
[2025-04-03 01:06:42 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][148/573]	eta 0:06:17 lr 0.001189	time 0.8772 (0.8892)	loss 0.5315 (0.5417)	grad_norm 2.4179 (2.9352)	mem 20675MB
[2025-04-03 01:06:44 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][150/573]	eta 0:06:16 lr 0.001189	time 0.8771 (0.8891)	loss 0.5063 (0.5416)	grad_norm 2.2723 (2.9274)	mem 20675MB
[2025-04-03 01:06:45 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][152/573]	eta 0:06:14 lr 0.001189	time 0.8772 (0.8889)	loss 0.6034 (0.5423)	grad_norm 2.4404 (2.9196)	mem 20675MB
[2025-04-03 01:06:47 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][154/573]	eta 0:06:12 lr 0.001189	time 0.8775 (0.8888)	loss 0.4490 (0.5413)	grad_norm 3.9174 (2.9214)	mem 20675MB
[2025-04-03 01:06:49 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][156/573]	eta 0:06:10 lr 0.001188	time 0.8772 (0.8887)	loss 0.6424 (0.5422)	grad_norm 4.2185 (2.9242)	mem 20675MB
[2025-04-03 01:06:51 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][158/573]	eta 0:06:08 lr 0.001188	time 0.8771 (0.8885)	loss 0.6372 (0.5433)	grad_norm 3.2378 (2.9227)	mem 20675MB
[2025-04-03 01:06:52 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][160/573]	eta 0:06:06 lr 0.001188	time 0.8775 (0.8884)	loss 0.5593 (0.5439)	grad_norm 2.3838 (2.9171)	mem 20675MB
[2025-04-03 01:06:54 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][162/573]	eta 0:06:05 lr 0.001188	time 0.8771 (0.8883)	loss 0.5894 (0.5448)	grad_norm 1.7834 (2.9024)	mem 20675MB
[2025-04-03 01:06:56 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][164/573]	eta 0:06:03 lr 0.001188	time 0.8771 (0.8882)	loss 0.5078 (0.5450)	grad_norm 3.3753 (2.8971)	mem 20675MB
[2025-04-03 01:06:58 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][166/573]	eta 0:06:01 lr 0.001188	time 0.8770 (0.8880)	loss 0.4902 (0.5447)	grad_norm 3.6394 (2.9032)	mem 20675MB
[2025-04-03 01:06:59 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][168/573]	eta 0:05:59 lr 0.001188	time 0.8771 (0.8879)	loss 0.6182 (0.5450)	grad_norm 1.7226 (2.8904)	mem 20675MB
[2025-04-03 01:07:01 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][170/573]	eta 0:05:57 lr 0.001188	time 0.8772 (0.8878)	loss 0.4843 (0.5440)	grad_norm 3.0167 (2.8940)	mem 20675MB
[2025-04-03 01:07:03 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][172/573]	eta 0:05:55 lr 0.001188	time 0.8771 (0.8877)	loss 0.5682 (0.5437)	grad_norm 3.0995 (2.9069)	mem 20675MB
[2025-04-03 01:07:05 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][174/573]	eta 0:05:54 lr 0.001188	time 0.8770 (0.8876)	loss 0.5684 (0.5432)	grad_norm 3.7728 (2.9155)	mem 20675MB
[2025-04-03 01:07:06 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][176/573]	eta 0:05:52 lr 0.001188	time 0.8772 (0.8875)	loss 0.6129 (0.5437)	grad_norm 2.9993 (2.9101)	mem 20675MB
[2025-04-03 01:07:08 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][178/573]	eta 0:05:50 lr 0.001187	time 0.8770 (0.8873)	loss 0.5820 (0.5446)	grad_norm 2.9821 (2.9096)	mem 20675MB
[2025-04-03 01:07:10 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][180/573]	eta 0:05:48 lr 0.001187	time 0.8770 (0.8872)	loss 0.4954 (0.5441)	grad_norm 2.2231 (2.9115)	mem 20675MB
[2025-04-03 01:07:12 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][182/573]	eta 0:05:46 lr 0.001187	time 0.8771 (0.8871)	loss 0.5658 (0.5440)	grad_norm 2.4288 (2.9081)	mem 20675MB
[2025-04-03 01:07:14 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][184/573]	eta 0:05:45 lr 0.001187	time 0.8773 (0.8870)	loss 0.5560 (0.5438)	grad_norm 2.9232 (2.9043)	mem 20675MB
[2025-04-03 01:07:15 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][186/573]	eta 0:05:43 lr 0.001187	time 0.8771 (0.8869)	loss 0.6175 (0.5441)	grad_norm 2.9417 (2.9031)	mem 20675MB
[2025-04-03 01:07:17 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][188/573]	eta 0:05:41 lr 0.001187	time 0.8773 (0.8869)	loss 0.6056 (0.5440)	grad_norm 2.4369 (2.8982)	mem 20675MB
[2025-04-03 01:07:19 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][190/573]	eta 0:05:39 lr 0.001187	time 0.8771 (0.8868)	loss 0.6146 (0.5440)	grad_norm 3.0713 (2.9008)	mem 20675MB
[2025-04-03 01:07:21 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][192/573]	eta 0:05:37 lr 0.001187	time 0.8775 (0.8867)	loss 0.5690 (0.5441)	grad_norm 2.3480 (2.8954)	mem 20675MB
[2025-04-03 01:07:22 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][194/573]	eta 0:05:36 lr 0.001187	time 0.8770 (0.8866)	loss 0.4882 (0.5443)	grad_norm 3.8393 (2.8987)	mem 20675MB
[2025-04-03 01:07:24 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][196/573]	eta 0:05:34 lr 0.001187	time 0.8773 (0.8865)	loss 0.5478 (0.5443)	grad_norm 2.5825 (2.8942)	mem 20675MB
[2025-04-03 01:07:26 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][198/573]	eta 0:05:32 lr 0.001186	time 0.8773 (0.8864)	loss 0.5822 (0.5449)	grad_norm 2.4112 (2.8924)	mem 20675MB
[2025-04-03 01:07:28 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][200/573]	eta 0:05:30 lr 0.001186	time 0.8770 (0.8863)	loss 0.4785 (0.5444)	grad_norm 3.7255 (2.8922)	mem 20675MB
[2025-04-03 01:07:29 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][202/573]	eta 0:05:28 lr 0.001186	time 0.8772 (0.8862)	loss 0.5589 (0.5441)	grad_norm 3.6380 (2.8975)	mem 20675MB
[2025-04-03 01:07:31 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][204/573]	eta 0:05:26 lr 0.001186	time 0.8770 (0.8862)	loss 0.4823 (0.5435)	grad_norm 3.8553 (2.9046)	mem 20675MB
[2025-04-03 01:07:33 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][206/573]	eta 0:05:25 lr 0.001186	time 0.8781 (0.8861)	loss 0.5788 (0.5435)	grad_norm 2.3846 (2.9057)	mem 20675MB
[2025-04-03 01:07:35 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][208/573]	eta 0:05:23 lr 0.001186	time 0.8772 (0.8860)	loss 0.5926 (0.5439)	grad_norm 2.5705 (2.9048)	mem 20675MB
[2025-04-03 01:07:36 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][210/573]	eta 0:05:21 lr 0.001186	time 0.8776 (0.8859)	loss 0.5251 (0.5432)	grad_norm 2.9452 (2.9137)	mem 20675MB
[2025-04-03 01:07:38 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][212/573]	eta 0:05:19 lr 0.001186	time 0.8773 (0.8859)	loss 0.6499 (0.5435)	grad_norm 2.9609 (2.9127)	mem 20675MB
[2025-04-03 01:07:40 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][214/573]	eta 0:05:18 lr 0.001186	time 0.8772 (0.8858)	loss 0.4476 (0.5427)	grad_norm 3.2385 (2.9165)	mem 20675MB
[2025-04-03 01:07:42 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][216/573]	eta 0:05:16 lr 0.001186	time 0.8770 (0.8857)	loss 0.6010 (0.5429)	grad_norm 2.4749 (2.9104)	mem 20675MB
[2025-04-03 01:07:43 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][218/573]	eta 0:05:14 lr 0.001185	time 0.8774 (0.8857)	loss 0.6704 (0.5436)	grad_norm 2.7282 (2.9079)	mem 20675MB
[2025-04-03 01:07:45 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][220/573]	eta 0:05:12 lr 0.001185	time 0.8770 (0.8856)	loss 0.4470 (0.5434)	grad_norm 3.4262 (2.9081)	mem 20675MB
[2025-04-03 01:07:47 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][222/573]	eta 0:05:10 lr 0.001185	time 0.8771 (0.8855)	loss 0.5727 (0.5436)	grad_norm 1.6736 (2.8975)	mem 20675MB
[2025-04-03 01:07:49 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][224/573]	eta 0:05:09 lr 0.001185	time 0.8773 (0.8855)	loss 0.5952 (0.5438)	grad_norm 1.8887 (2.8894)	mem 20675MB
[2025-04-03 01:07:50 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][226/573]	eta 0:05:07 lr 0.001185	time 0.8771 (0.8854)	loss 0.5537 (0.5440)	grad_norm 2.1878 (2.8821)	mem 20675MB
[2025-04-03 01:07:52 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][228/573]	eta 0:05:05 lr 0.001185	time 0.8772 (0.8853)	loss 0.5459 (0.5440)	grad_norm 2.4199 (2.8753)	mem 20675MB
[2025-04-03 01:07:54 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][230/573]	eta 0:05:03 lr 0.001185	time 0.8774 (0.8853)	loss 0.4112 (0.5435)	grad_norm 2.9028 (2.8732)	mem 20675MB
[2025-04-03 01:07:56 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][232/573]	eta 0:05:01 lr 0.001185	time 0.8771 (0.8852)	loss 0.5587 (0.5441)	grad_norm 2.4712 (2.8747)	mem 20675MB
[2025-04-03 01:07:57 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][234/573]	eta 0:05:00 lr 0.001185	time 0.8771 (0.8851)	loss 0.5988 (0.5448)	grad_norm 4.1292 (2.8802)	mem 20675MB
[2025-04-03 01:07:59 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][236/573]	eta 0:04:58 lr 0.001184	time 0.8773 (0.8851)	loss 0.6068 (0.5452)	grad_norm 3.4678 (2.8861)	mem 20675MB
[2025-04-03 01:08:01 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][238/573]	eta 0:04:56 lr 0.001184	time 0.8772 (0.8850)	loss 0.5935 (0.5453)	grad_norm 1.7551 (2.8852)	mem 20675MB
[2025-04-03 01:08:03 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][240/573]	eta 0:04:54 lr 0.001184	time 0.8772 (0.8850)	loss 0.5338 (0.5452)	grad_norm 3.2577 (2.8859)	mem 20675MB
[2025-04-03 01:08:04 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][242/573]	eta 0:04:52 lr 0.001184	time 0.8772 (0.8849)	loss 0.6024 (0.5456)	grad_norm 1.5419 (2.8794)	mem 20675MB
[2025-04-03 01:08:06 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][244/573]	eta 0:04:51 lr 0.001184	time 0.8771 (0.8849)	loss 0.5818 (0.5456)	grad_norm 2.8694 (2.8800)	mem 20675MB
[2025-04-03 01:08:08 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][246/573]	eta 0:04:49 lr 0.001184	time 0.8775 (0.8848)	loss 0.4637 (0.5454)	grad_norm 3.2126 (2.8774)	mem 20675MB
[2025-04-03 01:08:10 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][248/573]	eta 0:04:47 lr 0.001184	time 0.8776 (0.8848)	loss 0.5177 (0.5456)	grad_norm 3.6005 (2.8815)	mem 20675MB
[2025-04-03 01:08:11 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][250/573]	eta 0:04:45 lr 0.001184	time 0.8775 (0.8847)	loss 0.6215 (0.5458)	grad_norm 2.3900 (2.8791)	mem 20675MB
[2025-04-03 01:08:13 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][252/573]	eta 0:04:43 lr 0.001184	time 0.8771 (0.8846)	loss 0.5394 (0.5456)	grad_norm 2.7934 (2.8825)	mem 20675MB
[2025-04-03 01:08:15 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][254/573]	eta 0:04:42 lr 0.001184	time 0.8773 (0.8846)	loss 0.5869 (0.5458)	grad_norm 1.9089 (2.8808)	mem 20675MB
[2025-04-03 01:08:17 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][256/573]	eta 0:04:40 lr 0.001183	time 0.8773 (0.8845)	loss 0.5765 (0.5459)	grad_norm 1.8868 (2.8757)	mem 20675MB
[2025-04-03 01:08:19 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][258/573]	eta 0:04:38 lr 0.001183	time 0.8772 (0.8845)	loss 0.5772 (0.5460)	grad_norm 2.5850 (2.8760)	mem 20675MB
[2025-04-03 01:08:20 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][260/573]	eta 0:04:36 lr 0.001183	time 0.8772 (0.8845)	loss 0.5698 (0.5462)	grad_norm 1.7110 (2.8681)	mem 20675MB
[2025-04-03 01:08:22 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][262/573]	eta 0:04:35 lr 0.001183	time 0.8772 (0.8844)	loss 0.6256 (0.5467)	grad_norm 2.6922 (2.8646)	mem 20675MB
[2025-04-03 01:08:24 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][264/573]	eta 0:04:33 lr 0.001183	time 0.8772 (0.8844)	loss 0.5381 (0.5467)	grad_norm 1.8612 (2.8597)	mem 20675MB
[2025-04-03 01:08:26 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][266/573]	eta 0:04:31 lr 0.001183	time 0.8774 (0.8843)	loss 0.5916 (0.5470)	grad_norm 1.9683 (2.8527)	mem 20675MB
[2025-04-03 01:08:27 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][268/573]	eta 0:04:29 lr 0.001183	time 0.8774 (0.8843)	loss 0.5939 (0.5473)	grad_norm 3.0497 (2.8490)	mem 20675MB
[2025-04-03 01:08:29 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][270/573]	eta 0:04:27 lr 0.001183	time 0.8772 (0.8842)	loss 0.6192 (0.5476)	grad_norm 3.5518 (2.8538)	mem 20675MB
[2025-04-03 01:08:31 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][272/573]	eta 0:04:26 lr 0.001183	time 0.8775 (0.8842)	loss 0.6596 (0.5478)	grad_norm 3.2040 (2.8528)	mem 20675MB
[2025-04-03 01:08:33 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][274/573]	eta 0:04:24 lr 0.001183	time 0.8770 (0.8841)	loss 0.5720 (0.5481)	grad_norm 2.4804 (2.8531)	mem 20675MB
[2025-04-03 01:08:34 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][276/573]	eta 0:04:22 lr 0.001182	time 0.8770 (0.8841)	loss 0.5637 (0.5485)	grad_norm 1.7365 (2.8467)	mem 20675MB
[2025-04-03 01:08:36 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][278/573]	eta 0:04:20 lr 0.001182	time 0.8775 (0.8840)	loss 0.5440 (0.5486)	grad_norm 1.7322 (2.8383)	mem 20675MB
[2025-04-03 01:08:38 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][280/573]	eta 0:04:19 lr 0.001182	time 0.8774 (0.8840)	loss 0.5437 (0.5483)	grad_norm 1.7873 (2.8330)	mem 20675MB
[2025-04-03 01:08:40 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][282/573]	eta 0:04:17 lr 0.001182	time 0.8771 (0.8840)	loss 0.5265 (0.5484)	grad_norm 2.7982 (2.8316)	mem 20675MB
[2025-04-03 01:08:41 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][284/573]	eta 0:04:15 lr 0.001182	time 0.8771 (0.8839)	loss 0.4063 (0.5481)	grad_norm 3.1347 (2.8308)	mem 20675MB
[2025-04-03 01:08:43 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][286/573]	eta 0:04:13 lr 0.001182	time 0.8773 (0.8839)	loss 0.6716 (0.5485)	grad_norm 4.4456 (2.8348)	mem 20675MB
[2025-04-03 01:08:45 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][288/573]	eta 0:04:11 lr 0.001182	time 0.8771 (0.8838)	loss 0.6129 (0.5489)	grad_norm 2.3662 (2.8333)	mem 20675MB
[2025-04-03 01:08:47 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][290/573]	eta 0:04:10 lr 0.001182	time 0.8770 (0.8838)	loss 0.5077 (0.5487)	grad_norm 2.1393 (2.8323)	mem 20675MB
[2025-04-03 01:08:48 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][292/573]	eta 0:04:08 lr 0.001182	time 0.8773 (0.8838)	loss 0.4712 (0.5481)	grad_norm 2.4719 (2.8309)	mem 20675MB
[2025-04-03 01:08:50 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][294/573]	eta 0:04:06 lr 0.001182	time 0.8770 (0.8837)	loss 0.5725 (0.5483)	grad_norm 3.0260 (2.8286)	mem 20675MB
[2025-04-03 01:08:52 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][296/573]	eta 0:04:04 lr 0.001181	time 0.8774 (0.8837)	loss 0.4671 (0.5480)	grad_norm 2.8523 (2.8247)	mem 20675MB
[2025-04-03 01:08:54 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][298/573]	eta 0:04:03 lr 0.001181	time 0.8774 (0.8836)	loss 0.6177 (0.5479)	grad_norm 2.6079 (2.8223)	mem 20675MB
[2025-04-03 01:08:55 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][300/573]	eta 0:04:01 lr 0.001181	time 0.8773 (0.8836)	loss 0.5893 (0.5477)	grad_norm 2.5562 (2.8198)	mem 20675MB
[2025-04-03 01:08:57 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][302/573]	eta 0:03:59 lr 0.001181	time 0.8773 (0.8836)	loss 0.5373 (0.5474)	grad_norm 3.3766 (2.8212)	mem 20675MB
[2025-04-03 01:08:59 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][304/573]	eta 0:03:57 lr 0.001181	time 0.8770 (0.8835)	loss 0.5063 (0.5473)	grad_norm 2.4018 (2.8188)	mem 20675MB
[2025-04-03 01:09:01 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][306/573]	eta 0:03:55 lr 0.001181	time 0.8773 (0.8835)	loss 0.6609 (0.5477)	grad_norm 4.0562 (2.8207)	mem 20675MB
[2025-04-03 01:09:02 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][308/573]	eta 0:03:54 lr 0.001181	time 0.8772 (0.8835)	loss 0.5845 (0.5481)	grad_norm 3.1073 (2.8265)	mem 20675MB
[2025-04-03 01:09:04 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][310/573]	eta 0:03:52 lr 0.001181	time 0.8776 (0.8834)	loss 0.5680 (0.5483)	grad_norm 2.5760 (2.8241)	mem 20675MB
[2025-04-03 01:09:06 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][312/573]	eta 0:03:50 lr 0.001181	time 0.8775 (0.8834)	loss 0.5403 (0.5479)	grad_norm 2.6627 (2.8253)	mem 20675MB
[2025-04-03 01:09:08 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][314/573]	eta 0:03:48 lr 0.001180	time 0.8770 (0.8834)	loss 0.5713 (0.5478)	grad_norm 1.8043 (2.8217)	mem 20675MB
[2025-04-03 01:09:09 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][316/573]	eta 0:03:47 lr 0.001180	time 0.8774 (0.8833)	loss 0.5172 (0.5474)	grad_norm 2.4369 (2.8204)	mem 20675MB
[2025-04-03 01:09:11 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][318/573]	eta 0:03:45 lr 0.001180	time 0.8774 (0.8833)	loss 0.6004 (0.5473)	grad_norm 2.8874 (2.8188)	mem 20675MB
[2025-04-03 01:09:13 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][320/573]	eta 0:03:43 lr 0.001180	time 0.8770 (0.8833)	loss 0.4718 (0.5472)	grad_norm 4.5430 (2.8220)	mem 20675MB
[2025-04-03 01:09:15 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][322/573]	eta 0:03:41 lr 0.001180	time 0.8772 (0.8832)	loss 0.6522 (0.5477)	grad_norm 3.1233 (2.8260)	mem 20675MB
[2025-04-03 01:09:16 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][324/573]	eta 0:03:39 lr 0.001180	time 0.8769 (0.8832)	loss 0.5901 (0.5476)	grad_norm 2.2675 (2.8238)	mem 20675MB
[2025-04-03 01:09:18 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][326/573]	eta 0:03:38 lr 0.001180	time 0.8770 (0.8832)	loss 0.5508 (0.5474)	grad_norm 2.2436 (2.8205)	mem 20675MB
[2025-04-03 01:09:20 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][328/573]	eta 0:03:36 lr 0.001180	time 0.8772 (0.8831)	loss 0.5217 (0.5474)	grad_norm 2.8402 (2.8160)	mem 20675MB
[2025-04-03 01:09:22 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][330/573]	eta 0:03:34 lr 0.001180	time 0.8771 (0.8831)	loss 0.5874 (0.5476)	grad_norm 2.4597 (2.8137)	mem 20675MB
[2025-04-03 01:09:23 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][332/573]	eta 0:03:32 lr 0.001180	time 0.8771 (0.8831)	loss 0.5459 (0.5472)	grad_norm 1.7715 (2.8129)	mem 20675MB
[2025-04-03 01:09:25 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][334/573]	eta 0:03:31 lr 0.001179	time 0.8774 (0.8830)	loss 0.5441 (0.5472)	grad_norm 2.1349 (2.8106)	mem 20675MB
[2025-04-03 01:09:27 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][336/573]	eta 0:03:29 lr 0.001179	time 0.8773 (0.8830)	loss 0.5096 (0.5471)	grad_norm 2.7865 (2.8103)	mem 20675MB
[2025-04-03 01:09:29 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][338/573]	eta 0:03:27 lr 0.001179	time 0.8772 (0.8830)	loss 0.5237 (0.5469)	grad_norm 3.0549 (2.8108)	mem 20675MB
[2025-04-03 01:09:31 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][340/573]	eta 0:03:25 lr 0.001179	time 0.8776 (0.8830)	loss 0.5160 (0.5468)	grad_norm 2.2091 (2.8108)	mem 20675MB
[2025-04-03 01:09:32 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][342/573]	eta 0:03:23 lr 0.001179	time 0.8770 (0.8829)	loss 0.4490 (0.5465)	grad_norm 2.9363 (2.8120)	mem 20675MB
[2025-04-03 01:09:34 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][344/573]	eta 0:03:22 lr 0.001179	time 0.8775 (0.8829)	loss 0.4886 (0.5468)	grad_norm 2.6732 (2.8154)	mem 20675MB
[2025-04-03 01:09:36 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][346/573]	eta 0:03:20 lr 0.001179	time 0.8773 (0.8829)	loss 0.5022 (0.5469)	grad_norm 3.0393 (2.8152)	mem 20675MB
[2025-04-03 01:09:38 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][348/573]	eta 0:03:18 lr 0.001179	time 0.8776 (0.8828)	loss 0.5770 (0.5470)	grad_norm 1.4831 (2.8089)	mem 20675MB
[2025-04-03 01:09:39 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][350/573]	eta 0:03:16 lr 0.001179	time 0.8771 (0.8828)	loss 0.5520 (0.5472)	grad_norm 1.8767 (2.8074)	mem 20675MB
[2025-04-03 01:09:41 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][352/573]	eta 0:03:15 lr 0.001178	time 0.8774 (0.8828)	loss 0.6215 (0.5478)	grad_norm 2.5477 (2.8073)	mem 20675MB
[2025-04-03 01:09:43 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][354/573]	eta 0:03:13 lr 0.001178	time 0.8775 (0.8828)	loss 0.6117 (0.5477)	grad_norm 1.3964 (2.8038)	mem 20675MB
[2025-04-03 01:09:45 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][356/573]	eta 0:03:11 lr 0.001178	time 0.8773 (0.8827)	loss 0.6523 (0.5479)	grad_norm 2.6087 (2.8007)	mem 20675MB
[2025-04-03 01:09:46 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][358/573]	eta 0:03:09 lr 0.001178	time 0.8770 (0.8827)	loss 0.3801 (0.5471)	grad_norm 2.7402 (2.7997)	mem 20675MB
[2025-04-03 01:09:48 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][360/573]	eta 0:03:08 lr 0.001178	time 0.8775 (0.8827)	loss 0.3818 (0.5463)	grad_norm 3.5270 (2.8032)	mem 20675MB
[2025-04-03 01:09:50 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][362/573]	eta 0:03:06 lr 0.001178	time 0.8771 (0.8827)	loss 0.5672 (0.5464)	grad_norm 2.8770 (2.8011)	mem 20675MB
[2025-04-03 01:09:52 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][364/573]	eta 0:03:04 lr 0.001178	time 0.8772 (0.8826)	loss 0.5616 (0.5466)	grad_norm 2.4655 (2.8009)	mem 20675MB
[2025-04-03 01:09:53 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][366/573]	eta 0:03:02 lr 0.001178	time 0.8772 (0.8826)	loss 0.5718 (0.5469)	grad_norm 3.7408 (2.8042)	mem 20675MB
[2025-04-03 01:09:55 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][368/573]	eta 0:03:00 lr 0.001178	time 0.8771 (0.8826)	loss 0.5184 (0.5468)	grad_norm 2.9161 (2.8027)	mem 20675MB
[2025-04-03 01:09:57 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][370/573]	eta 0:02:59 lr 0.001177	time 0.8774 (0.8826)	loss 0.5640 (0.5467)	grad_norm 2.0661 (2.7998)	mem 20675MB
[2025-04-03 01:09:59 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][372/573]	eta 0:02:57 lr 0.001177	time 0.8772 (0.8825)	loss 0.5912 (0.5469)	grad_norm 1.7467 (2.7946)	mem 20675MB
[2025-04-03 01:10:00 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][374/573]	eta 0:02:55 lr 0.001177	time 0.8774 (0.8825)	loss 0.5256 (0.5469)	grad_norm 2.7914 (2.7939)	mem 20675MB
[2025-04-03 01:10:02 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][376/573]	eta 0:02:53 lr 0.001177	time 0.8771 (0.8825)	loss 0.5684 (0.5469)	grad_norm 2.7611 (2.7934)	mem 20675MB
[2025-04-03 01:10:04 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][378/573]	eta 0:02:52 lr 0.001177	time 0.8771 (0.8825)	loss 0.5619 (0.5471)	grad_norm 1.5323 (2.7906)	mem 20675MB
[2025-04-03 01:10:06 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][380/573]	eta 0:02:50 lr 0.001177	time 0.8772 (0.8824)	loss 0.5318 (0.5467)	grad_norm 2.4300 (2.7913)	mem 20675MB
[2025-04-03 01:10:07 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][382/573]	eta 0:02:48 lr 0.001177	time 0.8773 (0.8824)	loss 0.4977 (0.5466)	grad_norm 3.5306 (2.7932)	mem 20675MB
[2025-04-03 01:10:09 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][384/573]	eta 0:02:46 lr 0.001177	time 0.8770 (0.8824)	loss 0.5884 (0.5468)	grad_norm 2.7276 (2.7928)	mem 20675MB
[2025-04-03 01:10:11 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][386/573]	eta 0:02:45 lr 0.001177	time 0.8775 (0.8824)	loss 0.6502 (0.5471)	grad_norm 3.2981 (2.7933)	mem 20675MB
[2025-04-03 01:10:13 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][388/573]	eta 0:02:43 lr 0.001177	time 0.8774 (0.8824)	loss 0.5573 (0.5473)	grad_norm 2.2983 (2.7904)	mem 20675MB
[2025-04-03 01:10:14 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][390/573]	eta 0:02:41 lr 0.001176	time 0.8771 (0.8823)	loss 0.5753 (0.5469)	grad_norm 3.6860 (2.7929)	mem 20675MB
[2025-04-03 01:10:16 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][392/573]	eta 0:02:39 lr 0.001176	time 0.8770 (0.8823)	loss 0.5414 (0.5468)	grad_norm 2.7722 (2.7930)	mem 20675MB
[2025-04-03 01:10:18 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][394/573]	eta 0:02:37 lr 0.001176	time 0.8773 (0.8823)	loss 0.6539 (0.5471)	grad_norm 2.1176 (2.7898)	mem 20675MB
[2025-04-03 01:10:20 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][396/573]	eta 0:02:36 lr 0.001176	time 0.8770 (0.8823)	loss 0.5960 (0.5473)	grad_norm 1.9439 (2.7861)	mem 20675MB
[2025-04-03 01:10:21 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][398/573]	eta 0:02:34 lr 0.001176	time 0.8771 (0.8823)	loss 0.5858 (0.5475)	grad_norm 2.0799 (2.7817)	mem 20675MB
[2025-04-03 01:10:23 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][400/573]	eta 0:02:32 lr 0.001176	time 0.8773 (0.8822)	loss 0.4409 (0.5471)	grad_norm 2.3411 (2.7811)	mem 20675MB
[2025-04-03 01:10:25 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][402/573]	eta 0:02:30 lr 0.001176	time 0.8776 (0.8822)	loss 0.5902 (0.5470)	grad_norm 2.6778 (2.7803)	mem 20675MB
[2025-04-03 01:10:27 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][404/573]	eta 0:02:29 lr 0.001176	time 0.8774 (0.8822)	loss 0.6073 (0.5471)	grad_norm 5.6079 (2.7920)	mem 20675MB
[2025-04-03 01:10:28 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][406/573]	eta 0:02:27 lr 0.001176	time 0.8783 (0.8822)	loss 0.5754 (0.5473)	grad_norm 2.0154 (2.7901)	mem 20675MB
[2025-04-03 01:10:30 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][408/573]	eta 0:02:25 lr 0.001175	time 0.8771 (0.8822)	loss 0.6061 (0.5475)	grad_norm 5.1511 (2.7955)	mem 20675MB
[2025-04-03 01:10:32 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][410/573]	eta 0:02:23 lr 0.001175	time 0.8771 (0.8821)	loss 0.5520 (0.5475)	grad_norm 3.8862 (2.7980)	mem 20675MB
[2025-04-03 01:10:34 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][412/573]	eta 0:02:22 lr 0.001175	time 0.8772 (0.8821)	loss 0.5920 (0.5478)	grad_norm 1.4271 (2.7921)	mem 20675MB
[2025-04-03 01:10:35 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][414/573]	eta 0:02:20 lr 0.001175	time 0.8773 (0.8821)	loss 0.5885 (0.5480)	grad_norm 1.9447 (2.7879)	mem 20675MB
[2025-04-03 01:10:37 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][416/573]	eta 0:02:18 lr 0.001175	time 0.8770 (0.8821)	loss 0.5533 (0.5481)	grad_norm 2.1073 (2.7867)	mem 20675MB
[2025-04-03 01:10:39 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][418/573]	eta 0:02:16 lr 0.001175	time 0.8771 (0.8821)	loss 0.5085 (0.5480)	grad_norm 2.3179 (2.7844)	mem 20675MB
[2025-04-03 01:10:41 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][420/573]	eta 0:02:14 lr 0.001175	time 0.8772 (0.8820)	loss 0.5397 (0.5480)	grad_norm 2.2946 (2.7803)	mem 20675MB
[2025-04-03 01:10:43 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][422/573]	eta 0:02:13 lr 0.001175	time 0.8771 (0.8820)	loss 0.5928 (0.5480)	grad_norm 2.5885 (2.7791)	mem 20675MB
[2025-04-03 01:10:44 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][424/573]	eta 0:02:11 lr 0.001175	time 0.8774 (0.8820)	loss 0.4336 (0.5480)	grad_norm 4.1366 (2.7854)	mem 20675MB
[2025-04-03 01:10:46 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][426/573]	eta 0:02:09 lr 0.001174	time 0.8771 (0.8820)	loss 0.5775 (0.5482)	grad_norm 1.5469 (2.7815)	mem 20675MB
[2025-04-03 01:10:48 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][428/573]	eta 0:02:07 lr 0.001174	time 0.8771 (0.8820)	loss 0.5914 (0.5481)	grad_norm 2.5349 (2.7814)	mem 20675MB
[2025-04-03 01:10:50 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][430/573]	eta 0:02:06 lr 0.001174	time 0.8770 (0.8819)	loss 0.6115 (0.5482)	grad_norm 2.2488 (2.7816)	mem 20675MB
[2025-04-03 01:10:51 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][432/573]	eta 0:02:04 lr 0.001174	time 0.8775 (0.8819)	loss 0.6026 (0.5485)	grad_norm 2.1218 (2.7787)	mem 20675MB
[2025-04-03 01:10:53 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][434/573]	eta 0:02:02 lr 0.001174	time 0.8773 (0.8819)	loss 0.5998 (0.5486)	grad_norm 2.1280 (2.7752)	mem 20675MB
[2025-04-03 01:10:55 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][436/573]	eta 0:02:00 lr 0.001174	time 0.8773 (0.8819)	loss 0.5695 (0.5487)	grad_norm 1.7375 (2.7719)	mem 20675MB
[2025-04-03 01:10:57 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][438/573]	eta 0:01:59 lr 0.001174	time 0.8771 (0.8819)	loss 0.5444 (0.5487)	grad_norm 2.2782 (2.7685)	mem 20675MB
[2025-04-03 01:10:58 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][440/573]	eta 0:01:57 lr 0.001174	time 0.8772 (0.8819)	loss 0.5876 (0.5487)	grad_norm 2.3563 (2.7661)	mem 20675MB
[2025-04-03 01:11:00 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][442/573]	eta 0:01:55 lr 0.001174	time 0.8772 (0.8818)	loss 0.5317 (0.5486)	grad_norm 3.2954 (2.7660)	mem 20675MB
[2025-04-03 01:11:02 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][444/573]	eta 0:01:53 lr 0.001173	time 0.8770 (0.8818)	loss 0.5461 (0.5482)	grad_norm 1.9338 (2.7629)	mem 20675MB
[2025-04-03 01:11:04 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][446/573]	eta 0:01:51 lr 0.001173	time 0.8770 (0.8818)	loss 0.4959 (0.5478)	grad_norm 2.2634 (2.7649)	mem 20675MB
[2025-04-03 01:11:05 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][448/573]	eta 0:01:50 lr 0.001173	time 0.8771 (0.8818)	loss 0.4700 (0.5476)	grad_norm 2.9972 (2.7662)	mem 20675MB
[2025-04-03 01:11:07 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][450/573]	eta 0:01:48 lr 0.001173	time 0.8772 (0.8818)	loss 0.6268 (0.5478)	grad_norm 4.4785 (2.7703)	mem 20675MB
[2025-04-03 01:11:09 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][452/573]	eta 0:01:46 lr 0.001173	time 0.8773 (0.8818)	loss 0.6225 (0.5480)	grad_norm 2.3081 (2.7702)	mem 20675MB
[2025-04-03 01:11:11 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][454/573]	eta 0:01:44 lr 0.001173	time 0.8775 (0.8817)	loss 0.3821 (0.5476)	grad_norm 3.6570 (2.7714)	mem 20675MB
[2025-04-03 01:11:12 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][456/573]	eta 0:01:43 lr 0.001173	time 0.8770 (0.8817)	loss 0.4785 (0.5475)	grad_norm 3.4345 (2.7727)	mem 20675MB
[2025-04-03 01:11:14 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][458/573]	eta 0:01:41 lr 0.001173	time 0.8773 (0.8817)	loss 0.5398 (0.5475)	grad_norm 2.0866 (2.7714)	mem 20675MB
[2025-04-03 01:11:16 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][460/573]	eta 0:01:39 lr 0.001173	time 0.8771 (0.8817)	loss 0.4364 (0.5474)	grad_norm 2.1993 (2.7697)	mem 20675MB
[2025-04-03 01:11:18 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][462/573]	eta 0:01:37 lr 0.001173	time 0.8773 (0.8817)	loss 0.4216 (0.5472)	grad_norm 3.0810 (2.7686)	mem 20675MB
[2025-04-03 01:11:19 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][464/573]	eta 0:01:36 lr 0.001172	time 0.8774 (0.8817)	loss 0.5585 (0.5473)	grad_norm 2.5137 (2.7648)	mem 20675MB
[2025-04-03 01:11:21 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][466/573]	eta 0:01:34 lr 0.001172	time 0.8773 (0.8816)	loss 0.4685 (0.5470)	grad_norm 2.4674 (2.7630)	mem 20675MB
[2025-04-03 01:11:23 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][468/573]	eta 0:01:32 lr 0.001172	time 0.8774 (0.8816)	loss 0.4646 (0.5469)	grad_norm 5.1518 (2.7659)	mem 20675MB
[2025-04-03 01:11:25 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][470/573]	eta 0:01:30 lr 0.001172	time 0.8773 (0.8816)	loss 0.5740 (0.5471)	grad_norm 3.5731 (2.7693)	mem 20675MB
[2025-04-03 01:11:26 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][472/573]	eta 0:01:29 lr 0.001172	time 0.8773 (0.8816)	loss 0.5906 (0.5472)	grad_norm 3.1500 (2.7703)	mem 20675MB
[2025-04-03 01:11:28 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][474/573]	eta 0:01:27 lr 0.001172	time 0.8772 (0.8816)	loss 0.6471 (0.5476)	grad_norm 2.5726 (2.7717)	mem 20675MB
[2025-04-03 01:11:30 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][476/573]	eta 0:01:25 lr 0.001172	time 0.8774 (0.8816)	loss 0.4631 (0.5475)	grad_norm 2.6734 (2.7722)	mem 20675MB
[2025-04-03 01:11:32 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][478/573]	eta 0:01:23 lr 0.001172	time 0.8772 (0.8816)	loss 0.5836 (0.5478)	grad_norm 1.7844 (2.7681)	mem 20675MB
[2025-04-03 01:11:33 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][480/573]	eta 0:01:21 lr 0.001172	time 0.8778 (0.8815)	loss 0.5488 (0.5476)	grad_norm 1.7964 (2.7657)	mem 20675MB
[2025-04-03 01:11:35 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][482/573]	eta 0:01:20 lr 0.001171	time 0.8774 (0.8815)	loss 0.5413 (0.5476)	grad_norm 2.0874 (2.7642)	mem 20675MB
[2025-04-03 01:11:37 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][484/573]	eta 0:01:18 lr 0.001171	time 0.8791 (0.8815)	loss 0.6436 (0.5477)	grad_norm 3.5179 (2.7640)	mem 20675MB
[2025-04-03 01:11:39 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][486/573]	eta 0:01:16 lr 0.001171	time 0.8771 (0.8815)	loss 0.5541 (0.5475)	grad_norm 3.0311 (2.7644)	mem 20675MB
[2025-04-03 01:11:40 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][488/573]	eta 0:01:14 lr 0.001171	time 0.8773 (0.8815)	loss 0.5609 (0.5473)	grad_norm 2.9380 (2.7677)	mem 20675MB
[2025-04-03 01:11:42 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][490/573]	eta 0:01:13 lr 0.001171	time 0.8775 (0.8815)	loss 0.5923 (0.5473)	grad_norm 2.4116 (2.7693)	mem 20675MB
[2025-04-03 01:11:44 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][492/573]	eta 0:01:11 lr 0.001171	time 0.8773 (0.8815)	loss 0.5125 (0.5469)	grad_norm 3.4681 (2.7712)	mem 20675MB
[2025-04-03 01:11:46 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][494/573]	eta 0:01:09 lr 0.001171	time 0.8771 (0.8815)	loss 0.4367 (0.5465)	grad_norm 3.6396 (2.7718)	mem 20675MB
[2025-04-03 01:11:47 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][496/573]	eta 0:01:07 lr 0.001171	time 0.8769 (0.8814)	loss 0.4256 (0.5465)	grad_norm 3.5380 (2.7758)	mem 20675MB
[2025-04-03 01:11:49 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][498/573]	eta 0:01:06 lr 0.001171	time 0.8775 (0.8814)	loss 0.5592 (0.5467)	grad_norm 1.9079 (2.7771)	mem 20675MB
[2025-04-03 01:11:51 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][500/573]	eta 0:01:04 lr 0.001170	time 0.8771 (0.8814)	loss 0.4480 (0.5462)	grad_norm 2.8954 (2.7781)	mem 20675MB
[2025-04-03 01:11:53 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][502/573]	eta 0:01:02 lr 0.001170	time 0.8771 (0.8814)	loss 0.6407 (0.5464)	grad_norm 1.9544 (2.7763)	mem 20675MB
[2025-04-03 01:11:55 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][504/573]	eta 0:01:00 lr 0.001170	time 0.8771 (0.8814)	loss 0.5539 (0.5463)	grad_norm 1.7509 (2.7738)	mem 20675MB
[2025-04-03 01:11:56 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][506/573]	eta 0:00:59 lr 0.001170	time 0.8774 (0.8814)	loss 0.4990 (0.5462)	grad_norm 3.8611 (2.7766)	mem 20675MB
[2025-04-03 01:11:58 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][508/573]	eta 0:00:57 lr 0.001170	time 0.8772 (0.8814)	loss 0.5837 (0.5463)	grad_norm 2.9407 (2.7755)	mem 20675MB
[2025-04-03 01:12:00 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][510/573]	eta 0:00:55 lr 0.001170	time 0.8773 (0.8813)	loss 0.5873 (0.5462)	grad_norm 1.5651 (2.7778)	mem 20675MB
[2025-04-03 01:12:02 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][512/573]	eta 0:00:53 lr 0.001170	time 0.8771 (0.8813)	loss 0.6331 (0.5462)	grad_norm 3.3631 (2.7789)	mem 20675MB
[2025-04-03 01:12:03 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][514/573]	eta 0:00:51 lr 0.001170	time 0.8775 (0.8813)	loss 0.5614 (0.5462)	grad_norm 3.7660 (2.7808)	mem 20675MB
[2025-04-03 01:12:05 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][516/573]	eta 0:00:50 lr 0.001170	time 0.8777 (0.8813)	loss 0.5386 (0.5462)	grad_norm 1.4506 (2.7781)	mem 20675MB
[2025-04-03 01:12:07 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][518/573]	eta 0:00:48 lr 0.001169	time 0.8777 (0.8813)	loss 0.5606 (0.5463)	grad_norm 3.7895 (2.7791)	mem 20675MB
[2025-04-03 01:12:09 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][520/573]	eta 0:00:46 lr 0.001169	time 0.8774 (0.8813)	loss 0.6088 (0.5464)	grad_norm 2.7466 (2.7799)	mem 20675MB
[2025-04-03 01:12:10 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][522/573]	eta 0:00:44 lr 0.001169	time 0.8773 (0.8813)	loss 0.5016 (0.5464)	grad_norm 2.5996 (2.7779)	mem 20675MB
[2025-04-03 01:12:12 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][524/573]	eta 0:00:43 lr 0.001169	time 0.8774 (0.8813)	loss 0.6411 (0.5467)	grad_norm 2.3142 (2.7749)	mem 20675MB
[2025-04-03 01:12:14 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][526/573]	eta 0:00:41 lr 0.001169	time 0.8774 (0.8813)	loss 0.4806 (0.5464)	grad_norm 2.4205 (2.7749)	mem 20675MB
[2025-04-03 01:12:16 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][528/573]	eta 0:00:39 lr 0.001169	time 0.8777 (0.8813)	loss 0.5498 (0.5464)	grad_norm 2.2349 (2.7735)	mem 20675MB
[2025-04-03 01:12:17 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][530/573]	eta 0:00:37 lr 0.001169	time 0.8772 (0.8812)	loss 0.5294 (0.5462)	grad_norm 1.8841 (2.7717)	mem 20675MB
[2025-04-03 01:12:19 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][532/573]	eta 0:00:36 lr 0.001169	time 0.8776 (0.8812)	loss 0.5968 (0.5460)	grad_norm 2.8848 (2.7716)	mem 20675MB
[2025-04-03 01:12:21 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][534/573]	eta 0:00:34 lr 0.001168	time 0.8774 (0.8812)	loss 0.4505 (0.5458)	grad_norm 4.6355 (2.7774)	mem 20675MB
[2025-04-03 01:12:23 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][536/573]	eta 0:00:32 lr 0.001168	time 0.8774 (0.8812)	loss 0.6438 (0.5461)	grad_norm 2.4876 (2.7778)	mem 20675MB
[2025-04-03 01:12:24 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][538/573]	eta 0:00:30 lr 0.001168	time 0.8775 (0.8812)	loss 0.4765 (0.5459)	grad_norm 4.0886 (2.7798)	mem 20675MB
[2025-04-03 01:12:26 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][540/573]	eta 0:00:29 lr 0.001168	time 0.8785 (0.8812)	loss 0.6073 (0.5459)	grad_norm 2.3460 (2.7782)	mem 20675MB
[2025-04-03 01:12:28 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][542/573]	eta 0:00:27 lr 0.001168	time 0.8771 (0.8812)	loss 0.5146 (0.5460)	grad_norm 2.3688 (2.7760)	mem 20675MB
[2025-04-03 01:12:30 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][544/573]	eta 0:00:25 lr 0.001168	time 0.8776 (0.8812)	loss 0.5341 (0.5461)	grad_norm 2.3265 (2.7731)	mem 20675MB
[2025-04-03 01:12:31 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][546/573]	eta 0:00:23 lr 0.001168	time 0.8776 (0.8812)	loss 0.5215 (0.5460)	grad_norm 2.0938 (2.7722)	mem 20675MB
[2025-04-03 01:12:33 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][548/573]	eta 0:00:22 lr 0.001168	time 0.8774 (0.8811)	loss 0.4904 (0.5460)	grad_norm 2.4563 (2.7716)	mem 20675MB
[2025-04-03 01:12:35 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][550/573]	eta 0:00:20 lr 0.001168	time 0.8771 (0.8811)	loss 0.4378 (0.5458)	grad_norm 4.7447 (2.7739)	mem 20675MB
[2025-04-03 01:12:37 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][552/573]	eta 0:00:18 lr 0.001167	time 0.8777 (0.8811)	loss 0.6246 (0.5457)	grad_norm 3.0169 (2.7738)	mem 20675MB
[2025-04-03 01:12:38 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][554/573]	eta 0:00:16 lr 0.001167	time 0.8775 (0.8811)	loss 0.4307 (0.5455)	grad_norm 2.5936 (2.7735)	mem 20675MB
[2025-04-03 01:12:40 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][556/573]	eta 0:00:14 lr 0.001167	time 0.8776 (0.8811)	loss 0.5927 (0.5454)	grad_norm 3.7933 (2.7750)	mem 20675MB
[2025-04-03 01:12:42 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][558/573]	eta 0:00:13 lr 0.001167	time 0.8771 (0.8811)	loss 0.5084 (0.5453)	grad_norm 2.3020 (2.7726)	mem 20675MB
[2025-04-03 01:12:44 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][560/573]	eta 0:00:11 lr 0.001167	time 0.8770 (0.8811)	loss 0.5006 (0.5453)	grad_norm 3.0942 (2.7729)	mem 20675MB
[2025-04-03 01:12:45 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][562/573]	eta 0:00:09 lr 0.001167	time 0.8771 (0.8811)	loss 0.4905 (0.5449)	grad_norm 3.4750 (2.7736)	mem 20675MB
[2025-04-03 01:12:47 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][564/573]	eta 0:00:07 lr 0.001167	time 0.8770 (0.8811)	loss 0.4758 (0.5446)	grad_norm 4.1114 (2.7777)	mem 20675MB
[2025-04-03 01:12:49 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][566/573]	eta 0:00:06 lr 0.001167	time 0.8772 (0.8811)	loss 0.4780 (0.5443)	grad_norm 6.9482 (2.7863)	mem 20675MB
[2025-04-03 01:12:51 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][568/573]	eta 0:00:04 lr 0.001167	time 0.8773 (0.8810)	loss 0.4711 (0.5443)	grad_norm 3.5944 (2.7883)	mem 20675MB
[2025-04-03 01:12:52 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][570/573]	eta 0:00:02 lr 0.001166	time 0.8772 (0.8810)	loss 0.4600 (0.5440)	grad_norm 2.5109 (2.7928)	mem 20675MB
[2025-04-03 01:12:54 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][572/573]	eta 0:00:00 lr 0.001166	time 0.8770 (0.8810)	loss 0.5935 (0.5441)	grad_norm 1.8553 (2.7895)	mem 20675MB
[2025-04-03 01:12:54 simmim_finetune] (main_finetune.py 260): INFO EPOCH 4 training takes 0:08:24
[2025-04-03 01:12:57 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 2.181 (2.181)	Loss 0.5670 (0.5670)	Acc@1 70.312 (70.312)	Mem 20675MB
[2025-04-03 01:12:57 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.916)	Loss 0.5055 (0.5328)	Acc@1 73.438 (72.656)	Mem 20675MB
[2025-04-03 01:12:58 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.663)	Loss 0.5423 (0.5364)	Acc@1 77.344 (74.219)	Mem 20675MB
[2025-04-03 01:12:58 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.555)	Loss 0.5809 (0.5380)	Acc@1 72.656 (74.330)	Mem 20675MB
[2025-04-03 01:12:59 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.495)	Loss 0.5931 (0.5370)	Acc@1 69.531 (74.132)	Mem 20675MB
[2025-04-03 01:12:59 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.456)	Loss 0.5155 (0.5363)	Acc@1 75.000 (73.722)	Mem 20675MB
[2025-04-03 01:13:00 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.430)	Loss 0.5686 (0.5370)	Acc@1 67.188 (73.317)	Mem 20675MB
[2025-04-03 01:13:01 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.410)	Loss 0.5229 (0.5353)	Acc@1 75.781 (73.490)	Mem 20675MB
[2025-04-03 01:13:01 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 73.488
[2025-04-03 01:13:01 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 73.5%
[2025-04-03 01:13:01 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 73.89%
[2025-04-03 01:13:01 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.328871596084595e-06, 4.328871596084595e-06, 6.65079108411777e-06, 6.65079108411777e-06, 1.0222974911861116e-05, 1.0222974911861116e-05, 1.5718642339158572e-05, 1.5718642339158572e-05, 2.4173515304231577e-05, 2.4173515304231577e-05, 3.718101217357467e-05, 3.718101217357467e-05, 5.719254581871788e-05, 5.719254581871788e-05, 8.797952065739973e-05, 8.797952065739973e-05, 0.0001353440973322949, 0.0001353440973322949, 0.00020821267683213362, 0.00020821267683213362, 0.00032031818375496236, 0.00032031818375496236, 0.0004927881944054683, 0.0004927881944054683, 0.0007581266723293234, 0.0007581266723293234, 0.0011663397152891005, 0.0011663397152891005]
[2025-04-03 01:13:03 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][0/573]	eta 0:22:47 lr 0.001166	time 2.3867 (2.3867)	loss 0.5265 (0.5265)	grad_norm 3.8913 (3.8913)	mem 20675MB
[2025-04-03 01:13:05 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][2/573]	eta 0:13:09 lr 0.001166	time 0.8776 (1.3820)	loss 0.4152 (0.4570)	grad_norm 2.8604 (3.3869)	mem 20675MB
[2025-04-03 01:13:07 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][4/573]	eta 0:11:11 lr 0.001166	time 0.8773 (1.1807)	loss 0.6161 (0.5096)	grad_norm 2.2646 (2.9069)	mem 20675MB
[2025-04-03 01:13:08 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][6/573]	eta 0:10:20 lr 0.001166	time 0.8775 (1.0944)	loss 0.4687 (0.4966)	grad_norm 5.1948 (3.1151)	mem 20675MB
[2025-04-03 01:13:10 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][8/573]	eta 0:09:51 lr 0.001166	time 0.8778 (1.0465)	loss 0.5105 (0.5104)	grad_norm 2.5634 (2.9792)	mem 20675MB
[2025-04-03 01:13:12 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][10/573]	eta 0:09:31 lr 0.001166	time 0.8774 (1.0159)	loss 0.5991 (0.5171)	grad_norm 3.7783 (3.0226)	mem 20675MB
[2025-04-03 01:13:14 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][12/573]	eta 0:09:18 lr 0.001166	time 0.8776 (0.9948)	loss 0.4986 (0.5187)	grad_norm 2.7292 (2.9459)	mem 20675MB
[2025-04-03 01:13:16 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][14/573]	eta 0:09:07 lr 0.001165	time 0.8773 (0.9793)	loss 0.5598 (0.5174)	grad_norm 2.8910 (3.1935)	mem 20675MB
[2025-04-03 01:13:17 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][16/573]	eta 0:08:58 lr 0.001165	time 0.8778 (0.9674)	loss 0.5726 (0.5245)	grad_norm 1.7943 (3.1262)	mem 20675MB
[2025-04-03 01:13:19 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][18/573]	eta 0:08:51 lr 0.001165	time 0.8777 (0.9581)	loss 0.5455 (0.5238)	grad_norm 2.4778 (3.0224)	mem 20675MB
[2025-04-03 01:13:21 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][20/573]	eta 0:08:45 lr 0.001165	time 0.8773 (0.9505)	loss 0.4319 (0.5185)	grad_norm 2.2655 (2.9501)	mem 20675MB
[2025-04-03 01:13:23 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][22/573]	eta 0:08:40 lr 0.001165	time 0.8773 (0.9442)	loss 0.4475 (0.5120)	grad_norm 3.4317 (3.0106)	mem 20675MB
[2025-04-03 01:13:24 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][24/573]	eta 0:08:35 lr 0.001165	time 0.8778 (0.9390)	loss 0.5247 (0.5145)	grad_norm 2.6870 (2.9827)	mem 20675MB
[2025-04-03 01:13:26 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][26/573]	eta 0:08:31 lr 0.001165	time 0.8778 (0.9345)	loss 0.6141 (0.5200)	grad_norm 4.3877 (3.0153)	mem 20675MB
[2025-04-03 01:13:28 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][28/573]	eta 0:08:27 lr 0.001165	time 0.8792 (0.9307)	loss 0.5824 (0.5262)	grad_norm 3.4007 (3.0547)	mem 20675MB
[2025-04-03 01:13:30 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][30/573]	eta 0:08:23 lr 0.001165	time 0.8776 (0.9274)	loss 0.5274 (0.5264)	grad_norm 2.5070 (2.9983)	mem 20675MB
[2025-04-03 01:13:31 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][32/573]	eta 0:08:20 lr 0.001164	time 0.8780 (0.9244)	loss 0.5319 (0.5295)	grad_norm 2.1412 (2.9647)	mem 20675MB
[2025-04-03 01:13:33 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][34/573]	eta 0:08:16 lr 0.001164	time 0.8778 (0.9218)	loss 0.5587 (0.5309)	grad_norm 1.8033 (2.8861)	mem 20675MB
[2025-04-03 01:13:35 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][36/573]	eta 0:08:13 lr 0.001164	time 0.8775 (0.9195)	loss 0.5619 (0.5310)	grad_norm 1.8825 (2.8402)	mem 20675MB
[2025-04-03 01:13:37 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][38/573]	eta 0:08:10 lr 0.001164	time 0.8777 (0.9174)	loss 0.5907 (0.5303)	grad_norm 2.5601 (2.8210)	mem 20675MB
[2025-04-03 01:13:38 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][40/573]	eta 0:08:07 lr 0.001164	time 0.8778 (0.9155)	loss 0.4552 (0.5259)	grad_norm 2.2604 (2.8161)	mem 20675MB
[2025-04-03 01:13:40 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][42/573]	eta 0:08:05 lr 0.001164	time 0.8773 (0.9138)	loss 0.5823 (0.5283)	grad_norm 2.9190 (2.8257)	mem 20675MB
[2025-04-03 01:13:42 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][44/573]	eta 0:08:02 lr 0.001164	time 0.8779 (0.9122)	loss 0.4810 (0.5236)	grad_norm 3.3213 (2.8294)	mem 20675MB
[2025-04-03 01:13:44 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][46/573]	eta 0:07:59 lr 0.001164	time 0.8771 (0.9108)	loss 0.5734 (0.5246)	grad_norm 2.8516 (2.8192)	mem 20675MB
[2025-04-03 01:13:45 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][48/573]	eta 0:07:57 lr 0.001164	time 0.8774 (0.9095)	loss 0.5772 (0.5287)	grad_norm 4.0213 (2.8678)	mem 20675MB
[2025-04-03 01:13:47 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][50/573]	eta 0:07:55 lr 0.001163	time 0.8774 (0.9082)	loss 0.6172 (0.5305)	grad_norm 2.7766 (2.8648)	mem 20675MB
[2025-04-03 01:13:49 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][52/573]	eta 0:07:52 lr 0.001163	time 0.8775 (0.9071)	loss 0.4884 (0.5306)	grad_norm 2.7934 (2.8417)	mem 20675MB
[2025-04-03 01:13:51 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][54/573]	eta 0:07:50 lr 0.001163	time 0.8772 (0.9060)	loss 0.4233 (0.5274)	grad_norm 2.3177 (2.8306)	mem 20675MB
[2025-04-03 01:13:52 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][56/573]	eta 0:07:47 lr 0.001163	time 0.8774 (0.9051)	loss 0.5798 (0.5280)	grad_norm 1.4535 (2.7997)	mem 20675MB
[2025-04-03 01:13:54 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][58/573]	eta 0:07:45 lr 0.001163	time 0.8814 (0.9042)	loss 0.3989 (0.5262)	grad_norm 2.6204 (2.7851)	mem 20675MB
[2025-04-03 01:13:56 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][60/573]	eta 0:07:43 lr 0.001163	time 0.8775 (0.9034)	loss 0.6089 (0.5276)	grad_norm 2.8775 (2.7731)	mem 20675MB
[2025-04-03 01:13:58 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][62/573]	eta 0:07:41 lr 0.001163	time 0.8776 (0.9026)	loss 0.5798 (0.5296)	grad_norm 2.1551 (2.7575)	mem 20675MB
[2025-04-03 01:13:59 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][64/573]	eta 0:07:39 lr 0.001163	time 0.8777 (0.9019)	loss 0.5214 (0.5312)	grad_norm 1.8704 (2.7498)	mem 20675MB
[2025-04-03 01:14:01 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][66/573]	eta 0:07:36 lr 0.001162	time 0.8772 (0.9012)	loss 0.5543 (0.5325)	grad_norm 1.5246 (2.7260)	mem 20675MB
[2025-04-03 01:14:03 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][68/573]	eta 0:07:34 lr 0.001162	time 0.8802 (0.9006)	loss 0.5963 (0.5337)	grad_norm 3.2520 (2.7285)	mem 20675MB
[2025-04-03 01:14:05 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][70/573]	eta 0:07:32 lr 0.001162	time 0.8774 (0.8999)	loss 0.5907 (0.5352)	grad_norm 2.2841 (2.7075)	mem 20675MB
[2025-04-03 01:14:06 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][72/573]	eta 0:07:30 lr 0.001162	time 0.8772 (0.8993)	loss 0.4983 (0.5349)	grad_norm 2.8961 (2.6991)	mem 20675MB
[2025-04-03 01:14:08 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][74/573]	eta 0:07:28 lr 0.001162	time 0.8772 (0.8988)	loss 0.5435 (0.5357)	grad_norm 3.1340 (2.7107)	mem 20675MB
[2025-04-03 01:14:10 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][76/573]	eta 0:07:26 lr 0.001162	time 0.8781 (0.8983)	loss 0.5734 (0.5371)	grad_norm 3.0720 (2.7069)	mem 20675MB
[2025-04-03 01:14:12 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][78/573]	eta 0:07:24 lr 0.001162	time 0.8775 (0.8978)	loss 0.4494 (0.5344)	grad_norm 2.4558 (2.7134)	mem 20675MB
[2025-04-03 01:14:13 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][80/573]	eta 0:07:22 lr 0.001162	time 0.8788 (0.8973)	loss 0.5909 (0.5355)	grad_norm 2.6755 (2.7049)	mem 20675MB
[2025-04-03 01:14:15 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][82/573]	eta 0:07:20 lr 0.001162	time 0.8770 (0.8968)	loss 0.5566 (0.5336)	grad_norm 3.4208 (2.7218)	mem 20675MB
[2025-04-03 01:14:17 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][84/573]	eta 0:07:18 lr 0.001161	time 0.8786 (0.8964)	loss 0.5138 (0.5326)	grad_norm 2.4441 (2.7428)	mem 20675MB
[2025-04-03 01:14:19 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][86/573]	eta 0:07:16 lr 0.001161	time 0.8795 (0.8960)	loss 0.4217 (0.5312)	grad_norm 3.4580 (2.7495)	mem 20675MB
[2025-04-03 01:14:21 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][88/573]	eta 0:07:14 lr 0.001161	time 0.8774 (0.8956)	loss 0.4282 (0.5309)	grad_norm 4.0963 (2.7723)	mem 20675MB
[2025-04-03 01:14:22 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][90/573]	eta 0:07:12 lr 0.001161	time 0.8775 (0.8952)	loss 0.6557 (0.5325)	grad_norm 3.1139 (2.7709)	mem 20675MB
[2025-04-03 01:14:24 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][92/573]	eta 0:07:10 lr 0.001161	time 0.8773 (0.8949)	loss 0.6546 (0.5340)	grad_norm 3.9653 (2.8050)	mem 20675MB
[2025-04-03 01:14:26 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][94/573]	eta 0:07:08 lr 0.001161	time 0.8773 (0.8946)	loss 0.5697 (0.5339)	grad_norm 1.6506 (2.7846)	mem 20675MB
[2025-04-03 01:14:28 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][96/573]	eta 0:07:06 lr 0.001161	time 0.8775 (0.8942)	loss 0.5667 (0.5341)	grad_norm 3.3465 (2.7781)	mem 20675MB
[2025-04-03 01:14:29 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][98/573]	eta 0:07:04 lr 0.001161	time 0.8772 (0.8939)	loss 0.5717 (0.5341)	grad_norm 1.8261 (2.7702)	mem 20675MB
[2025-04-03 01:14:31 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][100/573]	eta 0:07:02 lr 0.001160	time 0.8781 (0.8936)	loss 0.6270 (0.5347)	grad_norm 1.5277 (2.7537)	mem 20675MB
[2025-04-03 01:14:33 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][102/573]	eta 0:07:00 lr 0.001160	time 0.8773 (0.8933)	loss 0.5363 (0.5351)	grad_norm 3.4192 (2.7492)	mem 20675MB
[2025-04-03 01:14:35 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][104/573]	eta 0:06:58 lr 0.001160	time 0.8772 (0.8930)	loss 0.5387 (0.5353)	grad_norm 2.6438 (2.7479)	mem 20675MB
[2025-04-03 01:14:36 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][106/573]	eta 0:06:56 lr 0.001160	time 0.8772 (0.8928)	loss 0.5286 (0.5356)	grad_norm 1.9248 (2.7315)	mem 20675MB
[2025-04-03 01:14:38 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][108/573]	eta 0:06:55 lr 0.001160	time 0.8775 (0.8925)	loss 0.6445 (0.5354)	grad_norm 3.2058 (2.7333)	mem 20675MB
[2025-04-03 01:14:40 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][110/573]	eta 0:06:53 lr 0.001160	time 0.8772 (0.8923)	loss 0.5720 (0.5364)	grad_norm 2.2132 (2.7230)	mem 20675MB
[2025-04-03 01:14:42 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][112/573]	eta 0:06:51 lr 0.001160	time 0.8774 (0.8920)	loss 0.5360 (0.5367)	grad_norm 1.9073 (2.7164)	mem 20675MB
[2025-04-03 01:14:43 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][114/573]	eta 0:06:49 lr 0.001160	time 0.8771 (0.8918)	loss 0.4751 (0.5361)	grad_norm 2.5275 (2.7132)	mem 20675MB
[2025-04-03 01:14:45 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][116/573]	eta 0:06:47 lr 0.001160	time 0.8775 (0.8915)	loss 0.4033 (0.5340)	grad_norm 2.6869 (2.7160)	mem 20675MB
[2025-04-03 01:14:47 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][118/573]	eta 0:06:45 lr 0.001159	time 0.8775 (0.8913)	loss 0.6054 (0.5340)	grad_norm 2.9183 (2.7133)	mem 20675MB
[2025-04-03 01:14:49 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][120/573]	eta 0:06:43 lr 0.001159	time 0.8773 (0.8911)	loss 0.5764 (0.5336)	grad_norm 3.2984 (2.7272)	mem 20675MB
[2025-04-03 01:14:50 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][122/573]	eta 0:06:41 lr 0.001159	time 0.8772 (0.8909)	loss 0.5936 (0.5342)	grad_norm 2.8050 (2.7304)	mem 20675MB
[2025-04-03 01:14:52 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][124/573]	eta 0:06:39 lr 0.001159	time 0.8774 (0.8907)	loss 0.5776 (0.5347)	grad_norm 2.1717 (2.7295)	mem 20675MB
[2025-04-03 01:14:54 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][126/573]	eta 0:06:38 lr 0.001159	time 0.8771 (0.8905)	loss 0.5420 (0.5350)	grad_norm 5.5897 (2.7541)	mem 20675MB
[2025-04-03 01:14:56 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][128/573]	eta 0:06:36 lr 0.001159	time 0.8772 (0.8903)	loss 0.5612 (0.5358)	grad_norm 2.9648 (2.7620)	mem 20675MB
[2025-04-03 01:14:57 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][130/573]	eta 0:06:34 lr 0.001159	time 0.8775 (0.8902)	loss 0.5389 (0.5353)	grad_norm 6.1022 (2.8115)	mem 20675MB
[2025-04-03 01:14:59 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][132/573]	eta 0:06:32 lr 0.001159	time 0.8773 (0.8900)	loss 0.5590 (0.5361)	grad_norm 2.8454 (2.8132)	mem 20675MB
[2025-04-03 01:15:01 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][134/573]	eta 0:06:30 lr 0.001158	time 0.8777 (0.8898)	loss 0.6112 (0.5356)	grad_norm 2.9559 (2.8151)	mem 20675MB
[2025-04-03 01:15:03 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][136/573]	eta 0:06:28 lr 0.001158	time 0.8776 (0.8896)	loss 0.5780 (0.5356)	grad_norm 2.4471 (2.8138)	mem 20675MB
[2025-04-03 01:15:04 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][138/573]	eta 0:06:26 lr 0.001158	time 0.8774 (0.8895)	loss 0.4425 (0.5351)	grad_norm 2.1805 (2.8027)	mem 20675MB
[2025-04-03 01:15:06 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][140/573]	eta 0:06:25 lr 0.001158	time 0.8772 (0.8893)	loss 0.4975 (0.5353)	grad_norm 2.1111 (2.7908)	mem 20675MB
[2025-04-03 01:15:08 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][142/573]	eta 0:06:23 lr 0.001158	time 0.8774 (0.8892)	loss 0.4303 (0.5345)	grad_norm 2.8251 (2.7851)	mem 20675MB
[2025-04-03 01:15:10 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][144/573]	eta 0:06:21 lr 0.001158	time 0.8772 (0.8890)	loss 0.5898 (0.5346)	grad_norm 2.3270 (2.7806)	mem 20675MB
[2025-04-03 01:15:11 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][146/573]	eta 0:06:19 lr 0.001158	time 0.8775 (0.8889)	loss 0.4984 (0.5353)	grad_norm 2.0193 (2.7906)	mem 20675MB
[2025-04-03 01:15:13 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][148/573]	eta 0:06:17 lr 0.001158	time 0.8771 (0.8887)	loss 0.5606 (0.5360)	grad_norm 3.4749 (2.7945)	mem 20675MB
[2025-04-03 01:15:15 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][150/573]	eta 0:06:15 lr 0.001158	time 0.8774 (0.8886)	loss 0.5603 (0.5359)	grad_norm 1.8574 (2.7882)	mem 20675MB
[2025-04-03 01:15:17 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][152/573]	eta 0:06:14 lr 0.001157	time 0.8774 (0.8884)	loss 0.5600 (0.5357)	grad_norm 1.9431 (2.7785)	mem 20675MB
[2025-04-03 01:15:19 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][154/573]	eta 0:06:12 lr 0.001157	time 0.8771 (0.8883)	loss 0.5347 (0.5358)	grad_norm 2.4281 (2.7745)	mem 20675MB
[2025-04-03 01:15:20 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][156/573]	eta 0:06:10 lr 0.001157	time 0.8773 (0.8882)	loss 0.4667 (0.5362)	grad_norm 4.9911 (2.7911)	mem 20675MB
[2025-04-03 01:15:22 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][158/573]	eta 0:06:08 lr 0.001157	time 0.8771 (0.8881)	loss 0.5814 (0.5366)	grad_norm 1.7267 (2.7795)	mem 20675MB
[2025-04-03 01:15:24 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][160/573]	eta 0:06:06 lr 0.001157	time 0.8770 (0.8879)	loss 0.5291 (0.5360)	grad_norm 3.4123 (2.7790)	mem 20675MB
[2025-04-03 01:15:26 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][162/573]	eta 0:06:04 lr 0.001157	time 0.8770 (0.8878)	loss 0.4984 (0.5353)	grad_norm 3.1103 (2.7816)	mem 20675MB
[2025-04-03 01:15:27 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][164/573]	eta 0:06:03 lr 0.001157	time 0.8773 (0.8877)	loss 0.4586 (0.5348)	grad_norm 4.4749 (2.7875)	mem 20675MB
[2025-04-03 01:15:29 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][166/573]	eta 0:06:01 lr 0.001157	time 0.8773 (0.8876)	loss 0.5976 (0.5354)	grad_norm 4.3475 (2.7956)	mem 20675MB
[2025-04-03 01:15:31 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][168/573]	eta 0:05:59 lr 0.001156	time 0.8777 (0.8875)	loss 0.6616 (0.5364)	grad_norm 3.3672 (2.7992)	mem 20675MB
[2025-04-03 01:15:33 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][170/573]	eta 0:05:57 lr 0.001156	time 0.8774 (0.8874)	loss 0.5600 (0.5370)	grad_norm 1.7327 (2.7895)	mem 20675MB
[2025-04-03 01:15:34 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][172/573]	eta 0:05:55 lr 0.001156	time 0.8772 (0.8872)	loss 0.5768 (0.5375)	grad_norm 1.7799 (2.7810)	mem 20675MB
[2025-04-03 01:15:36 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][174/573]	eta 0:05:53 lr 0.001156	time 0.8774 (0.8871)	loss 0.5613 (0.5380)	grad_norm 2.7289 (2.7762)	mem 20675MB
[2025-04-03 01:15:38 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][176/573]	eta 0:05:52 lr 0.001156	time 0.8775 (0.8870)	loss 0.4866 (0.5370)	grad_norm 6.0913 (2.7991)	mem 20675MB
[2025-04-03 01:15:40 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][178/573]	eta 0:05:50 lr 0.001156	time 0.8771 (0.8869)	loss 0.5734 (0.5373)	grad_norm 2.6748 (2.7956)	mem 20675MB
[2025-04-03 01:15:41 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][180/573]	eta 0:05:48 lr 0.001156	time 0.8771 (0.8868)	loss 0.4741 (0.5377)	grad_norm 3.1641 (2.7955)	mem 20675MB
[2025-04-03 01:15:43 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][182/573]	eta 0:05:46 lr 0.001156	time 0.8770 (0.8867)	loss 0.5772 (0.5375)	grad_norm 2.1738 (2.7921)	mem 20675MB
[2025-04-03 01:15:45 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][184/573]	eta 0:05:44 lr 0.001155	time 0.8772 (0.8866)	loss 0.5616 (0.5383)	grad_norm 2.6040 (2.8005)	mem 20675MB
[2025-04-03 01:15:47 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][186/573]	eta 0:05:43 lr 0.001155	time 0.8772 (0.8866)	loss 0.5804 (0.5389)	grad_norm 1.9243 (2.7939)	mem 20675MB
[2025-04-03 01:15:48 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][188/573]	eta 0:05:41 lr 0.001155	time 0.8771 (0.8865)	loss 0.5129 (0.5382)	grad_norm 1.7999 (2.7858)	mem 20675MB
[2025-04-03 01:15:50 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][190/573]	eta 0:05:39 lr 0.001155	time 0.8772 (0.8864)	loss 0.4272 (0.5379)	grad_norm 3.8211 (2.7901)	mem 20675MB
[2025-04-03 01:15:52 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][192/573]	eta 0:05:37 lr 0.001155	time 0.8774 (0.8863)	loss 0.6053 (0.5384)	grad_norm 3.5501 (2.7922)	mem 20675MB
[2025-04-03 01:15:54 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][194/573]	eta 0:05:35 lr 0.001155	time 0.8772 (0.8862)	loss 0.4357 (0.5377)	grad_norm 3.9127 (2.7983)	mem 20675MB
[2025-04-03 01:15:55 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][196/573]	eta 0:05:34 lr 0.001155	time 0.8769 (0.8861)	loss 0.6252 (0.5383)	grad_norm 3.5259 (2.8014)	mem 20675MB
[2025-04-03 01:15:57 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][198/573]	eta 0:05:32 lr 0.001155	time 0.8772 (0.8860)	loss 0.5079 (0.5377)	grad_norm 4.2177 (2.8128)	mem 20675MB
[2025-04-03 01:15:59 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][200/573]	eta 0:05:30 lr 0.001155	time 0.8773 (0.8860)	loss 0.4584 (0.5371)	grad_norm 3.3449 (2.8131)	mem 20675MB
[2025-04-03 01:16:01 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][202/573]	eta 0:05:28 lr 0.001154	time 0.8770 (0.8859)	loss 0.6275 (0.5370)	grad_norm 5.0615 (2.8214)	mem 20675MB
[2025-04-03 01:16:02 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][204/573]	eta 0:05:26 lr 0.001154	time 0.8774 (0.8858)	loss 0.5353 (0.5372)	grad_norm 3.0680 (2.8284)	mem 20675MB
[2025-04-03 01:16:04 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][206/573]	eta 0:05:25 lr 0.001154	time 0.8772 (0.8857)	loss 0.5460 (0.5375)	grad_norm 2.2797 (2.8243)	mem 20675MB
[2025-04-03 01:16:06 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][208/573]	eta 0:05:23 lr 0.001154	time 0.8773 (0.8857)	loss 0.4886 (0.5374)	grad_norm 2.0718 (2.8262)	mem 20675MB
[2025-04-03 01:16:08 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][210/573]	eta 0:05:21 lr 0.001154	time 0.8772 (0.8856)	loss 0.5047 (0.5365)	grad_norm 1.8450 (2.8276)	mem 20675MB
[2025-04-03 01:16:09 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][212/573]	eta 0:05:19 lr 0.001154	time 0.8772 (0.8855)	loss 0.4858 (0.5357)	grad_norm 3.5966 (2.8351)	mem 20675MB
[2025-04-03 01:16:11 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][214/573]	eta 0:05:17 lr 0.001154	time 0.8771 (0.8855)	loss 0.5212 (0.5356)	grad_norm 3.4111 (2.8368)	mem 20675MB
[2025-04-03 01:16:13 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][216/573]	eta 0:05:16 lr 0.001154	time 0.8773 (0.8854)	loss 0.5824 (0.5354)	grad_norm 3.8016 (2.8480)	mem 20675MB
[2025-04-03 01:16:15 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][218/573]	eta 0:05:14 lr 0.001153	time 0.8773 (0.8853)	loss 0.6558 (0.5356)	grad_norm 4.2811 (2.8698)	mem 20675MB
[2025-04-03 01:16:16 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][220/573]	eta 0:05:12 lr 0.001153	time 0.8773 (0.8853)	loss 0.5238 (0.5353)	grad_norm 4.1192 (2.8886)	mem 20675MB
[2025-04-03 01:16:18 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][222/573]	eta 0:05:10 lr 0.001153	time 0.8771 (0.8852)	loss 0.5704 (0.5353)	grad_norm 2.1376 (2.8833)	mem 20675MB
[2025-04-03 01:16:20 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][224/573]	eta 0:05:08 lr 0.001153	time 0.8773 (0.8851)	loss 0.5555 (0.5348)	grad_norm 2.4999 (2.8812)	mem 20675MB
[2025-04-03 01:16:22 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][226/573]	eta 0:05:07 lr 0.001153	time 0.8772 (0.8851)	loss 0.5971 (0.5352)	grad_norm 1.9168 (2.8711)	mem 20675MB
[2025-04-03 01:16:23 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][228/573]	eta 0:05:05 lr 0.001153	time 0.8774 (0.8850)	loss 0.6091 (0.5357)	grad_norm 1.8277 (2.8615)	mem 20675MB
[2025-04-03 01:16:25 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][230/573]	eta 0:05:03 lr 0.001153	time 0.8774 (0.8849)	loss 0.5715 (0.5359)	grad_norm 2.1867 (2.8552)	mem 20675MB
[2025-04-03 01:16:27 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][232/573]	eta 0:05:01 lr 0.001153	time 0.8772 (0.8849)	loss 0.5726 (0.5366)	grad_norm 2.1117 (2.8516)	mem 20675MB
[2025-04-03 01:16:29 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][234/573]	eta 0:04:59 lr 0.001152	time 0.8773 (0.8848)	loss 0.5849 (0.5365)	grad_norm 2.0899 (2.8485)	mem 20675MB
[2025-04-03 01:16:31 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][236/573]	eta 0:04:58 lr 0.001152	time 0.8771 (0.8848)	loss 0.4406 (0.5363)	grad_norm 2.7701 (2.8431)	mem 20675MB
[2025-04-03 01:16:32 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][238/573]	eta 0:04:56 lr 0.001152	time 0.8774 (0.8847)	loss 0.4843 (0.5358)	grad_norm 2.8304 (2.8449)	mem 20675MB
[2025-04-03 01:16:34 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][240/573]	eta 0:04:54 lr 0.001152	time 0.8773 (0.8847)	loss 0.5707 (0.5356)	grad_norm 2.7370 (2.8416)	mem 20675MB
[2025-04-03 01:16:36 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][242/573]	eta 0:04:52 lr 0.001152	time 0.8773 (0.8846)	loss 0.6174 (0.5359)	grad_norm 2.6961 (2.8374)	mem 20675MB
[2025-04-03 01:16:38 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][244/573]	eta 0:04:51 lr 0.001152	time 0.8777 (0.8846)	loss 0.6226 (0.5358)	grad_norm 4.3051 (2.8414)	mem 20675MB
[2025-04-03 01:16:39 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][246/573]	eta 0:04:49 lr 0.001152	time 0.8772 (0.8845)	loss 0.5888 (0.5361)	grad_norm 2.1110 (2.8439)	mem 20675MB
[2025-04-03 01:16:41 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][248/573]	eta 0:04:47 lr 0.001152	time 0.8772 (0.8845)	loss 0.5124 (0.5356)	grad_norm 1.8139 (2.8434)	mem 20675MB
[2025-04-03 01:16:43 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][250/573]	eta 0:04:45 lr 0.001151	time 0.8774 (0.8844)	loss 0.6383 (0.5359)	grad_norm 2.7004 (2.8405)	mem 20675MB
[2025-04-03 01:16:45 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][252/573]	eta 0:04:43 lr 0.001151	time 0.8774 (0.8844)	loss 0.6044 (0.5362)	grad_norm 1.8073 (2.8395)	mem 20675MB
[2025-04-03 01:16:46 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][254/573]	eta 0:04:42 lr 0.001151	time 0.8777 (0.8843)	loss 0.5738 (0.5362)	grad_norm 1.6813 (2.8308)	mem 20675MB
[2025-04-03 01:16:48 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][256/573]	eta 0:04:40 lr 0.001151	time 0.8774 (0.8843)	loss 0.5251 (0.5363)	grad_norm 2.4386 (2.8321)	mem 20675MB
[2025-04-03 01:16:50 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][258/573]	eta 0:04:38 lr 0.001151	time 0.8772 (0.8842)	loss 0.5406 (0.5363)	grad_norm 2.0459 (2.8247)	mem 20675MB
[2025-04-03 01:16:52 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][260/573]	eta 0:04:36 lr 0.001151	time 0.8772 (0.8842)	loss 0.4664 (0.5364)	grad_norm 3.3863 (2.8253)	mem 20675MB
[2025-04-03 01:16:53 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][262/573]	eta 0:04:34 lr 0.001151	time 0.8777 (0.8841)	loss 0.5167 (0.5365)	grad_norm 2.4563 (2.8244)	mem 20675MB
[2025-04-03 01:16:55 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][264/573]	eta 0:04:33 lr 0.001151	time 0.8773 (0.8841)	loss 0.5173 (0.5360)	grad_norm 2.0806 (2.8222)	mem 20675MB
[2025-04-03 01:16:57 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][266/573]	eta 0:04:31 lr 0.001150	time 0.8773 (0.8840)	loss 0.5795 (0.5363)	grad_norm 2.5870 (2.8197)	mem 20675MB
[2025-04-03 01:16:59 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][268/573]	eta 0:04:29 lr 0.001150	time 0.8775 (0.8840)	loss 0.5193 (0.5363)	grad_norm 3.4964 (2.8192)	mem 20675MB
[2025-04-03 01:17:00 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][270/573]	eta 0:04:27 lr 0.001150	time 0.8773 (0.8840)	loss 0.5273 (0.5362)	grad_norm 2.2958 (2.8148)	mem 20675MB
[2025-04-03 01:17:02 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][272/573]	eta 0:04:26 lr 0.001150	time 0.8775 (0.8839)	loss 0.3615 (0.5353)	grad_norm 3.9786 (2.8218)	mem 20675MB
[2025-04-03 01:17:04 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][274/573]	eta 0:04:24 lr 0.001150	time 0.8774 (0.8839)	loss 0.4011 (0.5342)	grad_norm 3.3158 (2.8278)	mem 20675MB
[2025-04-03 01:17:06 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][276/573]	eta 0:04:22 lr 0.001150	time 0.8778 (0.8838)	loss 0.4339 (0.5336)	grad_norm 4.6364 (2.8437)	mem 20675MB
[2025-04-03 01:17:07 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][278/573]	eta 0:04:20 lr 0.001150	time 0.8774 (0.8838)	loss 0.5908 (0.5340)	grad_norm 4.2406 (2.8501)	mem 20675MB
[2025-04-03 01:17:09 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][280/573]	eta 0:04:18 lr 0.001150	time 0.8775 (0.8838)	loss 0.4488 (0.5337)	grad_norm 2.4505 (2.8480)	mem 20675MB
[2025-04-03 01:17:11 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][282/573]	eta 0:04:17 lr 0.001149	time 0.8771 (0.8837)	loss 0.5516 (0.5340)	grad_norm 2.8204 (2.8470)	mem 20675MB
[2025-04-03 01:17:13 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][284/573]	eta 0:04:15 lr 0.001149	time 0.8774 (0.8837)	loss 0.4339 (0.5338)	grad_norm 2.4419 (2.8472)	mem 20675MB
[2025-04-03 01:17:14 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][286/573]	eta 0:04:13 lr 0.001149	time 0.8777 (0.8836)	loss 0.5991 (0.5336)	grad_norm 2.0947 (2.8450)	mem 20675MB
[2025-04-03 01:17:16 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][288/573]	eta 0:04:11 lr 0.001149	time 0.8775 (0.8836)	loss 0.5786 (0.5339)	grad_norm 2.2404 (2.8411)	mem 20675MB
[2025-04-03 01:17:18 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][290/573]	eta 0:04:10 lr 0.001149	time 0.8774 (0.8836)	loss 0.6024 (0.5339)	grad_norm 2.8502 (2.8418)	mem 20675MB
[2025-04-03 01:17:20 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][292/573]	eta 0:04:08 lr 0.001149	time 0.8771 (0.8835)	loss 0.4095 (0.5335)	grad_norm 2.3644 (2.8377)	mem 20675MB
[2025-04-03 01:17:21 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][294/573]	eta 0:04:06 lr 0.001149	time 0.8771 (0.8835)	loss 0.4573 (0.5335)	grad_norm 2.7570 (2.8388)	mem 20675MB
[2025-04-03 01:17:23 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][296/573]	eta 0:04:04 lr 0.001149	time 0.8773 (0.8835)	loss 0.4962 (0.5332)	grad_norm 2.3263 (2.8400)	mem 20675MB
[2025-04-03 01:17:25 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][298/573]	eta 0:04:02 lr 0.001148	time 0.8771 (0.8834)	loss 0.5864 (0.5338)	grad_norm 2.6439 (2.8396)	mem 20675MB
[2025-04-03 01:17:27 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][300/573]	eta 0:04:01 lr 0.001148	time 0.8774 (0.8834)	loss 0.4864 (0.5339)	grad_norm 2.3026 (2.8431)	mem 20675MB
[2025-04-03 01:17:28 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][302/573]	eta 0:03:59 lr 0.001148	time 0.8775 (0.8834)	loss 0.5675 (0.5338)	grad_norm 2.9833 (2.8476)	mem 20675MB
[2025-04-03 01:17:30 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][304/573]	eta 0:03:57 lr 0.001148	time 0.8772 (0.8833)	loss 0.4862 (0.5337)	grad_norm 1.7777 (2.8402)	mem 20675MB
[2025-04-03 01:17:32 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][306/573]	eta 0:03:55 lr 0.001148	time 0.8772 (0.8833)	loss 0.4741 (0.5336)	grad_norm 3.5871 (2.8409)	mem 20675MB
[2025-04-03 01:17:34 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][308/573]	eta 0:03:54 lr 0.001148	time 0.8771 (0.8833)	loss 0.5756 (0.5338)	grad_norm 3.8067 (2.8417)	mem 20675MB
[2025-04-03 01:17:35 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][310/573]	eta 0:03:52 lr 0.001148	time 0.8775 (0.8832)	loss 0.5309 (0.5333)	grad_norm 3.4150 (2.8445)	mem 20675MB
[2025-04-03 01:17:37 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][312/573]	eta 0:03:50 lr 0.001148	time 0.8775 (0.8832)	loss 0.6396 (0.5337)	grad_norm 2.8937 (2.8465)	mem 20675MB
[2025-04-03 01:17:39 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][314/573]	eta 0:03:48 lr 0.001147	time 0.8773 (0.8832)	loss 0.6984 (0.5340)	grad_norm 4.4782 (2.8493)	mem 20675MB
[2025-04-03 01:17:41 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][316/573]	eta 0:03:46 lr 0.001147	time 0.8771 (0.8831)	loss 0.5311 (0.5342)	grad_norm 2.8037 (2.8494)	mem 20675MB
[2025-04-03 01:17:43 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][318/573]	eta 0:03:45 lr 0.001147	time 0.8774 (0.8831)	loss 0.4918 (0.5338)	grad_norm 2.5529 (2.8499)	mem 20675MB
[2025-04-03 01:17:44 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][320/573]	eta 0:03:43 lr 0.001147	time 0.8775 (0.8831)	loss 0.5646 (0.5335)	grad_norm 2.8931 (2.8513)	mem 20675MB
[2025-04-03 01:17:46 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][322/573]	eta 0:03:41 lr 0.001147	time 0.8773 (0.8830)	loss 0.5525 (0.5332)	grad_norm 2.7511 (2.8541)	mem 20675MB
[2025-04-03 01:17:48 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][324/573]	eta 0:03:39 lr 0.001147	time 0.8775 (0.8830)	loss 0.5875 (0.5334)	grad_norm 3.8608 (2.8565)	mem 20675MB
[2025-04-03 01:17:50 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][326/573]	eta 0:03:38 lr 0.001147	time 0.8773 (0.8830)	loss 0.4005 (0.5329)	grad_norm 3.0119 (2.8602)	mem 20675MB
[2025-04-03 01:17:51 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][328/573]	eta 0:03:36 lr 0.001147	time 0.8774 (0.8829)	loss 0.5898 (0.5330)	grad_norm 3.1670 (2.8608)	mem 20675MB
[2025-04-03 01:17:53 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][330/573]	eta 0:03:34 lr 0.001146	time 0.8781 (0.8829)	loss 0.4756 (0.5332)	grad_norm 3.0400 (2.8718)	mem 20675MB
[2025-04-03 01:17:55 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][332/573]	eta 0:03:32 lr 0.001146	time 0.8773 (0.8829)	loss 0.5203 (0.5332)	grad_norm 2.6674 (2.8697)	mem 20675MB
[2025-04-03 01:17:57 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][334/573]	eta 0:03:31 lr 0.001146	time 0.8772 (0.8829)	loss 0.5723 (0.5337)	grad_norm 2.3961 (2.8724)	mem 20675MB
[2025-04-03 01:17:58 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][336/573]	eta 0:03:29 lr 0.001146	time 0.8779 (0.8828)	loss 0.4747 (0.5337)	grad_norm 2.7584 (2.8718)	mem 20675MB
[2025-04-03 01:18:00 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][338/573]	eta 0:03:27 lr 0.001146	time 0.8773 (0.8828)	loss 0.5986 (0.5341)	grad_norm 2.5149 (2.8670)	mem 20675MB
[2025-04-03 01:18:02 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][340/573]	eta 0:03:25 lr 0.001146	time 0.8775 (0.8828)	loss 0.5963 (0.5345)	grad_norm 1.5880 (2.8635)	mem 20675MB
[2025-04-03 01:18:04 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][342/573]	eta 0:03:23 lr 0.001146	time 0.8771 (0.8828)	loss 0.4380 (0.5342)	grad_norm 3.1702 (2.8629)	mem 20675MB
[2025-04-03 01:18:05 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][344/573]	eta 0:03:22 lr 0.001146	time 0.8775 (0.8827)	loss 0.5587 (0.5340)	grad_norm 3.9866 (2.8659)	mem 20675MB
[2025-04-03 01:18:07 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][346/573]	eta 0:03:20 lr 0.001145	time 0.8775 (0.8827)	loss 0.6967 (0.5347)	grad_norm 3.9564 (2.8660)	mem 20675MB
[2025-04-03 01:18:09 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][348/573]	eta 0:03:18 lr 0.001145	time 0.8778 (0.8827)	loss 0.5708 (0.5350)	grad_norm 2.2364 (2.8632)	mem 20675MB
[2025-04-03 01:18:11 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][350/573]	eta 0:03:16 lr 0.001145	time 0.8774 (0.8827)	loss 0.4463 (0.5346)	grad_norm 3.0011 (2.8603)	mem 20675MB
[2025-04-03 01:18:12 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][352/573]	eta 0:03:15 lr 0.001145	time 0.8773 (0.8826)	loss 0.5140 (0.5344)	grad_norm 2.4947 (2.8615)	mem 20675MB
[2025-04-03 01:18:14 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][354/573]	eta 0:03:13 lr 0.001145	time 0.8774 (0.8826)	loss 0.6024 (0.5345)	grad_norm 2.7090 (2.8641)	mem 20675MB
[2025-04-03 01:18:16 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][356/573]	eta 0:03:11 lr 0.001145	time 0.8771 (0.8826)	loss 0.4437 (0.5345)	grad_norm 4.0759 (2.8661)	mem 20675MB
[2025-04-03 01:18:18 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][358/573]	eta 0:03:09 lr 0.001145	time 0.8771 (0.8826)	loss 0.5363 (0.5344)	grad_norm 2.0781 (2.8628)	mem 20675MB
[2025-04-03 01:18:19 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][360/573]	eta 0:03:07 lr 0.001145	time 0.8773 (0.8825)	loss 0.6377 (0.5348)	grad_norm 2.2778 (2.8590)	mem 20675MB
[2025-04-03 01:18:21 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][362/573]	eta 0:03:06 lr 0.001144	time 0.8776 (0.8825)	loss 0.4952 (0.5346)	grad_norm 4.2402 (2.8607)	mem 20675MB
[2025-04-03 01:18:23 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][364/573]	eta 0:03:04 lr 0.001144	time 0.8775 (0.8825)	loss 0.6603 (0.5351)	grad_norm 2.1555 (2.8561)	mem 20675MB
[2025-04-03 01:18:25 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][366/573]	eta 0:03:02 lr 0.001144	time 0.8773 (0.8825)	loss 0.5964 (0.5353)	grad_norm 2.2700 (2.8510)	mem 20675MB
[2025-04-03 01:18:26 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][368/573]	eta 0:03:00 lr 0.001144	time 0.8773 (0.8824)	loss 0.5966 (0.5356)	grad_norm 1.5712 (2.8455)	mem 20675MB
[2025-04-03 01:18:28 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][370/573]	eta 0:02:59 lr 0.001144	time 0.8773 (0.8824)	loss 0.6455 (0.5361)	grad_norm 1.0574 (2.8376)	mem 20675MB
[2025-04-03 01:18:30 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][372/573]	eta 0:02:57 lr 0.001144	time 0.8771 (0.8824)	loss 0.5889 (0.5363)	grad_norm 1.4954 (2.8328)	mem 20675MB
[2025-04-03 01:18:32 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][374/573]	eta 0:02:55 lr 0.001144	time 0.8772 (0.8824)	loss 0.6091 (0.5366)	grad_norm 3.3449 (2.8301)	mem 20675MB
[2025-04-03 01:18:33 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][376/573]	eta 0:02:53 lr 0.001144	time 0.8772 (0.8823)	loss 0.6005 (0.5368)	grad_norm 2.5937 (2.8268)	mem 20675MB
[2025-04-03 01:18:35 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][378/573]	eta 0:02:52 lr 0.001143	time 0.8773 (0.8823)	loss 0.6089 (0.5368)	grad_norm 2.0927 (2.8269)	mem 20675MB
[2025-04-03 01:18:37 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][380/573]	eta 0:02:50 lr 0.001143	time 0.8773 (0.8823)	loss 0.4302 (0.5364)	grad_norm 2.5191 (2.8265)	mem 20675MB
[2025-04-03 01:18:39 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][382/573]	eta 0:02:48 lr 0.001143	time 0.8773 (0.8823)	loss 0.6264 (0.5366)	grad_norm 2.9823 (2.8314)	mem 20675MB
[2025-04-03 01:18:40 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][384/573]	eta 0:02:46 lr 0.001143	time 0.8775 (0.8823)	loss 0.5600 (0.5366)	grad_norm 2.9424 (2.8315)	mem 20675MB
[2025-04-03 01:18:42 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][386/573]	eta 0:02:44 lr 0.001143	time 0.8771 (0.8822)	loss 0.5975 (0.5371)	grad_norm 2.0921 (2.8305)	mem 20675MB
[2025-04-03 01:18:44 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][388/573]	eta 0:02:43 lr 0.001143	time 0.8774 (0.8822)	loss 0.6272 (0.5370)	grad_norm 2.8248 (2.8346)	mem 20675MB
[2025-04-03 01:18:46 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][390/573]	eta 0:02:41 lr 0.001143	time 0.8771 (0.8822)	loss 0.5487 (0.5370)	grad_norm 2.1686 (2.8331)	mem 20675MB
[2025-04-03 01:18:48 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][392/573]	eta 0:02:39 lr 0.001143	time 0.8774 (0.8822)	loss 0.5756 (0.5370)	grad_norm 1.4605 (2.8296)	mem 20675MB
[2025-04-03 01:18:49 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][394/573]	eta 0:02:37 lr 0.001142	time 0.8771 (0.8822)	loss 0.6096 (0.5374)	grad_norm 2.1690 (2.8260)	mem 20675MB
[2025-04-03 01:18:51 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][396/573]	eta 0:02:36 lr 0.001142	time 0.8773 (0.8821)	loss 0.4172 (0.5369)	grad_norm 2.4440 (2.8245)	mem 20675MB
[2025-04-03 01:18:53 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][398/573]	eta 0:02:34 lr 0.001142	time 0.8778 (0.8821)	loss 0.5608 (0.5372)	grad_norm 2.6165 (2.8215)	mem 20675MB
[2025-04-03 01:18:55 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][400/573]	eta 0:02:32 lr 0.001142	time 0.8772 (0.8821)	loss 0.5543 (0.5371)	grad_norm 2.1778 (2.8187)	mem 20675MB
[2025-04-03 01:18:56 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][402/573]	eta 0:02:30 lr 0.001142	time 0.8773 (0.8821)	loss 0.4320 (0.5368)	grad_norm 2.5030 (2.8168)	mem 20675MB
[2025-04-03 01:18:58 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][404/573]	eta 0:02:29 lr 0.001142	time 0.8773 (0.8821)	loss 0.4293 (0.5366)	grad_norm 4.9180 (2.8237)	mem 20675MB
[2025-04-03 01:19:00 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][406/573]	eta 0:02:27 lr 0.001142	time 0.8772 (0.8820)	loss 0.4301 (0.5363)	grad_norm 2.4420 (2.8285)	mem 20675MB
[2025-04-03 01:19:02 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][408/573]	eta 0:02:25 lr 0.001142	time 0.8775 (0.8820)	loss 0.5973 (0.5365)	grad_norm 3.2010 (2.8335)	mem 20675MB
[2025-04-03 01:19:03 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][410/573]	eta 0:02:23 lr 0.001141	time 0.8774 (0.8820)	loss 0.4992 (0.5366)	grad_norm 2.1600 (2.8332)	mem 20675MB
[2025-04-03 01:19:05 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][412/573]	eta 0:02:21 lr 0.001141	time 0.8773 (0.8820)	loss 0.5041 (0.5366)	grad_norm 2.3079 (2.8329)	mem 20675MB
[2025-04-03 01:19:07 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][414/573]	eta 0:02:20 lr 0.001141	time 0.8773 (0.8820)	loss 0.4256 (0.5361)	grad_norm 3.7600 (2.8354)	mem 20675MB
[2025-04-03 01:19:09 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][416/573]	eta 0:02:18 lr 0.001141	time 0.8771 (0.8819)	loss 0.4956 (0.5361)	grad_norm 2.7546 (2.8326)	mem 20675MB
[2025-04-03 01:19:10 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][418/573]	eta 0:02:16 lr 0.001141	time 0.8773 (0.8819)	loss 0.5410 (0.5364)	grad_norm 3.8029 (2.8341)	mem 20675MB
[2025-04-03 01:19:12 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][420/573]	eta 0:02:14 lr 0.001141	time 0.8774 (0.8819)	loss 0.5257 (0.5362)	grad_norm 2.2001 (2.8338)	mem 20675MB
[2025-04-03 01:19:14 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][422/573]	eta 0:02:13 lr 0.001141	time 0.8773 (0.8819)	loss 0.5717 (0.5362)	grad_norm 4.7324 (2.8371)	mem 20675MB
[2025-04-03 01:19:16 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][424/573]	eta 0:02:11 lr 0.001140	time 0.8774 (0.8819)	loss 0.6785 (0.5365)	grad_norm 3.5329 (2.8385)	mem 20675MB
[2025-04-03 01:19:17 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][426/573]	eta 0:02:09 lr 0.001140	time 0.8772 (0.8819)	loss 0.5736 (0.5366)	grad_norm 2.6027 (2.8361)	mem 20675MB
[2025-04-03 01:19:19 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][428/573]	eta 0:02:07 lr 0.001140	time 0.8772 (0.8818)	loss 0.5771 (0.5368)	grad_norm 2.0442 (2.8318)	mem 20675MB
[2025-04-03 01:19:21 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][430/573]	eta 0:02:06 lr 0.001140	time 0.8774 (0.8818)	loss 0.5843 (0.5370)	grad_norm 2.1928 (2.8297)	mem 20675MB
[2025-04-03 01:19:23 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][432/573]	eta 0:02:04 lr 0.001140	time 0.8774 (0.8818)	loss 0.5306 (0.5368)	grad_norm 1.6299 (2.8260)	mem 20675MB
[2025-04-03 01:19:24 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][434/573]	eta 0:02:02 lr 0.001140	time 0.8774 (0.8818)	loss 0.6199 (0.5368)	grad_norm 1.6736 (2.8234)	mem 20675MB
[2025-04-03 01:19:26 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][436/573]	eta 0:02:00 lr 0.001140	time 0.8773 (0.8818)	loss 0.6214 (0.5368)	grad_norm 1.6969 (2.8209)	mem 20675MB
[2025-04-03 01:19:28 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][438/573]	eta 0:01:59 lr 0.001140	time 0.8775 (0.8817)	loss 0.4762 (0.5368)	grad_norm 3.1335 (2.8204)	mem 20675MB
[2025-04-03 01:19:30 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][440/573]	eta 0:01:57 lr 0.001139	time 0.8772 (0.8817)	loss 0.5549 (0.5371)	grad_norm 3.3802 (2.8199)	mem 20675MB
[2025-04-03 01:19:31 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][442/573]	eta 0:01:55 lr 0.001139	time 0.8773 (0.8817)	loss 0.5666 (0.5372)	grad_norm 2.3480 (2.8181)	mem 20675MB
[2025-04-03 01:19:33 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][444/573]	eta 0:01:53 lr 0.001139	time 0.8772 (0.8817)	loss 0.5464 (0.5371)	grad_norm 2.0700 (2.8150)	mem 20675MB
[2025-04-03 01:19:35 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][446/573]	eta 0:01:51 lr 0.001139	time 0.8771 (0.8817)	loss 0.4252 (0.5369)	grad_norm 2.5194 (2.8146)	mem 20675MB
[2025-04-03 01:19:37 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][448/573]	eta 0:01:50 lr 0.001139	time 0.8773 (0.8817)	loss 0.5989 (0.5369)	grad_norm 2.2539 (2.8131)	mem 20675MB
[2025-04-03 01:19:38 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][450/573]	eta 0:01:48 lr 0.001139	time 0.8773 (0.8817)	loss 0.5067 (0.5369)	grad_norm 4.3756 (2.8190)	mem 20675MB
[2025-04-03 01:19:40 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][452/573]	eta 0:01:46 lr 0.001139	time 0.8773 (0.8816)	loss 0.6182 (0.5372)	grad_norm 2.8151 (2.8191)	mem 20675MB
[2025-04-03 01:19:42 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][454/573]	eta 0:01:44 lr 0.001139	time 0.8773 (0.8816)	loss 0.5336 (0.5373)	grad_norm 2.2584 (2.8209)	mem 20675MB
[2025-04-03 01:19:44 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][456/573]	eta 0:01:43 lr 0.001138	time 0.8772 (0.8816)	loss 0.5245 (0.5373)	grad_norm 2.5536 (2.8202)	mem 20675MB
[2025-04-03 01:19:45 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][458/573]	eta 0:01:41 lr 0.001138	time 0.8773 (0.8816)	loss 0.4164 (0.5367)	grad_norm 3.8425 (2.8250)	mem 20675MB
[2025-04-03 01:19:47 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][460/573]	eta 0:01:39 lr 0.001138	time 0.8771 (0.8816)	loss 0.4324 (0.5366)	grad_norm 2.8524 (2.8240)	mem 20675MB
[2025-04-03 01:19:49 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][462/573]	eta 0:01:37 lr 0.001138	time 0.8774 (0.8816)	loss 0.5494 (0.5364)	grad_norm 3.9537 (2.8281)	mem 20675MB
[2025-04-03 01:19:51 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][464/573]	eta 0:01:36 lr 0.001138	time 0.8774 (0.8815)	loss 0.6067 (0.5366)	grad_norm 2.8829 (2.8274)	mem 20675MB
[2025-04-03 01:19:52 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][466/573]	eta 0:01:34 lr 0.001138	time 0.8770 (0.8815)	loss 0.5122 (0.5365)	grad_norm 3.0502 (2.8281)	mem 20675MB
[2025-04-03 01:19:54 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][468/573]	eta 0:01:32 lr 0.001138	time 0.8772 (0.8815)	loss 0.4868 (0.5363)	grad_norm 2.3939 (2.8278)	mem 20675MB
[2025-04-03 01:19:56 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][470/573]	eta 0:01:30 lr 0.001137	time 0.8770 (0.8815)	loss 0.5090 (0.5359)	grad_norm 2.5849 (2.8337)	mem 20675MB
[2025-04-03 01:19:58 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][472/573]	eta 0:01:29 lr 0.001137	time 0.8773 (0.8815)	loss 0.5845 (0.5356)	grad_norm 2.4198 (2.8376)	mem 20675MB
[2025-04-03 01:20:00 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][474/573]	eta 0:01:27 lr 0.001137	time 0.8770 (0.8815)	loss 0.4420 (0.5354)	grad_norm 2.6332 (2.8382)	mem 20675MB
[2025-04-03 01:20:01 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][476/573]	eta 0:01:25 lr 0.001137	time 0.8770 (0.8815)	loss 0.5639 (0.5356)	grad_norm 2.9373 (2.8398)	mem 20675MB
[2025-04-03 01:20:03 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][478/573]	eta 0:01:23 lr 0.001137	time 0.8772 (0.8814)	loss 0.5981 (0.5358)	grad_norm 1.7572 (2.8369)	mem 20675MB
[2025-04-03 01:20:05 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][480/573]	eta 0:01:21 lr 0.001137	time 0.8774 (0.8814)	loss 0.3885 (0.5356)	grad_norm 2.8001 (2.8360)	mem 20675MB
[2025-04-03 01:20:07 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][482/573]	eta 0:01:20 lr 0.001137	time 0.8770 (0.8814)	loss 0.4564 (0.5355)	grad_norm 3.4966 (2.8385)	mem 20675MB
[2025-04-03 01:20:08 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][484/573]	eta 0:01:18 lr 0.001137	time 0.8770 (0.8814)	loss 0.5299 (0.5357)	grad_norm 2.0043 (2.8355)	mem 20675MB
[2025-04-03 01:20:10 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][486/573]	eta 0:01:16 lr 0.001136	time 0.8771 (0.8814)	loss 0.5328 (0.5358)	grad_norm 2.3370 (2.8328)	mem 20675MB
[2025-04-03 01:20:12 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][488/573]	eta 0:01:14 lr 0.001136	time 0.8771 (0.8814)	loss 0.4164 (0.5358)	grad_norm 3.6537 (2.8343)	mem 20675MB
[2025-04-03 01:20:14 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][490/573]	eta 0:01:13 lr 0.001136	time 0.8773 (0.8814)	loss 0.4949 (0.5354)	grad_norm 2.1655 (2.8334)	mem 20675MB
[2025-04-03 01:20:15 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][492/573]	eta 0:01:11 lr 0.001136	time 0.8773 (0.8813)	loss 0.5661 (0.5356)	grad_norm 2.7526 (2.8326)	mem 20675MB
[2025-04-03 01:20:17 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][494/573]	eta 0:01:09 lr 0.001136	time 0.8775 (0.8813)	loss 0.5351 (0.5354)	grad_norm 2.2485 (2.8293)	mem 20675MB
[2025-04-03 01:20:19 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][496/573]	eta 0:01:07 lr 0.001136	time 0.8773 (0.8813)	loss 0.5238 (0.5356)	grad_norm 2.1760 (2.8284)	mem 20675MB
[2025-04-03 01:20:21 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][498/573]	eta 0:01:06 lr 0.001136	time 0.8771 (0.8813)	loss 0.4311 (0.5355)	grad_norm 2.6450 (2.8272)	mem 20675MB
[2025-04-03 01:20:22 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][500/573]	eta 0:01:04 lr 0.001136	time 0.8774 (0.8813)	loss 0.5717 (0.5354)	grad_norm 2.4751 (2.8278)	mem 20675MB
[2025-04-03 01:20:24 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][502/573]	eta 0:01:02 lr 0.001135	time 0.8772 (0.8813)	loss 0.5366 (0.5355)	grad_norm 2.1209 (2.8254)	mem 20675MB
[2025-04-03 01:20:26 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][504/573]	eta 0:01:00 lr 0.001135	time 0.8775 (0.8813)	loss 0.4541 (0.5356)	grad_norm 4.7465 (2.8310)	mem 20675MB
[2025-04-03 01:20:28 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][506/573]	eta 0:00:59 lr 0.001135	time 0.8774 (0.8813)	loss 0.5168 (0.5355)	grad_norm 2.8540 (2.8297)	mem 20675MB
[2025-04-03 01:20:29 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][508/573]	eta 0:00:57 lr 0.001135	time 0.8772 (0.8812)	loss 0.5578 (0.5354)	grad_norm 2.5319 (2.8290)	mem 20675MB
[2025-04-03 01:20:31 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][510/573]	eta 0:00:55 lr 0.001135	time 0.8772 (0.8812)	loss 0.5331 (0.5354)	grad_norm 2.9249 (2.8298)	mem 20675MB
[2025-04-03 01:20:33 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][512/573]	eta 0:00:53 lr 0.001135	time 0.8775 (0.8812)	loss 0.5648 (0.5357)	grad_norm 1.6542 (2.8296)	mem 20675MB
[2025-04-03 01:20:35 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][514/573]	eta 0:00:51 lr 0.001135	time 0.8775 (0.8812)	loss 0.3598 (0.5355)	grad_norm 3.2476 (2.8293)	mem 20675MB
[2025-04-03 01:20:36 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][516/573]	eta 0:00:50 lr 0.001134	time 0.8773 (0.8812)	loss 0.3933 (0.5353)	grad_norm 3.1940 (2.8283)	mem 20675MB
[2025-04-03 01:20:38 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][518/573]	eta 0:00:48 lr 0.001134	time 0.8775 (0.8812)	loss 0.6478 (0.5357)	grad_norm 2.3896 (2.8254)	mem 20675MB
[2025-04-03 01:20:40 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][520/573]	eta 0:00:46 lr 0.001134	time 0.8773 (0.8812)	loss 0.6420 (0.5360)	grad_norm 1.9740 (2.8227)	mem 20675MB
[2025-04-03 01:20:42 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][522/573]	eta 0:00:44 lr 0.001134	time 0.8772 (0.8812)	loss 0.5485 (0.5360)	grad_norm 2.2407 (2.8193)	mem 20675MB
[2025-04-03 01:20:43 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][524/573]	eta 0:00:43 lr 0.001134	time 0.8774 (0.8812)	loss 0.5582 (0.5360)	grad_norm 2.5824 (2.8180)	mem 20675MB
[2025-04-03 01:20:45 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][526/573]	eta 0:00:41 lr 0.001134	time 0.8773 (0.8811)	loss 0.5493 (0.5361)	grad_norm 2.3564 (2.8146)	mem 20675MB
[2025-04-03 01:20:47 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][528/573]	eta 0:00:39 lr 0.001134	time 0.8776 (0.8811)	loss 0.6191 (0.5362)	grad_norm 1.3568 (2.8104)	mem 20675MB
[2025-04-03 01:20:49 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][530/573]	eta 0:00:37 lr 0.001134	time 0.8773 (0.8811)	loss 0.5019 (0.5361)	grad_norm 2.0420 (2.8069)	mem 20675MB
[2025-04-03 01:20:50 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][532/573]	eta 0:00:36 lr 0.001133	time 0.8775 (0.8811)	loss 0.4968 (0.5358)	grad_norm 2.8012 (2.8066)	mem 20675MB
[2025-04-03 01:20:52 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][534/573]	eta 0:00:34 lr 0.001133	time 0.8772 (0.8811)	loss 0.5913 (0.5356)	grad_norm 3.2217 (2.8073)	mem 20675MB
[2025-04-03 01:20:54 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][536/573]	eta 0:00:32 lr 0.001133	time 0.8773 (0.8811)	loss 0.5834 (0.5354)	grad_norm 2.4164 (2.8068)	mem 20675MB
[2025-04-03 01:20:56 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][538/573]	eta 0:00:30 lr 0.001133	time 0.8774 (0.8811)	loss 0.6660 (0.5358)	grad_norm 3.3826 (2.8070)	mem 20675MB
[2025-04-03 01:20:57 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][540/573]	eta 0:00:29 lr 0.001133	time 0.8772 (0.8811)	loss 0.5746 (0.5360)	grad_norm 2.8233 (2.8065)	mem 20675MB
[2025-04-03 01:20:59 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][542/573]	eta 0:00:27 lr 0.001133	time 0.8771 (0.8811)	loss 0.4651 (0.5361)	grad_norm 2.7889 (2.8059)	mem 20675MB
[2025-04-03 01:21:01 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][544/573]	eta 0:00:25 lr 0.001133	time 0.8772 (0.8810)	loss 0.5678 (0.5358)	grad_norm 1.9602 (2.8062)	mem 20675MB
[2025-04-03 01:21:03 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][546/573]	eta 0:00:23 lr 0.001132	time 0.8775 (0.8810)	loss 0.4460 (0.5358)	grad_norm 3.2202 (2.8059)	mem 20675MB
[2025-04-03 01:21:05 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][548/573]	eta 0:00:22 lr 0.001132	time 0.8776 (0.8810)	loss 0.6111 (0.5359)	grad_norm 1.7891 (2.8061)	mem 20675MB
[2025-04-03 01:21:06 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][550/573]	eta 0:00:20 lr 0.001132	time 0.8772 (0.8810)	loss 0.6242 (0.5361)	grad_norm 3.1366 (2.8053)	mem 20675MB
[2025-04-03 01:21:08 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][552/573]	eta 0:00:18 lr 0.001132	time 0.8771 (0.8810)	loss 0.3926 (0.5358)	grad_norm 3.6517 (2.8062)	mem 20675MB
[2025-04-03 01:21:10 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][554/573]	eta 0:00:16 lr 0.001132	time 0.8771 (0.8810)	loss 0.4663 (0.5358)	grad_norm 2.6440 (2.8040)	mem 20675MB
[2025-04-03 01:21:12 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][556/573]	eta 0:00:14 lr 0.001132	time 0.8771 (0.8810)	loss 0.6048 (0.5357)	grad_norm 2.9276 (2.8056)	mem 20675MB
[2025-04-03 01:21:13 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][558/573]	eta 0:00:13 lr 0.001132	time 0.8768 (0.8810)	loss 0.5030 (0.5359)	grad_norm 2.4108 (2.8052)	mem 20675MB
[2025-04-03 01:21:15 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][560/573]	eta 0:00:11 lr 0.001132	time 0.8771 (0.8810)	loss 0.4817 (0.5358)	grad_norm 3.3944 (2.8069)	mem 20675MB
[2025-04-03 01:21:17 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][562/573]	eta 0:00:09 lr 0.001131	time 0.8772 (0.8809)	loss 0.4648 (0.5357)	grad_norm 4.9919 (2.8119)	mem 20675MB
[2025-04-03 01:21:19 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][564/573]	eta 0:00:07 lr 0.001131	time 0.8770 (0.8809)	loss 0.5942 (0.5357)	grad_norm 2.3579 (2.8117)	mem 20675MB
[2025-04-03 01:21:20 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][566/573]	eta 0:00:06 lr 0.001131	time 0.8770 (0.8809)	loss 0.5107 (0.5355)	grad_norm 3.8724 (2.8141)	mem 20675MB
[2025-04-03 01:21:22 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][568/573]	eta 0:00:04 lr 0.001131	time 0.8770 (0.8809)	loss 0.4637 (0.5355)	grad_norm 2.8999 (2.8137)	mem 20675MB
[2025-04-03 01:21:24 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][570/573]	eta 0:00:02 lr 0.001131	time 0.8772 (0.8809)	loss 0.4680 (0.5354)	grad_norm 2.5334 (2.8116)	mem 20675MB
[2025-04-03 01:21:26 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][572/573]	eta 0:00:00 lr 0.001131	time 0.8772 (0.8809)	loss 0.4349 (0.5350)	grad_norm 2.7775 (2.8119)	mem 20675MB
[2025-04-03 01:21:26 simmim_finetune] (main_finetune.py 260): INFO EPOCH 5 training takes 0:08:24
[2025-04-03 01:21:26 simmim_finetune] (utils.py 60): INFO checkpoint/human/ckpt5.pth saving......
[2025-04-03 01:21:29 simmim_finetune] (utils.py 62): INFO checkpoint/human/ckpt5.pth saved !!!
[2025-04-03 01:21:31 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.890 (1.890)	Loss 0.4556 (0.4556)	Acc@1 78.906 (78.906)	Mem 20675MB
[2025-04-03 01:21:31 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.819)	Loss 0.4100 (0.4253)	Acc@1 81.250 (80.990)	Mem 20675MB
[2025-04-03 01:21:32 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.605)	Loss 0.4459 (0.4309)	Acc@1 78.906 (80.156)	Mem 20675MB
[2025-04-03 01:21:32 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.513)	Loss 0.4603 (0.4310)	Acc@1 78.906 (80.692)	Mem 20675MB
[2025-04-03 01:21:33 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.462)	Loss 0.7084 (0.4649)	Acc@1 57.031 (77.604)	Mem 20675MB
[2025-04-03 01:21:33 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.430)	Loss 0.5972 (0.4927)	Acc@1 72.656 (75.994)	Mem 20675MB
[2025-04-03 01:21:34 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.407)	Loss 0.6259 (0.5140)	Acc@1 63.281 (74.219)	Mem 20675MB
[2025-04-03 01:21:35 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.391)	Loss 0.5673 (0.5221)	Acc@1 74.219 (73.958)	Mem 20675MB
[2025-04-03 01:21:35 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 73.992
[2025-04-03 01:21:35 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 74.0%
[2025-04-03 01:21:35 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 73.99%
[2025-04-03 01:21:35 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.204300324251698e-06, 4.204300324251698e-06, 6.455306949032316e-06, 6.455306949032316e-06, 9.918394064079421e-06, 9.918394064079421e-06, 1.5246220394921124e-05, 1.5246220394921124e-05, 2.3442876288523738e-05, 2.3442876288523738e-05, 3.605311612483546e-05, 3.605311612483546e-05, 5.545348510377656e-05, 5.545348510377656e-05, 8.530020660983978e-05, 8.530020660983978e-05, 0.00013121823969609091, 0.00013121823969609091, 0.00020186136752109264, 0.00020186136752109264, 0.00031054310263647987, 0.00031054310263647987, 0.000477745772044768, 0.000477745772044768, 0.000734980648057519, 0.000734980648057519, 0.0011307266111540589, 0.0011307266111540589]
[2025-04-03 01:21:37 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][0/573]	eta 0:22:45 lr 0.001131	time 2.3828 (2.3828)	loss 0.5982 (0.5982)	grad_norm 2.6041 (2.6041)	mem 20675MB
[2025-04-03 01:21:39 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][2/573]	eta 0:13:07 lr 0.001131	time 0.8771 (1.3798)	loss 0.4192 (0.5160)	grad_norm 3.1489 (2.8089)	mem 20675MB
[2025-04-03 01:21:41 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][4/573]	eta 0:11:10 lr 0.001130	time 0.8775 (1.1792)	loss 0.5912 (0.5501)	grad_norm 1.9354 (2.6453)	mem 20675MB
[2025-04-03 01:21:43 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][6/573]	eta 0:10:19 lr 0.001130	time 0.8777 (1.0933)	loss 0.6278 (0.5483)	grad_norm 3.8763 (3.0719)	mem 20675MB
[2025-04-03 01:21:44 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][8/573]	eta 0:09:50 lr 0.001130	time 0.8773 (1.0455)	loss 0.5212 (0.5440)	grad_norm 2.4712 (2.9423)	mem 20675MB
[2025-04-03 01:21:46 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][10/573]	eta 0:09:31 lr 0.001130	time 0.8770 (1.0150)	loss 0.4489 (0.5423)	grad_norm 3.2003 (2.9172)	mem 20675MB
[2025-04-03 01:21:48 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][12/573]	eta 0:09:17 lr 0.001130	time 0.8775 (0.9940)	loss 0.4826 (0.5426)	grad_norm 4.9739 (2.9541)	mem 20675MB
[2025-04-03 01:21:50 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][14/573]	eta 0:09:06 lr 0.001130	time 0.8776 (0.9785)	loss 0.5750 (0.5443)	grad_norm 2.1956 (2.9465)	mem 20675MB
[2025-04-03 01:21:51 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][16/573]	eta 0:08:58 lr 0.001130	time 0.8773 (0.9667)	loss 0.5625 (0.5397)	grad_norm 1.7454 (2.9128)	mem 20675MB
[2025-04-03 01:21:53 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][18/573]	eta 0:08:51 lr 0.001129	time 0.8775 (0.9574)	loss 0.6027 (0.5486)	grad_norm 2.1908 (2.8360)	mem 20675MB
[2025-04-03 01:21:55 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][20/573]	eta 0:08:45 lr 0.001129	time 0.8775 (0.9499)	loss 0.4340 (0.5455)	grad_norm 3.2440 (2.8438)	mem 20675MB
[2025-04-03 01:21:57 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][22/573]	eta 0:08:39 lr 0.001129	time 0.8774 (0.9437)	loss 0.6125 (0.5457)	grad_norm 4.0383 (2.8833)	mem 20675MB
[2025-04-03 01:21:58 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][24/573]	eta 0:08:35 lr 0.001129	time 0.8776 (0.9384)	loss 0.5953 (0.5452)	grad_norm 3.5012 (2.8872)	mem 20675MB
[2025-04-03 01:22:00 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][26/573]	eta 0:08:30 lr 0.001129	time 0.8773 (0.9340)	loss 0.4160 (0.5357)	grad_norm 3.2991 (2.9244)	mem 20675MB
[2025-04-03 01:22:02 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][28/573]	eta 0:08:26 lr 0.001129	time 0.8773 (0.9301)	loss 0.6206 (0.5351)	grad_norm 2.2509 (2.8870)	mem 20675MB
[2025-04-03 01:22:04 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][30/573]	eta 0:08:23 lr 0.001129	time 0.8771 (0.9268)	loss 0.5176 (0.5294)	grad_norm 2.3149 (2.8740)	mem 20675MB
[2025-04-03 01:22:05 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][32/573]	eta 0:08:19 lr 0.001129	time 0.8774 (0.9238)	loss 0.4366 (0.5285)	grad_norm 3.1761 (2.8641)	mem 20675MB
[2025-04-03 01:22:07 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][34/573]	eta 0:08:16 lr 0.001128	time 0.8771 (0.9212)	loss 0.4866 (0.5266)	grad_norm 2.7495 (2.8898)	mem 20675MB
[2025-04-03 01:22:09 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][36/573]	eta 0:08:13 lr 0.001128	time 0.8771 (0.9189)	loss 0.4891 (0.5283)	grad_norm 2.0122 (2.8717)	mem 20675MB
[2025-04-03 01:22:11 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][38/573]	eta 0:08:10 lr 0.001128	time 0.8775 (0.9168)	loss 0.4866 (0.5277)	grad_norm 3.0337 (2.9396)	mem 20675MB
[2025-04-03 01:22:12 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][40/573]	eta 0:08:07 lr 0.001128	time 0.8777 (0.9149)	loss 0.5476 (0.5303)	grad_norm 2.8694 (2.9306)	mem 20675MB
[2025-04-03 01:22:14 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][42/573]	eta 0:08:04 lr 0.001128	time 0.8775 (0.9132)	loss 0.6388 (0.5332)	grad_norm 2.4722 (2.9037)	mem 20675MB
[2025-04-03 01:22:16 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][44/573]	eta 0:08:02 lr 0.001128	time 0.8774 (0.9117)	loss 0.4438 (0.5302)	grad_norm 3.4394 (2.8914)	mem 20675MB
[2025-04-03 01:22:18 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][46/573]	eta 0:07:59 lr 0.001128	time 0.8771 (0.9103)	loss 0.5083 (0.5279)	grad_norm 2.2532 (2.8717)	mem 20675MB
[2025-04-03 01:22:19 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][48/573]	eta 0:07:57 lr 0.001127	time 0.8775 (0.9090)	loss 0.5101 (0.5269)	grad_norm 1.5863 (2.8365)	mem 20675MB
[2025-04-03 01:22:21 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][50/573]	eta 0:07:54 lr 0.001127	time 0.8773 (0.9077)	loss 0.5113 (0.5272)	grad_norm 4.4372 (2.8525)	mem 20675MB
[2025-04-03 01:22:23 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][52/573]	eta 0:07:52 lr 0.001127	time 0.8773 (0.9066)	loss 0.5196 (0.5271)	grad_norm 2.3066 (2.8673)	mem 20675MB
[2025-04-03 01:22:25 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][54/573]	eta 0:07:50 lr 0.001127	time 0.8775 (0.9056)	loss 0.5370 (0.5276)	grad_norm 2.0936 (2.8483)	mem 20675MB
[2025-04-03 01:22:26 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][56/573]	eta 0:07:47 lr 0.001127	time 0.8771 (0.9047)	loss 0.3703 (0.5251)	grad_norm 2.9753 (2.8571)	mem 20675MB
[2025-04-03 01:22:28 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][58/573]	eta 0:07:45 lr 0.001127	time 0.8770 (0.9037)	loss 0.4013 (0.5237)	grad_norm 5.2370 (2.8936)	mem 20675MB
[2025-04-03 01:22:30 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][60/573]	eta 0:07:43 lr 0.001127	time 0.8773 (0.9029)	loss 0.4270 (0.5233)	grad_norm 5.2784 (2.9294)	mem 20675MB
[2025-04-03 01:22:32 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][62/573]	eta 0:07:40 lr 0.001126	time 0.8775 (0.9021)	loss 0.5038 (0.5245)	grad_norm 2.2373 (2.9143)	mem 20675MB
[2025-04-03 01:22:34 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][64/573]	eta 0:07:38 lr 0.001126	time 0.8792 (0.9014)	loss 0.5859 (0.5261)	grad_norm 3.0370 (2.9155)	mem 20675MB
[2025-04-03 01:22:35 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][66/573]	eta 0:07:36 lr 0.001126	time 0.8776 (0.9007)	loss 0.4923 (0.5265)	grad_norm 2.8137 (2.8980)	mem 20675MB
[2025-04-03 01:22:37 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][68/573]	eta 0:07:34 lr 0.001126	time 0.8773 (0.9001)	loss 0.5665 (0.5286)	grad_norm 1.7929 (2.8725)	mem 20675MB
[2025-04-03 01:22:39 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][70/573]	eta 0:07:32 lr 0.001126	time 0.8781 (0.8995)	loss 0.6412 (0.5311)	grad_norm 2.5339 (2.8639)	mem 20675MB
[2025-04-03 01:22:41 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][72/573]	eta 0:07:30 lr 0.001126	time 0.8777 (0.8989)	loss 0.4633 (0.5289)	grad_norm 2.3025 (2.8560)	mem 20675MB
[2025-04-03 01:22:42 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][74/573]	eta 0:07:28 lr 0.001126	time 0.8775 (0.8984)	loss 0.6000 (0.5301)	grad_norm 3.0129 (2.8586)	mem 20675MB
[2025-04-03 01:22:44 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][76/573]	eta 0:07:26 lr 0.001126	time 0.8774 (0.8978)	loss 0.6253 (0.5299)	grad_norm 2.5291 (2.8635)	mem 20675MB
[2025-04-03 01:22:46 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][78/573]	eta 0:07:24 lr 0.001125	time 0.8776 (0.8974)	loss 0.3941 (0.5287)	grad_norm 3.2382 (2.8620)	mem 20675MB
[2025-04-03 01:22:48 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][80/573]	eta 0:07:22 lr 0.001125	time 0.8776 (0.8969)	loss 0.5107 (0.5288)	grad_norm 2.4530 (2.8766)	mem 20675MB
[2025-04-03 01:22:49 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][82/573]	eta 0:07:20 lr 0.001125	time 0.8775 (0.8965)	loss 0.4164 (0.5277)	grad_norm 3.0920 (2.8778)	mem 20675MB
[2025-04-03 01:22:51 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][84/573]	eta 0:07:18 lr 0.001125	time 0.8778 (0.8960)	loss 0.5821 (0.5280)	grad_norm 2.0403 (2.8821)	mem 20675MB
[2025-04-03 01:22:53 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][86/573]	eta 0:07:16 lr 0.001125	time 0.8775 (0.8956)	loss 0.5304 (0.5283)	grad_norm 2.7524 (2.8841)	mem 20675MB
[2025-04-03 01:22:55 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][88/573]	eta 0:07:14 lr 0.001125	time 0.8774 (0.8952)	loss 0.5574 (0.5279)	grad_norm 2.1428 (2.8849)	mem 20675MB
[2025-04-03 01:22:56 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][90/573]	eta 0:07:12 lr 0.001125	time 0.8772 (0.8949)	loss 0.5317 (0.5281)	grad_norm 2.1187 (2.8702)	mem 20675MB
[2025-04-03 01:22:58 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][92/573]	eta 0:07:10 lr 0.001124	time 0.8776 (0.8945)	loss 0.4206 (0.5269)	grad_norm 2.7795 (2.8601)	mem 20675MB
[2025-04-03 01:23:00 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][94/573]	eta 0:07:08 lr 0.001124	time 0.8781 (0.8942)	loss 0.5864 (0.5280)	grad_norm 2.5445 (2.8541)	mem 20675MB
[2025-04-03 01:23:02 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][96/573]	eta 0:07:06 lr 0.001124	time 0.8773 (0.8938)	loss 0.5578 (0.5282)	grad_norm 1.8997 (2.8621)	mem 20675MB
[2025-04-03 01:23:03 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][98/573]	eta 0:07:04 lr 0.001124	time 0.8775 (0.8935)	loss 0.5077 (0.5277)	grad_norm 1.9058 (2.8572)	mem 20675MB
[2025-04-03 01:23:05 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][100/573]	eta 0:07:02 lr 0.001124	time 0.8775 (0.8932)	loss 0.5574 (0.5279)	grad_norm 3.0592 (2.8873)	mem 20675MB
[2025-04-03 01:23:07 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][102/573]	eta 0:07:00 lr 0.001124	time 0.8775 (0.8929)	loss 0.5325 (0.5272)	grad_norm 2.6833 (2.8952)	mem 20675MB
[2025-04-03 01:23:09 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][104/573]	eta 0:06:58 lr 0.001124	time 0.8775 (0.8927)	loss 0.4411 (0.5263)	grad_norm 3.1753 (2.8882)	mem 20675MB
[2025-04-03 01:23:10 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][106/573]	eta 0:06:56 lr 0.001123	time 0.8774 (0.8924)	loss 0.6099 (0.5270)	grad_norm 2.6921 (2.8833)	mem 20675MB
[2025-04-03 01:23:12 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][108/573]	eta 0:06:54 lr 0.001123	time 0.8779 (0.8921)	loss 0.4448 (0.5265)	grad_norm 4.3605 (2.9058)	mem 20675MB
[2025-04-03 01:23:14 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][110/573]	eta 0:06:52 lr 0.001123	time 0.8772 (0.8919)	loss 0.5026 (0.5264)	grad_norm 1.8832 (2.8861)	mem 20675MB
[2025-04-03 01:23:16 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][112/573]	eta 0:06:51 lr 0.001123	time 0.8771 (0.8916)	loss 0.4329 (0.5267)	grad_norm 2.3853 (2.8745)	mem 20675MB
[2025-04-03 01:23:17 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][114/573]	eta 0:06:49 lr 0.001123	time 0.8774 (0.8914)	loss 0.6440 (0.5282)	grad_norm 2.8092 (2.8628)	mem 20675MB
[2025-04-03 01:23:19 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][116/573]	eta 0:06:47 lr 0.001123	time 0.8773 (0.8912)	loss 0.5299 (0.5271)	grad_norm 2.8931 (2.8621)	mem 20675MB
[2025-04-03 01:23:21 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][118/573]	eta 0:06:45 lr 0.001123	time 0.8773 (0.8910)	loss 0.5741 (0.5281)	grad_norm 2.2996 (2.8564)	mem 20675MB
[2025-04-03 01:23:23 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][120/573]	eta 0:06:43 lr 0.001122	time 0.8777 (0.8908)	loss 0.6429 (0.5295)	grad_norm 2.2546 (2.8441)	mem 20675MB
[2025-04-03 01:23:24 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][122/573]	eta 0:06:41 lr 0.001122	time 0.8775 (0.8905)	loss 0.5403 (0.5303)	grad_norm 2.4998 (2.8359)	mem 20675MB
[2025-04-03 01:23:26 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][124/573]	eta 0:06:39 lr 0.001122	time 0.8775 (0.8904)	loss 0.6172 (0.5308)	grad_norm 2.1673 (2.8242)	mem 20675MB
[2025-04-03 01:23:28 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][126/573]	eta 0:06:37 lr 0.001122	time 0.8775 (0.8902)	loss 0.5690 (0.5320)	grad_norm 1.6169 (2.8070)	mem 20675MB
[2025-04-03 01:23:30 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][128/573]	eta 0:06:36 lr 0.001122	time 0.8775 (0.8900)	loss 0.6355 (0.5320)	grad_norm 1.3135 (2.8094)	mem 20675MB
[2025-04-03 01:23:31 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][130/573]	eta 0:06:34 lr 0.001122	time 0.8776 (0.8898)	loss 0.6076 (0.5325)	grad_norm 2.1862 (2.7984)	mem 20675MB
[2025-04-03 01:23:33 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][132/573]	eta 0:06:32 lr 0.001122	time 0.8774 (0.8896)	loss 0.4953 (0.5316)	grad_norm 2.1034 (2.7972)	mem 20675MB
[2025-04-03 01:23:35 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][134/573]	eta 0:06:30 lr 0.001122	time 0.8773 (0.8895)	loss 0.5516 (0.5323)	grad_norm 1.6071 (2.7862)	mem 20675MB
[2025-04-03 01:23:37 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][136/573]	eta 0:06:28 lr 0.001121	time 0.8776 (0.8893)	loss 0.4782 (0.5318)	grad_norm 2.4368 (2.7761)	mem 20675MB
[2025-04-03 01:23:39 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][138/573]	eta 0:06:26 lr 0.001121	time 0.8776 (0.8891)	loss 0.6228 (0.5328)	grad_norm 2.5625 (2.7670)	mem 20675MB
[2025-04-03 01:23:40 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][140/573]	eta 0:06:24 lr 0.001121	time 0.8774 (0.8890)	loss 0.6062 (0.5338)	grad_norm 2.1008 (2.7582)	mem 20675MB
[2025-04-03 01:23:42 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][142/573]	eta 0:06:23 lr 0.001121	time 0.8776 (0.8888)	loss 0.6044 (0.5341)	grad_norm 1.7144 (2.7484)	mem 20675MB
[2025-04-03 01:23:44 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][144/573]	eta 0:06:21 lr 0.001121	time 0.8775 (0.8887)	loss 0.4932 (0.5334)	grad_norm 2.8623 (2.7447)	mem 20675MB
[2025-04-03 01:23:46 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][146/573]	eta 0:06:19 lr 0.001121	time 0.8775 (0.8886)	loss 0.5173 (0.5333)	grad_norm 1.8068 (2.7302)	mem 20675MB
[2025-04-03 01:23:47 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][148/573]	eta 0:06:17 lr 0.001121	time 0.8775 (0.8884)	loss 0.5629 (0.5335)	grad_norm 2.0456 (2.7261)	mem 20675MB
[2025-04-03 01:23:49 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][150/573]	eta 0:06:15 lr 0.001120	time 0.8771 (0.8883)	loss 0.4946 (0.5332)	grad_norm 2.9927 (2.7217)	mem 20675MB
[2025-04-03 01:23:51 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][152/573]	eta 0:06:13 lr 0.001120	time 0.8772 (0.8881)	loss 0.5080 (0.5323)	grad_norm 3.0940 (2.7290)	mem 20675MB
[2025-04-03 01:23:53 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][154/573]	eta 0:06:12 lr 0.001120	time 0.8774 (0.8880)	loss 0.4774 (0.5319)	grad_norm 4.2613 (2.7388)	mem 20675MB
[2025-04-03 01:23:54 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][156/573]	eta 0:06:10 lr 0.001120	time 0.8775 (0.8879)	loss 0.5873 (0.5323)	grad_norm 2.4111 (2.7390)	mem 20675MB
[2025-04-03 01:23:56 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][158/573]	eta 0:06:08 lr 0.001120	time 0.8774 (0.8878)	loss 0.5582 (0.5319)	grad_norm 3.0928 (2.7857)	mem 20675MB
[2025-04-03 01:23:58 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][160/573]	eta 0:06:06 lr 0.001120	time 0.8778 (0.8877)	loss 0.5426 (0.5317)	grad_norm 2.1669 (2.7853)	mem 20675MB
[2025-04-03 01:24:00 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][162/573]	eta 0:06:04 lr 0.001120	time 0.8777 (0.8875)	loss 0.4453 (0.5314)	grad_norm 3.1848 (2.7833)	mem 20675MB
[2025-04-03 01:24:01 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][164/573]	eta 0:06:02 lr 0.001119	time 0.8778 (0.8874)	loss 0.3876 (0.5305)	grad_norm 3.0772 (2.7842)	mem 20675MB
[2025-04-03 01:24:03 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][166/573]	eta 0:06:01 lr 0.001119	time 0.8780 (0.8873)	loss 0.5973 (0.5308)	grad_norm 3.2512 (2.7891)	mem 20675MB
[2025-04-03 01:24:05 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][168/573]	eta 0:05:59 lr 0.001119	time 0.8779 (0.8872)	loss 0.5339 (0.5308)	grad_norm 3.3274 (2.7913)	mem 20675MB
[2025-04-03 01:24:07 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][170/573]	eta 0:05:57 lr 0.001119	time 0.8776 (0.8871)	loss 0.3893 (0.5306)	grad_norm 4.5952 (2.8033)	mem 20675MB
[2025-04-03 01:24:08 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][172/573]	eta 0:05:55 lr 0.001119	time 0.8776 (0.8870)	loss 0.6060 (0.5310)	grad_norm 2.5878 (2.7984)	mem 20675MB
[2025-04-03 01:24:10 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][174/573]	eta 0:05:53 lr 0.001119	time 0.8774 (0.8869)	loss 0.5299 (0.5315)	grad_norm 2.0555 (2.7926)	mem 20675MB
[2025-04-03 01:24:12 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][176/573]	eta 0:05:52 lr 0.001119	time 0.8776 (0.8868)	loss 0.5780 (0.5318)	grad_norm 2.3270 (2.7908)	mem 20675MB
[2025-04-03 01:24:14 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][178/573]	eta 0:05:50 lr 0.001118	time 0.8776 (0.8867)	loss 0.5586 (0.5321)	grad_norm 2.2519 (2.7867)	mem 20675MB
[2025-04-03 01:24:15 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][180/573]	eta 0:05:48 lr 0.001118	time 0.8774 (0.8866)	loss 0.4909 (0.5321)	grad_norm 2.6715 (2.7818)	mem 20675MB
[2025-04-03 01:24:17 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][182/573]	eta 0:05:46 lr 0.001118	time 0.8775 (0.8866)	loss 0.5953 (0.5326)	grad_norm 1.5114 (2.7715)	mem 20675MB
[2025-04-03 01:24:19 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][184/573]	eta 0:05:44 lr 0.001118	time 0.8775 (0.8865)	loss 0.4959 (0.5325)	grad_norm 1.8397 (2.7600)	mem 20675MB
[2025-04-03 01:24:21 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][186/573]	eta 0:05:43 lr 0.001118	time 0.8776 (0.8864)	loss 0.5744 (0.5330)	grad_norm 1.8610 (2.7531)	mem 20675MB
[2025-04-03 01:24:22 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][188/573]	eta 0:05:41 lr 0.001118	time 0.8777 (0.8863)	loss 0.5603 (0.5334)	grad_norm 3.1016 (2.7521)	mem 20675MB
[2025-04-03 01:24:24 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][190/573]	eta 0:05:39 lr 0.001118	time 0.8776 (0.8862)	loss 0.5552 (0.5339)	grad_norm 1.5412 (2.7425)	mem 20675MB
[2025-04-03 01:24:26 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][192/573]	eta 0:05:37 lr 0.001117	time 0.8775 (0.8861)	loss 0.5169 (0.5341)	grad_norm 2.7091 (2.7381)	mem 20675MB
[2025-04-03 01:24:28 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][194/573]	eta 0:05:35 lr 0.001117	time 0.8771 (0.8860)	loss 0.5494 (0.5345)	grad_norm 2.3258 (2.7373)	mem 20675MB
[2025-04-03 01:24:29 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][196/573]	eta 0:05:34 lr 0.001117	time 0.8772 (0.8860)	loss 0.5625 (0.5349)	grad_norm 3.3509 (2.7388)	mem 20675MB
[2025-04-03 01:24:31 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][198/573]	eta 0:05:32 lr 0.001117	time 0.8773 (0.8859)	loss 0.4685 (0.5346)	grad_norm 2.1922 (2.7391)	mem 20675MB
[2025-04-03 01:24:33 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][200/573]	eta 0:05:30 lr 0.001117	time 0.8782 (0.8858)	loss 0.3483 (0.5338)	grad_norm 3.5808 (2.7403)	mem 20675MB
[2025-04-03 01:24:35 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][202/573]	eta 0:05:28 lr 0.001117	time 0.8812 (0.8858)	loss 0.5626 (0.5335)	grad_norm 3.1095 (2.7443)	mem 20675MB
[2025-04-03 01:24:36 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][204/573]	eta 0:05:26 lr 0.001117	time 0.8779 (0.8857)	loss 0.6218 (0.5338)	grad_norm 2.7342 (2.7406)	mem 20675MB
[2025-04-03 01:24:38 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][206/573]	eta 0:05:25 lr 0.001116	time 0.8777 (0.8856)	loss 0.6013 (0.5342)	grad_norm 1.8609 (2.7429)	mem 20675MB
[2025-04-03 01:24:40 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][208/573]	eta 0:05:23 lr 0.001116	time 0.8773 (0.8856)	loss 0.5185 (0.5341)	grad_norm 1.9348 (2.7395)	mem 20675MB
[2025-04-03 01:24:42 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][210/573]	eta 0:05:21 lr 0.001116	time 0.8771 (0.8855)	loss 0.4403 (0.5337)	grad_norm 3.0691 (2.7356)	mem 20675MB
[2025-04-03 01:24:44 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][212/573]	eta 0:05:19 lr 0.001116	time 0.8774 (0.8854)	loss 0.5856 (0.5338)	grad_norm 2.5433 (2.7308)	mem 20675MB
[2025-04-03 01:24:45 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][214/573]	eta 0:05:17 lr 0.001116	time 0.8770 (0.8854)	loss 0.5446 (0.5343)	grad_norm 1.8612 (2.7248)	mem 20675MB
[2025-04-03 01:24:47 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][216/573]	eta 0:05:16 lr 0.001116	time 0.8774 (0.8853)	loss 0.4946 (0.5336)	grad_norm 2.8260 (2.7249)	mem 20675MB
[2025-04-03 01:24:49 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][218/573]	eta 0:05:14 lr 0.001116	time 0.8775 (0.8852)	loss 0.5157 (0.5334)	grad_norm 2.7734 (2.7238)	mem 20675MB
[2025-04-03 01:24:51 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][220/573]	eta 0:05:12 lr 0.001115	time 0.8775 (0.8852)	loss 0.4113 (0.5329)	grad_norm 2.7251 (2.7224)	mem 20675MB
[2025-04-03 01:24:52 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][222/573]	eta 0:05:10 lr 0.001115	time 0.8772 (0.8851)	loss 0.6204 (0.5333)	grad_norm 2.6584 (2.7196)	mem 20675MB
[2025-04-03 01:24:54 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][224/573]	eta 0:05:08 lr 0.001115	time 0.8774 (0.8850)	loss 0.5365 (0.5336)	grad_norm 3.3305 (2.7226)	mem 20675MB
[2025-04-03 01:24:56 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][226/573]	eta 0:05:07 lr 0.001115	time 0.8777 (0.8850)	loss 0.5787 (0.5336)	grad_norm 2.4304 (2.7199)	mem 20675MB
[2025-04-03 01:24:58 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][228/573]	eta 0:05:05 lr 0.001115	time 0.8772 (0.8849)	loss 0.6139 (0.5344)	grad_norm 3.6457 (2.7270)	mem 20675MB
[2025-04-03 01:24:59 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][230/573]	eta 0:05:03 lr 0.001115	time 0.8773 (0.8849)	loss 0.5625 (0.5340)	grad_norm 1.6294 (2.7225)	mem 20675MB
[2025-04-03 01:25:01 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][232/573]	eta 0:05:01 lr 0.001115	time 0.8771 (0.8848)	loss 0.5448 (0.5340)	grad_norm 3.4569 (2.7236)	mem 20675MB
[2025-04-03 01:25:03 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][234/573]	eta 0:04:59 lr 0.001114	time 0.8772 (0.8847)	loss 0.4975 (0.5341)	grad_norm 2.0788 (2.7173)	mem 20675MB
[2025-04-03 01:25:05 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][236/573]	eta 0:04:58 lr 0.001114	time 0.8771 (0.8847)	loss 0.5647 (0.5345)	grad_norm 1.6700 (2.7104)	mem 20675MB
[2025-04-03 01:25:06 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][238/573]	eta 0:04:56 lr 0.001114	time 0.8775 (0.8846)	loss 0.4076 (0.5337)	grad_norm 3.8088 (2.7146)	mem 20675MB
[2025-04-03 01:25:08 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][240/573]	eta 0:04:54 lr 0.001114	time 0.8775 (0.8846)	loss 0.4789 (0.5335)	grad_norm 3.3880 (2.7164)	mem 20675MB
[2025-04-03 01:25:10 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][242/573]	eta 0:04:52 lr 0.001114	time 0.8772 (0.8845)	loss 0.5492 (0.5335)	grad_norm 3.5198 (2.7231)	mem 20675MB
[2025-04-03 01:25:12 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][244/573]	eta 0:04:50 lr 0.001114	time 0.8775 (0.8845)	loss 0.5325 (0.5339)	grad_norm 2.7059 (2.7230)	mem 20675MB
[2025-04-03 01:25:13 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][246/573]	eta 0:04:49 lr 0.001114	time 0.8773 (0.8844)	loss 0.5515 (0.5336)	grad_norm 2.0995 (2.7231)	mem 20675MB
[2025-04-03 01:25:15 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][248/573]	eta 0:04:47 lr 0.001113	time 0.8772 (0.8844)	loss 0.4854 (0.5333)	grad_norm 3.4630 (2.7269)	mem 20675MB
[2025-04-03 01:25:17 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][250/573]	eta 0:04:45 lr 0.001113	time 0.8771 (0.8843)	loss 0.5861 (0.5339)	grad_norm 2.1302 (2.7321)	mem 20675MB
[2025-04-03 01:25:19 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][252/573]	eta 0:04:43 lr 0.001113	time 0.8775 (0.8843)	loss 0.4872 (0.5340)	grad_norm 2.8011 (2.7308)	mem 20675MB
[2025-04-03 01:25:20 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][254/573]	eta 0:04:42 lr 0.001113	time 0.8771 (0.8842)	loss 0.4863 (0.5340)	grad_norm 2.2623 (2.7283)	mem 20675MB
[2025-04-03 01:25:22 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][256/573]	eta 0:04:40 lr 0.001113	time 0.8780 (0.8842)	loss 0.5915 (0.5341)	grad_norm 2.4152 (2.7251)	mem 20675MB
[2025-04-03 01:25:24 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][258/573]	eta 0:04:38 lr 0.001113	time 0.8775 (0.8841)	loss 0.4065 (0.5331)	grad_norm 2.5367 (2.7249)	mem 20675MB
[2025-04-03 01:25:26 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][260/573]	eta 0:04:36 lr 0.001113	time 0.8770 (0.8841)	loss 0.5596 (0.5334)	grad_norm 2.7012 (2.7202)	mem 20675MB
[2025-04-03 01:25:27 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][262/573]	eta 0:04:34 lr 0.001112	time 0.8773 (0.8841)	loss 0.5668 (0.5335)	grad_norm 3.4527 (2.7221)	mem 20675MB
[2025-04-03 01:25:29 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][264/573]	eta 0:04:33 lr 0.001112	time 0.8773 (0.8840)	loss 0.5496 (0.5336)	grad_norm 2.5604 (2.7187)	mem 20675MB
[2025-04-03 01:25:31 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][266/573]	eta 0:04:31 lr 0.001112	time 0.8775 (0.8840)	loss 0.4359 (0.5334)	grad_norm 2.7835 (2.7175)	mem 20675MB
[2025-04-03 01:25:33 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][268/573]	eta 0:04:29 lr 0.001112	time 0.8776 (0.8839)	loss 0.5149 (0.5331)	grad_norm 2.7251 (2.7199)	mem 20675MB
[2025-04-03 01:25:34 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][270/573]	eta 0:04:27 lr 0.001112	time 0.8774 (0.8839)	loss 0.5902 (0.5338)	grad_norm 3.3007 (2.7247)	mem 20675MB
[2025-04-03 01:25:36 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][272/573]	eta 0:04:26 lr 0.001112	time 0.8776 (0.8838)	loss 0.4910 (0.5336)	grad_norm 4.5567 (2.7283)	mem 20675MB
[2025-04-03 01:25:38 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][274/573]	eta 0:04:24 lr 0.001112	time 0.8774 (0.8838)	loss 0.6182 (0.5335)	grad_norm 2.1258 (2.7287)	mem 20675MB
[2025-04-03 01:25:40 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][276/573]	eta 0:04:22 lr 0.001111	time 0.8775 (0.8838)	loss 0.4960 (0.5335)	grad_norm 1.7122 (2.7211)	mem 20675MB
[2025-04-03 01:25:41 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][278/573]	eta 0:04:20 lr 0.001111	time 0.8773 (0.8837)	loss 0.5708 (0.5337)	grad_norm 1.8208 (2.7131)	mem 20675MB
[2025-04-03 01:25:43 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][280/573]	eta 0:04:18 lr 0.001111	time 0.8788 (0.8837)	loss 0.5723 (0.5341)	grad_norm 1.9771 (2.7088)	mem 20675MB
[2025-04-03 01:25:45 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][282/573]	eta 0:04:17 lr 0.001111	time 0.8772 (0.8837)	loss 0.5143 (0.5340)	grad_norm 1.9966 (2.7082)	mem 20675MB
[2025-04-03 01:25:47 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][284/573]	eta 0:04:15 lr 0.001111	time 0.8776 (0.8836)	loss 0.5468 (0.5342)	grad_norm 1.7515 (2.7004)	mem 20675MB
[2025-04-03 01:25:49 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][286/573]	eta 0:04:13 lr 0.001111	time 0.8771 (0.8836)	loss 0.5103 (0.5339)	grad_norm 2.2578 (2.6968)	mem 20675MB
[2025-04-03 01:25:50 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][288/573]	eta 0:04:11 lr 0.001111	time 0.8772 (0.8835)	loss 0.6280 (0.5341)	grad_norm 2.9317 (2.6938)	mem 20675MB
[2025-04-03 01:25:52 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][290/573]	eta 0:04:10 lr 0.001110	time 0.8775 (0.8835)	loss 0.5894 (0.5344)	grad_norm 2.1757 (2.6902)	mem 20675MB
[2025-04-03 01:25:54 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][292/573]	eta 0:04:08 lr 0.001110	time 0.8775 (0.8835)	loss 0.4998 (0.5344)	grad_norm 2.6586 (2.6904)	mem 20675MB
[2025-04-03 01:25:56 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][294/573]	eta 0:04:06 lr 0.001110	time 0.8774 (0.8834)	loss 0.5124 (0.5345)	grad_norm 2.4638 (2.6885)	mem 20675MB
[2025-04-03 01:25:57 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][296/573]	eta 0:04:04 lr 0.001110	time 0.8772 (0.8834)	loss 0.4395 (0.5341)	grad_norm 4.1752 (2.6897)	mem 20675MB
[2025-04-03 01:25:59 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][298/573]	eta 0:04:02 lr 0.001110	time 0.8774 (0.8834)	loss 0.6197 (0.5343)	grad_norm 3.1540 (2.6885)	mem 20675MB
[2025-04-03 01:26:01 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][300/573]	eta 0:04:01 lr 0.001110	time 0.8774 (0.8833)	loss 0.5968 (0.5349)	grad_norm 3.2302 (2.6905)	mem 20675MB
[2025-04-03 01:26:03 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][302/573]	eta 0:03:59 lr 0.001110	time 0.8776 (0.8833)	loss 0.5956 (0.5352)	grad_norm 2.4040 (2.6866)	mem 20675MB
[2025-04-03 01:26:04 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][304/573]	eta 0:03:57 lr 0.001109	time 0.8773 (0.8833)	loss 0.5694 (0.5354)	grad_norm 2.1427 (2.6821)	mem 20675MB
[2025-04-03 01:26:06 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][306/573]	eta 0:03:55 lr 0.001109	time 0.8781 (0.8832)	loss 0.5455 (0.5354)	grad_norm 2.8589 (2.6799)	mem 20675MB
[2025-04-03 01:26:08 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][308/573]	eta 0:03:54 lr 0.001109	time 0.8772 (0.8832)	loss 0.5194 (0.5352)	grad_norm 1.9533 (2.6801)	mem 20675MB
[2025-04-03 01:26:10 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][310/573]	eta 0:03:52 lr 0.001109	time 0.8774 (0.8832)	loss 0.4409 (0.5350)	grad_norm 3.2530 (2.6789)	mem 20675MB
[2025-04-03 01:26:11 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][312/573]	eta 0:03:50 lr 0.001109	time 0.8772 (0.8831)	loss 0.3967 (0.5344)	grad_norm 2.6138 (2.6770)	mem 20675MB
[2025-04-03 01:26:13 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][314/573]	eta 0:03:48 lr 0.001109	time 0.8776 (0.8831)	loss 0.5766 (0.5344)	grad_norm 2.2087 (2.6752)	mem 20675MB
[2025-04-03 01:26:15 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][316/573]	eta 0:03:46 lr 0.001109	time 0.8776 (0.8831)	loss 0.5693 (0.5346)	grad_norm 3.0841 (2.6796)	mem 20675MB
[2025-04-03 01:26:17 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][318/573]	eta 0:03:45 lr 0.001108	time 0.8775 (0.8830)	loss 0.5419 (0.5348)	grad_norm 3.5551 (2.6819)	mem 20675MB
[2025-04-03 01:26:18 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][320/573]	eta 0:03:43 lr 0.001108	time 0.8774 (0.8830)	loss 0.5698 (0.5345)	grad_norm 1.8248 (2.6793)	mem 20675MB
[2025-04-03 01:26:20 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][322/573]	eta 0:03:41 lr 0.001108	time 0.8774 (0.8830)	loss 0.5322 (0.5346)	grad_norm 2.2120 (2.6773)	mem 20675MB
[2025-04-03 01:26:22 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][324/573]	eta 0:03:39 lr 0.001108	time 0.8773 (0.8830)	loss 0.5696 (0.5351)	grad_norm 1.7815 (2.6733)	mem 20675MB
[2025-04-03 01:26:24 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][326/573]	eta 0:03:38 lr 0.001108	time 0.8774 (0.8829)	loss 0.4126 (0.5348)	grad_norm 3.0701 (2.6709)	mem 20675MB
[2025-04-03 01:26:25 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][328/573]	eta 0:03:36 lr 0.001108	time 0.8778 (0.8829)	loss 0.5228 (0.5350)	grad_norm 1.6942 (2.6665)	mem 20675MB
[2025-04-03 01:26:27 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][330/573]	eta 0:03:34 lr 0.001108	time 0.8772 (0.8829)	loss 0.4317 (0.5350)	grad_norm 3.3532 (2.6661)	mem 20675MB
[2025-04-03 01:26:29 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][332/573]	eta 0:03:32 lr 0.001107	time 0.8775 (0.8829)	loss 0.5845 (0.5353)	grad_norm 2.2735 (2.6624)	mem 20675MB
[2025-04-03 01:26:31 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][334/573]	eta 0:03:30 lr 0.001107	time 0.8774 (0.8828)	loss 0.3780 (0.5348)	grad_norm 2.9379 (2.6630)	mem 20675MB
[2025-04-03 01:26:32 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][336/573]	eta 0:03:29 lr 0.001107	time 0.8776 (0.8828)	loss 0.5359 (0.5345)	grad_norm 1.8640 (2.6666)	mem 20675MB
[2025-04-03 01:26:34 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][338/573]	eta 0:03:27 lr 0.001107	time 0.8772 (0.8828)	loss 0.5628 (0.5347)	grad_norm 3.2850 (2.6680)	mem 20675MB
[2025-04-03 01:26:36 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][340/573]	eta 0:03:25 lr 0.001107	time 0.8773 (0.8828)	loss 0.5779 (0.5348)	grad_norm 2.1824 (2.6661)	mem 20675MB
[2025-04-03 01:26:38 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][342/573]	eta 0:03:23 lr 0.001107	time 0.8772 (0.8827)	loss 0.5300 (0.5345)	grad_norm 3.1914 (2.6694)	mem 20675MB
[2025-04-03 01:26:39 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][344/573]	eta 0:03:22 lr 0.001107	time 0.8777 (0.8827)	loss 0.5386 (0.5346)	grad_norm 2.9986 (2.6693)	mem 20675MB
[2025-04-03 01:26:41 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][346/573]	eta 0:03:20 lr 0.001106	time 0.8781 (0.8827)	loss 0.5680 (0.5348)	grad_norm 1.7172 (2.6639)	mem 20675MB
[2025-04-03 01:26:43 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][348/573]	eta 0:03:18 lr 0.001106	time 0.8774 (0.8827)	loss 0.5118 (0.5350)	grad_norm 2.5424 (2.6625)	mem 20675MB
[2025-04-03 01:26:45 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][350/573]	eta 0:03:16 lr 0.001106	time 0.8776 (0.8827)	loss 0.6447 (0.5353)	grad_norm 1.9122 (2.6582)	mem 20675MB
[2025-04-03 01:26:47 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][352/573]	eta 0:03:15 lr 0.001106	time 0.8776 (0.8826)	loss 0.4187 (0.5352)	grad_norm 2.5159 (2.6547)	mem 20675MB
[2025-04-03 01:26:48 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][354/573]	eta 0:03:13 lr 0.001106	time 0.8774 (0.8826)	loss 0.4153 (0.5350)	grad_norm 2.2885 (2.6530)	mem 20675MB
[2025-04-03 01:26:50 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][356/573]	eta 0:03:11 lr 0.001106	time 0.8774 (0.8826)	loss 0.4859 (0.5349)	grad_norm 2.3062 (2.6501)	mem 20675MB
[2025-04-03 01:26:52 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][358/573]	eta 0:03:09 lr 0.001106	time 0.8771 (0.8826)	loss 0.5241 (0.5350)	grad_norm 2.5202 (2.6531)	mem 20675MB
[2025-04-03 01:26:54 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][360/573]	eta 0:03:07 lr 0.001105	time 0.8772 (0.8825)	loss 0.5978 (0.5352)	grad_norm 2.6701 (2.6514)	mem 20675MB
[2025-04-03 01:26:55 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][362/573]	eta 0:03:06 lr 0.001105	time 0.8773 (0.8825)	loss 0.4588 (0.5347)	grad_norm 1.7606 (2.6483)	mem 20675MB
[2025-04-03 01:26:57 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][364/573]	eta 0:03:04 lr 0.001105	time 0.8772 (0.8825)	loss 0.6194 (0.5348)	grad_norm 2.5950 (2.6500)	mem 20675MB
[2025-04-03 01:26:59 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][366/573]	eta 0:03:02 lr 0.001105	time 0.8776 (0.8825)	loss 0.3698 (0.5343)	grad_norm 2.8424 (2.6522)	mem 20675MB
[2025-04-03 01:27:01 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][368/573]	eta 0:03:00 lr 0.001105	time 0.8785 (0.8824)	loss 0.5697 (0.5344)	grad_norm 2.3867 (2.6511)	mem 20675MB
[2025-04-03 01:27:02 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][370/573]	eta 0:02:59 lr 0.001105	time 0.8774 (0.8824)	loss 0.5504 (0.5344)	grad_norm 2.6862 (2.6528)	mem 20675MB
[2025-04-03 01:27:04 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][372/573]	eta 0:02:57 lr 0.001105	time 0.8776 (0.8824)	loss 0.6290 (0.5344)	grad_norm 2.8027 (2.6532)	mem 20675MB
[2025-04-03 01:27:06 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][374/573]	eta 0:02:55 lr 0.001104	time 0.8771 (0.8824)	loss 0.3763 (0.5339)	grad_norm 2.4794 (2.6557)	mem 20675MB
[2025-04-03 01:27:08 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][376/573]	eta 0:02:53 lr 0.001104	time 0.8776 (0.8824)	loss 0.4038 (0.5333)	grad_norm 3.3567 (2.6574)	mem 20675MB
[2025-04-03 01:27:09 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][378/573]	eta 0:02:52 lr 0.001104	time 0.8777 (0.8823)	loss 0.6087 (0.5335)	grad_norm 3.2333 (2.6596)	mem 20675MB
[2025-04-03 01:27:11 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][380/573]	eta 0:02:50 lr 0.001104	time 0.8776 (0.8823)	loss 0.5880 (0.5336)	grad_norm 3.0734 (2.6583)	mem 20675MB
[2025-04-03 01:27:13 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][382/573]	eta 0:02:48 lr 0.001104	time 0.8773 (0.8823)	loss 0.5971 (0.5338)	grad_norm 2.3155 (2.6554)	mem 20675MB
[2025-04-03 01:27:15 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][384/573]	eta 0:02:46 lr 0.001104	time 0.8773 (0.8823)	loss 0.6122 (0.5337)	grad_norm 1.9530 (2.6513)	mem 20675MB
[2025-04-03 01:27:16 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][386/573]	eta 0:02:44 lr 0.001104	time 0.8784 (0.8823)	loss 0.5755 (0.5338)	grad_norm 1.9437 (2.6484)	mem 20675MB
[2025-04-03 01:27:18 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][388/573]	eta 0:02:43 lr 0.001103	time 0.8778 (0.8822)	loss 0.5746 (0.5341)	grad_norm 1.5950 (2.6452)	mem 20675MB
[2025-04-03 01:27:20 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][390/573]	eta 0:02:41 lr 0.001103	time 0.8774 (0.8822)	loss 0.6100 (0.5344)	grad_norm 2.5372 (2.6436)	mem 20675MB
[2025-04-03 01:27:22 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][392/573]	eta 0:02:39 lr 0.001103	time 0.8775 (0.8822)	loss 0.3866 (0.5343)	grad_norm 2.8014 (2.6432)	mem 20675MB
[2025-04-03 01:27:23 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][394/573]	eta 0:02:37 lr 0.001103	time 0.8773 (0.8822)	loss 0.5325 (0.5340)	grad_norm 2.0459 (2.6428)	mem 20675MB
[2025-04-03 01:27:25 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][396/573]	eta 0:02:36 lr 0.001103	time 0.8775 (0.8822)	loss 0.5604 (0.5339)	grad_norm 2.1162 (2.6396)	mem 20675MB
[2025-04-03 01:27:27 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][398/573]	eta 0:02:34 lr 0.001103	time 0.8774 (0.8821)	loss 0.5577 (0.5335)	grad_norm 3.5793 (2.6449)	mem 20675MB
[2025-04-03 01:27:29 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][400/573]	eta 0:02:32 lr 0.001102	time 0.8776 (0.8822)	loss 0.5268 (0.5331)	grad_norm 3.5202 (2.6506)	mem 20675MB
[2025-04-03 01:27:30 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][402/573]	eta 0:02:30 lr 0.001102	time 0.8775 (0.8821)	loss 0.4964 (0.5331)	grad_norm 4.2266 (2.6546)	mem 20675MB
[2025-04-03 01:27:32 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][404/573]	eta 0:02:29 lr 0.001102	time 0.8777 (0.8821)	loss 0.7347 (0.5333)	grad_norm 5.0916 (2.6637)	mem 20675MB
[2025-04-03 01:27:34 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][406/573]	eta 0:02:27 lr 0.001102	time 0.8775 (0.8821)	loss 0.6234 (0.5335)	grad_norm 2.6074 (2.6653)	mem 20675MB
[2025-04-03 01:27:36 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][408/573]	eta 0:02:25 lr 0.001102	time 0.8773 (0.8821)	loss 0.5837 (0.5334)	grad_norm 1.9166 (2.6672)	mem 20675MB
[2025-04-03 01:27:37 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][410/573]	eta 0:02:23 lr 0.001102	time 0.8774 (0.8821)	loss 0.5176 (0.5335)	grad_norm 2.2422 (2.6673)	mem 20675MB
[2025-04-03 01:27:39 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][412/573]	eta 0:02:22 lr 0.001102	time 0.8776 (0.8820)	loss 0.5997 (0.5340)	grad_norm 1.4062 (2.6631)	mem 20675MB
[2025-04-03 01:27:41 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][414/573]	eta 0:02:20 lr 0.001101	time 0.8773 (0.8820)	loss 0.5460 (0.5341)	grad_norm 2.1964 (2.6611)	mem 20675MB
[2025-04-03 01:27:43 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][416/573]	eta 0:02:18 lr 0.001101	time 0.8777 (0.8820)	loss 0.4580 (0.5338)	grad_norm 3.4644 (2.6622)	mem 20675MB
[2025-04-03 01:27:44 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][418/573]	eta 0:02:16 lr 0.001101	time 0.8787 (0.8820)	loss 0.4238 (0.5336)	grad_norm 2.7362 (2.6605)	mem 20675MB
[2025-04-03 01:27:46 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][420/573]	eta 0:02:14 lr 0.001101	time 0.8775 (0.8820)	loss 0.5362 (0.5335)	grad_norm 2.0112 (2.6567)	mem 20675MB
[2025-04-03 01:27:48 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][422/573]	eta 0:02:13 lr 0.001101	time 0.8775 (0.8820)	loss 0.4821 (0.5332)	grad_norm 3.9585 (2.6678)	mem 20675MB
[2025-04-03 01:27:50 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][424/573]	eta 0:02:11 lr 0.001101	time 0.8775 (0.8819)	loss 0.5242 (0.5332)	grad_norm 2.3105 (2.6678)	mem 20675MB
[2025-04-03 01:27:52 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][426/573]	eta 0:02:09 lr 0.001101	time 0.8791 (0.8819)	loss 0.5167 (0.5332)	grad_norm 5.0399 (2.6740)	mem 20675MB
[2025-04-03 01:27:53 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][428/573]	eta 0:02:07 lr 0.001100	time 0.8777 (0.8819)	loss 0.5298 (0.5329)	grad_norm 3.1058 (2.6759)	mem 20675MB
[2025-04-03 01:27:55 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][430/573]	eta 0:02:06 lr 0.001100	time 0.8777 (0.8819)	loss 0.6029 (0.5331)	grad_norm 2.3451 (2.6738)	mem 20675MB
[2025-04-03 01:27:57 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][432/573]	eta 0:02:04 lr 0.001100	time 0.8773 (0.8819)	loss 0.5707 (0.5335)	grad_norm 2.0601 (2.6748)	mem 20675MB
[2025-04-03 01:27:59 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][434/573]	eta 0:02:02 lr 0.001100	time 0.8777 (0.8819)	loss 0.4598 (0.5335)	grad_norm 3.4457 (2.6765)	mem 20675MB
[2025-04-03 01:28:00 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][436/573]	eta 0:02:00 lr 0.001100	time 0.8777 (0.8819)	loss 0.5726 (0.5337)	grad_norm 1.1939 (2.6708)	mem 20675MB
[2025-04-03 01:28:02 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][438/573]	eta 0:01:59 lr 0.001100	time 0.8776 (0.8818)	loss 0.4224 (0.5334)	grad_norm 2.4342 (2.6688)	mem 20675MB
[2025-04-03 01:28:04 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][440/573]	eta 0:01:57 lr 0.001100	time 0.8775 (0.8818)	loss 0.4960 (0.5331)	grad_norm 2.4778 (2.6677)	mem 20675MB
[2025-04-03 01:28:06 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][442/573]	eta 0:01:55 lr 0.001099	time 0.8773 (0.8818)	loss 0.6113 (0.5331)	grad_norm 2.6715 (2.6657)	mem 20675MB
[2025-04-03 01:28:07 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][444/573]	eta 0:01:53 lr 0.001099	time 0.8777 (0.8818)	loss 0.4925 (0.5330)	grad_norm 3.7243 (2.6682)	mem 20675MB
[2025-04-03 01:28:09 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][446/573]	eta 0:01:51 lr 0.001099	time 0.8774 (0.8818)	loss 0.5671 (0.5332)	grad_norm 5.1338 (2.6726)	mem 20675MB
[2025-04-03 01:28:11 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][448/573]	eta 0:01:50 lr 0.001099	time 0.8776 (0.8818)	loss 0.5263 (0.5333)	grad_norm 2.1476 (2.6697)	mem 20675MB
[2025-04-03 01:28:13 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][450/573]	eta 0:01:48 lr 0.001099	time 0.8773 (0.8817)	loss 0.5509 (0.5330)	grad_norm 2.5119 (2.6695)	mem 20675MB
[2025-04-03 01:28:14 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][452/573]	eta 0:01:46 lr 0.001099	time 0.8774 (0.8817)	loss 0.5307 (0.5330)	grad_norm 1.8573 (2.6669)	mem 20675MB
[2025-04-03 01:28:16 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][454/573]	eta 0:01:44 lr 0.001098	time 0.8778 (0.8817)	loss 0.5952 (0.5330)	grad_norm 2.2238 (2.6668)	mem 20675MB
[2025-04-03 01:28:18 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][456/573]	eta 0:01:43 lr 0.001098	time 0.8776 (0.8817)	loss 0.4967 (0.5330)	grad_norm 3.0031 (2.6668)	mem 20675MB
[2025-04-03 01:28:20 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][458/573]	eta 0:01:41 lr 0.001098	time 0.8773 (0.8817)	loss 0.5433 (0.5329)	grad_norm 2.3307 (2.6676)	mem 20675MB
[2025-04-03 01:28:21 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][460/573]	eta 0:01:39 lr 0.001098	time 0.8772 (0.8817)	loss 0.3575 (0.5325)	grad_norm 3.2797 (2.6671)	mem 20675MB
[2025-04-03 01:28:23 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][462/573]	eta 0:01:37 lr 0.001098	time 0.8775 (0.8817)	loss 0.5568 (0.5323)	grad_norm 2.4203 (2.6681)	mem 20675MB
[2025-04-03 01:28:25 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][464/573]	eta 0:01:36 lr 0.001098	time 0.8777 (0.8816)	loss 0.5795 (0.5324)	grad_norm 2.5611 (2.6701)	mem 20675MB
[2025-04-03 01:28:27 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][466/573]	eta 0:01:34 lr 0.001098	time 0.8777 (0.8816)	loss 0.5524 (0.5323)	grad_norm 3.7502 (2.6709)	mem 20675MB
[2025-04-03 01:28:28 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][468/573]	eta 0:01:32 lr 0.001097	time 0.8771 (0.8816)	loss 0.4735 (0.5322)	grad_norm 2.1485 (2.6681)	mem 20675MB
[2025-04-03 01:28:30 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][470/573]	eta 0:01:30 lr 0.001097	time 0.8771 (0.8816)	loss 0.5266 (0.5321)	grad_norm 2.0769 (2.6663)	mem 20675MB
[2025-04-03 01:28:32 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][472/573]	eta 0:01:29 lr 0.001097	time 0.8775 (0.8816)	loss 0.5851 (0.5322)	grad_norm 2.2436 (2.6633)	mem 20675MB
[2025-04-03 01:28:34 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][474/573]	eta 0:01:27 lr 0.001097	time 0.8772 (0.8816)	loss 0.5727 (0.5321)	grad_norm 2.7627 (2.6643)	mem 20675MB
[2025-04-03 01:28:35 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][476/573]	eta 0:01:25 lr 0.001097	time 0.8773 (0.8816)	loss 0.5924 (0.5321)	grad_norm 3.2863 (2.6644)	mem 20675MB
[2025-04-03 01:28:37 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][478/573]	eta 0:01:23 lr 0.001097	time 0.8772 (0.8815)	loss 0.5212 (0.5322)	grad_norm 1.9359 (2.6626)	mem 20675MB
[2025-04-03 01:28:39 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][480/573]	eta 0:01:21 lr 0.001097	time 0.8776 (0.8815)	loss 0.4102 (0.5321)	grad_norm 3.9989 (2.6648)	mem 20675MB
[2025-04-03 01:28:41 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][482/573]	eta 0:01:20 lr 0.001096	time 0.8778 (0.8815)	loss 0.5692 (0.5321)	grad_norm 1.7257 (2.6628)	mem 20675MB
[2025-04-03 01:28:42 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][484/573]	eta 0:01:18 lr 0.001096	time 0.8774 (0.8815)	loss 0.5348 (0.5319)	grad_norm 2.6459 (2.6632)	mem 20675MB
[2025-04-03 01:28:44 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][486/573]	eta 0:01:16 lr 0.001096	time 0.8776 (0.8815)	loss 0.4530 (0.5318)	grad_norm 2.1740 (2.6612)	mem 20675MB
[2025-04-03 01:28:46 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][488/573]	eta 0:01:14 lr 0.001096	time 0.8773 (0.8815)	loss 0.6127 (0.5317)	grad_norm 3.2362 (2.6662)	mem 20675MB
[2025-04-03 01:28:48 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][490/573]	eta 0:01:13 lr 0.001096	time 0.8775 (0.8815)	loss 0.5994 (0.5320)	grad_norm 2.2932 (2.6639)	mem 20675MB
[2025-04-03 01:28:49 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][492/573]	eta 0:01:11 lr 0.001096	time 0.8773 (0.8815)	loss 0.4408 (0.5315)	grad_norm 2.7000 (2.6653)	mem 20675MB
[2025-04-03 01:28:51 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][494/573]	eta 0:01:09 lr 0.001095	time 0.8774 (0.8814)	loss 0.6155 (0.5317)	grad_norm 3.2962 (2.6667)	mem 20675MB
[2025-04-03 01:28:53 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][496/573]	eta 0:01:07 lr 0.001095	time 0.8773 (0.8814)	loss 0.4667 (0.5316)	grad_norm 2.1096 (2.6657)	mem 20675MB
[2025-04-03 01:28:55 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][498/573]	eta 0:01:06 lr 0.001095	time 0.8775 (0.8814)	loss 0.3983 (0.5313)	grad_norm 2.7963 (2.6666)	mem 20675MB
[2025-04-03 01:28:57 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][500/573]	eta 0:01:04 lr 0.001095	time 0.8772 (0.8814)	loss 0.4015 (0.5312)	grad_norm 2.2371 (2.6654)	mem 20675MB
[2025-04-03 01:28:58 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][502/573]	eta 0:01:02 lr 0.001095	time 0.8775 (0.8814)	loss 0.5408 (0.5310)	grad_norm 1.8247 (2.6659)	mem 20675MB
[2025-04-03 01:29:00 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][504/573]	eta 0:01:00 lr 0.001095	time 0.8771 (0.8814)	loss 0.6021 (0.5309)	grad_norm 2.2492 (2.6635)	mem 20675MB
[2025-04-03 01:29:02 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][506/573]	eta 0:00:59 lr 0.001095	time 0.8774 (0.8814)	loss 0.5325 (0.5310)	grad_norm 2.6231 (2.6630)	mem 20675MB
[2025-04-03 01:29:04 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][508/573]	eta 0:00:57 lr 0.001094	time 0.8779 (0.8813)	loss 0.4357 (0.5310)	grad_norm 4.6875 (2.6665)	mem 20675MB
[2025-04-03 01:29:05 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][510/573]	eta 0:00:55 lr 0.001094	time 0.8774 (0.8813)	loss 0.4771 (0.5306)	grad_norm 3.6068 (2.6693)	mem 20675MB
[2025-04-03 01:29:07 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][512/573]	eta 0:00:53 lr 0.001094	time 0.8777 (0.8813)	loss 0.5359 (0.5306)	grad_norm 2.2753 (2.6675)	mem 20675MB
[2025-04-03 01:29:09 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][514/573]	eta 0:00:51 lr 0.001094	time 0.8777 (0.8813)	loss 0.5919 (0.5305)	grad_norm 3.2024 (2.6692)	mem 20675MB
[2025-04-03 01:29:11 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][516/573]	eta 0:00:50 lr 0.001094	time 0.8776 (0.8813)	loss 0.5704 (0.5308)	grad_norm 2.7480 (2.6705)	mem 20675MB
[2025-04-03 01:29:12 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][518/573]	eta 0:00:48 lr 0.001094	time 0.8778 (0.8813)	loss 0.5929 (0.5310)	grad_norm 3.3694 (2.6711)	mem 20675MB
[2025-04-03 01:29:14 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][520/573]	eta 0:00:46 lr 0.001094	time 0.8771 (0.8813)	loss 0.5138 (0.5309)	grad_norm 2.2723 (2.6724)	mem 20675MB
[2025-04-03 01:29:16 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][522/573]	eta 0:00:44 lr 0.001093	time 0.8771 (0.8813)	loss 0.5013 (0.5307)	grad_norm 2.5784 (2.6726)	mem 20675MB
[2025-04-03 01:29:18 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][524/573]	eta 0:00:43 lr 0.001093	time 0.8771 (0.8813)	loss 0.5200 (0.5307)	grad_norm 3.2131 (2.6727)	mem 20675MB
[2025-04-03 01:29:19 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][526/573]	eta 0:00:41 lr 0.001093	time 0.8777 (0.8813)	loss 0.5805 (0.5309)	grad_norm 1.9747 (2.6703)	mem 20675MB
[2025-04-03 01:29:21 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][528/573]	eta 0:00:39 lr 0.001093	time 0.8772 (0.8812)	loss 0.5585 (0.5311)	grad_norm 2.1997 (2.6678)	mem 20675MB
[2025-04-03 01:29:23 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][530/573]	eta 0:00:37 lr 0.001093	time 0.8774 (0.8812)	loss 0.6078 (0.5313)	grad_norm 2.1951 (2.6663)	mem 20675MB
[2025-04-03 01:29:25 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][532/573]	eta 0:00:36 lr 0.001093	time 0.8773 (0.8812)	loss 0.6282 (0.5315)	grad_norm 2.1134 (2.6646)	mem 20675MB
[2025-04-03 01:29:26 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][534/573]	eta 0:00:34 lr 0.001092	time 0.8773 (0.8812)	loss 0.5613 (0.5314)	grad_norm 1.6091 (2.6611)	mem 20675MB
[2025-04-03 01:29:28 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][536/573]	eta 0:00:32 lr 0.001092	time 0.8775 (0.8812)	loss 0.5126 (0.5312)	grad_norm 3.0779 (2.6620)	mem 20675MB
[2025-04-03 01:29:30 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][538/573]	eta 0:00:30 lr 0.001092	time 0.8776 (0.8812)	loss 0.5786 (0.5312)	grad_norm 2.3631 (2.6604)	mem 20675MB
[2025-04-03 01:29:32 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][540/573]	eta 0:00:29 lr 0.001092	time 0.8774 (0.8812)	loss 0.6382 (0.5315)	grad_norm 2.1783 (2.6605)	mem 20675MB
[2025-04-03 01:29:33 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][542/573]	eta 0:00:27 lr 0.001092	time 0.8773 (0.8812)	loss 0.4624 (0.5315)	grad_norm 2.5733 (2.6583)	mem 20675MB
[2025-04-03 01:29:35 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][544/573]	eta 0:00:25 lr 0.001092	time 0.8772 (0.8812)	loss 0.5409 (0.5314)	grad_norm 1.7175 (2.6555)	mem 20675MB
[2025-04-03 01:29:37 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][546/573]	eta 0:00:23 lr 0.001092	time 0.8773 (0.8811)	loss 0.4516 (0.5313)	grad_norm 4.0372 (2.6572)	mem 20675MB
[2025-04-03 01:29:39 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][548/573]	eta 0:00:22 lr 0.001091	time 0.8774 (0.8811)	loss 0.4253 (0.5312)	grad_norm 3.7142 (2.6569)	mem 20675MB
[2025-04-03 01:29:40 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][550/573]	eta 0:00:20 lr 0.001091	time 0.8771 (0.8811)	loss 0.4884 (0.5313)	grad_norm 3.6950 (2.6580)	mem 20675MB
[2025-04-03 01:29:42 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][552/573]	eta 0:00:18 lr 0.001091	time 0.8771 (0.8811)	loss 0.5838 (0.5315)	grad_norm 2.8524 (2.6572)	mem 20675MB
[2025-04-03 01:29:44 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][554/573]	eta 0:00:16 lr 0.001091	time 0.8771 (0.8811)	loss 0.6073 (0.5315)	grad_norm 3.0486 (2.6581)	mem 20675MB
[2025-04-03 01:29:46 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][556/573]	eta 0:00:14 lr 0.001091	time 0.8772 (0.8811)	loss 0.4037 (0.5314)	grad_norm 3.4467 (2.6582)	mem 20675MB
[2025-04-03 01:29:47 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][558/573]	eta 0:00:13 lr 0.001091	time 0.8770 (0.8811)	loss 0.6083 (0.5313)	grad_norm 2.9746 (2.6584)	mem 20675MB
[2025-04-03 01:29:49 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][560/573]	eta 0:00:11 lr 0.001090	time 0.8770 (0.8811)	loss 0.5850 (0.5313)	grad_norm 2.9044 (2.6619)	mem 20675MB
[2025-04-03 01:29:51 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][562/573]	eta 0:00:09 lr 0.001090	time 0.8771 (0.8811)	loss 0.6069 (0.5313)	grad_norm 2.6335 (2.6624)	mem 20675MB
[2025-04-03 01:29:53 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][564/573]	eta 0:00:07 lr 0.001090	time 0.8769 (0.8810)	loss 0.3987 (0.5313)	grad_norm 3.5989 (2.6637)	mem 20675MB
[2025-04-03 01:29:54 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][566/573]	eta 0:00:06 lr 0.001090	time 0.8771 (0.8810)	loss 0.5190 (0.5313)	grad_norm 3.4345 (2.6635)	mem 20675MB
[2025-04-03 01:29:56 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][568/573]	eta 0:00:04 lr 0.001090	time 0.8771 (0.8810)	loss 0.4261 (0.5310)	grad_norm 6.0122 (2.6702)	mem 20675MB
[2025-04-03 01:29:58 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][570/573]	eta 0:00:02 lr 0.001090	time 0.8769 (0.8810)	loss 0.4715 (0.5309)	grad_norm 3.5751 (2.6729)	mem 20675MB
[2025-04-03 01:30:00 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][572/573]	eta 0:00:00 lr 0.001090	time 0.8772 (0.8810)	loss 0.5837 (0.5310)	grad_norm 1.7219 (2.6701)	mem 20675MB
[2025-04-03 01:30:00 simmim_finetune] (main_finetune.py 260): INFO EPOCH 6 training takes 0:08:24
[2025-04-03 01:30:02 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.800 (1.800)	Loss 0.5045 (0.5045)	Acc@1 75.781 (75.781)	Mem 20675MB
[2025-04-03 01:30:02 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.789)	Loss 0.4495 (0.4696)	Acc@1 83.594 (79.427)	Mem 20675MB
[2025-04-03 01:30:03 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.284 (0.587)	Loss 0.4871 (0.4742)	Acc@1 78.125 (79.062)	Mem 20675MB
[2025-04-03 01:30:03 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.501)	Loss 0.5061 (0.4751)	Acc@1 77.344 (79.353)	Mem 20675MB
[2025-04-03 01:30:04 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.452)	Loss 0.6639 (0.4977)	Acc@1 59.375 (76.823)	Mem 20675MB
[2025-04-03 01:30:05 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.422)	Loss 0.5544 (0.5109)	Acc@1 76.562 (76.065)	Mem 20675MB
[2025-04-03 01:30:05 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.401)	Loss 0.6131 (0.5231)	Acc@1 64.844 (74.760)	Mem 20675MB
[2025-04-03 01:30:06 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.385)	Loss 0.5467 (0.5270)	Acc@1 76.562 (74.896)	Mem 20675MB
[2025-04-03 01:30:06 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 74.899
[2025-04-03 01:30:06 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 74.9%
[2025-04-03 01:30:06 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 74.90%
[2025-04-03 01:30:06 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.0603524853128716e-06, 4.0603524853128716e-06, 6.22941603230375e-06, 6.22941603230375e-06, 9.566436873828178e-06, 9.566436873828178e-06, 1.470031509155807e-05, 1.470031509155807e-05, 2.2598589272680974e-05, 2.2598589272680974e-05, 3.474978032056237e-05, 3.474978032056237e-05, 5.3443920394226054e-05, 5.3443920394226054e-05, 8.220413589217017e-05, 8.220413589217017e-05, 0.00012645062127362267, 0.00012645062127362267, 0.00019452213724508808, 0.00019452213724508808, 0.00029924754643195783, 0.00029924754643195783, 0.00046036356056560374, 0.00046036356056560374, 0.0007082343515404435, 0.0007082343515404435, 0.0010895740299632739, 0.0010895740299632739]
[2025-04-03 01:30:08 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][0/573]	eta 0:22:35 lr 0.001089	time 2.3658 (2.3658)	loss 0.4097 (0.4097)	grad_norm 3.1182 (3.1182)	mem 20675MB
[2025-04-03 01:30:10 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][2/573]	eta 0:13:04 lr 0.001089	time 0.8773 (1.3743)	loss 0.5588 (0.5284)	grad_norm 2.1076 (2.4375)	mem 20675MB
[2025-04-03 01:30:12 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][4/573]	eta 0:11:08 lr 0.001089	time 0.8771 (1.1757)	loss 0.5703 (0.5179)	grad_norm 2.4906 (2.6996)	mem 20675MB
[2025-04-03 01:30:14 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][6/573]	eta 0:10:18 lr 0.001089	time 0.8772 (1.0907)	loss 0.5372 (0.5315)	grad_norm 2.5857 (2.6126)	mem 20675MB
[2025-04-03 01:30:15 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][8/573]	eta 0:09:49 lr 0.001089	time 0.8772 (1.0435)	loss 0.5533 (0.5364)	grad_norm 2.3041 (2.5715)	mem 20675MB
[2025-04-03 01:30:17 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][10/573]	eta 0:09:30 lr 0.001089	time 0.8773 (1.0134)	loss 0.5618 (0.5318)	grad_norm 2.6579 (2.6830)	mem 20675MB
[2025-04-03 01:30:19 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][12/573]	eta 0:09:16 lr 0.001089	time 0.8770 (0.9925)	loss 0.4391 (0.5291)	grad_norm 1.6550 (2.5428)	mem 20675MB
[2025-04-03 01:30:21 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][14/573]	eta 0:09:06 lr 0.001088	time 0.8778 (0.9773)	loss 0.5216 (0.5346)	grad_norm 2.3478 (2.5301)	mem 20675MB
[2025-04-03 01:30:22 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][16/573]	eta 0:08:57 lr 0.001088	time 0.8776 (0.9657)	loss 0.5486 (0.5285)	grad_norm 2.0533 (2.5459)	mem 20675MB
[2025-04-03 01:30:24 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][18/573]	eta 0:08:50 lr 0.001088	time 0.8778 (0.9566)	loss 0.4821 (0.5284)	grad_norm 2.5892 (2.5460)	mem 20675MB
[2025-04-03 01:30:26 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][20/573]	eta 0:08:44 lr 0.001088	time 0.8777 (0.9491)	loss 0.4405 (0.5226)	grad_norm 2.7106 (2.5228)	mem 20675MB
[2025-04-03 01:30:28 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][22/573]	eta 0:08:39 lr 0.001088	time 0.8779 (0.9430)	loss 0.5125 (0.5237)	grad_norm 2.8172 (2.5297)	mem 20675MB
[2025-04-03 01:30:29 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][24/573]	eta 0:08:34 lr 0.001088	time 0.8780 (0.9379)	loss 0.6313 (0.5287)	grad_norm 3.4878 (2.5521)	mem 20675MB
[2025-04-03 01:30:31 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][26/573]	eta 0:08:30 lr 0.001088	time 0.8779 (0.9335)	loss 0.5466 (0.5297)	grad_norm 2.0304 (2.5265)	mem 20675MB
[2025-04-03 01:30:33 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][28/573]	eta 0:08:26 lr 0.001087	time 0.8776 (0.9297)	loss 0.5899 (0.5320)	grad_norm 2.6104 (2.5686)	mem 20675MB
[2025-04-03 01:30:35 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][30/573]	eta 0:08:23 lr 0.001087	time 0.8777 (0.9264)	loss 0.4932 (0.5313)	grad_norm 3.2698 (2.6120)	mem 20675MB
[2025-04-03 01:30:36 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][32/573]	eta 0:08:19 lr 0.001087	time 0.8776 (0.9235)	loss 0.5641 (0.5349)	grad_norm 2.5196 (2.6071)	mem 20675MB
[2025-04-03 01:30:38 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][34/573]	eta 0:08:16 lr 0.001087	time 0.8776 (0.9209)	loss 0.5098 (0.5348)	grad_norm 2.2112 (2.5635)	mem 20675MB
[2025-04-03 01:30:40 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][36/573]	eta 0:08:13 lr 0.001087	time 0.8778 (0.9186)	loss 0.5590 (0.5365)	grad_norm 3.8357 (2.5896)	mem 20675MB
[2025-04-03 01:30:42 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][38/573]	eta 0:08:10 lr 0.001087	time 0.8776 (0.9166)	loss 0.5815 (0.5388)	grad_norm 2.0547 (2.5632)	mem 20675MB
[2025-04-03 01:30:43 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][40/573]	eta 0:08:07 lr 0.001086	time 0.8779 (0.9147)	loss 0.5269 (0.5389)	grad_norm 1.8042 (2.5319)	mem 20675MB
[2025-04-03 01:30:45 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][42/573]	eta 0:08:04 lr 0.001086	time 0.8778 (0.9131)	loss 0.4656 (0.5388)	grad_norm 2.5509 (2.5108)	mem 20675MB
[2025-04-03 01:30:47 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][44/573]	eta 0:08:02 lr 0.001086	time 0.8780 (0.9115)	loss 0.6089 (0.5423)	grad_norm 1.9088 (2.4919)	mem 20675MB
[2025-04-03 01:30:49 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][46/573]	eta 0:07:59 lr 0.001086	time 0.8783 (0.9102)	loss 0.4815 (0.5415)	grad_norm 4.4977 (2.5382)	mem 20675MB
[2025-04-03 01:30:50 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][48/573]	eta 0:07:57 lr 0.001086	time 0.8779 (0.9089)	loss 0.4942 (0.5393)	grad_norm 3.3348 (2.5556)	mem 20675MB
[2025-04-03 01:30:52 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][50/573]	eta 0:07:54 lr 0.001086	time 0.8780 (0.9077)	loss 0.6585 (0.5418)	grad_norm 4.6912 (2.5846)	mem 20675MB
[2025-04-03 01:30:54 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][52/573]	eta 0:07:52 lr 0.001086	time 0.8783 (0.9066)	loss 0.4327 (0.5418)	grad_norm 4.8583 (2.6366)	mem 20675MB
[2025-04-03 01:30:56 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][54/573]	eta 0:07:50 lr 0.001085	time 0.8785 (0.9056)	loss 0.4106 (0.5364)	grad_norm 2.3894 (2.6505)	mem 20675MB
[2025-04-03 01:30:57 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][56/573]	eta 0:07:47 lr 0.001085	time 0.8779 (0.9047)	loss 0.5668 (0.5359)	grad_norm 1.9798 (2.6544)	mem 20675MB
[2025-04-03 01:30:59 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][58/573]	eta 0:07:45 lr 0.001085	time 0.8781 (0.9038)	loss 0.4305 (0.5330)	grad_norm 3.5928 (2.6854)	mem 20675MB
[2025-04-03 01:31:01 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][60/573]	eta 0:07:43 lr 0.001085	time 0.8780 (0.9030)	loss 0.5914 (0.5338)	grad_norm 2.4646 (2.6820)	mem 20675MB
[2025-04-03 01:31:03 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][62/573]	eta 0:07:41 lr 0.001085	time 0.8778 (0.9023)	loss 0.4985 (0.5334)	grad_norm 4.3946 (2.7006)	mem 20675MB
[2025-04-03 01:31:05 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][64/573]	eta 0:07:38 lr 0.001085	time 0.8787 (0.9015)	loss 0.3641 (0.5318)	grad_norm 3.2098 (2.7197)	mem 20675MB
[2025-04-03 01:31:06 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][66/573]	eta 0:07:36 lr 0.001084	time 0.8780 (0.9009)	loss 0.4716 (0.5317)	grad_norm 2.4032 (2.7071)	mem 20675MB
[2025-04-03 01:31:08 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][68/573]	eta 0:07:34 lr 0.001084	time 0.8781 (0.9003)	loss 0.4648 (0.5293)	grad_norm 2.0602 (2.6981)	mem 20675MB
[2025-04-03 01:31:10 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][70/573]	eta 0:07:32 lr 0.001084	time 0.8786 (0.8997)	loss 0.5124 (0.5298)	grad_norm 2.4462 (2.7010)	mem 20675MB
[2025-04-03 01:31:12 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][72/573]	eta 0:07:30 lr 0.001084	time 0.8781 (0.8991)	loss 0.4046 (0.5263)	grad_norm 3.2564 (2.7246)	mem 20675MB
[2025-04-03 01:31:13 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][74/573]	eta 0:07:28 lr 0.001084	time 0.8777 (0.8986)	loss 0.5976 (0.5258)	grad_norm 2.3568 (2.7204)	mem 20675MB
[2025-04-03 01:31:15 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][76/573]	eta 0:07:26 lr 0.001084	time 0.8790 (0.8981)	loss 0.6240 (0.5267)	grad_norm 5.3649 (2.7551)	mem 20675MB
[2025-04-03 01:31:17 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][78/573]	eta 0:07:24 lr 0.001083	time 0.8781 (0.8976)	loss 0.4308 (0.5256)	grad_norm 3.1229 (2.7953)	mem 20675MB
[2025-04-03 01:31:19 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][80/573]	eta 0:07:22 lr 0.001083	time 0.8778 (0.8972)	loss 0.5631 (0.5273)	grad_norm 2.4336 (2.8076)	mem 20675MB
[2025-04-03 01:31:20 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][82/573]	eta 0:07:20 lr 0.001083	time 0.8785 (0.8967)	loss 0.6320 (0.5290)	grad_norm 2.8911 (2.8060)	mem 20675MB
[2025-04-03 01:31:22 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][84/573]	eta 0:07:18 lr 0.001083	time 0.8778 (0.8963)	loss 0.5730 (0.5296)	grad_norm 1.3372 (2.7793)	mem 20675MB
[2025-04-03 01:31:24 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][86/573]	eta 0:07:16 lr 0.001083	time 0.8779 (0.8959)	loss 0.5230 (0.5290)	grad_norm 1.8918 (2.7641)	mem 20675MB
[2025-04-03 01:31:26 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][88/573]	eta 0:07:14 lr 0.001083	time 0.8775 (0.8955)	loss 0.5102 (0.5282)	grad_norm 2.4871 (2.7715)	mem 20675MB
[2025-04-03 01:31:27 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][90/573]	eta 0:07:12 lr 0.001083	time 0.8779 (0.8952)	loss 0.4319 (0.5275)	grad_norm 3.1659 (2.7651)	mem 20675MB
[2025-04-03 01:31:29 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][92/573]	eta 0:07:10 lr 0.001082	time 0.8776 (0.8948)	loss 0.5842 (0.5272)	grad_norm 3.7066 (2.7868)	mem 20675MB
[2025-04-03 01:31:31 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][94/573]	eta 0:07:08 lr 0.001082	time 0.8771 (0.8945)	loss 0.5502 (0.5262)	grad_norm 2.4279 (2.7908)	mem 20675MB
[2025-04-03 01:31:33 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][96/573]	eta 0:07:06 lr 0.001082	time 0.8772 (0.8941)	loss 0.5599 (0.5272)	grad_norm 3.9856 (2.7941)	mem 20675MB
[2025-04-03 01:31:34 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][98/573]	eta 0:07:04 lr 0.001082	time 0.8775 (0.8938)	loss 0.6216 (0.5279)	grad_norm 2.6491 (2.7850)	mem 20675MB
[2025-04-03 01:31:36 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][100/573]	eta 0:07:02 lr 0.001082	time 0.8774 (0.8935)	loss 0.5077 (0.5288)	grad_norm 2.9489 (2.7817)	mem 20675MB
[2025-04-03 01:31:38 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][102/573]	eta 0:07:00 lr 0.001082	time 0.8775 (0.8932)	loss 0.5385 (0.5293)	grad_norm 1.7275 (2.7654)	mem 20675MB
[2025-04-03 01:31:40 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][104/573]	eta 0:06:58 lr 0.001081	time 0.8776 (0.8929)	loss 0.4351 (0.5289)	grad_norm 2.7346 (2.7609)	mem 20675MB
[2025-04-03 01:31:41 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][106/573]	eta 0:06:56 lr 0.001081	time 0.8775 (0.8927)	loss 0.5612 (0.5292)	grad_norm 2.6016 (2.7548)	mem 20675MB
[2025-04-03 01:31:43 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][108/573]	eta 0:06:55 lr 0.001081	time 0.8773 (0.8925)	loss 0.3737 (0.5285)	grad_norm 2.4137 (2.7444)	mem 20675MB
[2025-04-03 01:31:45 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][110/573]	eta 0:06:53 lr 0.001081	time 0.8780 (0.8941)	loss 0.6695 (0.5293)	grad_norm 3.4219 (2.7434)	mem 20675MB
[2025-04-03 01:31:47 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][112/573]	eta 0:06:52 lr 0.001081	time 0.8774 (0.8939)	loss 0.5132 (0.5290)	grad_norm 1.9814 (2.7412)	mem 20675MB
[2025-04-03 01:31:49 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][114/573]	eta 0:06:50 lr 0.001081	time 0.8774 (0.8937)	loss 0.5387 (0.5292)	grad_norm 6.8163 (2.7726)	mem 20675MB
[2025-04-03 01:31:50 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][116/573]	eta 0:06:48 lr 0.001081	time 0.8771 (0.8935)	loss 0.5072 (0.5298)	grad_norm 3.2057 (2.7848)	mem 20675MB
[2025-04-03 01:31:52 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][118/573]	eta 0:06:46 lr 0.001080	time 0.8783 (0.8933)	loss 0.4716 (0.5298)	grad_norm 3.0887 (2.7828)	mem 20675MB
[2025-04-03 01:31:54 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][120/573]	eta 0:06:44 lr 0.001080	time 0.8772 (0.8930)	loss 0.5391 (0.5298)	grad_norm 3.9048 (2.7910)	mem 20675MB
[2025-04-03 01:31:56 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][122/573]	eta 0:06:42 lr 0.001080	time 0.8774 (0.8928)	loss 0.3860 (0.5279)	grad_norm 2.4903 (2.8010)	mem 20675MB
[2025-04-03 01:31:57 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][124/573]	eta 0:06:40 lr 0.001080	time 0.8775 (0.8926)	loss 0.5105 (0.5280)	grad_norm 2.7627 (2.7924)	mem 20675MB
[2025-04-03 01:31:59 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][126/573]	eta 0:06:38 lr 0.001080	time 0.8772 (0.8923)	loss 0.5660 (0.5280)	grad_norm 2.9735 (2.7856)	mem 20675MB
[2025-04-03 01:32:01 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][128/573]	eta 0:06:36 lr 0.001080	time 0.8775 (0.8921)	loss 0.7131 (0.5301)	grad_norm 4.5766 (2.8089)	mem 20675MB
[2025-04-03 01:32:03 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][130/573]	eta 0:06:35 lr 0.001079	time 0.8772 (0.8919)	loss 0.5299 (0.5295)	grad_norm 2.6161 (2.8127)	mem 20675MB
[2025-04-03 01:32:04 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][132/573]	eta 0:06:33 lr 0.001079	time 0.8773 (0.8917)	loss 0.4197 (0.5288)	grad_norm 3.4076 (2.8210)	mem 20675MB
[2025-04-03 01:32:06 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][134/573]	eta 0:06:31 lr 0.001079	time 0.8782 (0.8915)	loss 0.4237 (0.5280)	grad_norm 3.9486 (2.8324)	mem 20675MB
[2025-04-03 01:32:08 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][136/573]	eta 0:06:29 lr 0.001079	time 0.8774 (0.8913)	loss 0.5063 (0.5280)	grad_norm 1.6297 (2.8176)	mem 20675MB
[2025-04-03 01:32:10 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][138/573]	eta 0:06:27 lr 0.001079	time 0.8778 (0.8911)	loss 0.6244 (0.5288)	grad_norm 2.9081 (2.8119)	mem 20675MB
[2025-04-03 01:32:12 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][140/573]	eta 0:06:25 lr 0.001079	time 0.8774 (0.8910)	loss 0.5675 (0.5298)	grad_norm 3.6502 (2.8144)	mem 20675MB
[2025-04-03 01:32:13 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][142/573]	eta 0:06:23 lr 0.001078	time 0.8774 (0.8908)	loss 0.4516 (0.5296)	grad_norm 2.2650 (2.8063)	mem 20675MB
[2025-04-03 01:32:15 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][144/573]	eta 0:06:22 lr 0.001078	time 0.8780 (0.8906)	loss 0.4544 (0.5296)	grad_norm 1.8188 (2.7932)	mem 20675MB
[2025-04-03 01:32:17 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][146/573]	eta 0:06:20 lr 0.001078	time 0.8774 (0.8905)	loss 0.4585 (0.5291)	grad_norm 3.1983 (2.7881)	mem 20675MB
[2025-04-03 01:32:19 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][148/573]	eta 0:06:18 lr 0.001078	time 0.8772 (0.8903)	loss 0.5635 (0.5293)	grad_norm 1.5576 (2.7738)	mem 20675MB
[2025-04-03 01:32:20 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][150/573]	eta 0:06:16 lr 0.001078	time 0.8771 (0.8901)	loss 0.5096 (0.5281)	grad_norm 2.8870 (2.7756)	mem 20675MB
[2025-04-03 01:32:22 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][152/573]	eta 0:06:14 lr 0.001078	time 0.8775 (0.8900)	loss 0.6229 (0.5291)	grad_norm 2.7652 (2.7762)	mem 20675MB
[2025-04-03 01:32:24 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][154/573]	eta 0:06:12 lr 0.001078	time 0.8775 (0.8898)	loss 0.5991 (0.5296)	grad_norm 3.3223 (2.7797)	mem 20675MB
[2025-04-03 01:32:26 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][156/573]	eta 0:06:10 lr 0.001077	time 0.8775 (0.8897)	loss 0.4596 (0.5293)	grad_norm 2.8122 (2.7825)	mem 20675MB
[2025-04-03 01:32:27 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][158/573]	eta 0:06:09 lr 0.001077	time 0.8772 (0.8895)	loss 0.5392 (0.5294)	grad_norm 3.1310 (2.7786)	mem 20675MB
[2025-04-03 01:32:29 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][160/573]	eta 0:06:07 lr 0.001077	time 0.8777 (0.8894)	loss 0.6195 (0.5308)	grad_norm 2.1901 (2.7757)	mem 20675MB
[2025-04-03 01:32:31 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][162/573]	eta 0:06:05 lr 0.001077	time 0.8774 (0.8893)	loss 0.5839 (0.5307)	grad_norm 3.3111 (2.7802)	mem 20675MB
[2025-04-03 01:32:33 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][164/573]	eta 0:06:03 lr 0.001077	time 0.8772 (0.8891)	loss 0.5579 (0.5309)	grad_norm 1.7861 (2.7713)	mem 20675MB
[2025-04-03 01:32:34 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][166/573]	eta 0:06:01 lr 0.001077	time 0.8772 (0.8890)	loss 0.5555 (0.5311)	grad_norm 1.7138 (2.7575)	mem 20675MB
[2025-04-03 01:32:36 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][168/573]	eta 0:05:59 lr 0.001076	time 0.8771 (0.8889)	loss 0.4878 (0.5310)	grad_norm 2.2945 (2.7493)	mem 20675MB
[2025-04-03 01:32:38 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][170/573]	eta 0:05:58 lr 0.001076	time 0.8772 (0.8887)	loss 0.6012 (0.5315)	grad_norm 1.8442 (2.7364)	mem 20675MB
[2025-04-03 01:32:40 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][172/573]	eta 0:05:56 lr 0.001076	time 0.8773 (0.8886)	loss 0.6127 (0.5319)	grad_norm 1.9318 (2.7315)	mem 20675MB
[2025-04-03 01:32:41 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][174/573]	eta 0:05:54 lr 0.001076	time 0.8772 (0.8885)	loss 0.4387 (0.5317)	grad_norm 4.5916 (2.7430)	mem 20675MB
[2025-04-03 01:32:43 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][176/573]	eta 0:05:52 lr 0.001076	time 0.8772 (0.8884)	loss 0.5139 (0.5312)	grad_norm 2.7917 (2.7455)	mem 20675MB
[2025-04-03 01:32:45 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][178/573]	eta 0:05:50 lr 0.001076	time 0.8774 (0.8883)	loss 0.4709 (0.5309)	grad_norm 2.7961 (2.7429)	mem 20675MB
[2025-04-03 01:32:47 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][180/573]	eta 0:05:49 lr 0.001075	time 0.8772 (0.8881)	loss 0.4379 (0.5307)	grad_norm 3.3995 (2.7470)	mem 20675MB
[2025-04-03 01:32:48 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][182/573]	eta 0:05:47 lr 0.001075	time 0.8776 (0.8880)	loss 0.4591 (0.5296)	grad_norm 3.2786 (2.7470)	mem 20675MB
[2025-04-03 01:32:50 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][184/573]	eta 0:05:45 lr 0.001075	time 0.8774 (0.8879)	loss 0.6273 (0.5305)	grad_norm 3.3623 (2.7498)	mem 20675MB
[2025-04-03 01:32:52 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][186/573]	eta 0:05:43 lr 0.001075	time 0.8774 (0.8878)	loss 0.3914 (0.5306)	grad_norm 3.8519 (2.7538)	mem 20675MB
[2025-04-03 01:32:54 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][188/573]	eta 0:05:41 lr 0.001075	time 0.8774 (0.8877)	loss 0.5435 (0.5308)	grad_norm 3.8826 (2.7566)	mem 20675MB
[2025-04-03 01:32:55 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][190/573]	eta 0:05:39 lr 0.001075	time 0.8771 (0.8876)	loss 0.6066 (0.5312)	grad_norm 2.0676 (2.7478)	mem 20675MB
[2025-04-03 01:32:57 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][192/573]	eta 0:05:38 lr 0.001075	time 0.8778 (0.8875)	loss 0.4698 (0.5309)	grad_norm 2.9155 (2.7426)	mem 20675MB
[2025-04-03 01:32:59 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][194/573]	eta 0:05:36 lr 0.001074	time 0.8772 (0.8874)	loss 0.5443 (0.5310)	grad_norm 1.4644 (2.7318)	mem 20675MB
[2025-04-03 01:33:01 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][196/573]	eta 0:05:34 lr 0.001074	time 0.8774 (0.8873)	loss 0.4527 (0.5304)	grad_norm 2.5158 (2.7305)	mem 20675MB
[2025-04-03 01:33:02 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][198/573]	eta 0:05:32 lr 0.001074	time 0.8770 (0.8873)	loss 0.5715 (0.5309)	grad_norm 2.9020 (2.7286)	mem 20675MB
[2025-04-03 01:33:04 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][200/573]	eta 0:05:30 lr 0.001074	time 0.8770 (0.8872)	loss 0.5187 (0.5312)	grad_norm 3.6932 (2.7361)	mem 20675MB
[2025-04-03 01:33:06 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][202/573]	eta 0:05:29 lr 0.001074	time 0.8771 (0.8871)	loss 0.5962 (0.5310)	grad_norm 2.3653 (2.7383)	mem 20675MB
[2025-04-03 01:33:08 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][204/573]	eta 0:05:27 lr 0.001074	time 0.8780 (0.8870)	loss 0.6080 (0.5315)	grad_norm 1.9410 (2.7312)	mem 20675MB
[2025-04-03 01:33:09 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][206/573]	eta 0:05:25 lr 0.001073	time 0.8773 (0.8869)	loss 0.3947 (0.5310)	grad_norm 2.5123 (2.7246)	mem 20675MB
[2025-04-03 01:33:11 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][208/573]	eta 0:05:23 lr 0.001073	time 0.8772 (0.8868)	loss 0.5943 (0.5314)	grad_norm 2.1884 (2.7177)	mem 20675MB
[2025-04-03 01:33:13 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][210/573]	eta 0:05:21 lr 0.001073	time 0.8775 (0.8867)	loss 0.4597 (0.5312)	grad_norm 3.0639 (2.7143)	mem 20675MB
[2025-04-03 01:33:15 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][212/573]	eta 0:05:20 lr 0.001073	time 0.8777 (0.8867)	loss 0.4252 (0.5311)	grad_norm 2.0649 (2.7132)	mem 20675MB
[2025-04-03 01:33:17 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][214/573]	eta 0:05:18 lr 0.001073	time 0.8771 (0.8866)	loss 0.3909 (0.5309)	grad_norm 2.5035 (2.7085)	mem 20675MB
[2025-04-03 01:33:18 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][216/573]	eta 0:05:16 lr 0.001073	time 0.8773 (0.8865)	loss 0.3823 (0.5301)	grad_norm 2.7200 (2.7057)	mem 20675MB
[2025-04-03 01:33:20 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][218/573]	eta 0:05:14 lr 0.001072	time 0.8774 (0.8864)	loss 0.5947 (0.5299)	grad_norm 2.0727 (2.7065)	mem 20675MB
[2025-04-03 01:33:22 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][220/573]	eta 0:05:12 lr 0.001072	time 0.8773 (0.8864)	loss 0.4208 (0.5292)	grad_norm 2.4264 (2.7059)	mem 20675MB
[2025-04-03 01:33:24 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][222/573]	eta 0:05:11 lr 0.001072	time 0.8779 (0.8863)	loss 0.5998 (0.5298)	grad_norm 2.8031 (2.7062)	mem 20675MB
[2025-04-03 01:33:25 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][224/573]	eta 0:05:09 lr 0.001072	time 0.8774 (0.8862)	loss 0.4853 (0.5297)	grad_norm 3.5155 (2.7145)	mem 20675MB
[2025-04-03 01:33:27 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][226/573]	eta 0:05:07 lr 0.001072	time 0.8770 (0.8861)	loss 0.4964 (0.5293)	grad_norm 3.5736 (2.7184)	mem 20675MB
[2025-04-03 01:33:29 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][228/573]	eta 0:05:05 lr 0.001072	time 0.8771 (0.8861)	loss 0.5414 (0.5296)	grad_norm 2.1886 (2.7128)	mem 20675MB
[2025-04-03 01:33:31 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][230/573]	eta 0:05:03 lr 0.001072	time 0.8774 (0.8860)	loss 0.6402 (0.5302)	grad_norm 2.2831 (2.7086)	mem 20675MB
[2025-04-03 01:33:32 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][232/573]	eta 0:05:02 lr 0.001071	time 0.8773 (0.8859)	loss 0.6012 (0.5300)	grad_norm 1.7675 (2.7098)	mem 20675MB
[2025-04-03 01:33:34 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][234/573]	eta 0:05:00 lr 0.001071	time 0.8785 (0.8859)	loss 0.4577 (0.5292)	grad_norm 2.4872 (2.7067)	mem 20675MB
[2025-04-03 01:33:36 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][236/573]	eta 0:04:58 lr 0.001071	time 0.8771 (0.8858)	loss 0.6043 (0.5302)	grad_norm 3.2493 (2.7074)	mem 20675MB
[2025-04-03 01:33:38 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][238/573]	eta 0:04:56 lr 0.001071	time 0.8773 (0.8857)	loss 0.5144 (0.5303)	grad_norm 1.9174 (2.7011)	mem 20675MB
[2025-04-03 01:33:39 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][240/573]	eta 0:04:54 lr 0.001071	time 0.8773 (0.8857)	loss 0.5661 (0.5307)	grad_norm 1.8979 (2.6932)	mem 20675MB
[2025-04-03 01:33:41 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][242/573]	eta 0:04:53 lr 0.001071	time 0.8772 (0.8856)	loss 0.6188 (0.5310)	grad_norm 1.5582 (2.6854)	mem 20675MB
[2025-04-03 01:33:43 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][244/573]	eta 0:04:51 lr 0.001070	time 0.8774 (0.8856)	loss 0.6590 (0.5315)	grad_norm 2.4027 (2.6845)	mem 20675MB
[2025-04-03 01:33:45 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][246/573]	eta 0:04:49 lr 0.001070	time 0.8771 (0.8855)	loss 0.4942 (0.5317)	grad_norm 2.3624 (2.6818)	mem 20675MB
[2025-04-03 01:33:46 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][248/573]	eta 0:04:47 lr 0.001070	time 0.8775 (0.8854)	loss 0.5834 (0.5318)	grad_norm 1.5628 (2.6756)	mem 20675MB
[2025-04-03 01:33:48 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][250/573]	eta 0:04:45 lr 0.001070	time 0.8775 (0.8854)	loss 0.4175 (0.5309)	grad_norm 2.7161 (2.6777)	mem 20675MB
[2025-04-03 01:33:50 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][252/573]	eta 0:04:44 lr 0.001070	time 0.8774 (0.8853)	loss 0.3654 (0.5300)	grad_norm 2.4471 (2.6749)	mem 20675MB
[2025-04-03 01:33:52 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][254/573]	eta 0:04:42 lr 0.001070	time 0.8780 (0.8853)	loss 0.4774 (0.5301)	grad_norm 2.0948 (2.6721)	mem 20675MB
[2025-04-03 01:33:53 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][256/573]	eta 0:04:40 lr 0.001069	time 0.8777 (0.8852)	loss 0.5084 (0.5298)	grad_norm 4.6148 (2.6862)	mem 20675MB
[2025-04-03 01:33:55 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][258/573]	eta 0:04:38 lr 0.001069	time 0.8773 (0.8852)	loss 0.6018 (0.5296)	grad_norm 3.9237 (2.6933)	mem 20675MB
[2025-04-03 01:33:57 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][260/573]	eta 0:04:37 lr 0.001069	time 0.8772 (0.8851)	loss 0.6726 (0.5298)	grad_norm 6.7940 (2.7139)	mem 20675MB
[2025-04-03 01:33:59 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][262/573]	eta 0:04:35 lr 0.001069	time 0.8771 (0.8851)	loss 0.5390 (0.5300)	grad_norm 2.5190 (2.7200)	mem 20675MB
[2025-04-03 01:34:00 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][264/573]	eta 0:04:33 lr 0.001069	time 0.8781 (0.8850)	loss 0.5095 (0.5298)	grad_norm 1.8201 (2.7181)	mem 20675MB
[2025-04-03 01:34:02 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][266/573]	eta 0:04:31 lr 0.001069	time 0.8773 (0.8850)	loss 0.6273 (0.5299)	grad_norm 5.1615 (2.7323)	mem 20675MB
[2025-04-03 01:34:04 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][268/573]	eta 0:04:29 lr 0.001068	time 0.8774 (0.8849)	loss 0.5823 (0.5301)	grad_norm 3.2379 (2.7356)	mem 20675MB
[2025-04-03 01:34:06 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][270/573]	eta 0:04:28 lr 0.001068	time 0.8773 (0.8849)	loss 0.5921 (0.5303)	grad_norm 2.5574 (2.7343)	mem 20675MB
[2025-04-03 01:34:07 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][272/573]	eta 0:04:26 lr 0.001068	time 0.8773 (0.8848)	loss 0.4682 (0.5295)	grad_norm 3.3141 (2.7386)	mem 20675MB
[2025-04-03 01:34:09 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][274/573]	eta 0:04:24 lr 0.001068	time 0.8773 (0.8848)	loss 0.5612 (0.5298)	grad_norm 3.6108 (2.7384)	mem 20675MB
[2025-04-03 01:34:11 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][276/573]	eta 0:04:22 lr 0.001068	time 0.8771 (0.8847)	loss 0.5277 (0.5302)	grad_norm 2.9446 (2.7435)	mem 20675MB
[2025-04-03 01:34:13 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][278/573]	eta 0:04:20 lr 0.001068	time 0.8771 (0.8847)	loss 0.4367 (0.5301)	grad_norm 3.6080 (2.7464)	mem 20675MB
[2025-04-03 01:34:14 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][280/573]	eta 0:04:19 lr 0.001068	time 0.8772 (0.8846)	loss 0.5593 (0.5306)	grad_norm 2.6535 (2.7466)	mem 20675MB
[2025-04-03 01:34:16 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][282/573]	eta 0:04:17 lr 0.001067	time 0.8774 (0.8846)	loss 0.6315 (0.5308)	grad_norm 1.7524 (2.7481)	mem 20675MB
[2025-04-03 01:34:18 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][284/573]	eta 0:04:15 lr 0.001067	time 0.8773 (0.8845)	loss 0.4961 (0.5309)	grad_norm 2.6993 (2.7449)	mem 20675MB
[2025-04-03 01:34:20 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][286/573]	eta 0:04:13 lr 0.001067	time 0.8771 (0.8845)	loss 0.5172 (0.5312)	grad_norm 3.0547 (2.7415)	mem 20675MB
[2025-04-03 01:34:22 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][288/573]	eta 0:04:12 lr 0.001067	time 0.8775 (0.8845)	loss 0.5000 (0.5309)	grad_norm 2.3012 (2.7402)	mem 20675MB
[2025-04-03 01:34:23 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][290/573]	eta 0:04:10 lr 0.001067	time 0.8776 (0.8844)	loss 0.4132 (0.5301)	grad_norm 3.2008 (2.7477)	mem 20675MB
[2025-04-03 01:34:25 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][292/573]	eta 0:04:08 lr 0.001067	time 0.8775 (0.8844)	loss 0.5278 (0.5297)	grad_norm 2.8611 (2.7451)	mem 20675MB
[2025-04-03 01:34:27 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][294/573]	eta 0:04:06 lr 0.001066	time 0.8774 (0.8843)	loss 0.4697 (0.5295)	grad_norm 2.4395 (2.7448)	mem 20675MB
[2025-04-03 01:34:29 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][296/573]	eta 0:04:04 lr 0.001066	time 0.8773 (0.8843)	loss 0.4990 (0.5295)	grad_norm 2.6757 (2.7448)	mem 20675MB
[2025-04-03 01:34:30 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][298/573]	eta 0:04:03 lr 0.001066	time 0.8773 (0.8843)	loss 0.6173 (0.5299)	grad_norm 2.3887 (2.7457)	mem 20675MB
[2025-04-03 01:34:32 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][300/573]	eta 0:04:01 lr 0.001066	time 0.8776 (0.8842)	loss 0.5594 (0.5303)	grad_norm 2.9495 (2.7491)	mem 20675MB
[2025-04-03 01:34:34 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][302/573]	eta 0:03:59 lr 0.001066	time 0.8773 (0.8842)	loss 0.3776 (0.5298)	grad_norm 4.2124 (2.7522)	mem 20675MB
[2025-04-03 01:34:36 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][304/573]	eta 0:03:57 lr 0.001066	time 0.8775 (0.8841)	loss 0.5498 (0.5298)	grad_norm 1.6422 (2.7464)	mem 20675MB
[2025-04-03 01:34:37 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][306/573]	eta 0:03:56 lr 0.001065	time 0.8773 (0.8841)	loss 0.5440 (0.5299)	grad_norm 2.1247 (2.7427)	mem 20675MB
[2025-04-03 01:34:39 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][308/573]	eta 0:03:54 lr 0.001065	time 0.8773 (0.8841)	loss 0.5414 (0.5302)	grad_norm 2.7875 (2.7435)	mem 20675MB
[2025-04-03 01:34:41 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][310/573]	eta 0:03:52 lr 0.001065	time 0.8771 (0.8840)	loss 0.5700 (0.5307)	grad_norm 1.7861 (2.7403)	mem 20675MB
[2025-04-03 01:34:43 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][312/573]	eta 0:03:50 lr 0.001065	time 0.8771 (0.8840)	loss 0.4979 (0.5306)	grad_norm 2.3580 (2.7372)	mem 20675MB
[2025-04-03 01:34:44 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][314/573]	eta 0:03:48 lr 0.001065	time 0.8773 (0.8839)	loss 0.5609 (0.5309)	grad_norm 2.0981 (2.7332)	mem 20675MB
[2025-04-03 01:34:46 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][316/573]	eta 0:03:47 lr 0.001065	time 0.8771 (0.8839)	loss 0.4296 (0.5301)	grad_norm 2.6805 (2.7340)	mem 20675MB
[2025-04-03 01:34:48 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][318/573]	eta 0:03:45 lr 0.001064	time 0.8772 (0.8839)	loss 0.5048 (0.5300)	grad_norm 1.3961 (2.7299)	mem 20675MB
[2025-04-03 01:34:50 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][320/573]	eta 0:03:43 lr 0.001064	time 0.8775 (0.8838)	loss 0.5869 (0.5300)	grad_norm 2.5184 (2.7311)	mem 20675MB
[2025-04-03 01:34:51 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][322/573]	eta 0:03:41 lr 0.001064	time 0.8788 (0.8838)	loss 0.4173 (0.5300)	grad_norm 3.5774 (2.7374)	mem 20675MB
[2025-04-03 01:34:53 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][324/573]	eta 0:03:40 lr 0.001064	time 0.8774 (0.8838)	loss 0.6050 (0.5303)	grad_norm 2.7600 (2.7365)	mem 20675MB
[2025-04-03 01:34:55 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][326/573]	eta 0:03:38 lr 0.001064	time 0.8773 (0.8837)	loss 0.3958 (0.5297)	grad_norm 4.1231 (2.7408)	mem 20675MB
[2025-04-03 01:34:57 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][328/573]	eta 0:03:36 lr 0.001064	time 0.8777 (0.8837)	loss 0.6935 (0.5301)	grad_norm 2.9695 (2.7427)	mem 20675MB
[2025-04-03 01:34:58 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][330/573]	eta 0:03:34 lr 0.001063	time 0.8773 (0.8837)	loss 0.5773 (0.5304)	grad_norm 2.0621 (2.7373)	mem 20675MB
[2025-04-03 01:35:00 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][332/573]	eta 0:03:32 lr 0.001063	time 0.8775 (0.8836)	loss 0.4235 (0.5297)	grad_norm 2.2966 (2.7382)	mem 20675MB
[2025-04-03 01:35:02 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][334/573]	eta 0:03:31 lr 0.001063	time 0.8772 (0.8836)	loss 0.5632 (0.5295)	grad_norm 1.8341 (2.7335)	mem 20675MB
[2025-04-03 01:35:04 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][336/573]	eta 0:03:29 lr 0.001063	time 0.8773 (0.8836)	loss 0.4362 (0.5294)	grad_norm 3.6644 (2.7364)	mem 20675MB
[2025-04-03 01:35:05 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][338/573]	eta 0:03:27 lr 0.001063	time 0.8771 (0.8836)	loss 0.4578 (0.5294)	grad_norm 2.3597 (2.7337)	mem 20675MB
[2025-04-03 01:35:07 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][340/573]	eta 0:03:25 lr 0.001063	time 0.8771 (0.8835)	loss 0.6063 (0.5297)	grad_norm 2.9914 (2.7341)	mem 20675MB
[2025-04-03 01:35:09 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][342/573]	eta 0:03:24 lr 0.001062	time 0.8771 (0.8835)	loss 0.5415 (0.5297)	grad_norm 2.1457 (2.7311)	mem 20675MB
[2025-04-03 01:35:11 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][344/573]	eta 0:03:22 lr 0.001062	time 0.8774 (0.8835)	loss 0.4887 (0.5297)	grad_norm 1.9891 (2.7265)	mem 20675MB
[2025-04-03 01:35:12 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][346/573]	eta 0:03:20 lr 0.001062	time 0.8788 (0.8834)	loss 0.5616 (0.5298)	grad_norm 2.2377 (2.7246)	mem 20675MB
[2025-04-03 01:35:14 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][348/573]	eta 0:03:18 lr 0.001062	time 0.8773 (0.8834)	loss 0.5686 (0.5298)	grad_norm 2.4680 (2.7220)	mem 20675MB
[2025-04-03 01:35:16 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][350/573]	eta 0:03:16 lr 0.001062	time 0.8774 (0.8834)	loss 0.5462 (0.5295)	grad_norm 2.2665 (2.7215)	mem 20675MB
[2025-04-03 01:35:18 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][352/573]	eta 0:03:15 lr 0.001062	time 0.8772 (0.8834)	loss 0.3621 (0.5289)	grad_norm 5.9166 (2.7307)	mem 20675MB
[2025-04-03 01:35:19 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][354/573]	eta 0:03:13 lr 0.001061	time 0.8773 (0.8833)	loss 0.6252 (0.5292)	grad_norm 2.8229 (2.7306)	mem 20675MB
[2025-04-03 01:35:21 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][356/573]	eta 0:03:11 lr 0.001061	time 0.8775 (0.8833)	loss 0.5543 (0.5293)	grad_norm 2.7306 (2.7282)	mem 20675MB
[2025-04-03 01:35:23 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][358/573]	eta 0:03:09 lr 0.001061	time 0.8772 (0.8833)	loss 0.5278 (0.5294)	grad_norm 2.4184 (2.7248)	mem 20675MB
[2025-04-03 01:35:25 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][360/573]	eta 0:03:08 lr 0.001061	time 0.8773 (0.8833)	loss 0.5506 (0.5295)	grad_norm 2.9645 (2.7244)	mem 20675MB
[2025-04-03 01:35:27 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][362/573]	eta 0:03:06 lr 0.001061	time 0.8781 (0.8832)	loss 0.5500 (0.5295)	grad_norm 1.9717 (2.7204)	mem 20675MB
[2025-04-03 01:35:28 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][364/573]	eta 0:03:04 lr 0.001061	time 0.8797 (0.8832)	loss 0.5533 (0.5297)	grad_norm 1.6533 (2.7174)	mem 20675MB
[2025-04-03 01:35:30 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][366/573]	eta 0:03:02 lr 0.001061	time 0.8772 (0.8832)	loss 0.4396 (0.5293)	grad_norm 3.0467 (2.7169)	mem 20675MB
[2025-04-03 01:35:32 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][368/573]	eta 0:03:01 lr 0.001060	time 0.8772 (0.8832)	loss 0.6063 (0.5292)	grad_norm 3.9416 (2.7239)	mem 20675MB
[2025-04-03 01:35:34 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][370/573]	eta 0:02:59 lr 0.001060	time 0.8774 (0.8831)	loss 0.3885 (0.5288)	grad_norm 4.1032 (2.7262)	mem 20675MB
[2025-04-03 01:35:35 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][372/573]	eta 0:02:57 lr 0.001060	time 0.8775 (0.8831)	loss 0.4694 (0.5288)	grad_norm 2.9101 (2.7293)	mem 20675MB
[2025-04-03 01:35:37 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][374/573]	eta 0:02:55 lr 0.001060	time 0.8776 (0.8831)	loss 0.4600 (0.5289)	grad_norm 2.9606 (2.7310)	mem 20675MB
[2025-04-03 01:35:39 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][376/573]	eta 0:02:53 lr 0.001060	time 0.8776 (0.8831)	loss 0.4238 (0.5285)	grad_norm 3.5612 (2.7330)	mem 20675MB
[2025-04-03 01:35:41 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][378/573]	eta 0:02:52 lr 0.001060	time 0.8781 (0.8830)	loss 0.6169 (0.5289)	grad_norm 2.4427 (2.7315)	mem 20675MB
[2025-04-03 01:35:42 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][380/573]	eta 0:02:50 lr 0.001059	time 0.8771 (0.8830)	loss 0.5224 (0.5288)	grad_norm 2.1402 (2.7287)	mem 20675MB
[2025-04-03 01:35:44 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][382/573]	eta 0:02:48 lr 0.001059	time 0.8771 (0.8830)	loss 0.4895 (0.5287)	grad_norm 3.5568 (2.7295)	mem 20675MB
[2025-04-03 01:35:46 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][384/573]	eta 0:02:46 lr 0.001059	time 0.8772 (0.8830)	loss 0.5555 (0.5288)	grad_norm 2.2556 (2.7267)	mem 20675MB
[2025-04-03 01:35:48 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][386/573]	eta 0:02:45 lr 0.001059	time 0.8776 (0.8829)	loss 0.5516 (0.5286)	grad_norm 2.4376 (2.7260)	mem 20675MB
[2025-04-03 01:35:49 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][388/573]	eta 0:02:43 lr 0.001059	time 0.8772 (0.8829)	loss 0.5411 (0.5289)	grad_norm 2.0061 (2.7224)	mem 20675MB
[2025-04-03 01:35:51 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][390/573]	eta 0:02:41 lr 0.001059	time 0.8772 (0.8829)	loss 0.5891 (0.5291)	grad_norm 1.7192 (2.7183)	mem 20675MB
[2025-04-03 01:35:53 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][392/573]	eta 0:02:39 lr 0.001058	time 0.8775 (0.8829)	loss 0.5932 (0.5292)	grad_norm 1.9522 (2.7157)	mem 20675MB
[2025-04-03 01:35:55 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][394/573]	eta 0:02:38 lr 0.001058	time 0.8774 (0.8828)	loss 0.4538 (0.5289)	grad_norm 2.4874 (2.7142)	mem 20675MB
[2025-04-03 01:35:56 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][396/573]	eta 0:02:36 lr 0.001058	time 0.8772 (0.8828)	loss 0.5591 (0.5291)	grad_norm 2.7666 (2.7116)	mem 20675MB
[2025-04-03 01:35:58 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][398/573]	eta 0:02:34 lr 0.001058	time 0.8774 (0.8828)	loss 0.4809 (0.5291)	grad_norm 3.3329 (2.7107)	mem 20675MB
[2025-04-03 01:36:00 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][400/573]	eta 0:02:32 lr 0.001058	time 0.8774 (0.8828)	loss 0.4834 (0.5291)	grad_norm 2.7265 (2.7108)	mem 20675MB
[2025-04-03 01:36:02 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][402/573]	eta 0:02:30 lr 0.001058	time 0.8772 (0.8828)	loss 0.5416 (0.5292)	grad_norm 2.1622 (2.7092)	mem 20675MB
[2025-04-03 01:36:03 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][404/573]	eta 0:02:29 lr 0.001057	time 0.8776 (0.8827)	loss 0.4932 (0.5288)	grad_norm 5.0838 (2.7166)	mem 20675MB
[2025-04-03 01:36:05 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][406/573]	eta 0:02:27 lr 0.001057	time 0.8774 (0.8827)	loss 0.3460 (0.5286)	grad_norm 3.5983 (2.7193)	mem 20675MB
[2025-04-03 01:36:07 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][408/573]	eta 0:02:25 lr 0.001057	time 0.8773 (0.8827)	loss 0.5591 (0.5285)	grad_norm 2.4515 (2.7169)	mem 20675MB
[2025-04-03 01:36:09 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][410/573]	eta 0:02:23 lr 0.001057	time 0.8774 (0.8827)	loss 0.6037 (0.5290)	grad_norm 3.4383 (2.7199)	mem 20675MB
[2025-04-03 01:36:10 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][412/573]	eta 0:02:22 lr 0.001057	time 0.8787 (0.8827)	loss 0.4533 (0.5290)	grad_norm 3.8038 (2.7241)	mem 20675MB
[2025-04-03 01:36:12 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][414/573]	eta 0:02:20 lr 0.001057	time 0.8773 (0.8826)	loss 0.4666 (0.5289)	grad_norm 3.0662 (2.7238)	mem 20675MB
[2025-04-03 01:36:14 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][416/573]	eta 0:02:18 lr 0.001056	time 0.8773 (0.8826)	loss 0.3722 (0.5283)	grad_norm 6.0262 (2.7306)	mem 20675MB
[2025-04-03 01:36:16 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][418/573]	eta 0:02:16 lr 0.001056	time 0.8773 (0.8826)	loss 0.4728 (0.5283)	grad_norm 2.2131 (2.7291)	mem 20675MB
[2025-04-03 01:36:17 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][420/573]	eta 0:02:15 lr 0.001056	time 0.8771 (0.8826)	loss 0.5423 (0.5285)	grad_norm 1.7146 (2.7255)	mem 20675MB
[2025-04-03 01:36:19 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][422/573]	eta 0:02:13 lr 0.001056	time 0.8774 (0.8826)	loss 0.6052 (0.5289)	grad_norm 2.6327 (2.7234)	mem 20675MB
[2025-04-03 01:36:21 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][424/573]	eta 0:02:11 lr 0.001056	time 0.8775 (0.8826)	loss 0.4442 (0.5285)	grad_norm 2.8369 (2.7254)	mem 20675MB
[2025-04-03 01:36:23 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][426/573]	eta 0:02:09 lr 0.001056	time 0.8770 (0.8825)	loss 0.5564 (0.5288)	grad_norm 2.1203 (2.7212)	mem 20675MB
[2025-04-03 01:36:25 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][428/573]	eta 0:02:07 lr 0.001055	time 0.8771 (0.8825)	loss 0.5896 (0.5289)	grad_norm 3.3794 (2.7209)	mem 20675MB
[2025-04-03 01:36:26 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][430/573]	eta 0:02:06 lr 0.001055	time 0.8771 (0.8825)	loss 0.3754 (0.5286)	grad_norm 2.3832 (2.7185)	mem 20675MB
[2025-04-03 01:36:28 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][432/573]	eta 0:02:04 lr 0.001055	time 0.8772 (0.8825)	loss 0.5680 (0.5286)	grad_norm 2.7490 (2.7164)	mem 20675MB
[2025-04-03 01:36:30 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][434/573]	eta 0:02:02 lr 0.001055	time 0.8772 (0.8825)	loss 0.6071 (0.5289)	grad_norm 2.7235 (2.7161)	mem 20675MB
[2025-04-03 01:36:32 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][436/573]	eta 0:02:00 lr 0.001055	time 0.8784 (0.8825)	loss 0.5583 (0.5289)	grad_norm 2.8975 (2.7166)	mem 20675MB
[2025-04-03 01:36:33 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][438/573]	eta 0:01:59 lr 0.001055	time 0.8774 (0.8824)	loss 0.5520 (0.5291)	grad_norm 2.2019 (2.7164)	mem 20675MB
[2025-04-03 01:36:35 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][440/573]	eta 0:01:57 lr 0.001054	time 0.8771 (0.8824)	loss 0.5045 (0.5287)	grad_norm 3.7012 (2.7197)	mem 20675MB
[2025-04-03 01:36:37 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][442/573]	eta 0:01:55 lr 0.001054	time 0.8772 (0.8824)	loss 0.4576 (0.5287)	grad_norm 4.5222 (2.7240)	mem 20675MB
[2025-04-03 01:36:39 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][444/573]	eta 0:01:53 lr 0.001054	time 0.8775 (0.8824)	loss 0.4088 (0.5285)	grad_norm 3.4629 (2.7249)	mem 20675MB
[2025-04-03 01:36:40 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][446/573]	eta 0:01:52 lr 0.001054	time 0.8771 (0.8824)	loss 0.5733 (0.5284)	grad_norm 2.6124 (2.7248)	mem 20675MB
[2025-04-03 01:36:42 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][448/573]	eta 0:01:50 lr 0.001054	time 0.8772 (0.8823)	loss 0.5577 (0.5284)	grad_norm 3.6467 (2.7252)	mem 20675MB
[2025-04-03 01:36:44 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][450/573]	eta 0:01:48 lr 0.001054	time 0.8768 (0.8823)	loss 0.5100 (0.5281)	grad_norm 2.3236 (2.7281)	mem 20675MB
[2025-04-03 01:36:46 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][452/573]	eta 0:01:46 lr 0.001053	time 0.8773 (0.8823)	loss 0.6113 (0.5283)	grad_norm 2.9570 (2.7265)	mem 20675MB
[2025-04-03 01:36:47 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][454/573]	eta 0:01:44 lr 0.001053	time 0.8773 (0.8823)	loss 0.6487 (0.5287)	grad_norm 3.3413 (2.7259)	mem 20675MB
[2025-04-03 01:36:49 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][456/573]	eta 0:01:43 lr 0.001053	time 0.8781 (0.8823)	loss 0.6167 (0.5290)	grad_norm 3.4702 (2.7260)	mem 20675MB
[2025-04-03 01:36:51 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][458/573]	eta 0:01:41 lr 0.001053	time 0.8771 (0.8822)	loss 0.5382 (0.5290)	grad_norm 1.8539 (2.7250)	mem 20675MB
[2025-04-03 01:36:53 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][460/573]	eta 0:01:39 lr 0.001053	time 0.8771 (0.8822)	loss 0.5514 (0.5291)	grad_norm 2.2613 (2.7208)	mem 20675MB
[2025-04-03 01:36:54 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][462/573]	eta 0:01:37 lr 0.001053	time 0.8773 (0.8822)	loss 0.5633 (0.5295)	grad_norm 1.9486 (2.7171)	mem 20675MB
[2025-04-03 01:36:56 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][464/573]	eta 0:01:36 lr 0.001052	time 0.8770 (0.8822)	loss 0.4456 (0.5292)	grad_norm 2.6670 (2.7174)	mem 20675MB
[2025-04-03 01:36:58 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][466/573]	eta 0:01:34 lr 0.001052	time 0.8771 (0.8822)	loss 0.4973 (0.5291)	grad_norm 2.0751 (2.7150)	mem 20675MB
[2025-04-03 01:37:00 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][468/573]	eta 0:01:32 lr 0.001052	time 0.8772 (0.8822)	loss 0.5633 (0.5292)	grad_norm 1.9495 (2.7120)	mem 20675MB
[2025-04-03 01:37:01 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][470/573]	eta 0:01:30 lr 0.001052	time 0.8771 (0.8821)	loss 0.5335 (0.5294)	grad_norm 2.6711 (2.7114)	mem 20675MB
[2025-04-03 01:37:03 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][472/573]	eta 0:01:29 lr 0.001052	time 0.8788 (0.8821)	loss 0.5416 (0.5295)	grad_norm 1.9793 (2.7149)	mem 20675MB
[2025-04-03 01:37:05 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][474/573]	eta 0:01:27 lr 0.001052	time 0.8775 (0.8821)	loss 0.3925 (0.5293)	grad_norm 2.4772 (2.7132)	mem 20675MB
[2025-04-03 01:37:07 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][476/573]	eta 0:01:25 lr 0.001051	time 0.8774 (0.8821)	loss 0.4827 (0.5290)	grad_norm 2.5292 (2.7160)	mem 20675MB
[2025-04-03 01:37:08 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][478/573]	eta 0:01:23 lr 0.001051	time 0.8769 (0.8821)	loss 0.3834 (0.5285)	grad_norm 3.8459 (2.7199)	mem 20675MB
[2025-04-03 01:37:10 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][480/573]	eta 0:01:22 lr 0.001051	time 0.8770 (0.8821)	loss 0.5326 (0.5283)	grad_norm 4.0974 (2.7234)	mem 20675MB
[2025-04-03 01:37:12 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][482/573]	eta 0:01:20 lr 0.001051	time 0.8773 (0.8820)	loss 0.6190 (0.5285)	grad_norm 2.1508 (2.7216)	mem 20675MB
[2025-04-03 01:37:14 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][484/573]	eta 0:01:18 lr 0.001051	time 0.8769 (0.8820)	loss 0.5870 (0.5286)	grad_norm 2.9608 (2.7211)	mem 20675MB
[2025-04-03 01:37:15 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][486/573]	eta 0:01:16 lr 0.001051	time 0.8772 (0.8820)	loss 0.5706 (0.5287)	grad_norm 2.2169 (2.7201)	mem 20675MB
[2025-04-03 01:37:17 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][488/573]	eta 0:01:14 lr 0.001050	time 0.8770 (0.8820)	loss 0.4168 (0.5284)	grad_norm 3.9741 (2.7230)	mem 20675MB
[2025-04-03 01:37:19 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][490/573]	eta 0:01:13 lr 0.001050	time 0.8773 (0.8820)	loss 0.5345 (0.5284)	grad_norm 2.3950 (2.7198)	mem 20675MB
[2025-04-03 01:37:21 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][492/573]	eta 0:01:11 lr 0.001050	time 0.8772 (0.8820)	loss 0.4654 (0.5285)	grad_norm 6.5690 (2.7270)	mem 20675MB
[2025-04-03 01:37:22 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][494/573]	eta 0:01:09 lr 0.001050	time 0.8770 (0.8819)	loss 0.5925 (0.5286)	grad_norm 2.7181 (2.7255)	mem 20675MB
[2025-04-03 01:37:24 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][496/573]	eta 0:01:07 lr 0.001050	time 0.8773 (0.8819)	loss 0.6075 (0.5287)	grad_norm 2.8764 (2.7245)	mem 20675MB
[2025-04-03 01:37:26 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][498/573]	eta 0:01:06 lr 0.001050	time 0.8771 (0.8819)	loss 0.5773 (0.5289)	grad_norm 2.6571 (2.7232)	mem 20675MB
[2025-04-03 01:37:28 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][500/573]	eta 0:01:04 lr 0.001049	time 0.8771 (0.8819)	loss 0.4633 (0.5286)	grad_norm 5.6827 (2.7299)	mem 20675MB
[2025-04-03 01:37:29 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][502/573]	eta 0:01:02 lr 0.001049	time 0.8772 (0.8819)	loss 0.4689 (0.5286)	grad_norm 4.9552 (2.7354)	mem 20675MB
[2025-04-03 01:37:31 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][504/573]	eta 0:01:00 lr 0.001049	time 0.8773 (0.8819)	loss 0.5144 (0.5283)	grad_norm 3.3591 (2.7358)	mem 20675MB
[2025-04-03 01:37:33 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][506/573]	eta 0:00:59 lr 0.001049	time 0.8773 (0.8819)	loss 0.6268 (0.5285)	grad_norm 2.6543 (2.7354)	mem 20675MB
[2025-04-03 01:37:35 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][508/573]	eta 0:00:57 lr 0.001049	time 0.8772 (0.8818)	loss 0.4293 (0.5284)	grad_norm 3.5048 (2.7383)	mem 20675MB
[2025-04-03 01:37:37 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][510/573]	eta 0:00:55 lr 0.001049	time 0.8769 (0.8818)	loss 0.4716 (0.5286)	grad_norm 2.1746 (2.7379)	mem 20675MB
[2025-04-03 01:37:38 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][512/573]	eta 0:00:53 lr 0.001048	time 0.8771 (0.8818)	loss 0.5909 (0.5287)	grad_norm 1.6964 (2.7347)	mem 20675MB
[2025-04-03 01:37:40 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][514/573]	eta 0:00:52 lr 0.001048	time 0.8772 (0.8818)	loss 0.5544 (0.5284)	grad_norm 1.5032 (2.7324)	mem 20675MB
[2025-04-03 01:37:42 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][516/573]	eta 0:00:50 lr 0.001048	time 0.8770 (0.8818)	loss 0.5850 (0.5286)	grad_norm 1.8024 (2.7280)	mem 20675MB
[2025-04-03 01:37:44 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][518/573]	eta 0:00:48 lr 0.001048	time 0.8770 (0.8818)	loss 0.5882 (0.5288)	grad_norm 2.2347 (2.7245)	mem 20675MB
[2025-04-03 01:37:45 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][520/573]	eta 0:00:46 lr 0.001048	time 0.8769 (0.8818)	loss 0.6179 (0.5288)	grad_norm 2.5437 (2.7236)	mem 20675MB
[2025-04-03 01:37:47 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][522/573]	eta 0:00:44 lr 0.001048	time 0.8774 (0.8817)	loss 0.5471 (0.5288)	grad_norm 2.1709 (2.7212)	mem 20675MB
[2025-04-03 01:37:49 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][524/573]	eta 0:00:43 lr 0.001047	time 0.8772 (0.8817)	loss 0.3967 (0.5285)	grad_norm 2.4560 (2.7192)	mem 20675MB
[2025-04-03 01:37:51 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][526/573]	eta 0:00:41 lr 0.001047	time 0.8770 (0.8817)	loss 0.5251 (0.5284)	grad_norm 1.5758 (2.7194)	mem 20675MB
[2025-04-03 01:37:52 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][528/573]	eta 0:00:39 lr 0.001047	time 0.8770 (0.8817)	loss 0.5675 (0.5283)	grad_norm 2.4019 (2.7199)	mem 20675MB
[2025-04-03 01:37:54 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][530/573]	eta 0:00:37 lr 0.001047	time 0.8769 (0.8817)	loss 0.5084 (0.5283)	grad_norm 2.3937 (2.7188)	mem 20675MB
[2025-04-03 01:37:56 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][532/573]	eta 0:00:36 lr 0.001047	time 0.8771 (0.8817)	loss 0.5916 (0.5284)	grad_norm 2.2776 (2.7170)	mem 20675MB
[2025-04-03 01:37:58 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][534/573]	eta 0:00:34 lr 0.001047	time 0.8769 (0.8817)	loss 0.5604 (0.5283)	grad_norm 2.4794 (2.7178)	mem 20675MB
[2025-04-03 01:37:59 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][536/573]	eta 0:00:32 lr 0.001046	time 0.8770 (0.8816)	loss 0.4405 (0.5281)	grad_norm 2.7270 (2.7177)	mem 20675MB
[2025-04-03 01:38:01 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][538/573]	eta 0:00:30 lr 0.001046	time 0.8770 (0.8816)	loss 0.4739 (0.5283)	grad_norm 2.3302 (2.7211)	mem 20675MB
[2025-04-03 01:38:03 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][540/573]	eta 0:00:29 lr 0.001046	time 0.8773 (0.8816)	loss 0.4286 (0.5281)	grad_norm 2.6450 (2.7193)	mem 20675MB
[2025-04-03 01:38:05 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][542/573]	eta 0:00:27 lr 0.001046	time 0.8771 (0.8816)	loss 0.5046 (0.5282)	grad_norm 3.2292 (2.7213)	mem 20675MB
[2025-04-03 01:38:06 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][544/573]	eta 0:00:25 lr 0.001046	time 0.8772 (0.8816)	loss 0.5493 (0.5285)	grad_norm 2.1144 (2.7206)	mem 20675MB
[2025-04-03 01:38:08 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][546/573]	eta 0:00:23 lr 0.001046	time 0.8771 (0.8816)	loss 0.5839 (0.5287)	grad_norm 2.0311 (2.7188)	mem 20675MB
[2025-04-03 01:38:10 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][548/573]	eta 0:00:22 lr 0.001045	time 0.8781 (0.8816)	loss 0.4290 (0.5285)	grad_norm 3.4452 (2.7209)	mem 20675MB
[2025-04-03 01:38:12 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][550/573]	eta 0:00:20 lr 0.001045	time 0.8775 (0.8816)	loss 0.6155 (0.5286)	grad_norm 3.3321 (2.7205)	mem 20675MB
[2025-04-03 01:38:13 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][552/573]	eta 0:00:18 lr 0.001045	time 0.8774 (0.8816)	loss 0.6286 (0.5288)	grad_norm 1.9199 (2.7199)	mem 20675MB
[2025-04-03 01:38:15 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][554/573]	eta 0:00:16 lr 0.001045	time 0.8775 (0.8815)	loss 0.5992 (0.5289)	grad_norm 2.0499 (2.7195)	mem 20675MB
[2025-04-03 01:38:17 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][556/573]	eta 0:00:14 lr 0.001045	time 0.8770 (0.8815)	loss 0.5905 (0.5287)	grad_norm 2.8850 (2.7197)	mem 20675MB
[2025-04-03 01:38:19 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][558/573]	eta 0:00:13 lr 0.001045	time 0.8774 (0.8815)	loss 0.6080 (0.5291)	grad_norm 3.1995 (2.7203)	mem 20675MB
[2025-04-03 01:38:20 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][560/573]	eta 0:00:11 lr 0.001044	time 0.8773 (0.8815)	loss 0.5825 (0.5289)	grad_norm 1.9732 (2.7240)	mem 20675MB
[2025-04-03 01:38:22 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][562/573]	eta 0:00:09 lr 0.001044	time 0.8772 (0.8815)	loss 0.5231 (0.5288)	grad_norm 4.0521 (2.7275)	mem 20675MB
[2025-04-03 01:38:24 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][564/573]	eta 0:00:07 lr 0.001044	time 0.8772 (0.8815)	loss 0.5414 (0.5286)	grad_norm 1.6929 (2.7260)	mem 20675MB
[2025-04-03 01:38:26 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][566/573]	eta 0:00:06 lr 0.001044	time 0.8771 (0.8815)	loss 0.5213 (0.5288)	grad_norm 2.5836 (2.7261)	mem 20675MB
[2025-04-03 01:38:27 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][568/573]	eta 0:00:04 lr 0.001044	time 0.8772 (0.8815)	loss 0.5326 (0.5289)	grad_norm 2.1870 (2.7246)	mem 20675MB
[2025-04-03 01:38:29 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][570/573]	eta 0:00:02 lr 0.001044	time 0.8770 (0.8814)	loss 0.6254 (0.5291)	grad_norm 2.1191 (2.7232)	mem 20675MB
[2025-04-03 01:38:31 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][572/573]	eta 0:00:00 lr 0.001043	time 0.8777 (0.8814)	loss 0.4971 (0.5291)	grad_norm 5.0250 (2.7267)	mem 20675MB
[2025-04-03 01:38:31 simmim_finetune] (main_finetune.py 260): INFO EPOCH 7 training takes 0:08:25
[2025-04-03 01:38:33 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.928 (1.928)	Loss 0.4045 (0.4045)	Acc@1 87.500 (87.500)	Mem 20675MB
[2025-04-03 01:38:34 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.832)	Loss 0.3900 (0.3954)	Acc@1 86.719 (86.458)	Mem 20675MB
[2025-04-03 01:38:34 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.613)	Loss 0.4168 (0.3975)	Acc@1 85.156 (86.250)	Mem 20675MB
[2025-04-03 01:38:35 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.284 (0.519)	Loss 0.4232 (0.3974)	Acc@1 82.031 (85.491)	Mem 20675MB
[2025-04-03 01:38:35 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.467)	Loss 0.7440 (0.4455)	Acc@1 54.688 (80.816)	Mem 20675MB
[2025-04-03 01:38:36 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.433)	Loss 0.6222 (0.4811)	Acc@1 68.750 (78.480)	Mem 20675MB
[2025-04-03 01:38:36 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.410)	Loss 0.6874 (0.5104)	Acc@1 60.156 (75.901)	Mem 20675MB
[2025-04-03 01:38:37 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.394)	Loss 0.6195 (0.5250)	Acc@1 67.188 (74.688)	Mem 20675MB
[2025-04-03 01:38:37 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 74.546
[2025-04-03 01:38:37 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 74.5%
[2025-04-03 01:38:37 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 74.90%
[2025-04-03 01:38:37 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.8986052019145524e-06, 3.8986052019145524e-06, 5.975593242086464e-06, 5.975593242086464e-06, 9.170959457735556e-06, 9.170959457735556e-06, 1.4086907481811087e-05, 1.4086907481811087e-05, 2.1649904441927282e-05, 2.1649904441927282e-05, 3.328528438056759e-05, 3.328528438056759e-05, 5.118586890155266e-05, 5.118586890155266e-05, 7.872522970306816e-05, 7.872522970306816e-05, 0.00012109347709001509, 0.00012109347709001509, 0.00018627539614685654, 0.00018627539614685654, 0.00028655527161892025, 0.00028655527161892025, 0.0004408320031144029, 0.0004408320031144029, 0.000678180820799761, 0.000678180820799761, 0.0010433328480080038, 0.0010433328480080038]
[2025-04-03 01:38:40 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][0/573]	eta 0:21:15 lr 0.001043	time 2.2254 (2.2254)	loss 0.6269 (0.6269)	grad_norm 3.6352 (3.6352)	mem 20675MB
[2025-04-03 01:38:41 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][2/573]	eta 0:12:38 lr 0.001043	time 0.8774 (1.3279)	loss 0.6637 (0.6118)	grad_norm 2.9758 (2.8732)	mem 20675MB
[2025-04-03 01:38:43 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][4/573]	eta 0:10:53 lr 0.001043	time 0.8773 (1.1480)	loss 0.5341 (0.5592)	grad_norm 2.2385 (2.8168)	mem 20675MB
[2025-04-03 01:38:45 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][6/573]	eta 0:10:07 lr 0.001043	time 0.8776 (1.0710)	loss 0.6269 (0.5777)	grad_norm 1.4614 (2.5681)	mem 20675MB
[2025-04-03 01:38:47 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][8/573]	eta 0:09:40 lr 0.001043	time 0.8773 (1.0283)	loss 0.4157 (0.5632)	grad_norm 3.4931 (2.6153)	mem 20675MB
[2025-04-03 01:38:48 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][10/573]	eta 0:09:23 lr 0.001042	time 0.8774 (1.0010)	loss 0.4544 (0.5497)	grad_norm 2.7666 (2.6260)	mem 20675MB
[2025-04-03 01:38:50 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][12/573]	eta 0:09:10 lr 0.001042	time 0.8775 (0.9821)	loss 0.5158 (0.5472)	grad_norm 1.9001 (2.5284)	mem 20675MB
[2025-04-03 01:38:52 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][14/573]	eta 0:09:01 lr 0.001042	time 0.8771 (0.9683)	loss 0.5514 (0.5493)	grad_norm 2.4309 (2.5051)	mem 20675MB
[2025-04-03 01:38:54 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][16/573]	eta 0:08:53 lr 0.001042	time 0.8773 (0.9577)	loss 0.5270 (0.5406)	grad_norm 3.5864 (2.5705)	mem 20675MB
[2025-04-03 01:38:55 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][18/573]	eta 0:08:46 lr 0.001042	time 0.8782 (0.9494)	loss 0.6116 (0.5452)	grad_norm 3.1759 (2.6636)	mem 20675MB
[2025-04-03 01:38:57 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][20/573]	eta 0:08:41 lr 0.001042	time 0.8771 (0.9426)	loss 0.5289 (0.5453)	grad_norm 5.2335 (2.7858)	mem 20675MB
[2025-04-03 01:38:59 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][22/573]	eta 0:08:36 lr 0.001041	time 0.8770 (0.9369)	loss 0.3670 (0.5308)	grad_norm 3.7484 (2.8659)	mem 20675MB
[2025-04-03 01:39:01 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][24/573]	eta 0:08:31 lr 0.001041	time 0.8774 (0.9322)	loss 0.4708 (0.5255)	grad_norm 2.9046 (3.0001)	mem 20675MB
[2025-04-03 01:39:02 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][26/573]	eta 0:08:27 lr 0.001041	time 0.8781 (0.9283)	loss 0.5284 (0.5281)	grad_norm 4.1331 (3.0087)	mem 20675MB
[2025-04-03 01:39:04 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][28/573]	eta 0:08:24 lr 0.001041	time 0.8773 (0.9248)	loss 0.5861 (0.5296)	grad_norm 3.3479 (3.0060)	mem 20675MB
[2025-04-03 01:39:06 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][30/573]	eta 0:08:20 lr 0.001041	time 0.8775 (0.9218)	loss 0.5925 (0.5334)	grad_norm 2.2171 (2.9426)	mem 20675MB
[2025-04-03 01:39:08 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][32/573]	eta 0:08:17 lr 0.001041	time 0.8780 (0.9192)	loss 0.6072 (0.5304)	grad_norm 2.5874 (2.9343)	mem 20675MB
[2025-04-03 01:39:09 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][34/573]	eta 0:08:14 lr 0.001040	time 0.8774 (0.9169)	loss 0.5158 (0.5330)	grad_norm 2.9019 (2.9372)	mem 20675MB
[2025-04-03 01:39:11 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][36/573]	eta 0:08:11 lr 0.001040	time 0.8772 (0.9148)	loss 0.4122 (0.5309)	grad_norm 3.3971 (2.9347)	mem 20675MB
[2025-04-03 01:39:13 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][38/573]	eta 0:08:08 lr 0.001040	time 0.8775 (0.9129)	loss 0.6081 (0.5328)	grad_norm 2.0773 (2.8929)	mem 20675MB
[2025-04-03 01:39:15 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][40/573]	eta 0:08:05 lr 0.001040	time 0.8773 (0.9112)	loss 0.3707 (0.5292)	grad_norm 3.2262 (2.8635)	mem 20675MB
[2025-04-03 01:39:16 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][42/573]	eta 0:08:03 lr 0.001040	time 0.8772 (0.9097)	loss 0.5775 (0.5309)	grad_norm 2.0092 (2.8189)	mem 20675MB
[2025-04-03 01:39:18 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][44/573]	eta 0:08:00 lr 0.001040	time 0.8777 (0.9083)	loss 0.5629 (0.5308)	grad_norm 2.9343 (2.8148)	mem 20675MB
[2025-04-03 01:39:20 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][46/573]	eta 0:07:58 lr 0.001039	time 0.8774 (0.9070)	loss 0.4122 (0.5297)	grad_norm 2.9008 (2.7961)	mem 20675MB
[2025-04-03 01:39:22 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][48/573]	eta 0:07:55 lr 0.001039	time 0.8771 (0.9058)	loss 0.5137 (0.5298)	grad_norm 2.1282 (2.7617)	mem 20675MB
[2025-04-03 01:39:23 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][50/573]	eta 0:07:53 lr 0.001039	time 0.8774 (0.9048)	loss 0.6050 (0.5316)	grad_norm 1.9920 (2.7201)	mem 20675MB
[2025-04-03 01:39:25 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][52/573]	eta 0:07:50 lr 0.001039	time 0.8775 (0.9038)	loss 0.5471 (0.5328)	grad_norm 2.3990 (2.7073)	mem 20675MB
[2025-04-03 01:39:27 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][54/573]	eta 0:07:48 lr 0.001039	time 0.8770 (0.9028)	loss 0.4717 (0.5323)	grad_norm 2.3254 (2.6864)	mem 20675MB
[2025-04-03 01:39:29 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][56/573]	eta 0:07:46 lr 0.001038	time 0.8782 (0.9020)	loss 0.5866 (0.5334)	grad_norm 2.2134 (2.6772)	mem 20675MB
[2025-04-03 01:39:30 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][58/573]	eta 0:07:44 lr 0.001038	time 0.8775 (0.9012)	loss 0.4899 (0.5336)	grad_norm 2.6005 (2.6744)	mem 20675MB
[2025-04-03 01:39:32 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][60/573]	eta 0:07:41 lr 0.001038	time 0.8774 (0.9004)	loss 0.4647 (0.5326)	grad_norm 2.0647 (2.6618)	mem 20675MB
[2025-04-03 01:39:34 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][62/573]	eta 0:07:39 lr 0.001038	time 0.8771 (0.8997)	loss 0.3945 (0.5315)	grad_norm 4.6406 (2.6850)	mem 20675MB
[2025-04-03 01:39:36 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][64/573]	eta 0:07:37 lr 0.001038	time 0.8777 (0.8991)	loss 0.4527 (0.5309)	grad_norm 2.3667 (2.6861)	mem 20675MB
[2025-04-03 01:39:37 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][66/573]	eta 0:07:35 lr 0.001038	time 0.8774 (0.8985)	loss 0.4711 (0.5304)	grad_norm 2.3572 (2.6795)	mem 20675MB
[2025-04-03 01:39:39 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][68/573]	eta 0:07:33 lr 0.001037	time 0.8773 (0.8979)	loss 0.5040 (0.5303)	grad_norm 3.4883 (2.6870)	mem 20675MB
[2025-04-03 01:39:41 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][70/573]	eta 0:07:31 lr 0.001037	time 0.8774 (0.8973)	loss 0.5531 (0.5318)	grad_norm 2.3170 (2.6845)	mem 20675MB
[2025-04-03 01:39:43 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][72/573]	eta 0:07:29 lr 0.001037	time 0.8772 (0.8968)	loss 0.5591 (0.5324)	grad_norm 2.3252 (2.6667)	mem 20675MB
[2025-04-03 01:39:45 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][74/573]	eta 0:07:27 lr 0.001037	time 0.8770 (0.8963)	loss 0.5619 (0.5333)	grad_norm 1.9369 (2.6508)	mem 20675MB
[2025-04-03 01:39:46 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][76/573]	eta 0:07:25 lr 0.001037	time 0.8770 (0.8958)	loss 0.6143 (0.5347)	grad_norm 2.8570 (2.6435)	mem 20675MB
[2025-04-03 01:39:48 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][78/573]	eta 0:07:23 lr 0.001037	time 0.8773 (0.8954)	loss 0.5154 (0.5352)	grad_norm 2.1576 (2.6460)	mem 20675MB
[2025-04-03 01:39:50 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][80/573]	eta 0:07:21 lr 0.001036	time 0.8771 (0.8950)	loss 0.4888 (0.5332)	grad_norm 3.6757 (2.6574)	mem 20675MB
[2025-04-03 01:39:52 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][82/573]	eta 0:07:19 lr 0.001036	time 0.8775 (0.8946)	loss 0.6049 (0.5346)	grad_norm 1.9987 (2.6389)	mem 20675MB
[2025-04-03 01:39:53 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][84/573]	eta 0:07:17 lr 0.001036	time 0.8773 (0.8942)	loss 0.5361 (0.5336)	grad_norm 2.2037 (2.6275)	mem 20675MB
[2025-04-03 01:39:55 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][86/573]	eta 0:07:15 lr 0.001036	time 0.8774 (0.8938)	loss 0.5463 (0.5347)	grad_norm 2.0914 (2.6133)	mem 20675MB
[2025-04-03 01:39:57 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][88/573]	eta 0:07:13 lr 0.001036	time 0.8774 (0.8935)	loss 0.4717 (0.5346)	grad_norm 2.1948 (2.6139)	mem 20675MB
[2025-04-03 01:39:59 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][90/573]	eta 0:07:11 lr 0.001036	time 0.8773 (0.8931)	loss 0.5276 (0.5352)	grad_norm 2.1761 (2.5994)	mem 20675MB
[2025-04-03 01:40:00 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][92/573]	eta 0:07:09 lr 0.001035	time 0.8775 (0.8928)	loss 0.6072 (0.5345)	grad_norm 3.9611 (2.6329)	mem 20675MB
[2025-04-03 01:40:02 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][94/573]	eta 0:07:07 lr 0.001035	time 0.8774 (0.8925)	loss 0.5584 (0.5334)	grad_norm 2.7266 (2.6413)	mem 20675MB
[2025-04-03 01:40:04 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][96/573]	eta 0:07:05 lr 0.001035	time 0.8772 (0.8922)	loss 0.5504 (0.5338)	grad_norm 3.6292 (2.6453)	mem 20675MB
[2025-04-03 01:40:06 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][98/573]	eta 0:07:03 lr 0.001035	time 0.8771 (0.8919)	loss 0.5543 (0.5339)	grad_norm 4.0796 (2.6808)	mem 20675MB
[2025-04-03 01:40:07 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][100/573]	eta 0:07:01 lr 0.001035	time 0.8773 (0.8916)	loss 0.4641 (0.5337)	grad_norm 2.1185 (2.6780)	mem 20675MB
[2025-04-03 01:40:09 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][102/573]	eta 0:06:59 lr 0.001035	time 0.8772 (0.8914)	loss 0.5598 (0.5341)	grad_norm 1.9920 (2.6711)	mem 20675MB
[2025-04-03 01:40:11 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][104/573]	eta 0:06:57 lr 0.001034	time 0.8775 (0.8911)	loss 0.5133 (0.5342)	grad_norm 2.0250 (2.6556)	mem 20675MB
[2025-04-03 01:40:13 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][106/573]	eta 0:06:56 lr 0.001034	time 0.8773 (0.8909)	loss 0.5581 (0.5353)	grad_norm 2.1864 (2.6490)	mem 20675MB
[2025-04-03 01:40:14 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][108/573]	eta 0:06:54 lr 0.001034	time 0.8771 (0.8907)	loss 0.5435 (0.5361)	grad_norm 1.7204 (2.6270)	mem 20675MB
[2025-04-03 01:40:16 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][110/573]	eta 0:06:52 lr 0.001034	time 0.8774 (0.8904)	loss 0.4880 (0.5353)	grad_norm 2.4066 (2.6183)	mem 20675MB
[2025-04-03 01:40:18 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][112/573]	eta 0:06:50 lr 0.001034	time 0.8777 (0.8902)	loss 0.5382 (0.5355)	grad_norm 2.1393 (2.6081)	mem 20675MB
[2025-04-03 01:40:20 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][114/573]	eta 0:06:48 lr 0.001033	time 0.8774 (0.8900)	loss 0.6402 (0.5374)	grad_norm 3.0424 (2.6094)	mem 20675MB
[2025-04-03 01:40:21 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][116/573]	eta 0:06:46 lr 0.001033	time 0.8781 (0.8898)	loss 0.5864 (0.5376)	grad_norm 2.2782 (2.6083)	mem 20675MB
[2025-04-03 01:40:23 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][118/573]	eta 0:06:44 lr 0.001033	time 0.8770 (0.8896)	loss 0.4648 (0.5379)	grad_norm 2.5441 (2.6115)	mem 20675MB
[2025-04-03 01:40:25 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][120/573]	eta 0:06:42 lr 0.001033	time 0.8775 (0.8894)	loss 0.5048 (0.5379)	grad_norm 1.8390 (2.5966)	mem 20675MB
[2025-04-03 01:40:27 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][122/573]	eta 0:06:41 lr 0.001033	time 0.8775 (0.8893)	loss 0.6001 (0.5379)	grad_norm 2.7643 (2.5955)	mem 20675MB
[2025-04-03 01:40:28 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][124/573]	eta 0:06:39 lr 0.001033	time 0.8776 (0.8891)	loss 0.4596 (0.5369)	grad_norm 3.8390 (2.6037)	mem 20675MB
[2025-04-03 01:40:30 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][126/573]	eta 0:06:37 lr 0.001032	time 0.8774 (0.8889)	loss 0.4910 (0.5354)	grad_norm 1.6074 (2.5978)	mem 20675MB
[2025-04-03 01:40:32 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][128/573]	eta 0:06:35 lr 0.001032	time 0.8772 (0.8888)	loss 0.6032 (0.5352)	grad_norm 3.3149 (2.6001)	mem 20675MB
[2025-04-03 01:40:34 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][130/573]	eta 0:06:33 lr 0.001032	time 0.8773 (0.8886)	loss 0.5907 (0.5359)	grad_norm 3.2152 (2.6036)	mem 20675MB
[2025-04-03 01:40:35 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][132/573]	eta 0:06:31 lr 0.001032	time 0.8773 (0.8884)	loss 0.5885 (0.5367)	grad_norm 3.0903 (2.6021)	mem 20675MB
[2025-04-03 01:40:37 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][134/573]	eta 0:06:29 lr 0.001032	time 0.8772 (0.8883)	loss 0.3931 (0.5358)	grad_norm 3.8980 (2.6116)	mem 20675MB
[2025-04-03 01:40:39 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][136/573]	eta 0:06:28 lr 0.001032	time 0.8772 (0.8881)	loss 0.5906 (0.5365)	grad_norm 2.4250 (2.6053)	mem 20675MB
[2025-04-03 01:40:41 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][138/573]	eta 0:06:26 lr 0.001031	time 0.8774 (0.8880)	loss 0.5173 (0.5369)	grad_norm 2.0609 (2.5967)	mem 20675MB
[2025-04-03 01:40:42 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][140/573]	eta 0:06:24 lr 0.001031	time 0.8775 (0.8879)	loss 0.4996 (0.5363)	grad_norm 2.9113 (2.5973)	mem 20675MB
[2025-04-03 01:40:44 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][142/573]	eta 0:06:22 lr 0.001031	time 0.8775 (0.8877)	loss 0.5699 (0.5372)	grad_norm 2.5957 (2.5923)	mem 20675MB
[2025-04-03 01:40:46 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][144/573]	eta 0:06:20 lr 0.001031	time 0.8776 (0.8876)	loss 0.4408 (0.5359)	grad_norm 3.0467 (2.5967)	mem 20675MB
[2025-04-03 01:40:48 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][146/573]	eta 0:06:18 lr 0.001031	time 0.8774 (0.8875)	loss 0.4088 (0.5346)	grad_norm 2.9057 (2.6090)	mem 20675MB
[2025-04-03 01:40:49 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][148/573]	eta 0:06:17 lr 0.001031	time 0.8773 (0.8874)	loss 0.5760 (0.5351)	grad_norm 2.1844 (2.6062)	mem 20675MB
[2025-04-03 01:40:51 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][150/573]	eta 0:06:15 lr 0.001030	time 0.8775 (0.8872)	loss 0.6779 (0.5360)	grad_norm 2.9249 (2.6169)	mem 20675MB
[2025-04-03 01:40:53 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][152/573]	eta 0:06:13 lr 0.001030	time 0.8775 (0.8871)	loss 0.5539 (0.5360)	grad_norm 2.8332 (2.6204)	mem 20675MB
[2025-04-03 01:40:55 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][154/573]	eta 0:06:11 lr 0.001030	time 0.8773 (0.8870)	loss 0.3995 (0.5352)	grad_norm 4.3676 (2.6300)	mem 20675MB
[2025-04-03 01:40:57 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][156/573]	eta 0:06:09 lr 0.001030	time 0.8772 (0.8869)	loss 0.5996 (0.5356)	grad_norm 2.1488 (2.6355)	mem 20675MB
[2025-04-03 01:40:58 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][158/573]	eta 0:06:08 lr 0.001030	time 0.8774 (0.8868)	loss 0.5785 (0.5360)	grad_norm 2.2901 (2.6323)	mem 20675MB
[2025-04-03 01:41:00 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][160/573]	eta 0:06:06 lr 0.001029	time 0.8774 (0.8867)	loss 0.4443 (0.5347)	grad_norm 3.5781 (2.6478)	mem 20675MB
[2025-04-03 01:41:02 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][162/573]	eta 0:06:04 lr 0.001029	time 0.8775 (0.8866)	loss 0.5689 (0.5349)	grad_norm 2.1811 (2.6481)	mem 20675MB
[2025-04-03 01:41:04 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][164/573]	eta 0:06:02 lr 0.001029	time 0.8774 (0.8865)	loss 0.5537 (0.5346)	grad_norm 1.9385 (2.6431)	mem 20675MB
[2025-04-03 01:41:05 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][166/573]	eta 0:06:00 lr 0.001029	time 0.8786 (0.8864)	loss 0.5236 (0.5345)	grad_norm 3.1758 (2.6429)	mem 20675MB
[2025-04-03 01:41:07 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][168/573]	eta 0:05:58 lr 0.001029	time 0.8771 (0.8863)	loss 0.5591 (0.5341)	grad_norm 2.3655 (2.6467)	mem 20675MB
[2025-04-03 01:41:09 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][170/573]	eta 0:05:57 lr 0.001029	time 0.8775 (0.8862)	loss 0.5846 (0.5345)	grad_norm 2.5702 (2.6398)	mem 20675MB
[2025-04-03 01:41:11 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][172/573]	eta 0:05:55 lr 0.001028	time 0.8774 (0.8861)	loss 0.5310 (0.5337)	grad_norm 2.0586 (2.6386)	mem 20675MB
[2025-04-03 01:41:12 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][174/573]	eta 0:05:53 lr 0.001028	time 0.8773 (0.8860)	loss 0.4038 (0.5333)	grad_norm 3.4919 (2.6406)	mem 20675MB
[2025-04-03 01:41:14 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][176/573]	eta 0:05:51 lr 0.001028	time 0.8772 (0.8859)	loss 0.5057 (0.5333)	grad_norm 2.6970 (2.6417)	mem 20675MB
[2025-04-03 01:41:16 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][178/573]	eta 0:05:49 lr 0.001028	time 0.8771 (0.8859)	loss 0.5852 (0.5339)	grad_norm 2.6567 (2.6406)	mem 20675MB
[2025-04-03 01:41:18 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][180/573]	eta 0:05:48 lr 0.001028	time 0.8776 (0.8858)	loss 0.4794 (0.5337)	grad_norm 3.1398 (2.6412)	mem 20675MB
[2025-04-03 01:41:19 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][182/573]	eta 0:05:46 lr 0.001028	time 0.8777 (0.8857)	loss 0.3634 (0.5326)	grad_norm 2.4420 (2.6434)	mem 20675MB
[2025-04-03 01:41:21 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][184/573]	eta 0:05:44 lr 0.001027	time 0.8777 (0.8856)	loss 0.5713 (0.5326)	grad_norm 1.4369 (2.6333)	mem 20675MB
[2025-04-03 01:41:23 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][186/573]	eta 0:05:42 lr 0.001027	time 0.8775 (0.8855)	loss 0.5366 (0.5321)	grad_norm 2.9106 (2.6331)	mem 20675MB
[2025-04-03 01:41:25 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][188/573]	eta 0:05:40 lr 0.001027	time 0.8777 (0.8855)	loss 0.4029 (0.5315)	grad_norm 2.8784 (2.6327)	mem 20675MB
[2025-04-03 01:41:26 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][190/573]	eta 0:05:39 lr 0.001027	time 0.8777 (0.8854)	loss 0.5072 (0.5313)	grad_norm 3.6782 (2.6383)	mem 20675MB
[2025-04-03 01:41:28 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][192/573]	eta 0:05:37 lr 0.001027	time 0.8775 (0.8853)	loss 0.5433 (0.5318)	grad_norm 2.4465 (2.6364)	mem 20675MB
[2025-04-03 01:41:30 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][194/573]	eta 0:05:35 lr 0.001027	time 0.8775 (0.8853)	loss 0.4500 (0.5314)	grad_norm 3.5586 (2.6522)	mem 20675MB
[2025-04-03 01:41:32 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][196/573]	eta 0:05:33 lr 0.001026	time 0.8775 (0.8852)	loss 0.5780 (0.5314)	grad_norm 2.7087 (2.6487)	mem 20675MB
[2025-04-03 01:41:33 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][198/573]	eta 0:05:31 lr 0.001026	time 0.8773 (0.8851)	loss 0.4410 (0.5312)	grad_norm 6.4051 (2.6657)	mem 20675MB
[2025-04-03 01:41:35 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][200/573]	eta 0:05:30 lr 0.001026	time 0.8770 (0.8850)	loss 0.4311 (0.5309)	grad_norm 3.4446 (2.6757)	mem 20675MB
[2025-04-03 01:41:37 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][202/573]	eta 0:05:28 lr 0.001026	time 0.8771 (0.8850)	loss 0.6105 (0.5314)	grad_norm 2.0185 (2.6688)	mem 20675MB
[2025-04-03 01:41:39 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][204/573]	eta 0:05:26 lr 0.001026	time 0.8773 (0.8849)	loss 0.5263 (0.5314)	grad_norm 5.7003 (2.6819)	mem 20675MB
[2025-04-03 01:41:40 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][206/573]	eta 0:05:24 lr 0.001025	time 0.8779 (0.8849)	loss 0.5414 (0.5314)	grad_norm 1.6122 (2.6733)	mem 20675MB
[2025-04-03 01:41:42 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][208/573]	eta 0:05:22 lr 0.001025	time 0.8777 (0.8849)	loss 0.5295 (0.5308)	grad_norm 2.2610 (2.6817)	mem 20675MB
[2025-04-03 01:41:44 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][210/573]	eta 0:05:21 lr 0.001025	time 0.8773 (0.8848)	loss 0.5473 (0.5311)	grad_norm 2.3385 (2.6813)	mem 20675MB
[2025-04-03 01:41:46 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][212/573]	eta 0:05:19 lr 0.001025	time 0.8780 (0.8848)	loss 0.5619 (0.5305)	grad_norm 2.2022 (2.6792)	mem 20675MB
[2025-04-03 01:41:48 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][214/573]	eta 0:05:17 lr 0.001025	time 0.8778 (0.8847)	loss 0.5382 (0.5304)	grad_norm 2.3423 (2.6857)	mem 20675MB
[2025-04-03 01:41:49 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][216/573]	eta 0:05:15 lr 0.001025	time 0.8775 (0.8847)	loss 0.5416 (0.5303)	grad_norm 1.9611 (2.6827)	mem 20675MB
[2025-04-03 01:41:51 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][218/573]	eta 0:05:14 lr 0.001024	time 0.8777 (0.8847)	loss 0.5300 (0.5304)	grad_norm 1.7780 (2.6758)	mem 20675MB
[2025-04-03 01:41:53 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][220/573]	eta 0:05:12 lr 0.001024	time 0.8774 (0.8846)	loss 0.5593 (0.5308)	grad_norm 1.5708 (2.6698)	mem 20675MB
[2025-04-03 01:41:55 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][222/573]	eta 0:05:10 lr 0.001024	time 0.8777 (0.8846)	loss 0.4570 (0.5302)	grad_norm 2.5583 (2.6678)	mem 20675MB
[2025-04-03 01:41:56 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][224/573]	eta 0:05:08 lr 0.001024	time 0.8774 (0.8845)	loss 0.5124 (0.5304)	grad_norm 1.9444 (2.6663)	mem 20675MB
[2025-04-03 01:41:58 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][226/573]	eta 0:05:06 lr 0.001024	time 0.8773 (0.8845)	loss 0.4466 (0.5302)	grad_norm 3.2917 (2.6774)	mem 20675MB
[2025-04-03 01:42:00 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][228/573]	eta 0:05:05 lr 0.001024	time 0.8773 (0.8844)	loss 0.5485 (0.5305)	grad_norm 1.5973 (2.6675)	mem 20675MB
[2025-04-03 01:42:02 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][230/573]	eta 0:05:03 lr 0.001023	time 0.8777 (0.8844)	loss 0.5608 (0.5306)	grad_norm 1.7093 (2.6630)	mem 20675MB
[2025-04-03 01:42:03 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][232/573]	eta 0:05:01 lr 0.001023	time 0.8771 (0.8843)	loss 0.5697 (0.5309)	grad_norm 2.7666 (2.6616)	mem 20675MB
[2025-04-03 01:42:05 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][234/573]	eta 0:04:59 lr 0.001023	time 0.8773 (0.8843)	loss 0.4188 (0.5306)	grad_norm 3.6317 (2.6652)	mem 20675MB
[2025-04-03 01:42:07 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][236/573]	eta 0:04:57 lr 0.001023	time 0.8775 (0.8842)	loss 0.3586 (0.5299)	grad_norm 4.3830 (2.6702)	mem 20675MB
[2025-04-03 01:42:09 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][238/573]	eta 0:04:56 lr 0.001023	time 0.8774 (0.8842)	loss 0.4461 (0.5292)	grad_norm 2.4463 (2.6723)	mem 20675MB
[2025-04-03 01:42:10 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][240/573]	eta 0:04:54 lr 0.001022	time 0.8774 (0.8841)	loss 0.5460 (0.5291)	grad_norm 4.1731 (2.6840)	mem 20675MB
[2025-04-03 01:42:12 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][242/573]	eta 0:04:52 lr 0.001022	time 0.8774 (0.8841)	loss 0.4846 (0.5291)	grad_norm 2.8139 (2.6850)	mem 20675MB
[2025-04-03 01:42:14 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][244/573]	eta 0:04:50 lr 0.001022	time 0.8773 (0.8840)	loss 0.5953 (0.5291)	grad_norm 3.2826 (2.6908)	mem 20675MB
[2025-04-03 01:42:16 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][246/573]	eta 0:04:49 lr 0.001022	time 0.8772 (0.8840)	loss 0.4944 (0.5291)	grad_norm 1.8327 (2.6865)	mem 20675MB
[2025-04-03 01:42:17 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][248/573]	eta 0:04:47 lr 0.001022	time 0.8775 (0.8839)	loss 0.5246 (0.5291)	grad_norm 1.8844 (2.6839)	mem 20675MB
[2025-04-03 01:42:19 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][250/573]	eta 0:04:45 lr 0.001022	time 0.8771 (0.8839)	loss 0.4805 (0.5292)	grad_norm 2.9650 (2.6856)	mem 20675MB
[2025-04-03 01:42:21 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][252/573]	eta 0:04:43 lr 0.001021	time 0.8774 (0.8838)	loss 0.6025 (0.5295)	grad_norm 2.1380 (2.6800)	mem 20675MB
[2025-04-03 01:42:23 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][254/573]	eta 0:04:41 lr 0.001021	time 0.8773 (0.8838)	loss 0.4670 (0.5292)	grad_norm 2.8861 (2.6800)	mem 20675MB
[2025-04-03 01:42:24 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][256/573]	eta 0:04:40 lr 0.001021	time 0.8771 (0.8838)	loss 0.5922 (0.5290)	grad_norm 2.5335 (2.6888)	mem 20675MB
[2025-04-03 01:42:26 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][258/573]	eta 0:04:38 lr 0.001021	time 0.8773 (0.8837)	loss 0.5873 (0.5295)	grad_norm 1.9168 (2.6842)	mem 20675MB
[2025-04-03 01:42:28 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][260/573]	eta 0:04:36 lr 0.001021	time 0.8772 (0.8837)	loss 0.5856 (0.5298)	grad_norm 2.0777 (2.6776)	mem 20675MB
[2025-04-03 01:42:30 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][262/573]	eta 0:04:34 lr 0.001021	time 0.8771 (0.8836)	loss 0.4981 (0.5290)	grad_norm 2.8072 (2.6805)	mem 20675MB
[2025-04-03 01:42:31 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][264/573]	eta 0:04:33 lr 0.001020	time 0.8773 (0.8836)	loss 0.3897 (0.5280)	grad_norm 3.5815 (2.6949)	mem 20675MB
[2025-04-03 01:42:33 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][266/573]	eta 0:04:31 lr 0.001020	time 0.8772 (0.8835)	loss 0.5453 (0.5280)	grad_norm 3.4688 (2.6953)	mem 20675MB
[2025-04-03 01:42:35 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][268/573]	eta 0:04:29 lr 0.001020	time 0.8772 (0.8835)	loss 0.5768 (0.5278)	grad_norm 2.7584 (2.7042)	mem 20675MB
[2025-04-03 01:42:37 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][270/573]	eta 0:04:27 lr 0.001020	time 0.8770 (0.8835)	loss 0.5041 (0.5279)	grad_norm 2.6222 (2.7061)	mem 20675MB
[2025-04-03 01:42:38 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][272/573]	eta 0:04:25 lr 0.001020	time 0.8772 (0.8834)	loss 0.4499 (0.5273)	grad_norm 3.0908 (2.7118)	mem 20675MB
[2025-04-03 01:42:40 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][274/573]	eta 0:04:24 lr 0.001019	time 0.8775 (0.8834)	loss 0.4350 (0.5269)	grad_norm 3.2070 (2.7128)	mem 20675MB
[2025-04-03 01:42:42 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][276/573]	eta 0:04:22 lr 0.001019	time 0.8773 (0.8834)	loss 0.5088 (0.5263)	grad_norm 2.5411 (2.7199)	mem 20675MB
[2025-04-03 01:42:44 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][278/573]	eta 0:04:20 lr 0.001019	time 0.8773 (0.8833)	loss 0.4689 (0.5256)	grad_norm 2.6065 (2.7235)	mem 20675MB
[2025-04-03 01:42:45 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][280/573]	eta 0:04:18 lr 0.001019	time 0.8772 (0.8833)	loss 0.5583 (0.5255)	grad_norm 2.7915 (2.7245)	mem 20675MB
[2025-04-03 01:42:47 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][282/573]	eta 0:04:17 lr 0.001019	time 0.8771 (0.8832)	loss 0.4247 (0.5248)	grad_norm 3.0348 (2.7246)	mem 20675MB
[2025-04-03 01:42:49 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][284/573]	eta 0:04:15 lr 0.001019	time 0.8774 (0.8832)	loss 0.5482 (0.5248)	grad_norm 2.6882 (2.7276)	mem 20675MB
[2025-04-03 01:42:51 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][286/573]	eta 0:04:13 lr 0.001018	time 0.8773 (0.8832)	loss 0.4115 (0.5244)	grad_norm 5.0916 (2.7373)	mem 20675MB
[2025-04-03 01:42:53 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][288/573]	eta 0:04:11 lr 0.001018	time 0.8774 (0.8832)	loss 0.5533 (0.5243)	grad_norm 2.5890 (2.7400)	mem 20675MB
[2025-04-03 01:42:54 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][290/573]	eta 0:04:09 lr 0.001018	time 0.8787 (0.8831)	loss 0.5158 (0.5245)	grad_norm 2.0923 (2.7379)	mem 20675MB
[2025-04-03 01:42:56 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][292/573]	eta 0:04:08 lr 0.001018	time 0.8779 (0.8831)	loss 0.4982 (0.5243)	grad_norm 1.7911 (2.7318)	mem 20675MB
[2025-04-03 01:42:58 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][294/573]	eta 0:04:06 lr 0.001018	time 0.8772 (0.8831)	loss 0.5728 (0.5244)	grad_norm 2.1304 (2.7288)	mem 20675MB
[2025-04-03 01:43:00 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][296/573]	eta 0:04:04 lr 0.001018	time 0.8784 (0.8830)	loss 0.6026 (0.5244)	grad_norm 2.0380 (2.7240)	mem 20675MB
[2025-04-03 01:43:01 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][298/573]	eta 0:04:02 lr 0.001017	time 0.8777 (0.8830)	loss 0.4996 (0.5246)	grad_norm 1.5604 (2.7168)	mem 20675MB
[2025-04-03 01:43:03 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][300/573]	eta 0:04:01 lr 0.001017	time 0.8774 (0.8830)	loss 0.5025 (0.5241)	grad_norm 2.2049 (2.7151)	mem 20675MB
[2025-04-03 01:43:05 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][302/573]	eta 0:03:59 lr 0.001017	time 0.8772 (0.8829)	loss 0.6027 (0.5245)	grad_norm 2.2664 (2.7127)	mem 20675MB
[2025-04-03 01:43:07 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][304/573]	eta 0:03:57 lr 0.001017	time 0.8772 (0.8829)	loss 0.5233 (0.5242)	grad_norm 1.8255 (2.7093)	mem 20675MB
[2025-04-03 01:43:08 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][306/573]	eta 0:03:55 lr 0.001017	time 0.8772 (0.8829)	loss 0.5553 (0.5241)	grad_norm 1.9932 (2.7048)	mem 20675MB
[2025-04-03 01:43:10 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][308/573]	eta 0:03:53 lr 0.001016	time 0.8772 (0.8828)	loss 0.4762 (0.5242)	grad_norm 2.7432 (2.7060)	mem 20675MB
[2025-04-03 01:43:12 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][310/573]	eta 0:03:52 lr 0.001016	time 0.8776 (0.8828)	loss 0.4452 (0.5238)	grad_norm 3.9059 (2.7132)	mem 20675MB
[2025-04-03 01:43:14 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][312/573]	eta 0:03:50 lr 0.001016	time 0.8775 (0.8828)	loss 0.4731 (0.5234)	grad_norm 2.7572 (2.7143)	mem 20675MB
[2025-04-03 01:43:15 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][314/573]	eta 0:03:48 lr 0.001016	time 0.8774 (0.8828)	loss 0.5401 (0.5231)	grad_norm 2.9717 (2.7158)	mem 20675MB
[2025-04-03 01:43:17 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][316/573]	eta 0:03:46 lr 0.001016	time 0.8791 (0.8827)	loss 0.3893 (0.5229)	grad_norm 2.6888 (2.7159)	mem 20675MB
[2025-04-03 01:43:19 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][318/573]	eta 0:03:45 lr 0.001016	time 0.8771 (0.8827)	loss 0.5283 (0.5231)	grad_norm 2.0808 (2.7153)	mem 20675MB
[2025-04-03 01:43:21 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][320/573]	eta 0:03:43 lr 0.001015	time 0.8772 (0.8827)	loss 0.5605 (0.5233)	grad_norm 2.2978 (2.7157)	mem 20675MB
[2025-04-03 01:43:22 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][322/573]	eta 0:03:41 lr 0.001015	time 0.8775 (0.8827)	loss 0.5575 (0.5233)	grad_norm 2.1395 (2.7121)	mem 20675MB
[2025-04-03 01:43:24 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][324/573]	eta 0:03:39 lr 0.001015	time 0.8771 (0.8826)	loss 0.3872 (0.5230)	grad_norm 2.7788 (2.7088)	mem 20675MB
[2025-04-03 01:43:26 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][326/573]	eta 0:03:38 lr 0.001015	time 0.8776 (0.8826)	loss 0.4064 (0.5227)	grad_norm 2.5640 (2.7112)	mem 20675MB
[2025-04-03 01:43:28 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][328/573]	eta 0:03:36 lr 0.001015	time 0.8772 (0.8826)	loss 0.6142 (0.5231)	grad_norm 1.9060 (2.7068)	mem 20675MB
[2025-04-03 01:43:29 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][330/573]	eta 0:03:34 lr 0.001014	time 0.8774 (0.8826)	loss 0.6553 (0.5234)	grad_norm 2.9719 (2.7076)	mem 20675MB
[2025-04-03 01:43:31 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][332/573]	eta 0:03:32 lr 0.001014	time 0.8772 (0.8825)	loss 0.6296 (0.5238)	grad_norm 2.2505 (2.7052)	mem 20675MB
[2025-04-03 01:43:33 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][334/573]	eta 0:03:30 lr 0.001014	time 0.8771 (0.8825)	loss 0.5585 (0.5238)	grad_norm 1.6470 (2.7017)	mem 20675MB
[2025-04-03 01:43:35 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][336/573]	eta 0:03:29 lr 0.001014	time 0.8771 (0.8825)	loss 0.5008 (0.5238)	grad_norm 2.5888 (2.7011)	mem 20675MB
[2025-04-03 01:43:36 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][338/573]	eta 0:03:27 lr 0.001014	time 0.8772 (0.8824)	loss 0.5132 (0.5239)	grad_norm 3.1355 (2.7012)	mem 20675MB
[2025-04-03 01:43:38 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][340/573]	eta 0:03:25 lr 0.001014	time 0.8773 (0.8824)	loss 0.5252 (0.5238)	grad_norm 1.9800 (2.6973)	mem 20675MB
[2025-04-03 01:43:40 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][342/573]	eta 0:03:23 lr 0.001013	time 0.8773 (0.8824)	loss 0.5307 (0.5237)	grad_norm 2.0430 (2.6941)	mem 20675MB
[2025-04-03 01:43:42 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][344/573]	eta 0:03:22 lr 0.001013	time 0.8773 (0.8824)	loss 0.5797 (0.5234)	grad_norm 2.3264 (2.7034)	mem 20675MB
[2025-04-03 01:43:43 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][346/573]	eta 0:03:20 lr 0.001013	time 0.8771 (0.8823)	loss 0.5761 (0.5238)	grad_norm 1.5428 (2.7003)	mem 20675MB
[2025-04-03 01:43:45 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][348/573]	eta 0:03:18 lr 0.001013	time 0.8774 (0.8823)	loss 0.6509 (0.5244)	grad_norm 2.5301 (2.6992)	mem 20675MB
[2025-04-03 01:43:47 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][350/573]	eta 0:03:16 lr 0.001013	time 0.8772 (0.8823)	loss 0.5699 (0.5247)	grad_norm 1.8396 (2.6954)	mem 20675MB
[2025-04-03 01:43:49 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][352/573]	eta 0:03:14 lr 0.001013	time 0.8768 (0.8823)	loss 0.5043 (0.5249)	grad_norm 2.3882 (2.6914)	mem 20675MB
[2025-04-03 01:43:50 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][354/573]	eta 0:03:13 lr 0.001012	time 0.8772 (0.8823)	loss 0.6259 (0.5253)	grad_norm 1.9154 (2.6865)	mem 20675MB
[2025-04-03 01:43:52 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][356/573]	eta 0:03:11 lr 0.001012	time 0.8773 (0.8822)	loss 0.5889 (0.5252)	grad_norm 1.8753 (2.6865)	mem 20675MB
[2025-04-03 01:43:54 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][358/573]	eta 0:03:09 lr 0.001012	time 0.8772 (0.8822)	loss 0.5516 (0.5254)	grad_norm 1.6869 (2.6815)	mem 20675MB
[2025-04-03 01:43:56 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][360/573]	eta 0:03:07 lr 0.001012	time 0.8774 (0.8822)	loss 0.5502 (0.5256)	grad_norm 1.5513 (2.6770)	mem 20675MB
[2025-04-03 01:43:58 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][362/573]	eta 0:03:06 lr 0.001012	time 0.8770 (0.8822)	loss 0.3924 (0.5254)	grad_norm 4.4149 (2.6805)	mem 20675MB
[2025-04-03 01:43:59 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][364/573]	eta 0:03:04 lr 0.001011	time 0.8775 (0.8821)	loss 0.4233 (0.5253)	grad_norm 3.3941 (2.6809)	mem 20675MB
[2025-04-03 01:44:01 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][366/573]	eta 0:03:02 lr 0.001011	time 0.8774 (0.8821)	loss 0.5238 (0.5253)	grad_norm 2.7848 (2.6784)	mem 20675MB
[2025-04-03 01:44:03 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][368/573]	eta 0:03:00 lr 0.001011	time 0.8773 (0.8821)	loss 0.5694 (0.5255)	grad_norm 2.2559 (2.6786)	mem 20675MB
[2025-04-03 01:44:05 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][370/573]	eta 0:02:59 lr 0.001011	time 0.8770 (0.8821)	loss 0.3859 (0.5252)	grad_norm 2.3806 (2.6773)	mem 20675MB
[2025-04-03 01:44:06 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][372/573]	eta 0:02:57 lr 0.001011	time 0.8771 (0.8821)	loss 0.5309 (0.5252)	grad_norm 3.7257 (2.6776)	mem 20675MB
[2025-04-03 01:44:08 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][374/573]	eta 0:02:55 lr 0.001011	time 0.8774 (0.8820)	loss 0.6122 (0.5256)	grad_norm 2.4366 (2.6758)	mem 20675MB
[2025-04-03 01:44:10 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][376/573]	eta 0:02:53 lr 0.001010	time 0.8772 (0.8820)	loss 0.4820 (0.5254)	grad_norm 3.7698 (2.6778)	mem 20675MB
[2025-04-03 01:44:12 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][378/573]	eta 0:02:51 lr 0.001010	time 0.8775 (0.8820)	loss 0.4760 (0.5250)	grad_norm 3.6465 (2.6796)	mem 20675MB
[2025-04-03 01:44:13 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][380/573]	eta 0:02:50 lr 0.001010	time 0.8774 (0.8820)	loss 0.5729 (0.5251)	grad_norm 2.2517 (2.6785)	mem 20675MB
[2025-04-03 01:44:15 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][382/573]	eta 0:02:48 lr 0.001010	time 0.8774 (0.8820)	loss 0.4884 (0.5252)	grad_norm 2.2680 (2.6762)	mem 20675MB
[2025-04-03 01:44:17 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][384/573]	eta 0:02:46 lr 0.001010	time 0.8771 (0.8819)	loss 0.5738 (0.5251)	grad_norm 2.3833 (2.6770)	mem 20675MB
[2025-04-03 01:44:19 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][386/573]	eta 0:02:44 lr 0.001009	time 0.8783 (0.8819)	loss 0.5941 (0.5252)	grad_norm 2.0786 (2.6791)	mem 20675MB
[2025-04-03 01:44:20 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][388/573]	eta 0:02:43 lr 0.001009	time 0.8773 (0.8819)	loss 0.5497 (0.5254)	grad_norm 2.1235 (2.6782)	mem 20675MB
[2025-04-03 01:44:22 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][390/573]	eta 0:02:41 lr 0.001009	time 0.8771 (0.8819)	loss 0.4860 (0.5254)	grad_norm 1.8509 (2.6757)	mem 20675MB
[2025-04-03 01:44:24 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][392/573]	eta 0:02:39 lr 0.001009	time 0.8780 (0.8819)	loss 0.6068 (0.5255)	grad_norm 2.4036 (2.6745)	mem 20675MB
[2025-04-03 01:44:26 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][394/573]	eta 0:02:37 lr 0.001009	time 0.8772 (0.8819)	loss 0.5798 (0.5258)	grad_norm 4.9033 (2.6839)	mem 20675MB
[2025-04-03 01:44:27 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][396/573]	eta 0:02:36 lr 0.001009	time 0.8772 (0.8818)	loss 0.5797 (0.5259)	grad_norm 1.1612 (2.6785)	mem 20675MB
[2025-04-03 01:44:29 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][398/573]	eta 0:02:34 lr 0.001008	time 0.8775 (0.8818)	loss 0.5397 (0.5260)	grad_norm 1.9609 (2.6764)	mem 20675MB
[2025-04-03 01:44:31 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][400/573]	eta 0:02:32 lr 0.001008	time 0.8779 (0.8818)	loss 0.5563 (0.5262)	grad_norm 2.3927 (2.6743)	mem 20675MB
[2025-04-03 01:44:33 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][402/573]	eta 0:02:30 lr 0.001008	time 0.8773 (0.8818)	loss 0.5855 (0.5263)	grad_norm 1.9133 (2.6709)	mem 20675MB
[2025-04-03 01:44:34 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][404/573]	eta 0:02:29 lr 0.001008	time 0.8772 (0.8818)	loss 0.4538 (0.5263)	grad_norm 3.7244 (2.6732)	mem 20675MB
[2025-04-03 01:44:36 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][406/573]	eta 0:02:27 lr 0.001008	time 0.8773 (0.8817)	loss 0.4824 (0.5261)	grad_norm 2.1905 (2.6723)	mem 20675MB
[2025-04-03 01:44:38 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][408/573]	eta 0:02:25 lr 0.001007	time 0.8771 (0.8817)	loss 0.5436 (0.5261)	grad_norm 2.4194 (2.6703)	mem 20675MB
[2025-04-03 01:44:40 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][410/573]	eta 0:02:23 lr 0.001007	time 0.8775 (0.8817)	loss 0.5398 (0.5261)	grad_norm 2.5560 (2.6743)	mem 20675MB
[2025-04-03 01:44:41 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][412/573]	eta 0:02:21 lr 0.001007	time 0.8773 (0.8817)	loss 0.5509 (0.5263)	grad_norm 3.0449 (2.6749)	mem 20675MB
[2025-04-03 01:44:43 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][414/573]	eta 0:02:20 lr 0.001007	time 0.8771 (0.8817)	loss 0.5745 (0.5265)	grad_norm 1.8551 (2.6754)	mem 20675MB
[2025-04-03 01:44:45 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][416/573]	eta 0:02:18 lr 0.001007	time 0.8773 (0.8817)	loss 0.4952 (0.5267)	grad_norm 3.4006 (2.6778)	mem 20675MB
[2025-04-03 01:44:47 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][418/573]	eta 0:02:16 lr 0.001007	time 0.8778 (0.8816)	loss 0.4630 (0.5266)	grad_norm 2.7304 (2.6760)	mem 20675MB
[2025-04-03 01:44:48 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][420/573]	eta 0:02:14 lr 0.001006	time 0.8774 (0.8816)	loss 0.5884 (0.5269)	grad_norm 2.0070 (2.6735)	mem 20675MB
[2025-04-03 01:44:50 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][422/573]	eta 0:02:13 lr 0.001006	time 0.8776 (0.8816)	loss 0.4528 (0.5267)	grad_norm 2.2696 (2.6738)	mem 20675MB
[2025-04-03 01:44:52 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][424/573]	eta 0:02:11 lr 0.001006	time 0.8773 (0.8816)	loss 0.4980 (0.5264)	grad_norm 2.0348 (2.6711)	mem 20675MB
[2025-04-03 01:44:54 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][426/573]	eta 0:02:09 lr 0.001006	time 0.8772 (0.8816)	loss 0.4838 (0.5265)	grad_norm 1.8584 (2.6670)	mem 20675MB
[2025-04-03 01:44:55 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][428/573]	eta 0:02:07 lr 0.001006	time 0.8774 (0.8816)	loss 0.5248 (0.5265)	grad_norm 2.0449 (2.6635)	mem 20675MB
[2025-04-03 01:44:57 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][430/573]	eta 0:02:06 lr 0.001006	time 0.8778 (0.8816)	loss 0.5975 (0.5265)	grad_norm 2.3687 (2.6629)	mem 20675MB
[2025-04-03 01:44:59 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][432/573]	eta 0:02:04 lr 0.001005	time 0.8772 (0.8815)	loss 0.5797 (0.5267)	grad_norm 2.1437 (2.6606)	mem 20675MB
[2025-04-03 01:45:01 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][434/573]	eta 0:02:02 lr 0.001005	time 0.8773 (0.8815)	loss 0.5570 (0.5269)	grad_norm 2.0842 (2.6608)	mem 20675MB
[2025-04-03 01:45:03 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][436/573]	eta 0:02:00 lr 0.001005	time 0.8775 (0.8815)	loss 0.3794 (0.5261)	grad_norm 3.6236 (2.6633)	mem 20675MB
[2025-04-03 01:45:04 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][438/573]	eta 0:01:59 lr 0.001005	time 0.8773 (0.8815)	loss 0.5139 (0.5259)	grad_norm 2.4053 (2.6648)	mem 20675MB
[2025-04-03 01:45:06 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][440/573]	eta 0:01:57 lr 0.001005	time 0.8771 (0.8815)	loss 0.5333 (0.5260)	grad_norm 1.9081 (2.6624)	mem 20675MB
[2025-04-03 01:45:08 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][442/573]	eta 0:01:55 lr 0.001004	time 0.8776 (0.8815)	loss 0.5549 (0.5260)	grad_norm 2.4373 (2.6609)	mem 20675MB
[2025-04-03 01:45:10 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][444/573]	eta 0:01:53 lr 0.001004	time 0.8774 (0.8815)	loss 0.4800 (0.5255)	grad_norm 2.9050 (2.6635)	mem 20675MB
[2025-04-03 01:45:11 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][446/573]	eta 0:01:51 lr 0.001004	time 0.8774 (0.8814)	loss 0.4244 (0.5255)	grad_norm 4.0516 (2.6661)	mem 20675MB
[2025-04-03 01:45:13 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][448/573]	eta 0:01:50 lr 0.001004	time 0.8774 (0.8814)	loss 0.5033 (0.5254)	grad_norm 3.9356 (2.6685)	mem 20675MB
[2025-04-03 01:45:15 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][450/573]	eta 0:01:48 lr 0.001004	time 0.8771 (0.8814)	loss 0.4102 (0.5253)	grad_norm 3.1818 (2.6695)	mem 20675MB
[2025-04-03 01:45:17 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][452/573]	eta 0:01:46 lr 0.001004	time 0.8773 (0.8814)	loss 0.5975 (0.5255)	grad_norm 2.5498 (2.6693)	mem 20675MB
[2025-04-03 01:45:18 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][454/573]	eta 0:01:44 lr 0.001003	time 0.8788 (0.8814)	loss 0.5648 (0.5256)	grad_norm 2.3481 (2.6671)	mem 20675MB
[2025-04-03 01:45:20 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][456/573]	eta 0:01:43 lr 0.001003	time 0.8774 (0.8814)	loss 0.5440 (0.5258)	grad_norm 2.3632 (2.6661)	mem 20675MB
[2025-04-03 01:45:22 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][458/573]	eta 0:01:41 lr 0.001003	time 0.8771 (0.8814)	loss 0.4401 (0.5256)	grad_norm 2.7928 (2.6649)	mem 20675MB
[2025-04-03 01:45:24 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][460/573]	eta 0:01:39 lr 0.001003	time 0.8772 (0.8813)	loss 0.4939 (0.5255)	grad_norm 1.9726 (2.6607)	mem 20675MB
[2025-04-03 01:45:25 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][462/573]	eta 0:01:37 lr 0.001003	time 0.8792 (0.8813)	loss 0.4586 (0.5254)	grad_norm 2.4615 (2.6598)	mem 20675MB
[2025-04-03 01:45:27 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][464/573]	eta 0:01:36 lr 0.001002	time 0.8775 (0.8813)	loss 0.5700 (0.5255)	grad_norm 2.1891 (2.6571)	mem 20675MB
[2025-04-03 01:45:29 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][466/573]	eta 0:01:34 lr 0.001002	time 0.8774 (0.8813)	loss 0.3375 (0.5253)	grad_norm 4.5092 (2.6611)	mem 20675MB
[2025-04-03 01:45:31 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][468/573]	eta 0:01:32 lr 0.001002	time 0.8779 (0.8813)	loss 0.5655 (0.5251)	grad_norm 4.5749 (2.6674)	mem 20675MB
[2025-04-03 01:45:32 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][470/573]	eta 0:01:30 lr 0.001002	time 0.8774 (0.8813)	loss 0.5638 (0.5253)	grad_norm 2.1712 (2.6662)	mem 20675MB
[2025-04-03 01:45:34 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][472/573]	eta 0:01:29 lr 0.001002	time 0.8771 (0.8813)	loss 0.5778 (0.5255)	grad_norm 2.8429 (2.6674)	mem 20675MB
[2025-04-03 01:45:36 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][474/573]	eta 0:01:27 lr 0.001002	time 0.8772 (0.8813)	loss 0.5479 (0.5257)	grad_norm 3.0879 (2.6692)	mem 20675MB
[2025-04-03 01:45:38 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][476/573]	eta 0:01:25 lr 0.001001	time 0.8775 (0.8812)	loss 0.6067 (0.5261)	grad_norm 1.3429 (2.6653)	mem 20675MB
[2025-04-03 01:45:39 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][478/573]	eta 0:01:23 lr 0.001001	time 0.8775 (0.8812)	loss 0.5161 (0.5259)	grad_norm 2.2093 (2.6639)	mem 20675MB
[2025-04-03 01:45:41 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][480/573]	eta 0:01:21 lr 0.001001	time 0.8777 (0.8812)	loss 0.4818 (0.5259)	grad_norm 2.2620 (2.6604)	mem 20675MB
[2025-04-03 01:45:43 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][482/573]	eta 0:01:20 lr 0.001001	time 0.8775 (0.8812)	loss 0.5641 (0.5258)	grad_norm 1.5847 (2.6574)	mem 20675MB
[2025-04-03 01:45:45 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][484/573]	eta 0:01:18 lr 0.001001	time 0.8773 (0.8812)	loss 0.5678 (0.5259)	grad_norm 2.0699 (2.6565)	mem 20675MB
[2025-04-03 01:45:46 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][486/573]	eta 0:01:16 lr 0.001000	time 0.8774 (0.8812)	loss 0.4081 (0.5253)	grad_norm 2.5939 (2.6569)	mem 20675MB
[2025-04-03 01:45:48 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][488/573]	eta 0:01:14 lr 0.001000	time 0.8773 (0.8812)	loss 0.3788 (0.5251)	grad_norm 3.1243 (2.6575)	mem 20675MB
[2025-04-03 01:45:50 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][490/573]	eta 0:01:13 lr 0.001000	time 0.8773 (0.8812)	loss 0.5355 (0.5251)	grad_norm 2.9783 (2.6577)	mem 20675MB
[2025-04-03 01:45:52 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][492/573]	eta 0:01:11 lr 0.001000	time 0.8771 (0.8811)	loss 0.3850 (0.5249)	grad_norm 3.6529 (2.6593)	mem 20675MB
[2025-04-03 01:45:53 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][494/573]	eta 0:01:09 lr 0.001000	time 0.8778 (0.8811)	loss 0.4278 (0.5248)	grad_norm 3.6474 (2.6615)	mem 20675MB
[2025-04-03 01:45:55 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][496/573]	eta 0:01:07 lr 0.000999	time 0.8775 (0.8811)	loss 0.5681 (0.5250)	grad_norm 1.5384 (2.6596)	mem 20675MB
[2025-04-03 01:45:57 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][498/573]	eta 0:01:06 lr 0.000999	time 0.8775 (0.8811)	loss 0.5983 (0.5252)	grad_norm 1.8062 (2.6568)	mem 20675MB
[2025-04-03 01:45:59 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][500/573]	eta 0:01:04 lr 0.000999	time 0.8775 (0.8811)	loss 0.4151 (0.5247)	grad_norm 2.6656 (2.6566)	mem 20675MB
[2025-04-03 01:46:00 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][502/573]	eta 0:01:02 lr 0.000999	time 0.8774 (0.8811)	loss 0.6003 (0.5247)	grad_norm 2.8274 (2.6558)	mem 20675MB
[2025-04-03 01:46:02 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][504/573]	eta 0:01:00 lr 0.000999	time 0.8773 (0.8811)	loss 0.5449 (0.5247)	grad_norm 2.4486 (2.6546)	mem 20675MB
[2025-04-03 01:46:04 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][506/573]	eta 0:00:59 lr 0.000999	time 0.8776 (0.8811)	loss 0.5723 (0.5247)	grad_norm 1.4391 (2.6519)	mem 20675MB
[2025-04-03 01:46:06 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][508/573]	eta 0:00:57 lr 0.000998	time 0.8772 (0.8811)	loss 0.4143 (0.5242)	grad_norm 2.2588 (2.6533)	mem 20675MB
[2025-04-03 01:46:07 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][510/573]	eta 0:00:55 lr 0.000998	time 0.8772 (0.8810)	loss 0.5287 (0.5241)	grad_norm 2.9419 (2.6563)	mem 20675MB
[2025-04-03 01:46:09 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][512/573]	eta 0:00:53 lr 0.000998	time 0.8778 (0.8810)	loss 0.6379 (0.5244)	grad_norm 3.5895 (2.6576)	mem 20675MB
[2025-04-03 01:46:11 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][514/573]	eta 0:00:51 lr 0.000998	time 0.8774 (0.8810)	loss 0.3744 (0.5240)	grad_norm 3.9501 (2.6627)	mem 20675MB
[2025-04-03 01:46:13 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][516/573]	eta 0:00:50 lr 0.000998	time 0.8772 (0.8810)	loss 0.4496 (0.5239)	grad_norm 2.8428 (2.6620)	mem 20675MB
[2025-04-03 01:46:15 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][518/573]	eta 0:00:48 lr 0.000997	time 0.8774 (0.8810)	loss 0.5984 (0.5241)	grad_norm 1.8935 (2.6621)	mem 20675MB
[2025-04-03 01:46:16 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][520/573]	eta 0:00:46 lr 0.000997	time 0.8780 (0.8810)	loss 0.6130 (0.5244)	grad_norm 2.7903 (2.6620)	mem 20675MB
[2025-04-03 01:46:18 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][522/573]	eta 0:00:44 lr 0.000997	time 0.8773 (0.8810)	loss 0.4995 (0.5242)	grad_norm 3.1397 (2.6640)	mem 20675MB
[2025-04-03 01:46:20 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][524/573]	eta 0:00:43 lr 0.000997	time 0.8772 (0.8810)	loss 0.4607 (0.5242)	grad_norm 2.3115 (2.6619)	mem 20675MB
[2025-04-03 01:46:22 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][526/573]	eta 0:00:41 lr 0.000997	time 0.8775 (0.8810)	loss 0.4165 (0.5239)	grad_norm 3.1845 (2.6639)	mem 20675MB
[2025-04-03 01:46:23 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][528/573]	eta 0:00:39 lr 0.000997	time 0.8773 (0.8810)	loss 0.5791 (0.5241)	grad_norm 2.2979 (2.6659)	mem 20675MB
[2025-04-03 01:46:25 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][530/573]	eta 0:00:37 lr 0.000996	time 0.8773 (0.8809)	loss 0.5350 (0.5242)	grad_norm 2.1460 (2.6641)	mem 20675MB
[2025-04-03 01:46:27 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][532/573]	eta 0:00:36 lr 0.000996	time 0.8779 (0.8809)	loss 0.4363 (0.5238)	grad_norm 3.1040 (2.6682)	mem 20675MB
[2025-04-03 01:46:29 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][534/573]	eta 0:00:34 lr 0.000996	time 0.8772 (0.8809)	loss 0.4605 (0.5236)	grad_norm 2.8437 (2.6719)	mem 20675MB
[2025-04-03 01:46:30 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][536/573]	eta 0:00:32 lr 0.000996	time 0.8776 (0.8809)	loss 0.5487 (0.5237)	grad_norm 2.3607 (2.6703)	mem 20675MB
[2025-04-03 01:46:32 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][538/573]	eta 0:00:30 lr 0.000996	time 0.8771 (0.8809)	loss 0.4405 (0.5237)	grad_norm 2.2329 (2.6699)	mem 20675MB
[2025-04-03 01:46:34 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][540/573]	eta 0:00:29 lr 0.000995	time 0.8774 (0.8809)	loss 0.6711 (0.5239)	grad_norm 2.9057 (2.6697)	mem 20675MB
[2025-04-03 01:46:36 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][542/573]	eta 0:00:27 lr 0.000995	time 0.8775 (0.8809)	loss 0.5565 (0.5240)	grad_norm 2.3375 (2.6683)	mem 20675MB
[2025-04-03 01:46:37 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][544/573]	eta 0:00:25 lr 0.000995	time 0.8778 (0.8809)	loss 0.5753 (0.5241)	grad_norm 1.2744 (2.6658)	mem 20675MB
[2025-04-03 01:46:39 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][546/573]	eta 0:00:23 lr 0.000995	time 0.8771 (0.8809)	loss 0.5445 (0.5243)	grad_norm 1.6070 (2.6615)	mem 20675MB
[2025-04-03 01:46:41 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][548/573]	eta 0:00:22 lr 0.000995	time 0.8773 (0.8809)	loss 0.5511 (0.5242)	grad_norm 1.4765 (2.6616)	mem 20675MB
[2025-04-03 01:46:43 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][550/573]	eta 0:00:20 lr 0.000995	time 0.8774 (0.8809)	loss 0.5334 (0.5242)	grad_norm 2.0182 (2.6603)	mem 20675MB
[2025-04-03 01:46:44 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][552/573]	eta 0:00:18 lr 0.000994	time 0.8774 (0.8808)	loss 0.5203 (0.5241)	grad_norm 2.6816 (2.6606)	mem 20675MB
[2025-04-03 01:46:46 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][554/573]	eta 0:00:16 lr 0.000994	time 0.8774 (0.8808)	loss 0.5788 (0.5242)	grad_norm 2.8950 (2.6593)	mem 20675MB
[2025-04-03 01:46:48 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][556/573]	eta 0:00:14 lr 0.000994	time 0.8775 (0.8808)	loss 0.6173 (0.5245)	grad_norm 2.5195 (2.6583)	mem 20675MB
[2025-04-03 01:46:50 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][558/573]	eta 0:00:13 lr 0.000994	time 0.8770 (0.8808)	loss 0.5620 (0.5245)	grad_norm 2.2322 (2.6582)	mem 20675MB
[2025-04-03 01:46:51 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][560/573]	eta 0:00:11 lr 0.000994	time 0.8779 (0.8808)	loss 0.5986 (0.5246)	grad_norm 2.2301 (2.6591)	mem 20675MB
[2025-04-03 01:46:53 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][562/573]	eta 0:00:09 lr 0.000993	time 0.8773 (0.8808)	loss 0.4896 (0.5244)	grad_norm 1.9152 (2.6571)	mem 20675MB
[2025-04-03 01:46:55 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][564/573]	eta 0:00:07 lr 0.000993	time 0.8772 (0.8808)	loss 0.5755 (0.5243)	grad_norm 2.9263 (2.6573)	mem 20675MB
[2025-04-03 01:46:57 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][566/573]	eta 0:00:06 lr 0.000993	time 0.8773 (0.8808)	loss 0.5772 (0.5245)	grad_norm 2.2814 (2.6547)	mem 20675MB
[2025-04-03 01:46:58 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][568/573]	eta 0:00:04 lr 0.000993	time 0.8768 (0.8808)	loss 0.4358 (0.5244)	grad_norm 2.2314 (2.6524)	mem 20675MB
[2025-04-03 01:47:00 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][570/573]	eta 0:00:02 lr 0.000993	time 0.8769 (0.8808)	loss 0.5563 (0.5245)	grad_norm 3.1684 (2.6529)	mem 20675MB
[2025-04-03 01:47:02 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][572/573]	eta 0:00:00 lr 0.000993	time 0.8776 (0.8808)	loss 0.5764 (0.5246)	grad_norm 1.9303 (2.6504)	mem 20675MB
[2025-04-03 01:47:02 simmim_finetune] (main_finetune.py 260): INFO EPOCH 8 training takes 0:08:24
[2025-04-03 01:47:04 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.942 (1.942)	Loss 0.7218 (0.7218)	Acc@1 55.469 (55.469)	Mem 20675MB
[2025-04-03 01:47:05 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.837)	Loss 0.6675 (0.6806)	Acc@1 58.594 (59.115)	Mem 20675MB
[2025-04-03 01:47:05 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.284 (0.616)	Loss 0.7135 (0.6825)	Acc@1 58.594 (59.062)	Mem 20675MB
[2025-04-03 01:47:06 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.521)	Loss 0.6943 (0.6787)	Acc@1 58.594 (59.821)	Mem 20675MB
[2025-04-03 01:47:06 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.468)	Loss 0.4211 (0.6189)	Acc@1 83.594 (65.451)	Mem 20675MB
[2025-04-03 01:47:07 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.435)	Loss 0.3749 (0.5765)	Acc@1 89.844 (69.318)	Mem 20675MB
[2025-04-03 01:47:07 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.412)	Loss 0.3894 (0.5470)	Acc@1 85.938 (71.935)	Mem 20675MB
[2025-04-03 01:47:08 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.395)	Loss 0.3669 (0.5237)	Acc@1 89.062 (73.958)	Mem 20675MB
[2025-04-03 01:47:08 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 74.395
[2025-04-03 01:47:08 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 74.4%
[2025-04-03 01:47:08 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 74.90%
[2025-04-03 01:47:08 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.7208306111414484e-06, 3.7208306111414484e-06, 5.6966195139859105e-06, 5.6966195139859105e-06, 8.736294749131236e-06, 8.736294749131236e-06, 1.3412718187816355e-05, 1.3412718187816355e-05, 2.0607215785793457e-05, 2.0607215785793457e-05, 3.167567362883516e-05, 3.167567362883516e-05, 4.870407031043776e-05, 4.870407031043776e-05, 7.490160366674946e-05, 7.490160366674946e-05, 0.00011520550113799824, 0.00011520550113799824, 0.00017721149724761175, 0.00017721149724761175, 0.00027260533741624787, 0.00027260533741624787, 0.0004193650915218419, 0.0004193650915218419, 0.0006451493286073714, 0.0006451493286073714, 0.0009925096933543396, 0.0009925096933543396]
[2025-04-03 01:47:11 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][0/573]	eta 0:23:32 lr 0.000992	time 2.4645 (2.4645)	loss 0.5389 (0.5389)	grad_norm 1.7419 (1.7419)	mem 20675MB
[2025-04-03 01:47:13 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][2/573]	eta 0:13:23 lr 0.000992	time 0.8776 (1.4071)	loss 0.4469 (0.4967)	grad_norm 2.2812 (2.4838)	mem 20675MB
[2025-04-03 01:47:14 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][4/573]	eta 0:11:20 lr 0.000992	time 0.8774 (1.1959)	loss 0.5717 (0.5137)	grad_norm 2.1159 (2.3135)	mem 20675MB
[2025-04-03 01:47:16 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][6/573]	eta 0:10:26 lr 0.000992	time 0.8774 (1.1051)	loss 0.5899 (0.5426)	grad_norm 2.5770 (2.3822)	mem 20675MB
[2025-04-03 01:47:18 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][8/573]	eta 0:09:55 lr 0.000992	time 0.8772 (1.0547)	loss 0.5442 (0.5449)	grad_norm 2.2153 (2.4065)	mem 20675MB
[2025-04-03 01:47:20 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][10/573]	eta 0:09:35 lr 0.000991	time 0.8772 (1.0226)	loss 0.4090 (0.5297)	grad_norm 3.2158 (2.5079)	mem 20675MB
[2025-04-03 01:47:21 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][12/573]	eta 0:09:21 lr 0.000991	time 0.8775 (1.0004)	loss 0.4068 (0.5213)	grad_norm 4.7309 (2.6963)	mem 20675MB
[2025-04-03 01:47:23 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][14/573]	eta 0:09:10 lr 0.000991	time 0.8773 (0.9842)	loss 0.5500 (0.5158)	grad_norm 1.8359 (2.7673)	mem 20675MB
[2025-04-03 01:47:25 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][16/573]	eta 0:09:01 lr 0.000991	time 0.8773 (0.9717)	loss 0.5210 (0.5192)	grad_norm 3.1511 (2.8291)	mem 20675MB
[2025-04-03 01:47:27 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][18/573]	eta 0:08:53 lr 0.000991	time 0.8773 (0.9619)	loss 0.6404 (0.5272)	grad_norm 3.0540 (2.8471)	mem 20675MB
[2025-04-03 01:47:28 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][20/573]	eta 0:08:47 lr 0.000991	time 0.8776 (0.9539)	loss 0.5340 (0.5221)	grad_norm 3.6505 (2.9132)	mem 20675MB
[2025-04-03 01:47:30 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][22/573]	eta 0:08:41 lr 0.000990	time 0.8778 (0.9474)	loss 0.5806 (0.5278)	grad_norm 1.8277 (2.8529)	mem 20675MB
[2025-04-03 01:47:32 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][24/573]	eta 0:08:37 lr 0.000990	time 0.8774 (0.9418)	loss 0.4870 (0.5288)	grad_norm 2.2974 (2.8091)	mem 20675MB
[2025-04-03 01:47:34 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][26/573]	eta 0:08:32 lr 0.000990	time 0.8772 (0.9371)	loss 0.5081 (0.5289)	grad_norm 3.0207 (2.7823)	mem 20675MB
[2025-04-03 01:47:35 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][28/573]	eta 0:08:28 lr 0.000990	time 0.8775 (0.9331)	loss 0.4525 (0.5279)	grad_norm 1.6363 (2.7104)	mem 20675MB
[2025-04-03 01:47:37 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][30/573]	eta 0:08:24 lr 0.000990	time 0.8777 (0.9295)	loss 0.4146 (0.5209)	grad_norm 2.3752 (2.6719)	mem 20675MB
[2025-04-03 01:47:39 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][32/573]	eta 0:08:21 lr 0.000989	time 0.8773 (0.9264)	loss 0.3926 (0.5188)	grad_norm 3.0522 (2.7018)	mem 20675MB
[2025-04-03 01:47:41 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][34/573]	eta 0:08:17 lr 0.000989	time 0.8772 (0.9237)	loss 0.6118 (0.5212)	grad_norm 1.9217 (2.6652)	mem 20675MB
[2025-04-03 01:47:42 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][36/573]	eta 0:08:14 lr 0.000989	time 0.8774 (0.9212)	loss 0.5553 (0.5201)	grad_norm 2.5863 (2.6460)	mem 20675MB
[2025-04-03 01:47:44 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][38/573]	eta 0:08:11 lr 0.000989	time 0.8773 (0.9190)	loss 0.5578 (0.5191)	grad_norm 2.5356 (2.6350)	mem 20675MB
[2025-04-03 01:47:46 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][40/573]	eta 0:08:08 lr 0.000989	time 0.8770 (0.9170)	loss 0.6122 (0.5225)	grad_norm 2.1269 (2.6036)	mem 20675MB
[2025-04-03 01:47:48 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][42/573]	eta 0:08:05 lr 0.000989	time 0.8771 (0.9152)	loss 0.5059 (0.5194)	grad_norm 2.2062 (2.5867)	mem 20675MB
[2025-04-03 01:47:49 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][44/573]	eta 0:08:03 lr 0.000988	time 0.8774 (0.9136)	loss 0.5462 (0.5214)	grad_norm 2.5184 (2.5712)	mem 20675MB
[2025-04-03 01:47:51 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][46/573]	eta 0:08:00 lr 0.000988	time 0.8773 (0.9121)	loss 0.3745 (0.5193)	grad_norm 3.0127 (2.5588)	mem 20675MB
[2025-04-03 01:47:53 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][48/573]	eta 0:07:58 lr 0.000988	time 0.8771 (0.9107)	loss 0.4452 (0.5182)	grad_norm 1.7274 (2.5861)	mem 20675MB
[2025-04-03 01:47:55 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][50/573]	eta 0:07:55 lr 0.000988	time 0.8782 (0.9094)	loss 0.6258 (0.5210)	grad_norm 3.8489 (2.5986)	mem 20675MB
[2025-04-03 01:47:56 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][52/573]	eta 0:07:53 lr 0.000988	time 0.8774 (0.9083)	loss 0.6566 (0.5240)	grad_norm 2.9605 (2.6286)	mem 20675MB
[2025-04-03 01:47:58 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][54/573]	eta 0:07:50 lr 0.000987	time 0.8773 (0.9072)	loss 0.3702 (0.5213)	grad_norm 2.1961 (2.6033)	mem 20675MB
[2025-04-03 01:48:00 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][56/573]	eta 0:07:48 lr 0.000987	time 0.8782 (0.9062)	loss 0.6048 (0.5219)	grad_norm 3.1416 (2.6317)	mem 20675MB
[2025-04-03 01:48:02 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][58/573]	eta 0:07:46 lr 0.000987	time 0.8774 (0.9052)	loss 0.6378 (0.5251)	grad_norm 2.2201 (2.6201)	mem 20675MB
[2025-04-03 01:48:03 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][60/573]	eta 0:07:43 lr 0.000987	time 0.8771 (0.9043)	loss 0.4403 (0.5248)	grad_norm 4.2024 (2.6611)	mem 20675MB
[2025-04-03 01:48:05 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][62/573]	eta 0:07:41 lr 0.000987	time 0.8784 (0.9035)	loss 0.5636 (0.5260)	grad_norm 2.9476 (2.6630)	mem 20675MB
[2025-04-03 01:48:07 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][64/573]	eta 0:07:39 lr 0.000986	time 0.8771 (0.9027)	loss 0.5415 (0.5255)	grad_norm 1.7910 (2.6422)	mem 20675MB
[2025-04-03 01:48:09 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][66/573]	eta 0:07:37 lr 0.000986	time 0.8770 (0.9020)	loss 0.5440 (0.5254)	grad_norm 2.6912 (2.6335)	mem 20675MB
[2025-04-03 01:48:11 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][68/573]	eta 0:07:35 lr 0.000986	time 0.8773 (0.9013)	loss 0.3996 (0.5230)	grad_norm 2.5451 (2.6469)	mem 20675MB
[2025-04-03 01:48:12 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][70/573]	eta 0:07:33 lr 0.000986	time 0.8771 (0.9007)	loss 0.6300 (0.5234)	grad_norm 3.3298 (2.6983)	mem 20675MB
[2025-04-03 01:48:14 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][72/573]	eta 0:07:30 lr 0.000986	time 0.8773 (0.9000)	loss 0.5251 (0.5225)	grad_norm 2.5300 (2.6959)	mem 20675MB
[2025-04-03 01:48:16 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][74/573]	eta 0:07:28 lr 0.000986	time 0.8770 (0.8994)	loss 0.5893 (0.5246)	grad_norm 3.6350 (2.7173)	mem 20675MB
[2025-04-03 01:48:18 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][76/573]	eta 0:07:26 lr 0.000985	time 0.8771 (0.8989)	loss 0.4964 (0.5243)	grad_norm 3.9963 (2.7389)	mem 20675MB
[2025-04-03 01:48:19 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][78/573]	eta 0:07:24 lr 0.000985	time 0.8772 (0.8984)	loss 0.5012 (0.5244)	grad_norm 1.9134 (2.7215)	mem 20675MB
[2025-04-03 01:48:21 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][80/573]	eta 0:07:22 lr 0.000985	time 0.8776 (0.8979)	loss 0.6144 (0.5268)	grad_norm 4.5032 (2.7455)	mem 20675MB
[2025-04-03 01:48:23 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][82/573]	eta 0:07:20 lr 0.000985	time 0.8782 (0.8974)	loss 0.5405 (0.5278)	grad_norm 2.8491 (2.7461)	mem 20675MB
[2025-04-03 01:48:25 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][84/573]	eta 0:07:18 lr 0.000985	time 0.8770 (0.8970)	loss 0.4708 (0.5283)	grad_norm 2.6259 (2.7404)	mem 20675MB
[2025-04-03 01:48:26 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][86/573]	eta 0:07:16 lr 0.000984	time 0.8773 (0.8965)	loss 0.5568 (0.5277)	grad_norm 1.5457 (2.7179)	mem 20675MB
[2025-04-03 01:48:28 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][88/573]	eta 0:07:14 lr 0.000984	time 0.8774 (0.8961)	loss 0.5405 (0.5289)	grad_norm 2.6692 (2.7248)	mem 20675MB
[2025-04-03 01:48:30 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][90/573]	eta 0:07:12 lr 0.000984	time 0.8773 (0.8957)	loss 0.4046 (0.5272)	grad_norm 2.4352 (2.7182)	mem 20675MB
[2025-04-03 01:48:32 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][92/573]	eta 0:07:10 lr 0.000984	time 0.8775 (0.8954)	loss 0.6187 (0.5272)	grad_norm 3.2283 (2.7534)	mem 20675MB
[2025-04-03 01:48:33 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][94/573]	eta 0:07:08 lr 0.000984	time 0.8774 (0.8950)	loss 0.5918 (0.5284)	grad_norm 2.6144 (2.7471)	mem 20675MB
[2025-04-03 01:48:35 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][96/573]	eta 0:07:06 lr 0.000983	time 0.8773 (0.8947)	loss 0.5917 (0.5285)	grad_norm 2.3380 (2.7419)	mem 20675MB
[2025-04-03 01:48:37 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][98/573]	eta 0:07:04 lr 0.000983	time 0.8773 (0.8943)	loss 0.3982 (0.5278)	grad_norm 4.6081 (2.7540)	mem 20675MB
[2025-04-03 01:48:39 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][100/573]	eta 0:07:02 lr 0.000983	time 0.8770 (0.8940)	loss 0.5884 (0.5284)	grad_norm 3.4136 (2.7702)	mem 20675MB
[2025-04-03 01:48:40 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][102/573]	eta 0:07:00 lr 0.000983	time 0.8772 (0.8937)	loss 0.5084 (0.5288)	grad_norm 2.7653 (2.7713)	mem 20675MB
[2025-04-03 01:48:42 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][104/573]	eta 0:06:59 lr 0.000983	time 0.8774 (0.8934)	loss 0.5632 (0.5301)	grad_norm 1.6380 (2.7676)	mem 20675MB
[2025-04-03 01:48:44 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][106/573]	eta 0:06:57 lr 0.000983	time 0.8778 (0.8931)	loss 0.4410 (0.5285)	grad_norm 3.9419 (2.7792)	mem 20675MB
[2025-04-03 01:48:46 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][108/573]	eta 0:06:55 lr 0.000982	time 0.8773 (0.8928)	loss 0.5723 (0.5294)	grad_norm 2.8492 (2.7701)	mem 20675MB
[2025-04-03 01:48:47 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][110/573]	eta 0:06:53 lr 0.000982	time 0.8779 (0.8926)	loss 0.4028 (0.5290)	grad_norm 3.6041 (2.7783)	mem 20675MB
[2025-04-03 01:48:49 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][112/573]	eta 0:06:51 lr 0.000982	time 0.8776 (0.8923)	loss 0.5852 (0.5294)	grad_norm 1.9415 (2.7664)	mem 20675MB
[2025-04-03 01:48:51 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][114/573]	eta 0:06:49 lr 0.000982	time 0.8773 (0.8921)	loss 0.5125 (0.5295)	grad_norm 1.7399 (2.7534)	mem 20675MB
[2025-04-03 01:48:53 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][116/573]	eta 0:06:47 lr 0.000982	time 0.8775 (0.8919)	loss 0.4497 (0.5293)	grad_norm 3.0939 (2.7533)	mem 20675MB
[2025-04-03 01:48:54 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][118/573]	eta 0:06:45 lr 0.000981	time 0.8774 (0.8916)	loss 0.6477 (0.5302)	grad_norm 3.0647 (2.7594)	mem 20675MB
[2025-04-03 01:48:56 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][120/573]	eta 0:06:43 lr 0.000981	time 0.8773 (0.8914)	loss 0.5923 (0.5296)	grad_norm 2.4717 (2.7607)	mem 20675MB
[2025-04-03 01:48:58 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][122/573]	eta 0:06:41 lr 0.000981	time 0.8773 (0.8912)	loss 0.5533 (0.5302)	grad_norm 2.5727 (2.7553)	mem 20675MB
[2025-04-03 01:49:00 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][124/573]	eta 0:06:40 lr 0.000981	time 0.8773 (0.8910)	loss 0.4076 (0.5292)	grad_norm 2.8724 (2.7490)	mem 20675MB
[2025-04-03 01:49:01 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][126/573]	eta 0:06:38 lr 0.000981	time 0.8779 (0.8908)	loss 0.6301 (0.5298)	grad_norm 2.6170 (2.7517)	mem 20675MB
[2025-04-03 01:49:03 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][128/573]	eta 0:06:36 lr 0.000980	time 0.8774 (0.8906)	loss 0.6178 (0.5314)	grad_norm 2.5877 (2.7444)	mem 20675MB
[2025-04-03 01:49:05 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][130/573]	eta 0:06:34 lr 0.000980	time 0.8773 (0.8904)	loss 0.5485 (0.5315)	grad_norm 2.4830 (2.7329)	mem 20675MB
[2025-04-03 01:49:07 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][132/573]	eta 0:06:32 lr 0.000980	time 0.8776 (0.8902)	loss 0.5209 (0.5310)	grad_norm 2.5640 (2.7370)	mem 20675MB
[2025-04-03 01:49:08 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][134/573]	eta 0:06:30 lr 0.000980	time 0.8771 (0.8900)	loss 0.4170 (0.5299)	grad_norm 2.4749 (2.7332)	mem 20675MB
[2025-04-03 01:49:10 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][136/573]	eta 0:06:28 lr 0.000980	time 0.8775 (0.8899)	loss 0.6015 (0.5307)	grad_norm 2.9253 (2.7261)	mem 20675MB
[2025-04-03 01:49:12 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][138/573]	eta 0:06:27 lr 0.000980	time 0.8775 (0.8897)	loss 0.6266 (0.5307)	grad_norm 2.4075 (2.7338)	mem 20675MB
[2025-04-03 01:49:14 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][140/573]	eta 0:06:25 lr 0.000979	time 0.8775 (0.8896)	loss 0.5679 (0.5315)	grad_norm 1.8455 (2.7230)	mem 20675MB
[2025-04-03 01:49:16 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][142/573]	eta 0:06:23 lr 0.000979	time 0.8772 (0.8894)	loss 0.5389 (0.5315)	grad_norm 3.0665 (2.7232)	mem 20675MB
[2025-04-03 01:49:17 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][144/573]	eta 0:06:21 lr 0.000979	time 0.8772 (0.8892)	loss 0.5387 (0.5317)	grad_norm 1.6327 (2.7112)	mem 20675MB
[2025-04-03 01:49:19 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][146/573]	eta 0:06:19 lr 0.000979	time 0.8772 (0.8891)	loss 0.5825 (0.5326)	grad_norm 1.9829 (2.7032)	mem 20675MB
[2025-04-03 01:49:21 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][148/573]	eta 0:06:17 lr 0.000979	time 0.8777 (0.8889)	loss 0.4083 (0.5315)	grad_norm 2.4445 (2.6946)	mem 20675MB
[2025-04-03 01:49:23 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][150/573]	eta 0:06:15 lr 0.000978	time 0.8772 (0.8888)	loss 0.5640 (0.5320)	grad_norm 2.4839 (2.6855)	mem 20675MB
[2025-04-03 01:49:24 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][152/573]	eta 0:06:14 lr 0.000978	time 0.8775 (0.8887)	loss 0.5786 (0.5324)	grad_norm 1.8111 (2.6831)	mem 20675MB
[2025-04-03 01:49:26 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][154/573]	eta 0:06:12 lr 0.000978	time 0.8774 (0.8885)	loss 0.3963 (0.5309)	grad_norm 3.5700 (2.6880)	mem 20675MB
[2025-04-03 01:49:28 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][156/573]	eta 0:06:10 lr 0.000978	time 0.8770 (0.8884)	loss 0.3849 (0.5299)	grad_norm 3.0148 (2.6938)	mem 20675MB
[2025-04-03 01:49:30 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][158/573]	eta 0:06:08 lr 0.000978	time 0.8774 (0.8883)	loss 0.4739 (0.5292)	grad_norm 2.5451 (2.6891)	mem 20675MB
[2025-04-03 01:49:31 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][160/573]	eta 0:06:06 lr 0.000977	time 0.8773 (0.8881)	loss 0.5340 (0.5288)	grad_norm 2.1346 (2.6872)	mem 20675MB
[2025-04-03 01:49:33 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][162/573]	eta 0:06:04 lr 0.000977	time 0.8772 (0.8880)	loss 0.3708 (0.5284)	grad_norm 2.4190 (2.6867)	mem 20675MB
[2025-04-03 01:49:35 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][164/573]	eta 0:06:03 lr 0.000977	time 0.8773 (0.8879)	loss 0.5951 (0.5280)	grad_norm 2.5219 (2.6877)	mem 20675MB
[2025-04-03 01:49:37 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][166/573]	eta 0:06:01 lr 0.000977	time 0.8772 (0.8878)	loss 0.4370 (0.5276)	grad_norm 2.6970 (2.6949)	mem 20675MB
[2025-04-03 01:49:38 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][168/573]	eta 0:05:59 lr 0.000977	time 0.8788 (0.8877)	loss 0.4436 (0.5270)	grad_norm 2.8056 (2.6949)	mem 20675MB
[2025-04-03 01:49:40 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][170/573]	eta 0:05:57 lr 0.000977	time 0.8776 (0.8876)	loss 0.5843 (0.5277)	grad_norm 2.0090 (2.6924)	mem 20675MB
[2025-04-03 01:49:42 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][172/573]	eta 0:05:55 lr 0.000976	time 0.8771 (0.8875)	loss 0.5650 (0.5270)	grad_norm 1.5122 (2.6901)	mem 20675MB
[2025-04-03 01:49:44 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][174/573]	eta 0:05:54 lr 0.000976	time 0.8780 (0.8874)	loss 0.5222 (0.5268)	grad_norm 3.5650 (2.6947)	mem 20675MB
[2025-04-03 01:49:45 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][176/573]	eta 0:05:52 lr 0.000976	time 0.8775 (0.8873)	loss 0.3601 (0.5259)	grad_norm 2.6976 (2.6875)	mem 20675MB
[2025-04-03 01:49:47 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][178/573]	eta 0:05:50 lr 0.000976	time 0.8772 (0.8872)	loss 0.6256 (0.5260)	grad_norm 2.9145 (2.6976)	mem 20675MB
[2025-04-03 01:49:49 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][180/573]	eta 0:05:48 lr 0.000976	time 0.8773 (0.8871)	loss 0.5977 (0.5267)	grad_norm 3.4770 (2.6976)	mem 20675MB
[2025-04-03 01:49:51 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][182/573]	eta 0:05:46 lr 0.000975	time 0.8774 (0.8870)	loss 0.5218 (0.5268)	grad_norm 2.1240 (2.6919)	mem 20675MB
[2025-04-03 01:49:52 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][184/573]	eta 0:05:44 lr 0.000975	time 0.8773 (0.8869)	loss 0.4964 (0.5269)	grad_norm 3.3311 (2.6927)	mem 20675MB
[2025-04-03 01:49:54 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][186/573]	eta 0:05:43 lr 0.000975	time 0.8770 (0.8868)	loss 0.5223 (0.5260)	grad_norm 1.8395 (2.6865)	mem 20675MB
[2025-04-03 01:49:56 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][188/573]	eta 0:05:41 lr 0.000975	time 0.8774 (0.8867)	loss 0.5979 (0.5263)	grad_norm 2.0399 (2.6807)	mem 20675MB
[2025-04-03 01:49:58 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][190/573]	eta 0:05:39 lr 0.000975	time 0.8771 (0.8866)	loss 0.5939 (0.5270)	grad_norm 2.3838 (2.6782)	mem 20675MB
[2025-04-03 01:49:59 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][192/573]	eta 0:05:37 lr 0.000974	time 0.8772 (0.8865)	loss 0.5442 (0.5271)	grad_norm 2.0415 (2.6722)	mem 20675MB
[2025-04-03 01:50:01 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][194/573]	eta 0:05:35 lr 0.000974	time 0.8775 (0.8864)	loss 0.3865 (0.5265)	grad_norm 3.4321 (2.6730)	mem 20675MB
[2025-04-03 01:50:03 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][196/573]	eta 0:05:34 lr 0.000974	time 0.8772 (0.8863)	loss 0.3855 (0.5259)	grad_norm 3.2852 (2.6761)	mem 20675MB
[2025-04-03 01:50:05 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][198/573]	eta 0:05:32 lr 0.000974	time 0.8776 (0.8863)	loss 0.4747 (0.5257)	grad_norm 2.4029 (2.6828)	mem 20675MB
[2025-04-03 01:50:06 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][200/573]	eta 0:05:30 lr 0.000974	time 0.8779 (0.8862)	loss 0.4827 (0.5259)	grad_norm 5.5824 (2.6965)	mem 20675MB
[2025-04-03 01:50:08 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][202/573]	eta 0:05:28 lr 0.000974	time 0.8775 (0.8861)	loss 0.5857 (0.5263)	grad_norm 2.2594 (2.7046)	mem 20675MB
[2025-04-03 01:50:10 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][204/573]	eta 0:05:26 lr 0.000973	time 0.8776 (0.8860)	loss 0.6377 (0.5264)	grad_norm 2.4893 (2.6995)	mem 20675MB
[2025-04-03 01:50:12 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][206/573]	eta 0:05:25 lr 0.000973	time 0.8773 (0.8860)	loss 0.4731 (0.5260)	grad_norm 2.3205 (2.6938)	mem 20675MB
[2025-04-03 01:50:13 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][208/573]	eta 0:05:23 lr 0.000973	time 0.8796 (0.8859)	loss 0.5529 (0.5262)	grad_norm 2.0274 (2.6908)	mem 20675MB
[2025-04-03 01:50:15 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][210/573]	eta 0:05:21 lr 0.000973	time 0.8776 (0.8858)	loss 0.5154 (0.5260)	grad_norm 2.5435 (2.6918)	mem 20675MB
[2025-04-03 01:50:17 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][212/573]	eta 0:05:19 lr 0.000973	time 0.8773 (0.8858)	loss 0.4416 (0.5256)	grad_norm 1.7446 (2.6825)	mem 20675MB
[2025-04-03 01:50:19 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][214/573]	eta 0:05:17 lr 0.000972	time 0.8774 (0.8857)	loss 0.6344 (0.5262)	grad_norm 2.0612 (2.6802)	mem 20675MB
[2025-04-03 01:50:20 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][216/573]	eta 0:05:16 lr 0.000972	time 0.8774 (0.8856)	loss 0.4157 (0.5258)	grad_norm 2.1661 (2.6769)	mem 20675MB
[2025-04-03 01:50:22 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][218/573]	eta 0:05:14 lr 0.000972	time 0.8778 (0.8855)	loss 0.5175 (0.5256)	grad_norm 1.6227 (2.6671)	mem 20675MB
[2025-04-03 01:50:24 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][220/573]	eta 0:05:12 lr 0.000972	time 0.8774 (0.8855)	loss 0.4884 (0.5257)	grad_norm 2.0874 (2.6612)	mem 20675MB
[2025-04-03 01:50:26 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][222/573]	eta 0:05:10 lr 0.000972	time 0.8773 (0.8854)	loss 0.3935 (0.5249)	grad_norm 4.1668 (2.6631)	mem 20675MB
[2025-04-03 01:50:28 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][224/573]	eta 0:05:08 lr 0.000971	time 0.8774 (0.8854)	loss 0.3798 (0.5239)	grad_norm 2.2535 (2.6617)	mem 20675MB
[2025-04-03 01:50:29 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][226/573]	eta 0:05:07 lr 0.000971	time 0.8772 (0.8853)	loss 0.5661 (0.5237)	grad_norm 2.8067 (2.6742)	mem 20675MB
[2025-04-03 01:50:31 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][228/573]	eta 0:05:05 lr 0.000971	time 0.8772 (0.8852)	loss 0.4539 (0.5237)	grad_norm 2.4305 (2.6730)	mem 20675MB
[2025-04-03 01:50:33 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][230/573]	eta 0:05:03 lr 0.000971	time 0.8784 (0.8852)	loss 0.5329 (0.5234)	grad_norm 2.8293 (2.6766)	mem 20675MB
[2025-04-03 01:50:35 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][232/573]	eta 0:05:01 lr 0.000971	time 0.8771 (0.8851)	loss 0.4670 (0.5234)	grad_norm 1.9367 (2.6753)	mem 20675MB
[2025-04-03 01:50:36 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][234/573]	eta 0:05:00 lr 0.000970	time 0.8770 (0.8851)	loss 0.6175 (0.5235)	grad_norm 1.8306 (2.6733)	mem 20675MB
[2025-04-03 01:50:38 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][236/573]	eta 0:04:58 lr 0.000970	time 0.8773 (0.8850)	loss 0.6251 (0.5236)	grad_norm 1.9966 (2.6832)	mem 20675MB
[2025-04-03 01:50:40 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][238/573]	eta 0:04:56 lr 0.000970	time 0.8780 (0.8849)	loss 0.5818 (0.5240)	grad_norm 1.2901 (2.6741)	mem 20675MB
[2025-04-03 01:50:42 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][240/573]	eta 0:04:54 lr 0.000970	time 0.8771 (0.8849)	loss 0.5866 (0.5245)	grad_norm 1.4934 (2.6676)	mem 20675MB
[2025-04-03 01:50:43 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][242/573]	eta 0:04:52 lr 0.000970	time 0.8772 (0.8848)	loss 0.5406 (0.5246)	grad_norm 1.7290 (2.6602)	mem 20675MB
[2025-04-03 01:50:45 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][244/573]	eta 0:04:51 lr 0.000970	time 0.8771 (0.8848)	loss 0.5337 (0.5249)	grad_norm 1.9373 (2.6542)	mem 20675MB
[2025-04-03 01:50:47 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][246/573]	eta 0:04:49 lr 0.000969	time 0.8773 (0.8847)	loss 0.5313 (0.5246)	grad_norm 1.6075 (2.6527)	mem 20675MB
[2025-04-03 01:50:49 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][248/573]	eta 0:04:47 lr 0.000969	time 0.8774 (0.8847)	loss 0.5409 (0.5249)	grad_norm 2.0908 (2.6463)	mem 20675MB
[2025-04-03 01:50:50 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][250/573]	eta 0:04:45 lr 0.000969	time 0.8776 (0.8846)	loss 0.5685 (0.5248)	grad_norm 1.8924 (2.6429)	mem 20675MB
[2025-04-03 01:50:52 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][252/573]	eta 0:04:43 lr 0.000969	time 0.8774 (0.8846)	loss 0.5458 (0.5250)	grad_norm 2.0142 (2.6407)	mem 20675MB
[2025-04-03 01:50:54 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][254/573]	eta 0:04:42 lr 0.000969	time 0.8774 (0.8845)	loss 0.6630 (0.5253)	grad_norm 2.3838 (2.6396)	mem 20675MB
[2025-04-03 01:50:56 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][256/573]	eta 0:04:40 lr 0.000968	time 0.8774 (0.8845)	loss 0.4967 (0.5253)	grad_norm 2.1346 (2.6359)	mem 20675MB
[2025-04-03 01:50:57 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][258/573]	eta 0:04:38 lr 0.000968	time 0.8774 (0.8844)	loss 0.3982 (0.5251)	grad_norm 3.7269 (2.6387)	mem 20675MB
[2025-04-03 01:50:59 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][260/573]	eta 0:04:36 lr 0.000968	time 0.8773 (0.8844)	loss 0.5665 (0.5254)	grad_norm 2.1259 (2.6355)	mem 20675MB
[2025-04-03 01:51:01 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][262/573]	eta 0:04:35 lr 0.000968	time 0.8773 (0.8843)	loss 0.5773 (0.5258)	grad_norm 2.2415 (2.6308)	mem 20675MB
[2025-04-03 01:51:03 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][264/573]	eta 0:04:33 lr 0.000968	time 0.8774 (0.8843)	loss 0.4056 (0.5249)	grad_norm 2.6244 (2.6292)	mem 20675MB
[2025-04-03 01:51:04 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][266/573]	eta 0:04:31 lr 0.000967	time 0.8772 (0.8842)	loss 0.6036 (0.5248)	grad_norm 2.8795 (2.6305)	mem 20675MB
[2025-04-03 01:51:06 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][268/573]	eta 0:04:29 lr 0.000967	time 0.8776 (0.8842)	loss 0.4062 (0.5241)	grad_norm 2.8962 (2.6301)	mem 20675MB
[2025-04-03 01:51:08 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][270/573]	eta 0:04:27 lr 0.000967	time 0.8777 (0.8841)	loss 0.5955 (0.5245)	grad_norm 2.7288 (2.6347)	mem 20675MB
[2025-04-03 01:51:10 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][272/573]	eta 0:04:26 lr 0.000967	time 0.8774 (0.8841)	loss 0.5652 (0.5244)	grad_norm 2.0490 (2.6340)	mem 20675MB
[2025-04-03 01:51:11 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][274/573]	eta 0:04:24 lr 0.000967	time 0.8771 (0.8841)	loss 0.5521 (0.5247)	grad_norm 2.0091 (2.6297)	mem 20675MB
[2025-04-03 01:51:13 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][276/573]	eta 0:04:22 lr 0.000966	time 0.8773 (0.8840)	loss 0.4619 (0.5245)	grad_norm 2.7559 (2.6281)	mem 20675MB
[2025-04-03 01:51:15 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][278/573]	eta 0:04:20 lr 0.000966	time 0.8771 (0.8840)	loss 0.5469 (0.5249)	grad_norm 2.9934 (2.6317)	mem 20675MB
[2025-04-03 01:51:17 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][280/573]	eta 0:04:18 lr 0.000966	time 0.8773 (0.8839)	loss 0.5807 (0.5251)	grad_norm 2.3570 (2.6309)	mem 20675MB
[2025-04-03 01:51:18 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][282/573]	eta 0:04:17 lr 0.000966	time 0.8775 (0.8839)	loss 0.5236 (0.5247)	grad_norm 2.3137 (2.6346)	mem 20675MB
[2025-04-03 01:51:20 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][284/573]	eta 0:04:15 lr 0.000966	time 0.8772 (0.8839)	loss 0.6172 (0.5251)	grad_norm 2.9119 (2.6331)	mem 20675MB
[2025-04-03 01:51:22 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][286/573]	eta 0:04:13 lr 0.000966	time 0.8771 (0.8838)	loss 0.5425 (0.5250)	grad_norm 2.6421 (2.6337)	mem 20675MB
[2025-04-03 01:51:24 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][288/573]	eta 0:04:11 lr 0.000965	time 0.8774 (0.8838)	loss 0.5329 (0.5251)	grad_norm 2.3602 (2.6301)	mem 20675MB
[2025-04-03 01:51:25 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][290/573]	eta 0:04:10 lr 0.000965	time 0.8772 (0.8837)	loss 0.6650 (0.5252)	grad_norm 2.7351 (2.6338)	mem 20675MB
[2025-04-03 01:51:27 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][292/573]	eta 0:04:08 lr 0.000965	time 0.8771 (0.8837)	loss 0.4972 (0.5251)	grad_norm 2.3649 (2.6316)	mem 20675MB
[2025-04-03 01:51:29 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][294/573]	eta 0:04:06 lr 0.000965	time 0.8774 (0.8837)	loss 0.5926 (0.5252)	grad_norm 1.8801 (2.6263)	mem 20675MB
[2025-04-03 01:51:31 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][296/573]	eta 0:04:04 lr 0.000965	time 0.8774 (0.8836)	loss 0.4385 (0.5249)	grad_norm 1.8008 (2.6209)	mem 20675MB
[2025-04-03 01:51:33 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][298/573]	eta 0:04:02 lr 0.000964	time 0.8775 (0.8836)	loss 0.4833 (0.5246)	grad_norm 1.6377 (2.6156)	mem 20675MB
[2025-04-03 01:51:34 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][300/573]	eta 0:04:01 lr 0.000964	time 0.8775 (0.8836)	loss 0.5605 (0.5244)	grad_norm 1.9457 (2.6141)	mem 20675MB
[2025-04-03 01:51:36 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][302/573]	eta 0:03:59 lr 0.000964	time 0.8773 (0.8835)	loss 0.6523 (0.5244)	grad_norm 2.4922 (2.6151)	mem 20675MB
[2025-04-03 01:51:38 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][304/573]	eta 0:03:57 lr 0.000964	time 0.8772 (0.8835)	loss 0.5383 (0.5246)	grad_norm 2.4927 (2.6118)	mem 20675MB
[2025-04-03 01:51:40 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][306/573]	eta 0:03:55 lr 0.000964	time 0.8773 (0.8835)	loss 0.5669 (0.5248)	grad_norm 2.4667 (2.6152)	mem 20675MB
[2025-04-03 01:51:41 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][308/573]	eta 0:03:54 lr 0.000963	time 0.8775 (0.8834)	loss 0.5812 (0.5251)	grad_norm 2.3653 (2.6135)	mem 20675MB
[2025-04-03 01:51:43 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][310/573]	eta 0:03:52 lr 0.000963	time 0.8769 (0.8834)	loss 0.5089 (0.5253)	grad_norm 2.9466 (2.6120)	mem 20675MB
[2025-04-03 01:51:45 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][312/573]	eta 0:03:50 lr 0.000963	time 0.8770 (0.8834)	loss 0.5875 (0.5258)	grad_norm 1.7082 (2.6084)	mem 20675MB
[2025-04-03 01:51:47 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][314/573]	eta 0:03:48 lr 0.000963	time 0.8773 (0.8833)	loss 0.4127 (0.5253)	grad_norm 3.1992 (2.6123)	mem 20675MB
[2025-04-03 01:51:48 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][316/573]	eta 0:03:47 lr 0.000963	time 0.8769 (0.8833)	loss 0.5759 (0.5254)	grad_norm 1.5634 (2.6068)	mem 20675MB
[2025-04-03 01:51:50 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][318/573]	eta 0:03:45 lr 0.000962	time 0.8770 (0.8833)	loss 0.5249 (0.5257)	grad_norm 1.7388 (2.6019)	mem 20675MB
[2025-04-03 01:51:52 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][320/573]	eta 0:03:43 lr 0.000962	time 0.8775 (0.8832)	loss 0.5216 (0.5256)	grad_norm 2.5535 (2.5989)	mem 20675MB
[2025-04-03 01:51:54 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][322/573]	eta 0:03:41 lr 0.000962	time 0.8780 (0.8832)	loss 0.5373 (0.5255)	grad_norm 2.5474 (2.6000)	mem 20675MB
[2025-04-03 01:51:55 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][324/573]	eta 0:03:39 lr 0.000962	time 0.8773 (0.8832)	loss 0.3982 (0.5250)	grad_norm 3.1256 (2.5995)	mem 20675MB
[2025-04-03 01:51:57 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][326/573]	eta 0:03:38 lr 0.000962	time 0.8769 (0.8831)	loss 0.4043 (0.5245)	grad_norm 3.7804 (2.6037)	mem 20675MB
[2025-04-03 01:51:59 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][328/573]	eta 0:03:36 lr 0.000961	time 0.8772 (0.8831)	loss 0.4854 (0.5240)	grad_norm 2.2206 (2.6057)	mem 20675MB
[2025-04-03 01:52:01 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][330/573]	eta 0:03:34 lr 0.000961	time 0.8773 (0.8831)	loss 0.5347 (0.5240)	grad_norm 3.8684 (2.6094)	mem 20675MB
[2025-04-03 01:52:02 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][332/573]	eta 0:03:32 lr 0.000961	time 0.8776 (0.8830)	loss 0.5756 (0.5244)	grad_norm 2.6173 (2.6092)	mem 20675MB
[2025-04-03 01:52:04 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][334/573]	eta 0:03:31 lr 0.000961	time 0.8773 (0.8830)	loss 0.3469 (0.5240)	grad_norm 2.6476 (2.6087)	mem 20675MB
[2025-04-03 01:52:06 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][336/573]	eta 0:03:29 lr 0.000961	time 0.8770 (0.8830)	loss 0.5590 (0.5242)	grad_norm 1.9510 (2.6069)	mem 20675MB
[2025-04-03 01:52:08 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][338/573]	eta 0:03:27 lr 0.000961	time 0.8773 (0.8830)	loss 0.5275 (0.5243)	grad_norm 3.8016 (2.6096)	mem 20675MB
[2025-04-03 01:52:09 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][340/573]	eta 0:03:25 lr 0.000960	time 0.8772 (0.8829)	loss 0.4358 (0.5239)	grad_norm 2.7033 (2.6092)	mem 20675MB
[2025-04-03 01:52:11 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][342/573]	eta 0:03:23 lr 0.000960	time 0.8772 (0.8829)	loss 0.5033 (0.5238)	grad_norm 2.7886 (2.6098)	mem 20675MB
[2025-04-03 01:52:13 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][344/573]	eta 0:03:22 lr 0.000960	time 0.8770 (0.8829)	loss 0.5494 (0.5238)	grad_norm 1.4490 (2.6073)	mem 20675MB
[2025-04-03 01:52:15 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][346/573]	eta 0:03:20 lr 0.000960	time 0.8773 (0.8828)	loss 0.4770 (0.5239)	grad_norm 3.0495 (2.6072)	mem 20675MB
[2025-04-03 01:52:16 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][348/573]	eta 0:03:18 lr 0.000960	time 0.8773 (0.8828)	loss 0.5440 (0.5241)	grad_norm 2.0830 (2.6048)	mem 20675MB
[2025-04-03 01:52:18 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][350/573]	eta 0:03:16 lr 0.000959	time 0.8772 (0.8828)	loss 0.4821 (0.5236)	grad_norm 2.0756 (2.6049)	mem 20675MB
[2025-04-03 01:52:20 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][352/573]	eta 0:03:15 lr 0.000959	time 0.8774 (0.8828)	loss 0.4445 (0.5236)	grad_norm 2.3999 (2.6032)	mem 20675MB
[2025-04-03 01:52:22 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][354/573]	eta 0:03:13 lr 0.000959	time 0.8773 (0.8827)	loss 0.5735 (0.5236)	grad_norm 2.9306 (2.6047)	mem 20675MB
[2025-04-03 01:52:23 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][356/573]	eta 0:03:11 lr 0.000959	time 0.8770 (0.8827)	loss 0.3614 (0.5227)	grad_norm 2.2842 (2.6058)	mem 20675MB
[2025-04-03 01:52:25 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][358/573]	eta 0:03:09 lr 0.000959	time 0.8774 (0.8827)	loss 0.4769 (0.5224)	grad_norm 2.8554 (2.6053)	mem 20675MB
[2025-04-03 01:52:27 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][360/573]	eta 0:03:08 lr 0.000958	time 0.8772 (0.8827)	loss 0.5681 (0.5226)	grad_norm 2.2929 (2.6086)	mem 20675MB
[2025-04-03 01:52:29 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][362/573]	eta 0:03:06 lr 0.000958	time 0.8771 (0.8826)	loss 0.5192 (0.5228)	grad_norm 3.0745 (2.6113)	mem 20675MB
[2025-04-03 01:52:30 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][364/573]	eta 0:03:04 lr 0.000958	time 0.8775 (0.8826)	loss 0.4572 (0.5227)	grad_norm 3.6726 (2.6134)	mem 20675MB
[2025-04-03 01:52:32 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][366/573]	eta 0:03:02 lr 0.000958	time 0.8772 (0.8826)	loss 0.4936 (0.5228)	grad_norm 3.7639 (2.6167)	mem 20675MB
[2025-04-03 01:52:34 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][368/573]	eta 0:03:00 lr 0.000958	time 0.8770 (0.8826)	loss 0.4638 (0.5229)	grad_norm 3.4714 (2.6188)	mem 20675MB
[2025-04-03 01:52:36 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][370/573]	eta 0:02:59 lr 0.000957	time 0.8769 (0.8825)	loss 0.4442 (0.5227)	grad_norm 2.6580 (2.6163)	mem 20675MB
[2025-04-03 01:52:37 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][372/573]	eta 0:02:57 lr 0.000957	time 0.8771 (0.8825)	loss 0.5621 (0.5228)	grad_norm 1.9680 (2.6170)	mem 20675MB
[2025-04-03 01:52:39 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][374/573]	eta 0:02:55 lr 0.000957	time 0.8770 (0.8825)	loss 0.3687 (0.5225)	grad_norm 4.6524 (2.6223)	mem 20675MB
[2025-04-03 01:52:41 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][376/573]	eta 0:02:53 lr 0.000957	time 0.8773 (0.8825)	loss 0.3897 (0.5223)	grad_norm 3.9109 (2.6234)	mem 20675MB
[2025-04-03 01:52:43 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][378/573]	eta 0:02:52 lr 0.000957	time 0.8773 (0.8824)	loss 0.5807 (0.5222)	grad_norm 2.9703 (2.6258)	mem 20675MB
[2025-04-03 01:52:45 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][380/573]	eta 0:02:50 lr 0.000956	time 0.8771 (0.8824)	loss 0.5507 (0.5227)	grad_norm 3.8644 (2.6367)	mem 20675MB
[2025-04-03 01:52:46 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][382/573]	eta 0:02:48 lr 0.000956	time 0.8771 (0.8824)	loss 0.5397 (0.5228)	grad_norm 1.8129 (2.6355)	mem 20675MB
[2025-04-03 01:52:48 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][384/573]	eta 0:02:46 lr 0.000956	time 0.8775 (0.8824)	loss 0.5075 (0.5228)	grad_norm 1.9981 (2.6327)	mem 20675MB
[2025-04-03 01:52:50 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][386/573]	eta 0:02:45 lr 0.000956	time 0.8769 (0.8824)	loss 0.5490 (0.5226)	grad_norm 2.5501 (2.6340)	mem 20675MB
[2025-04-03 01:52:52 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][388/573]	eta 0:02:43 lr 0.000956	time 0.8772 (0.8823)	loss 0.3869 (0.5219)	grad_norm 3.7818 (2.6397)	mem 20675MB
[2025-04-03 01:52:53 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][390/573]	eta 0:02:41 lr 0.000955	time 0.8771 (0.8823)	loss 0.5410 (0.5221)	grad_norm 2.2918 (2.6381)	mem 20675MB
[2025-04-03 01:52:55 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][392/573]	eta 0:02:39 lr 0.000955	time 0.8773 (0.8823)	loss 0.5464 (0.5217)	grad_norm 2.1484 (2.6366)	mem 20675MB
[2025-04-03 01:52:57 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][394/573]	eta 0:02:37 lr 0.000955	time 0.8775 (0.8823)	loss 0.5265 (0.5217)	grad_norm 3.1532 (2.6378)	mem 20675MB
[2025-04-03 01:52:59 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][396/573]	eta 0:02:36 lr 0.000955	time 0.8770 (0.8822)	loss 0.3808 (0.5216)	grad_norm 3.2650 (2.6434)	mem 20675MB
[2025-04-03 01:53:00 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][398/573]	eta 0:02:34 lr 0.000955	time 0.8774 (0.8822)	loss 0.4686 (0.5214)	grad_norm 3.6383 (2.6470)	mem 20675MB
[2025-04-03 01:53:02 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][400/573]	eta 0:02:32 lr 0.000955	time 0.8774 (0.8822)	loss 0.4902 (0.5214)	grad_norm 1.7307 (2.6485)	mem 20675MB
[2025-04-03 01:53:04 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][402/573]	eta 0:02:30 lr 0.000954	time 0.8777 (0.8822)	loss 0.3987 (0.5213)	grad_norm 6.2202 (2.6578)	mem 20675MB
[2025-04-03 01:53:06 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][404/573]	eta 0:02:29 lr 0.000954	time 0.8774 (0.8822)	loss 0.6429 (0.5212)	grad_norm 2.7230 (2.6620)	mem 20675MB
[2025-04-03 01:53:07 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][406/573]	eta 0:02:27 lr 0.000954	time 0.8774 (0.8822)	loss 0.5633 (0.5215)	grad_norm 2.6129 (2.6625)	mem 20675MB
[2025-04-03 01:53:09 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][408/573]	eta 0:02:25 lr 0.000954	time 0.8786 (0.8821)	loss 0.5445 (0.5213)	grad_norm 1.5492 (2.6615)	mem 20675MB
[2025-04-03 01:53:11 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][410/573]	eta 0:02:23 lr 0.000954	time 0.8776 (0.8821)	loss 0.4509 (0.5213)	grad_norm 3.2313 (2.6628)	mem 20675MB
[2025-04-03 01:53:13 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][412/573]	eta 0:02:22 lr 0.000953	time 0.8775 (0.8821)	loss 0.5882 (0.5217)	grad_norm 2.3051 (2.6626)	mem 20675MB
[2025-04-03 01:53:14 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][414/573]	eta 0:02:20 lr 0.000953	time 0.8775 (0.8821)	loss 0.4246 (0.5212)	grad_norm 3.7700 (2.6643)	mem 20675MB
[2025-04-03 01:53:16 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][416/573]	eta 0:02:18 lr 0.000953	time 0.8774 (0.8821)	loss 0.5290 (0.5210)	grad_norm 2.0183 (2.6620)	mem 20675MB
[2025-04-03 01:53:18 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][418/573]	eta 0:02:16 lr 0.000953	time 0.8773 (0.8820)	loss 0.4919 (0.5206)	grad_norm 2.4010 (2.6613)	mem 20675MB
[2025-04-03 01:53:20 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][420/573]	eta 0:02:14 lr 0.000953	time 0.8774 (0.8820)	loss 0.5664 (0.5208)	grad_norm 2.2901 (2.6612)	mem 20675MB
[2025-04-03 01:53:21 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][422/573]	eta 0:02:13 lr 0.000952	time 0.8775 (0.8820)	loss 0.3816 (0.5207)	grad_norm 5.6468 (2.6681)	mem 20675MB
[2025-04-03 01:53:23 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][424/573]	eta 0:02:11 lr 0.000952	time 0.8775 (0.8820)	loss 0.3807 (0.5204)	grad_norm 3.5150 (2.6705)	mem 20675MB
[2025-04-03 01:53:25 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][426/573]	eta 0:02:09 lr 0.000952	time 0.8772 (0.8820)	loss 0.5130 (0.5203)	grad_norm 3.6013 (2.6721)	mem 20675MB
[2025-04-03 01:53:27 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][428/573]	eta 0:02:07 lr 0.000952	time 0.8776 (0.8820)	loss 0.6573 (0.5208)	grad_norm 3.0681 (2.6730)	mem 20675MB
[2025-04-03 01:53:28 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][430/573]	eta 0:02:06 lr 0.000952	time 0.8776 (0.8819)	loss 0.5525 (0.5211)	grad_norm 3.0562 (2.6727)	mem 20675MB
[2025-04-03 01:53:30 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][432/573]	eta 0:02:04 lr 0.000951	time 0.8774 (0.8819)	loss 0.5564 (0.5212)	grad_norm 2.4900 (2.6713)	mem 20675MB
[2025-04-03 01:53:32 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][434/573]	eta 0:02:02 lr 0.000951	time 0.8774 (0.8819)	loss 0.4614 (0.5212)	grad_norm 3.9052 (2.6728)	mem 20675MB
[2025-04-03 01:53:34 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][436/573]	eta 0:02:00 lr 0.000951	time 0.8778 (0.8819)	loss 0.5191 (0.5211)	grad_norm 1.6867 (2.6720)	mem 20675MB
[2025-04-03 01:53:35 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][438/573]	eta 0:01:59 lr 0.000951	time 0.8772 (0.8819)	loss 0.4311 (0.5211)	grad_norm 2.7083 (2.6711)	mem 20675MB
[2025-04-03 01:53:37 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][440/573]	eta 0:01:57 lr 0.000951	time 0.8774 (0.8819)	loss 0.5628 (0.5213)	grad_norm 1.9256 (2.6685)	mem 20675MB
[2025-04-03 01:53:39 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][442/573]	eta 0:01:55 lr 0.000950	time 0.8776 (0.8818)	loss 0.6249 (0.5216)	grad_norm 1.9399 (2.6658)	mem 20675MB
[2025-04-03 01:53:41 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][444/573]	eta 0:01:53 lr 0.000950	time 0.8775 (0.8818)	loss 0.5790 (0.5215)	grad_norm 2.0956 (2.6654)	mem 20675MB
[2025-04-03 01:53:42 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][446/573]	eta 0:01:51 lr 0.000950	time 0.8776 (0.8818)	loss 0.4203 (0.5210)	grad_norm 5.4184 (2.6751)	mem 20675MB
[2025-04-03 01:53:44 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][448/573]	eta 0:01:50 lr 0.000950	time 0.8778 (0.8818)	loss 0.6664 (0.5210)	grad_norm 3.6599 (2.6763)	mem 20675MB
[2025-04-03 01:53:46 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][450/573]	eta 0:01:48 lr 0.000950	time 0.8775 (0.8818)	loss 0.6199 (0.5214)	grad_norm 2.8129 (2.6775)	mem 20675MB
[2025-04-03 01:53:48 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][452/573]	eta 0:01:46 lr 0.000949	time 0.8775 (0.8818)	loss 0.6309 (0.5216)	grad_norm 2.3247 (2.6774)	mem 20675MB
[2025-04-03 01:53:50 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][454/573]	eta 0:01:44 lr 0.000949	time 0.8775 (0.8818)	loss 0.4286 (0.5212)	grad_norm 3.2256 (2.6803)	mem 20675MB
[2025-04-03 01:53:51 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][456/573]	eta 0:01:43 lr 0.000949	time 0.8776 (0.8817)	loss 0.4378 (0.5210)	grad_norm 3.2008 (2.6812)	mem 20675MB
[2025-04-03 01:53:53 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][458/573]	eta 0:01:41 lr 0.000949	time 0.8774 (0.8817)	loss 0.5925 (0.5212)	grad_norm 2.4102 (2.6802)	mem 20675MB
[2025-04-03 01:53:55 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][460/573]	eta 0:01:39 lr 0.000949	time 0.8776 (0.8817)	loss 0.5777 (0.5212)	grad_norm 1.9992 (2.6849)	mem 20675MB
[2025-04-03 01:53:57 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][462/573]	eta 0:01:37 lr 0.000948	time 0.8776 (0.8817)	loss 0.5955 (0.5214)	grad_norm 2.1646 (2.6827)	mem 20675MB
[2025-04-03 01:53:58 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][464/573]	eta 0:01:36 lr 0.000948	time 0.8774 (0.8817)	loss 0.5284 (0.5212)	grad_norm 1.7281 (2.6797)	mem 20675MB
[2025-04-03 01:54:00 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][466/573]	eta 0:01:34 lr 0.000948	time 0.8776 (0.8817)	loss 0.4796 (0.5210)	grad_norm 2.8450 (2.6805)	mem 20675MB
[2025-04-03 01:54:02 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][468/573]	eta 0:01:32 lr 0.000948	time 0.8771 (0.8817)	loss 0.5849 (0.5212)	grad_norm 3.1924 (2.6804)	mem 20675MB
[2025-04-03 01:54:04 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][470/573]	eta 0:01:30 lr 0.000948	time 0.8775 (0.8816)	loss 0.5893 (0.5215)	grad_norm 2.4993 (2.6821)	mem 20675MB
[2025-04-03 01:54:05 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][472/573]	eta 0:01:29 lr 0.000947	time 0.8774 (0.8816)	loss 0.4667 (0.5215)	grad_norm 1.9902 (2.6809)	mem 20675MB
[2025-04-03 01:54:07 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][474/573]	eta 0:01:27 lr 0.000947	time 0.8774 (0.8816)	loss 0.5604 (0.5217)	grad_norm 1.6440 (2.6771)	mem 20675MB
[2025-04-03 01:54:09 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][476/573]	eta 0:01:25 lr 0.000947	time 0.8771 (0.8816)	loss 0.5716 (0.5219)	grad_norm 2.6902 (2.6762)	mem 20675MB
[2025-04-03 01:54:11 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][478/573]	eta 0:01:23 lr 0.000947	time 0.8777 (0.8816)	loss 0.5457 (0.5221)	grad_norm 2.4204 (2.6766)	mem 20675MB
[2025-04-03 01:54:12 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][480/573]	eta 0:01:21 lr 0.000947	time 0.8772 (0.8816)	loss 0.4148 (0.5218)	grad_norm 2.1183 (2.6763)	mem 20675MB
[2025-04-03 01:54:14 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][482/573]	eta 0:01:20 lr 0.000947	time 0.8798 (0.8816)	loss 0.6053 (0.5220)	grad_norm 1.5772 (2.6731)	mem 20675MB
[2025-04-03 01:54:16 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][484/573]	eta 0:01:18 lr 0.000946	time 0.8771 (0.8815)	loss 0.5135 (0.5222)	grad_norm 3.0203 (2.6728)	mem 20675MB
[2025-04-03 01:54:18 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][486/573]	eta 0:01:16 lr 0.000946	time 0.8785 (0.8815)	loss 0.3980 (0.5220)	grad_norm 3.5961 (2.6735)	mem 20675MB
[2025-04-03 01:54:19 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][488/573]	eta 0:01:14 lr 0.000946	time 0.8773 (0.8815)	loss 0.5673 (0.5217)	grad_norm 2.2177 (2.6749)	mem 20675MB
[2025-04-03 01:54:21 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][490/573]	eta 0:01:13 lr 0.000946	time 0.8773 (0.8815)	loss 0.3434 (0.5212)	grad_norm 3.0617 (2.6793)	mem 20675MB
[2025-04-03 01:54:23 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][492/573]	eta 0:01:11 lr 0.000946	time 0.8769 (0.8815)	loss 0.5333 (0.5210)	grad_norm 2.6733 (2.6802)	mem 20675MB
[2025-04-03 01:54:25 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][494/573]	eta 0:01:09 lr 0.000945	time 0.8772 (0.8815)	loss 0.5917 (0.5213)	grad_norm 3.2663 (2.6818)	mem 20675MB
[2025-04-03 01:54:26 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][496/573]	eta 0:01:07 lr 0.000945	time 0.8771 (0.8815)	loss 0.5700 (0.5212)	grad_norm 2.3196 (2.6839)	mem 20675MB
[2025-04-03 01:54:28 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][498/573]	eta 0:01:06 lr 0.000945	time 0.8772 (0.8815)	loss 0.5262 (0.5213)	grad_norm 2.0632 (2.6831)	mem 20675MB
[2025-04-03 01:54:30 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][500/573]	eta 0:01:04 lr 0.000945	time 0.8777 (0.8814)	loss 0.5727 (0.5211)	grad_norm 1.7446 (2.6827)	mem 20675MB
[2025-04-03 01:54:32 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][502/573]	eta 0:01:02 lr 0.000945	time 0.8774 (0.8814)	loss 0.5566 (0.5213)	grad_norm 1.7282 (2.6805)	mem 20675MB
[2025-04-03 01:54:33 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][504/573]	eta 0:01:00 lr 0.000944	time 0.8776 (0.8814)	loss 0.5149 (0.5212)	grad_norm 3.1411 (2.6807)	mem 20675MB
[2025-04-03 01:54:35 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][506/573]	eta 0:00:59 lr 0.000944	time 0.8770 (0.8814)	loss 0.4535 (0.5210)	grad_norm 1.8194 (2.6795)	mem 20675MB
[2025-04-03 01:54:37 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][508/573]	eta 0:00:57 lr 0.000944	time 0.8774 (0.8814)	loss 0.5836 (0.5211)	grad_norm 1.7364 (2.6777)	mem 20675MB
[2025-04-03 01:54:39 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][510/573]	eta 0:00:55 lr 0.000944	time 0.8778 (0.8814)	loss 0.4425 (0.5209)	grad_norm 3.2217 (2.6773)	mem 20675MB
[2025-04-03 01:54:40 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][512/573]	eta 0:00:53 lr 0.000944	time 0.8776 (0.8814)	loss 0.5927 (0.5208)	grad_norm 3.1373 (2.6867)	mem 20675MB
[2025-04-03 01:54:42 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][514/573]	eta 0:00:52 lr 0.000943	time 0.8792 (0.8814)	loss 0.4431 (0.5208)	grad_norm 2.9293 (2.6870)	mem 20675MB
[2025-04-03 01:54:44 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][516/573]	eta 0:00:50 lr 0.000943	time 0.8771 (0.8813)	loss 0.4414 (0.5207)	grad_norm 2.5589 (2.6891)	mem 20675MB
[2025-04-03 01:54:46 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][518/573]	eta 0:00:48 lr 0.000943	time 0.8773 (0.8813)	loss 0.5717 (0.5209)	grad_norm 3.5724 (2.6923)	mem 20675MB
[2025-04-03 01:54:47 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][520/573]	eta 0:00:46 lr 0.000943	time 0.8772 (0.8813)	loss 0.5478 (0.5211)	grad_norm 2.4956 (2.6919)	mem 20675MB
[2025-04-03 01:54:49 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][522/573]	eta 0:00:44 lr 0.000943	time 0.8769 (0.8813)	loss 0.4115 (0.5209)	grad_norm 2.5490 (2.6916)	mem 20675MB
[2025-04-03 01:54:51 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][524/573]	eta 0:00:43 lr 0.000942	time 0.8770 (0.8813)	loss 0.5703 (0.5210)	grad_norm 2.0086 (2.6881)	mem 20675MB
[2025-04-03 01:54:53 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][526/573]	eta 0:00:41 lr 0.000942	time 0.8775 (0.8813)	loss 0.5641 (0.5212)	grad_norm 1.8595 (2.6857)	mem 20675MB
[2025-04-03 01:54:55 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][528/573]	eta 0:00:39 lr 0.000942	time 0.8774 (0.8813)	loss 0.5082 (0.5211)	grad_norm 2.0887 (2.6842)	mem 20675MB
[2025-04-03 01:54:56 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][530/573]	eta 0:00:37 lr 0.000942	time 0.8773 (0.8813)	loss 0.6260 (0.5213)	grad_norm 1.9877 (2.6815)	mem 20675MB
[2025-04-03 01:54:58 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][532/573]	eta 0:00:36 lr 0.000942	time 0.8770 (0.8812)	loss 0.4272 (0.5210)	grad_norm 2.8812 (2.6821)	mem 20675MB
[2025-04-03 01:55:00 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][534/573]	eta 0:00:34 lr 0.000941	time 0.8772 (0.8812)	loss 0.5478 (0.5212)	grad_norm 2.1450 (2.6792)	mem 20675MB
[2025-04-03 01:55:02 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][536/573]	eta 0:00:32 lr 0.000941	time 0.8771 (0.8812)	loss 0.5564 (0.5213)	grad_norm 3.0810 (2.6794)	mem 20675MB
[2025-04-03 01:55:03 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][538/573]	eta 0:00:30 lr 0.000941	time 0.8772 (0.8812)	loss 0.5165 (0.5210)	grad_norm 2.9135 (2.6819)	mem 20675MB
[2025-04-03 01:55:05 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][540/573]	eta 0:00:29 lr 0.000941	time 0.8783 (0.8812)	loss 0.5791 (0.5212)	grad_norm 2.6238 (2.6818)	mem 20675MB
[2025-04-03 01:55:07 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][542/573]	eta 0:00:27 lr 0.000941	time 0.8771 (0.8812)	loss 0.4018 (0.5211)	grad_norm 2.9857 (2.6813)	mem 20675MB
[2025-04-03 01:55:09 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][544/573]	eta 0:00:25 lr 0.000940	time 0.8773 (0.8812)	loss 0.5998 (0.5214)	grad_norm 2.5420 (2.6801)	mem 20675MB
[2025-04-03 01:55:10 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][546/573]	eta 0:00:23 lr 0.000940	time 0.8775 (0.8812)	loss 0.5350 (0.5215)	grad_norm 2.4063 (2.6798)	mem 20675MB
[2025-04-03 01:55:12 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][548/573]	eta 0:00:22 lr 0.000940	time 0.8771 (0.8812)	loss 0.6024 (0.5217)	grad_norm 1.7745 (2.6785)	mem 20675MB
[2025-04-03 01:55:14 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][550/573]	eta 0:00:20 lr 0.000940	time 0.8772 (0.8811)	loss 0.5378 (0.5215)	grad_norm 1.9126 (2.6761)	mem 20675MB
[2025-04-03 01:55:16 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][552/573]	eta 0:00:18 lr 0.000940	time 0.8795 (0.8811)	loss 0.3670 (0.5210)	grad_norm 4.2363 (2.6848)	mem 20675MB
[2025-04-03 01:55:17 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][554/573]	eta 0:00:16 lr 0.000939	time 0.8770 (0.8811)	loss 0.5247 (0.5210)	grad_norm 2.5129 (2.6846)	mem 20675MB
[2025-04-03 01:55:19 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][556/573]	eta 0:00:14 lr 0.000939	time 0.8776 (0.8811)	loss 0.4142 (0.5209)	grad_norm 3.2612 (2.6843)	mem 20675MB
[2025-04-03 01:55:21 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][558/573]	eta 0:00:13 lr 0.000939	time 0.8768 (0.8811)	loss 0.5689 (0.5211)	grad_norm 2.9054 (2.6835)	mem 20675MB
[2025-04-03 01:55:23 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][560/573]	eta 0:00:11 lr 0.000939	time 0.8770 (0.8811)	loss 0.3844 (0.5207)	grad_norm 2.2984 (2.6881)	mem 20675MB
[2025-04-03 01:55:24 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][562/573]	eta 0:00:09 lr 0.000939	time 0.8773 (0.8811)	loss 0.5643 (0.5207)	grad_norm 2.5212 (2.6868)	mem 20675MB
[2025-04-03 01:55:26 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][564/573]	eta 0:00:07 lr 0.000938	time 0.8775 (0.8811)	loss 0.5325 (0.5209)	grad_norm 2.1397 (2.6847)	mem 20675MB
[2025-04-03 01:55:28 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][566/573]	eta 0:00:06 lr 0.000938	time 0.8785 (0.8811)	loss 0.4284 (0.5207)	grad_norm 2.3925 (2.6838)	mem 20675MB
[2025-04-03 01:55:30 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][568/573]	eta 0:00:04 lr 0.000938	time 0.8775 (0.8811)	loss 0.6271 (0.5208)	grad_norm 2.0740 (2.6836)	mem 20675MB
[2025-04-03 01:55:31 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][570/573]	eta 0:00:02 lr 0.000938	time 0.8772 (0.8811)	loss 0.5022 (0.5208)	grad_norm 3.2022 (2.6832)	mem 20675MB
[2025-04-03 01:55:33 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][572/573]	eta 0:00:00 lr 0.000938	time 0.8771 (0.8811)	loss 0.3944 (0.5206)	grad_norm 2.5525 (2.6837)	mem 20675MB
[2025-04-03 01:55:33 simmim_finetune] (main_finetune.py 260): INFO EPOCH 9 training takes 0:08:25
[2025-04-03 01:55:35 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.975 (1.975)	Loss 0.5753 (0.5753)	Acc@1 60.156 (60.156)	Mem 20675MB
[2025-04-03 01:55:36 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.848)	Loss 0.5004 (0.5300)	Acc@1 73.438 (67.969)	Mem 20675MB
[2025-04-03 01:55:36 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.623)	Loss 0.5578 (0.5341)	Acc@1 70.312 (69.062)	Mem 20675MB
[2025-04-03 01:55:37 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.526)	Loss 0.5338 (0.5268)	Acc@1 73.438 (70.871)	Mem 20675MB
[2025-04-03 01:55:38 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.472)	Loss 0.5417 (0.5158)	Acc@1 72.656 (72.656)	Mem 20675MB
[2025-04-03 01:55:38 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.438)	Loss 0.4757 (0.5127)	Acc@1 85.938 (73.793)	Mem 20675MB
[2025-04-03 01:55:39 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.414)	Loss 0.4959 (0.5072)	Acc@1 75.000 (74.339)	Mem 20675MB
[2025-04-03 01:55:39 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.397)	Loss 0.4550 (0.5016)	Acc@1 85.156 (75.417)	Mem 20675MB
[2025-04-03 01:55:40 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 75.554
[2025-04-03 01:55:40 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 75.6%
[2025-04-03 01:55:40 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 75.55%
[2025-04-03 01:55:40 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.528976448611793e-06, 3.528976448611793e-06, 5.395551342546166e-06, 5.395551342546166e-06, 8.267205025522124e-06, 8.267205025522124e-06, 1.2685133768562064e-05, 1.2685133768562064e-05, 1.9481947219392734e-05, 1.9481947219392734e-05, 2.993858329759377e-05, 2.993858329759377e-05, 4.6025715725595355e-05, 4.6025715725595355e-05, 7.077515023021317e-05, 7.077515023021317e-05, 0.0001088512033142406, 0.0001088512033142406, 0.00016742974652043667, 0.00016742974652043667, 0.0002575505822222767, 0.0002575505822222767, 0.00039619802176356903, 0.00039619802176356903, 0.0006095017749040189, 0.0006095017749040189, 0.0009376613951200955, 0.0009376613951200955]
[2025-04-03 01:55:42 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][0/573]	eta 0:21:27 lr 0.000938	time 2.2478 (2.2478)	loss 0.4463 (0.4463)	grad_norm 3.0726 (3.0726)	mem 20675MB
[2025-04-03 01:55:44 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][2/573]	eta 0:12:42 lr 0.000937	time 0.8771 (1.3346)	loss 0.3964 (0.4456)	grad_norm 3.0868 (2.7413)	mem 20675MB
[2025-04-03 01:55:45 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][4/573]	eta 0:10:55 lr 0.000937	time 0.8773 (1.1521)	loss 0.5279 (0.4817)	grad_norm 2.0749 (2.3633)	mem 20675MB
[2025-04-03 01:55:47 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][6/573]	eta 0:10:08 lr 0.000937	time 0.8769 (1.0738)	loss 0.4399 (0.4782)	grad_norm 2.4647 (2.5340)	mem 20675MB
[2025-04-03 01:55:49 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][8/573]	eta 0:09:42 lr 0.000937	time 0.8772 (1.0302)	loss 0.5656 (0.4892)	grad_norm 3.1725 (2.5808)	mem 20675MB
[2025-04-03 01:55:51 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][10/573]	eta 0:09:24 lr 0.000937	time 0.8773 (1.0026)	loss 0.5516 (0.4940)	grad_norm 3.0662 (2.6511)	mem 20675MB
[2025-04-03 01:55:52 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][12/573]	eta 0:09:11 lr 0.000936	time 0.8778 (0.9835)	loss 0.4457 (0.5001)	grad_norm 5.1701 (3.0030)	mem 20675MB
[2025-04-03 01:55:54 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][14/573]	eta 0:09:01 lr 0.000936	time 0.8769 (0.9695)	loss 0.4969 (0.4950)	grad_norm 2.3887 (2.9417)	mem 20675MB
[2025-04-03 01:55:56 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][16/573]	eta 0:08:54 lr 0.000936	time 0.8773 (0.9587)	loss 0.5668 (0.5021)	grad_norm 2.8619 (2.8933)	mem 20675MB
[2025-04-03 01:55:58 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][18/573]	eta 0:08:47 lr 0.000936	time 0.8775 (0.9503)	loss 0.5804 (0.5081)	grad_norm 1.6993 (2.7765)	mem 20675MB
[2025-04-03 01:55:59 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][20/573]	eta 0:08:41 lr 0.000936	time 0.8772 (0.9434)	loss 0.5362 (0.5081)	grad_norm 2.6511 (2.7697)	mem 20675MB
[2025-04-03 01:56:01 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][22/573]	eta 0:08:36 lr 0.000935	time 0.8771 (0.9377)	loss 0.5114 (0.5125)	grad_norm 2.1608 (2.7043)	mem 20675MB
[2025-04-03 01:56:03 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][24/573]	eta 0:08:32 lr 0.000935	time 0.8771 (0.9330)	loss 0.4916 (0.5123)	grad_norm 2.0612 (2.6431)	mem 20675MB
[2025-04-03 01:56:05 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][26/573]	eta 0:08:28 lr 0.000935	time 0.8772 (0.9289)	loss 0.5223 (0.5104)	grad_norm 2.1349 (2.6188)	mem 20675MB
[2025-04-03 01:56:06 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][28/573]	eta 0:08:24 lr 0.000935	time 0.8773 (0.9254)	loss 0.5937 (0.5116)	grad_norm 2.5993 (2.6348)	mem 20675MB
[2025-04-03 01:56:08 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][30/573]	eta 0:08:20 lr 0.000935	time 0.8784 (0.9224)	loss 0.5576 (0.5157)	grad_norm 2.8824 (2.6579)	mem 20675MB
[2025-04-03 01:56:10 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][32/573]	eta 0:08:17 lr 0.000934	time 0.8772 (0.9197)	loss 0.5817 (0.5203)	grad_norm 2.8516 (2.6608)	mem 20675MB
[2025-04-03 01:56:12 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][34/573]	eta 0:08:14 lr 0.000934	time 0.8775 (0.9174)	loss 0.4358 (0.5193)	grad_norm 2.5956 (2.6482)	mem 20675MB
[2025-04-03 01:56:13 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][36/573]	eta 0:08:11 lr 0.000934	time 0.8772 (0.9152)	loss 0.4767 (0.5192)	grad_norm 1.8880 (2.6491)	mem 20675MB
[2025-04-03 01:56:15 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][38/573]	eta 0:08:08 lr 0.000934	time 0.8776 (0.9133)	loss 0.5927 (0.5187)	grad_norm 2.5136 (2.6363)	mem 20675MB
[2025-04-03 01:56:17 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][40/573]	eta 0:08:05 lr 0.000934	time 0.8773 (0.9116)	loss 0.4237 (0.5170)	grad_norm 2.4678 (2.6174)	mem 20675MB
[2025-04-03 01:56:19 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][42/573]	eta 0:08:03 lr 0.000933	time 0.8773 (0.9101)	loss 0.4510 (0.5163)	grad_norm 2.3615 (2.5902)	mem 20675MB
[2025-04-03 01:56:20 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][44/573]	eta 0:08:00 lr 0.000933	time 0.8772 (0.9087)	loss 0.4402 (0.5167)	grad_norm 4.8244 (2.6523)	mem 20675MB
[2025-04-03 01:56:22 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][46/573]	eta 0:07:58 lr 0.000933	time 0.8772 (0.9074)	loss 0.5309 (0.5185)	grad_norm 3.0300 (2.6444)	mem 20675MB
[2025-04-03 01:56:24 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][48/573]	eta 0:07:55 lr 0.000933	time 0.8770 (0.9062)	loss 0.3956 (0.5149)	grad_norm 2.4157 (2.6719)	mem 20675MB
[2025-04-03 01:56:26 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][50/573]	eta 0:07:53 lr 0.000933	time 0.8772 (0.9051)	loss 0.5646 (0.5159)	grad_norm 1.9832 (2.6636)	mem 20675MB
[2025-04-03 01:56:28 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][52/573]	eta 0:07:51 lr 0.000932	time 0.8772 (0.9040)	loss 0.4864 (0.5160)	grad_norm 2.1173 (2.6521)	mem 20675MB
[2025-04-03 01:56:29 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][54/573]	eta 0:07:48 lr 0.000932	time 0.8768 (0.9032)	loss 0.5314 (0.5166)	grad_norm 2.2795 (2.6361)	mem 20675MB
[2025-04-03 01:56:31 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][56/573]	eta 0:07:46 lr 0.000932	time 0.8787 (0.9023)	loss 0.3509 (0.5140)	grad_norm 3.9265 (2.6506)	mem 20675MB
[2025-04-03 01:56:33 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][58/573]	eta 0:07:44 lr 0.000932	time 0.8771 (0.9015)	loss 0.5527 (0.5133)	grad_norm 1.8764 (2.6640)	mem 20675MB
[2025-04-03 01:56:35 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][60/573]	eta 0:07:42 lr 0.000932	time 0.8772 (0.9007)	loss 0.4815 (0.5141)	grad_norm 2.7290 (2.6510)	mem 20675MB
[2025-04-03 01:56:36 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][62/573]	eta 0:07:39 lr 0.000931	time 0.8771 (0.9000)	loss 0.4969 (0.5119)	grad_norm 2.2769 (2.6510)	mem 20675MB
[2025-04-03 01:56:38 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][64/573]	eta 0:07:37 lr 0.000931	time 0.8776 (0.8994)	loss 0.5724 (0.5124)	grad_norm 1.8904 (2.6405)	mem 20675MB
[2025-04-03 01:56:40 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][66/573]	eta 0:07:35 lr 0.000931	time 0.8773 (0.8988)	loss 0.5101 (0.5129)	grad_norm 3.0729 (2.6344)	mem 20675MB
[2025-04-03 01:56:42 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][68/573]	eta 0:07:33 lr 0.000931	time 0.8771 (0.8982)	loss 0.5323 (0.5135)	grad_norm 2.6527 (2.6342)	mem 20675MB
[2025-04-03 01:56:43 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][70/573]	eta 0:07:31 lr 0.000931	time 0.8773 (0.8976)	loss 0.5939 (0.5147)	grad_norm 2.1040 (2.6241)	mem 20675MB
[2025-04-03 01:56:45 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][72/573]	eta 0:07:29 lr 0.000930	time 0.8771 (0.8971)	loss 0.5024 (0.5149)	grad_norm 2.5292 (2.6195)	mem 20675MB
[2025-04-03 01:56:47 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][74/573]	eta 0:07:27 lr 0.000930	time 0.8774 (0.8966)	loss 0.5986 (0.5149)	grad_norm 2.1048 (2.6154)	mem 20675MB
[2025-04-03 01:56:49 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][76/573]	eta 0:07:25 lr 0.000930	time 0.8775 (0.8961)	loss 0.6455 (0.5169)	grad_norm 2.4264 (2.6153)	mem 20675MB
[2025-04-03 01:56:50 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][78/573]	eta 0:07:23 lr 0.000930	time 0.8770 (0.8956)	loss 0.4992 (0.5183)	grad_norm 2.1926 (2.6170)	mem 20675MB
[2025-04-03 01:56:52 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][80/573]	eta 0:07:21 lr 0.000930	time 0.8773 (0.8952)	loss 0.5691 (0.5178)	grad_norm 2.4207 (2.6263)	mem 20675MB
[2025-04-03 01:56:54 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][82/573]	eta 0:07:19 lr 0.000929	time 0.8770 (0.8948)	loss 0.3825 (0.5153)	grad_norm 2.0523 (2.6210)	mem 20675MB
[2025-04-03 01:56:56 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][84/573]	eta 0:07:17 lr 0.000929	time 0.8775 (0.8944)	loss 0.5071 (0.5153)	grad_norm 2.6925 (2.6307)	mem 20675MB
[2025-04-03 01:56:57 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][86/573]	eta 0:07:15 lr 0.000929	time 0.8774 (0.8940)	loss 0.5074 (0.5133)	grad_norm 1.8072 (2.6184)	mem 20675MB
[2025-04-03 01:56:59 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][88/573]	eta 0:07:13 lr 0.000929	time 0.8774 (0.8937)	loss 0.5757 (0.5141)	grad_norm 3.4215 (2.6260)	mem 20675MB
[2025-04-03 01:57:01 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][90/573]	eta 0:07:11 lr 0.000929	time 0.8771 (0.8934)	loss 0.5802 (0.5160)	grad_norm 1.8793 (2.6226)	mem 20675MB
[2025-04-03 01:57:03 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][92/573]	eta 0:07:09 lr 0.000928	time 0.8771 (0.8930)	loss 0.5311 (0.5167)	grad_norm 2.2114 (2.6100)	mem 20675MB
[2025-04-03 01:57:04 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][94/573]	eta 0:07:07 lr 0.000928	time 0.8780 (0.8927)	loss 0.5022 (0.5171)	grad_norm 2.9772 (2.6127)	mem 20675MB
[2025-04-03 01:57:06 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][96/573]	eta 0:07:05 lr 0.000928	time 0.8774 (0.8924)	loss 0.5527 (0.5175)	grad_norm 2.0340 (2.5928)	mem 20675MB
[2025-04-03 01:57:08 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][98/573]	eta 0:07:03 lr 0.000928	time 0.8774 (0.8921)	loss 0.5816 (0.5191)	grad_norm 1.5723 (2.5729)	mem 20675MB
[2025-04-03 01:57:10 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][100/573]	eta 0:07:01 lr 0.000928	time 0.8773 (0.8919)	loss 0.4812 (0.5189)	grad_norm 2.3362 (2.5626)	mem 20675MB
[2025-04-03 01:57:11 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][102/573]	eta 0:06:59 lr 0.000927	time 0.8783 (0.8916)	loss 0.5702 (0.5188)	grad_norm 2.6954 (2.5677)	mem 20675MB
[2025-04-03 01:57:13 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][104/573]	eta 0:06:58 lr 0.000927	time 0.8772 (0.8914)	loss 0.5723 (0.5187)	grad_norm 1.9362 (2.5648)	mem 20675MB
[2025-04-03 01:57:15 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][106/573]	eta 0:06:56 lr 0.000927	time 0.8777 (0.8911)	loss 0.5622 (0.5197)	grad_norm 3.0103 (2.5690)	mem 20675MB
[2025-04-03 01:57:17 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][108/573]	eta 0:06:54 lr 0.000927	time 0.8772 (0.8909)	loss 0.4894 (0.5185)	grad_norm 2.8084 (2.5858)	mem 20675MB
[2025-04-03 01:57:18 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][110/573]	eta 0:06:52 lr 0.000927	time 0.8771 (0.8907)	loss 0.5060 (0.5182)	grad_norm 3.6424 (2.5907)	mem 20675MB
[2025-04-03 01:57:20 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][112/573]	eta 0:06:50 lr 0.000926	time 0.8773 (0.8904)	loss 0.5953 (0.5182)	grad_norm 2.0392 (2.5798)	mem 20675MB
[2025-04-03 01:57:22 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][114/573]	eta 0:06:48 lr 0.000926	time 0.8775 (0.8902)	loss 0.5690 (0.5187)	grad_norm 2.8271 (2.5784)	mem 20675MB
[2025-04-03 01:57:24 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][116/573]	eta 0:06:46 lr 0.000926	time 0.8772 (0.8900)	loss 0.4925 (0.5187)	grad_norm 2.2955 (2.5717)	mem 20675MB
[2025-04-03 01:57:25 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][118/573]	eta 0:06:44 lr 0.000926	time 0.8773 (0.8898)	loss 0.5159 (0.5186)	grad_norm 2.5401 (2.5750)	mem 20675MB
[2025-04-03 01:57:27 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][120/573]	eta 0:06:43 lr 0.000926	time 0.8778 (0.8896)	loss 0.4155 (0.5171)	grad_norm 2.8260 (2.5811)	mem 20675MB
[2025-04-03 01:57:29 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][122/573]	eta 0:06:41 lr 0.000925	time 0.8770 (0.8895)	loss 0.3957 (0.5166)	grad_norm 6.7627 (2.6128)	mem 20675MB
[2025-04-03 01:57:31 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][124/573]	eta 0:06:39 lr 0.000925	time 0.8771 (0.8893)	loss 0.5306 (0.5169)	grad_norm 2.4462 (2.6075)	mem 20675MB
[2025-04-03 01:57:33 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][126/573]	eta 0:06:37 lr 0.000925	time 0.8770 (0.8891)	loss 0.4687 (0.5171)	grad_norm 2.9288 (2.6342)	mem 20675MB
[2025-04-03 01:57:34 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][128/573]	eta 0:06:35 lr 0.000925	time 0.8771 (0.8889)	loss 0.5031 (0.5180)	grad_norm 2.8142 (2.6389)	mem 20675MB
[2025-04-03 01:57:36 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][130/573]	eta 0:06:33 lr 0.000925	time 0.8770 (0.8888)	loss 0.6118 (0.5195)	grad_norm 3.0496 (2.6502)	mem 20675MB
[2025-04-03 01:57:38 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][132/573]	eta 0:06:31 lr 0.000924	time 0.8790 (0.8886)	loss 0.5346 (0.5200)	grad_norm 2.4733 (2.6511)	mem 20675MB
[2025-04-03 01:57:40 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][134/573]	eta 0:06:30 lr 0.000924	time 0.8772 (0.8885)	loss 0.5758 (0.5212)	grad_norm 1.4587 (2.6340)	mem 20675MB
[2025-04-03 01:57:41 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][136/573]	eta 0:06:28 lr 0.000924	time 0.8774 (0.8883)	loss 0.6252 (0.5220)	grad_norm 1.8955 (2.6250)	mem 20675MB
[2025-04-03 01:57:43 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][138/573]	eta 0:06:26 lr 0.000924	time 0.8774 (0.8882)	loss 0.5079 (0.5221)	grad_norm 2.0105 (2.6132)	mem 20675MB
[2025-04-03 01:57:45 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][140/573]	eta 0:06:24 lr 0.000924	time 0.8776 (0.8880)	loss 0.5898 (0.5229)	grad_norm 1.6249 (2.5998)	mem 20675MB
[2025-04-03 01:57:47 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][142/573]	eta 0:06:22 lr 0.000923	time 0.8772 (0.8879)	loss 0.5536 (0.5234)	grad_norm 2.3363 (2.5897)	mem 20675MB
[2025-04-03 01:57:48 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][144/573]	eta 0:06:20 lr 0.000923	time 0.8777 (0.8878)	loss 0.6151 (0.5248)	grad_norm 3.3581 (2.5908)	mem 20675MB
[2025-04-03 01:57:50 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][146/573]	eta 0:06:19 lr 0.000923	time 0.8773 (0.8876)	loss 0.5682 (0.5253)	grad_norm 2.1204 (2.5823)	mem 20675MB
[2025-04-03 01:57:52 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][148/573]	eta 0:06:17 lr 0.000923	time 0.8769 (0.8875)	loss 0.3931 (0.5244)	grad_norm 3.3499 (2.5878)	mem 20675MB
[2025-04-03 01:57:54 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][150/573]	eta 0:06:15 lr 0.000923	time 0.8769 (0.8874)	loss 0.5782 (0.5246)	grad_norm 2.2462 (2.5849)	mem 20675MB
[2025-04-03 01:57:55 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][152/573]	eta 0:06:13 lr 0.000922	time 0.8771 (0.8872)	loss 0.4389 (0.5241)	grad_norm 2.9425 (2.5808)	mem 20675MB
[2025-04-03 01:57:57 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][154/573]	eta 0:06:11 lr 0.000922	time 0.8773 (0.8871)	loss 0.5925 (0.5247)	grad_norm 2.0188 (2.5712)	mem 20675MB
[2025-04-03 01:57:59 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][156/573]	eta 0:06:09 lr 0.000922	time 0.8773 (0.8870)	loss 0.6433 (0.5256)	grad_norm 2.2713 (2.5693)	mem 20675MB
[2025-04-03 01:58:01 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][158/573]	eta 0:06:08 lr 0.000922	time 0.8771 (0.8869)	loss 0.4706 (0.5253)	grad_norm 4.0889 (2.5793)	mem 20675MB
[2025-04-03 01:58:02 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][160/573]	eta 0:06:06 lr 0.000922	time 0.8771 (0.8868)	loss 0.5473 (0.5253)	grad_norm 3.1773 (2.5833)	mem 20675MB
[2025-04-03 01:58:04 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][162/573]	eta 0:06:04 lr 0.000921	time 0.8775 (0.8867)	loss 0.6196 (0.5252)	grad_norm 2.2139 (2.5882)	mem 20675MB
[2025-04-03 01:58:06 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][164/573]	eta 0:06:02 lr 0.000921	time 0.8774 (0.8866)	loss 0.5443 (0.5247)	grad_norm 2.0045 (2.5987)	mem 20675MB
[2025-04-03 01:58:08 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][166/573]	eta 0:06:00 lr 0.000921	time 0.8771 (0.8865)	loss 0.4083 (0.5232)	grad_norm 2.4542 (2.5975)	mem 20675MB
[2025-04-03 01:58:09 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][168/573]	eta 0:05:58 lr 0.000921	time 0.8773 (0.8864)	loss 0.4491 (0.5220)	grad_norm 2.4664 (2.6001)	mem 20675MB
[2025-04-03 01:58:11 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][170/573]	eta 0:05:57 lr 0.000921	time 0.8775 (0.8863)	loss 0.5923 (0.5218)	grad_norm 2.8958 (2.6046)	mem 20675MB
[2025-04-03 01:58:13 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][172/573]	eta 0:05:55 lr 0.000920	time 0.8773 (0.8862)	loss 0.5303 (0.5224)	grad_norm 2.0592 (2.6215)	mem 20675MB
[2025-04-03 01:58:15 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][174/573]	eta 0:05:53 lr 0.000920	time 0.8772 (0.8861)	loss 0.5672 (0.5227)	grad_norm 2.4149 (2.6158)	mem 20675MB
[2025-04-03 01:58:16 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][176/573]	eta 0:05:51 lr 0.000920	time 0.8774 (0.8860)	loss 0.6002 (0.5229)	grad_norm 3.2296 (2.6154)	mem 20675MB
[2025-04-03 01:58:18 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][178/573]	eta 0:05:49 lr 0.000920	time 0.8773 (0.8859)	loss 0.5806 (0.5229)	grad_norm 2.8137 (2.6175)	mem 20675MB
[2025-04-03 01:58:20 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][180/573]	eta 0:05:48 lr 0.000920	time 0.8773 (0.8858)	loss 0.5846 (0.5236)	grad_norm 1.9247 (2.6121)	mem 20675MB
[2025-04-03 01:58:22 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][182/573]	eta 0:05:46 lr 0.000919	time 0.8773 (0.8858)	loss 0.5469 (0.5238)	grad_norm 1.7661 (2.6044)	mem 20675MB
[2025-04-03 01:58:23 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][184/573]	eta 0:05:44 lr 0.000919	time 0.8772 (0.8857)	loss 0.4845 (0.5239)	grad_norm 2.8312 (2.6003)	mem 20675MB
[2025-04-03 01:58:25 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][186/573]	eta 0:05:42 lr 0.000919	time 0.8772 (0.8856)	loss 0.4134 (0.5236)	grad_norm 2.9978 (2.5988)	mem 20675MB
[2025-04-03 01:58:27 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][188/573]	eta 0:05:40 lr 0.000919	time 0.8771 (0.8855)	loss 0.5483 (0.5239)	grad_norm 2.2194 (2.5931)	mem 20675MB
[2025-04-03 01:58:29 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][190/573]	eta 0:05:39 lr 0.000919	time 0.8772 (0.8854)	loss 0.5948 (0.5245)	grad_norm 1.8472 (2.5845)	mem 20675MB
[2025-04-03 01:58:30 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][192/573]	eta 0:05:37 lr 0.000918	time 0.8771 (0.8854)	loss 0.4355 (0.5241)	grad_norm 2.0825 (2.5771)	mem 20675MB
[2025-04-03 01:58:32 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][194/573]	eta 0:05:35 lr 0.000918	time 0.8770 (0.8853)	loss 0.5459 (0.5237)	grad_norm 2.0857 (2.5745)	mem 20675MB
[2025-04-03 01:58:34 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][196/573]	eta 0:05:33 lr 0.000918	time 0.8771 (0.8852)	loss 0.4191 (0.5235)	grad_norm 3.7068 (2.5768)	mem 20675MB
[2025-04-03 01:58:36 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][198/573]	eta 0:05:31 lr 0.000918	time 0.8772 (0.8851)	loss 0.6159 (0.5242)	grad_norm 2.6911 (2.5741)	mem 20675MB
[2025-04-03 01:58:37 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][200/573]	eta 0:05:30 lr 0.000918	time 0.8796 (0.8851)	loss 0.5930 (0.5246)	grad_norm 2.4508 (2.5732)	mem 20675MB
[2025-04-03 01:58:39 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][202/573]	eta 0:05:28 lr 0.000917	time 0.8769 (0.8850)	loss 0.5409 (0.5248)	grad_norm 2.3622 (2.5767)	mem 20675MB
[2025-04-03 01:58:41 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][204/573]	eta 0:05:26 lr 0.000917	time 0.8770 (0.8849)	loss 0.5661 (0.5241)	grad_norm 1.9236 (2.5725)	mem 20675MB
[2025-04-03 01:58:43 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][206/573]	eta 0:05:24 lr 0.000917	time 0.8771 (0.8849)	loss 0.5238 (0.5244)	grad_norm 2.8003 (2.5761)	mem 20675MB
[2025-04-03 01:58:45 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][208/573]	eta 0:05:22 lr 0.000917	time 0.8770 (0.8848)	loss 0.6354 (0.5249)	grad_norm 2.4937 (2.5733)	mem 20675MB
[2025-04-03 01:58:46 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][210/573]	eta 0:05:21 lr 0.000917	time 0.8772 (0.8848)	loss 0.6275 (0.5255)	grad_norm 1.9438 (2.5681)	mem 20675MB
[2025-04-03 01:58:48 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][212/573]	eta 0:05:19 lr 0.000916	time 0.8769 (0.8847)	loss 0.5425 (0.5256)	grad_norm 1.9012 (2.5643)	mem 20675MB
[2025-04-03 01:58:50 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][214/573]	eta 0:05:17 lr 0.000916	time 0.8769 (0.8846)	loss 0.4006 (0.5253)	grad_norm 3.0601 (2.5661)	mem 20675MB
[2025-04-03 01:58:52 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][216/573]	eta 0:05:15 lr 0.000916	time 0.8770 (0.8846)	loss 0.6044 (0.5252)	grad_norm 3.0196 (2.5678)	mem 20675MB
[2025-04-03 01:58:53 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][218/573]	eta 0:05:13 lr 0.000916	time 0.8770 (0.8845)	loss 0.5612 (0.5253)	grad_norm 2.1001 (2.5610)	mem 20675MB
[2025-04-03 01:58:55 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][220/573]	eta 0:05:12 lr 0.000916	time 0.8770 (0.8844)	loss 0.5172 (0.5254)	grad_norm 1.8066 (2.5563)	mem 20675MB
[2025-04-03 01:58:57 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][222/573]	eta 0:05:10 lr 0.000915	time 0.8769 (0.8844)	loss 0.4665 (0.5247)	grad_norm 3.0311 (2.5677)	mem 20675MB
[2025-04-03 01:58:59 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][224/573]	eta 0:05:08 lr 0.000915	time 0.8773 (0.8843)	loss 0.4259 (0.5245)	grad_norm 2.8351 (2.5662)	mem 20675MB
[2025-04-03 01:59:00 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][226/573]	eta 0:05:06 lr 0.000915	time 0.8769 (0.8843)	loss 0.4924 (0.5242)	grad_norm 2.2738 (2.5618)	mem 20675MB
[2025-04-03 01:59:02 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][228/573]	eta 0:05:05 lr 0.000915	time 0.8772 (0.8842)	loss 0.4459 (0.5232)	grad_norm 4.7078 (2.5765)	mem 20675MB
[2025-04-03 01:59:04 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][230/573]	eta 0:05:03 lr 0.000915	time 0.8773 (0.8842)	loss 0.4960 (0.5232)	grad_norm 3.5600 (2.5784)	mem 20675MB
[2025-04-03 01:59:06 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][232/573]	eta 0:05:01 lr 0.000914	time 0.8770 (0.8841)	loss 0.5531 (0.5227)	grad_norm 2.4559 (2.5799)	mem 20675MB
[2025-04-03 01:59:07 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][234/573]	eta 0:04:59 lr 0.000914	time 0.8774 (0.8841)	loss 0.6311 (0.5236)	grad_norm 2.6451 (2.5813)	mem 20675MB
[2025-04-03 01:59:09 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][236/573]	eta 0:04:57 lr 0.000914	time 0.8770 (0.8840)	loss 0.5620 (0.5239)	grad_norm 2.9200 (2.5820)	mem 20675MB
[2025-04-03 01:59:11 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][238/573]	eta 0:04:56 lr 0.000914	time 0.8774 (0.8839)	loss 0.5694 (0.5242)	grad_norm 1.8181 (2.5789)	mem 20675MB
[2025-04-03 01:59:13 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][240/573]	eta 0:04:54 lr 0.000914	time 0.8788 (0.8839)	loss 0.4726 (0.5235)	grad_norm 2.3537 (2.5852)	mem 20675MB
[2025-04-03 01:59:14 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][242/573]	eta 0:04:52 lr 0.000913	time 0.8770 (0.8839)	loss 0.5898 (0.5241)	grad_norm 2.4500 (2.5811)	mem 20675MB
[2025-04-03 01:59:16 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][244/573]	eta 0:04:50 lr 0.000913	time 0.8774 (0.8838)	loss 0.5585 (0.5247)	grad_norm 2.4225 (2.5775)	mem 20675MB
[2025-04-03 01:59:18 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][246/573]	eta 0:04:48 lr 0.000913	time 0.8773 (0.8838)	loss 0.5785 (0.5247)	grad_norm 2.3833 (2.5753)	mem 20675MB
[2025-04-03 01:59:20 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][248/573]	eta 0:04:47 lr 0.000913	time 0.8771 (0.8837)	loss 0.5447 (0.5245)	grad_norm 1.6823 (2.5705)	mem 20675MB
[2025-04-03 01:59:21 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][250/573]	eta 0:04:45 lr 0.000913	time 0.8773 (0.8837)	loss 0.5934 (0.5250)	grad_norm 2.7348 (2.5680)	mem 20675MB
[2025-04-03 01:59:23 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][252/573]	eta 0:04:43 lr 0.000912	time 0.8773 (0.8836)	loss 0.4067 (0.5248)	grad_norm 2.0629 (2.5647)	mem 20675MB
[2025-04-03 01:59:25 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][254/573]	eta 0:04:41 lr 0.000912	time 0.8769 (0.8836)	loss 0.4586 (0.5246)	grad_norm 2.1195 (2.5658)	mem 20675MB
[2025-04-03 01:59:27 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][256/573]	eta 0:04:40 lr 0.000912	time 0.8772 (0.8835)	loss 0.4930 (0.5246)	grad_norm 2.1040 (2.5630)	mem 20675MB
[2025-04-03 01:59:28 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][258/573]	eta 0:04:38 lr 0.000912	time 0.8771 (0.8835)	loss 0.5852 (0.5247)	grad_norm 3.1823 (2.5667)	mem 20675MB
[2025-04-03 01:59:30 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][260/573]	eta 0:04:36 lr 0.000912	time 0.8774 (0.8835)	loss 0.4449 (0.5247)	grad_norm 2.5980 (2.5679)	mem 20675MB
[2025-04-03 01:59:32 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][262/573]	eta 0:04:34 lr 0.000911	time 0.8779 (0.8834)	loss 0.4396 (0.5245)	grad_norm 2.2332 (2.5643)	mem 20675MB
[2025-04-03 01:59:34 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][264/573]	eta 0:04:32 lr 0.000911	time 0.8775 (0.8834)	loss 0.5653 (0.5244)	grad_norm 2.7078 (2.5690)	mem 20675MB
[2025-04-03 01:59:35 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][266/573]	eta 0:04:31 lr 0.000911	time 0.8774 (0.8833)	loss 0.5815 (0.5247)	grad_norm 2.1724 (2.5722)	mem 20675MB
[2025-04-03 01:59:37 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][268/573]	eta 0:04:29 lr 0.000911	time 0.8771 (0.8833)	loss 0.4387 (0.5245)	grad_norm 2.7645 (2.5717)	mem 20675MB
[2025-04-03 01:59:39 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][270/573]	eta 0:04:27 lr 0.000910	time 0.8771 (0.8833)	loss 0.5527 (0.5246)	grad_norm 3.1837 (2.5732)	mem 20675MB
[2025-04-03 01:59:41 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][272/573]	eta 0:04:25 lr 0.000910	time 0.8777 (0.8832)	loss 0.5041 (0.5243)	grad_norm 3.3035 (2.5772)	mem 20675MB
[2025-04-03 01:59:42 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][274/573]	eta 0:04:24 lr 0.000910	time 0.8771 (0.8832)	loss 0.5494 (0.5242)	grad_norm 1.8007 (2.5710)	mem 20675MB
[2025-04-03 01:59:44 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][276/573]	eta 0:04:22 lr 0.000910	time 0.8770 (0.8832)	loss 0.5221 (0.5240)	grad_norm 3.0236 (2.5741)	mem 20675MB
[2025-04-03 01:59:46 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][278/573]	eta 0:04:20 lr 0.000910	time 0.8771 (0.8831)	loss 0.5101 (0.5244)	grad_norm 2.6289 (2.5772)	mem 20675MB
[2025-04-03 01:59:48 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][280/573]	eta 0:04:18 lr 0.000909	time 0.8773 (0.8831)	loss 0.4901 (0.5242)	grad_norm 2.9019 (2.5798)	mem 20675MB
[2025-04-03 01:59:49 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][282/573]	eta 0:04:16 lr 0.000909	time 0.8773 (0.8830)	loss 0.5687 (0.5246)	grad_norm 2.5123 (2.5769)	mem 20675MB
[2025-04-03 01:59:51 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][284/573]	eta 0:04:15 lr 0.000909	time 0.8776 (0.8830)	loss 0.5798 (0.5248)	grad_norm 4.2840 (2.5835)	mem 20675MB
[2025-04-03 01:59:53 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][286/573]	eta 0:04:13 lr 0.000909	time 0.8772 (0.8830)	loss 0.5620 (0.5246)	grad_norm 2.6010 (2.5857)	mem 20675MB
[2025-04-03 01:59:55 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][288/573]	eta 0:04:11 lr 0.000909	time 0.8775 (0.8829)	loss 0.4414 (0.5243)	grad_norm 2.7800 (2.5847)	mem 20675MB
[2025-04-03 01:59:57 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][290/573]	eta 0:04:09 lr 0.000908	time 0.8770 (0.8829)	loss 0.5459 (0.5240)	grad_norm 1.7035 (2.5854)	mem 20675MB
[2025-04-03 01:59:58 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][292/573]	eta 0:04:08 lr 0.000908	time 0.8775 (0.8829)	loss 0.5560 (0.5239)	grad_norm 1.4216 (2.5856)	mem 20675MB
[2025-04-03 02:00:00 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][294/573]	eta 0:04:06 lr 0.000908	time 0.8770 (0.8828)	loss 0.4327 (0.5236)	grad_norm 2.5315 (2.5835)	mem 20675MB
[2025-04-03 02:00:02 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][296/573]	eta 0:04:04 lr 0.000908	time 0.8772 (0.8828)	loss 0.3731 (0.5234)	grad_norm 2.9004 (2.5861)	mem 20675MB
[2025-04-03 02:00:04 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][298/573]	eta 0:04:02 lr 0.000908	time 0.8777 (0.8828)	loss 0.3969 (0.5231)	grad_norm 4.1377 (2.5942)	mem 20675MB
[2025-04-03 02:00:05 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][300/573]	eta 0:04:00 lr 0.000907	time 0.8774 (0.8828)	loss 0.3674 (0.5227)	grad_norm 3.7158 (2.5954)	mem 20675MB
[2025-04-03 02:00:07 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][302/573]	eta 0:03:59 lr 0.000907	time 0.8771 (0.8827)	loss 0.3976 (0.5218)	grad_norm 2.9032 (2.5974)	mem 20675MB
[2025-04-03 02:00:09 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][304/573]	eta 0:03:57 lr 0.000907	time 0.8775 (0.8827)	loss 0.5689 (0.5221)	grad_norm 2.3132 (2.5966)	mem 20675MB
[2025-04-03 02:00:11 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][306/573]	eta 0:03:55 lr 0.000907	time 0.8776 (0.8827)	loss 0.4070 (0.5218)	grad_norm 3.0858 (2.5988)	mem 20675MB
[2025-04-03 02:00:12 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][308/573]	eta 0:03:53 lr 0.000907	time 0.8772 (0.8826)	loss 0.5041 (0.5218)	grad_norm 1.9815 (2.5942)	mem 20675MB
[2025-04-03 02:00:14 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][310/573]	eta 0:03:52 lr 0.000906	time 0.8777 (0.8826)	loss 0.4589 (0.5217)	grad_norm 3.4162 (2.6043)	mem 20675MB
[2025-04-03 02:00:16 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][312/573]	eta 0:03:50 lr 0.000906	time 0.8772 (0.8826)	loss 0.5629 (0.5218)	grad_norm 2.8700 (2.6029)	mem 20675MB
[2025-04-03 02:00:18 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][314/573]	eta 0:03:48 lr 0.000906	time 0.8775 (0.8826)	loss 0.6006 (0.5219)	grad_norm 2.1302 (2.6022)	mem 20675MB
[2025-04-03 02:00:19 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][316/573]	eta 0:03:46 lr 0.000906	time 0.8774 (0.8825)	loss 0.4511 (0.5218)	grad_norm 3.2795 (2.6037)	mem 20675MB
[2025-04-03 02:00:21 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][318/573]	eta 0:03:45 lr 0.000906	time 0.8777 (0.8825)	loss 0.3652 (0.5213)	grad_norm 3.7494 (2.6056)	mem 20675MB
[2025-04-03 02:00:23 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][320/573]	eta 0:03:43 lr 0.000905	time 0.8773 (0.8825)	loss 0.4820 (0.5211)	grad_norm 2.7112 (2.6030)	mem 20675MB
[2025-04-03 02:00:25 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][322/573]	eta 0:03:41 lr 0.000905	time 0.8773 (0.8824)	loss 0.6171 (0.5217)	grad_norm 2.1149 (2.6020)	mem 20675MB
[2025-04-03 02:00:26 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][324/573]	eta 0:03:39 lr 0.000905	time 0.8774 (0.8824)	loss 0.5059 (0.5216)	grad_norm 2.3633 (2.5983)	mem 20675MB
[2025-04-03 02:00:28 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][326/573]	eta 0:03:37 lr 0.000905	time 0.8772 (0.8824)	loss 0.4639 (0.5216)	grad_norm 2.6264 (2.5981)	mem 20675MB
[2025-04-03 02:00:30 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][328/573]	eta 0:03:36 lr 0.000905	time 0.8779 (0.8824)	loss 0.3772 (0.5207)	grad_norm 2.4235 (2.6038)	mem 20675MB
[2025-04-03 02:00:32 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][330/573]	eta 0:03:34 lr 0.000904	time 0.8778 (0.8824)	loss 0.5412 (0.5210)	grad_norm 3.0096 (2.6061)	mem 20675MB
[2025-04-03 02:00:33 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][332/573]	eta 0:03:32 lr 0.000904	time 0.8776 (0.8823)	loss 0.5342 (0.5211)	grad_norm 1.6842 (2.6016)	mem 20675MB
[2025-04-03 02:00:35 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][334/573]	eta 0:03:30 lr 0.000904	time 0.8778 (0.8823)	loss 0.5718 (0.5213)	grad_norm 2.5305 (2.5988)	mem 20675MB
[2025-04-03 02:00:37 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][336/573]	eta 0:03:29 lr 0.000904	time 0.8779 (0.8823)	loss 0.5782 (0.5217)	grad_norm 2.3877 (2.5981)	mem 20675MB
[2025-04-03 02:00:39 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][338/573]	eta 0:03:27 lr 0.000904	time 0.8779 (0.8823)	loss 0.5485 (0.5220)	grad_norm 1.4670 (2.5953)	mem 20675MB
[2025-04-03 02:00:40 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][340/573]	eta 0:03:25 lr 0.000903	time 0.8777 (0.8822)	loss 0.5180 (0.5220)	grad_norm 2.1588 (2.5923)	mem 20675MB
[2025-04-03 02:00:42 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][342/573]	eta 0:03:23 lr 0.000903	time 0.8780 (0.8822)	loss 0.5186 (0.5219)	grad_norm 1.6909 (2.5903)	mem 20675MB
[2025-04-03 02:00:44 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][344/573]	eta 0:03:22 lr 0.000903	time 0.8787 (0.8822)	loss 0.6192 (0.5219)	grad_norm 1.8347 (2.5866)	mem 20675MB
[2025-04-03 02:00:46 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][346/573]	eta 0:03:20 lr 0.000903	time 0.8776 (0.8822)	loss 0.6072 (0.5222)	grad_norm 2.2937 (2.5844)	mem 20675MB
[2025-04-03 02:00:47 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][348/573]	eta 0:03:18 lr 0.000903	time 0.8854 (0.8822)	loss 0.5869 (0.5223)	grad_norm 2.2317 (2.5838)	mem 20675MB
[2025-04-03 02:00:49 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][350/573]	eta 0:03:16 lr 0.000902	time 0.8778 (0.8822)	loss 0.5293 (0.5219)	grad_norm 1.8865 (2.5829)	mem 20675MB
[2025-04-03 02:00:51 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][352/573]	eta 0:03:14 lr 0.000902	time 0.8787 (0.8822)	loss 0.5057 (0.5218)	grad_norm 1.9052 (2.5802)	mem 20675MB
[2025-04-03 02:00:53 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][354/573]	eta 0:03:13 lr 0.000902	time 0.8782 (0.8821)	loss 0.4656 (0.5218)	grad_norm 3.6340 (2.5913)	mem 20675MB
[2025-04-03 02:00:55 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][356/573]	eta 0:03:11 lr 0.000902	time 0.8776 (0.8821)	loss 0.5375 (0.5220)	grad_norm 2.0494 (2.5912)	mem 20675MB
[2025-04-03 02:00:56 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][358/573]	eta 0:03:09 lr 0.000902	time 0.8780 (0.8821)	loss 0.4481 (0.5219)	grad_norm 3.0621 (2.5950)	mem 20675MB
[2025-04-03 02:00:58 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][360/573]	eta 0:03:07 lr 0.000901	time 0.8780 (0.8821)	loss 0.6525 (0.5224)	grad_norm 2.8123 (2.5945)	mem 20675MB
[2025-04-03 02:01:00 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][362/573]	eta 0:03:06 lr 0.000901	time 0.8779 (0.8821)	loss 0.5502 (0.5222)	grad_norm 3.0343 (2.5960)	mem 20675MB
[2025-04-03 02:01:02 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][364/573]	eta 0:03:04 lr 0.000901	time 0.8776 (0.8821)	loss 0.4216 (0.5216)	grad_norm 3.2224 (2.5997)	mem 20675MB
[2025-04-03 02:01:03 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][366/573]	eta 0:03:02 lr 0.000901	time 0.8776 (0.8820)	loss 0.5002 (0.5215)	grad_norm 2.2548 (2.6003)	mem 20675MB
[2025-04-03 02:01:05 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][368/573]	eta 0:03:00 lr 0.000900	time 0.8777 (0.8820)	loss 0.4339 (0.5215)	grad_norm 4.6295 (2.6079)	mem 20675MB
[2025-04-03 02:01:07 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][370/573]	eta 0:02:59 lr 0.000900	time 0.8782 (0.8820)	loss 0.5578 (0.5215)	grad_norm 1.8372 (2.6066)	mem 20675MB
[2025-04-03 02:01:09 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][372/573]	eta 0:02:57 lr 0.000900	time 0.8777 (0.8820)	loss 0.5533 (0.5215)	grad_norm 2.1020 (2.6080)	mem 20675MB
[2025-04-03 02:01:10 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][374/573]	eta 0:02:55 lr 0.000900	time 0.8776 (0.8820)	loss 0.4384 (0.5214)	grad_norm 2.1815 (2.6048)	mem 20675MB
[2025-04-03 02:01:12 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][376/573]	eta 0:02:53 lr 0.000900	time 0.8774 (0.8820)	loss 0.3416 (0.5208)	grad_norm 3.0461 (2.6072)	mem 20675MB
[2025-04-03 02:01:14 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][378/573]	eta 0:02:51 lr 0.000899	time 0.8777 (0.8819)	loss 0.4461 (0.5208)	grad_norm 2.8410 (2.6063)	mem 20675MB
[2025-04-03 02:01:16 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][380/573]	eta 0:02:50 lr 0.000899	time 0.8780 (0.8819)	loss 0.5521 (0.5208)	grad_norm 2.3300 (2.6071)	mem 20675MB
[2025-04-03 02:01:17 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][382/573]	eta 0:02:48 lr 0.000899	time 0.8775 (0.8819)	loss 0.3930 (0.5204)	grad_norm 4.0833 (2.6135)	mem 20675MB
[2025-04-03 02:01:19 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][384/573]	eta 0:02:46 lr 0.000899	time 0.8777 (0.8819)	loss 0.5890 (0.5206)	grad_norm 1.9541 (2.6143)	mem 20675MB
[2025-04-03 02:01:21 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][386/573]	eta 0:02:44 lr 0.000899	time 0.8777 (0.8819)	loss 0.5804 (0.5209)	grad_norm 2.0042 (2.6129)	mem 20675MB
[2025-04-03 02:01:23 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][388/573]	eta 0:02:43 lr 0.000898	time 0.8788 (0.8819)	loss 0.5418 (0.5211)	grad_norm 2.4630 (2.6154)	mem 20675MB
[2025-04-03 02:01:24 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][390/573]	eta 0:02:41 lr 0.000898	time 0.8780 (0.8819)	loss 0.5653 (0.5211)	grad_norm 1.5877 (2.6124)	mem 20675MB
[2025-04-03 02:01:26 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][392/573]	eta 0:02:39 lr 0.000898	time 0.8779 (0.8818)	loss 0.3858 (0.5205)	grad_norm 2.1952 (2.6110)	mem 20675MB
[2025-04-03 02:01:28 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][394/573]	eta 0:02:37 lr 0.000898	time 0.8822 (0.8818)	loss 0.4853 (0.5205)	grad_norm 2.7452 (2.6081)	mem 20675MB
[2025-04-03 02:01:30 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][396/573]	eta 0:02:36 lr 0.000898	time 0.8777 (0.8818)	loss 0.5428 (0.5205)	grad_norm 3.8845 (2.6126)	mem 20675MB
[2025-04-03 02:01:31 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][398/573]	eta 0:02:34 lr 0.000897	time 0.8782 (0.8818)	loss 0.5613 (0.5206)	grad_norm 1.9031 (2.6099)	mem 20675MB
[2025-04-03 02:01:33 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][400/573]	eta 0:02:32 lr 0.000897	time 0.8791 (0.8818)	loss 0.5678 (0.5206)	grad_norm 3.1247 (2.6100)	mem 20675MB
[2025-04-03 02:01:35 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][402/573]	eta 0:02:30 lr 0.000897	time 0.8776 (0.8818)	loss 0.5469 (0.5207)	grad_norm 2.2286 (2.6088)	mem 20675MB
[2025-04-03 02:01:37 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][404/573]	eta 0:02:29 lr 0.000897	time 0.8897 (0.8818)	loss 0.4316 (0.5204)	grad_norm 2.3141 (2.6115)	mem 20675MB
[2025-04-03 02:01:38 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][406/573]	eta 0:02:27 lr 0.000897	time 0.8782 (0.8818)	loss 0.3773 (0.5197)	grad_norm 3.2771 (2.6145)	mem 20675MB
[2025-04-03 02:01:40 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][408/573]	eta 0:02:25 lr 0.000896	time 0.8776 (0.8818)	loss 0.4146 (0.5195)	grad_norm 4.9191 (2.6180)	mem 20675MB
[2025-04-03 02:01:42 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][410/573]	eta 0:02:23 lr 0.000896	time 0.8774 (0.8817)	loss 0.5169 (0.5197)	grad_norm 1.9882 (2.6157)	mem 20675MB
[2025-04-03 02:01:44 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][412/573]	eta 0:02:21 lr 0.000896	time 0.8775 (0.8817)	loss 0.4591 (0.5192)	grad_norm 2.4577 (2.6161)	mem 20675MB
[2025-04-03 02:01:46 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][414/573]	eta 0:02:20 lr 0.000896	time 0.8773 (0.8817)	loss 0.4849 (0.5191)	grad_norm 4.0043 (2.6187)	mem 20675MB
[2025-04-03 02:01:47 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][416/573]	eta 0:02:18 lr 0.000896	time 0.8774 (0.8817)	loss 0.4674 (0.5190)	grad_norm 2.5403 (2.6180)	mem 20675MB
[2025-04-03 02:01:49 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][418/573]	eta 0:02:16 lr 0.000895	time 0.8774 (0.8817)	loss 0.6280 (0.5193)	grad_norm 3.0778 (2.6198)	mem 20675MB
[2025-04-03 02:01:51 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][420/573]	eta 0:02:14 lr 0.000895	time 0.8775 (0.8817)	loss 0.5504 (0.5191)	grad_norm 2.1029 (2.6203)	mem 20675MB
[2025-04-03 02:01:53 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][422/573]	eta 0:02:13 lr 0.000895	time 0.8777 (0.8817)	loss 0.6036 (0.5192)	grad_norm 2.1425 (2.6215)	mem 20675MB
[2025-04-03 02:01:54 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][424/573]	eta 0:02:11 lr 0.000895	time 0.8773 (0.8816)	loss 0.5469 (0.5192)	grad_norm 1.8616 (2.6203)	mem 20675MB
[2025-04-03 02:01:56 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][426/573]	eta 0:02:09 lr 0.000895	time 0.8770 (0.8816)	loss 0.4168 (0.5190)	grad_norm 3.2436 (2.6219)	mem 20675MB
[2025-04-03 02:01:58 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][428/573]	eta 0:02:07 lr 0.000894	time 0.8775 (0.8816)	loss 0.3678 (0.5187)	grad_norm 2.4445 (2.6196)	mem 20675MB
[2025-04-03 02:02:00 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][430/573]	eta 0:02:06 lr 0.000894	time 0.8776 (0.8816)	loss 0.6756 (0.5190)	grad_norm 2.4890 (2.6178)	mem 20675MB
[2025-04-03 02:02:01 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][432/573]	eta 0:02:04 lr 0.000894	time 0.8771 (0.8816)	loss 0.5178 (0.5189)	grad_norm 3.1974 (2.6182)	mem 20675MB
[2025-04-03 02:02:03 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][434/573]	eta 0:02:02 lr 0.000894	time 0.8772 (0.8816)	loss 0.6167 (0.5192)	grad_norm 2.2938 (2.6158)	mem 20675MB
[2025-04-03 02:02:05 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][436/573]	eta 0:02:00 lr 0.000893	time 0.8786 (0.8815)	loss 0.5001 (0.5194)	grad_norm 4.0894 (2.6219)	mem 20675MB
[2025-04-03 02:02:07 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][438/573]	eta 0:01:59 lr 0.000893	time 0.8772 (0.8815)	loss 0.4718 (0.5191)	grad_norm 2.5906 (2.6214)	mem 20675MB
[2025-04-03 02:02:08 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][440/573]	eta 0:01:57 lr 0.000893	time 0.8774 (0.8815)	loss 0.5922 (0.5192)	grad_norm 1.9050 (2.6177)	mem 20675MB
[2025-04-03 02:02:10 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][442/573]	eta 0:01:55 lr 0.000893	time 0.8773 (0.8815)	loss 0.5220 (0.5193)	grad_norm 2.1027 (2.6137)	mem 20675MB
[2025-04-03 02:02:12 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][444/573]	eta 0:01:53 lr 0.000893	time 0.8774 (0.8815)	loss 0.5868 (0.5195)	grad_norm 2.3807 (2.6118)	mem 20675MB
[2025-04-03 02:02:14 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][446/573]	eta 0:01:51 lr 0.000892	time 0.8774 (0.8815)	loss 0.5961 (0.5196)	grad_norm 2.6230 (2.6141)	mem 20675MB
[2025-04-03 02:02:15 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][448/573]	eta 0:01:50 lr 0.000892	time 0.8775 (0.8815)	loss 0.4796 (0.5196)	grad_norm 2.5269 (2.6132)	mem 20675MB
[2025-04-03 02:02:17 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][450/573]	eta 0:01:48 lr 0.000892	time 0.8774 (0.8815)	loss 0.5149 (0.5198)	grad_norm 1.6131 (2.6113)	mem 20675MB
[2025-04-03 02:02:19 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][452/573]	eta 0:01:46 lr 0.000892	time 0.8774 (0.8814)	loss 0.4318 (0.5197)	grad_norm 4.3026 (2.6169)	mem 20675MB
[2025-04-03 02:02:21 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][454/573]	eta 0:01:44 lr 0.000892	time 0.8774 (0.8814)	loss 0.6028 (0.5196)	grad_norm 1.9698 (2.6198)	mem 20675MB
[2025-04-03 02:02:22 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][456/573]	eta 0:01:43 lr 0.000891	time 0.8772 (0.8814)	loss 0.5199 (0.5196)	grad_norm 2.4191 (2.6183)	mem 20675MB
[2025-04-03 02:02:24 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][458/573]	eta 0:01:41 lr 0.000891	time 0.8775 (0.8814)	loss 0.6322 (0.5199)	grad_norm 2.6533 (2.6177)	mem 20675MB
[2025-04-03 02:02:26 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][460/573]	eta 0:01:39 lr 0.000891	time 0.8772 (0.8814)	loss 0.5111 (0.5200)	grad_norm 2.1000 (2.6164)	mem 20675MB
[2025-04-03 02:02:28 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][462/573]	eta 0:01:37 lr 0.000891	time 0.8775 (0.8814)	loss 0.3962 (0.5198)	grad_norm 4.0347 (2.6198)	mem 20675MB
[2025-04-03 02:02:29 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][464/573]	eta 0:01:36 lr 0.000891	time 0.8775 (0.8814)	loss 0.4799 (0.5197)	grad_norm 2.7095 (2.6181)	mem 20675MB
[2025-04-03 02:02:31 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][466/573]	eta 0:01:34 lr 0.000890	time 0.8777 (0.8813)	loss 0.5898 (0.5200)	grad_norm 2.7699 (2.6216)	mem 20675MB
[2025-04-03 02:02:33 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][468/573]	eta 0:01:32 lr 0.000890	time 0.8772 (0.8813)	loss 0.6347 (0.5203)	grad_norm 2.7736 (2.6207)	mem 20675MB
[2025-04-03 02:02:35 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][470/573]	eta 0:01:30 lr 0.000890	time 0.8770 (0.8813)	loss 0.4756 (0.5200)	grad_norm 1.6917 (2.6182)	mem 20675MB
[2025-04-03 02:02:36 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][472/573]	eta 0:01:29 lr 0.000890	time 0.8771 (0.8813)	loss 0.5650 (0.5203)	grad_norm 1.7217 (2.6156)	mem 20675MB
[2025-04-03 02:02:38 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][474/573]	eta 0:01:27 lr 0.000890	time 0.8770 (0.8813)	loss 0.4518 (0.5200)	grad_norm 3.2515 (2.6174)	mem 20675MB
[2025-04-03 02:02:40 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][476/573]	eta 0:01:25 lr 0.000889	time 0.8772 (0.8813)	loss 0.3857 (0.5197)	grad_norm 2.9873 (2.6174)	mem 20675MB
[2025-04-03 02:02:42 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][478/573]	eta 0:01:23 lr 0.000889	time 0.8774 (0.8813)	loss 0.4698 (0.5195)	grad_norm 4.1340 (2.6217)	mem 20675MB
[2025-04-03 02:02:43 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][480/573]	eta 0:01:21 lr 0.000889	time 0.8773 (0.8812)	loss 0.5056 (0.5191)	grad_norm 2.0251 (2.6191)	mem 20675MB
[2025-04-03 02:02:45 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][482/573]	eta 0:01:20 lr 0.000889	time 0.8769 (0.8812)	loss 0.4165 (0.5187)	grad_norm 5.6806 (2.6287)	mem 20675MB
[2025-04-03 02:02:47 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][484/573]	eta 0:01:18 lr 0.000889	time 0.8771 (0.8812)	loss 0.4009 (0.5186)	grad_norm 3.3504 (2.6313)	mem 20675MB
[2025-04-03 02:02:49 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][486/573]	eta 0:01:16 lr 0.000888	time 0.8770 (0.8812)	loss 0.4924 (0.5185)	grad_norm 4.2167 (2.6381)	mem 20675MB
[2025-04-03 02:02:50 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][488/573]	eta 0:01:14 lr 0.000888	time 0.8776 (0.8812)	loss 0.4860 (0.5185)	grad_norm 1.9971 (2.6374)	mem 20675MB
[2025-04-03 02:02:52 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][490/573]	eta 0:01:13 lr 0.000888	time 0.8772 (0.8812)	loss 0.5141 (0.5185)	grad_norm 2.8970 (2.6367)	mem 20675MB
[2025-04-03 02:02:54 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][492/573]	eta 0:01:11 lr 0.000888	time 0.8772 (0.8812)	loss 0.5691 (0.5187)	grad_norm 1.7296 (2.6352)	mem 20675MB
[2025-04-03 02:02:56 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][494/573]	eta 0:01:09 lr 0.000887	time 0.8772 (0.8812)	loss 0.4929 (0.5185)	grad_norm 2.8965 (2.6364)	mem 20675MB
[2025-04-03 02:02:58 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][496/573]	eta 0:01:07 lr 0.000887	time 0.8773 (0.8811)	loss 0.5478 (0.5184)	grad_norm 1.9315 (2.6361)	mem 20675MB
[2025-04-03 02:02:59 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][498/573]	eta 0:01:06 lr 0.000887	time 0.8770 (0.8811)	loss 0.5261 (0.5181)	grad_norm 2.2373 (2.6359)	mem 20675MB
[2025-04-03 02:03:01 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][500/573]	eta 0:01:04 lr 0.000887	time 0.8773 (0.8811)	loss 0.4266 (0.5180)	grad_norm 2.2582 (2.6354)	mem 20675MB
[2025-04-03 02:03:03 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][502/573]	eta 0:01:02 lr 0.000887	time 0.8772 (0.8811)	loss 0.5524 (0.5180)	grad_norm 2.6974 (2.6364)	mem 20675MB
[2025-04-03 02:03:05 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][504/573]	eta 0:01:00 lr 0.000886	time 0.8772 (0.8811)	loss 0.5088 (0.5181)	grad_norm 2.7989 (2.6372)	mem 20675MB
[2025-04-03 02:03:06 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][506/573]	eta 0:00:59 lr 0.000886	time 0.8775 (0.8811)	loss 0.4877 (0.5182)	grad_norm 2.4732 (2.6378)	mem 20675MB
[2025-04-03 02:03:08 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][508/573]	eta 0:00:57 lr 0.000886	time 0.8774 (0.8811)	loss 0.5395 (0.5183)	grad_norm 3.5254 (2.6396)	mem 20675MB
[2025-04-03 02:03:10 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][510/573]	eta 0:00:55 lr 0.000886	time 0.8771 (0.8811)	loss 0.5157 (0.5183)	grad_norm 2.1888 (2.6379)	mem 20675MB
[2025-04-03 02:03:12 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][512/573]	eta 0:00:53 lr 0.000886	time 0.8772 (0.8811)	loss 0.5371 (0.5184)	grad_norm 2.7538 (2.6361)	mem 20675MB
[2025-04-03 02:03:13 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][514/573]	eta 0:00:51 lr 0.000885	time 0.8771 (0.8810)	loss 0.4324 (0.5183)	grad_norm 2.5599 (2.6355)	mem 20675MB
[2025-04-03 02:03:15 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][516/573]	eta 0:00:50 lr 0.000885	time 0.8774 (0.8810)	loss 0.5563 (0.5186)	grad_norm 1.5066 (2.6332)	mem 20675MB
[2025-04-03 02:03:17 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][518/573]	eta 0:00:48 lr 0.000885	time 0.8772 (0.8810)	loss 0.5186 (0.5187)	grad_norm 2.5762 (2.6322)	mem 20675MB
[2025-04-03 02:03:19 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][520/573]	eta 0:00:46 lr 0.000885	time 0.8776 (0.8810)	loss 0.5950 (0.5188)	grad_norm 1.4713 (2.6280)	mem 20675MB
[2025-04-03 02:03:20 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][522/573]	eta 0:00:44 lr 0.000885	time 0.8774 (0.8810)	loss 0.5959 (0.5190)	grad_norm 1.7974 (2.6268)	mem 20675MB
[2025-04-03 02:03:22 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][524/573]	eta 0:00:43 lr 0.000884	time 0.8772 (0.8810)	loss 0.4727 (0.5189)	grad_norm 3.2090 (2.6270)	mem 20675MB
[2025-04-03 02:03:24 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][526/573]	eta 0:00:41 lr 0.000884	time 0.8775 (0.8810)	loss 0.5451 (0.5189)	grad_norm 3.3500 (2.6284)	mem 20675MB
[2025-04-03 02:03:26 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][528/573]	eta 0:00:39 lr 0.000884	time 0.8771 (0.8810)	loss 0.4609 (0.5188)	grad_norm 2.4264 (2.6301)	mem 20675MB
[2025-04-03 02:03:27 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][530/573]	eta 0:00:37 lr 0.000884	time 0.8773 (0.8810)	loss 0.5733 (0.5189)	grad_norm 1.9102 (2.6295)	mem 20675MB
[2025-04-03 02:03:29 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][532/573]	eta 0:00:36 lr 0.000884	time 0.8773 (0.8810)	loss 0.6090 (0.5192)	grad_norm 3.1819 (2.6311)	mem 20675MB
[2025-04-03 02:03:31 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][534/573]	eta 0:00:34 lr 0.000883	time 0.8778 (0.8809)	loss 0.5429 (0.5194)	grad_norm 2.5735 (2.6298)	mem 20675MB
[2025-04-03 02:03:33 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][536/573]	eta 0:00:32 lr 0.000883	time 0.8771 (0.8809)	loss 0.4927 (0.5195)	grad_norm 2.7086 (2.6283)	mem 20675MB
[2025-04-03 02:03:34 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][538/573]	eta 0:00:30 lr 0.000883	time 0.8769 (0.8809)	loss 0.4964 (0.5195)	grad_norm 2.4583 (2.6271)	mem 20675MB
[2025-04-03 02:03:36 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][540/573]	eta 0:00:29 lr 0.000883	time 0.8775 (0.8809)	loss 0.5971 (0.5198)	grad_norm 2.1303 (2.6246)	mem 20675MB
[2025-04-03 02:03:38 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][542/573]	eta 0:00:27 lr 0.000883	time 0.8771 (0.8809)	loss 0.5831 (0.5197)	grad_norm 1.7570 (2.6221)	mem 20675MB
[2025-04-03 02:03:40 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][544/573]	eta 0:00:25 lr 0.000882	time 0.8775 (0.8809)	loss 0.4177 (0.5196)	grad_norm 2.6099 (2.6216)	mem 20675MB
[2025-04-03 02:03:41 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][546/573]	eta 0:00:23 lr 0.000882	time 0.8771 (0.8809)	loss 0.5232 (0.5195)	grad_norm 1.7096 (2.6214)	mem 20675MB
[2025-04-03 02:03:43 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][548/573]	eta 0:00:22 lr 0.000882	time 0.8773 (0.8809)	loss 0.3767 (0.5194)	grad_norm 4.1892 (2.6231)	mem 20675MB
[2025-04-03 02:03:45 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][550/573]	eta 0:00:20 lr 0.000882	time 0.8772 (0.8809)	loss 0.5854 (0.5196)	grad_norm 2.1564 (2.6233)	mem 20675MB
[2025-04-03 02:03:47 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][552/573]	eta 0:00:18 lr 0.000881	time 0.8774 (0.8809)	loss 0.6031 (0.5194)	grad_norm 2.0225 (2.6221)	mem 20675MB
[2025-04-03 02:03:48 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][554/573]	eta 0:00:16 lr 0.000881	time 0.8770 (0.8809)	loss 0.5364 (0.5196)	grad_norm 2.7411 (2.6216)	mem 20675MB
[2025-04-03 02:03:50 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][556/573]	eta 0:00:14 lr 0.000881	time 0.8787 (0.8808)	loss 0.5979 (0.5196)	grad_norm 2.0195 (2.6219)	mem 20675MB
[2025-04-03 02:03:52 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][558/573]	eta 0:00:13 lr 0.000881	time 0.8784 (0.8808)	loss 0.5316 (0.5195)	grad_norm 2.2611 (2.6201)	mem 20675MB
[2025-04-03 02:03:54 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][560/573]	eta 0:00:11 lr 0.000881	time 0.8771 (0.8808)	loss 0.5753 (0.5197)	grad_norm 1.8819 (2.6169)	mem 20675MB
[2025-04-03 02:03:55 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][562/573]	eta 0:00:09 lr 0.000880	time 0.8774 (0.8808)	loss 0.6371 (0.5199)	grad_norm 2.7321 (2.6166)	mem 20675MB
[2025-04-03 02:03:57 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][564/573]	eta 0:00:07 lr 0.000880	time 0.8770 (0.8808)	loss 0.5643 (0.5200)	grad_norm 2.3484 (2.6152)	mem 20675MB
[2025-04-03 02:03:59 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][566/573]	eta 0:00:06 lr 0.000880	time 0.8773 (0.8808)	loss 0.5599 (0.5201)	grad_norm 3.7454 (2.6162)	mem 20675MB
[2025-04-03 02:04:01 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][568/573]	eta 0:00:04 lr 0.000880	time 0.8772 (0.8808)	loss 0.5698 (0.5202)	grad_norm 3.3037 (2.6160)	mem 20675MB
[2025-04-03 02:04:03 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][570/573]	eta 0:00:02 lr 0.000880	time 0.8768 (0.8808)	loss 0.5265 (0.5201)	grad_norm 2.4219 (2.6152)	mem 20675MB
[2025-04-03 02:04:04 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][572/573]	eta 0:00:00 lr 0.000879	time 0.8772 (0.8808)	loss 0.5861 (0.5202)	grad_norm 3.0558 (2.6149)	mem 20675MB
[2025-04-03 02:04:04 simmim_finetune] (main_finetune.py 260): INFO EPOCH 10 training takes 0:08:24
[2025-04-03 02:04:04 simmim_finetune] (utils.py 60): INFO checkpoint/human/ckpt10.pth saving......
[2025-04-03 02:04:08 simmim_finetune] (utils.py 62): INFO checkpoint/human/ckpt10.pth saved !!!
[2025-04-03 02:04:10 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.924 (1.924)	Loss 0.3501 (0.3501)	Acc@1 89.062 (89.062)	Mem 20675MB
[2025-04-03 02:04:10 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.284 (0.831)	Loss 0.3063 (0.3248)	Acc@1 89.844 (89.844)	Mem 20675MB
[2025-04-03 02:04:11 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.612)	Loss 0.3667 (0.3335)	Acc@1 85.938 (88.281)	Mem 20675MB
[2025-04-03 02:04:11 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.284 (0.519)	Loss 0.3443 (0.3329)	Acc@1 85.156 (88.058)	Mem 20675MB
[2025-04-03 02:04:12 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.466)	Loss 0.8214 (0.4066)	Acc@1 50.000 (82.031)	Mem 20675MB
[2025-04-03 02:04:13 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.433)	Loss 0.7153 (0.4645)	Acc@1 59.375 (78.054)	Mem 20675MB
[2025-04-03 02:04:13 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.410)	Loss 0.7711 (0.5070)	Acc@1 52.344 (74.820)	Mem 20675MB
[2025-04-03 02:04:14 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.393)	Loss 0.6836 (0.5320)	Acc@1 63.281 (73.021)	Mem 20675MB
[2025-04-03 02:04:14 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 72.581
[2025-04-03 02:04:14 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 72.6%
[2025-04-03 02:04:14 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 75.55%
[2025-04-03 02:04:14 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.325144708678325e-06, 3.325144708678325e-06, 5.075687293656089e-06, 5.075687293656089e-06, 7.768829732083416e-06, 7.768829732083416e-06, 1.1912125791202384e-05, 1.1912125791202384e-05, 1.8286427420616177e-05, 1.8286427420616177e-05, 2.8093045312022015e-05, 2.8093045312022015e-05, 4.318014976033868e-05, 4.318014976033868e-05, 6.639107968082587e-05, 6.639107968082587e-05, 0.00010210020263542153, 0.00010210020263542153, 0.00015703731487326104, 0.00015703731487326104, 0.00024155594908532177, 0.00024155594908532177, 0.00037158461710387673, 0.00037158461710387673, 0.0005716287217478076, 0.0005716287217478076, 0.0008793888827384703, 0.0008793888827384703]
[2025-04-03 02:04:16 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][0/573]	eta 0:24:17 lr 0.000879	time 2.5431 (2.5431)	loss 0.5880 (0.5880)	grad_norm 1.9041 (1.9041)	mem 20675MB
[2025-04-03 02:04:18 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][2/573]	eta 0:13:38 lr 0.000879	time 0.8776 (1.4333)	loss 0.5759 (0.5832)	grad_norm 2.3831 (2.1690)	mem 20675MB
[2025-04-03 02:04:20 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][4/573]	eta 0:11:29 lr 0.000879	time 0.8774 (1.2114)	loss 0.5943 (0.6043)	grad_norm 1.7234 (2.4426)	mem 20675MB
[2025-04-03 02:04:22 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][6/573]	eta 0:10:32 lr 0.000879	time 0.8776 (1.1164)	loss 0.4828 (0.5842)	grad_norm 1.9539 (2.3494)	mem 20675MB
[2025-04-03 02:04:23 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][8/573]	eta 0:10:00 lr 0.000878	time 0.8774 (1.0635)	loss 0.4838 (0.5761)	grad_norm 4.2414 (2.6404)	mem 20675MB
[2025-04-03 02:04:25 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][10/573]	eta 0:09:39 lr 0.000878	time 0.8772 (1.0297)	loss 0.5520 (0.5689)	grad_norm 2.0554 (2.4970)	mem 20675MB
[2025-04-03 02:04:27 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][12/573]	eta 0:09:24 lr 0.000878	time 0.8773 (1.0064)	loss 0.4709 (0.5507)	grad_norm 2.1388 (2.4785)	mem 20675MB
[2025-04-03 02:04:29 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][14/573]	eta 0:09:13 lr 0.000878	time 0.8772 (0.9894)	loss 0.5538 (0.5526)	grad_norm 2.1842 (2.4053)	mem 20675MB
[2025-04-03 02:04:31 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][16/573]	eta 0:09:03 lr 0.000878	time 0.8774 (0.9763)	loss 0.4272 (0.5464)	grad_norm 2.4640 (2.4084)	mem 20675MB
[2025-04-03 02:04:32 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][18/573]	eta 0:08:56 lr 0.000877	time 0.8783 (0.9660)	loss 0.6055 (0.5429)	grad_norm 2.5736 (2.4003)	mem 20675MB
[2025-04-03 02:04:34 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][20/573]	eta 0:08:49 lr 0.000877	time 0.8774 (0.9577)	loss 0.5422 (0.5457)	grad_norm 2.9591 (2.4407)	mem 20675MB
[2025-04-03 02:04:36 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][22/573]	eta 0:08:43 lr 0.000877	time 0.8774 (0.9508)	loss 0.4292 (0.5421)	grad_norm 3.9196 (2.4985)	mem 20675MB
[2025-04-03 02:04:38 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][24/573]	eta 0:08:38 lr 0.000877	time 0.8775 (0.9450)	loss 0.4208 (0.5399)	grad_norm 2.2009 (2.4876)	mem 20675MB
[2025-04-03 02:04:39 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][26/573]	eta 0:08:34 lr 0.000877	time 0.8779 (0.9401)	loss 0.4096 (0.5330)	grad_norm 3.4045 (2.5452)	mem 20675MB
[2025-04-03 02:04:41 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][28/573]	eta 0:08:30 lr 0.000876	time 0.8773 (0.9359)	loss 0.4768 (0.5307)	grad_norm 2.1838 (2.5369)	mem 20675MB
[2025-04-03 02:04:43 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][30/573]	eta 0:08:26 lr 0.000876	time 0.8775 (0.9321)	loss 0.5542 (0.5293)	grad_norm 2.3862 (2.5343)	mem 20675MB
[2025-04-03 02:04:45 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][32/573]	eta 0:08:22 lr 0.000876	time 0.8776 (0.9289)	loss 0.5115 (0.5289)	grad_norm 2.0660 (2.5329)	mem 20675MB
[2025-04-03 02:04:46 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][34/573]	eta 0:08:19 lr 0.000876	time 0.8772 (0.9260)	loss 0.4672 (0.5280)	grad_norm 2.6572 (2.5339)	mem 20675MB
[2025-04-03 02:04:48 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][36/573]	eta 0:08:15 lr 0.000876	time 0.8772 (0.9234)	loss 0.4793 (0.5288)	grad_norm 2.8148 (2.5416)	mem 20675MB
[2025-04-03 02:04:50 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][38/573]	eta 0:08:12 lr 0.000875	time 0.8774 (0.9211)	loss 0.5586 (0.5258)	grad_norm 1.9673 (2.5403)	mem 20675MB
[2025-04-03 02:04:52 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][40/573]	eta 0:08:09 lr 0.000875	time 0.8778 (0.9191)	loss 0.4658 (0.5269)	grad_norm 3.2727 (2.5877)	mem 20675MB
[2025-04-03 02:04:53 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][42/573]	eta 0:08:07 lr 0.000875	time 0.8775 (0.9172)	loss 0.5688 (0.5288)	grad_norm 1.6479 (2.5616)	mem 20675MB
[2025-04-03 02:04:55 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][44/573]	eta 0:08:04 lr 0.000875	time 0.8771 (0.9154)	loss 0.5923 (0.5312)	grad_norm 2.3228 (2.5488)	mem 20675MB
[2025-04-03 02:04:57 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][46/573]	eta 0:08:01 lr 0.000874	time 0.8773 (0.9138)	loss 0.3185 (0.5271)	grad_norm 2.7713 (2.5386)	mem 20675MB
[2025-04-03 02:04:59 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][48/573]	eta 0:07:59 lr 0.000874	time 0.8773 (0.9124)	loss 0.3698 (0.5221)	grad_norm 2.3569 (2.5483)	mem 20675MB
[2025-04-03 02:05:00 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][50/573]	eta 0:07:56 lr 0.000874	time 0.8774 (0.9110)	loss 0.6395 (0.5238)	grad_norm 3.5512 (2.5708)	mem 20675MB
[2025-04-03 02:05:02 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][52/573]	eta 0:07:54 lr 0.000874	time 0.8771 (0.9098)	loss 0.4526 (0.5224)	grad_norm 3.2185 (2.5740)	mem 20675MB
[2025-04-03 02:05:04 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][54/573]	eta 0:07:51 lr 0.000874	time 0.8772 (0.9087)	loss 0.3766 (0.5205)	grad_norm 3.9122 (2.5972)	mem 20675MB
[2025-04-03 02:05:06 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][56/573]	eta 0:07:49 lr 0.000873	time 0.8772 (0.9076)	loss 0.5264 (0.5213)	grad_norm 2.0986 (2.6098)	mem 20675MB
[2025-04-03 02:05:07 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][58/573]	eta 0:07:46 lr 0.000873	time 0.8773 (0.9066)	loss 0.5179 (0.5220)	grad_norm 2.4680 (2.6079)	mem 20675MB
[2025-04-03 02:05:09 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][60/573]	eta 0:07:44 lr 0.000873	time 0.8772 (0.9057)	loss 0.4414 (0.5216)	grad_norm 3.2259 (2.6394)	mem 20675MB
[2025-04-03 02:05:11 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][62/573]	eta 0:07:42 lr 0.000873	time 0.8778 (0.9048)	loss 0.6059 (0.5229)	grad_norm 1.9890 (2.6375)	mem 20675MB
[2025-04-03 02:05:13 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][64/573]	eta 0:07:40 lr 0.000873	time 0.8771 (0.9040)	loss 0.6047 (0.5226)	grad_norm 2.5546 (2.6385)	mem 20675MB
[2025-04-03 02:05:14 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][66/573]	eta 0:07:37 lr 0.000872	time 0.8771 (0.9032)	loss 0.5127 (0.5217)	grad_norm 1.9542 (2.6376)	mem 20675MB
[2025-04-03 02:05:16 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][68/573]	eta 0:07:35 lr 0.000872	time 0.8774 (0.9025)	loss 0.6087 (0.5234)	grad_norm 2.6429 (2.6322)	mem 20675MB
[2025-04-03 02:05:18 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][70/573]	eta 0:07:33 lr 0.000872	time 0.8776 (0.9018)	loss 0.5690 (0.5232)	grad_norm 1.3509 (2.6101)	mem 20675MB
[2025-04-03 02:05:20 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][72/573]	eta 0:07:31 lr 0.000872	time 0.8771 (0.9012)	loss 0.5613 (0.5237)	grad_norm 1.8930 (2.6076)	mem 20675MB
[2025-04-03 02:05:21 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][74/573]	eta 0:07:29 lr 0.000872	time 0.8773 (0.9006)	loss 0.5449 (0.5246)	grad_norm 2.1413 (2.5947)	mem 20675MB
[2025-04-03 02:05:23 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][76/573]	eta 0:07:27 lr 0.000871	time 0.8771 (0.9000)	loss 0.4299 (0.5243)	grad_norm 3.4687 (2.5943)	mem 20675MB
[2025-04-03 02:05:25 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][78/573]	eta 0:07:25 lr 0.000871	time 0.8773 (0.8995)	loss 0.4743 (0.5252)	grad_norm 2.1214 (2.5864)	mem 20675MB
[2025-04-03 02:05:27 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][80/573]	eta 0:07:23 lr 0.000871	time 0.8770 (0.8989)	loss 0.5668 (0.5243)	grad_norm 2.1398 (2.5804)	mem 20675MB
[2025-04-03 02:05:28 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][82/573]	eta 0:07:21 lr 0.000871	time 0.8771 (0.8984)	loss 0.5628 (0.5249)	grad_norm 3.3849 (2.5932)	mem 20675MB
[2025-04-03 02:05:30 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][84/573]	eta 0:07:19 lr 0.000870	time 0.8772 (0.8979)	loss 0.3909 (0.5228)	grad_norm 3.5857 (2.6003)	mem 20675MB
[2025-04-03 02:05:32 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][86/573]	eta 0:07:17 lr 0.000870	time 0.8773 (0.8975)	loss 0.5717 (0.5223)	grad_norm 2.7400 (2.6070)	mem 20675MB
[2025-04-03 02:05:34 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][88/573]	eta 0:07:15 lr 0.000870	time 0.8788 (0.8971)	loss 0.5767 (0.5246)	grad_norm 1.9626 (2.5996)	mem 20675MB
[2025-04-03 02:05:36 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][90/573]	eta 0:07:13 lr 0.000870	time 0.8774 (0.8967)	loss 0.6028 (0.5253)	grad_norm 1.8719 (2.5945)	mem 20675MB
[2025-04-03 02:05:37 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][92/573]	eta 0:07:11 lr 0.000870	time 0.8770 (0.8962)	loss 0.4681 (0.5247)	grad_norm 2.4111 (2.5879)	mem 20675MB
[2025-04-03 02:05:39 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][94/573]	eta 0:07:09 lr 0.000869	time 0.8775 (0.8959)	loss 0.4795 (0.5243)	grad_norm 2.5898 (2.6069)	mem 20675MB
[2025-04-03 02:05:41 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][96/573]	eta 0:07:07 lr 0.000869	time 0.8775 (0.8956)	loss 0.5970 (0.5251)	grad_norm 2.1199 (2.6010)	mem 20675MB
[2025-04-03 02:05:43 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][98/573]	eta 0:07:05 lr 0.000869	time 0.8772 (0.8952)	loss 0.5272 (0.5236)	grad_norm 2.0109 (2.5951)	mem 20675MB
[2025-04-03 02:05:44 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][100/573]	eta 0:07:03 lr 0.000869	time 0.8775 (0.8949)	loss 0.5818 (0.5242)	grad_norm 2.1526 (2.5876)	mem 20675MB
[2025-04-03 02:05:46 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][102/573]	eta 0:07:01 lr 0.000869	time 0.8779 (0.8946)	loss 0.5958 (0.5247)	grad_norm 4.2939 (2.6053)	mem 20675MB
[2025-04-03 02:05:48 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][104/573]	eta 0:06:59 lr 0.000868	time 0.8771 (0.8942)	loss 0.5284 (0.5236)	grad_norm 2.7132 (2.6004)	mem 20675MB
[2025-04-03 02:05:50 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][106/573]	eta 0:06:57 lr 0.000868	time 0.8771 (0.8939)	loss 0.4495 (0.5222)	grad_norm 2.7200 (2.6057)	mem 20675MB
[2025-04-03 02:05:51 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][108/573]	eta 0:06:55 lr 0.000868	time 0.8774 (0.8936)	loss 0.5780 (0.5222)	grad_norm 2.4481 (2.6200)	mem 20675MB
[2025-04-03 02:05:53 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][110/573]	eta 0:06:53 lr 0.000868	time 0.8774 (0.8934)	loss 0.4944 (0.5224)	grad_norm 2.7656 (2.6203)	mem 20675MB
[2025-04-03 02:05:55 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][112/573]	eta 0:06:51 lr 0.000868	time 0.8772 (0.8931)	loss 0.6005 (0.5231)	grad_norm 2.2595 (2.6134)	mem 20675MB
[2025-04-03 02:05:57 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][114/573]	eta 0:06:49 lr 0.000867	time 0.8775 (0.8929)	loss 0.6405 (0.5237)	grad_norm 2.3168 (2.6169)	mem 20675MB
[2025-04-03 02:05:58 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][116/573]	eta 0:06:47 lr 0.000867	time 0.8773 (0.8926)	loss 0.5430 (0.5241)	grad_norm 1.4976 (2.6027)	mem 20675MB
[2025-04-03 02:06:00 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][118/573]	eta 0:06:46 lr 0.000867	time 0.8776 (0.8924)	loss 0.4138 (0.5237)	grad_norm 2.6903 (2.6006)	mem 20675MB
[2025-04-03 02:06:02 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][120/573]	eta 0:06:44 lr 0.000867	time 0.8775 (0.8921)	loss 0.4656 (0.5236)	grad_norm 2.2585 (2.5903)	mem 20675MB
[2025-04-03 02:06:04 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][122/573]	eta 0:06:42 lr 0.000866	time 0.8774 (0.8919)	loss 0.5150 (0.5235)	grad_norm 2.2289 (2.5849)	mem 20675MB
[2025-04-03 02:06:05 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][124/573]	eta 0:06:40 lr 0.000866	time 0.8774 (0.8917)	loss 0.5283 (0.5241)	grad_norm 2.6151 (2.5817)	mem 20675MB
[2025-04-03 02:06:07 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][126/573]	eta 0:06:38 lr 0.000866	time 0.8778 (0.8915)	loss 0.4562 (0.5230)	grad_norm 2.6238 (2.5839)	mem 20675MB
[2025-04-03 02:06:09 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][128/573]	eta 0:06:36 lr 0.000866	time 0.8771 (0.8913)	loss 0.4683 (0.5225)	grad_norm 2.4227 (2.5809)	mem 20675MB
[2025-04-03 02:06:11 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][130/573]	eta 0:06:34 lr 0.000866	time 0.8776 (0.8911)	loss 0.5441 (0.5221)	grad_norm 3.1831 (2.5833)	mem 20675MB
[2025-04-03 02:06:12 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][132/573]	eta 0:06:32 lr 0.000865	time 0.8773 (0.8909)	loss 0.6158 (0.5225)	grad_norm 2.8804 (2.5910)	mem 20675MB
[2025-04-03 02:06:14 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][134/573]	eta 0:06:31 lr 0.000865	time 0.8776 (0.8907)	loss 0.5298 (0.5226)	grad_norm 2.6474 (2.5921)	mem 20675MB
[2025-04-03 02:06:16 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][136/573]	eta 0:06:29 lr 0.000865	time 0.8775 (0.8905)	loss 0.5565 (0.5229)	grad_norm 2.3816 (2.5837)	mem 20675MB
[2025-04-03 02:06:18 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][138/573]	eta 0:06:27 lr 0.000865	time 0.8772 (0.8903)	loss 0.5457 (0.5232)	grad_norm 1.4744 (2.5759)	mem 20675MB
[2025-04-03 02:06:19 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][140/573]	eta 0:06:25 lr 0.000865	time 0.8774 (0.8902)	loss 0.4117 (0.5221)	grad_norm 2.4065 (2.5729)	mem 20675MB
[2025-04-03 02:06:21 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][142/573]	eta 0:06:23 lr 0.000864	time 0.8774 (0.8900)	loss 0.4063 (0.5213)	grad_norm 2.8674 (2.5868)	mem 20675MB
[2025-04-03 02:06:23 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][144/573]	eta 0:06:21 lr 0.000864	time 0.8772 (0.8898)	loss 0.5324 (0.5205)	grad_norm 1.7284 (2.5858)	mem 20675MB
[2025-04-03 02:06:25 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][146/573]	eta 0:06:19 lr 0.000864	time 0.8774 (0.8897)	loss 0.3995 (0.5198)	grad_norm 3.3707 (2.5902)	mem 20675MB
[2025-04-03 02:06:26 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][148/573]	eta 0:06:18 lr 0.000864	time 0.8773 (0.8895)	loss 0.5365 (0.5204)	grad_norm 2.3643 (2.5926)	mem 20675MB
[2025-04-03 02:06:28 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][150/573]	eta 0:06:16 lr 0.000864	time 0.8772 (0.8894)	loss 0.5002 (0.5202)	grad_norm 1.5238 (2.5825)	mem 20675MB
[2025-04-03 02:06:30 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][152/573]	eta 0:06:14 lr 0.000863	time 0.8773 (0.8892)	loss 0.3481 (0.5180)	grad_norm 2.9669 (2.5855)	mem 20675MB
[2025-04-03 02:06:32 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][154/573]	eta 0:06:12 lr 0.000863	time 0.8773 (0.8891)	loss 0.4048 (0.5174)	grad_norm 3.3093 (2.5844)	mem 20675MB
[2025-04-03 02:06:33 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][156/573]	eta 0:06:10 lr 0.000863	time 0.8775 (0.8890)	loss 0.5218 (0.5176)	grad_norm 2.5931 (2.5947)	mem 20675MB
[2025-04-03 02:06:35 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][158/573]	eta 0:06:08 lr 0.000863	time 0.8772 (0.8888)	loss 0.5566 (0.5171)	grad_norm 3.2405 (2.5950)	mem 20675MB
[2025-04-03 02:06:37 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][160/573]	eta 0:06:07 lr 0.000862	time 0.8777 (0.8887)	loss 0.4244 (0.5169)	grad_norm 4.6296 (2.6073)	mem 20675MB
[2025-04-03 02:06:39 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][162/573]	eta 0:06:05 lr 0.000862	time 0.8776 (0.8886)	loss 0.3465 (0.5161)	grad_norm 2.9406 (2.6200)	mem 20675MB
[2025-04-03 02:06:41 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][164/573]	eta 0:06:03 lr 0.000862	time 0.8775 (0.8884)	loss 0.5876 (0.5160)	grad_norm 2.3784 (2.6327)	mem 20675MB
[2025-04-03 02:06:42 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][166/573]	eta 0:06:01 lr 0.000862	time 0.8775 (0.8883)	loss 0.4346 (0.5160)	grad_norm 2.1576 (2.6323)	mem 20675MB
[2025-04-03 02:06:44 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][168/573]	eta 0:05:59 lr 0.000862	time 0.8771 (0.8882)	loss 0.4146 (0.5150)	grad_norm 3.3199 (2.6386)	mem 20675MB
[2025-04-03 02:06:46 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][170/573]	eta 0:05:57 lr 0.000861	time 0.8772 (0.8881)	loss 0.4653 (0.5145)	grad_norm 2.8176 (2.6411)	mem 20675MB
[2025-04-03 02:06:48 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][172/573]	eta 0:05:56 lr 0.000861	time 0.8772 (0.8880)	loss 0.5446 (0.5149)	grad_norm 1.5830 (2.6318)	mem 20675MB
[2025-04-03 02:06:49 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][174/573]	eta 0:05:54 lr 0.000861	time 0.8773 (0.8879)	loss 0.6180 (0.5146)	grad_norm 3.2573 (2.6333)	mem 20675MB
[2025-04-03 02:06:51 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][176/573]	eta 0:05:52 lr 0.000861	time 0.8772 (0.8878)	loss 0.4445 (0.5140)	grad_norm 3.4426 (2.6434)	mem 20675MB
[2025-04-03 02:06:53 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][178/573]	eta 0:05:50 lr 0.000861	time 0.8771 (0.8876)	loss 0.4330 (0.5135)	grad_norm 2.4004 (2.6431)	mem 20675MB
[2025-04-03 02:06:55 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][180/573]	eta 0:05:48 lr 0.000860	time 0.8771 (0.8875)	loss 0.5259 (0.5138)	grad_norm 2.1950 (2.6402)	mem 20675MB
[2025-04-03 02:06:56 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][182/573]	eta 0:05:46 lr 0.000860	time 0.8774 (0.8874)	loss 0.5502 (0.5136)	grad_norm 2.0522 (2.6340)	mem 20675MB
[2025-04-03 02:06:58 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][184/573]	eta 0:05:45 lr 0.000860	time 0.8773 (0.8873)	loss 0.5620 (0.5143)	grad_norm 3.3977 (2.6364)	mem 20675MB
[2025-04-03 02:07:00 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][186/573]	eta 0:05:43 lr 0.000860	time 0.8774 (0.8872)	loss 0.4221 (0.5136)	grad_norm 3.5533 (2.6467)	mem 20675MB
[2025-04-03 02:07:02 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][188/573]	eta 0:05:41 lr 0.000860	time 0.8773 (0.8872)	loss 0.5443 (0.5142)	grad_norm 2.7330 (2.6445)	mem 20675MB
[2025-04-03 02:07:03 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][190/573]	eta 0:05:39 lr 0.000859	time 0.8774 (0.8871)	loss 0.5104 (0.5143)	grad_norm 2.1487 (2.6489)	mem 20675MB
[2025-04-03 02:07:05 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][192/573]	eta 0:05:37 lr 0.000859	time 0.8774 (0.8870)	loss 0.5386 (0.5149)	grad_norm 2.2434 (2.6429)	mem 20675MB
[2025-04-03 02:07:07 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][194/573]	eta 0:05:36 lr 0.000859	time 0.8776 (0.8869)	loss 0.4117 (0.5139)	grad_norm 5.2729 (2.6612)	mem 20675MB
[2025-04-03 02:07:09 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][196/573]	eta 0:05:34 lr 0.000859	time 0.8776 (0.8868)	loss 0.3655 (0.5129)	grad_norm 4.1602 (2.6694)	mem 20675MB
[2025-04-03 02:07:10 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][198/573]	eta 0:05:32 lr 0.000858	time 0.8773 (0.8867)	loss 0.5755 (0.5137)	grad_norm 3.8053 (2.6755)	mem 20675MB
[2025-04-03 02:07:12 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][200/573]	eta 0:05:30 lr 0.000858	time 0.8774 (0.8866)	loss 0.5203 (0.5136)	grad_norm 2.5337 (2.6754)	mem 20675MB
[2025-04-03 02:07:14 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][202/573]	eta 0:05:28 lr 0.000858	time 0.8774 (0.8865)	loss 0.5889 (0.5142)	grad_norm 2.2479 (2.6765)	mem 20675MB
[2025-04-03 02:07:16 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][204/573]	eta 0:05:27 lr 0.000858	time 0.8777 (0.8865)	loss 0.5102 (0.5147)	grad_norm 2.5428 (2.6766)	mem 20675MB
[2025-04-03 02:07:17 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][206/573]	eta 0:05:25 lr 0.000858	time 0.8775 (0.8864)	loss 0.5591 (0.5151)	grad_norm 2.1903 (2.6745)	mem 20675MB
[2025-04-03 02:07:19 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][208/573]	eta 0:05:23 lr 0.000857	time 0.8771 (0.8863)	loss 0.4250 (0.5150)	grad_norm 4.3049 (2.6809)	mem 20675MB
[2025-04-03 02:07:21 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][210/573]	eta 0:05:21 lr 0.000857	time 0.8771 (0.8862)	loss 0.5407 (0.5156)	grad_norm 2.1407 (2.6756)	mem 20675MB
[2025-04-03 02:07:23 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][212/573]	eta 0:05:19 lr 0.000857	time 0.8772 (0.8862)	loss 0.5718 (0.5161)	grad_norm 1.9343 (2.6714)	mem 20675MB
[2025-04-03 02:07:24 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][214/573]	eta 0:05:18 lr 0.000857	time 0.8774 (0.8861)	loss 0.4380 (0.5159)	grad_norm 2.4865 (2.6662)	mem 20675MB
[2025-04-03 02:07:26 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][216/573]	eta 0:05:16 lr 0.000857	time 0.8775 (0.8860)	loss 0.5163 (0.5163)	grad_norm 1.9246 (2.6600)	mem 20675MB
[2025-04-03 02:07:28 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][218/573]	eta 0:05:14 lr 0.000856	time 0.8775 (0.8859)	loss 0.5916 (0.5163)	grad_norm 2.2511 (2.6566)	mem 20675MB
[2025-04-03 02:07:30 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][220/573]	eta 0:05:12 lr 0.000856	time 0.8775 (0.8859)	loss 0.5184 (0.5165)	grad_norm 3.5166 (2.6575)	mem 20675MB
[2025-04-03 02:07:31 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][222/573]	eta 0:05:10 lr 0.000856	time 0.8773 (0.8858)	loss 0.5404 (0.5164)	grad_norm 1.6075 (2.6540)	mem 20675MB
[2025-04-03 02:07:33 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][224/573]	eta 0:05:09 lr 0.000856	time 0.8772 (0.8857)	loss 0.5807 (0.5165)	grad_norm 2.4193 (2.6577)	mem 20675MB
[2025-04-03 02:07:35 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][226/573]	eta 0:05:07 lr 0.000855	time 0.8775 (0.8857)	loss 0.5917 (0.5165)	grad_norm 2.5125 (2.6603)	mem 20675MB
[2025-04-03 02:07:37 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][228/573]	eta 0:05:05 lr 0.000855	time 0.8773 (0.8856)	loss 0.4134 (0.5156)	grad_norm 3.4099 (2.6637)	mem 20675MB
[2025-04-03 02:07:38 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][230/573]	eta 0:05:03 lr 0.000855	time 0.8771 (0.8855)	loss 0.6366 (0.5165)	grad_norm 2.3182 (2.6623)	mem 20675MB
[2025-04-03 02:07:40 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][232/573]	eta 0:05:01 lr 0.000855	time 0.8774 (0.8855)	loss 0.5503 (0.5169)	grad_norm 1.6708 (2.6553)	mem 20675MB
[2025-04-03 02:07:42 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][234/573]	eta 0:05:00 lr 0.000855	time 0.8771 (0.8854)	loss 0.5576 (0.5173)	grad_norm 4.0340 (2.6589)	mem 20675MB
[2025-04-03 02:07:44 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][236/573]	eta 0:04:58 lr 0.000854	time 0.8774 (0.8854)	loss 0.5724 (0.5175)	grad_norm 2.1160 (2.6570)	mem 20675MB
[2025-04-03 02:07:46 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][238/573]	eta 0:04:56 lr 0.000854	time 0.8770 (0.8853)	loss 0.5339 (0.5180)	grad_norm 2.5014 (2.6547)	mem 20675MB
[2025-04-03 02:07:47 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][240/573]	eta 0:04:54 lr 0.000854	time 0.8774 (0.8852)	loss 0.5170 (0.5181)	grad_norm 1.5547 (2.6484)	mem 20675MB
[2025-04-03 02:07:49 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][242/573]	eta 0:04:52 lr 0.000854	time 0.8772 (0.8852)	loss 0.5404 (0.5179)	grad_norm 2.4006 (2.6468)	mem 20675MB
[2025-04-03 02:07:51 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][244/573]	eta 0:04:51 lr 0.000854	time 0.8774 (0.8851)	loss 0.5597 (0.5179)	grad_norm 2.1445 (2.6448)	mem 20675MB
[2025-04-03 02:07:53 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][246/573]	eta 0:04:49 lr 0.000853	time 0.8773 (0.8851)	loss 0.5740 (0.5183)	grad_norm 2.5106 (2.6446)	mem 20675MB
[2025-04-03 02:07:54 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][248/573]	eta 0:04:47 lr 0.000853	time 0.8776 (0.8850)	loss 0.5882 (0.5188)	grad_norm 2.1589 (2.6411)	mem 20675MB
[2025-04-03 02:07:56 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][250/573]	eta 0:04:45 lr 0.000853	time 0.8774 (0.8850)	loss 0.5472 (0.5186)	grad_norm 2.0652 (2.6391)	mem 20675MB
[2025-04-03 02:07:58 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][252/573]	eta 0:04:44 lr 0.000853	time 0.8772 (0.8849)	loss 0.4219 (0.5178)	grad_norm 2.1667 (2.6359)	mem 20675MB
[2025-04-03 02:08:00 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][254/573]	eta 0:04:42 lr 0.000853	time 0.8770 (0.8848)	loss 0.4581 (0.5175)	grad_norm 2.0505 (2.6325)	mem 20675MB
[2025-04-03 02:08:01 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][256/573]	eta 0:04:40 lr 0.000852	time 0.8773 (0.8848)	loss 0.3815 (0.5168)	grad_norm 3.1956 (2.6383)	mem 20675MB
[2025-04-03 02:08:03 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][258/573]	eta 0:04:38 lr 0.000852	time 0.8775 (0.8848)	loss 0.5692 (0.5173)	grad_norm 2.3442 (2.6392)	mem 20675MB
[2025-04-03 02:08:05 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][260/573]	eta 0:04:36 lr 0.000852	time 0.8774 (0.8847)	loss 0.5623 (0.5172)	grad_norm 2.7518 (2.6407)	mem 20675MB
[2025-04-03 02:08:07 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][262/573]	eta 0:04:35 lr 0.000852	time 0.8774 (0.8847)	loss 0.5586 (0.5177)	grad_norm 3.1360 (2.6436)	mem 20675MB
[2025-04-03 02:08:08 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][264/573]	eta 0:04:33 lr 0.000851	time 0.8775 (0.8846)	loss 0.5869 (0.5183)	grad_norm 1.8733 (2.6387)	mem 20675MB
[2025-04-03 02:08:10 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][266/573]	eta 0:04:31 lr 0.000851	time 0.8775 (0.8846)	loss 0.4299 (0.5180)	grad_norm 4.3217 (2.6488)	mem 20675MB
[2025-04-03 02:08:12 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][268/573]	eta 0:04:29 lr 0.000851	time 0.8777 (0.8845)	loss 0.5864 (0.5183)	grad_norm 2.3866 (2.6483)	mem 20675MB
[2025-04-03 02:08:14 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][270/573]	eta 0:04:27 lr 0.000851	time 0.8776 (0.8845)	loss 0.6235 (0.5188)	grad_norm 1.8099 (2.6418)	mem 20675MB
[2025-04-03 02:08:15 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][272/573]	eta 0:04:26 lr 0.000851	time 0.8773 (0.8844)	loss 0.4206 (0.5181)	grad_norm 2.6345 (2.6477)	mem 20675MB
[2025-04-03 02:08:17 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][274/573]	eta 0:04:24 lr 0.000850	time 0.8771 (0.8844)	loss 0.4355 (0.5178)	grad_norm 3.5069 (2.6516)	mem 20675MB
[2025-04-03 02:08:19 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][276/573]	eta 0:04:22 lr 0.000850	time 0.8779 (0.8843)	loss 0.4537 (0.5179)	grad_norm 3.8156 (2.6539)	mem 20675MB
[2025-04-03 02:08:21 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][278/573]	eta 0:04:20 lr 0.000850	time 0.8775 (0.8843)	loss 0.4486 (0.5173)	grad_norm 2.4395 (2.6538)	mem 20675MB
[2025-04-03 02:08:22 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][280/573]	eta 0:04:19 lr 0.000850	time 0.8774 (0.8843)	loss 0.5774 (0.5175)	grad_norm 1.9287 (2.6498)	mem 20675MB
[2025-04-03 02:08:24 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][282/573]	eta 0:04:17 lr 0.000850	time 0.8774 (0.8842)	loss 0.5326 (0.5178)	grad_norm 2.2187 (2.6467)	mem 20675MB
[2025-04-03 02:08:26 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][284/573]	eta 0:04:15 lr 0.000849	time 0.8780 (0.8842)	loss 0.5903 (0.5179)	grad_norm 3.1684 (2.6481)	mem 20675MB
[2025-04-03 02:08:28 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][286/573]	eta 0:04:13 lr 0.000849	time 0.8774 (0.8841)	loss 0.4600 (0.5177)	grad_norm 2.4762 (2.6460)	mem 20675MB
[2025-04-03 02:08:29 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][288/573]	eta 0:04:11 lr 0.000849	time 0.8775 (0.8841)	loss 0.4975 (0.5177)	grad_norm 2.8666 (2.6454)	mem 20675MB
[2025-04-03 02:08:31 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][290/573]	eta 0:04:10 lr 0.000849	time 0.8772 (0.8841)	loss 0.5760 (0.5180)	grad_norm 3.3281 (2.6480)	mem 20675MB
[2025-04-03 02:08:33 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][292/573]	eta 0:04:08 lr 0.000848	time 0.8774 (0.8840)	loss 0.5751 (0.5185)	grad_norm 2.0155 (2.6442)	mem 20675MB
[2025-04-03 02:08:35 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][294/573]	eta 0:04:06 lr 0.000848	time 0.8772 (0.8840)	loss 0.5510 (0.5183)	grad_norm 1.9789 (2.6418)	mem 20675MB
[2025-04-03 02:08:36 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][296/573]	eta 0:04:04 lr 0.000848	time 0.8774 (0.8839)	loss 0.5561 (0.5184)	grad_norm 1.9437 (2.6359)	mem 20675MB
[2025-04-03 02:08:38 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][298/573]	eta 0:04:03 lr 0.000848	time 0.8784 (0.8839)	loss 0.5634 (0.5182)	grad_norm 2.1852 (2.6368)	mem 20675MB
[2025-04-03 02:08:40 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][300/573]	eta 0:04:01 lr 0.000848	time 0.8774 (0.8839)	loss 0.5035 (0.5177)	grad_norm 2.4749 (2.6364)	mem 20675MB
[2025-04-03 02:08:42 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][302/573]	eta 0:03:59 lr 0.000847	time 0.8773 (0.8838)	loss 0.3955 (0.5172)	grad_norm 2.6637 (2.6375)	mem 20675MB
[2025-04-03 02:08:43 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][304/573]	eta 0:03:57 lr 0.000847	time 0.8776 (0.8838)	loss 0.4788 (0.5171)	grad_norm 2.3435 (2.6357)	mem 20675MB
[2025-04-03 02:08:45 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][306/573]	eta 0:03:55 lr 0.000847	time 0.8770 (0.8838)	loss 0.5370 (0.5168)	grad_norm 2.5029 (2.6386)	mem 20675MB
[2025-04-03 02:08:47 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][308/573]	eta 0:03:54 lr 0.000847	time 0.8771 (0.8837)	loss 0.5626 (0.5170)	grad_norm 2.5763 (2.6412)	mem 20675MB
[2025-04-03 02:08:49 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][310/573]	eta 0:03:52 lr 0.000847	time 0.8772 (0.8837)	loss 0.5244 (0.5168)	grad_norm 2.7420 (2.6406)	mem 20675MB
[2025-04-03 02:08:51 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][312/573]	eta 0:03:50 lr 0.000846	time 0.8775 (0.8836)	loss 0.5561 (0.5171)	grad_norm 3.0153 (2.6443)	mem 20675MB
[2025-04-03 02:08:52 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][314/573]	eta 0:03:48 lr 0.000846	time 0.8773 (0.8836)	loss 0.5317 (0.5171)	grad_norm 2.0007 (2.6413)	mem 20675MB
[2025-04-03 02:08:54 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][316/573]	eta 0:03:47 lr 0.000846	time 0.8772 (0.8836)	loss 0.5810 (0.5172)	grad_norm 3.6529 (2.6472)	mem 20675MB
[2025-04-03 02:08:56 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][318/573]	eta 0:03:45 lr 0.000846	time 0.8771 (0.8835)	loss 0.5843 (0.5177)	grad_norm 3.6363 (2.6476)	mem 20675MB
[2025-04-03 02:08:58 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][320/573]	eta 0:03:43 lr 0.000845	time 0.8774 (0.8835)	loss 0.3987 (0.5174)	grad_norm 3.4270 (2.6488)	mem 20675MB
[2025-04-03 02:08:59 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][322/573]	eta 0:03:41 lr 0.000845	time 0.8771 (0.8835)	loss 0.5228 (0.5175)	grad_norm 2.3918 (2.6472)	mem 20675MB
[2025-04-03 02:09:01 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][324/573]	eta 0:03:39 lr 0.000845	time 0.8773 (0.8834)	loss 0.6121 (0.5177)	grad_norm 3.2171 (2.6470)	mem 20675MB
[2025-04-03 02:09:03 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][326/573]	eta 0:03:38 lr 0.000845	time 0.8775 (0.8834)	loss 0.3746 (0.5169)	grad_norm 2.7864 (2.6559)	mem 20675MB
[2025-04-03 02:09:05 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][328/573]	eta 0:03:36 lr 0.000845	time 0.8772 (0.8834)	loss 0.5848 (0.5168)	grad_norm 3.1897 (2.6620)	mem 20675MB
[2025-04-03 02:09:06 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][330/573]	eta 0:03:34 lr 0.000844	time 0.8772 (0.8834)	loss 0.5605 (0.5168)	grad_norm 2.4261 (2.6625)	mem 20675MB
[2025-04-03 02:09:08 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][332/573]	eta 0:03:32 lr 0.000844	time 0.8772 (0.8833)	loss 0.5581 (0.5170)	grad_norm 2.0334 (2.6588)	mem 20675MB
[2025-04-03 02:09:10 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][334/573]	eta 0:03:31 lr 0.000844	time 0.8771 (0.8833)	loss 0.5148 (0.5168)	grad_norm 2.5059 (2.6597)	mem 20675MB
[2025-04-03 02:09:12 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][336/573]	eta 0:03:29 lr 0.000844	time 0.8771 (0.8833)	loss 0.4916 (0.5169)	grad_norm 2.2197 (2.6587)	mem 20675MB
[2025-04-03 02:09:13 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][338/573]	eta 0:03:27 lr 0.000844	time 0.8783 (0.8832)	loss 0.4999 (0.5165)	grad_norm 2.0684 (2.6599)	mem 20675MB
[2025-04-03 02:09:15 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][340/573]	eta 0:03:25 lr 0.000843	time 0.8775 (0.8832)	loss 0.5951 (0.5169)	grad_norm 2.2043 (2.6581)	mem 20675MB
[2025-04-03 02:09:17 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][342/573]	eta 0:03:24 lr 0.000843	time 0.8770 (0.8832)	loss 0.5168 (0.5171)	grad_norm 2.8283 (2.6576)	mem 20675MB
[2025-04-03 02:09:19 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][344/573]	eta 0:03:22 lr 0.000843	time 0.8771 (0.8831)	loss 0.4872 (0.5169)	grad_norm 2.7239 (2.6590)	mem 20675MB
[2025-04-03 02:09:20 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][346/573]	eta 0:03:20 lr 0.000843	time 0.8774 (0.8831)	loss 0.5373 (0.5167)	grad_norm 2.6227 (2.6617)	mem 20675MB
[2025-04-03 02:09:22 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][348/573]	eta 0:03:18 lr 0.000842	time 0.8774 (0.8831)	loss 0.5938 (0.5167)	grad_norm 2.1658 (2.6613)	mem 20675MB
[2025-04-03 02:09:24 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][350/573]	eta 0:03:16 lr 0.000842	time 0.8773 (0.8831)	loss 0.5096 (0.5165)	grad_norm 2.8422 (2.6614)	mem 20675MB
[2025-04-03 02:09:26 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][352/573]	eta 0:03:15 lr 0.000842	time 0.8770 (0.8830)	loss 0.4335 (0.5159)	grad_norm 2.3306 (2.6672)	mem 20675MB
[2025-04-03 02:09:27 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][354/573]	eta 0:03:13 lr 0.000842	time 0.8772 (0.8830)	loss 0.5821 (0.5162)	grad_norm 2.5613 (2.6653)	mem 20675MB
[2025-04-03 02:09:29 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][356/573]	eta 0:03:11 lr 0.000842	time 0.8773 (0.8830)	loss 0.4509 (0.5162)	grad_norm 3.5446 (2.6683)	mem 20675MB
[2025-04-03 02:09:31 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][358/573]	eta 0:03:09 lr 0.000841	time 0.8773 (0.8830)	loss 0.5117 (0.5163)	grad_norm 1.9135 (2.6663)	mem 20675MB
[2025-04-03 02:09:33 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][360/573]	eta 0:03:08 lr 0.000841	time 0.8771 (0.8829)	loss 0.3589 (0.5154)	grad_norm 2.6343 (2.6675)	mem 20675MB
[2025-04-03 02:09:34 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][362/573]	eta 0:03:06 lr 0.000841	time 0.8774 (0.8829)	loss 0.5390 (0.5154)	grad_norm 2.1180 (2.6667)	mem 20675MB
[2025-04-03 02:09:36 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][364/573]	eta 0:03:04 lr 0.000841	time 0.8774 (0.8829)	loss 0.5564 (0.5155)	grad_norm 1.6969 (2.6659)	mem 20675MB
[2025-04-03 02:09:38 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][366/573]	eta 0:03:02 lr 0.000841	time 0.8774 (0.8828)	loss 0.5401 (0.5153)	grad_norm 1.5879 (2.6649)	mem 20675MB
[2025-04-03 02:09:40 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][368/573]	eta 0:03:00 lr 0.000840	time 0.8775 (0.8828)	loss 0.4782 (0.5149)	grad_norm 2.3606 (2.6655)	mem 20675MB
[2025-04-03 02:09:41 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][370/573]	eta 0:02:59 lr 0.000840	time 0.8773 (0.8828)	loss 0.5007 (0.5149)	grad_norm 1.8527 (2.6606)	mem 20675MB
[2025-04-03 02:09:43 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][372/573]	eta 0:02:57 lr 0.000840	time 0.8772 (0.8828)	loss 0.4854 (0.5146)	grad_norm 2.1844 (2.6610)	mem 20675MB
[2025-04-03 02:09:45 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][374/573]	eta 0:02:55 lr 0.000840	time 0.8770 (0.8827)	loss 0.4074 (0.5144)	grad_norm 2.7083 (2.6610)	mem 20675MB
[2025-04-03 02:09:47 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][376/573]	eta 0:02:53 lr 0.000839	time 0.8772 (0.8827)	loss 0.4157 (0.5139)	grad_norm 2.7394 (2.6646)	mem 20675MB
[2025-04-03 02:09:48 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][378/573]	eta 0:02:52 lr 0.000839	time 0.8770 (0.8827)	loss 0.5257 (0.5139)	grad_norm 2.5001 (2.6628)	mem 20675MB
[2025-04-03 02:09:50 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][380/573]	eta 0:02:50 lr 0.000839	time 0.8769 (0.8827)	loss 0.4748 (0.5141)	grad_norm 2.6784 (2.6644)	mem 20675MB
[2025-04-03 02:09:52 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][382/573]	eta 0:02:48 lr 0.000839	time 0.8771 (0.8827)	loss 0.4420 (0.5137)	grad_norm 2.6846 (2.6669)	mem 20675MB
[2025-04-03 02:09:54 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][384/573]	eta 0:02:46 lr 0.000839	time 0.8774 (0.8826)	loss 0.5671 (0.5139)	grad_norm 2.4985 (2.6666)	mem 20675MB
[2025-04-03 02:09:55 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][386/573]	eta 0:02:45 lr 0.000838	time 0.8780 (0.8826)	loss 0.4093 (0.5134)	grad_norm 2.4793 (2.6712)	mem 20675MB
[2025-04-03 02:09:57 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][388/573]	eta 0:02:43 lr 0.000838	time 0.8774 (0.8826)	loss 0.5730 (0.5132)	grad_norm 2.0792 (2.6708)	mem 20675MB
[2025-04-03 02:09:59 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][390/573]	eta 0:02:41 lr 0.000838	time 0.8772 (0.8826)	loss 0.5098 (0.5132)	grad_norm 5.6724 (2.6768)	mem 20675MB
[2025-04-03 02:10:01 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][392/573]	eta 0:02:39 lr 0.000838	time 0.8774 (0.8825)	loss 0.5455 (0.5134)	grad_norm 3.4849 (2.6785)	mem 20675MB
[2025-04-03 02:10:03 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][394/573]	eta 0:02:37 lr 0.000838	time 0.8775 (0.8825)	loss 0.5564 (0.5138)	grad_norm 1.8564 (2.6778)	mem 20675MB
[2025-04-03 02:10:04 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][396/573]	eta 0:02:36 lr 0.000837	time 0.8771 (0.8825)	loss 0.5357 (0.5140)	grad_norm 1.9592 (2.6746)	mem 20675MB
[2025-04-03 02:10:06 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][398/573]	eta 0:02:34 lr 0.000837	time 0.8775 (0.8825)	loss 0.5107 (0.5137)	grad_norm 2.3021 (2.6747)	mem 20675MB
[2025-04-03 02:10:08 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][400/573]	eta 0:02:32 lr 0.000837	time 0.8772 (0.8825)	loss 0.4849 (0.5137)	grad_norm 2.5450 (2.6738)	mem 20675MB
[2025-04-03 02:10:10 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][402/573]	eta 0:02:30 lr 0.000837	time 0.8771 (0.8824)	loss 0.6133 (0.5141)	grad_norm 1.7721 (2.6729)	mem 20675MB
[2025-04-03 02:10:11 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][404/573]	eta 0:02:29 lr 0.000836	time 0.8770 (0.8824)	loss 0.5471 (0.5141)	grad_norm 2.5630 (2.6712)	mem 20675MB
[2025-04-03 02:10:13 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][406/573]	eta 0:02:27 lr 0.000836	time 0.8774 (0.8824)	loss 0.5383 (0.5142)	grad_norm 2.7034 (2.6712)	mem 20675MB
[2025-04-03 02:10:15 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][408/573]	eta 0:02:25 lr 0.000836	time 0.8774 (0.8824)	loss 0.4354 (0.5141)	grad_norm 3.1044 (2.6754)	mem 20675MB
[2025-04-03 02:10:17 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][410/573]	eta 0:02:23 lr 0.000836	time 0.8785 (0.8824)	loss 0.5364 (0.5142)	grad_norm 2.7150 (2.6789)	mem 20675MB
[2025-04-03 02:10:18 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][412/573]	eta 0:02:22 lr 0.000836	time 0.8773 (0.8823)	loss 0.5274 (0.5143)	grad_norm 1.8738 (2.6787)	mem 20675MB
[2025-04-03 02:10:20 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][414/573]	eta 0:02:20 lr 0.000835	time 0.8771 (0.8823)	loss 0.4666 (0.5143)	grad_norm 2.4350 (2.6778)	mem 20675MB
[2025-04-03 02:10:22 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][416/573]	eta 0:02:18 lr 0.000835	time 0.8769 (0.8823)	loss 0.5797 (0.5147)	grad_norm 2.2288 (2.6765)	mem 20675MB
[2025-04-03 02:10:24 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][418/573]	eta 0:02:16 lr 0.000835	time 0.8770 (0.8823)	loss 0.4954 (0.5148)	grad_norm 2.6606 (2.6749)	mem 20675MB
[2025-04-03 02:10:25 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][420/573]	eta 0:02:14 lr 0.000835	time 0.8775 (0.8823)	loss 0.5071 (0.5148)	grad_norm 1.8457 (2.6739)	mem 20675MB
[2025-04-03 02:10:27 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][422/573]	eta 0:02:13 lr 0.000835	time 0.8768 (0.8822)	loss 0.4010 (0.5149)	grad_norm 2.6916 (2.6736)	mem 20675MB
[2025-04-03 02:10:29 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][424/573]	eta 0:02:11 lr 0.000834	time 0.8771 (0.8822)	loss 0.5566 (0.5148)	grad_norm 1.3416 (2.6699)	mem 20675MB
[2025-04-03 02:10:31 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][426/573]	eta 0:02:09 lr 0.000834	time 0.8781 (0.8822)	loss 0.4953 (0.5150)	grad_norm 2.2122 (2.6688)	mem 20675MB
[2025-04-03 02:10:32 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][428/573]	eta 0:02:07 lr 0.000834	time 0.8770 (0.8822)	loss 0.5741 (0.5152)	grad_norm 3.5456 (2.6706)	mem 20675MB
[2025-04-03 02:10:34 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][430/573]	eta 0:02:06 lr 0.000834	time 0.8771 (0.8822)	loss 0.5248 (0.5151)	grad_norm 2.7128 (2.6703)	mem 20675MB
[2025-04-03 02:10:36 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][432/573]	eta 0:02:04 lr 0.000833	time 0.8778 (0.8821)	loss 0.3832 (0.5149)	grad_norm 3.7403 (2.6717)	mem 20675MB
[2025-04-03 02:10:38 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][434/573]	eta 0:02:02 lr 0.000833	time 0.8776 (0.8821)	loss 0.5422 (0.5152)	grad_norm 2.6972 (2.6708)	mem 20675MB
[2025-04-03 02:10:39 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][436/573]	eta 0:02:00 lr 0.000833	time 0.8773 (0.8821)	loss 0.4458 (0.5150)	grad_norm 2.4395 (2.6697)	mem 20675MB
[2025-04-03 02:10:41 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][438/573]	eta 0:01:59 lr 0.000833	time 0.8789 (0.8821)	loss 0.4114 (0.5148)	grad_norm 5.0134 (2.6737)	mem 20675MB
[2025-04-03 02:10:43 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][440/573]	eta 0:01:57 lr 0.000833	time 0.8772 (0.8821)	loss 0.3806 (0.5144)	grad_norm 2.9744 (2.6778)	mem 20675MB
[2025-04-03 02:10:45 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][442/573]	eta 0:01:55 lr 0.000832	time 0.8772 (0.8821)	loss 0.3601 (0.5140)	grad_norm 2.8141 (2.6777)	mem 20675MB
[2025-04-03 02:10:46 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][444/573]	eta 0:01:53 lr 0.000832	time 0.8768 (0.8820)	loss 0.3423 (0.5135)	grad_norm 3.3977 (2.6815)	mem 20675MB
[2025-04-03 02:10:48 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][446/573]	eta 0:01:52 lr 0.000832	time 0.8771 (0.8820)	loss 0.6059 (0.5135)	grad_norm 3.1252 (2.6835)	mem 20675MB
[2025-04-03 02:10:50 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][448/573]	eta 0:01:50 lr 0.000832	time 0.8769 (0.8820)	loss 0.4704 (0.5136)	grad_norm 2.7521 (2.6845)	mem 20675MB
[2025-04-03 02:10:52 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][450/573]	eta 0:01:48 lr 0.000832	time 0.8769 (0.8820)	loss 0.4713 (0.5137)	grad_norm 4.3275 (2.6877)	mem 20675MB
[2025-04-03 02:10:53 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][452/573]	eta 0:01:46 lr 0.000831	time 0.8782 (0.8820)	loss 0.5990 (0.5139)	grad_norm 2.8155 (2.6943)	mem 20675MB
[2025-04-03 02:10:55 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][454/573]	eta 0:01:44 lr 0.000831	time 0.8771 (0.8820)	loss 0.4507 (0.5135)	grad_norm 3.0792 (2.6970)	mem 20675MB
[2025-04-03 02:10:57 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][456/573]	eta 0:01:43 lr 0.000831	time 0.8775 (0.8820)	loss 0.5298 (0.5133)	grad_norm 2.1148 (2.6966)	mem 20675MB
[2025-04-03 02:10:59 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][458/573]	eta 0:01:41 lr 0.000831	time 0.8769 (0.8819)	loss 0.5527 (0.5136)	grad_norm 4.1434 (2.6996)	mem 20675MB
[2025-04-03 02:11:00 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][460/573]	eta 0:01:39 lr 0.000830	time 0.8768 (0.8819)	loss 0.4238 (0.5136)	grad_norm 2.0457 (2.6965)	mem 20675MB
[2025-04-03 02:11:02 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][462/573]	eta 0:01:37 lr 0.000830	time 0.8773 (0.8819)	loss 0.4669 (0.5133)	grad_norm 2.5212 (2.6959)	mem 20675MB
[2025-04-03 02:11:04 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][464/573]	eta 0:01:36 lr 0.000830	time 0.8774 (0.8819)	loss 0.3458 (0.5131)	grad_norm 3.3412 (2.6958)	mem 20675MB
[2025-04-03 02:11:06 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][466/573]	eta 0:01:34 lr 0.000830	time 0.8772 (0.8819)	loss 0.5644 (0.5130)	grad_norm 2.4585 (2.6952)	mem 20675MB
[2025-04-03 02:11:08 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][468/573]	eta 0:01:32 lr 0.000830	time 0.8774 (0.8819)	loss 0.5677 (0.5129)	grad_norm 3.5708 (2.7065)	mem 20675MB
[2025-04-03 02:11:09 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][470/573]	eta 0:01:30 lr 0.000829	time 0.8775 (0.8818)	loss 0.6407 (0.5132)	grad_norm 3.1921 (2.7075)	mem 20675MB
[2025-04-03 02:11:11 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][472/573]	eta 0:01:29 lr 0.000829	time 0.8775 (0.8818)	loss 0.5381 (0.5134)	grad_norm 2.5304 (2.7064)	mem 20675MB
[2025-04-03 02:11:13 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][474/573]	eta 0:01:27 lr 0.000829	time 0.8772 (0.8818)	loss 0.4741 (0.5134)	grad_norm 4.1622 (2.7109)	mem 20675MB
[2025-04-03 02:11:15 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][476/573]	eta 0:01:25 lr 0.000829	time 0.8770 (0.8818)	loss 0.5796 (0.5134)	grad_norm 2.8264 (2.7119)	mem 20675MB
[2025-04-03 02:11:16 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][478/573]	eta 0:01:23 lr 0.000829	time 0.8773 (0.8818)	loss 0.5137 (0.5136)	grad_norm 2.4684 (2.7102)	mem 20675MB
[2025-04-03 02:11:18 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][480/573]	eta 0:01:22 lr 0.000828	time 0.8773 (0.8818)	loss 0.5571 (0.5134)	grad_norm 2.3056 (2.7100)	mem 20675MB
[2025-04-03 02:11:20 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][482/573]	eta 0:01:20 lr 0.000828	time 0.8784 (0.8818)	loss 0.5054 (0.5137)	grad_norm 2.2913 (2.7103)	mem 20675MB
[2025-04-03 02:11:22 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][484/573]	eta 0:01:18 lr 0.000828	time 0.8771 (0.8817)	loss 0.5541 (0.5137)	grad_norm 2.3691 (2.7082)	mem 20675MB
[2025-04-03 02:11:23 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][486/573]	eta 0:01:16 lr 0.000828	time 0.8770 (0.8817)	loss 0.5350 (0.5139)	grad_norm 2.9443 (2.7071)	mem 20675MB
[2025-04-03 02:11:25 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][488/573]	eta 0:01:14 lr 0.000827	time 0.8773 (0.8817)	loss 0.6165 (0.5143)	grad_norm 2.6845 (2.7082)	mem 20675MB
[2025-04-03 02:11:27 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][490/573]	eta 0:01:13 lr 0.000827	time 0.8771 (0.8817)	loss 0.5469 (0.5145)	grad_norm 1.9403 (2.7053)	mem 20675MB
[2025-04-03 02:11:29 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][492/573]	eta 0:01:11 lr 0.000827	time 0.8771 (0.8817)	loss 0.6075 (0.5146)	grad_norm 1.8793 (2.7023)	mem 20675MB
[2025-04-03 02:11:30 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][494/573]	eta 0:01:09 lr 0.000827	time 0.8773 (0.8817)	loss 0.3986 (0.5144)	grad_norm 3.0915 (2.7018)	mem 20675MB
[2025-04-03 02:11:32 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][496/573]	eta 0:01:07 lr 0.000827	time 0.8769 (0.8817)	loss 0.5325 (0.5142)	grad_norm 2.4130 (2.7008)	mem 20675MB
[2025-04-03 02:11:34 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][498/573]	eta 0:01:06 lr 0.000826	time 0.8771 (0.8816)	loss 0.5787 (0.5144)	grad_norm 2.1730 (2.6982)	mem 20675MB
[2025-04-03 02:11:36 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][500/573]	eta 0:01:04 lr 0.000826	time 0.8768 (0.8816)	loss 0.3661 (0.5143)	grad_norm 4.1662 (2.6998)	mem 20675MB
[2025-04-03 02:11:37 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][502/573]	eta 0:01:02 lr 0.000826	time 0.8771 (0.8816)	loss 0.3759 (0.5141)	grad_norm 3.0996 (2.7011)	mem 20675MB
[2025-04-03 02:11:39 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][504/573]	eta 0:01:00 lr 0.000826	time 0.8772 (0.8816)	loss 0.6217 (0.5143)	grad_norm 2.8904 (2.7010)	mem 20675MB
[2025-04-03 02:11:41 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][506/573]	eta 0:00:59 lr 0.000825	time 0.8772 (0.8816)	loss 0.4570 (0.5143)	grad_norm 2.0058 (2.7018)	mem 20675MB
[2025-04-03 02:11:43 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][508/573]	eta 0:00:57 lr 0.000825	time 0.8775 (0.8816)	loss 0.5093 (0.5145)	grad_norm 2.4285 (2.7007)	mem 20675MB
[2025-04-03 02:11:44 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][510/573]	eta 0:00:55 lr 0.000825	time 0.8773 (0.8816)	loss 0.6081 (0.5145)	grad_norm 2.8408 (2.7023)	mem 20675MB
[2025-04-03 02:11:46 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][512/573]	eta 0:00:53 lr 0.000825	time 0.8773 (0.8815)	loss 0.4798 (0.5144)	grad_norm 2.5359 (2.7046)	mem 20675MB
[2025-04-03 02:11:48 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][514/573]	eta 0:00:52 lr 0.000825	time 0.8773 (0.8815)	loss 0.6075 (0.5148)	grad_norm 1.9864 (2.7045)	mem 20675MB
[2025-04-03 02:11:50 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][516/573]	eta 0:00:50 lr 0.000824	time 0.8773 (0.8815)	loss 0.5786 (0.5147)	grad_norm 1.9176 (2.7049)	mem 20675MB
[2025-04-03 02:11:51 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][518/573]	eta 0:00:48 lr 0.000824	time 0.8774 (0.8815)	loss 0.5955 (0.5148)	grad_norm 3.7029 (2.7063)	mem 20675MB
[2025-04-03 02:11:53 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][520/573]	eta 0:00:46 lr 0.000824	time 0.8774 (0.8815)	loss 0.5440 (0.5147)	grad_norm 2.8807 (2.7073)	mem 20675MB
[2025-04-03 02:11:55 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][522/573]	eta 0:00:44 lr 0.000824	time 0.8771 (0.8815)	loss 0.4937 (0.5147)	grad_norm 2.3651 (2.7065)	mem 20675MB
[2025-04-03 02:11:57 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][524/573]	eta 0:00:43 lr 0.000824	time 0.8771 (0.8815)	loss 0.5245 (0.5148)	grad_norm 1.7213 (2.7035)	mem 20675MB
[2025-04-03 02:11:58 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][526/573]	eta 0:00:41 lr 0.000823	time 0.8771 (0.8815)	loss 0.4866 (0.5148)	grad_norm 2.4601 (2.7006)	mem 20675MB
[2025-04-03 02:12:00 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][528/573]	eta 0:00:39 lr 0.000823	time 0.8772 (0.8815)	loss 0.4366 (0.5147)	grad_norm 1.8675 (2.6981)	mem 20675MB
[2025-04-03 02:12:02 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][530/573]	eta 0:00:37 lr 0.000823	time 0.8771 (0.8814)	loss 0.5870 (0.5146)	grad_norm 1.9190 (2.6960)	mem 20675MB
[2025-04-03 02:12:04 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][532/573]	eta 0:00:36 lr 0.000823	time 0.8770 (0.8814)	loss 0.3821 (0.5144)	grad_norm 3.9785 (2.6970)	mem 20675MB
[2025-04-03 02:12:05 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][534/573]	eta 0:00:34 lr 0.000822	time 0.8775 (0.8814)	loss 0.6246 (0.5144)	grad_norm 2.1832 (2.6973)	mem 20675MB
[2025-04-03 02:12:07 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][536/573]	eta 0:00:32 lr 0.000822	time 0.8772 (0.8814)	loss 0.5764 (0.5146)	grad_norm 3.0360 (2.6960)	mem 20675MB
[2025-04-03 02:12:09 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][538/573]	eta 0:00:30 lr 0.000822	time 0.8774 (0.8814)	loss 0.6012 (0.5146)	grad_norm 1.7688 (2.6940)	mem 20675MB
[2025-04-03 02:12:11 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][540/573]	eta 0:00:29 lr 0.000822	time 0.8779 (0.8814)	loss 0.5697 (0.5150)	grad_norm 1.8119 (2.6920)	mem 20675MB
[2025-04-03 02:12:13 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][542/573]	eta 0:00:27 lr 0.000822	time 0.8773 (0.8814)	loss 0.6294 (0.5149)	grad_norm 2.1288 (2.6911)	mem 20675MB
[2025-04-03 02:12:14 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][544/573]	eta 0:00:25 lr 0.000821	time 0.8775 (0.8814)	loss 0.5358 (0.5148)	grad_norm 2.1062 (2.6920)	mem 20675MB
[2025-04-03 02:12:16 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][546/573]	eta 0:00:23 lr 0.000821	time 0.8774 (0.8813)	loss 0.4232 (0.5146)	grad_norm 2.8196 (2.6914)	mem 20675MB
[2025-04-03 02:12:18 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][548/573]	eta 0:00:22 lr 0.000821	time 0.8777 (0.8813)	loss 0.4514 (0.5143)	grad_norm 2.6829 (2.6924)	mem 20675MB
[2025-04-03 02:12:20 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][550/573]	eta 0:00:20 lr 0.000821	time 0.8774 (0.8813)	loss 0.5225 (0.5143)	grad_norm 2.8930 (2.6957)	mem 20675MB
[2025-04-03 02:12:21 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][552/573]	eta 0:00:18 lr 0.000821	time 0.8772 (0.8813)	loss 0.4933 (0.5140)	grad_norm 2.9729 (2.6982)	mem 20675MB
[2025-04-03 02:12:23 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][554/573]	eta 0:00:16 lr 0.000820	time 0.8783 (0.8813)	loss 0.4972 (0.5138)	grad_norm 3.3914 (2.7010)	mem 20675MB
[2025-04-03 02:12:25 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][556/573]	eta 0:00:14 lr 0.000820	time 0.8774 (0.8813)	loss 0.5596 (0.5140)	grad_norm 3.4374 (2.7031)	mem 20675MB
[2025-04-03 02:12:27 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][558/573]	eta 0:00:13 lr 0.000820	time 0.8770 (0.8813)	loss 0.4516 (0.5141)	grad_norm 3.7540 (2.7046)	mem 20675MB
[2025-04-03 02:12:28 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][560/573]	eta 0:00:11 lr 0.000820	time 0.8770 (0.8813)	loss 0.4467 (0.5140)	grad_norm 2.6770 (2.7036)	mem 20675MB
[2025-04-03 02:12:30 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][562/573]	eta 0:00:09 lr 0.000819	time 0.8771 (0.8813)	loss 0.4312 (0.5138)	grad_norm 3.3026 (2.7030)	mem 20675MB
[2025-04-03 02:12:32 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][564/573]	eta 0:00:07 lr 0.000819	time 0.8769 (0.8813)	loss 0.4269 (0.5139)	grad_norm 2.9497 (2.7030)	mem 20675MB
[2025-04-03 02:12:34 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][566/573]	eta 0:00:06 lr 0.000819	time 0.8773 (0.8812)	loss 0.5186 (0.5139)	grad_norm 2.2619 (2.7018)	mem 20675MB
[2025-04-03 02:12:35 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][568/573]	eta 0:00:04 lr 0.000819	time 0.8771 (0.8812)	loss 0.6449 (0.5140)	grad_norm 2.0394 (2.7004)	mem 20675MB
[2025-04-03 02:12:37 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][570/573]	eta 0:00:02 lr 0.000819	time 0.8771 (0.8812)	loss 0.6134 (0.5143)	grad_norm 1.9427 (2.6979)	mem 20675MB
[2025-04-03 02:12:39 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][572/573]	eta 0:00:00 lr 0.000818	time 0.8777 (0.8812)	loss 0.5098 (0.5143)	grad_norm 3.2497 (2.6975)	mem 20675MB
[2025-04-03 02:12:39 simmim_finetune] (main_finetune.py 260): INFO EPOCH 11 training takes 0:08:25
[2025-04-03 02:12:41 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.993 (1.993)	Loss 0.5186 (0.5186)	Acc@1 73.438 (73.438)	Mem 20675MB
[2025-04-03 02:12:42 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.854)	Loss 0.4550 (0.4783)	Acc@1 78.125 (77.083)	Mem 20675MB
[2025-04-03 02:12:42 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.626)	Loss 0.5148 (0.4830)	Acc@1 72.656 (76.406)	Mem 20675MB
[2025-04-03 02:12:43 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.528)	Loss 0.5004 (0.4791)	Acc@1 78.906 (77.455)	Mem 20675MB
[2025-04-03 02:12:43 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.474)	Loss 0.5829 (0.4845)	Acc@1 67.188 (76.910)	Mem 20675MB
[2025-04-03 02:12:44 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.439)	Loss 0.5184 (0.4927)	Acc@1 79.688 (76.562)	Mem 20675MB
[2025-04-03 02:12:44 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.415)	Loss 0.5344 (0.4973)	Acc@1 71.094 (76.202)	Mem 20675MB
[2025-04-03 02:12:45 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.398)	Loss 0.4956 (0.4983)	Acc@1 82.031 (76.667)	Mem 20675MB
[2025-04-03 02:12:45 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 76.714
[2025-04-03 02:12:45 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 76.7%
[2025-04-03 02:12:45 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 76.71%
[2025-04-03 02:12:45 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.111568614538291e-06, 3.111568614538291e-06, 4.7405318647711745e-06, 4.7405318647711745e-06, 7.246629172821764e-06, 7.246629172821764e-06, 1.1102163492899594e-05, 1.1102163492899594e-05, 1.7033754754557793e-05, 1.7033754754557793e-05, 2.6159279772493483e-05, 2.6159279772493483e-05, 4.019854903085608e-05, 4.019854903085608e-05, 6.179742481295238e-05, 6.179742481295238e-05, 9.502646447771593e-05, 9.502646447771593e-05, 0.00014614806396196756, 0.00014614806396196756, 0.00022479667855312385, 0.00022479667855312385, 0.0003457945471549028, 0.0003457945471549028, 0.0005319451142345628, 0.0005319451142345628, 0.0008183306020494241, 0.0008183306020494241]
[2025-04-03 02:12:47 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][0/573]	eta 0:20:22 lr 0.000818	time 2.1338 (2.1338)	loss 0.3722 (0.3722)	grad_norm 2.8114 (2.8114)	mem 20675MB
[2025-04-03 02:12:49 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][2/573]	eta 0:12:20 lr 0.000818	time 0.8779 (1.2969)	loss 0.4195 (0.4602)	grad_norm 2.5245 (2.5394)	mem 20675MB
[2025-04-03 02:12:51 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][4/573]	eta 0:10:42 lr 0.000818	time 0.8774 (1.1297)	loss 0.5338 (0.4898)	grad_norm 1.8761 (2.4971)	mem 20675MB
[2025-04-03 02:12:53 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][6/573]	eta 0:09:59 lr 0.000818	time 0.8778 (1.0580)	loss 0.4875 (0.5029)	grad_norm 2.7073 (2.3787)	mem 20675MB
[2025-04-03 02:12:54 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][8/573]	eta 0:09:35 lr 0.000817	time 0.8776 (1.0183)	loss 0.3732 (0.4942)	grad_norm 3.4208 (2.5920)	mem 20675MB
[2025-04-03 02:12:56 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][10/573]	eta 0:09:18 lr 0.000817	time 0.8774 (0.9929)	loss 0.4432 (0.5021)	grad_norm 2.3892 (2.5818)	mem 20675MB
[2025-04-03 02:12:58 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][12/573]	eta 0:09:07 lr 0.000817	time 0.8809 (0.9755)	loss 0.5657 (0.4986)	grad_norm 2.1646 (2.6472)	mem 20675MB
[2025-04-03 02:13:00 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][14/573]	eta 0:08:58 lr 0.000817	time 0.8781 (0.9626)	loss 0.5342 (0.5048)	grad_norm 1.8718 (2.5922)	mem 20675MB
[2025-04-03 02:13:01 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][16/573]	eta 0:08:50 lr 0.000816	time 0.8801 (0.9529)	loss 0.5679 (0.5090)	grad_norm 2.3467 (2.5678)	mem 20675MB
[2025-04-03 02:13:03 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][18/573]	eta 0:08:44 lr 0.000816	time 0.8793 (0.9452)	loss 0.5327 (0.5038)	grad_norm 3.1286 (2.7301)	mem 20675MB
[2025-04-03 02:13:05 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][20/573]	eta 0:08:39 lr 0.000816	time 0.8776 (0.9388)	loss 0.4460 (0.5072)	grad_norm 2.7249 (2.7142)	mem 20675MB
[2025-04-03 02:13:07 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][22/573]	eta 0:08:34 lr 0.000816	time 0.8773 (0.9336)	loss 0.5620 (0.5102)	grad_norm 1.8266 (2.7155)	mem 20675MB
[2025-04-03 02:13:08 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][24/573]	eta 0:08:30 lr 0.000816	time 0.8781 (0.9293)	loss 0.4085 (0.5056)	grad_norm 2.6225 (2.7258)	mem 20675MB
[2025-04-03 02:13:10 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][26/573]	eta 0:08:26 lr 0.000815	time 0.8773 (0.9256)	loss 0.5532 (0.5103)	grad_norm 2.2380 (2.6973)	mem 20675MB
[2025-04-03 02:13:12 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][28/573]	eta 0:08:22 lr 0.000815	time 0.8771 (0.9223)	loss 0.4001 (0.5046)	grad_norm 3.4583 (2.7183)	mem 20675MB
[2025-04-03 02:13:14 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][30/573]	eta 0:08:19 lr 0.000815	time 0.8777 (0.9194)	loss 0.5385 (0.5035)	grad_norm 2.5638 (2.7950)	mem 20675MB
[2025-04-03 02:13:16 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][32/573]	eta 0:08:16 lr 0.000815	time 0.8772 (0.9170)	loss 0.4586 (0.5053)	grad_norm 1.5675 (2.7401)	mem 20675MB
[2025-04-03 02:13:17 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][34/573]	eta 0:08:13 lr 0.000815	time 0.8775 (0.9147)	loss 0.3737 (0.5006)	grad_norm 4.1028 (2.8079)	mem 20675MB
[2025-04-03 02:13:19 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][36/573]	eta 0:08:10 lr 0.000814	time 0.8775 (0.9128)	loss 0.5735 (0.4999)	grad_norm 2.1101 (2.8111)	mem 20675MB
[2025-04-03 02:13:21 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][38/573]	eta 0:08:07 lr 0.000814	time 0.8774 (0.9110)	loss 0.6201 (0.5036)	grad_norm 2.0520 (2.7636)	mem 20675MB
[2025-04-03 02:13:23 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][40/573]	eta 0:08:04 lr 0.000814	time 0.8772 (0.9094)	loss 0.5247 (0.5027)	grad_norm 2.4582 (2.7588)	mem 20675MB
[2025-04-03 02:13:24 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][42/573]	eta 0:08:02 lr 0.000814	time 0.8773 (0.9080)	loss 0.5493 (0.5044)	grad_norm 1.8972 (2.7525)	mem 20675MB
[2025-04-03 02:13:26 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][44/573]	eta 0:07:59 lr 0.000813	time 0.8772 (0.9066)	loss 0.5192 (0.5017)	grad_norm 2.4252 (2.8092)	mem 20675MB
[2025-04-03 02:13:28 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][46/573]	eta 0:07:57 lr 0.000813	time 0.8773 (0.9054)	loss 0.4238 (0.5022)	grad_norm 2.4456 (2.7813)	mem 20675MB
[2025-04-03 02:13:30 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][48/573]	eta 0:07:54 lr 0.000813	time 0.8772 (0.9043)	loss 0.4343 (0.5015)	grad_norm 3.2281 (2.7879)	mem 20675MB
[2025-04-03 02:13:31 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][50/573]	eta 0:07:52 lr 0.000813	time 0.8779 (0.9033)	loss 0.6113 (0.5021)	grad_norm 2.5455 (2.7790)	mem 20675MB
[2025-04-03 02:13:33 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][52/573]	eta 0:07:50 lr 0.000813	time 0.8773 (0.9024)	loss 0.5572 (0.5039)	grad_norm 1.8476 (2.7659)	mem 20675MB
[2025-04-03 02:13:35 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][54/573]	eta 0:07:47 lr 0.000812	time 0.8775 (0.9015)	loss 0.5142 (0.5064)	grad_norm 3.1100 (2.7591)	mem 20675MB
[2025-04-03 02:13:37 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][56/573]	eta 0:07:45 lr 0.000812	time 0.8784 (0.9007)	loss 0.4618 (0.5069)	grad_norm 1.9908 (2.7295)	mem 20675MB
[2025-04-03 02:13:38 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][58/573]	eta 0:07:43 lr 0.000812	time 0.8774 (0.8999)	loss 0.4716 (0.5065)	grad_norm 1.8819 (2.7153)	mem 20675MB
[2025-04-03 02:13:40 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][60/573]	eta 0:07:41 lr 0.000812	time 0.8775 (0.8992)	loss 0.4077 (0.5067)	grad_norm 2.4291 (2.6918)	mem 20675MB
[2025-04-03 02:13:42 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][62/573]	eta 0:07:39 lr 0.000811	time 0.8773 (0.8986)	loss 0.4794 (0.5067)	grad_norm 1.9225 (2.6733)	mem 20675MB
[2025-04-03 02:13:44 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][64/573]	eta 0:07:37 lr 0.000811	time 0.8774 (0.8979)	loss 0.4078 (0.5053)	grad_norm 3.1457 (2.6681)	mem 20675MB
[2025-04-03 02:13:45 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][66/573]	eta 0:07:34 lr 0.000811	time 0.8773 (0.8974)	loss 0.4847 (0.5032)	grad_norm 2.4449 (2.6897)	mem 20675MB
[2025-04-03 02:13:47 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][68/573]	eta 0:07:32 lr 0.000811	time 0.8774 (0.8968)	loss 0.3765 (0.5022)	grad_norm 4.0908 (2.7064)	mem 20675MB
[2025-04-03 02:13:49 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][70/573]	eta 0:07:30 lr 0.000811	time 0.8774 (0.8963)	loss 0.5677 (0.5032)	grad_norm 2.8029 (2.7217)	mem 20675MB
[2025-04-03 02:13:51 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][72/573]	eta 0:07:28 lr 0.000810	time 0.8774 (0.8958)	loss 0.6627 (0.5038)	grad_norm 5.5111 (2.7728)	mem 20675MB
[2025-04-03 02:13:52 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][74/573]	eta 0:07:26 lr 0.000810	time 0.8775 (0.8953)	loss 0.6587 (0.5047)	grad_norm 2.8365 (2.7823)	mem 20675MB
[2025-04-03 02:13:54 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][76/573]	eta 0:07:24 lr 0.000810	time 0.8775 (0.8949)	loss 0.3807 (0.5043)	grad_norm 4.6910 (2.8006)	mem 20675MB
[2025-04-03 02:13:56 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][78/573]	eta 0:07:22 lr 0.000810	time 0.8775 (0.8945)	loss 0.4279 (0.5017)	grad_norm 3.2341 (2.8088)	mem 20675MB
[2025-04-03 02:13:58 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][80/573]	eta 0:07:20 lr 0.000810	time 0.8773 (0.8941)	loss 0.5189 (0.5019)	grad_norm 2.8091 (2.7935)	mem 20675MB
[2025-04-03 02:13:59 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][82/573]	eta 0:07:18 lr 0.000809	time 0.8777 (0.8937)	loss 0.6592 (0.5029)	grad_norm 3.5088 (2.7962)	mem 20675MB
[2025-04-03 02:14:01 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][84/573]	eta 0:07:16 lr 0.000809	time 0.8772 (0.8933)	loss 0.4497 (0.5031)	grad_norm 3.1060 (2.7979)	mem 20675MB
[2025-04-03 02:14:03 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][86/573]	eta 0:07:14 lr 0.000809	time 0.8773 (0.8930)	loss 0.5869 (0.5044)	grad_norm 2.1185 (2.7832)	mem 20675MB
[2025-04-03 02:14:05 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][88/573]	eta 0:07:12 lr 0.000809	time 0.8778 (0.8927)	loss 0.5625 (0.5057)	grad_norm 1.5304 (2.7587)	mem 20675MB
[2025-04-03 02:14:06 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][90/573]	eta 0:07:11 lr 0.000808	time 0.8773 (0.8923)	loss 0.4382 (0.5044)	grad_norm 2.5003 (2.7556)	mem 20675MB
[2025-04-03 02:14:08 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][92/573]	eta 0:07:09 lr 0.000808	time 0.8772 (0.8920)	loss 0.4716 (0.5044)	grad_norm 3.0463 (2.7477)	mem 20675MB
[2025-04-03 02:14:10 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][94/573]	eta 0:07:07 lr 0.000808	time 0.8776 (0.8918)	loss 0.6170 (0.5049)	grad_norm 1.9091 (2.7308)	mem 20675MB
[2025-04-03 02:14:12 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][96/573]	eta 0:07:05 lr 0.000808	time 0.8774 (0.8915)	loss 0.5230 (0.5065)	grad_norm 2.1312 (2.7169)	mem 20675MB
[2025-04-03 02:14:13 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][98/573]	eta 0:07:03 lr 0.000808	time 0.8773 (0.8912)	loss 0.3471 (0.5056)	grad_norm 2.8567 (2.7117)	mem 20675MB
[2025-04-03 02:14:15 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][100/573]	eta 0:07:01 lr 0.000807	time 0.8775 (0.8910)	loss 0.5676 (0.5071)	grad_norm 2.4845 (2.7113)	mem 20675MB
[2025-04-03 02:14:17 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][102/573]	eta 0:06:59 lr 0.000807	time 0.8774 (0.8907)	loss 0.5175 (0.5077)	grad_norm 2.2708 (2.6998)	mem 20675MB
[2025-04-03 02:14:19 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][104/573]	eta 0:06:57 lr 0.000807	time 0.8774 (0.8905)	loss 0.4771 (0.5074)	grad_norm 3.0266 (2.7145)	mem 20675MB
[2025-04-03 02:14:21 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][106/573]	eta 0:06:55 lr 0.000807	time 0.8775 (0.8902)	loss 0.4990 (0.5075)	grad_norm 1.9397 (2.7148)	mem 20675MB
[2025-04-03 02:14:22 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][108/573]	eta 0:06:53 lr 0.000806	time 0.8776 (0.8900)	loss 0.5499 (0.5068)	grad_norm 1.9135 (2.7105)	mem 20675MB
[2025-04-03 02:14:24 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][110/573]	eta 0:06:51 lr 0.000806	time 0.8773 (0.8898)	loss 0.4787 (0.5061)	grad_norm 3.0851 (2.7096)	mem 20675MB
[2025-04-03 02:14:26 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][112/573]	eta 0:06:50 lr 0.000806	time 0.8776 (0.8896)	loss 0.5568 (0.5067)	grad_norm 2.8284 (2.7156)	mem 20675MB
[2025-04-03 02:14:28 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][114/573]	eta 0:06:48 lr 0.000806	time 0.8773 (0.8894)	loss 0.5692 (0.5059)	grad_norm 2.4465 (2.7130)	mem 20675MB
[2025-04-03 02:14:29 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][116/573]	eta 0:06:46 lr 0.000806	time 0.8776 (0.8892)	loss 0.5816 (0.5074)	grad_norm 2.3198 (2.7246)	mem 20675MB
[2025-04-03 02:14:31 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][118/573]	eta 0:06:44 lr 0.000805	time 0.8775 (0.8890)	loss 0.4822 (0.5079)	grad_norm 2.3505 (2.7243)	mem 20675MB
[2025-04-03 02:14:33 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][120/573]	eta 0:06:42 lr 0.000805	time 0.8778 (0.8889)	loss 0.5537 (0.5076)	grad_norm 2.0718 (2.7195)	mem 20675MB
[2025-04-03 02:14:35 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][122/573]	eta 0:06:40 lr 0.000805	time 0.8772 (0.8887)	loss 0.4325 (0.5072)	grad_norm 2.2682 (2.7175)	mem 20675MB
[2025-04-03 02:14:36 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][124/573]	eta 0:06:38 lr 0.000805	time 0.8774 (0.8885)	loss 0.4359 (0.5058)	grad_norm 2.4774 (2.7223)	mem 20675MB
[2025-04-03 02:14:38 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][126/573]	eta 0:06:37 lr 0.000804	time 0.8776 (0.8884)	loss 0.5583 (0.5053)	grad_norm 2.2889 (2.7284)	mem 20675MB
[2025-04-03 02:14:40 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][128/573]	eta 0:06:35 lr 0.000804	time 0.8774 (0.8882)	loss 0.4753 (0.5043)	grad_norm 2.4575 (2.7310)	mem 20675MB
[2025-04-03 02:14:42 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][130/573]	eta 0:06:33 lr 0.000804	time 0.8772 (0.8881)	loss 0.4768 (0.5036)	grad_norm 3.4755 (2.7369)	mem 20675MB
[2025-04-03 02:14:43 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][132/573]	eta 0:06:31 lr 0.000804	time 0.8773 (0.8879)	loss 0.4887 (0.5041)	grad_norm 2.0738 (2.7275)	mem 20675MB
[2025-04-03 02:14:45 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][134/573]	eta 0:06:29 lr 0.000804	time 0.8773 (0.8878)	loss 0.5704 (0.5049)	grad_norm 2.9189 (2.7505)	mem 20675MB
[2025-04-03 02:14:47 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][136/573]	eta 0:06:27 lr 0.000803	time 0.8771 (0.8876)	loss 0.4113 (0.5042)	grad_norm 3.3847 (2.7485)	mem 20675MB
[2025-04-03 02:14:49 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][138/573]	eta 0:06:26 lr 0.000803	time 0.8775 (0.8875)	loss 0.5473 (0.5045)	grad_norm 1.9554 (2.7366)	mem 20675MB
[2025-04-03 02:14:50 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][140/573]	eta 0:06:24 lr 0.000803	time 0.8775 (0.8874)	loss 0.5763 (0.5057)	grad_norm 3.5429 (2.7515)	mem 20675MB
[2025-04-03 02:14:52 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][142/573]	eta 0:06:22 lr 0.000803	time 0.8773 (0.8873)	loss 0.3883 (0.5048)	grad_norm 3.4321 (2.7581)	mem 20675MB
[2025-04-03 02:14:54 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][144/573]	eta 0:06:20 lr 0.000803	time 0.8776 (0.8871)	loss 0.4828 (0.5054)	grad_norm 2.6484 (2.7526)	mem 20675MB
[2025-04-03 02:14:56 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][146/573]	eta 0:06:18 lr 0.000802	time 0.8779 (0.8870)	loss 0.5462 (0.5061)	grad_norm 1.5702 (2.7409)	mem 20675MB
[2025-04-03 02:14:57 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][148/573]	eta 0:06:16 lr 0.000802	time 0.8784 (0.8869)	loss 0.4234 (0.5056)	grad_norm 3.1642 (2.7409)	mem 20675MB
[2025-04-03 02:14:59 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][150/573]	eta 0:06:15 lr 0.000802	time 0.8773 (0.8868)	loss 0.4192 (0.5052)	grad_norm 2.4025 (2.7315)	mem 20675MB
[2025-04-03 02:15:01 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][152/573]	eta 0:06:13 lr 0.000802	time 0.8773 (0.8867)	loss 0.4508 (0.5051)	grad_norm 1.8452 (2.7183)	mem 20675MB
[2025-04-03 02:15:03 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][154/573]	eta 0:06:11 lr 0.000801	time 0.8774 (0.8866)	loss 0.5582 (0.5061)	grad_norm 2.8056 (2.7153)	mem 20675MB
[2025-04-03 02:15:04 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][156/573]	eta 0:06:09 lr 0.000801	time 0.8774 (0.8865)	loss 0.4749 (0.5053)	grad_norm 2.1272 (2.7107)	mem 20675MB
[2025-04-03 02:15:06 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][158/573]	eta 0:06:07 lr 0.000801	time 0.8776 (0.8864)	loss 0.5402 (0.5059)	grad_norm 1.9613 (2.7030)	mem 20675MB
[2025-04-03 02:15:08 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][160/573]	eta 0:06:06 lr 0.000801	time 0.8773 (0.8863)	loss 0.4282 (0.5060)	grad_norm 6.1737 (2.7250)	mem 20675MB
[2025-04-03 02:15:10 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][162/573]	eta 0:06:04 lr 0.000801	time 0.8773 (0.8862)	loss 0.5296 (0.5065)	grad_norm 1.9068 (2.7170)	mem 20675MB
[2025-04-03 02:15:11 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][164/573]	eta 0:06:02 lr 0.000800	time 0.8773 (0.8861)	loss 0.4546 (0.5054)	grad_norm 2.4366 (2.7171)	mem 20675MB
[2025-04-03 02:15:13 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][166/573]	eta 0:06:00 lr 0.000800	time 0.8778 (0.8860)	loss 0.3902 (0.5039)	grad_norm 3.1616 (2.7231)	mem 20675MB
[2025-04-03 02:15:15 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][168/573]	eta 0:05:58 lr 0.000800	time 0.8776 (0.8859)	loss 0.5341 (0.5044)	grad_norm 2.2979 (2.7166)	mem 20675MB
[2025-04-03 02:15:17 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][170/573]	eta 0:05:56 lr 0.000800	time 0.8772 (0.8858)	loss 0.5448 (0.5050)	grad_norm 2.8625 (2.7111)	mem 20675MB
[2025-04-03 02:15:18 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][172/573]	eta 0:05:55 lr 0.000799	time 0.8777 (0.8857)	loss 0.6029 (0.5059)	grad_norm 3.5803 (2.7155)	mem 20675MB
[2025-04-03 02:15:20 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][174/573]	eta 0:05:53 lr 0.000799	time 0.8774 (0.8856)	loss 0.3425 (0.5053)	grad_norm 2.8685 (2.7141)	mem 20675MB
[2025-04-03 02:15:22 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][176/573]	eta 0:05:51 lr 0.000799	time 0.8772 (0.8856)	loss 0.4769 (0.5046)	grad_norm 4.0133 (2.7212)	mem 20675MB
[2025-04-03 02:15:24 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][178/573]	eta 0:05:49 lr 0.000799	time 0.8781 (0.8855)	loss 0.4033 (0.5043)	grad_norm 3.5966 (2.7259)	mem 20675MB
[2025-04-03 02:15:26 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][180/573]	eta 0:05:47 lr 0.000799	time 0.8775 (0.8854)	loss 0.5770 (0.5048)	grad_norm 3.0327 (2.7292)	mem 20675MB
[2025-04-03 02:15:27 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][182/573]	eta 0:05:46 lr 0.000798	time 0.8774 (0.8853)	loss 0.5548 (0.5053)	grad_norm 1.8358 (2.7165)	mem 20675MB
[2025-04-03 02:15:29 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][184/573]	eta 0:05:44 lr 0.000798	time 0.8774 (0.8853)	loss 0.5528 (0.5056)	grad_norm 2.6627 (2.7125)	mem 20675MB
[2025-04-03 02:15:31 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][186/573]	eta 0:05:42 lr 0.000798	time 0.8774 (0.8852)	loss 0.5663 (0.5061)	grad_norm 1.8723 (2.7044)	mem 20675MB
[2025-04-03 02:15:33 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][188/573]	eta 0:05:40 lr 0.000798	time 0.8785 (0.8851)	loss 0.4918 (0.5062)	grad_norm 1.9302 (2.6961)	mem 20675MB
[2025-04-03 02:15:34 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][190/573]	eta 0:05:38 lr 0.000797	time 0.8772 (0.8850)	loss 0.5368 (0.5067)	grad_norm 5.2299 (2.7091)	mem 20675MB
[2025-04-03 02:15:36 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][192/573]	eta 0:05:37 lr 0.000797	time 0.8787 (0.8850)	loss 0.5735 (0.5071)	grad_norm 1.8842 (2.7027)	mem 20675MB
[2025-04-03 02:15:38 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][194/573]	eta 0:05:35 lr 0.000797	time 0.8773 (0.8849)	loss 0.5204 (0.5067)	grad_norm 1.6782 (2.6961)	mem 20675MB
[2025-04-03 02:15:40 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][196/573]	eta 0:05:33 lr 0.000797	time 0.8776 (0.8848)	loss 0.5089 (0.5062)	grad_norm 2.1460 (2.6927)	mem 20675MB
[2025-04-03 02:15:41 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][198/573]	eta 0:05:31 lr 0.000797	time 0.8770 (0.8848)	loss 0.4549 (0.5054)	grad_norm 2.7507 (2.6936)	mem 20675MB
[2025-04-03 02:15:43 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][200/573]	eta 0:05:29 lr 0.000796	time 0.8775 (0.8847)	loss 0.3579 (0.5045)	grad_norm 4.0124 (2.7174)	mem 20675MB
[2025-04-03 02:15:45 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][202/573]	eta 0:05:28 lr 0.000796	time 0.8775 (0.8847)	loss 0.3209 (0.5040)	grad_norm 3.5856 (2.7193)	mem 20675MB
[2025-04-03 02:15:47 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][204/573]	eta 0:05:26 lr 0.000796	time 0.8770 (0.8846)	loss 0.5054 (0.5043)	grad_norm 3.5503 (2.7234)	mem 20675MB
[2025-04-03 02:15:48 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][206/573]	eta 0:05:24 lr 0.000796	time 0.8771 (0.8845)	loss 0.3904 (0.5038)	grad_norm 4.0425 (2.7336)	mem 20675MB
[2025-04-03 02:15:50 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][208/573]	eta 0:05:22 lr 0.000795	time 0.8775 (0.8847)	loss 0.4933 (0.5038)	grad_norm 3.3626 (2.7349)	mem 20675MB
[2025-04-03 02:15:52 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][210/573]	eta 0:05:21 lr 0.000795	time 0.8772 (0.8846)	loss 0.6089 (0.5040)	grad_norm 2.4509 (2.7363)	mem 20675MB
[2025-04-03 02:15:54 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][212/573]	eta 0:05:19 lr 0.000795	time 0.8773 (0.8846)	loss 0.6331 (0.5051)	grad_norm 3.2658 (2.7373)	mem 20675MB
[2025-04-03 02:15:55 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][214/573]	eta 0:05:17 lr 0.000795	time 0.8771 (0.8845)	loss 0.4545 (0.5048)	grad_norm 2.3619 (2.7366)	mem 20675MB
[2025-04-03 02:15:57 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][216/573]	eta 0:05:15 lr 0.000795	time 0.8774 (0.8844)	loss 0.6171 (0.5056)	grad_norm 2.0404 (2.7306)	mem 20675MB
[2025-04-03 02:15:59 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][218/573]	eta 0:05:13 lr 0.000794	time 0.8785 (0.8844)	loss 0.3601 (0.5049)	grad_norm 2.5413 (2.7283)	mem 20675MB
[2025-04-03 02:16:01 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][220/573]	eta 0:05:12 lr 0.000794	time 0.8773 (0.8843)	loss 0.5810 (0.5052)	grad_norm 1.6608 (2.7215)	mem 20675MB
[2025-04-03 02:16:02 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][222/573]	eta 0:05:10 lr 0.000794	time 0.8771 (0.8843)	loss 0.5252 (0.5058)	grad_norm 1.8313 (2.7139)	mem 20675MB
[2025-04-03 02:16:04 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][224/573]	eta 0:05:08 lr 0.000794	time 0.8770 (0.8842)	loss 0.5107 (0.5063)	grad_norm 2.5791 (2.7110)	mem 20675MB
[2025-04-03 02:16:06 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][226/573]	eta 0:05:06 lr 0.000794	time 0.8774 (0.8842)	loss 0.4550 (0.5058)	grad_norm 2.6209 (2.7081)	mem 20675MB
[2025-04-03 02:16:08 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][228/573]	eta 0:05:05 lr 0.000793	time 0.8771 (0.8841)	loss 0.5287 (0.5056)	grad_norm 2.7160 (2.7123)	mem 20675MB
[2025-04-03 02:16:09 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][230/573]	eta 0:05:03 lr 0.000793	time 0.8773 (0.8841)	loss 0.4426 (0.5051)	grad_norm 2.5985 (2.7096)	mem 20675MB
[2025-04-03 02:16:11 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][232/573]	eta 0:05:01 lr 0.000793	time 0.8773 (0.8840)	loss 0.4589 (0.5052)	grad_norm 3.3476 (2.7124)	mem 20675MB
[2025-04-03 02:16:13 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][234/573]	eta 0:04:59 lr 0.000793	time 0.8769 (0.8840)	loss 0.4058 (0.5044)	grad_norm 4.8482 (2.7251)	mem 20675MB
[2025-04-03 02:16:15 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][236/573]	eta 0:04:57 lr 0.000792	time 0.8773 (0.8839)	loss 0.5046 (0.5043)	grad_norm 4.7256 (2.7294)	mem 20675MB
[2025-04-03 02:16:17 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][238/573]	eta 0:04:56 lr 0.000792	time 0.8771 (0.8839)	loss 0.5236 (0.5049)	grad_norm 2.9677 (2.7300)	mem 20675MB
[2025-04-03 02:16:18 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][240/573]	eta 0:04:54 lr 0.000792	time 0.8775 (0.8838)	loss 0.5448 (0.5052)	grad_norm 2.7678 (2.7317)	mem 20675MB
[2025-04-03 02:16:20 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][242/573]	eta 0:04:52 lr 0.000792	time 0.8773 (0.8838)	loss 0.6053 (0.5059)	grad_norm 2.0359 (2.7258)	mem 20675MB
[2025-04-03 02:16:22 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][244/573]	eta 0:04:50 lr 0.000792	time 0.8770 (0.8838)	loss 0.4390 (0.5053)	grad_norm 2.0902 (2.7249)	mem 20675MB
[2025-04-03 02:16:24 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][246/573]	eta 0:04:48 lr 0.000791	time 0.8773 (0.8837)	loss 0.4034 (0.5045)	grad_norm 2.8505 (2.7252)	mem 20675MB
[2025-04-03 02:16:25 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][248/573]	eta 0:04:47 lr 0.000791	time 0.8773 (0.8837)	loss 0.5315 (0.5050)	grad_norm 1.6743 (2.7167)	mem 20675MB
[2025-04-03 02:16:27 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][250/573]	eta 0:04:45 lr 0.000791	time 0.8770 (0.8836)	loss 0.5802 (0.5054)	grad_norm 2.1809 (2.7107)	mem 20675MB
[2025-04-03 02:16:29 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][252/573]	eta 0:04:43 lr 0.000791	time 0.8773 (0.8836)	loss 0.4137 (0.5052)	grad_norm 1.9851 (2.7101)	mem 20675MB
[2025-04-03 02:16:31 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][254/573]	eta 0:04:41 lr 0.000790	time 0.8775 (0.8836)	loss 0.5904 (0.5057)	grad_norm 2.1579 (2.7040)	mem 20675MB
[2025-04-03 02:16:32 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][256/573]	eta 0:04:40 lr 0.000790	time 0.8772 (0.8835)	loss 0.5143 (0.5055)	grad_norm 3.0449 (2.7057)	mem 20675MB
[2025-04-03 02:16:34 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][258/573]	eta 0:04:38 lr 0.000790	time 0.8772 (0.8835)	loss 0.3914 (0.5052)	grad_norm 2.6879 (2.7043)	mem 20675MB
[2025-04-03 02:16:36 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][260/573]	eta 0:04:36 lr 0.000790	time 0.8773 (0.8834)	loss 0.5627 (0.5054)	grad_norm 2.0071 (2.6988)	mem 20675MB
[2025-04-03 02:16:38 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][262/573]	eta 0:04:34 lr 0.000790	time 0.8770 (0.8834)	loss 0.5440 (0.5051)	grad_norm 1.9742 (2.6976)	mem 20675MB
[2025-04-03 02:16:39 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][264/573]	eta 0:04:32 lr 0.000789	time 0.8773 (0.8833)	loss 0.4185 (0.5050)	grad_norm 3.6454 (2.7025)	mem 20675MB
[2025-04-03 02:16:41 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][266/573]	eta 0:04:31 lr 0.000789	time 0.8772 (0.8833)	loss 0.4757 (0.5052)	grad_norm 2.1424 (2.7023)	mem 20675MB
[2025-04-03 02:16:43 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][268/573]	eta 0:04:29 lr 0.000789	time 0.8770 (0.8833)	loss 0.5648 (0.5052)	grad_norm 2.5249 (2.7027)	mem 20675MB
[2025-04-03 02:16:45 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][270/573]	eta 0:04:27 lr 0.000789	time 0.8773 (0.8832)	loss 0.5880 (0.5059)	grad_norm 2.9005 (2.7018)	mem 20675MB
[2025-04-03 02:16:46 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][272/573]	eta 0:04:25 lr 0.000788	time 0.8778 (0.8832)	loss 0.5629 (0.5066)	grad_norm 2.0107 (2.7005)	mem 20675MB
[2025-04-03 02:16:48 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][274/573]	eta 0:04:24 lr 0.000788	time 0.8770 (0.8832)	loss 0.4238 (0.5066)	grad_norm 3.1488 (2.6990)	mem 20675MB
[2025-04-03 02:16:50 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][276/573]	eta 0:04:22 lr 0.000788	time 0.8773 (0.8831)	loss 0.5464 (0.5071)	grad_norm 2.0744 (2.6928)	mem 20675MB
[2025-04-03 02:16:52 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][278/573]	eta 0:04:20 lr 0.000788	time 0.8774 (0.8831)	loss 0.6225 (0.5078)	grad_norm 1.8865 (2.6885)	mem 20675MB
[2025-04-03 02:16:53 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][280/573]	eta 0:04:18 lr 0.000788	time 0.8771 (0.8831)	loss 0.5553 (0.5083)	grad_norm 1.5580 (2.6812)	mem 20675MB
[2025-04-03 02:16:55 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][282/573]	eta 0:04:16 lr 0.000787	time 0.8774 (0.8830)	loss 0.6209 (0.5089)	grad_norm 1.5426 (2.6734)	mem 20675MB
[2025-04-03 02:16:57 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][284/573]	eta 0:04:15 lr 0.000787	time 0.8772 (0.8830)	loss 0.5616 (0.5094)	grad_norm 2.3062 (2.6685)	mem 20675MB
[2025-04-03 02:16:59 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][286/573]	eta 0:04:13 lr 0.000787	time 0.8771 (0.8830)	loss 0.4927 (0.5093)	grad_norm 2.0452 (2.6643)	mem 20675MB
[2025-04-03 02:17:00 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][288/573]	eta 0:04:11 lr 0.000787	time 0.8775 (0.8829)	loss 0.4931 (0.5093)	grad_norm 2.3226 (2.6615)	mem 20675MB
[2025-04-03 02:17:02 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][290/573]	eta 0:04:09 lr 0.000786	time 0.8774 (0.8829)	loss 0.5523 (0.5092)	grad_norm 2.1248 (2.6613)	mem 20675MB
[2025-04-03 02:17:04 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][292/573]	eta 0:04:08 lr 0.000786	time 0.8775 (0.8829)	loss 0.6353 (0.5099)	grad_norm 2.7112 (2.6621)	mem 20675MB
[2025-04-03 02:17:06 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][294/573]	eta 0:04:06 lr 0.000786	time 0.8774 (0.8828)	loss 0.5852 (0.5104)	grad_norm 2.2674 (2.6592)	mem 20675MB
[2025-04-03 02:17:07 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][296/573]	eta 0:04:04 lr 0.000786	time 0.8772 (0.8828)	loss 0.3453 (0.5094)	grad_norm 2.9596 (2.6606)	mem 20675MB
[2025-04-03 02:17:09 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][298/573]	eta 0:04:02 lr 0.000786	time 0.8771 (0.8828)	loss 0.5704 (0.5098)	grad_norm 3.1670 (2.6604)	mem 20675MB
[2025-04-03 02:17:11 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][300/573]	eta 0:04:00 lr 0.000785	time 0.8773 (0.8827)	loss 0.5193 (0.5101)	grad_norm 1.7489 (2.6555)	mem 20675MB
[2025-04-03 02:17:13 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][302/573]	eta 0:03:59 lr 0.000785	time 0.8776 (0.8827)	loss 0.3836 (0.5092)	grad_norm 3.6169 (2.6627)	mem 20675MB
[2025-04-03 02:17:14 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][304/573]	eta 0:03:57 lr 0.000785	time 0.8775 (0.8827)	loss 0.5348 (0.5088)	grad_norm 1.9886 (2.6606)	mem 20675MB
[2025-04-03 02:17:16 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][306/573]	eta 0:03:55 lr 0.000785	time 0.8775 (0.8827)	loss 0.5542 (0.5089)	grad_norm 2.4489 (2.6610)	mem 20675MB
[2025-04-03 02:17:18 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][308/573]	eta 0:03:53 lr 0.000784	time 0.8770 (0.8826)	loss 0.3836 (0.5086)	grad_norm 5.4123 (2.6679)	mem 20675MB
[2025-04-03 02:17:20 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][310/573]	eta 0:03:52 lr 0.000784	time 0.8773 (0.8826)	loss 0.3995 (0.5083)	grad_norm 4.3403 (2.6815)	mem 20675MB
[2025-04-03 02:17:22 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][312/573]	eta 0:03:50 lr 0.000784	time 0.8772 (0.8826)	loss 0.5424 (0.5082)	grad_norm 3.0189 (2.6818)	mem 20675MB
[2025-04-03 02:17:23 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][314/573]	eta 0:03:48 lr 0.000784	time 0.8784 (0.8826)	loss 0.5058 (0.5084)	grad_norm 3.4808 (2.6833)	mem 20675MB
[2025-04-03 02:17:25 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][316/573]	eta 0:03:46 lr 0.000784	time 0.8772 (0.8825)	loss 0.3525 (0.5080)	grad_norm 3.0513 (2.6819)	mem 20675MB
[2025-04-03 02:17:27 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][318/573]	eta 0:03:45 lr 0.000783	time 0.8773 (0.8825)	loss 0.5670 (0.5081)	grad_norm 2.1645 (2.6785)	mem 20675MB
[2025-04-03 02:17:29 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][320/573]	eta 0:03:43 lr 0.000783	time 0.8774 (0.8825)	loss 0.5289 (0.5083)	grad_norm 2.0736 (2.6746)	mem 20675MB
[2025-04-03 02:17:30 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][322/573]	eta 0:03:41 lr 0.000783	time 0.8773 (0.8825)	loss 0.5640 (0.5082)	grad_norm 2.6485 (2.6770)	mem 20675MB
[2025-04-03 02:17:32 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][324/573]	eta 0:03:39 lr 0.000783	time 0.8771 (0.8824)	loss 0.5369 (0.5082)	grad_norm 2.1783 (2.6753)	mem 20675MB
[2025-04-03 02:17:34 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][326/573]	eta 0:03:37 lr 0.000782	time 0.8774 (0.8824)	loss 0.4343 (0.5081)	grad_norm 2.4108 (2.6765)	mem 20675MB
[2025-04-03 02:17:36 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][328/573]	eta 0:03:36 lr 0.000782	time 0.8779 (0.8824)	loss 0.5744 (0.5082)	grad_norm 3.3799 (2.6797)	mem 20675MB
[2025-04-03 02:17:37 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][330/573]	eta 0:03:34 lr 0.000782	time 0.8771 (0.8824)	loss 0.4328 (0.5077)	grad_norm 3.9603 (2.6848)	mem 20675MB
[2025-04-03 02:17:39 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][332/573]	eta 0:03:32 lr 0.000782	time 0.8772 (0.8823)	loss 0.5747 (0.5080)	grad_norm 4.9359 (2.6930)	mem 20675MB
[2025-04-03 02:17:41 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][334/573]	eta 0:03:30 lr 0.000782	time 0.8771 (0.8823)	loss 0.4527 (0.5077)	grad_norm 3.4227 (2.6952)	mem 20675MB
[2025-04-03 02:17:43 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][336/573]	eta 0:03:29 lr 0.000781	time 0.8771 (0.8823)	loss 0.4145 (0.5075)	grad_norm 3.9131 (2.6977)	mem 20675MB
[2025-04-03 02:17:44 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][338/573]	eta 0:03:27 lr 0.000781	time 0.8772 (0.8823)	loss 0.5878 (0.5078)	grad_norm 3.1993 (2.6986)	mem 20675MB
[2025-04-03 02:17:46 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][340/573]	eta 0:03:25 lr 0.000781	time 0.8773 (0.8822)	loss 0.6334 (0.5083)	grad_norm 2.4067 (2.6986)	mem 20675MB
[2025-04-03 02:17:48 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][342/573]	eta 0:03:23 lr 0.000781	time 0.8771 (0.8822)	loss 0.5276 (0.5082)	grad_norm 1.7419 (2.6987)	mem 20675MB
[2025-04-03 02:17:50 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][344/573]	eta 0:03:22 lr 0.000781	time 0.8772 (0.8822)	loss 0.5642 (0.5085)	grad_norm 2.1118 (2.6949)	mem 20675MB
[2025-04-03 02:17:51 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][346/573]	eta 0:03:20 lr 0.000780	time 0.8774 (0.8822)	loss 0.5333 (0.5086)	grad_norm 1.9116 (2.6915)	mem 20675MB
[2025-04-03 02:17:53 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][348/573]	eta 0:03:18 lr 0.000780	time 0.8770 (0.8821)	loss 0.4546 (0.5086)	grad_norm 3.6950 (2.6924)	mem 20675MB
[2025-04-03 02:17:55 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][350/573]	eta 0:03:16 lr 0.000780	time 0.8774 (0.8821)	loss 0.4468 (0.5085)	grad_norm 2.4161 (2.6906)	mem 20675MB
[2025-04-03 02:17:57 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][352/573]	eta 0:03:14 lr 0.000780	time 0.8770 (0.8821)	loss 0.4074 (0.5082)	grad_norm 3.1606 (2.6911)	mem 20675MB
[2025-04-03 02:17:58 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][354/573]	eta 0:03:13 lr 0.000779	time 0.8774 (0.8821)	loss 0.6058 (0.5085)	grad_norm 2.5821 (2.6909)	mem 20675MB
[2025-04-03 02:18:00 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][356/573]	eta 0:03:11 lr 0.000779	time 0.8771 (0.8821)	loss 0.4382 (0.5083)	grad_norm 2.4298 (2.6899)	mem 20675MB
[2025-04-03 02:18:02 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][358/573]	eta 0:03:09 lr 0.000779	time 0.8774 (0.8820)	loss 0.5575 (0.5086)	grad_norm 2.1708 (2.6886)	mem 20675MB
[2025-04-03 02:18:04 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][360/573]	eta 0:03:07 lr 0.000779	time 0.8772 (0.8820)	loss 0.5435 (0.5085)	grad_norm 2.5371 (2.6883)	mem 20675MB
[2025-04-03 02:18:05 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][362/573]	eta 0:03:06 lr 0.000779	time 0.8771 (0.8820)	loss 0.5212 (0.5087)	grad_norm 2.5776 (2.6883)	mem 20675MB
[2025-04-03 02:18:07 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][364/573]	eta 0:03:04 lr 0.000778	time 0.8770 (0.8820)	loss 0.5740 (0.5089)	grad_norm 1.9484 (2.6875)	mem 20675MB
[2025-04-03 02:18:09 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][366/573]	eta 0:03:02 lr 0.000778	time 0.8776 (0.8820)	loss 0.5547 (0.5090)	grad_norm 3.4106 (2.6880)	mem 20675MB
[2025-04-03 02:18:11 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][368/573]	eta 0:03:00 lr 0.000778	time 0.8772 (0.8819)	loss 0.5432 (0.5091)	grad_norm 2.1030 (2.6845)	mem 20675MB
[2025-04-03 02:18:12 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][370/573]	eta 0:02:59 lr 0.000778	time 0.8775 (0.8819)	loss 0.4445 (0.5089)	grad_norm 5.4914 (2.6917)	mem 20675MB
[2025-04-03 02:18:14 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][372/573]	eta 0:02:57 lr 0.000777	time 0.8773 (0.8819)	loss 0.5925 (0.5088)	grad_norm 2.6052 (2.6947)	mem 20675MB
[2025-04-03 02:18:16 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][374/573]	eta 0:02:55 lr 0.000777	time 0.8770 (0.8819)	loss 0.5331 (0.5093)	grad_norm 3.9217 (2.6967)	mem 20675MB
[2025-04-03 02:18:18 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][376/573]	eta 0:02:53 lr 0.000777	time 0.8773 (0.8819)	loss 0.4890 (0.5093)	grad_norm 3.8186 (2.6987)	mem 20675MB
[2025-04-03 02:18:19 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][378/573]	eta 0:02:51 lr 0.000777	time 0.8772 (0.8818)	loss 0.3970 (0.5092)	grad_norm 2.6834 (2.6970)	mem 20675MB
[2025-04-03 02:18:21 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][380/573]	eta 0:02:50 lr 0.000777	time 0.8777 (0.8818)	loss 0.5511 (0.5092)	grad_norm 1.8669 (2.6943)	mem 20675MB
[2025-04-03 02:18:23 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][382/573]	eta 0:02:48 lr 0.000776	time 0.8770 (0.8818)	loss 0.4218 (0.5091)	grad_norm 2.8241 (2.6939)	mem 20675MB
[2025-04-03 02:18:25 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][384/573]	eta 0:02:46 lr 0.000776	time 0.8773 (0.8818)	loss 0.5752 (0.5096)	grad_norm 4.6757 (2.7016)	mem 20675MB
[2025-04-03 02:18:26 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][386/573]	eta 0:02:44 lr 0.000776	time 0.8779 (0.8818)	loss 0.6102 (0.5100)	grad_norm 2.4811 (2.6994)	mem 20675MB
[2025-04-03 02:18:28 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][388/573]	eta 0:02:43 lr 0.000776	time 0.8772 (0.8817)	loss 0.5191 (0.5102)	grad_norm 2.6624 (2.6985)	mem 20675MB
[2025-04-03 02:18:30 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][390/573]	eta 0:02:41 lr 0.000775	time 0.8771 (0.8817)	loss 0.5296 (0.5104)	grad_norm 1.7923 (2.6936)	mem 20675MB
[2025-04-03 02:18:32 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][392/573]	eta 0:02:39 lr 0.000775	time 0.8772 (0.8817)	loss 0.4427 (0.5102)	grad_norm 2.0676 (2.6899)	mem 20675MB
[2025-04-03 02:18:34 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][394/573]	eta 0:02:37 lr 0.000775	time 0.8770 (0.8817)	loss 0.5428 (0.5101)	grad_norm 1.8224 (2.6888)	mem 20675MB
[2025-04-03 02:18:35 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][396/573]	eta 0:02:36 lr 0.000775	time 0.8771 (0.8817)	loss 0.5477 (0.5104)	grad_norm 1.8984 (2.6841)	mem 20675MB
[2025-04-03 02:18:37 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][398/573]	eta 0:02:34 lr 0.000775	time 0.8770 (0.8817)	loss 0.5659 (0.5105)	grad_norm 1.9601 (2.6847)	mem 20675MB
[2025-04-03 02:18:39 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][400/573]	eta 0:02:32 lr 0.000774	time 0.8772 (0.8816)	loss 0.6026 (0.5105)	grad_norm 2.3516 (2.6859)	mem 20675MB
[2025-04-03 02:18:41 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][402/573]	eta 0:02:30 lr 0.000774	time 0.8772 (0.8816)	loss 0.4253 (0.5103)	grad_norm 4.1382 (2.6896)	mem 20675MB
[2025-04-03 02:18:42 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][404/573]	eta 0:02:28 lr 0.000774	time 0.8772 (0.8816)	loss 0.3623 (0.5100)	grad_norm 3.6158 (2.6917)	mem 20675MB
[2025-04-03 02:18:44 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][406/573]	eta 0:02:27 lr 0.000774	time 0.8772 (0.8816)	loss 0.4793 (0.5099)	grad_norm 5.2880 (2.6988)	mem 20675MB
[2025-04-03 02:18:46 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][408/573]	eta 0:02:25 lr 0.000773	time 0.8771 (0.8816)	loss 0.5524 (0.5097)	grad_norm 2.4267 (2.6994)	mem 20675MB
[2025-04-03 02:18:48 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][410/573]	eta 0:02:23 lr 0.000773	time 0.8772 (0.8816)	loss 0.5268 (0.5098)	grad_norm 2.2502 (2.7047)	mem 20675MB
[2025-04-03 02:18:49 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][412/573]	eta 0:02:21 lr 0.000773	time 0.8775 (0.8815)	loss 0.4797 (0.5099)	grad_norm 2.0447 (2.7024)	mem 20675MB
[2025-04-03 02:18:51 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][414/573]	eta 0:02:20 lr 0.000773	time 0.8773 (0.8815)	loss 0.5569 (0.5101)	grad_norm 1.7395 (2.6988)	mem 20675MB
[2025-04-03 02:18:53 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][416/573]	eta 0:02:18 lr 0.000773	time 0.8770 (0.8815)	loss 0.4931 (0.5101)	grad_norm 2.5410 (2.6983)	mem 20675MB
[2025-04-03 02:18:55 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][418/573]	eta 0:02:16 lr 0.000772	time 0.8769 (0.8815)	loss 0.3240 (0.5098)	grad_norm 3.3811 (2.6985)	mem 20675MB
[2025-04-03 02:18:56 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][420/573]	eta 0:02:14 lr 0.000772	time 0.8772 (0.8815)	loss 0.5047 (0.5098)	grad_norm 2.1574 (2.6965)	mem 20675MB
[2025-04-03 02:18:58 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][422/573]	eta 0:02:13 lr 0.000772	time 0.8772 (0.8815)	loss 0.4827 (0.5098)	grad_norm 2.7088 (2.6978)	mem 20675MB
[2025-04-03 02:19:00 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][424/573]	eta 0:02:11 lr 0.000772	time 0.8773 (0.8814)	loss 0.5271 (0.5098)	grad_norm 2.0066 (2.6989)	mem 20675MB
[2025-04-03 02:19:02 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][426/573]	eta 0:02:09 lr 0.000771	time 0.8776 (0.8814)	loss 0.5060 (0.5099)	grad_norm 2.9733 (2.7007)	mem 20675MB
[2025-04-03 02:19:03 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][428/573]	eta 0:02:07 lr 0.000771	time 0.8771 (0.8814)	loss 0.5637 (0.5101)	grad_norm 2.0106 (2.6974)	mem 20675MB
[2025-04-03 02:19:05 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][430/573]	eta 0:02:06 lr 0.000771	time 0.8775 (0.8814)	loss 0.4378 (0.5099)	grad_norm 2.9074 (2.6978)	mem 20675MB
[2025-04-03 02:19:07 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][432/573]	eta 0:02:04 lr 0.000771	time 0.8772 (0.8814)	loss 0.5571 (0.5100)	grad_norm 1.8112 (2.6950)	mem 20675MB
[2025-04-03 02:19:09 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][434/573]	eta 0:02:02 lr 0.000771	time 0.8774 (0.8814)	loss 0.5221 (0.5100)	grad_norm 3.1475 (2.6955)	mem 20675MB
[2025-04-03 02:19:10 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][436/573]	eta 0:02:00 lr 0.000770	time 0.8778 (0.8813)	loss 0.4665 (0.5097)	grad_norm 3.0728 (2.6970)	mem 20675MB
[2025-04-03 02:19:12 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][438/573]	eta 0:01:58 lr 0.000770	time 0.8773 (0.8813)	loss 0.6079 (0.5101)	grad_norm 2.3611 (2.6956)	mem 20675MB
[2025-04-03 02:19:14 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][440/573]	eta 0:01:57 lr 0.000770	time 0.8771 (0.8813)	loss 0.5294 (0.5099)	grad_norm 2.9766 (2.6952)	mem 20675MB
[2025-04-03 02:19:16 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][442/573]	eta 0:01:55 lr 0.000770	time 0.8777 (0.8813)	loss 0.6506 (0.5101)	grad_norm 2.6383 (2.6953)	mem 20675MB
[2025-04-03 02:19:17 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][444/573]	eta 0:01:53 lr 0.000769	time 0.8773 (0.8813)	loss 0.4303 (0.5096)	grad_norm 2.7467 (2.7002)	mem 20675MB
[2025-04-03 02:19:19 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][446/573]	eta 0:01:51 lr 0.000769	time 0.8773 (0.8813)	loss 0.5683 (0.5097)	grad_norm 2.0257 (2.6991)	mem 20675MB
[2025-04-03 02:19:21 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][448/573]	eta 0:01:50 lr 0.000769	time 0.8774 (0.8813)	loss 0.6076 (0.5100)	grad_norm 2.2275 (2.6959)	mem 20675MB
[2025-04-03 02:19:23 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][450/573]	eta 0:01:48 lr 0.000769	time 0.8774 (0.8813)	loss 0.4885 (0.5096)	grad_norm 2.2783 (2.6946)	mem 20675MB
[2025-04-03 02:19:24 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][452/573]	eta 0:01:46 lr 0.000769	time 0.8770 (0.8812)	loss 0.4688 (0.5093)	grad_norm 3.0678 (2.6965)	mem 20675MB
[2025-04-03 02:19:26 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][454/573]	eta 0:01:44 lr 0.000768	time 0.8771 (0.8812)	loss 0.5028 (0.5091)	grad_norm 2.5449 (2.6980)	mem 20675MB
[2025-04-03 02:19:28 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][456/573]	eta 0:01:43 lr 0.000768	time 0.8774 (0.8812)	loss 0.4904 (0.5090)	grad_norm 3.1485 (2.6978)	mem 20675MB
[2025-04-03 02:19:30 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][458/573]	eta 0:01:41 lr 0.000768	time 0.8773 (0.8812)	loss 0.5102 (0.5089)	grad_norm 3.1251 (2.7045)	mem 20675MB
[2025-04-03 02:19:31 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][460/573]	eta 0:01:39 lr 0.000768	time 0.8775 (0.8812)	loss 0.5725 (0.5090)	grad_norm 3.4433 (2.7044)	mem 20675MB
[2025-04-03 02:19:33 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][462/573]	eta 0:01:37 lr 0.000767	time 0.8771 (0.8812)	loss 0.4769 (0.5089)	grad_norm 3.4013 (2.7046)	mem 20675MB
[2025-04-03 02:19:35 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][464/573]	eta 0:01:36 lr 0.000767	time 0.8774 (0.8812)	loss 0.5211 (0.5091)	grad_norm 2.3270 (2.7063)	mem 20675MB
[2025-04-03 02:19:37 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][466/573]	eta 0:01:34 lr 0.000767	time 0.8771 (0.8811)	loss 0.5319 (0.5092)	grad_norm 3.4341 (2.7076)	mem 20675MB
[2025-04-03 02:19:39 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][468/573]	eta 0:01:32 lr 0.000767	time 0.8772 (0.8811)	loss 0.5620 (0.5093)	grad_norm 2.0371 (2.7059)	mem 20675MB
[2025-04-03 02:19:40 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][470/573]	eta 0:01:30 lr 0.000767	time 0.8773 (0.8811)	loss 0.6544 (0.5098)	grad_norm 2.5339 (2.7054)	mem 20675MB
[2025-04-03 02:19:42 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][472/573]	eta 0:01:28 lr 0.000766	time 0.8776 (0.8811)	loss 0.4451 (0.5096)	grad_norm 2.2170 (2.7034)	mem 20675MB
[2025-04-03 02:19:44 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][474/573]	eta 0:01:27 lr 0.000766	time 0.8771 (0.8811)	loss 0.5748 (0.5097)	grad_norm 1.6069 (2.7010)	mem 20675MB
[2025-04-03 02:19:46 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][476/573]	eta 0:01:25 lr 0.000766	time 0.8775 (0.8811)	loss 0.5352 (0.5097)	grad_norm 1.9123 (2.6982)	mem 20675MB
[2025-04-03 02:19:47 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][478/573]	eta 0:01:23 lr 0.000766	time 0.8771 (0.8811)	loss 0.3900 (0.5095)	grad_norm 1.9035 (2.6950)	mem 20675MB
[2025-04-03 02:19:49 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][480/573]	eta 0:01:21 lr 0.000765	time 0.8775 (0.8811)	loss 0.5851 (0.5097)	grad_norm 1.6310 (2.6923)	mem 20675MB
[2025-04-03 02:19:51 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][482/573]	eta 0:01:20 lr 0.000765	time 0.8777 (0.8811)	loss 0.5610 (0.5096)	grad_norm 2.0098 (2.6941)	mem 20675MB
[2025-04-03 02:19:53 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][484/573]	eta 0:01:18 lr 0.000765	time 0.8776 (0.8810)	loss 0.5929 (0.5098)	grad_norm 2.7134 (2.6930)	mem 20675MB
[2025-04-03 02:19:54 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][486/573]	eta 0:01:16 lr 0.000765	time 0.8774 (0.8810)	loss 0.4358 (0.5098)	grad_norm 3.2134 (2.6931)	mem 20675MB
[2025-04-03 02:19:56 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][488/573]	eta 0:01:14 lr 0.000765	time 0.8771 (0.8810)	loss 0.5492 (0.5096)	grad_norm 2.2428 (2.6921)	mem 20675MB
[2025-04-03 02:19:58 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][490/573]	eta 0:01:13 lr 0.000764	time 0.8772 (0.8810)	loss 0.4566 (0.5095)	grad_norm 2.2814 (2.6913)	mem 20675MB
[2025-04-03 02:20:00 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][492/573]	eta 0:01:11 lr 0.000764	time 0.8772 (0.8810)	loss 0.4529 (0.5094)	grad_norm 2.5245 (2.6901)	mem 20675MB
[2025-04-03 02:20:01 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][494/573]	eta 0:01:09 lr 0.000764	time 0.8774 (0.8810)	loss 0.6033 (0.5096)	grad_norm 2.2008 (2.6886)	mem 20675MB
[2025-04-03 02:20:03 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][496/573]	eta 0:01:07 lr 0.000764	time 0.8776 (0.8810)	loss 0.4012 (0.5092)	grad_norm 3.4376 (2.6900)	mem 20675MB
[2025-04-03 02:20:05 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][498/573]	eta 0:01:06 lr 0.000763	time 0.8774 (0.8810)	loss 0.4458 (0.5093)	grad_norm 2.6474 (2.6897)	mem 20675MB
[2025-04-03 02:20:07 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][500/573]	eta 0:01:04 lr 0.000763	time 0.8775 (0.8810)	loss 0.5357 (0.5095)	grad_norm 1.8930 (2.6892)	mem 20675MB
[2025-04-03 02:20:08 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][502/573]	eta 0:01:02 lr 0.000763	time 0.8777 (0.8809)	loss 0.3620 (0.5094)	grad_norm 2.5477 (2.6878)	mem 20675MB
[2025-04-03 02:20:10 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][504/573]	eta 0:01:00 lr 0.000763	time 0.8778 (0.8809)	loss 0.5529 (0.5095)	grad_norm 1.9354 (2.6838)	mem 20675MB
[2025-04-03 02:20:12 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][506/573]	eta 0:00:59 lr 0.000763	time 0.8786 (0.8809)	loss 0.5101 (0.5096)	grad_norm 2.5073 (2.6832)	mem 20675MB
[2025-04-03 02:20:14 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][508/573]	eta 0:00:57 lr 0.000762	time 0.8775 (0.8809)	loss 0.6012 (0.5096)	grad_norm 2.2134 (2.6829)	mem 20675MB
[2025-04-03 02:20:15 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][510/573]	eta 0:00:55 lr 0.000762	time 0.8785 (0.8809)	loss 0.6357 (0.5099)	grad_norm 3.3881 (2.6834)	mem 20675MB
[2025-04-03 02:20:17 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][512/573]	eta 0:00:53 lr 0.000762	time 0.8773 (0.8809)	loss 0.5250 (0.5100)	grad_norm 1.9377 (2.6802)	mem 20675MB
[2025-04-03 02:20:19 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][514/573]	eta 0:00:51 lr 0.000762	time 0.8779 (0.8809)	loss 0.5378 (0.5099)	grad_norm 1.5410 (2.6772)	mem 20675MB
[2025-04-03 02:20:21 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][516/573]	eta 0:00:50 lr 0.000761	time 0.8772 (0.8809)	loss 0.5327 (0.5100)	grad_norm 1.7929 (2.6741)	mem 20675MB
[2025-04-03 02:20:22 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][518/573]	eta 0:00:48 lr 0.000761	time 0.8773 (0.8809)	loss 0.4909 (0.5100)	grad_norm 2.2883 (2.6726)	mem 20675MB
[2025-04-03 02:20:24 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][520/573]	eta 0:00:46 lr 0.000761	time 0.8774 (0.8809)	loss 0.4721 (0.5097)	grad_norm 2.5456 (2.6726)	mem 20675MB
[2025-04-03 02:20:26 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][522/573]	eta 0:00:44 lr 0.000761	time 0.8771 (0.8809)	loss 0.4007 (0.5092)	grad_norm 4.4305 (2.6767)	mem 20675MB
[2025-04-03 02:20:28 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][524/573]	eta 0:00:43 lr 0.000761	time 0.8772 (0.8808)	loss 0.6003 (0.5093)	grad_norm 2.4832 (2.6785)	mem 20675MB
[2025-04-03 02:20:29 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][526/573]	eta 0:00:41 lr 0.000760	time 0.8771 (0.8808)	loss 0.4595 (0.5090)	grad_norm 3.3178 (2.6828)	mem 20675MB
[2025-04-03 02:20:31 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][528/573]	eta 0:00:39 lr 0.000760	time 0.8771 (0.8808)	loss 0.5563 (0.5093)	grad_norm 3.5247 (2.6881)	mem 20675MB
[2025-04-03 02:20:33 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][530/573]	eta 0:00:37 lr 0.000760	time 0.8772 (0.8808)	loss 0.5358 (0.5093)	grad_norm 1.9260 (2.6866)	mem 20675MB
[2025-04-03 02:20:35 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][532/573]	eta 0:00:36 lr 0.000760	time 0.8777 (0.8808)	loss 0.3564 (0.5092)	grad_norm 2.7758 (2.6865)	mem 20675MB
[2025-04-03 02:20:36 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][534/573]	eta 0:00:34 lr 0.000759	time 0.8772 (0.8808)	loss 0.5482 (0.5093)	grad_norm 2.9182 (2.6858)	mem 20675MB
[2025-04-03 02:20:38 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][536/573]	eta 0:00:32 lr 0.000759	time 0.8774 (0.8808)	loss 0.5634 (0.5095)	grad_norm 1.8675 (2.6849)	mem 20675MB
[2025-04-03 02:20:40 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][538/573]	eta 0:00:30 lr 0.000759	time 0.8771 (0.8808)	loss 0.4172 (0.5091)	grad_norm 3.0667 (2.6865)	mem 20675MB
[2025-04-03 02:20:42 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][540/573]	eta 0:00:29 lr 0.000759	time 0.8772 (0.8808)	loss 0.4755 (0.5092)	grad_norm 2.5073 (2.6846)	mem 20675MB
[2025-04-03 02:20:44 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][542/573]	eta 0:00:27 lr 0.000759	time 0.8775 (0.8808)	loss 0.5389 (0.5092)	grad_norm 3.7790 (2.6854)	mem 20675MB
[2025-04-03 02:20:45 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][544/573]	eta 0:00:25 lr 0.000758	time 0.8774 (0.8807)	loss 0.4941 (0.5093)	grad_norm 2.5814 (2.6846)	mem 20675MB
[2025-04-03 02:20:47 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][546/573]	eta 0:00:23 lr 0.000758	time 0.8774 (0.8807)	loss 0.4398 (0.5092)	grad_norm 2.1301 (2.6829)	mem 20675MB
[2025-04-03 02:20:49 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][548/573]	eta 0:00:22 lr 0.000758	time 0.8773 (0.8807)	loss 0.6793 (0.5094)	grad_norm 5.1752 (2.6875)	mem 20675MB
[2025-04-03 02:20:51 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][550/573]	eta 0:00:20 lr 0.000758	time 0.8774 (0.8807)	loss 0.5709 (0.5096)	grad_norm 2.6540 (2.6883)	mem 20675MB
[2025-04-03 02:20:52 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][552/573]	eta 0:00:18 lr 0.000757	time 0.8772 (0.8807)	loss 0.3757 (0.5094)	grad_norm 2.6481 (2.6876)	mem 20675MB
[2025-04-03 02:20:54 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][554/573]	eta 0:00:16 lr 0.000757	time 0.8771 (0.8807)	loss 0.6060 (0.5093)	grad_norm 3.0492 (2.6894)	mem 20675MB
[2025-04-03 02:20:56 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][556/573]	eta 0:00:14 lr 0.000757	time 0.8771 (0.8807)	loss 0.5247 (0.5094)	grad_norm 3.1054 (2.6894)	mem 20675MB
[2025-04-03 02:20:58 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][558/573]	eta 0:00:13 lr 0.000757	time 0.8772 (0.8807)	loss 0.4853 (0.5092)	grad_norm 2.8052 (2.6886)	mem 20675MB
[2025-04-03 02:20:59 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][560/573]	eta 0:00:11 lr 0.000756	time 0.8771 (0.8807)	loss 0.4362 (0.5090)	grad_norm 3.4799 (2.6905)	mem 20675MB
[2025-04-03 02:21:01 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][562/573]	eta 0:00:09 lr 0.000756	time 0.8772 (0.8807)	loss 0.5243 (0.5090)	grad_norm 2.0423 (2.6881)	mem 20675MB
[2025-04-03 02:21:03 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][564/573]	eta 0:00:07 lr 0.000756	time 0.8769 (0.8807)	loss 0.4383 (0.5090)	grad_norm 3.0388 (2.6876)	mem 20675MB
[2025-04-03 02:21:05 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][566/573]	eta 0:00:06 lr 0.000756	time 0.8771 (0.8807)	loss 0.3602 (0.5086)	grad_norm 4.1477 (2.6915)	mem 20675MB
[2025-04-03 02:21:06 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][568/573]	eta 0:00:04 lr 0.000756	time 0.8771 (0.8806)	loss 0.3853 (0.5085)	grad_norm 3.9092 (2.6935)	mem 20675MB
[2025-04-03 02:21:08 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][570/573]	eta 0:00:02 lr 0.000755	time 0.8778 (0.8806)	loss 0.5601 (0.5087)	grad_norm 1.6309 (2.6927)	mem 20675MB
[2025-04-03 02:21:10 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][572/573]	eta 0:00:00 lr 0.000755	time 0.8769 (0.8806)	loss 0.3907 (0.5085)	grad_norm 1.8827 (2.6904)	mem 20675MB
[2025-04-03 02:21:10 simmim_finetune] (main_finetune.py 260): INFO EPOCH 12 training takes 0:08:24
[2025-04-03 02:21:12 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 2.110 (2.110)	Loss 0.5208 (0.5208)	Acc@1 69.531 (69.531)	Mem 20675MB
[2025-04-03 02:21:13 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.893)	Loss 0.4709 (0.4860)	Acc@1 74.219 (72.656)	Mem 20675MB
[2025-04-03 02:21:13 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.284 (0.649)	Loss 0.5299 (0.4899)	Acc@1 70.312 (72.500)	Mem 20675MB
[2025-04-03 02:21:14 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.545)	Loss 0.5166 (0.4887)	Acc@1 76.562 (73.549)	Mem 20675MB
[2025-04-03 02:21:14 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.487)	Loss 0.5439 (0.4847)	Acc@1 71.094 (74.132)	Mem 20675MB
[2025-04-03 02:21:15 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.450)	Loss 0.4721 (0.4853)	Acc@1 84.375 (75.071)	Mem 20675MB
[2025-04-03 02:21:16 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.424)	Loss 0.4869 (0.4857)	Acc@1 74.219 (75.000)	Mem 20675MB
[2025-04-03 02:21:16 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.406)	Loss 0.4481 (0.4820)	Acc@1 82.031 (75.885)	Mem 20675MB
[2025-04-03 02:21:16 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 75.857
[2025-04-03 02:21:16 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 75.9%
[2025-04-03 02:21:16 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 76.71%
[2025-04-03 02:21:16 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.8905881505727593e-06, 2.8905881505727593e-06, 4.393757088906068e-06, 4.393757088906068e-06, 6.706324686341927e-06, 6.706324686341927e-06, 1.0264120990089402e-05, 1.0264120990089402e-05, 1.573765376508552e-05, 1.573765376508552e-05, 2.41584734189257e-05, 2.41584734189257e-05, 3.711358057867982e-05, 3.711358057867982e-05, 5.704451467060922e-05, 5.704451467060922e-05, 8.770749019665447e-05, 8.770749019665447e-05, 0.00013488129869826257, 0.00013488129869826257, 0.00020745638870073652, 0.00020745638870073652, 0.0003191103733199272, 0.0003191103733199272, 0.0004908857342725283, 0.0004908857342725283, 0.0007551555203534531, 0.0007551555203534531]
[2025-04-03 02:21:19 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][0/573]	eta 0:23:58 lr 0.000755	time 2.5097 (2.5097)	loss 0.5121 (0.5121)	grad_norm 3.4176 (3.4176)	mem 20675MB
[2025-04-03 02:21:21 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][2/573]	eta 0:13:32 lr 0.000755	time 0.8776 (1.4222)	loss 0.4697 (0.4546)	grad_norm 2.0436 (2.7561)	mem 20675MB
[2025-04-03 02:21:22 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][4/573]	eta 0:11:25 lr 0.000755	time 0.8772 (1.2048)	loss 0.5391 (0.4894)	grad_norm 1.8593 (2.3436)	mem 20675MB
[2025-04-03 02:21:24 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][6/573]	eta 0:10:30 lr 0.000754	time 0.8771 (1.1114)	loss 0.5622 (0.5142)	grad_norm 3.2016 (2.4871)	mem 20675MB
[2025-04-03 02:21:26 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][8/573]	eta 0:09:58 lr 0.000754	time 0.8773 (1.0596)	loss 0.4763 (0.4956)	grad_norm 4.2249 (2.7569)	mem 20675MB
[2025-04-03 02:21:28 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][10/573]	eta 0:09:37 lr 0.000754	time 0.8771 (1.0266)	loss 0.4415 (0.4946)	grad_norm 4.0271 (2.7881)	mem 20675MB
[2025-04-03 02:21:29 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][12/573]	eta 0:09:23 lr 0.000754	time 0.8775 (1.0037)	loss 0.6004 (0.5084)	grad_norm 3.8405 (2.8133)	mem 20675MB
[2025-04-03 02:21:31 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][14/573]	eta 0:09:11 lr 0.000753	time 0.8776 (0.9871)	loss 0.6365 (0.5217)	grad_norm 2.3274 (2.7560)	mem 20675MB
[2025-04-03 02:21:33 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][16/573]	eta 0:09:02 lr 0.000753	time 0.8774 (0.9743)	loss 0.5046 (0.5258)	grad_norm 2.5910 (2.7322)	mem 20675MB
[2025-04-03 02:21:35 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][18/573]	eta 0:08:55 lr 0.000753	time 0.8774 (0.9642)	loss 0.5002 (0.5287)	grad_norm 2.0299 (2.6486)	mem 20675MB
[2025-04-03 02:21:36 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][20/573]	eta 0:08:48 lr 0.000753	time 0.8779 (0.9561)	loss 0.4800 (0.5299)	grad_norm 3.6732 (2.6798)	mem 20675MB
[2025-04-03 02:21:38 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][22/573]	eta 0:08:43 lr 0.000753	time 0.8772 (0.9493)	loss 0.5463 (0.5266)	grad_norm 2.7090 (2.7333)	mem 20675MB
[2025-04-03 02:21:40 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][24/573]	eta 0:08:38 lr 0.000752	time 0.8780 (0.9437)	loss 0.5387 (0.5252)	grad_norm 1.5898 (2.6585)	mem 20675MB
[2025-04-03 02:21:42 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][26/573]	eta 0:08:33 lr 0.000752	time 0.8774 (0.9389)	loss 0.4405 (0.5228)	grad_norm 2.9990 (2.6531)	mem 20675MB
[2025-04-03 02:21:43 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][28/573]	eta 0:08:29 lr 0.000752	time 0.8774 (0.9347)	loss 0.4176 (0.5204)	grad_norm 2.6232 (2.6643)	mem 20675MB
[2025-04-03 02:21:45 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][30/573]	eta 0:08:25 lr 0.000752	time 0.8778 (0.9311)	loss 0.6202 (0.5246)	grad_norm 3.0162 (2.6544)	mem 20675MB
[2025-04-03 02:21:47 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][32/573]	eta 0:08:21 lr 0.000751	time 0.8775 (0.9279)	loss 0.5352 (0.5258)	grad_norm 2.5002 (2.6406)	mem 20675MB
[2025-04-03 02:21:49 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][34/573]	eta 0:08:18 lr 0.000751	time 0.8777 (0.9250)	loss 0.5196 (0.5248)	grad_norm 2.8231 (2.6579)	mem 20675MB
[2025-04-03 02:21:50 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][36/573]	eta 0:08:15 lr 0.000751	time 0.8773 (0.9225)	loss 0.4088 (0.5228)	grad_norm 3.1612 (2.6675)	mem 20675MB
[2025-04-03 02:21:52 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][38/573]	eta 0:08:12 lr 0.000751	time 0.8777 (0.9203)	loss 0.3993 (0.5170)	grad_norm 3.7985 (2.7085)	mem 20675MB
[2025-04-03 02:21:54 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][40/573]	eta 0:08:09 lr 0.000751	time 0.8771 (0.9182)	loss 0.4606 (0.5166)	grad_norm 3.7153 (2.7347)	mem 20675MB
[2025-04-03 02:21:56 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][42/573]	eta 0:08:06 lr 0.000750	time 0.8776 (0.9164)	loss 0.4641 (0.5159)	grad_norm 3.2695 (2.7406)	mem 20675MB
[2025-04-03 02:21:58 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][44/573]	eta 0:08:03 lr 0.000750	time 0.8774 (0.9147)	loss 0.5256 (0.5164)	grad_norm 1.8407 (2.7345)	mem 20675MB
[2025-04-03 02:21:59 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][46/573]	eta 0:08:01 lr 0.000750	time 0.8777 (0.9131)	loss 0.4059 (0.5159)	grad_norm 4.8120 (2.7635)	mem 20675MB
[2025-04-03 02:22:01 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][48/573]	eta 0:07:58 lr 0.000750	time 0.8774 (0.9117)	loss 0.5703 (0.5169)	grad_norm 3.1622 (2.7461)	mem 20675MB
[2025-04-03 02:22:03 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][50/573]	eta 0:07:56 lr 0.000749	time 0.8778 (0.9104)	loss 0.3724 (0.5150)	grad_norm 3.2205 (2.7559)	mem 20675MB
[2025-04-03 02:22:05 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][52/573]	eta 0:07:53 lr 0.000749	time 0.8775 (0.9092)	loss 0.5306 (0.5145)	grad_norm 1.6484 (2.7302)	mem 20675MB
[2025-04-03 02:22:06 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][54/573]	eta 0:07:51 lr 0.000749	time 0.8771 (0.9081)	loss 0.4515 (0.5134)	grad_norm 4.1325 (2.7446)	mem 20675MB
[2025-04-03 02:22:08 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][56/573]	eta 0:07:48 lr 0.000749	time 0.8773 (0.9070)	loss 0.5466 (0.5149)	grad_norm 2.7386 (2.7393)	mem 20675MB
[2025-04-03 02:22:10 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][58/573]	eta 0:07:46 lr 0.000749	time 0.8776 (0.9061)	loss 0.4454 (0.5112)	grad_norm 2.3157 (2.7359)	mem 20675MB
[2025-04-03 02:22:12 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][60/573]	eta 0:07:44 lr 0.000748	time 0.8771 (0.9052)	loss 0.4358 (0.5105)	grad_norm 3.7432 (2.7495)	mem 20675MB
[2025-04-03 02:22:13 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][62/573]	eta 0:07:42 lr 0.000748	time 0.8771 (0.9043)	loss 0.5537 (0.5123)	grad_norm 2.2513 (2.7367)	mem 20675MB
[2025-04-03 02:22:15 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][64/573]	eta 0:07:39 lr 0.000748	time 0.8771 (0.9035)	loss 0.5884 (0.5131)	grad_norm 2.7858 (2.7264)	mem 20675MB
[2025-04-03 02:22:17 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][66/573]	eta 0:07:37 lr 0.000748	time 0.8773 (0.9027)	loss 0.4970 (0.5107)	grad_norm 2.4304 (2.7359)	mem 20675MB
[2025-04-03 02:22:19 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][68/573]	eta 0:07:35 lr 0.000747	time 0.8772 (0.9020)	loss 0.5269 (0.5106)	grad_norm 1.7586 (2.7318)	mem 20675MB
[2025-04-03 02:22:20 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][70/573]	eta 0:07:33 lr 0.000747	time 0.8773 (0.9014)	loss 0.5340 (0.5111)	grad_norm 3.4994 (2.7415)	mem 20675MB
[2025-04-03 02:22:22 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][72/573]	eta 0:07:31 lr 0.000747	time 0.8774 (0.9007)	loss 0.3900 (0.5087)	grad_norm 3.7368 (2.7549)	mem 20675MB
[2025-04-03 02:22:24 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][74/573]	eta 0:07:29 lr 0.000747	time 0.8772 (0.9001)	loss 0.3636 (0.5072)	grad_norm 4.7161 (2.7733)	mem 20675MB
[2025-04-03 02:22:26 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][76/573]	eta 0:07:27 lr 0.000747	time 0.8775 (0.8996)	loss 0.6259 (0.5098)	grad_norm 2.5165 (2.7630)	mem 20675MB
[2025-04-03 02:22:27 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][78/573]	eta 0:07:25 lr 0.000746	time 0.8772 (0.8990)	loss 0.5755 (0.5109)	grad_norm 2.4736 (2.7606)	mem 20675MB
[2025-04-03 02:22:29 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][80/573]	eta 0:07:22 lr 0.000746	time 0.8773 (0.8985)	loss 0.5882 (0.5127)	grad_norm 2.8889 (2.7736)	mem 20675MB
[2025-04-03 02:22:31 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][82/573]	eta 0:07:20 lr 0.000746	time 0.8786 (0.8980)	loss 0.4552 (0.5133)	grad_norm 3.2754 (2.7716)	mem 20675MB
[2025-04-03 02:22:33 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][84/573]	eta 0:07:18 lr 0.000746	time 0.8771 (0.8976)	loss 0.4034 (0.5115)	grad_norm 2.7226 (2.7681)	mem 20675MB
[2025-04-03 02:22:34 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][86/573]	eta 0:07:16 lr 0.000745	time 0.8769 (0.8971)	loss 0.4129 (0.5107)	grad_norm 3.6361 (2.7713)	mem 20675MB
[2025-04-03 02:22:36 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][88/573]	eta 0:07:14 lr 0.000745	time 0.8770 (0.8967)	loss 0.5027 (0.5102)	grad_norm 3.2833 (2.7758)	mem 20675MB
[2025-04-03 02:22:38 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][90/573]	eta 0:07:12 lr 0.000745	time 0.8772 (0.8963)	loss 0.4882 (0.5098)	grad_norm 2.0917 (2.7687)	mem 20675MB
[2025-04-03 02:22:40 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][92/573]	eta 0:07:10 lr 0.000745	time 0.8774 (0.8959)	loss 0.5712 (0.5113)	grad_norm 2.0183 (2.7637)	mem 20675MB
[2025-04-03 02:22:41 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][94/573]	eta 0:07:08 lr 0.000745	time 0.8770 (0.8955)	loss 0.5372 (0.5118)	grad_norm 3.2981 (2.7630)	mem 20675MB
[2025-04-03 02:22:43 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][96/573]	eta 0:07:06 lr 0.000744	time 0.8773 (0.8952)	loss 0.4275 (0.5116)	grad_norm 4.5926 (2.7981)	mem 20675MB
[2025-04-03 02:22:45 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][98/573]	eta 0:07:05 lr 0.000744	time 0.8774 (0.8948)	loss 0.5829 (0.5129)	grad_norm 2.8681 (2.8149)	mem 20675MB
[2025-04-03 02:22:47 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][100/573]	eta 0:07:03 lr 0.000744	time 0.8772 (0.8945)	loss 0.3666 (0.5120)	grad_norm 2.5975 (2.8083)	mem 20675MB
[2025-04-03 02:22:48 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][102/573]	eta 0:07:01 lr 0.000744	time 0.8773 (0.8942)	loss 0.5405 (0.5121)	grad_norm 2.4036 (2.7954)	mem 20675MB
[2025-04-03 02:22:50 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][104/573]	eta 0:06:59 lr 0.000743	time 0.8773 (0.8939)	loss 0.3311 (0.5106)	grad_norm 2.8821 (2.7892)	mem 20675MB
[2025-04-03 02:22:52 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][106/573]	eta 0:06:57 lr 0.000743	time 0.8771 (0.8936)	loss 0.4796 (0.5086)	grad_norm 2.6919 (2.7813)	mem 20675MB
[2025-04-03 02:22:54 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][108/573]	eta 0:06:55 lr 0.000743	time 0.8771 (0.8933)	loss 0.5557 (0.5074)	grad_norm 2.4030 (2.7843)	mem 20675MB
[2025-04-03 02:22:55 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][110/573]	eta 0:06:53 lr 0.000743	time 0.8778 (0.8930)	loss 0.3450 (0.5061)	grad_norm 3.7747 (2.7881)	mem 20675MB
[2025-04-03 02:22:57 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][112/573]	eta 0:06:51 lr 0.000743	time 0.8773 (0.8928)	loss 0.3967 (0.5054)	grad_norm 3.6203 (2.8137)	mem 20675MB
[2025-04-03 02:22:59 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][114/573]	eta 0:06:49 lr 0.000742	time 0.8773 (0.8925)	loss 0.4146 (0.5038)	grad_norm 4.2427 (2.8315)	mem 20675MB
[2025-04-03 02:23:01 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][116/573]	eta 0:06:47 lr 0.000742	time 0.8772 (0.8923)	loss 0.4372 (0.5032)	grad_norm 2.2309 (2.8331)	mem 20675MB
[2025-04-03 02:23:03 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][118/573]	eta 0:06:45 lr 0.000742	time 0.8778 (0.8921)	loss 0.6044 (0.5039)	grad_norm 3.7406 (2.8403)	mem 20675MB
[2025-04-03 02:23:04 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][120/573]	eta 0:06:43 lr 0.000742	time 0.8771 (0.8918)	loss 0.4579 (0.5041)	grad_norm 1.8814 (2.8288)	mem 20675MB
[2025-04-03 02:23:06 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][122/573]	eta 0:06:42 lr 0.000741	time 0.8775 (0.8916)	loss 0.5113 (0.5029)	grad_norm 2.7774 (2.8432)	mem 20675MB
[2025-04-03 02:23:08 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][124/573]	eta 0:06:40 lr 0.000741	time 0.8774 (0.8914)	loss 0.4753 (0.5029)	grad_norm 1.6957 (2.8265)	mem 20675MB
[2025-04-03 02:23:10 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][126/573]	eta 0:06:38 lr 0.000741	time 0.8780 (0.8912)	loss 0.5764 (0.5036)	grad_norm 1.3919 (2.8066)	mem 20675MB
[2025-04-03 02:23:11 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][128/573]	eta 0:06:36 lr 0.000741	time 0.8772 (0.8910)	loss 0.5818 (0.5044)	grad_norm 2.3679 (2.8032)	mem 20675MB
[2025-04-03 02:23:13 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][130/573]	eta 0:06:34 lr 0.000740	time 0.8770 (0.8908)	loss 0.4825 (0.5048)	grad_norm 2.3162 (2.7909)	mem 20675MB
[2025-04-03 02:23:15 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][132/573]	eta 0:06:32 lr 0.000740	time 0.8772 (0.8906)	loss 0.5355 (0.5048)	grad_norm 2.1198 (2.7865)	mem 20675MB
[2025-04-03 02:23:17 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][134/573]	eta 0:06:30 lr 0.000740	time 0.8776 (0.8904)	loss 0.4180 (0.5046)	grad_norm 2.9257 (2.7845)	mem 20675MB
[2025-04-03 02:23:18 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][136/573]	eta 0:06:29 lr 0.000740	time 0.8774 (0.8903)	loss 0.6716 (0.5058)	grad_norm 4.0951 (2.7935)	mem 20675MB
[2025-04-03 02:23:20 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][138/573]	eta 0:06:27 lr 0.000740	time 0.8773 (0.8901)	loss 0.6000 (0.5071)	grad_norm 1.9472 (2.7816)	mem 20675MB
[2025-04-03 02:23:22 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][140/573]	eta 0:06:25 lr 0.000739	time 0.8772 (0.8899)	loss 0.4940 (0.5061)	grad_norm 3.3572 (2.7860)	mem 20675MB
[2025-04-03 02:23:24 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][142/573]	eta 0:06:23 lr 0.000739	time 0.8776 (0.8897)	loss 0.4759 (0.5061)	grad_norm 2.2754 (2.7802)	mem 20675MB
[2025-04-03 02:23:25 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][144/573]	eta 0:06:21 lr 0.000739	time 0.8775 (0.8896)	loss 0.4969 (0.5054)	grad_norm 2.3612 (2.7789)	mem 20675MB
[2025-04-03 02:23:27 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][146/573]	eta 0:06:19 lr 0.000739	time 0.8775 (0.8894)	loss 0.4381 (0.5054)	grad_norm 3.3001 (2.7891)	mem 20675MB
[2025-04-03 02:23:29 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][148/573]	eta 0:06:17 lr 0.000738	time 0.8770 (0.8893)	loss 0.4001 (0.5053)	grad_norm 3.7720 (2.7912)	mem 20675MB
[2025-04-03 02:23:31 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][150/573]	eta 0:06:16 lr 0.000738	time 0.8778 (0.8892)	loss 0.5130 (0.5052)	grad_norm 2.2579 (2.7819)	mem 20675MB
[2025-04-03 02:23:32 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][152/573]	eta 0:06:14 lr 0.000738	time 0.8793 (0.8890)	loss 0.4178 (0.5052)	grad_norm 2.4821 (2.7752)	mem 20675MB
[2025-04-03 02:23:34 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][154/573]	eta 0:06:12 lr 0.000738	time 0.8773 (0.8889)	loss 0.5380 (0.5044)	grad_norm 2.0233 (2.7689)	mem 20675MB
[2025-04-03 02:23:36 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][156/573]	eta 0:06:10 lr 0.000738	time 0.8776 (0.8888)	loss 0.5449 (0.5047)	grad_norm 3.5349 (2.7705)	mem 20675MB
[2025-04-03 02:23:38 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][158/573]	eta 0:06:08 lr 0.000737	time 0.8777 (0.8886)	loss 0.4656 (0.5045)	grad_norm 3.4248 (2.7701)	mem 20675MB
[2025-04-03 02:23:39 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][160/573]	eta 0:06:06 lr 0.000737	time 0.8775 (0.8885)	loss 0.5784 (0.5054)	grad_norm 3.3351 (2.7807)	mem 20675MB
[2025-04-03 02:23:41 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][162/573]	eta 0:06:05 lr 0.000737	time 0.8774 (0.8884)	loss 0.3981 (0.5050)	grad_norm 3.1839 (2.7840)	mem 20675MB
[2025-04-03 02:23:43 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][164/573]	eta 0:06:03 lr 0.000737	time 0.8774 (0.8883)	loss 0.6061 (0.5053)	grad_norm 2.0233 (2.7823)	mem 20675MB
[2025-04-03 02:23:45 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][166/573]	eta 0:06:01 lr 0.000736	time 0.8773 (0.8881)	loss 0.3461 (0.5035)	grad_norm 2.9582 (2.7833)	mem 20675MB
[2025-04-03 02:23:46 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][168/573]	eta 0:05:59 lr 0.000736	time 0.8778 (0.8880)	loss 0.5549 (0.5044)	grad_norm 2.0453 (2.7800)	mem 20675MB
[2025-04-03 02:23:48 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][170/573]	eta 0:05:57 lr 0.000736	time 0.8775 (0.8879)	loss 0.3790 (0.5041)	grad_norm 2.7724 (2.7831)	mem 20675MB
[2025-04-03 02:23:50 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][172/573]	eta 0:05:56 lr 0.000736	time 0.8772 (0.8878)	loss 0.4772 (0.5042)	grad_norm 2.0572 (2.7741)	mem 20675MB
[2025-04-03 02:23:52 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][174/573]	eta 0:05:54 lr 0.000736	time 0.8772 (0.8877)	loss 0.5286 (0.5047)	grad_norm 1.9720 (2.7649)	mem 20675MB
[2025-04-03 02:23:53 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][176/573]	eta 0:05:52 lr 0.000735	time 0.8774 (0.8876)	loss 0.3505 (0.5043)	grad_norm 3.1702 (2.7620)	mem 20675MB
[2025-04-03 02:23:55 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][178/573]	eta 0:05:50 lr 0.000735	time 0.8774 (0.8875)	loss 0.5946 (0.5050)	grad_norm 2.9122 (2.7611)	mem 20675MB
[2025-04-03 02:23:57 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][180/573]	eta 0:05:48 lr 0.000735	time 0.8773 (0.8874)	loss 0.5760 (0.5058)	grad_norm 2.7042 (2.7569)	mem 20675MB
[2025-04-03 02:23:59 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][182/573]	eta 0:05:46 lr 0.000735	time 0.8773 (0.8873)	loss 0.6202 (0.5062)	grad_norm 2.7587 (2.7598)	mem 20675MB
[2025-04-03 02:24:00 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][184/573]	eta 0:05:45 lr 0.000734	time 0.8774 (0.8872)	loss 0.5595 (0.5072)	grad_norm 2.1569 (2.7596)	mem 20675MB
[2025-04-03 02:24:02 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][186/573]	eta 0:05:43 lr 0.000734	time 0.8772 (0.8871)	loss 0.3962 (0.5065)	grad_norm 2.8239 (2.7556)	mem 20675MB
[2025-04-03 02:24:04 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][188/573]	eta 0:05:41 lr 0.000734	time 0.8773 (0.8870)	loss 0.3632 (0.5055)	grad_norm 5.1803 (2.7674)	mem 20675MB
[2025-04-03 02:24:06 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][190/573]	eta 0:05:39 lr 0.000734	time 0.8774 (0.8869)	loss 0.4904 (0.5055)	grad_norm 2.7830 (2.7635)	mem 20675MB
[2025-04-03 02:24:08 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][192/573]	eta 0:05:37 lr 0.000734	time 0.8772 (0.8868)	loss 0.5784 (0.5056)	grad_norm 2.1547 (2.7565)	mem 20675MB
[2025-04-03 02:24:09 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][194/573]	eta 0:05:36 lr 0.000733	time 0.8770 (0.8867)	loss 0.4503 (0.5051)	grad_norm 2.4683 (2.7567)	mem 20675MB
[2025-04-03 02:24:11 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][196/573]	eta 0:05:34 lr 0.000733	time 0.8770 (0.8866)	loss 0.5232 (0.5051)	grad_norm 2.8931 (2.7555)	mem 20675MB
[2025-04-03 02:24:13 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][198/573]	eta 0:05:32 lr 0.000733	time 0.8772 (0.8866)	loss 0.4898 (0.5057)	grad_norm 2.1593 (2.7561)	mem 20675MB
[2025-04-03 02:24:15 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][200/573]	eta 0:05:30 lr 0.000733	time 0.8772 (0.8865)	loss 0.5267 (0.5058)	grad_norm 1.7169 (2.7505)	mem 20675MB
[2025-04-03 02:24:16 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][202/573]	eta 0:05:28 lr 0.000732	time 0.8771 (0.8864)	loss 0.4998 (0.5059)	grad_norm 2.5842 (2.7516)	mem 20675MB
[2025-04-03 02:24:18 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][204/573]	eta 0:05:27 lr 0.000732	time 0.8771 (0.8863)	loss 0.3713 (0.5050)	grad_norm 2.3708 (2.7506)	mem 20675MB
[2025-04-03 02:24:20 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][206/573]	eta 0:05:25 lr 0.000732	time 0.8775 (0.8862)	loss 0.6348 (0.5059)	grad_norm 2.1488 (2.7550)	mem 20675MB
[2025-04-03 02:24:22 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][208/573]	eta 0:05:23 lr 0.000732	time 0.8782 (0.8862)	loss 0.4827 (0.5060)	grad_norm 3.0220 (2.7540)	mem 20675MB
[2025-04-03 02:24:23 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][210/573]	eta 0:05:21 lr 0.000731	time 0.8774 (0.8861)	loss 0.5608 (0.5064)	grad_norm 2.5477 (2.7527)	mem 20675MB
[2025-04-03 02:24:25 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][212/573]	eta 0:05:19 lr 0.000731	time 0.8774 (0.8860)	loss 0.5500 (0.5062)	grad_norm 3.4500 (2.7535)	mem 20675MB
[2025-04-03 02:24:27 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][214/573]	eta 0:05:18 lr 0.000731	time 0.8775 (0.8859)	loss 0.6012 (0.5072)	grad_norm 1.8786 (2.7487)	mem 20675MB
[2025-04-03 02:24:29 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][216/573]	eta 0:05:16 lr 0.000731	time 0.8772 (0.8859)	loss 0.6058 (0.5081)	grad_norm 3.0681 (2.7483)	mem 20675MB
[2025-04-03 02:24:30 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][218/573]	eta 0:05:14 lr 0.000731	time 0.8773 (0.8858)	loss 0.5089 (0.5082)	grad_norm 1.8680 (2.7407)	mem 20675MB
[2025-04-03 02:24:32 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][220/573]	eta 0:05:12 lr 0.000730	time 0.8774 (0.8857)	loss 0.5633 (0.5083)	grad_norm 1.6424 (2.7313)	mem 20675MB
[2025-04-03 02:24:34 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][222/573]	eta 0:05:10 lr 0.000730	time 0.8771 (0.8857)	loss 0.4823 (0.5076)	grad_norm 2.1258 (2.7298)	mem 20675MB
[2025-04-03 02:24:36 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][224/573]	eta 0:05:09 lr 0.000730	time 0.8770 (0.8856)	loss 0.5674 (0.5077)	grad_norm 2.3221 (2.7297)	mem 20675MB
[2025-04-03 02:24:37 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][226/573]	eta 0:05:07 lr 0.000730	time 0.8776 (0.8855)	loss 0.4093 (0.5077)	grad_norm 3.9385 (2.7315)	mem 20675MB
[2025-04-03 02:24:39 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][228/573]	eta 0:05:05 lr 0.000729	time 0.8774 (0.8855)	loss 0.3863 (0.5072)	grad_norm 2.8222 (2.7303)	mem 20675MB
[2025-04-03 02:24:41 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][230/573]	eta 0:05:03 lr 0.000729	time 0.8772 (0.8854)	loss 0.4962 (0.5076)	grad_norm 1.7399 (2.7331)	mem 20675MB
[2025-04-03 02:24:43 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][232/573]	eta 0:05:01 lr 0.000729	time 0.8774 (0.8853)	loss 0.5645 (0.5078)	grad_norm 3.0933 (2.7385)	mem 20675MB
[2025-04-03 02:24:44 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][234/573]	eta 0:05:00 lr 0.000729	time 0.8777 (0.8853)	loss 0.5736 (0.5078)	grad_norm 1.9187 (2.7338)	mem 20675MB
[2025-04-03 02:24:46 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][236/573]	eta 0:04:58 lr 0.000729	time 0.8772 (0.8852)	loss 0.5469 (0.5079)	grad_norm 2.5485 (2.7314)	mem 20675MB
[2025-04-03 02:24:48 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][238/573]	eta 0:04:56 lr 0.000728	time 0.8774 (0.8852)	loss 0.4385 (0.5074)	grad_norm 4.0127 (2.7385)	mem 20675MB
[2025-04-03 02:24:50 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][240/573]	eta 0:04:54 lr 0.000728	time 0.8774 (0.8851)	loss 0.5263 (0.5078)	grad_norm 2.4649 (2.7366)	mem 20675MB
[2025-04-03 02:24:51 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][242/573]	eta 0:04:52 lr 0.000728	time 0.8774 (0.8851)	loss 0.4883 (0.5079)	grad_norm 2.5111 (2.7334)	mem 20675MB
[2025-04-03 02:24:53 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][244/573]	eta 0:04:51 lr 0.000728	time 0.8771 (0.8850)	loss 0.5722 (0.5081)	grad_norm 2.4851 (2.7353)	mem 20675MB
[2025-04-03 02:24:55 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][246/573]	eta 0:04:49 lr 0.000727	time 0.8772 (0.8849)	loss 0.5422 (0.5085)	grad_norm 3.0147 (2.7362)	mem 20675MB
[2025-04-03 02:24:57 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][248/573]	eta 0:04:47 lr 0.000727	time 0.8778 (0.8849)	loss 0.5640 (0.5087)	grad_norm 2.9896 (2.7358)	mem 20675MB
[2025-04-03 02:24:58 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][250/573]	eta 0:04:45 lr 0.000727	time 0.8774 (0.8848)	loss 0.5760 (0.5088)	grad_norm 2.0357 (2.7314)	mem 20675MB
[2025-04-03 02:25:00 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][252/573]	eta 0:04:44 lr 0.000727	time 0.8775 (0.8848)	loss 0.5900 (0.5095)	grad_norm 2.2446 (2.7270)	mem 20675MB
[2025-04-03 02:25:02 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][254/573]	eta 0:04:42 lr 0.000727	time 0.8773 (0.8847)	loss 0.4261 (0.5098)	grad_norm 4.0488 (2.7310)	mem 20675MB
[2025-04-03 02:25:04 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][256/573]	eta 0:04:40 lr 0.000726	time 0.8772 (0.8847)	loss 0.5118 (0.5102)	grad_norm 1.7335 (2.7260)	mem 20675MB
[2025-04-03 02:25:05 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][258/573]	eta 0:04:38 lr 0.000726	time 0.8784 (0.8846)	loss 0.5615 (0.5103)	grad_norm 2.4723 (2.7207)	mem 20675MB
[2025-04-03 02:25:07 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][260/573]	eta 0:04:36 lr 0.000726	time 0.8773 (0.8846)	loss 0.5701 (0.5105)	grad_norm 1.8658 (2.7142)	mem 20675MB
[2025-04-03 02:25:09 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][262/573]	eta 0:04:35 lr 0.000726	time 0.8774 (0.8845)	loss 0.4257 (0.5099)	grad_norm 2.9543 (2.7138)	mem 20675MB
[2025-04-03 02:25:11 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][264/573]	eta 0:04:33 lr 0.000725	time 0.8774 (0.8845)	loss 0.4686 (0.5094)	grad_norm 2.9188 (2.7160)	mem 20675MB
[2025-04-03 02:25:13 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][266/573]	eta 0:04:31 lr 0.000725	time 0.8780 (0.8844)	loss 0.4316 (0.5090)	grad_norm 3.2947 (2.7158)	mem 20675MB
[2025-04-03 02:25:14 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][268/573]	eta 0:04:29 lr 0.000725	time 0.8773 (0.8844)	loss 0.3677 (0.5086)	grad_norm 4.5751 (2.7221)	mem 20675MB
[2025-04-03 02:25:16 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][270/573]	eta 0:04:27 lr 0.000725	time 0.8773 (0.8844)	loss 0.3274 (0.5074)	grad_norm 2.6028 (2.7249)	mem 20675MB
[2025-04-03 02:25:18 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][272/573]	eta 0:04:26 lr 0.000725	time 0.8771 (0.8843)	loss 0.4205 (0.5066)	grad_norm 3.4192 (2.7292)	mem 20675MB
[2025-04-03 02:25:20 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][274/573]	eta 0:04:24 lr 0.000724	time 0.8774 (0.8843)	loss 0.3789 (0.5061)	grad_norm 4.3462 (2.7389)	mem 20675MB
[2025-04-03 02:25:21 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][276/573]	eta 0:04:22 lr 0.000724	time 0.8773 (0.8842)	loss 0.5564 (0.5065)	grad_norm 2.6439 (2.7410)	mem 20675MB
[2025-04-03 02:25:23 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][278/573]	eta 0:04:20 lr 0.000724	time 0.8774 (0.8842)	loss 0.5823 (0.5069)	grad_norm 2.5268 (2.7538)	mem 20675MB
[2025-04-03 02:25:25 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][280/573]	eta 0:04:19 lr 0.000724	time 0.8773 (0.8841)	loss 0.5644 (0.5069)	grad_norm 2.6104 (2.7545)	mem 20675MB
[2025-04-03 02:25:27 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][282/573]	eta 0:04:17 lr 0.000723	time 0.8773 (0.8841)	loss 0.5113 (0.5071)	grad_norm 2.7083 (2.7544)	mem 20675MB
[2025-04-03 02:25:28 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][284/573]	eta 0:04:15 lr 0.000723	time 0.8771 (0.8841)	loss 0.4658 (0.5065)	grad_norm 2.7831 (2.7566)	mem 20675MB
[2025-04-03 02:25:30 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][286/573]	eta 0:04:13 lr 0.000723	time 0.8771 (0.8840)	loss 0.4921 (0.5064)	grad_norm 1.9244 (2.7515)	mem 20675MB
[2025-04-03 02:25:32 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][288/573]	eta 0:04:11 lr 0.000723	time 0.8772 (0.8840)	loss 0.5665 (0.5070)	grad_norm 4.4833 (2.7563)	mem 20675MB
[2025-04-03 02:25:34 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][290/573]	eta 0:04:10 lr 0.000722	time 0.8774 (0.8839)	loss 0.3590 (0.5068)	grad_norm 2.6807 (2.7543)	mem 20675MB
[2025-04-03 02:25:35 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][292/573]	eta 0:04:08 lr 0.000722	time 0.8772 (0.8839)	loss 0.5560 (0.5068)	grad_norm 1.5356 (2.7477)	mem 20675MB
[2025-04-03 02:25:37 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][294/573]	eta 0:04:06 lr 0.000722	time 0.8771 (0.8839)	loss 0.3939 (0.5065)	grad_norm 2.9425 (2.7482)	mem 20675MB
[2025-04-03 02:25:39 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][296/573]	eta 0:04:04 lr 0.000722	time 0.8773 (0.8838)	loss 0.6013 (0.5066)	grad_norm 2.7712 (2.7481)	mem 20675MB
[2025-04-03 02:25:41 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][298/573]	eta 0:04:03 lr 0.000722	time 0.8774 (0.8838)	loss 0.4990 (0.5066)	grad_norm 2.0235 (2.7439)	mem 20675MB
[2025-04-03 02:25:42 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][300/573]	eta 0:04:01 lr 0.000721	time 0.8772 (0.8837)	loss 0.4635 (0.5063)	grad_norm 4.1017 (2.7546)	mem 20675MB
[2025-04-03 02:25:44 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][302/573]	eta 0:03:59 lr 0.000721	time 0.8772 (0.8837)	loss 0.5397 (0.5062)	grad_norm 1.7320 (2.7573)	mem 20675MB
[2025-04-03 02:25:46 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][304/573]	eta 0:03:57 lr 0.000721	time 0.8773 (0.8837)	loss 0.5937 (0.5067)	grad_norm 2.8473 (2.7543)	mem 20675MB
[2025-04-03 02:25:48 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][306/573]	eta 0:03:55 lr 0.000721	time 0.8770 (0.8836)	loss 0.4444 (0.5061)	grad_norm 3.6916 (2.7625)	mem 20675MB
[2025-04-03 02:25:49 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][308/573]	eta 0:03:54 lr 0.000720	time 0.8772 (0.8836)	loss 0.4794 (0.5060)	grad_norm 3.1134 (2.7628)	mem 20675MB
[2025-04-03 02:25:51 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][310/573]	eta 0:03:52 lr 0.000720	time 0.8774 (0.8836)	loss 0.5030 (0.5062)	grad_norm 2.0208 (2.7644)	mem 20675MB
[2025-04-03 02:25:53 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][312/573]	eta 0:03:50 lr 0.000720	time 0.8772 (0.8835)	loss 0.5050 (0.5063)	grad_norm 1.9611 (2.7629)	mem 20675MB
[2025-04-03 02:25:55 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][314/573]	eta 0:03:48 lr 0.000720	time 0.8773 (0.8835)	loss 0.6232 (0.5062)	grad_norm 1.8964 (2.7625)	mem 20675MB
[2025-04-03 02:25:56 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][316/573]	eta 0:03:47 lr 0.000720	time 0.8771 (0.8835)	loss 0.5014 (0.5061)	grad_norm 1.8216 (2.7597)	mem 20675MB
[2025-04-03 02:25:58 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][318/573]	eta 0:03:45 lr 0.000719	time 0.8774 (0.8834)	loss 0.4816 (0.5062)	grad_norm 3.2629 (2.7588)	mem 20675MB
[2025-04-03 02:26:00 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][320/573]	eta 0:03:43 lr 0.000719	time 0.8773 (0.8834)	loss 0.3985 (0.5058)	grad_norm 4.1402 (2.7623)	mem 20675MB
[2025-04-03 02:26:02 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][322/573]	eta 0:03:41 lr 0.000719	time 0.8775 (0.8834)	loss 0.5847 (0.5062)	grad_norm 2.8022 (2.7593)	mem 20675MB
[2025-04-03 02:26:03 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][324/573]	eta 0:03:39 lr 0.000719	time 0.8774 (0.8833)	loss 0.4539 (0.5056)	grad_norm 3.1463 (2.7646)	mem 20675MB
[2025-04-03 02:26:05 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][326/573]	eta 0:03:38 lr 0.000718	time 0.8780 (0.8833)	loss 0.5946 (0.5062)	grad_norm 2.3447 (2.7642)	mem 20675MB
[2025-04-03 02:26:07 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][328/573]	eta 0:03:36 lr 0.000718	time 0.8772 (0.8833)	loss 0.4306 (0.5061)	grad_norm 4.0724 (2.7656)	mem 20675MB
[2025-04-03 02:26:09 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][330/573]	eta 0:03:34 lr 0.000718	time 0.8776 (0.8833)	loss 0.5151 (0.5059)	grad_norm 3.1577 (2.7669)	mem 20675MB
[2025-04-03 02:26:10 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][332/573]	eta 0:03:32 lr 0.000718	time 0.8775 (0.8832)	loss 0.5473 (0.5060)	grad_norm 2.1477 (2.7643)	mem 20675MB
[2025-04-03 02:26:12 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][334/573]	eta 0:03:31 lr 0.000718	time 0.8775 (0.8832)	loss 0.4658 (0.5058)	grad_norm 1.8276 (2.7685)	mem 20675MB
[2025-04-03 02:26:14 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][336/573]	eta 0:03:29 lr 0.000717	time 0.8773 (0.8832)	loss 0.4257 (0.5057)	grad_norm 2.9131 (2.7720)	mem 20675MB
[2025-04-03 02:26:16 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][338/573]	eta 0:03:27 lr 0.000717	time 0.8772 (0.8831)	loss 0.4843 (0.5060)	grad_norm 9.6073 (2.7920)	mem 20675MB
[2025-04-03 02:26:17 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][340/573]	eta 0:03:25 lr 0.000717	time 0.8775 (0.8831)	loss 0.6138 (0.5064)	grad_norm 2.8429 (2.7925)	mem 20675MB
[2025-04-03 02:26:19 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][342/573]	eta 0:03:23 lr 0.000717	time 0.8775 (0.8831)	loss 0.4772 (0.5060)	grad_norm 2.6856 (2.7920)	mem 20675MB
[2025-04-03 02:26:21 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][344/573]	eta 0:03:22 lr 0.000716	time 0.8774 (0.8831)	loss 0.6127 (0.5062)	grad_norm 2.0001 (2.7937)	mem 20675MB
[2025-04-03 02:26:23 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][346/573]	eta 0:03:20 lr 0.000716	time 0.8774 (0.8830)	loss 0.5314 (0.5064)	grad_norm 3.2704 (2.7939)	mem 20675MB
[2025-04-03 02:26:25 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][348/573]	eta 0:03:18 lr 0.000716	time 0.8774 (0.8830)	loss 0.4642 (0.5063)	grad_norm 4.3181 (2.7974)	mem 20675MB
[2025-04-03 02:26:26 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][350/573]	eta 0:03:16 lr 0.000716	time 0.8772 (0.8830)	loss 0.3628 (0.5056)	grad_norm 5.8447 (2.8086)	mem 20675MB
[2025-04-03 02:26:28 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][352/573]	eta 0:03:15 lr 0.000715	time 0.8775 (0.8830)	loss 0.5048 (0.5057)	grad_norm 3.0012 (2.8085)	mem 20675MB
[2025-04-03 02:26:30 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][354/573]	eta 0:03:13 lr 0.000715	time 0.8775 (0.8829)	loss 0.5911 (0.5061)	grad_norm 3.2485 (2.8078)	mem 20675MB
[2025-04-03 02:26:32 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][356/573]	eta 0:03:11 lr 0.000715	time 0.8771 (0.8829)	loss 0.4711 (0.5061)	grad_norm 3.1743 (2.8095)	mem 20675MB
[2025-04-03 02:26:33 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][358/573]	eta 0:03:09 lr 0.000715	time 0.8773 (0.8829)	loss 0.5446 (0.5063)	grad_norm 2.6201 (2.8094)	mem 20675MB
[2025-04-03 02:26:35 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][360/573]	eta 0:03:08 lr 0.000715	time 0.8772 (0.8828)	loss 0.5066 (0.5059)	grad_norm 2.5673 (2.8133)	mem 20675MB
[2025-04-03 02:26:37 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][362/573]	eta 0:03:06 lr 0.000714	time 0.8776 (0.8828)	loss 0.5176 (0.5060)	grad_norm 7.1159 (2.8373)	mem 20675MB
[2025-04-03 02:26:39 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][364/573]	eta 0:03:04 lr 0.000714	time 0.8774 (0.8828)	loss 0.4674 (0.5059)	grad_norm 4.5732 (2.8697)	mem 20675MB
[2025-04-03 02:26:40 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][366/573]	eta 0:03:02 lr 0.000714	time 0.8774 (0.8828)	loss 0.6058 (0.5065)	grad_norm 2.2290 (2.8684)	mem 20675MB
[2025-04-03 02:26:42 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][368/573]	eta 0:03:00 lr 0.000714	time 0.8773 (0.8827)	loss 0.5131 (0.5065)	grad_norm 2.6603 (2.8652)	mem 20675MB
[2025-04-03 02:26:44 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][370/573]	eta 0:02:59 lr 0.000713	time 0.8772 (0.8827)	loss 0.4447 (0.5062)	grad_norm 2.8760 (2.8639)	mem 20675MB
[2025-04-03 02:26:46 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][372/573]	eta 0:02:57 lr 0.000713	time 0.8778 (0.8827)	loss 0.3803 (0.5060)	grad_norm 5.9214 (2.8700)	mem 20675MB
[2025-04-03 02:26:47 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][374/573]	eta 0:02:55 lr 0.000713	time 0.8773 (0.8827)	loss 0.5490 (0.5061)	grad_norm 2.3854 (2.8675)	mem 20675MB
[2025-04-03 02:26:49 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][376/573]	eta 0:02:53 lr 0.000713	time 0.8771 (0.8827)	loss 0.5737 (0.5059)	grad_norm 2.2136 (2.8692)	mem 20675MB
[2025-04-03 02:26:51 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][378/573]	eta 0:02:52 lr 0.000713	time 0.8775 (0.8826)	loss 0.5953 (0.5060)	grad_norm 2.7065 (2.8703)	mem 20675MB
[2025-04-03 02:26:53 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][380/573]	eta 0:02:50 lr 0.000712	time 0.8772 (0.8826)	loss 0.6004 (0.5065)	grad_norm 2.4972 (2.8694)	mem 20675MB
[2025-04-03 02:26:54 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][382/573]	eta 0:02:48 lr 0.000712	time 0.8776 (0.8826)	loss 0.3969 (0.5061)	grad_norm 4.0259 (2.8716)	mem 20675MB
[2025-04-03 02:26:56 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][384/573]	eta 0:02:46 lr 0.000712	time 0.8776 (0.8826)	loss 0.5596 (0.5060)	grad_norm 2.5130 (2.8773)	mem 20675MB
[2025-04-03 02:26:58 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][386/573]	eta 0:02:45 lr 0.000712	time 0.8774 (0.8825)	loss 0.5904 (0.5065)	grad_norm 3.3467 (2.8794)	mem 20675MB
[2025-04-03 02:27:00 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][388/573]	eta 0:02:43 lr 0.000711	time 0.8774 (0.8825)	loss 0.4150 (0.5066)	grad_norm 2.5063 (2.8796)	mem 20675MB
[2025-04-03 02:27:01 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][390/573]	eta 0:02:41 lr 0.000711	time 0.8773 (0.8825)	loss 0.3508 (0.5060)	grad_norm 2.9949 (2.8780)	mem 20675MB
[2025-04-03 02:27:03 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][392/573]	eta 0:02:39 lr 0.000711	time 0.8776 (0.8825)	loss 0.6574 (0.5066)	grad_norm 3.3215 (2.8769)	mem 20675MB
[2025-04-03 02:27:05 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][394/573]	eta 0:02:37 lr 0.000711	time 0.8774 (0.8825)	loss 0.5727 (0.5069)	grad_norm 1.6577 (2.8716)	mem 20675MB
[2025-04-03 02:27:07 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][396/573]	eta 0:02:36 lr 0.000711	time 0.8774 (0.8824)	loss 0.3895 (0.5067)	grad_norm 2.6176 (2.8683)	mem 20675MB
[2025-04-03 02:27:08 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][398/573]	eta 0:02:34 lr 0.000710	time 0.8774 (0.8824)	loss 0.5734 (0.5070)	grad_norm 1.8841 (2.8666)	mem 20675MB
[2025-04-03 02:27:10 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][400/573]	eta 0:02:32 lr 0.000710	time 0.8773 (0.8824)	loss 0.5357 (0.5073)	grad_norm 1.7823 (2.8622)	mem 20675MB
[2025-04-03 02:27:12 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][402/573]	eta 0:02:30 lr 0.000710	time 0.8773 (0.8824)	loss 0.4382 (0.5068)	grad_norm 2.6829 (2.8617)	mem 20675MB
[2025-04-03 02:27:14 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][404/573]	eta 0:02:29 lr 0.000710	time 0.8774 (0.8824)	loss 0.5550 (0.5069)	grad_norm 3.3321 (2.8617)	mem 20675MB
[2025-04-03 02:27:15 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][406/573]	eta 0:02:27 lr 0.000709	time 0.8771 (0.8823)	loss 0.4514 (0.5063)	grad_norm 3.1072 (2.8619)	mem 20675MB
[2025-04-03 02:27:17 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][408/573]	eta 0:02:25 lr 0.000709	time 0.8775 (0.8823)	loss 0.5216 (0.5065)	grad_norm 3.1856 (2.8631)	mem 20675MB
[2025-04-03 02:27:19 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][410/573]	eta 0:02:23 lr 0.000709	time 0.8775 (0.8823)	loss 0.6086 (0.5069)	grad_norm 2.5239 (2.8625)	mem 20675MB
[2025-04-03 02:27:21 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][412/573]	eta 0:02:22 lr 0.000709	time 0.8773 (0.8823)	loss 0.5114 (0.5068)	grad_norm 2.1405 (2.8627)	mem 20675MB
[2025-04-03 02:27:22 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][414/573]	eta 0:02:20 lr 0.000708	time 0.8776 (0.8823)	loss 0.4722 (0.5069)	grad_norm 2.2454 (2.8594)	mem 20675MB
[2025-04-03 02:27:24 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][416/573]	eta 0:02:18 lr 0.000708	time 0.8774 (0.8822)	loss 0.5508 (0.5072)	grad_norm 2.2230 (2.8577)	mem 20675MB
[2025-04-03 02:27:26 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][418/573]	eta 0:02:16 lr 0.000708	time 0.8772 (0.8822)	loss 0.5500 (0.5074)	grad_norm 1.4597 (2.8521)	mem 20675MB
[2025-04-03 02:27:28 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][420/573]	eta 0:02:14 lr 0.000708	time 0.8772 (0.8822)	loss 0.4134 (0.5075)	grad_norm 2.8393 (2.8511)	mem 20675MB
[2025-04-03 02:27:30 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][422/573]	eta 0:02:13 lr 0.000708	time 0.8774 (0.8822)	loss 0.6269 (0.5079)	grad_norm 2.5015 (2.8490)	mem 20675MB
[2025-04-03 02:27:31 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][424/573]	eta 0:02:11 lr 0.000707	time 0.8774 (0.8822)	loss 0.5940 (0.5079)	grad_norm 1.9764 (2.8478)	mem 20675MB
[2025-04-03 02:27:33 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][426/573]	eta 0:02:09 lr 0.000707	time 0.8775 (0.8821)	loss 0.6428 (0.5081)	grad_norm 2.4661 (2.8466)	mem 20675MB
[2025-04-03 02:27:35 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][428/573]	eta 0:02:07 lr 0.000707	time 0.8776 (0.8821)	loss 0.3850 (0.5079)	grad_norm 2.6320 (2.8461)	mem 20675MB
[2025-04-03 02:27:37 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][430/573]	eta 0:02:06 lr 0.000707	time 0.8773 (0.8821)	loss 0.5496 (0.5079)	grad_norm 3.1426 (2.8453)	mem 20675MB
[2025-04-03 02:27:38 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][432/573]	eta 0:02:04 lr 0.000706	time 0.8774 (0.8821)	loss 0.5714 (0.5081)	grad_norm 2.0825 (2.8416)	mem 20675MB
[2025-04-03 02:27:40 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][434/573]	eta 0:02:02 lr 0.000706	time 0.8774 (0.8821)	loss 0.5706 (0.5084)	grad_norm 2.4397 (2.8397)	mem 20675MB
[2025-04-03 02:27:42 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][436/573]	eta 0:02:00 lr 0.000706	time 0.8774 (0.8821)	loss 0.4942 (0.5080)	grad_norm 2.5840 (2.8385)	mem 20675MB
[2025-04-03 02:27:44 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][438/573]	eta 0:01:59 lr 0.000706	time 0.8772 (0.8820)	loss 0.4195 (0.5079)	grad_norm 3.0779 (2.8363)	mem 20675MB
[2025-04-03 02:27:45 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][440/573]	eta 0:01:57 lr 0.000706	time 0.8772 (0.8820)	loss 0.5964 (0.5082)	grad_norm 2.5275 (2.8346)	mem 20675MB
[2025-04-03 02:27:47 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][442/573]	eta 0:01:55 lr 0.000705	time 0.8772 (0.8820)	loss 0.5075 (0.5082)	grad_norm 1.7498 (2.8310)	mem 20675MB
[2025-04-03 02:27:49 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][444/573]	eta 0:01:53 lr 0.000705	time 0.8773 (0.8820)	loss 0.5732 (0.5083)	grad_norm 2.5854 (2.8293)	mem 20675MB
[2025-04-03 02:27:51 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][446/573]	eta 0:01:52 lr 0.000705	time 0.8772 (0.8820)	loss 0.6159 (0.5087)	grad_norm 2.2897 (2.8270)	mem 20675MB
[2025-04-03 02:27:52 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][448/573]	eta 0:01:50 lr 0.000705	time 0.8772 (0.8820)	loss 0.4744 (0.5086)	grad_norm 2.2933 (2.8276)	mem 20675MB
[2025-04-03 02:27:54 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][450/573]	eta 0:01:48 lr 0.000704	time 0.8774 (0.8819)	loss 0.4951 (0.5088)	grad_norm 2.9419 (2.8273)	mem 20675MB
[2025-04-03 02:27:56 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][452/573]	eta 0:01:46 lr 0.000704	time 0.8774 (0.8819)	loss 0.5743 (0.5090)	grad_norm 1.9189 (2.8227)	mem 20675MB
[2025-04-03 02:27:58 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][454/573]	eta 0:01:44 lr 0.000704	time 0.8771 (0.8819)	loss 0.4824 (0.5089)	grad_norm 1.7070 (2.8179)	mem 20675MB
[2025-04-03 02:27:59 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][456/573]	eta 0:01:43 lr 0.000704	time 0.8773 (0.8819)	loss 0.5606 (0.5091)	grad_norm 2.3091 (2.8143)	mem 20675MB
[2025-04-03 02:28:01 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][458/573]	eta 0:01:41 lr 0.000703	time 0.8774 (0.8819)	loss 0.4621 (0.5093)	grad_norm 2.8535 (2.8135)	mem 20675MB
[2025-04-03 02:28:03 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][460/573]	eta 0:01:39 lr 0.000703	time 0.8774 (0.8819)	loss 0.5796 (0.5093)	grad_norm 1.9664 (2.8117)	mem 20675MB
[2025-04-03 02:28:05 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][462/573]	eta 0:01:37 lr 0.000703	time 0.8771 (0.8818)	loss 0.4433 (0.5091)	grad_norm 3.2850 (2.8134)	mem 20675MB
[2025-04-03 02:28:06 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][464/573]	eta 0:01:36 lr 0.000703	time 0.8775 (0.8818)	loss 0.5557 (0.5091)	grad_norm 2.8009 (2.8135)	mem 20675MB
[2025-04-03 02:28:08 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][466/573]	eta 0:01:34 lr 0.000703	time 0.8772 (0.8818)	loss 0.4973 (0.5091)	grad_norm 1.9142 (2.8132)	mem 20675MB
[2025-04-03 02:28:10 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][468/573]	eta 0:01:32 lr 0.000702	time 0.8777 (0.8818)	loss 0.4731 (0.5088)	grad_norm 4.4510 (2.8169)	mem 20675MB
[2025-04-03 02:28:12 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][470/573]	eta 0:01:30 lr 0.000702	time 0.8771 (0.8818)	loss 0.4661 (0.5087)	grad_norm 2.7569 (2.8169)	mem 20675MB
[2025-04-03 02:28:13 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][472/573]	eta 0:01:29 lr 0.000702	time 0.8774 (0.8818)	loss 0.5651 (0.5087)	grad_norm 2.2424 (2.8154)	mem 20675MB
[2025-04-03 02:28:15 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][474/573]	eta 0:01:27 lr 0.000702	time 0.8773 (0.8817)	loss 0.6039 (0.5086)	grad_norm 2.4561 (2.8173)	mem 20675MB
[2025-04-03 02:28:17 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][476/573]	eta 0:01:25 lr 0.000701	time 0.8771 (0.8817)	loss 0.5351 (0.5087)	grad_norm 2.5583 (2.8171)	mem 20675MB
[2025-04-03 02:28:19 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][478/573]	eta 0:01:23 lr 0.000701	time 0.8772 (0.8817)	loss 0.3676 (0.5086)	grad_norm 2.3336 (2.8148)	mem 20675MB
[2025-04-03 02:28:20 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][480/573]	eta 0:01:21 lr 0.000701	time 0.8772 (0.8817)	loss 0.4361 (0.5085)	grad_norm 3.5208 (2.8156)	mem 20675MB
[2025-04-03 02:28:22 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][482/573]	eta 0:01:20 lr 0.000701	time 0.8770 (0.8817)	loss 0.5354 (0.5085)	grad_norm 2.6053 (2.8132)	mem 20675MB
[2025-04-03 02:28:24 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][484/573]	eta 0:01:18 lr 0.000701	time 0.8772 (0.8817)	loss 0.5131 (0.5087)	grad_norm 1.9680 (2.8110)	mem 20675MB
[2025-04-03 02:28:26 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][486/573]	eta 0:01:16 lr 0.000700	time 0.8771 (0.8817)	loss 0.5038 (0.5085)	grad_norm 3.0890 (2.8123)	mem 20675MB
[2025-04-03 02:28:27 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][488/573]	eta 0:01:14 lr 0.000700	time 0.8772 (0.8816)	loss 0.5048 (0.5086)	grad_norm 2.8476 (2.8112)	mem 20675MB
[2025-04-03 02:28:29 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][490/573]	eta 0:01:13 lr 0.000700	time 0.8773 (0.8816)	loss 0.4988 (0.5086)	grad_norm 2.4665 (2.8100)	mem 20675MB
[2025-04-03 02:28:31 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][492/573]	eta 0:01:11 lr 0.000700	time 0.8779 (0.8816)	loss 0.4210 (0.5086)	grad_norm 4.7357 (2.8114)	mem 20675MB
[2025-04-03 02:28:33 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][494/573]	eta 0:01:09 lr 0.000699	time 0.8773 (0.8816)	loss 0.5707 (0.5087)	grad_norm 2.8241 (2.8101)	mem 20675MB
[2025-04-03 02:28:35 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][496/573]	eta 0:01:07 lr 0.000699	time 0.8771 (0.8816)	loss 0.5234 (0.5088)	grad_norm 2.0551 (2.8077)	mem 20675MB
[2025-04-03 02:28:36 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][498/573]	eta 0:01:06 lr 0.000699	time 0.8773 (0.8816)	loss 0.5012 (0.5088)	grad_norm 2.3055 (2.8064)	mem 20675MB
[2025-04-03 02:28:38 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][500/573]	eta 0:01:04 lr 0.000699	time 0.8772 (0.8816)	loss 0.5439 (0.5089)	grad_norm 2.6426 (2.8071)	mem 20675MB
[2025-04-03 02:28:40 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][502/573]	eta 0:01:02 lr 0.000699	time 0.8770 (0.8815)	loss 0.3922 (0.5086)	grad_norm 4.8891 (2.8131)	mem 20675MB
[2025-04-03 02:28:42 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][504/573]	eta 0:01:00 lr 0.000698	time 0.8773 (0.8815)	loss 0.4647 (0.5083)	grad_norm 2.3817 (2.8126)	mem 20675MB
[2025-04-03 02:28:43 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][506/573]	eta 0:00:59 lr 0.000698	time 0.8774 (0.8815)	loss 0.5117 (0.5082)	grad_norm 2.4353 (2.8150)	mem 20675MB
[2025-04-03 02:28:45 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][508/573]	eta 0:00:57 lr 0.000698	time 0.8771 (0.8815)	loss 0.5366 (0.5084)	grad_norm 2.1752 (2.8120)	mem 20675MB
[2025-04-03 02:28:47 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][510/573]	eta 0:00:55 lr 0.000698	time 0.8771 (0.8815)	loss 0.5195 (0.5085)	grad_norm 2.3440 (2.8098)	mem 20675MB
[2025-04-03 02:28:49 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][512/573]	eta 0:00:53 lr 0.000697	time 0.8774 (0.8815)	loss 0.4133 (0.5084)	grad_norm 2.5104 (2.8094)	mem 20675MB
[2025-04-03 02:28:50 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][514/573]	eta 0:00:52 lr 0.000697	time 0.8773 (0.8815)	loss 0.4046 (0.5083)	grad_norm 2.9554 (2.8090)	mem 20675MB
[2025-04-03 02:28:52 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][516/573]	eta 0:00:50 lr 0.000697	time 0.8773 (0.8815)	loss 0.3646 (0.5081)	grad_norm 2.7621 (2.8077)	mem 20675MB
[2025-04-03 02:28:54 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][518/573]	eta 0:00:48 lr 0.000697	time 0.8773 (0.8814)	loss 0.4936 (0.5078)	grad_norm 1.8637 (2.8064)	mem 20675MB
[2025-04-03 02:28:56 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][520/573]	eta 0:00:46 lr 0.000696	time 0.8772 (0.8814)	loss 0.5094 (0.5080)	grad_norm 2.9687 (2.8065)	mem 20675MB
[2025-04-03 02:28:57 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][522/573]	eta 0:00:44 lr 0.000696	time 0.8771 (0.8814)	loss 0.4476 (0.5077)	grad_norm 2.6453 (2.8062)	mem 20675MB
[2025-04-03 02:28:59 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][524/573]	eta 0:00:43 lr 0.000696	time 0.8773 (0.8814)	loss 0.3990 (0.5075)	grad_norm 3.0210 (2.8065)	mem 20675MB
[2025-04-03 02:29:01 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][526/573]	eta 0:00:41 lr 0.000696	time 0.8774 (0.8814)	loss 0.5569 (0.5078)	grad_norm 3.7334 (2.8075)	mem 20675MB
[2025-04-03 02:29:03 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][528/573]	eta 0:00:39 lr 0.000696	time 0.8771 (0.8814)	loss 0.4667 (0.5077)	grad_norm 1.8390 (2.8047)	mem 20675MB
[2025-04-03 02:29:04 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][530/573]	eta 0:00:37 lr 0.000695	time 0.8777 (0.8814)	loss 0.4843 (0.5075)	grad_norm 2.7077 (2.8054)	mem 20675MB
[2025-04-03 02:29:06 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][532/573]	eta 0:00:36 lr 0.000695	time 0.8771 (0.8814)	loss 0.5875 (0.5077)	grad_norm 2.3143 (2.8034)	mem 20675MB
[2025-04-03 02:29:08 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][534/573]	eta 0:00:34 lr 0.000695	time 0.8771 (0.8814)	loss 0.5275 (0.5076)	grad_norm 2.2322 (2.8034)	mem 20675MB
[2025-04-03 02:29:10 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][536/573]	eta 0:00:32 lr 0.000695	time 0.8773 (0.8813)	loss 0.4958 (0.5075)	grad_norm 4.3641 (2.8051)	mem 20675MB
[2025-04-03 02:29:11 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][538/573]	eta 0:00:30 lr 0.000694	time 0.8775 (0.8813)	loss 0.5666 (0.5077)	grad_norm 1.6136 (2.8062)	mem 20675MB
[2025-04-03 02:29:13 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][540/573]	eta 0:00:29 lr 0.000694	time 0.8771 (0.8813)	loss 0.4966 (0.5078)	grad_norm 2.3744 (2.8053)	mem 20675MB
[2025-04-03 02:29:15 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][542/573]	eta 0:00:27 lr 0.000694	time 0.8772 (0.8813)	loss 0.4877 (0.5076)	grad_norm 5.6263 (2.8131)	mem 20675MB
[2025-04-03 02:29:17 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][544/573]	eta 0:00:25 lr 0.000694	time 0.8774 (0.8813)	loss 0.6114 (0.5075)	grad_norm 2.3053 (2.8116)	mem 20675MB
[2025-04-03 02:29:18 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][546/573]	eta 0:00:23 lr 0.000694	time 0.8771 (0.8813)	loss 0.6140 (0.5078)	grad_norm 2.5559 (2.8099)	mem 20675MB
[2025-04-03 02:29:20 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][548/573]	eta 0:00:22 lr 0.000693	time 0.8787 (0.8813)	loss 0.4062 (0.5074)	grad_norm 3.8491 (2.8114)	mem 20675MB
[2025-04-03 02:29:22 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][550/573]	eta 0:00:20 lr 0.000693	time 0.8776 (0.8813)	loss 0.6206 (0.5075)	grad_norm 3.1871 (2.8131)	mem 20675MB
[2025-04-03 02:29:24 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][552/573]	eta 0:00:18 lr 0.000693	time 0.8773 (0.8813)	loss 0.5934 (0.5076)	grad_norm 1.9089 (2.8102)	mem 20675MB
[2025-04-03 02:29:25 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][554/573]	eta 0:00:16 lr 0.000693	time 0.8772 (0.8812)	loss 0.4571 (0.5075)	grad_norm 2.4882 (2.8084)	mem 20675MB
[2025-04-03 02:29:27 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][556/573]	eta 0:00:14 lr 0.000692	time 0.8771 (0.8812)	loss 0.4552 (0.5075)	grad_norm 2.2826 (2.8052)	mem 20675MB
[2025-04-03 02:29:29 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][558/573]	eta 0:00:13 lr 0.000692	time 0.8770 (0.8812)	loss 0.3724 (0.5073)	grad_norm 3.2234 (2.8047)	mem 20675MB
[2025-04-03 02:29:31 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][560/573]	eta 0:00:11 lr 0.000692	time 0.8773 (0.8812)	loss 0.4439 (0.5070)	grad_norm 2.3083 (2.8037)	mem 20675MB
[2025-04-03 02:29:32 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][562/573]	eta 0:00:09 lr 0.000692	time 0.8771 (0.8812)	loss 0.5408 (0.5070)	grad_norm 3.1342 (2.8103)	mem 20675MB
[2025-04-03 02:29:34 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][564/573]	eta 0:00:07 lr 0.000691	time 0.8773 (0.8812)	loss 0.5517 (0.5071)	grad_norm 3.0408 (2.8122)	mem 20675MB
[2025-04-03 02:29:36 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][566/573]	eta 0:00:06 lr 0.000691	time 0.8771 (0.8812)	loss 0.6267 (0.5075)	grad_norm 3.0966 (2.8119)	mem 20675MB
[2025-04-03 02:29:38 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][568/573]	eta 0:00:04 lr 0.000691	time 0.8780 (0.8812)	loss 0.5843 (0.5075)	grad_norm 2.1721 (2.8106)	mem 20675MB
[2025-04-03 02:29:39 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][570/573]	eta 0:00:02 lr 0.000691	time 0.8769 (0.8812)	loss 0.4630 (0.5075)	grad_norm 4.3045 (2.8121)	mem 20675MB
[2025-04-03 02:29:41 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][572/573]	eta 0:00:00 lr 0.000691	time 0.8772 (0.8811)	loss 0.3949 (0.5073)	grad_norm 3.3835 (2.8131)	mem 20675MB
[2025-04-03 02:29:41 simmim_finetune] (main_finetune.py 260): INFO EPOCH 13 training takes 0:08:25
[2025-04-03 02:29:43 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.923 (1.923)	Loss 0.7388 (0.7388)	Acc@1 52.344 (52.344)	Mem 20675MB
[2025-04-03 02:29:44 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.830)	Loss 0.7056 (0.7118)	Acc@1 57.812 (55.990)	Mem 20675MB
[2025-04-03 02:29:44 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.612)	Loss 0.7403 (0.7107)	Acc@1 53.906 (55.781)	Mem 20675MB
[2025-04-03 02:29:45 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.518)	Loss 0.6834 (0.7002)	Acc@1 63.281 (57.478)	Mem 20675MB
[2025-04-03 02:29:46 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.466)	Loss 0.3824 (0.6312)	Acc@1 86.719 (64.236)	Mem 20675MB
[2025-04-03 02:29:46 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.284 (0.433)	Loss 0.3514 (0.5842)	Acc@1 86.719 (67.827)	Mem 20675MB
[2025-04-03 02:29:47 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.410)	Loss 0.3424 (0.5470)	Acc@1 88.281 (70.913)	Mem 20675MB
[2025-04-03 02:29:47 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.393)	Loss 0.3287 (0.5175)	Acc@1 91.406 (73.542)	Mem 20675MB
[2025-04-03 02:29:48 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 73.942
[2025-04-03 02:29:48 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 73.9%
[2025-04-03 02:29:48 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 76.71%
[2025-04-03 02:29:48 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.6646244249880715e-06, 2.6646244249880715e-06, 4.039162303072435e-06, 4.039162303072435e-06, 6.153835961663764e-06, 6.153835961663764e-06, 9.40718005180427e-06, 9.40718005180427e-06, 1.4412324805866585e-05, 1.4412324805866585e-05, 2.2112547504423998e-05, 2.2112547504423998e-05, 3.3959043963743085e-05, 3.3959043963743085e-05, 5.2184423131926294e-05, 5.2184423131926294e-05, 8.022346800605433e-05, 8.022346800605433e-05, 0.00012336046012009744, 0.00012336046012009744, 0.00018972506337247146, 0.00018972506337247146, 0.0002918244529915084, 0.0002918244529915084, 0.000448900437020796, 0.000448900437020796, 0.0006905557970658539, 0.0006905557970658539]
[2025-04-03 02:29:50 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][0/573]	eta 0:23:16 lr 0.000690	time 2.4380 (2.4380)	loss 0.4197 (0.4197)	grad_norm 2.5483 (2.5483)	mem 20675MB
[2025-04-03 02:29:52 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][2/573]	eta 0:13:18 lr 0.000690	time 0.8772 (1.3984)	loss 0.5035 (0.4915)	grad_norm 2.2595 (2.2266)	mem 20675MB
[2025-04-03 02:29:54 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][4/573]	eta 0:11:17 lr 0.000690	time 0.8774 (1.1903)	loss 0.4497 (0.4619)	grad_norm 2.9510 (2.5660)	mem 20675MB
[2025-04-03 02:29:55 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][6/573]	eta 0:10:24 lr 0.000690	time 0.8776 (1.1012)	loss 0.6043 (0.5025)	grad_norm 2.9520 (2.6236)	mem 20675MB
[2025-04-03 02:29:57 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][8/573]	eta 0:09:54 lr 0.000690	time 0.8773 (1.0517)	loss 0.4236 (0.4953)	grad_norm 2.8722 (2.7316)	mem 20675MB
[2025-04-03 02:29:59 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][10/573]	eta 0:09:34 lr 0.000689	time 0.8776 (1.0202)	loss 0.5520 (0.5052)	grad_norm 2.3180 (2.6922)	mem 20675MB
[2025-04-03 02:30:01 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][12/573]	eta 0:09:20 lr 0.000689	time 0.8773 (0.9983)	loss 0.5886 (0.5165)	grad_norm 1.8449 (2.6128)	mem 20675MB
[2025-04-03 02:30:02 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][14/573]	eta 0:09:09 lr 0.000689	time 0.8776 (0.9823)	loss 0.5494 (0.5172)	grad_norm 2.3625 (2.6188)	mem 20675MB
[2025-04-03 02:30:04 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][16/573]	eta 0:09:00 lr 0.000689	time 0.8784 (0.9702)	loss 0.5582 (0.5136)	grad_norm 2.2246 (2.6339)	mem 20675MB
[2025-04-03 02:30:06 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][18/573]	eta 0:08:53 lr 0.000688	time 0.8773 (0.9606)	loss 0.4605 (0.5037)	grad_norm 2.3774 (2.6442)	mem 20675MB
[2025-04-03 02:30:08 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][20/573]	eta 0:08:46 lr 0.000688	time 0.8782 (0.9528)	loss 0.4282 (0.5016)	grad_norm 2.8780 (2.7048)	mem 20675MB
[2025-04-03 02:30:09 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][22/573]	eta 0:08:41 lr 0.000688	time 0.8776 (0.9464)	loss 0.5658 (0.5077)	grad_norm 1.9309 (2.6474)	mem 20675MB
[2025-04-03 02:30:11 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][24/573]	eta 0:08:36 lr 0.000688	time 0.8777 (0.9410)	loss 0.5183 (0.5034)	grad_norm 1.9324 (2.6117)	mem 20675MB
[2025-04-03 02:30:13 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][26/573]	eta 0:08:32 lr 0.000687	time 0.8772 (0.9363)	loss 0.4018 (0.4969)	grad_norm 3.4244 (2.7119)	mem 20675MB
[2025-04-03 02:30:15 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][28/573]	eta 0:08:28 lr 0.000687	time 0.8784 (0.9324)	loss 0.5025 (0.4951)	grad_norm 2.7922 (2.7395)	mem 20675MB
[2025-04-03 02:30:16 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][30/573]	eta 0:08:24 lr 0.000687	time 0.8773 (0.9289)	loss 0.5644 (0.4975)	grad_norm 2.7763 (2.7310)	mem 20675MB
[2025-04-03 02:30:18 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][32/573]	eta 0:08:20 lr 0.000687	time 0.8774 (0.9258)	loss 0.3572 (0.4940)	grad_norm 2.9298 (2.7706)	mem 20675MB
[2025-04-03 02:30:20 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][34/573]	eta 0:08:17 lr 0.000687	time 0.8773 (0.9231)	loss 0.3533 (0.4915)	grad_norm 3.0984 (2.7582)	mem 20675MB
[2025-04-03 02:30:22 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][36/573]	eta 0:08:14 lr 0.000686	time 0.8772 (0.9207)	loss 0.5673 (0.4952)	grad_norm 2.7813 (2.7553)	mem 20675MB
[2025-04-03 02:30:23 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][38/573]	eta 0:08:11 lr 0.000686	time 0.8779 (0.9185)	loss 0.5573 (0.4934)	grad_norm 4.2729 (2.7993)	mem 20675MB
[2025-04-03 02:30:25 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][40/573]	eta 0:08:08 lr 0.000686	time 0.8772 (0.9166)	loss 0.5923 (0.4960)	grad_norm 3.5609 (2.8048)	mem 20675MB
[2025-04-03 02:30:27 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][42/573]	eta 0:08:05 lr 0.000686	time 0.8785 (0.9149)	loss 0.5405 (0.4970)	grad_norm 2.8740 (2.8327)	mem 20675MB
[2025-04-03 02:30:29 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][44/573]	eta 0:08:03 lr 0.000685	time 0.8783 (0.9133)	loss 0.6116 (0.4965)	grad_norm 2.3836 (2.8605)	mem 20675MB
[2025-04-03 02:30:30 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][46/573]	eta 0:08:00 lr 0.000685	time 0.8773 (0.9118)	loss 0.5662 (0.4994)	grad_norm 3.2546 (2.8732)	mem 20675MB
[2025-04-03 02:30:32 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][48/573]	eta 0:07:57 lr 0.000685	time 0.8775 (0.9104)	loss 0.5896 (0.5029)	grad_norm 2.7165 (2.8576)	mem 20675MB
[2025-04-03 02:30:34 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][50/573]	eta 0:07:55 lr 0.000685	time 0.8779 (0.9092)	loss 0.5762 (0.5014)	grad_norm 2.5402 (2.8741)	mem 20675MB
[2025-04-03 02:30:36 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][52/573]	eta 0:07:53 lr 0.000685	time 0.8781 (0.9080)	loss 0.3810 (0.4991)	grad_norm 2.3862 (2.8633)	mem 20675MB
[2025-04-03 02:30:37 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][54/573]	eta 0:07:50 lr 0.000684	time 0.8780 (0.9070)	loss 0.6054 (0.5030)	grad_norm 2.9439 (2.8615)	mem 20675MB
[2025-04-03 02:30:39 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][56/573]	eta 0:07:48 lr 0.000684	time 0.8780 (0.9060)	loss 0.3763 (0.5026)	grad_norm 4.5013 (2.8881)	mem 20675MB
[2025-04-03 02:30:41 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][58/573]	eta 0:07:46 lr 0.000684	time 0.8778 (0.9051)	loss 0.4694 (0.5034)	grad_norm 2.1899 (2.8626)	mem 20675MB
[2025-04-03 02:30:43 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][60/573]	eta 0:07:43 lr 0.000684	time 0.8790 (0.9042)	loss 0.4312 (0.5035)	grad_norm 3.0038 (2.8576)	mem 20675MB
[2025-04-03 02:30:44 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][62/573]	eta 0:07:41 lr 0.000683	time 0.8779 (0.9034)	loss 0.5253 (0.5049)	grad_norm 3.7014 (2.8550)	mem 20675MB
[2025-04-03 02:30:46 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][64/573]	eta 0:07:39 lr 0.000683	time 0.8779 (0.9027)	loss 0.5816 (0.5077)	grad_norm 2.2429 (2.8462)	mem 20675MB
[2025-04-03 02:30:48 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][66/573]	eta 0:07:37 lr 0.000683	time 0.8786 (0.9020)	loss 0.6337 (0.5092)	grad_norm 2.7398 (2.8465)	mem 20675MB
[2025-04-03 02:30:50 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][68/573]	eta 0:07:35 lr 0.000683	time 0.8782 (0.9013)	loss 0.5701 (0.5115)	grad_norm 2.3178 (2.8273)	mem 20675MB
[2025-04-03 02:30:52 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][70/573]	eta 0:07:33 lr 0.000682	time 0.8781 (0.9007)	loss 0.4843 (0.5119)	grad_norm 2.0479 (2.8027)	mem 20675MB
[2025-04-03 02:30:53 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][72/573]	eta 0:07:30 lr 0.000682	time 0.8783 (0.9001)	loss 0.4401 (0.5121)	grad_norm 3.1443 (2.8079)	mem 20675MB
[2025-04-03 02:30:55 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][74/573]	eta 0:07:28 lr 0.000682	time 0.8780 (0.8996)	loss 0.5570 (0.5129)	grad_norm 1.7464 (2.7874)	mem 20675MB
[2025-04-03 02:30:57 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][76/573]	eta 0:07:26 lr 0.000682	time 0.8778 (0.8990)	loss 0.4538 (0.5115)	grad_norm 2.9322 (2.7946)	mem 20675MB
[2025-04-03 02:30:59 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][78/573]	eta 0:07:24 lr 0.000682	time 0.8786 (0.8985)	loss 0.4986 (0.5089)	grad_norm 2.4805 (2.7906)	mem 20675MB
[2025-04-03 02:31:00 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][80/573]	eta 0:07:22 lr 0.000681	time 0.8780 (0.8980)	loss 0.3674 (0.5066)	grad_norm 4.4468 (2.8267)	mem 20675MB
[2025-04-03 02:31:02 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][82/573]	eta 0:07:20 lr 0.000681	time 0.8780 (0.8976)	loss 0.6369 (0.5089)	grad_norm 2.9763 (2.8444)	mem 20675MB
[2025-04-03 02:31:04 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][84/573]	eta 0:07:18 lr 0.000681	time 0.8783 (0.8971)	loss 0.5208 (0.5095)	grad_norm 2.1254 (2.8501)	mem 20675MB
[2025-04-03 02:31:06 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][86/573]	eta 0:07:16 lr 0.000681	time 0.8777 (0.8967)	loss 0.5479 (0.5089)	grad_norm 3.3995 (2.8589)	mem 20675MB
[2025-04-03 02:31:07 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][88/573]	eta 0:07:14 lr 0.000680	time 0.8781 (0.8963)	loss 0.4278 (0.5089)	grad_norm 3.0899 (2.8739)	mem 20675MB
[2025-04-03 02:31:09 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][90/573]	eta 0:07:12 lr 0.000680	time 0.8779 (0.8959)	loss 0.5571 (0.5093)	grad_norm 1.7852 (2.8521)	mem 20675MB
[2025-04-03 02:31:11 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][92/573]	eta 0:07:10 lr 0.000680	time 0.8780 (0.8956)	loss 0.4764 (0.5080)	grad_norm 2.5657 (2.8509)	mem 20675MB
[2025-04-03 02:31:13 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][94/573]	eta 0:07:08 lr 0.000680	time 0.8776 (0.8952)	loss 0.4411 (0.5064)	grad_norm 6.1206 (2.8879)	mem 20675MB
[2025-04-03 02:31:14 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][96/573]	eta 0:07:06 lr 0.000680	time 0.8779 (0.8949)	loss 0.5891 (0.5072)	grad_norm 3.0501 (2.8858)	mem 20675MB
[2025-04-03 02:31:16 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][98/573]	eta 0:07:04 lr 0.000679	time 0.8783 (0.8945)	loss 0.5121 (0.5079)	grad_norm 2.5514 (2.8724)	mem 20675MB
[2025-04-03 02:31:18 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][100/573]	eta 0:07:02 lr 0.000679	time 0.8778 (0.8942)	loss 0.5630 (0.5082)	grad_norm 2.5382 (2.8649)	mem 20675MB
[2025-04-03 02:31:20 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][102/573]	eta 0:07:01 lr 0.000679	time 0.8776 (0.8939)	loss 0.4385 (0.5075)	grad_norm 4.0552 (2.8747)	mem 20675MB
[2025-04-03 02:31:21 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][104/573]	eta 0:06:59 lr 0.000679	time 0.8780 (0.8937)	loss 0.5611 (0.5082)	grad_norm 1.9878 (2.8664)	mem 20675MB
[2025-04-03 02:31:23 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][106/573]	eta 0:06:57 lr 0.000678	time 0.8804 (0.8934)	loss 0.5099 (0.5084)	grad_norm 2.1715 (2.8660)	mem 20675MB
[2025-04-03 02:31:25 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][108/573]	eta 0:06:55 lr 0.000678	time 0.8788 (0.8931)	loss 0.4688 (0.5089)	grad_norm 2.6052 (2.8597)	mem 20675MB
[2025-04-03 02:31:27 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][110/573]	eta 0:06:53 lr 0.000678	time 0.8778 (0.8929)	loss 0.3605 (0.5068)	grad_norm 2.5859 (2.8580)	mem 20675MB
[2025-04-03 02:31:28 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][112/573]	eta 0:06:51 lr 0.000678	time 0.8781 (0.8926)	loss 0.3652 (0.5056)	grad_norm 4.9591 (2.8729)	mem 20675MB
[2025-04-03 02:31:30 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][114/573]	eta 0:06:49 lr 0.000677	time 0.8783 (0.8924)	loss 0.4885 (0.5053)	grad_norm 2.4510 (2.8790)	mem 20675MB
[2025-04-03 02:31:32 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][116/573]	eta 0:06:47 lr 0.000677	time 0.8777 (0.8922)	loss 0.5980 (0.5063)	grad_norm 3.0443 (2.8780)	mem 20675MB
[2025-04-03 02:31:34 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][118/573]	eta 0:06:45 lr 0.000677	time 0.8774 (0.8919)	loss 0.4967 (0.5070)	grad_norm 3.2706 (2.8763)	mem 20675MB
[2025-04-03 02:31:35 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][120/573]	eta 0:06:43 lr 0.000677	time 0.8782 (0.8917)	loss 0.5486 (0.5078)	grad_norm 4.5608 (2.8823)	mem 20675MB
[2025-04-03 02:31:37 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][122/573]	eta 0:06:42 lr 0.000677	time 0.8787 (0.8915)	loss 0.5047 (0.5073)	grad_norm 3.3251 (2.8795)	mem 20675MB
[2025-04-03 02:31:39 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][124/573]	eta 0:06:40 lr 0.000676	time 0.8773 (0.8913)	loss 0.3812 (0.5066)	grad_norm 2.5353 (2.8702)	mem 20675MB
[2025-04-03 02:31:41 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][126/573]	eta 0:06:38 lr 0.000676	time 0.8773 (0.8911)	loss 0.4909 (0.5062)	grad_norm 2.0926 (2.8604)	mem 20675MB
[2025-04-03 02:31:42 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][128/573]	eta 0:06:36 lr 0.000676	time 0.8772 (0.8909)	loss 0.5555 (0.5069)	grad_norm 2.7683 (2.8544)	mem 20675MB
[2025-04-03 02:31:44 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][130/573]	eta 0:06:34 lr 0.000676	time 0.8779 (0.8907)	loss 0.3631 (0.5060)	grad_norm 2.9738 (2.8497)	mem 20675MB
[2025-04-03 02:31:46 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][132/573]	eta 0:06:32 lr 0.000675	time 0.8771 (0.8905)	loss 0.5220 (0.5055)	grad_norm 1.7634 (2.8457)	mem 20675MB
[2025-04-03 02:31:48 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][134/573]	eta 0:06:30 lr 0.000675	time 0.8770 (0.8903)	loss 0.4916 (0.5060)	grad_norm 2.8598 (2.8403)	mem 20675MB
[2025-04-03 02:31:50 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][136/573]	eta 0:06:29 lr 0.000675	time 0.8774 (0.8902)	loss 0.4823 (0.5067)	grad_norm 3.6712 (2.8429)	mem 20675MB
[2025-04-03 02:31:51 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][138/573]	eta 0:06:27 lr 0.000675	time 0.8771 (0.8900)	loss 0.4805 (0.5057)	grad_norm 2.3266 (2.8583)	mem 20675MB
[2025-04-03 02:31:53 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][140/573]	eta 0:06:25 lr 0.000675	time 0.8774 (0.8898)	loss 0.3605 (0.5052)	grad_norm 3.0563 (2.8572)	mem 20675MB
[2025-04-03 02:31:55 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][142/573]	eta 0:06:23 lr 0.000674	time 0.8771 (0.8897)	loss 0.5408 (0.5049)	grad_norm 2.2149 (2.8616)	mem 20675MB
[2025-04-03 02:31:57 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][144/573]	eta 0:06:21 lr 0.000674	time 0.8778 (0.8895)	loss 0.5258 (0.5052)	grad_norm 2.1044 (2.8585)	mem 20675MB
[2025-04-03 02:31:58 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][146/573]	eta 0:06:19 lr 0.000674	time 0.8774 (0.8894)	loss 0.5668 (0.5059)	grad_norm 2.4669 (2.8527)	mem 20675MB
[2025-04-03 02:32:00 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][148/573]	eta 0:06:17 lr 0.000674	time 0.8776 (0.8892)	loss 0.5540 (0.5061)	grad_norm 1.9889 (2.8432)	mem 20675MB
[2025-04-03 02:32:02 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][150/573]	eta 0:06:16 lr 0.000673	time 0.8776 (0.8891)	loss 0.3645 (0.5058)	grad_norm 2.9297 (2.8651)	mem 20675MB
[2025-04-03 02:32:04 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][152/573]	eta 0:06:14 lr 0.000673	time 0.8778 (0.8889)	loss 0.5121 (0.5060)	grad_norm 2.1560 (2.8558)	mem 20675MB
[2025-04-03 02:32:05 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][154/573]	eta 0:06:12 lr 0.000673	time 0.8777 (0.8888)	loss 0.5184 (0.5060)	grad_norm 2.1231 (2.8498)	mem 20675MB
[2025-04-03 02:32:07 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][156/573]	eta 0:06:10 lr 0.000673	time 0.8780 (0.8887)	loss 0.5240 (0.5063)	grad_norm 2.0230 (2.8531)	mem 20675MB
[2025-04-03 02:32:09 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][158/573]	eta 0:06:08 lr 0.000672	time 0.8773 (0.8886)	loss 0.5618 (0.5066)	grad_norm 2.5526 (2.8551)	mem 20675MB
[2025-04-03 02:32:11 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][160/573]	eta 0:06:06 lr 0.000672	time 0.8773 (0.8884)	loss 0.4123 (0.5065)	grad_norm 2.8946 (2.8534)	mem 20675MB
[2025-04-03 02:32:12 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][162/573]	eta 0:06:05 lr 0.000672	time 0.8779 (0.8883)	loss 0.5783 (0.5074)	grad_norm 1.8243 (2.8441)	mem 20675MB
[2025-04-03 02:32:14 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][164/573]	eta 0:06:03 lr 0.000672	time 0.8775 (0.8882)	loss 0.5743 (0.5081)	grad_norm 2.0632 (2.8331)	mem 20675MB
[2025-04-03 02:32:16 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][166/573]	eta 0:06:01 lr 0.000672	time 0.8775 (0.8881)	loss 0.5926 (0.5077)	grad_norm 2.4530 (2.8293)	mem 20675MB
[2025-04-03 02:32:18 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][168/573]	eta 0:05:59 lr 0.000671	time 0.8775 (0.8880)	loss 0.5238 (0.5069)	grad_norm 1.8546 (2.8222)	mem 20675MB
[2025-04-03 02:32:19 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][170/573]	eta 0:05:57 lr 0.000671	time 0.8775 (0.8878)	loss 0.5691 (0.5066)	grad_norm 1.9209 (2.8162)	mem 20675MB
[2025-04-03 02:32:21 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][172/573]	eta 0:05:55 lr 0.000671	time 0.8773 (0.8877)	loss 0.5439 (0.5065)	grad_norm 2.3826 (2.8101)	mem 20675MB
[2025-04-03 02:32:23 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][174/573]	eta 0:05:54 lr 0.000671	time 0.8773 (0.8876)	loss 0.4904 (0.5065)	grad_norm 3.3896 (2.8101)	mem 20675MB
[2025-04-03 02:32:25 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][176/573]	eta 0:05:52 lr 0.000670	time 0.8775 (0.8875)	loss 0.3975 (0.5057)	grad_norm 2.7250 (2.8100)	mem 20675MB
[2025-04-03 02:32:26 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][178/573]	eta 0:05:50 lr 0.000670	time 0.8770 (0.8874)	loss 0.5614 (0.5064)	grad_norm 2.9093 (2.8076)	mem 20675MB
[2025-04-03 02:32:28 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][180/573]	eta 0:05:48 lr 0.000670	time 0.8771 (0.8873)	loss 0.5502 (0.5067)	grad_norm 2.4674 (2.8178)	mem 20675MB
[2025-04-03 02:32:30 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][182/573]	eta 0:05:46 lr 0.000670	time 0.8776 (0.8872)	loss 0.6169 (0.5077)	grad_norm 2.7589 (2.8154)	mem 20675MB
[2025-04-03 02:32:32 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][184/573]	eta 0:05:45 lr 0.000670	time 0.8775 (0.8871)	loss 0.5952 (0.5076)	grad_norm 1.8766 (2.8063)	mem 20675MB
[2025-04-03 02:32:33 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][186/573]	eta 0:05:43 lr 0.000669	time 0.8774 (0.8870)	loss 0.5756 (0.5079)	grad_norm 2.3542 (2.8019)	mem 20675MB
[2025-04-03 02:32:35 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][188/573]	eta 0:05:41 lr 0.000669	time 0.8775 (0.8869)	loss 0.5329 (0.5075)	grad_norm 2.1677 (2.7965)	mem 20675MB
[2025-04-03 02:32:37 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][190/573]	eta 0:05:39 lr 0.000669	time 0.8774 (0.8869)	loss 0.4234 (0.5071)	grad_norm 2.9909 (2.7920)	mem 20675MB
[2025-04-03 02:32:39 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][192/573]	eta 0:05:37 lr 0.000669	time 0.8771 (0.8868)	loss 0.5325 (0.5072)	grad_norm 1.9929 (2.7887)	mem 20675MB
[2025-04-03 02:32:40 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][194/573]	eta 0:05:36 lr 0.000668	time 0.8779 (0.8867)	loss 0.5715 (0.5076)	grad_norm 2.8989 (2.7890)	mem 20675MB
[2025-04-03 02:32:42 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][196/573]	eta 0:05:34 lr 0.000668	time 0.8774 (0.8866)	loss 0.3965 (0.5069)	grad_norm 2.2387 (2.7919)	mem 20675MB
[2025-04-03 02:32:44 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][198/573]	eta 0:05:32 lr 0.000668	time 0.8776 (0.8865)	loss 0.5635 (0.5068)	grad_norm 2.1923 (2.7993)	mem 20675MB
[2025-04-03 02:32:46 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][200/573]	eta 0:05:30 lr 0.000668	time 0.8771 (0.8864)	loss 0.5049 (0.5073)	grad_norm 2.3056 (2.7956)	mem 20675MB
[2025-04-03 02:32:47 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][202/573]	eta 0:05:28 lr 0.000667	time 0.8770 (0.8864)	loss 0.3326 (0.5068)	grad_norm 3.7584 (2.7999)	mem 20675MB
[2025-04-03 02:32:49 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][204/573]	eta 0:05:27 lr 0.000667	time 0.8772 (0.8863)	loss 0.4459 (0.5059)	grad_norm 4.9019 (2.8137)	mem 20675MB
[2025-04-03 02:32:51 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][206/573]	eta 0:05:25 lr 0.000667	time 0.8775 (0.8862)	loss 0.5108 (0.5062)	grad_norm 1.6782 (2.8036)	mem 20675MB
[2025-04-03 02:32:53 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][208/573]	eta 0:05:23 lr 0.000667	time 0.8771 (0.8861)	loss 0.4279 (0.5060)	grad_norm 2.3994 (2.7959)	mem 20675MB
[2025-04-03 02:32:55 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][210/573]	eta 0:05:21 lr 0.000667	time 0.8773 (0.8861)	loss 0.4771 (0.5053)	grad_norm 4.0360 (2.8241)	mem 20675MB
[2025-04-03 02:32:56 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][212/573]	eta 0:05:19 lr 0.000666	time 0.8775 (0.8860)	loss 0.5091 (0.5055)	grad_norm 2.6525 (2.8231)	mem 20675MB
[2025-04-03 02:32:58 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][214/573]	eta 0:05:18 lr 0.000666	time 0.8774 (0.8859)	loss 0.5013 (0.5055)	grad_norm 3.1215 (2.8385)	mem 20675MB
[2025-04-03 02:33:00 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][216/573]	eta 0:05:16 lr 0.000666	time 0.8773 (0.8858)	loss 0.5973 (0.5056)	grad_norm 2.4448 (2.8373)	mem 20675MB
[2025-04-03 02:33:02 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][218/573]	eta 0:05:14 lr 0.000666	time 0.8774 (0.8858)	loss 0.5527 (0.5060)	grad_norm 3.9006 (2.8391)	mem 20675MB
[2025-04-03 02:33:03 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][220/573]	eta 0:05:12 lr 0.000665	time 0.8772 (0.8857)	loss 0.3752 (0.5058)	grad_norm 2.7727 (2.8348)	mem 20675MB
[2025-04-03 02:33:05 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][222/573]	eta 0:05:10 lr 0.000665	time 0.8771 (0.8856)	loss 0.5297 (0.5060)	grad_norm 2.7106 (2.8303)	mem 20675MB
[2025-04-03 02:33:07 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][224/573]	eta 0:05:09 lr 0.000665	time 0.8770 (0.8856)	loss 0.5775 (0.5067)	grad_norm 2.3801 (2.8242)	mem 20675MB
[2025-04-03 02:33:09 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][226/573]	eta 0:05:07 lr 0.000665	time 0.8776 (0.8855)	loss 0.4512 (0.5058)	grad_norm 3.3958 (2.8258)	mem 20675MB
[2025-04-03 02:33:10 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][228/573]	eta 0:05:05 lr 0.000664	time 0.8774 (0.8854)	loss 0.5453 (0.5062)	grad_norm 2.3849 (2.8248)	mem 20675MB
[2025-04-03 02:33:12 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][230/573]	eta 0:05:03 lr 0.000664	time 0.8780 (0.8854)	loss 0.5467 (0.5069)	grad_norm 1.9561 (2.8201)	mem 20675MB
[2025-04-03 02:33:14 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][232/573]	eta 0:05:01 lr 0.000664	time 0.8773 (0.8853)	loss 0.5985 (0.5069)	grad_norm 1.9304 (2.8184)	mem 20675MB
[2025-04-03 02:33:16 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][234/573]	eta 0:05:00 lr 0.000664	time 0.8773 (0.8853)	loss 0.5997 (0.5075)	grad_norm 1.7575 (2.8107)	mem 20675MB
[2025-04-03 02:33:17 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][236/573]	eta 0:04:58 lr 0.000664	time 0.8774 (0.8852)	loss 0.5012 (0.5077)	grad_norm 3.3751 (2.8085)	mem 20675MB
[2025-04-03 02:33:19 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][238/573]	eta 0:04:56 lr 0.000663	time 0.8772 (0.8852)	loss 0.4470 (0.5078)	grad_norm 2.7087 (2.8063)	mem 20675MB
[2025-04-03 02:33:21 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][240/573]	eta 0:04:54 lr 0.000663	time 0.8773 (0.8851)	loss 0.5404 (0.5081)	grad_norm 2.1319 (2.8027)	mem 20675MB
[2025-04-03 02:33:23 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][242/573]	eta 0:04:52 lr 0.000663	time 0.8776 (0.8850)	loss 0.5936 (0.5086)	grad_norm 2.0926 (2.7991)	mem 20675MB
[2025-04-03 02:33:24 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][244/573]	eta 0:04:51 lr 0.000663	time 0.8772 (0.8850)	loss 0.5737 (0.5092)	grad_norm 2.1000 (2.7928)	mem 20675MB
[2025-04-03 02:33:26 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][246/573]	eta 0:04:49 lr 0.000662	time 0.8771 (0.8849)	loss 0.5426 (0.5093)	grad_norm 1.7089 (2.7930)	mem 20675MB
[2025-04-03 02:33:28 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][248/573]	eta 0:04:47 lr 0.000662	time 0.8772 (0.8849)	loss 0.5655 (0.5097)	grad_norm 5.4334 (2.8025)	mem 20675MB
[2025-04-03 02:33:30 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][250/573]	eta 0:04:45 lr 0.000662	time 0.8770 (0.8848)	loss 0.5681 (0.5100)	grad_norm 1.8298 (2.7936)	mem 20675MB
[2025-04-03 02:33:31 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][252/573]	eta 0:04:44 lr 0.000662	time 0.8770 (0.8848)	loss 0.5605 (0.5098)	grad_norm 1.3871 (2.7883)	mem 20675MB
[2025-04-03 02:33:33 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][254/573]	eta 0:04:42 lr 0.000662	time 0.8771 (0.8847)	loss 0.6265 (0.5103)	grad_norm 1.8562 (2.7863)	mem 20675MB
[2025-04-03 02:33:35 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][256/573]	eta 0:04:40 lr 0.000661	time 0.8773 (0.8847)	loss 0.5274 (0.5102)	grad_norm 5.7905 (2.7951)	mem 20675MB
[2025-04-03 02:33:37 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][258/573]	eta 0:04:38 lr 0.000661	time 0.8775 (0.8846)	loss 0.4795 (0.5096)	grad_norm 1.8723 (2.7942)	mem 20675MB
[2025-04-03 02:33:38 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][260/573]	eta 0:04:36 lr 0.000661	time 0.8772 (0.8846)	loss 0.6333 (0.5105)	grad_norm 2.7263 (2.7917)	mem 20675MB
[2025-04-03 02:33:40 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][262/573]	eta 0:04:35 lr 0.000661	time 0.8775 (0.8845)	loss 0.5033 (0.5107)	grad_norm 1.9364 (2.7857)	mem 20675MB
[2025-04-03 02:33:42 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][264/573]	eta 0:04:33 lr 0.000660	time 0.8776 (0.8845)	loss 0.4667 (0.5107)	grad_norm 2.5645 (2.7825)	mem 20675MB
[2025-04-03 02:33:44 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][266/573]	eta 0:04:31 lr 0.000660	time 0.8776 (0.8844)	loss 0.4946 (0.5104)	grad_norm 2.4290 (2.7785)	mem 20675MB
[2025-04-03 02:33:45 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][268/573]	eta 0:04:29 lr 0.000660	time 0.8775 (0.8844)	loss 0.5623 (0.5106)	grad_norm 2.3653 (2.7743)	mem 20675MB
[2025-04-03 02:33:47 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][270/573]	eta 0:04:27 lr 0.000660	time 0.8778 (0.8844)	loss 0.5693 (0.5108)	grad_norm 2.2996 (2.7733)	mem 20675MB
[2025-04-03 02:33:49 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][272/573]	eta 0:04:26 lr 0.000659	time 0.8771 (0.8843)	loss 0.4550 (0.5104)	grad_norm 2.8035 (2.7720)	mem 20675MB
[2025-04-03 02:33:51 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][274/573]	eta 0:04:24 lr 0.000659	time 0.8774 (0.8843)	loss 0.5395 (0.5105)	grad_norm 2.1370 (2.7738)	mem 20675MB
[2025-04-03 02:33:52 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][276/573]	eta 0:04:22 lr 0.000659	time 0.8774 (0.8842)	loss 0.4617 (0.5097)	grad_norm 3.0021 (2.7836)	mem 20675MB
[2025-04-03 02:33:54 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][278/573]	eta 0:04:20 lr 0.000659	time 0.8778 (0.8842)	loss 0.5879 (0.5101)	grad_norm 2.1183 (2.7803)	mem 20675MB
[2025-04-03 02:33:56 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][280/573]	eta 0:04:19 lr 0.000659	time 0.8772 (0.8842)	loss 0.3679 (0.5094)	grad_norm 2.3803 (2.7764)	mem 20675MB
[2025-04-03 02:33:58 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][282/573]	eta 0:04:17 lr 0.000658	time 0.8774 (0.8841)	loss 0.4996 (0.5096)	grad_norm 4.3205 (2.7801)	mem 20675MB
[2025-04-03 02:34:00 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][284/573]	eta 0:04:15 lr 0.000658	time 0.8773 (0.8841)	loss 0.3717 (0.5086)	grad_norm 3.3413 (2.7805)	mem 20675MB
[2025-04-03 02:34:01 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][286/573]	eta 0:04:13 lr 0.000658	time 0.8774 (0.8840)	loss 0.4347 (0.5085)	grad_norm 3.6960 (2.7824)	mem 20675MB
[2025-04-03 02:34:03 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][288/573]	eta 0:04:11 lr 0.000658	time 0.8773 (0.8840)	loss 0.5337 (0.5085)	grad_norm 2.1243 (2.7839)	mem 20675MB
[2025-04-03 02:34:05 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][290/573]	eta 0:04:10 lr 0.000657	time 0.8772 (0.8840)	loss 0.5762 (0.5092)	grad_norm 3.3978 (2.7860)	mem 20675MB
[2025-04-03 02:34:07 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][292/573]	eta 0:04:08 lr 0.000657	time 0.8773 (0.8839)	loss 0.4264 (0.5088)	grad_norm 3.1685 (2.7985)	mem 20675MB
[2025-04-03 02:34:08 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][294/573]	eta 0:04:06 lr 0.000657	time 0.8776 (0.8839)	loss 0.5229 (0.5086)	grad_norm 1.9514 (2.7973)	mem 20675MB
[2025-04-03 02:34:10 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][296/573]	eta 0:04:04 lr 0.000657	time 0.8773 (0.8839)	loss 0.4577 (0.5087)	grad_norm 3.1070 (2.7991)	mem 20675MB
[2025-04-03 02:34:12 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][298/573]	eta 0:04:03 lr 0.000657	time 0.8773 (0.8838)	loss 0.5176 (0.5088)	grad_norm 1.7981 (2.7945)	mem 20675MB
[2025-04-03 02:34:14 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][300/573]	eta 0:04:01 lr 0.000656	time 0.8777 (0.8838)	loss 0.5933 (0.5093)	grad_norm 2.2263 (2.7912)	mem 20675MB
[2025-04-03 02:34:15 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][302/573]	eta 0:03:59 lr 0.000656	time 0.8773 (0.8837)	loss 0.3623 (0.5088)	grad_norm 3.7251 (2.7904)	mem 20675MB
[2025-04-03 02:34:17 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][304/573]	eta 0:03:57 lr 0.000656	time 0.8775 (0.8837)	loss 0.3965 (0.5089)	grad_norm 4.3865 (2.7946)	mem 20675MB
[2025-04-03 02:34:19 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][306/573]	eta 0:03:55 lr 0.000656	time 0.8772 (0.8837)	loss 0.4978 (0.5093)	grad_norm 2.4187 (2.7930)	mem 20675MB
[2025-04-03 02:34:21 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][308/573]	eta 0:03:54 lr 0.000655	time 0.8774 (0.8836)	loss 0.4127 (0.5090)	grad_norm 3.4999 (2.7930)	mem 20675MB
[2025-04-03 02:34:22 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][310/573]	eta 0:03:52 lr 0.000655	time 0.8776 (0.8836)	loss 0.4741 (0.5091)	grad_norm 2.3334 (2.7885)	mem 20675MB
[2025-04-03 02:34:24 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][312/573]	eta 0:03:50 lr 0.000655	time 0.8774 (0.8836)	loss 0.6048 (0.5096)	grad_norm 2.3956 (2.7843)	mem 20675MB
[2025-04-03 02:34:26 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][314/573]	eta 0:03:48 lr 0.000655	time 0.8776 (0.8835)	loss 0.4659 (0.5095)	grad_norm 2.6717 (2.7828)	mem 20675MB
[2025-04-03 02:34:28 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][316/573]	eta 0:03:47 lr 0.000654	time 0.8777 (0.8835)	loss 0.6152 (0.5095)	grad_norm 2.0283 (2.7805)	mem 20675MB
[2025-04-03 02:34:29 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][318/573]	eta 0:03:45 lr 0.000654	time 0.8775 (0.8835)	loss 0.4910 (0.5091)	grad_norm 3.7654 (2.7804)	mem 20675MB
[2025-04-03 02:34:31 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][320/573]	eta 0:03:43 lr 0.000654	time 0.8775 (0.8834)	loss 0.5364 (0.5093)	grad_norm 2.3022 (2.7764)	mem 20675MB
[2025-04-03 02:34:33 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][322/573]	eta 0:03:41 lr 0.000654	time 0.8774 (0.8834)	loss 0.4135 (0.5088)	grad_norm 7.6655 (2.7986)	mem 20675MB
[2025-04-03 02:34:35 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][324/573]	eta 0:03:39 lr 0.000654	time 0.8774 (0.8834)	loss 0.5089 (0.5083)	grad_norm 2.5156 (2.8015)	mem 20675MB
[2025-04-03 02:34:36 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][326/573]	eta 0:03:38 lr 0.000653	time 0.8770 (0.8834)	loss 0.5554 (0.5082)	grad_norm 2.5427 (2.8012)	mem 20675MB
[2025-04-03 02:34:38 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][328/573]	eta 0:03:36 lr 0.000653	time 0.8772 (0.8833)	loss 0.4585 (0.5083)	grad_norm 2.2334 (2.7971)	mem 20675MB
[2025-04-03 02:34:40 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][330/573]	eta 0:03:34 lr 0.000653	time 0.8772 (0.8833)	loss 0.5527 (0.5085)	grad_norm 2.4441 (2.7979)	mem 20675MB
[2025-04-03 02:34:42 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][332/573]	eta 0:03:32 lr 0.000653	time 0.8774 (0.8833)	loss 0.4681 (0.5084)	grad_norm 5.0369 (2.8079)	mem 20675MB
[2025-04-03 02:34:43 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][334/573]	eta 0:03:31 lr 0.000652	time 0.8775 (0.8832)	loss 0.5513 (0.5084)	grad_norm 2.1569 (2.8073)	mem 20675MB
[2025-04-03 02:34:45 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][336/573]	eta 0:03:29 lr 0.000652	time 0.8777 (0.8832)	loss 0.3976 (0.5085)	grad_norm 3.0810 (2.8074)	mem 20675MB
[2025-04-03 02:34:47 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][338/573]	eta 0:03:27 lr 0.000652	time 0.8776 (0.8832)	loss 0.6017 (0.5088)	grad_norm 2.1684 (2.8054)	mem 20675MB
[2025-04-03 02:34:49 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][340/573]	eta 0:03:25 lr 0.000652	time 0.8776 (0.8831)	loss 0.5686 (0.5093)	grad_norm 1.9930 (2.8022)	mem 20675MB
[2025-04-03 02:34:50 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][342/573]	eta 0:03:24 lr 0.000651	time 0.8777 (0.8831)	loss 0.4502 (0.5090)	grad_norm 2.8009 (2.8028)	mem 20675MB
[2025-04-03 02:34:52 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][344/573]	eta 0:03:22 lr 0.000651	time 0.8778 (0.8831)	loss 0.4048 (0.5088)	grad_norm 3.0117 (2.8000)	mem 20675MB
[2025-04-03 02:34:54 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][346/573]	eta 0:03:20 lr 0.000651	time 0.8775 (0.8831)	loss 0.3884 (0.5086)	grad_norm 2.6725 (2.7984)	mem 20675MB
[2025-04-03 02:34:56 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][348/573]	eta 0:03:18 lr 0.000651	time 0.8775 (0.8830)	loss 0.4917 (0.5089)	grad_norm 2.9314 (2.7965)	mem 20675MB
[2025-04-03 02:34:57 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][350/573]	eta 0:03:16 lr 0.000651	time 0.8775 (0.8830)	loss 0.5828 (0.5093)	grad_norm 1.3620 (2.7899)	mem 20675MB
[2025-04-03 02:34:59 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][352/573]	eta 0:03:15 lr 0.000650	time 0.8780 (0.8830)	loss 0.5002 (0.5095)	grad_norm 1.9361 (2.7855)	mem 20675MB
[2025-04-03 02:35:01 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][354/573]	eta 0:03:13 lr 0.000650	time 0.8782 (0.8830)	loss 0.4662 (0.5092)	grad_norm 2.3226 (2.7829)	mem 20675MB
[2025-04-03 02:35:03 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][356/573]	eta 0:03:11 lr 0.000650	time 0.8775 (0.8829)	loss 0.5125 (0.5093)	grad_norm 2.7408 (2.7834)	mem 20675MB
[2025-04-03 02:35:05 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][358/573]	eta 0:03:09 lr 0.000650	time 0.8776 (0.8829)	loss 0.5638 (0.5098)	grad_norm 5.0362 (2.7921)	mem 20675MB
[2025-04-03 02:35:06 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][360/573]	eta 0:03:08 lr 0.000649	time 0.8773 (0.8829)	loss 0.5219 (0.5100)	grad_norm 2.4681 (2.7900)	mem 20675MB
[2025-04-03 02:35:08 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][362/573]	eta 0:03:06 lr 0.000649	time 0.8773 (0.8829)	loss 0.4470 (0.5099)	grad_norm 3.4261 (2.7898)	mem 20675MB
[2025-04-03 02:35:10 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][364/573]	eta 0:03:04 lr 0.000649	time 0.8775 (0.8828)	loss 0.5386 (0.5103)	grad_norm 1.4983 (2.7853)	mem 20675MB
[2025-04-03 02:35:12 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][366/573]	eta 0:03:02 lr 0.000649	time 0.8775 (0.8828)	loss 0.5883 (0.5107)	grad_norm 1.5763 (2.7787)	mem 20675MB
[2025-04-03 02:35:13 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][368/573]	eta 0:03:00 lr 0.000649	time 0.8778 (0.8828)	loss 0.4881 (0.5107)	grad_norm 2.1901 (2.7759)	mem 20675MB
[2025-04-03 02:35:15 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][370/573]	eta 0:02:59 lr 0.000648	time 0.8774 (0.8828)	loss 0.4792 (0.5106)	grad_norm 2.3178 (2.7726)	mem 20675MB
[2025-04-03 02:35:17 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][372/573]	eta 0:02:57 lr 0.000648	time 0.8775 (0.8828)	loss 0.5540 (0.5105)	grad_norm 1.9791 (2.7692)	mem 20675MB
[2025-04-03 02:35:19 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][374/573]	eta 0:02:55 lr 0.000648	time 0.8774 (0.8827)	loss 0.4807 (0.5107)	grad_norm 3.3280 (2.7688)	mem 20675MB
[2025-04-03 02:35:20 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][376/573]	eta 0:02:53 lr 0.000648	time 0.8773 (0.8827)	loss 0.3909 (0.5106)	grad_norm 3.1545 (2.7698)	mem 20675MB
[2025-04-03 02:35:22 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][378/573]	eta 0:02:52 lr 0.000647	time 0.8774 (0.8827)	loss 0.5523 (0.5107)	grad_norm 3.3771 (2.7732)	mem 20675MB
[2025-04-03 02:35:24 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][380/573]	eta 0:02:50 lr 0.000647	time 0.8776 (0.8827)	loss 0.6249 (0.5108)	grad_norm 3.3630 (2.7811)	mem 20675MB
[2025-04-03 02:35:26 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][382/573]	eta 0:02:48 lr 0.000647	time 0.8773 (0.8826)	loss 0.5301 (0.5110)	grad_norm 1.9736 (2.7789)	mem 20675MB
[2025-04-03 02:35:27 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][384/573]	eta 0:02:46 lr 0.000647	time 0.8773 (0.8826)	loss 0.5082 (0.5109)	grad_norm 3.2218 (2.7785)	mem 20675MB
[2025-04-03 02:35:29 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][386/573]	eta 0:02:45 lr 0.000646	time 0.8772 (0.8826)	loss 0.5702 (0.5112)	grad_norm 2.5676 (2.7767)	mem 20675MB
[2025-04-03 02:35:31 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][388/573]	eta 0:02:43 lr 0.000646	time 0.8772 (0.8826)	loss 0.4794 (0.5112)	grad_norm 2.2305 (2.7738)	mem 20675MB
[2025-04-03 02:35:33 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][390/573]	eta 0:02:41 lr 0.000646	time 0.8771 (0.8826)	loss 0.4742 (0.5109)	grad_norm 3.9456 (2.7794)	mem 20675MB
[2025-04-03 02:35:34 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][392/573]	eta 0:02:39 lr 0.000646	time 0.8772 (0.8825)	loss 0.5163 (0.5105)	grad_norm 2.1904 (2.7790)	mem 20675MB
[2025-04-03 02:35:36 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][394/573]	eta 0:02:37 lr 0.000646	time 0.8774 (0.8825)	loss 0.4611 (0.5101)	grad_norm 2.9114 (2.7787)	mem 20675MB
[2025-04-03 02:35:38 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][396/573]	eta 0:02:36 lr 0.000645	time 0.8772 (0.8825)	loss 0.5083 (0.5100)	grad_norm 4.6923 (2.7830)	mem 20675MB
[2025-04-03 02:35:40 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][398/573]	eta 0:02:34 lr 0.000645	time 0.8772 (0.8825)	loss 0.5522 (0.5099)	grad_norm 3.4664 (2.7888)	mem 20675MB
[2025-04-03 02:35:41 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][400/573]	eta 0:02:32 lr 0.000645	time 0.8773 (0.8824)	loss 0.6045 (0.5103)	grad_norm 3.4778 (2.7907)	mem 20675MB
[2025-04-03 02:35:43 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][402/573]	eta 0:02:30 lr 0.000645	time 0.8775 (0.8824)	loss 0.4859 (0.5104)	grad_norm 2.7260 (2.7929)	mem 20675MB
[2025-04-03 02:35:45 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][404/573]	eta 0:02:29 lr 0.000644	time 0.8771 (0.8824)	loss 0.5965 (0.5102)	grad_norm 3.4399 (2.7975)	mem 20675MB
[2025-04-03 02:35:47 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][406/573]	eta 0:02:27 lr 0.000644	time 0.8773 (0.8824)	loss 0.6115 (0.5103)	grad_norm 3.0772 (2.8005)	mem 20675MB
[2025-04-03 02:35:48 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][408/573]	eta 0:02:25 lr 0.000644	time 0.8772 (0.8824)	loss 0.5313 (0.5105)	grad_norm 2.1741 (2.7960)	mem 20675MB
[2025-04-03 02:35:50 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][410/573]	eta 0:02:23 lr 0.000644	time 0.8770 (0.8823)	loss 0.4437 (0.5104)	grad_norm 2.8096 (2.7946)	mem 20675MB
[2025-04-03 02:35:52 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][412/573]	eta 0:02:22 lr 0.000644	time 0.8772 (0.8823)	loss 0.4315 (0.5101)	grad_norm 1.9997 (2.7921)	mem 20675MB
[2025-04-03 02:35:54 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][414/573]	eta 0:02:20 lr 0.000643	time 0.8776 (0.8823)	loss 0.5650 (0.5104)	grad_norm 1.9406 (2.7887)	mem 20675MB
[2025-04-03 02:35:55 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][416/573]	eta 0:02:18 lr 0.000643	time 0.8772 (0.8823)	loss 0.3879 (0.5101)	grad_norm 2.5814 (2.7865)	mem 20675MB
[2025-04-03 02:35:57 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][418/573]	eta 0:02:16 lr 0.000643	time 0.8773 (0.8823)	loss 0.5356 (0.5102)	grad_norm 1.9420 (2.7831)	mem 20675MB
[2025-04-03 02:35:59 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][420/573]	eta 0:02:14 lr 0.000643	time 0.8774 (0.8822)	loss 0.5886 (0.5106)	grad_norm 2.6520 (2.7814)	mem 20675MB
[2025-04-03 02:36:01 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][422/573]	eta 0:02:13 lr 0.000642	time 0.8780 (0.8822)	loss 0.5965 (0.5108)	grad_norm 2.1625 (2.7779)	mem 20675MB
[2025-04-03 02:36:02 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][424/573]	eta 0:02:11 lr 0.000642	time 0.8772 (0.8822)	loss 0.5305 (0.5109)	grad_norm 5.2219 (2.7815)	mem 20675MB
[2025-04-03 02:36:04 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][426/573]	eta 0:02:09 lr 0.000642	time 0.8773 (0.8822)	loss 0.5098 (0.5107)	grad_norm 2.0353 (2.7792)	mem 20675MB
[2025-04-03 02:36:06 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][428/573]	eta 0:02:07 lr 0.000642	time 0.8775 (0.8822)	loss 0.6074 (0.5111)	grad_norm 2.7756 (2.7774)	mem 20675MB
[2025-04-03 02:36:08 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][430/573]	eta 0:02:06 lr 0.000641	time 0.8773 (0.8822)	loss 0.5144 (0.5112)	grad_norm 2.2244 (2.7731)	mem 20675MB
[2025-04-03 02:36:10 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][432/573]	eta 0:02:04 lr 0.000641	time 0.8775 (0.8821)	loss 0.5418 (0.5114)	grad_norm 3.2095 (2.7737)	mem 20675MB
[2025-04-03 02:36:11 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][434/573]	eta 0:02:02 lr 0.000641	time 0.8775 (0.8821)	loss 0.5647 (0.5117)	grad_norm 2.3002 (2.7700)	mem 20675MB
[2025-04-03 02:36:13 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][436/573]	eta 0:02:00 lr 0.000641	time 0.8776 (0.8821)	loss 0.5739 (0.5117)	grad_norm 2.0329 (2.7666)	mem 20675MB
[2025-04-03 02:36:15 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][438/573]	eta 0:01:59 lr 0.000641	time 0.8772 (0.8821)	loss 0.5278 (0.5119)	grad_norm 2.5092 (2.7646)	mem 20675MB
[2025-04-03 02:36:17 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][440/573]	eta 0:01:57 lr 0.000640	time 0.8775 (0.8821)	loss 0.3702 (0.5116)	grad_norm 2.9567 (2.7675)	mem 20675MB
[2025-04-03 02:36:18 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][442/573]	eta 0:01:55 lr 0.000640	time 0.8774 (0.8821)	loss 0.4482 (0.5117)	grad_norm 2.9343 (2.7661)	mem 20675MB
[2025-04-03 02:36:20 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][444/573]	eta 0:01:53 lr 0.000640	time 0.8775 (0.8820)	loss 0.4870 (0.5118)	grad_norm 2.3153 (2.7631)	mem 20675MB
[2025-04-03 02:36:22 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][446/573]	eta 0:01:52 lr 0.000640	time 0.8777 (0.8820)	loss 0.4998 (0.5115)	grad_norm 3.1118 (2.7641)	mem 20675MB
[2025-04-03 02:36:24 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][448/573]	eta 0:01:50 lr 0.000639	time 0.8773 (0.8820)	loss 0.4160 (0.5112)	grad_norm 4.4016 (2.7699)	mem 20675MB
[2025-04-03 02:36:25 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][450/573]	eta 0:01:48 lr 0.000639	time 0.8787 (0.8820)	loss 0.5709 (0.5112)	grad_norm 3.2369 (2.7715)	mem 20675MB
[2025-04-03 02:36:27 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][452/573]	eta 0:01:46 lr 0.000639	time 0.8775 (0.8820)	loss 0.5789 (0.5111)	grad_norm 2.8477 (2.7762)	mem 20675MB
[2025-04-03 02:36:29 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][454/573]	eta 0:01:44 lr 0.000639	time 0.8772 (0.8820)	loss 0.4488 (0.5109)	grad_norm 3.4534 (2.7765)	mem 20675MB
[2025-04-03 02:36:31 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][456/573]	eta 0:01:43 lr 0.000638	time 0.8771 (0.8819)	loss 0.5176 (0.5107)	grad_norm 3.1658 (2.7787)	mem 20675MB
[2025-04-03 02:36:32 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][458/573]	eta 0:01:41 lr 0.000638	time 0.8771 (0.8819)	loss 0.5543 (0.5108)	grad_norm 2.3498 (2.7798)	mem 20675MB
[2025-04-03 02:36:34 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][460/573]	eta 0:01:39 lr 0.000638	time 0.8775 (0.8819)	loss 0.4448 (0.5105)	grad_norm 3.3961 (2.7813)	mem 20675MB
[2025-04-03 02:36:36 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][462/573]	eta 0:01:37 lr 0.000638	time 0.8773 (0.8819)	loss 0.4915 (0.5104)	grad_norm 2.1119 (2.7789)	mem 20675MB
[2025-04-03 02:36:38 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][464/573]	eta 0:01:36 lr 0.000638	time 0.8773 (0.8819)	loss 0.5640 (0.5103)	grad_norm 2.2621 (2.7788)	mem 20675MB
[2025-04-03 02:36:39 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][466/573]	eta 0:01:34 lr 0.000637	time 0.8774 (0.8819)	loss 0.6379 (0.5108)	grad_norm 2.9433 (2.7786)	mem 20675MB
[2025-04-03 02:36:41 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][468/573]	eta 0:01:32 lr 0.000637	time 0.8774 (0.8818)	loss 0.4044 (0.5107)	grad_norm 1.7727 (2.7751)	mem 20675MB
[2025-04-03 02:36:43 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][470/573]	eta 0:01:30 lr 0.000637	time 0.8774 (0.8818)	loss 0.6091 (0.5111)	grad_norm 2.3634 (2.7745)	mem 20675MB
[2025-04-03 02:36:45 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][472/573]	eta 0:01:29 lr 0.000637	time 0.8771 (0.8818)	loss 0.5359 (0.5110)	grad_norm 1.9706 (2.7711)	mem 20675MB
[2025-04-03 02:36:46 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][474/573]	eta 0:01:27 lr 0.000636	time 0.8773 (0.8818)	loss 0.5840 (0.5113)	grad_norm 3.3949 (2.7706)	mem 20675MB
[2025-04-03 02:36:48 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][476/573]	eta 0:01:25 lr 0.000636	time 0.8774 (0.8818)	loss 0.3978 (0.5110)	grad_norm 7.5088 (2.7807)	mem 20675MB
[2025-04-03 02:36:50 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][478/573]	eta 0:01:23 lr 0.000636	time 0.8788 (0.8818)	loss 0.5519 (0.5112)	grad_norm 1.8734 (2.7788)	mem 20675MB
[2025-04-03 02:36:52 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][480/573]	eta 0:01:22 lr 0.000636	time 0.8771 (0.8818)	loss 0.5442 (0.5112)	grad_norm 3.0236 (2.7787)	mem 20675MB
[2025-04-03 02:36:53 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][482/573]	eta 0:01:20 lr 0.000636	time 0.8773 (0.8817)	loss 0.5756 (0.5114)	grad_norm 2.5652 (2.7761)	mem 20675MB
[2025-04-03 02:36:55 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][484/573]	eta 0:01:18 lr 0.000635	time 0.8775 (0.8817)	loss 0.5997 (0.5114)	grad_norm 2.0428 (2.7724)	mem 20675MB
[2025-04-03 02:36:57 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][486/573]	eta 0:01:16 lr 0.000635	time 0.8775 (0.8817)	loss 0.5538 (0.5114)	grad_norm 2.4480 (2.7702)	mem 20675MB
[2025-04-03 02:36:59 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][488/573]	eta 0:01:14 lr 0.000635	time 0.8774 (0.8817)	loss 0.5096 (0.5115)	grad_norm 2.5634 (2.7698)	mem 20675MB
[2025-04-03 02:37:00 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][490/573]	eta 0:01:13 lr 0.000635	time 0.8771 (0.8817)	loss 0.4074 (0.5113)	grad_norm 3.3650 (2.7724)	mem 20675MB
[2025-04-03 02:37:02 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][492/573]	eta 0:01:11 lr 0.000634	time 0.8786 (0.8817)	loss 0.4530 (0.5113)	grad_norm 2.8756 (2.7723)	mem 20675MB
[2025-04-03 02:37:04 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][494/573]	eta 0:01:09 lr 0.000634	time 0.8773 (0.8817)	loss 0.3494 (0.5109)	grad_norm 2.9418 (2.7732)	mem 20675MB
[2025-04-03 02:37:06 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][496/573]	eta 0:01:07 lr 0.000634	time 0.8773 (0.8816)	loss 0.5623 (0.5107)	grad_norm 2.7112 (2.7792)	mem 20675MB
[2025-04-03 02:37:07 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][498/573]	eta 0:01:06 lr 0.000634	time 0.8770 (0.8816)	loss 0.5988 (0.5109)	grad_norm 3.2691 (2.7822)	mem 20675MB
[2025-04-03 02:37:09 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][500/573]	eta 0:01:04 lr 0.000633	time 0.8773 (0.8816)	loss 0.4810 (0.5107)	grad_norm 1.8983 (2.7798)	mem 20675MB
[2025-04-03 02:37:11 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][502/573]	eta 0:01:02 lr 0.000633	time 0.8793 (0.8816)	loss 0.5704 (0.5109)	grad_norm 1.5561 (2.7770)	mem 20675MB
[2025-04-03 02:37:13 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][504/573]	eta 0:01:00 lr 0.000633	time 0.8777 (0.8816)	loss 0.4319 (0.5108)	grad_norm 4.3046 (2.7789)	mem 20675MB
[2025-04-03 02:37:15 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][506/573]	eta 0:00:59 lr 0.000633	time 0.8777 (0.8816)	loss 0.6037 (0.5110)	grad_norm 2.3969 (2.7775)	mem 20675MB
[2025-04-03 02:37:16 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][508/573]	eta 0:00:57 lr 0.000633	time 0.8772 (0.8816)	loss 0.5983 (0.5112)	grad_norm 2.9167 (2.7761)	mem 20675MB
[2025-04-03 02:37:18 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][510/573]	eta 0:00:55 lr 0.000632	time 0.8774 (0.8816)	loss 0.5598 (0.5114)	grad_norm 3.2605 (2.7759)	mem 20675MB
[2025-04-03 02:37:20 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][512/573]	eta 0:00:53 lr 0.000632	time 0.8777 (0.8816)	loss 0.4939 (0.5114)	grad_norm 3.2912 (2.7759)	mem 20675MB
[2025-04-03 02:37:22 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][514/573]	eta 0:00:52 lr 0.000632	time 0.8772 (0.8815)	loss 0.5254 (0.5115)	grad_norm 2.0064 (2.7752)	mem 20675MB
[2025-04-03 02:37:23 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][516/573]	eta 0:00:50 lr 0.000632	time 0.8774 (0.8815)	loss 0.5651 (0.5118)	grad_norm 1.5340 (2.7709)	mem 20675MB
[2025-04-03 02:37:25 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][518/573]	eta 0:00:48 lr 0.000631	time 0.8772 (0.8815)	loss 0.5490 (0.5120)	grad_norm 2.0016 (2.7671)	mem 20675MB
[2025-04-03 02:37:27 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][520/573]	eta 0:00:46 lr 0.000631	time 0.8775 (0.8815)	loss 0.4148 (0.5119)	grad_norm 2.1934 (2.7640)	mem 20675MB
[2025-04-03 02:37:29 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][522/573]	eta 0:00:44 lr 0.000631	time 0.8771 (0.8815)	loss 0.4306 (0.5118)	grad_norm 2.0463 (2.7622)	mem 20675MB
[2025-04-03 02:37:30 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][524/573]	eta 0:00:43 lr 0.000631	time 0.8774 (0.8815)	loss 0.4198 (0.5118)	grad_norm 3.4860 (2.7648)	mem 20675MB
[2025-04-03 02:37:32 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][526/573]	eta 0:00:41 lr 0.000630	time 0.8775 (0.8815)	loss 0.5685 (0.5119)	grad_norm 1.5946 (2.7610)	mem 20675MB
[2025-04-03 02:37:34 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][528/573]	eta 0:00:39 lr 0.000630	time 0.8772 (0.8815)	loss 0.5590 (0.5120)	grad_norm 2.0620 (2.7591)	mem 20675MB
[2025-04-03 02:37:36 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][530/573]	eta 0:00:37 lr 0.000630	time 0.8773 (0.8814)	loss 0.5569 (0.5122)	grad_norm 2.2884 (2.7565)	mem 20675MB
[2025-04-03 02:37:37 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][532/573]	eta 0:00:36 lr 0.000630	time 0.8776 (0.8814)	loss 0.4376 (0.5122)	grad_norm 2.8271 (2.7548)	mem 20675MB
[2025-04-03 02:37:39 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][534/573]	eta 0:00:34 lr 0.000630	time 0.8775 (0.8814)	loss 0.5037 (0.5123)	grad_norm 2.1217 (2.7525)	mem 20675MB
[2025-04-03 02:37:41 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][536/573]	eta 0:00:32 lr 0.000629	time 0.8774 (0.8814)	loss 0.3933 (0.5120)	grad_norm 5.0121 (2.7586)	mem 20675MB
[2025-04-03 02:37:43 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][538/573]	eta 0:00:30 lr 0.000629	time 0.8772 (0.8814)	loss 0.5395 (0.5121)	grad_norm 2.2951 (2.7589)	mem 20675MB
[2025-04-03 02:37:44 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][540/573]	eta 0:00:29 lr 0.000629	time 0.8772 (0.8814)	loss 0.5920 (0.5121)	grad_norm 2.0100 (2.7587)	mem 20675MB
[2025-04-03 02:37:46 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][542/573]	eta 0:00:27 lr 0.000629	time 0.8771 (0.8814)	loss 0.5633 (0.5123)	grad_norm 2.0006 (2.7582)	mem 20675MB
[2025-04-03 02:37:48 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][544/573]	eta 0:00:25 lr 0.000628	time 0.8771 (0.8814)	loss 0.5565 (0.5125)	grad_norm 2.3353 (2.7575)	mem 20675MB
[2025-04-03 02:37:50 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][546/573]	eta 0:00:23 lr 0.000628	time 0.8772 (0.8814)	loss 0.4285 (0.5123)	grad_norm 2.7186 (2.7573)	mem 20675MB
[2025-04-03 02:37:51 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][548/573]	eta 0:00:22 lr 0.000628	time 0.8772 (0.8813)	loss 0.5412 (0.5122)	grad_norm 1.7410 (2.7559)	mem 20675MB
[2025-04-03 02:37:53 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][550/573]	eta 0:00:20 lr 0.000628	time 0.8775 (0.8813)	loss 0.6118 (0.5124)	grad_norm 2.0686 (2.7536)	mem 20675MB
[2025-04-03 02:37:55 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][552/573]	eta 0:00:18 lr 0.000628	time 0.8771 (0.8813)	loss 0.6009 (0.5125)	grad_norm 2.3171 (2.7520)	mem 20675MB
[2025-04-03 02:37:57 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][554/573]	eta 0:00:16 lr 0.000627	time 0.8771 (0.8813)	loss 0.4371 (0.5122)	grad_norm 3.0529 (2.7519)	mem 20675MB
[2025-04-03 02:37:58 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][556/573]	eta 0:00:14 lr 0.000627	time 0.8771 (0.8813)	loss 0.5487 (0.5121)	grad_norm 2.4220 (2.7500)	mem 20675MB
[2025-04-03 02:38:00 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][558/573]	eta 0:00:13 lr 0.000627	time 0.8788 (0.8813)	loss 0.5756 (0.5122)	grad_norm 2.2022 (2.7478)	mem 20675MB
[2025-04-03 02:38:02 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][560/573]	eta 0:00:11 lr 0.000627	time 0.8772 (0.8813)	loss 0.3981 (0.5121)	grad_norm 3.6727 (2.7484)	mem 20675MB
[2025-04-03 02:38:04 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][562/573]	eta 0:00:09 lr 0.000626	time 0.8773 (0.8813)	loss 0.3110 (0.5117)	grad_norm 2.5886 (2.7464)	mem 20675MB
[2025-04-03 02:38:05 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][564/573]	eta 0:00:07 lr 0.000626	time 0.8776 (0.8813)	loss 0.6472 (0.5119)	grad_norm 2.5384 (2.7476)	mem 20675MB
[2025-04-03 02:38:07 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][566/573]	eta 0:00:06 lr 0.000626	time 0.8770 (0.8813)	loss 0.5956 (0.5122)	grad_norm 4.1040 (2.7509)	mem 20675MB
[2025-04-03 02:38:09 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][568/573]	eta 0:00:04 lr 0.000626	time 0.8771 (0.8812)	loss 0.5668 (0.5120)	grad_norm 1.9430 (2.7490)	mem 20675MB
[2025-04-03 02:38:11 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][570/573]	eta 0:00:02 lr 0.000625	time 0.8773 (0.8812)	loss 0.6139 (0.5123)	grad_norm 2.2032 (2.7483)	mem 20675MB
[2025-04-03 02:38:12 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][572/573]	eta 0:00:00 lr 0.000625	time 0.8771 (0.8812)	loss 0.5225 (0.5124)	grad_norm 1.7271 (2.7451)	mem 20675MB
[2025-04-03 02:38:13 simmim_finetune] (main_finetune.py 260): INFO EPOCH 14 training takes 0:08:25
[2025-04-03 02:38:15 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.953 (1.953)	Loss 0.6088 (0.6088)	Acc@1 62.500 (62.500)	Mem 20675MB
[2025-04-03 02:38:15 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.840)	Loss 0.5816 (0.5871)	Acc@1 68.750 (65.625)	Mem 20675MB
[2025-04-03 02:38:16 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.618)	Loss 0.6112 (0.5899)	Acc@1 64.844 (65.312)	Mem 20675MB
[2025-04-03 02:38:16 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.523)	Loss 0.5405 (0.5757)	Acc@1 73.438 (67.188)	Mem 20675MB
[2025-04-03 02:38:17 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.286 (0.470)	Loss 0.4489 (0.5415)	Acc@1 82.031 (71.007)	Mem 20675MB
[2025-04-03 02:38:17 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.437)	Loss 0.4061 (0.5204)	Acc@1 85.938 (72.798)	Mem 20675MB
[2025-04-03 02:38:18 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.413)	Loss 0.4183 (0.5035)	Acc@1 85.938 (74.700)	Mem 20675MB
[2025-04-03 02:38:19 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.396)	Loss 0.3744 (0.4877)	Acc@1 86.719 (76.198)	Mem 20675MB
[2025-04-03 02:38:19 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 76.462
[2025-04-03 02:38:19 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 76.5%
[2025-04-03 02:38:19 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 76.71%
[2025-04-03 02:38:19 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.4361531436476834e-06, 2.4361531436476834e-06, 3.6806325219475996e-06, 3.6806325219475996e-06, 5.595216180870548e-06, 5.595216180870548e-06, 8.540729502290469e-06, 8.540729502290469e-06, 1.3072288458321114e-05, 1.3072288458321114e-05, 2.0043917621445186e-05, 2.0043917621445186e-05, 3.076950094932837e-05, 3.076950094932837e-05, 4.7270398376840955e-05, 4.7270398376840955e-05, 7.265639441916802e-05, 7.265639441916802e-05, 0.00011171177294582506, 0.00011171177294582506, 0.00017179697067914353, 0.00017179697067914353, 0.0002642357364227104, 0.0002642357364227104, 0.0004064492221820441, 0.0004064492221820441, 0.0006252392002733268, 0.0006252392002733268]
[2025-04-03 02:38:21 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][0/573]	eta 0:22:36 lr 0.000625	time 2.3666 (2.3666)	loss 0.5898 (0.5898)	grad_norm 3.1457 (3.1457)	mem 20675MB
[2025-04-03 02:38:23 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][2/573]	eta 0:13:05 lr 0.000625	time 0.8776 (1.3753)	loss 0.4673 (0.4896)	grad_norm 2.2852 (2.9258)	mem 20675MB
[2025-04-03 02:38:25 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][4/573]	eta 0:11:09 lr 0.000625	time 0.8776 (1.1771)	loss 0.4714 (0.5075)	grad_norm 2.9853 (2.8094)	mem 20675MB
[2025-04-03 02:38:27 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][6/573]	eta 0:10:19 lr 0.000624	time 0.8773 (1.0919)	loss 0.5567 (0.5155)	grad_norm 1.8701 (2.5581)	mem 20675MB
[2025-04-03 02:38:28 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][8/573]	eta 0:09:50 lr 0.000624	time 0.8774 (1.0445)	loss 0.5066 (0.5156)	grad_norm 2.7972 (2.5398)	mem 20675MB
[2025-04-03 02:38:30 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][10/573]	eta 0:09:31 lr 0.000624	time 0.8774 (1.0143)	loss 0.4197 (0.5139)	grad_norm 2.5275 (2.5678)	mem 20675MB
[2025-04-03 02:38:32 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][12/573]	eta 0:09:17 lr 0.000624	time 0.8774 (0.9933)	loss 0.6678 (0.5274)	grad_norm 2.8634 (2.5706)	mem 20675MB
[2025-04-03 02:38:34 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][14/573]	eta 0:09:06 lr 0.000624	time 0.8774 (0.9780)	loss 0.6367 (0.5263)	grad_norm 2.9901 (2.6321)	mem 20675MB
[2025-04-03 02:38:35 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][16/573]	eta 0:08:58 lr 0.000623	time 0.8776 (0.9664)	loss 0.5093 (0.5260)	grad_norm 2.0729 (2.6258)	mem 20675MB
[2025-04-03 02:38:37 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][18/573]	eta 0:08:51 lr 0.000623	time 0.8771 (0.9571)	loss 0.4757 (0.5176)	grad_norm 3.4071 (2.6680)	mem 20675MB
[2025-04-03 02:38:39 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][20/573]	eta 0:08:45 lr 0.000623	time 0.8774 (0.9496)	loss 0.4403 (0.5138)	grad_norm 2.7231 (2.7105)	mem 20675MB
[2025-04-03 02:38:41 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][22/573]	eta 0:08:39 lr 0.000623	time 0.8782 (0.9434)	loss 0.6219 (0.5178)	grad_norm 2.1722 (2.6693)	mem 20675MB
[2025-04-03 02:38:42 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][24/573]	eta 0:08:35 lr 0.000622	time 0.8774 (0.9382)	loss 0.5566 (0.5200)	grad_norm 3.8806 (2.7303)	mem 20675MB
[2025-04-03 02:38:44 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][26/573]	eta 0:08:30 lr 0.000622	time 0.8775 (0.9338)	loss 0.5730 (0.5244)	grad_norm 2.6788 (2.7421)	mem 20675MB
[2025-04-03 02:38:46 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][28/573]	eta 0:08:26 lr 0.000622	time 0.8773 (0.9300)	loss 0.4294 (0.5201)	grad_norm 2.8171 (2.7309)	mem 20675MB
[2025-04-03 02:38:48 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][30/573]	eta 0:08:23 lr 0.000622	time 0.8781 (0.9267)	loss 0.5797 (0.5193)	grad_norm 3.6163 (2.7821)	mem 20675MB
[2025-04-03 02:38:49 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][32/573]	eta 0:08:19 lr 0.000621	time 0.8775 (0.9237)	loss 0.5957 (0.5203)	grad_norm 2.6254 (2.8277)	mem 20675MB
[2025-04-03 02:38:51 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][34/573]	eta 0:08:16 lr 0.000621	time 0.8773 (0.9212)	loss 0.5172 (0.5207)	grad_norm 2.8075 (2.8389)	mem 20675MB
[2025-04-03 02:38:53 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][36/573]	eta 0:08:13 lr 0.000621	time 0.8772 (0.9188)	loss 0.5441 (0.5222)	grad_norm 2.4528 (2.8220)	mem 20675MB
[2025-04-03 02:38:55 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][38/573]	eta 0:08:10 lr 0.000621	time 0.8771 (0.9167)	loss 0.5431 (0.5198)	grad_norm 4.1038 (2.8732)	mem 20675MB
[2025-04-03 02:38:56 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][40/573]	eta 0:08:07 lr 0.000621	time 0.8780 (0.9149)	loss 0.4678 (0.5160)	grad_norm 1.9031 (2.8631)	mem 20675MB
[2025-04-03 02:38:58 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][42/573]	eta 0:08:04 lr 0.000620	time 0.8773 (0.9132)	loss 0.5826 (0.5156)	grad_norm 2.0262 (2.8399)	mem 20675MB
[2025-04-03 02:39:00 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][44/573]	eta 0:08:02 lr 0.000620	time 0.8773 (0.9116)	loss 0.5640 (0.5164)	grad_norm 2.3874 (2.8605)	mem 20675MB
[2025-04-03 02:39:02 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][46/573]	eta 0:07:59 lr 0.000620	time 0.8774 (0.9102)	loss 0.5983 (0.5190)	grad_norm 1.8916 (2.8116)	mem 20675MB
[2025-04-03 02:39:03 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][48/573]	eta 0:07:57 lr 0.000620	time 0.8774 (0.9089)	loss 0.5064 (0.5198)	grad_norm 2.6157 (2.7968)	mem 20675MB
[2025-04-03 02:39:05 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][50/573]	eta 0:07:54 lr 0.000619	time 0.8772 (0.9077)	loss 0.5929 (0.5219)	grad_norm 2.7786 (2.8419)	mem 20675MB
[2025-04-03 02:39:07 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][52/573]	eta 0:07:52 lr 0.000619	time 0.8770 (0.9066)	loss 0.4788 (0.5182)	grad_norm 2.2821 (2.8407)	mem 20675MB
[2025-04-03 02:39:09 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][54/573]	eta 0:07:49 lr 0.000619	time 0.8772 (0.9055)	loss 0.4784 (0.5145)	grad_norm 2.2613 (2.8555)	mem 20675MB
[2025-04-03 02:39:10 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][56/573]	eta 0:07:47 lr 0.000619	time 0.8773 (0.9046)	loss 0.4842 (0.5154)	grad_norm 2.8915 (2.8513)	mem 20675MB
[2025-04-03 02:39:12 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][58/573]	eta 0:07:45 lr 0.000619	time 0.8773 (0.9037)	loss 0.5871 (0.5161)	grad_norm 3.9645 (2.8679)	mem 20675MB
[2025-04-03 02:39:14 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][60/573]	eta 0:07:43 lr 0.000618	time 0.8771 (0.9028)	loss 0.3765 (0.5141)	grad_norm 3.5058 (2.8846)	mem 20675MB
[2025-04-03 02:39:16 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][62/573]	eta 0:07:40 lr 0.000618	time 0.8769 (0.9020)	loss 0.4132 (0.5118)	grad_norm 2.3637 (2.8833)	mem 20675MB
[2025-04-03 02:39:17 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][64/573]	eta 0:07:38 lr 0.000618	time 0.8773 (0.9013)	loss 0.5321 (0.5101)	grad_norm 2.1908 (2.8880)	mem 20675MB
[2025-04-03 02:39:19 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][66/573]	eta 0:07:36 lr 0.000618	time 0.8774 (0.9006)	loss 0.6067 (0.5113)	grad_norm 2.1269 (2.8738)	mem 20675MB
[2025-04-03 02:39:21 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][68/573]	eta 0:07:34 lr 0.000617	time 0.8774 (0.9000)	loss 0.5795 (0.5128)	grad_norm 2.1858 (2.8576)	mem 20675MB
[2025-04-03 02:39:23 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][70/573]	eta 0:07:32 lr 0.000617	time 0.8772 (0.8994)	loss 0.6199 (0.5126)	grad_norm 2.5687 (2.8673)	mem 20675MB
[2025-04-03 02:39:25 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][72/573]	eta 0:07:30 lr 0.000617	time 0.8772 (0.8988)	loss 0.4166 (0.5137)	grad_norm 3.4443 (2.8876)	mem 20675MB
[2025-04-03 02:39:26 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][74/573]	eta 0:07:28 lr 0.000617	time 0.8773 (0.8982)	loss 0.3951 (0.5122)	grad_norm 2.5830 (2.8772)	mem 20675MB
[2025-04-03 02:39:28 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][76/573]	eta 0:07:26 lr 0.000616	time 0.8772 (0.8977)	loss 0.5693 (0.5122)	grad_norm 2.0034 (2.8622)	mem 20675MB
[2025-04-03 02:39:30 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][78/573]	eta 0:07:24 lr 0.000616	time 0.8770 (0.8972)	loss 0.5478 (0.5126)	grad_norm 3.0516 (2.8648)	mem 20675MB
[2025-04-03 02:39:32 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][80/573]	eta 0:07:22 lr 0.000616	time 0.8773 (0.8967)	loss 0.5306 (0.5126)	grad_norm 1.7621 (2.8427)	mem 20675MB
[2025-04-03 02:39:33 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][82/573]	eta 0:07:20 lr 0.000616	time 0.8771 (0.8963)	loss 0.3716 (0.5094)	grad_norm 4.3364 (2.8580)	mem 20675MB
[2025-04-03 02:39:35 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][84/573]	eta 0:07:18 lr 0.000616	time 0.8772 (0.8959)	loss 0.3734 (0.5088)	grad_norm 3.0423 (2.8586)	mem 20675MB
[2025-04-03 02:39:37 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][86/573]	eta 0:07:16 lr 0.000615	time 0.8771 (0.8955)	loss 0.4974 (0.5099)	grad_norm 2.7350 (2.8532)	mem 20675MB
[2025-04-03 02:39:39 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][88/573]	eta 0:07:14 lr 0.000615	time 0.8781 (0.8951)	loss 0.3579 (0.5088)	grad_norm 3.4154 (2.8624)	mem 20675MB
[2025-04-03 02:39:40 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][90/573]	eta 0:07:12 lr 0.000615	time 0.8773 (0.8947)	loss 0.6006 (0.5093)	grad_norm 2.5061 (2.8597)	mem 20675MB
[2025-04-03 02:39:42 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][92/573]	eta 0:07:10 lr 0.000615	time 0.8772 (0.8944)	loss 0.4510 (0.5091)	grad_norm 3.5181 (2.8734)	mem 20675MB
[2025-04-03 02:39:44 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][94/573]	eta 0:07:08 lr 0.000614	time 0.8773 (0.8940)	loss 0.5646 (0.5095)	grad_norm 2.2817 (2.8670)	mem 20675MB
[2025-04-03 02:39:46 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][96/573]	eta 0:07:06 lr 0.000614	time 0.8772 (0.8937)	loss 0.3733 (0.5075)	grad_norm 4.8438 (2.8853)	mem 20675MB
[2025-04-03 02:39:47 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][98/573]	eta 0:07:04 lr 0.000614	time 0.8773 (0.8934)	loss 0.5395 (0.5079)	grad_norm 2.6365 (2.8823)	mem 20675MB
[2025-04-03 02:39:49 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][100/573]	eta 0:07:02 lr 0.000614	time 0.8771 (0.8931)	loss 0.5499 (0.5079)	grad_norm 2.4336 (2.8953)	mem 20675MB
[2025-04-03 02:39:51 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][102/573]	eta 0:07:00 lr 0.000613	time 0.8770 (0.8928)	loss 0.5349 (0.5075)	grad_norm 1.8128 (2.8763)	mem 20675MB
[2025-04-03 02:39:53 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][104/573]	eta 0:06:58 lr 0.000613	time 0.8776 (0.8925)	loss 0.6158 (0.5080)	grad_norm 2.8928 (2.8868)	mem 20675MB
[2025-04-03 02:39:54 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][106/573]	eta 0:06:56 lr 0.000613	time 0.8771 (0.8923)	loss 0.5296 (0.5086)	grad_norm 2.9170 (2.8817)	mem 20675MB
[2025-04-03 02:39:56 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][108/573]	eta 0:06:54 lr 0.000613	time 0.8772 (0.8920)	loss 0.5616 (0.5093)	grad_norm 1.9663 (2.8684)	mem 20675MB
[2025-04-03 02:39:58 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][110/573]	eta 0:06:52 lr 0.000613	time 0.8771 (0.8918)	loss 0.4220 (0.5088)	grad_norm 3.1142 (2.8694)	mem 20675MB
[2025-04-03 02:40:00 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][112/573]	eta 0:06:50 lr 0.000612	time 0.8784 (0.8915)	loss 0.4076 (0.5073)	grad_norm 2.2612 (2.8659)	mem 20675MB
[2025-04-03 02:40:01 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][114/573]	eta 0:06:49 lr 0.000612	time 0.8771 (0.8913)	loss 0.5754 (0.5075)	grad_norm 2.7075 (2.8616)	mem 20675MB
[2025-04-03 02:40:03 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][116/573]	eta 0:06:47 lr 0.000612	time 0.8771 (0.8911)	loss 0.4430 (0.5072)	grad_norm 1.8862 (2.8449)	mem 20675MB
[2025-04-03 02:40:05 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][118/573]	eta 0:06:45 lr 0.000612	time 0.8773 (0.8908)	loss 0.4823 (0.5065)	grad_norm 2.0564 (2.8391)	mem 20675MB
[2025-04-03 02:40:07 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][120/573]	eta 0:06:43 lr 0.000611	time 0.8771 (0.8906)	loss 0.5760 (0.5069)	grad_norm 2.5439 (2.8305)	mem 20675MB
[2025-04-03 02:40:08 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][122/573]	eta 0:06:41 lr 0.000611	time 0.8776 (0.8904)	loss 0.5455 (0.5073)	grad_norm 1.9812 (2.8337)	mem 20675MB
[2025-04-03 02:40:10 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][124/573]	eta 0:06:39 lr 0.000611	time 0.8772 (0.8902)	loss 0.4800 (0.5072)	grad_norm 3.5886 (2.8321)	mem 20675MB
[2025-04-03 02:40:12 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][126/573]	eta 0:06:37 lr 0.000611	time 0.8770 (0.8901)	loss 0.5528 (0.5077)	grad_norm 1.9005 (2.8177)	mem 20675MB
[2025-04-03 02:40:14 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][128/573]	eta 0:06:35 lr 0.000611	time 0.8770 (0.8899)	loss 0.5118 (0.5077)	grad_norm 3.2944 (2.8268)	mem 20675MB
[2025-04-03 02:40:15 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][130/573]	eta 0:06:34 lr 0.000610	time 0.8770 (0.8897)	loss 0.4821 (0.5077)	grad_norm 1.6742 (2.8107)	mem 20675MB
[2025-04-03 02:40:17 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][132/573]	eta 0:06:32 lr 0.000610	time 0.8773 (0.8895)	loss 0.6060 (0.5072)	grad_norm 2.3645 (2.8136)	mem 20675MB
[2025-04-03 02:40:19 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][134/573]	eta 0:06:30 lr 0.000610	time 0.8773 (0.8893)	loss 0.6086 (0.5067)	grad_norm 2.4233 (2.8084)	mem 20675MB
[2025-04-03 02:40:21 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][136/573]	eta 0:06:28 lr 0.000610	time 0.8771 (0.8892)	loss 0.6072 (0.5081)	grad_norm 1.6247 (2.8061)	mem 20675MB
[2025-04-03 02:40:22 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][138/573]	eta 0:06:26 lr 0.000609	time 0.8773 (0.8890)	loss 0.6080 (0.5087)	grad_norm 2.6402 (2.7954)	mem 20675MB
[2025-04-03 02:40:24 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][140/573]	eta 0:06:24 lr 0.000609	time 0.8788 (0.8889)	loss 0.5206 (0.5085)	grad_norm 1.8789 (2.7900)	mem 20675MB
[2025-04-03 02:40:26 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][142/573]	eta 0:06:23 lr 0.000609	time 0.8774 (0.8887)	loss 0.5533 (0.5094)	grad_norm 2.4539 (2.7833)	mem 20675MB
[2025-04-03 02:40:28 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][144/573]	eta 0:06:21 lr 0.000609	time 0.8771 (0.8886)	loss 0.5468 (0.5100)	grad_norm 1.4455 (2.7666)	mem 20675MB
[2025-04-03 02:40:30 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][146/573]	eta 0:06:19 lr 0.000608	time 0.8772 (0.8884)	loss 0.5563 (0.5104)	grad_norm 2.1312 (2.7575)	mem 20675MB
[2025-04-03 02:40:31 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][148/573]	eta 0:06:17 lr 0.000608	time 0.8774 (0.8883)	loss 0.4770 (0.5098)	grad_norm 2.6471 (2.7552)	mem 20675MB
[2025-04-03 02:40:33 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][150/573]	eta 0:06:15 lr 0.000608	time 0.8771 (0.8882)	loss 0.4639 (0.5098)	grad_norm 1.8765 (2.7489)	mem 20675MB
[2025-04-03 02:40:35 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][152/573]	eta 0:06:13 lr 0.000608	time 0.8772 (0.8880)	loss 0.5579 (0.5101)	grad_norm 2.2741 (2.7391)	mem 20675MB
[2025-04-03 02:40:37 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][154/573]	eta 0:06:12 lr 0.000608	time 0.8772 (0.8879)	loss 0.6372 (0.5107)	grad_norm 2.6233 (2.7363)	mem 20675MB
[2025-04-03 02:40:38 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][156/573]	eta 0:06:10 lr 0.000607	time 0.8771 (0.8878)	loss 0.5719 (0.5109)	grad_norm 2.1913 (2.7273)	mem 20675MB
[2025-04-03 02:40:40 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][158/573]	eta 0:06:08 lr 0.000607	time 0.8774 (0.8877)	loss 0.5262 (0.5110)	grad_norm 2.0677 (2.7200)	mem 20675MB
[2025-04-03 02:40:42 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][160/573]	eta 0:06:06 lr 0.000607	time 0.8771 (0.8876)	loss 0.5290 (0.5114)	grad_norm 2.0438 (2.7232)	mem 20675MB
[2025-04-03 02:40:44 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][162/573]	eta 0:06:04 lr 0.000607	time 0.8772 (0.8875)	loss 0.5838 (0.5121)	grad_norm 2.9613 (2.7234)	mem 20675MB
[2025-04-03 02:40:45 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][164/573]	eta 0:06:02 lr 0.000606	time 0.8772 (0.8873)	loss 0.4642 (0.5120)	grad_norm 2.7998 (2.7212)	mem 20675MB
[2025-04-03 02:40:47 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][166/573]	eta 0:06:01 lr 0.000606	time 0.8777 (0.8872)	loss 0.5557 (0.5122)	grad_norm 2.1799 (2.7167)	mem 20675MB
[2025-04-03 02:40:49 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][168/573]	eta 0:05:59 lr 0.000606	time 0.8773 (0.8871)	loss 0.4870 (0.5119)	grad_norm 2.0919 (2.7104)	mem 20675MB
[2025-04-03 02:40:51 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][170/573]	eta 0:05:57 lr 0.000606	time 0.8772 (0.8870)	loss 0.5556 (0.5115)	grad_norm 1.9032 (2.7147)	mem 20675MB
[2025-04-03 02:40:52 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][172/573]	eta 0:05:55 lr 0.000605	time 0.8776 (0.8869)	loss 0.6392 (0.5129)	grad_norm 2.3566 (2.7113)	mem 20675MB
[2025-04-03 02:40:54 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][174/573]	eta 0:05:53 lr 0.000605	time 0.8781 (0.8868)	loss 0.4168 (0.5119)	grad_norm 3.4783 (2.7168)	mem 20675MB
[2025-04-03 02:40:56 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][176/573]	eta 0:05:52 lr 0.000605	time 0.8772 (0.8867)	loss 0.5470 (0.5120)	grad_norm 2.0718 (2.7099)	mem 20675MB
[2025-04-03 02:40:58 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][178/573]	eta 0:05:50 lr 0.000605	time 0.8773 (0.8866)	loss 0.5493 (0.5126)	grad_norm 2.3357 (2.7060)	mem 20675MB
[2025-04-03 02:40:59 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][180/573]	eta 0:05:48 lr 0.000605	time 0.8776 (0.8866)	loss 0.4449 (0.5125)	grad_norm 2.2624 (2.6985)	mem 20675MB
[2025-04-03 02:41:01 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][182/573]	eta 0:05:46 lr 0.000604	time 0.8772 (0.8865)	loss 0.5557 (0.5123)	grad_norm 1.9860 (2.6924)	mem 20675MB
[2025-04-03 02:41:03 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][184/573]	eta 0:05:44 lr 0.000604	time 0.8772 (0.8864)	loss 0.4210 (0.5119)	grad_norm 2.4964 (2.6881)	mem 20675MB
[2025-04-03 02:41:05 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][186/573]	eta 0:05:42 lr 0.000604	time 0.8773 (0.8863)	loss 0.5121 (0.5120)	grad_norm 2.4417 (2.6864)	mem 20675MB
[2025-04-03 02:41:06 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][188/573]	eta 0:05:41 lr 0.000604	time 0.8774 (0.8862)	loss 0.3952 (0.5114)	grad_norm 2.5780 (2.6862)	mem 20675MB
[2025-04-03 02:41:08 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][190/573]	eta 0:05:39 lr 0.000603	time 0.8774 (0.8861)	loss 0.5336 (0.5111)	grad_norm 2.5949 (2.6897)	mem 20675MB
[2025-04-03 02:41:10 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][192/573]	eta 0:05:37 lr 0.000603	time 0.8772 (0.8860)	loss 0.5245 (0.5116)	grad_norm 2.0396 (2.6873)	mem 20675MB
[2025-04-03 02:41:12 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][194/573]	eta 0:05:35 lr 0.000603	time 0.8773 (0.8860)	loss 0.4937 (0.5118)	grad_norm 2.1992 (2.6840)	mem 20675MB
[2025-04-03 02:41:13 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][196/573]	eta 0:05:33 lr 0.000603	time 0.8773 (0.8859)	loss 0.4876 (0.5121)	grad_norm 2.3628 (2.6807)	mem 20675MB
[2025-04-03 02:41:15 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][198/573]	eta 0:05:32 lr 0.000603	time 0.8773 (0.8858)	loss 0.4326 (0.5118)	grad_norm 4.3785 (2.6913)	mem 20675MB
[2025-04-03 02:41:17 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][200/573]	eta 0:05:30 lr 0.000602	time 0.8773 (0.8857)	loss 0.5457 (0.5121)	grad_norm 1.7324 (2.6932)	mem 20675MB
[2025-04-03 02:41:19 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][202/573]	eta 0:05:28 lr 0.000602	time 0.8773 (0.8856)	loss 0.4684 (0.5114)	grad_norm 2.3001 (2.6917)	mem 20675MB
[2025-04-03 02:41:20 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][204/573]	eta 0:05:26 lr 0.000602	time 0.8774 (0.8856)	loss 0.4795 (0.5105)	grad_norm 1.9395 (2.6871)	mem 20675MB
[2025-04-03 02:41:22 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][206/573]	eta 0:05:24 lr 0.000602	time 0.8776 (0.8855)	loss 0.3675 (0.5096)	grad_norm 2.8759 (2.6845)	mem 20675MB
[2025-04-03 02:41:24 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][208/573]	eta 0:05:23 lr 0.000601	time 0.8773 (0.8854)	loss 0.5717 (0.5095)	grad_norm 2.6687 (2.6903)	mem 20675MB
[2025-04-03 02:41:26 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][210/573]	eta 0:05:21 lr 0.000601	time 0.8772 (0.8854)	loss 0.5149 (0.5097)	grad_norm 3.4648 (2.6980)	mem 20675MB
[2025-04-03 02:41:27 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][212/573]	eta 0:05:19 lr 0.000601	time 0.8772 (0.8853)	loss 0.5026 (0.5095)	grad_norm 2.7385 (2.7017)	mem 20675MB
[2025-04-03 02:41:29 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][214/573]	eta 0:05:17 lr 0.000601	time 0.8770 (0.8852)	loss 0.4425 (0.5096)	grad_norm 2.5138 (2.6992)	mem 20675MB
[2025-04-03 02:41:31 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][216/573]	eta 0:05:16 lr 0.000600	time 0.8774 (0.8852)	loss 0.4163 (0.5089)	grad_norm 4.1613 (2.7088)	mem 20675MB
[2025-04-03 02:41:33 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][218/573]	eta 0:05:14 lr 0.000600	time 0.8772 (0.8851)	loss 0.4748 (0.5091)	grad_norm 3.8002 (2.7145)	mem 20675MB
[2025-04-03 02:41:34 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][220/573]	eta 0:05:12 lr 0.000600	time 0.8773 (0.8850)	loss 0.5552 (0.5097)	grad_norm 3.4136 (2.7138)	mem 20675MB
[2025-04-03 02:41:36 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][222/573]	eta 0:05:10 lr 0.000600	time 0.8770 (0.8850)	loss 0.4563 (0.5097)	grad_norm 2.0016 (2.7060)	mem 20675MB
[2025-04-03 02:41:38 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][224/573]	eta 0:05:08 lr 0.000600	time 0.8771 (0.8849)	loss 0.4593 (0.5096)	grad_norm 2.7592 (2.7047)	mem 20675MB
[2025-04-03 02:41:40 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][226/573]	eta 0:05:07 lr 0.000599	time 0.8770 (0.8848)	loss 0.5836 (0.5096)	grad_norm 1.6482 (2.7019)	mem 20675MB
[2025-04-03 02:41:42 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][228/573]	eta 0:05:05 lr 0.000599	time 0.8774 (0.8848)	loss 0.5566 (0.5096)	grad_norm 2.2195 (2.7030)	mem 20675MB
[2025-04-03 02:41:43 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][230/573]	eta 0:05:03 lr 0.000599	time 0.8772 (0.8847)	loss 0.5252 (0.5099)	grad_norm 2.6279 (2.6993)	mem 20675MB
[2025-04-03 02:41:45 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][232/573]	eta 0:05:01 lr 0.000599	time 0.8772 (0.8847)	loss 0.5483 (0.5101)	grad_norm 1.6941 (2.6950)	mem 20675MB
[2025-04-03 02:41:47 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][234/573]	eta 0:04:59 lr 0.000598	time 0.8771 (0.8846)	loss 0.5308 (0.5099)	grad_norm 2.0637 (2.6937)	mem 20675MB
[2025-04-03 02:41:49 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][236/573]	eta 0:04:58 lr 0.000598	time 0.8772 (0.8846)	loss 0.5727 (0.5103)	grad_norm 2.0509 (2.6909)	mem 20675MB
[2025-04-03 02:41:50 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][238/573]	eta 0:04:56 lr 0.000598	time 0.8772 (0.8845)	loss 0.5190 (0.5101)	grad_norm 2.3203 (2.6876)	mem 20675MB
[2025-04-03 02:41:52 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][240/573]	eta 0:04:54 lr 0.000598	time 0.8774 (0.8845)	loss 0.4557 (0.5099)	grad_norm 3.1018 (2.6956)	mem 20675MB
[2025-04-03 02:41:54 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][242/573]	eta 0:04:52 lr 0.000597	time 0.8772 (0.8844)	loss 0.4843 (0.5100)	grad_norm 3.6201 (2.6973)	mem 20675MB
[2025-04-03 02:41:56 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][244/573]	eta 0:04:50 lr 0.000597	time 0.8772 (0.8844)	loss 0.5926 (0.5101)	grad_norm 3.2815 (2.7004)	mem 20675MB
[2025-04-03 02:41:57 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][246/573]	eta 0:04:49 lr 0.000597	time 0.8772 (0.8843)	loss 0.5708 (0.5107)	grad_norm 4.2900 (2.7099)	mem 20675MB
[2025-04-03 02:41:59 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][248/573]	eta 0:04:47 lr 0.000597	time 0.8777 (0.8843)	loss 0.5190 (0.5108)	grad_norm 2.3717 (2.7114)	mem 20675MB
[2025-04-03 02:42:01 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][250/573]	eta 0:04:45 lr 0.000597	time 0.8773 (0.8842)	loss 0.6278 (0.5112)	grad_norm 2.4916 (2.7129)	mem 20675MB
[2025-04-03 02:42:03 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][252/573]	eta 0:04:43 lr 0.000596	time 0.8773 (0.8842)	loss 0.3358 (0.5106)	grad_norm 3.4329 (2.7124)	mem 20675MB
[2025-04-03 02:42:04 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][254/573]	eta 0:04:42 lr 0.000596	time 0.8772 (0.8841)	loss 0.4851 (0.5103)	grad_norm 2.3118 (2.7107)	mem 20675MB
[2025-04-03 02:42:06 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][256/573]	eta 0:04:40 lr 0.000596	time 0.8776 (0.8841)	loss 0.5331 (0.5104)	grad_norm 1.6075 (2.7045)	mem 20675MB
[2025-04-03 02:42:08 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][258/573]	eta 0:04:38 lr 0.000596	time 0.8771 (0.8840)	loss 0.5601 (0.5103)	grad_norm 2.1072 (2.6980)	mem 20675MB
[2025-04-03 02:42:10 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][260/573]	eta 0:04:36 lr 0.000595	time 0.8771 (0.8840)	loss 0.4207 (0.5102)	grad_norm 2.5290 (2.6987)	mem 20675MB
[2025-04-03 02:42:11 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][262/573]	eta 0:04:34 lr 0.000595	time 0.8773 (0.8839)	loss 0.4663 (0.5102)	grad_norm 2.5911 (2.6980)	mem 20675MB
[2025-04-03 02:42:13 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][264/573]	eta 0:04:33 lr 0.000595	time 0.8772 (0.8839)	loss 0.6454 (0.5108)	grad_norm 2.6159 (2.6979)	mem 20675MB
[2025-04-03 02:42:15 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][266/573]	eta 0:04:31 lr 0.000595	time 0.8774 (0.8839)	loss 0.4147 (0.5106)	grad_norm 2.7260 (2.6939)	mem 20675MB
[2025-04-03 02:42:17 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][268/573]	eta 0:04:29 lr 0.000595	time 0.8774 (0.8838)	loss 0.4630 (0.5107)	grad_norm 1.9308 (2.6890)	mem 20675MB
[2025-04-03 02:42:18 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][270/573]	eta 0:04:27 lr 0.000594	time 0.8771 (0.8838)	loss 0.5617 (0.5106)	grad_norm 2.0767 (2.6880)	mem 20675MB
[2025-04-03 02:42:20 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][272/573]	eta 0:04:26 lr 0.000594	time 0.8773 (0.8837)	loss 0.4585 (0.5100)	grad_norm 1.9071 (2.6842)	mem 20675MB
[2025-04-03 02:42:22 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][274/573]	eta 0:04:24 lr 0.000594	time 0.8774 (0.8837)	loss 0.5325 (0.5103)	grad_norm 1.8193 (2.6777)	mem 20675MB
[2025-04-03 02:42:24 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][276/573]	eta 0:04:22 lr 0.000594	time 0.8770 (0.8836)	loss 0.3407 (0.5098)	grad_norm 3.9487 (2.6806)	mem 20675MB
[2025-04-03 02:42:25 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][278/573]	eta 0:04:20 lr 0.000593	time 0.8772 (0.8836)	loss 0.5474 (0.5097)	grad_norm 2.3513 (2.6813)	mem 20675MB
[2025-04-03 02:42:27 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][280/573]	eta 0:04:18 lr 0.000593	time 0.8773 (0.8836)	loss 0.5218 (0.5099)	grad_norm 2.6732 (2.6781)	mem 20675MB
[2025-04-03 02:42:29 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][282/573]	eta 0:04:17 lr 0.000593	time 0.8771 (0.8835)	loss 0.5246 (0.5094)	grad_norm 3.3815 (2.6825)	mem 20675MB
[2025-04-03 02:42:31 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][284/573]	eta 0:04:15 lr 0.000593	time 0.8780 (0.8835)	loss 0.5305 (0.5094)	grad_norm 2.0813 (2.6800)	mem 20675MB
[2025-04-03 02:42:32 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][286/573]	eta 0:04:13 lr 0.000592	time 0.8774 (0.8835)	loss 0.5708 (0.5092)	grad_norm 2.2148 (2.6836)	mem 20675MB
[2025-04-03 02:42:34 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][288/573]	eta 0:04:11 lr 0.000592	time 0.8772 (0.8834)	loss 0.5171 (0.5091)	grad_norm 2.1714 (2.6784)	mem 20675MB
[2025-04-03 02:42:36 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][290/573]	eta 0:04:09 lr 0.000592	time 0.8772 (0.8834)	loss 0.5491 (0.5096)	grad_norm 5.3259 (2.6888)	mem 20675MB
[2025-04-03 02:42:38 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][292/573]	eta 0:04:08 lr 0.000592	time 0.8773 (0.8834)	loss 0.5611 (0.5097)	grad_norm 2.7628 (2.6966)	mem 20675MB
[2025-04-03 02:42:39 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][294/573]	eta 0:04:06 lr 0.000592	time 0.8770 (0.8833)	loss 0.4556 (0.5090)	grad_norm 3.9463 (2.7045)	mem 20675MB
[2025-04-03 02:42:41 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][296/573]	eta 0:04:04 lr 0.000591	time 0.8774 (0.8833)	loss 0.5175 (0.5085)	grad_norm 2.5265 (2.7046)	mem 20675MB
[2025-04-03 02:42:43 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][298/573]	eta 0:04:02 lr 0.000591	time 0.8772 (0.8833)	loss 0.6082 (0.5085)	grad_norm 2.0732 (2.7059)	mem 20675MB
[2025-04-03 02:42:45 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][300/573]	eta 0:04:01 lr 0.000591	time 0.8774 (0.8832)	loss 0.4327 (0.5081)	grad_norm 2.4581 (2.7066)	mem 20675MB
[2025-04-03 02:42:47 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][302/573]	eta 0:03:59 lr 0.000591	time 0.8779 (0.8832)	loss 0.5653 (0.5083)	grad_norm 2.0900 (2.7033)	mem 20675MB
[2025-04-03 02:42:48 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][304/573]	eta 0:03:57 lr 0.000590	time 0.8772 (0.8832)	loss 0.4179 (0.5074)	grad_norm 3.5422 (2.7143)	mem 20675MB
[2025-04-03 02:42:50 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][306/573]	eta 0:03:55 lr 0.000590	time 0.8773 (0.8831)	loss 0.5345 (0.5076)	grad_norm 2.1130 (2.7109)	mem 20675MB
[2025-04-03 02:42:52 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][308/573]	eta 0:03:54 lr 0.000590	time 0.8772 (0.8831)	loss 0.4289 (0.5068)	grad_norm 2.6935 (2.7123)	mem 20675MB
[2025-04-03 02:42:54 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][310/573]	eta 0:03:52 lr 0.000590	time 0.8782 (0.8831)	loss 0.6354 (0.5074)	grad_norm 3.6016 (2.7142)	mem 20675MB
[2025-04-03 02:42:55 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][312/573]	eta 0:03:50 lr 0.000590	time 0.8773 (0.8830)	loss 0.6004 (0.5076)	grad_norm 3.8579 (2.7208)	mem 20675MB
[2025-04-03 02:42:57 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][314/573]	eta 0:03:48 lr 0.000589	time 0.8773 (0.8830)	loss 0.6012 (0.5082)	grad_norm 1.9705 (2.7171)	mem 20675MB
[2025-04-03 02:42:59 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][316/573]	eta 0:03:46 lr 0.000589	time 0.8771 (0.8830)	loss 0.4909 (0.5083)	grad_norm 3.0898 (2.7167)	mem 20675MB
[2025-04-03 02:43:01 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][318/573]	eta 0:03:45 lr 0.000589	time 0.8776 (0.8829)	loss 0.6088 (0.5088)	grad_norm 2.8452 (2.7164)	mem 20675MB
[2025-04-03 02:43:02 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][320/573]	eta 0:03:43 lr 0.000589	time 0.8772 (0.8829)	loss 0.5377 (0.5090)	grad_norm 1.8729 (2.7110)	mem 20675MB
[2025-04-03 02:43:04 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][322/573]	eta 0:03:41 lr 0.000588	time 0.8770 (0.8829)	loss 0.5187 (0.5086)	grad_norm 1.9526 (2.7086)	mem 20675MB
[2025-04-03 02:43:06 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][324/573]	eta 0:03:39 lr 0.000588	time 0.8773 (0.8828)	loss 0.5274 (0.5087)	grad_norm 2.5583 (2.7077)	mem 20675MB
[2025-04-03 02:43:08 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][326/573]	eta 0:03:38 lr 0.000588	time 0.8771 (0.8828)	loss 0.5361 (0.5088)	grad_norm 2.7266 (2.7060)	mem 20675MB
[2025-04-03 02:43:09 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][328/573]	eta 0:03:36 lr 0.000588	time 0.8774 (0.8828)	loss 0.4986 (0.5089)	grad_norm 2.6732 (2.7099)	mem 20675MB
[2025-04-03 02:43:11 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][330/573]	eta 0:03:34 lr 0.000587	time 0.8773 (0.8828)	loss 0.6133 (0.5092)	grad_norm 2.5738 (2.7087)	mem 20675MB
[2025-04-03 02:43:13 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][332/573]	eta 0:03:32 lr 0.000587	time 0.8772 (0.8827)	loss 0.5136 (0.5095)	grad_norm 2.6724 (2.7081)	mem 20675MB
[2025-04-03 02:43:15 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][334/573]	eta 0:03:30 lr 0.000587	time 0.8772 (0.8827)	loss 0.5440 (0.5096)	grad_norm 2.1567 (2.7037)	mem 20675MB
[2025-04-03 02:43:16 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][336/573]	eta 0:03:29 lr 0.000587	time 0.8772 (0.8827)	loss 0.4820 (0.5096)	grad_norm 2.2524 (2.7009)	mem 20675MB
[2025-04-03 02:43:18 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][338/573]	eta 0:03:27 lr 0.000587	time 0.8771 (0.8827)	loss 0.5799 (0.5098)	grad_norm 2.4895 (2.7008)	mem 20675MB
[2025-04-03 02:43:20 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][340/573]	eta 0:03:25 lr 0.000586	time 0.8771 (0.8826)	loss 0.5645 (0.5099)	grad_norm 1.7727 (2.6961)	mem 20675MB
[2025-04-03 02:43:22 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][342/573]	eta 0:03:23 lr 0.000586	time 0.8773 (0.8826)	loss 0.5638 (0.5102)	grad_norm 1.6388 (2.6919)	mem 20675MB
[2025-04-03 02:43:23 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][344/573]	eta 0:03:22 lr 0.000586	time 0.8771 (0.8826)	loss 0.3861 (0.5098)	grad_norm 2.7590 (2.6892)	mem 20675MB
[2025-04-03 02:43:25 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][346/573]	eta 0:03:20 lr 0.000586	time 0.8772 (0.8826)	loss 0.4394 (0.5094)	grad_norm 2.0951 (2.6893)	mem 20675MB
[2025-04-03 02:43:27 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][348/573]	eta 0:03:18 lr 0.000585	time 0.8771 (0.8825)	loss 0.5653 (0.5096)	grad_norm 1.9875 (2.6874)	mem 20675MB
[2025-04-03 02:43:29 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][350/573]	eta 0:03:16 lr 0.000585	time 0.8770 (0.8825)	loss 0.5152 (0.5093)	grad_norm 3.9844 (2.6904)	mem 20675MB
[2025-04-03 02:43:30 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][352/573]	eta 0:03:15 lr 0.000585	time 0.8774 (0.8825)	loss 0.5279 (0.5093)	grad_norm 2.7247 (2.6901)	mem 20675MB
[2025-04-03 02:43:32 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][354/573]	eta 0:03:13 lr 0.000585	time 0.8771 (0.8825)	loss 0.4356 (0.5089)	grad_norm 2.4792 (2.6893)	mem 20675MB
[2025-04-03 02:43:34 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][356/573]	eta 0:03:11 lr 0.000584	time 0.8768 (0.8824)	loss 0.4364 (0.5090)	grad_norm 4.9571 (2.6963)	mem 20675MB
[2025-04-03 02:43:36 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][358/573]	eta 0:03:09 lr 0.000584	time 0.8770 (0.8824)	loss 0.4744 (0.5086)	grad_norm 2.9342 (2.7011)	mem 20675MB
[2025-04-03 02:43:37 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][360/573]	eta 0:03:07 lr 0.000584	time 0.8769 (0.8824)	loss 0.4811 (0.5087)	grad_norm 3.6854 (2.7043)	mem 20675MB
[2025-04-03 02:43:39 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][362/573]	eta 0:03:06 lr 0.000584	time 0.8772 (0.8824)	loss 0.5296 (0.5089)	grad_norm 1.4870 (2.6997)	mem 20675MB
[2025-04-03 02:43:41 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][364/573]	eta 0:03:04 lr 0.000584	time 0.8771 (0.8823)	loss 0.5482 (0.5092)	grad_norm 2.7902 (2.6981)	mem 20675MB
[2025-04-03 02:43:43 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][366/573]	eta 0:03:02 lr 0.000583	time 0.8772 (0.8823)	loss 0.5722 (0.5096)	grad_norm 1.8190 (2.6945)	mem 20675MB
[2025-04-03 02:43:44 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][368/573]	eta 0:03:00 lr 0.000583	time 0.8771 (0.8823)	loss 0.5432 (0.5099)	grad_norm 2.7468 (2.6942)	mem 20675MB
[2025-04-03 02:43:46 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][370/573]	eta 0:02:59 lr 0.000583	time 0.8773 (0.8823)	loss 0.6185 (0.5105)	grad_norm 2.1351 (2.6911)	mem 20675MB
[2025-04-03 02:43:48 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][372/573]	eta 0:02:57 lr 0.000583	time 0.8773 (0.8822)	loss 0.6062 (0.5109)	grad_norm 2.1924 (2.6863)	mem 20675MB
[2025-04-03 02:43:50 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][374/573]	eta 0:02:55 lr 0.000582	time 0.8773 (0.8822)	loss 0.5154 (0.5110)	grad_norm 2.2582 (2.6841)	mem 20675MB
[2025-04-03 02:43:51 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][376/573]	eta 0:02:53 lr 0.000582	time 0.8772 (0.8822)	loss 0.4988 (0.5110)	grad_norm 2.4633 (2.6812)	mem 20675MB
[2025-04-03 02:43:53 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][378/573]	eta 0:02:52 lr 0.000582	time 0.8772 (0.8822)	loss 0.4328 (0.5109)	grad_norm 2.9577 (2.6803)	mem 20675MB
[2025-04-03 02:43:55 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][380/573]	eta 0:02:50 lr 0.000582	time 0.8779 (0.8821)	loss 0.4685 (0.5106)	grad_norm 2.4039 (2.6799)	mem 20675MB
[2025-04-03 02:43:57 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][382/573]	eta 0:02:48 lr 0.000582	time 0.8770 (0.8821)	loss 0.5257 (0.5104)	grad_norm 1.9118 (2.6800)	mem 20675MB
[2025-04-03 02:43:59 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][384/573]	eta 0:02:46 lr 0.000581	time 0.8769 (0.8821)	loss 0.4198 (0.5098)	grad_norm 3.8478 (2.6835)	mem 20675MB
[2025-04-03 02:44:00 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][386/573]	eta 0:02:44 lr 0.000581	time 0.8771 (0.8821)	loss 0.5644 (0.5102)	grad_norm 1.6134 (2.6802)	mem 20675MB
[2025-04-03 02:44:02 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][388/573]	eta 0:02:43 lr 0.000581	time 0.8772 (0.8821)	loss 0.5010 (0.5105)	grad_norm 2.3153 (2.6826)	mem 20675MB
[2025-04-03 02:44:04 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][390/573]	eta 0:02:41 lr 0.000581	time 0.8772 (0.8820)	loss 0.5313 (0.5104)	grad_norm 2.1247 (2.6784)	mem 20675MB
[2025-04-03 02:44:06 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][392/573]	eta 0:02:39 lr 0.000580	time 0.8774 (0.8820)	loss 0.6081 (0.5107)	grad_norm 2.8379 (2.6778)	mem 20675MB
[2025-04-03 02:44:07 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][394/573]	eta 0:02:37 lr 0.000580	time 0.8774 (0.8820)	loss 0.5997 (0.5111)	grad_norm 2.3890 (2.6754)	mem 20675MB
[2025-04-03 02:44:09 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][396/573]	eta 0:02:36 lr 0.000580	time 0.8779 (0.8820)	loss 0.4545 (0.5111)	grad_norm 3.1158 (2.6780)	mem 20675MB
[2025-04-03 02:44:11 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][398/573]	eta 0:02:34 lr 0.000580	time 0.8774 (0.8820)	loss 0.4635 (0.5108)	grad_norm 1.9777 (2.6771)	mem 20675MB
[2025-04-03 02:44:13 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][400/573]	eta 0:02:32 lr 0.000579	time 0.8771 (0.8820)	loss 0.5021 (0.5105)	grad_norm 2.0530 (2.6776)	mem 20675MB
[2025-04-03 02:44:14 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][402/573]	eta 0:02:30 lr 0.000579	time 0.8772 (0.8819)	loss 0.3852 (0.5102)	grad_norm 3.5045 (2.6781)	mem 20675MB
[2025-04-03 02:44:16 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][404/573]	eta 0:02:29 lr 0.000579	time 0.8774 (0.8819)	loss 0.4744 (0.5101)	grad_norm 2.1250 (2.6756)	mem 20675MB
[2025-04-03 02:44:18 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][406/573]	eta 0:02:27 lr 0.000579	time 0.8771 (0.8819)	loss 0.5520 (0.5099)	grad_norm 4.2966 (2.6836)	mem 20675MB
[2025-04-03 02:44:20 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][408/573]	eta 0:02:25 lr 0.000579	time 0.8770 (0.8819)	loss 0.3973 (0.5092)	grad_norm 2.8986 (2.6831)	mem 20675MB
[2025-04-03 02:44:21 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][410/573]	eta 0:02:23 lr 0.000578	time 0.8772 (0.8819)	loss 0.3858 (0.5088)	grad_norm 2.5042 (2.6830)	mem 20675MB
[2025-04-03 02:44:23 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][412/573]	eta 0:02:21 lr 0.000578	time 0.8769 (0.8818)	loss 0.4409 (0.5087)	grad_norm 4.8131 (2.6926)	mem 20675MB
[2025-04-03 02:44:25 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][414/573]	eta 0:02:20 lr 0.000578	time 0.8771 (0.8818)	loss 0.5357 (0.5085)	grad_norm 2.7853 (2.6949)	mem 20675MB
[2025-04-03 02:44:27 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][416/573]	eta 0:02:18 lr 0.000578	time 0.8773 (0.8818)	loss 0.5416 (0.5085)	grad_norm 3.0624 (2.6961)	mem 20675MB
[2025-04-03 02:44:28 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][418/573]	eta 0:02:16 lr 0.000577	time 0.8771 (0.8818)	loss 0.4936 (0.5081)	grad_norm 3.2731 (2.6994)	mem 20675MB
[2025-04-03 02:44:30 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][420/573]	eta 0:02:14 lr 0.000577	time 0.8771 (0.8818)	loss 0.5007 (0.5082)	grad_norm 4.2987 (2.7036)	mem 20675MB
[2025-04-03 02:44:32 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][422/573]	eta 0:02:13 lr 0.000577	time 0.8772 (0.8817)	loss 0.5260 (0.5085)	grad_norm 2.3244 (2.7037)	mem 20675MB
[2025-04-03 02:44:34 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][424/573]	eta 0:02:11 lr 0.000577	time 0.8775 (0.8817)	loss 0.5697 (0.5087)	grad_norm 2.5696 (2.7036)	mem 20675MB
[2025-04-03 02:44:35 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][426/573]	eta 0:02:09 lr 0.000577	time 0.8774 (0.8817)	loss 0.4769 (0.5088)	grad_norm 2.6516 (2.7044)	mem 20675MB
[2025-04-03 02:44:37 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][428/573]	eta 0:02:07 lr 0.000576	time 0.8772 (0.8817)	loss 0.3817 (0.5084)	grad_norm 3.7860 (2.7065)	mem 20675MB
[2025-04-03 02:44:39 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][430/573]	eta 0:02:06 lr 0.000576	time 0.8772 (0.8817)	loss 0.5763 (0.5087)	grad_norm 2.2165 (2.7036)	mem 20675MB
[2025-04-03 02:44:41 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][432/573]	eta 0:02:04 lr 0.000576	time 0.8774 (0.8817)	loss 0.6138 (0.5089)	grad_norm 1.9158 (2.6998)	mem 20675MB
[2025-04-03 02:44:42 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][434/573]	eta 0:02:02 lr 0.000576	time 0.8776 (0.8817)	loss 0.4494 (0.5087)	grad_norm 2.6274 (2.6995)	mem 20675MB
[2025-04-03 02:44:44 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][436/573]	eta 0:02:00 lr 0.000575	time 0.8778 (0.8816)	loss 0.5597 (0.5089)	grad_norm 1.6727 (2.6949)	mem 20675MB
[2025-04-03 02:44:46 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][438/573]	eta 0:01:59 lr 0.000575	time 0.8773 (0.8816)	loss 0.5221 (0.5087)	grad_norm 1.7306 (2.6971)	mem 20675MB
[2025-04-03 02:44:48 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][440/573]	eta 0:01:57 lr 0.000575	time 0.8771 (0.8816)	loss 0.5838 (0.5091)	grad_norm 2.2629 (2.6965)	mem 20675MB
[2025-04-03 02:44:49 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][442/573]	eta 0:01:55 lr 0.000575	time 0.8786 (0.8816)	loss 0.4463 (0.5090)	grad_norm 2.3105 (2.6965)	mem 20675MB
[2025-04-03 02:44:51 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][444/573]	eta 0:01:53 lr 0.000574	time 0.8769 (0.8816)	loss 0.3724 (0.5088)	grad_norm 2.2811 (2.6938)	mem 20675MB
[2025-04-03 02:44:53 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][446/573]	eta 0:01:51 lr 0.000574	time 0.8772 (0.8816)	loss 0.4294 (0.5087)	grad_norm 3.2337 (2.6942)	mem 20675MB
[2025-04-03 02:44:55 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][448/573]	eta 0:01:50 lr 0.000574	time 0.8772 (0.8815)	loss 0.5403 (0.5087)	grad_norm 1.9030 (2.6950)	mem 20675MB
[2025-04-03 02:44:56 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][450/573]	eta 0:01:48 lr 0.000574	time 0.8773 (0.8815)	loss 0.4032 (0.5080)	grad_norm 4.0116 (2.6988)	mem 20675MB
[2025-04-03 02:44:58 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][452/573]	eta 0:01:46 lr 0.000574	time 0.8770 (0.8815)	loss 0.3193 (0.5078)	grad_norm 2.6426 (2.6986)	mem 20675MB
[2025-04-03 02:45:00 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][454/573]	eta 0:01:44 lr 0.000573	time 0.8773 (0.8815)	loss 0.5052 (0.5076)	grad_norm 2.6991 (2.7019)	mem 20675MB
[2025-04-03 02:45:02 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][456/573]	eta 0:01:43 lr 0.000573	time 0.8773 (0.8815)	loss 0.5749 (0.5076)	grad_norm 3.2195 (2.7056)	mem 20675MB
[2025-04-03 02:45:03 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][458/573]	eta 0:01:41 lr 0.000573	time 0.8769 (0.8815)	loss 0.4789 (0.5075)	grad_norm 4.0446 (2.7076)	mem 20675MB
[2025-04-03 02:45:05 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][460/573]	eta 0:01:39 lr 0.000573	time 0.8773 (0.8815)	loss 0.5546 (0.5077)	grad_norm 2.3365 (2.7061)	mem 20675MB
[2025-04-03 02:45:07 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][462/573]	eta 0:01:37 lr 0.000572	time 0.8774 (0.8814)	loss 0.5238 (0.5078)	grad_norm 2.7652 (2.7051)	mem 20675MB
[2025-04-03 02:45:09 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][464/573]	eta 0:01:36 lr 0.000572	time 0.8773 (0.8814)	loss 0.3885 (0.5074)	grad_norm 2.6918 (2.7089)	mem 20675MB
[2025-04-03 02:45:11 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][466/573]	eta 0:01:34 lr 0.000572	time 0.8770 (0.8814)	loss 0.5600 (0.5076)	grad_norm 2.6693 (2.7064)	mem 20675MB
[2025-04-03 02:45:12 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][468/573]	eta 0:01:32 lr 0.000572	time 0.8772 (0.8814)	loss 0.4041 (0.5073)	grad_norm 2.9214 (2.7064)	mem 20675MB
[2025-04-03 02:45:14 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][470/573]	eta 0:01:30 lr 0.000572	time 0.8772 (0.8814)	loss 0.4470 (0.5072)	grad_norm 2.6099 (2.7056)	mem 20675MB
[2025-04-03 02:45:16 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][472/573]	eta 0:01:29 lr 0.000571	time 0.8771 (0.8814)	loss 0.5019 (0.5073)	grad_norm 1.7431 (2.7019)	mem 20675MB
[2025-04-03 02:45:18 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][474/573]	eta 0:01:27 lr 0.000571	time 0.8776 (0.8814)	loss 0.5761 (0.5072)	grad_norm 2.2348 (2.7013)	mem 20675MB
[2025-04-03 02:45:19 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][476/573]	eta 0:01:25 lr 0.000571	time 0.8772 (0.8813)	loss 0.5819 (0.5071)	grad_norm 2.7513 (2.7012)	mem 20675MB
[2025-04-03 02:45:21 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][478/573]	eta 0:01:23 lr 0.000571	time 0.8778 (0.8813)	loss 0.6474 (0.5074)	grad_norm 3.0823 (2.7012)	mem 20675MB
[2025-04-03 02:45:23 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][480/573]	eta 0:01:21 lr 0.000570	time 0.8773 (0.8813)	loss 0.5946 (0.5075)	grad_norm 2.4739 (2.7012)	mem 20675MB
[2025-04-03 02:45:25 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][482/573]	eta 0:01:20 lr 0.000570	time 0.8774 (0.8813)	loss 0.4589 (0.5075)	grad_norm 2.4369 (2.7004)	mem 20675MB
[2025-04-03 02:45:26 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][484/573]	eta 0:01:18 lr 0.000570	time 0.8772 (0.8813)	loss 0.4768 (0.5074)	grad_norm 2.7025 (2.7067)	mem 20675MB
[2025-04-03 02:45:28 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][486/573]	eta 0:01:16 lr 0.000570	time 0.8774 (0.8813)	loss 0.4964 (0.5076)	grad_norm 3.0826 (2.7072)	mem 20675MB
[2025-04-03 02:45:30 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][488/573]	eta 0:01:14 lr 0.000569	time 0.8772 (0.8813)	loss 0.5276 (0.5075)	grad_norm 1.8024 (2.7042)	mem 20675MB
[2025-04-03 02:45:32 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][490/573]	eta 0:01:13 lr 0.000569	time 0.8771 (0.8813)	loss 0.5460 (0.5075)	grad_norm 1.4370 (2.7007)	mem 20675MB
[2025-04-03 02:45:33 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][492/573]	eta 0:01:11 lr 0.000569	time 0.8774 (0.8812)	loss 0.4043 (0.5074)	grad_norm 3.2114 (2.7021)	mem 20675MB
[2025-04-03 02:45:35 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][494/573]	eta 0:01:09 lr 0.000569	time 0.8783 (0.8812)	loss 0.5243 (0.5075)	grad_norm 2.0412 (2.7009)	mem 20675MB
[2025-04-03 02:45:37 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][496/573]	eta 0:01:07 lr 0.000569	time 0.8770 (0.8812)	loss 0.5167 (0.5078)	grad_norm 2.7909 (2.7020)	mem 20675MB
[2025-04-03 02:45:39 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][498/573]	eta 0:01:06 lr 0.000568	time 0.8772 (0.8812)	loss 0.5177 (0.5079)	grad_norm 2.0492 (2.7002)	mem 20675MB
[2025-04-03 02:45:40 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][500/573]	eta 0:01:04 lr 0.000568	time 0.8782 (0.8812)	loss 0.3285 (0.5076)	grad_norm 2.3657 (2.7014)	mem 20675MB
[2025-04-03 02:45:42 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][502/573]	eta 0:01:02 lr 0.000568	time 0.8770 (0.8812)	loss 0.4107 (0.5072)	grad_norm 2.6392 (2.7007)	mem 20675MB
[2025-04-03 02:45:44 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][504/573]	eta 0:01:00 lr 0.000568	time 0.8773 (0.8812)	loss 0.4199 (0.5071)	grad_norm 2.3552 (2.6982)	mem 20675MB
[2025-04-03 02:45:46 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][506/573]	eta 0:00:59 lr 0.000567	time 0.8771 (0.8812)	loss 0.5834 (0.5073)	grad_norm 2.9289 (2.6978)	mem 20675MB
[2025-04-03 02:45:47 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][508/573]	eta 0:00:57 lr 0.000567	time 0.8774 (0.8812)	loss 0.4835 (0.5072)	grad_norm 3.1761 (2.6977)	mem 20675MB
[2025-04-03 02:45:49 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][510/573]	eta 0:00:55 lr 0.000567	time 0.8773 (0.8811)	loss 0.4920 (0.5073)	grad_norm 3.3665 (2.6976)	mem 20675MB
[2025-04-03 02:45:51 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][512/573]	eta 0:00:53 lr 0.000567	time 0.8775 (0.8811)	loss 0.5749 (0.5076)	grad_norm 2.8052 (2.6985)	mem 20675MB
[2025-04-03 02:45:53 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][514/573]	eta 0:00:51 lr 0.000567	time 0.8771 (0.8811)	loss 0.6267 (0.5079)	grad_norm 2.6674 (2.6982)	mem 20675MB
[2025-04-03 02:45:54 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][516/573]	eta 0:00:50 lr 0.000566	time 0.8770 (0.8811)	loss 0.3348 (0.5076)	grad_norm 2.6247 (2.6978)	mem 20675MB
[2025-04-03 02:45:56 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][518/573]	eta 0:00:48 lr 0.000566	time 0.8781 (0.8811)	loss 0.4835 (0.5075)	grad_norm 2.8412 (2.6971)	mem 20675MB
[2025-04-03 02:45:58 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][520/573]	eta 0:00:46 lr 0.000566	time 0.8774 (0.8811)	loss 0.3538 (0.5070)	grad_norm 4.3488 (2.6988)	mem 20675MB
[2025-04-03 02:46:00 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][522/573]	eta 0:00:44 lr 0.000566	time 0.8772 (0.8811)	loss 0.3364 (0.5067)	grad_norm 9.7705 (2.7112)	mem 20675MB
[2025-04-03 02:46:01 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][524/573]	eta 0:00:43 lr 0.000565	time 0.8774 (0.8811)	loss 0.5643 (0.5068)	grad_norm 2.3812 (2.7108)	mem 20675MB
[2025-04-03 02:46:03 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][526/573]	eta 0:00:41 lr 0.000565	time 0.8775 (0.8811)	loss 0.5301 (0.5067)	grad_norm 2.9473 (2.7122)	mem 20675MB
[2025-04-03 02:46:05 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][528/573]	eta 0:00:39 lr 0.000565	time 0.8771 (0.8810)	loss 0.4341 (0.5064)	grad_norm 2.8021 (2.7163)	mem 20675MB
[2025-04-03 02:46:07 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][530/573]	eta 0:00:37 lr 0.000565	time 0.8770 (0.8810)	loss 0.4832 (0.5063)	grad_norm 2.1801 (2.7140)	mem 20675MB
[2025-04-03 02:46:08 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][532/573]	eta 0:00:36 lr 0.000564	time 0.8774 (0.8810)	loss 0.5305 (0.5064)	grad_norm 7.1662 (2.7217)	mem 20675MB
[2025-04-03 02:46:10 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][534/573]	eta 0:00:34 lr 0.000564	time 0.8771 (0.8810)	loss 0.5421 (0.5064)	grad_norm 2.8192 (2.7225)	mem 20675MB
[2025-04-03 02:46:12 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][536/573]	eta 0:00:32 lr 0.000564	time 0.8774 (0.8810)	loss 0.5691 (0.5066)	grad_norm 2.1750 (2.7213)	mem 20675MB
[2025-04-03 02:46:14 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][538/573]	eta 0:00:30 lr 0.000564	time 0.8771 (0.8810)	loss 0.4489 (0.5064)	grad_norm 3.6674 (2.7263)	mem 20675MB
[2025-04-03 02:46:16 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][540/573]	eta 0:00:29 lr 0.000564	time 0.8773 (0.8810)	loss 0.5953 (0.5067)	grad_norm 2.1394 (2.7258)	mem 20675MB
[2025-04-03 02:46:17 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][542/573]	eta 0:00:27 lr 0.000563	time 0.8769 (0.8810)	loss 0.3812 (0.5062)	grad_norm 3.4330 (2.7279)	mem 20675MB
[2025-04-03 02:46:19 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][544/573]	eta 0:00:25 lr 0.000563	time 0.8773 (0.8810)	loss 0.6008 (0.5064)	grad_norm 3.5371 (2.7296)	mem 20675MB
[2025-04-03 02:46:21 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][546/573]	eta 0:00:23 lr 0.000563	time 0.8773 (0.8809)	loss 0.5806 (0.5066)	grad_norm 2.8711 (2.7286)	mem 20675MB
[2025-04-03 02:46:23 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][548/573]	eta 0:00:22 lr 0.000563	time 0.8776 (0.8809)	loss 0.4308 (0.5065)	grad_norm 2.1118 (2.7259)	mem 20675MB
[2025-04-03 02:46:24 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][550/573]	eta 0:00:20 lr 0.000562	time 0.8773 (0.8809)	loss 0.5264 (0.5063)	grad_norm 2.9659 (2.7264)	mem 20675MB
[2025-04-03 02:46:26 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][552/573]	eta 0:00:18 lr 0.000562	time 0.8772 (0.8809)	loss 0.4613 (0.5064)	grad_norm 2.9539 (2.7262)	mem 20675MB
[2025-04-03 02:46:28 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][554/573]	eta 0:00:16 lr 0.000562	time 0.8771 (0.8809)	loss 0.4704 (0.5060)	grad_norm 2.5725 (2.7272)	mem 20675MB
[2025-04-03 02:46:30 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][556/573]	eta 0:00:14 lr 0.000562	time 0.8776 (0.8809)	loss 0.4515 (0.5059)	grad_norm 3.8336 (2.7300)	mem 20675MB
[2025-04-03 02:46:31 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][558/573]	eta 0:00:13 lr 0.000562	time 0.8769 (0.8809)	loss 0.4699 (0.5059)	grad_norm 4.2654 (2.7336)	mem 20675MB
[2025-04-03 02:46:33 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][560/573]	eta 0:00:11 lr 0.000561	time 0.8778 (0.8809)	loss 0.5927 (0.5059)	grad_norm 4.2816 (2.7385)	mem 20675MB
[2025-04-03 02:46:35 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][562/573]	eta 0:00:09 lr 0.000561	time 0.8772 (0.8809)	loss 0.4770 (0.5058)	grad_norm 3.3804 (2.7398)	mem 20675MB
[2025-04-03 02:46:37 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][564/573]	eta 0:00:07 lr 0.000561	time 0.8774 (0.8809)	loss 0.5823 (0.5057)	grad_norm 2.5708 (2.7395)	mem 20675MB
[2025-04-03 02:46:38 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][566/573]	eta 0:00:06 lr 0.000561	time 0.8769 (0.8809)	loss 0.5647 (0.5058)	grad_norm 2.4223 (2.7413)	mem 20675MB
[2025-04-03 02:46:40 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][568/573]	eta 0:00:04 lr 0.000560	time 0.8776 (0.8809)	loss 0.5357 (0.5060)	grad_norm 1.8675 (2.7393)	mem 20675MB
[2025-04-03 02:46:42 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][570/573]	eta 0:00:02 lr 0.000560	time 0.8774 (0.8808)	loss 0.5751 (0.5061)	grad_norm 2.4687 (2.7394)	mem 20675MB
[2025-04-03 02:46:44 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][572/573]	eta 0:00:00 lr 0.000560	time 0.8771 (0.8808)	loss 0.5283 (0.5061)	grad_norm 2.7372 (2.7401)	mem 20675MB
[2025-04-03 02:46:44 simmim_finetune] (main_finetune.py 260): INFO EPOCH 15 training takes 0:08:24
[2025-04-03 02:46:44 simmim_finetune] (utils.py 60): INFO checkpoint/human/ckpt15.pth saving......
[2025-04-03 02:46:47 simmim_finetune] (utils.py 62): INFO checkpoint/human/ckpt15.pth saved !!!
[2025-04-03 02:46:49 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.875 (1.875)	Loss 0.5941 (0.5941)	Acc@1 64.062 (64.062)	Mem 20675MB
[2025-04-03 02:46:49 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.288 (0.816)	Loss 0.5574 (0.5638)	Acc@1 71.094 (68.750)	Mem 20675MB
[2025-04-03 02:46:50 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.603)	Loss 0.6016 (0.5664)	Acc@1 66.406 (68.750)	Mem 20675MB
[2025-04-03 02:46:50 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.512)	Loss 0.5492 (0.5587)	Acc@1 75.000 (70.089)	Mem 20675MB
[2025-04-03 02:46:51 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.461)	Loss 0.4470 (0.5269)	Acc@1 80.469 (73.003)	Mem 20675MB
[2025-04-03 02:46:52 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.429)	Loss 0.4105 (0.5089)	Acc@1 86.719 (74.503)	Mem 20675MB
[2025-04-03 02:46:52 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.407)	Loss 0.4045 (0.4948)	Acc@1 82.031 (75.541)	Mem 20675MB
[2025-04-03 02:46:53 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.390)	Loss 0.3954 (0.4827)	Acc@1 82.812 (76.615)	Mem 20675MB
[2025-04-03 02:46:53 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 76.815
[2025-04-03 02:46:53 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 76.8%
[2025-04-03 02:46:53 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 76.81%
[2025-04-03 02:46:53 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.2076774857206484e-06, 2.2076774857206484e-06, 3.322095872840743e-06, 3.322095872840743e-06, 5.03658569917935e-06, 5.03658569917935e-06, 7.6742623550849e-06, 7.6742623550849e-06, 1.1732226441093436e-05, 1.1732226441093436e-05, 1.79752481118758e-05, 1.79752481118758e-05, 2.757989683615636e-05, 2.757989683615636e-05, 4.2356279488895676e-05, 4.2356279488895676e-05, 6.50891758777254e-05, 6.50891758777254e-05, 0.00010006286262977115, 0.00010006286262977115, 0.00015386853455599535, 0.00015386853455599535, 0.00023664649136557103, 0.00023664649136557103, 0.0003639971941495336, 0.0003639971941495336, 0.0005599213522787069, 0.0005599213522787069]
[2025-04-03 02:46:55 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][0/573]	eta 0:23:29 lr 0.000560	time 2.4595 (2.4595)	loss 0.4036 (0.4036)	grad_norm 3.6390 (3.6390)	mem 20675MB
[2025-04-03 02:46:57 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][2/573]	eta 0:13:22 lr 0.000560	time 0.8773 (1.4053)	loss 0.5999 (0.5034)	grad_norm 2.3411 (3.3430)	mem 20675MB
[2025-04-03 02:46:59 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][4/573]	eta 0:11:19 lr 0.000559	time 0.8778 (1.1946)	loss 0.4587 (0.4726)	grad_norm 2.7593 (3.5512)	mem 20675MB
[2025-04-03 02:47:01 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][6/573]	eta 0:10:26 lr 0.000559	time 0.8776 (1.1042)	loss 0.4121 (0.4725)	grad_norm 6.1662 (3.7958)	mem 20675MB
[2025-04-03 02:47:02 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][8/573]	eta 0:09:55 lr 0.000559	time 0.8775 (1.0540)	loss 0.5561 (0.4641)	grad_norm 2.5519 (3.5271)	mem 20675MB
[2025-04-03 02:47:04 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][10/573]	eta 0:09:35 lr 0.000559	time 0.8783 (1.0222)	loss 0.4576 (0.4685)	grad_norm 3.7541 (3.4579)	mem 20675MB
[2025-04-03 02:47:06 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][12/573]	eta 0:09:21 lr 0.000558	time 0.8775 (1.0001)	loss 0.4213 (0.4720)	grad_norm 3.9644 (3.3959)	mem 20675MB
[2025-04-03 02:47:08 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][14/573]	eta 0:09:09 lr 0.000558	time 0.8774 (0.9839)	loss 0.3213 (0.4701)	grad_norm 3.0889 (3.3598)	mem 20675MB
[2025-04-03 02:47:09 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][16/573]	eta 0:09:01 lr 0.000558	time 0.8774 (0.9715)	loss 0.4260 (0.4745)	grad_norm 3.5375 (3.3275)	mem 20675MB
[2025-04-03 02:47:11 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][18/573]	eta 0:08:53 lr 0.000558	time 0.8771 (0.9617)	loss 0.4747 (0.4743)	grad_norm 2.2524 (3.2520)	mem 20675MB
[2025-04-03 02:47:13 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][20/573]	eta 0:08:47 lr 0.000558	time 0.8784 (0.9538)	loss 0.5583 (0.4806)	grad_norm 2.1747 (3.1399)	mem 20675MB
[2025-04-03 02:47:15 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][22/573]	eta 0:08:41 lr 0.000557	time 0.8774 (0.9472)	loss 0.3422 (0.4790)	grad_norm 3.8801 (3.1419)	mem 20675MB
[2025-04-03 02:47:17 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][24/573]	eta 0:08:36 lr 0.000557	time 0.8774 (0.9417)	loss 0.4219 (0.4757)	grad_norm 2.6184 (3.1848)	mem 20675MB
[2025-04-03 02:47:18 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][26/573]	eta 0:08:32 lr 0.000557	time 0.8775 (0.9370)	loss 0.6086 (0.4794)	grad_norm 2.4023 (3.1481)	mem 20675MB
[2025-04-03 02:47:20 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][28/573]	eta 0:08:28 lr 0.000557	time 0.8775 (0.9330)	loss 0.4715 (0.4831)	grad_norm 2.5753 (3.0805)	mem 20675MB
[2025-04-03 02:47:22 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][30/573]	eta 0:08:24 lr 0.000556	time 0.8775 (0.9295)	loss 0.4701 (0.4867)	grad_norm 3.3366 (3.1153)	mem 20675MB
[2025-04-03 02:47:24 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][32/573]	eta 0:08:21 lr 0.000556	time 0.8777 (0.9264)	loss 0.5089 (0.4889)	grad_norm 3.6726 (3.1330)	mem 20675MB
[2025-04-03 02:47:25 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][34/573]	eta 0:08:17 lr 0.000556	time 0.8774 (0.9236)	loss 0.6283 (0.4898)	grad_norm 2.5798 (3.1650)	mem 20675MB
[2025-04-03 02:47:27 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][36/573]	eta 0:08:14 lr 0.000556	time 0.8775 (0.9212)	loss 0.4999 (0.4915)	grad_norm 3.0789 (3.1595)	mem 20675MB
[2025-04-03 02:47:29 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][38/573]	eta 0:08:11 lr 0.000555	time 0.8773 (0.9190)	loss 0.4048 (0.4879)	grad_norm 3.4266 (3.1667)	mem 20675MB
[2025-04-03 02:47:31 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][40/573]	eta 0:08:08 lr 0.000555	time 0.8778 (0.9170)	loss 0.6356 (0.4934)	grad_norm 3.0247 (3.1425)	mem 20675MB
[2025-04-03 02:47:32 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][42/573]	eta 0:08:05 lr 0.000555	time 0.8773 (0.9152)	loss 0.5487 (0.4973)	grad_norm 1.5593 (3.0767)	mem 20675MB
[2025-04-03 02:47:34 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][44/573]	eta 0:08:03 lr 0.000555	time 0.8771 (0.9135)	loss 0.5234 (0.4951)	grad_norm 2.0754 (3.0515)	mem 20675MB
[2025-04-03 02:47:36 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][46/573]	eta 0:08:00 lr 0.000555	time 0.8775 (0.9120)	loss 0.5594 (0.4958)	grad_norm 1.6403 (3.0294)	mem 20675MB
[2025-04-03 02:47:38 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][48/573]	eta 0:07:58 lr 0.000554	time 0.8773 (0.9107)	loss 0.5521 (0.4982)	grad_norm 1.4515 (2.9856)	mem 20675MB
[2025-04-03 02:47:39 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][50/573]	eta 0:07:55 lr 0.000554	time 0.8773 (0.9094)	loss 0.5888 (0.5009)	grad_norm 2.4500 (2.9723)	mem 20675MB
[2025-04-03 02:47:41 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][52/573]	eta 0:07:53 lr 0.000554	time 0.8771 (0.9082)	loss 0.4609 (0.5017)	grad_norm 2.4302 (2.9402)	mem 20675MB
[2025-04-03 02:47:43 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][54/573]	eta 0:07:50 lr 0.000554	time 0.8772 (0.9071)	loss 0.5719 (0.5024)	grad_norm 2.2139 (2.9194)	mem 20675MB
[2025-04-03 02:47:45 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][56/573]	eta 0:07:48 lr 0.000553	time 0.8771 (0.9061)	loss 0.5746 (0.5019)	grad_norm 1.6778 (2.9160)	mem 20675MB
[2025-04-03 02:47:46 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][58/573]	eta 0:07:46 lr 0.000553	time 0.8774 (0.9051)	loss 0.4951 (0.5004)	grad_norm 2.9410 (2.9291)	mem 20675MB
[2025-04-03 02:47:48 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][60/573]	eta 0:07:43 lr 0.000553	time 0.8773 (0.9043)	loss 0.4390 (0.5001)	grad_norm 3.7763 (2.9273)	mem 20675MB
[2025-04-03 02:47:50 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][62/573]	eta 0:07:41 lr 0.000553	time 0.8774 (0.9034)	loss 0.5848 (0.4988)	grad_norm 2.4196 (2.9144)	mem 20675MB
[2025-04-03 02:47:52 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][64/573]	eta 0:07:39 lr 0.000553	time 0.8773 (0.9026)	loss 0.4663 (0.4974)	grad_norm 3.5067 (2.9184)	mem 20675MB
[2025-04-03 02:47:53 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][66/573]	eta 0:07:37 lr 0.000552	time 0.8773 (0.9019)	loss 0.5744 (0.4975)	grad_norm 2.6868 (2.9656)	mem 20675MB
[2025-04-03 02:47:55 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][68/573]	eta 0:07:35 lr 0.000552	time 0.8772 (0.9012)	loss 0.4186 (0.4959)	grad_norm 2.4776 (2.9465)	mem 20675MB
[2025-04-03 02:47:57 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][70/573]	eta 0:07:32 lr 0.000552	time 0.8773 (0.9006)	loss 0.5610 (0.4974)	grad_norm 2.4740 (2.9259)	mem 20675MB
[2025-04-03 02:47:59 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][72/573]	eta 0:07:30 lr 0.000552	time 0.8772 (0.8999)	loss 0.4811 (0.4965)	grad_norm 2.6254 (2.9192)	mem 20675MB
[2025-04-03 02:48:00 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][74/573]	eta 0:07:28 lr 0.000551	time 0.8773 (0.8994)	loss 0.4540 (0.4963)	grad_norm 3.3550 (2.9299)	mem 20675MB
[2025-04-03 02:48:02 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][76/573]	eta 0:07:26 lr 0.000551	time 0.8773 (0.8988)	loss 0.5048 (0.4951)	grad_norm 2.5484 (2.9362)	mem 20675MB
[2025-04-03 02:48:04 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][78/573]	eta 0:07:24 lr 0.000551	time 0.8771 (0.8983)	loss 0.3888 (0.4949)	grad_norm 5.5068 (2.9616)	mem 20675MB
[2025-04-03 02:48:06 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][80/573]	eta 0:07:22 lr 0.000551	time 0.8772 (0.8978)	loss 0.5541 (0.4954)	grad_norm 2.8127 (2.9698)	mem 20675MB
[2025-04-03 02:48:07 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][82/573]	eta 0:07:20 lr 0.000551	time 0.8772 (0.8973)	loss 0.5249 (0.4967)	grad_norm 1.6491 (2.9483)	mem 20675MB
[2025-04-03 02:48:09 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][84/573]	eta 0:07:18 lr 0.000550	time 0.8773 (0.8969)	loss 0.6007 (0.4987)	grad_norm 1.9451 (2.9349)	mem 20675MB
[2025-04-03 02:48:11 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][86/573]	eta 0:07:16 lr 0.000550	time 0.8774 (0.8964)	loss 0.5772 (0.4990)	grad_norm 2.7843 (2.9323)	mem 20675MB
[2025-04-03 02:48:13 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][88/573]	eta 0:07:14 lr 0.000550	time 0.8771 (0.8960)	loss 0.4504 (0.4976)	grad_norm 2.0221 (2.9308)	mem 20675MB
[2025-04-03 02:48:14 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][90/573]	eta 0:07:12 lr 0.000550	time 0.8774 (0.8956)	loss 0.5118 (0.4986)	grad_norm 2.7260 (2.9630)	mem 20675MB
[2025-04-03 02:48:16 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][92/573]	eta 0:07:10 lr 0.000549	time 0.8773 (0.8952)	loss 0.5380 (0.4995)	grad_norm 1.6949 (2.9383)	mem 20675MB
[2025-04-03 02:48:18 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][94/573]	eta 0:07:08 lr 0.000549	time 0.8773 (0.8949)	loss 0.4636 (0.4996)	grad_norm 2.9708 (2.9368)	mem 20675MB
[2025-04-03 02:48:20 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][96/573]	eta 0:07:06 lr 0.000549	time 0.8773 (0.8945)	loss 0.4934 (0.4999)	grad_norm 2.3842 (2.9174)	mem 20675MB
[2025-04-03 02:48:22 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][98/573]	eta 0:07:04 lr 0.000549	time 0.8774 (0.8942)	loss 0.5915 (0.5009)	grad_norm 2.0683 (2.8999)	mem 20675MB
[2025-04-03 02:48:23 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][100/573]	eta 0:07:02 lr 0.000548	time 0.8774 (0.8939)	loss 0.3993 (0.5004)	grad_norm 3.3725 (2.8925)	mem 20675MB
[2025-04-03 02:48:25 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][102/573]	eta 0:07:00 lr 0.000548	time 0.8774 (0.8936)	loss 0.3665 (0.4998)	grad_norm 2.7445 (2.8858)	mem 20675MB
[2025-04-03 02:48:27 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][104/573]	eta 0:06:58 lr 0.000548	time 0.8773 (0.8933)	loss 0.5460 (0.5013)	grad_norm 2.2800 (2.8807)	mem 20675MB
[2025-04-03 02:48:29 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][106/573]	eta 0:06:57 lr 0.000548	time 0.8774 (0.8930)	loss 0.5508 (0.5019)	grad_norm 2.3336 (2.8717)	mem 20675MB
[2025-04-03 02:48:30 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][108/573]	eta 0:06:55 lr 0.000548	time 0.8772 (0.8927)	loss 0.4037 (0.5015)	grad_norm 3.3446 (2.8701)	mem 20675MB
[2025-04-03 02:48:32 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][110/573]	eta 0:06:53 lr 0.000547	time 0.8774 (0.8925)	loss 0.5599 (0.5034)	grad_norm 2.1695 (2.8611)	mem 20675MB
[2025-04-03 02:48:34 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][112/573]	eta 0:06:51 lr 0.000547	time 0.8771 (0.8922)	loss 0.5221 (0.5033)	grad_norm 1.5361 (2.8513)	mem 20675MB
[2025-04-03 02:48:36 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][114/573]	eta 0:06:49 lr 0.000547	time 0.8771 (0.8920)	loss 0.5189 (0.5031)	grad_norm 2.3425 (2.8505)	mem 20675MB
[2025-04-03 02:48:37 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][116/573]	eta 0:06:47 lr 0.000547	time 0.8774 (0.8917)	loss 0.4153 (0.5027)	grad_norm 3.3520 (2.8539)	mem 20675MB
[2025-04-03 02:48:39 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][118/573]	eta 0:06:45 lr 0.000546	time 0.8774 (0.8915)	loss 0.5289 (0.5034)	grad_norm 1.8931 (2.8363)	mem 20675MB
[2025-04-03 02:48:41 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][120/573]	eta 0:06:43 lr 0.000546	time 0.8775 (0.8913)	loss 0.5590 (0.5037)	grad_norm 2.0282 (2.8294)	mem 20675MB
[2025-04-03 02:48:43 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][122/573]	eta 0:06:41 lr 0.000546	time 0.8773 (0.8911)	loss 0.3632 (0.5026)	grad_norm 2.1624 (2.8194)	mem 20675MB
[2025-04-03 02:48:44 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][124/573]	eta 0:06:40 lr 0.000546	time 0.8774 (0.8909)	loss 0.3954 (0.5021)	grad_norm 6.3571 (2.8448)	mem 20675MB
[2025-04-03 02:48:46 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][126/573]	eta 0:06:38 lr 0.000546	time 0.8771 (0.8907)	loss 0.5878 (0.5038)	grad_norm 2.0336 (2.8304)	mem 20675MB
[2025-04-03 02:48:48 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][128/573]	eta 0:06:36 lr 0.000545	time 0.8785 (0.8905)	loss 0.3323 (0.5018)	grad_norm 2.5415 (2.8288)	mem 20675MB
[2025-04-03 02:48:50 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][130/573]	eta 0:06:34 lr 0.000545	time 0.8773 (0.8903)	loss 0.5350 (0.5017)	grad_norm 2.1958 (2.8244)	mem 20675MB
[2025-04-03 02:48:51 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][132/573]	eta 0:06:32 lr 0.000545	time 0.8772 (0.8901)	loss 0.5286 (0.5021)	grad_norm 2.2520 (2.8140)	mem 20675MB
[2025-04-03 02:48:53 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][134/573]	eta 0:06:30 lr 0.000545	time 0.8775 (0.8899)	loss 0.6052 (0.5017)	grad_norm 2.3674 (2.8131)	mem 20675MB
[2025-04-03 02:48:55 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][136/573]	eta 0:06:28 lr 0.000544	time 0.8774 (0.8898)	loss 0.5266 (0.5019)	grad_norm 2.2687 (2.8112)	mem 20675MB
[2025-04-03 02:48:57 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][138/573]	eta 0:06:26 lr 0.000544	time 0.8775 (0.8896)	loss 0.5911 (0.5030)	grad_norm 2.9612 (2.8154)	mem 20675MB
[2025-04-03 02:48:58 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][140/573]	eta 0:06:25 lr 0.000544	time 0.8772 (0.8894)	loss 0.5656 (0.5033)	grad_norm 2.9260 (2.8149)	mem 20675MB
[2025-04-03 02:49:00 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][142/573]	eta 0:06:23 lr 0.000544	time 0.8771 (0.8893)	loss 0.5255 (0.5025)	grad_norm 2.7885 (2.8259)	mem 20675MB
[2025-04-03 02:49:02 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][144/573]	eta 0:06:21 lr 0.000543	time 0.8773 (0.8891)	loss 0.4268 (0.5010)	grad_norm 2.8700 (2.8403)	mem 20675MB
[2025-04-03 02:49:04 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][146/573]	eta 0:06:19 lr 0.000543	time 0.8774 (0.8890)	loss 0.6556 (0.5016)	grad_norm 3.2135 (2.8479)	mem 20675MB
[2025-04-03 02:49:05 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][148/573]	eta 0:06:17 lr 0.000543	time 0.8770 (0.8888)	loss 0.5103 (0.5012)	grad_norm 2.3668 (2.8561)	mem 20675MB
[2025-04-03 02:49:07 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][150/573]	eta 0:06:15 lr 0.000543	time 0.8774 (0.8887)	loss 0.5335 (0.5016)	grad_norm 2.3606 (2.8529)	mem 20675MB
[2025-04-03 02:49:09 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][152/573]	eta 0:06:14 lr 0.000543	time 0.8774 (0.8885)	loss 0.5232 (0.5019)	grad_norm 2.5180 (2.8565)	mem 20675MB
[2025-04-03 02:49:11 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][154/573]	eta 0:06:12 lr 0.000542	time 0.8774 (0.8884)	loss 0.4195 (0.5018)	grad_norm 2.8836 (2.8600)	mem 20675MB
[2025-04-03 02:49:12 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][156/573]	eta 0:06:10 lr 0.000542	time 0.8775 (0.8883)	loss 0.5727 (0.5018)	grad_norm 2.2720 (2.8531)	mem 20675MB
[2025-04-03 02:49:14 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][158/573]	eta 0:06:08 lr 0.000542	time 0.8775 (0.8882)	loss 0.3658 (0.5008)	grad_norm 2.4925 (2.8471)	mem 20675MB
[2025-04-03 02:49:16 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][160/573]	eta 0:06:06 lr 0.000542	time 0.8772 (0.8880)	loss 0.4691 (0.4999)	grad_norm 4.2203 (2.8651)	mem 20675MB
[2025-04-03 02:49:18 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][162/573]	eta 0:06:04 lr 0.000541	time 0.8772 (0.8879)	loss 0.3959 (0.4996)	grad_norm 2.2560 (2.8612)	mem 20675MB
[2025-04-03 02:49:19 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][164/573]	eta 0:06:03 lr 0.000541	time 0.8772 (0.8878)	loss 0.5434 (0.4995)	grad_norm 1.9542 (2.8613)	mem 20675MB
[2025-04-03 02:49:21 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][166/573]	eta 0:06:01 lr 0.000541	time 0.8770 (0.8877)	loss 0.5824 (0.4995)	grad_norm 3.2878 (2.8650)	mem 20675MB
[2025-04-03 02:49:23 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][168/573]	eta 0:05:59 lr 0.000541	time 0.8774 (0.8876)	loss 0.4053 (0.4991)	grad_norm 2.8807 (2.8621)	mem 20675MB
[2025-04-03 02:49:25 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][170/573]	eta 0:05:57 lr 0.000541	time 0.8773 (0.8874)	loss 0.4742 (0.4991)	grad_norm 1.9456 (2.8519)	mem 20675MB
[2025-04-03 02:49:26 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][172/573]	eta 0:05:55 lr 0.000540	time 0.8774 (0.8873)	loss 0.5362 (0.4999)	grad_norm 2.3530 (2.8473)	mem 20675MB
[2025-04-03 02:49:28 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][174/573]	eta 0:05:54 lr 0.000540	time 0.8772 (0.8872)	loss 0.4637 (0.4991)	grad_norm 3.6066 (2.8674)	mem 20675MB
[2025-04-03 02:49:30 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][176/573]	eta 0:05:52 lr 0.000540	time 0.8783 (0.8871)	loss 0.4640 (0.4990)	grad_norm 5.1019 (2.8765)	mem 20675MB
[2025-04-03 02:49:32 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][178/573]	eta 0:05:50 lr 0.000540	time 0.8771 (0.8870)	loss 0.4291 (0.4986)	grad_norm 2.7288 (2.8724)	mem 20675MB
[2025-04-03 02:49:34 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][180/573]	eta 0:05:48 lr 0.000539	time 0.8771 (0.8869)	loss 0.4229 (0.4985)	grad_norm 3.5495 (2.8750)	mem 20675MB
[2025-04-03 02:49:35 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][182/573]	eta 0:05:46 lr 0.000539	time 0.8771 (0.8868)	loss 0.5087 (0.4988)	grad_norm 2.5558 (2.8753)	mem 20675MB
[2025-04-03 02:49:37 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][184/573]	eta 0:05:44 lr 0.000539	time 0.8775 (0.8867)	loss 0.5098 (0.4992)	grad_norm 2.7580 (2.8711)	mem 20675MB
[2025-04-03 02:49:39 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][186/573]	eta 0:05:43 lr 0.000539	time 0.8772 (0.8866)	loss 0.4990 (0.4987)	grad_norm 3.6121 (2.8770)	mem 20675MB
[2025-04-03 02:49:41 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][188/573]	eta 0:05:41 lr 0.000538	time 0.8774 (0.8866)	loss 0.5806 (0.4989)	grad_norm 2.0745 (2.8757)	mem 20675MB
[2025-04-03 02:49:42 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][190/573]	eta 0:05:39 lr 0.000538	time 0.8772 (0.8865)	loss 0.5336 (0.4988)	grad_norm 2.7652 (2.8725)	mem 20675MB
[2025-04-03 02:49:44 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][192/573]	eta 0:05:37 lr 0.000538	time 0.8773 (0.8864)	loss 0.6676 (0.5001)	grad_norm 2.5798 (2.8718)	mem 20675MB
[2025-04-03 02:49:46 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][194/573]	eta 0:05:35 lr 0.000538	time 0.8774 (0.8863)	loss 0.5304 (0.5003)	grad_norm 1.6991 (2.8634)	mem 20675MB
[2025-04-03 02:49:48 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][196/573]	eta 0:05:34 lr 0.000538	time 0.8773 (0.8862)	loss 0.5397 (0.5009)	grad_norm 2.1969 (2.8552)	mem 20675MB
[2025-04-03 02:49:49 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][198/573]	eta 0:05:32 lr 0.000537	time 0.8772 (0.8861)	loss 0.5935 (0.5016)	grad_norm 1.6827 (2.8462)	mem 20675MB
[2025-04-03 02:49:51 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][200/573]	eta 0:05:30 lr 0.000537	time 0.8770 (0.8860)	loss 0.4999 (0.5018)	grad_norm 2.3171 (2.8405)	mem 20675MB
[2025-04-03 02:49:53 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][202/573]	eta 0:05:28 lr 0.000537	time 0.8771 (0.8860)	loss 0.5155 (0.5012)	grad_norm 2.2930 (2.8382)	mem 20675MB
[2025-04-03 02:49:55 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][204/573]	eta 0:05:26 lr 0.000537	time 0.8771 (0.8859)	loss 0.5026 (0.5006)	grad_norm 1.7956 (2.8374)	mem 20675MB
[2025-04-03 02:49:56 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][206/573]	eta 0:05:25 lr 0.000536	time 0.8771 (0.8858)	loss 0.4656 (0.5008)	grad_norm 2.2928 (2.8322)	mem 20675MB
[2025-04-03 02:49:58 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][208/573]	eta 0:05:23 lr 0.000536	time 0.8772 (0.8857)	loss 0.3590 (0.4998)	grad_norm 2.8266 (2.8303)	mem 20675MB
[2025-04-03 02:50:00 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][210/573]	eta 0:05:21 lr 0.000536	time 0.8773 (0.8857)	loss 0.5560 (0.5002)	grad_norm 1.7287 (2.8229)	mem 20675MB
[2025-04-03 02:50:02 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][212/573]	eta 0:05:19 lr 0.000536	time 0.8775 (0.8856)	loss 0.6778 (0.5008)	grad_norm 3.3669 (2.8225)	mem 20675MB
[2025-04-03 02:50:03 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][214/573]	eta 0:05:17 lr 0.000536	time 0.8773 (0.8855)	loss 0.5339 (0.5015)	grad_norm 1.7582 (2.8184)	mem 20675MB
[2025-04-03 02:50:05 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][216/573]	eta 0:05:16 lr 0.000535	time 0.8773 (0.8855)	loss 0.4217 (0.5008)	grad_norm 2.6989 (2.8198)	mem 20675MB
[2025-04-03 02:50:07 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][218/573]	eta 0:05:14 lr 0.000535	time 0.8772 (0.8854)	loss 0.5118 (0.5011)	grad_norm 2.3040 (2.8134)	mem 20675MB
[2025-04-03 02:50:09 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][220/573]	eta 0:05:12 lr 0.000535	time 0.8772 (0.8853)	loss 0.4811 (0.5007)	grad_norm 2.1739 (2.8104)	mem 20675MB
[2025-04-03 02:50:10 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][222/573]	eta 0:05:10 lr 0.000535	time 0.8774 (0.8853)	loss 0.6418 (0.5018)	grad_norm 3.0599 (2.8080)	mem 20675MB
[2025-04-03 02:50:12 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][224/573]	eta 0:05:08 lr 0.000534	time 0.8773 (0.8852)	loss 0.5137 (0.5021)	grad_norm 2.2889 (2.8026)	mem 20675MB
[2025-04-03 02:50:14 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][226/573]	eta 0:05:07 lr 0.000534	time 0.8773 (0.8851)	loss 0.5839 (0.5027)	grad_norm 2.3805 (2.8002)	mem 20675MB
[2025-04-03 02:50:16 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][228/573]	eta 0:05:05 lr 0.000534	time 0.8774 (0.8851)	loss 0.4216 (0.5024)	grad_norm 2.6404 (2.7980)	mem 20675MB
[2025-04-03 02:50:17 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][230/573]	eta 0:05:03 lr 0.000534	time 0.8772 (0.8850)	loss 0.3671 (0.5016)	grad_norm 2.7382 (2.7973)	mem 20675MB
[2025-04-03 02:50:19 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][232/573]	eta 0:05:01 lr 0.000534	time 0.8771 (0.8850)	loss 0.5360 (0.5019)	grad_norm 1.8318 (2.7900)	mem 20675MB
[2025-04-03 02:50:21 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][234/573]	eta 0:04:59 lr 0.000533	time 0.8769 (0.8849)	loss 0.5551 (0.5027)	grad_norm 2.9870 (2.7888)	mem 20675MB
[2025-04-03 02:50:23 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][236/573]	eta 0:04:58 lr 0.000533	time 0.8771 (0.8848)	loss 0.4107 (0.5016)	grad_norm 4.8712 (2.7999)	mem 20675MB
[2025-04-03 02:50:24 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][238/573]	eta 0:04:56 lr 0.000533	time 0.8771 (0.8848)	loss 0.5588 (0.5022)	grad_norm 1.5355 (2.7897)	mem 20675MB
[2025-04-03 02:50:26 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][240/573]	eta 0:04:54 lr 0.000533	time 0.8774 (0.8847)	loss 0.5387 (0.5026)	grad_norm 2.2243 (2.7879)	mem 20675MB
[2025-04-03 02:50:28 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][242/573]	eta 0:04:52 lr 0.000532	time 0.8772 (0.8847)	loss 0.5484 (0.5027)	grad_norm 2.1645 (2.7816)	mem 20675MB
[2025-04-03 02:50:30 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][244/573]	eta 0:04:51 lr 0.000532	time 0.8774 (0.8846)	loss 0.5331 (0.5030)	grad_norm 3.5596 (2.7831)	mem 20675MB
[2025-04-03 02:50:31 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][246/573]	eta 0:04:49 lr 0.000532	time 0.8773 (0.8846)	loss 0.5478 (0.5032)	grad_norm 1.9125 (2.7794)	mem 20675MB
[2025-04-03 02:50:33 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][248/573]	eta 0:04:47 lr 0.000532	time 0.8770 (0.8845)	loss 0.3613 (0.5032)	grad_norm 2.7678 (2.7775)	mem 20675MB
[2025-04-03 02:50:35 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][250/573]	eta 0:04:45 lr 0.000531	time 0.8772 (0.8845)	loss 0.5568 (0.5037)	grad_norm 2.5072 (2.7730)	mem 20675MB
[2025-04-03 02:50:37 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][252/573]	eta 0:04:43 lr 0.000531	time 0.8772 (0.8844)	loss 0.5600 (0.5042)	grad_norm 2.7697 (2.7705)	mem 20675MB
[2025-04-03 02:50:38 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][254/573]	eta 0:04:42 lr 0.000531	time 0.8772 (0.8844)	loss 0.5126 (0.5045)	grad_norm 2.2858 (2.7637)	mem 20675MB
[2025-04-03 02:50:40 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][256/573]	eta 0:04:40 lr 0.000531	time 0.8788 (0.8843)	loss 0.5850 (0.5046)	grad_norm 3.1546 (2.7633)	mem 20675MB
[2025-04-03 02:50:42 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][258/573]	eta 0:04:38 lr 0.000531	time 0.8772 (0.8843)	loss 0.5792 (0.5047)	grad_norm 3.7311 (2.7655)	mem 20675MB
[2025-04-03 02:50:44 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][260/573]	eta 0:04:36 lr 0.000530	time 0.8773 (0.8842)	loss 0.4900 (0.5045)	grad_norm 3.0713 (2.7646)	mem 20675MB
[2025-04-03 02:50:46 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][262/573]	eta 0:04:34 lr 0.000530	time 0.8773 (0.8842)	loss 0.5564 (0.5050)	grad_norm 2.7032 (2.7621)	mem 20675MB
[2025-04-03 02:50:47 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][264/573]	eta 0:04:33 lr 0.000530	time 0.8774 (0.8841)	loss 0.4640 (0.5048)	grad_norm 2.1921 (2.7576)	mem 20675MB
[2025-04-03 02:50:49 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][266/573]	eta 0:04:31 lr 0.000530	time 0.8771 (0.8841)	loss 0.5538 (0.5044)	grad_norm 3.2226 (2.7649)	mem 20675MB
[2025-04-03 02:50:51 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][268/573]	eta 0:04:29 lr 0.000529	time 0.8773 (0.8840)	loss 0.5724 (0.5049)	grad_norm 2.4469 (2.7633)	mem 20675MB
[2025-04-03 02:50:53 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][270/573]	eta 0:04:27 lr 0.000529	time 0.8770 (0.8840)	loss 0.4680 (0.5049)	grad_norm 3.4655 (2.7714)	mem 20675MB
[2025-04-03 02:50:54 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][272/573]	eta 0:04:26 lr 0.000529	time 0.8774 (0.8839)	loss 0.5540 (0.5051)	grad_norm 2.1220 (2.7714)	mem 20675MB
[2025-04-03 02:50:56 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][274/573]	eta 0:04:24 lr 0.000529	time 0.8772 (0.8839)	loss 0.5221 (0.5048)	grad_norm 3.5134 (2.7756)	mem 20675MB
[2025-04-03 02:50:58 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][276/573]	eta 0:04:22 lr 0.000529	time 0.8771 (0.8839)	loss 0.4505 (0.5045)	grad_norm 3.1484 (2.7801)	mem 20675MB
[2025-04-03 02:51:00 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][278/573]	eta 0:04:20 lr 0.000528	time 0.8774 (0.8838)	loss 0.6068 (0.5052)	grad_norm 2.5017 (2.7782)	mem 20675MB
[2025-04-03 02:51:01 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][280/573]	eta 0:04:18 lr 0.000528	time 0.8775 (0.8838)	loss 0.4947 (0.5055)	grad_norm 1.7661 (2.7729)	mem 20675MB
[2025-04-03 02:51:03 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][282/573]	eta 0:04:17 lr 0.000528	time 0.8789 (0.8838)	loss 0.4811 (0.5051)	grad_norm 2.8353 (2.7734)	mem 20675MB
[2025-04-03 02:51:05 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][284/573]	eta 0:04:15 lr 0.000528	time 0.8774 (0.8837)	loss 0.4516 (0.5047)	grad_norm 2.9271 (2.7794)	mem 20675MB
[2025-04-03 02:51:07 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][286/573]	eta 0:04:13 lr 0.000527	time 0.8771 (0.8837)	loss 0.4599 (0.5044)	grad_norm 2.5047 (2.7764)	mem 20675MB
[2025-04-03 02:51:08 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][288/573]	eta 0:04:11 lr 0.000527	time 0.8774 (0.8836)	loss 0.5179 (0.5046)	grad_norm 2.9190 (2.7757)	mem 20675MB
[2025-04-03 02:51:10 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][290/573]	eta 0:04:10 lr 0.000527	time 0.8774 (0.8836)	loss 0.5086 (0.5046)	grad_norm 1.9961 (2.7707)	mem 20675MB
[2025-04-03 02:51:12 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][292/573]	eta 0:04:08 lr 0.000527	time 0.8774 (0.8836)	loss 0.4144 (0.5044)	grad_norm 2.5418 (2.7670)	mem 20675MB
[2025-04-03 02:51:14 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][294/573]	eta 0:04:06 lr 0.000527	time 0.8773 (0.8835)	loss 0.5207 (0.5046)	grad_norm 3.0759 (2.7654)	mem 20675MB
[2025-04-03 02:51:15 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][296/573]	eta 0:04:04 lr 0.000526	time 0.8772 (0.8835)	loss 0.5460 (0.5048)	grad_norm 3.4121 (2.7666)	mem 20675MB
[2025-04-03 02:51:17 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][298/573]	eta 0:04:02 lr 0.000526	time 0.8773 (0.8834)	loss 0.5957 (0.5054)	grad_norm 2.6639 (2.7645)	mem 20675MB
[2025-04-03 02:51:19 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][300/573]	eta 0:04:01 lr 0.000526	time 0.8771 (0.8834)	loss 0.5534 (0.5051)	grad_norm 3.8850 (2.7694)	mem 20675MB
[2025-04-03 02:51:21 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][302/573]	eta 0:03:59 lr 0.000526	time 0.8771 (0.8834)	loss 0.5472 (0.5055)	grad_norm 2.5173 (2.7669)	mem 20675MB
[2025-04-03 02:51:22 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][304/573]	eta 0:03:57 lr 0.000525	time 0.8773 (0.8833)	loss 0.4770 (0.5053)	grad_norm 3.2063 (2.7682)	mem 20675MB
[2025-04-03 02:51:24 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][306/573]	eta 0:03:55 lr 0.000525	time 0.8773 (0.8833)	loss 0.5816 (0.5058)	grad_norm 4.0932 (2.7690)	mem 20675MB
[2025-04-03 02:51:26 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][308/573]	eta 0:03:54 lr 0.000525	time 0.8777 (0.8833)	loss 0.5955 (0.5063)	grad_norm 2.9324 (2.7717)	mem 20675MB
[2025-04-03 02:51:28 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][310/573]	eta 0:03:52 lr 0.000525	time 0.8772 (0.8832)	loss 0.6057 (0.5062)	grad_norm 2.1721 (2.7719)	mem 20675MB
[2025-04-03 02:51:29 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][312/573]	eta 0:03:50 lr 0.000524	time 0.8771 (0.8832)	loss 0.5275 (0.5065)	grad_norm 2.9930 (2.7715)	mem 20675MB
[2025-04-03 02:51:31 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][314/573]	eta 0:03:48 lr 0.000524	time 0.8773 (0.8832)	loss 0.5903 (0.5065)	grad_norm 2.4242 (2.7731)	mem 20675MB
[2025-04-03 02:51:33 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][316/573]	eta 0:03:46 lr 0.000524	time 0.8775 (0.8831)	loss 0.6184 (0.5070)	grad_norm 2.3513 (2.7695)	mem 20675MB
[2025-04-03 02:51:35 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][318/573]	eta 0:03:45 lr 0.000524	time 0.8773 (0.8831)	loss 0.5357 (0.5071)	grad_norm 2.3501 (2.7678)	mem 20675MB
[2025-04-03 02:51:36 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][320/573]	eta 0:03:43 lr 0.000524	time 0.8774 (0.8831)	loss 0.6180 (0.5075)	grad_norm 1.9430 (2.7657)	mem 20675MB
[2025-04-03 02:51:38 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][322/573]	eta 0:03:41 lr 0.000523	time 0.8771 (0.8830)	loss 0.3336 (0.5068)	grad_norm 3.7821 (2.7712)	mem 20675MB
[2025-04-03 02:51:40 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][324/573]	eta 0:03:39 lr 0.000523	time 0.8773 (0.8830)	loss 0.5036 (0.5069)	grad_norm 2.2612 (2.7675)	mem 20675MB
[2025-04-03 02:51:42 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][326/573]	eta 0:03:38 lr 0.000523	time 0.8773 (0.8830)	loss 0.5760 (0.5069)	grad_norm 2.7884 (2.7667)	mem 20675MB
[2025-04-03 02:51:43 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][328/573]	eta 0:03:36 lr 0.000523	time 0.8772 (0.8830)	loss 0.4664 (0.5070)	grad_norm 2.9736 (2.7702)	mem 20675MB
[2025-04-03 02:51:45 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][330/573]	eta 0:03:34 lr 0.000522	time 0.8773 (0.8829)	loss 0.4052 (0.5065)	grad_norm 3.6855 (2.7745)	mem 20675MB
[2025-04-03 02:51:47 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][332/573]	eta 0:03:32 lr 0.000522	time 0.8774 (0.8829)	loss 0.4957 (0.5064)	grad_norm 3.3592 (2.7767)	mem 20675MB
[2025-04-03 02:51:49 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][334/573]	eta 0:03:31 lr 0.000522	time 0.8772 (0.8829)	loss 0.3694 (0.5057)	grad_norm 2.8551 (2.7799)	mem 20675MB
[2025-04-03 02:51:50 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][336/573]	eta 0:03:29 lr 0.000522	time 0.8772 (0.8828)	loss 0.3984 (0.5051)	grad_norm 4.1126 (2.7856)	mem 20675MB
[2025-04-03 02:51:52 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][338/573]	eta 0:03:27 lr 0.000522	time 0.8771 (0.8828)	loss 0.4482 (0.5044)	grad_norm 6.5358 (2.8004)	mem 20675MB
[2025-04-03 02:51:54 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][340/573]	eta 0:03:25 lr 0.000521	time 0.8771 (0.8828)	loss 0.5424 (0.5044)	grad_norm 2.2713 (2.8052)	mem 20675MB
[2025-04-03 02:51:56 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][342/573]	eta 0:03:23 lr 0.000521	time 0.8774 (0.8827)	loss 0.4633 (0.5039)	grad_norm 3.0122 (2.8053)	mem 20675MB
[2025-04-03 02:51:58 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][344/573]	eta 0:03:22 lr 0.000521	time 0.8772 (0.8827)	loss 0.3418 (0.5036)	grad_norm 5.9092 (2.8152)	mem 20675MB
[2025-04-03 02:51:59 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][346/573]	eta 0:03:20 lr 0.000521	time 0.8771 (0.8827)	loss 0.3100 (0.5030)	grad_norm 4.1629 (2.8213)	mem 20675MB
[2025-04-03 02:52:01 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][348/573]	eta 0:03:18 lr 0.000520	time 0.8775 (0.8827)	loss 0.5546 (0.5031)	grad_norm 2.3048 (2.8220)	mem 20675MB
[2025-04-03 02:52:03 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][350/573]	eta 0:03:16 lr 0.000520	time 0.8772 (0.8826)	loss 0.6054 (0.5033)	grad_norm 4.7755 (2.8297)	mem 20675MB
[2025-04-03 02:52:05 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][352/573]	eta 0:03:15 lr 0.000520	time 0.8773 (0.8826)	loss 0.5515 (0.5036)	grad_norm 2.2150 (2.8365)	mem 20675MB
[2025-04-03 02:52:06 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][354/573]	eta 0:03:13 lr 0.000520	time 0.8773 (0.8826)	loss 0.4807 (0.5037)	grad_norm 2.8807 (2.8344)	mem 20675MB
[2025-04-03 02:52:08 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][356/573]	eta 0:03:11 lr 0.000520	time 0.8771 (0.8826)	loss 0.5568 (0.5035)	grad_norm 1.8276 (2.8368)	mem 20675MB
[2025-04-03 02:52:10 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][358/573]	eta 0:03:09 lr 0.000519	time 0.8771 (0.8825)	loss 0.5110 (0.5036)	grad_norm 2.9554 (2.8355)	mem 20675MB
[2025-04-03 02:52:12 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][360/573]	eta 0:03:07 lr 0.000519	time 0.8771 (0.8825)	loss 0.5313 (0.5039)	grad_norm 2.3520 (2.8353)	mem 20675MB
[2025-04-03 02:52:13 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][362/573]	eta 0:03:06 lr 0.000519	time 0.8770 (0.8825)	loss 0.5151 (0.5037)	grad_norm 1.7131 (2.8331)	mem 20675MB
[2025-04-03 02:52:15 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][364/573]	eta 0:03:04 lr 0.000519	time 0.8770 (0.8825)	loss 0.5670 (0.5039)	grad_norm 2.4199 (2.8311)	mem 20675MB
[2025-04-03 02:52:17 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][366/573]	eta 0:03:02 lr 0.000518	time 0.8772 (0.8824)	loss 0.5555 (0.5044)	grad_norm 2.8406 (2.8308)	mem 20675MB
[2025-04-03 02:52:19 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][368/573]	eta 0:03:00 lr 0.000518	time 0.8771 (0.8824)	loss 0.5464 (0.5046)	grad_norm 2.1688 (2.8263)	mem 20675MB
[2025-04-03 02:52:20 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][370/573]	eta 0:02:59 lr 0.000518	time 0.8773 (0.8824)	loss 0.5184 (0.5046)	grad_norm 2.0542 (2.8227)	mem 20675MB
[2025-04-03 02:52:22 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][372/573]	eta 0:02:57 lr 0.000518	time 0.8771 (0.8824)	loss 0.5549 (0.5048)	grad_norm 1.7366 (2.8186)	mem 20675MB
[2025-04-03 02:52:24 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][374/573]	eta 0:02:55 lr 0.000518	time 0.8772 (0.8823)	loss 0.6206 (0.5048)	grad_norm 1.8386 (2.8155)	mem 20675MB
[2025-04-03 02:52:26 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][376/573]	eta 0:02:53 lr 0.000517	time 0.8771 (0.8823)	loss 0.5256 (0.5048)	grad_norm 2.8492 (2.8164)	mem 20675MB
[2025-04-03 02:52:27 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][378/573]	eta 0:02:52 lr 0.000517	time 0.8773 (0.8823)	loss 0.4936 (0.5045)	grad_norm 3.4730 (2.8204)	mem 20675MB
[2025-04-03 02:52:29 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][380/573]	eta 0:02:50 lr 0.000517	time 0.8770 (0.8823)	loss 0.5313 (0.5045)	grad_norm 3.1553 (2.8200)	mem 20675MB
[2025-04-03 02:52:31 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][382/573]	eta 0:02:48 lr 0.000517	time 0.8771 (0.8822)	loss 0.5658 (0.5043)	grad_norm 2.3910 (2.8194)	mem 20675MB
[2025-04-03 02:52:33 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][384/573]	eta 0:02:46 lr 0.000516	time 0.8771 (0.8822)	loss 0.5417 (0.5046)	grad_norm 2.7836 (2.8192)	mem 20675MB
[2025-04-03 02:52:34 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][386/573]	eta 0:02:44 lr 0.000516	time 0.8773 (0.8822)	loss 0.5994 (0.5044)	grad_norm 2.8444 (2.8215)	mem 20675MB
[2025-04-03 02:52:36 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][388/573]	eta 0:02:43 lr 0.000516	time 0.8774 (0.8822)	loss 0.4074 (0.5043)	grad_norm 2.8919 (2.8212)	mem 20675MB
[2025-04-03 02:52:38 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][390/573]	eta 0:02:41 lr 0.000516	time 0.8772 (0.8822)	loss 0.6132 (0.5048)	grad_norm 4.1479 (2.8235)	mem 20675MB
[2025-04-03 02:52:40 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][392/573]	eta 0:02:39 lr 0.000515	time 0.8771 (0.8821)	loss 0.3305 (0.5043)	grad_norm 2.4725 (2.8228)	mem 20675MB
[2025-04-03 02:52:41 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][394/573]	eta 0:02:37 lr 0.000515	time 0.8772 (0.8821)	loss 0.4662 (0.5043)	grad_norm 3.0676 (2.8276)	mem 20675MB
[2025-04-03 02:52:43 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][396/573]	eta 0:02:36 lr 0.000515	time 0.8773 (0.8821)	loss 0.5613 (0.5045)	grad_norm 2.6378 (2.8265)	mem 20675MB
[2025-04-03 02:52:45 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][398/573]	eta 0:02:34 lr 0.000515	time 0.8770 (0.8821)	loss 0.5458 (0.5047)	grad_norm 2.1276 (2.8224)	mem 20675MB
[2025-04-03 02:52:47 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][400/573]	eta 0:02:32 lr 0.000515	time 0.8773 (0.8821)	loss 0.5694 (0.5047)	grad_norm 2.7862 (2.8222)	mem 20675MB
[2025-04-03 02:52:48 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][402/573]	eta 0:02:30 lr 0.000514	time 0.8772 (0.8820)	loss 0.4459 (0.5048)	grad_norm 2.7705 (2.8231)	mem 20675MB
[2025-04-03 02:52:50 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][404/573]	eta 0:02:29 lr 0.000514	time 0.8770 (0.8820)	loss 0.4530 (0.5043)	grad_norm 3.5989 (2.8243)	mem 20675MB
[2025-04-03 02:52:52 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][406/573]	eta 0:02:27 lr 0.000514	time 0.8772 (0.8820)	loss 0.5510 (0.5047)	grad_norm 1.8731 (2.8217)	mem 20675MB
[2025-04-03 02:52:54 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][408/573]	eta 0:02:25 lr 0.000514	time 0.8773 (0.8820)	loss 0.4819 (0.5048)	grad_norm 3.1030 (2.8205)	mem 20675MB
[2025-04-03 02:52:55 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][410/573]	eta 0:02:23 lr 0.000513	time 0.8770 (0.8820)	loss 0.4575 (0.5042)	grad_norm 2.9498 (2.8225)	mem 20675MB
[2025-04-03 02:52:57 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][412/573]	eta 0:02:21 lr 0.000513	time 0.8771 (0.8819)	loss 0.4471 (0.5039)	grad_norm 2.8011 (2.8195)	mem 20675MB
[2025-04-03 02:52:59 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][414/573]	eta 0:02:20 lr 0.000513	time 0.8773 (0.8819)	loss 0.5615 (0.5040)	grad_norm 2.2849 (2.8159)	mem 20675MB
[2025-04-03 02:53:01 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][416/573]	eta 0:02:18 lr 0.000513	time 0.8770 (0.8819)	loss 0.3494 (0.5037)	grad_norm 3.5614 (2.8211)	mem 20675MB
[2025-04-03 02:53:02 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][418/573]	eta 0:02:16 lr 0.000513	time 0.8771 (0.8819)	loss 0.5653 (0.5040)	grad_norm 4.3446 (2.8256)	mem 20675MB
[2025-04-03 02:53:04 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][420/573]	eta 0:02:14 lr 0.000512	time 0.8771 (0.8819)	loss 0.5451 (0.5039)	grad_norm 2.7893 (2.8254)	mem 20675MB
[2025-04-03 02:53:06 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][422/573]	eta 0:02:13 lr 0.000512	time 0.8773 (0.8818)	loss 0.5336 (0.5037)	grad_norm 3.0655 (2.8282)	mem 20675MB
[2025-04-03 02:53:08 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][424/573]	eta 0:02:11 lr 0.000512	time 0.8771 (0.8818)	loss 0.5406 (0.5035)	grad_norm 2.2465 (2.8290)	mem 20675MB
[2025-04-03 02:53:10 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][426/573]	eta 0:02:09 lr 0.000512	time 0.8770 (0.8818)	loss 0.3824 (0.5034)	grad_norm 6.5436 (2.8407)	mem 20675MB
[2025-04-03 02:53:11 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][428/573]	eta 0:02:07 lr 0.000511	time 0.8774 (0.8818)	loss 0.3698 (0.5032)	grad_norm 2.4937 (2.8403)	mem 20675MB
[2025-04-03 02:53:13 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][430/573]	eta 0:02:06 lr 0.000511	time 0.8774 (0.8818)	loss 0.5731 (0.5033)	grad_norm 1.7114 (2.8413)	mem 20675MB
[2025-04-03 02:53:15 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][432/573]	eta 0:02:04 lr 0.000511	time 0.8773 (0.8818)	loss 0.5924 (0.5035)	grad_norm 2.6868 (2.8410)	mem 20675MB
[2025-04-03 02:53:17 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][434/573]	eta 0:02:02 lr 0.000511	time 0.8774 (0.8817)	loss 0.4726 (0.5032)	grad_norm 2.2158 (2.8394)	mem 20675MB
[2025-04-03 02:53:18 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][436/573]	eta 0:02:00 lr 0.000511	time 0.8771 (0.8817)	loss 0.5681 (0.5036)	grad_norm 2.0513 (2.8358)	mem 20675MB
[2025-04-03 02:53:20 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][438/573]	eta 0:01:59 lr 0.000510	time 0.8774 (0.8817)	loss 0.5014 (0.5034)	grad_norm 2.6588 (2.8332)	mem 20675MB
[2025-04-03 02:53:22 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][440/573]	eta 0:01:57 lr 0.000510	time 0.8772 (0.8817)	loss 0.5408 (0.5035)	grad_norm 1.4689 (2.8290)	mem 20675MB
[2025-04-03 02:53:24 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][442/573]	eta 0:01:55 lr 0.000510	time 0.8773 (0.8817)	loss 0.6556 (0.5039)	grad_norm 3.1987 (2.8283)	mem 20675MB
[2025-04-03 02:53:25 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][444/573]	eta 0:01:53 lr 0.000510	time 0.8773 (0.8817)	loss 0.4502 (0.5038)	grad_norm 3.2067 (2.8274)	mem 20675MB
[2025-04-03 02:53:27 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][446/573]	eta 0:01:51 lr 0.000509	time 0.8769 (0.8816)	loss 0.4636 (0.5039)	grad_norm 2.8056 (2.8275)	mem 20675MB
[2025-04-03 02:53:29 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][448/573]	eta 0:01:50 lr 0.000509	time 0.8773 (0.8816)	loss 0.3381 (0.5033)	grad_norm 3.0287 (2.8305)	mem 20675MB
[2025-04-03 02:53:31 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][450/573]	eta 0:01:48 lr 0.000509	time 0.8773 (0.8816)	loss 0.4057 (0.5031)	grad_norm 1.7854 (2.8280)	mem 20675MB
[2025-04-03 02:53:32 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][452/573]	eta 0:01:46 lr 0.000509	time 0.8772 (0.8816)	loss 0.5568 (0.5035)	grad_norm 2.3180 (2.8262)	mem 20675MB
[2025-04-03 02:53:34 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][454/573]	eta 0:01:44 lr 0.000509	time 0.8771 (0.8816)	loss 0.4991 (0.5036)	grad_norm 2.9077 (2.8245)	mem 20675MB
[2025-04-03 02:53:36 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][456/573]	eta 0:01:43 lr 0.000508	time 0.8771 (0.8816)	loss 0.4136 (0.5035)	grad_norm 3.7105 (2.8266)	mem 20675MB
[2025-04-03 02:53:38 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][458/573]	eta 0:01:41 lr 0.000508	time 0.8773 (0.8815)	loss 0.6115 (0.5039)	grad_norm 3.3080 (2.8257)	mem 20675MB
[2025-04-03 02:53:39 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][460/573]	eta 0:01:39 lr 0.000508	time 0.8774 (0.8815)	loss 0.6045 (0.5037)	grad_norm 3.1370 (2.8298)	mem 20675MB
[2025-04-03 02:53:41 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][462/573]	eta 0:01:37 lr 0.000508	time 0.8771 (0.8815)	loss 0.5240 (0.5039)	grad_norm 1.9498 (2.8255)	mem 20675MB
[2025-04-03 02:53:43 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][464/573]	eta 0:01:36 lr 0.000507	time 0.8771 (0.8815)	loss 0.4173 (0.5038)	grad_norm 1.9941 (2.8219)	mem 20675MB
[2025-04-03 02:53:45 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][466/573]	eta 0:01:34 lr 0.000507	time 0.8772 (0.8815)	loss 0.4166 (0.5037)	grad_norm 3.0617 (2.8230)	mem 20675MB
[2025-04-03 02:53:46 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][468/573]	eta 0:01:32 lr 0.000507	time 0.8775 (0.8815)	loss 0.3441 (0.5033)	grad_norm 3.3458 (2.8233)	mem 20675MB
[2025-04-03 02:53:48 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][470/573]	eta 0:01:30 lr 0.000507	time 0.8773 (0.8815)	loss 0.4979 (0.5033)	grad_norm 2.0533 (2.8205)	mem 20675MB
[2025-04-03 02:53:50 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][472/573]	eta 0:01:29 lr 0.000507	time 0.8774 (0.8814)	loss 0.4091 (0.5028)	grad_norm 3.0785 (2.8211)	mem 20675MB
[2025-04-03 02:53:52 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][474/573]	eta 0:01:27 lr 0.000506	time 0.8771 (0.8814)	loss 0.5270 (0.5026)	grad_norm 2.7196 (2.8226)	mem 20675MB
[2025-04-03 02:53:53 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][476/573]	eta 0:01:25 lr 0.000506	time 0.8771 (0.8814)	loss 0.5003 (0.5026)	grad_norm 2.3549 (2.8221)	mem 20675MB
[2025-04-03 02:53:55 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][478/573]	eta 0:01:23 lr 0.000506	time 0.8770 (0.8814)	loss 0.4083 (0.5024)	grad_norm 3.2165 (2.8238)	mem 20675MB
[2025-04-03 02:53:57 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][480/573]	eta 0:01:21 lr 0.000506	time 0.8774 (0.8814)	loss 0.5792 (0.5028)	grad_norm 2.8524 (2.8228)	mem 20675MB
[2025-04-03 02:53:59 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][482/573]	eta 0:01:20 lr 0.000505	time 0.8773 (0.8814)	loss 0.5846 (0.5032)	grad_norm 2.4857 (2.8239)	mem 20675MB
[2025-04-03 02:54:00 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][484/573]	eta 0:01:18 lr 0.000505	time 0.8773 (0.8814)	loss 0.5192 (0.5035)	grad_norm 1.9589 (2.8216)	mem 20675MB
[2025-04-03 02:54:02 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][486/573]	eta 0:01:16 lr 0.000505	time 0.8772 (0.8813)	loss 0.5226 (0.5036)	grad_norm 3.2683 (2.8210)	mem 20675MB
[2025-04-03 02:54:04 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][488/573]	eta 0:01:14 lr 0.000505	time 0.8773 (0.8813)	loss 0.4790 (0.5035)	grad_norm 3.4146 (2.8225)	mem 20675MB
[2025-04-03 02:54:06 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][490/573]	eta 0:01:13 lr 0.000504	time 0.8772 (0.8813)	loss 0.5331 (0.5033)	grad_norm 3.8160 (2.8234)	mem 20675MB
[2025-04-03 02:54:07 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][492/573]	eta 0:01:11 lr 0.000504	time 0.8778 (0.8813)	loss 0.5966 (0.5035)	grad_norm 2.6127 (2.8207)	mem 20675MB
[2025-04-03 02:54:09 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][494/573]	eta 0:01:09 lr 0.000504	time 0.8770 (0.8813)	loss 0.5450 (0.5037)	grad_norm 2.1649 (2.8177)	mem 20675MB
[2025-04-03 02:54:11 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][496/573]	eta 0:01:07 lr 0.000504	time 0.8774 (0.8813)	loss 0.5474 (0.5038)	grad_norm 2.2770 (2.8158)	mem 20675MB
[2025-04-03 02:54:13 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][498/573]	eta 0:01:06 lr 0.000504	time 0.8771 (0.8813)	loss 0.5839 (0.5040)	grad_norm 2.7346 (2.8140)	mem 20675MB
[2025-04-03 02:54:14 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][500/573]	eta 0:01:04 lr 0.000503	time 0.8774 (0.8812)	loss 0.5915 (0.5042)	grad_norm 3.6573 (2.8132)	mem 20675MB
[2025-04-03 02:54:16 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][502/573]	eta 0:01:02 lr 0.000503	time 0.8771 (0.8812)	loss 0.5180 (0.5044)	grad_norm 2.4322 (2.8110)	mem 20675MB
[2025-04-03 02:54:18 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][504/573]	eta 0:01:00 lr 0.000503	time 0.8773 (0.8812)	loss 0.4808 (0.5044)	grad_norm 2.8560 (2.8095)	mem 20675MB
[2025-04-03 02:54:20 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][506/573]	eta 0:00:59 lr 0.000503	time 0.8774 (0.8812)	loss 0.5037 (0.5045)	grad_norm 1.7923 (2.8065)	mem 20675MB
[2025-04-03 02:54:22 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][508/573]	eta 0:00:57 lr 0.000502	time 0.8773 (0.8812)	loss 0.5308 (0.5046)	grad_norm 3.1104 (2.8062)	mem 20675MB
[2025-04-03 02:54:23 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][510/573]	eta 0:00:55 lr 0.000502	time 0.8775 (0.8812)	loss 0.5128 (0.5046)	grad_norm 2.7422 (2.8063)	mem 20675MB
[2025-04-03 02:54:25 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][512/573]	eta 0:00:53 lr 0.000502	time 0.8772 (0.8812)	loss 0.4280 (0.5046)	grad_norm 2.4998 (2.8049)	mem 20675MB
[2025-04-03 02:54:27 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][514/573]	eta 0:00:51 lr 0.000502	time 0.8776 (0.8812)	loss 0.5497 (0.5048)	grad_norm 2.2387 (2.8027)	mem 20675MB
[2025-04-03 02:54:29 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][516/573]	eta 0:00:50 lr 0.000502	time 0.8775 (0.8811)	loss 0.4698 (0.5046)	grad_norm 2.2563 (2.8022)	mem 20675MB
[2025-04-03 02:54:30 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][518/573]	eta 0:00:48 lr 0.000501	time 0.8773 (0.8811)	loss 0.6137 (0.5047)	grad_norm 2.5521 (2.8046)	mem 20675MB
[2025-04-03 02:54:32 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][520/573]	eta 0:00:46 lr 0.000501	time 0.8771 (0.8811)	loss 0.5705 (0.5050)	grad_norm 2.6684 (2.8045)	mem 20675MB
[2025-04-03 02:54:34 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][522/573]	eta 0:00:44 lr 0.000501	time 0.8773 (0.8811)	loss 0.4972 (0.5051)	grad_norm 3.2661 (2.8045)	mem 20675MB
[2025-04-03 02:54:36 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][524/573]	eta 0:00:43 lr 0.000501	time 0.8772 (0.8811)	loss 0.5430 (0.5051)	grad_norm 2.9478 (2.8078)	mem 20675MB
[2025-04-03 02:54:37 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][526/573]	eta 0:00:41 lr 0.000500	time 0.8773 (0.8811)	loss 0.4818 (0.5050)	grad_norm 2.1304 (2.8058)	mem 20675MB
[2025-04-03 02:54:39 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][528/573]	eta 0:00:39 lr 0.000500	time 0.8772 (0.8811)	loss 0.4653 (0.5049)	grad_norm 2.2310 (2.8037)	mem 20675MB
[2025-04-03 02:54:41 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][530/573]	eta 0:00:37 lr 0.000500	time 0.8774 (0.8811)	loss 0.5942 (0.5052)	grad_norm 2.2769 (2.8027)	mem 20675MB
[2025-04-03 02:54:43 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][532/573]	eta 0:00:36 lr 0.000500	time 0.8770 (0.8811)	loss 0.4085 (0.5050)	grad_norm 2.0970 (2.7998)	mem 20675MB
[2025-04-03 02:54:44 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][534/573]	eta 0:00:34 lr 0.000500	time 0.8770 (0.8810)	loss 0.5160 (0.5051)	grad_norm 2.1806 (2.7967)	mem 20675MB
[2025-04-03 02:54:46 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][536/573]	eta 0:00:32 lr 0.000499	time 0.8773 (0.8810)	loss 0.4725 (0.5052)	grad_norm 3.2912 (2.7986)	mem 20675MB
[2025-04-03 02:54:48 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][538/573]	eta 0:00:30 lr 0.000499	time 0.8775 (0.8810)	loss 0.5473 (0.5053)	grad_norm 2.6521 (2.7979)	mem 20675MB
[2025-04-03 02:54:50 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][540/573]	eta 0:00:29 lr 0.000499	time 0.8772 (0.8810)	loss 0.5706 (0.5056)	grad_norm 2.4298 (2.7970)	mem 20675MB
[2025-04-03 02:54:51 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][542/573]	eta 0:00:27 lr 0.000499	time 0.8771 (0.8810)	loss 0.5503 (0.5057)	grad_norm 1.5350 (2.7936)	mem 20675MB
[2025-04-03 02:54:53 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][544/573]	eta 0:00:25 lr 0.000498	time 0.8772 (0.8810)	loss 0.5194 (0.5058)	grad_norm 2.0624 (2.7910)	mem 20675MB
[2025-04-03 02:54:55 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][546/573]	eta 0:00:23 lr 0.000498	time 0.8786 (0.8810)	loss 0.5698 (0.5058)	grad_norm 2.1421 (2.7904)	mem 20675MB
[2025-04-03 02:54:57 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][548/573]	eta 0:00:22 lr 0.000498	time 0.8774 (0.8810)	loss 0.4642 (0.5057)	grad_norm 3.8143 (2.7911)	mem 20675MB
[2025-04-03 02:54:58 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][550/573]	eta 0:00:20 lr 0.000498	time 0.8771 (0.8810)	loss 0.4980 (0.5058)	grad_norm 1.8058 (2.7897)	mem 20675MB
[2025-04-03 02:55:00 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][552/573]	eta 0:00:18 lr 0.000498	time 0.8772 (0.8809)	loss 0.4214 (0.5056)	grad_norm 2.9880 (2.7910)	mem 20675MB
[2025-04-03 02:55:02 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][554/573]	eta 0:00:16 lr 0.000497	time 0.8771 (0.8809)	loss 0.5856 (0.5054)	grad_norm 2.0785 (2.7902)	mem 20675MB
[2025-04-03 02:55:04 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][556/573]	eta 0:00:14 lr 0.000497	time 0.8773 (0.8809)	loss 0.4826 (0.5055)	grad_norm 3.5597 (2.7900)	mem 20675MB
[2025-04-03 02:55:05 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][558/573]	eta 0:00:13 lr 0.000497	time 0.8771 (0.8809)	loss 0.5104 (0.5055)	grad_norm 2.4493 (2.7896)	mem 20675MB
[2025-04-03 02:55:07 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][560/573]	eta 0:00:11 lr 0.000497	time 0.8780 (0.8809)	loss 0.5654 (0.5056)	grad_norm 2.8470 (2.7901)	mem 20675MB
[2025-04-03 02:55:09 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][562/573]	eta 0:00:09 lr 0.000496	time 0.8771 (0.8809)	loss 0.5550 (0.5055)	grad_norm 1.8113 (2.7880)	mem 20675MB
[2025-04-03 02:55:11 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][564/573]	eta 0:00:07 lr 0.000496	time 0.8770 (0.8809)	loss 0.5702 (0.5056)	grad_norm 2.5335 (2.7870)	mem 20675MB
[2025-04-03 02:55:12 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][566/573]	eta 0:00:06 lr 0.000496	time 0.8771 (0.8809)	loss 0.4014 (0.5054)	grad_norm 3.2127 (2.7901)	mem 20675MB
[2025-04-03 02:55:14 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][568/573]	eta 0:00:04 lr 0.000496	time 0.8769 (0.8809)	loss 0.4104 (0.5053)	grad_norm 2.8455 (2.7879)	mem 20675MB
[2025-04-03 02:55:16 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][570/573]	eta 0:00:02 lr 0.000496	time 0.8768 (0.8808)	loss 0.4869 (0.5053)	grad_norm 2.8392 (2.7871)	mem 20675MB
[2025-04-03 02:55:18 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][572/573]	eta 0:00:00 lr 0.000495	time 0.8768 (0.8808)	loss 0.5439 (0.5055)	grad_norm 2.1820 (2.7861)	mem 20675MB
[2025-04-03 02:55:18 simmim_finetune] (main_finetune.py 260): INFO EPOCH 16 training takes 0:08:24
[2025-04-03 02:55:20 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.827 (1.827)	Loss 0.5574 (0.5574)	Acc@1 67.969 (67.969)	Mem 20675MB
[2025-04-03 02:55:20 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.798)	Loss 0.4991 (0.5076)	Acc@1 76.562 (74.219)	Mem 20675MB
[2025-04-03 02:55:21 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.592)	Loss 0.5236 (0.5023)	Acc@1 73.438 (74.844)	Mem 20675MB
[2025-04-03 02:55:21 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.504)	Loss 0.4777 (0.4922)	Acc@1 75.781 (75.781)	Mem 20675MB
[2025-04-03 02:55:22 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.284 (0.455)	Loss 0.5110 (0.4825)	Acc@1 73.438 (76.823)	Mem 20675MB
[2025-04-03 02:55:22 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.424)	Loss 0.4674 (0.4839)	Acc@1 82.031 (77.131)	Mem 20675MB
[2025-04-03 02:55:23 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.403)	Loss 0.4694 (0.4806)	Acc@1 77.344 (77.284)	Mem 20675MB
[2025-04-03 02:55:24 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.387)	Loss 0.4262 (0.4749)	Acc@1 81.250 (77.604)	Mem 20675MB
[2025-04-03 02:55:24 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.571
[2025-04-03 02:55:24 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 77.6%
[2025-04-03 02:55:24 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 77.57%
[2025-04-03 02:55:24 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.9817006783268215e-06, 1.9817006783268215e-06, 2.967480558308099e-06, 2.967480558308099e-06, 4.4840649890485264e-06, 4.4840649890485264e-06, 6.817271805572261e-06, 6.817271805572261e-06, 1.040682075407031e-05, 1.040682075407031e-05, 1.592920375175962e-05, 1.592920375175962e-05, 2.442517759435856e-05, 2.442517759435856e-05, 3.749590658297231e-05, 3.749590658297231e-05, 5.760472041160884e-05, 5.760472041160884e-05, 8.85413570710497e-05, 8.85413570710497e-05, 0.00013613618270095864, 0.00013613618270095864, 0.00020935899136235704, 0.00020935899136235704, 0.00032200946622604686, 0.00032200946622604686, 0.0004953178890932621, 0.0004953178890932621]
[2025-04-03 02:55:26 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][0/573]	eta 0:21:23 lr 0.000495	time 2.2401 (2.2401)	loss 0.5242 (0.5242)	grad_norm 3.2175 (3.2175)	mem 20675MB
[2025-04-03 02:55:28 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][2/573]	eta 0:12:40 lr 0.000495	time 0.8771 (1.3321)	loss 0.5072 (0.4958)	grad_norm 3.5904 (3.3390)	mem 20675MB
[2025-04-03 02:55:30 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][4/573]	eta 0:10:54 lr 0.000495	time 0.8771 (1.1505)	loss 0.4887 (0.4898)	grad_norm 2.6976 (3.1414)	mem 20675MB
[2025-04-03 02:55:31 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][6/573]	eta 0:10:08 lr 0.000495	time 0.8773 (1.0727)	loss 0.5032 (0.5022)	grad_norm 2.4922 (2.8182)	mem 20675MB
[2025-04-03 02:55:33 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][8/573]	eta 0:09:41 lr 0.000494	time 0.8773 (1.0294)	loss 0.5131 (0.5075)	grad_norm 3.1704 (2.8569)	mem 20675MB
[2025-04-03 02:55:35 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][10/573]	eta 0:09:24 lr 0.000494	time 0.8770 (1.0019)	loss 0.5236 (0.5098)	grad_norm 4.0256 (2.9538)	mem 20675MB
[2025-04-03 02:55:37 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][12/573]	eta 0:09:11 lr 0.000494	time 0.8787 (0.9829)	loss 0.3819 (0.4874)	grad_norm 2.2303 (2.8852)	mem 20675MB
[2025-04-03 02:55:38 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][14/573]	eta 0:09:01 lr 0.000494	time 0.8771 (0.9690)	loss 0.5364 (0.4887)	grad_norm 2.6377 (2.8388)	mem 20675MB
[2025-04-03 02:55:40 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][16/573]	eta 0:08:53 lr 0.000493	time 0.8771 (0.9583)	loss 0.5614 (0.4891)	grad_norm 1.3994 (2.9449)	mem 20675MB
[2025-04-03 02:55:42 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][18/573]	eta 0:08:47 lr 0.000493	time 0.8770 (0.9498)	loss 0.5202 (0.4884)	grad_norm 1.7801 (2.8868)	mem 20675MB
[2025-04-03 02:55:44 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][20/573]	eta 0:08:41 lr 0.000493	time 0.8774 (0.9430)	loss 0.6089 (0.4960)	grad_norm 2.7899 (2.8438)	mem 20675MB
[2025-04-03 02:55:45 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][22/573]	eta 0:08:36 lr 0.000493	time 0.8774 (0.9374)	loss 0.4790 (0.4968)	grad_norm 2.4577 (2.8054)	mem 20675MB
[2025-04-03 02:55:47 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][24/573]	eta 0:08:32 lr 0.000493	time 0.8776 (0.9326)	loss 0.3987 (0.4857)	grad_norm 3.0314 (2.8111)	mem 20675MB
[2025-04-03 02:55:49 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][26/573]	eta 0:08:27 lr 0.000492	time 0.8773 (0.9286)	loss 0.6192 (0.4927)	grad_norm 2.4631 (2.8599)	mem 20675MB
[2025-04-03 02:55:51 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][28/573]	eta 0:08:24 lr 0.000492	time 0.8773 (0.9251)	loss 0.4477 (0.4876)	grad_norm 2.2350 (2.8551)	mem 20675MB
[2025-04-03 02:55:52 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][30/573]	eta 0:08:20 lr 0.000492	time 0.8771 (0.9221)	loss 0.4792 (0.4864)	grad_norm 2.6188 (2.8431)	mem 20675MB
[2025-04-03 02:55:54 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][32/573]	eta 0:08:17 lr 0.000492	time 0.8774 (0.9194)	loss 0.5445 (0.4899)	grad_norm 2.9068 (2.8207)	mem 20675MB
[2025-04-03 02:55:56 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][34/573]	eta 0:08:14 lr 0.000491	time 0.8773 (0.9170)	loss 0.5495 (0.4920)	grad_norm 2.0183 (2.7717)	mem 20675MB
[2025-04-03 02:55:58 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][36/573]	eta 0:08:11 lr 0.000491	time 0.8772 (0.9149)	loss 0.4562 (0.4888)	grad_norm 2.8747 (2.8236)	mem 20675MB
[2025-04-03 02:56:00 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][38/573]	eta 0:08:08 lr 0.000491	time 0.8774 (0.9130)	loss 0.4265 (0.4867)	grad_norm 2.7844 (2.8349)	mem 20675MB
[2025-04-03 02:56:01 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][40/573]	eta 0:08:05 lr 0.000491	time 0.8773 (0.9113)	loss 0.5678 (0.4879)	grad_norm 1.8688 (2.7913)	mem 20675MB
[2025-04-03 02:56:03 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][42/573]	eta 0:08:03 lr 0.000491	time 0.8773 (0.9098)	loss 0.5714 (0.4905)	grad_norm 1.8527 (2.7590)	mem 20675MB
[2025-04-03 02:56:05 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][44/573]	eta 0:08:00 lr 0.000490	time 0.8772 (0.9084)	loss 0.5394 (0.4895)	grad_norm 2.2520 (2.7898)	mem 20675MB
[2025-04-03 02:56:07 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][46/573]	eta 0:07:58 lr 0.000490	time 0.8773 (0.9071)	loss 0.4855 (0.4912)	grad_norm 2.8231 (2.7954)	mem 20675MB
[2025-04-03 02:56:08 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][48/573]	eta 0:07:55 lr 0.000490	time 0.8776 (0.9059)	loss 0.4417 (0.4924)	grad_norm 3.7378 (2.7942)	mem 20675MB
[2025-04-03 02:56:10 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][50/573]	eta 0:07:53 lr 0.000490	time 0.8775 (0.9048)	loss 0.4962 (0.4932)	grad_norm 2.1032 (2.7828)	mem 20675MB
[2025-04-03 02:56:12 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][52/573]	eta 0:07:50 lr 0.000489	time 0.8774 (0.9038)	loss 0.5648 (0.4948)	grad_norm 3.0786 (2.7995)	mem 20675MB
[2025-04-03 02:56:14 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][54/573]	eta 0:07:48 lr 0.000489	time 0.8771 (0.9029)	loss 0.4564 (0.4944)	grad_norm 3.7034 (2.8151)	mem 20675MB
[2025-04-03 02:56:15 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][56/573]	eta 0:07:46 lr 0.000489	time 0.8771 (0.9020)	loss 0.5306 (0.4956)	grad_norm 2.9384 (2.8192)	mem 20675MB
[2025-04-03 02:56:17 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][58/573]	eta 0:07:44 lr 0.000489	time 0.8776 (0.9012)	loss 0.5218 (0.4975)	grad_norm 2.2389 (2.8086)	mem 20675MB
[2025-04-03 02:56:19 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][60/573]	eta 0:07:41 lr 0.000489	time 0.8777 (0.9005)	loss 0.4854 (0.4979)	grad_norm 2.2049 (2.8083)	mem 20675MB
[2025-04-03 02:56:21 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][62/573]	eta 0:07:39 lr 0.000488	time 0.8775 (0.8998)	loss 0.5996 (0.4975)	grad_norm 2.3703 (2.8023)	mem 20675MB
[2025-04-03 02:56:22 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][64/573]	eta 0:07:37 lr 0.000488	time 0.8781 (0.8991)	loss 0.5249 (0.4975)	grad_norm 2.4922 (2.7912)	mem 20675MB
[2025-04-03 02:56:24 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][66/573]	eta 0:07:35 lr 0.000488	time 0.8771 (0.8985)	loss 0.4604 (0.4974)	grad_norm 1.9593 (2.7638)	mem 20675MB
[2025-04-03 02:56:26 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][68/573]	eta 0:07:33 lr 0.000488	time 0.8773 (0.8979)	loss 0.3760 (0.4960)	grad_norm 2.5448 (2.7629)	mem 20675MB
[2025-04-03 02:56:28 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][70/573]	eta 0:07:31 lr 0.000487	time 0.8771 (0.8974)	loss 0.5695 (0.4962)	grad_norm 2.6575 (2.7603)	mem 20675MB
[2025-04-03 02:56:29 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][72/573]	eta 0:07:29 lr 0.000487	time 0.8776 (0.8968)	loss 0.5437 (0.4952)	grad_norm 2.0631 (2.7548)	mem 20675MB
[2025-04-03 02:56:31 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][74/573]	eta 0:07:27 lr 0.000487	time 0.8770 (0.8963)	loss 0.5725 (0.4980)	grad_norm 3.0517 (2.7739)	mem 20675MB
[2025-04-03 02:56:33 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][76/573]	eta 0:07:25 lr 0.000487	time 0.8774 (0.8959)	loss 0.3671 (0.4960)	grad_norm 3.5764 (2.7870)	mem 20675MB
[2025-04-03 02:56:35 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][78/573]	eta 0:07:23 lr 0.000487	time 0.8776 (0.8954)	loss 0.5459 (0.4976)	grad_norm 2.1072 (2.7701)	mem 20675MB
[2025-04-03 02:56:36 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][80/573]	eta 0:07:21 lr 0.000486	time 0.8775 (0.8950)	loss 0.5198 (0.4976)	grad_norm 3.2407 (2.7741)	mem 20675MB
[2025-04-03 02:56:38 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][82/573]	eta 0:07:19 lr 0.000486	time 0.8777 (0.8946)	loss 0.4964 (0.4956)	grad_norm 2.4295 (2.7645)	mem 20675MB
[2025-04-03 02:56:40 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][84/573]	eta 0:07:17 lr 0.000486	time 0.8775 (0.8942)	loss 0.6001 (0.4973)	grad_norm 2.6025 (2.7498)	mem 20675MB
[2025-04-03 02:56:42 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][86/573]	eta 0:07:15 lr 0.000486	time 0.8774 (0.8939)	loss 0.5338 (0.4981)	grad_norm 2.1593 (2.7416)	mem 20675MB
[2025-04-03 02:56:43 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][88/573]	eta 0:07:13 lr 0.000485	time 0.8772 (0.8935)	loss 0.6021 (0.4979)	grad_norm 2.6529 (2.7504)	mem 20675MB
[2025-04-03 02:56:45 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][90/573]	eta 0:07:11 lr 0.000485	time 0.8776 (0.8932)	loss 0.4090 (0.4962)	grad_norm 3.9309 (2.7718)	mem 20675MB
[2025-04-03 02:56:47 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][92/573]	eta 0:07:09 lr 0.000485	time 0.8775 (0.8929)	loss 0.5805 (0.4978)	grad_norm 1.7379 (2.7600)	mem 20675MB
[2025-04-03 02:56:49 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][94/573]	eta 0:07:07 lr 0.000485	time 0.8774 (0.8925)	loss 0.5397 (0.4970)	grad_norm 3.0136 (2.7604)	mem 20675MB
[2025-04-03 02:56:50 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][96/573]	eta 0:07:05 lr 0.000485	time 0.8772 (0.8922)	loss 0.4853 (0.4975)	grad_norm 2.6978 (2.7552)	mem 20675MB
[2025-04-03 02:56:52 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][98/573]	eta 0:07:03 lr 0.000484	time 0.8774 (0.8920)	loss 0.4385 (0.4966)	grad_norm 2.9459 (2.7515)	mem 20675MB
[2025-04-03 02:56:54 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][100/573]	eta 0:07:01 lr 0.000484	time 0.8774 (0.8917)	loss 0.5853 (0.4974)	grad_norm 2.4691 (2.7521)	mem 20675MB
[2025-04-03 02:56:56 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][102/573]	eta 0:06:59 lr 0.000484	time 0.8772 (0.8914)	loss 0.5002 (0.4973)	grad_norm 2.0096 (2.7414)	mem 20675MB
[2025-04-03 02:56:57 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][104/573]	eta 0:06:57 lr 0.000484	time 0.8773 (0.8912)	loss 0.4298 (0.4963)	grad_norm 2.3631 (2.7354)	mem 20675MB
[2025-04-03 02:56:59 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][106/573]	eta 0:06:56 lr 0.000483	time 0.8776 (0.8909)	loss 0.5217 (0.4953)	grad_norm 2.4758 (2.7341)	mem 20675MB
[2025-04-03 02:57:01 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][108/573]	eta 0:06:54 lr 0.000483	time 0.8774 (0.8907)	loss 0.3901 (0.4941)	grad_norm 3.0398 (2.7269)	mem 20675MB
[2025-04-03 02:57:03 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][110/573]	eta 0:06:52 lr 0.000483	time 0.8772 (0.8905)	loss 0.4083 (0.4938)	grad_norm 2.6901 (2.7274)	mem 20675MB
[2025-04-03 02:57:05 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][112/573]	eta 0:06:50 lr 0.000483	time 0.8779 (0.8903)	loss 0.3850 (0.4922)	grad_norm 3.0703 (2.7415)	mem 20675MB
[2025-04-03 02:57:06 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][114/573]	eta 0:06:48 lr 0.000483	time 0.8772 (0.8901)	loss 0.4486 (0.4923)	grad_norm 3.9393 (2.7475)	mem 20675MB
[2025-04-03 02:57:08 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][116/573]	eta 0:06:46 lr 0.000482	time 0.8773 (0.8899)	loss 0.4882 (0.4928)	grad_norm 1.9546 (2.7329)	mem 20675MB
[2025-04-03 02:57:10 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][118/573]	eta 0:06:44 lr 0.000482	time 0.8772 (0.8897)	loss 0.5046 (0.4932)	grad_norm 1.9861 (2.7233)	mem 20675MB
[2025-04-03 02:57:12 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][120/573]	eta 0:06:42 lr 0.000482	time 0.8776 (0.8895)	loss 0.4421 (0.4930)	grad_norm 3.0765 (2.7273)	mem 20675MB
[2025-04-03 02:57:13 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][122/573]	eta 0:06:41 lr 0.000482	time 0.8774 (0.8893)	loss 0.6139 (0.4943)	grad_norm 2.9109 (2.7242)	mem 20675MB
[2025-04-03 02:57:15 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][124/573]	eta 0:06:39 lr 0.000481	time 0.8774 (0.8892)	loss 0.4138 (0.4932)	grad_norm 5.4940 (2.7432)	mem 20675MB
[2025-04-03 02:57:17 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][126/573]	eta 0:06:37 lr 0.000481	time 0.8774 (0.8890)	loss 0.4627 (0.4938)	grad_norm 2.5804 (2.7470)	mem 20675MB
[2025-04-03 02:57:19 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][128/573]	eta 0:06:35 lr 0.000481	time 0.8773 (0.8888)	loss 0.4913 (0.4950)	grad_norm 2.8462 (2.7460)	mem 20675MB
[2025-04-03 02:57:20 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][130/573]	eta 0:06:33 lr 0.000481	time 0.8775 (0.8887)	loss 0.4916 (0.4943)	grad_norm 2.5048 (2.7469)	mem 20675MB
[2025-04-03 02:57:22 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][132/573]	eta 0:06:31 lr 0.000481	time 0.8772 (0.8885)	loss 0.5773 (0.4941)	grad_norm 1.8616 (2.7447)	mem 20675MB
[2025-04-03 02:57:24 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][134/573]	eta 0:06:29 lr 0.000480	time 0.8772 (0.8883)	loss 0.5298 (0.4951)	grad_norm 2.0880 (2.7386)	mem 20675MB
[2025-04-03 02:57:26 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][136/573]	eta 0:06:28 lr 0.000480	time 0.8775 (0.8882)	loss 0.4757 (0.4954)	grad_norm 2.7381 (2.7310)	mem 20675MB
[2025-04-03 02:57:27 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][138/573]	eta 0:06:26 lr 0.000480	time 0.8771 (0.8880)	loss 0.3454 (0.4948)	grad_norm 3.3511 (2.7321)	mem 20675MB
[2025-04-03 02:57:29 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][140/573]	eta 0:06:24 lr 0.000480	time 0.8772 (0.8879)	loss 0.5611 (0.4955)	grad_norm 2.3842 (2.7273)	mem 20675MB
[2025-04-03 02:57:31 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][142/573]	eta 0:06:22 lr 0.000479	time 0.8771 (0.8878)	loss 0.5788 (0.4967)	grad_norm 3.4659 (2.7301)	mem 20675MB
[2025-04-03 02:57:33 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][144/573]	eta 0:06:20 lr 0.000479	time 0.8776 (0.8876)	loss 0.5851 (0.4974)	grad_norm 2.0026 (2.7190)	mem 20675MB
[2025-04-03 02:57:34 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][146/573]	eta 0:06:18 lr 0.000479	time 0.8771 (0.8875)	loss 0.5514 (0.4969)	grad_norm 1.4779 (2.7248)	mem 20675MB
[2025-04-03 02:57:36 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][148/573]	eta 0:06:17 lr 0.000479	time 0.8773 (0.8874)	loss 0.4972 (0.4970)	grad_norm 2.4366 (2.7183)	mem 20675MB
[2025-04-03 02:57:38 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][150/573]	eta 0:06:15 lr 0.000479	time 0.8772 (0.8873)	loss 0.5141 (0.4974)	grad_norm 1.8400 (2.7081)	mem 20675MB
[2025-04-03 02:57:40 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][152/573]	eta 0:06:13 lr 0.000478	time 0.8777 (0.8871)	loss 0.4615 (0.4967)	grad_norm 2.1300 (2.7091)	mem 20675MB
[2025-04-03 02:57:41 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][154/573]	eta 0:06:11 lr 0.000478	time 0.8771 (0.8870)	loss 0.5358 (0.4963)	grad_norm 3.3869 (2.7173)	mem 20675MB
[2025-04-03 02:57:43 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][156/573]	eta 0:06:09 lr 0.000478	time 0.8774 (0.8869)	loss 0.4848 (0.4959)	grad_norm 3.0205 (2.7260)	mem 20675MB
[2025-04-03 02:57:45 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][158/573]	eta 0:06:08 lr 0.000478	time 0.8775 (0.8868)	loss 0.4250 (0.4957)	grad_norm 3.3154 (2.7262)	mem 20675MB
[2025-04-03 02:57:47 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][160/573]	eta 0:06:06 lr 0.000477	time 0.8775 (0.8867)	loss 0.4409 (0.4958)	grad_norm 3.0913 (2.7323)	mem 20675MB
[2025-04-03 02:57:48 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][162/573]	eta 0:06:04 lr 0.000477	time 0.8774 (0.8866)	loss 0.6349 (0.4965)	grad_norm 3.3389 (2.7396)	mem 20675MB
[2025-04-03 02:57:50 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][164/573]	eta 0:06:02 lr 0.000477	time 0.8771 (0.8865)	loss 0.5308 (0.4969)	grad_norm 4.9560 (2.7491)	mem 20675MB
[2025-04-03 02:57:52 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][166/573]	eta 0:06:00 lr 0.000477	time 0.8773 (0.8864)	loss 0.5263 (0.4966)	grad_norm 3.5511 (2.7576)	mem 20675MB
[2025-04-03 02:57:54 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][168/573]	eta 0:05:58 lr 0.000477	time 0.8774 (0.8863)	loss 0.4275 (0.4963)	grad_norm 3.4192 (2.7574)	mem 20675MB
[2025-04-03 02:57:55 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][170/573]	eta 0:05:57 lr 0.000476	time 0.8773 (0.8862)	loss 0.5512 (0.4967)	grad_norm 1.8030 (2.7491)	mem 20675MB
[2025-04-03 02:57:57 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][172/573]	eta 0:05:55 lr 0.000476	time 0.8772 (0.8861)	loss 0.5752 (0.4977)	grad_norm 2.1422 (2.7430)	mem 20675MB
[2025-04-03 02:57:59 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][174/573]	eta 0:05:53 lr 0.000476	time 0.8774 (0.8860)	loss 0.5017 (0.4978)	grad_norm 2.5281 (2.7385)	mem 20675MB
[2025-04-03 02:58:01 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][176/573]	eta 0:05:51 lr 0.000476	time 0.8771 (0.8859)	loss 0.5790 (0.4985)	grad_norm 1.8585 (2.7294)	mem 20675MB
[2025-04-03 02:58:02 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][178/573]	eta 0:05:49 lr 0.000475	time 0.8775 (0.8858)	loss 0.4007 (0.4972)	grad_norm 3.3677 (2.7344)	mem 20675MB
[2025-04-03 02:58:04 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][180/573]	eta 0:05:48 lr 0.000475	time 0.8774 (0.8857)	loss 0.5911 (0.4976)	grad_norm 2.6598 (2.7372)	mem 20675MB
[2025-04-03 02:58:06 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][182/573]	eta 0:05:46 lr 0.000475	time 0.8773 (0.8857)	loss 0.5594 (0.4983)	grad_norm 3.1660 (2.7392)	mem 20675MB
[2025-04-03 02:58:08 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][184/573]	eta 0:05:44 lr 0.000475	time 0.8770 (0.8856)	loss 0.4633 (0.4987)	grad_norm 2.5267 (2.7339)	mem 20675MB
[2025-04-03 02:58:09 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][186/573]	eta 0:05:42 lr 0.000475	time 0.8771 (0.8855)	loss 0.4748 (0.4979)	grad_norm 2.3571 (2.7309)	mem 20675MB
[2025-04-03 02:58:11 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][188/573]	eta 0:05:40 lr 0.000474	time 0.8773 (0.8854)	loss 0.5265 (0.4975)	grad_norm 2.3278 (2.7333)	mem 20675MB
[2025-04-03 02:58:13 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][190/573]	eta 0:05:39 lr 0.000474	time 0.8771 (0.8854)	loss 0.5365 (0.4981)	grad_norm 3.3612 (2.7361)	mem 20675MB
[2025-04-03 02:58:15 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][192/573]	eta 0:05:37 lr 0.000474	time 0.8772 (0.8853)	loss 0.3955 (0.4980)	grad_norm 3.8997 (2.7413)	mem 20675MB
[2025-04-03 02:58:17 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][194/573]	eta 0:05:35 lr 0.000474	time 0.8774 (0.8852)	loss 0.4727 (0.4980)	grad_norm 1.9437 (2.7337)	mem 20675MB
[2025-04-03 02:58:18 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][196/573]	eta 0:05:33 lr 0.000473	time 0.8774 (0.8851)	loss 0.4384 (0.4977)	grad_norm 2.5386 (2.7312)	mem 20675MB
[2025-04-03 02:58:20 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][198/573]	eta 0:05:31 lr 0.000473	time 0.8773 (0.8851)	loss 0.5936 (0.4981)	grad_norm 3.6979 (2.7336)	mem 20675MB
[2025-04-03 02:58:22 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][200/573]	eta 0:05:30 lr 0.000473	time 0.8773 (0.8850)	loss 0.5566 (0.4990)	grad_norm 2.2018 (2.7312)	mem 20675MB
[2025-04-03 02:58:24 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][202/573]	eta 0:05:28 lr 0.000473	time 0.8772 (0.8850)	loss 0.5553 (0.4993)	grad_norm 1.6488 (2.7287)	mem 20675MB
[2025-04-03 02:58:25 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][204/573]	eta 0:05:26 lr 0.000473	time 0.8772 (0.8849)	loss 0.5881 (0.5000)	grad_norm 2.3030 (2.7241)	mem 20675MB
[2025-04-03 02:58:27 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][206/573]	eta 0:05:24 lr 0.000472	time 0.8771 (0.8848)	loss 0.3690 (0.4994)	grad_norm 3.4150 (2.7362)	mem 20675MB
[2025-04-03 02:58:29 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][208/573]	eta 0:05:22 lr 0.000472	time 0.8770 (0.8848)	loss 0.4062 (0.4993)	grad_norm 5.3961 (2.7474)	mem 20675MB
[2025-04-03 02:58:31 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][210/573]	eta 0:05:21 lr 0.000472	time 0.8773 (0.8847)	loss 0.4642 (0.4993)	grad_norm 2.7406 (2.7453)	mem 20675MB
[2025-04-03 02:58:32 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][212/573]	eta 0:05:19 lr 0.000472	time 0.8773 (0.8846)	loss 0.3974 (0.4987)	grad_norm 2.4439 (2.7393)	mem 20675MB
[2025-04-03 02:58:34 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][214/573]	eta 0:05:17 lr 0.000471	time 0.8771 (0.8846)	loss 0.4429 (0.4982)	grad_norm 3.5074 (2.7456)	mem 20675MB
[2025-04-03 02:58:36 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][216/573]	eta 0:05:15 lr 0.000471	time 0.8773 (0.8845)	loss 0.3804 (0.4978)	grad_norm 4.8502 (2.7538)	mem 20675MB
[2025-04-03 02:58:38 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][218/573]	eta 0:05:13 lr 0.000471	time 0.8770 (0.8845)	loss 0.3959 (0.4974)	grad_norm 2.6908 (2.7509)	mem 20675MB
[2025-04-03 02:58:39 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][220/573]	eta 0:05:12 lr 0.000471	time 0.8775 (0.8844)	loss 0.6045 (0.4975)	grad_norm 2.6999 (2.7543)	mem 20675MB
[2025-04-03 02:58:41 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][222/573]	eta 0:05:10 lr 0.000471	time 0.8773 (0.8843)	loss 0.4039 (0.4968)	grad_norm 2.5910 (2.7742)	mem 20675MB
[2025-04-03 02:58:43 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][224/573]	eta 0:05:08 lr 0.000470	time 0.8775 (0.8843)	loss 0.5970 (0.4979)	grad_norm 2.4557 (2.7723)	mem 20675MB
[2025-04-03 02:58:45 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][226/573]	eta 0:05:06 lr 0.000470	time 0.8770 (0.8842)	loss 0.5393 (0.4980)	grad_norm 2.4225 (2.7720)	mem 20675MB
[2025-04-03 02:58:46 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][228/573]	eta 0:05:05 lr 0.000470	time 0.8770 (0.8842)	loss 0.5224 (0.4983)	grad_norm 2.3177 (2.7704)	mem 20675MB
[2025-04-03 02:58:48 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][230/573]	eta 0:05:03 lr 0.000470	time 0.8774 (0.8841)	loss 0.5952 (0.4985)	grad_norm 2.0127 (2.7653)	mem 20675MB
[2025-04-03 02:58:50 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][232/573]	eta 0:05:01 lr 0.000469	time 0.8774 (0.8841)	loss 0.5300 (0.4980)	grad_norm 1.8543 (2.7620)	mem 20675MB
[2025-04-03 02:58:52 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][234/573]	eta 0:04:59 lr 0.000469	time 0.8775 (0.8840)	loss 0.6028 (0.4984)	grad_norm 3.4015 (2.7637)	mem 20675MB
[2025-04-03 02:58:53 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][236/573]	eta 0:04:57 lr 0.000469	time 0.8774 (0.8840)	loss 0.3947 (0.4983)	grad_norm 3.6862 (2.7655)	mem 20675MB
[2025-04-03 02:58:55 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][238/573]	eta 0:04:56 lr 0.000469	time 0.8772 (0.8839)	loss 0.5882 (0.4988)	grad_norm 1.6669 (2.7598)	mem 20675MB
[2025-04-03 02:58:57 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][240/573]	eta 0:04:54 lr 0.000469	time 0.8774 (0.8839)	loss 0.4994 (0.4993)	grad_norm 2.0981 (2.7563)	mem 20675MB
[2025-04-03 02:58:59 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][242/573]	eta 0:04:52 lr 0.000468	time 0.8770 (0.8838)	loss 0.4884 (0.4993)	grad_norm 2.4468 (2.7548)	mem 20675MB
[2025-04-03 02:59:00 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][244/573]	eta 0:04:50 lr 0.000468	time 0.8770 (0.8838)	loss 0.4043 (0.4987)	grad_norm 2.8839 (2.7530)	mem 20675MB
[2025-04-03 02:59:02 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][246/573]	eta 0:04:48 lr 0.000468	time 0.8771 (0.8837)	loss 0.5612 (0.4987)	grad_norm 1.9274 (2.7494)	mem 20675MB
[2025-04-03 02:59:04 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][248/573]	eta 0:04:47 lr 0.000468	time 0.8773 (0.8837)	loss 0.5698 (0.4991)	grad_norm 1.8987 (2.7445)	mem 20675MB
[2025-04-03 02:59:06 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][250/573]	eta 0:04:45 lr 0.000467	time 0.8770 (0.8836)	loss 0.4208 (0.4992)	grad_norm 3.3846 (2.7452)	mem 20675MB
[2025-04-03 02:59:07 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][252/573]	eta 0:04:43 lr 0.000467	time 0.8773 (0.8836)	loss 0.5232 (0.4994)	grad_norm 2.4313 (2.7503)	mem 20675MB
[2025-04-03 02:59:09 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][254/573]	eta 0:04:41 lr 0.000467	time 0.8772 (0.8835)	loss 0.5067 (0.4991)	grad_norm 1.8789 (2.7476)	mem 20675MB
[2025-04-03 02:59:11 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][256/573]	eta 0:04:40 lr 0.000467	time 0.8784 (0.8835)	loss 0.4986 (0.4987)	grad_norm 2.5381 (2.7517)	mem 20675MB
[2025-04-03 02:59:13 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][258/573]	eta 0:04:38 lr 0.000467	time 0.8771 (0.8835)	loss 0.5418 (0.4988)	grad_norm 1.6834 (2.7431)	mem 20675MB
[2025-04-03 02:59:14 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][260/573]	eta 0:04:36 lr 0.000466	time 0.8771 (0.8834)	loss 0.5934 (0.4994)	grad_norm 3.0575 (2.7426)	mem 20675MB
[2025-04-03 02:59:16 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][262/573]	eta 0:04:34 lr 0.000466	time 0.8772 (0.8834)	loss 0.5116 (0.4993)	grad_norm 2.6129 (2.7422)	mem 20675MB
[2025-04-03 02:59:18 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][264/573]	eta 0:04:32 lr 0.000466	time 0.8772 (0.8833)	loss 0.4442 (0.4997)	grad_norm 2.4312 (2.7398)	mem 20675MB
[2025-04-03 02:59:20 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][266/573]	eta 0:04:31 lr 0.000466	time 0.8774 (0.8833)	loss 0.4770 (0.4998)	grad_norm 2.9798 (2.7401)	mem 20675MB
[2025-04-03 02:59:21 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][268/573]	eta 0:04:29 lr 0.000465	time 0.8771 (0.8833)	loss 0.5017 (0.5000)	grad_norm 2.2108 (2.7363)	mem 20675MB
[2025-04-03 02:59:23 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][270/573]	eta 0:04:27 lr 0.000465	time 0.8771 (0.8832)	loss 0.5570 (0.5004)	grad_norm 2.1072 (2.7289)	mem 20675MB
[2025-04-03 02:59:25 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][272/573]	eta 0:04:25 lr 0.000465	time 0.8774 (0.8832)	loss 0.5996 (0.5010)	grad_norm 2.1305 (2.7227)	mem 20675MB
[2025-04-03 02:59:27 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][274/573]	eta 0:04:24 lr 0.000465	time 0.8771 (0.8831)	loss 0.4199 (0.5007)	grad_norm 2.7869 (2.7201)	mem 20675MB
[2025-04-03 02:59:29 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][276/573]	eta 0:04:22 lr 0.000465	time 0.8773 (0.8831)	loss 0.5516 (0.5008)	grad_norm 2.4153 (2.7181)	mem 20675MB
[2025-04-03 02:59:30 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][278/573]	eta 0:04:20 lr 0.000464	time 0.8771 (0.8831)	loss 0.5697 (0.5009)	grad_norm 2.6827 (2.7183)	mem 20675MB
[2025-04-03 02:59:32 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][280/573]	eta 0:04:18 lr 0.000464	time 0.8772 (0.8830)	loss 0.5495 (0.5014)	grad_norm 1.7025 (2.7119)	mem 20675MB
[2025-04-03 02:59:34 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][282/573]	eta 0:04:16 lr 0.000464	time 0.8784 (0.8830)	loss 0.3546 (0.5010)	grad_norm 3.3565 (2.7132)	mem 20675MB
[2025-04-03 02:59:36 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][284/573]	eta 0:04:15 lr 0.000464	time 0.8770 (0.8830)	loss 0.4330 (0.5007)	grad_norm 9.8696 (2.7352)	mem 20675MB
[2025-04-03 02:59:37 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][286/573]	eta 0:04:13 lr 0.000463	time 0.8771 (0.8829)	loss 0.5789 (0.5014)	grad_norm 2.6240 (2.7343)	mem 20675MB
[2025-04-03 02:59:39 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][288/573]	eta 0:04:11 lr 0.000463	time 0.8776 (0.8829)	loss 0.4662 (0.5020)	grad_norm 2.9147 (2.7391)	mem 20675MB
[2025-04-03 02:59:41 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][290/573]	eta 0:04:09 lr 0.000463	time 0.8773 (0.8829)	loss 0.6249 (0.5023)	grad_norm 1.9242 (2.7348)	mem 20675MB
[2025-04-03 02:59:43 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][292/573]	eta 0:04:08 lr 0.000463	time 0.8781 (0.8828)	loss 0.5711 (0.5028)	grad_norm 2.3430 (2.7327)	mem 20675MB
[2025-04-03 02:59:44 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][294/573]	eta 0:04:06 lr 0.000463	time 0.8772 (0.8828)	loss 0.3697 (0.5026)	grad_norm 3.1107 (2.7363)	mem 20675MB
[2025-04-03 02:59:46 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][296/573]	eta 0:04:04 lr 0.000462	time 0.8773 (0.8828)	loss 0.5309 (0.5029)	grad_norm 2.3095 (2.7335)	mem 20675MB
[2025-04-03 02:59:48 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][298/573]	eta 0:04:02 lr 0.000462	time 0.8774 (0.8828)	loss 0.5764 (0.5034)	grad_norm 2.2856 (2.7309)	mem 20675MB
[2025-04-03 02:59:50 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][300/573]	eta 0:04:00 lr 0.000462	time 0.8775 (0.8827)	loss 0.5531 (0.5038)	grad_norm 2.1114 (2.7283)	mem 20675MB
[2025-04-03 02:59:51 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][302/573]	eta 0:03:59 lr 0.000462	time 0.8773 (0.8827)	loss 0.5243 (0.5034)	grad_norm 2.0376 (2.7274)	mem 20675MB
[2025-04-03 02:59:53 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][304/573]	eta 0:03:57 lr 0.000461	time 0.8776 (0.8827)	loss 0.4211 (0.5029)	grad_norm 3.3817 (2.7325)	mem 20675MB
[2025-04-03 02:59:55 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][306/573]	eta 0:03:55 lr 0.000461	time 0.8773 (0.8826)	loss 0.4181 (0.5023)	grad_norm 2.4187 (2.7324)	mem 20675MB
[2025-04-03 02:59:57 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][308/573]	eta 0:03:53 lr 0.000461	time 0.8774 (0.8826)	loss 0.4979 (0.5023)	grad_norm 1.8598 (2.7271)	mem 20675MB
[2025-04-03 02:59:58 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][310/573]	eta 0:03:52 lr 0.000461	time 0.8771 (0.8826)	loss 0.4728 (0.5019)	grad_norm 3.0880 (2.7271)	mem 20675MB
[2025-04-03 03:00:00 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][312/573]	eta 0:03:50 lr 0.000461	time 0.8774 (0.8825)	loss 0.5871 (0.5024)	grad_norm 2.1483 (2.7251)	mem 20675MB
[2025-04-03 03:00:02 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][314/573]	eta 0:03:48 lr 0.000460	time 0.8771 (0.8825)	loss 0.4920 (0.5028)	grad_norm 2.5907 (2.7232)	mem 20675MB
[2025-04-03 03:00:04 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][316/573]	eta 0:03:46 lr 0.000460	time 0.8773 (0.8825)	loss 0.5036 (0.5029)	grad_norm 2.4938 (2.7216)	mem 20675MB
[2025-04-03 03:00:05 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][318/573]	eta 0:03:45 lr 0.000460	time 0.8772 (0.8825)	loss 0.6110 (0.5034)	grad_norm 2.7319 (2.7196)	mem 20675MB
[2025-04-03 03:00:07 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][320/573]	eta 0:03:43 lr 0.000460	time 0.8771 (0.8824)	loss 0.4522 (0.5032)	grad_norm 2.0512 (2.7190)	mem 20675MB
[2025-04-03 03:00:09 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][322/573]	eta 0:03:41 lr 0.000459	time 0.8781 (0.8824)	loss 0.5787 (0.5033)	grad_norm 2.3220 (2.7183)	mem 20675MB
[2025-04-03 03:00:11 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][324/573]	eta 0:03:39 lr 0.000459	time 0.8772 (0.8824)	loss 0.5553 (0.5037)	grad_norm 1.9461 (2.7129)	mem 20675MB
[2025-04-03 03:00:12 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][326/573]	eta 0:03:37 lr 0.000459	time 0.8772 (0.8823)	loss 0.4556 (0.5032)	grad_norm 2.2565 (2.7141)	mem 20675MB
[2025-04-03 03:00:14 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][328/573]	eta 0:03:36 lr 0.000459	time 0.8773 (0.8823)	loss 0.5354 (0.5032)	grad_norm 2.8123 (2.7139)	mem 20675MB
[2025-04-03 03:00:16 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][330/573]	eta 0:03:34 lr 0.000459	time 0.8774 (0.8823)	loss 0.5704 (0.5037)	grad_norm 2.8505 (2.7148)	mem 20675MB
[2025-04-03 03:00:18 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][332/573]	eta 0:03:32 lr 0.000458	time 0.8773 (0.8823)	loss 0.3803 (0.5035)	grad_norm 2.9525 (2.7148)	mem 20675MB
[2025-04-03 03:00:19 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][334/573]	eta 0:03:30 lr 0.000458	time 0.8786 (0.8823)	loss 0.5528 (0.5040)	grad_norm 2.8151 (2.7133)	mem 20675MB
[2025-04-03 03:00:21 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][336/573]	eta 0:03:29 lr 0.000458	time 0.8776 (0.8822)	loss 0.4936 (0.5040)	grad_norm 1.9637 (2.7125)	mem 20675MB
[2025-04-03 03:00:23 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][338/573]	eta 0:03:27 lr 0.000458	time 0.8774 (0.8822)	loss 0.4574 (0.5039)	grad_norm 2.8518 (2.7122)	mem 20675MB
[2025-04-03 03:00:25 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][340/573]	eta 0:03:25 lr 0.000458	time 0.8777 (0.8822)	loss 0.6249 (0.5044)	grad_norm 2.9992 (2.7103)	mem 20675MB
[2025-04-03 03:00:26 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][342/573]	eta 0:03:23 lr 0.000457	time 0.8774 (0.8822)	loss 0.4811 (0.5046)	grad_norm 2.6452 (2.7105)	mem 20675MB
[2025-04-03 03:00:28 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][344/573]	eta 0:03:22 lr 0.000457	time 0.8776 (0.8821)	loss 0.5646 (0.5047)	grad_norm 1.7740 (2.7060)	mem 20675MB
[2025-04-03 03:00:30 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][346/573]	eta 0:03:20 lr 0.000457	time 0.8773 (0.8821)	loss 0.5433 (0.5045)	grad_norm 2.3700 (2.7078)	mem 20675MB
[2025-04-03 03:00:32 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][348/573]	eta 0:03:18 lr 0.000457	time 0.8777 (0.8821)	loss 0.5758 (0.5043)	grad_norm 2.4685 (2.7117)	mem 20675MB
[2025-04-03 03:00:34 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][350/573]	eta 0:03:16 lr 0.000456	time 0.8774 (0.8821)	loss 0.5254 (0.5042)	grad_norm 2.2037 (2.7089)	mem 20675MB
[2025-04-03 03:00:35 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][352/573]	eta 0:03:14 lr 0.000456	time 0.8778 (0.8821)	loss 0.5316 (0.5042)	grad_norm 2.2704 (2.7053)	mem 20675MB
[2025-04-03 03:00:37 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][354/573]	eta 0:03:13 lr 0.000456	time 0.8779 (0.8820)	loss 0.5953 (0.5048)	grad_norm 1.9880 (2.7023)	mem 20675MB
[2025-04-03 03:00:39 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][356/573]	eta 0:03:11 lr 0.000456	time 0.8778 (0.8820)	loss 0.4141 (0.5045)	grad_norm 3.5845 (2.7033)	mem 20675MB
[2025-04-03 03:00:41 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][358/573]	eta 0:03:09 lr 0.000456	time 0.8775 (0.8820)	loss 0.5473 (0.5048)	grad_norm 2.0144 (2.6996)	mem 20675MB
[2025-04-03 03:00:42 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][360/573]	eta 0:03:07 lr 0.000455	time 0.8775 (0.8820)	loss 0.4172 (0.5042)	grad_norm 3.3988 (2.7022)	mem 20675MB
[2025-04-03 03:00:44 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][362/573]	eta 0:03:06 lr 0.000455	time 0.8776 (0.8820)	loss 0.4463 (0.5038)	grad_norm 2.3328 (2.7000)	mem 20675MB
[2025-04-03 03:00:46 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][364/573]	eta 0:03:04 lr 0.000455	time 0.8777 (0.8819)	loss 0.5176 (0.5041)	grad_norm 2.1028 (2.7001)	mem 20675MB
[2025-04-03 03:00:48 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][366/573]	eta 0:03:02 lr 0.000455	time 0.8775 (0.8819)	loss 0.4748 (0.5041)	grad_norm 2.1811 (2.6970)	mem 20675MB
[2025-04-03 03:00:49 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][368/573]	eta 0:03:00 lr 0.000454	time 0.8777 (0.8819)	loss 0.6250 (0.5042)	grad_norm 2.4912 (2.6957)	mem 20675MB
[2025-04-03 03:00:51 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][370/573]	eta 0:02:59 lr 0.000454	time 0.8777 (0.8819)	loss 0.5255 (0.5045)	grad_norm 2.5200 (2.6954)	mem 20675MB
[2025-04-03 03:00:53 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][372/573]	eta 0:02:57 lr 0.000454	time 0.8776 (0.8819)	loss 0.5426 (0.5044)	grad_norm 2.4538 (2.6954)	mem 20675MB
[2025-04-03 03:00:55 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][374/573]	eta 0:02:55 lr 0.000454	time 0.8780 (0.8818)	loss 0.4562 (0.5043)	grad_norm 2.8326 (2.6938)	mem 20675MB
[2025-04-03 03:00:56 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][376/573]	eta 0:02:53 lr 0.000454	time 0.8776 (0.8818)	loss 0.5126 (0.5043)	grad_norm 3.1052 (2.6945)	mem 20675MB
[2025-04-03 03:00:58 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][378/573]	eta 0:02:51 lr 0.000453	time 0.8777 (0.8818)	loss 0.6420 (0.5048)	grad_norm 2.4879 (2.6925)	mem 20675MB
[2025-04-03 03:01:00 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][380/573]	eta 0:02:50 lr 0.000453	time 0.8776 (0.8818)	loss 0.3789 (0.5046)	grad_norm 3.6055 (2.6977)	mem 20675MB
[2025-04-03 03:01:02 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][382/573]	eta 0:02:48 lr 0.000453	time 0.8779 (0.8818)	loss 0.4813 (0.5046)	grad_norm 2.8152 (2.6960)	mem 20675MB
[2025-04-03 03:01:03 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][384/573]	eta 0:02:46 lr 0.000453	time 0.8773 (0.8818)	loss 0.4642 (0.5046)	grad_norm 3.2649 (2.6979)	mem 20675MB
[2025-04-03 03:01:05 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][386/573]	eta 0:02:44 lr 0.000452	time 0.8777 (0.8817)	loss 0.5588 (0.5046)	grad_norm 2.4627 (2.6965)	mem 20675MB
[2025-04-03 03:01:07 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][388/573]	eta 0:02:43 lr 0.000452	time 0.8775 (0.8817)	loss 0.4157 (0.5043)	grad_norm 1.9316 (2.6952)	mem 20675MB
[2025-04-03 03:01:09 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][390/573]	eta 0:02:41 lr 0.000452	time 0.8777 (0.8817)	loss 0.4098 (0.5042)	grad_norm 4.7822 (2.7019)	mem 20675MB
[2025-04-03 03:01:10 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][392/573]	eta 0:02:39 lr 0.000452	time 0.8777 (0.8817)	loss 0.5274 (0.5045)	grad_norm 2.5688 (2.7013)	mem 20675MB
[2025-04-03 03:01:12 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][394/573]	eta 0:02:37 lr 0.000452	time 0.8776 (0.8817)	loss 0.4781 (0.5044)	grad_norm 2.4356 (2.7008)	mem 20675MB
[2025-04-03 03:01:14 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][396/573]	eta 0:02:36 lr 0.000451	time 0.8774 (0.8817)	loss 0.4389 (0.5044)	grad_norm 1.5115 (2.6955)	mem 20675MB
[2025-04-03 03:01:16 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][398/573]	eta 0:02:34 lr 0.000451	time 0.8774 (0.8816)	loss 0.3564 (0.5043)	grad_norm 3.4290 (2.6977)	mem 20675MB
[2025-04-03 03:01:17 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][400/573]	eta 0:02:32 lr 0.000451	time 0.8776 (0.8816)	loss 0.4758 (0.5044)	grad_norm 2.2973 (2.6948)	mem 20675MB
[2025-04-03 03:01:19 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][402/573]	eta 0:02:30 lr 0.000451	time 0.8774 (0.8816)	loss 0.4786 (0.5041)	grad_norm 2.9979 (2.6997)	mem 20675MB
[2025-04-03 03:01:21 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][404/573]	eta 0:02:28 lr 0.000450	time 0.8774 (0.8816)	loss 0.3802 (0.5040)	grad_norm 2.8930 (2.6986)	mem 20675MB
[2025-04-03 03:01:23 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][406/573]	eta 0:02:27 lr 0.000450	time 0.8775 (0.8816)	loss 0.5189 (0.5042)	grad_norm 2.3984 (2.6969)	mem 20675MB
[2025-04-03 03:01:24 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][408/573]	eta 0:02:25 lr 0.000450	time 0.8774 (0.8816)	loss 0.5116 (0.5039)	grad_norm 1.8558 (2.6974)	mem 20675MB
[2025-04-03 03:01:26 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][410/573]	eta 0:02:23 lr 0.000450	time 0.8772 (0.8815)	loss 0.6046 (0.5041)	grad_norm 2.0673 (2.6949)	mem 20675MB
[2025-04-03 03:01:28 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][412/573]	eta 0:02:21 lr 0.000450	time 0.8772 (0.8815)	loss 0.5016 (0.5041)	grad_norm 2.8824 (2.6964)	mem 20675MB
[2025-04-03 03:01:30 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][414/573]	eta 0:02:20 lr 0.000449	time 0.8772 (0.8815)	loss 0.3581 (0.5035)	grad_norm 3.0803 (2.6979)	mem 20675MB
[2025-04-03 03:01:31 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][416/573]	eta 0:02:18 lr 0.000449	time 0.8772 (0.8815)	loss 0.4819 (0.5034)	grad_norm 4.5130 (2.7021)	mem 20675MB
[2025-04-03 03:01:33 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][418/573]	eta 0:02:16 lr 0.000449	time 0.8772 (0.8815)	loss 0.4802 (0.5033)	grad_norm 2.0338 (2.7003)	mem 20675MB
[2025-04-03 03:01:35 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][420/573]	eta 0:02:14 lr 0.000449	time 0.8773 (0.8815)	loss 0.5908 (0.5034)	grad_norm 2.4725 (2.6995)	mem 20675MB
[2025-04-03 03:01:37 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][422/573]	eta 0:02:13 lr 0.000448	time 0.8770 (0.8814)	loss 0.4226 (0.5028)	grad_norm 3.2000 (2.7001)	mem 20675MB
[2025-04-03 03:01:39 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][424/573]	eta 0:02:11 lr 0.000448	time 0.8773 (0.8814)	loss 0.4876 (0.5028)	grad_norm 2.3759 (2.6996)	mem 20675MB
[2025-04-03 03:01:40 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][426/573]	eta 0:02:09 lr 0.000448	time 0.8774 (0.8814)	loss 0.5322 (0.5030)	grad_norm 2.8660 (2.7004)	mem 20675MB
[2025-04-03 03:01:42 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][428/573]	eta 0:02:07 lr 0.000448	time 0.8774 (0.8814)	loss 0.6091 (0.5034)	grad_norm 2.4272 (2.6982)	mem 20675MB
[2025-04-03 03:01:44 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][430/573]	eta 0:02:06 lr 0.000448	time 0.8770 (0.8814)	loss 0.4953 (0.5035)	grad_norm 2.5303 (2.6992)	mem 20675MB
[2025-04-03 03:01:46 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][432/573]	eta 0:02:04 lr 0.000447	time 0.8775 (0.8814)	loss 0.4246 (0.5033)	grad_norm 2.9933 (2.6993)	mem 20675MB
[2025-04-03 03:01:47 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][434/573]	eta 0:02:02 lr 0.000447	time 0.8773 (0.8814)	loss 0.4223 (0.5032)	grad_norm 2.3149 (2.6965)	mem 20675MB
[2025-04-03 03:01:49 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][436/573]	eta 0:02:00 lr 0.000447	time 0.8773 (0.8813)	loss 0.5864 (0.5036)	grad_norm 2.0424 (2.6955)	mem 20675MB
[2025-04-03 03:01:51 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][438/573]	eta 0:01:58 lr 0.000447	time 0.8772 (0.8813)	loss 0.5532 (0.5038)	grad_norm 3.3440 (2.7049)	mem 20675MB
[2025-04-03 03:01:53 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][440/573]	eta 0:01:57 lr 0.000447	time 0.8772 (0.8813)	loss 0.3773 (0.5037)	grad_norm 2.8010 (2.7052)	mem 20675MB
[2025-04-03 03:01:54 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][442/573]	eta 0:01:55 lr 0.000446	time 0.8771 (0.8813)	loss 0.5292 (0.5033)	grad_norm 2.4779 (2.7044)	mem 20675MB
[2025-04-03 03:01:56 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][444/573]	eta 0:01:53 lr 0.000446	time 0.8777 (0.8813)	loss 0.3477 (0.5032)	grad_norm 4.5461 (2.7079)	mem 20675MB
[2025-04-03 03:01:58 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][446/573]	eta 0:01:51 lr 0.000446	time 0.8771 (0.8813)	loss 0.5037 (0.5033)	grad_norm 2.1833 (2.7053)	mem 20675MB
[2025-04-03 03:02:00 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][448/573]	eta 0:01:50 lr 0.000446	time 0.8770 (0.8812)	loss 0.5577 (0.5034)	grad_norm 3.3179 (2.7048)	mem 20675MB
[2025-04-03 03:02:01 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][450/573]	eta 0:01:48 lr 0.000445	time 0.8771 (0.8812)	loss 0.4982 (0.5035)	grad_norm 2.2777 (2.7036)	mem 20675MB
[2025-04-03 03:02:03 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][452/573]	eta 0:01:46 lr 0.000445	time 0.8771 (0.8812)	loss 0.5459 (0.5036)	grad_norm 2.3595 (2.7019)	mem 20675MB
[2025-04-03 03:02:05 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][454/573]	eta 0:01:44 lr 0.000445	time 0.8771 (0.8812)	loss 0.4075 (0.5033)	grad_norm 3.0647 (2.7031)	mem 20675MB
[2025-04-03 03:02:07 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][456/573]	eta 0:01:43 lr 0.000445	time 0.8771 (0.8812)	loss 0.5083 (0.5034)	grad_norm 2.7412 (2.7008)	mem 20675MB
[2025-04-03 03:02:08 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][458/573]	eta 0:01:41 lr 0.000445	time 0.8773 (0.8812)	loss 0.4419 (0.5034)	grad_norm 2.6959 (2.6991)	mem 20675MB
[2025-04-03 03:02:10 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][460/573]	eta 0:01:39 lr 0.000444	time 0.8772 (0.8812)	loss 0.5984 (0.5038)	grad_norm 2.9130 (2.6990)	mem 20675MB
[2025-04-03 03:02:12 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][462/573]	eta 0:01:37 lr 0.000444	time 0.8772 (0.8812)	loss 0.4340 (0.5039)	grad_norm 2.1170 (2.6980)	mem 20675MB
[2025-04-03 03:02:14 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][464/573]	eta 0:01:36 lr 0.000444	time 0.8776 (0.8811)	loss 0.4816 (0.5038)	grad_norm 1.5646 (2.6955)	mem 20675MB
[2025-04-03 03:02:15 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][466/573]	eta 0:01:34 lr 0.000444	time 0.8772 (0.8811)	loss 0.5318 (0.5040)	grad_norm 2.9587 (2.6942)	mem 20675MB
[2025-04-03 03:02:17 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][468/573]	eta 0:01:32 lr 0.000443	time 0.8780 (0.8811)	loss 0.4415 (0.5036)	grad_norm 3.8542 (2.6968)	mem 20675MB
[2025-04-03 03:02:19 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][470/573]	eta 0:01:30 lr 0.000443	time 0.8772 (0.8811)	loss 0.5600 (0.5037)	grad_norm 3.2067 (2.6964)	mem 20675MB
[2025-04-03 03:02:21 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][472/573]	eta 0:01:28 lr 0.000443	time 0.8774 (0.8811)	loss 0.5094 (0.5038)	grad_norm 2.3777 (2.6955)	mem 20675MB
[2025-04-03 03:02:22 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][474/573]	eta 0:01:27 lr 0.000443	time 0.8774 (0.8811)	loss 0.4680 (0.5035)	grad_norm 2.1474 (2.6964)	mem 20675MB
[2025-04-03 03:02:24 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][476/573]	eta 0:01:25 lr 0.000443	time 0.8774 (0.8811)	loss 0.5216 (0.5036)	grad_norm 1.9883 (2.6939)	mem 20675MB
[2025-04-03 03:02:26 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][478/573]	eta 0:01:23 lr 0.000442	time 0.8770 (0.8811)	loss 0.3915 (0.5030)	grad_norm 3.3644 (2.6962)	mem 20675MB
[2025-04-03 03:02:28 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][480/573]	eta 0:01:21 lr 0.000442	time 0.8775 (0.8810)	loss 0.4473 (0.5028)	grad_norm 4.1134 (2.7000)	mem 20675MB
[2025-04-03 03:02:29 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][482/573]	eta 0:01:20 lr 0.000442	time 0.8777 (0.8810)	loss 0.5048 (0.5028)	grad_norm 2.4775 (2.7014)	mem 20675MB
[2025-04-03 03:02:31 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][484/573]	eta 0:01:18 lr 0.000442	time 0.8772 (0.8810)	loss 0.4144 (0.5027)	grad_norm 4.2816 (2.7149)	mem 20675MB
[2025-04-03 03:02:33 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][486/573]	eta 0:01:16 lr 0.000442	time 0.8774 (0.8810)	loss 0.4629 (0.5027)	grad_norm 2.4525 (2.7151)	mem 20675MB
[2025-04-03 03:02:35 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][488/573]	eta 0:01:14 lr 0.000441	time 0.8774 (0.8810)	loss 0.5433 (0.5025)	grad_norm 3.2720 (2.7155)	mem 20675MB
[2025-04-03 03:02:36 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][490/573]	eta 0:01:13 lr 0.000441	time 0.8775 (0.8810)	loss 0.4307 (0.5023)	grad_norm 2.4640 (2.7165)	mem 20675MB
[2025-04-03 03:02:38 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][492/573]	eta 0:01:11 lr 0.000441	time 0.8773 (0.8810)	loss 0.4891 (0.5022)	grad_norm 2.1078 (2.7156)	mem 20675MB
[2025-04-03 03:02:40 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][494/573]	eta 0:01:09 lr 0.000441	time 0.8777 (0.8810)	loss 0.5113 (0.5024)	grad_norm 2.3106 (2.7147)	mem 20675MB
[2025-04-03 03:02:42 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][496/573]	eta 0:01:07 lr 0.000440	time 0.8773 (0.8810)	loss 0.6026 (0.5027)	grad_norm 3.1305 (2.7160)	mem 20675MB
[2025-04-03 03:02:43 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][498/573]	eta 0:01:06 lr 0.000440	time 0.8773 (0.8809)	loss 0.3813 (0.5022)	grad_norm 2.6611 (2.7182)	mem 20675MB
[2025-04-03 03:02:45 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][500/573]	eta 0:01:04 lr 0.000440	time 0.8773 (0.8809)	loss 0.4845 (0.5021)	grad_norm 2.9108 (2.7210)	mem 20675MB
[2025-04-03 03:02:47 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][502/573]	eta 0:01:02 lr 0.000440	time 0.8774 (0.8809)	loss 0.6158 (0.5024)	grad_norm 2.7398 (2.7246)	mem 20675MB
[2025-04-03 03:02:49 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][504/573]	eta 0:01:00 lr 0.000440	time 0.8772 (0.8809)	loss 0.5358 (0.5027)	grad_norm 2.6333 (2.7239)	mem 20675MB
[2025-04-03 03:02:51 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][506/573]	eta 0:00:59 lr 0.000439	time 0.8771 (0.8809)	loss 0.5758 (0.5029)	grad_norm 1.9211 (2.7200)	mem 20675MB
[2025-04-03 03:02:52 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][508/573]	eta 0:00:57 lr 0.000439	time 0.8776 (0.8809)	loss 0.5537 (0.5032)	grad_norm 2.1296 (2.7170)	mem 20675MB
[2025-04-03 03:02:54 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][510/573]	eta 0:00:55 lr 0.000439	time 0.8773 (0.8809)	loss 0.6038 (0.5035)	grad_norm 1.4411 (2.7138)	mem 20675MB
[2025-04-03 03:02:56 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][512/573]	eta 0:00:53 lr 0.000439	time 0.8774 (0.8809)	loss 0.5013 (0.5035)	grad_norm 2.0208 (2.7106)	mem 20675MB
[2025-04-03 03:02:58 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][514/573]	eta 0:00:51 lr 0.000438	time 0.8772 (0.8809)	loss 0.4764 (0.5035)	grad_norm 2.5168 (2.7094)	mem 20675MB
[2025-04-03 03:02:59 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][516/573]	eta 0:00:50 lr 0.000438	time 0.8771 (0.8808)	loss 0.4892 (0.5035)	grad_norm 2.0773 (2.7067)	mem 20675MB
[2025-04-03 03:03:01 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][518/573]	eta 0:00:48 lr 0.000438	time 0.8774 (0.8808)	loss 0.4840 (0.5036)	grad_norm 2.6001 (2.7069)	mem 20675MB
[2025-04-03 03:03:03 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][520/573]	eta 0:00:46 lr 0.000438	time 0.8773 (0.8808)	loss 0.5342 (0.5036)	grad_norm 2.3750 (2.7047)	mem 20675MB
[2025-04-03 03:03:05 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][522/573]	eta 0:00:44 lr 0.000438	time 0.8771 (0.8808)	loss 0.5312 (0.5038)	grad_norm 2.5216 (2.7031)	mem 20675MB
[2025-04-03 03:03:06 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][524/573]	eta 0:00:43 lr 0.000437	time 0.8769 (0.8808)	loss 0.3883 (0.5037)	grad_norm 2.2827 (2.7015)	mem 20675MB
[2025-04-03 03:03:08 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][526/573]	eta 0:00:41 lr 0.000437	time 0.8773 (0.8808)	loss 0.5098 (0.5038)	grad_norm 3.1658 (2.7027)	mem 20675MB
[2025-04-03 03:03:10 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][528/573]	eta 0:00:39 lr 0.000437	time 0.8775 (0.8808)	loss 0.4712 (0.5039)	grad_norm 3.4731 (2.7029)	mem 20675MB
[2025-04-03 03:03:12 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][530/573]	eta 0:00:37 lr 0.000437	time 0.8772 (0.8808)	loss 0.4241 (0.5037)	grad_norm 3.7764 (2.7051)	mem 20675MB
[2025-04-03 03:03:13 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][532/573]	eta 0:00:36 lr 0.000436	time 0.8771 (0.8808)	loss 0.5650 (0.5035)	grad_norm 2.5906 (2.7065)	mem 20675MB
[2025-04-03 03:03:15 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][534/573]	eta 0:00:34 lr 0.000436	time 0.8774 (0.8808)	loss 0.4427 (0.5032)	grad_norm 3.0046 (2.7076)	mem 20675MB
[2025-04-03 03:03:17 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][536/573]	eta 0:00:32 lr 0.000436	time 0.8774 (0.8807)	loss 0.5037 (0.5033)	grad_norm 2.4539 (2.7061)	mem 20675MB
[2025-04-03 03:03:19 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][538/573]	eta 0:00:30 lr 0.000436	time 0.8774 (0.8807)	loss 0.5733 (0.5036)	grad_norm 3.1255 (2.7061)	mem 20675MB
[2025-04-03 03:03:20 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][540/573]	eta 0:00:29 lr 0.000436	time 0.8770 (0.8807)	loss 0.5508 (0.5037)	grad_norm 2.0685 (2.7037)	mem 20675MB
[2025-04-03 03:03:22 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][542/573]	eta 0:00:27 lr 0.000435	time 0.8771 (0.8807)	loss 0.5673 (0.5037)	grad_norm 2.0188 (2.7048)	mem 20675MB
[2025-04-03 03:03:24 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][544/573]	eta 0:00:25 lr 0.000435	time 0.8772 (0.8807)	loss 0.5012 (0.5039)	grad_norm 2.2761 (2.7045)	mem 20675MB
[2025-04-03 03:03:26 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][546/573]	eta 0:00:23 lr 0.000435	time 0.8773 (0.8807)	loss 0.6099 (0.5040)	grad_norm 2.3373 (2.7033)	mem 20675MB
[2025-04-03 03:03:27 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][548/573]	eta 0:00:22 lr 0.000435	time 0.8770 (0.8807)	loss 0.4318 (0.5040)	grad_norm 2.8640 (2.7018)	mem 20675MB
[2025-04-03 03:03:29 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][550/573]	eta 0:00:20 lr 0.000435	time 0.8771 (0.8807)	loss 0.4857 (0.5040)	grad_norm 2.6052 (2.6997)	mem 20675MB
[2025-04-03 03:03:31 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][552/573]	eta 0:00:18 lr 0.000434	time 0.8774 (0.8807)	loss 0.5171 (0.5040)	grad_norm 2.9432 (2.6994)	mem 20675MB
[2025-04-03 03:03:33 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][554/573]	eta 0:00:16 lr 0.000434	time 0.8774 (0.8807)	loss 0.5980 (0.5041)	grad_norm 2.3551 (2.7004)	mem 20675MB
[2025-04-03 03:03:34 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][556/573]	eta 0:00:14 lr 0.000434	time 0.8776 (0.8806)	loss 0.5198 (0.5041)	grad_norm 2.3159 (2.7006)	mem 20675MB
[2025-04-03 03:03:36 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][558/573]	eta 0:00:13 lr 0.000434	time 0.8771 (0.8806)	loss 0.5418 (0.5043)	grad_norm 1.8211 (2.6981)	mem 20675MB
[2025-04-03 03:03:38 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][560/573]	eta 0:00:11 lr 0.000433	time 0.8772 (0.8806)	loss 0.5818 (0.5045)	grad_norm 2.2174 (2.6961)	mem 20675MB
[2025-04-03 03:03:40 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][562/573]	eta 0:00:09 lr 0.000433	time 0.8776 (0.8806)	loss 0.5488 (0.5043)	grad_norm 2.2175 (2.6935)	mem 20675MB
[2025-04-03 03:03:41 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][564/573]	eta 0:00:07 lr 0.000433	time 0.8771 (0.8806)	loss 0.6425 (0.5045)	grad_norm 2.5706 (2.6915)	mem 20675MB
[2025-04-03 03:03:43 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][566/573]	eta 0:00:06 lr 0.000433	time 0.8772 (0.8806)	loss 0.5381 (0.5048)	grad_norm 2.2500 (2.6910)	mem 20675MB
[2025-04-03 03:03:45 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][568/573]	eta 0:00:04 lr 0.000433	time 0.8771 (0.8806)	loss 0.5356 (0.5046)	grad_norm 3.0309 (2.6934)	mem 20675MB
[2025-04-03 03:03:47 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][570/573]	eta 0:00:02 lr 0.000432	time 0.8769 (0.8806)	loss 0.4306 (0.5045)	grad_norm 3.6258 (2.6946)	mem 20675MB
[2025-04-03 03:03:48 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][572/573]	eta 0:00:00 lr 0.000432	time 0.8769 (0.8806)	loss 0.3932 (0.5041)	grad_norm 2.8339 (2.6963)	mem 20675MB
[2025-04-03 03:03:49 simmim_finetune] (main_finetune.py 260): INFO EPOCH 17 training takes 0:08:24
[2025-04-03 03:03:50 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.791 (1.791)	Loss 0.5450 (0.5450)	Acc@1 68.750 (68.750)	Mem 20675MB
[2025-04-03 03:03:51 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.786)	Loss 0.4947 (0.5010)	Acc@1 76.562 (73.958)	Mem 20675MB
[2025-04-03 03:03:52 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.585)	Loss 0.5095 (0.4958)	Acc@1 72.656 (74.062)	Mem 20675MB
[2025-04-03 03:03:52 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.284 (0.499)	Loss 0.4555 (0.4811)	Acc@1 80.469 (75.670)	Mem 20675MB
[2025-04-03 03:03:53 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.451)	Loss 0.4982 (0.4729)	Acc@1 74.219 (76.476)	Mem 20675MB
[2025-04-03 03:03:53 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.421)	Loss 0.4549 (0.4767)	Acc@1 84.375 (76.989)	Mem 20675MB
[2025-04-03 03:03:54 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.400)	Loss 0.4890 (0.4754)	Acc@1 75.000 (76.983)	Mem 20675MB
[2025-04-03 03:03:54 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.384)	Loss 0.4337 (0.4700)	Acc@1 82.031 (77.500)	Mem 20675MB
[2025-04-03 03:03:55 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.671
[2025-04-03 03:03:55 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 77.7%
[2025-04-03 03:03:55 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 77.67%
[2025-04-03 03:03:55 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.760698570656696e-06, 1.760698570656696e-06, 2.6206718179437128e-06, 2.6206718179437128e-06, 3.943707583000661e-06, 3.943707583000661e-06, 5.979147221549813e-06, 5.979147221549813e-06, 9.110592819317736e-06, 9.110592819317736e-06, 1.392820143126839e-05, 1.392820143126839e-05, 2.1339906988115546e-05, 2.1339906988115546e-05, 3.274253092172655e-05, 3.274253092172655e-05, 5.028502928112811e-05, 5.028502928112811e-05, 7.727348829559206e-05, 7.727348829559206e-05, 0.00011879419447169041, 0.00011879419447169041, 0.00018267220397338017, 0.00018267220397338017, 0.00028094606474521057, 0.00028094606474521057, 0.00043213661977879584, 0.00043213661977879584]
[2025-04-03 03:03:57 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][0/573]	eta 0:20:44 lr 0.000432	time 2.1712 (2.1712)	loss 0.5359 (0.5359)	grad_norm 2.3818 (2.3818)	mem 20675MB
[2025-04-03 03:03:59 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][2/573]	eta 0:12:27 lr 0.000432	time 0.8770 (1.3090)	loss 0.5540 (0.4916)	grad_norm 2.1737 (2.5376)	mem 20675MB
[2025-04-03 03:04:00 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][4/573]	eta 0:10:46 lr 0.000432	time 0.8771 (1.1366)	loss 0.4868 (0.4913)	grad_norm 2.3774 (2.6344)	mem 20675MB
[2025-04-03 03:04:02 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][6/573]	eta 0:10:02 lr 0.000431	time 0.8772 (1.0627)	loss 0.5354 (0.4844)	grad_norm 4.8872 (2.9039)	mem 20675MB
[2025-04-03 03:04:04 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][8/573]	eta 0:09:37 lr 0.000431	time 0.8775 (1.0217)	loss 0.4003 (0.4835)	grad_norm 2.8022 (2.9818)	mem 20675MB
[2025-04-03 03:04:06 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][10/573]	eta 0:09:20 lr 0.000431	time 0.8773 (0.9957)	loss 0.6033 (0.4853)	grad_norm 2.4372 (2.9016)	mem 20675MB
[2025-04-03 03:04:07 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][12/573]	eta 0:09:08 lr 0.000431	time 0.8772 (0.9776)	loss 0.5214 (0.4955)	grad_norm 2.2419 (2.8019)	mem 20675MB
[2025-04-03 03:04:09 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][14/573]	eta 0:08:59 lr 0.000431	time 0.8771 (0.9643)	loss 0.5801 (0.4965)	grad_norm 2.4731 (2.8266)	mem 20675MB
[2025-04-03 03:04:11 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][16/573]	eta 0:08:51 lr 0.000430	time 0.8771 (0.9541)	loss 0.5292 (0.4994)	grad_norm 3.4044 (2.9906)	mem 20675MB
[2025-04-03 03:04:13 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][18/573]	eta 0:08:45 lr 0.000430	time 0.8772 (0.9461)	loss 0.6020 (0.5070)	grad_norm 2.0945 (2.8756)	mem 20675MB
[2025-04-03 03:04:14 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][20/573]	eta 0:08:39 lr 0.000430	time 0.8772 (0.9396)	loss 0.4970 (0.5015)	grad_norm 1.5856 (2.7934)	mem 20675MB
[2025-04-03 03:04:16 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][22/573]	eta 0:08:34 lr 0.000430	time 0.8772 (0.9343)	loss 0.5486 (0.5029)	grad_norm 1.7993 (2.7118)	mem 20675MB
[2025-04-03 03:04:18 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][24/573]	eta 0:08:30 lr 0.000429	time 0.8771 (0.9298)	loss 0.5247 (0.5070)	grad_norm 1.5682 (2.6244)	mem 20675MB
[2025-04-03 03:04:20 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][26/573]	eta 0:08:26 lr 0.000429	time 0.8770 (0.9259)	loss 0.5440 (0.5071)	grad_norm 2.0576 (2.6017)	mem 20675MB
[2025-04-03 03:04:21 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][28/573]	eta 0:08:22 lr 0.000429	time 0.8772 (0.9226)	loss 0.5159 (0.5078)	grad_norm 2.7379 (2.6091)	mem 20675MB
[2025-04-03 03:04:23 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][30/573]	eta 0:08:19 lr 0.000429	time 0.8772 (0.9197)	loss 0.5214 (0.5092)	grad_norm 1.9749 (2.5736)	mem 20675MB
[2025-04-03 03:04:25 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][32/573]	eta 0:08:16 lr 0.000429	time 0.8770 (0.9172)	loss 0.4616 (0.5072)	grad_norm 1.8749 (2.5427)	mem 20675MB
[2025-04-03 03:04:27 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][34/573]	eta 0:08:13 lr 0.000428	time 0.8773 (0.9150)	loss 0.3757 (0.5045)	grad_norm 2.4834 (2.5380)	mem 20675MB
[2025-04-03 03:04:28 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][36/573]	eta 0:08:10 lr 0.000428	time 0.8770 (0.9130)	loss 0.4687 (0.5008)	grad_norm 1.5718 (2.5268)	mem 20675MB
[2025-04-03 03:04:30 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][38/573]	eta 0:08:07 lr 0.000428	time 0.8774 (0.9112)	loss 0.5698 (0.5031)	grad_norm 2.8027 (2.5294)	mem 20675MB
[2025-04-03 03:04:32 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][40/573]	eta 0:08:04 lr 0.000428	time 0.8775 (0.9096)	loss 0.5934 (0.5061)	grad_norm 2.4494 (2.5158)	mem 20675MB
[2025-04-03 03:04:34 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][42/573]	eta 0:08:02 lr 0.000427	time 0.8772 (0.9081)	loss 0.5191 (0.5057)	grad_norm 2.2632 (2.5016)	mem 20675MB
[2025-04-03 03:04:35 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][44/573]	eta 0:07:59 lr 0.000427	time 0.8771 (0.9068)	loss 0.5063 (0.5022)	grad_norm 2.1839 (2.5035)	mem 20675MB
[2025-04-03 03:04:37 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][46/573]	eta 0:07:57 lr 0.000427	time 0.8770 (0.9055)	loss 0.5290 (0.5023)	grad_norm 2.4292 (2.5054)	mem 20675MB
[2025-04-03 03:04:39 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][48/573]	eta 0:07:54 lr 0.000427	time 0.8769 (0.9044)	loss 0.5001 (0.5023)	grad_norm 3.0912 (2.5271)	mem 20675MB
[2025-04-03 03:04:41 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][50/573]	eta 0:07:52 lr 0.000427	time 0.8771 (0.9034)	loss 0.5555 (0.5042)	grad_norm 2.1121 (2.5184)	mem 20675MB
[2025-04-03 03:04:42 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][52/573]	eta 0:07:50 lr 0.000426	time 0.8773 (0.9024)	loss 0.5976 (0.5057)	grad_norm 2.5486 (2.5171)	mem 20675MB
[2025-04-03 03:04:44 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][54/573]	eta 0:07:47 lr 0.000426	time 0.8769 (0.9015)	loss 0.5126 (0.5079)	grad_norm 2.8321 (2.5457)	mem 20675MB
[2025-04-03 03:04:46 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][56/573]	eta 0:07:45 lr 0.000426	time 0.8773 (0.9007)	loss 0.3598 (0.5031)	grad_norm 2.4900 (2.5481)	mem 20675MB
[2025-04-03 03:04:48 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][58/573]	eta 0:07:43 lr 0.000426	time 0.8773 (0.8999)	loss 0.5544 (0.5019)	grad_norm 2.0949 (2.5735)	mem 20675MB
[2025-04-03 03:04:49 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][60/573]	eta 0:07:41 lr 0.000426	time 0.8776 (0.8992)	loss 0.5746 (0.5015)	grad_norm 1.9944 (2.5672)	mem 20675MB
[2025-04-03 03:04:51 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][62/573]	eta 0:07:39 lr 0.000425	time 0.8771 (0.8986)	loss 0.4259 (0.4998)	grad_norm 2.2682 (2.5591)	mem 20675MB
[2025-04-03 03:04:53 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][64/573]	eta 0:07:37 lr 0.000425	time 0.8774 (0.8979)	loss 0.4429 (0.4982)	grad_norm 2.9825 (2.5571)	mem 20675MB
[2025-04-03 03:04:55 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][66/573]	eta 0:07:34 lr 0.000425	time 0.8780 (0.8974)	loss 0.5419 (0.4980)	grad_norm 1.6749 (2.5396)	mem 20675MB
[2025-04-03 03:04:56 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][68/573]	eta 0:07:32 lr 0.000425	time 0.8775 (0.8968)	loss 0.5558 (0.4977)	grad_norm 1.9705 (2.5479)	mem 20675MB
[2025-04-03 03:04:58 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][70/573]	eta 0:07:30 lr 0.000424	time 0.8774 (0.8963)	loss 0.5444 (0.4967)	grad_norm 2.0172 (2.5622)	mem 20675MB
[2025-04-03 03:05:00 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][72/573]	eta 0:07:28 lr 0.000424	time 0.8769 (0.8958)	loss 0.3750 (0.4939)	grad_norm 3.3592 (2.5871)	mem 20675MB
[2025-04-03 03:05:02 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][74/573]	eta 0:07:26 lr 0.000424	time 0.8771 (0.8953)	loss 0.5311 (0.4962)	grad_norm 2.5645 (2.5849)	mem 20675MB
[2025-04-03 03:05:04 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][76/573]	eta 0:07:24 lr 0.000424	time 0.8773 (0.8949)	loss 0.5504 (0.4973)	grad_norm 2.0725 (2.5799)	mem 20675MB
[2025-04-03 03:05:05 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][78/573]	eta 0:07:22 lr 0.000424	time 0.8773 (0.8945)	loss 0.3679 (0.4962)	grad_norm 2.8318 (2.5742)	mem 20675MB
[2025-04-03 03:05:07 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][80/573]	eta 0:07:20 lr 0.000423	time 0.8772 (0.8941)	loss 0.4357 (0.4966)	grad_norm 3.2766 (2.5890)	mem 20675MB
[2025-04-03 03:05:09 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][82/573]	eta 0:07:18 lr 0.000423	time 0.8773 (0.8937)	loss 0.4663 (0.4971)	grad_norm 3.2798 (2.5828)	mem 20675MB
[2025-04-03 03:05:11 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][84/573]	eta 0:07:16 lr 0.000423	time 0.8773 (0.8933)	loss 0.4363 (0.4971)	grad_norm 2.8601 (2.5864)	mem 20675MB
[2025-04-03 03:05:12 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][86/573]	eta 0:07:14 lr 0.000423	time 0.8772 (0.8930)	loss 0.3584 (0.4953)	grad_norm 3.1823 (2.5861)	mem 20675MB
[2025-04-03 03:05:14 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][88/573]	eta 0:07:12 lr 0.000422	time 0.8773 (0.8926)	loss 0.3391 (0.4925)	grad_norm 2.6100 (2.5821)	mem 20675MB
[2025-04-03 03:05:16 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][90/573]	eta 0:07:10 lr 0.000422	time 0.8770 (0.8923)	loss 0.4130 (0.4922)	grad_norm 4.1849 (2.5931)	mem 20675MB
[2025-04-03 03:05:18 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][92/573]	eta 0:07:09 lr 0.000422	time 0.8772 (0.8920)	loss 0.4042 (0.4923)	grad_norm 2.4921 (2.5880)	mem 20675MB
[2025-04-03 03:05:19 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][94/573]	eta 0:07:07 lr 0.000422	time 0.8770 (0.8917)	loss 0.5424 (0.4926)	grad_norm 2.5665 (2.5828)	mem 20675MB
[2025-04-03 03:05:21 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][96/573]	eta 0:07:05 lr 0.000422	time 0.8773 (0.8914)	loss 0.5231 (0.4937)	grad_norm 2.8384 (2.5869)	mem 20675MB
[2025-04-03 03:05:23 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][98/573]	eta 0:07:03 lr 0.000421	time 0.8772 (0.8911)	loss 0.4648 (0.4936)	grad_norm 2.9716 (2.5868)	mem 20675MB
[2025-04-03 03:05:25 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][100/573]	eta 0:07:01 lr 0.000421	time 0.8772 (0.8909)	loss 0.4658 (0.4921)	grad_norm 2.3335 (2.5900)	mem 20675MB
[2025-04-03 03:05:26 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][102/573]	eta 0:06:59 lr 0.000421	time 0.8770 (0.8906)	loss 0.4457 (0.4934)	grad_norm 4.2585 (2.6163)	mem 20675MB
[2025-04-03 03:05:28 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][104/573]	eta 0:06:57 lr 0.000421	time 0.8771 (0.8904)	loss 0.4286 (0.4937)	grad_norm 2.5475 (2.6132)	mem 20675MB
[2025-04-03 03:05:30 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][106/573]	eta 0:06:55 lr 0.000421	time 0.8773 (0.8902)	loss 0.5452 (0.4953)	grad_norm 2.1103 (2.6174)	mem 20675MB
[2025-04-03 03:05:32 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][108/573]	eta 0:06:53 lr 0.000420	time 0.8772 (0.8899)	loss 0.3882 (0.4939)	grad_norm 2.5970 (2.6162)	mem 20675MB
[2025-04-03 03:05:33 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][110/573]	eta 0:06:51 lr 0.000420	time 0.8774 (0.8897)	loss 0.5588 (0.4931)	grad_norm 2.7101 (2.6191)	mem 20675MB
[2025-04-03 03:05:35 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][112/573]	eta 0:06:50 lr 0.000420	time 0.8772 (0.8895)	loss 0.4341 (0.4931)	grad_norm 2.9457 (2.6226)	mem 20675MB
[2025-04-03 03:05:37 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][114/573]	eta 0:06:48 lr 0.000420	time 0.8771 (0.8893)	loss 0.3985 (0.4913)	grad_norm 3.5098 (2.6279)	mem 20675MB
[2025-04-03 03:05:39 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][116/573]	eta 0:06:46 lr 0.000419	time 0.8771 (0.8891)	loss 0.5754 (0.4924)	grad_norm 3.3024 (2.6300)	mem 20675MB
[2025-04-03 03:05:40 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][118/573]	eta 0:06:44 lr 0.000419	time 0.8776 (0.8889)	loss 0.5267 (0.4934)	grad_norm 2.8797 (2.6305)	mem 20675MB
[2025-04-03 03:05:42 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][120/573]	eta 0:06:42 lr 0.000419	time 0.8773 (0.8888)	loss 0.4777 (0.4941)	grad_norm 2.1357 (2.6277)	mem 20675MB
[2025-04-03 03:05:44 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][122/573]	eta 0:06:40 lr 0.000419	time 0.8770 (0.8886)	loss 0.4872 (0.4940)	grad_norm 3.6201 (2.6316)	mem 20675MB
[2025-04-03 03:05:46 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][124/573]	eta 0:06:38 lr 0.000419	time 0.8769 (0.8884)	loss 0.3663 (0.4923)	grad_norm 2.5561 (2.6356)	mem 20675MB
[2025-04-03 03:05:47 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][126/573]	eta 0:06:37 lr 0.000418	time 0.8771 (0.8883)	loss 0.4869 (0.4926)	grad_norm 2.0052 (2.6275)	mem 20675MB
[2025-04-03 03:05:49 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][128/573]	eta 0:06:35 lr 0.000418	time 0.8774 (0.8881)	loss 0.6472 (0.4948)	grad_norm 3.9580 (2.6365)	mem 20675MB
[2025-04-03 03:05:51 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][130/573]	eta 0:06:33 lr 0.000418	time 0.8774 (0.8879)	loss 0.5391 (0.4943)	grad_norm 2.6850 (2.6452)	mem 20675MB
[2025-04-03 03:05:53 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][132/573]	eta 0:06:31 lr 0.000418	time 0.8774 (0.8878)	loss 0.5980 (0.4953)	grad_norm 2.9374 (2.6521)	mem 20675MB
[2025-04-03 03:05:54 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][134/573]	eta 0:06:29 lr 0.000418	time 0.8771 (0.8877)	loss 0.4728 (0.4958)	grad_norm 2.1811 (2.6465)	mem 20675MB
[2025-04-03 03:05:56 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][136/573]	eta 0:06:27 lr 0.000417	time 0.8771 (0.8875)	loss 0.5110 (0.4950)	grad_norm 1.3884 (2.6319)	mem 20675MB
[2025-04-03 03:05:58 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][138/573]	eta 0:06:26 lr 0.000417	time 0.8769 (0.8874)	loss 0.4950 (0.4955)	grad_norm 2.9046 (2.6302)	mem 20675MB
[2025-04-03 03:06:00 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][140/573]	eta 0:06:24 lr 0.000417	time 0.8772 (0.8872)	loss 0.5987 (0.4964)	grad_norm 1.8370 (2.6201)	mem 20675MB
[2025-04-03 03:06:01 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][142/573]	eta 0:06:22 lr 0.000417	time 0.8772 (0.8871)	loss 0.3507 (0.4952)	grad_norm 2.4809 (2.6179)	mem 20675MB
[2025-04-03 03:06:03 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][144/573]	eta 0:06:20 lr 0.000416	time 0.8771 (0.8870)	loss 0.4559 (0.4948)	grad_norm 2.3491 (2.6184)	mem 20675MB
[2025-04-03 03:06:05 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][146/573]	eta 0:06:18 lr 0.000416	time 0.8770 (0.8869)	loss 0.4183 (0.4933)	grad_norm 2.9052 (2.6215)	mem 20675MB
[2025-04-03 03:06:07 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][148/573]	eta 0:06:16 lr 0.000416	time 0.8770 (0.8867)	loss 0.5118 (0.4936)	grad_norm 3.6471 (2.6271)	mem 20675MB
[2025-04-03 03:06:08 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][150/573]	eta 0:06:15 lr 0.000416	time 0.8773 (0.8866)	loss 0.5346 (0.4933)	grad_norm 2.4839 (2.6320)	mem 20675MB
[2025-04-03 03:06:10 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][152/573]	eta 0:06:13 lr 0.000416	time 0.8769 (0.8865)	loss 0.5445 (0.4944)	grad_norm 2.2026 (2.6296)	mem 20675MB
[2025-04-03 03:06:12 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][154/573]	eta 0:06:11 lr 0.000415	time 0.8771 (0.8864)	loss 0.5696 (0.4952)	grad_norm 2.5722 (2.6264)	mem 20675MB
[2025-04-03 03:06:14 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][156/573]	eta 0:06:09 lr 0.000415	time 0.8771 (0.8863)	loss 0.4790 (0.4956)	grad_norm 2.2378 (2.6225)	mem 20675MB
[2025-04-03 03:06:16 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][158/573]	eta 0:06:07 lr 0.000415	time 0.8771 (0.8862)	loss 0.3976 (0.4957)	grad_norm 2.3711 (2.6179)	mem 20675MB
[2025-04-03 03:06:17 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][160/573]	eta 0:06:05 lr 0.000415	time 0.8774 (0.8861)	loss 0.5553 (0.4965)	grad_norm 2.0836 (2.6103)	mem 20675MB
[2025-04-03 03:06:19 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][162/573]	eta 0:06:04 lr 0.000415	time 0.8779 (0.8860)	loss 0.5104 (0.4964)	grad_norm 1.7968 (2.6010)	mem 20675MB
[2025-04-03 03:06:21 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][164/573]	eta 0:06:02 lr 0.000414	time 0.8771 (0.8859)	loss 0.5126 (0.4965)	grad_norm 3.4171 (2.6023)	mem 20675MB
[2025-04-03 03:06:23 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][166/573]	eta 0:06:00 lr 0.000414	time 0.8770 (0.8858)	loss 0.3407 (0.4960)	grad_norm 3.0893 (2.6024)	mem 20675MB
[2025-04-03 03:06:24 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][168/573]	eta 0:05:58 lr 0.000414	time 0.8773 (0.8857)	loss 0.5729 (0.4967)	grad_norm 2.5586 (2.5995)	mem 20675MB
[2025-04-03 03:06:26 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][170/573]	eta 0:05:56 lr 0.000414	time 0.8770 (0.8856)	loss 0.4750 (0.4969)	grad_norm 2.6349 (2.5964)	mem 20675MB
[2025-04-03 03:06:28 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][172/573]	eta 0:05:55 lr 0.000413	time 0.8770 (0.8855)	loss 0.4890 (0.4971)	grad_norm 2.1044 (2.5942)	mem 20675MB
[2025-04-03 03:06:30 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][174/573]	eta 0:05:53 lr 0.000413	time 0.8770 (0.8854)	loss 0.4415 (0.4962)	grad_norm 3.8751 (2.6039)	mem 20675MB
[2025-04-03 03:06:31 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][176/573]	eta 0:05:51 lr 0.000413	time 0.8772 (0.8854)	loss 0.5500 (0.4969)	grad_norm 1.7046 (2.5963)	mem 20675MB
[2025-04-03 03:06:33 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][178/573]	eta 0:05:49 lr 0.000413	time 0.8773 (0.8853)	loss 0.5458 (0.4973)	grad_norm 2.3099 (2.5923)	mem 20675MB
[2025-04-03 03:06:35 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][180/573]	eta 0:05:47 lr 0.000413	time 0.8773 (0.8852)	loss 0.3632 (0.4962)	grad_norm 3.0163 (2.5978)	mem 20675MB
[2025-04-03 03:06:37 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][182/573]	eta 0:05:46 lr 0.000412	time 0.8773 (0.8851)	loss 0.4201 (0.4957)	grad_norm 3.5709 (2.6012)	mem 20675MB
[2025-04-03 03:06:38 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][184/573]	eta 0:05:44 lr 0.000412	time 0.8776 (0.8850)	loss 0.6317 (0.4967)	grad_norm 2.2880 (2.5977)	mem 20675MB
[2025-04-03 03:06:40 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][186/573]	eta 0:05:42 lr 0.000412	time 0.8775 (0.8850)	loss 0.4568 (0.4964)	grad_norm 3.5694 (2.6004)	mem 20675MB
[2025-04-03 03:06:42 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][188/573]	eta 0:05:40 lr 0.000412	time 0.8772 (0.8849)	loss 0.5881 (0.4976)	grad_norm 2.7588 (2.6053)	mem 20675MB
[2025-04-03 03:06:44 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][190/573]	eta 0:05:38 lr 0.000412	time 0.8774 (0.8848)	loss 0.5714 (0.4981)	grad_norm 2.7850 (2.6075)	mem 20675MB
[2025-04-03 03:06:45 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][192/573]	eta 0:05:37 lr 0.000411	time 0.8770 (0.8848)	loss 0.5012 (0.4978)	grad_norm 2.6714 (2.6135)	mem 20675MB
[2025-04-03 03:06:47 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][194/573]	eta 0:05:35 lr 0.000411	time 0.8773 (0.8847)	loss 0.5413 (0.4981)	grad_norm 3.1758 (2.6151)	mem 20675MB
[2025-04-03 03:06:49 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][196/573]	eta 0:05:33 lr 0.000411	time 0.8773 (0.8846)	loss 0.5899 (0.4987)	grad_norm 3.1587 (2.6194)	mem 20675MB
[2025-04-03 03:06:51 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][198/573]	eta 0:05:31 lr 0.000411	time 0.8772 (0.8846)	loss 0.5191 (0.4992)	grad_norm 2.1517 (2.6135)	mem 20675MB
[2025-04-03 03:06:52 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][200/573]	eta 0:05:29 lr 0.000410	time 0.8770 (0.8845)	loss 0.4645 (0.4989)	grad_norm 2.3312 (2.6098)	mem 20675MB
[2025-04-03 03:06:54 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][202/573]	eta 0:05:28 lr 0.000410	time 0.8773 (0.8844)	loss 0.4563 (0.4982)	grad_norm 2.4594 (2.6078)	mem 20675MB
[2025-04-03 03:06:56 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][204/573]	eta 0:05:26 lr 0.000410	time 0.8771 (0.8844)	loss 0.4976 (0.4980)	grad_norm 2.1087 (2.6071)	mem 20675MB
[2025-04-03 03:06:58 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][206/573]	eta 0:05:24 lr 0.000410	time 0.8771 (0.8843)	loss 0.5709 (0.4980)	grad_norm 2.2293 (2.6055)	mem 20675MB
[2025-04-03 03:06:59 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][208/573]	eta 0:05:22 lr 0.000410	time 0.8771 (0.8842)	loss 0.4726 (0.4976)	grad_norm 1.7249 (2.6014)	mem 20675MB
[2025-04-03 03:07:01 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][210/573]	eta 0:05:20 lr 0.000409	time 0.8773 (0.8842)	loss 0.5313 (0.4981)	grad_norm 1.7819 (2.5966)	mem 20675MB
[2025-04-03 03:07:03 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][212/573]	eta 0:05:19 lr 0.000409	time 0.8773 (0.8841)	loss 0.4580 (0.4977)	grad_norm 3.5849 (2.6033)	mem 20675MB
[2025-04-03 03:07:05 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][214/573]	eta 0:05:17 lr 0.000409	time 0.8773 (0.8841)	loss 0.5598 (0.4985)	grad_norm 2.8534 (2.6041)	mem 20675MB
[2025-04-03 03:07:06 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][216/573]	eta 0:05:15 lr 0.000409	time 0.8770 (0.8840)	loss 0.4311 (0.4980)	grad_norm 3.6318 (2.6060)	mem 20675MB
[2025-04-03 03:07:08 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][218/573]	eta 0:05:13 lr 0.000409	time 0.8769 (0.8839)	loss 0.4018 (0.4978)	grad_norm 2.2861 (2.6016)	mem 20675MB
[2025-04-03 03:07:10 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][220/573]	eta 0:05:12 lr 0.000408	time 0.8772 (0.8839)	loss 0.4213 (0.4971)	grad_norm 3.4132 (2.6036)	mem 20675MB
[2025-04-03 03:07:12 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][222/573]	eta 0:05:10 lr 0.000408	time 0.8772 (0.8838)	loss 0.5927 (0.4971)	grad_norm 3.5284 (2.6162)	mem 20675MB
[2025-04-03 03:07:13 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][224/573]	eta 0:05:08 lr 0.000408	time 0.8775 (0.8838)	loss 0.3279 (0.4963)	grad_norm 2.3917 (2.6173)	mem 20675MB
[2025-04-03 03:07:15 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][226/573]	eta 0:05:06 lr 0.000408	time 0.8772 (0.8837)	loss 0.5506 (0.4965)	grad_norm 2.0968 (2.6128)	mem 20675MB
[2025-04-03 03:07:17 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][228/573]	eta 0:05:04 lr 0.000407	time 0.8775 (0.8837)	loss 0.5850 (0.4973)	grad_norm 2.1432 (2.6122)	mem 20675MB
[2025-04-03 03:07:19 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][230/573]	eta 0:05:03 lr 0.000407	time 0.8778 (0.8837)	loss 0.5443 (0.4971)	grad_norm 3.3071 (2.6236)	mem 20675MB
[2025-04-03 03:07:20 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][232/573]	eta 0:05:01 lr 0.000407	time 0.8772 (0.8836)	loss 0.4460 (0.4966)	grad_norm 2.8983 (2.6261)	mem 20675MB
[2025-04-03 03:07:22 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][234/573]	eta 0:04:59 lr 0.000407	time 0.8773 (0.8836)	loss 0.5122 (0.4969)	grad_norm 1.9877 (2.6253)	mem 20675MB
[2025-04-03 03:07:24 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][236/573]	eta 0:04:57 lr 0.000407	time 0.8773 (0.8835)	loss 0.5838 (0.4970)	grad_norm 2.5624 (2.6289)	mem 20675MB
[2025-04-03 03:07:26 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][238/573]	eta 0:04:55 lr 0.000406	time 0.8770 (0.8835)	loss 0.5404 (0.4974)	grad_norm 2.0403 (2.6251)	mem 20675MB
[2025-04-03 03:07:28 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][240/573]	eta 0:04:54 lr 0.000406	time 0.8773 (0.8834)	loss 0.3998 (0.4965)	grad_norm 4.0354 (2.6309)	mem 20675MB
[2025-04-03 03:07:29 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][242/573]	eta 0:04:52 lr 0.000406	time 0.8770 (0.8834)	loss 0.5213 (0.4969)	grad_norm 2.7445 (2.6272)	mem 20675MB
[2025-04-03 03:07:31 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][244/573]	eta 0:04:50 lr 0.000406	time 0.8772 (0.8833)	loss 0.3558 (0.4963)	grad_norm 4.2751 (2.6329)	mem 20675MB
[2025-04-03 03:07:33 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][246/573]	eta 0:04:48 lr 0.000406	time 0.8771 (0.8833)	loss 0.4814 (0.4960)	grad_norm 2.8630 (2.6313)	mem 20675MB
[2025-04-03 03:07:35 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][248/573]	eta 0:04:47 lr 0.000405	time 0.8771 (0.8832)	loss 0.3352 (0.4952)	grad_norm 2.8669 (2.6326)	mem 20675MB
[2025-04-03 03:07:36 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][250/573]	eta 0:04:45 lr 0.000405	time 0.8772 (0.8832)	loss 0.3415 (0.4944)	grad_norm 3.1956 (2.6348)	mem 20675MB
[2025-04-03 03:07:38 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][252/573]	eta 0:04:43 lr 0.000405	time 0.8772 (0.8832)	loss 0.3771 (0.4941)	grad_norm 3.7928 (2.6375)	mem 20675MB
[2025-04-03 03:07:40 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][254/573]	eta 0:04:41 lr 0.000405	time 0.8770 (0.8831)	loss 0.5682 (0.4941)	grad_norm 3.1108 (2.6376)	mem 20675MB
[2025-04-03 03:07:42 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][256/573]	eta 0:04:39 lr 0.000404	time 0.8773 (0.8831)	loss 0.5892 (0.4946)	grad_norm 3.4850 (2.6405)	mem 20675MB
[2025-04-03 03:07:43 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][258/573]	eta 0:04:38 lr 0.000404	time 0.8771 (0.8830)	loss 0.4287 (0.4940)	grad_norm 3.3416 (2.6495)	mem 20675MB
[2025-04-03 03:07:45 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][260/573]	eta 0:04:36 lr 0.000404	time 0.8771 (0.8830)	loss 0.4208 (0.4940)	grad_norm 3.1175 (2.6503)	mem 20675MB
[2025-04-03 03:07:47 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][262/573]	eta 0:04:34 lr 0.000404	time 0.8773 (0.8830)	loss 0.5561 (0.4945)	grad_norm 2.7446 (2.6476)	mem 20675MB
[2025-04-03 03:07:49 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][264/573]	eta 0:04:32 lr 0.000404	time 0.8787 (0.8829)	loss 0.6219 (0.4945)	grad_norm 2.2731 (2.6479)	mem 20675MB
[2025-04-03 03:07:50 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][266/573]	eta 0:04:31 lr 0.000403	time 0.8771 (0.8829)	loss 0.5378 (0.4951)	grad_norm 2.2414 (2.6449)	mem 20675MB
[2025-04-03 03:07:52 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][268/573]	eta 0:04:29 lr 0.000403	time 0.8774 (0.8829)	loss 0.5417 (0.4952)	grad_norm 2.2358 (2.6420)	mem 20675MB
[2025-04-03 03:07:54 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][270/573]	eta 0:04:27 lr 0.000403	time 0.8774 (0.8828)	loss 0.5086 (0.4949)	grad_norm 1.6353 (2.6380)	mem 20675MB
[2025-04-03 03:07:56 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][272/573]	eta 0:04:25 lr 0.000403	time 0.8770 (0.8828)	loss 0.4468 (0.4947)	grad_norm 2.6540 (2.6359)	mem 20675MB
[2025-04-03 03:07:57 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][274/573]	eta 0:04:23 lr 0.000403	time 0.8773 (0.8828)	loss 0.3834 (0.4946)	grad_norm 2.3276 (2.6328)	mem 20675MB
[2025-04-03 03:07:59 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][276/573]	eta 0:04:22 lr 0.000402	time 0.8772 (0.8827)	loss 0.3913 (0.4941)	grad_norm 2.6494 (2.6332)	mem 20675MB
[2025-04-03 03:08:01 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][278/573]	eta 0:04:20 lr 0.000402	time 0.8771 (0.8827)	loss 0.5553 (0.4944)	grad_norm 2.4451 (2.6297)	mem 20675MB
[2025-04-03 03:08:03 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][280/573]	eta 0:04:18 lr 0.000402	time 0.8776 (0.8827)	loss 0.4080 (0.4943)	grad_norm 3.2596 (2.6302)	mem 20675MB
[2025-04-03 03:08:04 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][282/573]	eta 0:04:16 lr 0.000402	time 0.8769 (0.8826)	loss 0.4048 (0.4936)	grad_norm 7.8643 (2.6500)	mem 20675MB
[2025-04-03 03:08:06 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][284/573]	eta 0:04:15 lr 0.000401	time 0.8771 (0.8826)	loss 0.4325 (0.4934)	grad_norm 2.6531 (2.6472)	mem 20675MB
[2025-04-03 03:08:08 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][286/573]	eta 0:04:13 lr 0.000401	time 0.8773 (0.8826)	loss 0.4376 (0.4933)	grad_norm 2.0532 (2.6457)	mem 20675MB
[2025-04-03 03:08:10 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][288/573]	eta 0:04:11 lr 0.000401	time 0.8774 (0.8825)	loss 0.5170 (0.4932)	grad_norm 2.7523 (2.6473)	mem 20675MB
[2025-04-03 03:08:11 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][290/573]	eta 0:04:09 lr 0.000401	time 0.8771 (0.8825)	loss 0.4657 (0.4934)	grad_norm 2.3108 (2.6458)	mem 20675MB
[2025-04-03 03:08:13 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][292/573]	eta 0:04:07 lr 0.000401	time 0.8772 (0.8825)	loss 0.5403 (0.4934)	grad_norm 2.6641 (2.6556)	mem 20675MB
[2025-04-03 03:08:15 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][294/573]	eta 0:04:06 lr 0.000400	time 0.8772 (0.8824)	loss 0.5066 (0.4935)	grad_norm 2.5511 (2.6533)	mem 20675MB
[2025-04-03 03:08:17 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][296/573]	eta 0:04:04 lr 0.000400	time 0.8774 (0.8824)	loss 0.4037 (0.4935)	grad_norm 3.1591 (2.6532)	mem 20675MB
[2025-04-03 03:08:18 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][298/573]	eta 0:04:02 lr 0.000400	time 0.8773 (0.8824)	loss 0.5127 (0.4933)	grad_norm 2.0459 (2.6504)	mem 20675MB
[2025-04-03 03:08:20 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][300/573]	eta 0:04:00 lr 0.000400	time 0.8772 (0.8824)	loss 0.5057 (0.4931)	grad_norm 2.0206 (2.6454)	mem 20675MB
[2025-04-03 03:08:22 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][302/573]	eta 0:03:59 lr 0.000400	time 0.8775 (0.8823)	loss 0.5020 (0.4932)	grad_norm 2.3773 (2.6425)	mem 20675MB
[2025-04-03 03:08:24 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][304/573]	eta 0:03:57 lr 0.000399	time 0.8775 (0.8823)	loss 0.5316 (0.4934)	grad_norm 1.9820 (2.6447)	mem 20675MB
[2025-04-03 03:08:25 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][306/573]	eta 0:03:55 lr 0.000399	time 0.8774 (0.8823)	loss 0.4708 (0.4933)	grad_norm 2.8952 (2.6439)	mem 20675MB
[2025-04-03 03:08:27 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][308/573]	eta 0:03:53 lr 0.000399	time 0.8773 (0.8822)	loss 0.4312 (0.4933)	grad_norm 3.4455 (2.6433)	mem 20675MB
[2025-04-03 03:08:29 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][310/573]	eta 0:03:52 lr 0.000399	time 0.8789 (0.8822)	loss 0.5102 (0.4934)	grad_norm 2.0092 (2.6401)	mem 20675MB
[2025-04-03 03:08:31 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][312/573]	eta 0:03:50 lr 0.000398	time 0.8774 (0.8822)	loss 0.5277 (0.4934)	grad_norm 2.1163 (2.6375)	mem 20675MB
[2025-04-03 03:08:32 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][314/573]	eta 0:03:48 lr 0.000398	time 0.8773 (0.8822)	loss 0.5549 (0.4934)	grad_norm 2.4542 (2.6393)	mem 20675MB
[2025-04-03 03:08:34 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][316/573]	eta 0:03:46 lr 0.000398	time 0.8774 (0.8821)	loss 0.5814 (0.4937)	grad_norm 2.3789 (2.6369)	mem 20675MB
[2025-04-03 03:08:36 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][318/573]	eta 0:03:44 lr 0.000398	time 0.8773 (0.8821)	loss 0.5533 (0.4942)	grad_norm 2.5117 (2.6351)	mem 20675MB
[2025-04-03 03:08:38 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][320/573]	eta 0:03:43 lr 0.000398	time 0.8773 (0.8821)	loss 0.5788 (0.4945)	grad_norm 2.4484 (2.6320)	mem 20675MB
[2025-04-03 03:08:40 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][322/573]	eta 0:03:41 lr 0.000397	time 0.8772 (0.8821)	loss 0.4969 (0.4948)	grad_norm 2.4243 (2.6332)	mem 20675MB
[2025-04-03 03:08:41 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][324/573]	eta 0:03:39 lr 0.000397	time 0.8771 (0.8821)	loss 0.4718 (0.4948)	grad_norm 3.0495 (2.6337)	mem 20675MB
[2025-04-03 03:08:43 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][326/573]	eta 0:03:37 lr 0.000397	time 0.8771 (0.8820)	loss 0.4815 (0.4948)	grad_norm 1.8129 (2.6313)	mem 20675MB
[2025-04-03 03:08:45 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][328/573]	eta 0:03:36 lr 0.000397	time 0.8773 (0.8820)	loss 0.4866 (0.4945)	grad_norm 3.4982 (2.6326)	mem 20675MB
[2025-04-03 03:08:47 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][330/573]	eta 0:03:34 lr 0.000397	time 0.8770 (0.8820)	loss 0.5361 (0.4946)	grad_norm 5.6734 (2.6396)	mem 20675MB
[2025-04-03 03:08:48 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][332/573]	eta 0:03:32 lr 0.000396	time 0.8769 (0.8820)	loss 0.4760 (0.4943)	grad_norm 2.9658 (2.6449)	mem 20675MB
[2025-04-03 03:08:50 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][334/573]	eta 0:03:30 lr 0.000396	time 0.8773 (0.8819)	loss 0.5605 (0.4947)	grad_norm 1.7593 (2.6407)	mem 20675MB
[2025-04-03 03:08:52 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][336/573]	eta 0:03:29 lr 0.000396	time 0.8775 (0.8819)	loss 0.4119 (0.4945)	grad_norm 3.2338 (2.6423)	mem 20675MB
[2025-04-03 03:08:54 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][338/573]	eta 0:03:27 lr 0.000396	time 0.8771 (0.8819)	loss 0.5940 (0.4949)	grad_norm 2.9322 (2.6432)	mem 20675MB
[2025-04-03 03:08:55 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][340/573]	eta 0:03:25 lr 0.000395	time 0.8772 (0.8819)	loss 0.4700 (0.4947)	grad_norm 3.0376 (2.6453)	mem 20675MB
[2025-04-03 03:08:57 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][342/573]	eta 0:03:23 lr 0.000395	time 0.8768 (0.8818)	loss 0.3116 (0.4943)	grad_norm 3.1325 (2.6459)	mem 20675MB
[2025-04-03 03:08:59 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][344/573]	eta 0:03:21 lr 0.000395	time 0.8769 (0.8818)	loss 0.4775 (0.4941)	grad_norm 2.6580 (2.6498)	mem 20675MB
[2025-04-03 03:09:01 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][346/573]	eta 0:03:20 lr 0.000395	time 0.8771 (0.8818)	loss 0.5282 (0.4940)	grad_norm 2.5250 (2.6506)	mem 20675MB
[2025-04-03 03:09:02 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][348/573]	eta 0:03:18 lr 0.000395	time 0.8770 (0.8818)	loss 0.3253 (0.4935)	grad_norm 3.1736 (2.6508)	mem 20675MB
[2025-04-03 03:09:04 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][350/573]	eta 0:03:16 lr 0.000394	time 0.8772 (0.8818)	loss 0.5375 (0.4934)	grad_norm 3.6361 (2.6524)	mem 20675MB
[2025-04-03 03:09:06 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][352/573]	eta 0:03:14 lr 0.000394	time 0.8772 (0.8817)	loss 0.6034 (0.4939)	grad_norm 2.4172 (2.6528)	mem 20675MB
[2025-04-03 03:09:08 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][354/573]	eta 0:03:13 lr 0.000394	time 0.8773 (0.8817)	loss 0.4980 (0.4939)	grad_norm 2.3093 (2.6515)	mem 20675MB
[2025-04-03 03:09:09 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][356/573]	eta 0:03:11 lr 0.000394	time 0.8773 (0.8817)	loss 0.5313 (0.4943)	grad_norm 2.4557 (2.6531)	mem 20675MB
[2025-04-03 03:09:11 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][358/573]	eta 0:03:09 lr 0.000394	time 0.8784 (0.8817)	loss 0.4649 (0.4942)	grad_norm 2.7825 (2.6555)	mem 20675MB
[2025-04-03 03:09:13 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][360/573]	eta 0:03:07 lr 0.000393	time 0.8774 (0.8817)	loss 0.3315 (0.4938)	grad_norm 2.3265 (2.6546)	mem 20675MB
[2025-04-03 03:09:15 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][362/573]	eta 0:03:06 lr 0.000393	time 0.8774 (0.8816)	loss 0.5697 (0.4938)	grad_norm 2.3722 (2.6571)	mem 20675MB
[2025-04-03 03:09:16 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][364/573]	eta 0:03:04 lr 0.000393	time 0.8771 (0.8816)	loss 0.5158 (0.4940)	grad_norm 2.7227 (2.6555)	mem 20675MB
[2025-04-03 03:09:18 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][366/573]	eta 0:03:02 lr 0.000393	time 0.8773 (0.8816)	loss 0.5564 (0.4945)	grad_norm 2.9777 (2.6578)	mem 20675MB
[2025-04-03 03:09:20 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][368/573]	eta 0:03:00 lr 0.000393	time 0.8774 (0.8816)	loss 0.3416 (0.4941)	grad_norm 2.5793 (2.6569)	mem 20675MB
[2025-04-03 03:09:22 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][370/573]	eta 0:02:58 lr 0.000392	time 0.8773 (0.8816)	loss 0.6129 (0.4946)	grad_norm 2.2443 (2.6570)	mem 20675MB
[2025-04-03 03:09:23 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][372/573]	eta 0:02:57 lr 0.000392	time 0.8781 (0.8815)	loss 0.5052 (0.4950)	grad_norm 2.0482 (2.6578)	mem 20675MB
[2025-04-03 03:09:25 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][374/573]	eta 0:02:55 lr 0.000392	time 0.8774 (0.8815)	loss 0.5468 (0.4954)	grad_norm 2.2482 (2.6546)	mem 20675MB
[2025-04-03 03:09:27 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][376/573]	eta 0:02:53 lr 0.000392	time 0.8772 (0.8815)	loss 0.4354 (0.4952)	grad_norm 2.4079 (2.6581)	mem 20675MB
[2025-04-03 03:09:29 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][378/573]	eta 0:02:51 lr 0.000391	time 0.8772 (0.8815)	loss 0.4835 (0.4950)	grad_norm 2.2160 (2.6569)	mem 20675MB
[2025-04-03 03:09:30 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][380/573]	eta 0:02:50 lr 0.000391	time 0.8771 (0.8815)	loss 0.5005 (0.4950)	grad_norm 1.4588 (2.6518)	mem 20675MB
[2025-04-03 03:09:32 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][382/573]	eta 0:02:48 lr 0.000391	time 0.8773 (0.8815)	loss 0.4947 (0.4952)	grad_norm 2.9828 (2.6512)	mem 20675MB
[2025-04-03 03:09:34 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][384/573]	eta 0:02:46 lr 0.000391	time 0.8774 (0.8814)	loss 0.5103 (0.4951)	grad_norm 2.7022 (2.6493)	mem 20675MB
[2025-04-03 03:09:36 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][386/573]	eta 0:02:44 lr 0.000391	time 0.8773 (0.8814)	loss 0.5469 (0.4952)	grad_norm 1.8653 (2.6479)	mem 20675MB
[2025-04-03 03:09:37 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][388/573]	eta 0:02:43 lr 0.000390	time 0.8772 (0.8814)	loss 0.5253 (0.4950)	grad_norm 2.5458 (2.6494)	mem 20675MB
[2025-04-03 03:09:39 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][390/573]	eta 0:02:41 lr 0.000390	time 0.8773 (0.8814)	loss 0.5480 (0.4953)	grad_norm 2.5925 (2.6483)	mem 20675MB
[2025-04-03 03:09:41 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][392/573]	eta 0:02:39 lr 0.000390	time 0.8773 (0.8814)	loss 0.3503 (0.4949)	grad_norm 2.9902 (2.6485)	mem 20675MB
[2025-04-03 03:09:43 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][394/573]	eta 0:02:37 lr 0.000390	time 0.8783 (0.8814)	loss 0.3937 (0.4948)	grad_norm 3.5102 (2.6507)	mem 20675MB
[2025-04-03 03:09:45 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][396/573]	eta 0:02:35 lr 0.000390	time 0.8770 (0.8813)	loss 0.5587 (0.4950)	grad_norm 3.1252 (2.6505)	mem 20675MB
[2025-04-03 03:09:46 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][398/573]	eta 0:02:34 lr 0.000389	time 0.8775 (0.8813)	loss 0.5543 (0.4950)	grad_norm 2.4784 (2.6536)	mem 20675MB
[2025-04-03 03:09:48 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][400/573]	eta 0:02:32 lr 0.000389	time 0.8773 (0.8813)	loss 0.6494 (0.4955)	grad_norm 2.4626 (2.6525)	mem 20675MB
[2025-04-03 03:09:50 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][402/573]	eta 0:02:30 lr 0.000389	time 0.8773 (0.8813)	loss 0.4242 (0.4951)	grad_norm 3.1302 (2.6536)	mem 20675MB
[2025-04-03 03:09:52 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][404/573]	eta 0:02:28 lr 0.000389	time 0.8773 (0.8813)	loss 0.3615 (0.4945)	grad_norm 5.4019 (2.6615)	mem 20675MB
[2025-04-03 03:09:53 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][406/573]	eta 0:02:27 lr 0.000389	time 0.8774 (0.8813)	loss 0.5082 (0.4947)	grad_norm 2.3146 (2.6623)	mem 20675MB
[2025-04-03 03:09:55 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][408/573]	eta 0:02:25 lr 0.000388	time 0.8772 (0.8812)	loss 0.5695 (0.4951)	grad_norm 3.4287 (2.6624)	mem 20675MB
[2025-04-03 03:09:57 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][410/573]	eta 0:02:23 lr 0.000388	time 0.8775 (0.8812)	loss 0.5797 (0.4950)	grad_norm 2.2523 (2.6637)	mem 20675MB
[2025-04-03 03:09:59 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][412/573]	eta 0:02:21 lr 0.000388	time 0.8784 (0.8812)	loss 0.4325 (0.4945)	grad_norm 3.2694 (2.6669)	mem 20675MB
[2025-04-03 03:10:00 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][414/573]	eta 0:02:20 lr 0.000388	time 0.8774 (0.8812)	loss 0.5196 (0.4946)	grad_norm 2.1757 (2.6646)	mem 20675MB
[2025-04-03 03:10:02 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][416/573]	eta 0:02:18 lr 0.000387	time 0.8772 (0.8812)	loss 0.4301 (0.4946)	grad_norm 2.5213 (2.6620)	mem 20675MB
[2025-04-03 03:10:04 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][418/573]	eta 0:02:16 lr 0.000387	time 0.8774 (0.8812)	loss 0.5825 (0.4950)	grad_norm 2.0088 (2.6608)	mem 20675MB
[2025-04-03 03:10:06 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][420/573]	eta 0:02:14 lr 0.000387	time 0.8774 (0.8812)	loss 0.5732 (0.4952)	grad_norm 1.8986 (2.6572)	mem 20675MB
[2025-04-03 03:10:07 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][422/573]	eta 0:02:13 lr 0.000387	time 0.8774 (0.8812)	loss 0.3339 (0.4950)	grad_norm 3.1680 (2.6566)	mem 20675MB
[2025-04-03 03:10:09 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][424/573]	eta 0:02:11 lr 0.000387	time 0.8775 (0.8811)	loss 0.6387 (0.4952)	grad_norm 2.9732 (2.6586)	mem 20675MB
[2025-04-03 03:10:11 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][426/573]	eta 0:02:09 lr 0.000386	time 0.8776 (0.8811)	loss 0.5936 (0.4951)	grad_norm 2.5601 (2.6575)	mem 20675MB
[2025-04-03 03:10:13 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][428/573]	eta 0:02:07 lr 0.000386	time 0.8773 (0.8811)	loss 0.3659 (0.4950)	grad_norm 2.5086 (2.6600)	mem 20675MB
[2025-04-03 03:10:14 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][430/573]	eta 0:02:05 lr 0.000386	time 0.8773 (0.8811)	loss 0.5420 (0.4951)	grad_norm 1.8370 (2.6569)	mem 20675MB
[2025-04-03 03:10:16 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][432/573]	eta 0:02:04 lr 0.000386	time 0.8774 (0.8811)	loss 0.4631 (0.4953)	grad_norm 2.3242 (2.6544)	mem 20675MB
[2025-04-03 03:10:18 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][434/573]	eta 0:02:02 lr 0.000386	time 0.8772 (0.8811)	loss 0.4854 (0.4954)	grad_norm 2.5117 (2.6526)	mem 20675MB
[2025-04-03 03:10:20 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][436/573]	eta 0:02:00 lr 0.000385	time 0.8774 (0.8811)	loss 0.5584 (0.4956)	grad_norm 2.0902 (2.6499)	mem 20675MB
[2025-04-03 03:10:21 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][438/573]	eta 0:01:58 lr 0.000385	time 0.8771 (0.8810)	loss 0.4052 (0.4952)	grad_norm 3.8395 (2.6529)	mem 20675MB
[2025-04-03 03:10:23 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][440/573]	eta 0:01:57 lr 0.000385	time 0.8771 (0.8810)	loss 0.5473 (0.4953)	grad_norm 2.2869 (2.6515)	mem 20675MB
[2025-04-03 03:10:25 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][442/573]	eta 0:01:55 lr 0.000385	time 0.8772 (0.8810)	loss 0.5424 (0.4955)	grad_norm 1.8543 (2.6489)	mem 20675MB
[2025-04-03 03:10:27 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][444/573]	eta 0:01:53 lr 0.000384	time 0.8771 (0.8810)	loss 0.3961 (0.4955)	grad_norm 4.3250 (2.6509)	mem 20675MB
[2025-04-03 03:10:28 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][446/573]	eta 0:01:51 lr 0.000384	time 0.8774 (0.8810)	loss 0.5411 (0.4956)	grad_norm 2.1389 (2.6489)	mem 20675MB
[2025-04-03 03:10:30 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][448/573]	eta 0:01:50 lr 0.000384	time 0.8772 (0.8810)	loss 0.5032 (0.4954)	grad_norm 3.8535 (2.6509)	mem 20675MB
[2025-04-03 03:10:32 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][450/573]	eta 0:01:48 lr 0.000384	time 0.8774 (0.8810)	loss 0.4535 (0.4952)	grad_norm 2.7870 (2.6573)	mem 20675MB
[2025-04-03 03:10:34 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][452/573]	eta 0:01:46 lr 0.000384	time 0.8772 (0.8809)	loss 0.5532 (0.4955)	grad_norm 2.2947 (2.6558)	mem 20675MB
[2025-04-03 03:10:35 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][454/573]	eta 0:01:44 lr 0.000383	time 0.8774 (0.8809)	loss 0.5113 (0.4955)	grad_norm 2.2497 (2.6558)	mem 20675MB
[2025-04-03 03:10:37 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][456/573]	eta 0:01:43 lr 0.000383	time 0.8774 (0.8809)	loss 0.6374 (0.4955)	grad_norm 2.4430 (2.6587)	mem 20675MB
[2025-04-03 03:10:39 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][458/573]	eta 0:01:41 lr 0.000383	time 0.8777 (0.8809)	loss 0.5413 (0.4956)	grad_norm 2.3174 (2.6562)	mem 20675MB
[2025-04-03 03:10:41 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][460/573]	eta 0:01:39 lr 0.000383	time 0.8772 (0.8809)	loss 0.5159 (0.4956)	grad_norm 3.1549 (2.6559)	mem 20675MB
[2025-04-03 03:10:42 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][462/573]	eta 0:01:37 lr 0.000383	time 0.8775 (0.8809)	loss 0.4119 (0.4952)	grad_norm 2.7360 (2.6549)	mem 20675MB
[2025-04-03 03:10:44 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][464/573]	eta 0:01:36 lr 0.000382	time 0.8775 (0.8809)	loss 0.5169 (0.4954)	grad_norm 2.9346 (2.6543)	mem 20675MB
[2025-04-03 03:10:46 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][466/573]	eta 0:01:34 lr 0.000382	time 0.8773 (0.8809)	loss 0.4235 (0.4952)	grad_norm 2.4786 (2.6528)	mem 20675MB
[2025-04-03 03:10:48 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][468/573]	eta 0:01:32 lr 0.000382	time 0.8771 (0.8809)	loss 0.5207 (0.4954)	grad_norm 2.8989 (2.6527)	mem 20675MB
[2025-04-03 03:10:49 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][470/573]	eta 0:01:30 lr 0.000382	time 0.8774 (0.8808)	loss 0.3707 (0.4952)	grad_norm 2.0208 (2.6498)	mem 20675MB
[2025-04-03 03:10:51 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][472/573]	eta 0:01:28 lr 0.000382	time 0.8774 (0.8808)	loss 0.5522 (0.4955)	grad_norm 1.8134 (2.6463)	mem 20675MB
[2025-04-03 03:10:53 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][474/573]	eta 0:01:27 lr 0.000381	time 0.8771 (0.8808)	loss 0.5622 (0.4958)	grad_norm 2.8516 (2.6454)	mem 20675MB
[2025-04-03 03:10:55 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][476/573]	eta 0:01:25 lr 0.000381	time 0.8773 (0.8808)	loss 0.5298 (0.4956)	grad_norm 2.0418 (2.6516)	mem 20675MB
[2025-04-03 03:10:57 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][478/573]	eta 0:01:23 lr 0.000381	time 0.8771 (0.8808)	loss 0.5091 (0.4957)	grad_norm 1.9478 (2.6492)	mem 20675MB
[2025-04-03 03:10:58 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][480/573]	eta 0:01:21 lr 0.000381	time 0.8772 (0.8808)	loss 0.4737 (0.4953)	grad_norm 1.6871 (2.6468)	mem 20675MB
[2025-04-03 03:11:00 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][482/573]	eta 0:01:20 lr 0.000380	time 0.8775 (0.8808)	loss 0.4967 (0.4952)	grad_norm 2.1067 (2.6457)	mem 20675MB
[2025-04-03 03:11:02 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][484/573]	eta 0:01:18 lr 0.000380	time 0.8773 (0.8808)	loss 0.4129 (0.4949)	grad_norm 3.5108 (2.6522)	mem 20675MB
[2025-04-03 03:11:04 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][486/573]	eta 0:01:16 lr 0.000380	time 0.8772 (0.8808)	loss 0.5143 (0.4950)	grad_norm 2.2398 (2.6499)	mem 20675MB
[2025-04-03 03:11:05 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][488/573]	eta 0:01:14 lr 0.000380	time 0.8773 (0.8807)	loss 0.5277 (0.4950)	grad_norm 2.1123 (2.6487)	mem 20675MB
[2025-04-03 03:11:07 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][490/573]	eta 0:01:13 lr 0.000380	time 0.8773 (0.8807)	loss 0.5093 (0.4948)	grad_norm 3.0174 (2.6524)	mem 20675MB
[2025-04-03 03:11:09 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][492/573]	eta 0:01:11 lr 0.000379	time 0.8771 (0.8807)	loss 0.5460 (0.4947)	grad_norm 2.2770 (2.6539)	mem 20675MB
[2025-04-03 03:11:11 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][494/573]	eta 0:01:09 lr 0.000379	time 0.8772 (0.8807)	loss 0.5173 (0.4948)	grad_norm 3.1728 (2.6537)	mem 20675MB
[2025-04-03 03:11:12 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][496/573]	eta 0:01:07 lr 0.000379	time 0.8770 (0.8807)	loss 0.5569 (0.4949)	grad_norm 2.9701 (2.6540)	mem 20675MB
[2025-04-03 03:11:14 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][498/573]	eta 0:01:06 lr 0.000379	time 0.8772 (0.8807)	loss 0.3285 (0.4947)	grad_norm 2.8760 (2.6535)	mem 20675MB
[2025-04-03 03:11:16 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][500/573]	eta 0:01:04 lr 0.000379	time 0.8773 (0.8807)	loss 0.5664 (0.4949)	grad_norm 2.1091 (2.6523)	mem 20675MB
[2025-04-03 03:11:18 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][502/573]	eta 0:01:02 lr 0.000378	time 0.8773 (0.8807)	loss 0.6490 (0.4950)	grad_norm 2.8899 (2.6527)	mem 20675MB
[2025-04-03 03:11:19 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][504/573]	eta 0:01:00 lr 0.000378	time 0.8770 (0.8807)	loss 0.5632 (0.4951)	grad_norm 2.4397 (2.6528)	mem 20675MB
[2025-04-03 03:11:21 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][506/573]	eta 0:00:59 lr 0.000378	time 0.8770 (0.8806)	loss 0.5116 (0.4950)	grad_norm 2.5647 (2.6534)	mem 20675MB
[2025-04-03 03:11:23 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][508/573]	eta 0:00:57 lr 0.000378	time 0.8772 (0.8806)	loss 0.5467 (0.4950)	grad_norm 2.1467 (2.6553)	mem 20675MB
[2025-04-03 03:11:25 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][510/573]	eta 0:00:55 lr 0.000378	time 0.8772 (0.8806)	loss 0.3433 (0.4945)	grad_norm 2.4731 (2.6569)	mem 20675MB
[2025-04-03 03:11:26 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][512/573]	eta 0:00:53 lr 0.000377	time 0.8769 (0.8806)	loss 0.4367 (0.4945)	grad_norm 1.9683 (2.6549)	mem 20675MB
[2025-04-03 03:11:28 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][514/573]	eta 0:00:51 lr 0.000377	time 0.8772 (0.8806)	loss 0.4394 (0.4945)	grad_norm 3.3740 (2.6558)	mem 20675MB
[2025-04-03 03:11:30 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][516/573]	eta 0:00:50 lr 0.000377	time 0.8774 (0.8806)	loss 0.5269 (0.4943)	grad_norm 2.2178 (2.6577)	mem 20675MB
[2025-04-03 03:11:32 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][518/573]	eta 0:00:48 lr 0.000377	time 0.8771 (0.8806)	loss 0.5149 (0.4943)	grad_norm 2.0282 (2.6576)	mem 20675MB
[2025-04-03 03:11:33 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][520/573]	eta 0:00:46 lr 0.000377	time 0.8773 (0.8806)	loss 0.5226 (0.4944)	grad_norm 2.5346 (2.6569)	mem 20675MB
[2025-04-03 03:11:35 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][522/573]	eta 0:00:44 lr 0.000376	time 0.8774 (0.8806)	loss 0.6064 (0.4948)	grad_norm 2.4167 (2.6551)	mem 20675MB
[2025-04-03 03:11:37 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][524/573]	eta 0:00:43 lr 0.000376	time 0.8774 (0.8806)	loss 0.6301 (0.4952)	grad_norm 2.9303 (2.6552)	mem 20675MB
[2025-04-03 03:11:39 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][526/573]	eta 0:00:41 lr 0.000376	time 0.8771 (0.8805)	loss 0.5451 (0.4952)	grad_norm 2.9051 (2.6556)	mem 20675MB
[2025-04-03 03:11:40 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][528/573]	eta 0:00:39 lr 0.000376	time 0.8775 (0.8805)	loss 0.5446 (0.4951)	grad_norm 1.9756 (2.6533)	mem 20675MB
[2025-04-03 03:11:42 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][530/573]	eta 0:00:37 lr 0.000375	time 0.8774 (0.8805)	loss 0.5484 (0.4949)	grad_norm 1.8112 (2.6520)	mem 20675MB
[2025-04-03 03:11:44 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][532/573]	eta 0:00:36 lr 0.000375	time 0.8786 (0.8805)	loss 0.5471 (0.4950)	grad_norm 2.2369 (2.6507)	mem 20675MB
[2025-04-03 03:11:46 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][534/573]	eta 0:00:34 lr 0.000375	time 0.8772 (0.8805)	loss 0.6029 (0.4954)	grad_norm 2.1739 (2.6484)	mem 20675MB
[2025-04-03 03:11:47 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][536/573]	eta 0:00:32 lr 0.000375	time 0.8771 (0.8805)	loss 0.5584 (0.4956)	grad_norm 2.2402 (2.6466)	mem 20675MB
[2025-04-03 03:11:49 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][538/573]	eta 0:00:30 lr 0.000375	time 0.8772 (0.8805)	loss 0.5400 (0.4959)	grad_norm 2.1820 (2.6443)	mem 20675MB
[2025-04-03 03:11:51 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][540/573]	eta 0:00:29 lr 0.000374	time 0.8771 (0.8805)	loss 0.5450 (0.4958)	grad_norm 6.0034 (2.6529)	mem 20675MB
[2025-04-03 03:11:53 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][542/573]	eta 0:00:27 lr 0.000374	time 0.8769 (0.8805)	loss 0.4380 (0.4956)	grad_norm 2.8814 (2.6531)	mem 20675MB
[2025-04-03 03:11:54 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][544/573]	eta 0:00:25 lr 0.000374	time 0.8771 (0.8805)	loss 0.5822 (0.4959)	grad_norm 2.8244 (2.6535)	mem 20675MB
[2025-04-03 03:11:56 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][546/573]	eta 0:00:23 lr 0.000374	time 0.8773 (0.8805)	loss 0.6042 (0.4961)	grad_norm 3.2544 (2.6540)	mem 20675MB
[2025-04-03 03:11:58 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][548/573]	eta 0:00:22 lr 0.000374	time 0.8773 (0.8804)	loss 0.5389 (0.4963)	grad_norm 2.1276 (2.6517)	mem 20675MB
[2025-04-03 03:12:00 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][550/573]	eta 0:00:20 lr 0.000373	time 0.8773 (0.8804)	loss 0.5705 (0.4963)	grad_norm 1.8940 (2.6493)	mem 20675MB
[2025-04-03 03:12:01 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][552/573]	eta 0:00:18 lr 0.000373	time 0.8772 (0.8804)	loss 0.5256 (0.4963)	grad_norm 1.8098 (2.6471)	mem 20675MB
[2025-04-03 03:12:03 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][554/573]	eta 0:00:16 lr 0.000373	time 0.8769 (0.8804)	loss 0.4859 (0.4963)	grad_norm 3.5638 (2.6470)	mem 20675MB
[2025-04-03 03:12:05 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][556/573]	eta 0:00:14 lr 0.000373	time 0.8775 (0.8804)	loss 0.3402 (0.4962)	grad_norm 2.6994 (2.6459)	mem 20675MB
[2025-04-03 03:12:07 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][558/573]	eta 0:00:13 lr 0.000373	time 0.8772 (0.8804)	loss 0.5404 (0.4964)	grad_norm 2.9881 (2.6459)	mem 20675MB
[2025-04-03 03:12:09 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][560/573]	eta 0:00:11 lr 0.000372	time 0.8770 (0.8804)	loss 0.4819 (0.4966)	grad_norm 2.7041 (2.6449)	mem 20675MB
[2025-04-03 03:12:10 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][562/573]	eta 0:00:09 lr 0.000372	time 0.8770 (0.8804)	loss 0.5016 (0.4967)	grad_norm 2.9398 (2.6451)	mem 20675MB
[2025-04-03 03:12:12 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][564/573]	eta 0:00:07 lr 0.000372	time 0.8771 (0.8804)	loss 0.6051 (0.4966)	grad_norm 2.4027 (2.6442)	mem 20675MB
[2025-04-03 03:12:14 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][566/573]	eta 0:00:06 lr 0.000372	time 0.8774 (0.8804)	loss 0.5240 (0.4965)	grad_norm 2.1473 (2.6438)	mem 20675MB
[2025-04-03 03:12:16 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][568/573]	eta 0:00:04 lr 0.000371	time 0.8770 (0.8804)	loss 0.4114 (0.4964)	grad_norm 2.3618 (2.6450)	mem 20675MB
[2025-04-03 03:12:17 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][570/573]	eta 0:00:02 lr 0.000371	time 0.8771 (0.8804)	loss 0.4991 (0.4965)	grad_norm 2.6543 (2.6438)	mem 20675MB
[2025-04-03 03:12:19 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][572/573]	eta 0:00:00 lr 0.000371	time 0.8768 (0.8803)	loss 0.5807 (0.4967)	grad_norm 1.7994 (2.6432)	mem 20675MB
[2025-04-03 03:12:19 simmim_finetune] (main_finetune.py 260): INFO EPOCH 18 training takes 0:08:24
[2025-04-03 03:12:21 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.789 (1.789)	Loss 0.5042 (0.5042)	Acc@1 71.094 (71.094)	Mem 20675MB
[2025-04-03 03:12:22 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.786)	Loss 0.4593 (0.4729)	Acc@1 78.125 (75.260)	Mem 20675MB
[2025-04-03 03:12:22 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.585)	Loss 0.4932 (0.4670)	Acc@1 73.438 (76.094)	Mem 20675MB
[2025-04-03 03:12:23 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.499)	Loss 0.4206 (0.4544)	Acc@1 82.812 (77.790)	Mem 20675MB
[2025-04-03 03:12:23 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.451)	Loss 0.5226 (0.4552)	Acc@1 73.438 (77.778)	Mem 20675MB
[2025-04-03 03:12:24 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.421)	Loss 0.4689 (0.4656)	Acc@1 81.250 (77.557)	Mem 20675MB
[2025-04-03 03:12:24 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.400)	Loss 0.5039 (0.4692)	Acc@1 75.781 (77.344)	Mem 20675MB
[2025-04-03 03:12:25 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.384)	Loss 0.4483 (0.4674)	Acc@1 79.688 (77.656)	Mem 20675MB
[2025-04-03 03:12:25 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.571
[2025-04-03 03:12:25 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 77.6%
[2025-04-03 03:12:25 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 77.67%
[2025-04-03 03:12:25 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.5470925080495704e-06, 1.5470925080495704e-06, 2.28546936088141e-06, 2.28546936088141e-06, 3.4214337498534707e-06, 3.4214337498534707e-06, 5.1690712713489495e-06, 5.1690712713489495e-06, 7.857744381341991e-06, 7.857744381341991e-06, 1.1994164550562057e-05, 1.1994164550562057e-05, 1.8357887887823696e-05, 1.8357887887823696e-05, 2.8148231483610826e-05, 2.8148231483610826e-05, 4.3210298554052575e-05, 4.3210298554052575e-05, 6.638270943165528e-05, 6.638270943165528e-05, 0.00010203257232027479, 0.00010203257232027479, 0.00015687851522584327, 0.00015687851522584327, 0.00024125688892671787, 0.00024125688892671787, 0.00037106977154344797, 0.00037106977154344797]
[2025-04-03 03:12:28 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][0/573]	eta 0:23:14 lr 0.000371	time 2.4334 (2.4334)	loss 0.2970 (0.2970)	grad_norm 5.0149 (5.0149)	mem 20675MB
[2025-04-03 03:12:29 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][2/573]	eta 0:13:17 lr 0.000371	time 0.8771 (1.3964)	loss 0.5749 (0.4485)	grad_norm 2.3179 (3.8723)	mem 20675MB
[2025-04-03 03:12:31 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][4/573]	eta 0:11:16 lr 0.000371	time 0.8770 (1.1890)	loss 0.4174 (0.4613)	grad_norm 5.4803 (3.9855)	mem 20675MB
[2025-04-03 03:12:33 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][6/573]	eta 0:10:23 lr 0.000370	time 0.8772 (1.1002)	loss 0.4639 (0.4752)	grad_norm 3.2779 (3.8306)	mem 20675MB
[2025-04-03 03:12:35 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][8/573]	eta 0:09:53 lr 0.000370	time 0.8773 (1.0508)	loss 0.3611 (0.4584)	grad_norm 3.5356 (3.9438)	mem 20675MB
[2025-04-03 03:12:36 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][10/573]	eta 0:09:33 lr 0.000370	time 0.8773 (1.0194)	loss 0.3952 (0.4633)	grad_norm 2.8730 (3.6722)	mem 20675MB
[2025-04-03 03:12:38 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][12/573]	eta 0:09:19 lr 0.000370	time 0.8767 (0.9977)	loss 0.3306 (0.4534)	grad_norm 2.7581 (3.5511)	mem 20675MB
[2025-04-03 03:12:40 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][14/573]	eta 0:09:08 lr 0.000370	time 0.8769 (0.9817)	loss 0.5132 (0.4567)	grad_norm 2.6672 (3.4181)	mem 20675MB
[2025-04-03 03:12:42 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][16/573]	eta 0:08:59 lr 0.000369	time 0.8769 (0.9694)	loss 0.4858 (0.4636)	grad_norm 1.8537 (3.2765)	mem 20675MB
[2025-04-03 03:12:43 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][18/573]	eta 0:08:52 lr 0.000369	time 0.8769 (0.9598)	loss 0.5134 (0.4752)	grad_norm 2.2688 (3.2000)	mem 20675MB
[2025-04-03 03:12:45 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][20/573]	eta 0:08:46 lr 0.000369	time 0.8770 (0.9520)	loss 0.5363 (0.4786)	grad_norm 2.3856 (3.1768)	mem 20675MB
[2025-04-03 03:12:47 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][22/573]	eta 0:08:40 lr 0.000369	time 0.8772 (0.9455)	loss 0.5384 (0.4823)	grad_norm 2.1942 (3.1049)	mem 20675MB
[2025-04-03 03:12:49 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][24/573]	eta 0:08:36 lr 0.000368	time 0.8770 (0.9401)	loss 0.4695 (0.4776)	grad_norm 2.0324 (3.1553)	mem 20675MB
[2025-04-03 03:12:50 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][26/573]	eta 0:08:31 lr 0.000368	time 0.8769 (0.9355)	loss 0.5665 (0.4765)	grad_norm 2.0274 (3.0996)	mem 20675MB
[2025-04-03 03:12:52 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][28/573]	eta 0:08:27 lr 0.000368	time 0.8772 (0.9315)	loss 0.4625 (0.4804)	grad_norm 2.8595 (3.0587)	mem 20675MB
[2025-04-03 03:12:54 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][30/573]	eta 0:08:23 lr 0.000368	time 0.8771 (0.9281)	loss 0.3530 (0.4747)	grad_norm 2.8522 (3.0468)	mem 20675MB
[2025-04-03 03:12:56 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][32/573]	eta 0:08:20 lr 0.000368	time 0.8769 (0.9250)	loss 0.4095 (0.4754)	grad_norm 2.9443 (3.0130)	mem 20675MB
[2025-04-03 03:12:57 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][34/573]	eta 0:08:17 lr 0.000367	time 0.8773 (0.9223)	loss 0.3793 (0.4715)	grad_norm 2.9174 (3.0175)	mem 20675MB
[2025-04-03 03:12:59 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][36/573]	eta 0:08:14 lr 0.000367	time 0.8774 (0.9199)	loss 0.5123 (0.4740)	grad_norm 2.0247 (3.0487)	mem 20675MB
[2025-04-03 03:13:01 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][38/573]	eta 0:08:11 lr 0.000367	time 0.8773 (0.9178)	loss 0.5415 (0.4759)	grad_norm 2.3164 (2.9993)	mem 20675MB
[2025-04-03 03:13:03 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][40/573]	eta 0:08:08 lr 0.000367	time 0.8771 (0.9159)	loss 0.4549 (0.4768)	grad_norm 2.7960 (2.9689)	mem 20675MB
[2025-04-03 03:13:05 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][42/573]	eta 0:08:05 lr 0.000367	time 0.8773 (0.9141)	loss 0.4786 (0.4753)	grad_norm 2.2245 (2.9765)	mem 20675MB
[2025-04-03 03:13:06 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][44/573]	eta 0:08:02 lr 0.000366	time 0.8774 (0.9125)	loss 0.6637 (0.4784)	grad_norm 3.7872 (2.9973)	mem 20675MB
[2025-04-03 03:13:08 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][46/573]	eta 0:08:00 lr 0.000366	time 0.8784 (0.9111)	loss 0.5173 (0.4792)	grad_norm 2.3399 (2.9754)	mem 20675MB
[2025-04-03 03:13:10 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][48/573]	eta 0:07:57 lr 0.000366	time 0.8772 (0.9097)	loss 0.5558 (0.4789)	grad_norm 2.2257 (2.9515)	mem 20675MB
[2025-04-03 03:13:12 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][50/573]	eta 0:07:55 lr 0.000366	time 0.8773 (0.9085)	loss 0.4907 (0.4775)	grad_norm 2.0010 (3.0304)	mem 20675MB
[2025-04-03 03:13:13 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][52/573]	eta 0:07:52 lr 0.000366	time 0.8772 (0.9073)	loss 0.4348 (0.4753)	grad_norm 3.1314 (3.0305)	mem 20675MB
[2025-04-03 03:13:15 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][54/573]	eta 0:07:50 lr 0.000365	time 0.8771 (0.9062)	loss 0.5536 (0.4778)	grad_norm 1.7094 (3.0096)	mem 20675MB
[2025-04-03 03:13:17 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][56/573]	eta 0:07:48 lr 0.000365	time 0.8772 (0.9053)	loss 0.6130 (0.4794)	grad_norm 2.1574 (2.9897)	mem 20675MB
[2025-04-03 03:13:19 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][58/573]	eta 0:07:45 lr 0.000365	time 0.8770 (0.9043)	loss 0.4976 (0.4812)	grad_norm 2.7776 (2.9681)	mem 20675MB
[2025-04-03 03:13:20 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][60/573]	eta 0:07:43 lr 0.000365	time 0.8770 (0.9035)	loss 0.5195 (0.4829)	grad_norm 1.7529 (2.9261)	mem 20675MB
[2025-04-03 03:13:22 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][62/573]	eta 0:07:41 lr 0.000365	time 0.8773 (0.9027)	loss 0.5302 (0.4832)	grad_norm 2.6178 (2.9213)	mem 20675MB
[2025-04-03 03:13:24 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][64/573]	eta 0:07:39 lr 0.000364	time 0.8771 (0.9019)	loss 0.3839 (0.4824)	grad_norm 2.2584 (2.8891)	mem 20675MB
[2025-04-03 03:13:26 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][66/573]	eta 0:07:36 lr 0.000364	time 0.8772 (0.9012)	loss 0.4519 (0.4820)	grad_norm 3.9181 (2.8925)	mem 20675MB
[2025-04-03 03:13:27 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][68/573]	eta 0:07:34 lr 0.000364	time 0.8771 (0.9005)	loss 0.5114 (0.4837)	grad_norm 2.8552 (2.8806)	mem 20675MB
[2025-04-03 03:13:29 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][70/573]	eta 0:07:32 lr 0.000364	time 0.8771 (0.8999)	loss 0.5802 (0.4854)	grad_norm 2.3742 (2.8570)	mem 20675MB
[2025-04-03 03:13:31 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][72/573]	eta 0:07:30 lr 0.000363	time 0.8774 (0.8993)	loss 0.5682 (0.4854)	grad_norm 2.3589 (2.8549)	mem 20675MB
[2025-04-03 03:13:33 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][74/573]	eta 0:07:28 lr 0.000363	time 0.8771 (0.8987)	loss 0.4045 (0.4852)	grad_norm 2.4370 (2.8382)	mem 20675MB
[2025-04-03 03:13:34 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][76/573]	eta 0:07:26 lr 0.000363	time 0.8770 (0.8982)	loss 0.4971 (0.4867)	grad_norm 2.0010 (2.8265)	mem 20675MB
[2025-04-03 03:13:36 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][78/573]	eta 0:07:24 lr 0.000363	time 0.8772 (0.8977)	loss 0.5088 (0.4867)	grad_norm 2.3798 (2.8106)	mem 20675MB
[2025-04-03 03:13:38 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][80/573]	eta 0:07:22 lr 0.000363	time 0.8773 (0.8972)	loss 0.5369 (0.4890)	grad_norm 2.1540 (2.7927)	mem 20675MB
[2025-04-03 03:13:40 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][82/573]	eta 0:07:20 lr 0.000362	time 0.8770 (0.8967)	loss 0.5405 (0.4904)	grad_norm 2.7821 (2.7846)	mem 20675MB
[2025-04-03 03:13:41 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][84/573]	eta 0:07:18 lr 0.000362	time 0.8771 (0.8963)	loss 0.5364 (0.4904)	grad_norm 2.1732 (2.7824)	mem 20675MB
[2025-04-03 03:13:43 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][86/573]	eta 0:07:16 lr 0.000362	time 0.8773 (0.8959)	loss 0.3529 (0.4897)	grad_norm 2.2298 (2.7650)	mem 20675MB
[2025-04-03 03:13:45 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][88/573]	eta 0:07:14 lr 0.000362	time 0.8771 (0.8955)	loss 0.4870 (0.4903)	grad_norm 1.8650 (2.7436)	mem 20675MB
[2025-04-03 03:13:47 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][90/573]	eta 0:07:12 lr 0.000362	time 0.8771 (0.8951)	loss 0.4676 (0.4906)	grad_norm 2.8471 (2.7407)	mem 20675MB
[2025-04-03 03:13:48 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][92/573]	eta 0:07:10 lr 0.000361	time 0.8773 (0.8947)	loss 0.5481 (0.4919)	grad_norm 2.2201 (2.7245)	mem 20675MB
[2025-04-03 03:13:50 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][94/573]	eta 0:07:08 lr 0.000361	time 0.8770 (0.8943)	loss 0.4700 (0.4921)	grad_norm 2.2757 (2.7225)	mem 20675MB
[2025-04-03 03:13:52 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][96/573]	eta 0:07:06 lr 0.000361	time 0.8771 (0.8940)	loss 0.5128 (0.4913)	grad_norm 2.0557 (2.7143)	mem 20675MB
[2025-04-03 03:13:54 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][98/573]	eta 0:07:04 lr 0.000361	time 0.8773 (0.8937)	loss 0.5823 (0.4909)	grad_norm 2.5809 (2.7188)	mem 20675MB
[2025-04-03 03:13:55 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][100/573]	eta 0:07:02 lr 0.000361	time 0.8773 (0.8934)	loss 0.6071 (0.4914)	grad_norm 2.0209 (2.7145)	mem 20675MB
[2025-04-03 03:13:57 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][102/573]	eta 0:07:00 lr 0.000360	time 0.8774 (0.8931)	loss 0.5456 (0.4918)	grad_norm 1.7421 (2.7055)	mem 20675MB
[2025-04-03 03:13:59 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][104/573]	eta 0:06:58 lr 0.000360	time 0.8771 (0.8928)	loss 0.5590 (0.4912)	grad_norm 2.6393 (2.7113)	mem 20675MB
[2025-04-03 03:14:01 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][106/573]	eta 0:06:56 lr 0.000360	time 0.8769 (0.8925)	loss 0.4531 (0.4920)	grad_norm 3.2757 (2.7135)	mem 20675MB
[2025-04-03 03:14:02 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][108/573]	eta 0:06:54 lr 0.000360	time 0.8771 (0.8923)	loss 0.4958 (0.4925)	grad_norm 2.3738 (2.7002)	mem 20675MB
[2025-04-03 03:14:04 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][110/573]	eta 0:06:52 lr 0.000360	time 0.8770 (0.8920)	loss 0.4792 (0.4929)	grad_norm 3.7314 (2.7123)	mem 20675MB
[2025-04-03 03:14:06 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][112/573]	eta 0:06:51 lr 0.000359	time 0.8769 (0.8917)	loss 0.5076 (0.4919)	grad_norm 1.6658 (2.7160)	mem 20675MB
[2025-04-03 03:14:08 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][114/573]	eta 0:06:49 lr 0.000359	time 0.8770 (0.8915)	loss 0.5487 (0.4931)	grad_norm 1.9925 (2.7035)	mem 20675MB
[2025-04-03 03:14:09 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][116/573]	eta 0:06:47 lr 0.000359	time 0.8771 (0.8913)	loss 0.5012 (0.4934)	grad_norm 3.0924 (2.7071)	mem 20675MB
[2025-04-03 03:14:11 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][118/573]	eta 0:06:45 lr 0.000359	time 0.8769 (0.8910)	loss 0.5091 (0.4941)	grad_norm 3.0998 (2.7060)	mem 20675MB
[2025-04-03 03:14:13 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][120/573]	eta 0:06:43 lr 0.000359	time 0.8771 (0.8908)	loss 0.5389 (0.4951)	grad_norm 2.0640 (2.6955)	mem 20675MB
[2025-04-03 03:14:15 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][122/573]	eta 0:06:41 lr 0.000358	time 0.8771 (0.8906)	loss 0.4970 (0.4959)	grad_norm 2.0094 (2.6862)	mem 20675MB
[2025-04-03 03:14:17 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][124/573]	eta 0:06:39 lr 0.000358	time 0.8769 (0.8904)	loss 0.5449 (0.4969)	grad_norm 2.8131 (2.6951)	mem 20675MB
[2025-04-03 03:14:18 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][126/573]	eta 0:06:37 lr 0.000358	time 0.8771 (0.8902)	loss 0.3051 (0.4956)	grad_norm 3.2746 (2.6951)	mem 20675MB
[2025-04-03 03:14:20 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][128/573]	eta 0:06:36 lr 0.000358	time 0.8772 (0.8900)	loss 0.5298 (0.4963)	grad_norm 1.5221 (2.6784)	mem 20675MB
[2025-04-03 03:14:22 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][130/573]	eta 0:06:34 lr 0.000357	time 0.8769 (0.8898)	loss 0.5272 (0.4971)	grad_norm 2.7848 (2.6782)	mem 20675MB
[2025-04-03 03:14:24 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][132/573]	eta 0:06:32 lr 0.000357	time 0.8774 (0.8897)	loss 0.6011 (0.4979)	grad_norm 1.9479 (2.6723)	mem 20675MB
[2025-04-03 03:14:25 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][134/573]	eta 0:06:30 lr 0.000357	time 0.8773 (0.8895)	loss 0.5240 (0.4980)	grad_norm 2.0620 (2.6616)	mem 20675MB
[2025-04-03 03:14:27 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][136/573]	eta 0:06:28 lr 0.000357	time 0.8773 (0.8893)	loss 0.4601 (0.4983)	grad_norm 2.7555 (2.6599)	mem 20675MB
[2025-04-03 03:14:29 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][138/573]	eta 0:06:26 lr 0.000357	time 0.8771 (0.8892)	loss 0.5403 (0.4985)	grad_norm 2.5608 (2.6601)	mem 20675MB
[2025-04-03 03:14:31 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][140/573]	eta 0:06:24 lr 0.000356	time 0.8770 (0.8890)	loss 0.5012 (0.4984)	grad_norm 2.4405 (2.6528)	mem 20675MB
[2025-04-03 03:14:32 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][142/573]	eta 0:06:23 lr 0.000356	time 0.8769 (0.8888)	loss 0.4475 (0.4976)	grad_norm 2.2825 (2.6505)	mem 20675MB
[2025-04-03 03:14:34 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][144/573]	eta 0:06:21 lr 0.000356	time 0.8784 (0.8887)	loss 0.6167 (0.4989)	grad_norm 3.9692 (2.6575)	mem 20675MB
[2025-04-03 03:14:36 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][146/573]	eta 0:06:19 lr 0.000356	time 0.8770 (0.8886)	loss 0.4283 (0.4986)	grad_norm 2.8732 (2.6563)	mem 20675MB
[2025-04-03 03:14:38 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][148/573]	eta 0:06:17 lr 0.000356	time 0.8772 (0.8884)	loss 0.5160 (0.4976)	grad_norm 3.2944 (2.6628)	mem 20675MB
[2025-04-03 03:14:39 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][150/573]	eta 0:06:15 lr 0.000355	time 0.8770 (0.8883)	loss 0.4490 (0.4977)	grad_norm 2.9386 (2.6616)	mem 20675MB
[2025-04-03 03:14:41 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][152/573]	eta 0:06:13 lr 0.000355	time 0.8771 (0.8881)	loss 0.4946 (0.4981)	grad_norm 3.4420 (2.6694)	mem 20675MB
[2025-04-03 03:14:43 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][154/573]	eta 0:06:12 lr 0.000355	time 0.8773 (0.8880)	loss 0.5156 (0.4984)	grad_norm 2.4567 (2.6668)	mem 20675MB
[2025-04-03 03:14:45 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][156/573]	eta 0:06:10 lr 0.000355	time 0.8773 (0.8879)	loss 0.5278 (0.4991)	grad_norm 2.1785 (2.6644)	mem 20675MB
[2025-04-03 03:14:46 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][158/573]	eta 0:06:08 lr 0.000355	time 0.8772 (0.8878)	loss 0.6199 (0.5006)	grad_norm 2.5057 (2.6614)	mem 20675MB
[2025-04-03 03:14:48 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][160/573]	eta 0:06:06 lr 0.000354	time 0.8770 (0.8876)	loss 0.4606 (0.5004)	grad_norm 2.1903 (2.6538)	mem 20675MB
[2025-04-03 03:14:50 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][162/573]	eta 0:06:04 lr 0.000354	time 0.8770 (0.8875)	loss 0.4911 (0.4997)	grad_norm 2.0763 (2.6531)	mem 20675MB
[2025-04-03 03:14:52 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][164/573]	eta 0:06:02 lr 0.000354	time 0.8773 (0.8874)	loss 0.5096 (0.4992)	grad_norm 1.8914 (2.6476)	mem 20675MB
[2025-04-03 03:14:53 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][166/573]	eta 0:06:01 lr 0.000354	time 0.8774 (0.8873)	loss 0.3952 (0.4981)	grad_norm 3.3655 (2.6515)	mem 20675MB
[2025-04-03 03:14:55 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][168/573]	eta 0:05:59 lr 0.000354	time 0.8771 (0.8872)	loss 0.4757 (0.4981)	grad_norm 2.4450 (2.6467)	mem 20675MB
[2025-04-03 03:14:57 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][170/573]	eta 0:05:57 lr 0.000353	time 0.8774 (0.8871)	loss 0.5459 (0.4974)	grad_norm 2.0672 (2.6514)	mem 20675MB
[2025-04-03 03:14:59 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][172/573]	eta 0:05:55 lr 0.000353	time 0.8775 (0.8870)	loss 0.6074 (0.4983)	grad_norm 1.6147 (2.6424)	mem 20675MB
[2025-04-03 03:15:00 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][174/573]	eta 0:05:53 lr 0.000353	time 0.8770 (0.8869)	loss 0.5292 (0.4984)	grad_norm 2.3997 (2.6377)	mem 20675MB
[2025-04-03 03:15:02 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][176/573]	eta 0:05:52 lr 0.000353	time 0.8771 (0.8868)	loss 0.3790 (0.4978)	grad_norm 4.6940 (2.6495)	mem 20675MB
[2025-04-03 03:15:04 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][178/573]	eta 0:05:50 lr 0.000353	time 0.8784 (0.8867)	loss 0.5756 (0.4985)	grad_norm 2.9303 (2.6500)	mem 20675MB
[2025-04-03 03:15:06 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][180/573]	eta 0:05:48 lr 0.000352	time 0.8774 (0.8866)	loss 0.5182 (0.4991)	grad_norm 2.1358 (2.6444)	mem 20675MB
[2025-04-03 03:15:07 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][182/573]	eta 0:05:46 lr 0.000352	time 0.8770 (0.8865)	loss 0.4920 (0.4993)	grad_norm 2.7752 (2.6496)	mem 20675MB
[2025-04-03 03:15:09 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][184/573]	eta 0:05:44 lr 0.000352	time 0.8770 (0.8864)	loss 0.4074 (0.4990)	grad_norm 2.1438 (2.6463)	mem 20675MB
[2025-04-03 03:15:11 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][186/573]	eta 0:05:43 lr 0.000352	time 0.8772 (0.8863)	loss 0.3558 (0.4982)	grad_norm 4.6653 (2.6624)	mem 20675MB
[2025-04-03 03:15:13 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][188/573]	eta 0:05:41 lr 0.000352	time 0.8771 (0.8862)	loss 0.4164 (0.4982)	grad_norm 3.7539 (2.6680)	mem 20675MB
[2025-04-03 03:15:14 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][190/573]	eta 0:05:39 lr 0.000351	time 0.8773 (0.8861)	loss 0.4920 (0.4987)	grad_norm 1.8756 (2.6643)	mem 20675MB
[2025-04-03 03:15:16 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][192/573]	eta 0:05:37 lr 0.000351	time 0.8771 (0.8861)	loss 0.5289 (0.4991)	grad_norm 2.5573 (2.6610)	mem 20675MB
[2025-04-03 03:15:18 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][194/573]	eta 0:05:35 lr 0.000351	time 0.8771 (0.8860)	loss 0.5593 (0.4991)	grad_norm 2.7496 (2.6653)	mem 20675MB
[2025-04-03 03:15:20 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][196/573]	eta 0:05:33 lr 0.000351	time 0.8771 (0.8859)	loss 0.5447 (0.4984)	grad_norm 1.9063 (2.6644)	mem 20675MB
[2025-04-03 03:15:21 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][198/573]	eta 0:05:32 lr 0.000350	time 0.8772 (0.8858)	loss 0.4414 (0.4978)	grad_norm 2.3267 (2.6635)	mem 20675MB
[2025-04-03 03:15:23 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][200/573]	eta 0:05:30 lr 0.000350	time 0.8789 (0.8857)	loss 0.5633 (0.4983)	grad_norm 2.1228 (2.6604)	mem 20675MB
[2025-04-03 03:15:25 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][202/573]	eta 0:05:28 lr 0.000350	time 0.8771 (0.8857)	loss 0.4612 (0.4976)	grad_norm 3.3207 (2.6659)	mem 20675MB
[2025-04-03 03:15:27 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][204/573]	eta 0:05:26 lr 0.000350	time 0.8772 (0.8856)	loss 0.5980 (0.4983)	grad_norm 2.3616 (2.6660)	mem 20675MB
[2025-04-03 03:15:29 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][206/573]	eta 0:05:24 lr 0.000350	time 0.8774 (0.8855)	loss 0.4337 (0.4980)	grad_norm 3.1332 (2.6666)	mem 20675MB
[2025-04-03 03:15:30 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][208/573]	eta 0:05:23 lr 0.000349	time 0.8772 (0.8854)	loss 0.5588 (0.4987)	grad_norm 2.9924 (2.6648)	mem 20675MB
[2025-04-03 03:15:32 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][210/573]	eta 0:05:21 lr 0.000349	time 0.8772 (0.8854)	loss 0.6113 (0.4993)	grad_norm 2.5875 (2.6645)	mem 20675MB
[2025-04-03 03:15:34 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][212/573]	eta 0:05:19 lr 0.000349	time 0.8773 (0.8853)	loss 0.4994 (0.4996)	grad_norm 3.2756 (2.6662)	mem 20675MB
[2025-04-03 03:15:36 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][214/573]	eta 0:05:17 lr 0.000349	time 0.8769 (0.8852)	loss 0.5172 (0.4994)	grad_norm 2.7542 (2.6710)	mem 20675MB
[2025-04-03 03:15:37 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][216/573]	eta 0:05:16 lr 0.000349	time 0.8770 (0.8852)	loss 0.4992 (0.4999)	grad_norm 2.5759 (2.6698)	mem 20675MB
[2025-04-03 03:15:39 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][218/573]	eta 0:05:14 lr 0.000348	time 0.8787 (0.8851)	loss 0.3545 (0.4991)	grad_norm 2.8941 (2.6762)	mem 20675MB
[2025-04-03 03:15:41 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][220/573]	eta 0:05:12 lr 0.000348	time 0.8776 (0.8851)	loss 0.3273 (0.4977)	grad_norm 4.4123 (2.6828)	mem 20675MB
[2025-04-03 03:15:43 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][222/573]	eta 0:05:10 lr 0.000348	time 0.8775 (0.8850)	loss 0.5298 (0.4978)	grad_norm 1.8259 (2.6735)	mem 20675MB
[2025-04-03 03:15:44 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][224/573]	eta 0:05:08 lr 0.000348	time 0.8776 (0.8849)	loss 0.5393 (0.4981)	grad_norm 1.9014 (2.6712)	mem 20675MB
[2025-04-03 03:15:46 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][226/573]	eta 0:05:07 lr 0.000348	time 0.8775 (0.8849)	loss 0.4863 (0.4982)	grad_norm 2.9415 (2.6707)	mem 20675MB
[2025-04-03 03:15:48 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][228/573]	eta 0:05:05 lr 0.000347	time 0.8772 (0.8848)	loss 0.5395 (0.4983)	grad_norm 1.8881 (2.6695)	mem 20675MB
[2025-04-03 03:15:50 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][230/573]	eta 0:05:03 lr 0.000347	time 0.8775 (0.8848)	loss 0.3879 (0.4982)	grad_norm 5.0877 (2.6774)	mem 20675MB
[2025-04-03 03:15:51 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][232/573]	eta 0:05:01 lr 0.000347	time 0.8774 (0.8847)	loss 0.4066 (0.4979)	grad_norm 5.2230 (2.6863)	mem 20675MB
[2025-04-03 03:15:53 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][234/573]	eta 0:04:59 lr 0.000347	time 0.8773 (0.8847)	loss 0.5321 (0.4980)	grad_norm 1.8893 (2.6827)	mem 20675MB
[2025-04-03 03:15:55 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][236/573]	eta 0:04:58 lr 0.000347	time 0.8775 (0.8846)	loss 0.5730 (0.4982)	grad_norm 2.1113 (2.6784)	mem 20675MB
[2025-04-03 03:15:57 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][238/573]	eta 0:04:56 lr 0.000346	time 0.8771 (0.8845)	loss 0.5103 (0.4980)	grad_norm 2.8314 (2.6761)	mem 20675MB
[2025-04-03 03:15:58 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][240/573]	eta 0:04:54 lr 0.000346	time 0.8774 (0.8845)	loss 0.5903 (0.4985)	grad_norm 2.2153 (2.6764)	mem 20675MB
[2025-04-03 03:16:00 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][242/573]	eta 0:04:52 lr 0.000346	time 0.8770 (0.8844)	loss 0.3676 (0.4981)	grad_norm 3.1222 (2.6756)	mem 20675MB
[2025-04-03 03:16:02 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][244/573]	eta 0:04:50 lr 0.000346	time 0.8774 (0.8844)	loss 0.5724 (0.4988)	grad_norm 2.8098 (2.6742)	mem 20675MB
[2025-04-03 03:16:04 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][246/573]	eta 0:04:49 lr 0.000346	time 0.8773 (0.8843)	loss 0.5462 (0.4984)	grad_norm 1.9774 (2.6764)	mem 20675MB
[2025-04-03 03:16:05 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][248/573]	eta 0:04:47 lr 0.000345	time 0.8773 (0.8843)	loss 0.3594 (0.4976)	grad_norm 3.0583 (2.6838)	mem 20675MB
[2025-04-03 03:16:07 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][250/573]	eta 0:04:45 lr 0.000345	time 0.8772 (0.8842)	loss 0.5070 (0.4982)	grad_norm 2.6792 (2.6816)	mem 20675MB
[2025-04-03 03:16:09 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][252/573]	eta 0:04:43 lr 0.000345	time 0.8771 (0.8842)	loss 0.5340 (0.4985)	grad_norm 3.5962 (2.6844)	mem 20675MB
[2025-04-03 03:16:11 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][254/573]	eta 0:04:42 lr 0.000345	time 0.8774 (0.8841)	loss 0.6024 (0.4988)	grad_norm 2.4723 (2.6855)	mem 20675MB
[2025-04-03 03:16:12 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][256/573]	eta 0:04:40 lr 0.000345	time 0.8769 (0.8841)	loss 0.5777 (0.4991)	grad_norm 2.2192 (2.6832)	mem 20675MB
[2025-04-03 03:16:14 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][258/573]	eta 0:04:38 lr 0.000344	time 0.8774 (0.8840)	loss 0.5412 (0.4998)	grad_norm 1.6053 (2.6764)	mem 20675MB
[2025-04-03 03:16:16 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][260/573]	eta 0:04:36 lr 0.000344	time 0.8776 (0.8840)	loss 0.4969 (0.5001)	grad_norm 2.7268 (2.6743)	mem 20675MB
[2025-04-03 03:16:18 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][262/573]	eta 0:04:34 lr 0.000344	time 0.8774 (0.8840)	loss 0.5458 (0.5003)	grad_norm 1.7828 (2.6676)	mem 20675MB
[2025-04-03 03:16:19 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][264/573]	eta 0:04:33 lr 0.000344	time 0.8774 (0.8839)	loss 0.5734 (0.5008)	grad_norm 1.8497 (2.6618)	mem 20675MB
[2025-04-03 03:16:21 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][266/573]	eta 0:04:31 lr 0.000344	time 0.8770 (0.8839)	loss 0.4931 (0.5002)	grad_norm 2.5144 (2.6623)	mem 20675MB
[2025-04-03 03:16:23 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][268/573]	eta 0:04:29 lr 0.000343	time 0.8770 (0.8838)	loss 0.4584 (0.4998)	grad_norm 3.0136 (2.6639)	mem 20675MB
[2025-04-03 03:16:25 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][270/573]	eta 0:04:27 lr 0.000343	time 0.8773 (0.8838)	loss 0.5620 (0.5002)	grad_norm 1.6649 (2.6592)	mem 20675MB
[2025-04-03 03:16:26 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][272/573]	eta 0:04:26 lr 0.000343	time 0.8772 (0.8837)	loss 0.4128 (0.4999)	grad_norm 3.0244 (2.6588)	mem 20675MB
[2025-04-03 03:16:28 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][274/573]	eta 0:04:24 lr 0.000343	time 0.8773 (0.8837)	loss 0.5270 (0.4999)	grad_norm 1.8544 (2.6536)	mem 20675MB
[2025-04-03 03:16:30 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][276/573]	eta 0:04:22 lr 0.000343	time 0.8770 (0.8837)	loss 0.5386 (0.5004)	grad_norm 2.7076 (2.6524)	mem 20675MB
[2025-04-03 03:16:32 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][278/573]	eta 0:04:20 lr 0.000342	time 0.8773 (0.8836)	loss 0.5950 (0.5003)	grad_norm 1.9901 (2.6514)	mem 20675MB
[2025-04-03 03:16:33 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][280/573]	eta 0:04:18 lr 0.000342	time 0.8773 (0.8836)	loss 0.5692 (0.5004)	grad_norm 1.8190 (2.6482)	mem 20675MB
[2025-04-03 03:16:35 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][282/573]	eta 0:04:17 lr 0.000342	time 0.8772 (0.8835)	loss 0.3052 (0.4993)	grad_norm 3.3566 (2.6510)	mem 20675MB
[2025-04-03 03:16:37 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][284/573]	eta 0:04:15 lr 0.000342	time 0.8775 (0.8835)	loss 0.3987 (0.4990)	grad_norm 2.7654 (2.6498)	mem 20675MB
[2025-04-03 03:16:39 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][286/573]	eta 0:04:13 lr 0.000341	time 0.8773 (0.8835)	loss 0.5172 (0.4991)	grad_norm 2.0791 (2.6460)	mem 20675MB
[2025-04-03 03:16:41 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][288/573]	eta 0:04:11 lr 0.000341	time 0.8773 (0.8834)	loss 0.5111 (0.4986)	grad_norm 1.9408 (2.6444)	mem 20675MB
[2025-04-03 03:16:42 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][290/573]	eta 0:04:09 lr 0.000341	time 0.8771 (0.8834)	loss 0.5537 (0.4990)	grad_norm 2.2456 (2.6448)	mem 20675MB
[2025-04-03 03:16:44 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][292/573]	eta 0:04:08 lr 0.000341	time 0.8774 (0.8834)	loss 0.6010 (0.4993)	grad_norm 3.4331 (2.6560)	mem 20675MB
[2025-04-03 03:16:46 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][294/573]	eta 0:04:06 lr 0.000341	time 0.8772 (0.8833)	loss 0.4451 (0.4986)	grad_norm 2.8644 (2.6579)	mem 20675MB
[2025-04-03 03:16:48 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][296/573]	eta 0:04:04 lr 0.000340	time 0.8773 (0.8833)	loss 0.4592 (0.4989)	grad_norm 3.1212 (2.6599)	mem 20675MB
[2025-04-03 03:16:49 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][298/573]	eta 0:04:02 lr 0.000340	time 0.8772 (0.8832)	loss 0.3770 (0.4984)	grad_norm 4.7416 (2.6658)	mem 20675MB
[2025-04-03 03:16:51 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][300/573]	eta 0:04:01 lr 0.000340	time 0.8770 (0.8832)	loss 0.3518 (0.4979)	grad_norm 3.6730 (2.6704)	mem 20675MB
[2025-04-03 03:16:53 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][302/573]	eta 0:03:59 lr 0.000340	time 0.8772 (0.8832)	loss 0.4468 (0.4977)	grad_norm 2.8024 (2.6696)	mem 20675MB
[2025-04-03 03:16:55 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][304/573]	eta 0:03:57 lr 0.000340	time 0.8770 (0.8831)	loss 0.5488 (0.4980)	grad_norm 3.3192 (2.6703)	mem 20675MB
[2025-04-03 03:16:56 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][306/573]	eta 0:03:55 lr 0.000339	time 0.8770 (0.8831)	loss 0.5088 (0.4982)	grad_norm 3.2034 (2.6713)	mem 20675MB
[2025-04-03 03:16:58 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][308/573]	eta 0:03:54 lr 0.000339	time 0.8776 (0.8831)	loss 0.5357 (0.4979)	grad_norm 1.4752 (2.6698)	mem 20675MB
[2025-04-03 03:17:00 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][310/573]	eta 0:03:52 lr 0.000339	time 0.8770 (0.8830)	loss 0.4168 (0.4978)	grad_norm 3.5261 (2.6707)	mem 20675MB
[2025-04-03 03:17:02 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][312/573]	eta 0:03:50 lr 0.000339	time 0.8774 (0.8830)	loss 0.5287 (0.4980)	grad_norm 1.6595 (2.6674)	mem 20675MB
[2025-04-03 03:17:03 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][314/573]	eta 0:03:48 lr 0.000339	time 0.8771 (0.8830)	loss 0.5298 (0.4983)	grad_norm 2.0890 (2.6634)	mem 20675MB
[2025-04-03 03:17:05 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][316/573]	eta 0:03:46 lr 0.000338	time 0.8770 (0.8830)	loss 0.4398 (0.4985)	grad_norm 3.4946 (2.6659)	mem 20675MB
[2025-04-03 03:17:07 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][318/573]	eta 0:03:45 lr 0.000338	time 0.8769 (0.8829)	loss 0.3927 (0.4983)	grad_norm 3.6394 (2.6679)	mem 20675MB
[2025-04-03 03:17:09 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][320/573]	eta 0:03:43 lr 0.000338	time 0.8771 (0.8829)	loss 0.4985 (0.4982)	grad_norm 1.8446 (2.6665)	mem 20675MB
[2025-04-03 03:17:10 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][322/573]	eta 0:03:41 lr 0.000338	time 0.8772 (0.8829)	loss 0.4548 (0.4978)	grad_norm 2.4044 (2.6655)	mem 20675MB
[2025-04-03 03:17:12 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][324/573]	eta 0:03:39 lr 0.000338	time 0.8772 (0.8828)	loss 0.5709 (0.4980)	grad_norm 1.9808 (2.6628)	mem 20675MB
[2025-04-03 03:17:14 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][326/573]	eta 0:03:38 lr 0.000337	time 0.8772 (0.8828)	loss 0.5518 (0.4984)	grad_norm 1.9186 (2.6580)	mem 20675MB
[2025-04-03 03:17:16 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][328/573]	eta 0:03:36 lr 0.000337	time 0.8773 (0.8828)	loss 0.4302 (0.4984)	grad_norm 3.4321 (2.6592)	mem 20675MB
[2025-04-03 03:17:17 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][330/573]	eta 0:03:34 lr 0.000337	time 0.8771 (0.8827)	loss 0.4706 (0.4983)	grad_norm 3.2431 (2.6605)	mem 20675MB
[2025-04-03 03:17:19 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][332/573]	eta 0:03:32 lr 0.000337	time 0.8785 (0.8827)	loss 0.3695 (0.4982)	grad_norm 3.1674 (2.6619)	mem 20675MB
[2025-04-03 03:17:21 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][334/573]	eta 0:03:30 lr 0.000337	time 0.8772 (0.8827)	loss 0.4331 (0.4978)	grad_norm 2.7741 (2.6652)	mem 20675MB
[2025-04-03 03:17:23 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][336/573]	eta 0:03:29 lr 0.000336	time 0.8773 (0.8827)	loss 0.4542 (0.4978)	grad_norm 2.0461 (2.6641)	mem 20675MB
[2025-04-03 03:17:24 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][338/573]	eta 0:03:27 lr 0.000336	time 0.8775 (0.8826)	loss 0.5634 (0.4977)	grad_norm 2.6326 (2.6662)	mem 20675MB
[2025-04-03 03:17:26 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][340/573]	eta 0:03:25 lr 0.000336	time 0.8775 (0.8826)	loss 0.6130 (0.4980)	grad_norm 2.7587 (2.6649)	mem 20675MB
[2025-04-03 03:17:28 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][342/573]	eta 0:03:23 lr 0.000336	time 0.8779 (0.8826)	loss 0.5375 (0.4980)	grad_norm 2.2898 (2.6688)	mem 20675MB
[2025-04-03 03:17:30 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][344/573]	eta 0:03:22 lr 0.000336	time 0.8773 (0.8826)	loss 0.5838 (0.4985)	grad_norm 2.1651 (2.6671)	mem 20675MB
[2025-04-03 03:17:31 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][346/573]	eta 0:03:20 lr 0.000335	time 0.8773 (0.8825)	loss 0.4823 (0.4986)	grad_norm 3.2482 (2.6663)	mem 20675MB
[2025-04-03 03:17:33 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][348/573]	eta 0:03:18 lr 0.000335	time 0.8772 (0.8825)	loss 0.4882 (0.4986)	grad_norm 3.7905 (2.6697)	mem 20675MB
[2025-04-03 03:17:35 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][350/573]	eta 0:03:16 lr 0.000335	time 0.8777 (0.8825)	loss 0.4356 (0.4981)	grad_norm 2.7547 (2.6783)	mem 20675MB
[2025-04-03 03:17:37 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][352/573]	eta 0:03:15 lr 0.000335	time 0.8774 (0.8825)	loss 0.3187 (0.4976)	grad_norm 2.4092 (2.6769)	mem 20675MB
[2025-04-03 03:17:38 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][354/573]	eta 0:03:13 lr 0.000335	time 0.8773 (0.8824)	loss 0.5480 (0.4979)	grad_norm 3.1276 (2.6756)	mem 20675MB
[2025-04-03 03:17:40 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][356/573]	eta 0:03:11 lr 0.000334	time 0.8787 (0.8824)	loss 0.4947 (0.4980)	grad_norm 2.0911 (2.6732)	mem 20675MB
[2025-04-03 03:17:42 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][358/573]	eta 0:03:09 lr 0.000334	time 0.8775 (0.8824)	loss 0.5304 (0.4978)	grad_norm 2.2868 (2.6713)	mem 20675MB
[2025-04-03 03:17:44 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][360/573]	eta 0:03:07 lr 0.000334	time 0.8775 (0.8824)	loss 0.3991 (0.4976)	grad_norm 4.4618 (2.6748)	mem 20675MB
[2025-04-03 03:17:46 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][362/573]	eta 0:03:06 lr 0.000334	time 0.8773 (0.8824)	loss 0.4214 (0.4970)	grad_norm 3.3646 (2.6775)	mem 20675MB
[2025-04-03 03:17:47 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][364/573]	eta 0:03:04 lr 0.000334	time 0.8773 (0.8823)	loss 0.5929 (0.4975)	grad_norm 2.9459 (2.6790)	mem 20675MB
[2025-04-03 03:17:49 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][366/573]	eta 0:03:02 lr 0.000333	time 0.8772 (0.8823)	loss 0.5001 (0.4974)	grad_norm 3.2812 (2.6842)	mem 20675MB
[2025-04-03 03:17:51 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][368/573]	eta 0:03:00 lr 0.000333	time 0.8770 (0.8823)	loss 0.3851 (0.4974)	grad_norm 2.7789 (2.6833)	mem 20675MB
[2025-04-03 03:17:53 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][370/573]	eta 0:02:59 lr 0.000333	time 0.8774 (0.8823)	loss 0.5380 (0.4977)	grad_norm 2.7761 (2.6836)	mem 20675MB
[2025-04-03 03:17:54 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][372/573]	eta 0:02:57 lr 0.000333	time 0.8772 (0.8822)	loss 0.4045 (0.4977)	grad_norm 3.4295 (2.6843)	mem 20675MB
[2025-04-03 03:17:56 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][374/573]	eta 0:02:55 lr 0.000333	time 0.8774 (0.8822)	loss 0.5221 (0.4975)	grad_norm 1.8383 (2.6834)	mem 20675MB
[2025-04-03 03:17:58 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][376/573]	eta 0:02:53 lr 0.000332	time 0.8774 (0.8822)	loss 0.4620 (0.4971)	grad_norm 2.5969 (2.6847)	mem 20675MB
[2025-04-03 03:18:00 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][378/573]	eta 0:02:52 lr 0.000332	time 0.8777 (0.8822)	loss 0.5479 (0.4973)	grad_norm 1.7487 (2.6814)	mem 20675MB
[2025-04-03 03:18:01 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][380/573]	eta 0:02:50 lr 0.000332	time 0.8772 (0.8822)	loss 0.4805 (0.4973)	grad_norm 2.3714 (2.6810)	mem 20675MB
[2025-04-03 03:18:03 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][382/573]	eta 0:02:48 lr 0.000332	time 0.8774 (0.8821)	loss 0.4857 (0.4975)	grad_norm 3.2231 (2.6829)	mem 20675MB
[2025-04-03 03:18:05 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][384/573]	eta 0:02:46 lr 0.000332	time 0.8772 (0.8821)	loss 0.3832 (0.4973)	grad_norm 3.6898 (2.6848)	mem 20675MB
[2025-04-03 03:18:07 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][386/573]	eta 0:02:44 lr 0.000331	time 0.8772 (0.8821)	loss 0.5390 (0.4976)	grad_norm 1.7395 (2.6812)	mem 20675MB
[2025-04-03 03:18:08 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][388/573]	eta 0:02:43 lr 0.000331	time 0.8774 (0.8821)	loss 0.3763 (0.4972)	grad_norm 2.4213 (2.6803)	mem 20675MB
[2025-04-03 03:18:10 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][390/573]	eta 0:02:41 lr 0.000331	time 0.8780 (0.8821)	loss 0.4054 (0.4971)	grad_norm 2.9837 (2.6799)	mem 20675MB
[2025-04-03 03:18:12 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][392/573]	eta 0:02:39 lr 0.000331	time 0.8776 (0.8820)	loss 0.3596 (0.4968)	grad_norm 2.3061 (2.6785)	mem 20675MB
[2025-04-03 03:18:14 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][394/573]	eta 0:02:37 lr 0.000331	time 0.8774 (0.8820)	loss 0.4850 (0.4965)	grad_norm 1.9333 (2.6780)	mem 20675MB
[2025-04-03 03:18:15 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][396/573]	eta 0:02:36 lr 0.000330	time 0.8772 (0.8820)	loss 0.5929 (0.4970)	grad_norm 3.0493 (2.6783)	mem 20675MB
[2025-04-03 03:18:17 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][398/573]	eta 0:02:34 lr 0.000330	time 0.8771 (0.8820)	loss 0.4748 (0.4971)	grad_norm 2.2624 (2.6778)	mem 20675MB
[2025-04-03 03:18:19 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][400/573]	eta 0:02:32 lr 0.000330	time 0.8774 (0.8820)	loss 0.6280 (0.4975)	grad_norm 2.5356 (2.6784)	mem 20675MB
[2025-04-03 03:18:21 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][402/573]	eta 0:02:30 lr 0.000330	time 0.8770 (0.8820)	loss 0.4571 (0.4974)	grad_norm 3.7749 (2.6787)	mem 20675MB
[2025-04-03 03:18:22 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][404/573]	eta 0:02:29 lr 0.000330	time 0.8770 (0.8819)	loss 0.4469 (0.4975)	grad_norm 1.8466 (2.6748)	mem 20675MB
[2025-04-03 03:18:24 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][406/573]	eta 0:02:27 lr 0.000329	time 0.8772 (0.8819)	loss 0.4941 (0.4976)	grad_norm 1.9104 (2.6715)	mem 20675MB
[2025-04-03 03:18:26 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][408/573]	eta 0:02:25 lr 0.000329	time 0.8771 (0.8819)	loss 0.4748 (0.4976)	grad_norm 2.4049 (2.6692)	mem 20675MB
[2025-04-03 03:18:28 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][410/573]	eta 0:02:23 lr 0.000329	time 0.8770 (0.8819)	loss 0.3985 (0.4975)	grad_norm 2.6267 (2.6666)	mem 20675MB
[2025-04-03 03:18:29 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][412/573]	eta 0:02:21 lr 0.000329	time 0.8773 (0.8819)	loss 0.3315 (0.4972)	grad_norm 3.0837 (2.6666)	mem 20675MB
[2025-04-03 03:18:31 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][414/573]	eta 0:02:20 lr 0.000329	time 0.8773 (0.8818)	loss 0.5321 (0.4976)	grad_norm 3.0322 (2.6665)	mem 20675MB
[2025-04-03 03:18:33 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][416/573]	eta 0:02:18 lr 0.000328	time 0.8773 (0.8818)	loss 0.5388 (0.4979)	grad_norm 2.3289 (2.6654)	mem 20675MB
[2025-04-03 03:18:35 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][418/573]	eta 0:02:16 lr 0.000328	time 0.8773 (0.8818)	loss 0.5424 (0.4978)	grad_norm 2.2417 (2.6633)	mem 20675MB
[2025-04-03 03:18:36 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][420/573]	eta 0:02:14 lr 0.000328	time 0.8774 (0.8818)	loss 0.5659 (0.4981)	grad_norm 2.4465 (2.6608)	mem 20675MB
[2025-04-03 03:18:38 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][422/573]	eta 0:02:13 lr 0.000328	time 0.8777 (0.8818)	loss 0.3667 (0.4978)	grad_norm 2.1845 (2.6597)	mem 20675MB
[2025-04-03 03:18:40 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][424/573]	eta 0:02:11 lr 0.000328	time 0.8773 (0.8818)	loss 0.4533 (0.4976)	grad_norm 1.8909 (2.6572)	mem 20675MB
[2025-04-03 03:18:42 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][426/573]	eta 0:02:09 lr 0.000327	time 0.8772 (0.8817)	loss 0.4880 (0.4976)	grad_norm 2.6311 (2.6588)	mem 20675MB
[2025-04-03 03:18:43 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][428/573]	eta 0:02:07 lr 0.000327	time 0.8771 (0.8817)	loss 0.5645 (0.4977)	grad_norm 2.7566 (2.6582)	mem 20675MB
[2025-04-03 03:18:45 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][430/573]	eta 0:02:06 lr 0.000327	time 0.8772 (0.8817)	loss 0.4734 (0.4976)	grad_norm 1.9952 (2.6573)	mem 20675MB
[2025-04-03 03:18:47 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][432/573]	eta 0:02:04 lr 0.000327	time 0.8770 (0.8817)	loss 0.4170 (0.4974)	grad_norm 6.9428 (2.6690)	mem 20675MB
[2025-04-03 03:18:49 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][434/573]	eta 0:02:02 lr 0.000327	time 0.8774 (0.8817)	loss 0.5199 (0.4976)	grad_norm 1.8358 (2.6656)	mem 20675MB
[2025-04-03 03:18:50 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][436/573]	eta 0:02:00 lr 0.000326	time 0.8772 (0.8817)	loss 0.5859 (0.4979)	grad_norm 2.4686 (2.6646)	mem 20675MB
[2025-04-03 03:18:52 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][438/573]	eta 0:01:59 lr 0.000326	time 0.8772 (0.8816)	loss 0.4822 (0.4979)	grad_norm 3.4864 (2.6665)	mem 20675MB
[2025-04-03 03:18:54 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][440/573]	eta 0:01:57 lr 0.000326	time 0.8771 (0.8816)	loss 0.4036 (0.4978)	grad_norm 2.2082 (2.6648)	mem 20675MB
[2025-04-03 03:18:56 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][442/573]	eta 0:01:55 lr 0.000326	time 0.8770 (0.8816)	loss 0.4994 (0.4978)	grad_norm 4.7236 (2.6710)	mem 20675MB
[2025-04-03 03:18:58 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][444/573]	eta 0:01:53 lr 0.000326	time 0.8774 (0.8816)	loss 0.4318 (0.4976)	grad_norm 2.7968 (2.6720)	mem 20675MB
[2025-04-03 03:18:59 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][446/573]	eta 0:01:51 lr 0.000325	time 0.8774 (0.8816)	loss 0.5029 (0.4978)	grad_norm 1.8775 (2.6696)	mem 20675MB
[2025-04-03 03:19:01 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][448/573]	eta 0:01:50 lr 0.000325	time 0.8770 (0.8816)	loss 0.4791 (0.4977)	grad_norm 3.2807 (2.6703)	mem 20675MB
[2025-04-03 03:19:03 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][450/573]	eta 0:01:48 lr 0.000325	time 0.8774 (0.8815)	loss 0.5087 (0.4976)	grad_norm 2.1135 (2.6689)	mem 20675MB
[2025-04-03 03:19:05 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][452/573]	eta 0:01:46 lr 0.000325	time 0.8774 (0.8815)	loss 0.5813 (0.4977)	grad_norm 2.6951 (2.6685)	mem 20675MB
[2025-04-03 03:19:06 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][454/573]	eta 0:01:44 lr 0.000325	time 0.8771 (0.8815)	loss 0.6155 (0.4982)	grad_norm 2.5434 (2.6684)	mem 20675MB
[2025-04-03 03:19:08 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][456/573]	eta 0:01:43 lr 0.000324	time 0.8772 (0.8815)	loss 0.5443 (0.4985)	grad_norm 2.3969 (2.6685)	mem 20675MB
[2025-04-03 03:19:10 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][458/573]	eta 0:01:41 lr 0.000324	time 0.8781 (0.8815)	loss 0.5232 (0.4984)	grad_norm 2.6073 (2.6693)	mem 20675MB
[2025-04-03 03:19:12 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][460/573]	eta 0:01:39 lr 0.000324	time 0.8773 (0.8815)	loss 0.4665 (0.4984)	grad_norm 2.8248 (2.6713)	mem 20675MB
[2025-04-03 03:19:13 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][462/573]	eta 0:01:37 lr 0.000324	time 0.8774 (0.8815)	loss 0.6366 (0.4989)	grad_norm 2.9147 (2.6708)	mem 20675MB
[2025-04-03 03:19:15 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][464/573]	eta 0:01:36 lr 0.000324	time 0.8774 (0.8815)	loss 0.6049 (0.4992)	grad_norm 2.1280 (2.6683)	mem 20675MB
[2025-04-03 03:19:17 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][466/573]	eta 0:01:34 lr 0.000323	time 0.8770 (0.8814)	loss 0.4108 (0.4988)	grad_norm 3.1087 (2.6720)	mem 20675MB
[2025-04-03 03:19:19 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][468/573]	eta 0:01:32 lr 0.000323	time 0.8771 (0.8814)	loss 0.5100 (0.4988)	grad_norm 2.1637 (2.6695)	mem 20675MB
[2025-04-03 03:19:20 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][470/573]	eta 0:01:30 lr 0.000323	time 0.8785 (0.8814)	loss 0.5797 (0.4987)	grad_norm 2.1342 (2.6685)	mem 20675MB
[2025-04-03 03:19:22 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][472/573]	eta 0:01:29 lr 0.000323	time 0.8772 (0.8814)	loss 0.5278 (0.4990)	grad_norm 3.1538 (2.6716)	mem 20675MB
[2025-04-03 03:19:24 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][474/573]	eta 0:01:27 lr 0.000323	time 0.8774 (0.8814)	loss 0.3880 (0.4987)	grad_norm 3.8061 (2.6729)	mem 20675MB
[2025-04-03 03:19:26 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][476/573]	eta 0:01:25 lr 0.000322	time 0.8773 (0.8814)	loss 0.5656 (0.4989)	grad_norm 1.8686 (2.6722)	mem 20675MB
[2025-04-03 03:19:27 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][478/573]	eta 0:01:23 lr 0.000322	time 0.8773 (0.8814)	loss 0.5710 (0.4992)	grad_norm 2.1434 (2.6714)	mem 20675MB
[2025-04-03 03:19:29 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][480/573]	eta 0:01:21 lr 0.000322	time 0.8773 (0.8814)	loss 0.3676 (0.4991)	grad_norm 3.4791 (2.6733)	mem 20675MB
[2025-04-03 03:19:31 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][482/573]	eta 0:01:20 lr 0.000322	time 0.8773 (0.8813)	loss 0.4552 (0.4989)	grad_norm 2.5692 (2.6733)	mem 20675MB
[2025-04-03 03:19:33 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][484/573]	eta 0:01:18 lr 0.000322	time 0.8771 (0.8813)	loss 0.5620 (0.4986)	grad_norm 2.0944 (2.6721)	mem 20675MB
[2025-04-03 03:19:34 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][486/573]	eta 0:01:16 lr 0.000321	time 0.8774 (0.8813)	loss 0.5335 (0.4988)	grad_norm 2.1675 (2.6696)	mem 20675MB
[2025-04-03 03:19:36 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][488/573]	eta 0:01:14 lr 0.000321	time 0.8774 (0.8813)	loss 0.5727 (0.4991)	grad_norm 2.1887 (2.6699)	mem 20675MB
[2025-04-03 03:19:38 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][490/573]	eta 0:01:13 lr 0.000321	time 0.8772 (0.8813)	loss 0.5185 (0.4991)	grad_norm 2.5069 (2.6697)	mem 20675MB
[2025-04-03 03:19:40 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][492/573]	eta 0:01:11 lr 0.000321	time 0.8774 (0.8813)	loss 0.4408 (0.4988)	grad_norm 2.3435 (2.6701)	mem 20675MB
[2025-04-03 03:19:41 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][494/573]	eta 0:01:09 lr 0.000321	time 0.8772 (0.8813)	loss 0.4874 (0.4986)	grad_norm 2.7081 (2.6716)	mem 20675MB
[2025-04-03 03:19:43 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][496/573]	eta 0:01:07 lr 0.000320	time 0.8771 (0.8813)	loss 0.5201 (0.4985)	grad_norm 2.8708 (2.6724)	mem 20675MB
[2025-04-03 03:19:45 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][498/573]	eta 0:01:06 lr 0.000320	time 0.8773 (0.8812)	loss 0.4827 (0.4987)	grad_norm 3.4891 (2.6743)	mem 20675MB
[2025-04-03 03:19:47 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][500/573]	eta 0:01:04 lr 0.000320	time 0.8773 (0.8812)	loss 0.3373 (0.4984)	grad_norm 4.0840 (2.6764)	mem 20675MB
[2025-04-03 03:19:48 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][502/573]	eta 0:01:02 lr 0.000320	time 0.8773 (0.8812)	loss 0.5272 (0.4985)	grad_norm 2.1310 (2.6755)	mem 20675MB
[2025-04-03 03:19:50 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][504/573]	eta 0:01:00 lr 0.000320	time 0.8784 (0.8812)	loss 0.3671 (0.4984)	grad_norm 4.5160 (2.6792)	mem 20675MB
[2025-04-03 03:19:52 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][506/573]	eta 0:00:59 lr 0.000319	time 0.8770 (0.8812)	loss 0.5844 (0.4988)	grad_norm 2.6650 (2.6869)	mem 20675MB
[2025-04-03 03:19:54 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][508/573]	eta 0:00:57 lr 0.000319	time 0.8773 (0.8812)	loss 0.5474 (0.4987)	grad_norm 3.0546 (2.6871)	mem 20675MB
[2025-04-03 03:19:56 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][510/573]	eta 0:00:55 lr 0.000319	time 0.8773 (0.8812)	loss 0.5009 (0.4988)	grad_norm 3.4388 (2.6879)	mem 20675MB
[2025-04-03 03:19:57 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][512/573]	eta 0:00:53 lr 0.000319	time 0.8776 (0.8812)	loss 0.6479 (0.4990)	grad_norm 2.1160 (2.6853)	mem 20675MB
[2025-04-03 03:19:59 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][514/573]	eta 0:00:51 lr 0.000319	time 0.8770 (0.8812)	loss 0.4344 (0.4989)	grad_norm 3.2206 (2.6871)	mem 20675MB
[2025-04-03 03:20:01 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][516/573]	eta 0:00:50 lr 0.000318	time 0.8772 (0.8812)	loss 0.4663 (0.4986)	grad_norm 2.9014 (2.6887)	mem 20675MB
[2025-04-03 03:20:03 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][518/573]	eta 0:00:48 lr 0.000318	time 0.8773 (0.8812)	loss 0.3357 (0.4984)	grad_norm 3.4453 (2.6884)	mem 20675MB
[2025-04-03 03:20:04 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][520/573]	eta 0:00:46 lr 0.000318	time 0.8773 (0.8812)	loss 0.5371 (0.4983)	grad_norm 2.3782 (2.6880)	mem 20675MB
[2025-04-03 03:20:06 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][522/573]	eta 0:00:44 lr 0.000318	time 0.8773 (0.8812)	loss 0.3980 (0.4982)	grad_norm 3.4820 (2.6911)	mem 20675MB
[2025-04-03 03:20:08 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][524/573]	eta 0:00:43 lr 0.000318	time 0.8775 (0.8812)	loss 0.6642 (0.4985)	grad_norm 3.3435 (2.6935)	mem 20675MB
[2025-04-03 03:20:10 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][526/573]	eta 0:00:41 lr 0.000317	time 0.8774 (0.8811)	loss 0.5244 (0.4986)	grad_norm 2.5912 (2.6917)	mem 20675MB
[2025-04-03 03:20:11 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][528/573]	eta 0:00:39 lr 0.000317	time 0.8774 (0.8811)	loss 0.5839 (0.4989)	grad_norm 1.8560 (2.6905)	mem 20675MB
[2025-04-03 03:20:13 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][530/573]	eta 0:00:37 lr 0.000317	time 0.8775 (0.8811)	loss 0.6170 (0.4992)	grad_norm 2.0866 (2.6885)	mem 20675MB
[2025-04-03 03:20:15 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][532/573]	eta 0:00:36 lr 0.000317	time 0.8773 (0.8811)	loss 0.4862 (0.4992)	grad_norm 2.1839 (2.6868)	mem 20675MB
[2025-04-03 03:20:17 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][534/573]	eta 0:00:34 lr 0.000317	time 0.8773 (0.8811)	loss 0.5590 (0.4994)	grad_norm 2.5292 (2.6860)	mem 20675MB
[2025-04-03 03:20:18 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][536/573]	eta 0:00:32 lr 0.000316	time 0.8774 (0.8811)	loss 0.4497 (0.4993)	grad_norm 2.2499 (2.6863)	mem 20675MB
[2025-04-03 03:20:20 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][538/573]	eta 0:00:30 lr 0.000316	time 0.8773 (0.8811)	loss 0.5764 (0.4996)	grad_norm 2.5951 (2.6850)	mem 20675MB
[2025-04-03 03:20:22 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][540/573]	eta 0:00:29 lr 0.000316	time 0.8773 (0.8811)	loss 0.4156 (0.4994)	grad_norm 3.6213 (2.6877)	mem 20675MB
[2025-04-03 03:20:24 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][542/573]	eta 0:00:27 lr 0.000316	time 0.8772 (0.8811)	loss 0.5928 (0.4994)	grad_norm 2.1077 (2.6873)	mem 20675MB
[2025-04-03 03:20:25 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][544/573]	eta 0:00:25 lr 0.000316	time 0.8774 (0.8810)	loss 0.4516 (0.4994)	grad_norm 2.6734 (2.6861)	mem 20675MB
[2025-04-03 03:20:27 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][546/573]	eta 0:00:23 lr 0.000315	time 0.8775 (0.8810)	loss 0.6075 (0.4999)	grad_norm 2.0156 (2.6857)	mem 20675MB
[2025-04-03 03:20:29 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][548/573]	eta 0:00:22 lr 0.000315	time 0.8773 (0.8810)	loss 0.4673 (0.4999)	grad_norm 2.4646 (2.6838)	mem 20675MB
[2025-04-03 03:20:31 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][550/573]	eta 0:00:20 lr 0.000315	time 0.8771 (0.8810)	loss 0.4332 (0.4996)	grad_norm 2.3741 (2.6839)	mem 20675MB
[2025-04-03 03:20:32 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][552/573]	eta 0:00:18 lr 0.000315	time 0.8771 (0.8810)	loss 0.4573 (0.4996)	grad_norm 1.8133 (2.6812)	mem 20675MB
[2025-04-03 03:20:34 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][554/573]	eta 0:00:16 lr 0.000315	time 0.8772 (0.8810)	loss 0.5172 (0.4995)	grad_norm 3.3637 (2.6830)	mem 20675MB
[2025-04-03 03:20:36 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][556/573]	eta 0:00:14 lr 0.000314	time 0.8772 (0.8810)	loss 0.5174 (0.4994)	grad_norm 2.0875 (2.6835)	mem 20675MB
[2025-04-03 03:20:38 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][558/573]	eta 0:00:13 lr 0.000314	time 0.8771 (0.8810)	loss 0.4965 (0.4992)	grad_norm 2.0963 (2.6831)	mem 20675MB
[2025-04-03 03:20:39 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][560/573]	eta 0:00:11 lr 0.000314	time 0.8771 (0.8810)	loss 0.5182 (0.4993)	grad_norm 2.1125 (2.6838)	mem 20675MB
[2025-04-03 03:20:41 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][562/573]	eta 0:00:09 lr 0.000314	time 0.8772 (0.8810)	loss 0.5571 (0.4995)	grad_norm 3.3088 (2.6851)	mem 20675MB
[2025-04-03 03:20:43 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][564/573]	eta 0:00:07 lr 0.000314	time 0.8771 (0.8809)	loss 0.4990 (0.4997)	grad_norm 2.3665 (2.6841)	mem 20675MB
[2025-04-03 03:20:45 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][566/573]	eta 0:00:06 lr 0.000313	time 0.8770 (0.8809)	loss 0.4624 (0.4993)	grad_norm 3.8288 (2.6863)	mem 20675MB
[2025-04-03 03:20:46 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][568/573]	eta 0:00:04 lr 0.000313	time 0.8772 (0.8809)	loss 0.4000 (0.4993)	grad_norm 2.3955 (2.6861)	mem 20675MB
[2025-04-03 03:20:48 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][570/573]	eta 0:00:02 lr 0.000313	time 0.8770 (0.8809)	loss 0.5295 (0.4995)	grad_norm 1.6866 (2.6866)	mem 20675MB
[2025-04-03 03:20:50 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][572/573]	eta 0:00:00 lr 0.000313	time 0.8771 (0.8809)	loss 0.4733 (0.4996)	grad_norm 2.1382 (2.6857)	mem 20675MB
[2025-04-03 03:20:50 simmim_finetune] (main_finetune.py 260): INFO EPOCH 19 training takes 0:08:24
[2025-04-03 03:20:52 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 1.781 (1.781)	Loss 0.4570 (0.4570)	Acc@1 74.219 (74.219)	Mem 20675MB
[2025-04-03 03:20:52 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (0.783)	Loss 0.4090 (0.4240)	Acc@1 82.812 (78.385)	Mem 20675MB
[2025-04-03 03:20:53 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.583)	Loss 0.4484 (0.4248)	Acc@1 78.125 (78.438)	Mem 20675MB
[2025-04-03 03:20:54 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.498)	Loss 0.3874 (0.4143)	Acc@1 84.375 (80.134)	Mem 20675MB
[2025-04-03 03:20:54 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.450)	Loss 0.5600 (0.4306)	Acc@1 68.750 (79.080)	Mem 20675MB
[2025-04-03 03:20:55 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.420)	Loss 0.5239 (0.4525)	Acc@1 77.344 (78.196)	Mem 20675MB
[2025-04-03 03:20:55 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.399)	Loss 0.5278 (0.4626)	Acc@1 73.438 (77.704)	Mem 20675MB
[2025-04-03 03:20:56 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.384)	Loss 0.4899 (0.4673)	Acc@1 78.125 (77.552)	Mem 20675MB
[2025-04-03 03:20:56 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.571
[2025-04-03 03:20:56 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 77.6%
[2025-04-03 03:20:56 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 77.67%
[2025-04-03 03:20:56 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.3432228032273095e-06, 1.3432228032273095e-06, 1.9655457353863885e-06, 1.9655457353863885e-06, 2.9229656310157406e-06, 2.9229656310157406e-06, 4.3959193165993596e-06, 4.3959193165993596e-06, 6.662001909804925e-06, 6.662001909804925e-06, 1.0148282822428874e-05, 1.0148282822428874e-05, 1.5511791918773407e-05, 1.5511791918773407e-05, 2.3763344374688074e-05, 2.3763344374688074e-05, 3.6458040460710644e-05, 3.6458040460710644e-05, 5.59883421315146e-05, 5.59883421315146e-05, 8.60349600865976e-05, 8.60349600865976e-05, 0.00013226052617134067, 0.00013226052617134067, 0.00020337678168633002, 0.00020337678168633002, 0.00031278640555554434, 0.00031278640555554434]
[2025-04-03 03:20:59 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][0/573]	eta 0:22:51 lr 0.000313	time 2.3943 (2.3943)	loss 0.5036 (0.5036)	grad_norm 1.9227 (1.9227)	mem 20675MB
[2025-04-03 03:21:00 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][2/573]	eta 0:13:09 lr 0.000312	time 0.8773 (1.3835)	loss 0.5455 (0.5382)	grad_norm 2.2622 (2.0481)	mem 20675MB
[2025-04-03 03:21:02 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][4/573]	eta 0:11:12 lr 0.000312	time 0.8772 (1.1814)	loss 0.5450 (0.5417)	grad_norm 2.5267 (2.3214)	mem 20675MB
[2025-04-03 03:21:04 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][6/573]	eta 0:10:20 lr 0.000312	time 0.8772 (1.0947)	loss 0.5217 (0.5373)	grad_norm 2.8622 (2.4235)	mem 20675MB
[2025-04-03 03:21:06 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][8/573]	eta 0:09:51 lr 0.000312	time 0.8771 (1.0466)	loss 0.4806 (0.5343)	grad_norm 3.3999 (2.5181)	mem 20675MB
[2025-04-03 03:21:07 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][10/573]	eta 0:09:31 lr 0.000312	time 0.8772 (1.0159)	loss 0.5123 (0.5194)	grad_norm 2.3931 (2.5496)	mem 20675MB
[2025-04-03 03:21:09 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][12/573]	eta 0:09:18 lr 0.000312	time 0.8774 (0.9947)	loss 0.5751 (0.5104)	grad_norm 2.9554 (2.5943)	mem 20675MB
[2025-04-03 03:21:11 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][14/573]	eta 0:09:07 lr 0.000311	time 0.8774 (0.9792)	loss 0.4051 (0.5092)	grad_norm 3.0115 (2.5892)	mem 20675MB
[2025-04-03 03:21:13 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][16/573]	eta 0:08:58 lr 0.000311	time 0.8775 (0.9673)	loss 0.4692 (0.5060)	grad_norm 2.2459 (2.5651)	mem 20675MB
[2025-04-03 03:21:14 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][18/573]	eta 0:08:51 lr 0.000311	time 0.8774 (0.9579)	loss 0.4478 (0.5054)	grad_norm 2.9952 (2.5784)	mem 20675MB
[2025-04-03 03:21:16 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][20/573]	eta 0:08:45 lr 0.000311	time 0.8771 (0.9503)	loss 0.4113 (0.5044)	grad_norm 2.4086 (2.5641)	mem 20675MB
[2025-04-03 03:21:18 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][22/573]	eta 0:08:40 lr 0.000311	time 0.8774 (0.9440)	loss 0.5606 (0.5094)	grad_norm 2.2718 (2.5295)	mem 20675MB
[2025-04-03 03:21:20 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][24/573]	eta 0:08:35 lr 0.000310	time 0.8772 (0.9388)	loss 0.6469 (0.5161)	grad_norm 4.8894 (2.6464)	mem 20675MB
[2025-04-03 03:21:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][26/573]	eta 0:08:31 lr 0.000310	time 0.8775 (0.9343)	loss 0.5362 (0.5196)	grad_norm 2.0186 (2.6226)	mem 20675MB
[2025-04-03 03:21:23 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][28/573]	eta 0:08:27 lr 0.000310	time 0.8774 (0.9304)	loss 0.5194 (0.5205)	grad_norm 2.1159 (2.5968)	mem 20675MB
[2025-04-03 03:21:25 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][30/573]	eta 0:08:23 lr 0.000310	time 0.8772 (0.9271)	loss 0.3232 (0.5151)	grad_norm 2.4622 (2.6053)	mem 20675MB
[2025-04-03 03:21:27 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][32/573]	eta 0:08:19 lr 0.000310	time 0.8772 (0.9241)	loss 0.5370 (0.5183)	grad_norm 2.8849 (2.5885)	mem 20675MB
[2025-04-03 03:21:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][34/573]	eta 0:08:16 lr 0.000309	time 0.8772 (0.9215)	loss 0.5233 (0.5205)	grad_norm 2.1640 (2.5673)	mem 20675MB
[2025-04-03 03:21:30 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][36/573]	eta 0:08:13 lr 0.000309	time 0.8772 (0.9191)	loss 0.5404 (0.5228)	grad_norm 1.8843 (2.5243)	mem 20675MB
[2025-04-03 03:21:32 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][38/573]	eta 0:08:10 lr 0.000309	time 0.8774 (0.9170)	loss 0.5652 (0.5247)	grad_norm 1.8012 (2.4969)	mem 20675MB
[2025-04-03 03:21:34 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][40/573]	eta 0:08:07 lr 0.000309	time 0.8773 (0.9151)	loss 0.5271 (0.5260)	grad_norm 2.1607 (2.4733)	mem 20675MB
[2025-04-03 03:21:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][42/573]	eta 0:08:05 lr 0.000309	time 0.8770 (0.9134)	loss 0.5511 (0.5266)	grad_norm 1.8587 (2.4625)	mem 20675MB
[2025-04-03 03:21:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][44/573]	eta 0:08:02 lr 0.000308	time 0.8770 (0.9118)	loss 0.5206 (0.5274)	grad_norm 1.9322 (2.4480)	mem 20675MB
[2025-04-03 03:21:39 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][46/573]	eta 0:07:59 lr 0.000308	time 0.8771 (0.9104)	loss 0.5522 (0.5293)	grad_norm 2.5648 (2.4259)	mem 20675MB
[2025-04-03 03:21:41 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][48/573]	eta 0:07:57 lr 0.000308	time 0.8770 (0.9091)	loss 0.4583 (0.5279)	grad_norm 2.1402 (2.4118)	mem 20675MB
[2025-04-03 03:21:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][50/573]	eta 0:07:54 lr 0.000308	time 0.8780 (0.9079)	loss 0.5376 (0.5272)	grad_norm 2.4575 (2.4059)	mem 20675MB
[2025-04-03 03:21:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][52/573]	eta 0:07:52 lr 0.000308	time 0.8772 (0.9068)	loss 0.5980 (0.5277)	grad_norm 1.3403 (2.3821)	mem 20675MB
[2025-04-03 03:21:46 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][54/573]	eta 0:07:50 lr 0.000307	time 0.8770 (0.9057)	loss 0.5100 (0.5253)	grad_norm 2.2818 (2.3910)	mem 20675MB
[2025-04-03 03:21:48 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][56/573]	eta 0:07:47 lr 0.000307	time 0.8771 (0.9048)	loss 0.4430 (0.5239)	grad_norm 1.8713 (2.3878)	mem 20675MB
[2025-04-03 03:21:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][58/573]	eta 0:07:45 lr 0.000307	time 0.8771 (0.9039)	loss 0.5334 (0.5249)	grad_norm 1.8412 (2.3793)	mem 20675MB
[2025-04-03 03:21:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][60/573]	eta 0:07:43 lr 0.000307	time 0.8773 (0.9030)	loss 0.4685 (0.5210)	grad_norm 2.7712 (2.3921)	mem 20675MB
[2025-04-03 03:21:53 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][62/573]	eta 0:07:41 lr 0.000307	time 0.8771 (0.9022)	loss 0.5820 (0.5203)	grad_norm 3.7416 (2.4584)	mem 20675MB
[2025-04-03 03:21:55 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][64/573]	eta 0:07:38 lr 0.000306	time 0.8772 (0.9015)	loss 0.5001 (0.5208)	grad_norm 2.3800 (2.4554)	mem 20675MB
[2025-04-03 03:21:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][66/573]	eta 0:07:36 lr 0.000306	time 0.8772 (0.9008)	loss 0.5442 (0.5204)	grad_norm 2.8375 (2.4583)	mem 20675MB
[2025-04-03 03:21:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][68/573]	eta 0:07:34 lr 0.000306	time 0.8772 (0.9001)	loss 0.3860 (0.5183)	grad_norm 3.0644 (2.4832)	mem 20675MB
[2025-04-03 03:22:00 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][70/573]	eta 0:07:32 lr 0.000306	time 0.8775 (0.8995)	loss 0.3362 (0.5139)	grad_norm 2.8135 (2.4886)	mem 20675MB
[2025-04-03 03:22:02 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][72/573]	eta 0:07:30 lr 0.000306	time 0.8772 (0.8989)	loss 0.5506 (0.5153)	grad_norm 1.6767 (2.4741)	mem 20675MB
[2025-04-03 03:22:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][74/573]	eta 0:07:28 lr 0.000305	time 0.8772 (0.8984)	loss 0.5019 (0.5163)	grad_norm 2.4161 (2.4749)	mem 20675MB
[2025-04-03 03:22:05 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][76/573]	eta 0:07:26 lr 0.000305	time 0.8775 (0.8978)	loss 0.5211 (0.5146)	grad_norm 3.3386 (2.4920)	mem 20675MB
[2025-04-03 03:22:07 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][78/573]	eta 0:07:24 lr 0.000305	time 0.8772 (0.8973)	loss 0.4832 (0.5142)	grad_norm 3.3284 (2.5027)	mem 20675MB
[2025-04-03 03:22:09 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][80/573]	eta 0:07:22 lr 0.000305	time 0.8771 (0.8969)	loss 0.4535 (0.5147)	grad_norm 4.0811 (2.5222)	mem 20675MB
[2025-04-03 03:22:11 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][82/573]	eta 0:07:20 lr 0.000305	time 0.8774 (0.8964)	loss 0.4701 (0.5145)	grad_norm 2.3933 (2.5178)	mem 20675MB
[2025-04-03 03:22:12 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][84/573]	eta 0:07:18 lr 0.000304	time 0.8773 (0.8960)	loss 0.6316 (0.5151)	grad_norm 3.3560 (2.5379)	mem 20675MB
[2025-04-03 03:22:14 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][86/573]	eta 0:07:16 lr 0.000304	time 0.8775 (0.8956)	loss 0.5688 (0.5159)	grad_norm 2.2038 (2.5248)	mem 20675MB
[2025-04-03 03:22:16 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][88/573]	eta 0:07:14 lr 0.000304	time 0.8772 (0.8952)	loss 0.4175 (0.5152)	grad_norm 3.6195 (2.5438)	mem 20675MB
[2025-04-03 03:22:18 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][90/573]	eta 0:07:12 lr 0.000304	time 0.8771 (0.8948)	loss 0.4721 (0.5146)	grad_norm 2.1946 (2.5448)	mem 20675MB
[2025-04-03 03:22:19 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][92/573]	eta 0:07:10 lr 0.000304	time 0.8772 (0.8944)	loss 0.5144 (0.5157)	grad_norm 2.3245 (2.5429)	mem 20675MB
[2025-04-03 03:22:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][94/573]	eta 0:07:08 lr 0.000303	time 0.8771 (0.8941)	loss 0.3705 (0.5137)	grad_norm 2.3767 (2.5454)	mem 20675MB
[2025-04-03 03:22:23 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][96/573]	eta 0:07:06 lr 0.000303	time 0.8771 (0.8938)	loss 0.4096 (0.5117)	grad_norm 3.9058 (2.5640)	mem 20675MB
[2025-04-03 03:22:25 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][98/573]	eta 0:07:04 lr 0.000303	time 0.8773 (0.8934)	loss 0.4346 (0.5110)	grad_norm 3.1338 (2.5644)	mem 20675MB
[2025-04-03 03:22:26 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][100/573]	eta 0:07:02 lr 0.000303	time 0.8773 (0.8931)	loss 0.5511 (0.5103)	grad_norm 2.7590 (2.5743)	mem 20675MB
[2025-04-03 03:22:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][102/573]	eta 0:07:00 lr 0.000303	time 0.8775 (0.8929)	loss 0.3586 (0.5101)	grad_norm 3.4854 (2.5922)	mem 20675MB
[2025-04-03 03:22:30 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][104/573]	eta 0:06:58 lr 0.000302	time 0.8771 (0.8926)	loss 0.5656 (0.5106)	grad_norm 3.0149 (2.5920)	mem 20675MB
[2025-04-03 03:22:32 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][106/573]	eta 0:06:56 lr 0.000302	time 0.8770 (0.8923)	loss 0.5133 (0.5107)	grad_norm 2.7219 (2.5901)	mem 20675MB
[2025-04-03 03:22:33 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][108/573]	eta 0:06:54 lr 0.000302	time 0.8774 (0.8920)	loss 0.5273 (0.5102)	grad_norm 2.0546 (2.5915)	mem 20675MB
[2025-04-03 03:22:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][110/573]	eta 0:06:52 lr 0.000302	time 0.8774 (0.8918)	loss 0.5218 (0.5099)	grad_norm 1.4370 (2.5953)	mem 20675MB
[2025-04-03 03:22:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][112/573]	eta 0:06:51 lr 0.000302	time 0.8776 (0.8915)	loss 0.5407 (0.5100)	grad_norm 3.0457 (2.5971)	mem 20675MB
[2025-04-03 03:22:39 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][114/573]	eta 0:06:49 lr 0.000301	time 0.8772 (0.8913)	loss 0.4980 (0.5089)	grad_norm 2.7350 (2.6134)	mem 20675MB
[2025-04-03 03:22:40 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][116/573]	eta 0:06:47 lr 0.000301	time 0.8775 (0.8911)	loss 0.4702 (0.5084)	grad_norm 2.2401 (2.6229)	mem 20675MB
[2025-04-03 03:22:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][118/573]	eta 0:06:45 lr 0.000301	time 0.8776 (0.8909)	loss 0.3160 (0.5069)	grad_norm 3.4765 (2.6261)	mem 20675MB
[2025-04-03 03:22:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][120/573]	eta 0:06:43 lr 0.000301	time 0.8771 (0.8907)	loss 0.3787 (0.5060)	grad_norm 2.6462 (2.6265)	mem 20675MB
[2025-04-03 03:22:46 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][122/573]	eta 0:06:41 lr 0.000301	time 0.8772 (0.8905)	loss 0.3758 (0.5040)	grad_norm 4.1698 (2.6651)	mem 20675MB
[2025-04-03 03:22:47 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][124/573]	eta 0:06:39 lr 0.000301	time 0.8773 (0.8903)	loss 0.5179 (0.5045)	grad_norm 2.0726 (2.6621)	mem 20675MB
[2025-04-03 03:22:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][126/573]	eta 0:06:37 lr 0.000300	time 0.8786 (0.8901)	loss 0.5639 (0.5048)	grad_norm 2.3336 (2.6622)	mem 20675MB
[2025-04-03 03:22:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][128/573]	eta 0:06:36 lr 0.000300	time 0.8779 (0.8899)	loss 0.5471 (0.5045)	grad_norm 2.2177 (2.6572)	mem 20675MB
[2025-04-03 03:22:53 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][130/573]	eta 0:06:34 lr 0.000300	time 0.8778 (0.8898)	loss 0.4633 (0.5035)	grad_norm 1.7586 (2.6522)	mem 20675MB
[2025-04-03 03:22:54 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][132/573]	eta 0:06:32 lr 0.000300	time 0.8822 (0.8896)	loss 0.4420 (0.5024)	grad_norm 3.0584 (2.6719)	mem 20675MB
[2025-04-03 03:22:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][134/573]	eta 0:06:30 lr 0.000300	time 0.8864 (0.8896)	loss 0.3617 (0.5018)	grad_norm 2.7879 (2.6758)	mem 20675MB
[2025-04-03 03:22:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][136/573]	eta 0:06:28 lr 0.000299	time 0.8771 (0.8894)	loss 0.5024 (0.5025)	grad_norm 2.6791 (2.6755)	mem 20675MB
[2025-04-03 03:23:00 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][138/573]	eta 0:06:26 lr 0.000299	time 0.8774 (0.8893)	loss 0.5090 (0.5030)	grad_norm 2.3792 (2.6694)	mem 20675MB
[2025-04-03 03:23:02 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][140/573]	eta 0:06:25 lr 0.000299	time 0.8808 (0.8892)	loss 0.3577 (0.5017)	grad_norm 3.4409 (2.6741)	mem 20675MB
[2025-04-03 03:23:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][142/573]	eta 0:06:23 lr 0.000299	time 0.8782 (0.8891)	loss 0.5937 (0.5027)	grad_norm 2.4667 (2.6736)	mem 20675MB
[2025-04-03 03:23:05 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][144/573]	eta 0:06:21 lr 0.000299	time 0.8791 (0.8890)	loss 0.5185 (0.5023)	grad_norm 1.9944 (2.6655)	mem 20675MB
[2025-04-03 03:23:07 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][146/573]	eta 0:06:19 lr 0.000298	time 0.8779 (0.8889)	loss 0.4581 (0.5023)	grad_norm 3.6489 (2.6736)	mem 20675MB
[2025-04-03 03:23:09 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][148/573]	eta 0:06:17 lr 0.000298	time 0.8788 (0.8888)	loss 0.3755 (0.5019)	grad_norm 2.9405 (2.6685)	mem 20675MB
[2025-04-03 03:23:10 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][150/573]	eta 0:06:15 lr 0.000298	time 0.8881 (0.8887)	loss 0.6328 (0.5027)	grad_norm 3.5374 (2.6689)	mem 20675MB
[2025-04-03 03:23:12 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][152/573]	eta 0:06:14 lr 0.000298	time 0.8784 (0.8886)	loss 0.3671 (0.5017)	grad_norm 3.7421 (2.6746)	mem 20675MB
[2025-04-03 03:23:14 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][154/573]	eta 0:06:12 lr 0.000298	time 0.8784 (0.8885)	loss 0.4830 (0.5020)	grad_norm 2.9949 (2.6759)	mem 20675MB
[2025-04-03 03:23:16 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][156/573]	eta 0:06:10 lr 0.000297	time 0.8776 (0.8883)	loss 0.4961 (0.5014)	grad_norm 1.7423 (2.6696)	mem 20675MB
[2025-04-03 03:23:17 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][158/573]	eta 0:06:08 lr 0.000297	time 0.8779 (0.8882)	loss 0.4319 (0.5009)	grad_norm 2.9270 (2.6664)	mem 20675MB
[2025-04-03 03:23:19 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][160/573]	eta 0:06:06 lr 0.000297	time 0.8779 (0.8881)	loss 0.3670 (0.5004)	grad_norm 2.9424 (2.6663)	mem 20675MB
[2025-04-03 03:23:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][162/573]	eta 0:06:04 lr 0.000297	time 0.8774 (0.8880)	loss 0.3916 (0.4997)	grad_norm 3.9860 (2.6743)	mem 20675MB
[2025-04-03 03:23:23 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][164/573]	eta 0:06:03 lr 0.000297	time 0.8826 (0.8879)	loss 0.5860 (0.4994)	grad_norm 1.9190 (2.6693)	mem 20675MB
[2025-04-03 03:23:24 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][166/573]	eta 0:06:01 lr 0.000296	time 0.8772 (0.8878)	loss 0.4689 (0.4991)	grad_norm 2.3612 (2.6725)	mem 20675MB
[2025-04-03 03:23:26 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][168/573]	eta 0:05:59 lr 0.000296	time 0.8790 (0.8877)	loss 0.4703 (0.4989)	grad_norm 2.5003 (2.6728)	mem 20675MB
[2025-04-03 03:23:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][170/573]	eta 0:05:57 lr 0.000296	time 0.8782 (0.8876)	loss 0.5872 (0.4995)	grad_norm 2.1025 (2.6675)	mem 20675MB
[2025-04-03 03:23:30 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][172/573]	eta 0:05:55 lr 0.000296	time 0.8779 (0.8875)	loss 0.3701 (0.4994)	grad_norm 3.2826 (2.6696)	mem 20675MB
[2025-04-03 03:23:31 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][174/573]	eta 0:05:54 lr 0.000296	time 0.8783 (0.8874)	loss 0.5281 (0.4994)	grad_norm 2.3046 (2.6701)	mem 20675MB
[2025-04-03 03:23:33 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][176/573]	eta 0:05:52 lr 0.000295	time 0.8775 (0.8874)	loss 0.3600 (0.4993)	grad_norm 5.8972 (2.6904)	mem 20675MB
[2025-04-03 03:23:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][178/573]	eta 0:05:50 lr 0.000295	time 0.8811 (0.8873)	loss 0.5744 (0.5000)	grad_norm 2.3292 (2.6890)	mem 20675MB
[2025-04-03 03:23:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][180/573]	eta 0:05:48 lr 0.000295	time 0.8826 (0.8872)	loss 0.5389 (0.5007)	grad_norm 2.3734 (2.6893)	mem 20675MB
[2025-04-03 03:23:38 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][182/573]	eta 0:05:46 lr 0.000295	time 0.8777 (0.8871)	loss 0.5013 (0.5002)	grad_norm 2.2675 (2.6860)	mem 20675MB
[2025-04-03 03:23:40 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][184/573]	eta 0:05:45 lr 0.000295	time 0.8775 (0.8870)	loss 0.4082 (0.4999)	grad_norm 2.2352 (2.6788)	mem 20675MB
[2025-04-03 03:23:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][186/573]	eta 0:05:43 lr 0.000294	time 0.8772 (0.8869)	loss 0.4549 (0.4997)	grad_norm 2.1087 (2.6704)	mem 20675MB
[2025-04-03 03:23:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][188/573]	eta 0:05:41 lr 0.000294	time 0.8774 (0.8869)	loss 0.3483 (0.4987)	grad_norm 2.4730 (2.6852)	mem 20675MB
[2025-04-03 03:23:46 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][190/573]	eta 0:05:39 lr 0.000294	time 0.8866 (0.8868)	loss 0.3301 (0.4978)	grad_norm 3.5185 (2.6844)	mem 20675MB
[2025-04-03 03:23:47 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][192/573]	eta 0:05:37 lr 0.000294	time 0.8893 (0.8868)	loss 0.5163 (0.4981)	grad_norm 2.0534 (2.6799)	mem 20675MB
[2025-04-03 03:23:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][194/573]	eta 0:05:36 lr 0.000294	time 0.8777 (0.8867)	loss 0.5307 (0.4974)	grad_norm 2.9358 (2.6849)	mem 20675MB
[2025-04-03 03:23:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][196/573]	eta 0:05:34 lr 0.000294	time 0.8774 (0.8867)	loss 0.5805 (0.4977)	grad_norm 2.5777 (2.6833)	mem 20675MB
[2025-04-03 03:23:53 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][198/573]	eta 0:05:32 lr 0.000293	time 0.8790 (0.8866)	loss 0.5543 (0.4983)	grad_norm 2.1330 (2.6800)	mem 20675MB
[2025-04-03 03:23:54 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][200/573]	eta 0:05:30 lr 0.000293	time 0.8825 (0.8866)	loss 0.6306 (0.4986)	grad_norm 3.8041 (2.6874)	mem 20675MB
[2025-04-03 03:23:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][202/573]	eta 0:05:28 lr 0.000293	time 0.8789 (0.8865)	loss 0.5142 (0.4991)	grad_norm 3.7078 (2.6942)	mem 20675MB
[2025-04-03 03:23:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][204/573]	eta 0:05:27 lr 0.000293	time 0.8784 (0.8865)	loss 0.6382 (0.5002)	grad_norm 3.7320 (2.6997)	mem 20675MB
[2025-04-03 03:24:00 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][206/573]	eta 0:05:25 lr 0.000293	time 0.8827 (0.8868)	loss 0.4484 (0.4995)	grad_norm 2.6876 (2.7063)	mem 20675MB
[2025-04-03 03:24:01 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][208/573]	eta 0:05:23 lr 0.000292	time 0.8780 (0.8867)	loss 0.5065 (0.4989)	grad_norm 2.3616 (2.7053)	mem 20675MB
[2025-04-03 03:24:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][210/573]	eta 0:05:21 lr 0.000292	time 0.8878 (0.8867)	loss 0.5247 (0.4986)	grad_norm 2.2155 (2.7028)	mem 20675MB
[2025-04-03 03:24:05 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][212/573]	eta 0:05:20 lr 0.000292	time 0.8774 (0.8866)	loss 0.5934 (0.4991)	grad_norm 1.8904 (2.7038)	mem 20675MB
[2025-04-03 03:24:07 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][214/573]	eta 0:05:18 lr 0.000292	time 0.8784 (0.8865)	loss 0.5826 (0.4993)	grad_norm 2.1748 (2.7091)	mem 20675MB
[2025-04-03 03:24:08 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][216/573]	eta 0:05:16 lr 0.000292	time 0.8773 (0.8864)	loss 0.4979 (0.4997)	grad_norm 2.8652 (2.7112)	mem 20675MB
[2025-04-03 03:24:10 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][218/573]	eta 0:05:14 lr 0.000291	time 0.8779 (0.8864)	loss 0.6204 (0.5003)	grad_norm 2.1622 (2.7068)	mem 20675MB
[2025-04-03 03:24:12 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][220/573]	eta 0:05:12 lr 0.000291	time 0.8779 (0.8863)	loss 0.5107 (0.4999)	grad_norm 1.5777 (2.7042)	mem 20675MB
[2025-04-03 03:24:14 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][222/573]	eta 0:05:11 lr 0.000291	time 0.8964 (0.8864)	loss 0.3199 (0.4992)	grad_norm 2.8837 (2.7027)	mem 20675MB
[2025-04-03 03:24:16 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][224/573]	eta 0:05:09 lr 0.000291	time 0.8789 (0.8863)	loss 0.4192 (0.4988)	grad_norm 2.6208 (2.7039)	mem 20675MB
[2025-04-03 03:24:17 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][226/573]	eta 0:05:07 lr 0.000291	time 0.8791 (0.8862)	loss 0.5532 (0.4987)	grad_norm 2.2083 (2.7075)	mem 20675MB
[2025-04-03 03:24:19 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][228/573]	eta 0:05:05 lr 0.000290	time 0.8873 (0.8862)	loss 0.5424 (0.4990)	grad_norm 3.0419 (2.7045)	mem 20675MB
[2025-04-03 03:24:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][230/573]	eta 0:05:03 lr 0.000290	time 0.8890 (0.8862)	loss 0.4168 (0.4984)	grad_norm 2.2248 (2.7061)	mem 20675MB
[2025-04-03 03:24:23 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][232/573]	eta 0:05:02 lr 0.000290	time 0.8797 (0.8861)	loss 0.6424 (0.4990)	grad_norm 3.6937 (2.7107)	mem 20675MB
[2025-04-03 03:24:24 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][234/573]	eta 0:05:00 lr 0.000290	time 0.8894 (0.8862)	loss 0.4736 (0.4994)	grad_norm 3.0598 (2.7084)	mem 20675MB
[2025-04-03 03:24:26 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][236/573]	eta 0:04:58 lr 0.000290	time 0.8814 (0.8862)	loss 0.5460 (0.4996)	grad_norm 2.8710 (2.7078)	mem 20675MB
[2025-04-03 03:24:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][238/573]	eta 0:04:56 lr 0.000289	time 0.8781 (0.8862)	loss 0.5167 (0.4992)	grad_norm 2.6517 (2.7064)	mem 20675MB
[2025-04-03 03:24:30 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][240/573]	eta 0:04:55 lr 0.000289	time 0.8775 (0.8861)	loss 0.5884 (0.4994)	grad_norm 4.0701 (2.7095)	mem 20675MB
[2025-04-03 03:24:31 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][242/573]	eta 0:04:53 lr 0.000289	time 0.8774 (0.8860)	loss 0.4127 (0.4992)	grad_norm 3.2692 (2.7116)	mem 20675MB
[2025-04-03 03:24:33 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][244/573]	eta 0:04:51 lr 0.000289	time 0.8774 (0.8860)	loss 0.4042 (0.4987)	grad_norm 3.3617 (2.7114)	mem 20675MB
[2025-04-03 03:24:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][246/573]	eta 0:04:49 lr 0.000289	time 0.8775 (0.8860)	loss 0.5086 (0.4988)	grad_norm 2.6633 (2.7112)	mem 20675MB
[2025-04-03 03:24:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][248/573]	eta 0:04:47 lr 0.000288	time 0.8791 (0.8860)	loss 0.5955 (0.4992)	grad_norm 3.1481 (2.7139)	mem 20675MB
[2025-04-03 03:24:38 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][250/573]	eta 0:04:46 lr 0.000288	time 0.8778 (0.8859)	loss 0.4842 (0.4993)	grad_norm 1.7777 (2.7096)	mem 20675MB
[2025-04-03 03:24:40 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][252/573]	eta 0:04:44 lr 0.000288	time 0.8774 (0.8859)	loss 0.3501 (0.4986)	grad_norm 4.3092 (2.7168)	mem 20675MB
[2025-04-03 03:24:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][254/573]	eta 0:04:42 lr 0.000288	time 0.8783 (0.8858)	loss 0.4416 (0.4986)	grad_norm 3.3182 (2.7175)	mem 20675MB
[2025-04-03 03:24:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][256/573]	eta 0:04:40 lr 0.000288	time 0.8775 (0.8858)	loss 0.5045 (0.4979)	grad_norm 2.1366 (2.7172)	mem 20675MB
[2025-04-03 03:24:46 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][258/573]	eta 0:04:39 lr 0.000288	time 0.8780 (0.8857)	loss 0.4715 (0.4977)	grad_norm 2.0246 (2.7175)	mem 20675MB
[2025-04-03 03:24:47 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][260/573]	eta 0:04:37 lr 0.000287	time 0.8776 (0.8857)	loss 0.4875 (0.4972)	grad_norm 2.8726 (2.7188)	mem 20675MB
[2025-04-03 03:24:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][262/573]	eta 0:04:35 lr 0.000287	time 0.8776 (0.8856)	loss 0.6114 (0.4973)	grad_norm 3.1734 (2.7261)	mem 20675MB
[2025-04-03 03:24:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][264/573]	eta 0:04:33 lr 0.000287	time 0.8776 (0.8856)	loss 0.4510 (0.4968)	grad_norm 3.1605 (2.7266)	mem 20675MB
[2025-04-03 03:24:53 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][266/573]	eta 0:04:31 lr 0.000287	time 0.8777 (0.8855)	loss 0.4965 (0.4966)	grad_norm 3.9036 (2.7328)	mem 20675MB
[2025-04-03 03:24:54 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][268/573]	eta 0:04:30 lr 0.000287	time 0.8785 (0.8855)	loss 0.5424 (0.4969)	grad_norm 2.8293 (2.7366)	mem 20675MB
[2025-04-03 03:24:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][270/573]	eta 0:04:28 lr 0.000286	time 0.8772 (0.8855)	loss 0.4616 (0.4969)	grad_norm 3.3598 (2.7368)	mem 20675MB
[2025-04-03 03:24:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][272/573]	eta 0:04:26 lr 0.000286	time 0.8784 (0.8855)	loss 0.4994 (0.4966)	grad_norm 2.2942 (2.7350)	mem 20675MB
[2025-04-03 03:25:00 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][274/573]	eta 0:04:24 lr 0.000286	time 0.8776 (0.8854)	loss 0.5001 (0.4965)	grad_norm 2.3080 (2.7323)	mem 20675MB
[2025-04-03 03:25:01 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][276/573]	eta 0:04:22 lr 0.000286	time 0.8833 (0.8855)	loss 0.5739 (0.4971)	grad_norm 4.0048 (2.7363)	mem 20675MB
[2025-04-03 03:25:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][278/573]	eta 0:04:21 lr 0.000286	time 0.8795 (0.8855)	loss 0.4038 (0.4963)	grad_norm 3.1781 (2.7402)	mem 20675MB
[2025-04-03 03:25:05 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][280/573]	eta 0:04:19 lr 0.000285	time 0.8784 (0.8854)	loss 0.4052 (0.4964)	grad_norm 2.5151 (2.7380)	mem 20675MB
[2025-04-03 03:25:07 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][282/573]	eta 0:04:17 lr 0.000285	time 0.8808 (0.8854)	loss 0.4753 (0.4963)	grad_norm 3.4914 (2.7419)	mem 20675MB
[2025-04-03 03:25:08 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][284/573]	eta 0:04:15 lr 0.000285	time 0.8775 (0.8853)	loss 0.5755 (0.4966)	grad_norm 2.1817 (2.7396)	mem 20675MB
[2025-04-03 03:25:10 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][286/573]	eta 0:04:14 lr 0.000285	time 0.8780 (0.8853)	loss 0.3607 (0.4959)	grad_norm 3.3481 (2.7439)	mem 20675MB
[2025-04-03 03:25:12 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][288/573]	eta 0:04:12 lr 0.000285	time 0.8803 (0.8853)	loss 0.4852 (0.4961)	grad_norm 3.4104 (2.7439)	mem 20675MB
[2025-04-03 03:25:14 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][290/573]	eta 0:04:10 lr 0.000284	time 0.8789 (0.8852)	loss 0.4920 (0.4956)	grad_norm 1.5690 (2.7437)	mem 20675MB
[2025-04-03 03:25:15 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][292/573]	eta 0:04:08 lr 0.000284	time 0.8780 (0.8852)	loss 0.4101 (0.4954)	grad_norm 3.7450 (2.7469)	mem 20675MB
[2025-04-03 03:25:17 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][294/573]	eta 0:04:06 lr 0.000284	time 0.8774 (0.8852)	loss 0.4031 (0.4955)	grad_norm 2.7556 (2.7456)	mem 20675MB
[2025-04-03 03:25:19 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][296/573]	eta 0:04:05 lr 0.000284	time 0.8777 (0.8851)	loss 0.4313 (0.4956)	grad_norm 2.7932 (2.7450)	mem 20675MB
[2025-04-03 03:25:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][298/573]	eta 0:04:03 lr 0.000284	time 0.8794 (0.8851)	loss 0.5959 (0.4953)	grad_norm 2.5433 (2.7614)	mem 20675MB
[2025-04-03 03:25:23 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][300/573]	eta 0:04:01 lr 0.000284	time 0.8790 (0.8851)	loss 0.5064 (0.4956)	grad_norm 2.5219 (2.7601)	mem 20675MB
[2025-04-03 03:25:24 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][302/573]	eta 0:03:59 lr 0.000283	time 0.8774 (0.8850)	loss 0.4733 (0.4958)	grad_norm 2.6839 (2.7585)	mem 20675MB
[2025-04-03 03:25:26 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][304/573]	eta 0:03:58 lr 0.000283	time 0.8841 (0.8850)	loss 0.5412 (0.4961)	grad_norm 2.4563 (2.7565)	mem 20675MB
[2025-04-03 03:25:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][306/573]	eta 0:03:56 lr 0.000283	time 0.8775 (0.8850)	loss 0.5274 (0.4965)	grad_norm 2.8920 (2.7560)	mem 20675MB
[2025-04-03 03:25:30 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][308/573]	eta 0:03:54 lr 0.000283	time 0.8773 (0.8850)	loss 0.4091 (0.4960)	grad_norm 4.4547 (2.7610)	mem 20675MB
[2025-04-03 03:25:31 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][310/573]	eta 0:03:52 lr 0.000283	time 0.8907 (0.8850)	loss 0.4506 (0.4958)	grad_norm 2.3336 (2.7572)	mem 20675MB
[2025-04-03 03:25:33 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][312/573]	eta 0:03:50 lr 0.000282	time 0.8779 (0.8850)	loss 0.4065 (0.4955)	grad_norm 8.1283 (2.7716)	mem 20675MB
[2025-04-03 03:25:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][314/573]	eta 0:03:49 lr 0.000282	time 0.8796 (0.8850)	loss 0.5843 (0.4959)	grad_norm 1.6381 (2.7668)	mem 20675MB
[2025-04-03 03:25:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][316/573]	eta 0:03:47 lr 0.000282	time 0.8853 (0.8850)	loss 0.4858 (0.4961)	grad_norm 3.2256 (2.7677)	mem 20675MB
[2025-04-03 03:25:38 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][318/573]	eta 0:03:45 lr 0.000282	time 0.8775 (0.8849)	loss 0.5425 (0.4962)	grad_norm 1.3347 (2.7613)	mem 20675MB
[2025-04-03 03:25:40 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][320/573]	eta 0:03:43 lr 0.000282	time 0.8778 (0.8849)	loss 0.4864 (0.4962)	grad_norm 2.4181 (2.7602)	mem 20675MB
[2025-04-03 03:25:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][322/573]	eta 0:03:42 lr 0.000281	time 0.8778 (0.8849)	loss 0.5083 (0.4963)	grad_norm 2.5419 (2.7584)	mem 20675MB
[2025-04-03 03:25:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][324/573]	eta 0:03:40 lr 0.000281	time 0.8777 (0.8848)	loss 0.5249 (0.4964)	grad_norm 1.7524 (2.7540)	mem 20675MB
[2025-04-03 03:25:45 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][326/573]	eta 0:03:38 lr 0.000281	time 0.8778 (0.8848)	loss 0.5410 (0.4965)	grad_norm 2.1108 (2.7507)	mem 20675MB
[2025-04-03 03:25:47 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][328/573]	eta 0:03:36 lr 0.000281	time 0.8822 (0.8848)	loss 0.4584 (0.4962)	grad_norm 2.3217 (2.7464)	mem 20675MB
[2025-04-03 03:25:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][330/573]	eta 0:03:35 lr 0.000281	time 0.8778 (0.8848)	loss 0.5529 (0.4961)	grad_norm 1.7615 (2.7439)	mem 20675MB
[2025-04-03 03:25:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][332/573]	eta 0:03:33 lr 0.000280	time 0.8778 (0.8848)	loss 0.4785 (0.4958)	grad_norm 2.9186 (2.7438)	mem 20675MB
[2025-04-03 03:25:53 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][334/573]	eta 0:03:31 lr 0.000280	time 0.8783 (0.8848)	loss 0.4945 (0.4955)	grad_norm 2.9932 (2.7484)	mem 20675MB
[2025-04-03 03:25:54 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][336/573]	eta 0:03:29 lr 0.000280	time 0.8783 (0.8847)	loss 0.5751 (0.4957)	grad_norm 3.0034 (2.7502)	mem 20675MB
[2025-04-03 03:25:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][338/573]	eta 0:03:27 lr 0.000280	time 0.8780 (0.8847)	loss 0.4054 (0.4956)	grad_norm 2.8703 (2.7503)	mem 20675MB
[2025-04-03 03:25:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][340/573]	eta 0:03:26 lr 0.000280	time 0.8784 (0.8847)	loss 0.5433 (0.4957)	grad_norm 3.6766 (2.7505)	mem 20675MB
[2025-04-03 03:26:00 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][342/573]	eta 0:03:24 lr 0.000279	time 0.8788 (0.8847)	loss 0.5157 (0.4960)	grad_norm 1.8640 (2.7474)	mem 20675MB
[2025-04-03 03:26:01 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][344/573]	eta 0:03:22 lr 0.000279	time 0.8777 (0.8846)	loss 0.6070 (0.4961)	grad_norm 3.0877 (2.7481)	mem 20675MB
[2025-04-03 03:26:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][346/573]	eta 0:03:20 lr 0.000279	time 0.8787 (0.8846)	loss 0.4899 (0.4962)	grad_norm 1.9041 (2.7442)	mem 20675MB
[2025-04-03 03:26:05 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][348/573]	eta 0:03:19 lr 0.000279	time 0.9096 (0.8847)	loss 0.5473 (0.4962)	grad_norm 2.4842 (2.7448)	mem 20675MB
[2025-04-03 03:26:07 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][350/573]	eta 0:03:17 lr 0.000279	time 0.8841 (0.8847)	loss 0.5663 (0.4966)	grad_norm 1.7320 (2.7399)	mem 20675MB
[2025-04-03 03:26:08 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][352/573]	eta 0:03:15 lr 0.000279	time 0.8779 (0.8846)	loss 0.5351 (0.4965)	grad_norm 2.1819 (2.7379)	mem 20675MB
[2025-04-03 03:26:10 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][354/573]	eta 0:03:13 lr 0.000278	time 0.8777 (0.8846)	loss 0.5053 (0.4968)	grad_norm 2.0077 (2.7344)	mem 20675MB
[2025-04-03 03:26:12 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][356/573]	eta 0:03:11 lr 0.000278	time 0.8776 (0.8846)	loss 0.5473 (0.4970)	grad_norm 2.2549 (2.7322)	mem 20675MB
[2025-04-03 03:26:14 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][358/573]	eta 0:03:10 lr 0.000278	time 0.8787 (0.8846)	loss 0.5302 (0.4966)	grad_norm 1.6434 (2.7330)	mem 20675MB
[2025-04-03 03:26:15 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][360/573]	eta 0:03:08 lr 0.000278	time 0.8778 (0.8846)	loss 0.5542 (0.4964)	grad_norm 1.6887 (2.7326)	mem 20675MB
[2025-04-03 03:26:17 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][362/573]	eta 0:03:06 lr 0.000278	time 0.8832 (0.8845)	loss 0.4499 (0.4966)	grad_norm 2.5456 (2.7333)	mem 20675MB
[2025-04-03 03:26:19 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][364/573]	eta 0:03:04 lr 0.000277	time 0.8800 (0.8845)	loss 0.3907 (0.4966)	grad_norm 3.4763 (2.7342)	mem 20675MB
[2025-04-03 03:26:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][366/573]	eta 0:03:03 lr 0.000277	time 0.8782 (0.8845)	loss 0.5549 (0.4965)	grad_norm 2.4433 (2.7342)	mem 20675MB
[2025-04-03 03:26:22 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][368/573]	eta 0:03:01 lr 0.000277	time 0.8780 (0.8845)	loss 0.4487 (0.4964)	grad_norm 3.9959 (2.7372)	mem 20675MB
[2025-04-03 03:26:24 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][370/573]	eta 0:02:59 lr 0.000277	time 0.8774 (0.8844)	loss 0.5683 (0.4964)	grad_norm 2.0280 (2.7358)	mem 20675MB
[2025-04-03 03:26:26 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][372/573]	eta 0:02:57 lr 0.000277	time 0.8774 (0.8844)	loss 0.3659 (0.4963)	grad_norm 3.1957 (2.7361)	mem 20675MB
[2025-04-03 03:26:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][374/573]	eta 0:02:55 lr 0.000276	time 0.8776 (0.8844)	loss 0.5435 (0.4964)	grad_norm 2.5211 (2.7362)	mem 20675MB
[2025-04-03 03:26:30 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][376/573]	eta 0:02:54 lr 0.000276	time 0.8833 (0.8843)	loss 0.4172 (0.4963)	grad_norm 3.1557 (2.7363)	mem 20675MB
[2025-04-03 03:26:31 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][378/573]	eta 0:02:52 lr 0.000276	time 0.8776 (0.8843)	loss 0.5424 (0.4966)	grad_norm 2.1355 (2.7343)	mem 20675MB
[2025-04-03 03:26:33 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][380/573]	eta 0:02:50 lr 0.000276	time 0.8795 (0.8843)	loss 0.5060 (0.4967)	grad_norm 2.6080 (2.7357)	mem 20675MB
[2025-04-03 03:26:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][382/573]	eta 0:02:48 lr 0.000276	time 0.8783 (0.8843)	loss 0.5494 (0.4969)	grad_norm 2.0392 (2.7326)	mem 20675MB
[2025-04-03 03:26:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][384/573]	eta 0:02:47 lr 0.000276	time 0.8777 (0.8842)	loss 0.5854 (0.4970)	grad_norm 2.1429 (2.7324)	mem 20675MB
[2025-04-03 03:26:38 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][386/573]	eta 0:02:45 lr 0.000275	time 0.8780 (0.8842)	loss 0.4778 (0.4970)	grad_norm 3.0560 (2.7307)	mem 20675MB
[2025-04-03 03:26:40 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][388/573]	eta 0:02:43 lr 0.000275	time 0.8905 (0.8842)	loss 0.5528 (0.4972)	grad_norm 2.6919 (2.7338)	mem 20675MB
[2025-04-03 03:26:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][390/573]	eta 0:02:41 lr 0.000275	time 0.8780 (0.8842)	loss 0.5822 (0.4976)	grad_norm 2.6908 (2.7337)	mem 20675MB
[2025-04-03 03:26:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][392/573]	eta 0:02:40 lr 0.000275	time 0.8783 (0.8842)	loss 0.5543 (0.4979)	grad_norm 2.3975 (2.7306)	mem 20675MB
[2025-04-03 03:26:45 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][394/573]	eta 0:02:38 lr 0.000275	time 0.8829 (0.8842)	loss 0.5438 (0.4977)	grad_norm 2.1792 (2.7293)	mem 20675MB
[2025-04-03 03:26:47 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][396/573]	eta 0:02:36 lr 0.000274	time 0.8803 (0.8841)	loss 0.3692 (0.4970)	grad_norm 2.5310 (2.7283)	mem 20675MB
[2025-04-03 03:26:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][398/573]	eta 0:02:34 lr 0.000274	time 0.8845 (0.8841)	loss 0.3528 (0.4966)	grad_norm 3.8791 (2.7317)	mem 20675MB
[2025-04-03 03:26:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][400/573]	eta 0:02:32 lr 0.000274	time 0.8788 (0.8841)	loss 0.5943 (0.4966)	grad_norm 2.1716 (2.7299)	mem 20675MB
[2025-04-03 03:26:52 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][402/573]	eta 0:02:31 lr 0.000274	time 0.8781 (0.8841)	loss 0.4965 (0.4965)	grad_norm 2.1325 (2.7311)	mem 20675MB
[2025-04-03 03:26:54 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][404/573]	eta 0:02:29 lr 0.000274	time 0.8912 (0.8841)	loss 0.4890 (0.4964)	grad_norm 1.9248 (2.7298)	mem 20675MB
[2025-04-03 03:26:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][406/573]	eta 0:02:27 lr 0.000273	time 0.8790 (0.8841)	loss 0.4103 (0.4965)	grad_norm 2.7648 (2.7291)	mem 20675MB
[2025-04-03 03:26:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][408/573]	eta 0:02:25 lr 0.000273	time 0.8851 (0.8841)	loss 0.5827 (0.4963)	grad_norm 2.9647 (2.7321)	mem 20675MB
[2025-04-03 03:26:59 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][410/573]	eta 0:02:24 lr 0.000273	time 0.8793 (0.8841)	loss 0.6003 (0.4968)	grad_norm 2.3055 (2.7312)	mem 20675MB
[2025-04-03 03:27:01 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][412/573]	eta 0:02:22 lr 0.000273	time 0.8840 (0.8841)	loss 0.4197 (0.4965)	grad_norm 2.0683 (2.7331)	mem 20675MB
[2025-04-03 03:27:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][414/573]	eta 0:02:20 lr 0.000273	time 0.8777 (0.8841)	loss 0.5674 (0.4967)	grad_norm 2.8295 (2.7328)	mem 20675MB
[2025-04-03 03:27:05 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][416/573]	eta 0:02:18 lr 0.000272	time 0.8776 (0.8841)	loss 0.4200 (0.4966)	grad_norm 2.5848 (2.7300)	mem 20675MB
[2025-04-03 03:27:07 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][418/573]	eta 0:02:17 lr 0.000272	time 0.8785 (0.8840)	loss 0.5804 (0.4965)	grad_norm 1.9742 (2.7297)	mem 20675MB
[2025-04-03 03:27:08 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][420/573]	eta 0:02:15 lr 0.000272	time 0.8921 (0.8841)	loss 0.5900 (0.4968)	grad_norm 2.4638 (2.7284)	mem 20675MB
[2025-04-03 03:27:10 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][422/573]	eta 0:02:13 lr 0.000272	time 0.8835 (0.8841)	loss 0.5449 (0.4972)	grad_norm 1.5831 (2.7263)	mem 20675MB
[2025-04-03 03:27:12 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][424/573]	eta 0:02:11 lr 0.000272	time 0.8926 (0.8841)	loss 0.3910 (0.4966)	grad_norm 3.3403 (2.7286)	mem 20675MB
[2025-04-03 03:27:14 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][426/573]	eta 0:02:09 lr 0.000272	time 0.8778 (0.8841)	loss 0.3648 (0.4961)	grad_norm 3.1248 (2.7284)	mem 20675MB
[2025-04-03 03:27:15 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][428/573]	eta 0:02:08 lr 0.000271	time 0.8833 (0.8841)	loss 0.3684 (0.4957)	grad_norm 3.2791 (2.7299)	mem 20675MB
[2025-04-03 03:27:17 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][430/573]	eta 0:02:06 lr 0.000271	time 0.8898 (0.8841)	loss 0.4660 (0.4958)	grad_norm 2.4198 (2.7290)	mem 20675MB
[2025-04-03 03:27:19 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][432/573]	eta 0:02:04 lr 0.000271	time 0.8772 (0.8841)	loss 0.4609 (0.4959)	grad_norm 2.0523 (2.7276)	mem 20675MB
[2025-04-03 03:27:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][434/573]	eta 0:02:02 lr 0.000271	time 0.8783 (0.8841)	loss 0.5334 (0.4962)	grad_norm 2.4355 (2.7265)	mem 20675MB
[2025-04-03 03:27:22 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][436/573]	eta 0:02:01 lr 0.000271	time 0.8777 (0.8841)	loss 0.5273 (0.4963)	grad_norm 2.1106 (2.7238)	mem 20675MB
[2025-04-03 03:27:24 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][438/573]	eta 0:01:59 lr 0.000270	time 0.8837 (0.8841)	loss 0.5428 (0.4966)	grad_norm 2.2836 (2.7224)	mem 20675MB
[2025-04-03 03:27:26 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][440/573]	eta 0:01:57 lr 0.000270	time 0.8782 (0.8841)	loss 0.4782 (0.4968)	grad_norm 2.3379 (2.7227)	mem 20675MB
[2025-04-03 03:27:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][442/573]	eta 0:01:55 lr 0.000270	time 0.8808 (0.8841)	loss 0.5711 (0.4969)	grad_norm 1.8349 (2.7190)	mem 20675MB
[2025-04-03 03:27:30 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][444/573]	eta 0:01:54 lr 0.000270	time 0.8777 (0.8841)	loss 0.4618 (0.4969)	grad_norm 3.3063 (2.7202)	mem 20675MB
[2025-04-03 03:27:31 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][446/573]	eta 0:01:52 lr 0.000270	time 0.8879 (0.8841)	loss 0.3455 (0.4965)	grad_norm 2.3926 (2.7192)	mem 20675MB
[2025-04-03 03:27:33 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][448/573]	eta 0:01:50 lr 0.000269	time 0.8780 (0.8840)	loss 0.5467 (0.4967)	grad_norm 3.3034 (2.7225)	mem 20675MB
[2025-04-03 03:27:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][450/573]	eta 0:01:48 lr 0.000269	time 0.8792 (0.8840)	loss 0.5072 (0.4967)	grad_norm 2.5081 (2.7218)	mem 20675MB
[2025-04-03 03:27:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][452/573]	eta 0:01:46 lr 0.000269	time 0.8785 (0.8840)	loss 0.5176 (0.4964)	grad_norm 2.4051 (2.7213)	mem 20675MB
[2025-04-03 03:27:38 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][454/573]	eta 0:01:45 lr 0.000269	time 0.8776 (0.8840)	loss 0.4643 (0.4962)	grad_norm 2.9848 (2.7210)	mem 20675MB
[2025-04-03 03:27:40 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][456/573]	eta 0:01:43 lr 0.000269	time 0.8781 (0.8840)	loss 0.4939 (0.4963)	grad_norm 2.5348 (2.7235)	mem 20675MB
[2025-04-03 03:27:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][458/573]	eta 0:01:41 lr 0.000269	time 0.8787 (0.8839)	loss 0.5785 (0.4965)	grad_norm 3.1139 (2.7248)	mem 20675MB
[2025-04-03 03:27:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][460/573]	eta 0:01:39 lr 0.000268	time 0.8779 (0.8839)	loss 0.4413 (0.4963)	grad_norm 2.7699 (2.7261)	mem 20675MB
[2025-04-03 03:27:45 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][462/573]	eta 0:01:38 lr 0.000268	time 0.8777 (0.8839)	loss 0.4619 (0.4963)	grad_norm 2.6564 (2.7256)	mem 20675MB
[2025-04-03 03:27:47 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][464/573]	eta 0:01:36 lr 0.000268	time 0.8909 (0.8839)	loss 0.4763 (0.4963)	grad_norm 3.5762 (2.7285)	mem 20675MB
[2025-04-03 03:27:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][466/573]	eta 0:01:34 lr 0.000268	time 0.8775 (0.8839)	loss 0.5889 (0.4967)	grad_norm 3.8572 (2.7309)	mem 20675MB
[2025-04-03 03:27:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][468/573]	eta 0:01:32 lr 0.000268	time 0.8793 (0.8839)	loss 0.5547 (0.4969)	grad_norm 2.9801 (2.7301)	mem 20675MB
[2025-04-03 03:27:52 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][470/573]	eta 0:01:31 lr 0.000267	time 0.8896 (0.8839)	loss 0.3778 (0.4964)	grad_norm 4.7135 (2.7414)	mem 20675MB
[2025-04-03 03:27:54 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][472/573]	eta 0:01:29 lr 0.000267	time 0.8777 (0.8839)	loss 0.4994 (0.4966)	grad_norm 3.0758 (2.7412)	mem 20675MB
[2025-04-03 03:27:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][474/573]	eta 0:01:27 lr 0.000267	time 0.8776 (0.8839)	loss 0.4640 (0.4965)	grad_norm 2.7812 (2.7427)	mem 20675MB
[2025-04-03 03:27:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][476/573]	eta 0:01:25 lr 0.000267	time 0.8772 (0.8839)	loss 0.4056 (0.4961)	grad_norm 3.1608 (2.7421)	mem 20675MB
[2025-04-03 03:27:59 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][478/573]	eta 0:01:23 lr 0.000267	time 0.8775 (0.8838)	loss 0.5412 (0.4964)	grad_norm 1.8399 (2.7392)	mem 20675MB
[2025-04-03 03:28:01 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][480/573]	eta 0:01:22 lr 0.000266	time 0.8797 (0.8838)	loss 0.5058 (0.4963)	grad_norm 3.2670 (2.7400)	mem 20675MB
[2025-04-03 03:28:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][482/573]	eta 0:01:20 lr 0.000266	time 0.8781 (0.8838)	loss 0.5739 (0.4966)	grad_norm 3.3890 (2.7409)	mem 20675MB
[2025-04-03 03:28:05 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][484/573]	eta 0:01:18 lr 0.000266	time 0.8777 (0.8838)	loss 0.3360 (0.4963)	grad_norm 2.8085 (2.7400)	mem 20675MB
[2025-04-03 03:28:07 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][486/573]	eta 0:01:16 lr 0.000266	time 0.8775 (0.8838)	loss 0.5559 (0.4965)	grad_norm 2.3814 (2.7381)	mem 20675MB
[2025-04-03 03:28:08 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][488/573]	eta 0:01:15 lr 0.000266	time 0.8777 (0.8838)	loss 0.5881 (0.4968)	grad_norm 2.3810 (2.7362)	mem 20675MB
[2025-04-03 03:28:10 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][490/573]	eta 0:01:13 lr 0.000266	time 0.8773 (0.8838)	loss 0.4319 (0.4967)	grad_norm 3.7604 (2.7374)	mem 20675MB
[2025-04-03 03:28:12 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][492/573]	eta 0:01:11 lr 0.000265	time 0.8904 (0.8838)	loss 0.4533 (0.4967)	grad_norm 2.3939 (2.7350)	mem 20675MB
[2025-04-03 03:28:14 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][494/573]	eta 0:01:09 lr 0.000265	time 0.8868 (0.8838)	loss 0.4981 (0.4967)	grad_norm 2.2614 (2.7336)	mem 20675MB
[2025-04-03 03:28:15 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][496/573]	eta 0:01:08 lr 0.000265	time 0.8836 (0.8838)	loss 0.3609 (0.4962)	grad_norm 3.3230 (2.7350)	mem 20675MB
[2025-04-03 03:28:17 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][498/573]	eta 0:01:06 lr 0.000265	time 0.8781 (0.8838)	loss 0.4265 (0.4961)	grad_norm 3.6790 (2.7348)	mem 20675MB
[2025-04-03 03:28:19 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][500/573]	eta 0:01:04 lr 0.000265	time 0.8776 (0.8837)	loss 0.5123 (0.4964)	grad_norm 2.6051 (2.7347)	mem 20675MB
[2025-04-03 03:28:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][502/573]	eta 0:01:02 lr 0.000264	time 0.9023 (0.8838)	loss 0.3878 (0.4962)	grad_norm 2.8698 (2.7332)	mem 20675MB
[2025-04-03 03:28:22 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][504/573]	eta 0:01:00 lr 0.000264	time 0.8794 (0.8838)	loss 0.6116 (0.4962)	grad_norm 2.1631 (2.7337)	mem 20675MB
[2025-04-03 03:28:24 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][506/573]	eta 0:00:59 lr 0.000264	time 0.8774 (0.8838)	loss 0.5613 (0.4963)	grad_norm 2.7651 (2.7329)	mem 20675MB
[2025-04-03 03:28:26 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][508/573]	eta 0:00:57 lr 0.000264	time 0.8782 (0.8837)	loss 0.5576 (0.4962)	grad_norm 2.2084 (2.7312)	mem 20675MB
[2025-04-03 03:28:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][510/573]	eta 0:00:55 lr 0.000264	time 0.8774 (0.8837)	loss 0.4610 (0.4960)	grad_norm 2.7195 (2.7354)	mem 20675MB
[2025-04-03 03:28:29 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][512/573]	eta 0:00:53 lr 0.000263	time 0.8779 (0.8837)	loss 0.6165 (0.4963)	grad_norm 3.6656 (2.7364)	mem 20675MB
[2025-04-03 03:28:31 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][514/573]	eta 0:00:52 lr 0.000263	time 0.8779 (0.8837)	loss 0.4951 (0.4966)	grad_norm 1.8208 (2.7363)	mem 20675MB
[2025-04-03 03:28:33 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][516/573]	eta 0:00:50 lr 0.000263	time 0.8775 (0.8837)	loss 0.3637 (0.4964)	grad_norm 3.1443 (2.7366)	mem 20675MB
[2025-04-03 03:28:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][518/573]	eta 0:00:48 lr 0.000263	time 0.8777 (0.8837)	loss 0.4278 (0.4961)	grad_norm 2.8614 (2.7371)	mem 20675MB
[2025-04-03 03:28:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][520/573]	eta 0:00:46 lr 0.000263	time 0.8788 (0.8837)	loss 0.5149 (0.4963)	grad_norm 2.2491 (2.7359)	mem 20675MB
[2025-04-03 03:28:38 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][522/573]	eta 0:00:45 lr 0.000263	time 0.8778 (0.8837)	loss 0.4759 (0.4962)	grad_norm 2.1030 (2.7350)	mem 20675MB
[2025-04-03 03:28:40 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][524/573]	eta 0:00:43 lr 0.000262	time 0.8780 (0.8836)	loss 0.5049 (0.4960)	grad_norm 3.7710 (2.7379)	mem 20675MB
[2025-04-03 03:28:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][526/573]	eta 0:00:41 lr 0.000262	time 0.8776 (0.8836)	loss 0.4717 (0.4957)	grad_norm 2.5810 (2.7395)	mem 20675MB
[2025-04-03 03:28:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][528/573]	eta 0:00:39 lr 0.000262	time 0.8847 (0.8836)	loss 0.5651 (0.4960)	grad_norm 2.6144 (2.7382)	mem 20675MB
[2025-04-03 03:28:45 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][530/573]	eta 0:00:37 lr 0.000262	time 0.8778 (0.8836)	loss 0.3829 (0.4955)	grad_norm 3.7883 (2.7417)	mem 20675MB
[2025-04-03 03:28:47 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][532/573]	eta 0:00:36 lr 0.000262	time 0.8776 (0.8836)	loss 0.5807 (0.4956)	grad_norm 2.4175 (2.7438)	mem 20675MB
[2025-04-03 03:28:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][534/573]	eta 0:00:34 lr 0.000261	time 0.8821 (0.8836)	loss 0.5758 (0.4958)	grad_norm 1.9297 (2.7427)	mem 20675MB
[2025-04-03 03:28:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][536/573]	eta 0:00:32 lr 0.000261	time 0.8779 (0.8836)	loss 0.5791 (0.4959)	grad_norm 2.4579 (2.7437)	mem 20675MB
[2025-04-03 03:28:52 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][538/573]	eta 0:00:30 lr 0.000261	time 0.8777 (0.8836)	loss 0.4018 (0.4958)	grad_norm 3.9434 (2.7449)	mem 20675MB
[2025-04-03 03:28:54 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][540/573]	eta 0:00:29 lr 0.000261	time 0.8775 (0.8836)	loss 0.5238 (0.4960)	grad_norm 2.0779 (2.7433)	mem 20675MB
[2025-04-03 03:28:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][542/573]	eta 0:00:27 lr 0.000261	time 0.8780 (0.8835)	loss 0.5338 (0.4962)	grad_norm 3.0216 (2.7419)	mem 20675MB
[2025-04-03 03:28:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][544/573]	eta 0:00:25 lr 0.000261	time 0.8781 (0.8835)	loss 0.5654 (0.4964)	grad_norm 2.5496 (2.7411)	mem 20675MB
[2025-04-03 03:28:59 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][546/573]	eta 0:00:23 lr 0.000260	time 0.8779 (0.8835)	loss 0.3915 (0.4963)	grad_norm 2.8270 (2.7397)	mem 20675MB
[2025-04-03 03:29:01 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][548/573]	eta 0:00:22 lr 0.000260	time 0.8861 (0.8835)	loss 0.5475 (0.4964)	grad_norm 2.5933 (2.7392)	mem 20675MB
[2025-04-03 03:29:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][550/573]	eta 0:00:20 lr 0.000260	time 0.8788 (0.8835)	loss 0.4756 (0.4963)	grad_norm 3.0295 (2.7401)	mem 20675MB
[2025-04-03 03:29:05 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][552/573]	eta 0:00:18 lr 0.000260	time 0.8776 (0.8835)	loss 0.4781 (0.4964)	grad_norm 2.5532 (2.7394)	mem 20675MB
[2025-04-03 03:29:06 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][554/573]	eta 0:00:16 lr 0.000260	time 0.8781 (0.8835)	loss 0.5367 (0.4963)	grad_norm 2.1213 (2.7386)	mem 20675MB
[2025-04-03 03:29:08 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][556/573]	eta 0:00:15 lr 0.000259	time 0.8828 (0.8835)	loss 0.5720 (0.4964)	grad_norm 2.3629 (2.7366)	mem 20675MB
[2025-04-03 03:29:10 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][558/573]	eta 0:00:13 lr 0.000259	time 0.8777 (0.8835)	loss 0.5833 (0.4963)	grad_norm 3.9866 (2.7415)	mem 20675MB
[2025-04-03 03:29:12 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][560/573]	eta 0:00:11 lr 0.000259	time 0.8786 (0.8834)	loss 0.3233 (0.4959)	grad_norm 3.1566 (2.7418)	mem 20675MB
[2025-04-03 03:29:13 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][562/573]	eta 0:00:09 lr 0.000259	time 0.8778 (0.8834)	loss 0.6216 (0.4961)	grad_norm 3.8036 (2.7445)	mem 20675MB
[2025-04-03 03:29:15 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][564/573]	eta 0:00:07 lr 0.000259	time 0.8776 (0.8834)	loss 0.3935 (0.4961)	grad_norm 4.4752 (2.7472)	mem 20675MB
[2025-04-03 03:29:17 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][566/573]	eta 0:00:06 lr 0.000258	time 0.8775 (0.8834)	loss 0.5700 (0.4964)	grad_norm 2.0016 (2.7453)	mem 20675MB
[2025-04-03 03:29:19 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][568/573]	eta 0:00:04 lr 0.000258	time 0.8775 (0.8834)	loss 0.5577 (0.4963)	grad_norm 2.5371 (2.7446)	mem 20675MB
[2025-04-03 03:29:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][570/573]	eta 0:00:02 lr 0.000258	time 0.8774 (0.8834)	loss 0.5228 (0.4962)	grad_norm 3.1855 (2.7464)	mem 20675MB
[2025-04-03 03:29:22 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][572/573]	eta 0:00:00 lr 0.000258	time 0.8775 (0.8834)	loss 0.5513 (0.4963)	grad_norm 3.0701 (2.7456)	mem 20675MB
[2025-04-03 03:29:22 simmim_finetune] (main_finetune.py 260): INFO EPOCH 20 training takes 0:08:26
[2025-04-03 03:29:22 simmim_finetune] (utils.py 60): INFO checkpoint/human/ckpt20.pth saving......
[2025-04-03 03:29:27 simmim_finetune] (utils.py 62): INFO checkpoint/human/ckpt20.pth saved !!!
[2025-04-03 03:29:31 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 3.583 (3.583)	Loss 0.5281 (0.5281)	Acc@1 69.531 (69.531)	Mem 20675MB
[2025-04-03 03:29:31 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (1.384)	Loss 0.4739 (0.4852)	Acc@1 74.219 (73.438)	Mem 20675MB
[2025-04-03 03:29:32 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.945)	Loss 0.5210 (0.4838)	Acc@1 72.656 (74.375)	Mem 20675MB
[2025-04-03 03:29:32 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.756)	Loss 0.4351 (0.4692)	Acc@1 78.906 (75.670)	Mem 20675MB
[2025-04-03 03:29:33 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.651)	Loss 0.4968 (0.4646)	Acc@1 72.656 (76.302)	Mem 20675MB
[2025-04-03 03:29:33 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.585)	Loss 0.4760 (0.4722)	Acc@1 82.031 (76.562)	Mem 20675MB
[2025-04-03 03:29:34 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.538)	Loss 0.4718 (0.4711)	Acc@1 77.344 (76.923)	Mem 20675MB
[2025-04-03 03:29:35 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.504)	Loss 0.4303 (0.4663)	Acc@1 79.688 (77.292)	Mem 20675MB
[2025-04-03 03:29:35 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.319
[2025-04-03 03:29:35 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 77.3%
[2025-04-03 03:29:35 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 77.67%
[2025-04-03 03:29:35 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.1513230953384268e-06, 1.1513230953384268e-06, 1.6644060916478973e-06, 1.6644060916478973e-06, 2.4537645475086216e-06, 2.4537645475086216e-06, 3.6681621719097354e-06, 3.6681621719097354e-06, 5.53646620944991e-06, 5.53646620944991e-06, 8.410780113357872e-06, 8.410780113357872e-06, 1.2832801503985503e-05, 1.2832801503985503e-05, 1.963591133572032e-05, 1.963591133572032e-05, 3.0102234153773882e-05, 3.0102234153773882e-05, 4.620426925847168e-05, 4.620426925847168e-05, 7.097663095800675e-05, 7.097663095800675e-05, 0.00010908795664959916, 0.00010908795664959916, 0.00016772076540589519, 0.00016772076540589519, 0.00025792508656942746, 0.00025792508656942746]
[2025-04-03 03:29:39 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][0/573]	eta 0:35:11 lr 0.000258	time 3.6844 (3.6844)	loss 0.5335 (0.5335)	grad_norm 2.0821 (2.0821)	mem 20675MB
[2025-04-03 03:29:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][2/573]	eta 0:17:16 lr 0.000258	time 0.8780 (1.8154)	loss 0.5910 (0.5144)	grad_norm 2.1421 (2.2295)	mem 20675MB
[2025-04-03 03:29:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][4/573]	eta 0:13:40 lr 0.000257	time 0.8779 (1.4424)	loss 0.5810 (0.5227)	grad_norm 2.1482 (2.2342)	mem 20675MB
[2025-04-03 03:29:44 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][6/573]	eta 0:12:08 lr 0.000257	time 0.8805 (1.2856)	loss 0.3534 (0.5019)	grad_norm 2.0245 (2.2515)	mem 20675MB
[2025-04-03 03:29:46 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][8/573]	eta 0:11:16 lr 0.000257	time 0.8902 (1.1971)	loss 0.5455 (0.5066)	grad_norm 2.0656 (2.2583)	mem 20675MB
[2025-04-03 03:29:48 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][10/573]	eta 0:10:41 lr 0.000257	time 0.8777 (1.1392)	loss 0.4814 (0.5101)	grad_norm 4.7872 (2.4497)	mem 20675MB
[2025-04-03 03:29:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][12/573]	eta 0:10:16 lr 0.000257	time 0.8780 (1.0993)	loss 0.5391 (0.5045)	grad_norm 1.9547 (2.4566)	mem 20675MB
[2025-04-03 03:29:51 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][14/573]	eta 0:09:58 lr 0.000257	time 0.8805 (1.0700)	loss 0.5613 (0.5039)	grad_norm 1.9561 (2.4747)	mem 20675MB
[2025-04-03 03:29:53 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][16/573]	eta 0:09:43 lr 0.000256	time 0.8788 (1.0476)	loss 0.5238 (0.4984)	grad_norm 3.0485 (2.5730)	mem 20675MB
[2025-04-03 03:29:55 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][18/573]	eta 0:09:31 lr 0.000256	time 0.8856 (1.0302)	loss 0.5659 (0.5051)	grad_norm 2.3236 (2.5422)	mem 20675MB
[2025-04-03 03:29:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][20/573]	eta 0:09:22 lr 0.000256	time 0.8803 (1.0163)	loss 0.4245 (0.5024)	grad_norm 3.0912 (2.5868)	mem 20675MB
[2025-04-03 03:29:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][22/573]	eta 0:09:13 lr 0.000256	time 0.8782 (1.0044)	loss 0.4546 (0.4999)	grad_norm 2.1153 (2.5441)	mem 20675MB
[2025-04-03 03:30:00 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][24/573]	eta 0:09:05 lr 0.000256	time 0.8784 (0.9944)	loss 0.4856 (0.5025)	grad_norm 2.3488 (2.5183)	mem 20675MB
[2025-04-03 03:30:02 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][26/573]	eta 0:08:59 lr 0.000255	time 0.8775 (0.9858)	loss 0.3341 (0.4971)	grad_norm 4.9440 (2.6132)	mem 20675MB
[2025-04-03 03:30:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][28/573]	eta 0:08:53 lr 0.000255	time 0.8779 (0.9786)	loss 0.5281 (0.5011)	grad_norm 3.2540 (2.6362)	mem 20675MB
[2025-04-03 03:30:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][30/573]	eta 0:08:47 lr 0.000255	time 0.8779 (0.9721)	loss 0.5164 (0.5074)	grad_norm 2.3492 (2.6318)	mem 20675MB
[2025-04-03 03:30:07 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][32/573]	eta 0:08:42 lr 0.000255	time 0.8777 (0.9667)	loss 0.5932 (0.5051)	grad_norm 2.2837 (2.6404)	mem 20675MB
[2025-04-03 03:30:09 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][34/573]	eta 0:08:38 lr 0.000255	time 0.8777 (0.9617)	loss 0.5279 (0.5041)	grad_norm 2.2556 (2.6258)	mem 20675MB
[2025-04-03 03:30:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][36/573]	eta 0:08:34 lr 0.000255	time 0.8783 (0.9573)	loss 0.6058 (0.5067)	grad_norm 1.7715 (2.5960)	mem 20675MB
[2025-04-03 03:30:12 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][38/573]	eta 0:08:30 lr 0.000254	time 0.9020 (0.9544)	loss 0.6297 (0.5121)	grad_norm 2.5029 (2.5804)	mem 20675MB
[2025-04-03 03:30:14 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][40/573]	eta 0:08:26 lr 0.000254	time 0.8782 (0.9508)	loss 0.4976 (0.5122)	grad_norm 2.5809 (2.5657)	mem 20675MB
[2025-04-03 03:30:16 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][42/573]	eta 0:08:23 lr 0.000254	time 0.8778 (0.9475)	loss 0.5335 (0.5140)	grad_norm 1.6734 (2.5408)	mem 20675MB
[2025-04-03 03:30:18 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][44/573]	eta 0:08:19 lr 0.000254	time 0.8774 (0.9447)	loss 0.4810 (0.5117)	grad_norm 2.5593 (2.5581)	mem 20675MB
[2025-04-03 03:30:19 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][46/573]	eta 0:08:16 lr 0.000254	time 0.8783 (0.9419)	loss 0.5116 (0.5090)	grad_norm 2.7974 (2.6030)	mem 20675MB
[2025-04-03 03:30:21 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][48/573]	eta 0:08:13 lr 0.000253	time 0.8789 (0.9394)	loss 0.4447 (0.5071)	grad_norm 2.1857 (2.5849)	mem 20675MB
[2025-04-03 03:30:23 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][50/573]	eta 0:08:10 lr 0.000253	time 0.8800 (0.9371)	loss 0.6012 (0.5096)	grad_norm 1.9175 (2.5783)	mem 20675MB
[2025-04-03 03:30:25 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][52/573]	eta 0:08:07 lr 0.000253	time 0.8781 (0.9349)	loss 0.4108 (0.5081)	grad_norm 3.5592 (2.5983)	mem 20675MB
[2025-04-03 03:30:26 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][54/573]	eta 0:08:04 lr 0.000253	time 0.8782 (0.9330)	loss 0.4793 (0.5067)	grad_norm 3.3014 (2.6103)	mem 20675MB
[2025-04-03 03:30:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][56/573]	eta 0:08:01 lr 0.000253	time 0.8784 (0.9311)	loss 0.3729 (0.5057)	grad_norm 2.1837 (2.5961)	mem 20675MB
[2025-04-03 03:30:30 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][58/573]	eta 0:07:58 lr 0.000252	time 0.8780 (0.9293)	loss 0.4748 (0.5059)	grad_norm 2.5346 (2.5926)	mem 20675MB
[2025-04-03 03:30:32 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][60/573]	eta 0:07:55 lr 0.000252	time 0.8807 (0.9278)	loss 0.5021 (0.5049)	grad_norm 2.4327 (2.5853)	mem 20675MB
[2025-04-03 03:30:33 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][62/573]	eta 0:07:53 lr 0.000252	time 0.8781 (0.9263)	loss 0.3674 (0.5020)	grad_norm 3.2456 (2.5886)	mem 20675MB
[2025-04-03 03:30:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][64/573]	eta 0:07:50 lr 0.000252	time 0.8778 (0.9249)	loss 0.4723 (0.4992)	grad_norm 1.9353 (2.6029)	mem 20675MB
[2025-04-03 03:30:37 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][66/573]	eta 0:07:48 lr 0.000252	time 0.8798 (0.9235)	loss 0.4509 (0.4970)	grad_norm 2.6385 (2.6035)	mem 20675MB
[2025-04-03 03:30:39 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][68/573]	eta 0:07:45 lr 0.000252	time 0.8782 (0.9225)	loss 0.6403 (0.4973)	grad_norm 2.6895 (2.6092)	mem 20675MB
[2025-04-03 03:30:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][70/573]	eta 0:07:43 lr 0.000251	time 0.8787 (0.9213)	loss 0.5774 (0.4972)	grad_norm 1.8698 (2.6056)	mem 20675MB
[2025-04-03 03:30:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][72/573]	eta 0:07:40 lr 0.000251	time 0.8781 (0.9201)	loss 0.4553 (0.4953)	grad_norm 2.9333 (2.6004)	mem 20675MB
[2025-04-03 03:30:44 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][74/573]	eta 0:07:38 lr 0.000251	time 0.8784 (0.9191)	loss 0.4316 (0.4941)	grad_norm 2.7809 (2.5970)	mem 20675MB
[2025-04-03 03:30:46 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][76/573]	eta 0:07:36 lr 0.000251	time 0.8924 (0.9183)	loss 0.4782 (0.4936)	grad_norm 3.0241 (2.5984)	mem 20675MB
[2025-04-03 03:30:47 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][78/573]	eta 0:07:34 lr 0.000251	time 0.8784 (0.9174)	loss 0.6331 (0.4953)	grad_norm 2.9689 (2.5966)	mem 20675MB
[2025-04-03 03:30:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][80/573]	eta 0:07:31 lr 0.000250	time 0.8786 (0.9164)	loss 0.5868 (0.4952)	grad_norm 1.6545 (2.5902)	mem 20675MB
[2025-04-03 03:30:51 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][82/573]	eta 0:07:29 lr 0.000250	time 0.8781 (0.9156)	loss 0.5287 (0.4967)	grad_norm 2.7884 (2.6025)	mem 20675MB
[2025-04-03 03:30:53 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][84/573]	eta 0:07:27 lr 0.000250	time 0.8782 (0.9148)	loss 0.3741 (0.4935)	grad_norm 3.4143 (2.6110)	mem 20675MB
[2025-04-03 03:30:55 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][86/573]	eta 0:07:25 lr 0.000250	time 0.8783 (0.9140)	loss 0.5002 (0.4922)	grad_norm 2.6868 (2.6234)	mem 20675MB
[2025-04-03 03:30:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][88/573]	eta 0:07:22 lr 0.000250	time 0.8784 (0.9133)	loss 0.5022 (0.4921)	grad_norm 2.5895 (2.6343)	mem 20675MB
[2025-04-03 03:30:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][90/573]	eta 0:07:20 lr 0.000250	time 0.8792 (0.9125)	loss 0.3893 (0.4914)	grad_norm 3.6860 (2.6461)	mem 20675MB
[2025-04-03 03:31:00 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][92/573]	eta 0:07:18 lr 0.000249	time 0.8782 (0.9118)	loss 0.4945 (0.4920)	grad_norm 3.1400 (2.6462)	mem 20675MB
[2025-04-03 03:31:02 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][94/573]	eta 0:07:16 lr 0.000249	time 0.8781 (0.9114)	loss 0.3023 (0.4887)	grad_norm 3.4357 (2.6525)	mem 20675MB
[2025-04-03 03:31:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][96/573]	eta 0:07:14 lr 0.000249	time 0.8781 (0.9108)	loss 0.5755 (0.4892)	grad_norm 2.3347 (2.6516)	mem 20675MB
[2025-04-03 03:31:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][98/573]	eta 0:07:12 lr 0.000249	time 0.8780 (0.9102)	loss 0.5687 (0.4900)	grad_norm 2.7635 (2.6598)	mem 20675MB
[2025-04-03 03:31:07 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][100/573]	eta 0:07:10 lr 0.000249	time 0.8777 (0.9096)	loss 0.3706 (0.4899)	grad_norm 4.6846 (2.6790)	mem 20675MB
[2025-04-03 03:31:09 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][102/573]	eta 0:07:08 lr 0.000248	time 0.8778 (0.9091)	loss 0.3941 (0.4887)	grad_norm 4.4128 (2.6889)	mem 20675MB
[2025-04-03 03:31:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][104/573]	eta 0:07:06 lr 0.000248	time 0.8782 (0.9085)	loss 0.4316 (0.4868)	grad_norm 3.8106 (2.7060)	mem 20675MB
[2025-04-03 03:31:12 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][106/573]	eta 0:07:04 lr 0.000248	time 0.8785 (0.9080)	loss 0.4754 (0.4863)	grad_norm 2.5459 (2.7170)	mem 20675MB
[2025-04-03 03:31:14 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][108/573]	eta 0:07:01 lr 0.000248	time 0.8789 (0.9075)	loss 0.5690 (0.4877)	grad_norm 2.0811 (2.7120)	mem 20675MB
[2025-04-03 03:31:16 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][110/573]	eta 0:06:59 lr 0.000248	time 0.8778 (0.9070)	loss 0.4465 (0.4872)	grad_norm 3.2309 (2.7146)	mem 20675MB
[2025-04-03 03:31:17 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][112/573]	eta 0:06:57 lr 0.000248	time 0.8775 (0.9065)	loss 0.4271 (0.4865)	grad_norm 3.3292 (2.7336)	mem 20675MB
[2025-04-03 03:31:19 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][114/573]	eta 0:06:55 lr 0.000247	time 0.8789 (0.9061)	loss 0.4492 (0.4871)	grad_norm 2.4407 (2.7245)	mem 20675MB
[2025-04-03 03:31:21 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][116/573]	eta 0:06:53 lr 0.000247	time 0.8773 (0.9056)	loss 0.5008 (0.4870)	grad_norm 2.3163 (2.7234)	mem 20675MB
[2025-04-03 03:31:23 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][118/573]	eta 0:06:51 lr 0.000247	time 0.8776 (0.9051)	loss 0.4534 (0.4872)	grad_norm 2.2130 (2.7216)	mem 20675MB
[2025-04-03 03:31:24 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][120/573]	eta 0:06:49 lr 0.000247	time 0.8783 (0.9047)	loss 0.3953 (0.4865)	grad_norm 3.6706 (2.7227)	mem 20675MB
[2025-04-03 03:31:26 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][122/573]	eta 0:06:47 lr 0.000247	time 0.8783 (0.9043)	loss 0.3839 (0.4860)	grad_norm 3.5913 (2.7281)	mem 20675MB
[2025-04-03 03:31:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][124/573]	eta 0:06:45 lr 0.000246	time 0.8773 (0.9040)	loss 0.3678 (0.4853)	grad_norm 4.2688 (2.7408)	mem 20675MB
[2025-04-03 03:31:30 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][126/573]	eta 0:06:43 lr 0.000246	time 0.8800 (0.9036)	loss 0.5675 (0.4864)	grad_norm 2.1545 (2.7375)	mem 20675MB
[2025-04-03 03:31:32 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][128/573]	eta 0:06:42 lr 0.000246	time 0.8882 (0.9034)	loss 0.5070 (0.4872)	grad_norm 2.8008 (2.7499)	mem 20675MB
[2025-04-03 03:31:33 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][130/573]	eta 0:06:40 lr 0.000246	time 0.8797 (0.9030)	loss 0.5010 (0.4879)	grad_norm 2.9415 (2.7519)	mem 20675MB
[2025-04-03 03:31:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][132/573]	eta 0:06:38 lr 0.000246	time 0.8826 (0.9027)	loss 0.3337 (0.4874)	grad_norm 5.3712 (2.7685)	mem 20675MB
[2025-04-03 03:31:37 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][134/573]	eta 0:06:36 lr 0.000246	time 0.8875 (0.9025)	loss 0.4922 (0.4871)	grad_norm 3.0370 (2.7725)	mem 20675MB
[2025-04-03 03:31:39 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][136/573]	eta 0:06:34 lr 0.000245	time 0.8781 (0.9022)	loss 0.5927 (0.4884)	grad_norm 2.3123 (2.7731)	mem 20675MB
[2025-04-03 03:31:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][138/573]	eta 0:06:32 lr 0.000245	time 0.8778 (0.9018)	loss 0.3691 (0.4878)	grad_norm 2.8666 (2.7730)	mem 20675MB
[2025-04-03 03:31:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][140/573]	eta 0:06:30 lr 0.000245	time 0.8783 (0.9015)	loss 0.4116 (0.4877)	grad_norm 2.3073 (2.7665)	mem 20675MB
[2025-04-03 03:31:44 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][142/573]	eta 0:06:28 lr 0.000245	time 0.8805 (0.9012)	loss 0.4847 (0.4872)	grad_norm 3.0156 (2.7697)	mem 20675MB
[2025-04-03 03:31:46 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][144/573]	eta 0:06:26 lr 0.000245	time 0.8777 (0.9009)	loss 0.3718 (0.4856)	grad_norm 3.5056 (2.7835)	mem 20675MB
[2025-04-03 03:31:47 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][146/573]	eta 0:06:24 lr 0.000244	time 0.8774 (0.9006)	loss 0.5249 (0.4867)	grad_norm 3.0289 (2.7839)	mem 20675MB
[2025-04-03 03:31:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][148/573]	eta 0:06:22 lr 0.000244	time 0.8781 (0.9004)	loss 0.4588 (0.4863)	grad_norm 4.5483 (2.7937)	mem 20675MB
[2025-04-03 03:31:51 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][150/573]	eta 0:06:20 lr 0.000244	time 0.8774 (0.9001)	loss 0.5520 (0.4863)	grad_norm 2.8892 (2.7948)	mem 20675MB
[2025-04-03 03:31:53 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][152/573]	eta 0:06:18 lr 0.000244	time 0.8776 (0.8999)	loss 0.4514 (0.4850)	grad_norm 2.2867 (2.7955)	mem 20675MB
[2025-04-03 03:31:54 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][154/573]	eta 0:06:16 lr 0.000244	time 0.8780 (0.8996)	loss 0.4760 (0.4851)	grad_norm 3.6794 (2.8022)	mem 20675MB
[2025-04-03 03:31:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][156/573]	eta 0:06:15 lr 0.000244	time 0.8791 (0.8993)	loss 0.4583 (0.4845)	grad_norm 1.9749 (2.7959)	mem 20675MB
[2025-04-03 03:31:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][158/573]	eta 0:06:13 lr 0.000243	time 0.8795 (0.8991)	loss 0.5551 (0.4842)	grad_norm 3.1551 (2.8008)	mem 20675MB
[2025-04-03 03:32:00 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][160/573]	eta 0:06:11 lr 0.000243	time 0.9049 (0.8991)	loss 0.3835 (0.4832)	grad_norm 3.9926 (2.8069)	mem 20675MB
[2025-04-03 03:32:02 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][162/573]	eta 0:06:09 lr 0.000243	time 0.8803 (0.8988)	loss 0.6086 (0.4844)	grad_norm 2.1853 (2.7996)	mem 20675MB
[2025-04-03 03:32:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][164/573]	eta 0:06:07 lr 0.000243	time 0.8776 (0.8986)	loss 0.4883 (0.4842)	grad_norm 3.6929 (2.8083)	mem 20675MB
[2025-04-03 03:32:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][166/573]	eta 0:06:05 lr 0.000243	time 0.8776 (0.8983)	loss 0.3633 (0.4829)	grad_norm 4.1735 (2.8146)	mem 20675MB
[2025-04-03 03:32:07 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][168/573]	eta 0:06:03 lr 0.000242	time 0.8826 (0.8982)	loss 0.5114 (0.4833)	grad_norm 2.3539 (2.8088)	mem 20675MB
[2025-04-03 03:32:09 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][170/573]	eta 0:06:01 lr 0.000242	time 0.8845 (0.8980)	loss 0.5995 (0.4842)	grad_norm 2.5789 (2.8132)	mem 20675MB
[2025-04-03 03:32:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][172/573]	eta 0:06:00 lr 0.000242	time 0.8776 (0.8978)	loss 0.4903 (0.4836)	grad_norm 3.7264 (2.8147)	mem 20675MB
[2025-04-03 03:32:12 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][174/573]	eta 0:05:58 lr 0.000242	time 0.8777 (0.8976)	loss 0.5857 (0.4840)	grad_norm 2.2026 (2.8109)	mem 20675MB
[2025-04-03 03:32:14 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][176/573]	eta 0:05:56 lr 0.000242	time 0.8788 (0.8974)	loss 0.5302 (0.4834)	grad_norm 2.0153 (2.8058)	mem 20675MB
[2025-04-03 03:32:16 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][178/573]	eta 0:05:54 lr 0.000242	time 0.8778 (0.8972)	loss 0.5797 (0.4842)	grad_norm 1.5313 (2.7937)	mem 20675MB
[2025-04-03 03:32:17 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][180/573]	eta 0:05:52 lr 0.000241	time 0.8777 (0.8970)	loss 0.4431 (0.4835)	grad_norm 3.3707 (2.8004)	mem 20675MB
[2025-04-03 03:32:19 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][182/573]	eta 0:05:50 lr 0.000241	time 0.8776 (0.8968)	loss 0.4322 (0.4836)	grad_norm 3.2407 (2.7998)	mem 20675MB
[2025-04-03 03:32:21 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][184/573]	eta 0:05:48 lr 0.000241	time 0.8777 (0.8966)	loss 0.5450 (0.4843)	grad_norm 2.0365 (2.7931)	mem 20675MB
[2025-04-03 03:32:23 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][186/573]	eta 0:05:46 lr 0.000241	time 0.8773 (0.8964)	loss 0.4254 (0.4838)	grad_norm 2.2951 (2.7890)	mem 20675MB
[2025-04-03 03:32:24 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][188/573]	eta 0:05:45 lr 0.000241	time 0.8778 (0.8962)	loss 0.5245 (0.4843)	grad_norm 3.0181 (2.7892)	mem 20675MB
[2025-04-03 03:32:26 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][190/573]	eta 0:05:43 lr 0.000241	time 0.8779 (0.8960)	loss 0.4923 (0.4843)	grad_norm 2.6522 (2.7893)	mem 20675MB
[2025-04-03 03:32:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][192/573]	eta 0:05:41 lr 0.000240	time 0.8794 (0.8959)	loss 0.4322 (0.4834)	grad_norm 3.6143 (2.8104)	mem 20675MB
[2025-04-03 03:32:30 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][194/573]	eta 0:05:39 lr 0.000240	time 0.8783 (0.8957)	loss 0.4524 (0.4834)	grad_norm 3.7587 (2.8131)	mem 20675MB
[2025-04-03 03:32:31 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][196/573]	eta 0:05:37 lr 0.000240	time 0.8786 (0.8955)	loss 0.4700 (0.4838)	grad_norm 2.3846 (2.8106)	mem 20675MB
[2025-04-03 03:32:33 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][198/573]	eta 0:05:35 lr 0.000240	time 0.8778 (0.8954)	loss 0.5858 (0.4840)	grad_norm 2.3507 (2.8052)	mem 20675MB
[2025-04-03 03:32:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][200/573]	eta 0:05:33 lr 0.000240	time 0.8975 (0.8953)	loss 0.5379 (0.4843)	grad_norm 4.0956 (2.8079)	mem 20675MB
[2025-04-03 03:32:37 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][202/573]	eta 0:05:32 lr 0.000239	time 0.8782 (0.8951)	loss 0.4114 (0.4842)	grad_norm 3.9753 (2.8115)	mem 20675MB
[2025-04-03 03:32:38 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][204/573]	eta 0:05:30 lr 0.000239	time 0.8782 (0.8950)	loss 0.5946 (0.4850)	grad_norm 3.3191 (2.8106)	mem 20675MB
[2025-04-03 03:32:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][206/573]	eta 0:05:28 lr 0.000239	time 0.8777 (0.8949)	loss 0.5360 (0.4847)	grad_norm 2.3646 (2.8115)	mem 20675MB
[2025-04-03 03:32:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][208/573]	eta 0:05:26 lr 0.000239	time 0.8787 (0.8948)	loss 0.5296 (0.4849)	grad_norm 2.2005 (2.8098)	mem 20675MB
[2025-04-03 03:32:44 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][210/573]	eta 0:05:24 lr 0.000239	time 0.8775 (0.8946)	loss 0.4058 (0.4850)	grad_norm 2.4219 (2.8080)	mem 20675MB
[2025-04-03 03:32:46 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][212/573]	eta 0:05:22 lr 0.000239	time 0.8779 (0.8945)	loss 0.5321 (0.4848)	grad_norm 2.9936 (2.8116)	mem 20675MB
[2025-04-03 03:32:47 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][214/573]	eta 0:05:21 lr 0.000238	time 0.8850 (0.8945)	loss 0.4259 (0.4843)	grad_norm 2.1116 (2.8038)	mem 20675MB
[2025-04-03 03:32:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][216/573]	eta 0:05:19 lr 0.000238	time 0.8783 (0.8943)	loss 0.5754 (0.4850)	grad_norm 2.2054 (2.7994)	mem 20675MB
[2025-04-03 03:32:51 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][218/573]	eta 0:05:17 lr 0.000238	time 0.8776 (0.8942)	loss 0.4597 (0.4849)	grad_norm 1.8971 (2.7950)	mem 20675MB
[2025-04-03 03:32:53 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][220/573]	eta 0:05:15 lr 0.000238	time 0.9029 (0.8941)	loss 0.4760 (0.4850)	grad_norm 2.4045 (2.7888)	mem 20675MB
[2025-04-03 03:32:54 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][222/573]	eta 0:05:13 lr 0.000238	time 0.8828 (0.8940)	loss 0.4480 (0.4845)	grad_norm 3.1892 (2.7902)	mem 20675MB
[2025-04-03 03:32:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][224/573]	eta 0:05:11 lr 0.000237	time 0.8780 (0.8939)	loss 0.6602 (0.4857)	grad_norm 2.6502 (2.7881)	mem 20675MB
[2025-04-03 03:32:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][226/573]	eta 0:05:10 lr 0.000237	time 0.8775 (0.8938)	loss 0.5650 (0.4858)	grad_norm 2.6246 (2.7896)	mem 20675MB
[2025-04-03 03:33:00 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][228/573]	eta 0:05:08 lr 0.000237	time 0.8773 (0.8937)	loss 0.3711 (0.4855)	grad_norm 4.0067 (2.7905)	mem 20675MB
[2025-04-03 03:33:01 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][230/573]	eta 0:05:06 lr 0.000237	time 0.8940 (0.8937)	loss 0.5551 (0.4862)	grad_norm 1.8682 (2.7827)	mem 20675MB
[2025-04-03 03:33:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][232/573]	eta 0:05:04 lr 0.000237	time 0.8844 (0.8936)	loss 0.5220 (0.4864)	grad_norm 2.7507 (2.7787)	mem 20675MB
[2025-04-03 03:33:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][234/573]	eta 0:05:02 lr 0.000237	time 0.8778 (0.8935)	loss 0.5861 (0.4866)	grad_norm 2.6065 (2.7773)	mem 20675MB
[2025-04-03 03:33:07 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][236/573]	eta 0:05:01 lr 0.000236	time 0.8779 (0.8933)	loss 0.3480 (0.4859)	grad_norm 3.2128 (2.7762)	mem 20675MB
[2025-04-03 03:33:08 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][238/573]	eta 0:04:59 lr 0.000236	time 0.8778 (0.8932)	loss 0.4896 (0.4855)	grad_norm 1.9075 (2.7744)	mem 20675MB
[2025-04-03 03:33:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][240/573]	eta 0:04:57 lr 0.000236	time 0.8777 (0.8931)	loss 0.5741 (0.4856)	grad_norm 2.7437 (2.7750)	mem 20675MB
[2025-04-03 03:33:12 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][242/573]	eta 0:04:55 lr 0.000236	time 0.8943 (0.8931)	loss 0.3795 (0.4849)	grad_norm 5.5296 (2.7897)	mem 20675MB
[2025-04-03 03:33:14 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][244/573]	eta 0:04:53 lr 0.000236	time 0.8800 (0.8930)	loss 0.4134 (0.4849)	grad_norm 2.6717 (2.7863)	mem 20675MB
[2025-04-03 03:33:16 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][246/573]	eta 0:04:51 lr 0.000235	time 0.8794 (0.8929)	loss 0.6198 (0.4856)	grad_norm 2.6692 (2.7837)	mem 20675MB
[2025-04-03 03:33:17 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][248/573]	eta 0:04:50 lr 0.000235	time 0.8777 (0.8928)	loss 0.4874 (0.4863)	grad_norm 2.4306 (2.7846)	mem 20675MB
[2025-04-03 03:33:19 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][250/573]	eta 0:04:48 lr 0.000235	time 0.8773 (0.8927)	loss 0.5910 (0.4869)	grad_norm 3.0399 (2.7846)	mem 20675MB
[2025-04-03 03:33:21 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][252/573]	eta 0:04:46 lr 0.000235	time 0.8776 (0.8926)	loss 0.5973 (0.4870)	grad_norm 3.3964 (2.7860)	mem 20675MB
[2025-04-03 03:33:23 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][254/573]	eta 0:04:44 lr 0.000235	time 0.8884 (0.8925)	loss 0.5231 (0.4874)	grad_norm 2.6940 (2.7821)	mem 20675MB
[2025-04-03 03:33:24 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][256/573]	eta 0:04:42 lr 0.000235	time 0.8775 (0.8924)	loss 0.5179 (0.4874)	grad_norm 2.3418 (2.7817)	mem 20675MB
[2025-04-03 03:33:26 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][258/573]	eta 0:04:41 lr 0.000234	time 0.8793 (0.8924)	loss 0.5547 (0.4880)	grad_norm 2.5534 (2.7788)	mem 20675MB
[2025-04-03 03:33:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][260/573]	eta 0:04:39 lr 0.000234	time 0.8780 (0.8923)	loss 0.5249 (0.4883)	grad_norm 2.5185 (2.7752)	mem 20675MB
[2025-04-03 03:33:30 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][262/573]	eta 0:04:37 lr 0.000234	time 0.8827 (0.8922)	loss 0.3484 (0.4877)	grad_norm 4.9489 (2.7840)	mem 20675MB
[2025-04-03 03:33:31 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][264/573]	eta 0:04:35 lr 0.000234	time 0.8828 (0.8921)	loss 0.4927 (0.4879)	grad_norm 2.5148 (2.7795)	mem 20675MB
[2025-04-03 03:33:33 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][266/573]	eta 0:04:33 lr 0.000234	time 0.8775 (0.8921)	loss 0.5798 (0.4884)	grad_norm 2.4832 (2.7756)	mem 20675MB
[2025-04-03 03:33:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][268/573]	eta 0:04:32 lr 0.000234	time 0.8780 (0.8920)	loss 0.5808 (0.4888)	grad_norm 2.0731 (2.7734)	mem 20675MB
[2025-04-03 03:33:37 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][270/573]	eta 0:04:30 lr 0.000233	time 0.8786 (0.8919)	loss 0.5615 (0.4891)	grad_norm 2.9180 (2.7720)	mem 20675MB
[2025-04-03 03:33:38 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][272/573]	eta 0:04:28 lr 0.000233	time 0.8775 (0.8918)	loss 0.4168 (0.4885)	grad_norm 3.5077 (2.7707)	mem 20675MB
[2025-04-03 03:33:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][274/573]	eta 0:04:26 lr 0.000233	time 0.8779 (0.8917)	loss 0.3349 (0.4879)	grad_norm 2.8950 (2.7800)	mem 20675MB
[2025-04-03 03:33:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][276/573]	eta 0:04:24 lr 0.000233	time 0.8773 (0.8916)	loss 0.5192 (0.4882)	grad_norm 3.5723 (2.7838)	mem 20675MB
[2025-04-03 03:33:44 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][278/573]	eta 0:04:23 lr 0.000233	time 0.8774 (0.8916)	loss 0.5693 (0.4886)	grad_norm 5.8571 (2.7938)	mem 20675MB
[2025-04-03 03:33:46 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][280/573]	eta 0:04:21 lr 0.000232	time 0.8785 (0.8915)	loss 0.5503 (0.4890)	grad_norm 2.3232 (2.7932)	mem 20675MB
[2025-04-03 03:33:47 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][282/573]	eta 0:04:19 lr 0.000232	time 0.8822 (0.8914)	loss 0.5914 (0.4892)	grad_norm 2.6961 (2.7951)	mem 20675MB
[2025-04-03 03:33:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][284/573]	eta 0:04:17 lr 0.000232	time 0.8776 (0.8913)	loss 0.3726 (0.4884)	grad_norm 3.1128 (2.8000)	mem 20675MB
[2025-04-03 03:33:51 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][286/573]	eta 0:04:15 lr 0.000232	time 0.8823 (0.8913)	loss 0.5681 (0.4886)	grad_norm 7.2354 (2.8154)	mem 20675MB
[2025-04-03 03:33:53 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][288/573]	eta 0:04:13 lr 0.000232	time 0.8777 (0.8912)	loss 0.4721 (0.4886)	grad_norm 2.4593 (2.8131)	mem 20675MB
[2025-04-03 03:33:54 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][290/573]	eta 0:04:12 lr 0.000232	time 0.8777 (0.8911)	loss 0.5757 (0.4889)	grad_norm 2.3467 (2.8096)	mem 20675MB
[2025-04-03 03:33:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][292/573]	eta 0:04:10 lr 0.000231	time 0.8788 (0.8911)	loss 0.5756 (0.4894)	grad_norm 2.9012 (2.8083)	mem 20675MB
[2025-04-03 03:33:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][294/573]	eta 0:04:08 lr 0.000231	time 0.8777 (0.8912)	loss 0.5065 (0.4897)	grad_norm 2.6854 (2.8052)	mem 20675MB
[2025-04-03 03:34:00 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][296/573]	eta 0:04:06 lr 0.000231	time 0.8778 (0.8911)	loss 0.5292 (0.4894)	grad_norm 2.5217 (2.8058)	mem 20675MB
[2025-04-03 03:34:01 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][298/573]	eta 0:04:05 lr 0.000231	time 0.8775 (0.8910)	loss 0.3307 (0.4883)	grad_norm 3.4483 (2.8085)	mem 20675MB
[2025-04-03 03:34:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][300/573]	eta 0:04:03 lr 0.000231	time 0.8780 (0.8909)	loss 0.3454 (0.4881)	grad_norm 4.5015 (2.8112)	mem 20675MB
[2025-04-03 03:34:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][302/573]	eta 0:04:01 lr 0.000231	time 0.8794 (0.8909)	loss 0.5051 (0.4882)	grad_norm 2.3346 (2.8065)	mem 20675MB
[2025-04-03 03:34:07 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][304/573]	eta 0:03:59 lr 0.000230	time 0.8779 (0.8908)	loss 0.4979 (0.4887)	grad_norm 2.3471 (2.8058)	mem 20675MB
[2025-04-03 03:34:08 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][306/573]	eta 0:03:57 lr 0.000230	time 0.8779 (0.8907)	loss 0.5261 (0.4890)	grad_norm 3.5621 (2.8073)	mem 20675MB
[2025-04-03 03:34:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][308/573]	eta 0:03:56 lr 0.000230	time 0.8792 (0.8907)	loss 0.5574 (0.4894)	grad_norm 2.8773 (2.8058)	mem 20675MB
[2025-04-03 03:34:12 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][310/573]	eta 0:03:54 lr 0.000230	time 0.8778 (0.8906)	loss 0.5848 (0.4901)	grad_norm 2.5817 (2.8049)	mem 20675MB
[2025-04-03 03:34:14 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][312/573]	eta 0:03:52 lr 0.000230	time 0.8773 (0.8906)	loss 0.5168 (0.4900)	grad_norm 2.5860 (2.8042)	mem 20675MB
[2025-04-03 03:34:16 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][314/573]	eta 0:03:50 lr 0.000229	time 0.8844 (0.8905)	loss 0.5208 (0.4899)	grad_norm 2.8785 (2.8077)	mem 20675MB
[2025-04-03 03:34:17 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][316/573]	eta 0:03:48 lr 0.000229	time 0.8794 (0.8904)	loss 0.4933 (0.4897)	grad_norm 2.1515 (2.8053)	mem 20675MB
[2025-04-03 03:34:19 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][318/573]	eta 0:03:47 lr 0.000229	time 0.8806 (0.8904)	loss 0.4015 (0.4893)	grad_norm 3.5244 (2.8059)	mem 20675MB
[2025-04-03 03:34:21 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][320/573]	eta 0:03:45 lr 0.000229	time 0.8777 (0.8904)	loss 0.4452 (0.4888)	grad_norm 2.5380 (2.8019)	mem 20675MB
[2025-04-03 03:34:23 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][322/573]	eta 0:03:43 lr 0.000229	time 0.8776 (0.8903)	loss 0.4994 (0.4890)	grad_norm 2.2622 (2.7974)	mem 20675MB
[2025-04-03 03:34:24 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][324/573]	eta 0:03:41 lr 0.000229	time 0.8776 (0.8902)	loss 0.3889 (0.4888)	grad_norm 2.6473 (2.7962)	mem 20675MB
[2025-04-03 03:34:26 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][326/573]	eta 0:03:39 lr 0.000228	time 0.8773 (0.8902)	loss 0.4739 (0.4888)	grad_norm 3.5671 (2.7989)	mem 20675MB
[2025-04-03 03:34:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][328/573]	eta 0:03:38 lr 0.000228	time 0.8774 (0.8901)	loss 0.4715 (0.4887)	grad_norm 2.8635 (2.7984)	mem 20675MB
[2025-04-03 03:34:30 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][330/573]	eta 0:03:36 lr 0.000228	time 0.8778 (0.8900)	loss 0.3887 (0.4880)	grad_norm 4.1795 (2.8017)	mem 20675MB
[2025-04-03 03:34:31 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][332/573]	eta 0:03:34 lr 0.000228	time 0.8790 (0.8899)	loss 0.4148 (0.4878)	grad_norm 2.7731 (2.8025)	mem 20675MB
[2025-04-03 03:34:33 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][334/573]	eta 0:03:32 lr 0.000228	time 0.8783 (0.8899)	loss 0.5166 (0.4879)	grad_norm 2.4214 (2.8014)	mem 20675MB
[2025-04-03 03:34:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][336/573]	eta 0:03:30 lr 0.000228	time 0.8799 (0.8898)	loss 0.3588 (0.4872)	grad_norm 4.7985 (2.8123)	mem 20675MB
[2025-04-03 03:34:37 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][338/573]	eta 0:03:29 lr 0.000227	time 0.8779 (0.8898)	loss 0.4989 (0.4870)	grad_norm 2.6474 (2.8136)	mem 20675MB
[2025-04-03 03:34:38 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][340/573]	eta 0:03:27 lr 0.000227	time 0.8779 (0.8897)	loss 0.4784 (0.4867)	grad_norm 2.4491 (2.8125)	mem 20675MB
[2025-04-03 03:34:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][342/573]	eta 0:03:25 lr 0.000227	time 0.8779 (0.8897)	loss 0.4710 (0.4866)	grad_norm 1.7461 (2.8124)	mem 20675MB
[2025-04-03 03:34:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][344/573]	eta 0:03:23 lr 0.000227	time 0.8777 (0.8896)	loss 0.6017 (0.4871)	grad_norm 4.0663 (2.8182)	mem 20675MB
[2025-04-03 03:34:44 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][346/573]	eta 0:03:21 lr 0.000227	time 0.8775 (0.8896)	loss 0.4959 (0.4873)	grad_norm 1.5810 (2.8139)	mem 20675MB
[2025-04-03 03:34:45 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][348/573]	eta 0:03:20 lr 0.000226	time 0.8775 (0.8895)	loss 0.3298 (0.4871)	grad_norm 3.5471 (2.8139)	mem 20675MB
[2025-04-03 03:34:47 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][350/573]	eta 0:03:18 lr 0.000226	time 0.8782 (0.8894)	loss 0.4927 (0.4871)	grad_norm 2.3128 (2.8133)	mem 20675MB
[2025-04-03 03:34:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][352/573]	eta 0:03:16 lr 0.000226	time 0.8851 (0.8894)	loss 0.5763 (0.4871)	grad_norm 3.5610 (2.8144)	mem 20675MB
[2025-04-03 03:34:51 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][354/573]	eta 0:03:14 lr 0.000226	time 0.8772 (0.8893)	loss 0.5288 (0.4875)	grad_norm 2.2775 (2.8112)	mem 20675MB
[2025-04-03 03:34:52 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][356/573]	eta 0:03:12 lr 0.000226	time 0.8773 (0.8893)	loss 0.4453 (0.4872)	grad_norm 3.7012 (2.8137)	mem 20675MB
[2025-04-03 03:34:54 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][358/573]	eta 0:03:11 lr 0.000226	time 0.8774 (0.8892)	loss 0.5288 (0.4870)	grad_norm 2.6757 (2.8152)	mem 20675MB
[2025-04-03 03:34:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][360/573]	eta 0:03:09 lr 0.000225	time 0.9427 (0.8894)	loss 0.4571 (0.4867)	grad_norm 3.3391 (2.8168)	mem 20675MB
[2025-04-03 03:34:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][362/573]	eta 0:03:07 lr 0.000225	time 0.8804 (0.8893)	loss 0.4607 (0.4865)	grad_norm 2.5058 (2.8162)	mem 20675MB
[2025-04-03 03:35:00 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][364/573]	eta 0:03:05 lr 0.000225	time 0.8774 (0.8893)	loss 0.5233 (0.4862)	grad_norm 2.3940 (2.8143)	mem 20675MB
[2025-04-03 03:35:01 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][366/573]	eta 0:03:04 lr 0.000225	time 0.8775 (0.8892)	loss 0.3978 (0.4861)	grad_norm 2.6369 (2.8145)	mem 20675MB
[2025-04-03 03:35:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][368/573]	eta 0:03:02 lr 0.000225	time 0.8778 (0.8892)	loss 0.4265 (0.4863)	grad_norm 3.6311 (2.8162)	mem 20675MB
[2025-04-03 03:35:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][370/573]	eta 0:03:00 lr 0.000225	time 0.8782 (0.8891)	loss 0.4177 (0.4860)	grad_norm 3.5281 (2.8166)	mem 20675MB
[2025-04-03 03:35:07 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][372/573]	eta 0:02:58 lr 0.000224	time 0.8776 (0.8891)	loss 0.4913 (0.4861)	grad_norm 2.8880 (2.8155)	mem 20675MB
[2025-04-03 03:35:08 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][374/573]	eta 0:02:56 lr 0.000224	time 0.8777 (0.8890)	loss 0.5256 (0.4863)	grad_norm 1.8784 (2.8138)	mem 20675MB
[2025-04-03 03:35:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][376/573]	eta 0:02:55 lr 0.000224	time 0.8782 (0.8890)	loss 0.6269 (0.4869)	grad_norm 3.2782 (2.8149)	mem 20675MB
[2025-04-03 03:35:12 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][378/573]	eta 0:02:53 lr 0.000224	time 0.8849 (0.8889)	loss 0.3570 (0.4864)	grad_norm 4.5186 (2.8208)	mem 20675MB
[2025-04-03 03:35:14 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][380/573]	eta 0:02:51 lr 0.000224	time 0.8873 (0.8889)	loss 0.5773 (0.4867)	grad_norm 2.8421 (2.8198)	mem 20675MB
[2025-04-03 03:35:15 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][382/573]	eta 0:02:49 lr 0.000223	time 0.8781 (0.8889)	loss 0.3838 (0.4862)	grad_norm 3.3590 (2.8236)	mem 20675MB
[2025-04-03 03:35:17 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][384/573]	eta 0:02:47 lr 0.000223	time 0.8964 (0.8889)	loss 0.4555 (0.4857)	grad_norm 2.9386 (2.8263)	mem 20675MB
[2025-04-03 03:35:19 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][386/573]	eta 0:02:46 lr 0.000223	time 0.8784 (0.8888)	loss 0.5882 (0.4863)	grad_norm 2.2400 (2.8263)	mem 20675MB
[2025-04-03 03:35:21 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][388/573]	eta 0:02:44 lr 0.000223	time 0.8783 (0.8888)	loss 0.5063 (0.4865)	grad_norm 4.2251 (2.8279)	mem 20675MB
[2025-04-03 03:35:23 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][390/573]	eta 0:02:42 lr 0.000223	time 0.8782 (0.8888)	loss 0.5826 (0.4864)	grad_norm 3.1175 (2.8307)	mem 20675MB
[2025-04-03 03:35:24 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][392/573]	eta 0:02:40 lr 0.000223	time 0.8775 (0.8887)	loss 0.4731 (0.4863)	grad_norm 2.7018 (2.8326)	mem 20675MB
[2025-04-03 03:35:26 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][394/573]	eta 0:02:39 lr 0.000222	time 0.8775 (0.8887)	loss 0.5004 (0.4862)	grad_norm 3.6641 (2.8365)	mem 20675MB
[2025-04-03 03:35:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][396/573]	eta 0:02:37 lr 0.000222	time 0.8845 (0.8886)	loss 0.4017 (0.4860)	grad_norm 2.7398 (2.8337)	mem 20675MB
[2025-04-03 03:35:30 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][398/573]	eta 0:02:35 lr 0.000222	time 0.8776 (0.8886)	loss 0.3862 (0.4859)	grad_norm 2.0987 (2.8303)	mem 20675MB
[2025-04-03 03:35:31 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][400/573]	eta 0:02:33 lr 0.000222	time 0.8791 (0.8885)	loss 0.4937 (0.4859)	grad_norm 2.8301 (2.8298)	mem 20675MB
[2025-04-03 03:35:33 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][402/573]	eta 0:02:31 lr 0.000222	time 0.8774 (0.8885)	loss 0.5247 (0.4857)	grad_norm 3.9926 (2.8343)	mem 20675MB
[2025-04-03 03:35:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][404/573]	eta 0:02:30 lr 0.000222	time 0.8780 (0.8884)	loss 0.4462 (0.4857)	grad_norm 3.1841 (2.8356)	mem 20675MB
[2025-04-03 03:35:37 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][406/573]	eta 0:02:28 lr 0.000221	time 0.8805 (0.8884)	loss 0.4207 (0.4856)	grad_norm 2.4694 (2.8361)	mem 20675MB
[2025-04-03 03:35:38 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][408/573]	eta 0:02:26 lr 0.000221	time 0.8884 (0.8884)	loss 0.5047 (0.4856)	grad_norm 2.2966 (2.8325)	mem 20675MB
[2025-04-03 03:35:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][410/573]	eta 0:02:24 lr 0.000221	time 0.8777 (0.8884)	loss 0.4886 (0.4855)	grad_norm 2.7426 (2.8296)	mem 20675MB
[2025-04-03 03:35:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][412/573]	eta 0:02:23 lr 0.000221	time 0.8773 (0.8883)	loss 0.5533 (0.4855)	grad_norm 4.7817 (2.8345)	mem 20675MB
[2025-04-03 03:35:44 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][414/573]	eta 0:02:21 lr 0.000221	time 0.8775 (0.8883)	loss 0.4673 (0.4855)	grad_norm 2.2885 (2.8332)	mem 20675MB
[2025-04-03 03:35:45 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][416/573]	eta 0:02:19 lr 0.000220	time 0.8782 (0.8882)	loss 0.3727 (0.4855)	grad_norm 3.7635 (2.8346)	mem 20675MB
[2025-04-03 03:35:47 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][418/573]	eta 0:02:17 lr 0.000220	time 0.8930 (0.8882)	loss 0.5025 (0.4852)	grad_norm 1.5392 (2.8338)	mem 20675MB
[2025-04-03 03:35:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][420/573]	eta 0:02:15 lr 0.000220	time 0.8770 (0.8882)	loss 0.3291 (0.4849)	grad_norm 4.9956 (2.8387)	mem 20675MB
[2025-04-03 03:35:51 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][422/573]	eta 0:02:14 lr 0.000220	time 0.8773 (0.8881)	loss 0.5348 (0.4849)	grad_norm 2.6677 (2.8385)	mem 20675MB
[2025-04-03 03:35:52 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][424/573]	eta 0:02:12 lr 0.000220	time 0.8783 (0.8881)	loss 0.5883 (0.4851)	grad_norm 3.7210 (2.8461)	mem 20675MB
[2025-04-03 03:35:54 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][426/573]	eta 0:02:10 lr 0.000220	time 0.8840 (0.8881)	loss 0.4441 (0.4852)	grad_norm 4.4591 (2.8484)	mem 20675MB
[2025-04-03 03:35:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][428/573]	eta 0:02:08 lr 0.000219	time 0.8780 (0.8881)	loss 0.5521 (0.4853)	grad_norm 4.6743 (2.8526)	mem 20675MB
[2025-04-03 03:35:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][430/573]	eta 0:02:06 lr 0.000219	time 0.8782 (0.8880)	loss 0.5261 (0.4853)	grad_norm 2.3844 (2.8513)	mem 20675MB
[2025-04-03 03:35:59 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][432/573]	eta 0:02:05 lr 0.000219	time 0.8780 (0.8880)	loss 0.5488 (0.4856)	grad_norm 2.2829 (2.8486)	mem 20675MB
[2025-04-03 03:36:01 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][434/573]	eta 0:02:03 lr 0.000219	time 0.8779 (0.8879)	loss 0.5241 (0.4857)	grad_norm 3.4768 (2.8525)	mem 20675MB
[2025-04-03 03:36:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][436/573]	eta 0:02:01 lr 0.000219	time 0.8775 (0.8879)	loss 0.5236 (0.4859)	grad_norm 2.5181 (2.8492)	mem 20675MB
[2025-04-03 03:36:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][438/573]	eta 0:01:59 lr 0.000219	time 0.8775 (0.8879)	loss 0.5295 (0.4859)	grad_norm 2.2256 (2.8465)	mem 20675MB
[2025-04-03 03:36:07 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][440/573]	eta 0:01:58 lr 0.000218	time 0.8773 (0.8878)	loss 0.5382 (0.4861)	grad_norm 2.6240 (2.8450)	mem 20675MB
[2025-04-03 03:36:08 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][442/573]	eta 0:01:56 lr 0.000218	time 0.8776 (0.8878)	loss 0.5764 (0.4861)	grad_norm 2.9272 (2.8469)	mem 20675MB
[2025-04-03 03:36:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][444/573]	eta 0:01:54 lr 0.000218	time 0.8777 (0.8878)	loss 0.5622 (0.4862)	grad_norm 2.3508 (2.8489)	mem 20675MB
[2025-04-03 03:36:12 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][446/573]	eta 0:01:52 lr 0.000218	time 0.8771 (0.8877)	loss 0.5141 (0.4863)	grad_norm 1.9812 (2.8465)	mem 20675MB
[2025-04-03 03:36:14 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][448/573]	eta 0:01:50 lr 0.000218	time 0.8776 (0.8877)	loss 0.4938 (0.4862)	grad_norm 2.8479 (2.8466)	mem 20675MB
[2025-04-03 03:36:15 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][450/573]	eta 0:01:49 lr 0.000218	time 0.8776 (0.8876)	loss 0.3724 (0.4859)	grad_norm 2.6561 (2.8446)	mem 20675MB
[2025-04-03 03:36:17 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][452/573]	eta 0:01:47 lr 0.000217	time 0.8819 (0.8876)	loss 0.5752 (0.4858)	grad_norm 2.9607 (2.8496)	mem 20675MB
[2025-04-03 03:36:19 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][454/573]	eta 0:01:45 lr 0.000217	time 0.8813 (0.8876)	loss 0.3923 (0.4857)	grad_norm 3.3921 (2.8493)	mem 20675MB
[2025-04-03 03:36:21 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][456/573]	eta 0:01:43 lr 0.000217	time 0.8780 (0.8876)	loss 0.5434 (0.4860)	grad_norm 2.5636 (2.8469)	mem 20675MB
[2025-04-03 03:36:22 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][458/573]	eta 0:01:42 lr 0.000217	time 0.8779 (0.8875)	loss 0.5297 (0.4858)	grad_norm 2.7754 (2.8528)	mem 20675MB
[2025-04-03 03:36:24 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][460/573]	eta 0:01:40 lr 0.000217	time 0.8781 (0.8875)	loss 0.5783 (0.4860)	grad_norm 2.5673 (2.8498)	mem 20675MB
[2025-04-03 03:36:26 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][462/573]	eta 0:01:38 lr 0.000217	time 0.8782 (0.8875)	loss 0.6570 (0.4865)	grad_norm 3.1966 (2.8491)	mem 20675MB
[2025-04-03 03:36:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][464/573]	eta 0:01:36 lr 0.000216	time 0.8776 (0.8874)	loss 0.3282 (0.4865)	grad_norm 2.7890 (2.8506)	mem 20675MB
[2025-04-03 03:36:29 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][466/573]	eta 0:01:34 lr 0.000216	time 0.8792 (0.8874)	loss 0.3584 (0.4863)	grad_norm 3.4006 (2.8598)	mem 20675MB
[2025-04-03 03:36:31 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][468/573]	eta 0:01:33 lr 0.000216	time 0.8773 (0.8874)	loss 0.4332 (0.4863)	grad_norm 2.0082 (2.8557)	mem 20675MB
[2025-04-03 03:36:33 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][470/573]	eta 0:01:31 lr 0.000216	time 0.8804 (0.8873)	loss 0.5256 (0.4865)	grad_norm 1.8296 (2.8526)	mem 20675MB
[2025-04-03 03:36:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][472/573]	eta 0:01:29 lr 0.000216	time 0.8773 (0.8873)	loss 0.4776 (0.4865)	grad_norm 1.8492 (2.8495)	mem 20675MB
[2025-04-03 03:36:36 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][474/573]	eta 0:01:27 lr 0.000215	time 0.8778 (0.8872)	loss 0.5181 (0.4867)	grad_norm 3.7711 (2.8511)	mem 20675MB
[2025-04-03 03:36:38 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][476/573]	eta 0:01:26 lr 0.000215	time 0.8785 (0.8872)	loss 0.5413 (0.4865)	grad_norm 1.8319 (2.8525)	mem 20675MB
[2025-04-03 03:36:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][478/573]	eta 0:01:24 lr 0.000215	time 0.8775 (0.8872)	loss 0.5810 (0.4867)	grad_norm 3.6829 (2.8528)	mem 20675MB
[2025-04-03 03:36:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][480/573]	eta 0:01:22 lr 0.000215	time 0.8814 (0.8872)	loss 0.4305 (0.4866)	grad_norm 4.7202 (2.8572)	mem 20675MB
[2025-04-03 03:36:44 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][482/573]	eta 0:01:20 lr 0.000215	time 0.8819 (0.8872)	loss 0.4860 (0.4866)	grad_norm 2.1122 (2.8542)	mem 20675MB
[2025-04-03 03:36:45 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][484/573]	eta 0:01:18 lr 0.000215	time 0.8779 (0.8871)	loss 0.5413 (0.4869)	grad_norm 1.9186 (2.8515)	mem 20675MB
[2025-04-03 03:36:47 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][486/573]	eta 0:01:17 lr 0.000214	time 0.8784 (0.8871)	loss 0.5464 (0.4871)	grad_norm 2.2894 (2.8483)	mem 20675MB
[2025-04-03 03:36:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][488/573]	eta 0:01:15 lr 0.000214	time 0.8777 (0.8871)	loss 0.5168 (0.4872)	grad_norm 2.3869 (2.8467)	mem 20675MB
[2025-04-03 03:36:51 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][490/573]	eta 0:01:13 lr 0.000214	time 0.8779 (0.8870)	loss 0.5421 (0.4874)	grad_norm 2.3677 (2.8451)	mem 20675MB
[2025-04-03 03:36:52 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][492/573]	eta 0:01:11 lr 0.000214	time 0.8780 (0.8870)	loss 0.5704 (0.4876)	grad_norm 2.2779 (2.8421)	mem 20675MB
[2025-04-03 03:36:54 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][494/573]	eta 0:01:10 lr 0.000214	time 0.8777 (0.8870)	loss 0.3172 (0.4873)	grad_norm 3.1393 (2.8409)	mem 20675MB
[2025-04-03 03:36:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][496/573]	eta 0:01:08 lr 0.000214	time 0.8775 (0.8869)	loss 0.5813 (0.4874)	grad_norm 2.3537 (2.8385)	mem 20675MB
[2025-04-03 03:36:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][498/573]	eta 0:01:06 lr 0.000213	time 0.8781 (0.8869)	loss 0.5533 (0.4874)	grad_norm 2.2259 (2.8371)	mem 20675MB
[2025-04-03 03:36:59 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][500/573]	eta 0:01:04 lr 0.000213	time 0.8778 (0.8869)	loss 0.4696 (0.4873)	grad_norm 3.8128 (2.8379)	mem 20675MB
[2025-04-03 03:37:01 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][502/573]	eta 0:01:02 lr 0.000213	time 0.8777 (0.8868)	loss 0.4335 (0.4873)	grad_norm 4.1041 (2.8404)	mem 20675MB
[2025-04-03 03:37:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][504/573]	eta 0:01:01 lr 0.000213	time 0.8802 (0.8868)	loss 0.4984 (0.4872)	grad_norm 3.5737 (2.8428)	mem 20675MB
[2025-04-03 03:37:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][506/573]	eta 0:00:59 lr 0.000213	time 0.8775 (0.8868)	loss 0.4942 (0.4873)	grad_norm 3.1728 (2.8472)	mem 20675MB
[2025-04-03 03:37:06 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][508/573]	eta 0:00:57 lr 0.000213	time 0.8778 (0.8868)	loss 0.4945 (0.4875)	grad_norm 2.9086 (2.8465)	mem 20675MB
[2025-04-03 03:37:08 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][510/573]	eta 0:00:55 lr 0.000212	time 0.8856 (0.8867)	loss 0.6007 (0.4878)	grad_norm 2.0104 (2.8452)	mem 20675MB
[2025-04-03 03:37:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][512/573]	eta 0:00:54 lr 0.000212	time 0.8780 (0.8867)	loss 0.4354 (0.4880)	grad_norm 2.4206 (2.8438)	mem 20675MB
[2025-04-03 03:37:12 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][514/573]	eta 0:00:52 lr 0.000212	time 0.8923 (0.8867)	loss 0.4488 (0.4880)	grad_norm 4.0781 (2.8453)	mem 20675MB
[2025-04-03 03:37:13 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][516/573]	eta 0:00:50 lr 0.000212	time 0.8858 (0.8867)	loss 0.5625 (0.4880)	grad_norm 1.8629 (2.8495)	mem 20675MB
[2025-04-03 03:37:15 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][518/573]	eta 0:00:48 lr 0.000212	time 0.8779 (0.8867)	loss 0.5253 (0.4883)	grad_norm 1.6049 (2.8452)	mem 20675MB
[2025-04-03 03:37:17 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][520/573]	eta 0:00:46 lr 0.000212	time 0.8778 (0.8867)	loss 0.3497 (0.4880)	grad_norm 3.0509 (2.8471)	mem 20675MB
[2025-04-03 03:37:19 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][522/573]	eta 0:00:45 lr 0.000211	time 0.8784 (0.8866)	loss 0.5250 (0.4881)	grad_norm 2.0598 (2.8471)	mem 20675MB
[2025-04-03 03:37:20 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][524/573]	eta 0:00:43 lr 0.000211	time 0.8792 (0.8866)	loss 0.4846 (0.4879)	grad_norm 1.7700 (2.8467)	mem 20675MB
[2025-04-03 03:37:22 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][526/573]	eta 0:00:41 lr 0.000211	time 0.8778 (0.8866)	loss 0.5771 (0.4881)	grad_norm 3.1825 (2.8450)	mem 20675MB
[2025-04-03 03:37:24 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][528/573]	eta 0:00:39 lr 0.000211	time 0.8771 (0.8866)	loss 0.3344 (0.4878)	grad_norm 9.7662 (2.8648)	mem 20675MB
[2025-04-03 03:37:26 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][530/573]	eta 0:00:38 lr 0.000211	time 0.8774 (0.8865)	loss 0.4466 (0.4878)	grad_norm 2.0837 (2.8613)	mem 20675MB
[2025-04-03 03:37:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][532/573]	eta 0:00:36 lr 0.000210	time 0.8775 (0.8865)	loss 0.5662 (0.4880)	grad_norm 3.2261 (2.8633)	mem 20675MB
[2025-04-03 03:37:29 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][534/573]	eta 0:00:34 lr 0.000210	time 0.8776 (0.8865)	loss 0.4255 (0.4877)	grad_norm 2.3407 (2.8621)	mem 20675MB
[2025-04-03 03:37:31 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][536/573]	eta 0:00:32 lr 0.000210	time 0.8825 (0.8865)	loss 0.3717 (0.4876)	grad_norm 3.1147 (2.8627)	mem 20675MB
[2025-04-03 03:37:33 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][538/573]	eta 0:00:31 lr 0.000210	time 0.8774 (0.8865)	loss 0.6229 (0.4879)	grad_norm 2.4041 (2.8611)	mem 20675MB
[2025-04-03 03:37:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][540/573]	eta 0:00:29 lr 0.000210	time 0.8779 (0.8865)	loss 0.6329 (0.4884)	grad_norm 2.7121 (2.8610)	mem 20675MB
[2025-04-03 03:37:36 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][542/573]	eta 0:00:27 lr 0.000210	time 0.8779 (0.8865)	loss 0.5146 (0.4884)	grad_norm 2.1457 (2.8592)	mem 20675MB
[2025-04-03 03:37:38 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][544/573]	eta 0:00:25 lr 0.000209	time 0.8776 (0.8864)	loss 0.5713 (0.4885)	grad_norm 2.5694 (2.8581)	mem 20675MB
[2025-04-03 03:37:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][546/573]	eta 0:00:23 lr 0.000209	time 0.8780 (0.8864)	loss 0.5617 (0.4887)	grad_norm 3.3129 (2.8577)	mem 20675MB
[2025-04-03 03:37:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][548/573]	eta 0:00:22 lr 0.000209	time 0.8789 (0.8864)	loss 0.4976 (0.4886)	grad_norm 2.9590 (2.8587)	mem 20675MB
[2025-04-03 03:37:43 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][550/573]	eta 0:00:20 lr 0.000209	time 0.8776 (0.8864)	loss 0.4083 (0.4884)	grad_norm 3.7581 (2.8594)	mem 20675MB
[2025-04-03 03:37:45 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][552/573]	eta 0:00:18 lr 0.000209	time 0.8774 (0.8863)	loss 0.5369 (0.4885)	grad_norm 1.8096 (2.8561)	mem 20675MB
[2025-04-03 03:37:47 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][554/573]	eta 0:00:16 lr 0.000209	time 0.8775 (0.8863)	loss 0.5510 (0.4888)	grad_norm 2.0854 (2.8540)	mem 20675MB
[2025-04-03 03:37:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][556/573]	eta 0:00:15 lr 0.000208	time 0.8779 (0.8863)	loss 0.5957 (0.4890)	grad_norm 2.2075 (2.8572)	mem 20675MB
[2025-04-03 03:37:50 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][558/573]	eta 0:00:13 lr 0.000208	time 0.9343 (0.8864)	loss 0.4089 (0.4889)	grad_norm 2.3217 (2.8554)	mem 20675MB
[2025-04-03 03:37:52 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][560/573]	eta 0:00:11 lr 0.000208	time 0.8776 (0.8863)	loss 0.4431 (0.4887)	grad_norm 2.3131 (2.8569)	mem 20675MB
[2025-04-03 03:37:54 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][562/573]	eta 0:00:09 lr 0.000208	time 0.8775 (0.8863)	loss 0.5852 (0.4890)	grad_norm 2.3100 (2.8573)	mem 20675MB
[2025-04-03 03:37:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][564/573]	eta 0:00:07 lr 0.000208	time 0.8771 (0.8863)	loss 0.4754 (0.4889)	grad_norm 4.2206 (2.8598)	mem 20675MB
[2025-04-03 03:37:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][566/573]	eta 0:00:06 lr 0.000208	time 0.8774 (0.8863)	loss 0.4015 (0.4887)	grad_norm 3.8471 (2.8600)	mem 20675MB
[2025-04-03 03:37:59 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][568/573]	eta 0:00:04 lr 0.000207	time 0.8787 (0.8862)	loss 0.3419 (0.4886)	grad_norm 2.8038 (2.8592)	mem 20675MB
[2025-04-03 03:38:01 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][570/573]	eta 0:00:02 lr 0.000207	time 0.8775 (0.8862)	loss 0.6047 (0.4887)	grad_norm 2.1276 (2.8610)	mem 20675MB
[2025-04-03 03:38:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][572/573]	eta 0:00:00 lr 0.000207	time 0.8775 (0.8862)	loss 0.5501 (0.4888)	grad_norm 2.9786 (2.8608)	mem 20675MB
[2025-04-03 03:38:03 simmim_finetune] (main_finetune.py 260): INFO EPOCH 21 training takes 0:08:27
[2025-04-03 03:38:06 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 3.460 (3.460)	Loss 0.5147 (0.5147)	Acc@1 71.094 (71.094)	Mem 20675MB
[2025-04-03 03:38:07 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.286 (1.344)	Loss 0.4704 (0.4808)	Acc@1 74.219 (73.958)	Mem 20675MB
[2025-04-03 03:38:08 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.306 (0.926)	Loss 0.5003 (0.4749)	Acc@1 71.094 (74.531)	Mem 20675MB
[2025-04-03 03:38:08 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.288 (0.743)	Loss 0.4328 (0.4628)	Acc@1 80.469 (76.339)	Mem 20675MB
[2025-04-03 03:38:09 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.641)	Loss 0.5078 (0.4606)	Acc@1 75.000 (76.997)	Mem 20675MB
[2025-04-03 03:38:09 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.576)	Loss 0.4721 (0.4665)	Acc@1 81.250 (77.060)	Mem 20675MB
[2025-04-03 03:38:10 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.532)	Loss 0.4671 (0.4665)	Acc@1 78.125 (77.284)	Mem 20675MB
[2025-04-03 03:38:10 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.291 (0.499)	Loss 0.4313 (0.4620)	Acc@1 82.031 (77.917)	Mem 20675MB
[2025-04-03 03:38:11 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.974
[2025-04-03 03:38:11 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 78.0%
[2025-04-03 03:38:11 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 77.97%
[2025-04-03 03:38:11 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [9.734958777401468e-07, 9.734958777401468e-07, 1.3853497786202552e-06, 1.3853497786202552e-06, 2.0189711645896527e-06, 2.0189711645896527e-06, 2.9937732968502645e-06, 2.9937732968502645e-06, 4.493468884943513e-06, 4.493468884943513e-06, 6.800692866625434e-06, 6.800692866625434e-06, 1.0350268223059156e-05, 1.0350268223059156e-05, 1.5811153386803342e-05, 1.5811153386803342e-05, 2.4212515177179018e-05, 2.4212515177179018e-05, 3.7137687162372373e-05, 3.7137687162372373e-05, 5.70225671395929e-05, 5.70225671395929e-05, 8.761469018147066e-05, 8.761469018147066e-05, 0.00013467949486128256, 0.00013467949486128256, 0.00020708688667637782, 0.00020708688667637782]
[2025-04-03 03:38:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][0/573]	eta 0:34:29 lr 0.000207	time 3.6114 (3.6114)	loss 0.5281 (0.5281)	grad_norm 2.5590 (2.5590)	mem 20675MB
[2025-04-03 03:38:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][2/573]	eta 0:17:03 lr 0.000207	time 0.8841 (1.7924)	loss 0.5412 (0.4899)	grad_norm 3.1086 (2.6932)	mem 20675MB
[2025-04-03 03:38:18 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][4/573]	eta 0:13:32 lr 0.000207	time 0.8777 (1.4280)	loss 0.4774 (0.4609)	grad_norm 2.3249 (2.8710)	mem 20675MB
[2025-04-03 03:38:20 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][6/573]	eta 0:12:00 lr 0.000206	time 0.8775 (1.2711)	loss 0.5058 (0.4735)	grad_norm 2.5968 (2.8440)	mem 20675MB
[2025-04-03 03:38:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][8/573]	eta 0:11:09 lr 0.000206	time 0.8786 (1.1843)	loss 0.3595 (0.4640)	grad_norm 2.6927 (2.7631)	mem 20675MB
[2025-04-03 03:38:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][10/573]	eta 0:10:35 lr 0.000206	time 0.8779 (1.1288)	loss 0.5873 (0.4896)	grad_norm 2.5944 (2.8266)	mem 20675MB
[2025-04-03 03:38:25 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][12/573]	eta 0:10:12 lr 0.000206	time 0.8882 (1.0917)	loss 0.2941 (0.4842)	grad_norm 3.4658 (2.8485)	mem 20675MB
[2025-04-03 03:38:27 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][14/573]	eta 0:09:54 lr 0.000206	time 0.8835 (1.0639)	loss 0.3876 (0.4801)	grad_norm 3.2954 (2.8411)	mem 20675MB
[2025-04-03 03:38:29 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][16/573]	eta 0:09:41 lr 0.000206	time 0.9027 (1.0436)	loss 0.5390 (0.4836)	grad_norm 3.9918 (2.9486)	mem 20675MB
[2025-04-03 03:38:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][18/573]	eta 0:09:29 lr 0.000205	time 0.8844 (1.0266)	loss 0.5530 (0.4891)	grad_norm 2.1288 (2.9085)	mem 20675MB
[2025-04-03 03:38:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][20/573]	eta 0:09:19 lr 0.000205	time 0.8787 (1.0126)	loss 0.4423 (0.4911)	grad_norm 2.1018 (2.8343)	mem 20675MB
[2025-04-03 03:38:34 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][22/573]	eta 0:09:11 lr 0.000205	time 0.8846 (1.0014)	loss 0.5146 (0.4879)	grad_norm 3.2708 (2.8492)	mem 20675MB
[2025-04-03 03:38:36 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][24/573]	eta 0:09:04 lr 0.000205	time 0.8778 (0.9920)	loss 0.5285 (0.4908)	grad_norm 2.4807 (2.8162)	mem 20675MB
[2025-04-03 03:38:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][26/573]	eta 0:08:58 lr 0.000205	time 0.8778 (0.9836)	loss 0.5387 (0.4926)	grad_norm 2.5547 (2.8128)	mem 20675MB
[2025-04-03 03:38:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][28/573]	eta 0:08:52 lr 0.000205	time 0.8779 (0.9765)	loss 0.4067 (0.4918)	grad_norm 4.1002 (2.8424)	mem 20675MB
[2025-04-03 03:38:41 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][30/573]	eta 0:08:46 lr 0.000204	time 0.8829 (0.9704)	loss 0.3411 (0.4871)	grad_norm 3.2362 (2.8443)	mem 20675MB
[2025-04-03 03:38:43 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][32/573]	eta 0:08:42 lr 0.000204	time 0.8779 (0.9649)	loss 0.4414 (0.4861)	grad_norm 3.9942 (2.8655)	mem 20675MB
[2025-04-03 03:38:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][34/573]	eta 0:08:37 lr 0.000204	time 0.8814 (0.9600)	loss 0.5483 (0.4887)	grad_norm 2.2201 (2.8317)	mem 20675MB
[2025-04-03 03:38:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][36/573]	eta 0:08:33 lr 0.000204	time 0.8788 (0.9557)	loss 0.5960 (0.4936)	grad_norm 2.6384 (2.8703)	mem 20675MB
[2025-04-03 03:38:48 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][38/573]	eta 0:08:29 lr 0.000204	time 0.8850 (0.9519)	loss 0.5047 (0.4943)	grad_norm 2.5727 (2.8500)	mem 20675MB
[2025-04-03 03:38:50 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][40/573]	eta 0:08:25 lr 0.000204	time 0.8780 (0.9484)	loss 0.3831 (0.4888)	grad_norm 3.8022 (2.8820)	mem 20675MB
[2025-04-03 03:38:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][42/573]	eta 0:08:21 lr 0.000203	time 0.8799 (0.9452)	loss 0.4995 (0.4916)	grad_norm 5.2764 (3.0104)	mem 20675MB
[2025-04-03 03:38:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][44/573]	eta 0:08:18 lr 0.000203	time 0.8783 (0.9422)	loss 0.5223 (0.4928)	grad_norm 2.4517 (2.9797)	mem 20675MB
[2025-04-03 03:38:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][46/573]	eta 0:08:15 lr 0.000203	time 0.8777 (0.9395)	loss 0.5691 (0.4966)	grad_norm 2.5275 (2.9611)	mem 20675MB
[2025-04-03 03:38:57 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][48/573]	eta 0:08:11 lr 0.000203	time 0.8779 (0.9370)	loss 0.3524 (0.4937)	grad_norm 3.7439 (2.9614)	mem 20675MB
[2025-04-03 03:38:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][50/573]	eta 0:08:08 lr 0.000203	time 0.8777 (0.9348)	loss 0.3634 (0.4926)	grad_norm 3.7200 (2.9600)	mem 20675MB
[2025-04-03 03:39:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][52/573]	eta 0:08:05 lr 0.000203	time 0.8780 (0.9328)	loss 0.5152 (0.4928)	grad_norm 7.1746 (3.0219)	mem 20675MB
[2025-04-03 03:39:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][54/573]	eta 0:08:03 lr 0.000202	time 0.8896 (0.9311)	loss 0.6138 (0.4947)	grad_norm 3.1712 (3.0165)	mem 20675MB
[2025-04-03 03:39:04 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][56/573]	eta 0:08:00 lr 0.000202	time 0.8781 (0.9293)	loss 0.4958 (0.4947)	grad_norm 2.8045 (3.0033)	mem 20675MB
[2025-04-03 03:39:06 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][58/573]	eta 0:07:57 lr 0.000202	time 0.8780 (0.9276)	loss 0.5392 (0.4933)	grad_norm 1.9591 (2.9878)	mem 20675MB
[2025-04-03 03:39:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][60/573]	eta 0:07:55 lr 0.000202	time 0.8777 (0.9260)	loss 0.5509 (0.4939)	grad_norm 2.3509 (2.9746)	mem 20675MB
[2025-04-03 03:39:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][62/573]	eta 0:07:52 lr 0.000202	time 0.8832 (0.9246)	loss 0.4973 (0.4961)	grad_norm 2.6921 (2.9666)	mem 20675MB
[2025-04-03 03:39:11 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][64/573]	eta 0:07:49 lr 0.000202	time 0.8793 (0.9233)	loss 0.5744 (0.4974)	grad_norm 2.4613 (2.9474)	mem 20675MB
[2025-04-03 03:39:13 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][66/573]	eta 0:07:47 lr 0.000201	time 0.8788 (0.9220)	loss 0.6003 (0.4997)	grad_norm 2.1297 (2.9312)	mem 20675MB
[2025-04-03 03:39:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][68/573]	eta 0:07:45 lr 0.000201	time 0.8874 (0.9212)	loss 0.5203 (0.4997)	grad_norm 2.0913 (2.9135)	mem 20675MB
[2025-04-03 03:39:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][70/573]	eta 0:07:42 lr 0.000201	time 0.8778 (0.9200)	loss 0.3586 (0.4986)	grad_norm 4.3451 (2.9203)	mem 20675MB
[2025-04-03 03:39:18 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][72/573]	eta 0:07:40 lr 0.000201	time 0.8774 (0.9189)	loss 0.5471 (0.4996)	grad_norm 2.0999 (2.8920)	mem 20675MB
[2025-04-03 03:39:20 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][74/573]	eta 0:07:37 lr 0.000201	time 0.8777 (0.9178)	loss 0.4776 (0.5009)	grad_norm 2.7993 (2.9269)	mem 20675MB
[2025-04-03 03:39:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][76/573]	eta 0:07:35 lr 0.000201	time 0.8794 (0.9168)	loss 0.4404 (0.5007)	grad_norm 3.1915 (2.9336)	mem 20675MB
[2025-04-03 03:39:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][78/573]	eta 0:07:33 lr 0.000200	time 0.8851 (0.9160)	loss 0.4064 (0.5000)	grad_norm 5.7105 (2.9564)	mem 20675MB
[2025-04-03 03:39:25 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][80/573]	eta 0:07:31 lr 0.000200	time 0.8777 (0.9151)	loss 0.5638 (0.5011)	grad_norm 2.1586 (2.9645)	mem 20675MB
[2025-04-03 03:39:27 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][82/573]	eta 0:07:28 lr 0.000200	time 0.8777 (0.9143)	loss 0.5544 (0.5025)	grad_norm 1.9188 (2.9480)	mem 20675MB
[2025-04-03 03:39:28 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][84/573]	eta 0:07:26 lr 0.000200	time 0.8777 (0.9135)	loss 0.5548 (0.5027)	grad_norm 2.2027 (2.9262)	mem 20675MB
[2025-04-03 03:39:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][86/573]	eta 0:07:24 lr 0.000200	time 0.8859 (0.9128)	loss 0.5558 (0.5039)	grad_norm 3.2730 (2.9233)	mem 20675MB
[2025-04-03 03:39:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][88/573]	eta 0:07:22 lr 0.000200	time 0.8778 (0.9120)	loss 0.5430 (0.5043)	grad_norm 2.1032 (2.9121)	mem 20675MB
[2025-04-03 03:39:34 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][90/573]	eta 0:07:20 lr 0.000199	time 0.8774 (0.9114)	loss 0.3969 (0.5016)	grad_norm 2.6792 (2.9061)	mem 20675MB
[2025-04-03 03:39:35 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][92/573]	eta 0:07:18 lr 0.000199	time 0.8779 (0.9107)	loss 0.4822 (0.5019)	grad_norm 2.6468 (2.8984)	mem 20675MB
[2025-04-03 03:39:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][94/573]	eta 0:07:15 lr 0.000199	time 0.8833 (0.9101)	loss 0.5621 (0.5030)	grad_norm 2.4458 (2.8891)	mem 20675MB
[2025-04-03 03:39:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][96/573]	eta 0:07:13 lr 0.000199	time 0.8778 (0.9095)	loss 0.6045 (0.5049)	grad_norm 3.1282 (2.8918)	mem 20675MB
[2025-04-03 03:39:41 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][98/573]	eta 0:07:11 lr 0.000199	time 0.8775 (0.9090)	loss 0.4990 (0.5033)	grad_norm 1.8352 (2.8882)	mem 20675MB
[2025-04-03 03:39:43 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][100/573]	eta 0:07:09 lr 0.000199	time 0.8776 (0.9083)	loss 0.3471 (0.5022)	grad_norm 2.7438 (2.8886)	mem 20675MB
[2025-04-03 03:39:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][102/573]	eta 0:07:07 lr 0.000198	time 0.8823 (0.9078)	loss 0.5268 (0.5008)	grad_norm 1.9936 (2.9036)	mem 20675MB
[2025-04-03 03:39:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][104/573]	eta 0:07:05 lr 0.000198	time 0.8776 (0.9073)	loss 0.5140 (0.5014)	grad_norm 3.2030 (2.9038)	mem 20675MB
[2025-04-03 03:39:48 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][106/573]	eta 0:07:03 lr 0.000198	time 0.8791 (0.9067)	loss 0.4527 (0.5019)	grad_norm 4.6037 (2.9169)	mem 20675MB
[2025-04-03 03:39:50 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][108/573]	eta 0:07:01 lr 0.000198	time 0.8779 (0.9065)	loss 0.4979 (0.5014)	grad_norm 2.3196 (2.9099)	mem 20675MB
[2025-04-03 03:39:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][110/573]	eta 0:06:59 lr 0.000198	time 0.8802 (0.9060)	loss 0.5805 (0.5027)	grad_norm 3.5354 (2.9103)	mem 20675MB
[2025-04-03 03:39:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][112/573]	eta 0:06:57 lr 0.000198	time 0.8784 (0.9055)	loss 0.5929 (0.5040)	grad_norm 2.8155 (2.8998)	mem 20675MB
[2025-04-03 03:39:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][114/573]	eta 0:06:55 lr 0.000197	time 0.8787 (0.9051)	loss 0.6176 (0.5045)	grad_norm 2.5536 (2.9068)	mem 20675MB
[2025-04-03 03:39:57 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][116/573]	eta 0:06:53 lr 0.000197	time 0.8784 (0.9046)	loss 0.5949 (0.5056)	grad_norm 2.0316 (2.8898)	mem 20675MB
[2025-04-03 03:39:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][118/573]	eta 0:06:51 lr 0.000197	time 0.8779 (0.9042)	loss 0.4674 (0.5056)	grad_norm 2.5543 (2.8849)	mem 20675MB
[2025-04-03 03:40:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][120/573]	eta 0:06:49 lr 0.000197	time 0.8788 (0.9038)	loss 0.5235 (0.5047)	grad_norm 2.3975 (2.8813)	mem 20675MB
[2025-04-03 03:40:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][122/573]	eta 0:06:47 lr 0.000197	time 0.8796 (0.9034)	loss 0.4608 (0.5049)	grad_norm 2.9681 (2.8766)	mem 20675MB
[2025-04-03 03:40:04 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][124/573]	eta 0:06:45 lr 0.000197	time 0.8776 (0.9030)	loss 0.3954 (0.5045)	grad_norm 3.8904 (2.8788)	mem 20675MB
[2025-04-03 03:40:05 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][126/573]	eta 0:06:43 lr 0.000196	time 0.8782 (0.9026)	loss 0.5556 (0.5056)	grad_norm 1.9281 (2.8842)	mem 20675MB
[2025-04-03 03:40:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][128/573]	eta 0:06:41 lr 0.000196	time 0.8776 (0.9023)	loss 0.4380 (0.5047)	grad_norm 3.2152 (2.8824)	mem 20675MB
[2025-04-03 03:40:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][130/573]	eta 0:06:39 lr 0.000196	time 0.8779 (0.9019)	loss 0.3693 (0.5039)	grad_norm 2.6259 (2.8720)	mem 20675MB
[2025-04-03 03:40:11 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][132/573]	eta 0:06:37 lr 0.000196	time 0.8775 (0.9016)	loss 0.3821 (0.5033)	grad_norm 2.7878 (2.8658)	mem 20675MB
[2025-04-03 03:40:12 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][134/573]	eta 0:06:35 lr 0.000196	time 0.8911 (0.9014)	loss 0.5907 (0.5040)	grad_norm 2.0448 (2.8556)	mem 20675MB
[2025-04-03 03:40:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][136/573]	eta 0:06:33 lr 0.000196	time 0.8777 (0.9010)	loss 0.4921 (0.5035)	grad_norm 1.9035 (2.8552)	mem 20675MB
[2025-04-03 03:40:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][138/573]	eta 0:06:31 lr 0.000195	time 0.8773 (0.9007)	loss 0.4740 (0.5023)	grad_norm 3.6098 (2.8539)	mem 20675MB
[2025-04-03 03:40:18 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][140/573]	eta 0:06:29 lr 0.000195	time 0.8788 (0.9004)	loss 0.5256 (0.5015)	grad_norm 2.3512 (2.8617)	mem 20675MB
[2025-04-03 03:40:20 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][142/573]	eta 0:06:27 lr 0.000195	time 0.8774 (0.9001)	loss 0.4029 (0.5012)	grad_norm 4.9990 (2.8719)	mem 20675MB
[2025-04-03 03:40:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][144/573]	eta 0:06:26 lr 0.000195	time 0.8776 (0.8999)	loss 0.4312 (0.5013)	grad_norm 3.7107 (2.8747)	mem 20675MB
[2025-04-03 03:40:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][146/573]	eta 0:06:24 lr 0.000195	time 0.8776 (0.8996)	loss 0.4597 (0.5009)	grad_norm 2.6572 (2.8905)	mem 20675MB
[2025-04-03 03:40:25 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][148/573]	eta 0:06:22 lr 0.000195	time 0.8774 (0.8993)	loss 0.5529 (0.5016)	grad_norm 1.8002 (2.8759)	mem 20675MB
[2025-04-03 03:40:27 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][150/573]	eta 0:06:20 lr 0.000194	time 0.8773 (0.8990)	loss 0.4988 (0.5016)	grad_norm 3.3645 (2.8746)	mem 20675MB
[2025-04-03 03:40:28 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][152/573]	eta 0:06:18 lr 0.000194	time 0.8779 (0.8988)	loss 0.5806 (0.5016)	grad_norm 2.9844 (2.8818)	mem 20675MB
[2025-04-03 03:40:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][154/573]	eta 0:06:16 lr 0.000194	time 0.8943 (0.8987)	loss 0.3688 (0.5011)	grad_norm 2.8457 (2.8791)	mem 20675MB
[2025-04-03 03:40:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][156/573]	eta 0:06:14 lr 0.000194	time 0.8777 (0.8984)	loss 0.5027 (0.5012)	grad_norm 2.2764 (2.8686)	mem 20675MB
[2025-04-03 03:40:34 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][158/573]	eta 0:06:12 lr 0.000194	time 0.8780 (0.8983)	loss 0.4706 (0.5013)	grad_norm 2.6719 (2.8653)	mem 20675MB
[2025-04-03 03:40:35 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][160/573]	eta 0:06:10 lr 0.000194	time 0.8776 (0.8980)	loss 0.5888 (0.5020)	grad_norm 3.0386 (2.8620)	mem 20675MB
[2025-04-03 03:40:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][162/573]	eta 0:06:09 lr 0.000193	time 0.8784 (0.8979)	loss 0.5720 (0.5017)	grad_norm 1.9047 (2.8519)	mem 20675MB
[2025-04-03 03:40:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][164/573]	eta 0:06:07 lr 0.000193	time 0.8774 (0.8976)	loss 0.6005 (0.5022)	grad_norm 4.2185 (2.8605)	mem 20675MB
[2025-04-03 03:40:41 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][166/573]	eta 0:06:05 lr 0.000193	time 0.8772 (0.8974)	loss 0.4226 (0.5020)	grad_norm 3.4127 (2.8562)	mem 20675MB
[2025-04-03 03:40:42 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][168/573]	eta 0:06:03 lr 0.000193	time 0.8779 (0.8972)	loss 0.3921 (0.5008)	grad_norm 3.2390 (2.8659)	mem 20675MB
[2025-04-03 03:40:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][170/573]	eta 0:06:01 lr 0.000193	time 0.8773 (0.8970)	loss 0.5296 (0.5004)	grad_norm 2.7380 (2.8668)	mem 20675MB
[2025-04-03 03:40:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][172/573]	eta 0:05:59 lr 0.000193	time 0.8950 (0.8969)	loss 0.4814 (0.5000)	grad_norm 4.0425 (2.8728)	mem 20675MB
[2025-04-03 03:40:48 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][174/573]	eta 0:05:57 lr 0.000192	time 0.8785 (0.8967)	loss 0.4822 (0.5001)	grad_norm 4.1156 (2.8761)	mem 20675MB
[2025-04-03 03:40:49 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][176/573]	eta 0:05:55 lr 0.000192	time 0.8778 (0.8966)	loss 0.3862 (0.4993)	grad_norm 3.5467 (2.8788)	mem 20675MB
[2025-04-03 03:40:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][178/573]	eta 0:05:54 lr 0.000192	time 0.8845 (0.8964)	loss 0.5673 (0.4994)	grad_norm 3.1511 (2.8815)	mem 20675MB
[2025-04-03 03:40:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][180/573]	eta 0:05:52 lr 0.000192	time 0.8847 (0.8963)	loss 0.4011 (0.4992)	grad_norm 2.4555 (2.8841)	mem 20675MB
[2025-04-03 03:40:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][182/573]	eta 0:05:50 lr 0.000192	time 0.8785 (0.8961)	loss 0.5447 (0.4985)	grad_norm 2.7488 (2.8843)	mem 20675MB
[2025-04-03 03:40:57 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][184/573]	eta 0:05:48 lr 0.000192	time 0.8798 (0.8961)	loss 0.4679 (0.4986)	grad_norm 2.3238 (2.8811)	mem 20675MB
[2025-04-03 03:40:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][186/573]	eta 0:05:46 lr 0.000191	time 0.9049 (0.8960)	loss 0.4269 (0.4982)	grad_norm 4.6601 (2.8901)	mem 20675MB
[2025-04-03 03:41:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][188/573]	eta 0:05:44 lr 0.000191	time 0.8799 (0.8959)	loss 0.3753 (0.4982)	grad_norm 4.7670 (2.8949)	mem 20675MB
[2025-04-03 03:41:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][190/573]	eta 0:05:43 lr 0.000191	time 0.8784 (0.8957)	loss 0.3598 (0.4978)	grad_norm 3.4463 (2.8953)	mem 20675MB
[2025-04-03 03:41:04 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][192/573]	eta 0:05:41 lr 0.000191	time 0.8778 (0.8955)	loss 0.4213 (0.4976)	grad_norm 3.4546 (2.9022)	mem 20675MB
[2025-04-03 03:41:05 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][194/573]	eta 0:05:39 lr 0.000191	time 0.8774 (0.8954)	loss 0.5523 (0.4986)	grad_norm 2.8897 (2.9037)	mem 20675MB
[2025-04-03 03:41:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][196/573]	eta 0:05:37 lr 0.000191	time 0.8854 (0.8953)	loss 0.4351 (0.4975)	grad_norm 2.9768 (2.9083)	mem 20675MB
[2025-04-03 03:41:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][198/573]	eta 0:05:35 lr 0.000190	time 0.8789 (0.8951)	loss 0.3868 (0.4969)	grad_norm 3.3853 (2.9067)	mem 20675MB
[2025-04-03 03:41:11 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][200/573]	eta 0:05:33 lr 0.000190	time 0.8819 (0.8950)	loss 0.5605 (0.4972)	grad_norm 2.1298 (2.8999)	mem 20675MB
[2025-04-03 03:41:12 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][202/573]	eta 0:05:31 lr 0.000190	time 0.8775 (0.8948)	loss 0.5445 (0.4969)	grad_norm 2.1386 (2.9036)	mem 20675MB
[2025-04-03 03:41:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][204/573]	eta 0:05:30 lr 0.000190	time 0.8781 (0.8947)	loss 0.5283 (0.4977)	grad_norm 2.0030 (2.8996)	mem 20675MB
[2025-04-03 03:41:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][206/573]	eta 0:05:28 lr 0.000190	time 0.8786 (0.8945)	loss 0.5234 (0.4980)	grad_norm 2.0575 (2.9022)	mem 20675MB
[2025-04-03 03:41:18 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][208/573]	eta 0:05:26 lr 0.000190	time 0.8778 (0.8944)	loss 0.5313 (0.4984)	grad_norm 2.7416 (2.8949)	mem 20675MB
[2025-04-03 03:41:19 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][210/573]	eta 0:05:24 lr 0.000189	time 0.8775 (0.8942)	loss 0.5085 (0.4984)	grad_norm 2.2710 (2.8900)	mem 20675MB
[2025-04-03 03:41:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][212/573]	eta 0:05:22 lr 0.000189	time 0.8789 (0.8941)	loss 0.3990 (0.4983)	grad_norm 2.8374 (2.8887)	mem 20675MB
[2025-04-03 03:41:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][214/573]	eta 0:05:20 lr 0.000189	time 0.8833 (0.8940)	loss 0.5544 (0.4979)	grad_norm 2.3530 (2.8915)	mem 20675MB
[2025-04-03 03:41:25 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][216/573]	eta 0:05:19 lr 0.000189	time 0.8781 (0.8939)	loss 0.5339 (0.4977)	grad_norm 2.5012 (2.8973)	mem 20675MB
[2025-04-03 03:41:27 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][218/573]	eta 0:05:17 lr 0.000189	time 0.8778 (0.8937)	loss 0.5844 (0.4983)	grad_norm 2.8289 (2.9003)	mem 20675MB
[2025-04-03 03:41:28 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][220/573]	eta 0:05:15 lr 0.000189	time 0.8776 (0.8936)	loss 0.5240 (0.4989)	grad_norm 2.1218 (2.8966)	mem 20675MB
[2025-04-03 03:41:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][222/573]	eta 0:05:13 lr 0.000189	time 0.8778 (0.8935)	loss 0.4285 (0.4991)	grad_norm 2.3563 (2.8924)	mem 20675MB
[2025-04-03 03:41:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][224/573]	eta 0:05:11 lr 0.000188	time 0.8778 (0.8933)	loss 0.4541 (0.4989)	grad_norm 2.8418 (2.8942)	mem 20675MB
[2025-04-03 03:41:34 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][226/573]	eta 0:05:09 lr 0.000188	time 0.8845 (0.8933)	loss 0.5970 (0.4996)	grad_norm 2.4009 (2.8906)	mem 20675MB
[2025-04-03 03:41:35 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][228/573]	eta 0:05:08 lr 0.000188	time 0.8778 (0.8932)	loss 0.4347 (0.4996)	grad_norm 3.2417 (2.8907)	mem 20675MB
[2025-04-03 03:41:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][230/573]	eta 0:05:06 lr 0.000188	time 0.8780 (0.8930)	loss 0.4981 (0.4994)	grad_norm 2.5232 (2.8895)	mem 20675MB
[2025-04-03 03:41:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][232/573]	eta 0:05:04 lr 0.000188	time 0.8783 (0.8929)	loss 0.5372 (0.4989)	grad_norm 2.0945 (2.8912)	mem 20675MB
[2025-04-03 03:41:41 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][234/573]	eta 0:05:02 lr 0.000188	time 0.8789 (0.8928)	loss 0.4283 (0.4980)	grad_norm 2.9619 (2.8929)	mem 20675MB
[2025-04-03 03:41:42 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][236/573]	eta 0:05:00 lr 0.000187	time 0.8781 (0.8927)	loss 0.4975 (0.4983)	grad_norm 2.5125 (2.8867)	mem 20675MB
[2025-04-03 03:41:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][238/573]	eta 0:04:59 lr 0.000187	time 0.8799 (0.8926)	loss 0.5839 (0.4989)	grad_norm 2.0160 (2.8827)	mem 20675MB
[2025-04-03 03:41:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][240/573]	eta 0:04:57 lr 0.000187	time 0.8808 (0.8925)	loss 0.4721 (0.4991)	grad_norm 2.5535 (2.8828)	mem 20675MB
[2025-04-03 03:41:48 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][242/573]	eta 0:04:55 lr 0.000187	time 0.8788 (0.8924)	loss 0.5833 (0.4994)	grad_norm 1.9866 (2.8795)	mem 20675MB
[2025-04-03 03:41:49 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][244/573]	eta 0:04:53 lr 0.000187	time 0.8779 (0.8923)	loss 0.4543 (0.4990)	grad_norm 2.8273 (2.8795)	mem 20675MB
[2025-04-03 03:41:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][246/573]	eta 0:04:51 lr 0.000187	time 0.8776 (0.8922)	loss 0.5213 (0.4989)	grad_norm 2.1901 (2.8741)	mem 20675MB
[2025-04-03 03:41:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][248/573]	eta 0:04:49 lr 0.000186	time 0.8819 (0.8921)	loss 0.5188 (0.4986)	grad_norm 2.5229 (2.8763)	mem 20675MB
[2025-04-03 03:41:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][250/573]	eta 0:04:48 lr 0.000186	time 0.8779 (0.8920)	loss 0.5417 (0.4981)	grad_norm 2.1959 (2.8722)	mem 20675MB
[2025-04-03 03:41:56 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][252/573]	eta 0:04:46 lr 0.000186	time 0.8779 (0.8919)	loss 0.3466 (0.4970)	grad_norm 2.5923 (2.8751)	mem 20675MB
[2025-04-03 03:41:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][254/573]	eta 0:04:44 lr 0.000186	time 0.8777 (0.8919)	loss 0.3923 (0.4968)	grad_norm 3.8528 (2.8749)	mem 20675MB
[2025-04-03 03:42:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][256/573]	eta 0:04:42 lr 0.000186	time 0.8776 (0.8918)	loss 0.4676 (0.4963)	grad_norm 3.7362 (2.8808)	mem 20675MB
[2025-04-03 03:42:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][258/573]	eta 0:04:40 lr 0.000186	time 0.8786 (0.8917)	loss 0.4896 (0.4967)	grad_norm 2.7248 (2.8787)	mem 20675MB
[2025-04-03 03:42:04 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][260/573]	eta 0:04:39 lr 0.000185	time 0.8781 (0.8916)	loss 0.5596 (0.4974)	grad_norm 3.4345 (2.8779)	mem 20675MB
[2025-04-03 03:42:05 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][262/573]	eta 0:04:37 lr 0.000185	time 0.8781 (0.8915)	loss 0.4644 (0.4973)	grad_norm 2.4513 (2.8753)	mem 20675MB
[2025-04-03 03:42:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][264/573]	eta 0:04:35 lr 0.000185	time 0.8906 (0.8915)	loss 0.5020 (0.4972)	grad_norm 2.1678 (2.8719)	mem 20675MB
[2025-04-03 03:42:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][266/573]	eta 0:04:33 lr 0.000185	time 0.8842 (0.8914)	loss 0.4652 (0.4969)	grad_norm 2.6134 (2.8680)	mem 20675MB
[2025-04-03 03:42:11 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][268/573]	eta 0:04:31 lr 0.000185	time 0.8789 (0.8913)	loss 0.4301 (0.4968)	grad_norm 3.2439 (2.8665)	mem 20675MB
[2025-04-03 03:42:12 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][270/573]	eta 0:04:30 lr 0.000185	time 0.8783 (0.8912)	loss 0.5517 (0.4967)	grad_norm 2.5095 (2.8695)	mem 20675MB
[2025-04-03 03:42:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][272/573]	eta 0:04:28 lr 0.000184	time 0.8788 (0.8912)	loss 0.3927 (0.4964)	grad_norm 2.5222 (2.8660)	mem 20675MB
[2025-04-03 03:42:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][274/573]	eta 0:04:26 lr 0.000184	time 0.8803 (0.8911)	loss 0.5365 (0.4967)	grad_norm 3.9121 (2.8677)	mem 20675MB
[2025-04-03 03:42:18 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][276/573]	eta 0:04:24 lr 0.000184	time 0.8774 (0.8910)	loss 0.5416 (0.4967)	grad_norm 3.3427 (2.8686)	mem 20675MB
[2025-04-03 03:42:19 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][278/573]	eta 0:04:22 lr 0.000184	time 0.8774 (0.8909)	loss 0.4097 (0.4967)	grad_norm 4.0546 (2.8702)	mem 20675MB
[2025-04-03 03:42:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][280/573]	eta 0:04:21 lr 0.000184	time 0.8777 (0.8908)	loss 0.4996 (0.4966)	grad_norm 2.1532 (2.8680)	mem 20675MB
[2025-04-03 03:42:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][282/573]	eta 0:04:19 lr 0.000184	time 0.8784 (0.8908)	loss 0.6072 (0.4965)	grad_norm 2.9616 (2.8690)	mem 20675MB
[2025-04-03 03:42:25 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][284/573]	eta 0:04:17 lr 0.000183	time 0.8775 (0.8908)	loss 0.5416 (0.4964)	grad_norm 2.3679 (2.8717)	mem 20675MB
[2025-04-03 03:42:26 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][286/573]	eta 0:04:15 lr 0.000183	time 0.8776 (0.8907)	loss 0.5214 (0.4967)	grad_norm 2.7769 (2.8700)	mem 20675MB
[2025-04-03 03:42:28 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][288/573]	eta 0:04:13 lr 0.000183	time 0.8803 (0.8906)	loss 0.3662 (0.4963)	grad_norm 5.0289 (2.8777)	mem 20675MB
[2025-04-03 03:42:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][290/573]	eta 0:04:12 lr 0.000183	time 0.8781 (0.8905)	loss 0.6201 (0.4968)	grad_norm 2.5285 (2.8751)	mem 20675MB
[2025-04-03 03:42:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][292/573]	eta 0:04:10 lr 0.000183	time 0.8776 (0.8904)	loss 0.5048 (0.4970)	grad_norm 1.6757 (2.8687)	mem 20675MB
[2025-04-03 03:42:33 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][294/573]	eta 0:04:08 lr 0.000183	time 0.8776 (0.8904)	loss 0.5977 (0.4968)	grad_norm 2.1402 (2.8656)	mem 20675MB
[2025-04-03 03:42:35 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][296/573]	eta 0:04:06 lr 0.000183	time 0.8779 (0.8903)	loss 0.6087 (0.4974)	grad_norm 2.8534 (2.8656)	mem 20675MB
[2025-04-03 03:42:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][298/573]	eta 0:04:04 lr 0.000182	time 0.8779 (0.8902)	loss 0.6103 (0.4974)	grad_norm 2.2216 (2.8615)	mem 20675MB
[2025-04-03 03:42:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][300/573]	eta 0:04:03 lr 0.000182	time 0.8895 (0.8903)	loss 0.5382 (0.4977)	grad_norm 2.7582 (2.8609)	mem 20675MB
[2025-04-03 03:42:41 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][302/573]	eta 0:04:01 lr 0.000182	time 0.8779 (0.8903)	loss 0.5699 (0.4982)	grad_norm 2.2024 (2.8563)	mem 20675MB
[2025-04-03 03:42:42 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][304/573]	eta 0:03:59 lr 0.000182	time 0.8781 (0.8902)	loss 0.4216 (0.4980)	grad_norm 3.7498 (2.8547)	mem 20675MB
[2025-04-03 03:42:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][306/573]	eta 0:03:57 lr 0.000182	time 0.8793 (0.8901)	loss 0.5827 (0.4981)	grad_norm 1.7478 (2.8495)	mem 20675MB
[2025-04-03 03:42:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][308/573]	eta 0:03:55 lr 0.000182	time 0.8778 (0.8901)	loss 0.4959 (0.4983)	grad_norm 2.3726 (2.8443)	mem 20675MB
[2025-04-03 03:42:48 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][310/573]	eta 0:03:54 lr 0.000181	time 0.8779 (0.8900)	loss 0.4717 (0.4983)	grad_norm 3.0348 (2.8437)	mem 20675MB
[2025-04-03 03:42:49 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][312/573]	eta 0:03:52 lr 0.000181	time 0.8779 (0.8899)	loss 0.4973 (0.4984)	grad_norm 2.7995 (2.8436)	mem 20675MB
[2025-04-03 03:42:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][314/573]	eta 0:03:50 lr 0.000181	time 0.8781 (0.8899)	loss 0.5219 (0.4987)	grad_norm 2.0863 (2.8386)	mem 20675MB
[2025-04-03 03:42:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][316/573]	eta 0:03:48 lr 0.000181	time 0.8785 (0.8898)	loss 0.5169 (0.4988)	grad_norm 2.5013 (2.8368)	mem 20675MB
[2025-04-03 03:42:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][318/573]	eta 0:03:46 lr 0.000181	time 0.8821 (0.8898)	loss 0.4830 (0.4990)	grad_norm 2.9175 (2.8353)	mem 20675MB
[2025-04-03 03:42:56 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][320/573]	eta 0:03:45 lr 0.000181	time 0.8784 (0.8897)	loss 0.5285 (0.4992)	grad_norm 1.9104 (2.8297)	mem 20675MB
[2025-04-03 03:42:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][322/573]	eta 0:03:43 lr 0.000180	time 0.8819 (0.8896)	loss 0.3832 (0.4985)	grad_norm 2.9990 (2.8282)	mem 20675MB
[2025-04-03 03:43:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][324/573]	eta 0:03:41 lr 0.000180	time 0.8772 (0.8896)	loss 0.4710 (0.4979)	grad_norm 2.6286 (2.8269)	mem 20675MB
[2025-04-03 03:43:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][326/573]	eta 0:03:39 lr 0.000180	time 0.8776 (0.8895)	loss 0.4606 (0.4976)	grad_norm 2.3485 (2.8253)	mem 20675MB
[2025-04-03 03:43:03 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][328/573]	eta 0:03:37 lr 0.000180	time 0.8882 (0.8895)	loss 0.5312 (0.4978)	grad_norm 2.2420 (2.8206)	mem 20675MB
[2025-04-03 03:43:05 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][330/573]	eta 0:03:36 lr 0.000180	time 0.8780 (0.8894)	loss 0.5211 (0.4980)	grad_norm 1.9000 (2.8182)	mem 20675MB
[2025-04-03 03:43:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][332/573]	eta 0:03:34 lr 0.000180	time 0.8781 (0.8893)	loss 0.3895 (0.4978)	grad_norm 2.8191 (2.8187)	mem 20675MB
[2025-04-03 03:43:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][334/573]	eta 0:03:32 lr 0.000179	time 0.8774 (0.8893)	loss 0.5493 (0.4976)	grad_norm 2.8169 (2.8195)	mem 20675MB
[2025-04-03 03:43:10 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][336/573]	eta 0:03:30 lr 0.000179	time 0.8940 (0.8893)	loss 0.5837 (0.4974)	grad_norm 2.3530 (2.8191)	mem 20675MB
[2025-04-03 03:43:12 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][338/573]	eta 0:03:28 lr 0.000179	time 0.8776 (0.8892)	loss 0.3707 (0.4970)	grad_norm 3.9363 (2.8246)	mem 20675MB
[2025-04-03 03:43:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][340/573]	eta 0:03:27 lr 0.000179	time 0.8790 (0.8892)	loss 0.5160 (0.4972)	grad_norm 3.0068 (2.8240)	mem 20675MB
[2025-04-03 03:43:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][342/573]	eta 0:03:25 lr 0.000179	time 0.8781 (0.8891)	loss 0.3559 (0.4971)	grad_norm 3.8523 (2.8255)	mem 20675MB
[2025-04-03 03:43:18 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][344/573]	eta 0:03:23 lr 0.000179	time 0.8780 (0.8891)	loss 0.5362 (0.4972)	grad_norm 2.3700 (2.8248)	mem 20675MB
[2025-04-03 03:43:19 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][346/573]	eta 0:03:21 lr 0.000178	time 0.8775 (0.8890)	loss 0.5036 (0.4972)	grad_norm 3.0241 (2.8245)	mem 20675MB
[2025-04-03 03:43:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][348/573]	eta 0:03:20 lr 0.000178	time 0.9112 (0.8890)	loss 0.4362 (0.4973)	grad_norm 3.3718 (2.8252)	mem 20675MB
[2025-04-03 03:43:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][350/573]	eta 0:03:18 lr 0.000178	time 0.8777 (0.8890)	loss 0.4621 (0.4972)	grad_norm 2.5511 (2.8250)	mem 20675MB
[2025-04-03 03:43:25 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][352/573]	eta 0:03:16 lr 0.000178	time 0.8781 (0.8889)	loss 0.5038 (0.4973)	grad_norm 2.6129 (2.8236)	mem 20675MB
[2025-04-03 03:43:26 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][354/573]	eta 0:03:14 lr 0.000178	time 0.8795 (0.8889)	loss 0.5304 (0.4974)	grad_norm 2.8383 (2.8216)	mem 20675MB
[2025-04-03 03:43:28 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][356/573]	eta 0:03:12 lr 0.000178	time 0.8976 (0.8889)	loss 0.4812 (0.4976)	grad_norm 2.4357 (2.8193)	mem 20675MB
[2025-04-03 03:43:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][358/573]	eta 0:03:11 lr 0.000178	time 0.8774 (0.8888)	loss 0.4401 (0.4976)	grad_norm 2.6008 (2.8179)	mem 20675MB
[2025-04-03 03:43:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][360/573]	eta 0:03:09 lr 0.000177	time 0.8775 (0.8888)	loss 0.5523 (0.4979)	grad_norm 2.4554 (2.8161)	mem 20675MB
[2025-04-03 03:43:33 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][362/573]	eta 0:03:07 lr 0.000177	time 0.9029 (0.8888)	loss 0.5341 (0.4982)	grad_norm 3.1794 (2.8161)	mem 20675MB
[2025-04-03 03:43:35 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][364/573]	eta 0:03:05 lr 0.000177	time 0.8774 (0.8887)	loss 0.3290 (0.4978)	grad_norm 3.5130 (2.8178)	mem 20675MB
[2025-04-03 03:43:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][366/573]	eta 0:03:03 lr 0.000177	time 0.8782 (0.8887)	loss 0.5132 (0.4976)	grad_norm 2.3452 (2.8158)	mem 20675MB
[2025-04-03 03:43:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][368/573]	eta 0:03:02 lr 0.000177	time 0.8796 (0.8886)	loss 0.4865 (0.4974)	grad_norm 2.3527 (2.8138)	mem 20675MB
[2025-04-03 03:43:40 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][370/573]	eta 0:03:00 lr 0.000177	time 0.8775 (0.8886)	loss 0.5707 (0.4977)	grad_norm 2.7779 (2.8118)	mem 20675MB
[2025-04-03 03:43:42 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][372/573]	eta 0:02:58 lr 0.000176	time 0.8778 (0.8886)	loss 0.5628 (0.4974)	grad_norm 2.5604 (2.8106)	mem 20675MB
[2025-04-03 03:43:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][374/573]	eta 0:02:56 lr 0.000176	time 0.8783 (0.8885)	loss 0.3391 (0.4970)	grad_norm 3.5226 (2.8108)	mem 20675MB
[2025-04-03 03:43:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][376/573]	eta 0:02:55 lr 0.000176	time 0.8830 (0.8886)	loss 0.4303 (0.4969)	grad_norm 2.9652 (2.8127)	mem 20675MB
[2025-04-03 03:43:48 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][378/573]	eta 0:02:53 lr 0.000176	time 0.8778 (0.8885)	loss 0.4839 (0.4969)	grad_norm 3.0405 (2.8109)	mem 20675MB
[2025-04-03 03:43:49 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][380/573]	eta 0:02:51 lr 0.000176	time 0.8782 (0.8884)	loss 0.4814 (0.4969)	grad_norm 2.1363 (2.8076)	mem 20675MB
[2025-04-03 03:43:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][382/573]	eta 0:02:49 lr 0.000176	time 0.8794 (0.8884)	loss 0.3831 (0.4963)	grad_norm 4.8972 (2.8134)	mem 20675MB
[2025-04-03 03:43:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][384/573]	eta 0:02:47 lr 0.000175	time 0.8829 (0.8884)	loss 0.5052 (0.4961)	grad_norm 2.9022 (2.8167)	mem 20675MB
[2025-04-03 03:43:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][386/573]	eta 0:02:46 lr 0.000175	time 0.8776 (0.8883)	loss 0.5525 (0.4961)	grad_norm 2.1839 (2.8166)	mem 20675MB
[2025-04-03 03:43:56 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][388/573]	eta 0:02:44 lr 0.000175	time 0.8773 (0.8883)	loss 0.5169 (0.4960)	grad_norm 2.2168 (2.8147)	mem 20675MB
[2025-04-03 03:43:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][390/573]	eta 0:02:42 lr 0.000175	time 0.8777 (0.8882)	loss 0.3726 (0.4955)	grad_norm 4.1763 (2.8205)	mem 20675MB
[2025-04-03 03:44:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][392/573]	eta 0:02:40 lr 0.000175	time 0.8805 (0.8882)	loss 0.4945 (0.4957)	grad_norm 3.0386 (2.8217)	mem 20675MB
[2025-04-03 03:44:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][394/573]	eta 0:02:38 lr 0.000175	time 0.8778 (0.8882)	loss 0.3320 (0.4956)	grad_norm 4.8581 (2.8268)	mem 20675MB
[2025-04-03 03:44:03 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][396/573]	eta 0:02:37 lr 0.000175	time 0.8850 (0.8882)	loss 0.3952 (0.4954)	grad_norm 4.0105 (2.8318)	mem 20675MB
[2025-04-03 03:44:05 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][398/573]	eta 0:02:35 lr 0.000174	time 0.8774 (0.8882)	loss 0.5880 (0.4958)	grad_norm 3.0111 (2.8319)	mem 20675MB
[2025-04-03 03:44:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][400/573]	eta 0:02:33 lr 0.000174	time 0.8904 (0.8882)	loss 0.6040 (0.4963)	grad_norm 3.5308 (2.8329)	mem 20675MB
[2025-04-03 03:44:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][402/573]	eta 0:02:31 lr 0.000174	time 0.8774 (0.8881)	loss 0.5400 (0.4963)	grad_norm 3.7875 (2.8359)	mem 20675MB
[2025-04-03 03:44:10 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][404/573]	eta 0:02:30 lr 0.000174	time 0.8774 (0.8881)	loss 0.5356 (0.4965)	grad_norm 1.8022 (2.8315)	mem 20675MB
[2025-04-03 03:44:12 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][406/573]	eta 0:02:28 lr 0.000174	time 0.8780 (0.8881)	loss 0.3433 (0.4965)	grad_norm 3.4826 (2.8334)	mem 20675MB
[2025-04-03 03:44:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][408/573]	eta 0:02:26 lr 0.000174	time 0.8781 (0.8881)	loss 0.5578 (0.4967)	grad_norm 2.1624 (2.8308)	mem 20675MB
[2025-04-03 03:44:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][410/573]	eta 0:02:24 lr 0.000173	time 0.8802 (0.8880)	loss 0.5296 (0.4969)	grad_norm 2.0992 (2.8273)	mem 20675MB
[2025-04-03 03:44:18 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][412/573]	eta 0:02:22 lr 0.000173	time 0.8777 (0.8880)	loss 0.5006 (0.4970)	grad_norm 2.4123 (2.8268)	mem 20675MB
[2025-04-03 03:44:19 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][414/573]	eta 0:02:21 lr 0.000173	time 0.8791 (0.8879)	loss 0.5638 (0.4971)	grad_norm 2.3233 (2.8244)	mem 20675MB
[2025-04-03 03:44:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][416/573]	eta 0:02:19 lr 0.000173	time 0.8778 (0.8880)	loss 0.5393 (0.4973)	grad_norm 2.3528 (2.8265)	mem 20675MB
[2025-04-03 03:44:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][418/573]	eta 0:02:17 lr 0.000173	time 0.8774 (0.8880)	loss 0.5738 (0.4975)	grad_norm 3.1670 (2.8256)	mem 20675MB
[2025-04-03 03:44:25 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][420/573]	eta 0:02:15 lr 0.000173	time 0.8782 (0.8879)	loss 0.5055 (0.4975)	grad_norm 1.9155 (2.8222)	mem 20675MB
[2025-04-03 03:44:26 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][422/573]	eta 0:02:14 lr 0.000172	time 0.8776 (0.8879)	loss 0.4353 (0.4974)	grad_norm 2.3907 (2.8206)	mem 20675MB
[2025-04-03 03:44:28 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][424/573]	eta 0:02:12 lr 0.000172	time 0.8780 (0.8878)	loss 0.3263 (0.4971)	grad_norm 3.1126 (2.8196)	mem 20675MB
[2025-04-03 03:44:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][426/573]	eta 0:02:10 lr 0.000172	time 0.8780 (0.8878)	loss 0.6328 (0.4974)	grad_norm 2.4639 (2.8219)	mem 20675MB
[2025-04-03 03:44:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][428/573]	eta 0:02:08 lr 0.000172	time 0.8788 (0.8878)	loss 0.3929 (0.4973)	grad_norm 3.1235 (2.8214)	mem 20675MB
[2025-04-03 03:44:33 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][430/573]	eta 0:02:06 lr 0.000172	time 0.8780 (0.8877)	loss 0.4897 (0.4974)	grad_norm 2.3113 (2.8193)	mem 20675MB
[2025-04-03 03:44:35 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][432/573]	eta 0:02:05 lr 0.000172	time 0.8776 (0.8877)	loss 0.5688 (0.4978)	grad_norm 2.7506 (2.8210)	mem 20675MB
[2025-04-03 03:44:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][434/573]	eta 0:02:03 lr 0.000172	time 0.8775 (0.8877)	loss 0.5114 (0.4979)	grad_norm 3.3467 (2.8202)	mem 20675MB
[2025-04-03 03:44:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][436/573]	eta 0:02:01 lr 0.000171	time 0.8778 (0.8876)	loss 0.5938 (0.4978)	grad_norm 2.5600 (2.8193)	mem 20675MB
[2025-04-03 03:44:40 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][438/573]	eta 0:01:59 lr 0.000171	time 0.8776 (0.8876)	loss 0.5545 (0.4979)	grad_norm 2.2903 (2.8174)	mem 20675MB
[2025-04-03 03:44:42 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][440/573]	eta 0:01:58 lr 0.000171	time 0.8789 (0.8876)	loss 0.5609 (0.4981)	grad_norm 2.2151 (2.8157)	mem 20675MB
[2025-04-03 03:44:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][442/573]	eta 0:01:56 lr 0.000171	time 0.8801 (0.8875)	loss 0.5546 (0.4982)	grad_norm 2.0886 (2.8137)	mem 20675MB
[2025-04-03 03:44:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][444/573]	eta 0:01:54 lr 0.000171	time 0.8777 (0.8875)	loss 0.4841 (0.4983)	grad_norm 2.4416 (2.8110)	mem 20675MB
[2025-04-03 03:44:47 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][446/573]	eta 0:01:52 lr 0.000171	time 0.8774 (0.8875)	loss 0.4636 (0.4982)	grad_norm 2.3304 (2.8088)	mem 20675MB
[2025-04-03 03:44:49 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][448/573]	eta 0:01:50 lr 0.000170	time 0.8785 (0.8874)	loss 0.5669 (0.4984)	grad_norm 1.9298 (2.8064)	mem 20675MB
[2025-04-03 03:44:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][450/573]	eta 0:01:49 lr 0.000170	time 0.8779 (0.8874)	loss 0.5236 (0.4986)	grad_norm 2.2396 (2.8050)	mem 20675MB
[2025-04-03 03:44:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][452/573]	eta 0:01:47 lr 0.000170	time 0.8967 (0.8874)	loss 0.3950 (0.4982)	grad_norm 2.7143 (2.8059)	mem 20675MB
[2025-04-03 03:44:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][454/573]	eta 0:01:45 lr 0.000170	time 0.8871 (0.8874)	loss 0.3595 (0.4977)	grad_norm 3.3100 (2.8055)	mem 20675MB
[2025-04-03 03:44:56 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][456/573]	eta 0:01:43 lr 0.000170	time 0.8842 (0.8874)	loss 0.5764 (0.4978)	grad_norm 1.8557 (2.8025)	mem 20675MB
[2025-04-03 03:44:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][458/573]	eta 0:01:42 lr 0.000170	time 0.8782 (0.8873)	loss 0.3258 (0.4973)	grad_norm 2.7748 (2.8009)	mem 20675MB
[2025-04-03 03:45:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][460/573]	eta 0:01:40 lr 0.000169	time 0.8776 (0.8873)	loss 0.5594 (0.4973)	grad_norm 2.2306 (2.8021)	mem 20675MB
[2025-04-03 03:45:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][462/573]	eta 0:01:38 lr 0.000169	time 0.8777 (0.8873)	loss 0.5027 (0.4974)	grad_norm 2.3630 (2.7992)	mem 20675MB
[2025-04-03 03:45:03 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][464/573]	eta 0:01:36 lr 0.000169	time 0.9054 (0.8873)	loss 0.4342 (0.4972)	grad_norm 3.5242 (2.8011)	mem 20675MB
[2025-04-03 03:45:05 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][466/573]	eta 0:01:34 lr 0.000169	time 0.8790 (0.8873)	loss 0.3323 (0.4969)	grad_norm 2.8990 (2.8033)	mem 20675MB
[2025-04-03 03:45:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][468/573]	eta 0:01:33 lr 0.000169	time 0.8824 (0.8872)	loss 0.5341 (0.4971)	grad_norm 2.2870 (2.8018)	mem 20675MB
[2025-04-03 03:45:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][470/573]	eta 0:01:31 lr 0.000169	time 0.8777 (0.8872)	loss 0.5038 (0.4971)	grad_norm 2.4808 (2.8028)	mem 20675MB
[2025-04-03 03:45:10 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][472/573]	eta 0:01:29 lr 0.000169	time 0.8784 (0.8872)	loss 0.3504 (0.4965)	grad_norm 3.2544 (2.8068)	mem 20675MB
[2025-04-03 03:45:12 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][474/573]	eta 0:01:27 lr 0.000168	time 0.8909 (0.8872)	loss 0.5276 (0.4962)	grad_norm 2.9617 (2.8080)	mem 20675MB
[2025-04-03 03:45:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][476/573]	eta 0:01:26 lr 0.000168	time 0.8779 (0.8871)	loss 0.5478 (0.4965)	grad_norm 2.9014 (2.8068)	mem 20675MB
[2025-04-03 03:45:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][478/573]	eta 0:01:24 lr 0.000168	time 0.8783 (0.8871)	loss 0.3477 (0.4962)	grad_norm 3.7696 (2.8077)	mem 20675MB
[2025-04-03 03:45:17 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][480/573]	eta 0:01:22 lr 0.000168	time 0.8780 (0.8871)	loss 0.5384 (0.4964)	grad_norm 3.1634 (2.8069)	mem 20675MB
[2025-04-03 03:45:19 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][482/573]	eta 0:01:20 lr 0.000168	time 0.8772 (0.8870)	loss 0.5273 (0.4968)	grad_norm 2.2618 (2.8061)	mem 20675MB
[2025-04-03 03:45:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][484/573]	eta 0:01:18 lr 0.000168	time 0.8792 (0.8870)	loss 0.3892 (0.4966)	grad_norm 3.6052 (2.8080)	mem 20675MB
[2025-04-03 03:45:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][486/573]	eta 0:01:17 lr 0.000167	time 0.8835 (0.8870)	loss 0.4874 (0.4968)	grad_norm 2.0432 (2.8058)	mem 20675MB
[2025-04-03 03:45:25 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][488/573]	eta 0:01:15 lr 0.000167	time 0.8777 (0.8870)	loss 0.4133 (0.4966)	grad_norm 4.5524 (2.8079)	mem 20675MB
[2025-04-03 03:45:26 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][490/573]	eta 0:01:13 lr 0.000167	time 0.8777 (0.8870)	loss 0.3944 (0.4965)	grad_norm 4.1332 (2.8112)	mem 20675MB
[2025-04-03 03:45:28 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][492/573]	eta 0:01:11 lr 0.000167	time 0.8794 (0.8869)	loss 0.5202 (0.4966)	grad_norm 2.5909 (2.8096)	mem 20675MB
[2025-04-03 03:45:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][494/573]	eta 0:01:10 lr 0.000167	time 0.8784 (0.8869)	loss 0.5742 (0.4970)	grad_norm 1.4514 (2.8065)	mem 20675MB
[2025-04-03 03:45:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][496/573]	eta 0:01:08 lr 0.000167	time 0.8777 (0.8869)	loss 0.5428 (0.4971)	grad_norm 2.5415 (2.8058)	mem 20675MB
[2025-04-03 03:45:33 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][498/573]	eta 0:01:06 lr 0.000167	time 0.8781 (0.8868)	loss 0.6369 (0.4974)	grad_norm 2.3608 (2.8052)	mem 20675MB
[2025-04-03 03:45:35 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][500/573]	eta 0:01:04 lr 0.000166	time 0.8840 (0.8868)	loss 0.4996 (0.4975)	grad_norm 2.5273 (2.8036)	mem 20675MB
[2025-04-03 03:45:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][502/573]	eta 0:01:02 lr 0.000166	time 0.8834 (0.8868)	loss 0.4485 (0.4974)	grad_norm 3.2962 (2.8036)	mem 20675MB
[2025-04-03 03:45:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][504/573]	eta 0:01:01 lr 0.000166	time 0.9017 (0.8869)	loss 0.4730 (0.4971)	grad_norm 1.5854 (2.8043)	mem 20675MB
[2025-04-03 03:45:40 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][506/573]	eta 0:00:59 lr 0.000166	time 0.8776 (0.8869)	loss 0.4079 (0.4971)	grad_norm 2.7189 (2.8042)	mem 20675MB
[2025-04-03 03:45:42 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][508/573]	eta 0:00:57 lr 0.000166	time 0.8865 (0.8869)	loss 0.4959 (0.4969)	grad_norm 2.8551 (2.8047)	mem 20675MB
[2025-04-03 03:45:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][510/573]	eta 0:00:55 lr 0.000166	time 0.8778 (0.8868)	loss 0.3043 (0.4963)	grad_norm 2.4355 (2.8055)	mem 20675MB
[2025-04-03 03:45:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][512/573]	eta 0:00:54 lr 0.000165	time 0.8799 (0.8868)	loss 0.6180 (0.4965)	grad_norm 2.1732 (2.8048)	mem 20675MB
[2025-04-03 03:45:47 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][514/573]	eta 0:00:52 lr 0.000165	time 0.8777 (0.8868)	loss 0.6433 (0.4967)	grad_norm 3.2073 (2.8041)	mem 20675MB
[2025-04-03 03:45:49 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][516/573]	eta 0:00:50 lr 0.000165	time 0.8779 (0.8868)	loss 0.4594 (0.4969)	grad_norm 3.1757 (2.8040)	mem 20675MB
[2025-04-03 03:45:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][518/573]	eta 0:00:48 lr 0.000165	time 0.8779 (0.8867)	loss 0.6169 (0.4972)	grad_norm 2.6811 (2.8019)	mem 20675MB
[2025-04-03 03:45:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][520/573]	eta 0:00:46 lr 0.000165	time 0.8789 (0.8867)	loss 0.4726 (0.4972)	grad_norm 3.5202 (2.8053)	mem 20675MB
[2025-04-03 03:45:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][522/573]	eta 0:00:45 lr 0.000165	time 0.8775 (0.8867)	loss 0.3935 (0.4970)	grad_norm 3.5005 (2.8052)	mem 20675MB
[2025-04-03 03:45:56 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][524/573]	eta 0:00:43 lr 0.000165	time 0.8777 (0.8867)	loss 0.4094 (0.4969)	grad_norm 3.9733 (2.8052)	mem 20675MB
[2025-04-03 03:45:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][526/573]	eta 0:00:41 lr 0.000164	time 0.8780 (0.8867)	loss 0.4316 (0.4968)	grad_norm 2.1945 (2.8034)	mem 20675MB
[2025-04-03 03:46:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][528/573]	eta 0:00:39 lr 0.000164	time 0.8782 (0.8866)	loss 0.4964 (0.4968)	grad_norm 2.1949 (2.8009)	mem 20675MB
[2025-04-03 03:46:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][530/573]	eta 0:00:38 lr 0.000164	time 0.8813 (0.8866)	loss 0.5678 (0.4972)	grad_norm 2.0290 (2.7995)	mem 20675MB
[2025-04-03 03:46:03 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][532/573]	eta 0:00:36 lr 0.000164	time 0.8777 (0.8866)	loss 0.5363 (0.4972)	grad_norm 2.7362 (2.7999)	mem 20675MB
[2025-04-03 03:46:05 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][534/573]	eta 0:00:34 lr 0.000164	time 0.8784 (0.8866)	loss 0.3362 (0.4969)	grad_norm 3.0105 (2.8002)	mem 20675MB
[2025-04-03 03:46:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][536/573]	eta 0:00:32 lr 0.000164	time 0.8780 (0.8865)	loss 0.5716 (0.4970)	grad_norm 1.9765 (2.7972)	mem 20675MB
[2025-04-03 03:46:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][538/573]	eta 0:00:31 lr 0.000163	time 0.8819 (0.8865)	loss 0.5488 (0.4969)	grad_norm 2.2055 (2.7979)	mem 20675MB
[2025-04-03 03:46:10 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][540/573]	eta 0:00:29 lr 0.000163	time 0.8780 (0.8865)	loss 0.6126 (0.4973)	grad_norm 3.2686 (2.7986)	mem 20675MB
[2025-04-03 03:46:12 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][542/573]	eta 0:00:27 lr 0.000163	time 0.8780 (0.8865)	loss 0.5313 (0.4975)	grad_norm 2.6190 (2.7973)	mem 20675MB
[2025-04-03 03:46:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][544/573]	eta 0:00:25 lr 0.000163	time 0.8778 (0.8864)	loss 0.5171 (0.4976)	grad_norm 2.5616 (2.7961)	mem 20675MB
[2025-04-03 03:46:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][546/573]	eta 0:00:23 lr 0.000163	time 0.8817 (0.8864)	loss 0.5393 (0.4977)	grad_norm 2.4838 (2.7943)	mem 20675MB
[2025-04-03 03:46:17 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][548/573]	eta 0:00:22 lr 0.000163	time 0.8780 (0.8864)	loss 0.4919 (0.4978)	grad_norm 3.1248 (2.7941)	mem 20675MB
[2025-04-03 03:46:19 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][550/573]	eta 0:00:20 lr 0.000163	time 0.8776 (0.8864)	loss 0.3351 (0.4976)	grad_norm 3.4530 (2.7946)	mem 20675MB
[2025-04-03 03:46:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][552/573]	eta 0:00:18 lr 0.000162	time 0.8778 (0.8863)	loss 0.3901 (0.4972)	grad_norm 5.2633 (2.7988)	mem 20675MB
[2025-04-03 03:46:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][554/573]	eta 0:00:16 lr 0.000162	time 0.8775 (0.8863)	loss 0.4622 (0.4971)	grad_norm 2.5411 (2.7993)	mem 20675MB
[2025-04-03 03:46:24 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][556/573]	eta 0:00:15 lr 0.000162	time 0.8780 (0.8863)	loss 0.5435 (0.4969)	grad_norm 3.0870 (2.8004)	mem 20675MB
[2025-04-03 03:46:26 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][558/573]	eta 0:00:13 lr 0.000162	time 0.8775 (0.8863)	loss 0.5625 (0.4970)	grad_norm 2.2831 (2.7996)	mem 20675MB
[2025-04-03 03:46:28 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][560/573]	eta 0:00:11 lr 0.000162	time 0.8773 (0.8863)	loss 0.4736 (0.4968)	grad_norm 2.2735 (2.7999)	mem 20675MB
[2025-04-03 03:46:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][562/573]	eta 0:00:09 lr 0.000162	time 0.8837 (0.8862)	loss 0.5416 (0.4970)	grad_norm 2.0986 (2.7980)	mem 20675MB
[2025-04-03 03:46:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][564/573]	eta 0:00:07 lr 0.000161	time 0.8773 (0.8862)	loss 0.4305 (0.4965)	grad_norm 4.2081 (2.8030)	mem 20675MB
[2025-04-03 03:46:33 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][566/573]	eta 0:00:06 lr 0.000161	time 0.8773 (0.8862)	loss 0.5020 (0.4966)	grad_norm 2.4006 (2.8026)	mem 20675MB
[2025-04-03 03:46:35 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][568/573]	eta 0:00:04 lr 0.000161	time 0.8775 (0.8862)	loss 0.5763 (0.4966)	grad_norm 2.2144 (2.8016)	mem 20675MB
[2025-04-03 03:46:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][570/573]	eta 0:00:02 lr 0.000161	time 0.8923 (0.8862)	loss 0.4673 (0.4965)	grad_norm 4.7010 (2.8051)	mem 20675MB
[2025-04-03 03:46:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][572/573]	eta 0:00:00 lr 0.000161	time 0.8771 (0.8862)	loss 0.5480 (0.4967)	grad_norm 2.6006 (2.8042)	mem 20675MB
[2025-04-03 03:46:39 simmim_finetune] (main_finetune.py 260): INFO EPOCH 22 training takes 0:08:27
[2025-04-03 03:46:42 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 3.047 (3.047)	Loss 0.4713 (0.4713)	Acc@1 74.219 (74.219)	Mem 20675MB
[2025-04-03 03:46:42 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.284 (1.207)	Loss 0.4205 (0.4345)	Acc@1 77.344 (77.083)	Mem 20675MB
[2025-04-03 03:46:43 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.289 (0.840)	Loss 0.4533 (0.4306)	Acc@1 76.562 (77.500)	Mem 20675MB
[2025-04-03 03:46:44 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.681)	Loss 0.3832 (0.4176)	Acc@1 85.156 (79.688)	Mem 20675MB
[2025-04-03 03:46:44 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.593)	Loss 0.5644 (0.4332)	Acc@1 71.094 (79.167)	Mem 20675MB
[2025-04-03 03:46:45 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.537)	Loss 0.5283 (0.4546)	Acc@1 77.344 (78.480)	Mem 20675MB
[2025-04-03 03:46:45 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.498)	Loss 0.5132 (0.4629)	Acc@1 77.344 (78.245)	Mem 20675MB
[2025-04-03 03:46:46 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.284 (0.469)	Loss 0.4793 (0.4657)	Acc@1 78.906 (78.177)	Mem 20675MB
[2025-04-03 03:46:46 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.024
[2025-04-03 03:46:46 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 78.0%
[2025-04-03 03:46:46 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.02%
[2025-04-03 03:46:46 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [8.116894626412138e-07, 8.116894626412138e-07, 1.13143419566528e-06, 1.13143419566528e-06, 1.6233491695484589e-06, 1.6233491695484589e-06, 2.3801414370610424e-06, 2.3801414370610424e-06, 3.5444372332342463e-06, 3.5444372332342463e-06, 5.3356615350391765e-06, 5.3356615350391765e-06, 8.091391230123682e-06, 8.091391230123682e-06, 1.2330975376407538e-05, 1.2330975376407538e-05, 1.8853412524536548e-05, 1.8853412524536548e-05, 2.8887931213965798e-05, 2.8887931213965798e-05, 4.432565227462617e-05, 4.432565227462617e-05, 6.807599236794985e-05, 6.807599236794985e-05, 0.00010461497712690932, 0.00010461497712690932, 0.00016082879983300081, 0.00016082879983300081]
[2025-04-03 03:46:50 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][0/573]	eta 0:32:30 lr 0.000161	time 3.4040 (3.4040)	loss 0.5512 (0.5512)	grad_norm 2.0747 (2.0747)	mem 20675MB
[2025-04-03 03:46:52 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][2/573]	eta 0:16:22 lr 0.000161	time 0.8778 (1.7207)	loss 0.5736 (0.5403)	grad_norm 3.1001 (2.5813)	mem 20675MB
[2025-04-03 03:46:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][4/573]	eta 0:13:07 lr 0.000160	time 0.8787 (1.3843)	loss 0.4249 (0.5328)	grad_norm 3.0130 (2.8823)	mem 20675MB
[2025-04-03 03:46:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][6/573]	eta 0:11:43 lr 0.000160	time 0.8785 (1.2400)	loss 0.3459 (0.5036)	grad_norm 2.6853 (2.7311)	mem 20675MB
[2025-04-03 03:46:57 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][8/573]	eta 0:10:58 lr 0.000160	time 0.8792 (1.1647)	loss 0.4892 (0.5062)	grad_norm 2.8948 (2.6643)	mem 20675MB
[2025-04-03 03:46:59 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][10/573]	eta 0:10:26 lr 0.000160	time 0.8788 (1.1129)	loss 0.3248 (0.4850)	grad_norm 3.1393 (2.6660)	mem 20675MB
[2025-04-03 03:47:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][12/573]	eta 0:10:04 lr 0.000160	time 0.8786 (1.0771)	loss 0.5707 (0.4875)	grad_norm 2.4189 (2.6147)	mem 20675MB
[2025-04-03 03:47:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][14/573]	eta 0:09:47 lr 0.000160	time 0.8780 (1.0508)	loss 0.4467 (0.4810)	grad_norm 2.5231 (2.5967)	mem 20675MB
[2025-04-03 03:47:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][16/573]	eta 0:09:34 lr 0.000160	time 0.8775 (1.0309)	loss 0.4657 (0.4792)	grad_norm 3.0567 (2.6218)	mem 20675MB
[2025-04-03 03:47:06 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][18/573]	eta 0:09:23 lr 0.000159	time 0.8779 (1.0149)	loss 0.4033 (0.4796)	grad_norm 3.3032 (2.6409)	mem 20675MB
[2025-04-03 03:47:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][20/573]	eta 0:09:14 lr 0.000159	time 0.8784 (1.0033)	loss 0.5182 (0.4838)	grad_norm 2.4624 (2.6266)	mem 20675MB
[2025-04-03 03:47:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][22/573]	eta 0:09:07 lr 0.000159	time 0.8796 (0.9929)	loss 0.3872 (0.4821)	grad_norm 3.3700 (2.6429)	mem 20675MB
[2025-04-03 03:47:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][24/573]	eta 0:09:00 lr 0.000159	time 0.8778 (0.9838)	loss 0.3387 (0.4827)	grad_norm 3.2709 (2.6917)	mem 20675MB
[2025-04-03 03:47:13 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][26/573]	eta 0:08:53 lr 0.000159	time 0.8789 (0.9760)	loss 0.5754 (0.4863)	grad_norm 2.8840 (2.7076)	mem 20675MB
[2025-04-03 03:47:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][28/573]	eta 0:08:48 lr 0.000159	time 0.8774 (0.9693)	loss 0.5362 (0.4918)	grad_norm 2.5966 (2.7014)	mem 20675MB
[2025-04-03 03:47:16 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][30/573]	eta 0:08:43 lr 0.000158	time 0.8777 (0.9637)	loss 0.4265 (0.4906)	grad_norm 2.4536 (2.7022)	mem 20675MB
[2025-04-03 03:47:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][32/573]	eta 0:08:38 lr 0.000158	time 0.8792 (0.9592)	loss 0.5589 (0.4962)	grad_norm 2.0779 (2.6739)	mem 20675MB
[2025-04-03 03:47:20 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][34/573]	eta 0:08:34 lr 0.000158	time 0.8777 (0.9547)	loss 0.5651 (0.4994)	grad_norm 1.7880 (2.6257)	mem 20675MB
[2025-04-03 03:47:22 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][36/573]	eta 0:08:30 lr 0.000158	time 0.8787 (0.9506)	loss 0.4336 (0.4988)	grad_norm 4.1259 (2.6500)	mem 20675MB
[2025-04-03 03:47:23 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][38/573]	eta 0:08:26 lr 0.000158	time 0.8779 (0.9469)	loss 0.5606 (0.5019)	grad_norm 2.5081 (2.6362)	mem 20675MB
[2025-04-03 03:47:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][40/573]	eta 0:08:22 lr 0.000158	time 0.8815 (0.9437)	loss 0.4544 (0.5019)	grad_norm 1.9276 (2.6244)	mem 20675MB
[2025-04-03 03:47:27 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][42/573]	eta 0:08:19 lr 0.000158	time 0.8777 (0.9407)	loss 0.4381 (0.4990)	grad_norm 2.0854 (2.6305)	mem 20675MB
[2025-04-03 03:47:29 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][44/573]	eta 0:08:16 lr 0.000157	time 0.8847 (0.9381)	loss 0.5548 (0.4994)	grad_norm 2.1969 (2.6308)	mem 20675MB
[2025-04-03 03:47:30 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][46/573]	eta 0:08:13 lr 0.000157	time 0.8774 (0.9356)	loss 0.5500 (0.5012)	grad_norm 2.3727 (2.6090)	mem 20675MB
[2025-04-03 03:47:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][48/573]	eta 0:08:09 lr 0.000157	time 0.8790 (0.9333)	loss 0.5654 (0.5034)	grad_norm 2.3444 (2.5912)	mem 20675MB
[2025-04-03 03:47:34 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][50/573]	eta 0:08:06 lr 0.000157	time 0.8780 (0.9311)	loss 0.3879 (0.5009)	grad_norm 3.5127 (2.6135)	mem 20675MB
[2025-04-03 03:47:36 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][52/573]	eta 0:08:04 lr 0.000157	time 0.8908 (0.9294)	loss 0.4734 (0.5025)	grad_norm 2.7717 (2.6120)	mem 20675MB
[2025-04-03 03:47:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][54/573]	eta 0:08:01 lr 0.000157	time 0.8777 (0.9276)	loss 0.4561 (0.5015)	grad_norm 3.6341 (2.6243)	mem 20675MB
[2025-04-03 03:47:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][56/573]	eta 0:07:58 lr 0.000156	time 0.8775 (0.9258)	loss 0.5865 (0.5016)	grad_norm 2.8340 (2.6456)	mem 20675MB
[2025-04-03 03:47:41 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][58/573]	eta 0:07:55 lr 0.000156	time 0.8782 (0.9242)	loss 0.4449 (0.4993)	grad_norm 2.9628 (2.6635)	mem 20675MB
[2025-04-03 03:47:43 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][60/573]	eta 0:07:53 lr 0.000156	time 0.8774 (0.9228)	loss 0.5256 (0.4996)	grad_norm 2.1671 (2.6675)	mem 20675MB
[2025-04-03 03:47:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][62/573]	eta 0:07:50 lr 0.000156	time 0.8775 (0.9214)	loss 0.4791 (0.4981)	grad_norm 2.5858 (2.6755)	mem 20675MB
[2025-04-03 03:47:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][64/573]	eta 0:07:48 lr 0.000156	time 0.8856 (0.9204)	loss 0.3869 (0.4936)	grad_norm 3.4502 (2.6878)	mem 20675MB
[2025-04-03 03:47:48 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][66/573]	eta 0:07:46 lr 0.000156	time 0.8777 (0.9192)	loss 0.5112 (0.4929)	grad_norm 2.3174 (2.6735)	mem 20675MB
[2025-04-03 03:47:50 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][68/573]	eta 0:07:43 lr 0.000156	time 0.8781 (0.9180)	loss 0.4709 (0.4903)	grad_norm 2.9105 (2.6645)	mem 20675MB
[2025-04-03 03:47:51 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][70/573]	eta 0:07:41 lr 0.000155	time 0.8785 (0.9169)	loss 0.5913 (0.4922)	grad_norm 3.0881 (2.6725)	mem 20675MB
[2025-04-03 03:47:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][72/573]	eta 0:07:38 lr 0.000155	time 0.8845 (0.9160)	loss 0.5881 (0.4929)	grad_norm 2.1636 (2.6698)	mem 20675MB
[2025-04-03 03:47:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][74/573]	eta 0:07:36 lr 0.000155	time 0.8776 (0.9150)	loss 0.5517 (0.4924)	grad_norm 1.9946 (2.6694)	mem 20675MB
[2025-04-03 03:47:57 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][76/573]	eta 0:07:34 lr 0.000155	time 0.8775 (0.9141)	loss 0.5438 (0.4926)	grad_norm 2.5746 (2.6724)	mem 20675MB
[2025-04-03 03:47:58 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][78/573]	eta 0:07:32 lr 0.000155	time 0.8774 (0.9132)	loss 0.3682 (0.4909)	grad_norm 2.9973 (2.6709)	mem 20675MB
[2025-04-03 03:48:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][80/573]	eta 0:07:29 lr 0.000155	time 0.8784 (0.9124)	loss 0.5526 (0.4899)	grad_norm 2.5263 (2.6708)	mem 20675MB
[2025-04-03 03:48:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][82/573]	eta 0:07:27 lr 0.000155	time 0.8778 (0.9117)	loss 0.6111 (0.4910)	grad_norm 2.3985 (2.6777)	mem 20675MB
[2025-04-03 03:48:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][84/573]	eta 0:07:25 lr 0.000154	time 0.8777 (0.9109)	loss 0.5250 (0.4910)	grad_norm 2.4136 (2.6758)	mem 20675MB
[2025-04-03 03:48:06 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][86/573]	eta 0:07:23 lr 0.000154	time 0.8780 (0.9102)	loss 0.4401 (0.4910)	grad_norm 3.5446 (2.6860)	mem 20675MB
[2025-04-03 03:48:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][88/573]	eta 0:07:21 lr 0.000154	time 0.8970 (0.9097)	loss 0.4019 (0.4901)	grad_norm 4.8347 (2.7322)	mem 20675MB
[2025-04-03 03:48:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][90/573]	eta 0:07:19 lr 0.000154	time 0.8778 (0.9091)	loss 0.5419 (0.4906)	grad_norm 2.7149 (2.7731)	mem 20675MB
[2025-04-03 03:48:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][92/573]	eta 0:07:16 lr 0.000154	time 0.8821 (0.9084)	loss 0.5037 (0.4897)	grad_norm 3.1065 (2.7852)	mem 20675MB
[2025-04-03 03:48:13 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][94/573]	eta 0:07:14 lr 0.000154	time 0.8776 (0.9078)	loss 0.4268 (0.4890)	grad_norm 2.7245 (2.7953)	mem 20675MB
[2025-04-03 03:48:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][96/573]	eta 0:07:12 lr 0.000153	time 0.8780 (0.9072)	loss 0.5731 (0.4892)	grad_norm 2.0170 (2.7876)	mem 20675MB
[2025-04-03 03:48:16 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][98/573]	eta 0:07:10 lr 0.000153	time 0.8857 (0.9067)	loss 0.6212 (0.4906)	grad_norm 2.7564 (2.7861)	mem 20675MB
[2025-04-03 03:48:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][100/573]	eta 0:07:08 lr 0.000153	time 0.8787 (0.9063)	loss 0.4772 (0.4887)	grad_norm 3.6267 (2.7988)	mem 20675MB
[2025-04-03 03:48:20 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][102/573]	eta 0:07:06 lr 0.000153	time 0.8775 (0.9057)	loss 0.5355 (0.4897)	grad_norm 2.4543 (2.7912)	mem 20675MB
[2025-04-03 03:48:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][104/573]	eta 0:07:04 lr 0.000153	time 0.8779 (0.9052)	loss 0.4108 (0.4902)	grad_norm 3.2839 (2.7940)	mem 20675MB
[2025-04-03 03:48:23 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][106/573]	eta 0:07:02 lr 0.000153	time 0.8777 (0.9047)	loss 0.5279 (0.4892)	grad_norm 2.7040 (2.8054)	mem 20675MB
[2025-04-03 03:48:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][108/573]	eta 0:07:00 lr 0.000153	time 0.8833 (0.9046)	loss 0.6291 (0.4915)	grad_norm 2.5069 (2.7967)	mem 20675MB
[2025-04-03 03:48:27 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][110/573]	eta 0:06:58 lr 0.000152	time 0.8968 (0.9043)	loss 0.4264 (0.4921)	grad_norm 3.4284 (2.7987)	mem 20675MB
[2025-04-03 03:48:28 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][112/573]	eta 0:06:56 lr 0.000152	time 0.8776 (0.9039)	loss 0.4712 (0.4924)	grad_norm 1.8288 (2.7823)	mem 20675MB
[2025-04-03 03:48:30 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][114/573]	eta 0:06:54 lr 0.000152	time 0.8773 (0.9035)	loss 0.4921 (0.4925)	grad_norm 1.9577 (2.7727)	mem 20675MB
[2025-04-03 03:48:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][116/573]	eta 0:06:52 lr 0.000152	time 0.8796 (0.9031)	loss 0.4891 (0.4913)	grad_norm 2.2083 (2.7729)	mem 20675MB
[2025-04-03 03:48:34 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][118/573]	eta 0:06:50 lr 0.000152	time 0.8775 (0.9027)	loss 0.4974 (0.4909)	grad_norm 1.8248 (2.7687)	mem 20675MB
[2025-04-03 03:48:36 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][120/573]	eta 0:06:48 lr 0.000152	time 0.8783 (0.9023)	loss 0.3679 (0.4896)	grad_norm 3.8392 (2.7712)	mem 20675MB
[2025-04-03 03:48:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][122/573]	eta 0:06:46 lr 0.000152	time 0.8778 (0.9020)	loss 0.6139 (0.4905)	grad_norm 2.6006 (2.7668)	mem 20675MB
[2025-04-03 03:48:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][124/573]	eta 0:06:44 lr 0.000151	time 0.8778 (0.9016)	loss 0.3894 (0.4887)	grad_norm 2.7614 (2.7736)	mem 20675MB
[2025-04-03 03:48:41 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][126/573]	eta 0:06:42 lr 0.000151	time 0.8778 (0.9012)	loss 0.5043 (0.4877)	grad_norm 4.2966 (2.7870)	mem 20675MB
[2025-04-03 03:48:43 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][128/573]	eta 0:06:40 lr 0.000151	time 0.8817 (0.9009)	loss 0.3690 (0.4868)	grad_norm 4.5076 (2.7926)	mem 20675MB
[2025-04-03 03:48:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][130/573]	eta 0:06:38 lr 0.000151	time 0.8774 (0.9006)	loss 0.5329 (0.4861)	grad_norm 2.4075 (2.7984)	mem 20675MB
[2025-04-03 03:48:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][132/573]	eta 0:06:37 lr 0.000151	time 0.8776 (0.9003)	loss 0.6546 (0.4866)	grad_norm 3.8812 (2.8149)	mem 20675MB
[2025-04-03 03:48:48 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][134/573]	eta 0:06:35 lr 0.000151	time 0.8773 (0.9000)	loss 0.3883 (0.4853)	grad_norm 2.8240 (2.8252)	mem 20675MB
[2025-04-03 03:48:50 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][136/573]	eta 0:06:33 lr 0.000151	time 0.8774 (0.8997)	loss 0.4363 (0.4857)	grad_norm 2.5059 (2.8172)	mem 20675MB
[2025-04-03 03:48:51 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][138/573]	eta 0:06:31 lr 0.000150	time 0.8778 (0.8996)	loss 0.5727 (0.4855)	grad_norm 2.3701 (2.8162)	mem 20675MB
[2025-04-03 03:48:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][140/573]	eta 0:06:29 lr 0.000150	time 0.8857 (0.8995)	loss 0.4274 (0.4844)	grad_norm 2.5667 (2.8193)	mem 20675MB
[2025-04-03 03:48:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][142/573]	eta 0:06:27 lr 0.000150	time 0.8788 (0.8993)	loss 0.5176 (0.4850)	grad_norm 2.4581 (2.8138)	mem 20675MB
[2025-04-03 03:48:57 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][144/573]	eta 0:06:25 lr 0.000150	time 0.8783 (0.8990)	loss 0.6037 (0.4863)	grad_norm 2.5636 (2.8129)	mem 20675MB
[2025-04-03 03:48:58 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][146/573]	eta 0:06:23 lr 0.000150	time 0.8783 (0.8987)	loss 0.4913 (0.4863)	grad_norm 3.2766 (2.8144)	mem 20675MB
[2025-04-03 03:49:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][148/573]	eta 0:06:21 lr 0.000150	time 0.8905 (0.8985)	loss 0.4405 (0.4848)	grad_norm 2.1913 (2.8154)	mem 20675MB
[2025-04-03 03:49:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][150/573]	eta 0:06:19 lr 0.000149	time 0.8793 (0.8983)	loss 0.3912 (0.4843)	grad_norm 2.3776 (2.8083)	mem 20675MB
[2025-04-03 03:49:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][152/573]	eta 0:06:18 lr 0.000149	time 0.8785 (0.8980)	loss 0.4836 (0.4851)	grad_norm 3.1513 (2.8094)	mem 20675MB
[2025-04-03 03:49:06 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][154/573]	eta 0:06:16 lr 0.000149	time 0.8774 (0.8978)	loss 0.3888 (0.4847)	grad_norm 4.0647 (2.8137)	mem 20675MB
[2025-04-03 03:49:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][156/573]	eta 0:06:14 lr 0.000149	time 0.8788 (0.8975)	loss 0.3202 (0.4841)	grad_norm 3.4686 (2.8188)	mem 20675MB
[2025-04-03 03:49:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][158/573]	eta 0:06:12 lr 0.000149	time 0.8782 (0.8973)	loss 0.5551 (0.4840)	grad_norm 2.7200 (2.8269)	mem 20675MB
[2025-04-03 03:49:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][160/573]	eta 0:06:10 lr 0.000149	time 0.8780 (0.8971)	loss 0.5499 (0.4844)	grad_norm 2.1867 (2.8235)	mem 20675MB
[2025-04-03 03:49:13 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][162/573]	eta 0:06:08 lr 0.000149	time 0.8776 (0.8969)	loss 0.4859 (0.4850)	grad_norm 3.1330 (2.8221)	mem 20675MB
[2025-04-03 03:49:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][164/573]	eta 0:06:06 lr 0.000148	time 0.8811 (0.8967)	loss 0.4240 (0.4851)	grad_norm 3.2821 (2.8192)	mem 20675MB
[2025-04-03 03:49:16 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][166/573]	eta 0:06:04 lr 0.000148	time 0.8855 (0.8965)	loss 0.5449 (0.4857)	grad_norm 2.1780 (2.8131)	mem 20675MB
[2025-04-03 03:49:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][168/573]	eta 0:06:02 lr 0.000148	time 0.8773 (0.8963)	loss 0.5231 (0.4864)	grad_norm 2.7196 (2.8114)	mem 20675MB
[2025-04-03 03:49:20 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][170/573]	eta 0:06:01 lr 0.000148	time 0.8776 (0.8961)	loss 0.3796 (0.4860)	grad_norm 2.7310 (2.8099)	mem 20675MB
[2025-04-03 03:49:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][172/573]	eta 0:05:59 lr 0.000148	time 0.8776 (0.8959)	loss 0.5289 (0.4868)	grad_norm 1.8060 (2.8150)	mem 20675MB
[2025-04-03 03:49:23 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][174/573]	eta 0:05:57 lr 0.000148	time 0.8778 (0.8957)	loss 0.3817 (0.4868)	grad_norm 1.7341 (2.8052)	mem 20675MB
[2025-04-03 03:49:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][176/573]	eta 0:05:55 lr 0.000148	time 0.8782 (0.8956)	loss 0.5167 (0.4872)	grad_norm 2.3854 (2.8005)	mem 20675MB
[2025-04-03 03:49:27 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][178/573]	eta 0:05:53 lr 0.000147	time 0.8776 (0.8954)	loss 0.5620 (0.4877)	grad_norm 1.5446 (2.8031)	mem 20675MB
[2025-04-03 03:49:28 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][180/573]	eta 0:05:51 lr 0.000147	time 0.8792 (0.8953)	loss 0.5250 (0.4876)	grad_norm 2.6369 (2.8043)	mem 20675MB
[2025-04-03 03:49:30 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][182/573]	eta 0:05:49 lr 0.000147	time 0.8774 (0.8951)	loss 0.5191 (0.4871)	grad_norm 1.9940 (2.8053)	mem 20675MB
[2025-04-03 03:49:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][184/573]	eta 0:05:48 lr 0.000147	time 0.8789 (0.8949)	loss 0.3800 (0.4870)	grad_norm 2.9313 (2.8048)	mem 20675MB
[2025-04-03 03:49:34 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][186/573]	eta 0:05:46 lr 0.000147	time 0.8774 (0.8948)	loss 0.3862 (0.4869)	grad_norm 2.0946 (2.7997)	mem 20675MB
[2025-04-03 03:49:35 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][188/573]	eta 0:05:44 lr 0.000147	time 0.8829 (0.8947)	loss 0.5757 (0.4879)	grad_norm 3.1786 (2.8005)	mem 20675MB
[2025-04-03 03:49:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][190/573]	eta 0:05:42 lr 0.000147	time 0.8775 (0.8948)	loss 0.5395 (0.4886)	grad_norm 2.4233 (2.7936)	mem 20675MB
[2025-04-03 03:49:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][192/573]	eta 0:05:40 lr 0.000146	time 0.8786 (0.8946)	loss 0.6394 (0.4896)	grad_norm 2.6558 (2.7904)	mem 20675MB
[2025-04-03 03:49:41 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][194/573]	eta 0:05:39 lr 0.000146	time 0.8789 (0.8945)	loss 0.4970 (0.4900)	grad_norm 2.1036 (2.7825)	mem 20675MB
[2025-04-03 03:49:43 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][196/573]	eta 0:05:37 lr 0.000146	time 0.8822 (0.8944)	loss 0.4712 (0.4905)	grad_norm 1.8906 (2.7788)	mem 20675MB
[2025-04-03 03:49:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][198/573]	eta 0:05:35 lr 0.000146	time 0.8773 (0.8942)	loss 0.5586 (0.4908)	grad_norm 2.3304 (2.7771)	mem 20675MB
[2025-04-03 03:49:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][200/573]	eta 0:05:33 lr 0.000146	time 0.8774 (0.8941)	loss 0.5594 (0.4917)	grad_norm 2.4852 (2.7751)	mem 20675MB
[2025-04-03 03:49:48 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][202/573]	eta 0:05:31 lr 0.000146	time 0.8779 (0.8939)	loss 0.4659 (0.4918)	grad_norm 2.6327 (2.7695)	mem 20675MB
[2025-04-03 03:49:50 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][204/573]	eta 0:05:29 lr 0.000145	time 0.8777 (0.8938)	loss 0.5854 (0.4918)	grad_norm 1.8859 (2.7674)	mem 20675MB
[2025-04-03 03:49:51 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][206/573]	eta 0:05:27 lr 0.000145	time 0.8779 (0.8936)	loss 0.4876 (0.4914)	grad_norm 2.7758 (2.7676)	mem 20675MB
[2025-04-03 03:49:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][208/573]	eta 0:05:26 lr 0.000145	time 0.8866 (0.8936)	loss 0.4964 (0.4912)	grad_norm 2.9804 (2.7717)	mem 20675MB
[2025-04-03 03:49:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][210/573]	eta 0:05:24 lr 0.000145	time 0.9217 (0.8937)	loss 0.5355 (0.4913)	grad_norm 2.9524 (2.7768)	mem 20675MB
[2025-04-03 03:49:57 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][212/573]	eta 0:05:22 lr 0.000145	time 0.8779 (0.8935)	loss 0.3686 (0.4905)	grad_norm 2.9742 (2.7769)	mem 20675MB
[2025-04-03 03:49:58 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][214/573]	eta 0:05:20 lr 0.000145	time 0.8800 (0.8934)	loss 0.3673 (0.4903)	grad_norm 3.6848 (2.7764)	mem 20675MB
[2025-04-03 03:50:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][216/573]	eta 0:05:18 lr 0.000145	time 0.8780 (0.8933)	loss 0.3132 (0.4892)	grad_norm 4.7012 (2.7928)	mem 20675MB
[2025-04-03 03:50:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][218/573]	eta 0:05:17 lr 0.000144	time 0.8824 (0.8932)	loss 0.6172 (0.4900)	grad_norm 3.3662 (2.7944)	mem 20675MB
[2025-04-03 03:50:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][220/573]	eta 0:05:15 lr 0.000144	time 0.8775 (0.8930)	loss 0.3751 (0.4889)	grad_norm 3.5865 (2.7953)	mem 20675MB
[2025-04-03 03:50:05 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][222/573]	eta 0:05:13 lr 0.000144	time 0.8777 (0.8929)	loss 0.3898 (0.4880)	grad_norm 3.5283 (2.8023)	mem 20675MB
[2025-04-03 03:50:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][224/573]	eta 0:05:11 lr 0.000144	time 0.8797 (0.8928)	loss 0.3300 (0.4867)	grad_norm 3.9671 (2.8085)	mem 20675MB
[2025-04-03 03:50:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][226/573]	eta 0:05:09 lr 0.000144	time 0.9085 (0.8928)	loss 0.4721 (0.4864)	grad_norm 3.1098 (2.8129)	mem 20675MB
[2025-04-03 03:50:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][228/573]	eta 0:05:07 lr 0.000144	time 0.8869 (0.8927)	loss 0.4932 (0.4862)	grad_norm 2.6321 (2.8130)	mem 20675MB
[2025-04-03 03:50:13 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][230/573]	eta 0:05:06 lr 0.000144	time 0.8779 (0.8926)	loss 0.4265 (0.4861)	grad_norm 5.0248 (2.8229)	mem 20675MB
[2025-04-03 03:50:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][232/573]	eta 0:05:04 lr 0.000143	time 0.8788 (0.8925)	loss 0.5350 (0.4857)	grad_norm 3.1312 (2.8288)	mem 20675MB
[2025-04-03 03:50:16 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][234/573]	eta 0:05:02 lr 0.000143	time 0.8777 (0.8924)	loss 0.6037 (0.4860)	grad_norm 2.8037 (2.8292)	mem 20675MB
[2025-04-03 03:50:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][236/573]	eta 0:05:00 lr 0.000143	time 0.8781 (0.8923)	loss 0.4905 (0.4866)	grad_norm 3.4696 (2.8311)	mem 20675MB
[2025-04-03 03:50:20 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][238/573]	eta 0:04:58 lr 0.000143	time 0.8859 (0.8923)	loss 0.4385 (0.4867)	grad_norm 2.4716 (2.8333)	mem 20675MB
[2025-04-03 03:50:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][240/573]	eta 0:04:57 lr 0.000143	time 0.8802 (0.8922)	loss 0.4031 (0.4858)	grad_norm 4.0144 (2.8421)	mem 20675MB
[2025-04-03 03:50:23 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][242/573]	eta 0:04:55 lr 0.000143	time 0.8780 (0.8921)	loss 0.4368 (0.4857)	grad_norm 2.9703 (2.8401)	mem 20675MB
[2025-04-03 03:50:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][244/573]	eta 0:04:53 lr 0.000143	time 0.8781 (0.8920)	loss 0.4229 (0.4856)	grad_norm 2.9199 (2.8397)	mem 20675MB
[2025-04-03 03:50:27 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][246/573]	eta 0:04:51 lr 0.000142	time 0.8779 (0.8919)	loss 0.5120 (0.4859)	grad_norm 2.0506 (2.8370)	mem 20675MB
[2025-04-03 03:50:28 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][248/573]	eta 0:04:49 lr 0.000142	time 0.8779 (0.8918)	loss 0.5356 (0.4865)	grad_norm 2.6776 (2.8363)	mem 20675MB
[2025-04-03 03:50:30 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][250/573]	eta 0:04:48 lr 0.000142	time 0.8778 (0.8917)	loss 0.5841 (0.4874)	grad_norm 2.8164 (2.8368)	mem 20675MB
[2025-04-03 03:50:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][252/573]	eta 0:04:46 lr 0.000142	time 0.8775 (0.8917)	loss 0.4929 (0.4872)	grad_norm 2.3514 (2.8334)	mem 20675MB
[2025-04-03 03:50:34 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][254/573]	eta 0:04:44 lr 0.000142	time 0.8771 (0.8916)	loss 0.4622 (0.4876)	grad_norm 3.4073 (2.8368)	mem 20675MB
[2025-04-03 03:50:35 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][256/573]	eta 0:04:42 lr 0.000142	time 0.8838 (0.8915)	loss 0.4920 (0.4880)	grad_norm 3.1910 (2.8350)	mem 20675MB
[2025-04-03 03:50:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][258/573]	eta 0:04:40 lr 0.000142	time 0.8780 (0.8914)	loss 0.5950 (0.4889)	grad_norm 2.3435 (2.8325)	mem 20675MB
[2025-04-03 03:50:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][260/573]	eta 0:04:38 lr 0.000141	time 0.8822 (0.8913)	loss 0.4384 (0.4892)	grad_norm 4.0682 (2.8326)	mem 20675MB
[2025-04-03 03:50:41 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][262/573]	eta 0:04:37 lr 0.000141	time 0.8782 (0.8913)	loss 0.5459 (0.4892)	grad_norm 2.0299 (2.8306)	mem 20675MB
[2025-04-03 03:50:43 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][264/573]	eta 0:04:35 lr 0.000141	time 0.8781 (0.8912)	loss 0.5098 (0.4893)	grad_norm 2.1790 (2.8256)	mem 20675MB
[2025-04-03 03:50:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][266/573]	eta 0:04:33 lr 0.000141	time 0.8775 (0.8911)	loss 0.4967 (0.4892)	grad_norm 2.1190 (2.8206)	mem 20675MB
[2025-04-03 03:50:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][268/573]	eta 0:04:31 lr 0.000141	time 0.8771 (0.8910)	loss 0.3401 (0.4888)	grad_norm 2.2614 (2.8205)	mem 20675MB
[2025-04-03 03:50:48 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][270/573]	eta 0:04:29 lr 0.000141	time 0.8872 (0.8909)	loss 0.5941 (0.4894)	grad_norm 2.3183 (2.8143)	mem 20675MB
[2025-04-03 03:50:50 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][272/573]	eta 0:04:28 lr 0.000141	time 0.8782 (0.8909)	loss 0.4958 (0.4892)	grad_norm 2.0869 (2.8116)	mem 20675MB
[2025-04-03 03:50:51 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][274/573]	eta 0:04:26 lr 0.000140	time 0.8777 (0.8908)	loss 0.4827 (0.4890)	grad_norm 3.0512 (2.8148)	mem 20675MB
[2025-04-03 03:50:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][276/573]	eta 0:04:24 lr 0.000140	time 0.8771 (0.8907)	loss 0.4008 (0.4892)	grad_norm 2.0702 (2.8100)	mem 20675MB
[2025-04-03 03:50:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][278/573]	eta 0:04:22 lr 0.000140	time 0.8819 (0.8906)	loss 0.4398 (0.4889)	grad_norm 2.9709 (2.8067)	mem 20675MB
[2025-04-03 03:50:57 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][280/573]	eta 0:04:20 lr 0.000140	time 0.8839 (0.8906)	loss 0.5564 (0.4892)	grad_norm 2.2258 (2.8038)	mem 20675MB
[2025-04-03 03:50:58 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][282/573]	eta 0:04:19 lr 0.000140	time 0.8830 (0.8906)	loss 0.5602 (0.4892)	grad_norm 2.3708 (2.8028)	mem 20675MB
[2025-04-03 03:51:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][284/573]	eta 0:04:17 lr 0.000140	time 0.8773 (0.8905)	loss 0.5607 (0.4896)	grad_norm 2.2369 (2.7984)	mem 20675MB
[2025-04-03 03:51:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][286/573]	eta 0:04:15 lr 0.000140	time 0.8772 (0.8904)	loss 0.5493 (0.4900)	grad_norm 1.7051 (2.7923)	mem 20675MB
[2025-04-03 03:51:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][288/573]	eta 0:04:13 lr 0.000139	time 0.8776 (0.8903)	loss 0.3964 (0.4898)	grad_norm 2.2912 (2.7867)	mem 20675MB
[2025-04-03 03:51:05 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][290/573]	eta 0:04:11 lr 0.000139	time 0.8776 (0.8903)	loss 0.4241 (0.4897)	grad_norm 4.6176 (2.7929)	mem 20675MB
[2025-04-03 03:51:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][292/573]	eta 0:04:10 lr 0.000139	time 0.8774 (0.8902)	loss 0.4681 (0.4896)	grad_norm 3.3889 (2.7947)	mem 20675MB
[2025-04-03 03:51:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][294/573]	eta 0:04:08 lr 0.000139	time 0.8775 (0.8901)	loss 0.4572 (0.4894)	grad_norm 2.5807 (2.7941)	mem 20675MB
[2025-04-03 03:51:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][296/573]	eta 0:04:06 lr 0.000139	time 0.8776 (0.8900)	loss 0.5732 (0.4898)	grad_norm 1.7294 (2.7874)	mem 20675MB
[2025-04-03 03:51:12 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][298/573]	eta 0:04:04 lr 0.000139	time 0.8797 (0.8900)	loss 0.3517 (0.4892)	grad_norm 4.4627 (2.7915)	mem 20675MB
[2025-04-03 03:51:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][300/573]	eta 0:04:02 lr 0.000139	time 0.8778 (0.8899)	loss 0.5086 (0.4896)	grad_norm 2.0659 (2.7874)	mem 20675MB
[2025-04-03 03:51:16 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][302/573]	eta 0:04:01 lr 0.000138	time 0.8771 (0.8898)	loss 0.5027 (0.4894)	grad_norm 3.2114 (2.7868)	mem 20675MB
[2025-04-03 03:51:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][304/573]	eta 0:03:59 lr 0.000138	time 0.8803 (0.8898)	loss 0.3558 (0.4889)	grad_norm 2.3803 (2.7858)	mem 20675MB
[2025-04-03 03:51:19 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][306/573]	eta 0:03:57 lr 0.000138	time 0.8782 (0.8897)	loss 0.4550 (0.4888)	grad_norm 4.0175 (2.7897)	mem 20675MB
[2025-04-03 03:51:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][308/573]	eta 0:03:55 lr 0.000138	time 0.8773 (0.8897)	loss 0.5246 (0.4888)	grad_norm 2.6940 (2.7881)	mem 20675MB
[2025-04-03 03:51:23 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][310/573]	eta 0:03:53 lr 0.000138	time 0.8790 (0.8896)	loss 0.5346 (0.4893)	grad_norm 2.4274 (2.7860)	mem 20675MB
[2025-04-03 03:51:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][312/573]	eta 0:03:52 lr 0.000138	time 0.8776 (0.8895)	loss 0.5103 (0.4895)	grad_norm 2.1972 (2.7863)	mem 20675MB
[2025-04-03 03:51:27 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][314/573]	eta 0:03:50 lr 0.000138	time 0.8776 (0.8895)	loss 0.5194 (0.4901)	grad_norm 2.1891 (2.7845)	mem 20675MB
[2025-04-03 03:51:28 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][316/573]	eta 0:03:48 lr 0.000137	time 0.8781 (0.8894)	loss 0.4958 (0.4901)	grad_norm 3.9024 (2.7880)	mem 20675MB
[2025-04-03 03:51:30 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][318/573]	eta 0:03:46 lr 0.000137	time 0.8777 (0.8894)	loss 0.5618 (0.4902)	grad_norm 3.3915 (2.7914)	mem 20675MB
[2025-04-03 03:51:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][320/573]	eta 0:03:45 lr 0.000137	time 0.8786 (0.8894)	loss 0.3607 (0.4895)	grad_norm 5.0968 (2.7964)	mem 20675MB
[2025-04-03 03:51:34 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][322/573]	eta 0:03:43 lr 0.000137	time 0.8926 (0.8894)	loss 0.3520 (0.4893)	grad_norm 3.3467 (2.7979)	mem 20675MB
[2025-04-03 03:51:35 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][324/573]	eta 0:03:41 lr 0.000137	time 0.8782 (0.8893)	loss 0.5051 (0.4898)	grad_norm 2.6135 (2.7970)	mem 20675MB
[2025-04-03 03:51:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][326/573]	eta 0:03:39 lr 0.000137	time 0.8781 (0.8893)	loss 0.4290 (0.4900)	grad_norm 2.4089 (2.7938)	mem 20675MB
[2025-04-03 03:51:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][328/573]	eta 0:03:37 lr 0.000137	time 0.8886 (0.8893)	loss 0.4544 (0.4902)	grad_norm 3.4462 (2.7927)	mem 20675MB
[2025-04-03 03:51:41 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][330/573]	eta 0:03:36 lr 0.000136	time 0.8869 (0.8892)	loss 0.4853 (0.4903)	grad_norm 3.5993 (2.7981)	mem 20675MB
[2025-04-03 03:51:42 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][332/573]	eta 0:03:34 lr 0.000136	time 0.8790 (0.8892)	loss 0.3069 (0.4900)	grad_norm 2.8628 (2.8009)	mem 20675MB
[2025-04-03 03:51:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][334/573]	eta 0:03:32 lr 0.000136	time 0.8780 (0.8891)	loss 0.6200 (0.4904)	grad_norm 2.7704 (2.8021)	mem 20675MB
[2025-04-03 03:51:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][336/573]	eta 0:03:30 lr 0.000136	time 0.8777 (0.8891)	loss 0.4183 (0.4903)	grad_norm 2.6795 (2.8019)	mem 20675MB
[2025-04-03 03:51:48 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][338/573]	eta 0:03:28 lr 0.000136	time 0.8777 (0.8890)	loss 0.3975 (0.4900)	grad_norm 3.2269 (2.8006)	mem 20675MB
[2025-04-03 03:51:49 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][340/573]	eta 0:03:27 lr 0.000136	time 0.8787 (0.8890)	loss 0.4543 (0.4896)	grad_norm 2.2926 (2.7993)	mem 20675MB
[2025-04-03 03:51:51 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][342/573]	eta 0:03:25 lr 0.000136	time 0.8771 (0.8890)	loss 0.5750 (0.4893)	grad_norm 2.2529 (2.7975)	mem 20675MB
[2025-04-03 03:51:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][344/573]	eta 0:03:23 lr 0.000135	time 0.8774 (0.8890)	loss 0.3410 (0.4891)	grad_norm 2.8787 (2.7966)	mem 20675MB
[2025-04-03 03:51:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][346/573]	eta 0:03:21 lr 0.000135	time 0.8782 (0.8889)	loss 0.3473 (0.4882)	grad_norm 2.4547 (2.7997)	mem 20675MB
[2025-04-03 03:51:57 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][348/573]	eta 0:03:19 lr 0.000135	time 0.8859 (0.8889)	loss 0.4119 (0.4881)	grad_norm 2.2697 (2.7969)	mem 20675MB
[2025-04-03 03:51:58 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][350/573]	eta 0:03:18 lr 0.000135	time 0.8773 (0.8888)	loss 0.4905 (0.4884)	grad_norm 2.7519 (2.7960)	mem 20675MB
[2025-04-03 03:52:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][352/573]	eta 0:03:16 lr 0.000135	time 0.8779 (0.8888)	loss 0.6017 (0.4887)	grad_norm 3.0291 (2.7940)	mem 20675MB
[2025-04-03 03:52:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][354/573]	eta 0:03:14 lr 0.000135	time 0.8775 (0.8887)	loss 0.4262 (0.4888)	grad_norm 2.5079 (2.7922)	mem 20675MB
[2025-04-03 03:52:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][356/573]	eta 0:03:12 lr 0.000135	time 0.8776 (0.8887)	loss 0.4114 (0.4887)	grad_norm 2.9832 (2.7903)	mem 20675MB
[2025-04-03 03:52:05 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][358/573]	eta 0:03:11 lr 0.000134	time 0.8779 (0.8886)	loss 0.5439 (0.4890)	grad_norm 2.6664 (2.7893)	mem 20675MB
[2025-04-03 03:52:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][360/573]	eta 0:03:09 lr 0.000134	time 0.8802 (0.8886)	loss 0.4267 (0.4886)	grad_norm 3.2653 (2.7930)	mem 20675MB
[2025-04-03 03:52:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][362/573]	eta 0:03:07 lr 0.000134	time 0.8774 (0.8885)	loss 0.5104 (0.4887)	grad_norm 3.2260 (2.7935)	mem 20675MB
[2025-04-03 03:52:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][364/573]	eta 0:03:05 lr 0.000134	time 0.8780 (0.8885)	loss 0.5416 (0.4887)	grad_norm 2.3469 (2.7913)	mem 20675MB
[2025-04-03 03:52:12 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][366/573]	eta 0:03:03 lr 0.000134	time 0.8776 (0.8884)	loss 0.3398 (0.4880)	grad_norm 2.6687 (2.7911)	mem 20675MB
[2025-04-03 03:52:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][368/573]	eta 0:03:02 lr 0.000134	time 0.8786 (0.8883)	loss 0.4545 (0.4882)	grad_norm 2.8483 (2.7918)	mem 20675MB
[2025-04-03 03:52:16 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][370/573]	eta 0:03:00 lr 0.000134	time 0.8781 (0.8883)	loss 0.5387 (0.4884)	grad_norm 4.6364 (2.7952)	mem 20675MB
[2025-04-03 03:52:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][372/573]	eta 0:02:58 lr 0.000133	time 0.9222 (0.8884)	loss 0.4277 (0.4878)	grad_norm 3.3524 (2.8018)	mem 20675MB
[2025-04-03 03:52:19 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][374/573]	eta 0:02:56 lr 0.000133	time 0.8795 (0.8883)	loss 0.5818 (0.4883)	grad_norm 2.1232 (2.7984)	mem 20675MB
[2025-04-03 03:52:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][376/573]	eta 0:02:54 lr 0.000133	time 0.8806 (0.8883)	loss 0.5275 (0.4884)	grad_norm 1.9801 (2.8014)	mem 20675MB
[2025-04-03 03:52:23 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][378/573]	eta 0:02:53 lr 0.000133	time 0.8979 (0.8883)	loss 0.4110 (0.4878)	grad_norm 2.5218 (2.8010)	mem 20675MB
[2025-04-03 03:52:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][380/573]	eta 0:02:51 lr 0.000133	time 0.8779 (0.8883)	loss 0.5750 (0.4880)	grad_norm 1.8824 (2.7984)	mem 20675MB
[2025-04-03 03:52:27 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][382/573]	eta 0:02:49 lr 0.000133	time 0.8774 (0.8882)	loss 0.4045 (0.4878)	grad_norm 2.8640 (2.7981)	mem 20675MB
[2025-04-03 03:52:28 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][384/573]	eta 0:02:47 lr 0.000133	time 0.8845 (0.8882)	loss 0.4876 (0.4878)	grad_norm 3.7358 (2.8015)	mem 20675MB
[2025-04-03 03:52:30 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][386/573]	eta 0:02:46 lr 0.000132	time 0.8777 (0.8881)	loss 0.3670 (0.4873)	grad_norm 2.3189 (2.8048)	mem 20675MB
[2025-04-03 03:52:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][388/573]	eta 0:02:44 lr 0.000132	time 0.8778 (0.8881)	loss 0.4880 (0.4874)	grad_norm 1.7432 (2.8008)	mem 20675MB
[2025-04-03 03:52:34 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][390/573]	eta 0:02:42 lr 0.000132	time 0.8775 (0.8880)	loss 0.5171 (0.4871)	grad_norm 2.2050 (2.8006)	mem 20675MB
[2025-04-03 03:52:35 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][392/573]	eta 0:02:40 lr 0.000132	time 0.8784 (0.8880)	loss 0.3409 (0.4867)	grad_norm 3.1036 (2.8020)	mem 20675MB
[2025-04-03 03:52:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][394/573]	eta 0:02:38 lr 0.000132	time 0.8787 (0.8880)	loss 0.4063 (0.4867)	grad_norm 3.9040 (2.8041)	mem 20675MB
[2025-04-03 03:52:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][396/573]	eta 0:02:37 lr 0.000132	time 0.8806 (0.8879)	loss 0.6265 (0.4872)	grad_norm 2.8359 (2.8038)	mem 20675MB
[2025-04-03 03:52:41 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][398/573]	eta 0:02:35 lr 0.000132	time 0.8968 (0.8879)	loss 0.3454 (0.4871)	grad_norm 2.7990 (2.8025)	mem 20675MB
[2025-04-03 03:52:42 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][400/573]	eta 0:02:33 lr 0.000131	time 0.8782 (0.8879)	loss 0.4561 (0.4871)	grad_norm 2.2861 (2.8023)	mem 20675MB
[2025-04-03 03:52:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][402/573]	eta 0:02:31 lr 0.000131	time 0.8777 (0.8879)	loss 0.6047 (0.4875)	grad_norm 6.6278 (2.8123)	mem 20675MB
[2025-04-03 03:52:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][404/573]	eta 0:02:30 lr 0.000131	time 0.8774 (0.8878)	loss 0.5311 (0.4877)	grad_norm 2.9711 (2.8141)	mem 20675MB
[2025-04-03 03:52:48 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][406/573]	eta 0:02:28 lr 0.000131	time 0.8777 (0.8878)	loss 0.4988 (0.4875)	grad_norm 1.7326 (2.8133)	mem 20675MB
[2025-04-03 03:52:49 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][408/573]	eta 0:02:26 lr 0.000131	time 0.8774 (0.8877)	loss 0.4286 (0.4874)	grad_norm 2.4737 (2.8102)	mem 20675MB
[2025-04-03 03:52:51 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][410/573]	eta 0:02:24 lr 0.000131	time 0.8775 (0.8877)	loss 0.6096 (0.4875)	grad_norm 2.4270 (2.8095)	mem 20675MB
[2025-04-03 03:52:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][412/573]	eta 0:02:22 lr 0.000131	time 0.8784 (0.8877)	loss 0.4918 (0.4873)	grad_norm 3.5525 (2.8122)	mem 20675MB
[2025-04-03 03:52:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][414/573]	eta 0:02:21 lr 0.000130	time 0.8779 (0.8876)	loss 0.4623 (0.4870)	grad_norm 2.8302 (2.8161)	mem 20675MB
[2025-04-03 03:52:56 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][416/573]	eta 0:02:19 lr 0.000130	time 0.8866 (0.8876)	loss 0.4330 (0.4867)	grad_norm 2.1419 (2.8168)	mem 20675MB
[2025-04-03 03:52:58 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][418/573]	eta 0:02:17 lr 0.000130	time 0.8819 (0.8876)	loss 0.3752 (0.4865)	grad_norm 2.9259 (2.8154)	mem 20675MB
[2025-04-03 03:53:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][420/573]	eta 0:02:15 lr 0.000130	time 0.8777 (0.8875)	loss 0.5626 (0.4867)	grad_norm 3.1157 (2.8148)	mem 20675MB
[2025-04-03 03:53:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][422/573]	eta 0:02:14 lr 0.000130	time 0.8777 (0.8875)	loss 0.5278 (0.4865)	grad_norm 2.6798 (2.8134)	mem 20675MB
[2025-04-03 03:53:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][424/573]	eta 0:02:12 lr 0.000130	time 0.8782 (0.8875)	loss 0.3770 (0.4864)	grad_norm 3.4481 (2.8127)	mem 20675MB
[2025-04-03 03:53:05 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][426/573]	eta 0:02:10 lr 0.000130	time 0.8824 (0.8874)	loss 0.5517 (0.4865)	grad_norm 2.0575 (2.8089)	mem 20675MB
[2025-04-03 03:53:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][428/573]	eta 0:02:08 lr 0.000130	time 0.8792 (0.8874)	loss 0.5604 (0.4863)	grad_norm 2.1060 (2.8090)	mem 20675MB
[2025-04-03 03:53:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][430/573]	eta 0:02:06 lr 0.000129	time 0.8788 (0.8874)	loss 0.4013 (0.4862)	grad_norm 3.5034 (2.8108)	mem 20675MB
[2025-04-03 03:53:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][432/573]	eta 0:02:05 lr 0.000129	time 0.8784 (0.8873)	loss 0.5365 (0.4864)	grad_norm 2.3012 (2.8104)	mem 20675MB
[2025-04-03 03:53:12 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][434/573]	eta 0:02:03 lr 0.000129	time 0.8864 (0.8873)	loss 0.5725 (0.4864)	grad_norm 2.6661 (2.8143)	mem 20675MB
[2025-04-03 03:53:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][436/573]	eta 0:02:01 lr 0.000129	time 0.8782 (0.8873)	loss 0.5932 (0.4868)	grad_norm 1.9993 (2.8114)	mem 20675MB
[2025-04-03 03:53:16 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][438/573]	eta 0:01:59 lr 0.000129	time 0.8777 (0.8872)	loss 0.5565 (0.4867)	grad_norm 2.4801 (2.8116)	mem 20675MB
[2025-04-03 03:53:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][440/573]	eta 0:01:57 lr 0.000129	time 0.8780 (0.8872)	loss 0.3792 (0.4864)	grad_norm 2.9859 (2.8122)	mem 20675MB
[2025-04-03 03:53:19 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][442/573]	eta 0:01:56 lr 0.000129	time 0.8890 (0.8872)	loss 0.5057 (0.4865)	grad_norm 2.4556 (2.8098)	mem 20675MB
[2025-04-03 03:53:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][444/573]	eta 0:01:54 lr 0.000128	time 0.8787 (0.8872)	loss 0.5399 (0.4867)	grad_norm 2.4174 (2.8075)	mem 20675MB
[2025-04-03 03:53:23 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][446/573]	eta 0:01:52 lr 0.000128	time 0.8778 (0.8871)	loss 0.5575 (0.4869)	grad_norm 2.1447 (2.8047)	mem 20675MB
[2025-04-03 03:53:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][448/573]	eta 0:01:50 lr 0.000128	time 0.8775 (0.8871)	loss 0.4880 (0.4871)	grad_norm 1.6125 (2.8011)	mem 20675MB
[2025-04-03 03:53:26 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][450/573]	eta 0:01:49 lr 0.000128	time 0.8847 (0.8871)	loss 0.4905 (0.4871)	grad_norm 2.4905 (2.7999)	mem 20675MB
[2025-04-03 03:53:28 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][452/573]	eta 0:01:47 lr 0.000128	time 0.8794 (0.8870)	loss 0.3267 (0.4869)	grad_norm 2.9567 (2.8004)	mem 20675MB
[2025-04-03 03:53:30 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][454/573]	eta 0:01:45 lr 0.000128	time 0.8780 (0.8870)	loss 0.3573 (0.4868)	grad_norm 3.3286 (2.8052)	mem 20675MB
[2025-04-03 03:53:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][456/573]	eta 0:01:43 lr 0.000128	time 0.8831 (0.8870)	loss 0.5223 (0.4866)	grad_norm 2.4596 (2.8064)	mem 20675MB
[2025-04-03 03:53:33 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][458/573]	eta 0:01:42 lr 0.000127	time 0.8973 (0.8870)	loss 0.4110 (0.4866)	grad_norm 2.3101 (2.8068)	mem 20675MB
[2025-04-03 03:53:35 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][460/573]	eta 0:01:40 lr 0.000127	time 0.8845 (0.8870)	loss 0.3896 (0.4861)	grad_norm 3.2172 (2.8077)	mem 20675MB
[2025-04-03 03:53:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][462/573]	eta 0:01:38 lr 0.000127	time 0.8772 (0.8869)	loss 0.4010 (0.4857)	grad_norm 2.9176 (2.8082)	mem 20675MB
[2025-04-03 03:53:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][464/573]	eta 0:01:36 lr 0.000127	time 0.8785 (0.8869)	loss 0.4159 (0.4855)	grad_norm 2.8168 (2.8084)	mem 20675MB
[2025-04-03 03:53:41 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][466/573]	eta 0:01:34 lr 0.000127	time 0.8878 (0.8870)	loss 0.4908 (0.4853)	grad_norm 2.2449 (2.8088)	mem 20675MB
[2025-04-03 03:53:42 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][468/573]	eta 0:01:33 lr 0.000127	time 0.8795 (0.8869)	loss 0.3684 (0.4851)	grad_norm 4.1334 (2.8097)	mem 20675MB
[2025-04-03 03:53:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][470/573]	eta 0:01:31 lr 0.000127	time 0.8839 (0.8869)	loss 0.4314 (0.4849)	grad_norm 3.2598 (2.8178)	mem 20675MB
[2025-04-03 03:53:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][472/573]	eta 0:01:29 lr 0.000126	time 0.8776 (0.8869)	loss 0.4910 (0.4849)	grad_norm 2.6781 (2.8207)	mem 20675MB
[2025-04-03 03:53:48 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][474/573]	eta 0:01:27 lr 0.000126	time 0.8871 (0.8869)	loss 0.4248 (0.4847)	grad_norm 3.1598 (2.8216)	mem 20675MB
[2025-04-03 03:53:49 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][476/573]	eta 0:01:26 lr 0.000126	time 0.8780 (0.8869)	loss 0.4459 (0.4844)	grad_norm 2.7862 (2.8330)	mem 20675MB
[2025-04-03 03:53:51 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][478/573]	eta 0:01:24 lr 0.000126	time 0.8772 (0.8868)	loss 0.4157 (0.4843)	grad_norm 3.1592 (2.8317)	mem 20675MB
[2025-04-03 03:53:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][480/573]	eta 0:01:22 lr 0.000126	time 0.8775 (0.8868)	loss 0.5749 (0.4846)	grad_norm 2.8486 (2.8308)	mem 20675MB
[2025-04-03 03:53:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][482/573]	eta 0:01:20 lr 0.000126	time 0.8773 (0.8868)	loss 0.5231 (0.4846)	grad_norm 3.5673 (2.8366)	mem 20675MB
[2025-04-03 03:53:56 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][484/573]	eta 0:01:18 lr 0.000126	time 0.8910 (0.8868)	loss 0.5806 (0.4850)	grad_norm 2.1786 (2.8344)	mem 20675MB
[2025-04-03 03:53:58 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][486/573]	eta 0:01:17 lr 0.000125	time 0.8781 (0.8868)	loss 0.5306 (0.4848)	grad_norm 3.4489 (2.8340)	mem 20675MB
[2025-04-03 03:54:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][488/573]	eta 0:01:15 lr 0.000125	time 0.8828 (0.8868)	loss 0.4899 (0.4849)	grad_norm 2.1100 (2.8319)	mem 20675MB
[2025-04-03 03:54:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][490/573]	eta 0:01:13 lr 0.000125	time 0.8776 (0.8868)	loss 0.5446 (0.4851)	grad_norm 2.0473 (2.8295)	mem 20675MB
[2025-04-03 03:54:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][492/573]	eta 0:01:11 lr 0.000125	time 0.8773 (0.8868)	loss 0.4865 (0.4850)	grad_norm 2.1390 (2.8286)	mem 20675MB
[2025-04-03 03:54:05 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][494/573]	eta 0:01:10 lr 0.000125	time 0.8776 (0.8867)	loss 0.5390 (0.4853)	grad_norm 2.7232 (2.8305)	mem 20675MB
[2025-04-03 03:54:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][496/573]	eta 0:01:08 lr 0.000125	time 0.8776 (0.8867)	loss 0.5674 (0.4856)	grad_norm 3.3563 (2.8308)	mem 20675MB
[2025-04-03 03:54:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][498/573]	eta 0:01:06 lr 0.000125	time 0.8821 (0.8867)	loss 0.5748 (0.4856)	grad_norm 2.3572 (2.8297)	mem 20675MB
[2025-04-03 03:54:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][500/573]	eta 0:01:04 lr 0.000125	time 0.8781 (0.8867)	loss 0.4583 (0.4858)	grad_norm 2.5971 (2.8275)	mem 20675MB
[2025-04-03 03:54:12 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][502/573]	eta 0:01:02 lr 0.000124	time 0.8785 (0.8866)	loss 0.5778 (0.4861)	grad_norm 2.2052 (2.8248)	mem 20675MB
[2025-04-03 03:54:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][504/573]	eta 0:01:01 lr 0.000124	time 0.8782 (0.8866)	loss 0.5343 (0.4862)	grad_norm 2.4688 (2.8219)	mem 20675MB
[2025-04-03 03:54:16 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][506/573]	eta 0:00:59 lr 0.000124	time 0.8778 (0.8866)	loss 0.4070 (0.4861)	grad_norm 3.7405 (2.8222)	mem 20675MB
[2025-04-03 03:54:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][508/573]	eta 0:00:57 lr 0.000124	time 0.8797 (0.8866)	loss 0.4699 (0.4858)	grad_norm 2.1270 (2.8212)	mem 20675MB
[2025-04-03 03:54:19 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][510/573]	eta 0:00:55 lr 0.000124	time 0.8777 (0.8866)	loss 0.5416 (0.4857)	grad_norm 2.3710 (2.8222)	mem 20675MB
[2025-04-03 03:54:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][512/573]	eta 0:00:54 lr 0.000124	time 0.8777 (0.8866)	loss 0.5753 (0.4856)	grad_norm 2.9345 (2.8222)	mem 20675MB
[2025-04-03 03:54:23 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][514/573]	eta 0:00:52 lr 0.000124	time 0.8780 (0.8866)	loss 0.5113 (0.4857)	grad_norm 2.0104 (2.8199)	mem 20675MB
[2025-04-03 03:54:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][516/573]	eta 0:00:50 lr 0.000123	time 0.8779 (0.8865)	loss 0.5768 (0.4856)	grad_norm 2.0217 (2.8174)	mem 20675MB
[2025-04-03 03:54:26 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][518/573]	eta 0:00:48 lr 0.000123	time 0.8779 (0.8865)	loss 0.5147 (0.4857)	grad_norm 3.2970 (2.8174)	mem 20675MB
[2025-04-03 03:54:28 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][520/573]	eta 0:00:46 lr 0.000123	time 0.8780 (0.8865)	loss 0.4820 (0.4856)	grad_norm 3.0170 (2.8170)	mem 20675MB
[2025-04-03 03:54:30 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][522/573]	eta 0:00:45 lr 0.000123	time 0.8777 (0.8865)	loss 0.4808 (0.4854)	grad_norm 2.0186 (2.8153)	mem 20675MB
[2025-04-03 03:54:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][524/573]	eta 0:00:43 lr 0.000123	time 0.8789 (0.8864)	loss 0.5214 (0.4853)	grad_norm 2.1291 (2.8141)	mem 20675MB
[2025-04-03 03:54:33 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][526/573]	eta 0:00:41 lr 0.000123	time 0.8779 (0.8864)	loss 0.4277 (0.4852)	grad_norm 2.8007 (2.8127)	mem 20675MB
[2025-04-03 03:54:35 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][528/573]	eta 0:00:39 lr 0.000123	time 0.8779 (0.8864)	loss 0.5166 (0.4852)	grad_norm 1.8558 (2.8107)	mem 20675MB
[2025-04-03 03:54:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][530/573]	eta 0:00:38 lr 0.000122	time 0.8778 (0.8863)	loss 0.5211 (0.4854)	grad_norm 2.2094 (2.8090)	mem 20675MB
[2025-04-03 03:54:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][532/573]	eta 0:00:36 lr 0.000122	time 0.8774 (0.8863)	loss 0.4806 (0.4858)	grad_norm 3.4137 (2.8098)	mem 20675MB
[2025-04-03 03:54:41 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][534/573]	eta 0:00:34 lr 0.000122	time 0.8776 (0.8863)	loss 0.4935 (0.4858)	grad_norm 3.8153 (2.8098)	mem 20675MB
[2025-04-03 03:54:42 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][536/573]	eta 0:00:32 lr 0.000122	time 0.8774 (0.8863)	loss 0.4775 (0.4857)	grad_norm 3.8352 (2.8124)	mem 20675MB
[2025-04-03 03:54:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][538/573]	eta 0:00:31 lr 0.000122	time 0.8820 (0.8863)	loss 0.5103 (0.4859)	grad_norm 2.8359 (2.8117)	mem 20675MB
[2025-04-03 03:54:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][540/573]	eta 0:00:29 lr 0.000122	time 0.8773 (0.8863)	loss 0.4877 (0.4859)	grad_norm 2.4053 (2.8097)	mem 20675MB
[2025-04-03 03:54:48 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][542/573]	eta 0:00:27 lr 0.000122	time 0.8791 (0.8862)	loss 0.5135 (0.4859)	grad_norm 2.5230 (2.8076)	mem 20675MB
[2025-04-03 03:54:49 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][544/573]	eta 0:00:25 lr 0.000122	time 0.8771 (0.8862)	loss 0.4440 (0.4860)	grad_norm 2.5561 (2.8063)	mem 20675MB
[2025-04-03 03:54:51 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][546/573]	eta 0:00:23 lr 0.000121	time 0.8778 (0.8862)	loss 0.5655 (0.4860)	grad_norm 1.9880 (2.8043)	mem 20675MB
[2025-04-03 03:54:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][548/573]	eta 0:00:22 lr 0.000121	time 0.8772 (0.8862)	loss 0.3453 (0.4858)	grad_norm 3.6143 (2.8058)	mem 20675MB
[2025-04-03 03:54:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][550/573]	eta 0:00:20 lr 0.000121	time 0.8775 (0.8861)	loss 0.3788 (0.4856)	grad_norm 3.9104 (2.8070)	mem 20675MB
[2025-04-03 03:54:56 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][552/573]	eta 0:00:18 lr 0.000121	time 0.8774 (0.8861)	loss 0.4585 (0.4857)	grad_norm 3.0364 (2.8069)	mem 20675MB
[2025-04-03 03:54:58 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][554/573]	eta 0:00:16 lr 0.000121	time 0.8772 (0.8861)	loss 0.4228 (0.4857)	grad_norm 3.8097 (2.8093)	mem 20675MB
[2025-04-03 03:55:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][556/573]	eta 0:00:15 lr 0.000121	time 0.8810 (0.8861)	loss 0.4933 (0.4858)	grad_norm 3.3946 (2.8087)	mem 20675MB
[2025-04-03 03:55:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][558/573]	eta 0:00:13 lr 0.000121	time 0.8840 (0.8861)	loss 0.4836 (0.4860)	grad_norm 2.5264 (2.8080)	mem 20675MB
[2025-04-03 03:55:03 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][560/573]	eta 0:00:11 lr 0.000120	time 0.8773 (0.8860)	loss 0.4989 (0.4859)	grad_norm 2.4097 (2.8063)	mem 20675MB
[2025-04-03 03:55:05 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][562/573]	eta 0:00:09 lr 0.000120	time 0.8840 (0.8860)	loss 0.5554 (0.4859)	grad_norm 2.0491 (2.8062)	mem 20675MB
[2025-04-03 03:55:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][564/573]	eta 0:00:07 lr 0.000120	time 0.8776 (0.8860)	loss 0.5039 (0.4862)	grad_norm 2.7205 (2.8059)	mem 20675MB
[2025-04-03 03:55:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][566/573]	eta 0:00:06 lr 0.000120	time 0.8776 (0.8860)	loss 0.6207 (0.4863)	grad_norm 4.2046 (2.8085)	mem 20675MB
[2025-04-03 03:55:10 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][568/573]	eta 0:00:04 lr 0.000120	time 0.8773 (0.8859)	loss 0.5134 (0.4863)	grad_norm 2.2626 (2.8090)	mem 20675MB
[2025-04-03 03:55:12 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][570/573]	eta 0:00:02 lr 0.000120	time 0.8775 (0.8859)	loss 0.5356 (0.4864)	grad_norm 2.2044 (2.8072)	mem 20675MB
[2025-04-03 03:55:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][572/573]	eta 0:00:00 lr 0.000120	time 0.8776 (0.8859)	loss 0.5220 (0.4865)	grad_norm 2.3988 (2.8059)	mem 20675MB
[2025-04-03 03:55:14 simmim_finetune] (main_finetune.py 260): INFO EPOCH 23 training takes 0:08:27
[2025-04-03 03:55:17 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 3.040 (3.040)	Loss 0.5397 (0.5397)	Acc@1 69.531 (69.531)	Mem 20675MB
[2025-04-03 03:55:18 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (1.209)	Loss 0.4983 (0.5058)	Acc@1 75.000 (73.698)	Mem 20675MB
[2025-04-03 03:55:18 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.291 (0.841)	Loss 0.5235 (0.5010)	Acc@1 73.438 (74.219)	Mem 20675MB
[2025-04-03 03:55:19 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.682)	Loss 0.4448 (0.4855)	Acc@1 79.688 (75.670)	Mem 20675MB
[2025-04-03 03:55:20 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.285 (0.594)	Loss 0.4745 (0.4743)	Acc@1 75.781 (76.823)	Mem 20675MB
[2025-04-03 03:55:20 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.538)	Loss 0.4500 (0.4738)	Acc@1 83.594 (77.202)	Mem 20675MB
[2025-04-03 03:55:21 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.499)	Loss 0.4376 (0.4688)	Acc@1 80.469 (77.584)	Mem 20675MB
[2025-04-03 03:55:21 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.471)	Loss 0.4120 (0.4607)	Acc@1 80.469 (78.177)	Mem 20675MB
[2025-04-03 03:55:22 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.226
[2025-04-03 03:55:22 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 78.2%
[2025-04-03 03:55:22 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.23%
[2025-04-03 03:55:22 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [6.676766349856207e-07, 6.676766349856207e-07, 9.054412950450785e-07, 9.054412950450785e-07, 1.2712330797519366e-06, 1.2712330797519366e-06, 1.8339896716086415e-06, 1.8339896716086415e-06, 2.6997690436958795e-06, 2.6997690436958795e-06, 4.031737308445477e-06, 4.031737308445477e-06, 6.080919254214087e-06, 6.080919254214087e-06, 9.23350686308887e-06, 9.23350686308887e-06, 1.4083641645973154e-05, 1.4083641645973154e-05, 2.1545387465795134e-05, 2.1545387465795134e-05, 3.30249964193674e-05, 3.30249964193674e-05, 5.068593327101705e-05, 5.068593327101705e-05, 7.785660535047805e-05, 7.785660535047805e-05, 0.00011965763931887957, 0.00011965763931887957]
[2025-04-03 03:55:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][0/573]	eta 0:36:11 lr 0.000120	time 3.7892 (3.7892)	loss 0.4350 (0.4350)	grad_norm 2.3819 (2.3819)	mem 20675MB
[2025-04-03 03:55:27 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][2/573]	eta 0:17:38 lr 0.000119	time 0.8872 (1.8529)	loss 0.4584 (0.4396)	grad_norm 2.5965 (3.0443)	mem 20675MB
[2025-04-03 03:55:29 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][4/573]	eta 0:13:53 lr 0.000119	time 0.8906 (1.4657)	loss 0.4389 (0.4705)	grad_norm 3.7200 (3.0016)	mem 20675MB
[2025-04-03 03:55:31 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][6/573]	eta 0:12:15 lr 0.000119	time 0.8777 (1.2979)	loss 0.5087 (0.4797)	grad_norm 2.2405 (2.7518)	mem 20675MB
[2025-04-03 03:55:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][8/573]	eta 0:11:20 lr 0.000119	time 0.8792 (1.2049)	loss 0.4913 (0.4851)	grad_norm 2.9870 (2.7103)	mem 20675MB
[2025-04-03 03:55:34 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][10/573]	eta 0:10:44 lr 0.000119	time 0.8777 (1.1456)	loss 0.6070 (0.5065)	grad_norm 2.3288 (2.6365)	mem 20675MB
[2025-04-03 03:55:36 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][12/573]	eta 0:10:19 lr 0.000119	time 0.8778 (1.1047)	loss 0.6124 (0.5072)	grad_norm 2.7547 (2.6598)	mem 20675MB
[2025-04-03 03:55:38 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][14/573]	eta 0:10:01 lr 0.000119	time 0.8941 (1.0757)	loss 0.3587 (0.4964)	grad_norm 2.9916 (2.6827)	mem 20675MB
[2025-04-03 03:55:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][16/573]	eta 0:09:46 lr 0.000119	time 0.8773 (1.0525)	loss 0.5051 (0.4954)	grad_norm 3.4125 (2.7229)	mem 20675MB
[2025-04-03 03:55:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][18/573]	eta 0:09:33 lr 0.000118	time 0.8782 (1.0342)	loss 0.5895 (0.4990)	grad_norm 2.8390 (2.7738)	mem 20675MB
[2025-04-03 03:55:43 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][20/573]	eta 0:09:23 lr 0.000118	time 0.8776 (1.0197)	loss 0.5725 (0.5091)	grad_norm 2.1197 (2.7353)	mem 20675MB
[2025-04-03 03:55:45 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][22/573]	eta 0:09:15 lr 0.000118	time 0.8786 (1.0075)	loss 0.4679 (0.5001)	grad_norm 2.8992 (2.7668)	mem 20675MB
[2025-04-03 03:55:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][24/573]	eta 0:09:07 lr 0.000118	time 0.8785 (0.9974)	loss 0.5550 (0.5078)	grad_norm 2.5167 (2.7337)	mem 20675MB
[2025-04-03 03:55:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][26/573]	eta 0:09:00 lr 0.000118	time 0.8777 (0.9887)	loss 0.5605 (0.5143)	grad_norm 1.7948 (2.6928)	mem 20675MB
[2025-04-03 03:55:50 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][28/573]	eta 0:08:54 lr 0.000118	time 0.8777 (0.9811)	loss 0.5639 (0.5177)	grad_norm 2.0074 (2.6537)	mem 20675MB
[2025-04-03 03:55:52 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][30/573]	eta 0:08:49 lr 0.000118	time 0.8774 (0.9746)	loss 0.3670 (0.5130)	grad_norm 4.2365 (2.6781)	mem 20675MB
[2025-04-03 03:55:54 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][32/573]	eta 0:08:44 lr 0.000117	time 0.8779 (0.9688)	loss 0.5027 (0.5065)	grad_norm 2.5569 (2.6907)	mem 20675MB
[2025-04-03 03:55:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][34/573]	eta 0:08:39 lr 0.000117	time 0.8774 (0.9636)	loss 0.3504 (0.5029)	grad_norm 2.6053 (2.6577)	mem 20675MB
[2025-04-03 03:55:57 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][36/573]	eta 0:08:35 lr 0.000117	time 0.8776 (0.9591)	loss 0.5680 (0.5042)	grad_norm 2.9900 (2.6612)	mem 20675MB
[2025-04-03 03:55:59 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][38/573]	eta 0:08:30 lr 0.000117	time 0.8787 (0.9549)	loss 0.4546 (0.5045)	grad_norm 2.7228 (2.6738)	mem 20675MB
[2025-04-03 03:56:01 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][40/573]	eta 0:08:27 lr 0.000117	time 0.8835 (0.9514)	loss 0.4821 (0.5041)	grad_norm 2.2494 (2.6566)	mem 20675MB
[2025-04-03 03:56:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][42/573]	eta 0:08:23 lr 0.000117	time 0.8904 (0.9483)	loss 0.4692 (0.5035)	grad_norm 2.6453 (2.6579)	mem 20675MB
[2025-04-03 03:56:04 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][44/573]	eta 0:08:20 lr 0.000117	time 0.8782 (0.9454)	loss 0.5512 (0.5046)	grad_norm 2.3386 (2.6310)	mem 20675MB
[2025-04-03 03:56:06 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][46/573]	eta 0:08:16 lr 0.000117	time 0.8795 (0.9426)	loss 0.5392 (0.5034)	grad_norm 2.7114 (2.6254)	mem 20675MB
[2025-04-03 03:56:08 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][48/573]	eta 0:08:13 lr 0.000116	time 0.8782 (0.9400)	loss 0.4864 (0.5011)	grad_norm 3.0912 (2.6331)	mem 20675MB
[2025-04-03 03:56:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][50/573]	eta 0:08:10 lr 0.000116	time 0.8777 (0.9376)	loss 0.5177 (0.4994)	grad_norm 1.9763 (2.6147)	mem 20675MB
[2025-04-03 03:56:11 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][52/573]	eta 0:08:07 lr 0.000116	time 0.8771 (0.9355)	loss 0.4381 (0.4992)	grad_norm 2.3510 (2.5982)	mem 20675MB
[2025-04-03 03:56:13 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][54/573]	eta 0:08:04 lr 0.000116	time 0.8774 (0.9334)	loss 0.5195 (0.4987)	grad_norm 2.9520 (2.6127)	mem 20675MB
[2025-04-03 03:56:15 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][56/573]	eta 0:08:01 lr 0.000116	time 0.8789 (0.9315)	loss 0.3304 (0.4922)	grad_norm 4.2121 (2.6573)	mem 20675MB
[2025-04-03 03:56:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][58/573]	eta 0:07:58 lr 0.000116	time 0.8773 (0.9297)	loss 0.5634 (0.4913)	grad_norm 3.2838 (2.6874)	mem 20675MB
[2025-04-03 03:56:18 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][60/573]	eta 0:07:56 lr 0.000116	time 0.8798 (0.9281)	loss 0.5999 (0.4932)	grad_norm 2.5492 (2.6883)	mem 20675MB
[2025-04-03 03:56:20 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][62/573]	eta 0:07:53 lr 0.000115	time 0.8776 (0.9266)	loss 0.5608 (0.4940)	grad_norm 3.3634 (2.7018)	mem 20675MB
[2025-04-03 03:56:22 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][64/573]	eta 0:07:50 lr 0.000115	time 0.8776 (0.9251)	loss 0.4668 (0.4920)	grad_norm 3.8988 (2.7279)	mem 20675MB
[2025-04-03 03:56:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][66/573]	eta 0:07:48 lr 0.000115	time 0.8776 (0.9237)	loss 0.5139 (0.4917)	grad_norm 3.9028 (2.7630)	mem 20675MB
[2025-04-03 03:56:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][68/573]	eta 0:07:45 lr 0.000115	time 0.8778 (0.9224)	loss 0.3440 (0.4893)	grad_norm 3.2607 (2.7622)	mem 20675MB
[2025-04-03 03:56:27 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][70/573]	eta 0:07:43 lr 0.000115	time 0.8772 (0.9214)	loss 0.5237 (0.4914)	grad_norm 3.1232 (2.7651)	mem 20675MB
[2025-04-03 03:56:29 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][72/573]	eta 0:07:41 lr 0.000115	time 0.8774 (0.9203)	loss 0.3223 (0.4897)	grad_norm 2.5514 (2.7544)	mem 20675MB
[2025-04-03 03:56:30 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][74/573]	eta 0:07:38 lr 0.000115	time 0.8774 (0.9192)	loss 0.4960 (0.4901)	grad_norm 3.8000 (2.7656)	mem 20675MB
[2025-04-03 03:56:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][76/573]	eta 0:07:36 lr 0.000115	time 0.8785 (0.9182)	loss 0.4662 (0.4904)	grad_norm 3.4260 (2.7742)	mem 20675MB
[2025-04-03 03:56:34 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][78/573]	eta 0:07:34 lr 0.000114	time 0.8817 (0.9172)	loss 0.4785 (0.4910)	grad_norm 2.9661 (2.7678)	mem 20675MB
[2025-04-03 03:56:36 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][80/573]	eta 0:07:31 lr 0.000114	time 0.8799 (0.9163)	loss 0.4509 (0.4890)	grad_norm 4.7615 (2.8041)	mem 20675MB
[2025-04-03 03:56:38 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][82/573]	eta 0:07:29 lr 0.000114	time 0.8875 (0.9155)	loss 0.5528 (0.4881)	grad_norm 1.8056 (2.7914)	mem 20675MB
[2025-04-03 03:56:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][84/573]	eta 0:07:27 lr 0.000114	time 0.8776 (0.9148)	loss 0.5082 (0.4884)	grad_norm 2.0752 (2.7816)	mem 20675MB
[2025-04-03 03:56:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][86/573]	eta 0:07:25 lr 0.000114	time 0.8780 (0.9140)	loss 0.5622 (0.4894)	grad_norm 2.7202 (2.7782)	mem 20675MB
[2025-04-03 03:56:43 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][88/573]	eta 0:07:22 lr 0.000114	time 0.8776 (0.9132)	loss 0.5489 (0.4915)	grad_norm 3.1149 (2.7986)	mem 20675MB
[2025-04-03 03:56:45 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][90/573]	eta 0:07:20 lr 0.000114	time 0.8784 (0.9125)	loss 0.3558 (0.4905)	grad_norm 4.0658 (2.8063)	mem 20675MB
[2025-04-03 03:56:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][92/573]	eta 0:07:18 lr 0.000113	time 0.8777 (0.9118)	loss 0.4078 (0.4894)	grad_norm 3.5348 (2.8082)	mem 20675MB
[2025-04-03 03:56:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][94/573]	eta 0:07:16 lr 0.000113	time 0.8778 (0.9111)	loss 0.4957 (0.4900)	grad_norm 3.3230 (2.8086)	mem 20675MB
[2025-04-03 03:56:50 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][96/573]	eta 0:07:14 lr 0.000113	time 0.8785 (0.9105)	loss 0.4870 (0.4894)	grad_norm 2.7214 (2.8049)	mem 20675MB
[2025-04-03 03:56:52 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][98/573]	eta 0:07:12 lr 0.000113	time 0.8776 (0.9098)	loss 0.6143 (0.4910)	grad_norm 2.1221 (2.7918)	mem 20675MB
[2025-04-03 03:56:53 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][100/573]	eta 0:07:10 lr 0.000113	time 0.8778 (0.9092)	loss 0.3838 (0.4898)	grad_norm 2.9632 (2.7838)	mem 20675MB
[2025-04-03 03:56:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][102/573]	eta 0:07:08 lr 0.000113	time 0.8846 (0.9087)	loss 0.5172 (0.4911)	grad_norm 1.9175 (2.7687)	mem 20675MB
[2025-04-03 03:56:57 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][104/573]	eta 0:07:05 lr 0.000113	time 0.8786 (0.9082)	loss 0.5264 (0.4920)	grad_norm 2.8187 (2.7600)	mem 20675MB
[2025-04-03 03:56:59 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][106/573]	eta 0:07:03 lr 0.000113	time 0.8778 (0.9076)	loss 0.4852 (0.4922)	grad_norm 2.5938 (2.7552)	mem 20675MB
[2025-04-03 03:57:00 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][108/573]	eta 0:07:01 lr 0.000112	time 0.8834 (0.9071)	loss 0.5400 (0.4914)	grad_norm 3.2379 (2.7695)	mem 20675MB
[2025-04-03 03:57:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][110/573]	eta 0:06:59 lr 0.000112	time 0.8773 (0.9067)	loss 0.5241 (0.4915)	grad_norm 2.7235 (2.7702)	mem 20675MB
[2025-04-03 03:57:04 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][112/573]	eta 0:06:57 lr 0.000112	time 0.8776 (0.9062)	loss 0.4972 (0.4920)	grad_norm 2.6298 (2.7681)	mem 20675MB
[2025-04-03 03:57:06 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][114/573]	eta 0:06:55 lr 0.000112	time 0.8775 (0.9057)	loss 0.5058 (0.4911)	grad_norm 2.7273 (2.7766)	mem 20675MB
[2025-04-03 03:57:07 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][116/573]	eta 0:06:53 lr 0.000112	time 0.8779 (0.9053)	loss 0.5749 (0.4916)	grad_norm 3.0068 (2.7830)	mem 20675MB
[2025-04-03 03:57:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][118/573]	eta 0:06:51 lr 0.000112	time 0.8780 (0.9048)	loss 0.5287 (0.4923)	grad_norm 2.7016 (2.7856)	mem 20675MB
[2025-04-03 03:57:11 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][120/573]	eta 0:06:49 lr 0.000112	time 0.8812 (0.9044)	loss 0.5929 (0.4937)	grad_norm 2.1077 (2.7749)	mem 20675MB
[2025-04-03 03:57:13 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][122/573]	eta 0:06:47 lr 0.000112	time 0.8774 (0.9040)	loss 0.3356 (0.4925)	grad_norm 5.2749 (2.7947)	mem 20675MB
[2025-04-03 03:57:15 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][124/573]	eta 0:06:45 lr 0.000111	time 0.8906 (0.9037)	loss 0.4273 (0.4920)	grad_norm 3.6582 (2.7999)	mem 20675MB
[2025-04-03 03:57:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][126/573]	eta 0:06:43 lr 0.000111	time 0.8777 (0.9033)	loss 0.5242 (0.4921)	grad_norm 2.6993 (2.7987)	mem 20675MB
[2025-04-03 03:57:18 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][128/573]	eta 0:06:41 lr 0.000111	time 0.8788 (0.9029)	loss 0.3899 (0.4911)	grad_norm 4.4473 (2.8223)	mem 20675MB
[2025-04-03 03:57:20 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][130/573]	eta 0:06:39 lr 0.000111	time 0.8779 (0.9026)	loss 0.6375 (0.4924)	grad_norm 2.5986 (2.8363)	mem 20675MB
[2025-04-03 03:57:22 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][132/573]	eta 0:06:37 lr 0.000111	time 0.8776 (0.9022)	loss 0.3546 (0.4915)	grad_norm 4.7226 (2.8498)	mem 20675MB
[2025-04-03 03:57:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][134/573]	eta 0:06:35 lr 0.000111	time 0.8779 (0.9019)	loss 0.3524 (0.4908)	grad_norm 3.1714 (2.8498)	mem 20675MB
[2025-04-03 03:57:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][136/573]	eta 0:06:34 lr 0.000111	time 0.8779 (0.9016)	loss 0.5479 (0.4911)	grad_norm 2.2412 (2.8492)	mem 20675MB
[2025-04-03 03:57:27 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][138/573]	eta 0:06:32 lr 0.000110	time 0.8781 (0.9013)	loss 0.5617 (0.4919)	grad_norm 2.2323 (2.8382)	mem 20675MB
[2025-04-03 03:57:29 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][140/573]	eta 0:06:30 lr 0.000110	time 0.8837 (0.9010)	loss 0.4807 (0.4926)	grad_norm 2.4807 (2.8295)	mem 20675MB
[2025-04-03 03:57:30 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][142/573]	eta 0:06:28 lr 0.000110	time 0.8821 (0.9007)	loss 0.3996 (0.4925)	grad_norm 3.3250 (2.8320)	mem 20675MB
[2025-04-03 03:57:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][144/573]	eta 0:06:26 lr 0.000110	time 0.8778 (0.9005)	loss 0.5353 (0.4924)	grad_norm 2.2445 (2.8324)	mem 20675MB
[2025-04-03 03:57:34 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][146/573]	eta 0:06:24 lr 0.000110	time 0.8776 (0.9002)	loss 0.5214 (0.4926)	grad_norm 2.0718 (2.8247)	mem 20675MB
[2025-04-03 03:57:36 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][148/573]	eta 0:06:22 lr 0.000110	time 0.8777 (0.8999)	loss 0.4785 (0.4920)	grad_norm 2.2807 (2.8298)	mem 20675MB
[2025-04-03 03:57:37 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][150/573]	eta 0:06:20 lr 0.000110	time 0.8773 (0.8996)	loss 0.3531 (0.4906)	grad_norm 2.6728 (2.8264)	mem 20675MB
[2025-04-03 03:57:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][152/573]	eta 0:06:18 lr 0.000110	time 0.8775 (0.8993)	loss 0.5831 (0.4913)	grad_norm 3.7364 (2.8262)	mem 20675MB
[2025-04-03 03:57:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][154/573]	eta 0:06:16 lr 0.000109	time 0.8837 (0.8991)	loss 0.5406 (0.4910)	grad_norm 3.2645 (2.8292)	mem 20675MB
[2025-04-03 03:57:43 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][156/573]	eta 0:06:14 lr 0.000109	time 0.8777 (0.8989)	loss 0.5310 (0.4915)	grad_norm 1.9551 (2.8218)	mem 20675MB
[2025-04-03 03:57:44 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][158/573]	eta 0:06:12 lr 0.000109	time 0.8808 (0.8986)	loss 0.6131 (0.4922)	grad_norm 2.4361 (2.8168)	mem 20675MB
[2025-04-03 03:57:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][160/573]	eta 0:06:11 lr 0.000109	time 0.8776 (0.8984)	loss 0.4498 (0.4921)	grad_norm 3.9221 (2.8186)	mem 20675MB
[2025-04-03 03:57:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][162/573]	eta 0:06:09 lr 0.000109	time 0.8776 (0.8981)	loss 0.5065 (0.4923)	grad_norm 2.0007 (2.8158)	mem 20675MB
[2025-04-03 03:57:50 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][164/573]	eta 0:06:07 lr 0.000109	time 0.8773 (0.8979)	loss 0.5199 (0.4916)	grad_norm 2.1650 (2.8123)	mem 20675MB
[2025-04-03 03:57:51 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][166/573]	eta 0:06:05 lr 0.000109	time 0.8831 (0.8978)	loss 0.3206 (0.4910)	grad_norm 6.4679 (2.8325)	mem 20675MB
[2025-04-03 03:57:53 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][168/573]	eta 0:06:03 lr 0.000109	time 0.8773 (0.8975)	loss 0.3749 (0.4896)	grad_norm 2.1568 (2.8338)	mem 20675MB
[2025-04-03 03:57:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][170/573]	eta 0:06:01 lr 0.000108	time 0.8878 (0.8974)	loss 0.5728 (0.4906)	grad_norm 3.2972 (2.8366)	mem 20675MB
[2025-04-03 03:57:57 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][172/573]	eta 0:05:59 lr 0.000108	time 0.8796 (0.8972)	loss 0.3626 (0.4897)	grad_norm 7.6768 (2.8675)	mem 20675MB
[2025-04-03 03:57:59 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][174/573]	eta 0:05:57 lr 0.000108	time 0.8802 (0.8970)	loss 0.4623 (0.4898)	grad_norm 2.4114 (2.8633)	mem 20675MB
[2025-04-03 03:58:00 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][176/573]	eta 0:05:56 lr 0.000108	time 0.8776 (0.8968)	loss 0.4956 (0.4903)	grad_norm 2.5091 (2.8581)	mem 20675MB
[2025-04-03 03:58:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][178/573]	eta 0:05:54 lr 0.000108	time 0.8780 (0.8966)	loss 0.4168 (0.4901)	grad_norm 3.4439 (2.8598)	mem 20675MB
[2025-04-03 03:58:04 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][180/573]	eta 0:05:52 lr 0.000108	time 0.8805 (0.8965)	loss 0.4986 (0.4904)	grad_norm 2.0849 (2.8536)	mem 20675MB
[2025-04-03 03:58:06 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][182/573]	eta 0:05:50 lr 0.000108	time 0.8785 (0.8964)	loss 0.4869 (0.4896)	grad_norm 2.8169 (2.8507)	mem 20675MB
[2025-04-03 03:58:07 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][184/573]	eta 0:05:48 lr 0.000108	time 0.8778 (0.8962)	loss 0.5661 (0.4895)	grad_norm 2.7427 (2.8556)	mem 20675MB
[2025-04-03 03:58:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][186/573]	eta 0:05:46 lr 0.000107	time 0.8774 (0.8961)	loss 0.5384 (0.4901)	grad_norm 1.6560 (2.8445)	mem 20675MB
[2025-04-03 03:58:11 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][188/573]	eta 0:05:44 lr 0.000107	time 0.8806 (0.8959)	loss 0.4528 (0.4904)	grad_norm 2.4189 (2.8403)	mem 20675MB
[2025-04-03 03:58:13 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][190/573]	eta 0:05:43 lr 0.000107	time 0.8910 (0.8958)	loss 0.5222 (0.4903)	grad_norm 2.6872 (2.8352)	mem 20675MB
[2025-04-03 03:58:14 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][192/573]	eta 0:05:41 lr 0.000107	time 0.8784 (0.8957)	loss 0.5147 (0.4902)	grad_norm 2.6382 (2.8302)	mem 20675MB
[2025-04-03 03:58:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][194/573]	eta 0:05:39 lr 0.000107	time 0.8847 (0.8955)	loss 0.5433 (0.4899)	grad_norm 2.4070 (2.8319)	mem 20675MB
[2025-04-03 03:58:18 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][196/573]	eta 0:05:37 lr 0.000107	time 0.8862 (0.8954)	loss 0.4775 (0.4893)	grad_norm 2.1623 (2.8363)	mem 20675MB
[2025-04-03 03:58:20 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][198/573]	eta 0:05:35 lr 0.000107	time 0.8906 (0.8953)	loss 0.5016 (0.4887)	grad_norm 2.7257 (2.8430)	mem 20675MB
[2025-04-03 03:58:21 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][200/573]	eta 0:05:33 lr 0.000107	time 0.8772 (0.8952)	loss 0.5535 (0.4892)	grad_norm 2.4553 (2.8395)	mem 20675MB
[2025-04-03 03:58:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][202/573]	eta 0:05:32 lr 0.000106	time 0.8776 (0.8951)	loss 0.5537 (0.4897)	grad_norm 2.6004 (2.8337)	mem 20675MB
[2025-04-03 03:58:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][204/573]	eta 0:05:30 lr 0.000106	time 0.8779 (0.8949)	loss 0.3287 (0.4889)	grad_norm 2.7345 (2.8501)	mem 20675MB
[2025-04-03 03:58:27 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][206/573]	eta 0:05:28 lr 0.000106	time 0.8792 (0.8948)	loss 0.4246 (0.4886)	grad_norm 2.5008 (2.8479)	mem 20675MB
[2025-04-03 03:58:29 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][208/573]	eta 0:05:26 lr 0.000106	time 0.8781 (0.8946)	loss 0.4700 (0.4887)	grad_norm 2.2275 (2.8414)	mem 20675MB
[2025-04-03 03:58:30 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][210/573]	eta 0:05:24 lr 0.000106	time 0.8827 (0.8945)	loss 0.3292 (0.4875)	grad_norm 2.9606 (2.8449)	mem 20675MB
[2025-04-03 03:58:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][212/573]	eta 0:05:22 lr 0.000106	time 0.8774 (0.8943)	loss 0.3520 (0.4873)	grad_norm 4.4265 (2.8559)	mem 20675MB
[2025-04-03 03:58:34 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][214/573]	eta 0:05:21 lr 0.000106	time 0.8829 (0.8943)	loss 0.5680 (0.4878)	grad_norm 2.4175 (2.8526)	mem 20675MB
[2025-04-03 03:58:36 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][216/573]	eta 0:05:19 lr 0.000105	time 0.8774 (0.8942)	loss 0.5450 (0.4885)	grad_norm 2.0354 (2.8478)	mem 20675MB
[2025-04-03 03:58:37 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][218/573]	eta 0:05:17 lr 0.000105	time 0.8826 (0.8941)	loss 0.5968 (0.4889)	grad_norm 2.5162 (2.8477)	mem 20675MB
[2025-04-03 03:58:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][220/573]	eta 0:05:15 lr 0.000105	time 0.8876 (0.8940)	loss 0.4083 (0.4885)	grad_norm 3.9742 (2.8517)	mem 20675MB
[2025-04-03 03:58:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][222/573]	eta 0:05:13 lr 0.000105	time 0.8774 (0.8938)	loss 0.5364 (0.4887)	grad_norm 2.0834 (2.8488)	mem 20675MB
[2025-04-03 03:58:43 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][224/573]	eta 0:05:11 lr 0.000105	time 0.8782 (0.8937)	loss 0.3629 (0.4877)	grad_norm 4.0131 (2.8606)	mem 20675MB
[2025-04-03 03:58:44 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][226/573]	eta 0:05:10 lr 0.000105	time 0.8780 (0.8936)	loss 0.5453 (0.4883)	grad_norm 2.6540 (2.8575)	mem 20675MB
[2025-04-03 03:58:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][228/573]	eta 0:05:08 lr 0.000105	time 0.8810 (0.8934)	loss 0.5659 (0.4889)	grad_norm 3.0247 (2.8583)	mem 20675MB
[2025-04-03 03:58:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][230/573]	eta 0:05:06 lr 0.000105	time 0.8773 (0.8933)	loss 0.5364 (0.4892)	grad_norm 2.9517 (2.8570)	mem 20675MB
[2025-04-03 03:58:50 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][232/573]	eta 0:05:04 lr 0.000104	time 0.8774 (0.8932)	loss 0.5849 (0.4899)	grad_norm 2.8615 (2.8558)	mem 20675MB
[2025-04-03 03:58:51 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][234/573]	eta 0:05:02 lr 0.000104	time 0.8777 (0.8931)	loss 0.5005 (0.4897)	grad_norm 2.9000 (2.8585)	mem 20675MB
[2025-04-03 03:58:53 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][236/573]	eta 0:05:00 lr 0.000104	time 0.8773 (0.8930)	loss 0.5582 (0.4898)	grad_norm 3.0434 (2.8601)	mem 20675MB
[2025-04-03 03:58:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][238/573]	eta 0:04:59 lr 0.000104	time 0.8777 (0.8928)	loss 0.5477 (0.4904)	grad_norm 2.6035 (2.8573)	mem 20675MB
[2025-04-03 03:58:57 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][240/573]	eta 0:04:57 lr 0.000104	time 0.8778 (0.8927)	loss 0.4708 (0.4908)	grad_norm 1.7762 (2.8503)	mem 20675MB
[2025-04-03 03:58:58 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][242/573]	eta 0:04:55 lr 0.000104	time 0.8776 (0.8926)	loss 0.5425 (0.4912)	grad_norm 2.3423 (2.8468)	mem 20675MB
[2025-04-03 03:59:00 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][244/573]	eta 0:04:53 lr 0.000104	time 0.9083 (0.8926)	loss 0.4561 (0.4912)	grad_norm 2.0135 (2.8411)	mem 20675MB
[2025-04-03 03:59:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][246/573]	eta 0:04:51 lr 0.000104	time 0.8777 (0.8925)	loss 0.4999 (0.4919)	grad_norm 1.8574 (2.8404)	mem 20675MB
[2025-04-03 03:59:04 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][248/573]	eta 0:04:50 lr 0.000103	time 0.8775 (0.8924)	loss 0.3999 (0.4914)	grad_norm 3.3991 (2.8439)	mem 20675MB
[2025-04-03 03:59:06 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][250/573]	eta 0:04:48 lr 0.000103	time 0.8779 (0.8923)	loss 0.5561 (0.4921)	grad_norm 1.9885 (2.8390)	mem 20675MB
[2025-04-03 03:59:07 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][252/573]	eta 0:04:46 lr 0.000103	time 0.8809 (0.8922)	loss 0.4901 (0.4924)	grad_norm 2.3401 (2.8356)	mem 20675MB
[2025-04-03 03:59:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][254/573]	eta 0:04:44 lr 0.000103	time 0.8773 (0.8921)	loss 0.5244 (0.4923)	grad_norm 3.4680 (2.8350)	mem 20675MB
[2025-04-03 03:59:11 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][256/573]	eta 0:04:42 lr 0.000103	time 0.8910 (0.8921)	loss 0.5396 (0.4920)	grad_norm 2.3440 (2.8365)	mem 20675MB
[2025-04-03 03:59:13 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][258/573]	eta 0:04:40 lr 0.000103	time 0.8779 (0.8920)	loss 0.5694 (0.4925)	grad_norm 1.9878 (2.8322)	mem 20675MB
[2025-04-03 03:59:14 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][260/573]	eta 0:04:39 lr 0.000103	time 0.8774 (0.8919)	loss 0.3130 (0.4919)	grad_norm 2.6324 (2.8289)	mem 20675MB
[2025-04-03 03:59:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][262/573]	eta 0:04:37 lr 0.000103	time 0.8799 (0.8918)	loss 0.5421 (0.4919)	grad_norm 1.5764 (2.8244)	mem 20675MB
[2025-04-03 03:59:18 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][264/573]	eta 0:04:35 lr 0.000102	time 0.9003 (0.8919)	loss 0.5366 (0.4920)	grad_norm 1.9762 (2.8194)	mem 20675MB
[2025-04-03 03:59:20 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][266/573]	eta 0:04:33 lr 0.000102	time 0.8803 (0.8918)	loss 0.5050 (0.4920)	grad_norm 3.0351 (2.8194)	mem 20675MB
[2025-04-03 03:59:21 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][268/573]	eta 0:04:31 lr 0.000102	time 0.8778 (0.8917)	loss 0.4985 (0.4919)	grad_norm 3.9412 (2.8230)	mem 20675MB
[2025-04-03 03:59:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][270/573]	eta 0:04:30 lr 0.000102	time 0.8775 (0.8916)	loss 0.5549 (0.4920)	grad_norm 2.2350 (2.8184)	mem 20675MB
[2025-04-03 03:59:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][272/573]	eta 0:04:28 lr 0.000102	time 0.8776 (0.8915)	loss 0.5315 (0.4921)	grad_norm 2.5048 (2.8160)	mem 20675MB
[2025-04-03 03:59:27 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][274/573]	eta 0:04:26 lr 0.000102	time 0.8779 (0.8915)	loss 0.5381 (0.4921)	grad_norm 3.2850 (2.8145)	mem 20675MB
[2025-04-03 03:59:28 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][276/573]	eta 0:04:24 lr 0.000102	time 0.8778 (0.8914)	loss 0.4476 (0.4923)	grad_norm 2.4354 (2.8125)	mem 20675MB
[2025-04-03 03:59:30 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][278/573]	eta 0:04:22 lr 0.000102	time 0.8775 (0.8913)	loss 0.4948 (0.4925)	grad_norm 2.4720 (2.8100)	mem 20675MB
[2025-04-03 03:59:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][280/573]	eta 0:04:21 lr 0.000101	time 0.8781 (0.8912)	loss 0.4643 (0.4922)	grad_norm 2.7366 (2.8138)	mem 20675MB
[2025-04-03 03:59:34 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][282/573]	eta 0:04:19 lr 0.000101	time 0.8906 (0.8912)	loss 0.5373 (0.4923)	grad_norm 1.8234 (2.8099)	mem 20675MB
[2025-04-03 03:59:36 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][284/573]	eta 0:04:17 lr 0.000101	time 0.8777 (0.8911)	loss 0.5967 (0.4933)	grad_norm 2.8769 (2.8128)	mem 20675MB
[2025-04-03 03:59:37 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][286/573]	eta 0:04:15 lr 0.000101	time 0.8775 (0.8910)	loss 0.4536 (0.4930)	grad_norm 2.9271 (2.8133)	mem 20675MB
[2025-04-03 03:59:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][288/573]	eta 0:04:13 lr 0.000101	time 0.8778 (0.8910)	loss 0.3982 (0.4927)	grad_norm 3.2747 (2.8139)	mem 20675MB
[2025-04-03 03:59:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][290/573]	eta 0:04:12 lr 0.000101	time 0.8777 (0.8909)	loss 0.5285 (0.4927)	grad_norm 1.6896 (2.8096)	mem 20675MB
[2025-04-03 03:59:43 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][292/573]	eta 0:04:10 lr 0.000101	time 0.8778 (0.8908)	loss 0.5513 (0.4928)	grad_norm 1.9327 (2.8044)	mem 20675MB
[2025-04-03 03:59:44 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][294/573]	eta 0:04:08 lr 0.000101	time 0.8777 (0.8907)	loss 0.5425 (0.4934)	grad_norm 2.7271 (2.8043)	mem 20675MB
[2025-04-03 03:59:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][296/573]	eta 0:04:06 lr 0.000100	time 0.8775 (0.8906)	loss 0.5187 (0.4933)	grad_norm 2.6079 (2.8043)	mem 20675MB
[2025-04-03 03:59:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][298/573]	eta 0:04:04 lr 0.000100	time 0.8819 (0.8906)	loss 0.5073 (0.4931)	grad_norm 2.2750 (2.8051)	mem 20675MB
[2025-04-03 03:59:50 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][300/573]	eta 0:04:03 lr 0.000100	time 0.8776 (0.8905)	loss 0.4416 (0.4930)	grad_norm 1.6210 (2.8004)	mem 20675MB
[2025-04-03 03:59:51 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][302/573]	eta 0:04:01 lr 0.000100	time 0.8779 (0.8905)	loss 0.5725 (0.4929)	grad_norm 2.1020 (2.7966)	mem 20675MB
[2025-04-03 03:59:53 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][304/573]	eta 0:03:59 lr 0.000100	time 0.8778 (0.8904)	loss 0.4522 (0.4924)	grad_norm 2.8948 (2.7973)	mem 20675MB
[2025-04-03 03:59:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][306/573]	eta 0:03:57 lr 0.000100	time 0.8779 (0.8904)	loss 0.4846 (0.4924)	grad_norm 1.9200 (2.7926)	mem 20675MB
[2025-04-03 03:59:57 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][308/573]	eta 0:03:55 lr 0.000100	time 0.8774 (0.8904)	loss 0.5930 (0.4927)	grad_norm 1.7993 (2.7934)	mem 20675MB
[2025-04-03 03:59:58 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][310/573]	eta 0:03:54 lr 0.000100	time 0.8773 (0.8904)	loss 0.5276 (0.4929)	grad_norm 1.2615 (2.7855)	mem 20675MB
[2025-04-03 04:00:00 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][312/573]	eta 0:03:52 lr 0.000099	time 0.8775 (0.8903)	loss 0.4424 (0.4928)	grad_norm 2.6417 (2.7863)	mem 20675MB
[2025-04-03 04:00:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][314/573]	eta 0:03:50 lr 0.000099	time 0.8780 (0.8902)	loss 0.5755 (0.4933)	grad_norm 2.4525 (2.7826)	mem 20675MB
[2025-04-03 04:00:04 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][316/573]	eta 0:03:48 lr 0.000099	time 0.8778 (0.8902)	loss 0.5956 (0.4937)	grad_norm 2.3296 (2.7813)	mem 20675MB
[2025-04-03 04:00:05 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][318/573]	eta 0:03:46 lr 0.000099	time 0.8778 (0.8901)	loss 0.4808 (0.4937)	grad_norm 2.5392 (2.7787)	mem 20675MB
[2025-04-03 04:00:07 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][320/573]	eta 0:03:45 lr 0.000099	time 0.8779 (0.8900)	loss 0.4538 (0.4938)	grad_norm 2.6606 (2.7770)	mem 20675MB
[2025-04-03 04:00:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][322/573]	eta 0:03:43 lr 0.000099	time 0.8774 (0.8900)	loss 0.5689 (0.4937)	grad_norm 1.9530 (2.7737)	mem 20675MB
[2025-04-03 04:00:11 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][324/573]	eta 0:03:41 lr 0.000099	time 0.8779 (0.8899)	loss 0.4895 (0.4939)	grad_norm 1.5425 (2.7688)	mem 20675MB
[2025-04-03 04:00:13 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][326/573]	eta 0:03:39 lr 0.000099	time 0.8778 (0.8898)	loss 0.4628 (0.4940)	grad_norm 1.9485 (2.7646)	mem 20675MB
[2025-04-03 04:00:14 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][328/573]	eta 0:03:37 lr 0.000098	time 0.8776 (0.8898)	loss 0.6408 (0.4944)	grad_norm 2.3802 (2.7625)	mem 20675MB
[2025-04-03 04:00:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][330/573]	eta 0:03:36 lr 0.000098	time 0.8778 (0.8897)	loss 0.5972 (0.4950)	grad_norm 2.1406 (2.7585)	mem 20675MB
[2025-04-03 04:00:18 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][332/573]	eta 0:03:34 lr 0.000098	time 0.8777 (0.8896)	loss 0.4732 (0.4950)	grad_norm 1.5726 (2.7531)	mem 20675MB
[2025-04-03 04:00:20 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][334/573]	eta 0:03:32 lr 0.000098	time 0.8779 (0.8896)	loss 0.5221 (0.4954)	grad_norm 1.7363 (2.7477)	mem 20675MB
[2025-04-03 04:00:21 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][336/573]	eta 0:03:30 lr 0.000098	time 0.8784 (0.8895)	loss 0.4627 (0.4955)	grad_norm 2.3670 (2.7467)	mem 20675MB
[2025-04-03 04:00:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][338/573]	eta 0:03:29 lr 0.000098	time 0.8785 (0.8895)	loss 0.4462 (0.4956)	grad_norm 1.5996 (2.7430)	mem 20675MB
[2025-04-03 04:00:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][340/573]	eta 0:03:27 lr 0.000098	time 0.8797 (0.8894)	loss 0.4419 (0.4953)	grad_norm 2.1154 (2.7410)	mem 20675MB
[2025-04-03 04:00:27 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][342/573]	eta 0:03:25 lr 0.000098	time 0.8799 (0.8894)	loss 0.5056 (0.4954)	grad_norm 2.3689 (2.7375)	mem 20675MB
[2025-04-03 04:00:28 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][344/573]	eta 0:03:23 lr 0.000098	time 0.8780 (0.8893)	loss 0.5062 (0.4954)	grad_norm 2.1716 (2.7344)	mem 20675MB
[2025-04-03 04:00:30 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][346/573]	eta 0:03:21 lr 0.000097	time 0.8780 (0.8893)	loss 0.5418 (0.4957)	grad_norm 1.7388 (2.7294)	mem 20675MB
[2025-04-03 04:00:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][348/573]	eta 0:03:20 lr 0.000097	time 0.8804 (0.8892)	loss 0.3475 (0.4951)	grad_norm 2.5737 (2.7307)	mem 20675MB
[2025-04-03 04:00:34 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][350/573]	eta 0:03:18 lr 0.000097	time 0.8780 (0.8892)	loss 0.4661 (0.4950)	grad_norm 2.4463 (2.7272)	mem 20675MB
[2025-04-03 04:00:35 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][352/573]	eta 0:03:16 lr 0.000097	time 0.8779 (0.8891)	loss 0.5325 (0.4952)	grad_norm 3.9268 (2.7293)	mem 20675MB
[2025-04-03 04:00:37 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][354/573]	eta 0:03:14 lr 0.000097	time 0.8780 (0.8891)	loss 0.4265 (0.4949)	grad_norm 4.9482 (2.7354)	mem 20675MB
[2025-04-03 04:00:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][356/573]	eta 0:03:12 lr 0.000097	time 0.8818 (0.8890)	loss 0.4183 (0.4945)	grad_norm 2.7520 (2.7350)	mem 20675MB
[2025-04-03 04:00:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][358/573]	eta 0:03:11 lr 0.000097	time 0.8779 (0.8890)	loss 0.6012 (0.4947)	grad_norm 2.5389 (2.7396)	mem 20675MB
[2025-04-03 04:00:42 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][360/573]	eta 0:03:09 lr 0.000097	time 0.8783 (0.8889)	loss 0.5681 (0.4945)	grad_norm 2.7603 (2.7459)	mem 20675MB
[2025-04-03 04:00:44 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][362/573]	eta 0:03:07 lr 0.000096	time 0.8785 (0.8889)	loss 0.5698 (0.4948)	grad_norm 2.0007 (2.7415)	mem 20675MB
[2025-04-03 04:00:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][364/573]	eta 0:03:05 lr 0.000096	time 0.8778 (0.8888)	loss 0.4508 (0.4944)	grad_norm 2.9442 (2.7419)	mem 20675MB
[2025-04-03 04:00:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][366/573]	eta 0:03:03 lr 0.000096	time 0.8781 (0.8888)	loss 0.5928 (0.4947)	grad_norm 2.8336 (2.7444)	mem 20675MB
[2025-04-03 04:00:50 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][368/573]	eta 0:03:02 lr 0.000096	time 0.8784 (0.8888)	loss 0.3913 (0.4944)	grad_norm 3.2405 (2.7464)	mem 20675MB
[2025-04-03 04:00:51 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][370/573]	eta 0:03:00 lr 0.000096	time 0.8780 (0.8887)	loss 0.5955 (0.4947)	grad_norm 2.0532 (2.7435)	mem 20675MB
[2025-04-03 04:00:53 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][372/573]	eta 0:02:58 lr 0.000096	time 0.8778 (0.8887)	loss 0.6127 (0.4949)	grad_norm 3.4367 (2.7470)	mem 20675MB
[2025-04-03 04:00:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][374/573]	eta 0:02:56 lr 0.000096	time 0.8783 (0.8886)	loss 0.5521 (0.4951)	grad_norm 2.4493 (2.7485)	mem 20675MB
[2025-04-03 04:00:57 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][376/573]	eta 0:02:55 lr 0.000096	time 0.8786 (0.8886)	loss 0.5873 (0.4952)	grad_norm 2.1829 (2.7487)	mem 20675MB
[2025-04-03 04:00:58 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][378/573]	eta 0:02:53 lr 0.000095	time 0.8907 (0.8886)	loss 0.4296 (0.4952)	grad_norm 2.6478 (2.7472)	mem 20675MB
[2025-04-03 04:01:00 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][380/573]	eta 0:02:51 lr 0.000095	time 0.8786 (0.8885)	loss 0.4440 (0.4951)	grad_norm 3.5148 (2.7518)	mem 20675MB
[2025-04-03 04:01:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][382/573]	eta 0:02:49 lr 0.000095	time 0.8779 (0.8885)	loss 0.5712 (0.4955)	grad_norm 2.0719 (2.7484)	mem 20675MB
[2025-04-03 04:01:04 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][384/573]	eta 0:02:47 lr 0.000095	time 0.8783 (0.8884)	loss 0.5619 (0.4953)	grad_norm 1.5786 (2.7466)	mem 20675MB
[2025-04-03 04:01:05 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][386/573]	eta 0:02:46 lr 0.000095	time 0.8830 (0.8884)	loss 0.5317 (0.4954)	grad_norm 1.9867 (2.7438)	mem 20675MB
[2025-04-03 04:01:07 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][388/573]	eta 0:02:44 lr 0.000095	time 0.8814 (0.8884)	loss 0.4839 (0.4956)	grad_norm 2.6643 (2.7420)	mem 20675MB
[2025-04-03 04:01:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][390/573]	eta 0:02:42 lr 0.000095	time 0.8780 (0.8884)	loss 0.5814 (0.4959)	grad_norm 2.8715 (2.7397)	mem 20675MB
[2025-04-03 04:01:11 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][392/573]	eta 0:02:40 lr 0.000095	time 0.8793 (0.8883)	loss 0.5238 (0.4962)	grad_norm 3.2281 (2.7399)	mem 20675MB
[2025-04-03 04:01:12 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][394/573]	eta 0:02:39 lr 0.000094	time 0.8793 (0.8883)	loss 0.5120 (0.4964)	grad_norm 2.0968 (2.7363)	mem 20675MB
[2025-04-03 04:01:14 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][396/573]	eta 0:02:37 lr 0.000094	time 0.8816 (0.8882)	loss 0.4627 (0.4966)	grad_norm 1.6921 (2.7328)	mem 20675MB
[2025-04-03 04:01:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][398/573]	eta 0:02:35 lr 0.000094	time 0.8786 (0.8882)	loss 0.3176 (0.4962)	grad_norm 2.3764 (2.7290)	mem 20675MB
[2025-04-03 04:01:18 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][400/573]	eta 0:02:33 lr 0.000094	time 0.8778 (0.8882)	loss 0.5474 (0.4963)	grad_norm 1.6051 (2.7256)	mem 20675MB
[2025-04-03 04:01:19 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][402/573]	eta 0:02:31 lr 0.000094	time 0.8872 (0.8881)	loss 0.5189 (0.4964)	grad_norm 3.0398 (2.7259)	mem 20675MB
[2025-04-03 04:01:21 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][404/573]	eta 0:02:30 lr 0.000094	time 0.8776 (0.8881)	loss 0.3804 (0.4962)	grad_norm 3.4931 (2.7266)	mem 20675MB
[2025-04-03 04:01:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][406/573]	eta 0:02:28 lr 0.000094	time 0.8780 (0.8881)	loss 0.4437 (0.4962)	grad_norm 2.7498 (2.7247)	mem 20675MB
[2025-04-03 04:01:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][408/573]	eta 0:02:26 lr 0.000094	time 0.8822 (0.8880)	loss 0.3578 (0.4962)	grad_norm 3.2558 (2.7257)	mem 20675MB
[2025-04-03 04:01:27 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][410/573]	eta 0:02:24 lr 0.000094	time 0.8784 (0.8880)	loss 0.5747 (0.4965)	grad_norm 2.2804 (2.7220)	mem 20675MB
[2025-04-03 04:01:28 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][412/573]	eta 0:02:22 lr 0.000093	time 0.8869 (0.8880)	loss 0.4517 (0.4966)	grad_norm 2.6833 (2.7243)	mem 20675MB
[2025-04-03 04:01:30 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][414/573]	eta 0:02:21 lr 0.000093	time 0.8794 (0.8879)	loss 0.5243 (0.4966)	grad_norm 1.9138 (2.7261)	mem 20675MB
[2025-04-03 04:01:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][416/573]	eta 0:02:19 lr 0.000093	time 0.8894 (0.8880)	loss 0.4448 (0.4963)	grad_norm 2.1694 (2.7252)	mem 20675MB
[2025-04-03 04:01:34 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][418/573]	eta 0:02:17 lr 0.000093	time 0.8779 (0.8879)	loss 0.3265 (0.4959)	grad_norm 3.6075 (2.7269)	mem 20675MB
[2025-04-03 04:01:35 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][420/573]	eta 0:02:15 lr 0.000093	time 0.8775 (0.8879)	loss 0.3801 (0.4958)	grad_norm 3.2301 (2.7259)	mem 20675MB
[2025-04-03 04:01:37 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][422/573]	eta 0:02:14 lr 0.000093	time 0.8814 (0.8879)	loss 0.4545 (0.4957)	grad_norm 3.2223 (2.7244)	mem 20675MB
[2025-04-03 04:01:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][424/573]	eta 0:02:12 lr 0.000093	time 0.8790 (0.8878)	loss 0.5363 (0.4960)	grad_norm 2.3465 (2.7229)	mem 20675MB
[2025-04-03 04:01:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][426/573]	eta 0:02:10 lr 0.000093	time 0.8778 (0.8878)	loss 0.5404 (0.4958)	grad_norm 2.2562 (2.7228)	mem 20675MB
[2025-04-03 04:01:42 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][428/573]	eta 0:02:08 lr 0.000092	time 0.8779 (0.8878)	loss 0.3392 (0.4957)	grad_norm 3.1929 (2.7224)	mem 20675MB
[2025-04-03 04:01:44 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][430/573]	eta 0:02:06 lr 0.000092	time 0.8788 (0.8877)	loss 0.4250 (0.4957)	grad_norm 2.2107 (2.7214)	mem 20675MB
[2025-04-03 04:01:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][432/573]	eta 0:02:05 lr 0.000092	time 0.8787 (0.8877)	loss 0.5408 (0.4961)	grad_norm 2.3330 (2.7190)	mem 20675MB
[2025-04-03 04:01:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][434/573]	eta 0:02:03 lr 0.000092	time 0.8778 (0.8877)	loss 0.5155 (0.4962)	grad_norm 2.0272 (2.7148)	mem 20675MB
[2025-04-03 04:01:49 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][436/573]	eta 0:02:01 lr 0.000092	time 0.8789 (0.8876)	loss 0.4829 (0.4963)	grad_norm 2.1245 (2.7128)	mem 20675MB
[2025-04-03 04:01:51 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][438/573]	eta 0:01:59 lr 0.000092	time 0.8783 (0.8876)	loss 0.5205 (0.4965)	grad_norm 1.7698 (2.7086)	mem 20675MB
[2025-04-03 04:01:53 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][440/573]	eta 0:01:58 lr 0.000092	time 0.8789 (0.8876)	loss 0.5836 (0.4968)	grad_norm 2.4054 (2.7081)	mem 20675MB
[2025-04-03 04:01:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][442/573]	eta 0:01:56 lr 0.000092	time 0.8772 (0.8875)	loss 0.4834 (0.4970)	grad_norm 3.3544 (2.7090)	mem 20675MB
[2025-04-03 04:01:56 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][444/573]	eta 0:01:54 lr 0.000091	time 0.8784 (0.8875)	loss 0.5247 (0.4969)	grad_norm 2.5241 (2.7110)	mem 20675MB
[2025-04-03 04:01:58 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][446/573]	eta 0:01:52 lr 0.000091	time 0.8776 (0.8875)	loss 0.5738 (0.4970)	grad_norm 2.1798 (2.7098)	mem 20675MB
[2025-04-03 04:02:00 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][448/573]	eta 0:01:50 lr 0.000091	time 0.8777 (0.8874)	loss 0.4447 (0.4970)	grad_norm 2.7798 (2.7084)	mem 20675MB
[2025-04-03 04:02:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][450/573]	eta 0:01:49 lr 0.000091	time 0.8778 (0.8874)	loss 0.4394 (0.4966)	grad_norm 2.3621 (2.7073)	mem 20675MB
[2025-04-03 04:02:04 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][452/573]	eta 0:01:47 lr 0.000091	time 0.8776 (0.8873)	loss 0.3773 (0.4961)	grad_norm 3.6447 (2.7096)	mem 20675MB
[2025-04-03 04:02:05 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][454/573]	eta 0:01:45 lr 0.000091	time 0.8865 (0.8873)	loss 0.3928 (0.4960)	grad_norm 3.0084 (2.7084)	mem 20675MB
[2025-04-03 04:02:07 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][456/573]	eta 0:01:43 lr 0.000091	time 0.8785 (0.8873)	loss 0.4607 (0.4955)	grad_norm 3.2486 (2.7097)	mem 20675MB
[2025-04-03 04:02:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][458/573]	eta 0:01:42 lr 0.000091	time 0.8775 (0.8872)	loss 0.4689 (0.4952)	grad_norm 1.9860 (2.7102)	mem 20675MB
[2025-04-03 04:02:11 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][460/573]	eta 0:01:40 lr 0.000091	time 0.8779 (0.8872)	loss 0.5795 (0.4954)	grad_norm 2.4145 (2.7139)	mem 20675MB
[2025-04-03 04:02:12 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][462/573]	eta 0:01:38 lr 0.000090	time 0.8840 (0.8872)	loss 0.4938 (0.4952)	grad_norm 3.9533 (2.7201)	mem 20675MB
[2025-04-03 04:02:14 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][464/573]	eta 0:01:36 lr 0.000090	time 0.8803 (0.8872)	loss 0.5078 (0.4952)	grad_norm 3.4865 (2.7208)	mem 20675MB
[2025-04-03 04:02:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][466/573]	eta 0:01:34 lr 0.000090	time 0.8779 (0.8871)	loss 0.5893 (0.4952)	grad_norm 2.5291 (2.7256)	mem 20675MB
[2025-04-03 04:02:18 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][468/573]	eta 0:01:33 lr 0.000090	time 0.8782 (0.8871)	loss 0.5375 (0.4951)	grad_norm 2.2679 (2.7256)	mem 20675MB
[2025-04-03 04:02:19 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][470/573]	eta 0:01:31 lr 0.000090	time 0.8845 (0.8871)	loss 0.4750 (0.4952)	grad_norm 4.0577 (2.7270)	mem 20675MB
[2025-04-03 04:02:21 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][472/573]	eta 0:01:29 lr 0.000090	time 0.8793 (0.8871)	loss 0.5517 (0.4954)	grad_norm 2.3043 (2.7256)	mem 20675MB
[2025-04-03 04:02:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][474/573]	eta 0:01:27 lr 0.000090	time 0.8778 (0.8870)	loss 0.3481 (0.4947)	grad_norm 2.8798 (2.7282)	mem 20675MB
[2025-04-03 04:02:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][476/573]	eta 0:01:26 lr 0.000090	time 0.8777 (0.8870)	loss 0.4815 (0.4946)	grad_norm 1.9279 (2.7262)	mem 20675MB
[2025-04-03 04:02:26 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][478/573]	eta 0:01:24 lr 0.000089	time 0.8800 (0.8870)	loss 0.5622 (0.4945)	grad_norm 3.0333 (2.7337)	mem 20675MB
[2025-04-03 04:02:28 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][480/573]	eta 0:01:22 lr 0.000089	time 0.8787 (0.8869)	loss 0.4510 (0.4945)	grad_norm 2.0877 (2.7305)	mem 20675MB
[2025-04-03 04:02:30 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][482/573]	eta 0:01:20 lr 0.000089	time 0.8772 (0.8869)	loss 0.4712 (0.4942)	grad_norm 2.0365 (2.7323)	mem 20675MB
[2025-04-03 04:02:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][484/573]	eta 0:01:18 lr 0.000089	time 0.8781 (0.8869)	loss 0.5268 (0.4945)	grad_norm 2.4991 (2.7329)	mem 20675MB
[2025-04-03 04:02:33 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][486/573]	eta 0:01:17 lr 0.000089	time 0.8864 (0.8869)	loss 0.5356 (0.4947)	grad_norm 2.7811 (2.7326)	mem 20675MB
[2025-04-03 04:02:35 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][488/573]	eta 0:01:15 lr 0.000089	time 0.8776 (0.8868)	loss 0.5792 (0.4948)	grad_norm 3.8223 (2.7353)	mem 20675MB
[2025-04-03 04:02:37 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][490/573]	eta 0:01:13 lr 0.000089	time 0.8782 (0.8868)	loss 0.5354 (0.4951)	grad_norm 1.9079 (2.7324)	mem 20675MB
[2025-04-03 04:02:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][492/573]	eta 0:01:11 lr 0.000089	time 0.8774 (0.8868)	loss 0.5086 (0.4950)	grad_norm 2.3565 (2.7324)	mem 20675MB
[2025-04-03 04:02:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][494/573]	eta 0:01:10 lr 0.000089	time 0.8786 (0.8868)	loss 0.4176 (0.4949)	grad_norm 1.7974 (2.7299)	mem 20675MB
[2025-04-03 04:02:42 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][496/573]	eta 0:01:08 lr 0.000088	time 0.8780 (0.8867)	loss 0.6045 (0.4950)	grad_norm 2.9042 (2.7294)	mem 20675MB
[2025-04-03 04:02:44 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][498/573]	eta 0:01:06 lr 0.000088	time 0.8776 (0.8867)	loss 0.5194 (0.4950)	grad_norm 2.6638 (2.7298)	mem 20675MB
[2025-04-03 04:02:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][500/573]	eta 0:01:04 lr 0.000088	time 0.8775 (0.8867)	loss 0.4713 (0.4951)	grad_norm 2.0390 (2.7287)	mem 20675MB
[2025-04-03 04:02:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][502/573]	eta 0:01:02 lr 0.000088	time 0.8772 (0.8866)	loss 0.4794 (0.4947)	grad_norm 2.2654 (2.7287)	mem 20675MB
[2025-04-03 04:02:49 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][504/573]	eta 0:01:01 lr 0.000088	time 0.8799 (0.8866)	loss 0.4677 (0.4946)	grad_norm 3.7829 (2.7304)	mem 20675MB
[2025-04-03 04:02:51 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][506/573]	eta 0:00:59 lr 0.000088	time 0.8966 (0.8866)	loss 0.5430 (0.4947)	grad_norm 2.0991 (2.7293)	mem 20675MB
[2025-04-03 04:02:53 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][508/573]	eta 0:00:57 lr 0.000088	time 0.8819 (0.8866)	loss 0.6058 (0.4949)	grad_norm 1.7879 (2.7289)	mem 20675MB
[2025-04-03 04:02:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][510/573]	eta 0:00:55 lr 0.000088	time 0.8780 (0.8866)	loss 0.3693 (0.4947)	grad_norm 2.9516 (2.7283)	mem 20675MB
[2025-04-03 04:02:56 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][512/573]	eta 0:00:54 lr 0.000087	time 0.8900 (0.8866)	loss 0.4094 (0.4947)	grad_norm 3.8805 (2.7299)	mem 20675MB
[2025-04-03 04:02:58 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][514/573]	eta 0:00:52 lr 0.000087	time 0.8806 (0.8865)	loss 0.4947 (0.4944)	grad_norm 3.4346 (2.7318)	mem 20675MB
[2025-04-03 04:03:00 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][516/573]	eta 0:00:50 lr 0.000087	time 0.8777 (0.8865)	loss 0.4356 (0.4942)	grad_norm 2.7970 (2.7338)	mem 20675MB
[2025-04-03 04:03:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][518/573]	eta 0:00:48 lr 0.000087	time 0.8782 (0.8865)	loss 0.5735 (0.4946)	grad_norm 1.9753 (2.7324)	mem 20675MB
[2025-04-03 04:03:03 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][520/573]	eta 0:00:46 lr 0.000087	time 0.8796 (0.8865)	loss 0.4342 (0.4947)	grad_norm 3.3224 (2.7321)	mem 20675MB
[2025-04-03 04:03:05 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][522/573]	eta 0:00:45 lr 0.000087	time 0.8777 (0.8864)	loss 0.5322 (0.4949)	grad_norm 1.6675 (2.7296)	mem 20675MB
[2025-04-03 04:03:07 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][524/573]	eta 0:00:43 lr 0.000087	time 0.8778 (0.8864)	loss 0.5430 (0.4949)	grad_norm 2.5552 (2.7289)	mem 20675MB
[2025-04-03 04:03:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][526/573]	eta 0:00:41 lr 0.000087	time 0.8999 (0.8864)	loss 0.5420 (0.4948)	grad_norm 2.2156 (2.7297)	mem 20675MB
[2025-04-03 04:03:10 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][528/573]	eta 0:00:39 lr 0.000087	time 0.8781 (0.8864)	loss 0.4044 (0.4945)	grad_norm 5.6906 (2.7352)	mem 20675MB
[2025-04-03 04:03:12 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][530/573]	eta 0:00:38 lr 0.000086	time 0.8789 (0.8864)	loss 0.5616 (0.4948)	grad_norm 1.7951 (2.7333)	mem 20675MB
[2025-04-03 04:03:14 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][532/573]	eta 0:00:36 lr 0.000086	time 0.8774 (0.8864)	loss 0.4408 (0.4946)	grad_norm 1.7797 (2.7310)	mem 20675MB
[2025-04-03 04:03:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][534/573]	eta 0:00:34 lr 0.000086	time 0.8778 (0.8863)	loss 0.5823 (0.4947)	grad_norm 2.4001 (2.7288)	mem 20675MB
[2025-04-03 04:03:18 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][536/573]	eta 0:00:32 lr 0.000086	time 0.8782 (0.8863)	loss 0.4243 (0.4947)	grad_norm 2.8371 (2.7280)	mem 20675MB
[2025-04-03 04:03:19 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][538/573]	eta 0:00:31 lr 0.000086	time 0.8779 (0.8863)	loss 0.5516 (0.4948)	grad_norm 1.7629 (2.7254)	mem 20675MB
[2025-04-03 04:03:21 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][540/573]	eta 0:00:29 lr 0.000086	time 0.8802 (0.8863)	loss 0.5544 (0.4950)	grad_norm 2.0725 (2.7263)	mem 20675MB
[2025-04-03 04:03:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][542/573]	eta 0:00:27 lr 0.000086	time 0.8862 (0.8862)	loss 0.4016 (0.4946)	grad_norm 2.7091 (2.7271)	mem 20675MB
[2025-04-03 04:03:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][544/573]	eta 0:00:25 lr 0.000086	time 0.8782 (0.8862)	loss 0.5391 (0.4946)	grad_norm 2.0534 (2.7254)	mem 20675MB
[2025-04-03 04:03:26 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][546/573]	eta 0:00:23 lr 0.000086	time 0.8836 (0.8862)	loss 0.4380 (0.4947)	grad_norm 3.1112 (2.7253)	mem 20675MB
[2025-04-03 04:03:28 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][548/573]	eta 0:00:22 lr 0.000085	time 0.8777 (0.8862)	loss 0.5328 (0.4948)	grad_norm 2.6069 (2.7264)	mem 20675MB
[2025-04-03 04:03:30 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][550/573]	eta 0:00:20 lr 0.000085	time 0.8775 (0.8862)	loss 0.4604 (0.4947)	grad_norm 2.6967 (2.7277)	mem 20675MB
[2025-04-03 04:03:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][552/573]	eta 0:00:18 lr 0.000085	time 0.8776 (0.8861)	loss 0.4813 (0.4947)	grad_norm 2.9009 (2.7267)	mem 20675MB
[2025-04-03 04:03:33 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][554/573]	eta 0:00:16 lr 0.000085	time 0.8780 (0.8861)	loss 0.4429 (0.4945)	grad_norm 2.2454 (2.7266)	mem 20675MB
[2025-04-03 04:03:35 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][556/573]	eta 0:00:15 lr 0.000085	time 0.8795 (0.8861)	loss 0.5284 (0.4943)	grad_norm 2.5548 (2.7277)	mem 20675MB
[2025-04-03 04:03:37 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][558/573]	eta 0:00:13 lr 0.000085	time 0.8776 (0.8861)	loss 0.3743 (0.4941)	grad_norm 3.1798 (2.7284)	mem 20675MB
[2025-04-03 04:03:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][560/573]	eta 0:00:11 lr 0.000085	time 0.8773 (0.8860)	loss 0.5592 (0.4943)	grad_norm 3.3212 (2.7284)	mem 20675MB
[2025-04-03 04:03:40 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][562/573]	eta 0:00:09 lr 0.000085	time 0.8773 (0.8860)	loss 0.3871 (0.4940)	grad_norm 3.4613 (2.7297)	mem 20675MB
[2025-04-03 04:03:42 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][564/573]	eta 0:00:07 lr 0.000084	time 0.8774 (0.8860)	loss 0.5018 (0.4940)	grad_norm 1.8901 (2.7268)	mem 20675MB
[2025-04-03 04:03:44 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][566/573]	eta 0:00:06 lr 0.000084	time 0.8775 (0.8860)	loss 0.5275 (0.4941)	grad_norm 2.1749 (2.7251)	mem 20675MB
[2025-04-03 04:03:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][568/573]	eta 0:00:04 lr 0.000084	time 0.8774 (0.8859)	loss 0.5699 (0.4943)	grad_norm 2.6291 (2.7249)	mem 20675MB
[2025-04-03 04:03:47 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][570/573]	eta 0:00:02 lr 0.000084	time 0.8873 (0.8859)	loss 0.4273 (0.4942)	grad_norm 3.0111 (2.7245)	mem 20675MB
[2025-04-03 04:03:49 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][572/573]	eta 0:00:00 lr 0.000084	time 0.8808 (0.8859)	loss 0.4682 (0.4942)	grad_norm 2.3188 (2.7260)	mem 20675MB
[2025-04-03 04:03:50 simmim_finetune] (main_finetune.py 260): INFO EPOCH 24 training takes 0:08:28
[2025-04-03 04:03:53 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 3.044 (3.044)	Loss 0.5269 (0.5269)	Acc@1 71.094 (71.094)	Mem 20675MB
[2025-04-03 04:03:53 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (1.204)	Loss 0.4693 (0.4827)	Acc@1 75.781 (74.219)	Mem 20675MB
[2025-04-03 04:03:54 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.284 (0.836)	Loss 0.5021 (0.4778)	Acc@1 71.875 (74.219)	Mem 20675MB
[2025-04-03 04:03:54 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.679)	Loss 0.4227 (0.4622)	Acc@1 81.250 (76.339)	Mem 20675MB
[2025-04-03 04:03:55 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.591)	Loss 0.4942 (0.4574)	Acc@1 74.219 (76.997)	Mem 20675MB
[2025-04-03 04:03:56 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.286 (0.535)	Loss 0.4583 (0.4639)	Acc@1 82.812 (77.273)	Mem 20675MB
[2025-04-03 04:03:56 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.497)	Loss 0.4620 (0.4628)	Acc@1 78.906 (77.644)	Mem 20675MB
[2025-04-03 04:03:57 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.469)	Loss 0.4244 (0.4574)	Acc@1 80.469 (78.125)	Mem 20675MB
[2025-04-03 04:03:57 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.175
[2025-04-03 04:03:57 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 78.2%
[2025-04-03 04:03:57 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.23%
[2025-04-03 04:03:57 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [5.430352294497848e-07, 5.430352294497848e-07, 7.098471022709014e-07, 7.098471022709014e-07, 9.66480752764927e-07, 9.66480752764927e-07, 1.3613017535249661e-06, 1.3613017535249661e-06, 1.9687186777711805e-06, 1.9687186777711805e-06, 2.903206253534587e-06, 2.903206253534587e-06, 4.34087944701675e-06, 4.34087944701675e-06, 6.552684360066231e-06, 6.552684360066231e-06, 9.955461149373127e-06, 9.955461149373127e-06, 1.519050236369143e-05, 1.519050236369143e-05, 2.324441192418112e-05, 2.324441192418112e-05, 3.5635042017242184e-05, 3.5635042017242184e-05, 5.469754985272075e-05, 5.469754985272075e-05, 8.402448498422623e-05, 8.402448498422623e-05]
[2025-04-03 04:04:01 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][0/573]	eta 0:32:43 lr 0.000084	time 3.4260 (3.4260)	loss 0.4790 (0.4790)	grad_norm 2.6452 (2.6452)	mem 20675MB
[2025-04-03 04:04:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][2/573]	eta 0:16:26 lr 0.000084	time 0.8776 (1.7279)	loss 0.4676 (0.5008)	grad_norm 2.9498 (2.7421)	mem 20675MB
[2025-04-03 04:04:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][4/573]	eta 0:13:10 lr 0.000084	time 0.8774 (1.3888)	loss 0.5355 (0.5017)	grad_norm 2.3593 (2.5886)	mem 20675MB
[2025-04-03 04:04:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][6/573]	eta 0:11:44 lr 0.000084	time 0.8779 (1.2431)	loss 0.5224 (0.5075)	grad_norm 2.2014 (2.5208)	mem 20675MB
[2025-04-03 04:04:08 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][8/573]	eta 0:10:56 lr 0.000084	time 0.8783 (1.1622)	loss 0.4889 (0.5066)	grad_norm 5.2506 (2.7672)	mem 20675MB
[2025-04-03 04:04:10 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][10/573]	eta 0:10:26 lr 0.000083	time 0.8909 (1.1126)	loss 0.4670 (0.4983)	grad_norm 2.4959 (2.6892)	mem 20675MB
[2025-04-03 04:04:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][12/573]	eta 0:10:04 lr 0.000083	time 0.8773 (1.0774)	loss 0.5221 (0.4894)	grad_norm 2.7664 (2.7341)	mem 20675MB
[2025-04-03 04:04:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][14/573]	eta 0:09:47 lr 0.000083	time 0.8810 (1.0512)	loss 0.5247 (0.4918)	grad_norm 1.5092 (2.6186)	mem 20675MB
[2025-04-03 04:04:15 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][16/573]	eta 0:09:34 lr 0.000083	time 0.8779 (1.0308)	loss 0.6013 (0.4993)	grad_norm 2.4163 (2.6266)	mem 20675MB
[2025-04-03 04:04:17 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][18/573]	eta 0:09:23 lr 0.000083	time 0.8781 (1.0149)	loss 0.4472 (0.4964)	grad_norm 3.0870 (2.6586)	mem 20675MB
[2025-04-03 04:04:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][20/573]	eta 0:09:14 lr 0.000083	time 0.8776 (1.0022)	loss 0.4721 (0.4934)	grad_norm 3.3812 (2.6967)	mem 20675MB
[2025-04-03 04:04:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][22/573]	eta 0:09:06 lr 0.000083	time 0.8777 (0.9918)	loss 0.4407 (0.4899)	grad_norm 2.1897 (2.6963)	mem 20675MB
[2025-04-03 04:04:22 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][24/573]	eta 0:08:59 lr 0.000083	time 0.8776 (0.9827)	loss 0.5871 (0.4977)	grad_norm 2.2166 (2.6798)	mem 20675MB
[2025-04-03 04:04:24 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][26/573]	eta 0:08:53 lr 0.000082	time 0.8776 (0.9750)	loss 0.5585 (0.5040)	grad_norm 2.6234 (2.6745)	mem 20675MB
[2025-04-03 04:04:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][28/573]	eta 0:08:47 lr 0.000082	time 0.8858 (0.9687)	loss 0.2951 (0.4969)	grad_norm 3.1132 (2.6915)	mem 20675MB
[2025-04-03 04:04:27 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][30/573]	eta 0:08:43 lr 0.000082	time 0.8856 (0.9635)	loss 0.4999 (0.4996)	grad_norm 1.8084 (2.6547)	mem 20675MB
[2025-04-03 04:04:29 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][32/573]	eta 0:08:38 lr 0.000082	time 0.8777 (0.9584)	loss 0.3937 (0.4936)	grad_norm 3.7755 (2.7104)	mem 20675MB
[2025-04-03 04:04:31 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][34/573]	eta 0:08:34 lr 0.000082	time 0.8786 (0.9539)	loss 0.4964 (0.4950)	grad_norm 2.3786 (2.6965)	mem 20675MB
[2025-04-03 04:04:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][36/573]	eta 0:08:30 lr 0.000082	time 0.8782 (0.9500)	loss 0.5445 (0.4967)	grad_norm 1.9892 (2.6559)	mem 20675MB
[2025-04-03 04:04:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][38/573]	eta 0:08:26 lr 0.000082	time 0.8806 (0.9465)	loss 0.4179 (0.4958)	grad_norm 2.4843 (2.6532)	mem 20675MB
[2025-04-03 04:04:36 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][40/573]	eta 0:08:22 lr 0.000082	time 0.8781 (0.9433)	loss 0.3988 (0.4953)	grad_norm 2.8751 (2.6827)	mem 20675MB
[2025-04-03 04:04:38 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][42/573]	eta 0:08:19 lr 0.000082	time 0.8801 (0.9404)	loss 0.5690 (0.4977)	grad_norm 1.8698 (2.6470)	mem 20675MB
[2025-04-03 04:04:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][44/573]	eta 0:08:16 lr 0.000081	time 0.8776 (0.9380)	loss 0.4241 (0.4974)	grad_norm 2.9422 (2.6387)	mem 20675MB
[2025-04-03 04:04:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][46/573]	eta 0:08:13 lr 0.000081	time 0.8858 (0.9356)	loss 0.4520 (0.4977)	grad_norm 2.7095 (2.6309)	mem 20675MB
[2025-04-03 04:04:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][48/573]	eta 0:08:10 lr 0.000081	time 0.8786 (0.9334)	loss 0.5687 (0.4979)	grad_norm 1.9899 (2.6230)	mem 20675MB
[2025-04-03 04:04:45 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][50/573]	eta 0:08:07 lr 0.000081	time 0.8775 (0.9313)	loss 0.5313 (0.4977)	grad_norm 2.0679 (2.6408)	mem 20675MB
[2025-04-03 04:04:47 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][52/573]	eta 0:08:04 lr 0.000081	time 0.8777 (0.9293)	loss 0.4989 (0.4997)	grad_norm 2.1330 (2.6221)	mem 20675MB
[2025-04-03 04:04:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][54/573]	eta 0:08:01 lr 0.000081	time 0.8867 (0.9277)	loss 0.5435 (0.5007)	grad_norm 2.2390 (2.6141)	mem 20675MB
[2025-04-03 04:04:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][56/573]	eta 0:07:58 lr 0.000081	time 0.8777 (0.9261)	loss 0.4597 (0.4994)	grad_norm 3.0802 (2.6404)	mem 20675MB
[2025-04-03 04:04:52 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][58/573]	eta 0:07:56 lr 0.000081	time 0.8773 (0.9245)	loss 0.4468 (0.4992)	grad_norm 1.7764 (2.6224)	mem 20675MB
[2025-04-03 04:04:54 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][60/573]	eta 0:07:53 lr 0.000081	time 0.8779 (0.9231)	loss 0.3890 (0.4980)	grad_norm 3.1706 (2.6280)	mem 20675MB
[2025-04-03 04:04:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][62/573]	eta 0:07:50 lr 0.000080	time 0.8775 (0.9217)	loss 0.3649 (0.4966)	grad_norm 3.8930 (2.6549)	mem 20675MB
[2025-04-03 04:04:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][64/573]	eta 0:07:48 lr 0.000080	time 0.8997 (0.9207)	loss 0.5093 (0.4989)	grad_norm 2.5058 (2.6527)	mem 20675MB
[2025-04-03 04:04:59 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][66/573]	eta 0:07:46 lr 0.000080	time 0.8778 (0.9195)	loss 0.5062 (0.4995)	grad_norm 2.8718 (2.6514)	mem 20675MB
[2025-04-03 04:05:01 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][68/573]	eta 0:07:43 lr 0.000080	time 0.8834 (0.9184)	loss 0.5230 (0.4997)	grad_norm 1.9514 (2.6523)	mem 20675MB
[2025-04-03 04:05:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][70/573]	eta 0:07:41 lr 0.000080	time 0.8777 (0.9174)	loss 0.5394 (0.5003)	grad_norm 2.3595 (2.6374)	mem 20675MB
[2025-04-03 04:05:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][72/573]	eta 0:07:39 lr 0.000080	time 0.8777 (0.9163)	loss 0.5535 (0.5008)	grad_norm 2.5485 (2.6343)	mem 20675MB
[2025-04-03 04:05:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][74/573]	eta 0:07:36 lr 0.000080	time 0.8800 (0.9155)	loss 0.5531 (0.5023)	grad_norm 2.9435 (2.6305)	mem 20675MB
[2025-04-03 04:05:08 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][76/573]	eta 0:07:34 lr 0.000080	time 0.8816 (0.9146)	loss 0.4731 (0.5023)	grad_norm 3.1154 (2.6399)	mem 20675MB
[2025-04-03 04:05:09 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][78/573]	eta 0:07:32 lr 0.000080	time 0.8776 (0.9139)	loss 0.3828 (0.5006)	grad_norm 3.5932 (2.6487)	mem 20675MB
[2025-04-03 04:05:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][80/573]	eta 0:07:30 lr 0.000079	time 0.8779 (0.9131)	loss 0.5106 (0.5012)	grad_norm 2.4570 (2.6452)	mem 20675MB
[2025-04-03 04:05:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][82/573]	eta 0:07:27 lr 0.000079	time 0.8855 (0.9123)	loss 0.4895 (0.5015)	grad_norm 2.8206 (2.6484)	mem 20675MB
[2025-04-03 04:05:15 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][84/573]	eta 0:07:25 lr 0.000079	time 0.8776 (0.9116)	loss 0.5262 (0.5020)	grad_norm 2.5498 (2.6430)	mem 20675MB
[2025-04-03 04:05:17 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][86/573]	eta 0:07:23 lr 0.000079	time 0.8776 (0.9108)	loss 0.5723 (0.5040)	grad_norm 1.9629 (2.6300)	mem 20675MB
[2025-04-03 04:05:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][88/573]	eta 0:07:21 lr 0.000079	time 0.8777 (0.9101)	loss 0.3365 (0.5015)	grad_norm 3.3003 (2.6406)	mem 20675MB
[2025-04-03 04:05:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][90/573]	eta 0:07:19 lr 0.000079	time 0.8776 (0.9094)	loss 0.5391 (0.5023)	grad_norm 2.7001 (2.6527)	mem 20675MB
[2025-04-03 04:05:22 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][92/573]	eta 0:07:17 lr 0.000079	time 0.8782 (0.9087)	loss 0.3156 (0.4991)	grad_norm 2.9700 (2.6584)	mem 20675MB
[2025-04-03 04:05:24 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][94/573]	eta 0:07:14 lr 0.000079	time 0.8778 (0.9081)	loss 0.5179 (0.5000)	grad_norm 2.2966 (2.6489)	mem 20675MB
[2025-04-03 04:05:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][96/573]	eta 0:07:12 lr 0.000079	time 0.8782 (0.9075)	loss 0.5631 (0.5003)	grad_norm 2.2923 (2.6557)	mem 20675MB
[2025-04-03 04:05:27 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][98/573]	eta 0:07:10 lr 0.000078	time 0.8775 (0.9070)	loss 0.5134 (0.5012)	grad_norm 2.5455 (2.6523)	mem 20675MB
[2025-04-03 04:05:29 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][100/573]	eta 0:07:08 lr 0.000078	time 0.8777 (0.9065)	loss 0.5056 (0.5010)	grad_norm 2.4206 (2.6473)	mem 20675MB
[2025-04-03 04:05:31 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][102/573]	eta 0:07:06 lr 0.000078	time 0.8799 (0.9060)	loss 0.5475 (0.5022)	grad_norm 2.3374 (2.6439)	mem 20675MB
[2025-04-03 04:05:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][104/573]	eta 0:07:04 lr 0.000078	time 0.8776 (0.9055)	loss 0.4913 (0.5022)	grad_norm 2.9018 (2.6568)	mem 20675MB
[2025-04-03 04:05:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][106/573]	eta 0:07:02 lr 0.000078	time 0.8787 (0.9050)	loss 0.4225 (0.5024)	grad_norm 3.7312 (2.6671)	mem 20675MB
[2025-04-03 04:05:36 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][108/573]	eta 0:07:00 lr 0.000078	time 0.8779 (0.9046)	loss 0.6007 (0.5031)	grad_norm 2.0068 (2.6586)	mem 20675MB
[2025-04-03 04:05:38 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][110/573]	eta 0:06:58 lr 0.000078	time 0.8777 (0.9041)	loss 0.5166 (0.5034)	grad_norm 1.9649 (2.6514)	mem 20675MB
[2025-04-03 04:05:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][112/573]	eta 0:06:56 lr 0.000078	time 0.8784 (0.9037)	loss 0.4938 (0.5030)	grad_norm 2.1232 (2.6553)	mem 20675MB
[2025-04-03 04:05:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][114/573]	eta 0:06:54 lr 0.000078	time 0.8776 (0.9033)	loss 0.4008 (0.5023)	grad_norm 4.1571 (2.6617)	mem 20675MB
[2025-04-03 04:05:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][116/573]	eta 0:06:52 lr 0.000077	time 0.8810 (0.9029)	loss 0.4604 (0.5007)	grad_norm 3.1273 (2.6723)	mem 20675MB
[2025-04-03 04:05:45 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][118/573]	eta 0:06:50 lr 0.000077	time 0.8782 (0.9025)	loss 0.5659 (0.5006)	grad_norm 2.4693 (2.6773)	mem 20675MB
[2025-04-03 04:05:46 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][120/573]	eta 0:06:48 lr 0.000077	time 0.8778 (0.9022)	loss 0.5206 (0.5008)	grad_norm 2.4218 (2.6706)	mem 20675MB
[2025-04-03 04:05:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][122/573]	eta 0:06:46 lr 0.000077	time 0.8780 (0.9018)	loss 0.4855 (0.5015)	grad_norm 2.3541 (2.6766)	mem 20675MB
[2025-04-03 04:05:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][124/573]	eta 0:06:44 lr 0.000077	time 0.8782 (0.9014)	loss 0.3572 (0.5008)	grad_norm 2.5233 (2.6718)	mem 20675MB
[2025-04-03 04:05:52 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][126/573]	eta 0:06:42 lr 0.000077	time 0.8781 (0.9011)	loss 0.5099 (0.5013)	grad_norm 2.8533 (2.6720)	mem 20675MB
[2025-04-03 04:05:53 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][128/573]	eta 0:06:40 lr 0.000077	time 0.8781 (0.9008)	loss 0.4394 (0.5019)	grad_norm 3.1443 (2.6751)	mem 20675MB
[2025-04-03 04:05:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][130/573]	eta 0:06:38 lr 0.000077	time 0.8812 (0.9005)	loss 0.5707 (0.5028)	grad_norm 2.2028 (2.6796)	mem 20675MB
[2025-04-03 04:05:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][132/573]	eta 0:06:36 lr 0.000077	time 0.8829 (0.9002)	loss 0.5053 (0.5037)	grad_norm 1.7766 (2.6686)	mem 20675MB
[2025-04-03 04:05:59 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][134/573]	eta 0:06:35 lr 0.000076	time 0.8778 (0.8999)	loss 0.3369 (0.5028)	grad_norm 3.4360 (2.6700)	mem 20675MB
[2025-04-03 04:06:01 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][136/573]	eta 0:06:33 lr 0.000076	time 0.8777 (0.8996)	loss 0.4993 (0.5023)	grad_norm 2.2480 (2.6673)	mem 20675MB
[2025-04-03 04:06:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][138/573]	eta 0:06:31 lr 0.000076	time 0.8788 (0.8993)	loss 0.5102 (0.5026)	grad_norm 3.0257 (2.6661)	mem 20675MB
[2025-04-03 04:06:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][140/573]	eta 0:06:29 lr 0.000076	time 0.8795 (0.8990)	loss 0.5177 (0.5036)	grad_norm 2.7639 (2.6688)	mem 20675MB
[2025-04-03 04:06:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][142/573]	eta 0:06:27 lr 0.000076	time 0.8802 (0.8988)	loss 0.4671 (0.5039)	grad_norm 3.8799 (2.6762)	mem 20675MB
[2025-04-03 04:06:08 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][144/573]	eta 0:06:25 lr 0.000076	time 0.8786 (0.8985)	loss 0.4965 (0.5043)	grad_norm 2.6810 (2.6740)	mem 20675MB
[2025-04-03 04:06:09 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][146/573]	eta 0:06:23 lr 0.000076	time 0.8781 (0.8983)	loss 0.5828 (0.5051)	grad_norm 2.0442 (2.6665)	mem 20675MB
[2025-04-03 04:06:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][148/573]	eta 0:06:21 lr 0.000076	time 0.8777 (0.8980)	loss 0.6218 (0.5058)	grad_norm 2.4756 (2.6657)	mem 20675MB
[2025-04-03 04:06:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][150/573]	eta 0:06:19 lr 0.000076	time 0.8774 (0.8978)	loss 0.4089 (0.5053)	grad_norm 3.0141 (2.6686)	mem 20675MB
[2025-04-03 04:06:15 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][152/573]	eta 0:06:17 lr 0.000075	time 0.8783 (0.8976)	loss 0.4829 (0.5043)	grad_norm 2.5549 (2.6692)	mem 20675MB
[2025-04-03 04:06:16 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][154/573]	eta 0:06:16 lr 0.000075	time 0.8806 (0.8974)	loss 0.5506 (0.5041)	grad_norm 2.4880 (2.6717)	mem 20675MB
[2025-04-03 04:06:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][156/573]	eta 0:06:14 lr 0.000075	time 0.8777 (0.8972)	loss 0.5199 (0.5036)	grad_norm 3.0942 (2.6726)	mem 20675MB
[2025-04-03 04:06:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][158/573]	eta 0:06:12 lr 0.000075	time 0.8775 (0.8970)	loss 0.5260 (0.5035)	grad_norm 2.4321 (2.6674)	mem 20675MB
[2025-04-03 04:06:22 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][160/573]	eta 0:06:10 lr 0.000075	time 0.8788 (0.8969)	loss 0.5241 (0.5032)	grad_norm 3.8463 (2.6866)	mem 20675MB
[2025-04-03 04:06:23 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][162/573]	eta 0:06:08 lr 0.000075	time 0.8787 (0.8967)	loss 0.5616 (0.5041)	grad_norm 3.0494 (2.6941)	mem 20675MB
[2025-04-03 04:06:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][164/573]	eta 0:06:06 lr 0.000075	time 0.8801 (0.8965)	loss 0.5190 (0.5047)	grad_norm 1.7180 (2.6838)	mem 20675MB
[2025-04-03 04:06:27 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][166/573]	eta 0:06:04 lr 0.000075	time 0.8776 (0.8963)	loss 0.5895 (0.5054)	grad_norm 2.7374 (2.6807)	mem 20675MB
[2025-04-03 04:06:29 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][168/573]	eta 0:06:02 lr 0.000075	time 0.8822 (0.8962)	loss 0.4712 (0.5043)	grad_norm 2.0579 (2.6756)	mem 20675MB
[2025-04-03 04:06:30 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][170/573]	eta 0:06:01 lr 0.000075	time 0.8777 (0.8960)	loss 0.4682 (0.5038)	grad_norm 3.5228 (2.6774)	mem 20675MB
[2025-04-03 04:06:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][172/573]	eta 0:05:59 lr 0.000074	time 0.8777 (0.8959)	loss 0.4347 (0.5034)	grad_norm 4.3020 (2.6869)	mem 20675MB
[2025-04-03 04:06:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][174/573]	eta 0:05:57 lr 0.000074	time 0.8772 (0.8957)	loss 0.4965 (0.5032)	grad_norm 2.5119 (2.6857)	mem 20675MB
[2025-04-03 04:06:36 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][176/573]	eta 0:05:55 lr 0.000074	time 0.8779 (0.8955)	loss 0.5745 (0.5041)	grad_norm 2.4717 (2.6848)	mem 20675MB
[2025-04-03 04:06:38 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][178/573]	eta 0:05:53 lr 0.000074	time 0.8783 (0.8953)	loss 0.5507 (0.5033)	grad_norm 1.8573 (2.6831)	mem 20675MB
[2025-04-03 04:06:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][180/573]	eta 0:05:51 lr 0.000074	time 0.8779 (0.8951)	loss 0.4469 (0.5030)	grad_norm 2.4197 (2.6877)	mem 20675MB
[2025-04-03 04:06:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][182/573]	eta 0:05:49 lr 0.000074	time 0.8780 (0.8950)	loss 0.3866 (0.5028)	grad_norm 3.6810 (2.6951)	mem 20675MB
[2025-04-03 04:06:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][184/573]	eta 0:05:48 lr 0.000074	time 0.8784 (0.8948)	loss 0.5120 (0.5027)	grad_norm 3.1911 (2.6958)	mem 20675MB
[2025-04-03 04:06:45 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][186/573]	eta 0:05:46 lr 0.000074	time 0.8850 (0.8947)	loss 0.5053 (0.5032)	grad_norm 1.3692 (2.7070)	mem 20675MB
[2025-04-03 04:06:46 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][188/573]	eta 0:05:44 lr 0.000074	time 0.8828 (0.8946)	loss 0.3845 (0.5026)	grad_norm 2.4682 (2.7099)	mem 20675MB
[2025-04-03 04:06:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][190/573]	eta 0:05:42 lr 0.000073	time 0.8781 (0.8945)	loss 0.4790 (0.5028)	grad_norm 2.2009 (2.7069)	mem 20675MB
[2025-04-03 04:06:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][192/573]	eta 0:05:40 lr 0.000073	time 0.8775 (0.8944)	loss 0.6107 (0.5032)	grad_norm 4.0116 (2.7172)	mem 20675MB
[2025-04-03 04:06:52 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][194/573]	eta 0:05:38 lr 0.000073	time 0.8773 (0.8943)	loss 0.5078 (0.5036)	grad_norm 2.3985 (2.7133)	mem 20675MB
[2025-04-03 04:06:53 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][196/573]	eta 0:05:37 lr 0.000073	time 0.8777 (0.8941)	loss 0.4420 (0.5037)	grad_norm 2.4331 (2.7089)	mem 20675MB
[2025-04-03 04:06:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][198/573]	eta 0:05:35 lr 0.000073	time 0.8777 (0.8940)	loss 0.4916 (0.5032)	grad_norm 2.5595 (2.7086)	mem 20675MB
[2025-04-03 04:06:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][200/573]	eta 0:05:33 lr 0.000073	time 0.8775 (0.8939)	loss 0.5193 (0.5027)	grad_norm 2.1777 (2.7112)	mem 20675MB
[2025-04-03 04:06:59 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][202/573]	eta 0:05:31 lr 0.000073	time 0.8780 (0.8937)	loss 0.5246 (0.5030)	grad_norm 2.1990 (2.7082)	mem 20675MB
[2025-04-03 04:07:00 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][204/573]	eta 0:05:29 lr 0.000073	time 0.8776 (0.8936)	loss 0.5623 (0.5023)	grad_norm 1.8472 (2.7025)	mem 20675MB
[2025-04-03 04:07:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][206/573]	eta 0:05:27 lr 0.000073	time 0.8795 (0.8935)	loss 0.4588 (0.5021)	grad_norm 2.0167 (2.6953)	mem 20675MB
[2025-04-03 04:07:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][208/573]	eta 0:05:26 lr 0.000072	time 0.8794 (0.8934)	loss 0.5348 (0.5021)	grad_norm 2.2422 (2.6929)	mem 20675MB
[2025-04-03 04:07:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][210/573]	eta 0:05:24 lr 0.000072	time 0.8786 (0.8933)	loss 0.4414 (0.5021)	grad_norm 2.8946 (2.6949)	mem 20675MB
[2025-04-03 04:07:08 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][212/573]	eta 0:05:22 lr 0.000072	time 0.8787 (0.8931)	loss 0.4501 (0.5020)	grad_norm 3.0417 (2.6944)	mem 20675MB
[2025-04-03 04:07:09 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][214/573]	eta 0:05:20 lr 0.000072	time 0.8781 (0.8930)	loss 0.4750 (0.5019)	grad_norm 2.2168 (2.6865)	mem 20675MB
[2025-04-03 04:07:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][216/573]	eta 0:05:18 lr 0.000072	time 0.8845 (0.8929)	loss 0.3082 (0.5008)	grad_norm 2.4326 (2.6838)	mem 20675MB
[2025-04-03 04:07:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][218/573]	eta 0:05:16 lr 0.000072	time 0.8804 (0.8929)	loss 0.5160 (0.5012)	grad_norm 3.1734 (2.6863)	mem 20675MB
[2025-04-03 04:07:15 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][220/573]	eta 0:05:15 lr 0.000072	time 0.8779 (0.8928)	loss 0.4523 (0.5009)	grad_norm 2.8102 (2.6863)	mem 20675MB
[2025-04-03 04:07:16 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][222/573]	eta 0:05:13 lr 0.000072	time 0.8777 (0.8926)	loss 0.4753 (0.5001)	grad_norm 2.8672 (2.6975)	mem 20675MB
[2025-04-03 04:07:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][224/573]	eta 0:05:11 lr 0.000072	time 0.8779 (0.8925)	loss 0.5136 (0.5001)	grad_norm 1.8936 (2.6980)	mem 20675MB
[2025-04-03 04:07:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][226/573]	eta 0:05:09 lr 0.000072	time 0.8779 (0.8924)	loss 0.6225 (0.5010)	grad_norm 3.2038 (2.6996)	mem 20675MB
[2025-04-03 04:07:22 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][228/573]	eta 0:05:07 lr 0.000071	time 0.8781 (0.8923)	loss 0.5591 (0.5012)	grad_norm 4.0124 (2.7013)	mem 20675MB
[2025-04-03 04:07:23 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][230/573]	eta 0:05:06 lr 0.000071	time 0.8779 (0.8922)	loss 0.5061 (0.5012)	grad_norm 8.6268 (2.7254)	mem 20675MB
[2025-04-03 04:07:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][232/573]	eta 0:05:04 lr 0.000071	time 0.8781 (0.8921)	loss 0.5317 (0.5014)	grad_norm 2.1542 (2.7251)	mem 20675MB
[2025-04-03 04:07:27 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][234/573]	eta 0:05:02 lr 0.000071	time 0.8839 (0.8920)	loss 0.4868 (0.5010)	grad_norm 4.1077 (2.7311)	mem 20675MB
[2025-04-03 04:07:29 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][236/573]	eta 0:05:00 lr 0.000071	time 0.8777 (0.8919)	loss 0.4942 (0.5011)	grad_norm 2.2099 (2.7280)	mem 20675MB
[2025-04-03 04:07:30 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][238/573]	eta 0:04:58 lr 0.000071	time 0.8784 (0.8918)	loss 0.5718 (0.5007)	grad_norm 2.2046 (2.7251)	mem 20675MB
[2025-04-03 04:07:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][240/573]	eta 0:04:56 lr 0.000071	time 0.8771 (0.8917)	loss 0.3569 (0.5001)	grad_norm 5.3382 (2.7318)	mem 20675MB
[2025-04-03 04:07:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][242/573]	eta 0:04:55 lr 0.000071	time 0.8794 (0.8916)	loss 0.6138 (0.5000)	grad_norm 2.5222 (2.7300)	mem 20675MB
[2025-04-03 04:07:36 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][244/573]	eta 0:04:53 lr 0.000071	time 0.8774 (0.8915)	loss 0.5545 (0.4999)	grad_norm 3.5364 (2.7348)	mem 20675MB
[2025-04-03 04:07:37 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][246/573]	eta 0:04:51 lr 0.000070	time 0.8779 (0.8913)	loss 0.3973 (0.4995)	grad_norm 4.1315 (2.7393)	mem 20675MB
[2025-04-03 04:07:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][248/573]	eta 0:04:49 lr 0.000070	time 0.8780 (0.8913)	loss 0.4813 (0.4997)	grad_norm 2.6170 (2.7364)	mem 20675MB
[2025-04-03 04:07:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][250/573]	eta 0:04:47 lr 0.000070	time 0.8783 (0.8912)	loss 0.4859 (0.5000)	grad_norm 1.5759 (2.7294)	mem 20675MB
[2025-04-03 04:07:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][252/573]	eta 0:04:46 lr 0.000070	time 0.8775 (0.8912)	loss 0.4308 (0.4990)	grad_norm 2.7876 (2.7334)	mem 20675MB
[2025-04-03 04:07:45 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][254/573]	eta 0:04:44 lr 0.000070	time 0.8798 (0.8911)	loss 0.5017 (0.4991)	grad_norm 2.3487 (2.7288)	mem 20675MB
[2025-04-03 04:07:46 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][256/573]	eta 0:04:42 lr 0.000070	time 0.8798 (0.8910)	loss 0.5478 (0.4992)	grad_norm 2.1366 (2.7336)	mem 20675MB
[2025-04-03 04:07:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][258/573]	eta 0:04:40 lr 0.000070	time 0.8780 (0.8909)	loss 0.3857 (0.4989)	grad_norm 6.7267 (2.7456)	mem 20675MB
[2025-04-03 04:07:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][260/573]	eta 0:04:38 lr 0.000070	time 0.8813 (0.8908)	loss 0.5291 (0.4987)	grad_norm 2.7366 (2.7469)	mem 20675MB
[2025-04-03 04:07:52 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][262/573]	eta 0:04:37 lr 0.000070	time 0.9015 (0.8908)	loss 0.4684 (0.4986)	grad_norm 2.1083 (2.7417)	mem 20675MB
[2025-04-03 04:07:53 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][264/573]	eta 0:04:35 lr 0.000070	time 0.8822 (0.8908)	loss 0.3571 (0.4986)	grad_norm 4.0404 (2.7461)	mem 20675MB
[2025-04-03 04:07:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][266/573]	eta 0:04:33 lr 0.000069	time 0.8847 (0.8908)	loss 0.5832 (0.4992)	grad_norm 1.9756 (2.7409)	mem 20675MB
[2025-04-03 04:07:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][268/573]	eta 0:04:31 lr 0.000069	time 0.8783 (0.8907)	loss 0.4723 (0.4990)	grad_norm 2.3871 (2.7392)	mem 20675MB
[2025-04-03 04:07:59 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][270/573]	eta 0:04:29 lr 0.000069	time 0.8776 (0.8906)	loss 0.4995 (0.4993)	grad_norm 2.1197 (2.7342)	mem 20675MB
[2025-04-03 04:08:00 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][272/573]	eta 0:04:28 lr 0.000069	time 0.8773 (0.8906)	loss 0.4984 (0.4987)	grad_norm 2.3301 (2.7362)	mem 20675MB
[2025-04-03 04:08:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][274/573]	eta 0:04:26 lr 0.000069	time 0.8778 (0.8905)	loss 0.5337 (0.4990)	grad_norm 2.2258 (2.7315)	mem 20675MB
[2025-04-03 04:08:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][276/573]	eta 0:04:24 lr 0.000069	time 0.8845 (0.8904)	loss 0.5744 (0.4990)	grad_norm 2.1064 (2.7337)	mem 20675MB
[2025-04-03 04:08:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][278/573]	eta 0:04:22 lr 0.000069	time 0.8778 (0.8903)	loss 0.5244 (0.4991)	grad_norm 4.0752 (2.7365)	mem 20675MB
[2025-04-03 04:08:07 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][280/573]	eta 0:04:20 lr 0.000069	time 0.8777 (0.8902)	loss 0.5352 (0.4993)	grad_norm 2.7368 (2.7343)	mem 20675MB
[2025-04-03 04:08:09 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][282/573]	eta 0:04:19 lr 0.000069	time 0.8912 (0.8902)	loss 0.4909 (0.4993)	grad_norm 1.9202 (2.7313)	mem 20675MB
[2025-04-03 04:08:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][284/573]	eta 0:04:17 lr 0.000068	time 0.8775 (0.8901)	loss 0.5429 (0.4995)	grad_norm 2.7409 (2.7301)	mem 20675MB
[2025-04-03 04:08:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][286/573]	eta 0:04:15 lr 0.000068	time 0.8773 (0.8901)	loss 0.5626 (0.4999)	grad_norm 2.9346 (2.7303)	mem 20675MB
[2025-04-03 04:08:14 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][288/573]	eta 0:04:13 lr 0.000068	time 0.8782 (0.8900)	loss 0.4092 (0.4998)	grad_norm 2.4118 (2.7275)	mem 20675MB
[2025-04-03 04:08:16 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][290/573]	eta 0:04:11 lr 0.000068	time 0.8790 (0.8899)	loss 0.5361 (0.4999)	grad_norm 2.4958 (2.7282)	mem 20675MB
[2025-04-03 04:08:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][292/573]	eta 0:04:10 lr 0.000068	time 0.8793 (0.8899)	loss 0.5348 (0.5002)	grad_norm 3.0128 (2.7278)	mem 20675MB
[2025-04-03 04:08:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][294/573]	eta 0:04:08 lr 0.000068	time 0.8778 (0.8898)	loss 0.5396 (0.5004)	grad_norm 1.8828 (2.7254)	mem 20675MB
[2025-04-03 04:08:22 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][296/573]	eta 0:04:06 lr 0.000068	time 0.8870 (0.8897)	loss 0.3592 (0.4998)	grad_norm 2.9264 (2.7251)	mem 20675MB
[2025-04-03 04:08:23 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][298/573]	eta 0:04:04 lr 0.000068	time 0.8867 (0.8897)	loss 0.4720 (0.4995)	grad_norm 2.9752 (2.7270)	mem 20675MB
[2025-04-03 04:08:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][300/573]	eta 0:04:02 lr 0.000068	time 0.8782 (0.8896)	loss 0.6097 (0.5001)	grad_norm 2.6805 (2.7258)	mem 20675MB
[2025-04-03 04:08:27 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][302/573]	eta 0:04:01 lr 0.000068	time 0.8773 (0.8896)	loss 0.5038 (0.4995)	grad_norm 3.9766 (2.7303)	mem 20675MB
[2025-04-03 04:08:29 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][304/573]	eta 0:03:59 lr 0.000067	time 0.8780 (0.8895)	loss 0.5603 (0.4998)	grad_norm 1.9896 (2.7248)	mem 20675MB
[2025-04-03 04:08:30 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][306/573]	eta 0:03:57 lr 0.000067	time 0.8780 (0.8894)	loss 0.4574 (0.4996)	grad_norm 2.3008 (2.7208)	mem 20675MB
[2025-04-03 04:08:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][308/573]	eta 0:03:55 lr 0.000067	time 0.8803 (0.8894)	loss 0.4872 (0.4991)	grad_norm 2.2912 (2.7192)	mem 20675MB
[2025-04-03 04:08:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][310/573]	eta 0:03:53 lr 0.000067	time 0.8776 (0.8893)	loss 0.5271 (0.4991)	grad_norm 2.7896 (2.7186)	mem 20675MB
[2025-04-03 04:08:36 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][312/573]	eta 0:03:52 lr 0.000067	time 0.8821 (0.8893)	loss 0.5237 (0.4987)	grad_norm 2.4456 (2.7197)	mem 20675MB
[2025-04-03 04:08:37 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][314/573]	eta 0:03:50 lr 0.000067	time 0.8774 (0.8893)	loss 0.5062 (0.4984)	grad_norm 4.1939 (2.7275)	mem 20675MB
[2025-04-03 04:08:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][316/573]	eta 0:03:48 lr 0.000067	time 0.8772 (0.8892)	loss 0.4686 (0.4977)	grad_norm 1.8893 (2.7242)	mem 20675MB
[2025-04-03 04:08:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][318/573]	eta 0:03:46 lr 0.000067	time 0.8773 (0.8891)	loss 0.3699 (0.4975)	grad_norm 4.6903 (2.7289)	mem 20675MB
[2025-04-03 04:08:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][320/573]	eta 0:03:44 lr 0.000067	time 0.8774 (0.8891)	loss 0.4608 (0.4973)	grad_norm 4.6349 (2.7385)	mem 20675MB
[2025-04-03 04:08:44 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][322/573]	eta 0:03:43 lr 0.000067	time 0.8791 (0.8890)	loss 0.4520 (0.4967)	grad_norm 3.6622 (2.7470)	mem 20675MB
[2025-04-03 04:08:46 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][324/573]	eta 0:03:41 lr 0.000066	time 0.8780 (0.8890)	loss 0.5674 (0.4969)	grad_norm 3.5401 (2.7470)	mem 20675MB
[2025-04-03 04:08:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][326/573]	eta 0:03:39 lr 0.000066	time 0.8797 (0.8889)	loss 0.4265 (0.4967)	grad_norm 3.3363 (2.7504)	mem 20675MB
[2025-04-03 04:08:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][328/573]	eta 0:03:37 lr 0.000066	time 0.8779 (0.8889)	loss 0.5566 (0.4971)	grad_norm 2.0116 (2.7471)	mem 20675MB
[2025-04-03 04:08:51 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][330/573]	eta 0:03:35 lr 0.000066	time 0.8825 (0.8888)	loss 0.4901 (0.4970)	grad_norm 2.0127 (2.7436)	mem 20675MB
[2025-04-03 04:08:53 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][332/573]	eta 0:03:34 lr 0.000066	time 0.8993 (0.8889)	loss 0.5377 (0.4971)	grad_norm 2.1751 (2.7410)	mem 20675MB
[2025-04-03 04:08:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][334/573]	eta 0:03:32 lr 0.000066	time 0.8778 (0.8888)	loss 0.4587 (0.4971)	grad_norm 2.9557 (2.7410)	mem 20675MB
[2025-04-03 04:08:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][336/573]	eta 0:03:30 lr 0.000066	time 0.8804 (0.8887)	loss 0.3573 (0.4966)	grad_norm 2.1770 (2.7373)	mem 20675MB
[2025-04-03 04:08:59 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][338/573]	eta 0:03:28 lr 0.000066	time 0.8790 (0.8887)	loss 0.5437 (0.4963)	grad_norm 2.8655 (2.7387)	mem 20675MB
[2025-04-03 04:09:00 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][340/573]	eta 0:03:27 lr 0.000066	time 0.8775 (0.8886)	loss 0.3818 (0.4961)	grad_norm 4.1139 (2.7470)	mem 20675MB
[2025-04-03 04:09:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][342/573]	eta 0:03:25 lr 0.000066	time 0.8788 (0.8886)	loss 0.4891 (0.4961)	grad_norm 4.1492 (2.7542)	mem 20675MB
[2025-04-03 04:09:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][344/573]	eta 0:03:23 lr 0.000065	time 0.8773 (0.8886)	loss 0.3448 (0.4957)	grad_norm 4.1446 (2.7581)	mem 20675MB
[2025-04-03 04:09:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][346/573]	eta 0:03:21 lr 0.000065	time 0.8972 (0.8886)	loss 0.5592 (0.4959)	grad_norm 2.9031 (2.7563)	mem 20675MB
[2025-04-03 04:09:07 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][348/573]	eta 0:03:19 lr 0.000065	time 0.8776 (0.8885)	loss 0.3839 (0.4958)	grad_norm 5.5465 (2.7622)	mem 20675MB
[2025-04-03 04:09:09 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][350/573]	eta 0:03:18 lr 0.000065	time 0.8773 (0.8885)	loss 0.3614 (0.4955)	grad_norm 3.3226 (2.7632)	mem 20675MB
[2025-04-03 04:09:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][352/573]	eta 0:03:16 lr 0.000065	time 0.8777 (0.8884)	loss 0.4695 (0.4950)	grad_norm 3.8275 (2.7678)	mem 20675MB
[2025-04-03 04:09:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][354/573]	eta 0:03:14 lr 0.000065	time 0.8798 (0.8884)	loss 0.3673 (0.4946)	grad_norm 3.7592 (2.7692)	mem 20675MB
[2025-04-03 04:09:14 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][356/573]	eta 0:03:12 lr 0.000065	time 0.8773 (0.8884)	loss 0.5722 (0.4951)	grad_norm 4.1925 (2.7736)	mem 20675MB
[2025-04-03 04:09:16 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][358/573]	eta 0:03:10 lr 0.000065	time 0.8781 (0.8883)	loss 0.5259 (0.4948)	grad_norm 1.6151 (2.7763)	mem 20675MB
[2025-04-03 04:09:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][360/573]	eta 0:03:09 lr 0.000065	time 0.8784 (0.8883)	loss 0.4859 (0.4949)	grad_norm 2.5452 (2.7751)	mem 20675MB
[2025-04-03 04:09:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][362/573]	eta 0:03:07 lr 0.000064	time 0.8772 (0.8882)	loss 0.3235 (0.4945)	grad_norm 3.0040 (2.7746)	mem 20675MB
[2025-04-03 04:09:21 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][364/573]	eta 0:03:05 lr 0.000064	time 0.8773 (0.8882)	loss 0.5874 (0.4948)	grad_norm 2.3288 (2.7733)	mem 20675MB
[2025-04-03 04:09:23 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][366/573]	eta 0:03:03 lr 0.000064	time 0.8777 (0.8881)	loss 0.3776 (0.4943)	grad_norm 4.9086 (2.7794)	mem 20675MB
[2025-04-03 04:09:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][368/573]	eta 0:03:02 lr 0.000064	time 0.8775 (0.8881)	loss 0.2707 (0.4933)	grad_norm 2.2281 (2.7792)	mem 20675MB
[2025-04-03 04:09:27 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][370/573]	eta 0:03:00 lr 0.000064	time 0.8778 (0.8880)	loss 0.5037 (0.4930)	grad_norm 3.0187 (2.7822)	mem 20675MB
[2025-04-03 04:09:28 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][372/573]	eta 0:02:58 lr 0.000064	time 0.8774 (0.8880)	loss 0.3871 (0.4923)	grad_norm 2.7296 (2.7854)	mem 20675MB
[2025-04-03 04:09:30 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][374/573]	eta 0:02:56 lr 0.000064	time 0.8778 (0.8879)	loss 0.3645 (0.4920)	grad_norm 6.1758 (2.7936)	mem 20675MB
[2025-04-03 04:09:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][376/573]	eta 0:02:54 lr 0.000064	time 0.8777 (0.8879)	loss 0.5933 (0.4925)	grad_norm 2.6619 (2.7909)	mem 20675MB
[2025-04-03 04:09:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][378/573]	eta 0:02:53 lr 0.000064	time 0.8848 (0.8878)	loss 0.4772 (0.4924)	grad_norm 2.0464 (2.7905)	mem 20675MB
[2025-04-03 04:09:36 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][380/573]	eta 0:02:51 lr 0.000064	time 0.8777 (0.8878)	loss 0.5252 (0.4926)	grad_norm 3.2694 (2.7897)	mem 20675MB
[2025-04-03 04:09:37 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][382/573]	eta 0:02:49 lr 0.000063	time 0.8777 (0.8877)	loss 0.5467 (0.4925)	grad_norm 3.7163 (2.7925)	mem 20675MB
[2025-04-03 04:09:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][384/573]	eta 0:02:47 lr 0.000063	time 0.8778 (0.8877)	loss 0.3235 (0.4922)	grad_norm 3.1587 (2.7917)	mem 20675MB
[2025-04-03 04:09:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][386/573]	eta 0:02:45 lr 0.000063	time 0.8798 (0.8877)	loss 0.4800 (0.4922)	grad_norm 2.4964 (2.7927)	mem 20675MB
[2025-04-03 04:09:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][388/573]	eta 0:02:44 lr 0.000063	time 0.8773 (0.8876)	loss 0.4001 (0.4921)	grad_norm 2.7790 (2.7924)	mem 20675MB
[2025-04-03 04:09:44 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][390/573]	eta 0:02:42 lr 0.000063	time 0.8779 (0.8876)	loss 0.5356 (0.4923)	grad_norm 2.6438 (2.7911)	mem 20675MB
[2025-04-03 04:09:46 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][392/573]	eta 0:02:40 lr 0.000063	time 0.8788 (0.8876)	loss 0.5039 (0.4926)	grad_norm 3.2581 (2.7949)	mem 20675MB
[2025-04-03 04:09:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][394/573]	eta 0:02:38 lr 0.000063	time 0.8856 (0.8876)	loss 0.4116 (0.4920)	grad_norm 4.1046 (2.8021)	mem 20675MB
[2025-04-03 04:09:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][396/573]	eta 0:02:37 lr 0.000063	time 0.8774 (0.8875)	loss 0.3395 (0.4913)	grad_norm 12.8156 (2.8271)	mem 20675MB
[2025-04-03 04:09:51 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][398/573]	eta 0:02:35 lr 0.000063	time 0.8777 (0.8875)	loss 0.5320 (0.4914)	grad_norm 3.2413 (2.8270)	mem 20675MB
[2025-04-03 04:09:53 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][400/573]	eta 0:02:33 lr 0.000063	time 0.8773 (0.8875)	loss 0.5312 (0.4917)	grad_norm 3.2545 (2.8263)	mem 20675MB
[2025-04-03 04:09:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][402/573]	eta 0:02:31 lr 0.000062	time 0.8890 (0.8874)	loss 0.4231 (0.4917)	grad_norm 3.2049 (2.8252)	mem 20675MB
[2025-04-03 04:09:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][404/573]	eta 0:02:29 lr 0.000062	time 0.8825 (0.8874)	loss 0.3918 (0.4914)	grad_norm 2.6706 (2.8225)	mem 20675MB
[2025-04-03 04:09:58 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][406/573]	eta 0:02:28 lr 0.000062	time 0.8774 (0.8874)	loss 0.5110 (0.4917)	grad_norm 1.4533 (2.8178)	mem 20675MB
[2025-04-03 04:10:00 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][408/573]	eta 0:02:26 lr 0.000062	time 0.8784 (0.8873)	loss 0.5809 (0.4915)	grad_norm 2.8304 (2.8197)	mem 20675MB
[2025-04-03 04:10:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][410/573]	eta 0:02:24 lr 0.000062	time 0.8776 (0.8873)	loss 0.4497 (0.4914)	grad_norm 3.7179 (2.8207)	mem 20675MB
[2025-04-03 04:10:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][412/573]	eta 0:02:22 lr 0.000062	time 0.8783 (0.8873)	loss 0.6047 (0.4918)	grad_norm 2.3297 (2.8203)	mem 20675MB
[2025-04-03 04:10:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][414/573]	eta 0:02:21 lr 0.000062	time 0.8774 (0.8873)	loss 0.5094 (0.4920)	grad_norm 3.0040 (2.8190)	mem 20675MB
[2025-04-03 04:10:07 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][416/573]	eta 0:02:19 lr 0.000062	time 0.8788 (0.8873)	loss 0.3812 (0.4916)	grad_norm 2.9829 (2.8180)	mem 20675MB
[2025-04-03 04:10:09 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][418/573]	eta 0:02:17 lr 0.000062	time 0.8802 (0.8872)	loss 0.4115 (0.4914)	grad_norm 2.7165 (2.8180)	mem 20675MB
[2025-04-03 04:10:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][420/573]	eta 0:02:15 lr 0.000062	time 0.8870 (0.8872)	loss 0.4420 (0.4913)	grad_norm 2.4339 (2.8157)	mem 20675MB
[2025-04-03 04:10:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][422/573]	eta 0:02:13 lr 0.000062	time 0.9149 (0.8873)	loss 0.6171 (0.4915)	grad_norm 2.9010 (2.8168)	mem 20675MB
[2025-04-03 04:10:14 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][424/573]	eta 0:02:12 lr 0.000061	time 0.8778 (0.8873)	loss 0.4939 (0.4916)	grad_norm 2.1068 (2.8142)	mem 20675MB
[2025-04-03 04:10:16 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][426/573]	eta 0:02:10 lr 0.000061	time 0.8775 (0.8872)	loss 0.4403 (0.4916)	grad_norm 2.4779 (2.8127)	mem 20675MB
[2025-04-03 04:10:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][428/573]	eta 0:02:08 lr 0.000061	time 0.8787 (0.8872)	loss 0.4116 (0.4910)	grad_norm 2.7275 (2.8109)	mem 20675MB
[2025-04-03 04:10:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][430/573]	eta 0:02:06 lr 0.000061	time 0.8777 (0.8872)	loss 0.5670 (0.4913)	grad_norm 2.4812 (2.8095)	mem 20675MB
[2025-04-03 04:10:21 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][432/573]	eta 0:02:05 lr 0.000061	time 0.8842 (0.8871)	loss 0.5494 (0.4913)	grad_norm 2.9155 (2.8112)	mem 20675MB
[2025-04-03 04:10:23 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][434/573]	eta 0:02:03 lr 0.000061	time 0.8787 (0.8871)	loss 0.3752 (0.4912)	grad_norm 4.4154 (2.8136)	mem 20675MB
[2025-04-03 04:10:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][436/573]	eta 0:02:01 lr 0.000061	time 0.8785 (0.8871)	loss 0.5257 (0.4910)	grad_norm 2.0475 (2.8119)	mem 20675MB
[2025-04-03 04:10:27 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][438/573]	eta 0:01:59 lr 0.000061	time 0.8772 (0.8870)	loss 0.3660 (0.4909)	grad_norm 4.0259 (2.8148)	mem 20675MB
[2025-04-03 04:10:28 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][440/573]	eta 0:01:57 lr 0.000061	time 0.8776 (0.8870)	loss 0.4375 (0.4909)	grad_norm 2.6363 (2.8127)	mem 20675MB
[2025-04-03 04:10:30 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][442/573]	eta 0:01:56 lr 0.000061	time 0.8811 (0.8870)	loss 0.4733 (0.4907)	grad_norm 2.6374 (2.8131)	mem 20675MB
[2025-04-03 04:10:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][444/573]	eta 0:01:54 lr 0.000060	time 0.8774 (0.8869)	loss 0.4002 (0.4907)	grad_norm 4.3115 (2.8149)	mem 20675MB
[2025-04-03 04:10:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][446/573]	eta 0:01:52 lr 0.000060	time 0.8773 (0.8869)	loss 0.3327 (0.4902)	grad_norm 4.1607 (2.8172)	mem 20675MB
[2025-04-03 04:10:35 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][448/573]	eta 0:01:50 lr 0.000060	time 0.8774 (0.8869)	loss 0.3890 (0.4902)	grad_norm 2.6616 (2.8162)	mem 20675MB
[2025-04-03 04:10:37 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][450/573]	eta 0:01:49 lr 0.000060	time 0.8775 (0.8869)	loss 0.4042 (0.4901)	grad_norm 4.3086 (2.8201)	mem 20675MB
[2025-04-03 04:10:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][452/573]	eta 0:01:47 lr 0.000060	time 0.8777 (0.8868)	loss 0.4950 (0.4901)	grad_norm 2.7672 (2.8223)	mem 20675MB
[2025-04-03 04:10:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][454/573]	eta 0:01:45 lr 0.000060	time 0.8778 (0.8868)	loss 0.4804 (0.4901)	grad_norm 3.4811 (2.8249)	mem 20675MB
[2025-04-03 04:10:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][456/573]	eta 0:01:43 lr 0.000060	time 0.8867 (0.8868)	loss 0.5448 (0.4899)	grad_norm 2.5302 (2.8298)	mem 20675MB
[2025-04-03 04:10:44 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][458/573]	eta 0:01:41 lr 0.000060	time 0.8779 (0.8868)	loss 0.4501 (0.4898)	grad_norm 3.1747 (2.8295)	mem 20675MB
[2025-04-03 04:10:46 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][460/573]	eta 0:01:40 lr 0.000060	time 0.8774 (0.8867)	loss 0.5046 (0.4898)	grad_norm 2.4533 (2.8282)	mem 20675MB
[2025-04-03 04:10:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][462/573]	eta 0:01:38 lr 0.000060	time 0.8774 (0.8867)	loss 0.5198 (0.4902)	grad_norm 2.1495 (2.8268)	mem 20675MB
[2025-04-03 04:10:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][464/573]	eta 0:01:36 lr 0.000059	time 0.8795 (0.8867)	loss 0.4729 (0.4903)	grad_norm 2.4482 (2.8240)	mem 20675MB
[2025-04-03 04:10:51 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][466/573]	eta 0:01:34 lr 0.000059	time 0.8885 (0.8867)	loss 0.5143 (0.4904)	grad_norm 2.3898 (2.8222)	mem 20675MB
[2025-04-03 04:10:53 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][468/573]	eta 0:01:33 lr 0.000059	time 0.9058 (0.8867)	loss 0.5776 (0.4906)	grad_norm 2.9954 (2.8212)	mem 20675MB
[2025-04-03 04:10:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][470/573]	eta 0:01:31 lr 0.000059	time 0.8777 (0.8867)	loss 0.5703 (0.4907)	grad_norm 2.4519 (2.8232)	mem 20675MB
[2025-04-03 04:10:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][472/573]	eta 0:01:29 lr 0.000059	time 0.8778 (0.8866)	loss 0.5157 (0.4908)	grad_norm 2.8179 (2.8228)	mem 20675MB
[2025-04-03 04:10:58 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][474/573]	eta 0:01:27 lr 0.000059	time 0.9004 (0.8867)	loss 0.4247 (0.4908)	grad_norm 1.9061 (2.8217)	mem 20675MB
[2025-04-03 04:11:00 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][476/573]	eta 0:01:26 lr 0.000059	time 0.8776 (0.8867)	loss 0.5009 (0.4908)	grad_norm 2.4816 (2.8181)	mem 20675MB
[2025-04-03 04:11:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][478/573]	eta 0:01:24 lr 0.000059	time 0.8801 (0.8866)	loss 0.4928 (0.4908)	grad_norm 2.6100 (2.8185)	mem 20675MB
[2025-04-03 04:11:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][480/573]	eta 0:01:22 lr 0.000059	time 0.8818 (0.8866)	loss 0.3851 (0.4906)	grad_norm 2.8246 (2.8180)	mem 20675MB
[2025-04-03 04:11:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][482/573]	eta 0:01:20 lr 0.000059	time 0.8774 (0.8866)	loss 0.5287 (0.4908)	grad_norm 2.6717 (2.8164)	mem 20675MB
[2025-04-03 04:11:07 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][484/573]	eta 0:01:18 lr 0.000058	time 0.9135 (0.8867)	loss 0.6004 (0.4912)	grad_norm 2.5411 (2.8148)	mem 20675MB
[2025-04-03 04:11:09 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][486/573]	eta 0:01:17 lr 0.000058	time 0.8845 (0.8867)	loss 0.3561 (0.4906)	grad_norm 1.8739 (2.8163)	mem 20675MB
[2025-04-03 04:11:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][488/573]	eta 0:01:15 lr 0.000058	time 0.8776 (0.8866)	loss 0.4353 (0.4905)	grad_norm 2.5788 (2.8139)	mem 20675MB
[2025-04-03 04:11:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][490/573]	eta 0:01:13 lr 0.000058	time 0.8776 (0.8866)	loss 0.3853 (0.4903)	grad_norm 2.8167 (2.8138)	mem 20675MB
[2025-04-03 04:11:14 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][492/573]	eta 0:01:11 lr 0.000058	time 0.8789 (0.8866)	loss 0.3753 (0.4900)	grad_norm 2.2362 (2.8128)	mem 20675MB
[2025-04-03 04:11:16 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][494/573]	eta 0:01:10 lr 0.000058	time 0.8778 (0.8866)	loss 0.5543 (0.4901)	grad_norm 2.0976 (2.8107)	mem 20675MB
[2025-04-03 04:11:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][496/573]	eta 0:01:08 lr 0.000058	time 0.8785 (0.8866)	loss 0.3904 (0.4897)	grad_norm 4.8298 (2.8175)	mem 20675MB
[2025-04-03 04:11:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][498/573]	eta 0:01:06 lr 0.000058	time 0.8773 (0.8865)	loss 0.4593 (0.4894)	grad_norm 4.1772 (2.8260)	mem 20675MB
[2025-04-03 04:11:21 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][500/573]	eta 0:01:04 lr 0.000058	time 0.8776 (0.8865)	loss 0.5339 (0.4894)	grad_norm 2.1885 (2.8240)	mem 20675MB
[2025-04-03 04:11:23 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][502/573]	eta 0:01:02 lr 0.000058	time 0.8773 (0.8865)	loss 0.4920 (0.4897)	grad_norm 2.5037 (2.8234)	mem 20675MB
[2025-04-03 04:11:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][504/573]	eta 0:01:01 lr 0.000058	time 0.8792 (0.8864)	loss 0.5110 (0.4899)	grad_norm 3.8607 (2.8255)	mem 20675MB
[2025-04-03 04:11:27 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][506/573]	eta 0:00:59 lr 0.000057	time 0.8791 (0.8864)	loss 0.3816 (0.4896)	grad_norm 2.6246 (2.8261)	mem 20675MB
[2025-04-03 04:11:28 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][508/573]	eta 0:00:57 lr 0.000057	time 0.8773 (0.8864)	loss 0.4687 (0.4897)	grad_norm 1.8300 (2.8249)	mem 20675MB
[2025-04-03 04:11:30 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][510/573]	eta 0:00:55 lr 0.000057	time 0.8987 (0.8864)	loss 0.5383 (0.4900)	grad_norm 2.0601 (2.8223)	mem 20675MB
[2025-04-03 04:11:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][512/573]	eta 0:00:54 lr 0.000057	time 0.8777 (0.8864)	loss 0.4941 (0.4901)	grad_norm 2.4966 (2.8205)	mem 20675MB
[2025-04-03 04:11:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][514/573]	eta 0:00:52 lr 0.000057	time 0.8776 (0.8863)	loss 0.3399 (0.4897)	grad_norm 3.6861 (2.8212)	mem 20675MB
[2025-04-03 04:11:35 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][516/573]	eta 0:00:50 lr 0.000057	time 0.8778 (0.8863)	loss 0.4769 (0.4897)	grad_norm 3.8729 (2.8237)	mem 20675MB
[2025-04-03 04:11:37 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][518/573]	eta 0:00:48 lr 0.000057	time 0.8785 (0.8863)	loss 0.5492 (0.4898)	grad_norm 2.8183 (2.8234)	mem 20675MB
[2025-04-03 04:11:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][520/573]	eta 0:00:46 lr 0.000057	time 0.8777 (0.8863)	loss 0.4853 (0.4899)	grad_norm 1.6302 (2.8207)	mem 20675MB
[2025-04-03 04:11:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][522/573]	eta 0:00:45 lr 0.000057	time 0.8774 (0.8862)	loss 0.5547 (0.4901)	grad_norm 3.3721 (2.8228)	mem 20675MB
[2025-04-03 04:11:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][524/573]	eta 0:00:43 lr 0.000057	time 0.8775 (0.8862)	loss 0.4585 (0.4902)	grad_norm 1.7057 (2.8200)	mem 20675MB
[2025-04-03 04:11:44 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][526/573]	eta 0:00:41 lr 0.000056	time 0.8788 (0.8862)	loss 0.4347 (0.4898)	grad_norm 4.2241 (2.8255)	mem 20675MB
[2025-04-03 04:11:46 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][528/573]	eta 0:00:39 lr 0.000056	time 0.8877 (0.8862)	loss 0.4394 (0.4898)	grad_norm 5.1332 (2.8275)	mem 20675MB
[2025-04-03 04:11:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][530/573]	eta 0:00:38 lr 0.000056	time 0.8783 (0.8862)	loss 0.3678 (0.4895)	grad_norm 2.4196 (2.8266)	mem 20675MB
[2025-04-03 04:11:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][532/573]	eta 0:00:36 lr 0.000056	time 0.8774 (0.8862)	loss 0.4392 (0.4894)	grad_norm 2.3065 (2.8245)	mem 20675MB
[2025-04-03 04:11:51 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][534/573]	eta 0:00:34 lr 0.000056	time 0.8784 (0.8861)	loss 0.4392 (0.4894)	grad_norm 1.9173 (2.8226)	mem 20675MB
[2025-04-03 04:11:53 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][536/573]	eta 0:00:32 lr 0.000056	time 0.8802 (0.8861)	loss 0.5206 (0.4895)	grad_norm 2.6386 (2.8215)	mem 20675MB
[2025-04-03 04:11:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][538/573]	eta 0:00:31 lr 0.000056	time 0.8779 (0.8861)	loss 0.3974 (0.4895)	grad_norm 2.7241 (2.8214)	mem 20675MB
[2025-04-03 04:11:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][540/573]	eta 0:00:29 lr 0.000056	time 0.8780 (0.8861)	loss 0.3818 (0.4893)	grad_norm 2.3112 (2.8198)	mem 20675MB
[2025-04-03 04:11:58 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][542/573]	eta 0:00:27 lr 0.000056	time 0.8777 (0.8860)	loss 0.6027 (0.4895)	grad_norm 2.5978 (2.8198)	mem 20675MB
[2025-04-03 04:12:00 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][544/573]	eta 0:00:25 lr 0.000056	time 0.8778 (0.8860)	loss 0.3757 (0.4893)	grad_norm 3.1413 (2.8196)	mem 20675MB
[2025-04-03 04:12:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][546/573]	eta 0:00:23 lr 0.000056	time 0.8776 (0.8860)	loss 0.4298 (0.4892)	grad_norm 2.2137 (2.8207)	mem 20675MB
[2025-04-03 04:12:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][548/573]	eta 0:00:22 lr 0.000055	time 0.8810 (0.8860)	loss 0.5407 (0.4893)	grad_norm 2.2056 (2.8183)	mem 20675MB
[2025-04-03 04:12:05 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][550/573]	eta 0:00:20 lr 0.000055	time 0.8847 (0.8860)	loss 0.5803 (0.4895)	grad_norm 2.6214 (2.8162)	mem 20675MB
[2025-04-03 04:12:07 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][552/573]	eta 0:00:18 lr 0.000055	time 0.8776 (0.8860)	loss 0.4335 (0.4894)	grad_norm 2.3791 (2.8136)	mem 20675MB
[2025-04-03 04:12:09 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][554/573]	eta 0:00:16 lr 0.000055	time 0.8814 (0.8860)	loss 0.3600 (0.4892)	grad_norm 2.9020 (2.8124)	mem 20675MB
[2025-04-03 04:12:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][556/573]	eta 0:00:15 lr 0.000055	time 0.8776 (0.8859)	loss 0.6175 (0.4892)	grad_norm 2.9217 (2.8135)	mem 20675MB
[2025-04-03 04:12:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][558/573]	eta 0:00:13 lr 0.000055	time 0.8783 (0.8859)	loss 0.5239 (0.4892)	grad_norm 2.6595 (2.8128)	mem 20675MB
[2025-04-03 04:12:14 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][560/573]	eta 0:00:11 lr 0.000055	time 0.8780 (0.8859)	loss 0.5182 (0.4892)	grad_norm 2.2246 (2.8112)	mem 20675MB
[2025-04-03 04:12:16 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][562/573]	eta 0:00:09 lr 0.000055	time 0.8770 (0.8859)	loss 0.4975 (0.4889)	grad_norm 2.9051 (2.8140)	mem 20675MB
[2025-04-03 04:12:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][564/573]	eta 0:00:07 lr 0.000055	time 0.8773 (0.8859)	loss 0.5295 (0.4891)	grad_norm 2.0443 (2.8128)	mem 20675MB
[2025-04-03 04:12:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][566/573]	eta 0:00:06 lr 0.000055	time 0.8775 (0.8859)	loss 0.5680 (0.4889)	grad_norm 2.5076 (2.8140)	mem 20675MB
[2025-04-03 04:12:21 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][568/573]	eta 0:00:04 lr 0.000055	time 0.8778 (0.8858)	loss 0.3648 (0.4889)	grad_norm 3.1943 (2.8161)	mem 20675MB
[2025-04-03 04:12:23 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][570/573]	eta 0:00:02 lr 0.000054	time 0.8777 (0.8858)	loss 0.3262 (0.4886)	grad_norm 3.3608 (2.8202)	mem 20675MB
[2025-04-03 04:12:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][572/573]	eta 0:00:00 lr 0.000054	time 0.8771 (0.8858)	loss 0.3845 (0.4886)	grad_norm 2.4754 (2.8188)	mem 20675MB
[2025-04-03 04:12:25 simmim_finetune] (main_finetune.py 260): INFO EPOCH 25 training takes 0:08:27
[2025-04-03 04:12:25 simmim_finetune] (utils.py 60): INFO checkpoint/human/ckpt25.pth saving......
[2025-04-03 04:12:29 simmim_finetune] (utils.py 62): INFO checkpoint/human/ckpt25.pth saved !!!
[2025-04-03 04:12:33 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 3.655 (3.655)	Loss 0.5307 (0.5307)	Acc@1 69.531 (69.531)	Mem 20675MB
[2025-04-03 04:12:33 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (1.408)	Loss 0.4812 (0.4898)	Acc@1 75.000 (73.698)	Mem 20675MB
[2025-04-03 04:12:34 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.958)	Loss 0.5046 (0.4843)	Acc@1 72.656 (74.062)	Mem 20675MB
[2025-04-03 04:12:34 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.766)	Loss 0.4325 (0.4700)	Acc@1 80.469 (76.004)	Mem 20675MB
[2025-04-03 04:12:35 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.289 (0.660)	Loss 0.4803 (0.4619)	Acc@1 75.781 (76.910)	Mem 20675MB
[2025-04-03 04:12:36 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.592)	Loss 0.4543 (0.4663)	Acc@1 82.812 (77.202)	Mem 20675MB
[2025-04-03 04:12:36 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.285 (0.545)	Loss 0.4540 (0.4636)	Acc@1 79.688 (77.584)	Mem 20675MB
[2025-04-03 04:12:37 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.510)	Loss 0.4171 (0.4569)	Acc@1 80.469 (78.125)	Mem 20675MB
[2025-04-03 04:12:37 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.175
[2025-04-03 04:12:37 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 78.2%
[2025-04-03 04:12:37 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.23%
[2025-04-03 04:12:37 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.3913084335564767e-07, 4.3913084335564767e-07, 5.46794588249499e-07, 5.46794588249499e-07, 7.124311188554243e-07, 7.124311188554243e-07, 9.672565505568475e-07, 9.672565505568475e-07, 1.3592956762513448e-06, 1.3592956762513448e-06, 1.962432792704418e-06, 1.962432792704418e-06, 2.890336048786068e-06, 2.890336048786068e-06, 4.317879519680914e-06, 4.317879519680914e-06, 6.5141002441345246e-06, 6.5141002441345246e-06, 9.89290135867854e-06, 9.89290135867854e-06, 1.5091056919515486e-05, 1.5091056919515486e-05, 2.3088219320803096e-05, 2.3088219320803096e-05, 3.539154609201481e-05, 3.539154609201481e-05, 5.43197411246482e-05, 5.43197411246482e-05]
[2025-04-03 04:12:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][0/573]	eta 0:37:29 lr 0.000054	time 3.9258 (3.9258)	loss 0.5437 (0.5437)	grad_norm 2.0802 (2.0802)	mem 20675MB
[2025-04-03 04:12:43 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][2/573]	eta 0:18:06 lr 0.000054	time 0.8777 (1.9019)	loss 0.5569 (0.5593)	grad_norm 2.0073 (2.2580)	mem 20675MB
[2025-04-03 04:12:45 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][4/573]	eta 0:14:10 lr 0.000054	time 0.8776 (1.4946)	loss 0.4569 (0.5113)	grad_norm 3.0661 (2.7566)	mem 20675MB
[2025-04-03 04:12:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][6/573]	eta 0:12:27 lr 0.000054	time 0.8790 (1.3188)	loss 0.5653 (0.5156)	grad_norm 2.3969 (2.6843)	mem 20675MB
[2025-04-03 04:12:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][8/573]	eta 0:11:30 lr 0.000054	time 0.8787 (1.2214)	loss 0.4682 (0.5148)	grad_norm 2.7140 (2.6333)	mem 20675MB
[2025-04-03 04:12:50 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][10/573]	eta 0:10:53 lr 0.000054	time 0.8978 (1.1609)	loss 0.5321 (0.5186)	grad_norm 2.5523 (2.6011)	mem 20675MB
[2025-04-03 04:12:52 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][12/573]	eta 0:10:26 lr 0.000054	time 0.8782 (1.1176)	loss 0.5541 (0.5161)	grad_norm 1.8321 (2.5750)	mem 20675MB
[2025-04-03 04:12:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][14/573]	eta 0:10:07 lr 0.000054	time 0.8779 (1.0863)	loss 0.5910 (0.5255)	grad_norm 2.5998 (2.5439)	mem 20675MB
[2025-04-03 04:12:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][16/573]	eta 0:09:51 lr 0.000054	time 0.8780 (1.0620)	loss 0.5604 (0.5248)	grad_norm 2.1487 (2.4997)	mem 20675MB
[2025-04-03 04:12:57 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][18/573]	eta 0:09:38 lr 0.000053	time 0.8775 (1.0427)	loss 0.3641 (0.5147)	grad_norm 3.2777 (2.5523)	mem 20675MB
[2025-04-03 04:12:59 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][20/573]	eta 0:09:28 lr 0.000053	time 0.8899 (1.0283)	loss 0.5826 (0.5144)	grad_norm 2.2655 (2.5334)	mem 20675MB
[2025-04-03 04:13:01 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][22/573]	eta 0:09:19 lr 0.000053	time 0.8785 (1.0158)	loss 0.5362 (0.5156)	grad_norm 2.1719 (2.5270)	mem 20675MB
[2025-04-03 04:13:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][24/573]	eta 0:09:11 lr 0.000053	time 0.8784 (1.0049)	loss 0.4005 (0.5129)	grad_norm 2.3792 (2.5279)	mem 20675MB
[2025-04-03 04:13:04 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][26/573]	eta 0:09:04 lr 0.000053	time 0.8777 (0.9956)	loss 0.4911 (0.5132)	grad_norm 2.9348 (2.5536)	mem 20675MB
[2025-04-03 04:13:06 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][28/573]	eta 0:08:58 lr 0.000053	time 0.8807 (0.9876)	loss 0.5119 (0.5082)	grad_norm 1.7126 (2.5207)	mem 20675MB
[2025-04-03 04:13:08 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][30/573]	eta 0:08:52 lr 0.000053	time 0.8778 (0.9806)	loss 0.3547 (0.5019)	grad_norm 3.6798 (2.5632)	mem 20675MB
[2025-04-03 04:13:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][32/573]	eta 0:08:47 lr 0.000053	time 0.8798 (0.9745)	loss 0.3410 (0.4985)	grad_norm 3.4683 (2.5810)	mem 20675MB
[2025-04-03 04:13:11 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][34/573]	eta 0:08:42 lr 0.000053	time 0.8774 (0.9690)	loss 0.5575 (0.4995)	grad_norm 1.8230 (2.5617)	mem 20675MB
[2025-04-03 04:13:13 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][36/573]	eta 0:08:37 lr 0.000053	time 0.8801 (0.9642)	loss 0.5417 (0.4954)	grad_norm 2.0448 (2.5266)	mem 20675MB
[2025-04-03 04:13:15 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][38/573]	eta 0:08:33 lr 0.000053	time 0.8778 (0.9600)	loss 0.3942 (0.4944)	grad_norm 2.6663 (2.5251)	mem 20675MB
[2025-04-03 04:13:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][40/573]	eta 0:08:29 lr 0.000052	time 0.8796 (0.9561)	loss 0.5544 (0.4958)	grad_norm 2.1566 (2.5049)	mem 20675MB
[2025-04-03 04:13:18 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][42/573]	eta 0:08:25 lr 0.000052	time 0.8775 (0.9525)	loss 0.6006 (0.5004)	grad_norm 2.5066 (2.5021)	mem 20675MB
[2025-04-03 04:13:20 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][44/573]	eta 0:08:22 lr 0.000052	time 0.8774 (0.9493)	loss 0.4471 (0.4960)	grad_norm 3.1745 (2.5288)	mem 20675MB
[2025-04-03 04:13:22 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][46/573]	eta 0:08:18 lr 0.000052	time 0.8921 (0.9466)	loss 0.4410 (0.4953)	grad_norm 3.3591 (2.5541)	mem 20675MB
[2025-04-03 04:13:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][48/573]	eta 0:08:15 lr 0.000052	time 0.8871 (0.9440)	loss 0.5174 (0.4964)	grad_norm 2.6067 (2.5519)	mem 20675MB
[2025-04-03 04:13:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][50/573]	eta 0:08:12 lr 0.000052	time 0.8861 (0.9417)	loss 0.4469 (0.4960)	grad_norm 3.0937 (2.5527)	mem 20675MB
[2025-04-03 04:13:27 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][52/573]	eta 0:08:09 lr 0.000052	time 0.8783 (0.9397)	loss 0.5378 (0.4975)	grad_norm 2.0633 (2.5287)	mem 20675MB
[2025-04-03 04:13:29 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][54/573]	eta 0:08:06 lr 0.000052	time 0.8775 (0.9375)	loss 0.3894 (0.4976)	grad_norm 2.9424 (2.5301)	mem 20675MB
[2025-04-03 04:13:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][56/573]	eta 0:08:03 lr 0.000052	time 0.8780 (0.9355)	loss 0.4429 (0.4976)	grad_norm 2.7586 (2.5267)	mem 20675MB
[2025-04-03 04:13:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][58/573]	eta 0:08:00 lr 0.000052	time 0.8779 (0.9335)	loss 0.3316 (0.4963)	grad_norm 2.7027 (2.5273)	mem 20675MB
[2025-04-03 04:13:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][60/573]	eta 0:07:57 lr 0.000052	time 0.8778 (0.9317)	loss 0.5151 (0.4965)	grad_norm 2.1272 (2.5334)	mem 20675MB
[2025-04-03 04:13:36 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][62/573]	eta 0:07:55 lr 0.000051	time 0.8779 (0.9301)	loss 0.5715 (0.4973)	grad_norm 1.8863 (2.5220)	mem 20675MB
[2025-04-03 04:13:38 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][64/573]	eta 0:07:52 lr 0.000051	time 0.8854 (0.9287)	loss 0.5937 (0.4979)	grad_norm 2.1986 (2.5145)	mem 20675MB
[2025-04-03 04:13:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][66/573]	eta 0:07:50 lr 0.000051	time 0.8775 (0.9273)	loss 0.4991 (0.4971)	grad_norm 2.4697 (2.5049)	mem 20675MB
[2025-04-03 04:13:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][68/573]	eta 0:07:47 lr 0.000051	time 0.8779 (0.9258)	loss 0.4424 (0.4957)	grad_norm 2.3939 (2.4967)	mem 20675MB
[2025-04-03 04:13:43 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][70/573]	eta 0:07:45 lr 0.000051	time 0.8778 (0.9246)	loss 0.4710 (0.4932)	grad_norm 2.8703 (2.5194)	mem 20675MB
[2025-04-03 04:13:45 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][72/573]	eta 0:07:42 lr 0.000051	time 0.8794 (0.9234)	loss 0.5510 (0.4946)	grad_norm 2.1254 (2.5606)	mem 20675MB
[2025-04-03 04:13:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][74/573]	eta 0:07:40 lr 0.000051	time 0.8780 (0.9223)	loss 0.4006 (0.4918)	grad_norm 2.5334 (2.5690)	mem 20675MB
[2025-04-03 04:13:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][76/573]	eta 0:07:37 lr 0.000051	time 0.8792 (0.9212)	loss 0.5526 (0.4914)	grad_norm 2.2490 (2.6043)	mem 20675MB
[2025-04-03 04:13:50 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][78/573]	eta 0:07:35 lr 0.000051	time 0.8780 (0.9201)	loss 0.4599 (0.4928)	grad_norm 1.8290 (2.6032)	mem 20675MB
[2025-04-03 04:13:52 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][80/573]	eta 0:07:33 lr 0.000051	time 0.8781 (0.9191)	loss 0.5520 (0.4940)	grad_norm 2.6936 (2.6112)	mem 20675MB
[2025-04-03 04:13:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][82/573]	eta 0:07:30 lr 0.000051	time 0.8777 (0.9181)	loss 0.5151 (0.4925)	grad_norm 2.7572 (2.6124)	mem 20675MB
[2025-04-03 04:13:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][84/573]	eta 0:07:28 lr 0.000050	time 0.8783 (0.9172)	loss 0.4984 (0.4934)	grad_norm 3.5547 (2.6435)	mem 20675MB
[2025-04-03 04:13:57 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][86/573]	eta 0:07:26 lr 0.000050	time 0.8776 (0.9164)	loss 0.5276 (0.4936)	grad_norm 2.5700 (2.6405)	mem 20675MB
[2025-04-03 04:13:59 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][88/573]	eta 0:07:24 lr 0.000050	time 0.8781 (0.9156)	loss 0.3756 (0.4926)	grad_norm 3.9932 (2.6479)	mem 20675MB
[2025-04-03 04:14:00 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][90/573]	eta 0:07:21 lr 0.000050	time 0.8781 (0.9148)	loss 0.5165 (0.4935)	grad_norm 1.9754 (2.6312)	mem 20675MB
[2025-04-03 04:14:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][92/573]	eta 0:07:19 lr 0.000050	time 0.8838 (0.9143)	loss 0.5615 (0.4924)	grad_norm 2.4989 (2.6365)	mem 20675MB
[2025-04-03 04:14:04 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][94/573]	eta 0:07:17 lr 0.000050	time 0.8778 (0.9136)	loss 0.5238 (0.4931)	grad_norm 3.2187 (2.6384)	mem 20675MB
[2025-04-03 04:14:06 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][96/573]	eta 0:07:15 lr 0.000050	time 0.8786 (0.9129)	loss 0.5131 (0.4943)	grad_norm 1.8969 (2.6400)	mem 20675MB
[2025-04-03 04:14:07 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][98/573]	eta 0:07:13 lr 0.000050	time 0.8823 (0.9123)	loss 0.5398 (0.4935)	grad_norm 2.2895 (2.6507)	mem 20675MB
[2025-04-03 04:14:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][100/573]	eta 0:07:11 lr 0.000050	time 0.8778 (0.9117)	loss 0.4781 (0.4939)	grad_norm 2.3805 (2.6487)	mem 20675MB
[2025-04-03 04:14:11 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][102/573]	eta 0:07:09 lr 0.000050	time 0.8781 (0.9110)	loss 0.5532 (0.4954)	grad_norm 2.2214 (2.6417)	mem 20675MB
[2025-04-03 04:14:13 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][104/573]	eta 0:07:06 lr 0.000050	time 0.8777 (0.9104)	loss 0.4951 (0.4962)	grad_norm 2.8542 (2.6398)	mem 20675MB
[2025-04-03 04:14:15 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][106/573]	eta 0:07:04 lr 0.000049	time 0.8846 (0.9099)	loss 0.6100 (0.4957)	grad_norm 3.0698 (2.6476)	mem 20675MB
[2025-04-03 04:14:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][108/573]	eta 0:07:02 lr 0.000049	time 0.8873 (0.9094)	loss 0.5536 (0.4960)	grad_norm 1.7940 (2.6396)	mem 20675MB
[2025-04-03 04:14:18 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][110/573]	eta 0:07:00 lr 0.000049	time 0.8778 (0.9090)	loss 0.5237 (0.4955)	grad_norm 2.6996 (2.6411)	mem 20675MB
[2025-04-03 04:14:20 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][112/573]	eta 0:06:58 lr 0.000049	time 0.8915 (0.9086)	loss 0.4855 (0.4945)	grad_norm 2.3189 (2.6509)	mem 20675MB
[2025-04-03 04:14:22 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][114/573]	eta 0:06:56 lr 0.000049	time 0.8776 (0.9080)	loss 0.6084 (0.4957)	grad_norm 2.7388 (2.6449)	mem 20675MB
[2025-04-03 04:14:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][116/573]	eta 0:06:54 lr 0.000049	time 0.8787 (0.9077)	loss 0.3555 (0.4950)	grad_norm 2.7117 (2.6441)	mem 20675MB
[2025-04-03 04:14:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][118/573]	eta 0:06:52 lr 0.000049	time 0.8780 (0.9072)	loss 0.4930 (0.4939)	grad_norm 3.6104 (2.6528)	mem 20675MB
[2025-04-03 04:14:27 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][120/573]	eta 0:06:50 lr 0.000049	time 0.8774 (0.9068)	loss 0.4085 (0.4925)	grad_norm 2.0481 (2.6500)	mem 20675MB
[2025-04-03 04:14:29 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][122/573]	eta 0:06:48 lr 0.000049	time 0.8787 (0.9063)	loss 0.5466 (0.4932)	grad_norm 2.4713 (2.6536)	mem 20675MB
[2025-04-03 04:14:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][124/573]	eta 0:06:46 lr 0.000049	time 0.8798 (0.9060)	loss 0.4750 (0.4934)	grad_norm 3.7241 (2.6632)	mem 20675MB
[2025-04-03 04:14:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][126/573]	eta 0:06:44 lr 0.000049	time 0.8840 (0.9056)	loss 0.4618 (0.4936)	grad_norm 2.4630 (2.6679)	mem 20675MB
[2025-04-03 04:14:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][128/573]	eta 0:06:42 lr 0.000048	time 0.8775 (0.9052)	loss 0.3698 (0.4932)	grad_norm 2.6039 (2.6675)	mem 20675MB
[2025-04-03 04:14:36 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][130/573]	eta 0:06:40 lr 0.000048	time 0.8778 (0.9048)	loss 0.3857 (0.4928)	grad_norm 3.7440 (2.6775)	mem 20675MB
[2025-04-03 04:14:37 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][132/573]	eta 0:06:38 lr 0.000048	time 0.8776 (0.9044)	loss 0.5197 (0.4923)	grad_norm 2.3287 (2.6955)	mem 20675MB
[2025-04-03 04:14:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][134/573]	eta 0:06:36 lr 0.000048	time 0.8818 (0.9041)	loss 0.5094 (0.4930)	grad_norm 2.8244 (2.6965)	mem 20675MB
[2025-04-03 04:14:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][136/573]	eta 0:06:34 lr 0.000048	time 0.8773 (0.9037)	loss 0.5540 (0.4940)	grad_norm 3.4015 (2.7021)	mem 20675MB
[2025-04-03 04:14:43 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][138/573]	eta 0:06:32 lr 0.000048	time 0.8787 (0.9034)	loss 0.4390 (0.4938)	grad_norm 3.1538 (2.7016)	mem 20675MB
[2025-04-03 04:14:44 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][140/573]	eta 0:06:31 lr 0.000048	time 0.8779 (0.9031)	loss 0.3758 (0.4928)	grad_norm 3.4937 (2.7051)	mem 20675MB
[2025-04-03 04:14:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][142/573]	eta 0:06:29 lr 0.000048	time 0.8858 (0.9028)	loss 0.5529 (0.4926)	grad_norm 2.9843 (2.7091)	mem 20675MB
[2025-04-03 04:14:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][144/573]	eta 0:06:27 lr 0.000048	time 0.8773 (0.9026)	loss 0.5600 (0.4937)	grad_norm 2.3271 (2.7032)	mem 20675MB
[2025-04-03 04:14:50 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][146/573]	eta 0:06:25 lr 0.000048	time 0.8802 (0.9023)	loss 0.3802 (0.4934)	grad_norm 3.4904 (2.7081)	mem 20675MB
[2025-04-03 04:14:52 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][148/573]	eta 0:06:23 lr 0.000048	time 0.8778 (0.9020)	loss 0.3583 (0.4925)	grad_norm 2.4082 (2.7029)	mem 20675MB
[2025-04-03 04:14:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][150/573]	eta 0:06:21 lr 0.000048	time 0.8779 (0.9017)	loss 0.5106 (0.4928)	grad_norm 2.0617 (2.6963)	mem 20675MB
[2025-04-03 04:14:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][152/573]	eta 0:06:19 lr 0.000047	time 0.8813 (0.9015)	loss 0.3423 (0.4914)	grad_norm 2.4494 (2.7033)	mem 20675MB
[2025-04-03 04:14:57 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][154/573]	eta 0:06:17 lr 0.000047	time 0.8802 (0.9012)	loss 0.5749 (0.4918)	grad_norm 2.5582 (2.7051)	mem 20675MB
[2025-04-03 04:14:59 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][156/573]	eta 0:06:15 lr 0.000047	time 0.8776 (0.9010)	loss 0.5196 (0.4922)	grad_norm 2.6288 (2.7067)	mem 20675MB
[2025-04-03 04:15:00 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][158/573]	eta 0:06:13 lr 0.000047	time 0.8774 (0.9007)	loss 0.4162 (0.4912)	grad_norm 4.8988 (2.7229)	mem 20675MB
[2025-04-03 04:15:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][160/573]	eta 0:06:11 lr 0.000047	time 0.8842 (0.9006)	loss 0.5957 (0.4917)	grad_norm 2.0287 (2.7154)	mem 20675MB
[2025-04-03 04:15:04 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][162/573]	eta 0:06:10 lr 0.000047	time 0.8822 (0.9004)	loss 0.3549 (0.4910)	grad_norm 2.7464 (2.7308)	mem 20675MB
[2025-04-03 04:15:06 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][164/573]	eta 0:06:08 lr 0.000047	time 0.8806 (0.9001)	loss 0.5145 (0.4902)	grad_norm 2.0740 (2.7248)	mem 20675MB
[2025-04-03 04:15:07 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][166/573]	eta 0:06:06 lr 0.000047	time 0.8779 (0.8999)	loss 0.3695 (0.4885)	grad_norm 3.5993 (2.7310)	mem 20675MB
[2025-04-03 04:15:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][168/573]	eta 0:06:04 lr 0.000047	time 0.8852 (0.8997)	loss 0.5324 (0.4884)	grad_norm 5.1992 (2.7527)	mem 20675MB
[2025-04-03 04:15:11 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][170/573]	eta 0:06:02 lr 0.000047	time 0.8779 (0.8994)	loss 0.5160 (0.4885)	grad_norm 3.4328 (2.7554)	mem 20675MB
[2025-04-03 04:15:13 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][172/573]	eta 0:06:00 lr 0.000047	time 0.8777 (0.8992)	loss 0.4356 (0.4887)	grad_norm 2.6804 (2.7614)	mem 20675MB
[2025-04-03 04:15:14 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][174/573]	eta 0:05:58 lr 0.000046	time 0.8779 (0.8989)	loss 0.5317 (0.4884)	grad_norm 5.1427 (2.7864)	mem 20675MB
[2025-04-03 04:15:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][176/573]	eta 0:05:56 lr 0.000046	time 0.8779 (0.8987)	loss 0.6553 (0.4897)	grad_norm 3.4057 (2.7858)	mem 20675MB
[2025-04-03 04:15:18 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][178/573]	eta 0:05:54 lr 0.000046	time 0.8788 (0.8985)	loss 0.5189 (0.4905)	grad_norm 2.8811 (2.7882)	mem 20675MB
[2025-04-03 04:15:20 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][180/573]	eta 0:05:53 lr 0.000046	time 0.8792 (0.8983)	loss 0.6120 (0.4910)	grad_norm 2.1968 (2.7834)	mem 20675MB
[2025-04-03 04:15:21 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][182/573]	eta 0:05:51 lr 0.000046	time 0.8778 (0.8981)	loss 0.5394 (0.4905)	grad_norm 2.7973 (2.7948)	mem 20675MB
[2025-04-03 04:15:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][184/573]	eta 0:05:49 lr 0.000046	time 0.8776 (0.8979)	loss 0.5159 (0.4904)	grad_norm 1.9217 (2.7866)	mem 20675MB
[2025-04-03 04:15:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][186/573]	eta 0:05:47 lr 0.000046	time 0.8779 (0.8977)	loss 0.4986 (0.4907)	grad_norm 3.3844 (2.7867)	mem 20675MB
[2025-04-03 04:15:27 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][188/573]	eta 0:05:45 lr 0.000046	time 0.8778 (0.8975)	loss 0.3252 (0.4902)	grad_norm 6.1323 (2.8027)	mem 20675MB
[2025-04-03 04:15:29 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][190/573]	eta 0:05:43 lr 0.000046	time 0.8781 (0.8973)	loss 0.5696 (0.4909)	grad_norm 1.7966 (2.8002)	mem 20675MB
[2025-04-03 04:15:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][192/573]	eta 0:05:41 lr 0.000046	time 0.8813 (0.8973)	loss 0.5628 (0.4913)	grad_norm 2.1454 (2.7991)	mem 20675MB
[2025-04-03 04:15:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][194/573]	eta 0:05:39 lr 0.000046	time 0.8777 (0.8971)	loss 0.5255 (0.4912)	grad_norm 2.3140 (2.7966)	mem 20675MB
[2025-04-03 04:15:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][196/573]	eta 0:05:38 lr 0.000046	time 0.8782 (0.8969)	loss 0.4609 (0.4913)	grad_norm 2.5333 (2.7903)	mem 20675MB
[2025-04-03 04:15:36 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][198/573]	eta 0:05:36 lr 0.000045	time 0.8779 (0.8969)	loss 0.5098 (0.4920)	grad_norm 2.3332 (2.7841)	mem 20675MB
[2025-04-03 04:15:37 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][200/573]	eta 0:05:34 lr 0.000045	time 0.8822 (0.8968)	loss 0.5249 (0.4916)	grad_norm 1.6749 (2.7844)	mem 20675MB
[2025-04-03 04:15:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][202/573]	eta 0:05:32 lr 0.000045	time 0.8773 (0.8966)	loss 0.5161 (0.4909)	grad_norm 3.4038 (2.7906)	mem 20675MB
[2025-04-03 04:15:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][204/573]	eta 0:05:30 lr 0.000045	time 0.8784 (0.8964)	loss 0.5923 (0.4911)	grad_norm 2.6716 (2.7896)	mem 20675MB
[2025-04-03 04:15:43 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][206/573]	eta 0:05:28 lr 0.000045	time 0.8779 (0.8962)	loss 0.3405 (0.4904)	grad_norm 2.4581 (2.7839)	mem 20675MB
[2025-04-03 04:15:44 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][208/573]	eta 0:05:27 lr 0.000045	time 0.8782 (0.8961)	loss 0.6075 (0.4906)	grad_norm 3.0844 (2.7838)	mem 20675MB
[2025-04-03 04:15:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][210/573]	eta 0:05:25 lr 0.000045	time 0.8799 (0.8959)	loss 0.5254 (0.4912)	grad_norm 3.6898 (2.7881)	mem 20675MB
[2025-04-03 04:15:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][212/573]	eta 0:05:23 lr 0.000045	time 0.8779 (0.8958)	loss 0.5430 (0.4916)	grad_norm 2.7477 (2.7841)	mem 20675MB
[2025-04-03 04:15:50 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][214/573]	eta 0:05:21 lr 0.000045	time 0.8776 (0.8956)	loss 0.4721 (0.4918)	grad_norm 3.0391 (2.7846)	mem 20675MB
[2025-04-03 04:15:51 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][216/573]	eta 0:05:19 lr 0.000045	time 0.8779 (0.8954)	loss 0.5894 (0.4925)	grad_norm 2.1753 (2.7779)	mem 20675MB
[2025-04-03 04:15:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][218/573]	eta 0:05:17 lr 0.000045	time 0.8847 (0.8953)	loss 0.5057 (0.4926)	grad_norm 1.8725 (2.7717)	mem 20675MB
[2025-04-03 04:15:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][220/573]	eta 0:05:16 lr 0.000045	time 0.8783 (0.8952)	loss 0.3993 (0.4921)	grad_norm 3.3896 (2.7749)	mem 20675MB
[2025-04-03 04:15:57 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][222/573]	eta 0:05:14 lr 0.000044	time 0.8774 (0.8951)	loss 0.3600 (0.4914)	grad_norm 3.8219 (2.7806)	mem 20675MB
[2025-04-03 04:15:59 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][224/573]	eta 0:05:12 lr 0.000044	time 0.8835 (0.8949)	loss 0.5225 (0.4918)	grad_norm 2.4954 (2.7754)	mem 20675MB
[2025-04-03 04:16:00 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][226/573]	eta 0:05:10 lr 0.000044	time 0.8806 (0.8949)	loss 0.4636 (0.4921)	grad_norm 2.8843 (2.7727)	mem 20675MB
[2025-04-03 04:16:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][228/573]	eta 0:05:08 lr 0.000044	time 0.8879 (0.8948)	loss 0.5726 (0.4927)	grad_norm 2.1951 (2.7685)	mem 20675MB
[2025-04-03 04:16:04 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][230/573]	eta 0:05:06 lr 0.000044	time 0.8777 (0.8947)	loss 0.4321 (0.4917)	grad_norm 3.5942 (2.7706)	mem 20675MB
[2025-04-03 04:16:06 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][232/573]	eta 0:05:05 lr 0.000044	time 0.8775 (0.8945)	loss 0.5648 (0.4918)	grad_norm 2.3845 (2.7677)	mem 20675MB
[2025-04-03 04:16:07 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][234/573]	eta 0:05:03 lr 0.000044	time 0.8844 (0.8944)	loss 0.4659 (0.4910)	grad_norm 2.5957 (2.7670)	mem 20675MB
[2025-04-03 04:16:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][236/573]	eta 0:05:01 lr 0.000044	time 0.8779 (0.8943)	loss 0.5651 (0.4907)	grad_norm 2.7178 (2.7718)	mem 20675MB
[2025-04-03 04:16:11 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][238/573]	eta 0:04:59 lr 0.000044	time 0.8778 (0.8942)	loss 0.5526 (0.4910)	grad_norm 2.5148 (2.7665)	mem 20675MB
[2025-04-03 04:16:13 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][240/573]	eta 0:04:57 lr 0.000044	time 0.8777 (0.8941)	loss 0.5491 (0.4911)	grad_norm 2.6032 (2.7662)	mem 20675MB
[2025-04-03 04:16:14 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][242/573]	eta 0:04:55 lr 0.000044	time 0.8773 (0.8940)	loss 0.3698 (0.4909)	grad_norm 4.6099 (2.7770)	mem 20675MB
[2025-04-03 04:16:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][244/573]	eta 0:04:54 lr 0.000044	time 0.8772 (0.8938)	loss 0.4735 (0.4912)	grad_norm 3.9503 (2.7816)	mem 20675MB
[2025-04-03 04:16:18 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][246/573]	eta 0:04:52 lr 0.000043	time 0.8780 (0.8937)	loss 0.5355 (0.4915)	grad_norm 3.5231 (2.7835)	mem 20675MB
[2025-04-03 04:16:20 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][248/573]	eta 0:04:50 lr 0.000043	time 0.8788 (0.8936)	loss 0.6051 (0.4923)	grad_norm 3.2736 (2.7830)	mem 20675MB
[2025-04-03 04:16:21 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][250/573]	eta 0:04:48 lr 0.000043	time 0.8779 (0.8935)	loss 0.4442 (0.4921)	grad_norm 2.1302 (2.7795)	mem 20675MB
[2025-04-03 04:16:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][252/573]	eta 0:04:46 lr 0.000043	time 0.8778 (0.8934)	loss 0.5743 (0.4929)	grad_norm 2.6853 (2.7781)	mem 20675MB
[2025-04-03 04:16:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][254/573]	eta 0:04:44 lr 0.000043	time 0.8825 (0.8933)	loss 0.5142 (0.4931)	grad_norm 2.7099 (2.7757)	mem 20675MB
[2025-04-03 04:16:27 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][256/573]	eta 0:04:43 lr 0.000043	time 0.8887 (0.8933)	loss 0.4476 (0.4931)	grad_norm 3.5517 (2.7828)	mem 20675MB
[2025-04-03 04:16:28 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][258/573]	eta 0:04:41 lr 0.000043	time 0.8862 (0.8932)	loss 0.5202 (0.4928)	grad_norm 2.9685 (2.7840)	mem 20675MB
[2025-04-03 04:16:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][260/573]	eta 0:04:39 lr 0.000043	time 0.8773 (0.8931)	loss 0.4415 (0.4931)	grad_norm 2.8877 (2.7828)	mem 20675MB
[2025-04-03 04:16:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][262/573]	eta 0:04:37 lr 0.000043	time 0.8776 (0.8930)	loss 0.4815 (0.4923)	grad_norm 3.0621 (2.8009)	mem 20675MB
[2025-04-03 04:16:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][264/573]	eta 0:04:35 lr 0.000043	time 0.8814 (0.8929)	loss 0.4562 (0.4926)	grad_norm 3.2575 (2.8006)	mem 20675MB
[2025-04-03 04:16:36 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][266/573]	eta 0:04:34 lr 0.000043	time 0.8793 (0.8928)	loss 0.4936 (0.4928)	grad_norm 2.0276 (2.7959)	mem 20675MB
[2025-04-03 04:16:37 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][268/573]	eta 0:04:32 lr 0.000043	time 0.8778 (0.8927)	loss 0.5231 (0.4931)	grad_norm 3.0397 (2.7968)	mem 20675MB
[2025-04-03 04:16:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][270/573]	eta 0:04:30 lr 0.000042	time 0.8782 (0.8927)	loss 0.4377 (0.4933)	grad_norm 3.6417 (2.7972)	mem 20675MB
[2025-04-03 04:16:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][272/573]	eta 0:04:28 lr 0.000042	time 0.8807 (0.8926)	loss 0.4418 (0.4927)	grad_norm 2.2386 (2.7953)	mem 20675MB
[2025-04-03 04:16:43 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][274/573]	eta 0:04:26 lr 0.000042	time 0.8781 (0.8925)	loss 0.5020 (0.4931)	grad_norm 1.9160 (2.7910)	mem 20675MB
[2025-04-03 04:16:44 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][276/573]	eta 0:04:25 lr 0.000042	time 0.8791 (0.8925)	loss 0.5332 (0.4933)	grad_norm 1.7345 (2.7843)	mem 20675MB
[2025-04-03 04:16:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][278/573]	eta 0:04:23 lr 0.000042	time 0.8812 (0.8924)	loss 0.5183 (0.4934)	grad_norm 2.5377 (2.7827)	mem 20675MB
[2025-04-03 04:16:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][280/573]	eta 0:04:21 lr 0.000042	time 0.8786 (0.8923)	loss 0.5962 (0.4937)	grad_norm 2.1509 (2.7788)	mem 20675MB
[2025-04-03 04:16:50 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][282/573]	eta 0:04:19 lr 0.000042	time 0.8919 (0.8923)	loss 0.5042 (0.4939)	grad_norm 3.1107 (2.7797)	mem 20675MB
[2025-04-03 04:16:51 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][284/573]	eta 0:04:17 lr 0.000042	time 0.8800 (0.8922)	loss 0.5464 (0.4941)	grad_norm 2.1712 (2.7763)	mem 20675MB
[2025-04-03 04:16:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][286/573]	eta 0:04:16 lr 0.000042	time 0.8838 (0.8921)	loss 0.3865 (0.4934)	grad_norm 2.2927 (2.7734)	mem 20675MB
[2025-04-03 04:16:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][288/573]	eta 0:04:14 lr 0.000042	time 0.8782 (0.8920)	loss 0.5734 (0.4940)	grad_norm 2.1950 (2.7708)	mem 20675MB
[2025-04-03 04:16:57 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][290/573]	eta 0:04:12 lr 0.000042	time 0.8776 (0.8919)	loss 0.5473 (0.4941)	grad_norm 2.2196 (2.7692)	mem 20675MB
[2025-04-03 04:16:58 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][292/573]	eta 0:04:10 lr 0.000042	time 0.8776 (0.8919)	loss 0.5225 (0.4942)	grad_norm 2.1928 (2.7642)	mem 20675MB
[2025-04-03 04:17:00 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][294/573]	eta 0:04:08 lr 0.000041	time 0.8779 (0.8918)	loss 0.3525 (0.4933)	grad_norm 4.0993 (2.7689)	mem 20675MB
[2025-04-03 04:17:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][296/573]	eta 0:04:07 lr 0.000041	time 0.8858 (0.8917)	loss 0.5028 (0.4935)	grad_norm 2.4904 (2.7671)	mem 20675MB
[2025-04-03 04:17:04 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][298/573]	eta 0:04:05 lr 0.000041	time 0.8772 (0.8917)	loss 0.4110 (0.4934)	grad_norm 2.6205 (2.7657)	mem 20675MB
[2025-04-03 04:17:06 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][300/573]	eta 0:04:03 lr 0.000041	time 0.8776 (0.8916)	loss 0.5151 (0.4934)	grad_norm 2.8730 (2.7646)	mem 20675MB
[2025-04-03 04:17:07 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][302/573]	eta 0:04:01 lr 0.000041	time 0.8779 (0.8915)	loss 0.3972 (0.4932)	grad_norm 3.8896 (2.7679)	mem 20675MB
[2025-04-03 04:17:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][304/573]	eta 0:03:59 lr 0.000041	time 0.8774 (0.8914)	loss 0.3555 (0.4925)	grad_norm 2.3989 (2.7715)	mem 20675MB
[2025-04-03 04:17:11 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][306/573]	eta 0:03:58 lr 0.000041	time 0.8779 (0.8914)	loss 0.5868 (0.4928)	grad_norm 2.6802 (2.7682)	mem 20675MB
[2025-04-03 04:17:13 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][308/573]	eta 0:03:56 lr 0.000041	time 0.8774 (0.8913)	loss 0.4619 (0.4928)	grad_norm 3.1354 (2.7706)	mem 20675MB
[2025-04-03 04:17:14 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][310/573]	eta 0:03:54 lr 0.000041	time 0.8772 (0.8912)	loss 0.4469 (0.4926)	grad_norm 6.4515 (2.7821)	mem 20675MB
[2025-04-03 04:17:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][312/573]	eta 0:03:52 lr 0.000041	time 0.8882 (0.8912)	loss 0.5169 (0.4927)	grad_norm 2.7826 (2.7801)	mem 20675MB
[2025-04-03 04:17:18 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][314/573]	eta 0:03:50 lr 0.000041	time 0.8789 (0.8912)	loss 0.5749 (0.4930)	grad_norm 2.7128 (2.7779)	mem 20675MB
[2025-04-03 04:17:20 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][316/573]	eta 0:03:49 lr 0.000041	time 0.8777 (0.8911)	loss 0.4177 (0.4929)	grad_norm 2.2816 (2.7748)	mem 20675MB
[2025-04-03 04:17:21 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][318/573]	eta 0:03:47 lr 0.000040	time 0.8773 (0.8911)	loss 0.4224 (0.4923)	grad_norm 5.2711 (2.7849)	mem 20675MB
[2025-04-03 04:17:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][320/573]	eta 0:03:45 lr 0.000040	time 0.8813 (0.8910)	loss 0.4533 (0.4924)	grad_norm 2.3828 (2.7816)	mem 20675MB
[2025-04-03 04:17:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][322/573]	eta 0:03:43 lr 0.000040	time 0.8795 (0.8909)	loss 0.5238 (0.4925)	grad_norm 3.7923 (2.7834)	mem 20675MB
[2025-04-03 04:17:27 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][324/573]	eta 0:03:41 lr 0.000040	time 0.8819 (0.8909)	loss 0.4967 (0.4927)	grad_norm 2.5142 (2.7813)	mem 20675MB
[2025-04-03 04:17:28 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][326/573]	eta 0:03:40 lr 0.000040	time 0.8774 (0.8908)	loss 0.3800 (0.4925)	grad_norm 2.3383 (2.7796)	mem 20675MB
[2025-04-03 04:17:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][328/573]	eta 0:03:38 lr 0.000040	time 0.8777 (0.8907)	loss 0.6278 (0.4926)	grad_norm 2.7042 (2.7788)	mem 20675MB
[2025-04-03 04:17:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][330/573]	eta 0:03:36 lr 0.000040	time 0.8778 (0.8907)	loss 0.5321 (0.4926)	grad_norm 2.5062 (2.7778)	mem 20675MB
[2025-04-03 04:17:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][332/573]	eta 0:03:34 lr 0.000040	time 0.8775 (0.8906)	loss 0.4378 (0.4923)	grad_norm 2.3237 (2.7770)	mem 20675MB
[2025-04-03 04:17:35 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][334/573]	eta 0:03:32 lr 0.000040	time 0.8780 (0.8905)	loss 0.6006 (0.4926)	grad_norm 2.1308 (2.7731)	mem 20675MB
[2025-04-03 04:17:37 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][336/573]	eta 0:03:31 lr 0.000040	time 0.8778 (0.8905)	loss 0.5577 (0.4926)	grad_norm 3.6470 (2.7762)	mem 20675MB
[2025-04-03 04:17:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][338/573]	eta 0:03:29 lr 0.000040	time 0.8794 (0.8904)	loss 0.5084 (0.4924)	grad_norm 1.9551 (2.7757)	mem 20675MB
[2025-04-03 04:17:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][340/573]	eta 0:03:27 lr 0.000040	time 0.8776 (0.8904)	loss 0.3613 (0.4920)	grad_norm 3.8032 (2.7797)	mem 20675MB
[2025-04-03 04:17:43 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][342/573]	eta 0:03:25 lr 0.000040	time 0.8775 (0.8903)	loss 0.5348 (0.4923)	grad_norm 2.9369 (2.7781)	mem 20675MB
[2025-04-03 04:17:44 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][344/573]	eta 0:03:23 lr 0.000039	time 0.8778 (0.8903)	loss 0.3763 (0.4920)	grad_norm 3.8330 (2.7798)	mem 20675MB
[2025-04-03 04:17:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][346/573]	eta 0:03:22 lr 0.000039	time 0.8810 (0.8902)	loss 0.5826 (0.4923)	grad_norm 2.8449 (2.7828)	mem 20675MB
[2025-04-03 04:17:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][348/573]	eta 0:03:20 lr 0.000039	time 0.8807 (0.8902)	loss 0.5900 (0.4927)	grad_norm 2.0634 (2.7807)	mem 20675MB
[2025-04-03 04:17:50 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][350/573]	eta 0:03:18 lr 0.000039	time 0.8778 (0.8902)	loss 0.5151 (0.4930)	grad_norm 2.7844 (2.7793)	mem 20675MB
[2025-04-03 04:17:51 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][352/573]	eta 0:03:16 lr 0.000039	time 0.8773 (0.8901)	loss 0.4826 (0.4929)	grad_norm 3.2208 (2.7770)	mem 20675MB
[2025-04-03 04:17:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][354/573]	eta 0:03:14 lr 0.000039	time 0.8773 (0.8900)	loss 0.3473 (0.4923)	grad_norm 2.6668 (2.7753)	mem 20675MB
[2025-04-03 04:17:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][356/573]	eta 0:03:13 lr 0.000039	time 0.8773 (0.8900)	loss 0.5267 (0.4921)	grad_norm 1.6732 (2.7747)	mem 20675MB
[2025-04-03 04:17:57 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][358/573]	eta 0:03:11 lr 0.000039	time 0.8774 (0.8899)	loss 0.4177 (0.4922)	grad_norm 2.1152 (2.7706)	mem 20675MB
[2025-04-03 04:17:58 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][360/573]	eta 0:03:09 lr 0.000039	time 0.8774 (0.8899)	loss 0.4066 (0.4920)	grad_norm 3.4531 (2.7706)	mem 20675MB
[2025-04-03 04:18:00 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][362/573]	eta 0:03:07 lr 0.000039	time 0.8772 (0.8898)	loss 0.4442 (0.4920)	grad_norm 2.6969 (2.7714)	mem 20675MB
[2025-04-03 04:18:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][364/573]	eta 0:03:05 lr 0.000039	time 0.8773 (0.8897)	loss 0.5677 (0.4920)	grad_norm 2.9802 (2.7761)	mem 20675MB
[2025-04-03 04:18:04 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][366/573]	eta 0:03:04 lr 0.000039	time 0.8773 (0.8897)	loss 0.5373 (0.4922)	grad_norm 2.1360 (2.7738)	mem 20675MB
[2025-04-03 04:18:05 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][368/573]	eta 0:03:02 lr 0.000038	time 0.8777 (0.8896)	loss 0.4581 (0.4924)	grad_norm 2.7361 (2.7721)	mem 20675MB
[2025-04-03 04:18:07 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][370/573]	eta 0:03:00 lr 0.000038	time 0.8776 (0.8896)	loss 0.5312 (0.4927)	grad_norm 2.5060 (2.7684)	mem 20675MB
[2025-04-03 04:18:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][372/573]	eta 0:02:58 lr 0.000038	time 0.8776 (0.8895)	loss 0.3577 (0.4922)	grad_norm 3.3682 (2.7687)	mem 20675MB
[2025-04-03 04:18:11 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][374/573]	eta 0:02:57 lr 0.000038	time 0.8781 (0.8895)	loss 0.4991 (0.4921)	grad_norm 1.8977 (2.7678)	mem 20675MB
[2025-04-03 04:18:12 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][376/573]	eta 0:02:55 lr 0.000038	time 0.8777 (0.8894)	loss 0.4818 (0.4922)	grad_norm 2.1164 (2.7639)	mem 20675MB
[2025-04-03 04:18:14 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][378/573]	eta 0:02:53 lr 0.000038	time 0.8806 (0.8894)	loss 0.5953 (0.4927)	grad_norm 3.2172 (2.7636)	mem 20675MB
[2025-04-03 04:18:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][380/573]	eta 0:02:51 lr 0.000038	time 0.8830 (0.8894)	loss 0.4108 (0.4927)	grad_norm 2.2133 (2.7638)	mem 20675MB
[2025-04-03 04:18:18 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][382/573]	eta 0:02:49 lr 0.000038	time 0.8776 (0.8893)	loss 0.4248 (0.4927)	grad_norm 3.2734 (2.7643)	mem 20675MB
[2025-04-03 04:18:20 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][384/573]	eta 0:02:48 lr 0.000038	time 0.8863 (0.8893)	loss 0.5348 (0.4929)	grad_norm 4.3129 (2.7679)	mem 20675MB
[2025-04-03 04:18:21 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][386/573]	eta 0:02:46 lr 0.000038	time 0.8917 (0.8893)	loss 0.4549 (0.4924)	grad_norm 3.6041 (2.7720)	mem 20675MB
[2025-04-03 04:18:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][388/573]	eta 0:02:44 lr 0.000038	time 0.8772 (0.8892)	loss 0.4798 (0.4925)	grad_norm 2.1565 (2.7684)	mem 20675MB
[2025-04-03 04:18:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][390/573]	eta 0:02:42 lr 0.000038	time 0.8773 (0.8892)	loss 0.5016 (0.4924)	grad_norm 1.6549 (2.7655)	mem 20675MB
[2025-04-03 04:18:27 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][392/573]	eta 0:02:40 lr 0.000038	time 0.8854 (0.8892)	loss 0.5027 (0.4924)	grad_norm 2.1981 (2.7629)	mem 20675MB
[2025-04-03 04:18:28 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][394/573]	eta 0:02:39 lr 0.000037	time 0.8954 (0.8892)	loss 0.5371 (0.4926)	grad_norm 2.4448 (2.7613)	mem 20675MB
[2025-04-03 04:18:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][396/573]	eta 0:02:37 lr 0.000037	time 0.8821 (0.8891)	loss 0.5842 (0.4930)	grad_norm 2.3158 (2.7599)	mem 20675MB
[2025-04-03 04:18:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][398/573]	eta 0:02:35 lr 0.000037	time 0.9095 (0.8891)	loss 0.5568 (0.4933)	grad_norm 3.8114 (2.7624)	mem 20675MB
[2025-04-03 04:18:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][400/573]	eta 0:02:33 lr 0.000037	time 0.8773 (0.8891)	loss 0.4546 (0.4930)	grad_norm 3.7256 (2.7649)	mem 20675MB
[2025-04-03 04:18:35 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][402/573]	eta 0:02:32 lr 0.000037	time 0.8776 (0.8891)	loss 0.4891 (0.4930)	grad_norm 3.6369 (2.7661)	mem 20675MB
[2025-04-03 04:18:37 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][404/573]	eta 0:02:30 lr 0.000037	time 0.8773 (0.8890)	loss 0.4295 (0.4927)	grad_norm 3.7483 (2.7694)	mem 20675MB
[2025-04-03 04:18:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][406/573]	eta 0:02:28 lr 0.000037	time 0.8773 (0.8890)	loss 0.5476 (0.4927)	grad_norm 2.1869 (2.7718)	mem 20675MB
[2025-04-03 04:18:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][408/573]	eta 0:02:26 lr 0.000037	time 0.8933 (0.8890)	loss 0.4738 (0.4922)	grad_norm 2.3546 (2.7712)	mem 20675MB
[2025-04-03 04:18:42 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][410/573]	eta 0:02:24 lr 0.000037	time 0.8785 (0.8889)	loss 0.3724 (0.4921)	grad_norm 5.8737 (2.7770)	mem 20675MB
[2025-04-03 04:18:44 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][412/573]	eta 0:02:23 lr 0.000037	time 0.8773 (0.8889)	loss 0.5434 (0.4925)	grad_norm 2.3519 (2.7760)	mem 20675MB
[2025-04-03 04:18:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][414/573]	eta 0:02:21 lr 0.000037	time 0.8776 (0.8888)	loss 0.5667 (0.4928)	grad_norm 2.5050 (2.7757)	mem 20675MB
[2025-04-03 04:18:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][416/573]	eta 0:02:19 lr 0.000037	time 0.8775 (0.8888)	loss 0.2871 (0.4925)	grad_norm 2.3898 (2.7726)	mem 20675MB
[2025-04-03 04:18:50 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][418/573]	eta 0:02:17 lr 0.000037	time 0.8778 (0.8887)	loss 0.4961 (0.4925)	grad_norm 2.4696 (2.7732)	mem 20675MB
[2025-04-03 04:18:51 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][420/573]	eta 0:02:15 lr 0.000036	time 0.8819 (0.8887)	loss 0.3129 (0.4921)	grad_norm 3.2373 (2.7739)	mem 20675MB
[2025-04-03 04:18:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][422/573]	eta 0:02:14 lr 0.000036	time 0.8772 (0.8886)	loss 0.5210 (0.4924)	grad_norm 2.4835 (2.7728)	mem 20675MB
[2025-04-03 04:18:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][424/573]	eta 0:02:12 lr 0.000036	time 0.8775 (0.8886)	loss 0.4579 (0.4920)	grad_norm 2.2192 (2.7735)	mem 20675MB
[2025-04-03 04:18:57 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][426/573]	eta 0:02:10 lr 0.000036	time 0.8850 (0.8886)	loss 0.4300 (0.4918)	grad_norm 4.0982 (2.7754)	mem 20675MB
[2025-04-03 04:18:58 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][428/573]	eta 0:02:08 lr 0.000036	time 0.8774 (0.8885)	loss 0.3609 (0.4911)	grad_norm 3.4192 (2.7766)	mem 20675MB
[2025-04-03 04:19:00 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][430/573]	eta 0:02:07 lr 0.000036	time 0.8771 (0.8885)	loss 0.5180 (0.4912)	grad_norm 2.6762 (2.7783)	mem 20675MB
[2025-04-03 04:19:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][432/573]	eta 0:02:05 lr 0.000036	time 0.8775 (0.8884)	loss 0.3834 (0.4910)	grad_norm 3.6896 (2.7811)	mem 20675MB
[2025-04-03 04:19:04 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][434/573]	eta 0:02:03 lr 0.000036	time 0.8777 (0.8884)	loss 0.4944 (0.4909)	grad_norm 2.5313 (2.7813)	mem 20675MB
[2025-04-03 04:19:05 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][436/573]	eta 0:02:01 lr 0.000036	time 0.8785 (0.8884)	loss 0.5088 (0.4911)	grad_norm 3.7087 (2.7824)	mem 20675MB
[2025-04-03 04:19:07 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][438/573]	eta 0:01:59 lr 0.000036	time 0.8779 (0.8884)	loss 0.6099 (0.4913)	grad_norm 2.5865 (2.7814)	mem 20675MB
[2025-04-03 04:19:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][440/573]	eta 0:01:58 lr 0.000036	time 0.8899 (0.8883)	loss 0.4964 (0.4912)	grad_norm 3.0997 (2.7843)	mem 20675MB
[2025-04-03 04:19:11 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][442/573]	eta 0:01:56 lr 0.000036	time 0.8839 (0.8883)	loss 0.4779 (0.4912)	grad_norm 2.3205 (2.7819)	mem 20675MB
[2025-04-03 04:19:12 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][444/573]	eta 0:01:54 lr 0.000036	time 0.8854 (0.8884)	loss 0.5609 (0.4912)	grad_norm 2.5847 (2.7844)	mem 20675MB
[2025-04-03 04:19:14 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][446/573]	eta 0:01:52 lr 0.000035	time 0.8774 (0.8883)	loss 0.4269 (0.4910)	grad_norm 2.9904 (2.7847)	mem 20675MB
[2025-04-03 04:19:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][448/573]	eta 0:01:51 lr 0.000035	time 0.8775 (0.8883)	loss 0.3805 (0.4909)	grad_norm 2.3965 (2.7844)	mem 20675MB
[2025-04-03 04:19:18 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][450/573]	eta 0:01:49 lr 0.000035	time 0.8780 (0.8883)	loss 0.3945 (0.4909)	grad_norm 3.2980 (2.7862)	mem 20675MB
[2025-04-03 04:19:20 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][452/573]	eta 0:01:47 lr 0.000035	time 0.8778 (0.8882)	loss 0.5475 (0.4911)	grad_norm 1.6614 (2.7832)	mem 20675MB
[2025-04-03 04:19:21 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][454/573]	eta 0:01:45 lr 0.000035	time 0.8780 (0.8882)	loss 0.5398 (0.4914)	grad_norm 2.6356 (2.7834)	mem 20675MB
[2025-04-03 04:19:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][456/573]	eta 0:01:43 lr 0.000035	time 0.8784 (0.8881)	loss 0.5249 (0.4915)	grad_norm 2.0274 (2.7831)	mem 20675MB
[2025-04-03 04:19:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][458/573]	eta 0:01:42 lr 0.000035	time 0.8806 (0.8881)	loss 0.5266 (0.4916)	grad_norm 2.0534 (2.7842)	mem 20675MB
[2025-04-03 04:19:27 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][460/573]	eta 0:01:40 lr 0.000035	time 0.8772 (0.8881)	loss 0.5530 (0.4918)	grad_norm 2.2469 (2.7807)	mem 20675MB
[2025-04-03 04:19:28 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][462/573]	eta 0:01:38 lr 0.000035	time 0.8771 (0.8880)	loss 0.5849 (0.4918)	grad_norm 2.4700 (2.7808)	mem 20675MB
[2025-04-03 04:19:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][464/573]	eta 0:01:36 lr 0.000035	time 0.8778 (0.8880)	loss 0.4442 (0.4918)	grad_norm 3.1267 (2.7805)	mem 20675MB
[2025-04-03 04:19:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][466/573]	eta 0:01:35 lr 0.000035	time 0.8799 (0.8880)	loss 0.5165 (0.4919)	grad_norm 2.8526 (2.7792)	mem 20675MB
[2025-04-03 04:19:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][468/573]	eta 0:01:33 lr 0.000035	time 0.8893 (0.8880)	loss 0.4955 (0.4917)	grad_norm 2.6850 (2.7791)	mem 20675MB
[2025-04-03 04:19:35 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][470/573]	eta 0:01:31 lr 0.000035	time 0.8782 (0.8879)	loss 0.5291 (0.4919)	grad_norm 2.2710 (2.7793)	mem 20675MB
[2025-04-03 04:19:37 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][472/573]	eta 0:01:29 lr 0.000034	time 0.8783 (0.8879)	loss 0.5348 (0.4921)	grad_norm 2.6382 (2.7829)	mem 20675MB
[2025-04-03 04:19:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][474/573]	eta 0:01:27 lr 0.000034	time 0.8777 (0.8879)	loss 0.4874 (0.4923)	grad_norm 3.3767 (2.7838)	mem 20675MB
[2025-04-03 04:19:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][476/573]	eta 0:01:26 lr 0.000034	time 0.9190 (0.8879)	loss 0.5388 (0.4923)	grad_norm 1.7750 (2.7825)	mem 20675MB
[2025-04-03 04:19:42 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][478/573]	eta 0:01:24 lr 0.000034	time 0.8814 (0.8879)	loss 0.5867 (0.4924)	grad_norm 2.6029 (2.7801)	mem 20675MB
[2025-04-03 04:19:44 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][480/573]	eta 0:01:22 lr 0.000034	time 0.8776 (0.8879)	loss 0.3477 (0.4923)	grad_norm 5.2145 (2.7842)	mem 20675MB
[2025-04-03 04:19:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][482/573]	eta 0:01:20 lr 0.000034	time 0.8796 (0.8878)	loss 0.5249 (0.4925)	grad_norm 2.3827 (2.7812)	mem 20675MB
[2025-04-03 04:19:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][484/573]	eta 0:01:19 lr 0.000034	time 0.8774 (0.8878)	loss 0.4652 (0.4925)	grad_norm 1.8059 (2.7789)	mem 20675MB
[2025-04-03 04:19:49 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][486/573]	eta 0:01:17 lr 0.000034	time 0.8777 (0.8878)	loss 0.5185 (0.4926)	grad_norm 2.7783 (2.7792)	mem 20675MB
[2025-04-03 04:19:51 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][488/573]	eta 0:01:15 lr 0.000034	time 0.8772 (0.8877)	loss 0.4484 (0.4927)	grad_norm 2.6959 (2.7790)	mem 20675MB
[2025-04-03 04:19:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][490/573]	eta 0:01:13 lr 0.000034	time 0.8773 (0.8877)	loss 0.3985 (0.4923)	grad_norm 2.1029 (2.7810)	mem 20675MB
[2025-04-03 04:19:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][492/573]	eta 0:01:11 lr 0.000034	time 0.8785 (0.8877)	loss 0.3414 (0.4918)	grad_norm 3.0583 (2.7807)	mem 20675MB
[2025-04-03 04:19:57 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][494/573]	eta 0:01:10 lr 0.000034	time 0.8779 (0.8877)	loss 0.3878 (0.4914)	grad_norm 3.9735 (2.7825)	mem 20675MB
[2025-04-03 04:19:58 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][496/573]	eta 0:01:08 lr 0.000034	time 0.8780 (0.8876)	loss 0.4557 (0.4914)	grad_norm 2.9330 (2.7824)	mem 20675MB
[2025-04-03 04:20:00 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][498/573]	eta 0:01:06 lr 0.000034	time 0.8772 (0.8876)	loss 0.5525 (0.4914)	grad_norm 1.8698 (2.7810)	mem 20675MB
[2025-04-03 04:20:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][500/573]	eta 0:01:04 lr 0.000033	time 0.8778 (0.8875)	loss 0.5611 (0.4915)	grad_norm 2.4750 (2.7796)	mem 20675MB
[2025-04-03 04:20:04 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][502/573]	eta 0:01:03 lr 0.000033	time 0.8820 (0.8875)	loss 0.4687 (0.4915)	grad_norm 3.8284 (2.7818)	mem 20675MB
[2025-04-03 04:20:05 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][504/573]	eta 0:01:01 lr 0.000033	time 0.8777 (0.8875)	loss 0.5968 (0.4915)	grad_norm 3.2789 (2.7846)	mem 20675MB
[2025-04-03 04:20:07 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][506/573]	eta 0:00:59 lr 0.000033	time 0.8866 (0.8875)	loss 0.4134 (0.4915)	grad_norm 2.9642 (2.7836)	mem 20675MB
[2025-04-03 04:20:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][508/573]	eta 0:00:57 lr 0.000033	time 0.8778 (0.8874)	loss 0.5679 (0.4916)	grad_norm 2.1722 (2.7817)	mem 20675MB
[2025-04-03 04:20:11 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][510/573]	eta 0:00:55 lr 0.000033	time 0.8857 (0.8874)	loss 0.5120 (0.4915)	grad_norm 3.2524 (2.7839)	mem 20675MB
[2025-04-03 04:20:12 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][512/573]	eta 0:00:54 lr 0.000033	time 0.8781 (0.8874)	loss 0.5000 (0.4915)	grad_norm 2.8886 (2.7850)	mem 20675MB
[2025-04-03 04:20:14 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][514/573]	eta 0:00:52 lr 0.000033	time 0.8941 (0.8874)	loss 0.5629 (0.4918)	grad_norm 2.2709 (2.7831)	mem 20675MB
[2025-04-03 04:20:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][516/573]	eta 0:00:50 lr 0.000033	time 0.8775 (0.8874)	loss 0.5365 (0.4918)	grad_norm 2.3823 (2.7813)	mem 20675MB
[2025-04-03 04:20:18 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][518/573]	eta 0:00:48 lr 0.000033	time 0.8774 (0.8874)	loss 0.5559 (0.4918)	grad_norm 2.6544 (2.7828)	mem 20675MB
[2025-04-03 04:20:19 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][520/573]	eta 0:00:47 lr 0.000033	time 0.8774 (0.8873)	loss 0.5621 (0.4918)	grad_norm 1.9723 (2.7816)	mem 20675MB
[2025-04-03 04:20:21 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][522/573]	eta 0:00:45 lr 0.000033	time 0.8773 (0.8873)	loss 0.5050 (0.4918)	grad_norm 3.4543 (2.7846)	mem 20675MB
[2025-04-03 04:20:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][524/573]	eta 0:00:43 lr 0.000033	time 0.8771 (0.8873)	loss 0.4093 (0.4914)	grad_norm 2.3665 (2.7840)	mem 20675MB
[2025-04-03 04:20:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][526/573]	eta 0:00:41 lr 0.000033	time 0.8778 (0.8872)	loss 0.3905 (0.4912)	grad_norm 2.7448 (2.7821)	mem 20675MB
[2025-04-03 04:20:26 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][528/573]	eta 0:00:39 lr 0.000032	time 0.8780 (0.8872)	loss 0.4308 (0.4908)	grad_norm 3.0730 (2.7834)	mem 20675MB
[2025-04-03 04:20:28 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][530/573]	eta 0:00:38 lr 0.000032	time 0.8790 (0.8872)	loss 0.5720 (0.4911)	grad_norm 2.0487 (2.7816)	mem 20675MB
[2025-04-03 04:20:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][532/573]	eta 0:00:36 lr 0.000032	time 0.8771 (0.8872)	loss 0.4144 (0.4910)	grad_norm 4.0217 (2.7847)	mem 20675MB
[2025-04-03 04:20:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][534/573]	eta 0:00:34 lr 0.000032	time 0.8828 (0.8871)	loss 0.3573 (0.4909)	grad_norm 2.7954 (2.7835)	mem 20675MB
[2025-04-03 04:20:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][536/573]	eta 0:00:32 lr 0.000032	time 0.8818 (0.8871)	loss 0.3861 (0.4908)	grad_norm 2.1079 (2.7811)	mem 20675MB
[2025-04-03 04:20:35 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][538/573]	eta 0:00:31 lr 0.000032	time 0.8795 (0.8871)	loss 0.5370 (0.4906)	grad_norm 2.0477 (2.7841)	mem 20675MB
[2025-04-03 04:20:37 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][540/573]	eta 0:00:29 lr 0.000032	time 0.8774 (0.8871)	loss 0.4748 (0.4907)	grad_norm 2.9217 (2.7827)	mem 20675MB
[2025-04-03 04:20:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][542/573]	eta 0:00:27 lr 0.000032	time 0.8773 (0.8870)	loss 0.4923 (0.4908)	grad_norm 2.1066 (2.7802)	mem 20675MB
[2025-04-03 04:20:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][544/573]	eta 0:00:25 lr 0.000032	time 0.8774 (0.8870)	loss 0.4736 (0.4908)	grad_norm 2.0725 (2.7774)	mem 20675MB
[2025-04-03 04:20:42 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][546/573]	eta 0:00:23 lr 0.000032	time 0.8833 (0.8870)	loss 0.4560 (0.4909)	grad_norm 2.3975 (2.7764)	mem 20675MB
[2025-04-03 04:20:44 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][548/573]	eta 0:00:22 lr 0.000032	time 0.8790 (0.8870)	loss 0.5497 (0.4910)	grad_norm 3.6842 (2.7772)	mem 20675MB
[2025-04-03 04:20:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][550/573]	eta 0:00:20 lr 0.000032	time 0.8797 (0.8869)	loss 0.5224 (0.4910)	grad_norm 1.9165 (2.7779)	mem 20675MB
[2025-04-03 04:20:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][552/573]	eta 0:00:18 lr 0.000032	time 0.8778 (0.8869)	loss 0.5113 (0.4907)	grad_norm 2.4581 (2.7781)	mem 20675MB
[2025-04-03 04:20:49 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][554/573]	eta 0:00:16 lr 0.000032	time 0.8791 (0.8869)	loss 0.3702 (0.4906)	grad_norm 2.3134 (2.7778)	mem 20675MB
[2025-04-03 04:20:51 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][556/573]	eta 0:00:15 lr 0.000031	time 0.8776 (0.8869)	loss 0.5207 (0.4906)	grad_norm 2.5323 (2.7760)	mem 20675MB
[2025-04-03 04:20:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][558/573]	eta 0:00:13 lr 0.000031	time 0.8773 (0.8868)	loss 0.4801 (0.4907)	grad_norm 2.6210 (2.7748)	mem 20675MB
[2025-04-03 04:20:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][560/573]	eta 0:00:11 lr 0.000031	time 0.8771 (0.8868)	loss 0.3491 (0.4902)	grad_norm 2.4299 (2.7733)	mem 20675MB
[2025-04-03 04:20:56 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][562/573]	eta 0:00:09 lr 0.000031	time 0.8772 (0.8868)	loss 0.3477 (0.4902)	grad_norm 3.3518 (2.7762)	mem 20675MB
[2025-04-03 04:20:58 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][564/573]	eta 0:00:07 lr 0.000031	time 0.8791 (0.8867)	loss 0.4495 (0.4901)	grad_norm 3.5933 (2.7761)	mem 20675MB
[2025-04-03 04:21:00 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][566/573]	eta 0:00:06 lr 0.000031	time 0.8774 (0.8867)	loss 0.5748 (0.4900)	grad_norm 2.1186 (2.7773)	mem 20675MB
[2025-04-03 04:21:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][568/573]	eta 0:00:04 lr 0.000031	time 0.8772 (0.8867)	loss 0.5061 (0.4899)	grad_norm 2.6344 (2.7768)	mem 20675MB
[2025-04-03 04:21:03 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][570/573]	eta 0:00:02 lr 0.000031	time 0.8804 (0.8867)	loss 0.3783 (0.4895)	grad_norm 3.8049 (2.7779)	mem 20675MB
[2025-04-03 04:21:05 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][572/573]	eta 0:00:00 lr 0.000031	time 0.8776 (0.8866)	loss 0.3345 (0.4891)	grad_norm 2.3053 (2.7783)	mem 20675MB
[2025-04-03 04:21:05 simmim_finetune] (main_finetune.py 260): INFO EPOCH 26 training takes 0:08:28
[2025-04-03 04:21:09 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 3.105 (3.105)	Loss 0.5020 (0.5020)	Acc@1 70.312 (70.312)	Mem 20675MB
[2025-04-03 04:21:09 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (1.224)	Loss 0.4535 (0.4619)	Acc@1 78.125 (74.740)	Mem 20675MB
[2025-04-03 04:21:10 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.284 (0.850)	Loss 0.4767 (0.4555)	Acc@1 74.219 (75.469)	Mem 20675MB
[2025-04-03 04:21:10 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.688)	Loss 0.4062 (0.4419)	Acc@1 82.812 (77.455)	Mem 20675MB
[2025-04-03 04:21:11 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.598)	Loss 0.5145 (0.4456)	Acc@1 73.438 (77.431)	Mem 20675MB
[2025-04-03 04:21:11 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.541)	Loss 0.4848 (0.4582)	Acc@1 82.031 (77.486)	Mem 20675MB
[2025-04-03 04:21:12 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.285 (0.502)	Loss 0.4865 (0.4613)	Acc@1 78.125 (77.584)	Mem 20675MB
[2025-04-03 04:21:12 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.473)	Loss 0.4444 (0.4590)	Acc@1 78.906 (77.865)	Mem 20675MB
[2025-04-03 04:21:13 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.873
[2025-04-03 04:21:13 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 77.9%
[2025-04-03 04:21:13 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.23%
[2025-04-03 04:21:13 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.571018749006464e-07, 3.571018749006464e-07, 4.1807019044542047e-07, 4.1807019044542047e-07, 5.118675989758422e-07, 5.118675989758422e-07, 6.561713044072602e-07, 6.561713044072602e-07, 8.781770050709801e-07, 8.781770050709801e-07, 1.2197242368613186e-06, 1.2197242368613186e-06, 1.745181516538762e-06, 1.745181516538762e-06, 2.5535773314271367e-06, 2.5535773314271367e-06, 3.7972632004861744e-06, 3.7972632004861744e-06, 5.7106260759616186e-06, 5.7106260759616186e-06, 8.65426126900076e-06, 8.65426126900076e-06, 1.3182930796753288e-05, 1.3182930796753288e-05, 2.015011468560333e-05, 2.015011468560333e-05, 3.086885912998801e-05, 3.086885912998801e-05]
[2025-04-03 04:21:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][0/573]	eta 0:33:22 lr 0.000031	time 3.4940 (3.4940)	loss 0.3596 (0.3596)	grad_norm 2.9227 (2.9227)	mem 20675MB
[2025-04-03 04:21:18 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][2/573]	eta 0:16:41 lr 0.000031	time 0.8820 (1.7547)	loss 0.5187 (0.4806)	grad_norm 2.8806 (2.7759)	mem 20675MB
[2025-04-03 04:21:20 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][4/573]	eta 0:13:19 lr 0.000031	time 0.8774 (1.4047)	loss 0.5874 (0.4924)	grad_norm 2.5779 (3.5563)	mem 20675MB
[2025-04-03 04:21:22 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][6/573]	eta 0:11:51 lr 0.000031	time 0.8779 (1.2552)	loss 0.5962 (0.5239)	grad_norm 2.6059 (3.2077)	mem 20675MB
[2025-04-03 04:21:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][8/573]	eta 0:11:02 lr 0.000031	time 0.8776 (1.1724)	loss 0.3940 (0.4934)	grad_norm 2.2767 (3.1942)	mem 20675MB
[2025-04-03 04:21:25 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][10/573]	eta 0:10:30 lr 0.000030	time 0.8780 (1.1191)	loss 0.5195 (0.4978)	grad_norm 2.5380 (3.0307)	mem 20675MB
[2025-04-03 04:21:27 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][12/573]	eta 0:10:07 lr 0.000030	time 0.8775 (1.0821)	loss 0.4229 (0.4907)	grad_norm 2.0162 (2.9094)	mem 20675MB
[2025-04-03 04:21:29 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][14/573]	eta 0:09:49 lr 0.000030	time 0.8786 (1.0550)	loss 0.4390 (0.4919)	grad_norm 3.8935 (2.9703)	mem 20675MB
[2025-04-03 04:21:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][16/573]	eta 0:09:36 lr 0.000030	time 0.8777 (1.0344)	loss 0.5168 (0.5008)	grad_norm 2.5635 (2.9499)	mem 20675MB
[2025-04-03 04:21:32 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][18/573]	eta 0:09:24 lr 0.000030	time 0.8777 (1.0179)	loss 0.5291 (0.5066)	grad_norm 2.6189 (2.9488)	mem 20675MB
[2025-04-03 04:21:34 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][20/573]	eta 0:09:15 lr 0.000030	time 0.8775 (1.0047)	loss 0.5051 (0.5065)	grad_norm 2.1446 (2.8975)	mem 20675MB
[2025-04-03 04:21:36 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][22/573]	eta 0:09:08 lr 0.000030	time 0.8857 (0.9947)	loss 0.5131 (0.5093)	grad_norm 2.2296 (2.8338)	mem 20675MB
[2025-04-03 04:21:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][24/573]	eta 0:09:00 lr 0.000030	time 0.8775 (0.9854)	loss 0.4534 (0.5085)	grad_norm 3.3954 (2.8198)	mem 20675MB
[2025-04-03 04:21:39 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][26/573]	eta 0:08:54 lr 0.000030	time 0.8776 (0.9776)	loss 0.4998 (0.5083)	grad_norm 2.2286 (2.7771)	mem 20675MB
[2025-04-03 04:21:41 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][28/573]	eta 0:08:49 lr 0.000030	time 0.8777 (0.9707)	loss 0.3975 (0.5050)	grad_norm 2.1287 (2.7757)	mem 20675MB
[2025-04-03 04:21:43 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][30/573]	eta 0:08:43 lr 0.000030	time 0.8781 (0.9648)	loss 0.5516 (0.5007)	grad_norm 2.2496 (2.8139)	mem 20675MB
[2025-04-03 04:21:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][32/573]	eta 0:08:39 lr 0.000030	time 0.8807 (0.9597)	loss 0.5863 (0.5040)	grad_norm 3.2758 (2.8098)	mem 20675MB
[2025-04-03 04:21:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][34/573]	eta 0:08:34 lr 0.000030	time 0.8778 (0.9551)	loss 0.4974 (0.5049)	grad_norm 2.4225 (2.7913)	mem 20675MB
[2025-04-03 04:21:48 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][36/573]	eta 0:08:30 lr 0.000030	time 0.8778 (0.9509)	loss 0.3645 (0.5020)	grad_norm 2.0716 (2.7624)	mem 20675MB
[2025-04-03 04:21:50 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][38/573]	eta 0:08:26 lr 0.000030	time 0.8775 (0.9473)	loss 0.5271 (0.5008)	grad_norm 2.5233 (2.7661)	mem 20675MB
[2025-04-03 04:21:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][40/573]	eta 0:08:23 lr 0.000029	time 0.8783 (0.9439)	loss 0.3541 (0.4976)	grad_norm 3.5465 (2.7659)	mem 20675MB
[2025-04-03 04:21:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][42/573]	eta 0:08:19 lr 0.000029	time 0.8781 (0.9410)	loss 0.5585 (0.4999)	grad_norm 4.1530 (2.7877)	mem 20675MB
[2025-04-03 04:21:55 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][44/573]	eta 0:08:16 lr 0.000029	time 0.8770 (0.9382)	loss 0.4678 (0.5014)	grad_norm 3.0432 (2.7995)	mem 20675MB
[2025-04-03 04:21:57 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][46/573]	eta 0:08:13 lr 0.000029	time 0.8782 (0.9357)	loss 0.4868 (0.5020)	grad_norm 2.3928 (2.7777)	mem 20675MB
[2025-04-03 04:21:59 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][48/573]	eta 0:08:10 lr 0.000029	time 0.8779 (0.9334)	loss 0.6328 (0.5050)	grad_norm 2.2078 (2.7561)	mem 20675MB
[2025-04-03 04:22:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][50/573]	eta 0:08:07 lr 0.000029	time 0.8775 (0.9315)	loss 0.5437 (0.5057)	grad_norm 2.3813 (2.7362)	mem 20675MB
[2025-04-03 04:22:02 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][52/573]	eta 0:08:04 lr 0.000029	time 0.8776 (0.9295)	loss 0.3810 (0.5024)	grad_norm 3.2618 (2.7562)	mem 20675MB
[2025-04-03 04:22:04 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][54/573]	eta 0:08:01 lr 0.000029	time 0.8776 (0.9276)	loss 0.6715 (0.5064)	grad_norm 2.6047 (2.7462)	mem 20675MB
[2025-04-03 04:22:06 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][56/573]	eta 0:07:58 lr 0.000029	time 0.8775 (0.9259)	loss 0.4960 (0.5066)	grad_norm 1.8842 (2.7299)	mem 20675MB
[2025-04-03 04:22:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][58/573]	eta 0:07:56 lr 0.000029	time 0.8777 (0.9243)	loss 0.4549 (0.5036)	grad_norm 2.5017 (2.7244)	mem 20675MB
[2025-04-03 04:22:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][60/573]	eta 0:07:53 lr 0.000029	time 0.8779 (0.9228)	loss 0.5366 (0.5011)	grad_norm 1.9591 (2.7186)	mem 20675MB
[2025-04-03 04:22:11 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][62/573]	eta 0:07:50 lr 0.000029	time 0.8778 (0.9214)	loss 0.5021 (0.4999)	grad_norm 1.8809 (2.6982)	mem 20675MB
[2025-04-03 04:22:13 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][64/573]	eta 0:07:48 lr 0.000029	time 0.8781 (0.9202)	loss 0.5694 (0.5007)	grad_norm 2.5108 (2.6877)	mem 20675MB
[2025-04-03 04:22:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][66/573]	eta 0:07:45 lr 0.000029	time 0.8780 (0.9189)	loss 0.5831 (0.4997)	grad_norm 2.3152 (2.6787)	mem 20675MB
[2025-04-03 04:22:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][68/573]	eta 0:07:43 lr 0.000028	time 0.8775 (0.9179)	loss 0.5457 (0.4984)	grad_norm 2.9170 (2.6856)	mem 20675MB
[2025-04-03 04:22:18 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][70/573]	eta 0:07:41 lr 0.000028	time 0.8831 (0.9168)	loss 0.5152 (0.4996)	grad_norm 2.2124 (2.6717)	mem 20675MB
[2025-04-03 04:22:20 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][72/573]	eta 0:07:38 lr 0.000028	time 0.8890 (0.9160)	loss 0.4217 (0.4991)	grad_norm 3.0926 (2.6700)	mem 20675MB
[2025-04-03 04:22:21 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][74/573]	eta 0:07:36 lr 0.000028	time 0.8776 (0.9150)	loss 0.5300 (0.5001)	grad_norm 1.8302 (2.6465)	mem 20675MB
[2025-04-03 04:22:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][76/573]	eta 0:07:34 lr 0.000028	time 0.8777 (0.9140)	loss 0.4755 (0.4992)	grad_norm 2.6436 (2.6380)	mem 20675MB
[2025-04-03 04:22:25 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][78/573]	eta 0:07:32 lr 0.000028	time 0.8786 (0.9131)	loss 0.4857 (0.4991)	grad_norm 2.8653 (2.6500)	mem 20675MB
[2025-04-03 04:22:27 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][80/573]	eta 0:07:29 lr 0.000028	time 0.8781 (0.9124)	loss 0.5423 (0.4985)	grad_norm 1.9594 (2.6382)	mem 20675MB
[2025-04-03 04:22:28 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][82/573]	eta 0:07:27 lr 0.000028	time 0.8775 (0.9115)	loss 0.5780 (0.4993)	grad_norm 2.3180 (2.6270)	mem 20675MB
[2025-04-03 04:22:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][84/573]	eta 0:07:25 lr 0.000028	time 0.8799 (0.9108)	loss 0.5631 (0.5010)	grad_norm 2.2634 (2.6237)	mem 20675MB
[2025-04-03 04:22:32 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][86/573]	eta 0:07:23 lr 0.000028	time 0.8804 (0.9101)	loss 0.5328 (0.4992)	grad_norm 2.3871 (2.6188)	mem 20675MB
[2025-04-03 04:22:34 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][88/573]	eta 0:07:21 lr 0.000028	time 0.8830 (0.9095)	loss 0.5528 (0.4982)	grad_norm 3.0364 (2.6187)	mem 20675MB
[2025-04-03 04:22:35 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][90/573]	eta 0:07:18 lr 0.000028	time 0.8788 (0.9088)	loss 0.4984 (0.4973)	grad_norm 2.2183 (2.6142)	mem 20675MB
[2025-04-03 04:22:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][92/573]	eta 0:07:16 lr 0.000028	time 0.8776 (0.9082)	loss 0.5120 (0.4976)	grad_norm 2.1120 (2.6053)	mem 20675MB
[2025-04-03 04:22:39 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][94/573]	eta 0:07:14 lr 0.000028	time 0.8774 (0.9076)	loss 0.5536 (0.4975)	grad_norm 2.2660 (2.6052)	mem 20675MB
[2025-04-03 04:22:41 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][96/573]	eta 0:07:12 lr 0.000028	time 0.8981 (0.9072)	loss 0.6627 (0.4997)	grad_norm 3.4786 (2.6116)	mem 20675MB
[2025-04-03 04:22:43 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][98/573]	eta 0:07:10 lr 0.000027	time 0.8802 (0.9067)	loss 0.5505 (0.5002)	grad_norm 2.7015 (2.6082)	mem 20675MB
[2025-04-03 04:22:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][100/573]	eta 0:07:08 lr 0.000027	time 0.8774 (0.9061)	loss 0.3785 (0.4994)	grad_norm 3.4925 (2.6135)	mem 20675MB
[2025-04-03 04:22:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][102/573]	eta 0:07:06 lr 0.000027	time 0.8781 (0.9056)	loss 0.4099 (0.4991)	grad_norm 3.7576 (2.6238)	mem 20675MB
[2025-04-03 04:22:48 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][104/573]	eta 0:07:04 lr 0.000027	time 0.8776 (0.9054)	loss 0.5869 (0.5002)	grad_norm 3.1829 (2.6233)	mem 20675MB
[2025-04-03 04:22:50 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][106/573]	eta 0:07:02 lr 0.000027	time 0.8772 (0.9049)	loss 0.4539 (0.5004)	grad_norm 3.2937 (2.6224)	mem 20675MB
[2025-04-03 04:22:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][108/573]	eta 0:07:00 lr 0.000027	time 0.8779 (0.9045)	loss 0.3866 (0.4997)	grad_norm 3.7871 (2.6339)	mem 20675MB
[2025-04-03 04:22:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][110/573]	eta 0:06:58 lr 0.000027	time 0.8777 (0.9040)	loss 0.5813 (0.4984)	grad_norm 2.0738 (2.6359)	mem 20675MB
[2025-04-03 04:22:55 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][112/573]	eta 0:06:56 lr 0.000027	time 0.8776 (0.9036)	loss 0.3799 (0.4966)	grad_norm 3.0332 (2.6422)	mem 20675MB
[2025-04-03 04:22:57 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][114/573]	eta 0:06:54 lr 0.000027	time 0.8774 (0.9032)	loss 0.5306 (0.4975)	grad_norm 2.3610 (2.6363)	mem 20675MB
[2025-04-03 04:22:58 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][116/573]	eta 0:06:52 lr 0.000027	time 0.8778 (0.9028)	loss 0.4222 (0.4969)	grad_norm 2.5815 (2.6308)	mem 20675MB
[2025-04-03 04:23:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][118/573]	eta 0:06:50 lr 0.000027	time 0.8793 (0.9024)	loss 0.5021 (0.4971)	grad_norm 1.9544 (2.6224)	mem 20675MB
[2025-04-03 04:23:02 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][120/573]	eta 0:06:48 lr 0.000027	time 0.8776 (0.9020)	loss 0.3947 (0.4957)	grad_norm 3.2951 (2.6383)	mem 20675MB
[2025-04-03 04:23:04 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][122/573]	eta 0:06:46 lr 0.000027	time 0.8776 (0.9016)	loss 0.5045 (0.4952)	grad_norm 2.9223 (2.6544)	mem 20675MB
[2025-04-03 04:23:05 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][124/573]	eta 0:06:44 lr 0.000027	time 0.8776 (0.9013)	loss 0.5666 (0.4951)	grad_norm 2.3531 (2.6856)	mem 20675MB
[2025-04-03 04:23:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][126/573]	eta 0:06:42 lr 0.000027	time 0.8776 (0.9009)	loss 0.6317 (0.4951)	grad_norm 2.3300 (2.6902)	mem 20675MB
[2025-04-03 04:23:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][128/573]	eta 0:06:40 lr 0.000026	time 0.8776 (0.9006)	loss 0.5448 (0.4960)	grad_norm 2.9375 (2.6897)	mem 20675MB
[2025-04-03 04:23:11 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][130/573]	eta 0:06:38 lr 0.000026	time 0.8778 (0.9002)	loss 0.3887 (0.4950)	grad_norm 3.9187 (2.6968)	mem 20675MB
[2025-04-03 04:23:12 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][132/573]	eta 0:06:36 lr 0.000026	time 0.8821 (0.9000)	loss 0.5735 (0.4961)	grad_norm 2.4100 (2.6930)	mem 20675MB
[2025-04-03 04:23:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][134/573]	eta 0:06:34 lr 0.000026	time 0.8776 (0.8997)	loss 0.4895 (0.4955)	grad_norm 2.6697 (2.6884)	mem 20675MB
[2025-04-03 04:23:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][136/573]	eta 0:06:33 lr 0.000026	time 0.8776 (0.8995)	loss 0.4997 (0.4963)	grad_norm 2.2296 (2.6847)	mem 20675MB
[2025-04-03 04:23:18 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][138/573]	eta 0:06:31 lr 0.000026	time 0.8775 (0.8992)	loss 0.5381 (0.4970)	grad_norm 3.1289 (2.6863)	mem 20675MB
[2025-04-03 04:23:20 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][140/573]	eta 0:06:29 lr 0.000026	time 0.8782 (0.8989)	loss 0.5725 (0.4972)	grad_norm 2.0061 (2.6891)	mem 20675MB
[2025-04-03 04:23:21 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][142/573]	eta 0:06:27 lr 0.000026	time 0.8776 (0.8986)	loss 0.4405 (0.4970)	grad_norm 2.9732 (2.6885)	mem 20675MB
[2025-04-03 04:23:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][144/573]	eta 0:06:25 lr 0.000026	time 0.8776 (0.8985)	loss 0.6038 (0.4975)	grad_norm 1.7738 (2.6912)	mem 20675MB
[2025-04-03 04:23:25 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][146/573]	eta 0:06:23 lr 0.000026	time 0.8837 (0.8984)	loss 0.5251 (0.4975)	grad_norm 2.8156 (2.7076)	mem 20675MB
[2025-04-03 04:23:27 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][148/573]	eta 0:06:21 lr 0.000026	time 0.8782 (0.8983)	loss 0.4356 (0.4973)	grad_norm 5.1788 (2.7195)	mem 20675MB
[2025-04-03 04:23:28 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][150/573]	eta 0:06:19 lr 0.000026	time 0.8774 (0.8981)	loss 0.5285 (0.4983)	grad_norm 2.2503 (2.7129)	mem 20675MB
[2025-04-03 04:23:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][152/573]	eta 0:06:18 lr 0.000026	time 0.8781 (0.8979)	loss 0.6008 (0.4989)	grad_norm 2.5357 (2.7122)	mem 20675MB
[2025-04-03 04:23:32 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][154/573]	eta 0:06:16 lr 0.000026	time 0.8801 (0.8977)	loss 0.5742 (0.4997)	grad_norm 1.7710 (2.7053)	mem 20675MB
[2025-04-03 04:23:34 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][156/573]	eta 0:06:14 lr 0.000026	time 0.8776 (0.8975)	loss 0.4949 (0.4994)	grad_norm 3.1095 (2.7073)	mem 20675MB
[2025-04-03 04:23:35 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][158/573]	eta 0:06:12 lr 0.000026	time 0.8829 (0.8973)	loss 0.4142 (0.4990)	grad_norm 2.9231 (2.7120)	mem 20675MB
[2025-04-03 04:23:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][160/573]	eta 0:06:10 lr 0.000025	time 0.8877 (0.8972)	loss 0.4201 (0.4982)	grad_norm 4.0608 (2.7159)	mem 20675MB
[2025-04-03 04:23:39 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][162/573]	eta 0:06:08 lr 0.000025	time 0.8872 (0.8971)	loss 0.3467 (0.4972)	grad_norm 3.1747 (2.7133)	mem 20675MB
[2025-04-03 04:23:41 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][164/573]	eta 0:06:06 lr 0.000025	time 0.8779 (0.8968)	loss 0.4774 (0.4972)	grad_norm 2.1914 (2.7060)	mem 20675MB
[2025-04-03 04:23:43 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][166/573]	eta 0:06:04 lr 0.000025	time 0.8785 (0.8966)	loss 0.4891 (0.4971)	grad_norm 3.0481 (2.7017)	mem 20675MB
[2025-04-03 04:23:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][168/573]	eta 0:06:03 lr 0.000025	time 0.8772 (0.8964)	loss 0.4209 (0.4967)	grad_norm 3.5737 (2.7036)	mem 20675MB
[2025-04-03 04:23:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][170/573]	eta 0:06:01 lr 0.000025	time 0.8777 (0.8962)	loss 0.3877 (0.4965)	grad_norm 2.5102 (2.7007)	mem 20675MB
[2025-04-03 04:23:48 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][172/573]	eta 0:05:59 lr 0.000025	time 0.8787 (0.8960)	loss 0.5121 (0.4961)	grad_norm 1.8108 (2.6992)	mem 20675MB
[2025-04-03 04:23:50 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][174/573]	eta 0:05:57 lr 0.000025	time 0.8778 (0.8958)	loss 0.3784 (0.4953)	grad_norm 3.1101 (2.7034)	mem 20675MB
[2025-04-03 04:23:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][176/573]	eta 0:05:55 lr 0.000025	time 0.8776 (0.8956)	loss 0.3245 (0.4950)	grad_norm 2.4612 (2.7030)	mem 20675MB
[2025-04-03 04:23:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][178/573]	eta 0:05:53 lr 0.000025	time 0.8775 (0.8955)	loss 0.3875 (0.4946)	grad_norm 3.2951 (2.7030)	mem 20675MB
[2025-04-03 04:23:55 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][180/573]	eta 0:05:51 lr 0.000025	time 0.8857 (0.8953)	loss 0.5844 (0.4948)	grad_norm 3.4904 (2.7099)	mem 20675MB
[2025-04-03 04:23:57 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][182/573]	eta 0:05:50 lr 0.000025	time 0.8772 (0.8953)	loss 0.5706 (0.4953)	grad_norm 2.3589 (2.7038)	mem 20675MB
[2025-04-03 04:23:58 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][184/573]	eta 0:05:48 lr 0.000025	time 0.8958 (0.8952)	loss 0.4987 (0.4957)	grad_norm 1.7474 (2.6960)	mem 20675MB
[2025-04-03 04:24:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][186/573]	eta 0:05:46 lr 0.000025	time 0.8775 (0.8951)	loss 0.6233 (0.4964)	grad_norm 2.1252 (2.6898)	mem 20675MB
[2025-04-03 04:24:02 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][188/573]	eta 0:05:44 lr 0.000025	time 0.8871 (0.8949)	loss 0.4829 (0.4963)	grad_norm 3.0554 (2.6987)	mem 20675MB
[2025-04-03 04:24:04 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][190/573]	eta 0:05:42 lr 0.000024	time 0.8813 (0.8948)	loss 0.5677 (0.4969)	grad_norm 2.4549 (2.6941)	mem 20675MB
[2025-04-03 04:24:05 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][192/573]	eta 0:05:40 lr 0.000024	time 0.8782 (0.8946)	loss 0.3405 (0.4953)	grad_norm 2.8269 (2.6985)	mem 20675MB
[2025-04-03 04:24:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][194/573]	eta 0:05:39 lr 0.000024	time 0.8779 (0.8945)	loss 0.5381 (0.4954)	grad_norm 2.3960 (2.6951)	mem 20675MB
[2025-04-03 04:24:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][196/573]	eta 0:05:37 lr 0.000024	time 0.8775 (0.8943)	loss 0.5854 (0.4952)	grad_norm 2.8916 (2.6969)	mem 20675MB
[2025-04-03 04:24:11 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][198/573]	eta 0:05:35 lr 0.000024	time 0.8778 (0.8942)	loss 0.5086 (0.4954)	grad_norm 2.8607 (2.6970)	mem 20675MB
[2025-04-03 04:24:12 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][200/573]	eta 0:05:33 lr 0.000024	time 0.8869 (0.8940)	loss 0.5166 (0.4959)	grad_norm 2.7929 (2.6983)	mem 20675MB
[2025-04-03 04:24:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][202/573]	eta 0:05:31 lr 0.000024	time 0.8795 (0.8939)	loss 0.5608 (0.4966)	grad_norm 1.9211 (2.6906)	mem 20675MB
[2025-04-03 04:24:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][204/573]	eta 0:05:29 lr 0.000024	time 0.8780 (0.8938)	loss 0.5407 (0.4970)	grad_norm 3.1795 (2.6925)	mem 20675MB
[2025-04-03 04:24:18 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][206/573]	eta 0:05:27 lr 0.000024	time 0.8774 (0.8936)	loss 0.3809 (0.4961)	grad_norm 3.4375 (2.6973)	mem 20675MB
[2025-04-03 04:24:20 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][208/573]	eta 0:05:26 lr 0.000024	time 0.8778 (0.8935)	loss 0.5194 (0.4959)	grad_norm 2.3584 (2.6971)	mem 20675MB
[2025-04-03 04:24:21 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][210/573]	eta 0:05:24 lr 0.000024	time 0.8790 (0.8934)	loss 0.3516 (0.4950)	grad_norm 2.7918 (2.6981)	mem 20675MB
[2025-04-03 04:24:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][212/573]	eta 0:05:22 lr 0.000024	time 0.8778 (0.8932)	loss 0.4256 (0.4947)	grad_norm 3.6409 (2.7000)	mem 20675MB
[2025-04-03 04:24:25 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][214/573]	eta 0:05:20 lr 0.000024	time 0.8777 (0.8931)	loss 0.5067 (0.4943)	grad_norm 2.3997 (2.6974)	mem 20675MB
[2025-04-03 04:24:27 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][216/573]	eta 0:05:18 lr 0.000024	time 0.8774 (0.8930)	loss 0.5141 (0.4946)	grad_norm 3.1714 (2.6989)	mem 20675MB
[2025-04-03 04:24:28 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][218/573]	eta 0:05:16 lr 0.000024	time 0.8849 (0.8929)	loss 0.5354 (0.4941)	grad_norm 2.6440 (2.7031)	mem 20675MB
[2025-04-03 04:24:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][220/573]	eta 0:05:15 lr 0.000024	time 0.8778 (0.8928)	loss 0.5415 (0.4940)	grad_norm 2.0581 (2.7055)	mem 20675MB
[2025-04-03 04:24:32 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][222/573]	eta 0:05:13 lr 0.000023	time 0.8774 (0.8926)	loss 0.4549 (0.4945)	grad_norm 1.5110 (2.6996)	mem 20675MB
[2025-04-03 04:24:34 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][224/573]	eta 0:05:11 lr 0.000023	time 0.8778 (0.8925)	loss 0.5949 (0.4950)	grad_norm 2.1534 (2.6968)	mem 20675MB
[2025-04-03 04:24:35 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][226/573]	eta 0:05:09 lr 0.000023	time 0.8788 (0.8924)	loss 0.4585 (0.4948)	grad_norm 2.6633 (2.6926)	mem 20675MB
[2025-04-03 04:24:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][228/573]	eta 0:05:07 lr 0.000023	time 0.8791 (0.8923)	loss 0.3825 (0.4937)	grad_norm 3.2568 (2.7037)	mem 20675MB
[2025-04-03 04:24:39 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][230/573]	eta 0:05:06 lr 0.000023	time 0.8776 (0.8922)	loss 0.6093 (0.4944)	grad_norm 2.7346 (2.7011)	mem 20675MB
[2025-04-03 04:24:41 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][232/573]	eta 0:05:04 lr 0.000023	time 0.8773 (0.8921)	loss 0.4700 (0.4947)	grad_norm 3.4139 (2.7044)	mem 20675MB
[2025-04-03 04:24:42 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][234/573]	eta 0:05:02 lr 0.000023	time 0.8773 (0.8920)	loss 0.5417 (0.4946)	grad_norm 1.9690 (2.7017)	mem 20675MB
[2025-04-03 04:24:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][236/573]	eta 0:05:00 lr 0.000023	time 0.8777 (0.8918)	loss 0.3998 (0.4946)	grad_norm 2.8734 (2.6990)	mem 20675MB
[2025-04-03 04:24:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][238/573]	eta 0:04:58 lr 0.000023	time 0.8779 (0.8917)	loss 0.4970 (0.4947)	grad_norm 1.4124 (2.6963)	mem 20675MB
[2025-04-03 04:24:48 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][240/573]	eta 0:04:56 lr 0.000023	time 0.8778 (0.8916)	loss 0.5381 (0.4950)	grad_norm 2.3307 (2.6926)	mem 20675MB
[2025-04-03 04:24:49 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][242/573]	eta 0:04:55 lr 0.000023	time 0.8776 (0.8915)	loss 0.5192 (0.4951)	grad_norm 1.6632 (2.6891)	mem 20675MB
[2025-04-03 04:24:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][244/573]	eta 0:04:53 lr 0.000023	time 0.8774 (0.8915)	loss 0.5409 (0.4950)	grad_norm 2.2573 (2.6894)	mem 20675MB
[2025-04-03 04:24:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][246/573]	eta 0:04:51 lr 0.000023	time 0.8779 (0.8914)	loss 0.5773 (0.4954)	grad_norm 2.2663 (2.6847)	mem 20675MB
[2025-04-03 04:24:55 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][248/573]	eta 0:04:49 lr 0.000023	time 0.8854 (0.8913)	loss 0.5570 (0.4953)	grad_norm 1.8912 (2.6837)	mem 20675MB
[2025-04-03 04:24:56 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][250/573]	eta 0:04:47 lr 0.000023	time 0.8777 (0.8912)	loss 0.3557 (0.4945)	grad_norm 3.8238 (2.6884)	mem 20675MB
[2025-04-03 04:24:58 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][252/573]	eta 0:04:46 lr 0.000023	time 0.8776 (0.8911)	loss 0.5404 (0.4947)	grad_norm 2.0656 (2.6890)	mem 20675MB
[2025-04-03 04:25:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][254/573]	eta 0:04:44 lr 0.000023	time 0.8775 (0.8910)	loss 0.4266 (0.4939)	grad_norm 3.8055 (2.6969)	mem 20675MB
[2025-04-03 04:25:02 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][256/573]	eta 0:04:42 lr 0.000022	time 0.8777 (0.8909)	loss 0.5386 (0.4944)	grad_norm 2.8338 (2.6958)	mem 20675MB
[2025-04-03 04:25:03 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][258/573]	eta 0:04:40 lr 0.000022	time 0.8773 (0.8909)	loss 0.4825 (0.4944)	grad_norm 2.6125 (2.6996)	mem 20675MB
[2025-04-03 04:25:05 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][260/573]	eta 0:04:38 lr 0.000022	time 0.8778 (0.8908)	loss 0.6207 (0.4946)	grad_norm 2.3983 (2.6990)	mem 20675MB
[2025-04-03 04:25:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][262/573]	eta 0:04:37 lr 0.000022	time 0.8772 (0.8907)	loss 0.4804 (0.4948)	grad_norm 2.0832 (2.6948)	mem 20675MB
[2025-04-03 04:25:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][264/573]	eta 0:04:35 lr 0.000022	time 0.8787 (0.8906)	loss 0.5889 (0.4954)	grad_norm 2.8410 (2.6958)	mem 20675MB
[2025-04-03 04:25:11 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][266/573]	eta 0:04:33 lr 0.000022	time 0.8900 (0.8906)	loss 0.4009 (0.4944)	grad_norm 3.0780 (2.6977)	mem 20675MB
[2025-04-03 04:25:12 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][268/573]	eta 0:04:31 lr 0.000022	time 0.8773 (0.8905)	loss 0.5577 (0.4946)	grad_norm 1.7345 (2.6931)	mem 20675MB
[2025-04-03 04:25:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][270/573]	eta 0:04:29 lr 0.000022	time 0.8781 (0.8905)	loss 0.5613 (0.4944)	grad_norm 2.1459 (2.6936)	mem 20675MB
[2025-04-03 04:25:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][272/573]	eta 0:04:28 lr 0.000022	time 0.8777 (0.8904)	loss 0.5312 (0.4945)	grad_norm 2.0681 (2.6888)	mem 20675MB
[2025-04-03 04:25:18 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][274/573]	eta 0:04:26 lr 0.000022	time 0.8777 (0.8904)	loss 0.3635 (0.4943)	grad_norm 3.5783 (2.6902)	mem 20675MB
[2025-04-03 04:25:19 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][276/573]	eta 0:04:24 lr 0.000022	time 0.8774 (0.8903)	loss 0.4119 (0.4941)	grad_norm 2.3262 (2.6880)	mem 20675MB
[2025-04-03 04:25:21 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][278/573]	eta 0:04:22 lr 0.000022	time 0.8771 (0.8902)	loss 0.5033 (0.4947)	grad_norm 2.5120 (2.6864)	mem 20675MB
[2025-04-03 04:25:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][280/573]	eta 0:04:20 lr 0.000022	time 0.8797 (0.8901)	loss 0.4039 (0.4944)	grad_norm 2.1856 (2.6847)	mem 20675MB
[2025-04-03 04:25:25 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][282/573]	eta 0:04:18 lr 0.000022	time 0.8771 (0.8900)	loss 0.3787 (0.4941)	grad_norm 3.9641 (2.6884)	mem 20675MB
[2025-04-03 04:25:26 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][284/573]	eta 0:04:17 lr 0.000022	time 0.8775 (0.8900)	loss 0.4778 (0.4942)	grad_norm 2.2687 (2.6846)	mem 20675MB
[2025-04-03 04:25:28 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][286/573]	eta 0:04:15 lr 0.000022	time 0.8970 (0.8900)	loss 0.5109 (0.4943)	grad_norm 2.2659 (2.6857)	mem 20675MB
[2025-04-03 04:25:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][288/573]	eta 0:04:13 lr 0.000021	time 0.8778 (0.8899)	loss 0.5062 (0.4947)	grad_norm 2.6614 (2.6873)	mem 20675MB
[2025-04-03 04:25:32 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][290/573]	eta 0:04:11 lr 0.000021	time 0.8792 (0.8898)	loss 0.3442 (0.4943)	grad_norm 2.8733 (2.6864)	mem 20675MB
[2025-04-03 04:25:33 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][292/573]	eta 0:04:10 lr 0.000021	time 0.8771 (0.8897)	loss 0.4224 (0.4939)	grad_norm 2.4926 (2.6841)	mem 20675MB
[2025-04-03 04:25:35 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][294/573]	eta 0:04:08 lr 0.000021	time 0.8817 (0.8897)	loss 0.3224 (0.4928)	grad_norm 2.1141 (2.6843)	mem 20675MB
[2025-04-03 04:25:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][296/573]	eta 0:04:06 lr 0.000021	time 0.8820 (0.8896)	loss 0.5945 (0.4932)	grad_norm 2.6270 (2.6828)	mem 20675MB
[2025-04-03 04:25:39 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][298/573]	eta 0:04:04 lr 0.000021	time 0.8781 (0.8896)	loss 0.5771 (0.4936)	grad_norm 2.3674 (2.6807)	mem 20675MB
[2025-04-03 04:25:41 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][300/573]	eta 0:04:02 lr 0.000021	time 0.8785 (0.8895)	loss 0.3963 (0.4936)	grad_norm 3.4089 (2.6810)	mem 20675MB
[2025-04-03 04:25:42 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][302/573]	eta 0:04:01 lr 0.000021	time 0.8825 (0.8894)	loss 0.5754 (0.4937)	grad_norm 2.0578 (2.6788)	mem 20675MB
[2025-04-03 04:25:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][304/573]	eta 0:03:59 lr 0.000021	time 0.8776 (0.8895)	loss 0.4002 (0.4938)	grad_norm 2.9506 (2.6794)	mem 20675MB
[2025-04-03 04:25:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][306/573]	eta 0:03:57 lr 0.000021	time 0.8780 (0.8894)	loss 0.5524 (0.4938)	grad_norm 2.6175 (2.6768)	mem 20675MB
[2025-04-03 04:25:48 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][308/573]	eta 0:03:55 lr 0.000021	time 0.8778 (0.8894)	loss 0.3601 (0.4932)	grad_norm 3.6342 (2.6827)	mem 20675MB
[2025-04-03 04:25:49 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][310/573]	eta 0:03:53 lr 0.000021	time 0.8777 (0.8893)	loss 0.3728 (0.4929)	grad_norm 3.0966 (2.6821)	mem 20675MB
[2025-04-03 04:25:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][312/573]	eta 0:03:52 lr 0.000021	time 0.8958 (0.8893)	loss 0.4369 (0.4924)	grad_norm 3.0855 (2.6851)	mem 20675MB
[2025-04-03 04:25:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][314/573]	eta 0:03:50 lr 0.000021	time 0.8791 (0.8893)	loss 0.5794 (0.4928)	grad_norm 2.0083 (2.6818)	mem 20675MB
[2025-04-03 04:25:55 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][316/573]	eta 0:03:48 lr 0.000021	time 0.8793 (0.8893)	loss 0.3188 (0.4926)	grad_norm 3.3569 (2.6853)	mem 20675MB
[2025-04-03 04:25:56 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][318/573]	eta 0:03:46 lr 0.000021	time 0.8822 (0.8893)	loss 0.5748 (0.4932)	grad_norm 1.9592 (2.6824)	mem 20675MB
[2025-04-03 04:25:58 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][320/573]	eta 0:03:44 lr 0.000021	time 0.8808 (0.8892)	loss 0.5807 (0.4929)	grad_norm 4.8746 (2.6924)	mem 20675MB
[2025-04-03 04:26:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][322/573]	eta 0:03:43 lr 0.000021	time 0.8779 (0.8892)	loss 0.5937 (0.4936)	grad_norm 2.4073 (2.6900)	mem 20675MB
[2025-04-03 04:26:02 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][324/573]	eta 0:03:41 lr 0.000020	time 0.8773 (0.8891)	loss 0.5355 (0.4934)	grad_norm 3.0895 (2.6959)	mem 20675MB
[2025-04-03 04:26:03 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][326/573]	eta 0:03:39 lr 0.000020	time 0.8776 (0.8890)	loss 0.5207 (0.4937)	grad_norm 3.0177 (2.6939)	mem 20675MB
[2025-04-03 04:26:05 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][328/573]	eta 0:03:37 lr 0.000020	time 0.8785 (0.8890)	loss 0.5339 (0.4937)	grad_norm 2.7528 (2.6963)	mem 20675MB
[2025-04-03 04:26:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][330/573]	eta 0:03:36 lr 0.000020	time 0.8773 (0.8889)	loss 0.5332 (0.4939)	grad_norm 1.6601 (2.6917)	mem 20675MB
[2025-04-03 04:26:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][332/573]	eta 0:03:34 lr 0.000020	time 0.8774 (0.8889)	loss 0.5094 (0.4939)	grad_norm 2.7583 (2.6917)	mem 20675MB
[2025-04-03 04:26:11 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][334/573]	eta 0:03:32 lr 0.000020	time 0.8775 (0.8888)	loss 0.5241 (0.4935)	grad_norm 1.9440 (2.6921)	mem 20675MB
[2025-04-03 04:26:12 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][336/573]	eta 0:03:30 lr 0.000020	time 0.8776 (0.8888)	loss 0.5989 (0.4936)	grad_norm 2.7406 (2.6939)	mem 20675MB
[2025-04-03 04:26:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][338/573]	eta 0:03:28 lr 0.000020	time 0.8830 (0.8887)	loss 0.4517 (0.4935)	grad_norm 2.1875 (2.6905)	mem 20675MB
[2025-04-03 04:26:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][340/573]	eta 0:03:27 lr 0.000020	time 0.8776 (0.8887)	loss 0.4157 (0.4932)	grad_norm 3.8030 (2.6954)	mem 20675MB
[2025-04-03 04:26:18 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][342/573]	eta 0:03:25 lr 0.000020	time 0.8775 (0.8886)	loss 0.5957 (0.4933)	grad_norm 3.1716 (2.6976)	mem 20675MB
[2025-04-03 04:26:19 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][344/573]	eta 0:03:23 lr 0.000020	time 0.8788 (0.8886)	loss 0.4950 (0.4936)	grad_norm 2.4383 (2.6956)	mem 20675MB
[2025-04-03 04:26:21 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][346/573]	eta 0:03:21 lr 0.000020	time 0.8782 (0.8886)	loss 0.6011 (0.4940)	grad_norm 2.3886 (2.6944)	mem 20675MB
[2025-04-03 04:26:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][348/573]	eta 0:03:19 lr 0.000020	time 0.8777 (0.8885)	loss 0.4988 (0.4943)	grad_norm 2.4203 (2.6914)	mem 20675MB
[2025-04-03 04:26:25 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][350/573]	eta 0:03:18 lr 0.000020	time 0.8870 (0.8885)	loss 0.4448 (0.4939)	grad_norm 3.1195 (2.6949)	mem 20675MB
[2025-04-03 04:26:26 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][352/573]	eta 0:03:16 lr 0.000020	time 0.8778 (0.8884)	loss 0.4535 (0.4941)	grad_norm 2.5411 (2.6938)	mem 20675MB
[2025-04-03 04:26:28 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][354/573]	eta 0:03:14 lr 0.000020	time 0.8782 (0.8884)	loss 0.5383 (0.4944)	grad_norm 2.6045 (2.6948)	mem 20675MB
[2025-04-03 04:26:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][356/573]	eta 0:03:12 lr 0.000020	time 0.8838 (0.8883)	loss 0.6166 (0.4944)	grad_norm 3.0469 (2.7036)	mem 20675MB
[2025-04-03 04:26:32 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][358/573]	eta 0:03:10 lr 0.000019	time 0.8778 (0.8883)	loss 0.5620 (0.4944)	grad_norm 2.1135 (2.7020)	mem 20675MB
[2025-04-03 04:26:33 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][360/573]	eta 0:03:09 lr 0.000019	time 0.8776 (0.8883)	loss 0.5538 (0.4945)	grad_norm 2.5649 (2.7002)	mem 20675MB
[2025-04-03 04:26:35 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][362/573]	eta 0:03:07 lr 0.000019	time 0.8871 (0.8882)	loss 0.4367 (0.4945)	grad_norm 3.5803 (2.7021)	mem 20675MB
[2025-04-03 04:26:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][364/573]	eta 0:03:05 lr 0.000019	time 0.8934 (0.8884)	loss 0.5957 (0.4946)	grad_norm 2.6263 (2.7017)	mem 20675MB
[2025-04-03 04:26:39 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][366/573]	eta 0:03:03 lr 0.000019	time 0.8825 (0.8884)	loss 0.5335 (0.4945)	grad_norm 2.7580 (2.7005)	mem 20675MB
[2025-04-03 04:26:41 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][368/573]	eta 0:03:02 lr 0.000019	time 0.8778 (0.8883)	loss 0.4899 (0.4947)	grad_norm 2.5101 (2.7010)	mem 20675MB
[2025-04-03 04:26:42 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][370/573]	eta 0:03:00 lr 0.000019	time 0.8791 (0.8883)	loss 0.5356 (0.4948)	grad_norm 1.6654 (2.6963)	mem 20675MB
[2025-04-03 04:26:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][372/573]	eta 0:02:58 lr 0.000019	time 0.8774 (0.8883)	loss 0.5718 (0.4950)	grad_norm 2.6277 (2.6985)	mem 20675MB
[2025-04-03 04:26:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][374/573]	eta 0:02:56 lr 0.000019	time 0.8786 (0.8882)	loss 0.5861 (0.4949)	grad_norm 2.2808 (2.6977)	mem 20675MB
[2025-04-03 04:26:48 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][376/573]	eta 0:02:54 lr 0.000019	time 0.8808 (0.8882)	loss 0.4163 (0.4947)	grad_norm 2.5280 (2.6958)	mem 20675MB
[2025-04-03 04:26:49 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][378/573]	eta 0:02:53 lr 0.000019	time 0.8779 (0.8881)	loss 0.3172 (0.4945)	grad_norm 2.7742 (2.6972)	mem 20675MB
[2025-04-03 04:26:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][380/573]	eta 0:02:51 lr 0.000019	time 0.8926 (0.8881)	loss 0.6270 (0.4945)	grad_norm 2.0327 (2.6963)	mem 20675MB
[2025-04-03 04:26:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][382/573]	eta 0:02:49 lr 0.000019	time 0.8802 (0.8881)	loss 0.6433 (0.4946)	grad_norm 2.7241 (2.6971)	mem 20675MB
[2025-04-03 04:26:55 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][384/573]	eta 0:02:47 lr 0.000019	time 0.8774 (0.8880)	loss 0.5201 (0.4948)	grad_norm 2.3715 (2.6937)	mem 20675MB
[2025-04-03 04:26:56 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][386/573]	eta 0:02:46 lr 0.000019	time 0.8777 (0.8880)	loss 0.5674 (0.4949)	grad_norm 2.1522 (2.6906)	mem 20675MB
[2025-04-03 04:26:58 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][388/573]	eta 0:02:44 lr 0.000019	time 0.8778 (0.8879)	loss 0.5231 (0.4950)	grad_norm 1.7998 (2.6857)	mem 20675MB
[2025-04-03 04:27:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][390/573]	eta 0:02:42 lr 0.000019	time 0.8816 (0.8879)	loss 0.5290 (0.4951)	grad_norm 2.6046 (2.6828)	mem 20675MB
[2025-04-03 04:27:02 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][392/573]	eta 0:02:40 lr 0.000019	time 0.8796 (0.8878)	loss 0.3853 (0.4949)	grad_norm 2.5605 (2.6833)	mem 20675MB
[2025-04-03 04:27:03 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][394/573]	eta 0:02:38 lr 0.000018	time 0.8776 (0.8878)	loss 0.5515 (0.4950)	grad_norm 2.5538 (2.6823)	mem 20675MB
[2025-04-03 04:27:05 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][396/573]	eta 0:02:37 lr 0.000018	time 0.8916 (0.8878)	loss 0.5284 (0.4952)	grad_norm 5.5355 (2.6880)	mem 20675MB
[2025-04-03 04:27:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][398/573]	eta 0:02:35 lr 0.000018	time 0.8774 (0.8878)	loss 0.4912 (0.4952)	grad_norm 2.4202 (2.6870)	mem 20675MB
[2025-04-03 04:27:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][400/573]	eta 0:02:33 lr 0.000018	time 0.8817 (0.8878)	loss 0.5079 (0.4952)	grad_norm 2.6299 (2.6865)	mem 20675MB
[2025-04-03 04:27:11 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][402/573]	eta 0:02:31 lr 0.000018	time 0.8776 (0.8877)	loss 0.6003 (0.4952)	grad_norm 2.7417 (2.6882)	mem 20675MB
[2025-04-03 04:27:12 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][404/573]	eta 0:02:30 lr 0.000018	time 0.8774 (0.8877)	loss 0.5701 (0.4955)	grad_norm 1.9619 (2.6862)	mem 20675MB
[2025-04-03 04:27:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][406/573]	eta 0:02:28 lr 0.000018	time 0.8775 (0.8876)	loss 0.5341 (0.4957)	grad_norm 3.4121 (2.6865)	mem 20675MB
[2025-04-03 04:27:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][408/573]	eta 0:02:26 lr 0.000018	time 0.8929 (0.8876)	loss 0.5204 (0.4958)	grad_norm 2.5622 (2.6853)	mem 20675MB
[2025-04-03 04:27:18 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][410/573]	eta 0:02:24 lr 0.000018	time 0.9008 (0.8877)	loss 0.5584 (0.4962)	grad_norm 1.6483 (2.6833)	mem 20675MB
[2025-04-03 04:27:19 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][412/573]	eta 0:02:22 lr 0.000018	time 0.8779 (0.8876)	loss 0.3715 (0.4960)	grad_norm 2.5514 (2.6842)	mem 20675MB
[2025-04-03 04:27:21 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][414/573]	eta 0:02:21 lr 0.000018	time 0.8774 (0.8876)	loss 0.5333 (0.4961)	grad_norm 2.5217 (2.6828)	mem 20675MB
[2025-04-03 04:27:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][416/573]	eta 0:02:19 lr 0.000018	time 0.8785 (0.8875)	loss 0.4901 (0.4959)	grad_norm 1.9917 (2.6817)	mem 20675MB
[2025-04-03 04:27:25 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][418/573]	eta 0:02:17 lr 0.000018	time 0.8776 (0.8875)	loss 0.5053 (0.4960)	grad_norm 1.9482 (2.6793)	mem 20675MB
[2025-04-03 04:27:26 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][420/573]	eta 0:02:15 lr 0.000018	time 0.8801 (0.8875)	loss 0.5993 (0.4963)	grad_norm 2.0161 (2.6766)	mem 20675MB
[2025-04-03 04:27:28 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][422/573]	eta 0:02:13 lr 0.000018	time 0.8802 (0.8874)	loss 0.5515 (0.4964)	grad_norm 2.2944 (2.6752)	mem 20675MB
[2025-04-03 04:27:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][424/573]	eta 0:02:12 lr 0.000018	time 0.8777 (0.8874)	loss 0.3604 (0.4958)	grad_norm 3.5905 (2.6793)	mem 20675MB
[2025-04-03 04:27:32 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][426/573]	eta 0:02:10 lr 0.000018	time 0.8848 (0.8874)	loss 0.5878 (0.4960)	grad_norm 2.5293 (2.6830)	mem 20675MB
[2025-04-03 04:27:33 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][428/573]	eta 0:02:08 lr 0.000018	time 0.8932 (0.8873)	loss 0.5489 (0.4963)	grad_norm 2.8650 (2.6835)	mem 20675MB
[2025-04-03 04:27:35 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][430/573]	eta 0:02:06 lr 0.000018	time 0.8802 (0.8873)	loss 0.4492 (0.4960)	grad_norm 2.5405 (2.6842)	mem 20675MB
[2025-04-03 04:27:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][432/573]	eta 0:02:05 lr 0.000017	time 0.8776 (0.8873)	loss 0.5438 (0.4958)	grad_norm 1.9262 (2.6820)	mem 20675MB
[2025-04-03 04:27:39 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][434/573]	eta 0:02:03 lr 0.000017	time 0.8776 (0.8872)	loss 0.5649 (0.4961)	grad_norm 1.9443 (2.6829)	mem 20675MB
[2025-04-03 04:27:40 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][436/573]	eta 0:02:01 lr 0.000017	time 0.8817 (0.8872)	loss 0.4193 (0.4960)	grad_norm 3.2926 (2.6829)	mem 20675MB
[2025-04-03 04:27:42 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][438/573]	eta 0:01:59 lr 0.000017	time 0.8772 (0.8872)	loss 0.5510 (0.4960)	grad_norm 2.6939 (2.6835)	mem 20675MB
[2025-04-03 04:27:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][440/573]	eta 0:01:57 lr 0.000017	time 0.8776 (0.8872)	loss 0.3723 (0.4954)	grad_norm 3.5335 (2.6867)	mem 20675MB
[2025-04-03 04:27:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][442/573]	eta 0:01:56 lr 0.000017	time 0.8780 (0.8871)	loss 0.6074 (0.4958)	grad_norm 2.4767 (2.6854)	mem 20675MB
[2025-04-03 04:27:48 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][444/573]	eta 0:01:54 lr 0.000017	time 0.8778 (0.8871)	loss 0.4008 (0.4953)	grad_norm 4.4981 (2.6911)	mem 20675MB
[2025-04-03 04:27:49 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][446/573]	eta 0:01:52 lr 0.000017	time 0.8774 (0.8870)	loss 0.4358 (0.4954)	grad_norm 3.3020 (2.6923)	mem 20675MB
[2025-04-03 04:27:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][448/573]	eta 0:01:50 lr 0.000017	time 0.8778 (0.8870)	loss 0.5314 (0.4954)	grad_norm 2.7147 (2.6943)	mem 20675MB
[2025-04-03 04:27:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][450/573]	eta 0:01:49 lr 0.000017	time 0.8773 (0.8870)	loss 0.5346 (0.4951)	grad_norm 2.2954 (2.6961)	mem 20675MB
[2025-04-03 04:27:55 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][452/573]	eta 0:01:47 lr 0.000017	time 0.8818 (0.8869)	loss 0.4319 (0.4952)	grad_norm 7.0157 (2.7044)	mem 20675MB
[2025-04-03 04:27:56 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][454/573]	eta 0:01:45 lr 0.000017	time 0.8779 (0.8869)	loss 0.3137 (0.4944)	grad_norm 3.5117 (2.7068)	mem 20675MB
[2025-04-03 04:27:58 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][456/573]	eta 0:01:43 lr 0.000017	time 0.8774 (0.8869)	loss 0.3497 (0.4939)	grad_norm 3.4941 (2.7082)	mem 20675MB
[2025-04-03 04:28:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][458/573]	eta 0:01:41 lr 0.000017	time 0.8775 (0.8869)	loss 0.5171 (0.4939)	grad_norm 1.9231 (2.7078)	mem 20675MB
[2025-04-03 04:28:02 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][460/573]	eta 0:01:40 lr 0.000017	time 0.8777 (0.8869)	loss 0.4779 (0.4938)	grad_norm 2.7899 (2.7083)	mem 20675MB
[2025-04-03 04:28:03 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][462/573]	eta 0:01:38 lr 0.000017	time 0.8789 (0.8868)	loss 0.3403 (0.4933)	grad_norm 2.9341 (2.7098)	mem 20675MB
[2025-04-03 04:28:05 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][464/573]	eta 0:01:36 lr 0.000017	time 0.8777 (0.8868)	loss 0.5139 (0.4933)	grad_norm 2.8945 (2.7103)	mem 20675MB
[2025-04-03 04:28:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][466/573]	eta 0:01:34 lr 0.000017	time 0.8805 (0.8868)	loss 0.3614 (0.4931)	grad_norm 2.7976 (2.7106)	mem 20675MB
[2025-04-03 04:28:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][468/573]	eta 0:01:33 lr 0.000017	time 0.8772 (0.8868)	loss 0.4943 (0.4931)	grad_norm 2.7442 (2.7087)	mem 20675MB
[2025-04-03 04:28:10 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][470/573]	eta 0:01:31 lr 0.000016	time 0.8855 (0.8868)	loss 0.5050 (0.4933)	grad_norm 2.8469 (2.7086)	mem 20675MB
[2025-04-03 04:28:12 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][472/573]	eta 0:01:29 lr 0.000016	time 0.8773 (0.8867)	loss 0.4493 (0.4932)	grad_norm 2.4690 (2.7069)	mem 20675MB
[2025-04-03 04:28:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][474/573]	eta 0:01:27 lr 0.000016	time 0.8904 (0.8867)	loss 0.3189 (0.4928)	grad_norm 3.3936 (2.7082)	mem 20675MB
[2025-04-03 04:28:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][476/573]	eta 0:01:26 lr 0.000016	time 0.8774 (0.8867)	loss 0.4677 (0.4927)	grad_norm 1.8743 (2.7064)	mem 20675MB
[2025-04-03 04:28:18 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][478/573]	eta 0:01:24 lr 0.000016	time 0.8785 (0.8867)	loss 0.4266 (0.4926)	grad_norm 2.5337 (2.7050)	mem 20675MB
[2025-04-03 04:28:19 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][480/573]	eta 0:01:22 lr 0.000016	time 0.8776 (0.8867)	loss 0.5165 (0.4924)	grad_norm 2.0990 (2.7110)	mem 20675MB
[2025-04-03 04:28:21 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][482/573]	eta 0:01:20 lr 0.000016	time 0.8783 (0.8866)	loss 0.5745 (0.4925)	grad_norm 3.2356 (2.7129)	mem 20675MB
[2025-04-03 04:28:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][484/573]	eta 0:01:18 lr 0.000016	time 0.8798 (0.8866)	loss 0.5724 (0.4925)	grad_norm 1.9510 (2.7103)	mem 20675MB
[2025-04-03 04:28:25 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][486/573]	eta 0:01:17 lr 0.000016	time 0.8774 (0.8866)	loss 0.5115 (0.4923)	grad_norm 2.8154 (2.7129)	mem 20675MB
[2025-04-03 04:28:26 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][488/573]	eta 0:01:15 lr 0.000016	time 0.8774 (0.8865)	loss 0.5432 (0.4921)	grad_norm 2.0965 (2.7123)	mem 20675MB
[2025-04-03 04:28:28 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][490/573]	eta 0:01:13 lr 0.000016	time 0.8780 (0.8865)	loss 0.5329 (0.4920)	grad_norm 3.2620 (2.7160)	mem 20675MB
[2025-04-03 04:28:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][492/573]	eta 0:01:11 lr 0.000016	time 0.8773 (0.8865)	loss 0.5520 (0.4924)	grad_norm 3.8490 (2.7211)	mem 20675MB
[2025-04-03 04:28:32 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][494/573]	eta 0:01:10 lr 0.000016	time 0.8777 (0.8864)	loss 0.5295 (0.4926)	grad_norm 3.1255 (2.7211)	mem 20675MB
[2025-04-03 04:28:33 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][496/573]	eta 0:01:08 lr 0.000016	time 0.8780 (0.8864)	loss 0.4605 (0.4924)	grad_norm 5.9688 (2.7285)	mem 20675MB
[2025-04-03 04:28:35 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][498/573]	eta 0:01:06 lr 0.000016	time 0.8780 (0.8864)	loss 0.4374 (0.4924)	grad_norm 15.3144 (2.7532)	mem 20675MB
[2025-04-03 04:28:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][500/573]	eta 0:01:04 lr 0.000016	time 0.8801 (0.8864)	loss 0.3634 (0.4922)	grad_norm 4.0063 (2.7543)	mem 20675MB
[2025-04-03 04:28:39 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][502/573]	eta 0:01:02 lr 0.000016	time 0.8829 (0.8864)	loss 0.5085 (0.4923)	grad_norm 2.9933 (2.7528)	mem 20675MB
[2025-04-03 04:28:40 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][504/573]	eta 0:01:01 lr 0.000016	time 0.8776 (0.8864)	loss 0.5635 (0.4925)	grad_norm 2.3042 (2.7507)	mem 20675MB
[2025-04-03 04:28:42 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][506/573]	eta 0:00:59 lr 0.000016	time 0.8779 (0.8863)	loss 0.5350 (0.4924)	grad_norm 3.2306 (2.7532)	mem 20675MB
[2025-04-03 04:28:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][508/573]	eta 0:00:57 lr 0.000015	time 0.8839 (0.8863)	loss 0.3629 (0.4919)	grad_norm 3.9059 (2.7568)	mem 20675MB
[2025-04-03 04:28:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][510/573]	eta 0:00:55 lr 0.000015	time 0.8778 (0.8863)	loss 0.3723 (0.4919)	grad_norm 3.3905 (2.7581)	mem 20675MB
[2025-04-03 04:28:47 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][512/573]	eta 0:00:54 lr 0.000015	time 0.8774 (0.8863)	loss 0.4186 (0.4915)	grad_norm 2.5826 (2.7630)	mem 20675MB
[2025-04-03 04:28:49 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][514/573]	eta 0:00:52 lr 0.000015	time 0.8792 (0.8862)	loss 0.4667 (0.4912)	grad_norm 3.3638 (2.7640)	mem 20675MB
[2025-04-03 04:28:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][516/573]	eta 0:00:50 lr 0.000015	time 0.8775 (0.8862)	loss 0.5208 (0.4910)	grad_norm 1.8981 (2.7687)	mem 20675MB
[2025-04-03 04:28:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][518/573]	eta 0:00:48 lr 0.000015	time 0.8780 (0.8862)	loss 0.4554 (0.4911)	grad_norm 2.9821 (2.7692)	mem 20675MB
[2025-04-03 04:28:54 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][520/573]	eta 0:00:46 lr 0.000015	time 0.8773 (0.8862)	loss 0.4301 (0.4908)	grad_norm 1.9709 (2.7677)	mem 20675MB
[2025-04-03 04:28:56 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][522/573]	eta 0:00:45 lr 0.000015	time 0.8787 (0.8861)	loss 0.6186 (0.4910)	grad_norm 1.9740 (2.7648)	mem 20675MB
[2025-04-03 04:28:58 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][524/573]	eta 0:00:43 lr 0.000015	time 0.8781 (0.8861)	loss 0.5272 (0.4912)	grad_norm 1.8771 (2.7616)	mem 20675MB
[2025-04-03 04:29:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][526/573]	eta 0:00:41 lr 0.000015	time 0.8775 (0.8861)	loss 0.3510 (0.4911)	grad_norm 4.9678 (2.7656)	mem 20675MB
[2025-04-03 04:29:01 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][528/573]	eta 0:00:39 lr 0.000015	time 0.8773 (0.8861)	loss 0.4471 (0.4911)	grad_norm 3.8952 (2.7684)	mem 20675MB
[2025-04-03 04:29:03 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][530/573]	eta 0:00:38 lr 0.000015	time 0.8777 (0.8860)	loss 0.4684 (0.4912)	grad_norm 3.2733 (2.7689)	mem 20675MB
[2025-04-03 04:29:05 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][532/573]	eta 0:00:36 lr 0.000015	time 0.8781 (0.8860)	loss 0.5989 (0.4915)	grad_norm 2.7340 (2.7690)	mem 20675MB
[2025-04-03 04:29:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][534/573]	eta 0:00:34 lr 0.000015	time 0.8919 (0.8860)	loss 0.4396 (0.4915)	grad_norm 4.2834 (2.7717)	mem 20675MB
[2025-04-03 04:29:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][536/573]	eta 0:00:32 lr 0.000015	time 0.8775 (0.8860)	loss 0.4699 (0.4916)	grad_norm 2.9322 (2.7711)	mem 20675MB
[2025-04-03 04:29:10 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][538/573]	eta 0:00:31 lr 0.000015	time 0.8784 (0.8860)	loss 0.4012 (0.4914)	grad_norm 2.6560 (2.7718)	mem 20675MB
[2025-04-03 04:29:12 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][540/573]	eta 0:00:29 lr 0.000015	time 0.8779 (0.8860)	loss 0.3405 (0.4909)	grad_norm 3.6440 (2.7757)	mem 20675MB
[2025-04-03 04:29:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][542/573]	eta 0:00:27 lr 0.000015	time 0.8782 (0.8859)	loss 0.4095 (0.4907)	grad_norm 3.9043 (2.7772)	mem 20675MB
[2025-04-03 04:29:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][544/573]	eta 0:00:25 lr 0.000015	time 0.8779 (0.8859)	loss 0.6317 (0.4906)	grad_norm 2.6677 (2.7798)	mem 20675MB
[2025-04-03 04:29:17 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][546/573]	eta 0:00:23 lr 0.000015	time 0.8774 (0.8859)	loss 0.5443 (0.4909)	grad_norm 2.6673 (2.7792)	mem 20675MB
[2025-04-03 04:29:19 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][548/573]	eta 0:00:22 lr 0.000015	time 0.8779 (0.8859)	loss 0.5438 (0.4910)	grad_norm 2.4525 (2.7789)	mem 20675MB
[2025-04-03 04:29:21 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][550/573]	eta 0:00:20 lr 0.000014	time 0.8781 (0.8858)	loss 0.6614 (0.4914)	grad_norm 2.9224 (2.7787)	mem 20675MB
[2025-04-03 04:29:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][552/573]	eta 0:00:18 lr 0.000014	time 0.8772 (0.8858)	loss 0.5769 (0.4913)	grad_norm 2.8480 (2.7799)	mem 20675MB
[2025-04-03 04:29:24 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][554/573]	eta 0:00:16 lr 0.000014	time 0.8776 (0.8858)	loss 0.3740 (0.4910)	grad_norm 3.6939 (2.7810)	mem 20675MB
[2025-04-03 04:29:26 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][556/573]	eta 0:00:15 lr 0.000014	time 0.8819 (0.8858)	loss 0.4212 (0.4911)	grad_norm 2.7544 (2.7799)	mem 20675MB
[2025-04-03 04:29:28 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][558/573]	eta 0:00:13 lr 0.000014	time 0.8773 (0.8857)	loss 0.6183 (0.4911)	grad_norm 3.3136 (2.7816)	mem 20675MB
[2025-04-03 04:29:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][560/573]	eta 0:00:11 lr 0.000014	time 0.8777 (0.8857)	loss 0.4450 (0.4910)	grad_norm 2.0869 (2.7792)	mem 20675MB
[2025-04-03 04:29:31 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][562/573]	eta 0:00:09 lr 0.000014	time 0.8773 (0.8857)	loss 0.5176 (0.4911)	grad_norm 3.1393 (2.7796)	mem 20675MB
[2025-04-03 04:29:33 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][564/573]	eta 0:00:07 lr 0.000014	time 0.8772 (0.8857)	loss 0.5231 (0.4911)	grad_norm 2.3857 (2.7785)	mem 20675MB
[2025-04-03 04:29:35 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][566/573]	eta 0:00:06 lr 0.000014	time 0.8772 (0.8856)	loss 0.3814 (0.4910)	grad_norm 3.7389 (2.7798)	mem 20675MB
[2025-04-03 04:29:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][568/573]	eta 0:00:04 lr 0.000014	time 0.8775 (0.8856)	loss 0.5762 (0.4911)	grad_norm 2.2234 (2.7776)	mem 20675MB
[2025-04-03 04:29:38 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][570/573]	eta 0:00:02 lr 0.000014	time 0.8775 (0.8856)	loss 0.5224 (0.4911)	grad_norm 2.2805 (2.7782)	mem 20675MB
[2025-04-03 04:29:40 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][572/573]	eta 0:00:00 lr 0.000014	time 0.8774 (0.8856)	loss 0.5227 (0.4911)	grad_norm 2.1718 (2.7766)	mem 20675MB
[2025-04-03 04:29:40 simmim_finetune] (main_finetune.py 260): INFO EPOCH 27 training takes 0:08:27
[2025-04-03 04:29:43 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 2.950 (2.950)	Loss 0.5158 (0.5158)	Acc@1 70.312 (70.312)	Mem 20675MB
[2025-04-03 04:29:44 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.283 (1.173)	Loss 0.4756 (0.4812)	Acc@1 74.219 (73.698)	Mem 20675MB
[2025-04-03 04:29:44 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.287 (0.818)	Loss 0.4945 (0.4740)	Acc@1 72.656 (74.531)	Mem 20675MB
[2025-04-03 04:29:45 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.666)	Loss 0.4219 (0.4603)	Acc@1 82.031 (76.451)	Mem 20675MB
[2025-04-03 04:29:46 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.293 (0.582)	Loss 0.4932 (0.4564)	Acc@1 75.781 (77.170)	Mem 20675MB
[2025-04-03 04:29:46 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.284 (0.528)	Loss 0.4641 (0.4629)	Acc@1 83.594 (77.486)	Mem 20675MB
[2025-04-03 04:29:47 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.491)	Loss 0.4611 (0.4622)	Acc@1 78.906 (77.704)	Mem 20675MB
[2025-04-03 04:29:47 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.464)	Loss 0.4217 (0.4567)	Acc@1 79.688 (78.073)	Mem 20675MB
[2025-04-03 04:29:48 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.125
[2025-04-03 04:29:48 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 78.1%
[2025-04-03 04:29:48 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.23%
[2025-04-03 04:29:48 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.978470506288398e-07, 2.978470506288398e-07, 3.2508424029832037e-07, 3.2508424029832037e-07, 3.669876090205982e-07, 3.669876090205982e-07, 4.3145433013179483e-07, 4.3145433013179483e-07, 5.306339010720973e-07, 5.306339010720973e-07, 6.832178563648705e-07, 6.832178563648705e-07, 9.179624029691367e-07, 9.179624029691367e-07, 1.2791078592833922e-06, 1.2791078592833922e-06, 1.8347162536130166e-06, 1.8347162536130166e-06, 2.689498398735516e-06, 2.689498398735516e-06, 4.004547852770129e-06, 4.004547852770129e-06, 6.027700858977227e-06, 6.027700858977227e-06, 9.140243945449684e-06, 9.140243945449684e-06, 1.3928771770791927e-05, 1.3928771770791927e-05]
[2025-04-03 04:29:51 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][0/573]	eta 0:32:18 lr 0.000014	time 3.3834 (3.3834)	loss 0.4233 (0.4233)	grad_norm 4.6842 (4.6842)	mem 20675MB
[2025-04-03 04:29:53 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][2/573]	eta 0:16:19 lr 0.000014	time 0.8801 (1.7150)	loss 0.3981 (0.4628)	grad_norm 2.3694 (2.9562)	mem 20675MB
[2025-04-03 04:29:55 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][4/573]	eta 0:13:06 lr 0.000014	time 0.8812 (1.3819)	loss 0.4593 (0.4646)	grad_norm 3.0941 (3.0722)	mem 20675MB
[2025-04-03 04:29:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][6/573]	eta 0:11:42 lr 0.000014	time 0.8806 (1.2387)	loss 0.5624 (0.4966)	grad_norm 2.6985 (2.9819)	mem 20675MB
[2025-04-03 04:29:58 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][8/573]	eta 0:10:54 lr 0.000014	time 0.8777 (1.1589)	loss 0.4108 (0.4738)	grad_norm 4.0729 (3.1390)	mem 20675MB
[2025-04-03 04:30:00 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][10/573]	eta 0:10:24 lr 0.000014	time 0.8773 (1.1084)	loss 0.3453 (0.4701)	grad_norm 3.9199 (3.2113)	mem 20675MB
[2025-04-03 04:30:02 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][12/573]	eta 0:10:02 lr 0.000014	time 0.8772 (1.0735)	loss 0.4488 (0.4696)	grad_norm 2.0491 (3.0683)	mem 20675MB
[2025-04-03 04:30:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][14/573]	eta 0:09:45 lr 0.000014	time 0.8832 (1.0482)	loss 0.5157 (0.4622)	grad_norm 2.7170 (3.0176)	mem 20675MB
[2025-04-03 04:30:05 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][16/573]	eta 0:09:33 lr 0.000014	time 0.8780 (1.0290)	loss 0.5527 (0.4768)	grad_norm 2.0502 (2.9460)	mem 20675MB
[2025-04-03 04:30:07 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][18/573]	eta 0:09:22 lr 0.000013	time 0.8818 (1.0134)	loss 0.5607 (0.4742)	grad_norm 2.6300 (2.9281)	mem 20675MB
[2025-04-03 04:30:09 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][20/573]	eta 0:09:13 lr 0.000013	time 0.8801 (1.0011)	loss 0.4815 (0.4799)	grad_norm 3.2636 (2.9249)	mem 20675MB
[2025-04-03 04:30:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][22/573]	eta 0:09:05 lr 0.000013	time 0.8783 (0.9905)	loss 0.5728 (0.4849)	grad_norm 2.4695 (2.9018)	mem 20675MB
[2025-04-03 04:30:12 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][24/573]	eta 0:08:58 lr 0.000013	time 0.8828 (0.9817)	loss 0.3237 (0.4743)	grad_norm 3.2664 (2.9117)	mem 20675MB
[2025-04-03 04:30:14 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][26/573]	eta 0:08:53 lr 0.000013	time 0.8783 (0.9746)	loss 0.5382 (0.4737)	grad_norm 2.2079 (2.8902)	mem 20675MB
[2025-04-03 04:30:16 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][28/573]	eta 0:08:47 lr 0.000013	time 0.8773 (0.9683)	loss 0.5727 (0.4782)	grad_norm 2.6892 (2.8598)	mem 20675MB
[2025-04-03 04:30:18 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][30/573]	eta 0:08:42 lr 0.000013	time 0.8871 (0.9629)	loss 0.5379 (0.4804)	grad_norm 2.0693 (2.8001)	mem 20675MB
[2025-04-03 04:30:19 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][32/573]	eta 0:08:38 lr 0.000013	time 0.8773 (0.9578)	loss 0.4623 (0.4804)	grad_norm 2.3119 (2.8079)	mem 20675MB
[2025-04-03 04:30:21 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][34/573]	eta 0:08:33 lr 0.000013	time 0.8778 (0.9532)	loss 0.5395 (0.4844)	grad_norm 2.5202 (2.7847)	mem 20675MB
[2025-04-03 04:30:23 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][36/573]	eta 0:08:29 lr 0.000013	time 0.8783 (0.9492)	loss 0.6179 (0.4871)	grad_norm 2.3033 (2.7519)	mem 20675MB
[2025-04-03 04:30:25 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][38/573]	eta 0:08:25 lr 0.000013	time 0.8782 (0.9456)	loss 0.6199 (0.4882)	grad_norm 2.2888 (2.7429)	mem 20675MB
[2025-04-03 04:30:26 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][40/573]	eta 0:08:22 lr 0.000013	time 0.8801 (0.9425)	loss 0.5130 (0.4905)	grad_norm 2.7269 (2.7402)	mem 20675MB
[2025-04-03 04:30:28 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][42/573]	eta 0:08:19 lr 0.000013	time 0.8840 (0.9399)	loss 0.5257 (0.4931)	grad_norm 2.3230 (2.7283)	mem 20675MB
[2025-04-03 04:30:30 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][44/573]	eta 0:08:15 lr 0.000013	time 0.8778 (0.9372)	loss 0.4095 (0.4875)	grad_norm 4.8175 (2.7822)	mem 20675MB
[2025-04-03 04:30:32 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][46/573]	eta 0:08:12 lr 0.000013	time 0.8781 (0.9350)	loss 0.4748 (0.4865)	grad_norm 1.8207 (2.7463)	mem 20675MB
[2025-04-03 04:30:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][48/573]	eta 0:08:09 lr 0.000013	time 0.8788 (0.9327)	loss 0.3555 (0.4841)	grad_norm 4.0708 (2.7613)	mem 20675MB
[2025-04-03 04:30:35 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][50/573]	eta 0:08:06 lr 0.000013	time 0.8779 (0.9306)	loss 0.5405 (0.4847)	grad_norm 2.1899 (2.7551)	mem 20675MB
[2025-04-03 04:30:37 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][52/573]	eta 0:08:03 lr 0.000013	time 0.8806 (0.9287)	loss 0.3114 (0.4825)	grad_norm 3.7351 (2.7640)	mem 20675MB
[2025-04-03 04:30:39 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][54/573]	eta 0:08:01 lr 0.000013	time 0.8779 (0.9270)	loss 0.5583 (0.4816)	grad_norm 2.2041 (2.7857)	mem 20675MB
[2025-04-03 04:30:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][56/573]	eta 0:07:58 lr 0.000013	time 0.8778 (0.9253)	loss 0.3458 (0.4777)	grad_norm 2.8266 (2.8086)	mem 20675MB
[2025-04-03 04:30:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][58/573]	eta 0:07:55 lr 0.000013	time 0.8871 (0.9239)	loss 0.4908 (0.4783)	grad_norm 3.2941 (2.8240)	mem 20675MB
[2025-04-03 04:30:44 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][60/573]	eta 0:07:53 lr 0.000013	time 0.8784 (0.9224)	loss 0.5460 (0.4792)	grad_norm 2.3843 (2.8097)	mem 20675MB
[2025-04-03 04:30:46 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][62/573]	eta 0:07:50 lr 0.000012	time 0.8812 (0.9212)	loss 0.4977 (0.4784)	grad_norm 2.6462 (2.8095)	mem 20675MB
[2025-04-03 04:30:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][64/573]	eta 0:07:48 lr 0.000012	time 0.8782 (0.9199)	loss 0.4982 (0.4776)	grad_norm 2.7308 (2.7976)	mem 20675MB
[2025-04-03 04:30:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][66/573]	eta 0:07:45 lr 0.000012	time 0.8784 (0.9187)	loss 0.5908 (0.4795)	grad_norm 1.9511 (2.7835)	mem 20675MB
[2025-04-03 04:30:51 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][68/573]	eta 0:07:43 lr 0.000012	time 0.8785 (0.9176)	loss 0.5033 (0.4792)	grad_norm 3.2499 (2.7847)	mem 20675MB
[2025-04-03 04:30:53 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][70/573]	eta 0:07:41 lr 0.000012	time 0.8798 (0.9166)	loss 0.4662 (0.4803)	grad_norm 4.6546 (2.8097)	mem 20675MB
[2025-04-03 04:30:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][72/573]	eta 0:07:38 lr 0.000012	time 0.8781 (0.9155)	loss 0.5283 (0.4798)	grad_norm 1.6949 (2.8225)	mem 20675MB
[2025-04-03 04:30:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][74/573]	eta 0:07:36 lr 0.000012	time 0.8782 (0.9146)	loss 0.6074 (0.4813)	grad_norm 2.2026 (2.8233)	mem 20675MB
[2025-04-03 04:30:58 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][76/573]	eta 0:07:34 lr 0.000012	time 0.8854 (0.9137)	loss 0.4270 (0.4823)	grad_norm 2.9664 (2.8197)	mem 20675MB
[2025-04-03 04:31:00 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][78/573]	eta 0:07:31 lr 0.000012	time 0.8859 (0.9129)	loss 0.3263 (0.4810)	grad_norm 3.3616 (2.8146)	mem 20675MB
[2025-04-03 04:31:02 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][80/573]	eta 0:07:29 lr 0.000012	time 0.8817 (0.9122)	loss 0.5139 (0.4814)	grad_norm 2.5707 (2.8215)	mem 20675MB
[2025-04-03 04:31:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][82/573]	eta 0:07:27 lr 0.000012	time 0.8782 (0.9115)	loss 0.4630 (0.4803)	grad_norm 3.3841 (2.8239)	mem 20675MB
[2025-04-03 04:31:05 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][84/573]	eta 0:07:25 lr 0.000012	time 0.8788 (0.9107)	loss 0.5264 (0.4821)	grad_norm 2.7062 (2.8176)	mem 20675MB
[2025-04-03 04:31:07 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][86/573]	eta 0:07:23 lr 0.000012	time 0.8834 (0.9101)	loss 0.4561 (0.4833)	grad_norm 2.8161 (2.8169)	mem 20675MB
[2025-04-03 04:31:09 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][88/573]	eta 0:07:21 lr 0.000012	time 0.8783 (0.9095)	loss 0.4368 (0.4819)	grad_norm 3.2392 (2.8200)	mem 20675MB
[2025-04-03 04:31:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][90/573]	eta 0:07:18 lr 0.000012	time 0.8779 (0.9088)	loss 0.5431 (0.4833)	grad_norm 1.9389 (2.8079)	mem 20675MB
[2025-04-03 04:31:12 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][92/573]	eta 0:07:16 lr 0.000012	time 0.8779 (0.9082)	loss 0.5087 (0.4845)	grad_norm 2.2357 (2.7973)	mem 20675MB
[2025-04-03 04:31:14 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][94/573]	eta 0:07:14 lr 0.000012	time 0.8844 (0.9077)	loss 0.5355 (0.4856)	grad_norm 1.4746 (2.7759)	mem 20675MB
[2025-04-03 04:31:16 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][96/573]	eta 0:07:12 lr 0.000012	time 0.8781 (0.9071)	loss 0.4889 (0.4864)	grad_norm 1.9901 (2.7629)	mem 20675MB
[2025-04-03 04:31:17 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][98/573]	eta 0:07:10 lr 0.000012	time 0.8780 (0.9065)	loss 0.4136 (0.4841)	grad_norm 3.8821 (2.7745)	mem 20675MB
[2025-04-03 04:31:19 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][100/573]	eta 0:07:08 lr 0.000012	time 0.8778 (0.9060)	loss 0.5854 (0.4861)	grad_norm 3.1409 (2.7752)	mem 20675MB
[2025-04-03 04:31:21 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][102/573]	eta 0:07:06 lr 0.000012	time 0.8779 (0.9055)	loss 0.5346 (0.4871)	grad_norm 3.1641 (2.7747)	mem 20675MB
[2025-04-03 04:31:23 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][104/573]	eta 0:07:04 lr 0.000012	time 0.8787 (0.9050)	loss 0.6446 (0.4885)	grad_norm 2.9522 (2.7806)	mem 20675MB
[2025-04-03 04:31:24 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][106/573]	eta 0:07:02 lr 0.000012	time 0.8840 (0.9047)	loss 0.4483 (0.4882)	grad_norm 2.9250 (2.7858)	mem 20675MB
[2025-04-03 04:31:26 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][108/573]	eta 0:07:00 lr 0.000011	time 0.8955 (0.9044)	loss 0.4461 (0.4874)	grad_norm 3.7819 (2.7969)	mem 20675MB
[2025-04-03 04:31:28 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][110/573]	eta 0:06:58 lr 0.000011	time 0.8780 (0.9040)	loss 0.5083 (0.4872)	grad_norm 2.6473 (2.7997)	mem 20675MB
[2025-04-03 04:31:30 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][112/573]	eta 0:06:56 lr 0.000011	time 0.8772 (0.9036)	loss 0.5432 (0.4884)	grad_norm 3.3395 (2.8003)	mem 20675MB
[2025-04-03 04:31:32 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][114/573]	eta 0:06:54 lr 0.000011	time 0.8784 (0.9032)	loss 0.5136 (0.4892)	grad_norm 2.3005 (2.7959)	mem 20675MB
[2025-04-03 04:31:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][116/573]	eta 0:06:52 lr 0.000011	time 0.8785 (0.9028)	loss 0.4932 (0.4894)	grad_norm 2.7578 (2.7973)	mem 20675MB
[2025-04-03 04:31:35 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][118/573]	eta 0:06:50 lr 0.000011	time 0.8809 (0.9024)	loss 0.4031 (0.4887)	grad_norm 3.2661 (2.7943)	mem 20675MB
[2025-04-03 04:31:37 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][120/573]	eta 0:06:48 lr 0.000011	time 0.8793 (0.9021)	loss 0.6107 (0.4898)	grad_norm 2.6941 (2.7885)	mem 20675MB
[2025-04-03 04:31:39 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][122/573]	eta 0:06:46 lr 0.000011	time 0.8779 (0.9017)	loss 0.3341 (0.4893)	grad_norm 3.8034 (2.7992)	mem 20675MB
[2025-04-03 04:31:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][124/573]	eta 0:06:44 lr 0.000011	time 0.8774 (0.9014)	loss 0.5333 (0.4890)	grad_norm 3.5571 (2.8089)	mem 20675MB
[2025-04-03 04:31:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][126/573]	eta 0:06:42 lr 0.000011	time 0.8776 (0.9011)	loss 0.4073 (0.4882)	grad_norm 4.9710 (2.8303)	mem 20675MB
[2025-04-03 04:31:44 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][128/573]	eta 0:06:40 lr 0.000011	time 0.8779 (0.9007)	loss 0.5221 (0.4890)	grad_norm 2.0120 (2.8233)	mem 20675MB
[2025-04-03 04:31:46 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][130/573]	eta 0:06:38 lr 0.000011	time 0.8775 (0.9004)	loss 0.3348 (0.4870)	grad_norm 2.4334 (2.8169)	mem 20675MB
[2025-04-03 04:31:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][132/573]	eta 0:06:36 lr 0.000011	time 0.8777 (0.9001)	loss 0.4902 (0.4869)	grad_norm 2.6009 (2.8181)	mem 20675MB
[2025-04-03 04:31:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][134/573]	eta 0:06:34 lr 0.000011	time 0.8782 (0.8997)	loss 0.5724 (0.4868)	grad_norm 2.6528 (2.8194)	mem 20675MB
[2025-04-03 04:31:51 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][136/573]	eta 0:06:33 lr 0.000011	time 0.8778 (0.8995)	loss 0.3680 (0.4861)	grad_norm 3.1737 (2.8229)	mem 20675MB
[2025-04-03 04:31:53 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][138/573]	eta 0:06:31 lr 0.000011	time 0.8776 (0.8992)	loss 0.4475 (0.4854)	grad_norm 3.8438 (2.8308)	mem 20675MB
[2025-04-03 04:31:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][140/573]	eta 0:06:29 lr 0.000011	time 0.8783 (0.8989)	loss 0.3928 (0.4854)	grad_norm 3.1078 (2.8327)	mem 20675MB
[2025-04-03 04:31:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][142/573]	eta 0:06:27 lr 0.000011	time 0.8777 (0.8986)	loss 0.5483 (0.4856)	grad_norm 3.0940 (2.8409)	mem 20675MB
[2025-04-03 04:31:58 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][144/573]	eta 0:06:25 lr 0.000011	time 0.8816 (0.8984)	loss 0.4953 (0.4864)	grad_norm 2.7359 (2.8375)	mem 20675MB
[2025-04-03 04:32:00 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][146/573]	eta 0:06:23 lr 0.000011	time 0.8773 (0.8982)	loss 0.5388 (0.4863)	grad_norm 2.0037 (2.8293)	mem 20675MB
[2025-04-03 04:32:01 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][148/573]	eta 0:06:21 lr 0.000011	time 0.8813 (0.8979)	loss 0.5782 (0.4872)	grad_norm 2.0981 (2.8243)	mem 20675MB
[2025-04-03 04:32:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][150/573]	eta 0:06:19 lr 0.000011	time 0.8784 (0.8977)	loss 0.5576 (0.4874)	grad_norm 1.7563 (2.8150)	mem 20675MB
[2025-04-03 04:32:05 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][152/573]	eta 0:06:17 lr 0.000011	time 0.8779 (0.8974)	loss 0.5332 (0.4886)	grad_norm 3.1346 (2.8152)	mem 20675MB
[2025-04-03 04:32:07 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][154/573]	eta 0:06:15 lr 0.000010	time 0.8780 (0.8972)	loss 0.4980 (0.4879)	grad_norm 3.0641 (2.8328)	mem 20675MB
[2025-04-03 04:32:08 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][156/573]	eta 0:06:14 lr 0.000010	time 0.8774 (0.8970)	loss 0.5749 (0.4892)	grad_norm 2.6076 (2.8318)	mem 20675MB
[2025-04-03 04:32:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][158/573]	eta 0:06:12 lr 0.000010	time 0.8782 (0.8968)	loss 0.6281 (0.4904)	grad_norm 2.7048 (2.8280)	mem 20675MB
[2025-04-03 04:32:12 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][160/573]	eta 0:06:10 lr 0.000010	time 0.8776 (0.8965)	loss 0.5684 (0.4903)	grad_norm 3.3006 (2.8326)	mem 20675MB
[2025-04-03 04:32:14 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][162/573]	eta 0:06:08 lr 0.000010	time 0.8777 (0.8963)	loss 0.4767 (0.4907)	grad_norm 2.1100 (2.8236)	mem 20675MB
[2025-04-03 04:32:16 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][164/573]	eta 0:06:06 lr 0.000010	time 0.8780 (0.8961)	loss 0.5673 (0.4910)	grad_norm 3.0880 (2.8228)	mem 20675MB
[2025-04-03 04:32:17 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][166/573]	eta 0:06:04 lr 0.000010	time 0.8779 (0.8960)	loss 0.5252 (0.4919)	grad_norm 3.1559 (2.8221)	mem 20675MB
[2025-04-03 04:32:19 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][168/573]	eta 0:06:02 lr 0.000010	time 0.8777 (0.8958)	loss 0.3889 (0.4908)	grad_norm 2.7748 (2.8292)	mem 20675MB
[2025-04-03 04:32:21 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][170/573]	eta 0:06:00 lr 0.000010	time 0.8778 (0.8956)	loss 0.3886 (0.4895)	grad_norm 4.5304 (2.8522)	mem 20675MB
[2025-04-03 04:32:23 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][172/573]	eta 0:05:59 lr 0.000010	time 0.8772 (0.8954)	loss 0.5462 (0.4898)	grad_norm 1.5679 (2.8410)	mem 20675MB
[2025-04-03 04:32:24 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][174/573]	eta 0:05:57 lr 0.000010	time 0.8777 (0.8952)	loss 0.4149 (0.4903)	grad_norm 4.0518 (2.8483)	mem 20675MB
[2025-04-03 04:32:26 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][176/573]	eta 0:05:55 lr 0.000010	time 0.8801 (0.8950)	loss 0.4310 (0.4903)	grad_norm 3.7445 (2.8502)	mem 20675MB
[2025-04-03 04:32:28 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][178/573]	eta 0:05:53 lr 0.000010	time 0.8773 (0.8949)	loss 0.5667 (0.4908)	grad_norm 3.2490 (2.8509)	mem 20675MB
[2025-04-03 04:32:30 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][180/573]	eta 0:05:51 lr 0.000010	time 0.8812 (0.8948)	loss 0.4682 (0.4905)	grad_norm 2.4786 (2.8470)	mem 20675MB
[2025-04-03 04:32:31 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][182/573]	eta 0:05:49 lr 0.000010	time 0.8782 (0.8946)	loss 0.4925 (0.4905)	grad_norm 2.6741 (2.8408)	mem 20675MB
[2025-04-03 04:32:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][184/573]	eta 0:05:47 lr 0.000010	time 0.8773 (0.8944)	loss 0.4561 (0.4905)	grad_norm 3.7426 (2.8432)	mem 20675MB
[2025-04-03 04:32:35 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][186/573]	eta 0:05:46 lr 0.000010	time 0.8775 (0.8943)	loss 0.4876 (0.4908)	grad_norm 2.3284 (2.8378)	mem 20675MB
[2025-04-03 04:32:37 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][188/573]	eta 0:05:44 lr 0.000010	time 0.8774 (0.8941)	loss 0.5096 (0.4909)	grad_norm 1.6530 (2.8283)	mem 20675MB
[2025-04-03 04:32:38 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][190/573]	eta 0:05:42 lr 0.000010	time 0.8772 (0.8940)	loss 0.3174 (0.4892)	grad_norm 3.2304 (2.8307)	mem 20675MB
[2025-04-03 04:32:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][192/573]	eta 0:05:40 lr 0.000010	time 0.8776 (0.8938)	loss 0.4824 (0.4886)	grad_norm 1.7667 (2.8363)	mem 20675MB
[2025-04-03 04:32:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][194/573]	eta 0:05:38 lr 0.000010	time 0.8825 (0.8937)	loss 0.4099 (0.4876)	grad_norm 3.0549 (2.8389)	mem 20675MB
[2025-04-03 04:32:44 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][196/573]	eta 0:05:36 lr 0.000010	time 0.8909 (0.8937)	loss 0.4753 (0.4878)	grad_norm 3.9523 (2.8466)	mem 20675MB
[2025-04-03 04:32:45 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][198/573]	eta 0:05:35 lr 0.000010	time 0.8778 (0.8935)	loss 0.4454 (0.4879)	grad_norm 3.2330 (2.8461)	mem 20675MB
[2025-04-03 04:32:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][200/573]	eta 0:05:33 lr 0.000010	time 0.8780 (0.8934)	loss 0.5849 (0.4886)	grad_norm 2.0451 (2.8460)	mem 20675MB
[2025-04-03 04:32:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][202/573]	eta 0:05:31 lr 0.000010	time 0.8779 (0.8932)	loss 0.4573 (0.4879)	grad_norm 3.0131 (2.8499)	mem 20675MB
[2025-04-03 04:32:51 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][204/573]	eta 0:05:29 lr 0.000009	time 0.8775 (0.8931)	loss 0.6412 (0.4884)	grad_norm 2.8348 (2.8491)	mem 20675MB
[2025-04-03 04:32:53 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][206/573]	eta 0:05:27 lr 0.000009	time 0.8778 (0.8930)	loss 0.5285 (0.4884)	grad_norm 2.5587 (2.8492)	mem 20675MB
[2025-04-03 04:32:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][208/573]	eta 0:05:25 lr 0.000009	time 0.8774 (0.8929)	loss 0.3634 (0.4880)	grad_norm 3.0933 (2.8510)	mem 20675MB
[2025-04-03 04:32:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][210/573]	eta 0:05:24 lr 0.000009	time 0.8772 (0.8928)	loss 0.4279 (0.4876)	grad_norm 3.9689 (2.8536)	mem 20675MB
[2025-04-03 04:32:58 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][212/573]	eta 0:05:22 lr 0.000009	time 0.8775 (0.8927)	loss 0.4788 (0.4874)	grad_norm 2.8738 (2.8555)	mem 20675MB
[2025-04-03 04:33:00 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][214/573]	eta 0:05:20 lr 0.000009	time 0.8813 (0.8926)	loss 0.4583 (0.4876)	grad_norm 2.8378 (2.8506)	mem 20675MB
[2025-04-03 04:33:01 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][216/573]	eta 0:05:18 lr 0.000009	time 0.8775 (0.8925)	loss 0.5842 (0.4884)	grad_norm 1.8123 (2.8437)	mem 20675MB
[2025-04-03 04:33:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][218/573]	eta 0:05:16 lr 0.000009	time 0.8801 (0.8924)	loss 0.5329 (0.4883)	grad_norm 2.2122 (2.8402)	mem 20675MB
[2025-04-03 04:33:05 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][220/573]	eta 0:05:14 lr 0.000009	time 0.8771 (0.8922)	loss 0.4707 (0.4877)	grad_norm 2.6834 (2.8411)	mem 20675MB
[2025-04-03 04:33:07 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][222/573]	eta 0:05:13 lr 0.000009	time 0.8774 (0.8922)	loss 0.4777 (0.4875)	grad_norm 2.4285 (2.8415)	mem 20675MB
[2025-04-03 04:33:08 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][224/573]	eta 0:05:11 lr 0.000009	time 0.8777 (0.8921)	loss 0.5223 (0.4878)	grad_norm 2.8466 (2.8413)	mem 20675MB
[2025-04-03 04:33:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][226/573]	eta 0:05:09 lr 0.000009	time 0.8839 (0.8920)	loss 0.4876 (0.4877)	grad_norm 4.1611 (2.8476)	mem 20675MB
[2025-04-03 04:33:12 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][228/573]	eta 0:05:07 lr 0.000009	time 0.8783 (0.8919)	loss 0.3036 (0.4867)	grad_norm 3.4961 (2.8476)	mem 20675MB
[2025-04-03 04:33:14 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][230/573]	eta 0:05:05 lr 0.000009	time 0.8793 (0.8918)	loss 0.5140 (0.4862)	grad_norm 2.5772 (2.8523)	mem 20675MB
[2025-04-03 04:33:15 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][232/573]	eta 0:05:04 lr 0.000009	time 0.8773 (0.8917)	loss 0.4769 (0.4855)	grad_norm 3.6161 (2.8565)	mem 20675MB
[2025-04-03 04:33:17 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][234/573]	eta 0:05:02 lr 0.000009	time 0.8958 (0.8917)	loss 0.4359 (0.4855)	grad_norm 2.2474 (2.8521)	mem 20675MB
[2025-04-03 04:33:19 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][236/573]	eta 0:05:00 lr 0.000009	time 0.8806 (0.8916)	loss 0.5347 (0.4860)	grad_norm 3.5421 (2.8553)	mem 20675MB
[2025-04-03 04:33:21 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][238/573]	eta 0:04:58 lr 0.000009	time 0.8789 (0.8917)	loss 0.5671 (0.4866)	grad_norm 3.1026 (2.8534)	mem 20675MB
[2025-04-03 04:33:23 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][240/573]	eta 0:04:56 lr 0.000009	time 0.8972 (0.8917)	loss 0.4470 (0.4869)	grad_norm 2.3374 (2.8509)	mem 20675MB
[2025-04-03 04:33:24 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][242/573]	eta 0:04:55 lr 0.000009	time 0.8770 (0.8916)	loss 0.5361 (0.4876)	grad_norm 2.3902 (2.8468)	mem 20675MB
[2025-04-03 04:33:26 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][244/573]	eta 0:04:53 lr 0.000009	time 0.8774 (0.8915)	loss 0.3945 (0.4870)	grad_norm 3.4578 (2.8502)	mem 20675MB
[2025-04-03 04:33:28 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][246/573]	eta 0:04:51 lr 0.000009	time 0.8775 (0.8914)	loss 0.4366 (0.4870)	grad_norm 4.6827 (2.8544)	mem 20675MB
[2025-04-03 04:33:30 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][248/573]	eta 0:04:49 lr 0.000009	time 0.8774 (0.8913)	loss 0.5257 (0.4866)	grad_norm 3.3505 (2.8547)	mem 20675MB
[2025-04-03 04:33:31 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][250/573]	eta 0:04:47 lr 0.000009	time 0.8806 (0.8912)	loss 0.5866 (0.4868)	grad_norm 1.9705 (2.8492)	mem 20675MB
[2025-04-03 04:33:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][252/573]	eta 0:04:46 lr 0.000009	time 0.8777 (0.8912)	loss 0.4504 (0.4867)	grad_norm 2.9065 (2.8467)	mem 20675MB
[2025-04-03 04:33:35 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][254/573]	eta 0:04:44 lr 0.000009	time 0.8779 (0.8911)	loss 0.4104 (0.4862)	grad_norm 2.5423 (2.8443)	mem 20675MB
[2025-04-03 04:33:37 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][256/573]	eta 0:04:42 lr 0.000008	time 0.8773 (0.8910)	loss 0.5757 (0.4863)	grad_norm 2.5494 (2.8410)	mem 20675MB
[2025-04-03 04:33:38 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][258/573]	eta 0:04:40 lr 0.000008	time 0.8773 (0.8909)	loss 0.4654 (0.4856)	grad_norm 3.4344 (2.8433)	mem 20675MB
[2025-04-03 04:33:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][260/573]	eta 0:04:38 lr 0.000008	time 0.8790 (0.8908)	loss 0.4779 (0.4857)	grad_norm 2.2081 (2.8400)	mem 20675MB
[2025-04-03 04:33:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][262/573]	eta 0:04:37 lr 0.000008	time 0.8779 (0.8908)	loss 0.5442 (0.4861)	grad_norm 3.4517 (2.8452)	mem 20675MB
[2025-04-03 04:33:44 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][264/573]	eta 0:04:35 lr 0.000008	time 0.8805 (0.8907)	loss 0.5004 (0.4863)	grad_norm 3.3719 (2.8440)	mem 20675MB
[2025-04-03 04:33:45 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][266/573]	eta 0:04:33 lr 0.000008	time 0.8826 (0.8906)	loss 0.3639 (0.4859)	grad_norm 3.6061 (2.8501)	mem 20675MB
[2025-04-03 04:33:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][268/573]	eta 0:04:31 lr 0.000008	time 0.8901 (0.8906)	loss 0.5614 (0.4859)	grad_norm 2.2617 (2.8429)	mem 20675MB
[2025-04-03 04:33:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][270/573]	eta 0:04:29 lr 0.000008	time 0.8777 (0.8905)	loss 0.6049 (0.4866)	grad_norm 3.0350 (2.8422)	mem 20675MB
[2025-04-03 04:33:51 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][272/573]	eta 0:04:28 lr 0.000008	time 0.8786 (0.8905)	loss 0.3649 (0.4863)	grad_norm 3.3346 (2.8420)	mem 20675MB
[2025-04-03 04:33:53 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][274/573]	eta 0:04:26 lr 0.000008	time 0.8781 (0.8904)	loss 0.4282 (0.4864)	grad_norm 3.2807 (2.8427)	mem 20675MB
[2025-04-03 04:33:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][276/573]	eta 0:04:24 lr 0.000008	time 0.8829 (0.8903)	loss 0.6209 (0.4868)	grad_norm 3.0700 (2.8424)	mem 20675MB
[2025-04-03 04:33:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][278/573]	eta 0:04:22 lr 0.000008	time 0.8809 (0.8902)	loss 0.4846 (0.4867)	grad_norm 3.3101 (2.8414)	mem 20675MB
[2025-04-03 04:33:58 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][280/573]	eta 0:04:20 lr 0.000008	time 0.8776 (0.8902)	loss 0.3817 (0.4863)	grad_norm 2.4086 (2.8395)	mem 20675MB
[2025-04-03 04:34:00 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][282/573]	eta 0:04:19 lr 0.000008	time 0.8798 (0.8901)	loss 0.4823 (0.4863)	grad_norm 1.9635 (2.8330)	mem 20675MB
[2025-04-03 04:34:01 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][284/573]	eta 0:04:17 lr 0.000008	time 0.8772 (0.8901)	loss 0.5410 (0.4865)	grad_norm 2.5038 (2.8297)	mem 20675MB
[2025-04-03 04:34:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][286/573]	eta 0:04:15 lr 0.000008	time 0.8776 (0.8900)	loss 0.5442 (0.4865)	grad_norm 1.9530 (2.8275)	mem 20675MB
[2025-04-03 04:34:05 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][288/573]	eta 0:04:13 lr 0.000008	time 0.8817 (0.8899)	loss 0.5741 (0.4861)	grad_norm 2.0865 (2.8243)	mem 20675MB
[2025-04-03 04:34:07 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][290/573]	eta 0:04:11 lr 0.000008	time 0.8835 (0.8899)	loss 0.6186 (0.4863)	grad_norm 2.6238 (2.8291)	mem 20675MB
[2025-04-03 04:34:08 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][292/573]	eta 0:04:10 lr 0.000008	time 0.8777 (0.8898)	loss 0.4067 (0.4860)	grad_norm 4.7730 (2.8340)	mem 20675MB
[2025-04-03 04:34:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][294/573]	eta 0:04:08 lr 0.000008	time 0.8779 (0.8897)	loss 0.4861 (0.4860)	grad_norm 2.5328 (2.8345)	mem 20675MB
[2025-04-03 04:34:12 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][296/573]	eta 0:04:06 lr 0.000008	time 0.8782 (0.8896)	loss 0.5334 (0.4861)	grad_norm 2.1911 (2.8337)	mem 20675MB
[2025-04-03 04:34:14 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][298/573]	eta 0:04:04 lr 0.000008	time 0.8777 (0.8896)	loss 0.4245 (0.4863)	grad_norm 2.4845 (2.8294)	mem 20675MB
[2025-04-03 04:34:15 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][300/573]	eta 0:04:02 lr 0.000008	time 0.8862 (0.8895)	loss 0.4143 (0.4856)	grad_norm 2.6566 (2.8307)	mem 20675MB
[2025-04-03 04:34:17 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][302/573]	eta 0:04:01 lr 0.000008	time 0.8782 (0.8895)	loss 0.5439 (0.4853)	grad_norm 1.8870 (2.8309)	mem 20675MB
[2025-04-03 04:34:19 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][304/573]	eta 0:03:59 lr 0.000008	time 0.8809 (0.8895)	loss 0.5442 (0.4849)	grad_norm 2.3700 (2.8317)	mem 20675MB
[2025-04-03 04:34:21 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][306/573]	eta 0:03:57 lr 0.000008	time 0.8776 (0.8895)	loss 0.6086 (0.4852)	grad_norm 1.9806 (2.8305)	mem 20675MB
[2025-04-03 04:34:23 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][308/573]	eta 0:03:55 lr 0.000008	time 0.8943 (0.8895)	loss 0.4216 (0.4853)	grad_norm 2.8247 (2.8281)	mem 20675MB
[2025-04-03 04:34:24 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][310/573]	eta 0:03:53 lr 0.000008	time 0.8784 (0.8894)	loss 0.3663 (0.4845)	grad_norm 3.6802 (2.8290)	mem 20675MB
[2025-04-03 04:34:26 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][312/573]	eta 0:03:52 lr 0.000007	time 0.8860 (0.8894)	loss 0.5515 (0.4848)	grad_norm 2.7948 (2.8255)	mem 20675MB
[2025-04-03 04:34:28 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][314/573]	eta 0:03:50 lr 0.000007	time 0.8784 (0.8893)	loss 0.5362 (0.4847)	grad_norm 2.4050 (2.8220)	mem 20675MB
[2025-04-03 04:34:30 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][316/573]	eta 0:03:48 lr 0.000007	time 0.8775 (0.8892)	loss 0.5408 (0.4851)	grad_norm 2.5085 (2.8197)	mem 20675MB
[2025-04-03 04:34:31 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][318/573]	eta 0:03:46 lr 0.000007	time 0.8776 (0.8892)	loss 0.3624 (0.4851)	grad_norm 2.8360 (2.8190)	mem 20675MB
[2025-04-03 04:34:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][320/573]	eta 0:03:44 lr 0.000007	time 0.8771 (0.8892)	loss 0.5025 (0.4849)	grad_norm 2.2061 (2.8176)	mem 20675MB
[2025-04-03 04:34:35 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][322/573]	eta 0:03:43 lr 0.000007	time 0.8777 (0.8891)	loss 0.5490 (0.4847)	grad_norm 2.2382 (2.8187)	mem 20675MB
[2025-04-03 04:34:37 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][324/573]	eta 0:03:41 lr 0.000007	time 0.8791 (0.8891)	loss 0.4027 (0.4846)	grad_norm 2.9895 (2.8180)	mem 20675MB
[2025-04-03 04:34:38 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][326/573]	eta 0:03:39 lr 0.000007	time 0.8781 (0.8890)	loss 0.3810 (0.4845)	grad_norm 2.3636 (2.8164)	mem 20675MB
[2025-04-03 04:34:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][328/573]	eta 0:03:37 lr 0.000007	time 0.8776 (0.8890)	loss 0.3685 (0.4843)	grad_norm 3.7971 (2.8190)	mem 20675MB
[2025-04-03 04:34:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][330/573]	eta 0:03:36 lr 0.000007	time 0.8778 (0.8889)	loss 0.5203 (0.4845)	grad_norm 3.3441 (2.8177)	mem 20675MB
[2025-04-03 04:34:44 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][332/573]	eta 0:03:34 lr 0.000007	time 0.8876 (0.8889)	loss 0.4740 (0.4850)	grad_norm 2.3695 (2.8166)	mem 20675MB
[2025-04-03 04:34:45 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][334/573]	eta 0:03:32 lr 0.000007	time 0.8777 (0.8889)	loss 0.4474 (0.4846)	grad_norm 2.0185 (2.8152)	mem 20675MB
[2025-04-03 04:34:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][336/573]	eta 0:03:30 lr 0.000007	time 0.8777 (0.8888)	loss 0.3892 (0.4847)	grad_norm 4.8876 (2.8212)	mem 20675MB
[2025-04-03 04:34:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][338/573]	eta 0:03:28 lr 0.000007	time 0.8788 (0.8887)	loss 0.4217 (0.4843)	grad_norm 3.6951 (2.8289)	mem 20675MB
[2025-04-03 04:34:51 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][340/573]	eta 0:03:27 lr 0.000007	time 0.8816 (0.8887)	loss 0.4616 (0.4842)	grad_norm 1.9782 (2.8279)	mem 20675MB
[2025-04-03 04:34:52 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][342/573]	eta 0:03:25 lr 0.000007	time 0.8779 (0.8887)	loss 0.4485 (0.4838)	grad_norm 2.3300 (2.8291)	mem 20675MB
[2025-04-03 04:34:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][344/573]	eta 0:03:23 lr 0.000007	time 0.8803 (0.8886)	loss 0.4706 (0.4840)	grad_norm 2.4719 (2.8255)	mem 20675MB
[2025-04-03 04:34:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][346/573]	eta 0:03:21 lr 0.000007	time 0.8778 (0.8887)	loss 0.6145 (0.4846)	grad_norm 2.5240 (2.8249)	mem 20675MB
[2025-04-03 04:34:58 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][348/573]	eta 0:03:19 lr 0.000007	time 0.8785 (0.8886)	loss 0.5195 (0.4844)	grad_norm 2.7062 (2.8267)	mem 20675MB
[2025-04-03 04:35:00 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][350/573]	eta 0:03:18 lr 0.000007	time 0.8795 (0.8886)	loss 0.5719 (0.4850)	grad_norm 2.5544 (2.8250)	mem 20675MB
[2025-04-03 04:35:01 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][352/573]	eta 0:03:16 lr 0.000007	time 0.8773 (0.8885)	loss 0.4315 (0.4852)	grad_norm 3.8136 (2.8264)	mem 20675MB
[2025-04-03 04:35:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][354/573]	eta 0:03:14 lr 0.000007	time 0.8776 (0.8885)	loss 0.4356 (0.4851)	grad_norm 3.7313 (2.8273)	mem 20675MB
[2025-04-03 04:35:05 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][356/573]	eta 0:03:12 lr 0.000007	time 0.8775 (0.8884)	loss 0.4064 (0.4846)	grad_norm 4.1391 (2.8325)	mem 20675MB
[2025-04-03 04:35:07 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][358/573]	eta 0:03:10 lr 0.000007	time 0.8780 (0.8884)	loss 0.3990 (0.4844)	grad_norm 2.9812 (2.8305)	mem 20675MB
[2025-04-03 04:35:08 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][360/573]	eta 0:03:09 lr 0.000007	time 0.8821 (0.8883)	loss 0.4424 (0.4846)	grad_norm 2.4874 (2.8281)	mem 20675MB
[2025-04-03 04:35:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][362/573]	eta 0:03:07 lr 0.000007	time 0.8774 (0.8883)	loss 0.5135 (0.4848)	grad_norm 3.4412 (2.8291)	mem 20675MB
[2025-04-03 04:35:12 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][364/573]	eta 0:03:05 lr 0.000007	time 0.8778 (0.8882)	loss 0.6017 (0.4850)	grad_norm 2.6570 (2.8268)	mem 20675MB
[2025-04-03 04:35:14 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][366/573]	eta 0:03:03 lr 0.000007	time 0.8774 (0.8882)	loss 0.5320 (0.4851)	grad_norm 1.5901 (2.8252)	mem 20675MB
[2025-04-03 04:35:15 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][368/573]	eta 0:03:02 lr 0.000007	time 0.8779 (0.8882)	loss 0.5617 (0.4851)	grad_norm 2.7059 (2.8268)	mem 20675MB
[2025-04-03 04:35:17 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][370/573]	eta 0:03:00 lr 0.000007	time 0.8771 (0.8881)	loss 0.5028 (0.4853)	grad_norm 1.7636 (2.8238)	mem 20675MB
[2025-04-03 04:35:19 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][372/573]	eta 0:02:58 lr 0.000006	time 0.8862 (0.8881)	loss 0.3810 (0.4851)	grad_norm 3.0790 (2.8253)	mem 20675MB
[2025-04-03 04:35:21 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][374/573]	eta 0:02:56 lr 0.000006	time 0.8775 (0.8880)	loss 0.4353 (0.4847)	grad_norm 2.6594 (2.8276)	mem 20675MB
[2025-04-03 04:35:22 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][376/573]	eta 0:02:54 lr 0.000006	time 0.8774 (0.8880)	loss 0.4937 (0.4846)	grad_norm 3.6902 (2.8293)	mem 20675MB
[2025-04-03 04:35:24 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][378/573]	eta 0:02:53 lr 0.000006	time 0.8775 (0.8879)	loss 0.4629 (0.4850)	grad_norm 3.5351 (2.8321)	mem 20675MB
[2025-04-03 04:35:26 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][380/573]	eta 0:02:51 lr 0.000006	time 0.8813 (0.8879)	loss 0.5259 (0.4851)	grad_norm 2.6008 (2.8299)	mem 20675MB
[2025-04-03 04:35:28 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][382/573]	eta 0:02:49 lr 0.000006	time 0.8780 (0.8879)	loss 0.3492 (0.4848)	grad_norm 4.5385 (2.8326)	mem 20675MB
[2025-04-03 04:35:29 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][384/573]	eta 0:02:47 lr 0.000006	time 0.8788 (0.8878)	loss 0.6129 (0.4850)	grad_norm 2.9754 (2.8326)	mem 20675MB
[2025-04-03 04:35:31 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][386/573]	eta 0:02:46 lr 0.000006	time 0.8780 (0.8878)	loss 0.5611 (0.4852)	grad_norm 1.6087 (2.8274)	mem 20675MB
[2025-04-03 04:35:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][388/573]	eta 0:02:44 lr 0.000006	time 0.8856 (0.8878)	loss 0.3989 (0.4850)	grad_norm 3.9680 (2.8306)	mem 20675MB
[2025-04-03 04:35:35 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][390/573]	eta 0:02:42 lr 0.000006	time 0.8777 (0.8877)	loss 0.5803 (0.4850)	grad_norm 2.7796 (2.8308)	mem 20675MB
[2025-04-03 04:35:37 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][392/573]	eta 0:02:40 lr 0.000006	time 0.8863 (0.8877)	loss 0.4700 (0.4850)	grad_norm 3.4058 (2.8317)	mem 20675MB
[2025-04-03 04:35:38 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][394/573]	eta 0:02:38 lr 0.000006	time 0.8791 (0.8877)	loss 0.5825 (0.4855)	grad_norm 2.3065 (2.8291)	mem 20675MB
[2025-04-03 04:35:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][396/573]	eta 0:02:37 lr 0.000006	time 0.8867 (0.8877)	loss 0.5514 (0.4858)	grad_norm 2.4149 (2.8266)	mem 20675MB
[2025-04-03 04:35:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][398/573]	eta 0:02:35 lr 0.000006	time 0.8777 (0.8876)	loss 0.5123 (0.4859)	grad_norm 2.4812 (2.8251)	mem 20675MB
[2025-04-03 04:35:44 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][400/573]	eta 0:02:33 lr 0.000006	time 0.8774 (0.8876)	loss 0.5554 (0.4862)	grad_norm 1.9051 (2.8241)	mem 20675MB
[2025-04-03 04:35:45 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][402/573]	eta 0:02:31 lr 0.000006	time 0.8775 (0.8876)	loss 0.3043 (0.4854)	grad_norm 3.6176 (2.8243)	mem 20675MB
[2025-04-03 04:35:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][404/573]	eta 0:02:29 lr 0.000006	time 0.8778 (0.8875)	loss 0.5327 (0.4855)	grad_norm 2.0851 (2.8213)	mem 20675MB
[2025-04-03 04:35:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][406/573]	eta 0:02:28 lr 0.000006	time 0.8773 (0.8875)	loss 0.5404 (0.4856)	grad_norm 2.5553 (2.8205)	mem 20675MB
[2025-04-03 04:35:51 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][408/573]	eta 0:02:26 lr 0.000006	time 0.8774 (0.8874)	loss 0.4908 (0.4856)	grad_norm 3.3992 (2.8225)	mem 20675MB
[2025-04-03 04:35:52 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][410/573]	eta 0:02:24 lr 0.000006	time 0.8779 (0.8874)	loss 0.4700 (0.4852)	grad_norm 3.3556 (2.8230)	mem 20675MB
[2025-04-03 04:35:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][412/573]	eta 0:02:22 lr 0.000006	time 0.8812 (0.8873)	loss 0.4602 (0.4853)	grad_norm 1.9255 (2.8197)	mem 20675MB
[2025-04-03 04:35:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][414/573]	eta 0:02:21 lr 0.000006	time 0.8874 (0.8874)	loss 0.3511 (0.4854)	grad_norm 2.5861 (2.8183)	mem 20675MB
[2025-04-03 04:35:58 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][416/573]	eta 0:02:19 lr 0.000006	time 0.8776 (0.8873)	loss 0.5907 (0.4853)	grad_norm 2.6345 (2.8191)	mem 20675MB
[2025-04-03 04:35:59 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][418/573]	eta 0:02:17 lr 0.000006	time 0.8776 (0.8873)	loss 0.5636 (0.4851)	grad_norm 2.3381 (2.8213)	mem 20675MB
[2025-04-03 04:36:01 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][420/573]	eta 0:02:15 lr 0.000006	time 0.8778 (0.8873)	loss 0.4896 (0.4850)	grad_norm 3.0961 (2.8244)	mem 20675MB
[2025-04-03 04:36:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][422/573]	eta 0:02:13 lr 0.000006	time 0.8776 (0.8872)	loss 0.5198 (0.4849)	grad_norm 3.1616 (2.8265)	mem 20675MB
[2025-04-03 04:36:05 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][424/573]	eta 0:02:12 lr 0.000006	time 0.8779 (0.8872)	loss 0.6072 (0.4853)	grad_norm 3.2840 (2.8263)	mem 20675MB
[2025-04-03 04:36:06 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][426/573]	eta 0:02:10 lr 0.000006	time 0.8822 (0.8872)	loss 0.4692 (0.4854)	grad_norm 2.6248 (2.8251)	mem 20675MB
[2025-04-03 04:36:08 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][428/573]	eta 0:02:08 lr 0.000006	time 0.8773 (0.8871)	loss 0.5300 (0.4855)	grad_norm 4.0638 (2.8269)	mem 20675MB
[2025-04-03 04:36:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][430/573]	eta 0:02:06 lr 0.000006	time 0.8774 (0.8871)	loss 0.4257 (0.4854)	grad_norm 3.6262 (2.8265)	mem 20675MB
[2025-04-03 04:36:12 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][432/573]	eta 0:02:05 lr 0.000006	time 0.8774 (0.8871)	loss 0.4757 (0.4853)	grad_norm 2.8244 (2.8271)	mem 20675MB
[2025-04-03 04:36:14 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][434/573]	eta 0:02:03 lr 0.000006	time 0.8773 (0.8870)	loss 0.4619 (0.4851)	grad_norm 3.2829 (2.8286)	mem 20675MB
[2025-04-03 04:36:15 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][436/573]	eta 0:02:01 lr 0.000006	time 0.8774 (0.8870)	loss 0.4806 (0.4853)	grad_norm 1.9204 (2.8246)	mem 20675MB
[2025-04-03 04:36:17 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][438/573]	eta 0:01:59 lr 0.000005	time 0.9022 (0.8870)	loss 0.4479 (0.4851)	grad_norm 1.6879 (2.8203)	mem 20675MB
[2025-04-03 04:36:19 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][440/573]	eta 0:01:57 lr 0.000005	time 0.8780 (0.8870)	loss 0.5918 (0.4855)	grad_norm 3.4777 (2.8199)	mem 20675MB
[2025-04-03 04:36:21 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][442/573]	eta 0:01:56 lr 0.000005	time 0.8780 (0.8869)	loss 0.3731 (0.4853)	grad_norm 2.6492 (2.8188)	mem 20675MB
[2025-04-03 04:36:22 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][444/573]	eta 0:01:54 lr 0.000005	time 0.8776 (0.8869)	loss 0.4247 (0.4852)	grad_norm 3.7919 (2.8220)	mem 20675MB
[2025-04-03 04:36:24 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][446/573]	eta 0:01:52 lr 0.000005	time 0.8772 (0.8869)	loss 0.5319 (0.4849)	grad_norm 2.1844 (2.8225)	mem 20675MB
[2025-04-03 04:36:26 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][448/573]	eta 0:01:50 lr 0.000005	time 0.8770 (0.8868)	loss 0.5345 (0.4849)	grad_norm 3.0884 (2.8230)	mem 20675MB
[2025-04-03 04:36:28 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][450/573]	eta 0:01:49 lr 0.000005	time 0.8772 (0.8868)	loss 0.5515 (0.4853)	grad_norm 2.3854 (2.8209)	mem 20675MB
[2025-04-03 04:36:29 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][452/573]	eta 0:01:47 lr 0.000005	time 0.8773 (0.8868)	loss 0.5575 (0.4852)	grad_norm 2.5209 (2.8195)	mem 20675MB
[2025-04-03 04:36:31 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][454/573]	eta 0:01:45 lr 0.000005	time 0.8773 (0.8867)	loss 0.5514 (0.4851)	grad_norm 2.3343 (2.8194)	mem 20675MB
[2025-04-03 04:36:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][456/573]	eta 0:01:43 lr 0.000005	time 0.8830 (0.8867)	loss 0.5743 (0.4856)	grad_norm 3.0228 (2.8209)	mem 20675MB
[2025-04-03 04:36:35 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][458/573]	eta 0:01:41 lr 0.000005	time 0.8774 (0.8867)	loss 0.5274 (0.4858)	grad_norm 2.2203 (2.8186)	mem 20675MB
[2025-04-03 04:36:36 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][460/573]	eta 0:01:40 lr 0.000005	time 0.8776 (0.8867)	loss 0.5742 (0.4860)	grad_norm 2.6896 (2.8207)	mem 20675MB
[2025-04-03 04:36:38 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][462/573]	eta 0:01:38 lr 0.000005	time 0.8780 (0.8866)	loss 0.4880 (0.4860)	grad_norm 2.1755 (2.8179)	mem 20675MB
[2025-04-03 04:36:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][464/573]	eta 0:01:36 lr 0.000005	time 0.8773 (0.8866)	loss 0.6069 (0.4863)	grad_norm 3.6563 (2.8191)	mem 20675MB
[2025-04-03 04:36:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][466/573]	eta 0:01:34 lr 0.000005	time 0.8800 (0.8866)	loss 0.4898 (0.4861)	grad_norm 2.1351 (2.8183)	mem 20675MB
[2025-04-03 04:36:43 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][468/573]	eta 0:01:33 lr 0.000005	time 0.8773 (0.8866)	loss 0.5178 (0.4864)	grad_norm 2.7457 (2.8164)	mem 20675MB
[2025-04-03 04:36:45 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][470/573]	eta 0:01:31 lr 0.000005	time 0.8781 (0.8866)	loss 0.5631 (0.4867)	grad_norm 2.1746 (2.8135)	mem 20675MB
[2025-04-03 04:36:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][472/573]	eta 0:01:29 lr 0.000005	time 0.8779 (0.8865)	loss 0.3213 (0.4864)	grad_norm 3.4867 (2.8142)	mem 20675MB
[2025-04-03 04:36:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][474/573]	eta 0:01:27 lr 0.000005	time 0.8779 (0.8865)	loss 0.5998 (0.4864)	grad_norm 3.0619 (2.8200)	mem 20675MB
[2025-04-03 04:36:50 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][476/573]	eta 0:01:25 lr 0.000005	time 0.8774 (0.8865)	loss 0.5615 (0.4863)	grad_norm 2.0055 (2.8176)	mem 20675MB
[2025-04-03 04:36:52 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][478/573]	eta 0:01:24 lr 0.000005	time 0.8777 (0.8864)	loss 0.3982 (0.4860)	grad_norm 2.9556 (2.8172)	mem 20675MB
[2025-04-03 04:36:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][480/573]	eta 0:01:22 lr 0.000005	time 0.8788 (0.8864)	loss 0.5458 (0.4863)	grad_norm 3.5883 (2.8178)	mem 20675MB
[2025-04-03 04:36:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][482/573]	eta 0:01:20 lr 0.000005	time 0.8772 (0.8864)	loss 0.4287 (0.4863)	grad_norm 2.7294 (2.8162)	mem 20675MB
[2025-04-03 04:36:58 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][484/573]	eta 0:01:18 lr 0.000005	time 0.8774 (0.8864)	loss 0.4703 (0.4860)	grad_norm 3.1006 (2.8180)	mem 20675MB
[2025-04-03 04:36:59 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][486/573]	eta 0:01:17 lr 0.000005	time 0.8772 (0.8863)	loss 0.5213 (0.4860)	grad_norm 2.1651 (2.8170)	mem 20675MB
[2025-04-03 04:37:01 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][488/573]	eta 0:01:15 lr 0.000005	time 0.8775 (0.8863)	loss 0.5101 (0.4861)	grad_norm 2.3252 (2.8176)	mem 20675MB
[2025-04-03 04:37:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][490/573]	eta 0:01:13 lr 0.000005	time 0.8778 (0.8863)	loss 0.4597 (0.4859)	grad_norm 2.4712 (2.8179)	mem 20675MB
[2025-04-03 04:37:05 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][492/573]	eta 0:01:11 lr 0.000005	time 0.8775 (0.8862)	loss 0.4856 (0.4856)	grad_norm 1.4337 (2.8145)	mem 20675MB
[2025-04-03 04:37:06 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][494/573]	eta 0:01:10 lr 0.000005	time 0.8776 (0.8862)	loss 0.5281 (0.4857)	grad_norm 2.4838 (2.8133)	mem 20675MB
[2025-04-03 04:37:08 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][496/573]	eta 0:01:08 lr 0.000005	time 0.8814 (0.8862)	loss 0.6030 (0.4858)	grad_norm 2.3218 (2.8134)	mem 20675MB
[2025-04-03 04:37:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][498/573]	eta 0:01:06 lr 0.000005	time 0.8771 (0.8862)	loss 0.3525 (0.4857)	grad_norm 3.6030 (2.8160)	mem 20675MB
[2025-04-03 04:37:12 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][500/573]	eta 0:01:04 lr 0.000005	time 0.8778 (0.8861)	loss 0.5161 (0.4858)	grad_norm 2.0275 (2.8132)	mem 20675MB
[2025-04-03 04:37:13 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][502/573]	eta 0:01:02 lr 0.000005	time 0.8775 (0.8861)	loss 0.3564 (0.4856)	grad_norm 3.8141 (2.8160)	mem 20675MB
[2025-04-03 04:37:15 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][504/573]	eta 0:01:01 lr 0.000005	time 0.8775 (0.8861)	loss 0.4768 (0.4857)	grad_norm 2.3220 (2.8160)	mem 20675MB
[2025-04-03 04:37:17 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][506/573]	eta 0:00:59 lr 0.000005	time 0.8775 (0.8860)	loss 0.5102 (0.4856)	grad_norm 2.2713 (2.8165)	mem 20675MB
[2025-04-03 04:37:19 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][508/573]	eta 0:00:57 lr 0.000004	time 0.8774 (0.8860)	loss 0.4911 (0.4853)	grad_norm 2.5335 (2.8176)	mem 20675MB
[2025-04-03 04:37:20 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][510/573]	eta 0:00:55 lr 0.000004	time 0.8790 (0.8860)	loss 0.3913 (0.4852)	grad_norm 3.0636 (2.8192)	mem 20675MB
[2025-04-03 04:37:22 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][512/573]	eta 0:00:54 lr 0.000004	time 0.8776 (0.8860)	loss 0.4697 (0.4849)	grad_norm 3.9112 (2.8251)	mem 20675MB
[2025-04-03 04:37:24 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][514/573]	eta 0:00:52 lr 0.000004	time 0.8786 (0.8859)	loss 0.3787 (0.4845)	grad_norm 2.2302 (2.8246)	mem 20675MB
[2025-04-03 04:37:26 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][516/573]	eta 0:00:50 lr 0.000004	time 0.8819 (0.8860)	loss 0.5549 (0.4847)	grad_norm 3.1064 (2.8258)	mem 20675MB
[2025-04-03 04:37:27 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][518/573]	eta 0:00:48 lr 0.000004	time 0.8775 (0.8859)	loss 0.4429 (0.4848)	grad_norm 2.5897 (2.8249)	mem 20675MB
[2025-04-03 04:37:29 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][520/573]	eta 0:00:46 lr 0.000004	time 0.8773 (0.8859)	loss 0.4607 (0.4849)	grad_norm 3.3396 (2.8256)	mem 20675MB
[2025-04-03 04:37:31 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][522/573]	eta 0:00:45 lr 0.000004	time 0.8770 (0.8859)	loss 0.5300 (0.4851)	grad_norm 2.4250 (2.8236)	mem 20675MB
[2025-04-03 04:37:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][524/573]	eta 0:00:43 lr 0.000004	time 0.8865 (0.8859)	loss 0.3655 (0.4847)	grad_norm 3.9015 (2.8281)	mem 20675MB
[2025-04-03 04:37:34 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][526/573]	eta 0:00:41 lr 0.000004	time 0.8774 (0.8858)	loss 0.5240 (0.4850)	grad_norm 2.6992 (2.8271)	mem 20675MB
[2025-04-03 04:37:36 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][528/573]	eta 0:00:39 lr 0.000004	time 0.8801 (0.8858)	loss 0.4766 (0.4851)	grad_norm 3.6070 (2.8282)	mem 20675MB
[2025-04-03 04:37:38 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][530/573]	eta 0:00:38 lr 0.000004	time 0.8778 (0.8858)	loss 0.5275 (0.4851)	grad_norm 3.3861 (2.8317)	mem 20675MB
[2025-04-03 04:37:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][532/573]	eta 0:00:36 lr 0.000004	time 0.8776 (0.8858)	loss 0.5786 (0.4852)	grad_norm 1.9722 (2.8284)	mem 20675MB
[2025-04-03 04:37:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][534/573]	eta 0:00:34 lr 0.000004	time 0.8786 (0.8858)	loss 0.5545 (0.4853)	grad_norm 2.1795 (2.8265)	mem 20675MB
[2025-04-03 04:37:43 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][536/573]	eta 0:00:32 lr 0.000004	time 0.8773 (0.8857)	loss 0.5216 (0.4853)	grad_norm 2.2930 (2.8239)	mem 20675MB
[2025-04-03 04:37:45 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][538/573]	eta 0:00:30 lr 0.000004	time 0.8775 (0.8857)	loss 0.5459 (0.4852)	grad_norm 2.2023 (2.8220)	mem 20675MB
[2025-04-03 04:37:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][540/573]	eta 0:00:29 lr 0.000004	time 0.8774 (0.8857)	loss 0.6408 (0.4856)	grad_norm 3.5476 (2.8229)	mem 20675MB
[2025-04-03 04:37:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][542/573]	eta 0:00:27 lr 0.000004	time 0.8775 (0.8856)	loss 0.5833 (0.4859)	grad_norm 2.4794 (2.8206)	mem 20675MB
[2025-04-03 04:37:50 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][544/573]	eta 0:00:25 lr 0.000004	time 0.8783 (0.8856)	loss 0.3272 (0.4856)	grad_norm 2.8871 (2.8212)	mem 20675MB
[2025-04-03 04:37:52 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][546/573]	eta 0:00:23 lr 0.000004	time 0.8777 (0.8856)	loss 0.3543 (0.4856)	grad_norm 2.3853 (2.8189)	mem 20675MB
[2025-04-03 04:37:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][548/573]	eta 0:00:22 lr 0.000004	time 0.8787 (0.8856)	loss 0.5596 (0.4858)	grad_norm 1.8891 (2.8162)	mem 20675MB
[2025-04-03 04:37:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][550/573]	eta 0:00:20 lr 0.000004	time 0.8775 (0.8856)	loss 0.4192 (0.4859)	grad_norm 3.3763 (2.8169)	mem 20675MB
[2025-04-03 04:37:57 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][552/573]	eta 0:00:18 lr 0.000004	time 0.8773 (0.8856)	loss 0.3649 (0.4854)	grad_norm 3.7683 (2.8205)	mem 20675MB
[2025-04-03 04:37:59 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][554/573]	eta 0:00:16 lr 0.000004	time 0.8829 (0.8856)	loss 0.4925 (0.4854)	grad_norm 3.7574 (2.8213)	mem 20675MB
[2025-04-03 04:38:01 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][556/573]	eta 0:00:15 lr 0.000004	time 0.8782 (0.8855)	loss 0.4373 (0.4855)	grad_norm 3.6430 (2.8226)	mem 20675MB
[2025-04-03 04:38:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][558/573]	eta 0:00:13 lr 0.000004	time 0.8776 (0.8855)	loss 0.5151 (0.4857)	grad_norm 3.5360 (2.8233)	mem 20675MB
[2025-04-03 04:38:04 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][560/573]	eta 0:00:11 lr 0.000004	time 0.8776 (0.8855)	loss 0.5576 (0.4858)	grad_norm 1.8409 (2.8211)	mem 20675MB
[2025-04-03 04:38:06 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][562/573]	eta 0:00:09 lr 0.000004	time 0.8771 (0.8855)	loss 0.4893 (0.4858)	grad_norm 1.9165 (2.8193)	mem 20675MB
[2025-04-03 04:38:08 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][564/573]	eta 0:00:07 lr 0.000004	time 0.8772 (0.8854)	loss 0.5074 (0.4858)	grad_norm 3.1531 (2.8189)	mem 20675MB
[2025-04-03 04:38:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][566/573]	eta 0:00:06 lr 0.000004	time 0.8770 (0.8854)	loss 0.4162 (0.4855)	grad_norm 3.3940 (2.8203)	mem 20675MB
[2025-04-03 04:38:11 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][568/573]	eta 0:00:04 lr 0.000004	time 0.8815 (0.8854)	loss 0.5724 (0.4857)	grad_norm 2.3875 (2.8299)	mem 20675MB
[2025-04-03 04:38:13 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][570/573]	eta 0:00:02 lr 0.000004	time 0.8816 (0.8854)	loss 0.5402 (0.4857)	grad_norm 2.7013 (2.8313)	mem 20675MB
[2025-04-03 04:38:15 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][572/573]	eta 0:00:00 lr 0.000004	time 0.8774 (0.8854)	loss 0.5098 (0.4858)	grad_norm 2.6345 (2.8313)	mem 20675MB
[2025-04-03 04:38:15 simmim_finetune] (main_finetune.py 260): INFO EPOCH 28 training takes 0:08:27
[2025-04-03 04:38:18 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 3.035 (3.035)	Loss 0.5058 (0.5058)	Acc@1 70.312 (70.312)	Mem 20675MB
[2025-04-03 04:38:19 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.290 (1.203)	Loss 0.4640 (0.4706)	Acc@1 75.781 (74.479)	Mem 20675MB
[2025-04-03 04:38:19 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.836)	Loss 0.4845 (0.4639)	Acc@1 74.219 (75.312)	Mem 20675MB
[2025-04-03 04:38:20 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.283 (0.678)	Loss 0.4128 (0.4501)	Acc@1 82.031 (77.121)	Mem 20675MB
[2025-04-03 04:38:20 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.591)	Loss 0.5056 (0.4503)	Acc@1 74.219 (77.344)	Mem 20675MB
[2025-04-03 04:38:21 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.535)	Loss 0.4733 (0.4599)	Acc@1 83.594 (77.557)	Mem 20675MB
[2025-04-03 04:38:22 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.496)	Loss 0.4742 (0.4614)	Acc@1 78.906 (77.764)	Mem 20675MB
[2025-04-03 04:38:22 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.468)	Loss 0.4338 (0.4575)	Acc@1 78.906 (78.073)	Mem 20675MB
[2025-04-03 04:38:22 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.125
[2025-04-03 04:38:22 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 78.1%
[2025-04-03 04:38:22 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.23%
[2025-04-03 04:38:22 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.6201557879481894e-07, 2.6201557879481894e-07, 2.6885551133657125e-07, 2.6885551133657125e-07, 2.793784844777287e-07, 2.793784844777287e-07, 2.9556767392566325e-07, 2.9556767392566325e-07, 3.204741192301779e-07, 3.204741192301779e-07, 3.587917273909696e-07, 3.587917273909696e-07, 4.177418937921877e-07, 4.177418937921877e-07, 5.084344574863693e-07, 5.084344574863693e-07, 6.47961478554341e-07, 6.47961478554341e-07, 8.626184340435283e-07, 8.626184340435283e-07, 1.1928599040268934e-06, 1.1928599040268934e-06, 1.7009237040013012e-06, 1.7009237040013012e-06, 2.4825603193465444e-06, 2.4825603193465444e-06, 3.6850781891084563e-06, 3.6850781891084563e-06]
[2025-04-03 04:38:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][0/573]	eta 0:34:44 lr 0.000004	time 3.6383 (3.6383)	loss 0.5279 (0.5279)	grad_norm 1.6440 (1.6440)	mem 20675MB
[2025-04-03 04:38:28 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][2/573]	eta 0:17:07 lr 0.000004	time 0.8776 (1.7992)	loss 0.4337 (0.4407)	grad_norm 3.8653 (2.7203)	mem 20675MB
[2025-04-03 04:38:30 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][4/573]	eta 0:13:34 lr 0.000004	time 0.8781 (1.4310)	loss 0.4444 (0.4713)	grad_norm 2.5453 (2.7134)	mem 20675MB
[2025-04-03 04:38:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][6/573]	eta 0:12:01 lr 0.000004	time 0.8770 (1.2731)	loss 0.4432 (0.4570)	grad_norm 1.7730 (2.6859)	mem 20675MB
[2025-04-03 04:38:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][8/573]	eta 0:11:10 lr 0.000004	time 0.8868 (1.1873)	loss 0.5929 (0.4825)	grad_norm 2.9544 (2.6955)	mem 20675MB
[2025-04-03 04:38:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][10/573]	eta 0:10:37 lr 0.000004	time 0.8816 (1.1315)	loss 0.4871 (0.4841)	grad_norm 2.0926 (2.5733)	mem 20675MB
[2025-04-03 04:38:37 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][12/573]	eta 0:10:12 lr 0.000004	time 0.8791 (1.0927)	loss 0.5312 (0.4833)	grad_norm 2.1673 (2.5730)	mem 20675MB
[2025-04-03 04:38:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][14/573]	eta 0:09:54 lr 0.000004	time 0.8785 (1.0642)	loss 0.5898 (0.4919)	grad_norm 2.1267 (2.5143)	mem 20675MB
[2025-04-03 04:38:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][16/573]	eta 0:09:41 lr 0.000003	time 0.8817 (1.0432)	loss 0.4348 (0.4910)	grad_norm 2.1359 (2.4881)	mem 20675MB
[2025-04-03 04:38:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][18/573]	eta 0:09:29 lr 0.000003	time 0.8775 (1.0259)	loss 0.3040 (0.4814)	grad_norm 3.1200 (2.4911)	mem 20675MB
[2025-04-03 04:38:44 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][20/573]	eta 0:09:19 lr 0.000003	time 0.8773 (1.0118)	loss 0.5849 (0.4817)	grad_norm 2.2678 (2.5221)	mem 20675MB
[2025-04-03 04:38:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][22/573]	eta 0:09:11 lr 0.000003	time 0.8774 (1.0002)	loss 0.5331 (0.4863)	grad_norm 1.9746 (2.4721)	mem 20675MB
[2025-04-03 04:38:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][24/573]	eta 0:09:03 lr 0.000003	time 0.8775 (0.9905)	loss 0.3236 (0.4828)	grad_norm 2.9227 (2.4659)	mem 20675MB
[2025-04-03 04:38:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][26/573]	eta 0:08:57 lr 0.000003	time 0.8777 (0.9823)	loss 0.5505 (0.4852)	grad_norm 2.7015 (2.4845)	mem 20675MB
[2025-04-03 04:38:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][28/573]	eta 0:08:51 lr 0.000003	time 0.8774 (0.9751)	loss 0.4985 (0.4803)	grad_norm 2.4553 (2.4947)	mem 20675MB
[2025-04-03 04:38:53 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][30/573]	eta 0:08:46 lr 0.000003	time 0.8771 (0.9689)	loss 0.3419 (0.4774)	grad_norm 4.5424 (2.5574)	mem 20675MB
[2025-04-03 04:38:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][32/573]	eta 0:08:41 lr 0.000003	time 0.8810 (0.9635)	loss 0.5220 (0.4788)	grad_norm 3.0076 (2.5463)	mem 20675MB
[2025-04-03 04:38:56 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][34/573]	eta 0:08:36 lr 0.000003	time 0.8809 (0.9587)	loss 0.3778 (0.4760)	grad_norm 2.2224 (2.5364)	mem 20675MB
[2025-04-03 04:38:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][36/573]	eta 0:08:32 lr 0.000003	time 0.8825 (0.9546)	loss 0.5399 (0.4758)	grad_norm 2.7100 (2.5838)	mem 20675MB
[2025-04-03 04:39:00 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][38/573]	eta 0:08:28 lr 0.000003	time 0.8862 (0.9509)	loss 0.5291 (0.4788)	grad_norm 2.1255 (2.5875)	mem 20675MB
[2025-04-03 04:39:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][40/573]	eta 0:08:25 lr 0.000003	time 0.8818 (0.9475)	loss 0.4009 (0.4765)	grad_norm 4.5289 (2.6307)	mem 20675MB
[2025-04-03 04:39:03 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][42/573]	eta 0:08:21 lr 0.000003	time 0.8776 (0.9447)	loss 0.3443 (0.4746)	grad_norm 2.0627 (2.6331)	mem 20675MB
[2025-04-03 04:39:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][44/573]	eta 0:08:18 lr 0.000003	time 0.8781 (0.9418)	loss 0.5028 (0.4782)	grad_norm 2.4972 (2.6245)	mem 20675MB
[2025-04-03 04:39:07 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][46/573]	eta 0:08:15 lr 0.000003	time 0.8912 (0.9394)	loss 0.5624 (0.4812)	grad_norm 2.0916 (2.6011)	mem 20675MB
[2025-04-03 04:39:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][48/573]	eta 0:08:11 lr 0.000003	time 0.8824 (0.9370)	loss 0.3910 (0.4763)	grad_norm 2.9908 (2.6138)	mem 20675MB
[2025-04-03 04:39:10 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][50/573]	eta 0:08:08 lr 0.000003	time 0.8776 (0.9347)	loss 0.5727 (0.4794)	grad_norm 2.3254 (2.6026)	mem 20675MB
[2025-04-03 04:39:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][52/573]	eta 0:08:05 lr 0.000003	time 0.8780 (0.9328)	loss 0.6474 (0.4805)	grad_norm 2.6526 (2.6193)	mem 20675MB
[2025-04-03 04:39:14 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][54/573]	eta 0:08:03 lr 0.000003	time 0.8778 (0.9310)	loss 0.6152 (0.4829)	grad_norm 2.0753 (2.6172)	mem 20675MB
[2025-04-03 04:39:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][56/573]	eta 0:08:00 lr 0.000003	time 0.8789 (0.9293)	loss 0.3430 (0.4799)	grad_norm 2.3333 (2.6113)	mem 20675MB
[2025-04-03 04:39:17 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][58/573]	eta 0:07:57 lr 0.000003	time 0.8785 (0.9276)	loss 0.3610 (0.4787)	grad_norm 3.1901 (2.6032)	mem 20675MB
[2025-04-03 04:39:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][60/573]	eta 0:07:55 lr 0.000003	time 0.8776 (0.9260)	loss 0.5249 (0.4794)	grad_norm 2.4622 (2.6138)	mem 20675MB
[2025-04-03 04:39:21 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][62/573]	eta 0:07:52 lr 0.000003	time 0.8795 (0.9245)	loss 0.3695 (0.4776)	grad_norm 2.9364 (2.6484)	mem 20675MB
[2025-04-03 04:39:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][64/573]	eta 0:07:49 lr 0.000003	time 0.8779 (0.9233)	loss 0.6340 (0.4805)	grad_norm 2.2989 (2.6420)	mem 20675MB
[2025-04-03 04:39:24 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][66/573]	eta 0:07:47 lr 0.000003	time 0.8776 (0.9220)	loss 0.3902 (0.4802)	grad_norm 2.6186 (2.6299)	mem 20675MB
[2025-04-03 04:39:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][68/573]	eta 0:07:45 lr 0.000003	time 0.8792 (0.9208)	loss 0.5819 (0.4815)	grad_norm 2.2955 (2.6488)	mem 20675MB
[2025-04-03 04:39:28 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][70/573]	eta 0:07:42 lr 0.000003	time 0.8789 (0.9198)	loss 0.5118 (0.4818)	grad_norm 2.2171 (2.6639)	mem 20675MB
[2025-04-03 04:39:30 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][72/573]	eta 0:07:40 lr 0.000003	time 0.8826 (0.9187)	loss 0.5594 (0.4845)	grad_norm 1.8262 (2.6462)	mem 20675MB
[2025-04-03 04:39:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][74/573]	eta 0:07:37 lr 0.000003	time 0.8809 (0.9178)	loss 0.4905 (0.4827)	grad_norm 2.8890 (2.6628)	mem 20675MB
[2025-04-03 04:39:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][76/573]	eta 0:07:35 lr 0.000003	time 0.9009 (0.9172)	loss 0.4109 (0.4838)	grad_norm 2.0652 (2.6631)	mem 20675MB
[2025-04-03 04:39:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][78/573]	eta 0:07:33 lr 0.000003	time 0.8850 (0.9163)	loss 0.4914 (0.4854)	grad_norm 2.4396 (2.6639)	mem 20675MB
[2025-04-03 04:39:37 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][80/573]	eta 0:07:31 lr 0.000003	time 0.8797 (0.9154)	loss 0.5549 (0.4865)	grad_norm 3.0462 (2.6662)	mem 20675MB
[2025-04-03 04:39:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][82/573]	eta 0:07:29 lr 0.000003	time 0.8785 (0.9146)	loss 0.5190 (0.4887)	grad_norm 4.1812 (2.6955)	mem 20675MB
[2025-04-03 04:39:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][84/573]	eta 0:07:26 lr 0.000003	time 0.8781 (0.9137)	loss 0.5630 (0.4903)	grad_norm 2.5626 (2.6942)	mem 20675MB
[2025-04-03 04:39:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][86/573]	eta 0:07:24 lr 0.000003	time 0.8775 (0.9131)	loss 0.4949 (0.4897)	grad_norm 3.4657 (2.7050)	mem 20675MB
[2025-04-03 04:39:44 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][88/573]	eta 0:07:22 lr 0.000003	time 0.8815 (0.9123)	loss 0.5933 (0.4900)	grad_norm 2.6007 (2.7034)	mem 20675MB
[2025-04-03 04:39:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][90/573]	eta 0:07:20 lr 0.000003	time 0.8783 (0.9116)	loss 0.4920 (0.4912)	grad_norm 2.1150 (2.6901)	mem 20675MB
[2025-04-03 04:39:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][92/573]	eta 0:07:18 lr 0.000003	time 0.8872 (0.9110)	loss 0.5025 (0.4914)	grad_norm 1.7559 (2.6748)	mem 20675MB
[2025-04-03 04:39:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][94/573]	eta 0:07:16 lr 0.000003	time 0.8777 (0.9104)	loss 0.3589 (0.4905)	grad_norm 3.3918 (2.6783)	mem 20675MB
[2025-04-03 04:39:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][96/573]	eta 0:07:13 lr 0.000003	time 0.8774 (0.9098)	loss 0.5585 (0.4922)	grad_norm 1.7583 (2.6605)	mem 20675MB
[2025-04-03 04:39:52 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][98/573]	eta 0:07:11 lr 0.000003	time 0.8783 (0.9092)	loss 0.4141 (0.4913)	grad_norm 3.5230 (2.6781)	mem 20675MB
[2025-04-03 04:39:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][100/573]	eta 0:07:09 lr 0.000003	time 0.8775 (0.9086)	loss 0.3562 (0.4906)	grad_norm 2.6534 (2.6789)	mem 20675MB
[2025-04-03 04:39:56 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][102/573]	eta 0:07:07 lr 0.000003	time 0.8890 (0.9081)	loss 0.3946 (0.4882)	grad_norm 3.6274 (2.6912)	mem 20675MB
[2025-04-03 04:39:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][104/573]	eta 0:07:05 lr 0.000003	time 0.8784 (0.9076)	loss 0.4813 (0.4888)	grad_norm 3.4148 (2.6929)	mem 20675MB
[2025-04-03 04:40:00 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][106/573]	eta 0:07:03 lr 0.000003	time 0.8773 (0.9070)	loss 0.4993 (0.4884)	grad_norm 3.3923 (2.6943)	mem 20675MB
[2025-04-03 04:40:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][108/573]	eta 0:07:01 lr 0.000003	time 0.8834 (0.9066)	loss 0.3811 (0.4871)	grad_norm 2.7727 (2.6948)	mem 20675MB
[2025-04-03 04:40:03 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][110/573]	eta 0:06:59 lr 0.000002	time 0.8780 (0.9062)	loss 0.5144 (0.4875)	grad_norm 2.3626 (2.6943)	mem 20675MB
[2025-04-03 04:40:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][112/573]	eta 0:06:57 lr 0.000002	time 0.8776 (0.9057)	loss 0.5028 (0.4863)	grad_norm 1.6350 (2.6900)	mem 20675MB
[2025-04-03 04:40:07 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][114/573]	eta 0:06:55 lr 0.000002	time 0.8780 (0.9052)	loss 0.5227 (0.4864)	grad_norm 2.2090 (2.6827)	mem 20675MB
[2025-04-03 04:40:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][116/573]	eta 0:06:53 lr 0.000002	time 0.8784 (0.9048)	loss 0.5435 (0.4864)	grad_norm 2.4107 (2.6786)	mem 20675MB
[2025-04-03 04:40:10 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][118/573]	eta 0:06:51 lr 0.000002	time 0.8801 (0.9044)	loss 0.4911 (0.4860)	grad_norm 2.4468 (2.6849)	mem 20675MB
[2025-04-03 04:40:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][120/573]	eta 0:06:49 lr 0.000002	time 0.8865 (0.9041)	loss 0.4737 (0.4862)	grad_norm 2.1446 (2.6898)	mem 20675MB
[2025-04-03 04:40:14 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][122/573]	eta 0:06:47 lr 0.000002	time 0.8888 (0.9038)	loss 0.5293 (0.4862)	grad_norm 2.7713 (2.6990)	mem 20675MB
[2025-04-03 04:40:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][124/573]	eta 0:06:45 lr 0.000002	time 0.8776 (0.9034)	loss 0.5351 (0.4871)	grad_norm 3.9420 (2.7032)	mem 20675MB
[2025-04-03 04:40:17 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][126/573]	eta 0:06:43 lr 0.000002	time 0.8774 (0.9030)	loss 0.5748 (0.4888)	grad_norm 1.6441 (2.6888)	mem 20675MB
[2025-04-03 04:40:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][128/573]	eta 0:06:41 lr 0.000002	time 0.8773 (0.9027)	loss 0.4742 (0.4884)	grad_norm 2.5816 (2.6850)	mem 20675MB
[2025-04-03 04:40:21 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][130/573]	eta 0:06:39 lr 0.000002	time 0.8954 (0.9024)	loss 0.4724 (0.4894)	grad_norm 2.1937 (2.6821)	mem 20675MB
[2025-04-03 04:40:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][132/573]	eta 0:06:37 lr 0.000002	time 0.8774 (0.9021)	loss 0.3241 (0.4885)	grad_norm 3.2312 (2.6918)	mem 20675MB
[2025-04-03 04:40:24 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][134/573]	eta 0:06:35 lr 0.000002	time 0.8777 (0.9018)	loss 0.5699 (0.4900)	grad_norm 2.1775 (2.6896)	mem 20675MB
[2025-04-03 04:40:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][136/573]	eta 0:06:33 lr 0.000002	time 0.8847 (0.9015)	loss 0.6142 (0.4917)	grad_norm 2.1884 (2.6861)	mem 20675MB
[2025-04-03 04:40:28 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][138/573]	eta 0:06:32 lr 0.000002	time 0.8778 (0.9012)	loss 0.4810 (0.4920)	grad_norm 3.3782 (2.6858)	mem 20675MB
[2025-04-03 04:40:30 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][140/573]	eta 0:06:30 lr 0.000002	time 0.8773 (0.9009)	loss 0.4745 (0.4921)	grad_norm 2.0222 (2.6847)	mem 20675MB
[2025-04-03 04:40:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][142/573]	eta 0:06:28 lr 0.000002	time 0.8831 (0.9006)	loss 0.4297 (0.4915)	grad_norm 4.5495 (2.6974)	mem 20675MB
[2025-04-03 04:40:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][144/573]	eta 0:06:26 lr 0.000002	time 0.8779 (0.9003)	loss 0.4982 (0.4929)	grad_norm 1.9125 (2.6932)	mem 20675MB
[2025-04-03 04:40:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][146/573]	eta 0:06:24 lr 0.000002	time 0.8822 (0.9001)	loss 0.6272 (0.4935)	grad_norm 2.0938 (2.6897)	mem 20675MB
[2025-04-03 04:40:37 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][148/573]	eta 0:06:22 lr 0.000002	time 0.8780 (0.8999)	loss 0.5264 (0.4943)	grad_norm 2.5211 (2.6846)	mem 20675MB
[2025-04-03 04:40:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][150/573]	eta 0:06:20 lr 0.000002	time 0.8778 (0.8996)	loss 0.5165 (0.4941)	grad_norm 2.2934 (2.6846)	mem 20675MB
[2025-04-03 04:40:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][152/573]	eta 0:06:18 lr 0.000002	time 0.8771 (0.8997)	loss 0.5219 (0.4949)	grad_norm 1.8768 (2.6785)	mem 20675MB
[2025-04-03 04:40:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][154/573]	eta 0:06:16 lr 0.000002	time 0.8937 (0.8995)	loss 0.6080 (0.4956)	grad_norm 2.1392 (2.6789)	mem 20675MB
[2025-04-03 04:40:44 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][156/573]	eta 0:06:15 lr 0.000002	time 0.8832 (0.8993)	loss 0.5344 (0.4962)	grad_norm 2.4092 (2.6715)	mem 20675MB
[2025-04-03 04:40:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][158/573]	eta 0:06:13 lr 0.000002	time 0.8769 (0.8991)	loss 0.5626 (0.4970)	grad_norm 2.7245 (2.6691)	mem 20675MB
[2025-04-03 04:40:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][160/573]	eta 0:06:11 lr 0.000002	time 0.8774 (0.8989)	loss 0.4127 (0.4967)	grad_norm 4.0873 (2.6765)	mem 20675MB
[2025-04-03 04:40:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][162/573]	eta 0:06:09 lr 0.000002	time 0.8773 (0.8986)	loss 0.4437 (0.4963)	grad_norm 3.1638 (2.6802)	mem 20675MB
[2025-04-03 04:40:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][164/573]	eta 0:06:07 lr 0.000002	time 0.8776 (0.8984)	loss 0.5188 (0.4971)	grad_norm 3.2060 (2.6809)	mem 20675MB
[2025-04-03 04:40:52 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][166/573]	eta 0:06:05 lr 0.000002	time 0.8785 (0.8981)	loss 0.5713 (0.4967)	grad_norm 3.0589 (2.6848)	mem 20675MB
[2025-04-03 04:40:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][168/573]	eta 0:06:03 lr 0.000002	time 0.8838 (0.8979)	loss 0.4416 (0.4957)	grad_norm 3.4809 (2.6884)	mem 20675MB
[2025-04-03 04:40:56 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][170/573]	eta 0:06:01 lr 0.000002	time 0.8776 (0.8977)	loss 0.4595 (0.4956)	grad_norm 3.8700 (2.6996)	mem 20675MB
[2025-04-03 04:40:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][172/573]	eta 0:05:59 lr 0.000002	time 0.8896 (0.8977)	loss 0.4362 (0.4949)	grad_norm 2.9428 (2.7066)	mem 20675MB
[2025-04-03 04:41:00 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][174/573]	eta 0:05:58 lr 0.000002	time 0.8827 (0.8976)	loss 0.5491 (0.4945)	grad_norm 2.2229 (2.7024)	mem 20675MB
[2025-04-03 04:41:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][176/573]	eta 0:05:56 lr 0.000002	time 0.8777 (0.8974)	loss 0.5297 (0.4940)	grad_norm 2.0609 (2.6990)	mem 20675MB
[2025-04-03 04:41:03 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][178/573]	eta 0:05:54 lr 0.000002	time 0.8772 (0.8972)	loss 0.3896 (0.4936)	grad_norm 4.5325 (2.7120)	mem 20675MB
[2025-04-03 04:41:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][180/573]	eta 0:05:52 lr 0.000002	time 0.8847 (0.8970)	loss 0.5169 (0.4935)	grad_norm 2.4263 (2.7076)	mem 20675MB
[2025-04-03 04:41:07 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][182/573]	eta 0:05:50 lr 0.000002	time 0.8783 (0.8968)	loss 0.4991 (0.4937)	grad_norm 2.6748 (2.7023)	mem 20675MB
[2025-04-03 04:41:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][184/573]	eta 0:05:48 lr 0.000002	time 0.8779 (0.8966)	loss 0.2993 (0.4922)	grad_norm 2.8258 (2.7058)	mem 20675MB
[2025-04-03 04:41:10 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][186/573]	eta 0:05:46 lr 0.000002	time 0.9020 (0.8966)	loss 0.5500 (0.4930)	grad_norm 2.8196 (2.7025)	mem 20675MB
[2025-04-03 04:41:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][188/573]	eta 0:05:45 lr 0.000002	time 0.8788 (0.8964)	loss 0.4798 (0.4930)	grad_norm 5.4226 (2.7134)	mem 20675MB
[2025-04-03 04:41:14 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][190/573]	eta 0:05:43 lr 0.000002	time 0.8774 (0.8962)	loss 0.5633 (0.4939)	grad_norm 2.8689 (2.7137)	mem 20675MB
[2025-04-03 04:41:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][192/573]	eta 0:05:41 lr 0.000002	time 0.8887 (0.8961)	loss 0.3170 (0.4922)	grad_norm 4.0911 (2.7241)	mem 20675MB
[2025-04-03 04:41:17 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][194/573]	eta 0:05:39 lr 0.000002	time 0.8774 (0.8959)	loss 0.4670 (0.4922)	grad_norm 6.4458 (2.7402)	mem 20675MB
[2025-04-03 04:41:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][196/573]	eta 0:05:37 lr 0.000002	time 0.8781 (0.8957)	loss 0.5744 (0.4925)	grad_norm 3.5903 (2.7446)	mem 20675MB
[2025-04-03 04:41:21 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][198/573]	eta 0:05:35 lr 0.000002	time 0.8775 (0.8956)	loss 0.3993 (0.4924)	grad_norm 3.9571 (2.7469)	mem 20675MB
[2025-04-03 04:41:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][200/573]	eta 0:05:33 lr 0.000002	time 0.8776 (0.8954)	loss 0.4859 (0.4915)	grad_norm 2.9536 (2.7587)	mem 20675MB
[2025-04-03 04:41:24 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][202/573]	eta 0:05:32 lr 0.000002	time 0.8775 (0.8953)	loss 0.5380 (0.4918)	grad_norm 1.7632 (2.7567)	mem 20675MB
[2025-04-03 04:41:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][204/573]	eta 0:05:30 lr 0.000002	time 0.8772 (0.8951)	loss 0.4156 (0.4909)	grad_norm 2.0871 (2.7605)	mem 20675MB
[2025-04-03 04:41:28 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][206/573]	eta 0:05:28 lr 0.000002	time 0.8780 (0.8950)	loss 0.4619 (0.4910)	grad_norm 2.5358 (2.7558)	mem 20675MB
[2025-04-03 04:41:30 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][208/573]	eta 0:05:26 lr 0.000002	time 0.8800 (0.8948)	loss 0.5507 (0.4914)	grad_norm 2.0380 (2.7519)	mem 20675MB
[2025-04-03 04:41:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][210/573]	eta 0:05:24 lr 0.000002	time 0.8774 (0.8947)	loss 0.5464 (0.4917)	grad_norm 2.6505 (2.7503)	mem 20675MB
[2025-04-03 04:41:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][212/573]	eta 0:05:22 lr 0.000002	time 0.8774 (0.8946)	loss 0.5126 (0.4916)	grad_norm 2.5102 (2.7467)	mem 20675MB
[2025-04-03 04:41:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][214/573]	eta 0:05:21 lr 0.000002	time 0.8774 (0.8944)	loss 0.4031 (0.4914)	grad_norm 6.1747 (2.7600)	mem 20675MB
[2025-04-03 04:41:37 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][216/573]	eta 0:05:19 lr 0.000002	time 0.8779 (0.8943)	loss 0.4400 (0.4912)	grad_norm 2.5858 (2.7570)	mem 20675MB
[2025-04-03 04:41:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][218/573]	eta 0:05:17 lr 0.000002	time 0.8829 (0.8942)	loss 0.4864 (0.4917)	grad_norm 2.5737 (2.7566)	mem 20675MB
[2025-04-03 04:41:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][220/573]	eta 0:05:15 lr 0.000002	time 0.8787 (0.8941)	loss 0.4727 (0.4917)	grad_norm 1.8231 (2.7535)	mem 20675MB
[2025-04-03 04:41:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][222/573]	eta 0:05:13 lr 0.000002	time 0.8804 (0.8940)	loss 0.5496 (0.4922)	grad_norm 2.8002 (2.7519)	mem 20675MB
[2025-04-03 04:41:44 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][224/573]	eta 0:05:11 lr 0.000002	time 0.8907 (0.8939)	loss 0.4971 (0.4922)	grad_norm 2.0697 (2.7509)	mem 20675MB
[2025-04-03 04:41:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][226/573]	eta 0:05:10 lr 0.000002	time 0.8790 (0.8938)	loss 0.4628 (0.4920)	grad_norm 2.0300 (2.7456)	mem 20675MB
[2025-04-03 04:41:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][228/573]	eta 0:05:08 lr 0.000001	time 0.8774 (0.8937)	loss 0.5241 (0.4919)	grad_norm 3.0288 (2.7443)	mem 20675MB
[2025-04-03 04:41:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][230/573]	eta 0:05:06 lr 0.000001	time 0.8786 (0.8936)	loss 0.5882 (0.4924)	grad_norm 2.7018 (2.7443)	mem 20675MB
[2025-04-03 04:41:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][232/573]	eta 0:05:04 lr 0.000001	time 0.8780 (0.8936)	loss 0.6183 (0.4934)	grad_norm 3.2057 (2.7444)	mem 20675MB
[2025-04-03 04:41:52 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][234/573]	eta 0:05:02 lr 0.000001	time 0.8810 (0.8934)	loss 0.4533 (0.4931)	grad_norm 2.9248 (2.7475)	mem 20675MB
[2025-04-03 04:41:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][236/573]	eta 0:05:01 lr 0.000001	time 0.8777 (0.8934)	loss 0.5172 (0.4926)	grad_norm 2.5830 (2.7468)	mem 20675MB
[2025-04-03 04:41:56 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][238/573]	eta 0:04:59 lr 0.000001	time 0.8776 (0.8933)	loss 0.5197 (0.4925)	grad_norm 1.7491 (2.7418)	mem 20675MB
[2025-04-03 04:41:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][240/573]	eta 0:04:57 lr 0.000001	time 0.8777 (0.8932)	loss 0.3566 (0.4921)	grad_norm 4.9235 (2.7588)	mem 20675MB
[2025-04-03 04:41:59 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][242/573]	eta 0:04:55 lr 0.000001	time 0.8776 (0.8931)	loss 0.5357 (0.4914)	grad_norm 2.9161 (2.7627)	mem 20675MB
[2025-04-03 04:42:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][244/573]	eta 0:04:53 lr 0.000001	time 0.8776 (0.8930)	loss 0.3464 (0.4913)	grad_norm 4.2551 (2.7679)	mem 20675MB
[2025-04-03 04:42:03 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][246/573]	eta 0:04:51 lr 0.000001	time 0.8780 (0.8929)	loss 0.6142 (0.4920)	grad_norm 2.4495 (2.7677)	mem 20675MB
[2025-04-03 04:42:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][248/573]	eta 0:04:50 lr 0.000001	time 0.8779 (0.8927)	loss 0.5759 (0.4924)	grad_norm 1.9206 (2.7650)	mem 20675MB
[2025-04-03 04:42:07 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][250/573]	eta 0:04:48 lr 0.000001	time 0.8781 (0.8926)	loss 0.6312 (0.4929)	grad_norm 2.4520 (2.7598)	mem 20675MB
[2025-04-03 04:42:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][252/573]	eta 0:04:46 lr 0.000001	time 0.8777 (0.8925)	loss 0.5671 (0.4930)	grad_norm 2.7667 (2.7600)	mem 20675MB
[2025-04-03 04:42:10 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][254/573]	eta 0:04:44 lr 0.000001	time 0.8836 (0.8924)	loss 0.5568 (0.4928)	grad_norm 3.3905 (2.7667)	mem 20675MB
[2025-04-03 04:42:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][256/573]	eta 0:04:42 lr 0.000001	time 0.8781 (0.8924)	loss 0.5674 (0.4930)	grad_norm 2.6084 (2.7619)	mem 20675MB
[2025-04-03 04:42:14 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][258/573]	eta 0:04:41 lr 0.000001	time 0.8780 (0.8922)	loss 0.4902 (0.4930)	grad_norm 2.8615 (2.7628)	mem 20675MB
[2025-04-03 04:42:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][260/573]	eta 0:04:39 lr 0.000001	time 0.8777 (0.8922)	loss 0.3163 (0.4923)	grad_norm 2.9726 (2.7610)	mem 20675MB
[2025-04-03 04:42:17 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][262/573]	eta 0:04:37 lr 0.000001	time 0.8942 (0.8921)	loss 0.5316 (0.4928)	grad_norm 2.6478 (2.7601)	mem 20675MB
[2025-04-03 04:42:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][264/573]	eta 0:04:35 lr 0.000001	time 0.8817 (0.8920)	loss 0.4537 (0.4926)	grad_norm 3.6363 (2.7604)	mem 20675MB
[2025-04-03 04:42:21 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][266/573]	eta 0:04:33 lr 0.000001	time 0.8777 (0.8920)	loss 0.5908 (0.4931)	grad_norm 2.5516 (2.7581)	mem 20675MB
[2025-04-03 04:42:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][268/573]	eta 0:04:32 lr 0.000001	time 0.8779 (0.8919)	loss 0.4586 (0.4929)	grad_norm 1.9907 (2.7538)	mem 20675MB
[2025-04-03 04:42:24 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][270/573]	eta 0:04:30 lr 0.000001	time 0.8773 (0.8918)	loss 0.3528 (0.4926)	grad_norm 2.9747 (2.7534)	mem 20675MB
[2025-04-03 04:42:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][272/573]	eta 0:04:28 lr 0.000001	time 0.8924 (0.8918)	loss 0.4754 (0.4923)	grad_norm 2.7273 (2.7573)	mem 20675MB
[2025-04-03 04:42:28 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][274/573]	eta 0:04:26 lr 0.000001	time 0.8782 (0.8917)	loss 0.6129 (0.4922)	grad_norm 2.5669 (2.7605)	mem 20675MB
[2025-04-03 04:42:29 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][276/573]	eta 0:04:24 lr 0.000001	time 0.8777 (0.8917)	loss 0.4481 (0.4923)	grad_norm 2.9542 (2.7603)	mem 20675MB
[2025-04-03 04:42:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][278/573]	eta 0:04:23 lr 0.000001	time 0.8777 (0.8916)	loss 0.4075 (0.4919)	grad_norm 3.7596 (2.7635)	mem 20675MB
[2025-04-03 04:42:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][280/573]	eta 0:04:21 lr 0.000001	time 0.8798 (0.8915)	loss 0.5127 (0.4914)	grad_norm 2.3403 (2.7744)	mem 20675MB
[2025-04-03 04:42:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][282/573]	eta 0:04:19 lr 0.000001	time 0.8783 (0.8914)	loss 0.3647 (0.4913)	grad_norm 3.5656 (2.7745)	mem 20675MB
[2025-04-03 04:42:37 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][284/573]	eta 0:04:17 lr 0.000001	time 0.8778 (0.8914)	loss 0.4480 (0.4909)	grad_norm 2.2940 (2.7740)	mem 20675MB
[2025-04-03 04:42:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][286/573]	eta 0:04:15 lr 0.000001	time 0.8823 (0.8913)	loss 0.5133 (0.4911)	grad_norm 2.2663 (2.7711)	mem 20675MB
[2025-04-03 04:42:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][288/573]	eta 0:04:13 lr 0.000001	time 0.8791 (0.8912)	loss 0.3277 (0.4907)	grad_norm 2.6803 (2.7667)	mem 20675MB
[2025-04-03 04:42:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][290/573]	eta 0:04:12 lr 0.000001	time 0.8799 (0.8911)	loss 0.5068 (0.4902)	grad_norm 2.0823 (2.7636)	mem 20675MB
[2025-04-03 04:42:44 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][292/573]	eta 0:04:10 lr 0.000001	time 0.8777 (0.8911)	loss 0.5061 (0.4898)	grad_norm 3.0292 (2.7675)	mem 20675MB
[2025-04-03 04:42:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][294/573]	eta 0:04:08 lr 0.000001	time 0.8790 (0.8910)	loss 0.3583 (0.4895)	grad_norm 2.7756 (2.7650)	mem 20675MB
[2025-04-03 04:42:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][296/573]	eta 0:04:06 lr 0.000001	time 0.8810 (0.8909)	loss 0.5063 (0.4891)	grad_norm 1.7920 (2.7603)	mem 20675MB
[2025-04-03 04:42:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][298/573]	eta 0:04:04 lr 0.000001	time 0.8860 (0.8909)	loss 0.5629 (0.4894)	grad_norm 2.0922 (2.7550)	mem 20675MB
[2025-04-03 04:42:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][300/573]	eta 0:04:03 lr 0.000001	time 0.8777 (0.8908)	loss 0.5971 (0.4898)	grad_norm 2.2322 (2.7516)	mem 20675MB
[2025-04-03 04:42:52 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][302/573]	eta 0:04:01 lr 0.000001	time 0.8915 (0.8908)	loss 0.4519 (0.4899)	grad_norm 2.4882 (2.7495)	mem 20675MB
[2025-04-03 04:42:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][304/573]	eta 0:03:59 lr 0.000001	time 0.8773 (0.8909)	loss 0.4633 (0.4899)	grad_norm 3.7248 (2.7537)	mem 20675MB
[2025-04-03 04:42:56 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][306/573]	eta 0:03:57 lr 0.000001	time 0.8774 (0.8908)	loss 0.4665 (0.4896)	grad_norm 2.0140 (2.7535)	mem 20675MB
[2025-04-03 04:42:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][308/573]	eta 0:03:56 lr 0.000001	time 0.8778 (0.8908)	loss 0.4655 (0.4891)	grad_norm 4.4296 (2.7636)	mem 20675MB
[2025-04-03 04:43:00 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][310/573]	eta 0:03:54 lr 0.000001	time 0.8893 (0.8907)	loss 0.5781 (0.4896)	grad_norm 2.7052 (2.7630)	mem 20675MB
[2025-04-03 04:43:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][312/573]	eta 0:03:52 lr 0.000001	time 0.8785 (0.8907)	loss 0.5209 (0.4895)	grad_norm 3.5057 (2.7651)	mem 20675MB
[2025-04-03 04:43:03 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][314/573]	eta 0:03:50 lr 0.000001	time 0.8835 (0.8906)	loss 0.6718 (0.4899)	grad_norm 2.8762 (2.7713)	mem 20675MB
[2025-04-03 04:43:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][316/573]	eta 0:03:48 lr 0.000001	time 0.8908 (0.8906)	loss 0.3604 (0.4891)	grad_norm 3.1403 (2.7749)	mem 20675MB
[2025-04-03 04:43:07 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][318/573]	eta 0:03:47 lr 0.000001	time 0.8779 (0.8905)	loss 0.4053 (0.4884)	grad_norm 2.7096 (2.7804)	mem 20675MB
[2025-04-03 04:43:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][320/573]	eta 0:03:45 lr 0.000001	time 0.8793 (0.8904)	loss 0.4430 (0.4883)	grad_norm 4.2423 (2.7833)	mem 20675MB
[2025-04-03 04:43:10 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][322/573]	eta 0:03:43 lr 0.000001	time 0.9066 (0.8906)	loss 0.4694 (0.4882)	grad_norm 2.1070 (2.7810)	mem 20675MB
[2025-04-03 04:43:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][324/573]	eta 0:03:41 lr 0.000001	time 0.8777 (0.8905)	loss 0.4753 (0.4886)	grad_norm 2.2784 (2.7799)	mem 20675MB
[2025-04-03 04:43:14 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][326/573]	eta 0:03:39 lr 0.000001	time 0.8779 (0.8904)	loss 0.5316 (0.4888)	grad_norm 2.9079 (2.7792)	mem 20675MB
[2025-04-03 04:43:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][328/573]	eta 0:03:38 lr 0.000001	time 0.8777 (0.8904)	loss 0.5232 (0.4887)	grad_norm 2.1193 (2.7792)	mem 20675MB
[2025-04-03 04:43:17 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][330/573]	eta 0:03:36 lr 0.000001	time 0.8780 (0.8903)	loss 0.4784 (0.4891)	grad_norm 2.1063 (2.7765)	mem 20675MB
[2025-04-03 04:43:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][332/573]	eta 0:03:34 lr 0.000001	time 0.8777 (0.8902)	loss 0.5730 (0.4895)	grad_norm 2.6029 (2.7744)	mem 20675MB
[2025-04-03 04:43:21 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][334/573]	eta 0:03:32 lr 0.000001	time 0.8780 (0.8902)	loss 0.6075 (0.4902)	grad_norm 2.3955 (2.7718)	mem 20675MB
[2025-04-03 04:43:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][336/573]	eta 0:03:30 lr 0.000001	time 0.8781 (0.8901)	loss 0.3644 (0.4899)	grad_norm 3.8988 (2.7755)	mem 20675MB
[2025-04-03 04:43:24 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][338/573]	eta 0:03:29 lr 0.000001	time 0.8776 (0.8901)	loss 0.5185 (0.4903)	grad_norm 2.4309 (2.7739)	mem 20675MB
[2025-04-03 04:43:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][340/573]	eta 0:03:27 lr 0.000001	time 0.8840 (0.8900)	loss 0.4684 (0.4900)	grad_norm 1.9322 (2.7705)	mem 20675MB
[2025-04-03 04:43:28 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][342/573]	eta 0:03:25 lr 0.000001	time 0.8925 (0.8900)	loss 0.3459 (0.4894)	grad_norm 3.6008 (2.7734)	mem 20675MB
[2025-04-03 04:43:30 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][344/573]	eta 0:03:23 lr 0.000001	time 0.8782 (0.8899)	loss 0.5065 (0.4896)	grad_norm 1.7273 (2.7674)	mem 20675MB
[2025-04-03 04:43:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][346/573]	eta 0:03:22 lr 0.000001	time 0.8778 (0.8899)	loss 0.5605 (0.4897)	grad_norm 2.5658 (2.7675)	mem 20675MB
[2025-04-03 04:43:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][348/573]	eta 0:03:20 lr 0.000001	time 0.8774 (0.8898)	loss 0.4016 (0.4896)	grad_norm 3.0595 (2.7661)	mem 20675MB
[2025-04-03 04:43:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][350/573]	eta 0:03:18 lr 0.000001	time 0.8773 (0.8898)	loss 0.4348 (0.4890)	grad_norm 3.4066 (2.7689)	mem 20675MB
[2025-04-03 04:43:37 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][352/573]	eta 0:03:16 lr 0.000001	time 0.8790 (0.8897)	loss 0.5208 (0.4891)	grad_norm 2.6957 (2.7701)	mem 20675MB
[2025-04-03 04:43:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][354/573]	eta 0:03:14 lr 0.000001	time 0.8785 (0.8896)	loss 0.5350 (0.4894)	grad_norm 3.3636 (2.7718)	mem 20675MB
[2025-04-03 04:43:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][356/573]	eta 0:03:13 lr 0.000001	time 0.8782 (0.8896)	loss 0.5001 (0.4892)	grad_norm 2.4517 (2.7695)	mem 20675MB
[2025-04-03 04:43:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][358/573]	eta 0:03:11 lr 0.000001	time 0.8792 (0.8896)	loss 0.5973 (0.4895)	grad_norm 2.7444 (2.7666)	mem 20675MB
[2025-04-03 04:43:44 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][360/573]	eta 0:03:09 lr 0.000001	time 0.8778 (0.8895)	loss 0.5354 (0.4899)	grad_norm 2.1985 (2.7628)	mem 20675MB
[2025-04-03 04:43:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][362/573]	eta 0:03:07 lr 0.000001	time 0.8778 (0.8895)	loss 0.4541 (0.4893)	grad_norm 2.2102 (2.7617)	mem 20675MB
[2025-04-03 04:43:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][364/573]	eta 0:03:05 lr 0.000001	time 0.8788 (0.8894)	loss 0.5297 (0.4892)	grad_norm 2.8093 (2.7627)	mem 20675MB
[2025-04-03 04:43:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][366/573]	eta 0:03:04 lr 0.000001	time 0.8779 (0.8894)	loss 0.4699 (0.4893)	grad_norm 2.2937 (2.7600)	mem 20675MB
[2025-04-03 04:43:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][368/573]	eta 0:03:02 lr 0.000001	time 0.8808 (0.8893)	loss 0.4851 (0.4889)	grad_norm 2.1690 (2.7605)	mem 20675MB
[2025-04-03 04:43:52 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][370/573]	eta 0:03:00 lr 0.000001	time 0.8811 (0.8893)	loss 0.5001 (0.4890)	grad_norm 1.5760 (2.7551)	mem 20675MB
[2025-04-03 04:43:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][372/573]	eta 0:02:58 lr 0.000001	time 0.8775 (0.8893)	loss 0.5931 (0.4891)	grad_norm 2.2909 (2.7553)	mem 20675MB
[2025-04-03 04:43:56 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][374/573]	eta 0:02:56 lr 0.000001	time 0.8774 (0.8892)	loss 0.4982 (0.4893)	grad_norm 2.0780 (2.7597)	mem 20675MB
[2025-04-03 04:43:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][376/573]	eta 0:02:55 lr 0.000001	time 0.8777 (0.8892)	loss 0.4248 (0.4887)	grad_norm 3.1040 (2.7640)	mem 20675MB
[2025-04-03 04:43:59 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][378/573]	eta 0:02:53 lr 0.000001	time 0.8777 (0.8891)	loss 0.4170 (0.4886)	grad_norm 2.5497 (2.7631)	mem 20675MB
[2025-04-03 04:44:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][380/573]	eta 0:02:51 lr 0.000001	time 0.8773 (0.8891)	loss 0.5056 (0.4889)	grad_norm 2.3601 (2.7602)	mem 20675MB
[2025-04-03 04:44:03 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][382/573]	eta 0:02:49 lr 0.000001	time 0.8774 (0.8890)	loss 0.5767 (0.4891)	grad_norm 3.0915 (2.7612)	mem 20675MB
[2025-04-03 04:44:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][384/573]	eta 0:02:48 lr 0.000001	time 0.8794 (0.8890)	loss 0.5275 (0.4894)	grad_norm 1.8949 (2.7583)	mem 20675MB
[2025-04-03 04:44:07 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][386/573]	eta 0:02:46 lr 0.000001	time 0.8835 (0.8890)	loss 0.4701 (0.4890)	grad_norm 2.6767 (2.7611)	mem 20675MB
[2025-04-03 04:44:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][388/573]	eta 0:02:44 lr 0.000001	time 0.8926 (0.8890)	loss 0.3257 (0.4887)	grad_norm 3.7340 (2.7671)	mem 20675MB
[2025-04-03 04:44:10 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][390/573]	eta 0:02:42 lr 0.000001	time 0.8777 (0.8890)	loss 0.3922 (0.4885)	grad_norm 3.4518 (2.7698)	mem 20675MB
[2025-04-03 04:44:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][392/573]	eta 0:02:40 lr 0.000001	time 0.8774 (0.8890)	loss 0.5013 (0.4888)	grad_norm 2.3393 (2.7668)	mem 20675MB
[2025-04-03 04:44:14 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][394/573]	eta 0:02:39 lr 0.000001	time 0.8782 (0.8889)	loss 0.4605 (0.4889)	grad_norm 2.3110 (2.7637)	mem 20675MB
[2025-04-03 04:44:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][396/573]	eta 0:02:37 lr 0.000001	time 0.8783 (0.8889)	loss 0.4293 (0.4885)	grad_norm 2.9408 (2.7668)	mem 20675MB
[2025-04-03 04:44:17 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][398/573]	eta 0:02:35 lr 0.000001	time 0.8775 (0.8888)	loss 0.5841 (0.4885)	grad_norm 2.3099 (2.7666)	mem 20675MB
[2025-04-03 04:44:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][400/573]	eta 0:02:33 lr 0.000001	time 0.8776 (0.8888)	loss 0.5017 (0.4884)	grad_norm 2.6048 (2.7678)	mem 20675MB
[2025-04-03 04:44:21 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][402/573]	eta 0:02:31 lr 0.000001	time 0.8775 (0.8888)	loss 0.4588 (0.4882)	grad_norm 2.8585 (2.7662)	mem 20675MB
[2025-04-03 04:44:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][404/573]	eta 0:02:30 lr 0.000001	time 0.8778 (0.8887)	loss 0.3766 (0.4879)	grad_norm 2.1959 (2.7680)	mem 20675MB
[2025-04-03 04:44:24 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][406/573]	eta 0:02:28 lr 0.000001	time 0.8778 (0.8887)	loss 0.5325 (0.4876)	grad_norm 2.8512 (2.7698)	mem 20675MB
[2025-04-03 04:44:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][408/573]	eta 0:02:26 lr 0.000001	time 0.8782 (0.8886)	loss 0.4248 (0.4876)	grad_norm 4.6741 (2.7721)	mem 20675MB
[2025-04-03 04:44:28 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][410/573]	eta 0:02:24 lr 0.000001	time 0.8776 (0.8886)	loss 0.5081 (0.4879)	grad_norm 2.5926 (2.7700)	mem 20675MB
[2025-04-03 04:44:29 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][412/573]	eta 0:02:23 lr 0.000001	time 0.8787 (0.8885)	loss 0.3894 (0.4873)	grad_norm 3.5327 (2.7731)	mem 20675MB
[2025-04-03 04:44:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][414/573]	eta 0:02:21 lr 0.000001	time 0.8801 (0.8885)	loss 0.5439 (0.4875)	grad_norm 2.1426 (2.7716)	mem 20675MB
[2025-04-03 04:44:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][416/573]	eta 0:02:19 lr 0.000001	time 0.8841 (0.8885)	loss 0.3930 (0.4871)	grad_norm 2.9709 (2.7868)	mem 20675MB
[2025-04-03 04:44:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][418/573]	eta 0:02:17 lr 0.000001	time 0.8860 (0.8884)	loss 0.4214 (0.4870)	grad_norm 2.8497 (2.7889)	mem 20675MB
[2025-04-03 04:44:36 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][420/573]	eta 0:02:15 lr 0.000000	time 0.8774 (0.8884)	loss 0.3285 (0.4869)	grad_norm 3.5792 (2.7908)	mem 20675MB
[2025-04-03 04:44:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][422/573]	eta 0:02:14 lr 0.000000	time 0.8813 (0.8884)	loss 0.5767 (0.4872)	grad_norm 2.9927 (2.7911)	mem 20675MB
[2025-04-03 04:44:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][424/573]	eta 0:02:12 lr 0.000000	time 0.8790 (0.8884)	loss 0.4592 (0.4874)	grad_norm 2.6429 (2.7894)	mem 20675MB
[2025-04-03 04:44:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][426/573]	eta 0:02:10 lr 0.000000	time 0.8775 (0.8883)	loss 0.5328 (0.4875)	grad_norm 2.9475 (2.7885)	mem 20675MB
[2025-04-03 04:44:44 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][428/573]	eta 0:02:08 lr 0.000000	time 0.8853 (0.8883)	loss 0.5215 (0.4877)	grad_norm 2.2560 (2.7879)	mem 20675MB
[2025-04-03 04:44:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][430/573]	eta 0:02:07 lr 0.000000	time 0.8828 (0.8883)	loss 0.5358 (0.4878)	grad_norm 2.2610 (2.7897)	mem 20675MB
[2025-04-03 04:44:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][432/573]	eta 0:02:05 lr 0.000000	time 0.8875 (0.8883)	loss 0.4262 (0.4873)	grad_norm 5.4918 (2.7998)	mem 20675MB
[2025-04-03 04:44:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][434/573]	eta 0:02:03 lr 0.000000	time 0.8774 (0.8883)	loss 0.6386 (0.4875)	grad_norm 2.9298 (2.7992)	mem 20675MB
[2025-04-03 04:44:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][436/573]	eta 0:02:01 lr 0.000000	time 0.8781 (0.8882)	loss 0.3826 (0.4870)	grad_norm 3.8650 (2.8010)	mem 20675MB
[2025-04-03 04:44:52 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][438/573]	eta 0:01:59 lr 0.000000	time 0.8793 (0.8882)	loss 0.6018 (0.4874)	grad_norm 2.3669 (2.7984)	mem 20675MB
[2025-04-03 04:44:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][440/573]	eta 0:01:58 lr 0.000000	time 0.8774 (0.8882)	loss 0.4512 (0.4870)	grad_norm 2.6181 (2.8001)	mem 20675MB
[2025-04-03 04:44:56 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][442/573]	eta 0:01:56 lr 0.000000	time 0.8779 (0.8881)	loss 0.5671 (0.4873)	grad_norm 2.2646 (2.7997)	mem 20675MB
[2025-04-03 04:44:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][444/573]	eta 0:01:54 lr 0.000000	time 0.8833 (0.8881)	loss 0.6201 (0.4878)	grad_norm 5.8240 (2.8044)	mem 20675MB
[2025-04-03 04:44:59 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][446/573]	eta 0:01:52 lr 0.000000	time 0.8859 (0.8881)	loss 0.4338 (0.4873)	grad_norm 2.6065 (2.8042)	mem 20675MB
[2025-04-03 04:45:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][448/573]	eta 0:01:51 lr 0.000000	time 0.8778 (0.8880)	loss 0.5037 (0.4870)	grad_norm 2.7161 (2.8065)	mem 20675MB
[2025-04-03 04:45:03 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][450/573]	eta 0:01:49 lr 0.000000	time 0.8774 (0.8880)	loss 0.4784 (0.4869)	grad_norm 1.9401 (2.8038)	mem 20675MB
[2025-04-03 04:45:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][452/573]	eta 0:01:47 lr 0.000000	time 0.8778 (0.8880)	loss 0.3880 (0.4865)	grad_norm 2.9687 (2.8048)	mem 20675MB
[2025-04-03 04:45:07 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][454/573]	eta 0:01:45 lr 0.000000	time 0.8778 (0.8880)	loss 0.5158 (0.4866)	grad_norm 2.0611 (2.8021)	mem 20675MB
[2025-04-03 04:45:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][456/573]	eta 0:01:43 lr 0.000000	time 0.8773 (0.8879)	loss 0.5414 (0.4868)	grad_norm 2.5940 (2.8085)	mem 20675MB
[2025-04-03 04:45:10 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][458/573]	eta 0:01:42 lr 0.000000	time 0.8780 (0.8879)	loss 0.5055 (0.4869)	grad_norm 2.7688 (2.8079)	mem 20675MB
[2025-04-03 04:45:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][460/573]	eta 0:01:40 lr 0.000000	time 0.8818 (0.8879)	loss 0.4832 (0.4867)	grad_norm 2.5609 (2.8080)	mem 20675MB
[2025-04-03 04:45:14 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][462/573]	eta 0:01:38 lr 0.000000	time 0.8784 (0.8879)	loss 0.5258 (0.4867)	grad_norm 2.1885 (2.8132)	mem 20675MB
[2025-04-03 04:45:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][464/573]	eta 0:01:36 lr 0.000000	time 0.8806 (0.8878)	loss 0.5553 (0.4869)	grad_norm 2.3001 (2.8112)	mem 20675MB
[2025-04-03 04:45:17 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][466/573]	eta 0:01:34 lr 0.000000	time 0.8779 (0.8878)	loss 0.5331 (0.4871)	grad_norm 3.4189 (2.8115)	mem 20675MB
[2025-04-03 04:45:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][468/573]	eta 0:01:33 lr 0.000000	time 0.8788 (0.8878)	loss 0.5055 (0.4872)	grad_norm 1.7699 (2.8078)	mem 20675MB
[2025-04-03 04:45:21 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][470/573]	eta 0:01:31 lr 0.000000	time 0.8775 (0.8877)	loss 0.5206 (0.4873)	grad_norm 3.2361 (2.8082)	mem 20675MB
[2025-04-03 04:45:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][472/573]	eta 0:01:29 lr 0.000000	time 0.8776 (0.8877)	loss 0.3762 (0.4869)	grad_norm 3.7550 (2.8110)	mem 20675MB
[2025-04-03 04:45:24 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][474/573]	eta 0:01:27 lr 0.000000	time 0.8779 (0.8877)	loss 0.4801 (0.4867)	grad_norm 2.2582 (2.8126)	mem 20675MB
[2025-04-03 04:45:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][476/573]	eta 0:01:26 lr 0.000000	time 0.8775 (0.8876)	loss 0.5153 (0.4868)	grad_norm 2.1355 (2.8111)	mem 20675MB
[2025-04-03 04:45:28 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][478/573]	eta 0:01:24 lr 0.000000	time 0.8772 (0.8876)	loss 0.4574 (0.4867)	grad_norm 3.0898 (2.8104)	mem 20675MB
[2025-04-03 04:45:29 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][480/573]	eta 0:01:22 lr 0.000000	time 0.8779 (0.8876)	loss 0.6113 (0.4871)	grad_norm 2.0891 (2.8073)	mem 20675MB
[2025-04-03 04:45:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][482/573]	eta 0:01:20 lr 0.000000	time 0.8777 (0.8875)	loss 0.5827 (0.4874)	grad_norm 2.4561 (2.8052)	mem 20675MB
[2025-04-03 04:45:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][484/573]	eta 0:01:18 lr 0.000000	time 0.8777 (0.8875)	loss 0.5693 (0.4877)	grad_norm 2.5954 (2.8043)	mem 20675MB
[2025-04-03 04:45:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][486/573]	eta 0:01:17 lr 0.000000	time 0.8775 (0.8875)	loss 0.5187 (0.4876)	grad_norm 2.6197 (2.8031)	mem 20675MB
[2025-04-03 04:45:36 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][488/573]	eta 0:01:15 lr 0.000000	time 0.8775 (0.8875)	loss 0.5497 (0.4875)	grad_norm 1.9199 (2.8010)	mem 20675MB
[2025-04-03 04:45:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][490/573]	eta 0:01:13 lr 0.000000	time 0.8774 (0.8874)	loss 0.4232 (0.4874)	grad_norm 3.1902 (2.8003)	mem 20675MB
[2025-04-03 04:45:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][492/573]	eta 0:01:11 lr 0.000000	time 0.8843 (0.8874)	loss 0.3994 (0.4873)	grad_norm 2.4198 (2.7981)	mem 20675MB
[2025-04-03 04:45:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][494/573]	eta 0:01:10 lr 0.000000	time 0.8782 (0.8874)	loss 0.4171 (0.4870)	grad_norm 2.4784 (2.7964)	mem 20675MB
[2025-04-03 04:45:43 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][496/573]	eta 0:01:08 lr 0.000000	time 0.8777 (0.8874)	loss 0.4560 (0.4870)	grad_norm 2.9952 (2.7953)	mem 20675MB
[2025-04-03 04:45:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][498/573]	eta 0:01:06 lr 0.000000	time 0.8802 (0.8873)	loss 0.5940 (0.4873)	grad_norm 2.0834 (2.7929)	mem 20675MB
[2025-04-03 04:45:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][500/573]	eta 0:01:04 lr 0.000000	time 0.8860 (0.8873)	loss 0.5576 (0.4875)	grad_norm 2.3034 (2.7904)	mem 20675MB
[2025-04-03 04:45:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][502/573]	eta 0:01:02 lr 0.000000	time 0.8781 (0.8873)	loss 0.4718 (0.4876)	grad_norm 3.8030 (2.7924)	mem 20675MB
[2025-04-03 04:45:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][504/573]	eta 0:01:01 lr 0.000000	time 0.8784 (0.8872)	loss 0.4503 (0.4875)	grad_norm 3.3673 (2.7956)	mem 20675MB
[2025-04-03 04:45:52 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][506/573]	eta 0:00:59 lr 0.000000	time 0.8820 (0.8872)	loss 0.5020 (0.4876)	grad_norm 2.3020 (2.7941)	mem 20675MB
[2025-04-03 04:45:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][508/573]	eta 0:00:57 lr 0.000000	time 0.8776 (0.8872)	loss 0.3868 (0.4871)	grad_norm 3.2747 (2.7947)	mem 20675MB
[2025-04-03 04:45:56 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][510/573]	eta 0:00:55 lr 0.000000	time 0.8871 (0.8872)	loss 0.4558 (0.4871)	grad_norm 2.4423 (2.7935)	mem 20675MB
[2025-04-03 04:45:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][512/573]	eta 0:00:54 lr 0.000000	time 0.8773 (0.8871)	loss 0.4061 (0.4867)	grad_norm 3.1267 (2.7946)	mem 20675MB
[2025-04-03 04:45:59 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][514/573]	eta 0:00:52 lr 0.000000	time 0.8782 (0.8871)	loss 0.5970 (0.4869)	grad_norm 3.1635 (2.7946)	mem 20675MB
[2025-04-03 04:46:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][516/573]	eta 0:00:50 lr 0.000000	time 0.8781 (0.8871)	loss 0.5671 (0.4871)	grad_norm 2.4627 (2.7928)	mem 20675MB
[2025-04-03 04:46:03 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][518/573]	eta 0:00:48 lr 0.000000	time 0.8777 (0.8870)	loss 0.3515 (0.4871)	grad_norm 3.3909 (2.7930)	mem 20675MB
[2025-04-03 04:46:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][520/573]	eta 0:00:47 lr 0.000000	time 0.8776 (0.8870)	loss 0.5021 (0.4869)	grad_norm 2.1182 (2.7933)	mem 20675MB
[2025-04-03 04:46:06 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][522/573]	eta 0:00:45 lr 0.000000	time 0.8778 (0.8870)	loss 0.4613 (0.4869)	grad_norm 2.7810 (2.7929)	mem 20675MB
[2025-04-03 04:46:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][524/573]	eta 0:00:43 lr 0.000000	time 0.8775 (0.8869)	loss 0.4009 (0.4868)	grad_norm 3.3141 (2.7954)	mem 20675MB
[2025-04-03 04:46:10 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][526/573]	eta 0:00:41 lr 0.000000	time 0.8777 (0.8869)	loss 0.3161 (0.4866)	grad_norm 2.7149 (2.7947)	mem 20675MB
[2025-04-03 04:46:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][528/573]	eta 0:00:39 lr 0.000000	time 0.8772 (0.8869)	loss 0.4988 (0.4868)	grad_norm 1.9707 (2.7919)	mem 20675MB
[2025-04-03 04:46:13 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][530/573]	eta 0:00:38 lr 0.000000	time 0.8773 (0.8869)	loss 0.4680 (0.4868)	grad_norm 2.2945 (2.7917)	mem 20675MB
[2025-04-03 04:46:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][532/573]	eta 0:00:36 lr 0.000000	time 0.8773 (0.8868)	loss 0.4991 (0.4870)	grad_norm 2.9138 (2.7910)	mem 20675MB
[2025-04-03 04:46:17 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][534/573]	eta 0:00:34 lr 0.000000	time 0.8781 (0.8868)	loss 0.5171 (0.4868)	grad_norm 2.4143 (2.7922)	mem 20675MB
[2025-04-03 04:46:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][536/573]	eta 0:00:32 lr 0.000000	time 0.8778 (0.8868)	loss 0.4242 (0.4869)	grad_norm 2.3856 (2.7911)	mem 20675MB
[2025-04-03 04:46:20 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][538/573]	eta 0:00:31 lr 0.000000	time 0.8773 (0.8867)	loss 0.5693 (0.4871)	grad_norm 2.0009 (2.7887)	mem 20675MB
[2025-04-03 04:46:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][540/573]	eta 0:00:29 lr 0.000000	time 0.8874 (0.8867)	loss 0.6234 (0.4872)	grad_norm 2.4086 (2.7876)	mem 20675MB
[2025-04-03 04:46:24 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][542/573]	eta 0:00:27 lr 0.000000	time 0.8775 (0.8867)	loss 0.5633 (0.4875)	grad_norm 2.7783 (2.7866)	mem 20675MB
[2025-04-03 04:46:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][544/573]	eta 0:00:25 lr 0.000000	time 0.8781 (0.8867)	loss 0.5207 (0.4877)	grad_norm 1.8233 (2.7857)	mem 20675MB
[2025-04-03 04:46:27 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][546/573]	eta 0:00:23 lr 0.000000	time 0.8778 (0.8867)	loss 0.3957 (0.4874)	grad_norm 3.1438 (2.7872)	mem 20675MB
[2025-04-03 04:46:29 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][548/573]	eta 0:00:22 lr 0.000000	time 0.8778 (0.8866)	loss 0.4296 (0.4874)	grad_norm 2.2535 (2.7850)	mem 20675MB
[2025-04-03 04:46:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][550/573]	eta 0:00:20 lr 0.000000	time 0.8773 (0.8867)	loss 0.5784 (0.4876)	grad_norm 2.0386 (2.7821)	mem 20675MB
[2025-04-03 04:46:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][552/573]	eta 0:00:18 lr 0.000000	time 0.8772 (0.8866)	loss 0.5455 (0.4877)	grad_norm 2.2029 (2.7797)	mem 20675MB
[2025-04-03 04:46:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][554/573]	eta 0:00:16 lr 0.000000	time 0.8778 (0.8866)	loss 0.4262 (0.4876)	grad_norm 3.1162 (2.7796)	mem 20675MB
[2025-04-03 04:46:36 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][556/573]	eta 0:00:15 lr 0.000000	time 0.8774 (0.8866)	loss 0.5776 (0.4878)	grad_norm 2.3914 (2.7784)	mem 20675MB
[2025-04-03 04:46:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][558/573]	eta 0:00:13 lr 0.000000	time 0.8827 (0.8866)	loss 0.5094 (0.4876)	grad_norm 2.3524 (2.7788)	mem 20675MB
[2025-04-03 04:46:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][560/573]	eta 0:00:11 lr 0.000000	time 0.8776 (0.8866)	loss 0.6027 (0.4877)	grad_norm 3.1626 (2.7803)	mem 20675MB
[2025-04-03 04:46:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][562/573]	eta 0:00:09 lr 0.000000	time 0.8775 (0.8865)	loss 0.4804 (0.4876)	grad_norm 2.1965 (2.7789)	mem 20675MB
[2025-04-03 04:46:43 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][564/573]	eta 0:00:07 lr 0.000000	time 0.8774 (0.8865)	loss 0.3766 (0.4871)	grad_norm 4.7355 (2.7823)	mem 20675MB
[2025-04-03 04:46:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][566/573]	eta 0:00:06 lr 0.000000	time 0.8777 (0.8865)	loss 0.5166 (0.4871)	grad_norm 2.3585 (2.7830)	mem 20675MB
[2025-04-03 04:46:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][568/573]	eta 0:00:04 lr 0.000000	time 0.8774 (0.8865)	loss 0.5246 (0.4870)	grad_norm 2.6591 (2.7828)	mem 20675MB
[2025-04-03 04:46:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][570/573]	eta 0:00:02 lr 0.000000	time 0.8771 (0.8864)	loss 0.4833 (0.4867)	grad_norm 1.7539 (2.7836)	mem 20675MB
[2025-04-03 04:46:50 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][572/573]	eta 0:00:00 lr 0.000000	time 0.8775 (0.8864)	loss 0.3781 (0.4864)	grad_norm 3.1198 (2.7849)	mem 20675MB
[2025-04-03 04:46:51 simmim_finetune] (main_finetune.py 260): INFO EPOCH 29 training takes 0:08:28
[2025-04-03 04:46:51 simmim_finetune] (utils.py 60): INFO checkpoint/human/ckpt29.pth saving......
[2025-04-03 04:46:55 simmim_finetune] (utils.py 62): INFO checkpoint/human/ckpt29.pth saved !!!
[2025-04-03 04:46:57 simmim_finetune] (main_finetune.py 297): INFO Test: [0/16]	Time 2.923 (2.923)	Loss 0.5121 (0.5121)	Acc@1 70.312 (70.312)	Mem 20675MB
[2025-04-03 04:46:58 simmim_finetune] (main_finetune.py 297): INFO Test: [2/16]	Time 0.285 (1.166)	Loss 0.4682 (0.4744)	Acc@1 75.000 (73.958)	Mem 20675MB
[2025-04-03 04:46:59 simmim_finetune] (main_finetune.py 297): INFO Test: [4/16]	Time 0.283 (0.814)	Loss 0.4876 (0.4675)	Acc@1 74.219 (75.000)	Mem 20675MB
[2025-04-03 04:46:59 simmim_finetune] (main_finetune.py 297): INFO Test: [6/16]	Time 0.285 (0.663)	Loss 0.4159 (0.4537)	Acc@1 82.031 (76.786)	Mem 20675MB
[2025-04-03 04:47:00 simmim_finetune] (main_finetune.py 297): INFO Test: [8/16]	Time 0.283 (0.579)	Loss 0.4999 (0.4524)	Acc@1 73.438 (77.083)	Mem 20675MB
[2025-04-03 04:47:00 simmim_finetune] (main_finetune.py 297): INFO Test: [10/16]	Time 0.283 (0.525)	Loss 0.4702 (0.4608)	Acc@1 83.594 (77.486)	Mem 20675MB
[2025-04-03 04:47:01 simmim_finetune] (main_finetune.py 297): INFO Test: [12/16]	Time 0.283 (0.489)	Loss 0.4705 (0.4617)	Acc@1 78.906 (77.704)	Mem 20675MB
[2025-04-03 04:47:01 simmim_finetune] (main_finetune.py 297): INFO Test: [14/16]	Time 0.283 (0.461)	Loss 0.4319 (0.4574)	Acc@1 78.906 (78.021)	Mem 20675MB
[2025-04-03 04:47:02 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.075
[2025-04-03 04:47:02 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 1984 test images: 78.1%
[2025-04-03 04:47:02 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.23%
[2025-04-03 04:47:02 simmim_finetune] (main_finetune.py 177): INFO Training time 4:16:43
