[2025-04-03 00:28:54 simmim_finetune] (main_finetune.py 375): INFO Full config saved to checkpoint/hand/config.json
[2025-04-03 00:28:54 simmim_finetune] (main_finetune.py 378): INFO AMP_OPT_LEVEL: O0
AUG:
  AUTO_AUGMENT: rand-m9-mstd0.5-inc1
  COLOR_JITTER: 0.4
  CUTMIX: 1.0
  CUTMIX_MINMAX: null
  MIXUP: 0.8
  MIXUP_MODE: batch
  MIXUP_PROB: 1.0
  MIXUP_SWITCH_PROB: 0.5
  RECOUNT: 1
  REMODE: pixel
  REPROB: 0.25
BASE:
- ''
DATA:
  BATCH_SIZE: 128
  DATASET: imagenet
  DATA_PATH: ''
  IMG_SIZE: 224
  INTERPOLATION: bicubic
  MASK_PATCH_SIZE: 32
  MASK_RATIO: 0.6
  NUM_WORKERS: 8
  PIN_MEMORY: true
  TRAIN_PATH: VBench-2.0_human_anomaly/dataset/hand_train.jsonl
  VAL_PATH: VBench-2.0_human_anomaly/dataset/hand_test.jsonl
EVAL_MODE: false
LOCAL_RANK: 0
LOSS:
  FOCAL: false
  FOCAL_ALPHA: 0.25
  FOCAL_GAMMA: 2.0
MODEL:
  DROP_PATH_RATE: 0.1
  DROP_RATE: 0.0
  LABEL_SMOOTHING: 0.1
  NAME: simmim_finetune
  NUM_CLASSES: 2
  RESUME: ''
  SWIN:
    APE: false
    DEPTHS:
    - 2
    - 2
    - 6
    - 2
    EMBED_DIM: 96
    IN_CHANS: 3
    MLP_RATIO: 4.0
    NUM_HEADS:
    - 3
    - 6
    - 12
    - 24
    PATCH_NORM: true
    PATCH_SIZE: 4
    QKV_BIAS: true
    QK_SCALE: null
    WINDOW_SIZE: 7
  TYPE: vit
  VIT:
    DEPTH: 12
    EMBED_DIM: 768
    INIT_VALUES: 0.1
    IN_CHANS: 3
    MLP_RATIO: 4
    NUM_HEADS: 12
    PATCH_SIZE: 16
    QKV_BIAS: true
    USE_APE: false
    USE_MEAN_POOLING: true
    USE_RPB: true
    USE_SHARED_RPB: false
OUTPUT: checkpoint/hand
PRETRAINED: pretrain/simmim_pretrain__vit_base__img224__800ep.pth
PRINT_FREQ: 2
SAVE_FREQ: 5
SEED: 0
TAG: simmim_finetune__vit_base__img224__800ep
TEST:
  CROP: true
THROUGHPUT_MODE: false
TRAIN:
  ACCUMULATION_STEPS: 0
  AUTO_RESUME: true
  BASE_LR: 0.00125
  CLIP_GRAD: 5.0
  EPOCHS: 30
  LAYER_DECAY: 0.65
  LR_SCHEDULER:
    DECAY_EPOCHS: 30
    DECAY_RATE: 0.1
    GAMMA: 0.1
    MULTISTEPS: []
    NAME: cosine
  MIN_LR: 2.5e-07
  OPTIMIZER:
    BETAS:
    - 0.9
    - 0.999
    EPS: 1.0e-08
    MOMENTUM: 0.9
    NAME: adamw
  START_EPOCH: 0
  USE_CHECKPOINT: false
  WARMUP_EPOCHS: 3
  WARMUP_LR: 2.5e-07
  WEIGHT_DECAY: 0.05

[2025-04-03 00:28:54 simmim_finetune] (data_finetune.py 88): INFO Fine-tune data transform, is_train=True:
Compose(
    RandomResizedCropAndInterpolation(size=(224, 224), scale=(0.08, 1.0), ratio=(0.75, 1.3333), interpolation=PIL.Image.BICUBIC)
    RandomHorizontalFlip(p=0.5)
    <timm.data.auto_augment.RandAugment object at 0x7f8ee99851e0>
    ToTensor()
    Normalize(mean=tensor([0.4850, 0.4560, 0.4060]), std=tensor([0.2290, 0.2240, 0.2250]))
    <timm.data.random_erasing.RandomErasing object at 0x7f8ee99855d0>
)
[2025-04-03 00:28:54 simmim_finetune] (data_finetune.py 88): INFO Fine-tune data transform, is_train=False:
Compose(
    Resize(size=256, interpolation=bicubic, max_size=None, antialias=True)
    CenterCrop(size=(224, 224))
    ToTensor()
    Normalize(mean=(0.485, 0.456, 0.406), std=(0.229, 0.224, 0.225))
)
[2025-04-03 00:28:54 simmim_finetune] (data_finetune.py 26): INFO Build dataset: train images = 39848, val images = 142
[2025-04-03 00:28:54 simmim_finetune] (main_finetune.py 102): INFO Creating model:vit/simmim_finetune
[2025-04-03 00:29:03 simmim_finetune] (main_finetune.py 105): INFO VisionTransformer(
  (patch_embed): PatchEmbed(
    (proj): Conv2d(3, 768, kernel_size=(16, 16), stride=(16, 16))
  )
  (pos_drop): Dropout(p=0.0, inplace=False)
  (blocks): ModuleList(
    (0): Block(
      (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True)
      (attn): Attention(
        (qkv): Linear(in_features=768, out_features=2304, bias=False)
        (attn_drop): Dropout(p=0.0, inplace=False)
        (proj): Linear(in_features=768, out_features=768, bias=True)
        (proj_drop): Dropout(p=0.0, inplace=False)
      )
      (drop_path): Identity()
      (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True)
      (mlp): Mlp(
        (fc1): Linear(in_features=768, out_features=3072, bias=True)
        (act): GELU(approximate='none')
        (fc2): Linear(in_features=3072, out_features=768, bias=True)
        (drop): Dropout(p=0.0, inplace=False)
      )
    )
    (1-11): 11 x Block(
      (norm1): LayerNorm((768,), eps=1e-06, elementwise_affine=True)
      (attn): Attention(
        (qkv): Linear(in_features=768, out_features=2304, bias=False)
        (attn_drop): Dropout(p=0.0, inplace=False)
        (proj): Linear(in_features=768, out_features=768, bias=True)
        (proj_drop): Dropout(p=0.0, inplace=False)
      )
      (drop_path): DropPath()
      (norm2): LayerNorm((768,), eps=1e-06, elementwise_affine=True)
      (mlp): Mlp(
        (fc1): Linear(in_features=768, out_features=3072, bias=True)
        (act): GELU(approximate='none')
        (fc2): Linear(in_features=3072, out_features=768, bias=True)
        (drop): Dropout(p=0.0, inplace=False)
      )
    )
  )
  (norm): Identity()
  (fc_norm): LayerNorm((768,), eps=1e-06, elementwise_affine=True)
  (head): Linear(in_features=768, out_features=2, bias=True)
)
[2025-04-03 00:29:03 simmim_finetune] (optimizer.py 70): INFO >>>>>>>>>> Build Optimizer for Fine-tuning Stage
[2025-04-03 00:29:03 simmim_finetune] (optimizer.py 87): INFO No weight decay: {'pos_embed', 'cls_token'}
[2025-04-03 00:29:03 simmim_finetune] (optimizer.py 182): INFO Param groups = {
  "layer_0_no_decay": {
    "group_name": "layer_0_no_decay",
    "weight_decay": 0.0,
    "params": [
      "cls_token",
      "patch_embed.proj.bias"
    ],
    "lr": 4.621507363773394e-06,
    "lr_scale": 0.003697205891018715
  },
  "layer_0_decay": {
    "group_name": "layer_0_decay",
    "weight_decay": 0.05,
    "params": [
      "patch_embed.proj.weight"
    ],
    "lr": 4.621507363773394e-06,
    "lr_scale": 0.003697205891018715
  },
  "layer_1_no_decay": {
    "group_name": "layer_1_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.0.gamma_1",
      "blocks.0.gamma_2",
      "blocks.0.norm1.weight",
      "blocks.0.norm1.bias",
      "blocks.0.attn.q_bias",
      "blocks.0.attn.v_bias",
      "blocks.0.attn.proj.bias",
      "blocks.0.norm2.weight",
      "blocks.0.norm2.bias",
      "blocks.0.mlp.fc1.bias",
      "blocks.0.mlp.fc2.bias"
    ],
    "lr": 7.110011328882144e-06,
    "lr_scale": 0.005688009063105715
  },
  "layer_1_decay": {
    "group_name": "layer_1_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.0.attn.relative_position_bias_table",
      "blocks.0.attn.qkv.weight",
      "blocks.0.attn.proj.weight",
      "blocks.0.mlp.fc1.weight",
      "blocks.0.mlp.fc2.weight"
    ],
    "lr": 7.110011328882144e-06,
    "lr_scale": 0.005688009063105715
  },
  "layer_2_no_decay": {
    "group_name": "layer_2_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.1.gamma_1",
      "blocks.1.gamma_2",
      "blocks.1.norm1.weight",
      "blocks.1.norm1.bias",
      "blocks.1.attn.q_bias",
      "blocks.1.attn.v_bias",
      "blocks.1.attn.proj.bias",
      "blocks.1.norm2.weight",
      "blocks.1.norm2.bias",
      "blocks.1.mlp.fc1.bias",
      "blocks.1.mlp.fc2.bias"
    ],
    "lr": 1.093847896751099e-05,
    "lr_scale": 0.008750783174008792
  },
  "layer_2_decay": {
    "group_name": "layer_2_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.1.attn.relative_position_bias_table",
      "blocks.1.attn.qkv.weight",
      "blocks.1.attn.proj.weight",
      "blocks.1.mlp.fc1.weight",
      "blocks.1.mlp.fc2.weight"
    ],
    "lr": 1.093847896751099e-05,
    "lr_scale": 0.008750783174008792
  },
  "layer_3_no_decay": {
    "group_name": "layer_3_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.2.gamma_1",
      "blocks.2.gamma_2",
      "blocks.2.norm1.weight",
      "blocks.2.norm1.bias",
      "blocks.2.attn.q_bias",
      "blocks.2.attn.v_bias",
      "blocks.2.attn.proj.bias",
      "blocks.2.norm2.weight",
      "blocks.2.norm2.bias",
      "blocks.2.mlp.fc1.bias",
      "blocks.2.mlp.fc2.bias"
    ],
    "lr": 1.682842918078614e-05,
    "lr_scale": 0.013462743344628911
  },
  "layer_3_decay": {
    "group_name": "layer_3_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.2.attn.relative_position_bias_table",
      "blocks.2.attn.qkv.weight",
      "blocks.2.attn.proj.weight",
      "blocks.2.mlp.fc1.weight",
      "blocks.2.mlp.fc2.weight"
    ],
    "lr": 1.682842918078614e-05,
    "lr_scale": 0.013462743344628911
  },
  "layer_4_no_decay": {
    "group_name": "layer_4_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.3.gamma_1",
      "blocks.3.gamma_2",
      "blocks.3.norm1.weight",
      "blocks.3.norm1.bias",
      "blocks.3.attn.q_bias",
      "blocks.3.attn.v_bias",
      "blocks.3.attn.proj.bias",
      "blocks.3.norm2.weight",
      "blocks.3.norm2.bias",
      "blocks.3.mlp.fc1.bias",
      "blocks.3.mlp.fc2.bias"
    ],
    "lr": 2.588989104736329e-05,
    "lr_scale": 0.02071191283789063
  },
  "layer_4_decay": {
    "group_name": "layer_4_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.3.attn.relative_position_bias_table",
      "blocks.3.attn.qkv.weight",
      "blocks.3.attn.proj.weight",
      "blocks.3.mlp.fc1.weight",
      "blocks.3.mlp.fc2.weight"
    ],
    "lr": 2.588989104736329e-05,
    "lr_scale": 0.02071191283789063
  },
  "layer_5_no_decay": {
    "group_name": "layer_5_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.4.gamma_1",
      "blocks.4.gamma_2",
      "blocks.4.norm1.weight",
      "blocks.4.norm1.bias",
      "blocks.4.attn.q_bias",
      "blocks.4.attn.v_bias",
      "blocks.4.attn.proj.bias",
      "blocks.4.norm2.weight",
      "blocks.4.norm2.bias",
      "blocks.4.mlp.fc1.bias",
      "blocks.4.mlp.fc2.bias"
    ],
    "lr": 3.983060161132814e-05,
    "lr_scale": 0.03186448128906251
  },
  "layer_5_decay": {
    "group_name": "layer_5_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.4.attn.relative_position_bias_table",
      "blocks.4.attn.qkv.weight",
      "blocks.4.attn.proj.weight",
      "blocks.4.mlp.fc1.weight",
      "blocks.4.mlp.fc2.weight"
    ],
    "lr": 3.983060161132814e-05,
    "lr_scale": 0.03186448128906251
  },
  "layer_6_no_decay": {
    "group_name": "layer_6_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.5.gamma_1",
      "blocks.5.gamma_2",
      "blocks.5.norm1.weight",
      "blocks.5.norm1.bias",
      "blocks.5.attn.q_bias",
      "blocks.5.attn.v_bias",
      "blocks.5.attn.proj.bias",
      "blocks.5.norm2.weight",
      "blocks.5.norm2.bias",
      "blocks.5.mlp.fc1.bias",
      "blocks.5.mlp.fc2.bias"
    ],
    "lr": 6.127784863281252e-05,
    "lr_scale": 0.049022278906250015
  },
  "layer_6_decay": {
    "group_name": "layer_6_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.5.attn.relative_position_bias_table",
      "blocks.5.attn.qkv.weight",
      "blocks.5.attn.proj.weight",
      "blocks.5.mlp.fc1.weight",
      "blocks.5.mlp.fc2.weight"
    ],
    "lr": 6.127784863281252e-05,
    "lr_scale": 0.049022278906250015
  },
  "layer_7_no_decay": {
    "group_name": "layer_7_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.6.gamma_1",
      "blocks.6.gamma_2",
      "blocks.6.norm1.weight",
      "blocks.6.norm1.bias",
      "blocks.6.attn.q_bias",
      "blocks.6.attn.v_bias",
      "blocks.6.attn.proj.bias",
      "blocks.6.norm2.weight",
      "blocks.6.norm2.bias",
      "blocks.6.mlp.fc1.bias",
      "blocks.6.mlp.fc2.bias"
    ],
    "lr": 9.427361328125001e-05,
    "lr_scale": 0.07541889062500001
  },
  "layer_7_decay": {
    "group_name": "layer_7_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.6.attn.relative_position_bias_table",
      "blocks.6.attn.qkv.weight",
      "blocks.6.attn.proj.weight",
      "blocks.6.mlp.fc1.weight",
      "blocks.6.mlp.fc2.weight"
    ],
    "lr": 9.427361328125001e-05,
    "lr_scale": 0.07541889062500001
  },
  "layer_8_no_decay": {
    "group_name": "layer_8_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.7.gamma_1",
      "blocks.7.gamma_2",
      "blocks.7.norm1.weight",
      "blocks.7.norm1.bias",
      "blocks.7.attn.q_bias",
      "blocks.7.attn.v_bias",
      "blocks.7.attn.proj.bias",
      "blocks.7.norm2.weight",
      "blocks.7.norm2.bias",
      "blocks.7.mlp.fc1.bias",
      "blocks.7.mlp.fc2.bias"
    ],
    "lr": 0.00014503632812500002,
    "lr_scale": 0.11602906250000002
  },
  "layer_8_decay": {
    "group_name": "layer_8_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.7.attn.relative_position_bias_table",
      "blocks.7.attn.qkv.weight",
      "blocks.7.attn.proj.weight",
      "blocks.7.mlp.fc1.weight",
      "blocks.7.mlp.fc2.weight"
    ],
    "lr": 0.00014503632812500002,
    "lr_scale": 0.11602906250000002
  },
  "layer_9_no_decay": {
    "group_name": "layer_9_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.8.gamma_1",
      "blocks.8.gamma_2",
      "blocks.8.norm1.weight",
      "blocks.8.norm1.bias",
      "blocks.8.attn.q_bias",
      "blocks.8.attn.v_bias",
      "blocks.8.attn.proj.bias",
      "blocks.8.norm2.weight",
      "blocks.8.norm2.bias",
      "blocks.8.mlp.fc1.bias",
      "blocks.8.mlp.fc2.bias"
    ],
    "lr": 0.00022313281250000005,
    "lr_scale": 0.17850625000000003
  },
  "layer_9_decay": {
    "group_name": "layer_9_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.8.attn.relative_position_bias_table",
      "blocks.8.attn.qkv.weight",
      "blocks.8.attn.proj.weight",
      "blocks.8.mlp.fc1.weight",
      "blocks.8.mlp.fc2.weight"
    ],
    "lr": 0.00022313281250000005,
    "lr_scale": 0.17850625000000003
  },
  "layer_10_no_decay": {
    "group_name": "layer_10_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.9.gamma_1",
      "blocks.9.gamma_2",
      "blocks.9.norm1.weight",
      "blocks.9.norm1.bias",
      "blocks.9.attn.q_bias",
      "blocks.9.attn.v_bias",
      "blocks.9.attn.proj.bias",
      "blocks.9.norm2.weight",
      "blocks.9.norm2.bias",
      "blocks.9.mlp.fc1.bias",
      "blocks.9.mlp.fc2.bias"
    ],
    "lr": 0.00034328125,
    "lr_scale": 0.274625
  },
  "layer_10_decay": {
    "group_name": "layer_10_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.9.attn.relative_position_bias_table",
      "blocks.9.attn.qkv.weight",
      "blocks.9.attn.proj.weight",
      "blocks.9.mlp.fc1.weight",
      "blocks.9.mlp.fc2.weight"
    ],
    "lr": 0.00034328125,
    "lr_scale": 0.274625
  },
  "layer_11_no_decay": {
    "group_name": "layer_11_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.10.gamma_1",
      "blocks.10.gamma_2",
      "blocks.10.norm1.weight",
      "blocks.10.norm1.bias",
      "blocks.10.attn.q_bias",
      "blocks.10.attn.v_bias",
      "blocks.10.attn.proj.bias",
      "blocks.10.norm2.weight",
      "blocks.10.norm2.bias",
      "blocks.10.mlp.fc1.bias",
      "blocks.10.mlp.fc2.bias"
    ],
    "lr": 0.0005281250000000001,
    "lr_scale": 0.42250000000000004
  },
  "layer_11_decay": {
    "group_name": "layer_11_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.10.attn.relative_position_bias_table",
      "blocks.10.attn.qkv.weight",
      "blocks.10.attn.proj.weight",
      "blocks.10.mlp.fc1.weight",
      "blocks.10.mlp.fc2.weight"
    ],
    "lr": 0.0005281250000000001,
    "lr_scale": 0.42250000000000004
  },
  "layer_12_no_decay": {
    "group_name": "layer_12_no_decay",
    "weight_decay": 0.0,
    "params": [
      "blocks.11.gamma_1",
      "blocks.11.gamma_2",
      "blocks.11.norm1.weight",
      "blocks.11.norm1.bias",
      "blocks.11.attn.q_bias",
      "blocks.11.attn.v_bias",
      "blocks.11.attn.proj.bias",
      "blocks.11.norm2.weight",
      "blocks.11.norm2.bias",
      "blocks.11.mlp.fc1.bias",
      "blocks.11.mlp.fc2.bias"
    ],
    "lr": 0.0008125000000000001,
    "lr_scale": 0.65
  },
  "layer_12_decay": {
    "group_name": "layer_12_decay",
    "weight_decay": 0.05,
    "params": [
      "blocks.11.attn.relative_position_bias_table",
      "blocks.11.attn.qkv.weight",
      "blocks.11.attn.proj.weight",
      "blocks.11.mlp.fc1.weight",
      "blocks.11.mlp.fc2.weight"
    ],
    "lr": 0.0008125000000000001,
    "lr_scale": 0.65
  },
  "layer_13_no_decay": {
    "group_name": "layer_13_no_decay",
    "weight_decay": 0.0,
    "params": [
      "fc_norm.weight",
      "fc_norm.bias",
      "head.bias"
    ],
    "lr": 0.00125,
    "lr_scale": 1.0
  },
  "layer_13_decay": {
    "group_name": "layer_13_decay",
    "weight_decay": 0.05,
    "params": [
      "head.weight"
    ],
    "lr": 0.00125,
    "lr_scale": 1.0
  }
}
[2025-04-03 00:29:03 simmim_finetune] (optimizer.py 105): INFO AdamW (
Parameter Group 0
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_0_no_decay
    lr: 4.621507363773394e-06
    lr_scale: 0.003697205891018715
    maximize: False
    weight_decay: 0.0

Parameter Group 1
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_0_decay
    lr: 4.621507363773394e-06
    lr_scale: 0.003697205891018715
    maximize: False
    weight_decay: 0.05

Parameter Group 2
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_1_no_decay
    lr: 7.110011328882144e-06
    lr_scale: 0.005688009063105715
    maximize: False
    weight_decay: 0.0

Parameter Group 3
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_1_decay
    lr: 7.110011328882144e-06
    lr_scale: 0.005688009063105715
    maximize: False
    weight_decay: 0.05

Parameter Group 4
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_2_no_decay
    lr: 1.093847896751099e-05
    lr_scale: 0.008750783174008792
    maximize: False
    weight_decay: 0.0

Parameter Group 5
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_2_decay
    lr: 1.093847896751099e-05
    lr_scale: 0.008750783174008792
    maximize: False
    weight_decay: 0.05

Parameter Group 6
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_3_no_decay
    lr: 1.682842918078614e-05
    lr_scale: 0.013462743344628911
    maximize: False
    weight_decay: 0.0

Parameter Group 7
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_3_decay
    lr: 1.682842918078614e-05
    lr_scale: 0.013462743344628911
    maximize: False
    weight_decay: 0.05

Parameter Group 8
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_4_no_decay
    lr: 2.588989104736329e-05
    lr_scale: 0.02071191283789063
    maximize: False
    weight_decay: 0.0

Parameter Group 9
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_4_decay
    lr: 2.588989104736329e-05
    lr_scale: 0.02071191283789063
    maximize: False
    weight_decay: 0.05

Parameter Group 10
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_5_no_decay
    lr: 3.983060161132814e-05
    lr_scale: 0.03186448128906251
    maximize: False
    weight_decay: 0.0

Parameter Group 11
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_5_decay
    lr: 3.983060161132814e-05
    lr_scale: 0.03186448128906251
    maximize: False
    weight_decay: 0.05

Parameter Group 12
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_6_no_decay
    lr: 6.127784863281252e-05
    lr_scale: 0.049022278906250015
    maximize: False
    weight_decay: 0.0

Parameter Group 13
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_6_decay
    lr: 6.127784863281252e-05
    lr_scale: 0.049022278906250015
    maximize: False
    weight_decay: 0.05

Parameter Group 14
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_7_no_decay
    lr: 9.427361328125001e-05
    lr_scale: 0.07541889062500001
    maximize: False
    weight_decay: 0.0

Parameter Group 15
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_7_decay
    lr: 9.427361328125001e-05
    lr_scale: 0.07541889062500001
    maximize: False
    weight_decay: 0.05

Parameter Group 16
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_8_no_decay
    lr: 0.00014503632812500002
    lr_scale: 0.11602906250000002
    maximize: False
    weight_decay: 0.0

Parameter Group 17
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_8_decay
    lr: 0.00014503632812500002
    lr_scale: 0.11602906250000002
    maximize: False
    weight_decay: 0.05

Parameter Group 18
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_9_no_decay
    lr: 0.00022313281250000005
    lr_scale: 0.17850625000000003
    maximize: False
    weight_decay: 0.0

Parameter Group 19
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_9_decay
    lr: 0.00022313281250000005
    lr_scale: 0.17850625000000003
    maximize: False
    weight_decay: 0.05

Parameter Group 20
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_10_no_decay
    lr: 0.00034328125
    lr_scale: 0.274625
    maximize: False
    weight_decay: 0.0

Parameter Group 21
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_10_decay
    lr: 0.00034328125
    lr_scale: 0.274625
    maximize: False
    weight_decay: 0.05

Parameter Group 22
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_11_no_decay
    lr: 0.0005281250000000001
    lr_scale: 0.42250000000000004
    maximize: False
    weight_decay: 0.0

Parameter Group 23
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_11_decay
    lr: 0.0005281250000000001
    lr_scale: 0.42250000000000004
    maximize: False
    weight_decay: 0.05

Parameter Group 24
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_12_no_decay
    lr: 0.0008125000000000001
    lr_scale: 0.65
    maximize: False
    weight_decay: 0.0

Parameter Group 25
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_12_decay
    lr: 0.0008125000000000001
    lr_scale: 0.65
    maximize: False
    weight_decay: 0.05

Parameter Group 26
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_13_no_decay
    lr: 0.00125
    lr_scale: 1.0
    maximize: False
    weight_decay: 0.0

Parameter Group 27
    amsgrad: False
    betas: (0.9, 0.999)
    capturable: False
    differentiable: False
    eps: 1e-08
    foreach: None
    fused: None
    group_name: layer_13_decay
    lr: 0.00125
    lr_scale: 1.0
    maximize: False
    weight_decay: 0.05
)
[2025-04-03 00:29:03 simmim_finetune] (main_finetune.py 116): INFO number of params: 85763522
[2025-04-03 00:29:03 simmim_finetune] (utils.py 81): INFO All checkpoints founded in checkpoint/hand: []
[2025-04-03 00:29:03 simmim_finetune] (main_finetune.py 146): INFO no checkpoint found in checkpoint/hand, ignoring auto resume
[2025-04-03 00:29:03 simmim_finetune] (utils.py 99): INFO >>>>>>>>>> Fine-tuned from pretrain/simmim_pretrain__vit_base__img224__800ep.pth ..........
[2025-04-03 00:29:04 simmim_finetune] (utils.py 105): INFO Detect pre-trained model, remove [encoder.] prefix.
[2025-04-03 00:29:04 simmim_finetune] (utils.py 113): INFO >>>>>>>>>> Remapping pre-trained keys for VIT ..........
[2025-04-03 00:29:04 simmim_finetune] (utils.py 210): INFO Expand the shared relative position embedding to each transformer block.
[2025-04-03 00:29:04 simmim_finetune] (utils.py 119): INFO _IncompatibleKeys(missing_keys=['blocks.0.attn.relative_position_index', 'blocks.1.attn.relative_position_index', 'blocks.2.attn.relative_position_index', 'blocks.3.attn.relative_position_index', 'blocks.4.attn.relative_position_index', 'blocks.5.attn.relative_position_index', 'blocks.6.attn.relative_position_index', 'blocks.7.attn.relative_position_index', 'blocks.8.attn.relative_position_index', 'blocks.9.attn.relative_position_index', 'blocks.10.attn.relative_position_index', 'blocks.11.attn.relative_position_index', 'fc_norm.weight', 'fc_norm.bias', 'head.weight', 'head.bias'], unexpected_keys=['mask_token', 'norm.weight', 'norm.bias'])
[2025-04-03 00:29:04 simmim_finetune] (utils.py 123): INFO >>>>>>>>>> loaded successfully 'pretrain/simmim_pretrain__vit_base__img224__800ep.pth'
[2025-04-03 00:29:04 simmim_finetune] (main_finetune.py 161): INFO Start training
[2025-04-03 00:29:05 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07, 2.5e-07]
[2025-04-03 00:29:18 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][0/311]	eta 1:11:45 lr 0.000000	time 13.8433 (13.8433)	loss 0.6931 (0.6931)	grad_norm 0.9874 (0.9874)	mem 19956MB
[2025-04-03 00:29:20 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][2/311]	eta 0:27:18 lr 0.000003	time 0.8823 (5.3012)	loss 0.6932 (0.6932)	grad_norm 1.1137 (1.4141)	mem 20675MB
[2025-04-03 00:29:22 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][4/311]	eta 0:18:04 lr 0.000006	time 0.8832 (3.5334)	loss 0.6931 (0.6931)	grad_norm 0.8324 (1.2476)	mem 20675MB
[2025-04-03 00:29:24 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][6/311]	eta 0:14:06 lr 0.000008	time 0.8817 (2.7759)	loss 0.6928 (0.6931)	grad_norm 2.0029 (1.4625)	mem 20675MB
[2025-04-03 00:29:26 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][8/311]	eta 0:11:53 lr 0.000011	time 0.8774 (2.3544)	loss 0.6926 (0.6930)	grad_norm 1.5783 (1.6104)	mem 20675MB
[2025-04-03 00:29:27 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][10/311]	eta 0:10:27 lr 0.000014	time 0.8773 (2.0860)	loss 0.6923 (0.6929)	grad_norm 1.4544 (1.5336)	mem 20675MB
[2025-04-03 00:29:29 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][12/311]	eta 0:09:28 lr 0.000016	time 0.8775 (1.9005)	loss 0.6901 (0.6928)	grad_norm 3.8417 (1.6873)	mem 20675MB
[2025-04-03 00:29:31 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][14/311]	eta 0:08:44 lr 0.000019	time 0.8786 (1.7646)	loss 0.6927 (0.6927)	grad_norm 0.5976 (1.6246)	mem 20675MB
[2025-04-03 00:29:33 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][16/311]	eta 0:08:09 lr 0.000022	time 0.8773 (1.6602)	loss 0.6892 (0.6926)	grad_norm 2.2345 (1.6285)	mem 20675MB
[2025-04-03 00:29:34 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][18/311]	eta 0:07:42 lr 0.000024	time 0.8781 (1.5780)	loss 0.6865 (0.6923)	grad_norm 3.0743 (1.6625)	mem 20675MB
[2025-04-03 00:29:36 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][20/311]	eta 0:07:19 lr 0.000027	time 0.8772 (1.5114)	loss 0.6936 (0.6923)	grad_norm 0.8182 (1.5930)	mem 20675MB
[2025-04-03 00:29:38 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][22/311]	eta 0:07:00 lr 0.000030	time 0.8779 (1.4567)	loss 0.6893 (0.6921)	grad_norm 0.8477 (1.5166)	mem 20675MB
[2025-04-03 00:29:40 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][24/311]	eta 0:06:44 lr 0.000032	time 0.8776 (1.4107)	loss 0.6936 (0.6921)	grad_norm 0.8015 (1.4569)	mem 20675MB
[2025-04-03 00:29:42 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][26/311]	eta 0:06:30 lr 0.000035	time 0.8777 (1.3713)	loss 0.6909 (0.6921)	grad_norm 0.4007 (1.3996)	mem 20675MB
[2025-04-03 00:29:43 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][28/311]	eta 0:06:18 lr 0.000038	time 0.8774 (1.3376)	loss 0.6917 (0.6922)	grad_norm 0.7822 (1.3863)	mem 20675MB
[2025-04-03 00:29:45 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][30/311]	eta 0:06:07 lr 0.000040	time 0.8801 (1.3080)	loss 0.7021 (0.6924)	grad_norm 2.8478 (1.4161)	mem 20675MB
[2025-04-03 00:29:47 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][32/311]	eta 0:05:57 lr 0.000043	time 0.8776 (1.2820)	loss 0.6871 (0.6923)	grad_norm 1.2809 (1.3906)	mem 20675MB
[2025-04-03 00:29:49 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][34/311]	eta 0:05:48 lr 0.000046	time 0.8973 (1.2598)	loss 0.6864 (0.6919)	grad_norm 1.3007 (1.3726)	mem 20675MB
[2025-04-03 00:29:50 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][36/311]	eta 0:05:40 lr 0.000048	time 0.8767 (1.2394)	loss 0.6867 (0.6917)	grad_norm 1.3002 (1.3486)	mem 20675MB
[2025-04-03 00:29:52 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][38/311]	eta 0:05:33 lr 0.000051	time 0.8794 (1.2209)	loss 0.6997 (0.6920)	grad_norm 2.7868 (1.3809)	mem 20675MB
[2025-04-03 00:29:54 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][40/311]	eta 0:05:26 lr 0.000054	time 0.8771 (1.2042)	loss 0.6957 (0.6919)	grad_norm 1.8672 (1.3948)	mem 20675MB
[2025-04-03 00:29:56 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][42/311]	eta 0:05:19 lr 0.000057	time 0.8819 (1.1891)	loss 0.6876 (0.6917)	grad_norm 2.0385 (1.4040)	mem 20675MB
[2025-04-03 00:29:57 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][44/311]	eta 0:05:13 lr 0.000059	time 0.8796 (1.1754)	loss 0.6900 (0.6915)	grad_norm 2.1311 (1.4098)	mem 20675MB
[2025-04-03 00:29:59 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][46/311]	eta 0:05:08 lr 0.000062	time 0.8779 (1.1628)	loss 0.6941 (0.6916)	grad_norm 2.2822 (1.4237)	mem 20675MB
[2025-04-03 00:30:01 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][48/311]	eta 0:05:02 lr 0.000065	time 0.8815 (1.1513)	loss 0.6865 (0.6913)	grad_norm 2.4880 (1.4483)	mem 20675MB
[2025-04-03 00:30:03 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][50/311]	eta 0:04:57 lr 0.000067	time 0.8768 (1.1405)	loss 0.6957 (0.6912)	grad_norm 1.7250 (1.4411)	mem 20675MB
[2025-04-03 00:30:04 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][52/311]	eta 0:04:52 lr 0.000070	time 0.8767 (1.1306)	loss 0.6737 (0.6909)	grad_norm 2.0666 (1.4537)	mem 20675MB
[2025-04-03 00:30:06 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][54/311]	eta 0:04:48 lr 0.000073	time 0.8767 (1.1214)	loss 0.6902 (0.6910)	grad_norm 1.5280 (1.4648)	mem 20675MB
[2025-04-03 00:30:08 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][56/311]	eta 0:04:43 lr 0.000075	time 0.8765 (1.1129)	loss 0.6844 (0.6909)	grad_norm 1.5790 (1.4790)	mem 20675MB
[2025-04-03 00:30:10 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][58/311]	eta 0:04:39 lr 0.000078	time 0.8769 (1.1049)	loss 0.6915 (0.6907)	grad_norm 0.8906 (1.4662)	mem 20675MB
[2025-04-03 00:30:11 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][60/311]	eta 0:04:35 lr 0.000081	time 0.8764 (1.0974)	loss 0.6898 (0.6906)	grad_norm 1.8675 (1.4791)	mem 20675MB
[2025-04-03 00:30:13 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][62/311]	eta 0:04:31 lr 0.000083	time 0.8764 (1.0905)	loss 0.6751 (0.6903)	grad_norm 1.6104 (1.4931)	mem 20675MB
[2025-04-03 00:30:15 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][64/311]	eta 0:04:27 lr 0.000086	time 0.8764 (1.0839)	loss 0.6559 (0.6896)	grad_norm 2.4134 (1.5205)	mem 20675MB
[2025-04-03 00:30:17 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][66/311]	eta 0:04:24 lr 0.000089	time 0.8764 (1.0777)	loss 0.6650 (0.6889)	grad_norm 1.5387 (1.5257)	mem 20675MB
[2025-04-03 00:30:18 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][68/311]	eta 0:04:20 lr 0.000091	time 0.8764 (1.0719)	loss 0.6930 (0.6889)	grad_norm 1.2152 (1.5304)	mem 20675MB
[2025-04-03 00:30:20 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][70/311]	eta 0:04:17 lr 0.000094	time 0.8763 (1.0664)	loss 0.6901 (0.6884)	grad_norm 1.6984 (1.5466)	mem 20675MB
[2025-04-03 00:30:22 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][72/311]	eta 0:04:13 lr 0.000097	time 0.8765 (1.0613)	loss 0.6747 (0.6882)	grad_norm 2.2913 (1.5608)	mem 20675MB
[2025-04-03 00:30:24 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][74/311]	eta 0:04:10 lr 0.000099	time 0.8764 (1.0564)	loss 0.6747 (0.6880)	grad_norm 3.0657 (1.6002)	mem 20675MB
[2025-04-03 00:30:25 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][76/311]	eta 0:04:07 lr 0.000102	time 0.8761 (1.0517)	loss 0.6789 (0.6877)	grad_norm 2.0107 (1.6188)	mem 20675MB
[2025-04-03 00:30:27 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][78/311]	eta 0:04:04 lr 0.000105	time 0.8765 (1.0473)	loss 0.6527 (0.6869)	grad_norm 1.4168 (1.6259)	mem 20675MB
[2025-04-03 00:30:29 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][80/311]	eta 0:04:00 lr 0.000107	time 0.8764 (1.0431)	loss 0.6725 (0.6866)	grad_norm 1.8477 (1.6191)	mem 20675MB
[2025-04-03 00:30:31 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][82/311]	eta 0:03:57 lr 0.000110	time 0.8765 (1.0391)	loss 0.6584 (0.6863)	grad_norm 2.4601 (1.6326)	mem 20675MB
[2025-04-03 00:30:33 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][84/311]	eta 0:03:55 lr 0.000113	time 0.8772 (1.0353)	loss 0.6790 (0.6858)	grad_norm 1.4334 (1.6322)	mem 20675MB
[2025-04-03 00:30:34 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][86/311]	eta 0:03:52 lr 0.000115	time 0.8768 (1.0317)	loss 0.6686 (0.6850)	grad_norm 2.4281 (1.6493)	mem 20675MB
[2025-04-03 00:30:36 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][88/311]	eta 0:03:49 lr 0.000118	time 0.8770 (1.0282)	loss 0.6440 (0.6840)	grad_norm 2.0065 (1.6727)	mem 20675MB
[2025-04-03 00:30:38 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][90/311]	eta 0:03:46 lr 0.000121	time 0.8770 (1.0249)	loss 0.6757 (0.6836)	grad_norm 2.3713 (1.6885)	mem 20675MB
[2025-04-03 00:30:40 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][92/311]	eta 0:03:43 lr 0.000123	time 0.8772 (1.0217)	loss 0.6264 (0.6829)	grad_norm 1.4073 (1.7090)	mem 20675MB
[2025-04-03 00:30:41 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][94/311]	eta 0:03:41 lr 0.000126	time 0.8766 (1.0187)	loss 0.7054 (0.6831)	grad_norm 3.2754 (1.7233)	mem 20675MB
[2025-04-03 00:30:43 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][96/311]	eta 0:03:38 lr 0.000129	time 0.8764 (1.0158)	loss 0.6971 (0.6830)	grad_norm 2.3435 (1.7502)	mem 20675MB
[2025-04-03 00:30:45 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][98/311]	eta 0:03:35 lr 0.000132	time 0.8763 (1.0130)	loss 0.6807 (0.6830)	grad_norm 1.1786 (1.7466)	mem 20675MB
[2025-04-03 00:30:47 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][100/311]	eta 0:03:33 lr 0.000134	time 0.8765 (1.0103)	loss 0.6782 (0.6829)	grad_norm 1.2355 (1.7372)	mem 20675MB
[2025-04-03 00:30:48 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][102/311]	eta 0:03:30 lr 0.000137	time 0.8767 (1.0077)	loss 0.6863 (0.6833)	grad_norm 2.0021 (1.7803)	mem 20675MB
[2025-04-03 00:30:50 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][104/311]	eta 0:03:28 lr 0.000140	time 0.8764 (1.0053)	loss 0.6860 (0.6834)	grad_norm 1.4342 (1.7693)	mem 20675MB
[2025-04-03 00:30:52 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][106/311]	eta 0:03:25 lr 0.000142	time 0.8771 (1.0029)	loss 0.6647 (0.6832)	grad_norm 3.0465 (1.7769)	mem 20675MB
[2025-04-03 00:30:54 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][108/311]	eta 0:03:23 lr 0.000145	time 0.8770 (1.0006)	loss 0.6684 (0.6829)	grad_norm 0.8217 (1.7709)	mem 20675MB
[2025-04-03 00:30:55 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][110/311]	eta 0:03:20 lr 0.000148	time 0.8768 (0.9985)	loss 0.6878 (0.6828)	grad_norm 2.2877 (1.7680)	mem 20675MB
[2025-04-03 00:30:57 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][112/311]	eta 0:03:18 lr 0.000150	time 0.8768 (0.9964)	loss 0.6810 (0.6829)	grad_norm 1.7494 (1.7767)	mem 20675MB
[2025-04-03 00:30:59 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][114/311]	eta 0:03:15 lr 0.000153	time 0.8768 (0.9944)	loss 0.6798 (0.6827)	grad_norm 2.4096 (1.7755)	mem 20675MB
[2025-04-03 00:31:01 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][116/311]	eta 0:03:13 lr 0.000156	time 0.8770 (0.9924)	loss 0.6692 (0.6826)	grad_norm 1.4767 (1.7751)	mem 20675MB
[2025-04-03 00:31:02 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][118/311]	eta 0:03:11 lr 0.000158	time 0.8763 (0.9905)	loss 0.6880 (0.6826)	grad_norm 1.4437 (1.7762)	mem 20675MB
[2025-04-03 00:31:04 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][120/311]	eta 0:03:08 lr 0.000161	time 0.8768 (0.9887)	loss 0.6842 (0.6825)	grad_norm 1.9078 (1.7811)	mem 20675MB
[2025-04-03 00:31:06 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][122/311]	eta 0:03:06 lr 0.000164	time 0.8774 (0.9869)	loss 0.6842 (0.6823)	grad_norm 1.6075 (1.7747)	mem 20675MB
[2025-04-03 00:31:08 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][124/311]	eta 0:03:04 lr 0.000166	time 0.8765 (0.9852)	loss 0.6475 (0.6820)	grad_norm 2.1016 (1.7745)	mem 20675MB
[2025-04-03 00:31:09 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][126/311]	eta 0:03:01 lr 0.000169	time 0.8767 (0.9835)	loss 0.6727 (0.6819)	grad_norm 1.4634 (1.7794)	mem 20675MB
[2025-04-03 00:31:11 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][128/311]	eta 0:02:59 lr 0.000172	time 0.8764 (0.9818)	loss 0.6701 (0.6816)	grad_norm 2.3708 (1.7800)	mem 20675MB
[2025-04-03 00:31:13 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][130/311]	eta 0:02:57 lr 0.000174	time 0.8765 (0.9802)	loss 0.6796 (0.6816)	grad_norm 1.2643 (1.7800)	mem 20675MB
[2025-04-03 00:31:15 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][132/311]	eta 0:02:55 lr 0.000177	time 0.8769 (0.9787)	loss 0.6807 (0.6816)	grad_norm 1.2461 (1.7776)	mem 20675MB
[2025-04-03 00:31:16 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][134/311]	eta 0:02:52 lr 0.000180	time 0.8764 (0.9773)	loss 0.6557 (0.6815)	grad_norm 2.1849 (1.7867)	mem 20675MB
[2025-04-03 00:31:18 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][136/311]	eta 0:02:50 lr 0.000182	time 0.8765 (0.9758)	loss 0.6648 (0.6813)	grad_norm 1.2721 (1.7773)	mem 20675MB
[2025-04-03 00:31:20 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][138/311]	eta 0:02:48 lr 0.000185	time 0.8770 (0.9744)	loss 0.7082 (0.6814)	grad_norm 3.2550 (1.7837)	mem 20675MB
[2025-04-03 00:31:22 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][140/311]	eta 0:02:46 lr 0.000188	time 0.8766 (0.9730)	loss 0.6621 (0.6812)	grad_norm 1.5846 (1.7859)	mem 20675MB
[2025-04-03 00:31:23 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][142/311]	eta 0:02:44 lr 0.000190	time 0.8765 (0.9717)	loss 0.6785 (0.6810)	grad_norm 1.8900 (1.7868)	mem 20675MB
[2025-04-03 00:31:25 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][144/311]	eta 0:02:42 lr 0.000193	time 0.8767 (0.9704)	loss 0.6377 (0.6806)	grad_norm 2.1756 (1.7880)	mem 20675MB
[2025-04-03 00:31:27 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][146/311]	eta 0:02:39 lr 0.000196	time 0.8774 (0.9692)	loss 0.6673 (0.6803)	grad_norm 2.0575 (1.7927)	mem 20675MB
[2025-04-03 00:31:29 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][148/311]	eta 0:02:37 lr 0.000198	time 0.8764 (0.9679)	loss 0.6591 (0.6804)	grad_norm 3.4265 (1.8261)	mem 20675MB
[2025-04-03 00:31:30 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][150/311]	eta 0:02:35 lr 0.000201	time 0.8756 (0.9667)	loss 0.6524 (0.6803)	grad_norm 3.1475 (1.8348)	mem 20675MB
[2025-04-03 00:31:32 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][152/311]	eta 0:02:33 lr 0.000204	time 0.8759 (0.9655)	loss 0.6484 (0.6801)	grad_norm 3.4829 (1.8549)	mem 20675MB
[2025-04-03 00:31:34 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][154/311]	eta 0:02:31 lr 0.000207	time 0.8761 (0.9644)	loss 0.6722 (0.6801)	grad_norm 0.9958 (1.8480)	mem 20675MB
[2025-04-03 00:31:36 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][156/311]	eta 0:02:29 lr 0.000209	time 0.8756 (0.9633)	loss 0.6670 (0.6800)	grad_norm 1.1220 (1.8484)	mem 20675MB
[2025-04-03 00:31:37 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][158/311]	eta 0:02:27 lr 0.000212	time 0.8788 (0.9622)	loss 0.7005 (0.6800)	grad_norm 3.9337 (1.8673)	mem 20675MB
[2025-04-03 00:31:39 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][160/311]	eta 0:02:25 lr 0.000215	time 0.8757 (0.9611)	loss 0.6798 (0.6798)	grad_norm 1.2128 (1.8648)	mem 20675MB
[2025-04-03 00:31:41 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][162/311]	eta 0:02:23 lr 0.000217	time 0.8761 (0.9601)	loss 0.6456 (0.6796)	grad_norm 1.5285 (1.8659)	mem 20675MB
[2025-04-03 00:31:43 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][164/311]	eta 0:02:20 lr 0.000220	time 0.8758 (0.9591)	loss 0.6637 (0.6795)	grad_norm 2.1782 (1.8639)	mem 20675MB
[2025-04-03 00:31:45 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][166/311]	eta 0:02:18 lr 0.000223	time 0.8759 (0.9581)	loss 0.6578 (0.6793)	grad_norm 1.4207 (1.8568)	mem 20675MB
[2025-04-03 00:31:46 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][168/311]	eta 0:02:16 lr 0.000225	time 0.8758 (0.9571)	loss 0.6862 (0.6793)	grad_norm 1.5827 (1.8523)	mem 20675MB
[2025-04-03 00:31:48 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][170/311]	eta 0:02:14 lr 0.000228	time 0.8757 (0.9562)	loss 0.6606 (0.6790)	grad_norm 1.6301 (1.8515)	mem 20675MB
[2025-04-03 00:31:50 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][172/311]	eta 0:02:12 lr 0.000231	time 0.8759 (0.9553)	loss 0.6663 (0.6790)	grad_norm 1.6349 (1.8627)	mem 20675MB
[2025-04-03 00:31:52 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][174/311]	eta 0:02:10 lr 0.000233	time 0.8759 (0.9544)	loss 0.6656 (0.6788)	grad_norm 2.7031 (1.8776)	mem 20675MB
[2025-04-03 00:31:53 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][176/311]	eta 0:02:08 lr 0.000236	time 0.8757 (0.9535)	loss 0.6390 (0.6783)	grad_norm 1.8767 (1.8853)	mem 20675MB
[2025-04-03 00:31:55 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][178/311]	eta 0:02:06 lr 0.000239	time 0.8759 (0.9526)	loss 0.7365 (0.6784)	grad_norm 5.6109 (1.9126)	mem 20675MB
[2025-04-03 00:31:57 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][180/311]	eta 0:02:04 lr 0.000241	time 0.8756 (0.9518)	loss 0.6434 (0.6782)	grad_norm 2.7267 (1.9264)	mem 20675MB
[2025-04-03 00:31:59 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][182/311]	eta 0:02:02 lr 0.000244	time 0.8755 (0.9510)	loss 0.6784 (0.6782)	grad_norm 1.6361 (1.9319)	mem 20675MB
[2025-04-03 00:32:00 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][184/311]	eta 0:02:00 lr 0.000247	time 0.8756 (0.9502)	loss 0.6691 (0.6780)	grad_norm 1.6237 (1.9388)	mem 20675MB
[2025-04-03 00:32:02 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][186/311]	eta 0:01:58 lr 0.000249	time 0.8761 (0.9494)	loss 0.6680 (0.6778)	grad_norm 1.6292 (1.9437)	mem 20675MB
[2025-04-03 00:32:04 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][188/311]	eta 0:01:56 lr 0.000252	time 0.8763 (0.9486)	loss 0.6648 (0.6777)	grad_norm 1.5489 (1.9470)	mem 20675MB
[2025-04-03 00:32:06 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][190/311]	eta 0:01:54 lr 0.000255	time 0.8755 (0.9479)	loss 0.6458 (0.6773)	grad_norm 2.0427 (1.9464)	mem 20675MB
[2025-04-03 00:32:07 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][192/311]	eta 0:01:52 lr 0.000257	time 0.8761 (0.9471)	loss 0.6670 (0.6772)	grad_norm 1.8991 (1.9445)	mem 20675MB
[2025-04-03 00:32:09 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][194/311]	eta 0:01:50 lr 0.000260	time 0.8755 (0.9464)	loss 0.6404 (0.6768)	grad_norm 3.7357 (1.9610)	mem 20675MB
[2025-04-03 00:32:11 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][196/311]	eta 0:01:48 lr 0.000263	time 0.8760 (0.9457)	loss 0.7126 (0.6770)	grad_norm 5.3322 (1.9870)	mem 20675MB
[2025-04-03 00:32:13 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][198/311]	eta 0:01:46 lr 0.000265	time 0.8755 (0.9450)	loss 0.6700 (0.6769)	grad_norm 2.3237 (1.9934)	mem 20675MB
[2025-04-03 00:32:14 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][200/311]	eta 0:01:44 lr 0.000268	time 0.8755 (0.9443)	loss 0.6369 (0.6765)	grad_norm 1.8301 (1.9890)	mem 20675MB
[2025-04-03 00:32:16 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][202/311]	eta 0:01:42 lr 0.000271	time 0.8758 (0.9437)	loss 0.6629 (0.6764)	grad_norm 2.0123 (1.9866)	mem 20675MB
[2025-04-03 00:32:18 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][204/311]	eta 0:01:40 lr 0.000274	time 0.8755 (0.9430)	loss 0.6465 (0.6761)	grad_norm 2.1869 (1.9923)	mem 20675MB
[2025-04-03 00:32:20 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][206/311]	eta 0:01:38 lr 0.000276	time 0.8758 (0.9424)	loss 0.6404 (0.6758)	grad_norm 2.2145 (2.0015)	mem 20675MB
[2025-04-03 00:32:21 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][208/311]	eta 0:01:36 lr 0.000279	time 0.8758 (0.9417)	loss 0.6841 (0.6758)	grad_norm 4.7125 (2.0121)	mem 20675MB
[2025-04-03 00:32:23 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][210/311]	eta 0:01:35 lr 0.000282	time 0.8755 (0.9411)	loss 0.6602 (0.6758)	grad_norm 1.2029 (2.0104)	mem 20675MB
[2025-04-03 00:32:25 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][212/311]	eta 0:01:33 lr 0.000284	time 0.8755 (0.9405)	loss 0.6387 (0.6754)	grad_norm 4.9524 (2.0281)	mem 20675MB
[2025-04-03 00:32:27 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][214/311]	eta 0:01:31 lr 0.000287	time 0.8754 (0.9399)	loss 0.6770 (0.6753)	grad_norm 3.7758 (2.0328)	mem 20675MB
[2025-04-03 00:32:28 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][216/311]	eta 0:01:29 lr 0.000290	time 0.8759 (0.9393)	loss 0.6828 (0.6752)	grad_norm 3.6450 (2.0460)	mem 20675MB
[2025-04-03 00:32:30 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][218/311]	eta 0:01:27 lr 0.000292	time 0.8761 (0.9388)	loss 0.6700 (0.6751)	grad_norm 1.3085 (2.0437)	mem 20675MB
[2025-04-03 00:32:32 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][220/311]	eta 0:01:25 lr 0.000295	time 0.8758 (0.9382)	loss 0.6743 (0.6752)	grad_norm 2.3828 (2.0572)	mem 20675MB
[2025-04-03 00:32:34 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][222/311]	eta 0:01:23 lr 0.000298	time 0.8757 (0.9376)	loss 0.6536 (0.6752)	grad_norm 3.7629 (2.0718)	mem 20675MB
[2025-04-03 00:32:35 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][224/311]	eta 0:01:21 lr 0.000300	time 0.8758 (0.9371)	loss 0.6611 (0.6750)	grad_norm 2.5393 (2.0705)	mem 20675MB
[2025-04-03 00:32:37 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][226/311]	eta 0:01:19 lr 0.000303	time 0.8756 (0.9366)	loss 0.6663 (0.6749)	grad_norm 2.9089 (2.0758)	mem 20675MB
[2025-04-03 00:32:39 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][228/311]	eta 0:01:17 lr 0.000306	time 0.8757 (0.9360)	loss 0.6489 (0.6749)	grad_norm 1.5476 (2.0779)	mem 20675MB
[2025-04-03 00:32:41 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][230/311]	eta 0:01:15 lr 0.000308	time 0.8757 (0.9355)	loss 0.6412 (0.6747)	grad_norm 3.0548 (2.0820)	mem 20675MB
[2025-04-03 00:32:42 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][232/311]	eta 0:01:13 lr 0.000311	time 0.8754 (0.9350)	loss 0.6594 (0.6747)	grad_norm 1.8714 (2.0913)	mem 20675MB
[2025-04-03 00:32:44 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][234/311]	eta 0:01:11 lr 0.000314	time 0.8755 (0.9345)	loss 0.6979 (0.6747)	grad_norm 3.3077 (2.1005)	mem 20675MB
[2025-04-03 00:32:46 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][236/311]	eta 0:01:10 lr 0.000316	time 0.8761 (0.9340)	loss 0.6459 (0.6746)	grad_norm 1.7585 (2.1107)	mem 20675MB
[2025-04-03 00:32:48 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][238/311]	eta 0:01:08 lr 0.000319	time 0.8756 (0.9335)	loss 0.6286 (0.6744)	grad_norm 2.5890 (2.1139)	mem 20675MB
[2025-04-03 00:32:49 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][240/311]	eta 0:01:06 lr 0.000322	time 0.8758 (0.9331)	loss 0.6778 (0.6743)	grad_norm 3.3050 (2.1200)	mem 20675MB
[2025-04-03 00:32:51 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][242/311]	eta 0:01:04 lr 0.000324	time 0.8754 (0.9326)	loss 0.6586 (0.6741)	grad_norm 1.2882 (2.1138)	mem 20675MB
[2025-04-03 00:32:53 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][244/311]	eta 0:01:02 lr 0.000327	time 0.8755 (0.9322)	loss 0.6456 (0.6738)	grad_norm 1.6380 (2.1111)	mem 20675MB
[2025-04-03 00:32:55 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][246/311]	eta 0:01:00 lr 0.000330	time 0.8756 (0.9317)	loss 0.6840 (0.6737)	grad_norm 3.3684 (2.1200)	mem 20675MB
[2025-04-03 00:32:56 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][248/311]	eta 0:00:58 lr 0.000332	time 0.8758 (0.9313)	loss 0.6918 (0.6738)	grad_norm 4.3731 (2.1357)	mem 20675MB
[2025-04-03 00:32:58 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][250/311]	eta 0:00:56 lr 0.000335	time 0.8757 (0.9308)	loss 0.6722 (0.6735)	grad_norm 2.5943 (2.1391)	mem 20675MB
[2025-04-03 00:33:00 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][252/311]	eta 0:00:54 lr 0.000338	time 0.8755 (0.9304)	loss 0.6628 (0.6735)	grad_norm 3.3460 (2.1466)	mem 20675MB
[2025-04-03 00:33:02 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][254/311]	eta 0:00:53 lr 0.000340	time 0.8758 (0.9300)	loss 0.6483 (0.6734)	grad_norm 1.4851 (2.1528)	mem 20675MB
[2025-04-03 00:33:03 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][256/311]	eta 0:00:51 lr 0.000343	time 0.8757 (0.9296)	loss 0.6253 (0.6731)	grad_norm 1.6768 (2.1530)	mem 20675MB
[2025-04-03 00:33:05 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][258/311]	eta 0:00:49 lr 0.000346	time 0.8756 (0.9291)	loss 0.6897 (0.6731)	grad_norm 4.5240 (2.1594)	mem 20675MB
[2025-04-03 00:33:07 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][260/311]	eta 0:00:47 lr 0.000349	time 0.8759 (0.9287)	loss 0.6247 (0.6727)	grad_norm 2.6315 (2.1627)	mem 20675MB
[2025-04-03 00:33:09 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][262/311]	eta 0:00:45 lr 0.000351	time 0.8756 (0.9283)	loss 0.6641 (0.6726)	grad_norm 3.8244 (2.1697)	mem 20675MB
[2025-04-03 00:33:10 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][264/311]	eta 0:00:43 lr 0.000354	time 0.8758 (0.9279)	loss 0.6882 (0.6726)	grad_norm 4.8843 (2.1810)	mem 20675MB
[2025-04-03 00:33:12 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][266/311]	eta 0:00:41 lr 0.000357	time 0.8757 (0.9276)	loss 0.6780 (0.6727)	grad_norm 4.1464 (2.1933)	mem 20675MB
[2025-04-03 00:33:14 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][268/311]	eta 0:00:39 lr 0.000359	time 0.8757 (0.9272)	loss 0.6440 (0.6728)	grad_norm 1.8304 (2.2028)	mem 20675MB
[2025-04-03 00:33:16 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][270/311]	eta 0:00:37 lr 0.000362	time 0.8757 (0.9268)	loss 0.6722 (0.6728)	grad_norm 1.8588 (2.1978)	mem 20675MB
[2025-04-03 00:33:17 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][272/311]	eta 0:00:36 lr 0.000365	time 0.8757 (0.9264)	loss 0.6853 (0.6729)	grad_norm 0.4348 (2.1930)	mem 20675MB
[2025-04-03 00:33:19 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][274/311]	eta 0:00:34 lr 0.000367	time 0.8755 (0.9261)	loss 0.6668 (0.6729)	grad_norm 1.5960 (2.1872)	mem 20675MB
[2025-04-03 00:33:21 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][276/311]	eta 0:00:32 lr 0.000370	time 0.8755 (0.9257)	loss 0.6750 (0.6730)	grad_norm 0.8296 (2.1831)	mem 20675MB
[2025-04-03 00:33:23 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][278/311]	eta 0:00:30 lr 0.000373	time 0.8755 (0.9254)	loss 0.6699 (0.6729)	grad_norm 2.0578 (2.1790)	mem 20675MB
[2025-04-03 00:33:24 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][280/311]	eta 0:00:28 lr 0.000375	time 0.8754 (0.9250)	loss 0.6878 (0.6730)	grad_norm 1.4606 (2.1733)	mem 20675MB
[2025-04-03 00:33:26 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][282/311]	eta 0:00:26 lr 0.000378	time 0.8755 (0.9247)	loss 0.6754 (0.6730)	grad_norm 2.5112 (2.1722)	mem 20675MB
[2025-04-03 00:33:28 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][284/311]	eta 0:00:24 lr 0.000381	time 0.8756 (0.9243)	loss 0.6657 (0.6729)	grad_norm 0.8368 (2.1694)	mem 20675MB
[2025-04-03 00:33:30 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][286/311]	eta 0:00:23 lr 0.000383	time 0.8758 (0.9240)	loss 0.6875 (0.6729)	grad_norm 3.0563 (2.1734)	mem 20675MB
[2025-04-03 00:33:31 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][288/311]	eta 0:00:21 lr 0.000386	time 0.8755 (0.9237)	loss 0.6430 (0.6728)	grad_norm 2.4059 (2.1758)	mem 20675MB
[2025-04-03 00:33:33 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][290/311]	eta 0:00:19 lr 0.000389	time 0.8757 (0.9233)	loss 0.6377 (0.6726)	grad_norm 1.6770 (2.1764)	mem 20675MB
[2025-04-03 00:33:35 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][292/311]	eta 0:00:17 lr 0.000391	time 0.8755 (0.9230)	loss 0.6343 (0.6724)	grad_norm 1.5382 (2.1718)	mem 20675MB
[2025-04-03 00:33:37 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][294/311]	eta 0:00:15 lr 0.000394	time 0.8757 (0.9227)	loss 0.6598 (0.6724)	grad_norm 1.6177 (2.1709)	mem 20675MB
[2025-04-03 00:33:38 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][296/311]	eta 0:00:13 lr 0.000397	time 0.8752 (0.9224)	loss 0.6280 (0.6721)	grad_norm 1.8663 (2.1683)	mem 20675MB
[2025-04-03 00:33:40 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][298/311]	eta 0:00:11 lr 0.000399	time 0.8754 (0.9221)	loss 0.6920 (0.6721)	grad_norm 2.1609 (2.1667)	mem 20675MB
[2025-04-03 00:33:42 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][300/311]	eta 0:00:10 lr 0.000402	time 0.8753 (0.9218)	loss 0.6658 (0.6720)	grad_norm 2.1958 (2.1695)	mem 20675MB
[2025-04-03 00:33:44 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][302/311]	eta 0:00:08 lr 0.000405	time 0.8755 (0.9215)	loss 0.6490 (0.6719)	grad_norm 2.2932 (2.1679)	mem 20675MB
[2025-04-03 00:33:45 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][304/311]	eta 0:00:06 lr 0.000407	time 0.8753 (0.9212)	loss 0.6475 (0.6719)	grad_norm 2.6476 (2.1739)	mem 20675MB
[2025-04-03 00:33:47 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][306/311]	eta 0:00:04 lr 0.000410	time 0.8755 (0.9209)	loss 0.6636 (0.6718)	grad_norm 1.4933 (2.1774)	mem 20675MB
[2025-04-03 00:33:49 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][308/311]	eta 0:00:02 lr 0.000413	time 0.8753 (0.9206)	loss 0.6555 (0.6717)	grad_norm 4.1606 (2.1895)	mem 20675MB
[2025-04-03 00:33:51 simmim_finetune] (main_finetune.py 252): INFO Train: [0/30][310/311]	eta 0:00:00 lr 0.000415	time 0.8756 (0.9203)	loss 0.6835 (0.6718)	grad_norm 2.1901 (2.1999)	mem 20675MB
[2025-04-03 00:33:51 simmim_finetune] (main_finetune.py 260): INFO EPOCH 0 training takes 0:04:46
[2025-04-03 00:33:51 simmim_finetune] (utils.py 60): INFO checkpoint/hand/ckpt0.pth saving......
[2025-04-03 00:33:54 simmim_finetune] (utils.py 62): INFO checkpoint/hand/ckpt0.pth saved !!!
[2025-04-03 00:33:56 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.454 (1.454)	Loss 0.6656 (0.6656)	Acc@1 60.938 (60.938)	Mem 20675MB
[2025-04-03 00:33:56 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 63.380
[2025-04-03 00:33:56 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 63.4%
[2025-04-03 00:33:56 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 63.38%
[2025-04-03 00:33:56 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.7024836899997342e-06, 1.7024836899997342e-06, 2.529317804880455e-06, 2.529317804880455e-06, 3.801370289312334e-06, 3.801370289312334e-06, 5.758374111515224e-06, 5.758374111515224e-06, 8.76914922259659e-06, 8.76914922259659e-06, 1.3401110931952544e-05, 1.3401110931952544e-05, 2.0527205869423236e-05, 2.0527205869423236e-05, 3.1490428850147375e-05, 3.1490428850147375e-05, 4.835692574356913e-05, 4.835692574356913e-05, 7.430538250267954e-05, 7.430538250267954e-05, 0.00011422608520900322, 0.00011422608520900322, 0.00017564255091103966, 0.00017564255091103966, 0.0002701294212218649, 0.0002701294212218649, 0.000415493837084673, 0.000415493837084673]
[2025-04-03 00:33:58 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][0/311]	eta 0:10:48 lr 0.000417	time 2.0859 (2.0859)	loss 0.7014 (0.7014)	grad_norm 2.8646 (2.8646)	mem 20675MB
[2025-04-03 00:34:00 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][2/311]	eta 0:06:35 lr 0.000420	time 0.8759 (1.2801)	loss 0.6861 (0.6852)	grad_norm 5.8534 (4.4407)	mem 20675MB
[2025-04-03 00:34:01 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][4/311]	eta 0:05:43 lr 0.000422	time 0.8761 (1.1188)	loss 0.6790 (0.6758)	grad_norm 3.8511 (3.7123)	mem 20675MB
[2025-04-03 00:34:03 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][6/311]	eta 0:05:20 lr 0.000425	time 0.8786 (1.0501)	loss 0.6585 (0.6727)	grad_norm 1.3631 (2.9724)	mem 20675MB
[2025-04-03 00:34:05 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][8/311]	eta 0:05:06 lr 0.000428	time 0.8760 (1.0116)	loss 0.6457 (0.6676)	grad_norm 1.3695 (2.6446)	mem 20675MB
[2025-04-03 00:34:07 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][10/311]	eta 0:04:57 lr 0.000430	time 0.8761 (0.9871)	loss 0.6443 (0.6663)	grad_norm 2.2969 (2.6060)	mem 20675MB
[2025-04-03 00:34:09 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][12/311]	eta 0:04:50 lr 0.000433	time 0.8762 (0.9701)	loss 0.6300 (0.6632)	grad_norm 2.8245 (2.5593)	mem 20675MB
[2025-04-03 00:34:10 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][14/311]	eta 0:04:44 lr 0.000436	time 0.8759 (0.9577)	loss 0.6692 (0.6613)	grad_norm 1.6863 (2.5062)	mem 20675MB
[2025-04-03 00:34:12 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][16/311]	eta 0:04:39 lr 0.000438	time 0.8761 (0.9482)	loss 0.6683 (0.6566)	grad_norm 2.7674 (2.5167)	mem 20675MB
[2025-04-03 00:34:14 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][18/311]	eta 0:04:35 lr 0.000441	time 0.8765 (0.9407)	loss 0.6622 (0.6549)	grad_norm 4.2110 (2.6143)	mem 20675MB
[2025-04-03 00:34:16 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][20/311]	eta 0:04:31 lr 0.000444	time 0.8759 (0.9347)	loss 0.6465 (0.6538)	grad_norm 2.7249 (2.6528)	mem 20675MB
[2025-04-03 00:34:17 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][22/311]	eta 0:04:28 lr 0.000446	time 0.8761 (0.9297)	loss 0.6952 (0.6557)	grad_norm 3.3553 (2.6769)	mem 20675MB
[2025-04-03 00:34:19 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][24/311]	eta 0:04:25 lr 0.000449	time 0.8758 (0.9254)	loss 0.6284 (0.6552)	grad_norm 1.9429 (2.6243)	mem 20675MB
[2025-04-03 00:34:21 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][26/311]	eta 0:04:22 lr 0.000452	time 0.8761 (0.9218)	loss 0.6701 (0.6562)	grad_norm 1.6043 (2.5613)	mem 20675MB
[2025-04-03 00:34:23 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][28/311]	eta 0:04:19 lr 0.000454	time 0.8760 (0.9187)	loss 0.6638 (0.6566)	grad_norm 1.6545 (2.5107)	mem 20675MB
[2025-04-03 00:34:24 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][30/311]	eta 0:04:17 lr 0.000457	time 0.8762 (0.9160)	loss 0.7049 (0.6582)	grad_norm 2.6825 (2.5441)	mem 20675MB
[2025-04-03 00:34:26 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][32/311]	eta 0:04:14 lr 0.000460	time 0.8759 (0.9136)	loss 0.6690 (0.6575)	grad_norm 1.1153 (2.4701)	mem 20675MB
[2025-04-03 00:34:28 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][34/311]	eta 0:04:12 lr 0.000462	time 0.8760 (0.9115)	loss 0.6991 (0.6578)	grad_norm 4.5327 (2.5360)	mem 20675MB
[2025-04-03 00:34:30 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][36/311]	eta 0:04:10 lr 0.000465	time 0.8762 (0.9097)	loss 0.6589 (0.6579)	grad_norm 2.8304 (2.5313)	mem 20675MB
[2025-04-03 00:34:31 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][38/311]	eta 0:04:07 lr 0.000468	time 0.8758 (0.9080)	loss 0.6320 (0.6564)	grad_norm 2.8734 (2.5236)	mem 20675MB
[2025-04-03 00:34:33 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][40/311]	eta 0:04:05 lr 0.000470	time 0.8762 (0.9065)	loss 0.6451 (0.6561)	grad_norm 2.2162 (2.5172)	mem 20675MB
[2025-04-03 00:34:35 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][42/311]	eta 0:04:03 lr 0.000473	time 0.8759 (0.9051)	loss 0.6491 (0.6552)	grad_norm 1.6435 (2.5076)	mem 20675MB
[2025-04-03 00:34:37 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][44/311]	eta 0:04:01 lr 0.000476	time 0.8762 (0.9038)	loss 0.6240 (0.6541)	grad_norm 2.1028 (2.5120)	mem 20675MB
[2025-04-03 00:34:38 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][46/311]	eta 0:03:59 lr 0.000478	time 0.8760 (0.9027)	loss 0.6831 (0.6546)	grad_norm 3.0410 (2.5073)	mem 20675MB
[2025-04-03 00:34:40 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][48/311]	eta 0:03:57 lr 0.000481	time 0.8762 (0.9016)	loss 0.6908 (0.6556)	grad_norm 1.7562 (2.4826)	mem 20675MB
[2025-04-03 00:34:42 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][50/311]	eta 0:03:55 lr 0.000484	time 0.8763 (0.9007)	loss 0.6875 (0.6560)	grad_norm 1.1554 (2.4605)	mem 20675MB
[2025-04-03 00:34:44 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][52/311]	eta 0:03:53 lr 0.000486	time 0.8762 (0.8998)	loss 0.6610 (0.6562)	grad_norm 1.5297 (2.4430)	mem 20675MB
[2025-04-03 00:34:45 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][54/311]	eta 0:03:51 lr 0.000489	time 0.8760 (0.8989)	loss 0.6787 (0.6567)	grad_norm 2.8322 (2.4237)	mem 20675MB
[2025-04-03 00:34:47 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][56/311]	eta 0:03:49 lr 0.000492	time 0.8764 (0.8982)	loss 0.6192 (0.6560)	grad_norm 1.6386 (2.3977)	mem 20675MB
[2025-04-03 00:34:49 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][58/311]	eta 0:03:47 lr 0.000495	time 0.8759 (0.8974)	loss 0.6437 (0.6559)	grad_norm 1.7000 (2.3987)	mem 20675MB
[2025-04-03 00:34:51 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][60/311]	eta 0:03:45 lr 0.000497	time 0.8762 (0.8968)	loss 0.6378 (0.6543)	grad_norm 3.7042 (2.4142)	mem 20675MB
[2025-04-03 00:34:52 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][62/311]	eta 0:03:43 lr 0.000500	time 0.8759 (0.8961)	loss 0.6602 (0.6548)	grad_norm 3.9738 (2.4499)	mem 20675MB
[2025-04-03 00:34:54 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][64/311]	eta 0:03:41 lr 0.000503	time 0.8760 (0.8955)	loss 0.6106 (0.6542)	grad_norm 2.8977 (2.4806)	mem 20675MB
[2025-04-03 00:34:56 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][66/311]	eta 0:03:39 lr 0.000505	time 0.8764 (0.8950)	loss 0.6581 (0.6543)	grad_norm 1.5371 (2.4921)	mem 20675MB
[2025-04-03 00:34:58 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][68/311]	eta 0:03:37 lr 0.000508	time 0.8761 (0.8945)	loss 0.6299 (0.6538)	grad_norm 2.2890 (2.4851)	mem 20675MB
[2025-04-03 00:34:59 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][70/311]	eta 0:03:35 lr 0.000511	time 0.8763 (0.8940)	loss 0.6313 (0.6531)	grad_norm 1.7470 (2.4983)	mem 20675MB
[2025-04-03 00:35:01 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][72/311]	eta 0:03:33 lr 0.000513	time 0.8763 (0.8935)	loss 0.6549 (0.6527)	grad_norm 4.9413 (2.5611)	mem 20675MB
[2025-04-03 00:35:03 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][74/311]	eta 0:03:31 lr 0.000516	time 0.8758 (0.8931)	loss 0.7125 (0.6531)	grad_norm 2.7685 (2.5788)	mem 20675MB
[2025-04-03 00:35:05 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][76/311]	eta 0:03:29 lr 0.000519	time 0.8761 (0.8927)	loss 0.6457 (0.6534)	grad_norm 2.4245 (2.5897)	mem 20675MB
[2025-04-03 00:35:06 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][78/311]	eta 0:03:27 lr 0.000521	time 0.8758 (0.8922)	loss 0.6504 (0.6535)	grad_norm 6.1500 (2.6287)	mem 20675MB
[2025-04-03 00:35:08 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][80/311]	eta 0:03:26 lr 0.000524	time 0.8762 (0.8919)	loss 0.6688 (0.6540)	grad_norm 2.1222 (2.6090)	mem 20675MB
[2025-04-03 00:35:10 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][82/311]	eta 0:03:24 lr 0.000527	time 0.8759 (0.8915)	loss 0.6465 (0.6541)	grad_norm 1.3210 (2.5826)	mem 20675MB
[2025-04-03 00:35:12 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][84/311]	eta 0:03:22 lr 0.000529	time 0.8761 (0.8912)	loss 0.7118 (0.6546)	grad_norm 4.9256 (2.5903)	mem 20675MB
[2025-04-03 00:35:13 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][86/311]	eta 0:03:20 lr 0.000532	time 0.8762 (0.8908)	loss 0.6580 (0.6551)	grad_norm 0.9189 (2.5818)	mem 20675MB
[2025-04-03 00:35:15 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][88/311]	eta 0:03:18 lr 0.000535	time 0.8759 (0.8905)	loss 0.6566 (0.6556)	grad_norm 3.0190 (2.6049)	mem 20675MB
[2025-04-03 00:35:17 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][90/311]	eta 0:03:16 lr 0.000537	time 0.8760 (0.8902)	loss 0.7055 (0.6566)	grad_norm 8.3963 (2.7050)	mem 20675MB
[2025-04-03 00:35:19 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][92/311]	eta 0:03:14 lr 0.000540	time 0.8761 (0.8899)	loss 0.6503 (0.6564)	grad_norm 1.1695 (2.6739)	mem 20675MB
[2025-04-03 00:35:20 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][94/311]	eta 0:03:13 lr 0.000543	time 0.8762 (0.8896)	loss 0.6928 (0.6566)	grad_norm 2.9698 (2.6736)	mem 20675MB
[2025-04-03 00:35:22 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][96/311]	eta 0:03:11 lr 0.000545	time 0.8759 (0.8894)	loss 0.6422 (0.6566)	grad_norm 1.9096 (2.6771)	mem 20675MB
[2025-04-03 00:35:24 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][98/311]	eta 0:03:09 lr 0.000548	time 0.8761 (0.8891)	loss 0.6708 (0.6563)	grad_norm 1.6640 (2.6574)	mem 20675MB
[2025-04-03 00:35:26 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][100/311]	eta 0:03:07 lr 0.000551	time 0.8758 (0.8889)	loss 0.6574 (0.6564)	grad_norm 1.2671 (2.6349)	mem 20675MB
[2025-04-03 00:35:27 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][102/311]	eta 0:03:05 lr 0.000553	time 0.8758 (0.8886)	loss 0.6487 (0.6565)	grad_norm 2.4631 (2.6496)	mem 20675MB
[2025-04-03 00:35:29 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][104/311]	eta 0:03:03 lr 0.000556	time 0.8760 (0.8884)	loss 0.6308 (0.6567)	grad_norm 2.7976 (2.6465)	mem 20675MB
[2025-04-03 00:35:31 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][106/311]	eta 0:03:02 lr 0.000559	time 0.8764 (0.8882)	loss 0.6701 (0.6568)	grad_norm 1.0214 (2.6294)	mem 20675MB
[2025-04-03 00:35:33 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][108/311]	eta 0:03:00 lr 0.000561	time 0.8761 (0.8880)	loss 0.6489 (0.6566)	grad_norm 2.8293 (2.6335)	mem 20675MB
[2025-04-03 00:35:34 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][110/311]	eta 0:02:58 lr 0.000564	time 0.8762 (0.8878)	loss 0.6771 (0.6569)	grad_norm 2.5086 (2.6211)	mem 20675MB
[2025-04-03 00:35:36 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][112/311]	eta 0:02:56 lr 0.000567	time 0.8762 (0.8876)	loss 0.7237 (0.6571)	grad_norm 4.3960 (2.6465)	mem 20675MB
[2025-04-03 00:35:38 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][114/311]	eta 0:02:54 lr 0.000570	time 0.8758 (0.8874)	loss 0.6534 (0.6575)	grad_norm 1.3203 (2.6599)	mem 20675MB
[2025-04-03 00:35:40 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][116/311]	eta 0:02:53 lr 0.000572	time 0.8761 (0.8872)	loss 0.7250 (0.6584)	grad_norm 4.9564 (2.6753)	mem 20675MB
[2025-04-03 00:35:41 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][118/311]	eta 0:02:51 lr 0.000575	time 0.8760 (0.8871)	loss 0.6475 (0.6585)	grad_norm 1.3288 (2.6644)	mem 20675MB
[2025-04-03 00:35:43 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][120/311]	eta 0:02:49 lr 0.000578	time 0.8761 (0.8869)	loss 0.6761 (0.6588)	grad_norm 2.0900 (2.6443)	mem 20675MB
[2025-04-03 00:35:45 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][122/311]	eta 0:02:47 lr 0.000580	time 0.8758 (0.8867)	loss 0.6821 (0.6592)	grad_norm 2.2972 (2.6239)	mem 20675MB
[2025-04-03 00:35:47 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][124/311]	eta 0:02:45 lr 0.000583	time 0.8759 (0.8866)	loss 0.6652 (0.6594)	grad_norm 2.0503 (2.6070)	mem 20675MB
[2025-04-03 00:35:48 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][126/311]	eta 0:02:43 lr 0.000586	time 0.8759 (0.8864)	loss 0.6702 (0.6596)	grad_norm 1.5842 (2.5841)	mem 20675MB
[2025-04-03 00:35:50 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][128/311]	eta 0:02:42 lr 0.000588	time 0.8758 (0.8863)	loss 0.6648 (0.6597)	grad_norm 1.2131 (2.5603)	mem 20675MB
[2025-04-03 00:35:52 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][130/311]	eta 0:02:40 lr 0.000591	time 0.8760 (0.8861)	loss 0.6540 (0.6599)	grad_norm 2.7701 (2.5544)	mem 20675MB
[2025-04-03 00:35:54 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][132/311]	eta 0:02:38 lr 0.000594	time 0.8759 (0.8860)	loss 0.6439 (0.6594)	grad_norm 1.4887 (2.5400)	mem 20675MB
[2025-04-03 00:35:55 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][134/311]	eta 0:02:36 lr 0.000596	time 0.8762 (0.8858)	loss 0.5938 (0.6591)	grad_norm 1.9170 (2.5354)	mem 20675MB
[2025-04-03 00:35:57 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][136/311]	eta 0:02:35 lr 0.000599	time 0.8758 (0.8857)	loss 0.6547 (0.6592)	grad_norm 2.3579 (2.5459)	mem 20675MB
[2025-04-03 00:35:59 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][138/311]	eta 0:02:33 lr 0.000602	time 0.8761 (0.8856)	loss 0.6869 (0.6593)	grad_norm 6.7470 (2.5724)	mem 20675MB
[2025-04-03 00:36:01 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][140/311]	eta 0:02:31 lr 0.000604	time 0.8765 (0.8855)	loss 0.6441 (0.6591)	grad_norm 2.8519 (2.5732)	mem 20675MB
[2025-04-03 00:36:02 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][142/311]	eta 0:02:29 lr 0.000607	time 0.8763 (0.8854)	loss 0.7117 (0.6595)	grad_norm 5.2785 (2.6047)	mem 20675MB
[2025-04-03 00:36:04 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][144/311]	eta 0:02:27 lr 0.000610	time 0.8762 (0.8852)	loss 0.6633 (0.6594)	grad_norm 2.1410 (2.5973)	mem 20675MB
[2025-04-03 00:36:06 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][146/311]	eta 0:02:26 lr 0.000612	time 0.8762 (0.8851)	loss 0.6318 (0.6589)	grad_norm 4.0976 (2.6078)	mem 20675MB
[2025-04-03 00:36:08 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][148/311]	eta 0:02:24 lr 0.000615	time 0.8766 (0.8850)	loss 0.6514 (0.6588)	grad_norm 3.3431 (2.6143)	mem 20675MB
[2025-04-03 00:36:10 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][150/311]	eta 0:02:22 lr 0.000618	time 0.8761 (0.8849)	loss 0.6365 (0.6587)	grad_norm 3.5348 (2.6448)	mem 20675MB
[2025-04-03 00:36:11 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][152/311]	eta 0:02:20 lr 0.000620	time 0.8762 (0.8848)	loss 0.6746 (0.6586)	grad_norm 2.5982 (2.6422)	mem 20675MB
[2025-04-03 00:36:13 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][154/311]	eta 0:02:18 lr 0.000623	time 0.8761 (0.8847)	loss 0.6484 (0.6584)	grad_norm 2.6971 (2.6518)	mem 20675MB
[2025-04-03 00:36:15 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][156/311]	eta 0:02:17 lr 0.000626	time 0.8760 (0.8846)	loss 0.6431 (0.6582)	grad_norm 1.5789 (2.6378)	mem 20675MB
[2025-04-03 00:36:17 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][158/311]	eta 0:02:15 lr 0.000628	time 0.8759 (0.8845)	loss 0.6688 (0.6583)	grad_norm 3.1295 (2.6759)	mem 20675MB
[2025-04-03 00:36:18 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][160/311]	eta 0:02:13 lr 0.000631	time 0.8761 (0.8844)	loss 0.6393 (0.6584)	grad_norm 3.2670 (2.6815)	mem 20675MB
[2025-04-03 00:36:20 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][162/311]	eta 0:02:11 lr 0.000634	time 0.8759 (0.8843)	loss 0.6747 (0.6588)	grad_norm 3.6802 (2.7043)	mem 20675MB
[2025-04-03 00:36:22 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][164/311]	eta 0:02:09 lr 0.000637	time 0.8761 (0.8842)	loss 0.6410 (0.6585)	grad_norm 4.0795 (2.7063)	mem 20675MB
[2025-04-03 00:36:24 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][166/311]	eta 0:02:08 lr 0.000639	time 0.8765 (0.8841)	loss 0.6446 (0.6586)	grad_norm 4.6305 (2.7225)	mem 20675MB
[2025-04-03 00:36:25 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][168/311]	eta 0:02:06 lr 0.000642	time 0.8758 (0.8841)	loss 0.6300 (0.6583)	grad_norm 1.4411 (2.7098)	mem 20675MB
[2025-04-03 00:36:27 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][170/311]	eta 0:02:04 lr 0.000645	time 0.8764 (0.8840)	loss 0.6684 (0.6583)	grad_norm 2.4597 (2.7038)	mem 20675MB
[2025-04-03 00:36:29 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][172/311]	eta 0:02:02 lr 0.000647	time 0.8757 (0.8839)	loss 0.6360 (0.6582)	grad_norm 1.6148 (2.6987)	mem 20675MB
[2025-04-03 00:36:31 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][174/311]	eta 0:02:01 lr 0.000650	time 0.8761 (0.8838)	loss 0.6467 (0.6582)	grad_norm 1.6914 (2.6961)	mem 20675MB
[2025-04-03 00:36:32 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][176/311]	eta 0:01:59 lr 0.000653	time 0.8759 (0.8837)	loss 0.6939 (0.6584)	grad_norm 5.6291 (2.7051)	mem 20675MB
[2025-04-03 00:36:34 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][178/311]	eta 0:01:57 lr 0.000655	time 0.8762 (0.8837)	loss 0.6663 (0.6584)	grad_norm 1.4960 (2.7073)	mem 20675MB
[2025-04-03 00:36:36 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][180/311]	eta 0:01:55 lr 0.000658	time 0.8760 (0.8836)	loss 0.6836 (0.6585)	grad_norm 3.2233 (2.7041)	mem 20675MB
[2025-04-03 00:36:38 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][182/311]	eta 0:01:53 lr 0.000661	time 0.8759 (0.8835)	loss 0.6338 (0.6582)	grad_norm 1.4519 (2.6913)	mem 20675MB
[2025-04-03 00:36:39 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][184/311]	eta 0:01:52 lr 0.000663	time 0.8762 (0.8834)	loss 0.6173 (0.6577)	grad_norm 2.6001 (2.6893)	mem 20675MB
[2025-04-03 00:36:41 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][186/311]	eta 0:01:50 lr 0.000666	time 0.8762 (0.8834)	loss 0.6567 (0.6577)	grad_norm 3.4031 (2.6912)	mem 20675MB
[2025-04-03 00:36:43 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][188/311]	eta 0:01:48 lr 0.000669	time 0.8765 (0.8833)	loss 0.6120 (0.6573)	grad_norm 6.1065 (2.7126)	mem 20675MB
[2025-04-03 00:36:45 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][190/311]	eta 0:01:46 lr 0.000671	time 0.8758 (0.8832)	loss 0.6949 (0.6575)	grad_norm 5.3108 (2.7553)	mem 20675MB
[2025-04-03 00:36:46 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][192/311]	eta 0:01:45 lr 0.000674	time 0.8758 (0.8832)	loss 0.6739 (0.6573)	grad_norm 2.2603 (2.7584)	mem 20675MB
[2025-04-03 00:36:48 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][194/311]	eta 0:01:43 lr 0.000677	time 0.8758 (0.8831)	loss 0.6595 (0.6574)	grad_norm 1.7782 (2.7519)	mem 20675MB
[2025-04-03 00:36:50 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][196/311]	eta 0:01:41 lr 0.000679	time 0.8761 (0.8830)	loss 0.6686 (0.6576)	grad_norm 1.3222 (2.7389)	mem 20675MB
[2025-04-03 00:36:52 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][198/311]	eta 0:01:39 lr 0.000682	time 0.8770 (0.8830)	loss 0.6502 (0.6577)	grad_norm 2.8327 (2.7438)	mem 20675MB
[2025-04-03 00:36:53 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][200/311]	eta 0:01:38 lr 0.000685	time 0.8761 (0.8829)	loss 0.6909 (0.6579)	grad_norm 1.5806 (2.7399)	mem 20675MB
[2025-04-03 00:36:55 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][202/311]	eta 0:01:36 lr 0.000687	time 0.8763 (0.8829)	loss 0.6759 (0.6580)	grad_norm 1.8925 (2.7289)	mem 20675MB
[2025-04-03 00:36:57 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][204/311]	eta 0:01:34 lr 0.000690	time 0.8759 (0.8828)	loss 0.6716 (0.6581)	grad_norm 1.4519 (2.7138)	mem 20675MB
[2025-04-03 00:36:59 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][206/311]	eta 0:01:32 lr 0.000693	time 0.8758 (0.8827)	loss 0.6616 (0.6582)	grad_norm 1.1987 (2.7029)	mem 20675MB
[2025-04-03 00:37:00 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][208/311]	eta 0:01:30 lr 0.000695	time 0.8760 (0.8827)	loss 0.6457 (0.6580)	grad_norm 1.9788 (2.6962)	mem 20675MB
[2025-04-03 00:37:02 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][210/311]	eta 0:01:29 lr 0.000698	time 0.8764 (0.8826)	loss 0.6259 (0.6577)	grad_norm 1.9320 (2.6886)	mem 20675MB
[2025-04-03 00:37:04 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][212/311]	eta 0:01:27 lr 0.000701	time 0.8762 (0.8826)	loss 0.6250 (0.6576)	grad_norm 3.2662 (2.6868)	mem 20675MB
[2025-04-03 00:37:06 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][214/311]	eta 0:01:25 lr 0.000703	time 0.8761 (0.8825)	loss 0.6252 (0.6573)	grad_norm 3.0766 (2.6870)	mem 20675MB
[2025-04-03 00:37:07 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][216/311]	eta 0:01:23 lr 0.000706	time 0.8761 (0.8825)	loss 0.6165 (0.6573)	grad_norm 2.4155 (2.6823)	mem 20675MB
[2025-04-03 00:37:09 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][218/311]	eta 0:01:22 lr 0.000709	time 0.8758 (0.8824)	loss 0.6572 (0.6573)	grad_norm 4.2828 (2.6866)	mem 20675MB
[2025-04-03 00:37:11 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][220/311]	eta 0:01:20 lr 0.000712	time 0.8762 (0.8824)	loss 0.6310 (0.6572)	grad_norm 1.3628 (2.6746)	mem 20675MB
[2025-04-03 00:37:13 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][222/311]	eta 0:01:18 lr 0.000714	time 0.8763 (0.8823)	loss 0.6345 (0.6573)	grad_norm 1.4951 (2.6727)	mem 20675MB
[2025-04-03 00:37:14 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][224/311]	eta 0:01:16 lr 0.000717	time 0.8760 (0.8823)	loss 0.6481 (0.6571)	grad_norm 2.8380 (2.6699)	mem 20675MB
[2025-04-03 00:37:16 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][226/311]	eta 0:01:14 lr 0.000720	time 0.8762 (0.8822)	loss 0.6330 (0.6568)	grad_norm 1.8896 (2.6647)	mem 20675MB
[2025-04-03 00:37:18 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][228/311]	eta 0:01:13 lr 0.000722	time 0.8761 (0.8822)	loss 0.6494 (0.6568)	grad_norm 2.4230 (2.6599)	mem 20675MB
[2025-04-03 00:37:20 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][230/311]	eta 0:01:11 lr 0.000725	time 0.8765 (0.8821)	loss 0.6407 (0.6566)	grad_norm 2.7563 (2.6579)	mem 20675MB
[2025-04-03 00:37:21 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][232/311]	eta 0:01:09 lr 0.000728	time 0.8762 (0.8821)	loss 0.6476 (0.6566)	grad_norm 3.4272 (2.6605)	mem 20675MB
[2025-04-03 00:37:23 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][234/311]	eta 0:01:07 lr 0.000730	time 0.8760 (0.8821)	loss 0.5930 (0.6561)	grad_norm 3.4039 (2.6602)	mem 20675MB
[2025-04-03 00:37:25 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][236/311]	eta 0:01:06 lr 0.000733	time 0.8761 (0.8820)	loss 0.7163 (0.6564)	grad_norm 6.7789 (2.6846)	mem 20675MB
[2025-04-03 00:37:27 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][238/311]	eta 0:01:04 lr 0.000736	time 0.8758 (0.8820)	loss 0.6031 (0.6562)	grad_norm 2.7473 (2.6797)	mem 20675MB
[2025-04-03 00:37:28 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][240/311]	eta 0:01:02 lr 0.000738	time 0.8761 (0.8819)	loss 0.6547 (0.6563)	grad_norm 1.1150 (2.6704)	mem 20675MB
[2025-04-03 00:37:30 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][242/311]	eta 0:01:00 lr 0.000741	time 0.8760 (0.8819)	loss 0.6707 (0.6564)	grad_norm 3.1355 (2.6720)	mem 20675MB
[2025-04-03 00:37:32 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][244/311]	eta 0:00:59 lr 0.000744	time 0.8763 (0.8818)	loss 0.6904 (0.6565)	grad_norm 5.0426 (2.6777)	mem 20675MB
[2025-04-03 00:37:34 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][246/311]	eta 0:00:57 lr 0.000746	time 0.8759 (0.8818)	loss 0.6279 (0.6562)	grad_norm 4.1738 (2.6808)	mem 20675MB
[2025-04-03 00:37:35 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][248/311]	eta 0:00:55 lr 0.000749	time 0.8757 (0.8818)	loss 0.6572 (0.6563)	grad_norm 1.4777 (2.6716)	mem 20675MB
[2025-04-03 00:37:37 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][250/311]	eta 0:00:53 lr 0.000752	time 0.8765 (0.8817)	loss 0.6515 (0.6563)	grad_norm 1.3472 (2.6621)	mem 20675MB
[2025-04-03 00:37:39 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][252/311]	eta 0:00:52 lr 0.000754	time 0.8757 (0.8817)	loss 0.6573 (0.6562)	grad_norm 1.5558 (2.6535)	mem 20675MB
[2025-04-03 00:37:41 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][254/311]	eta 0:00:50 lr 0.000757	time 0.8761 (0.8817)	loss 0.5836 (0.6556)	grad_norm 2.6306 (2.6493)	mem 20675MB
[2025-04-03 00:37:42 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][256/311]	eta 0:00:48 lr 0.000760	time 0.8761 (0.8816)	loss 0.6668 (0.6558)	grad_norm 2.3514 (2.6518)	mem 20675MB
[2025-04-03 00:37:44 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][258/311]	eta 0:00:46 lr 0.000762	time 0.8761 (0.8816)	loss 0.6522 (0.6557)	grad_norm 1.9820 (2.6546)	mem 20675MB
[2025-04-03 00:37:46 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][260/311]	eta 0:00:44 lr 0.000765	time 0.8758 (0.8816)	loss 0.6362 (0.6555)	grad_norm 2.6892 (2.6551)	mem 20675MB
[2025-04-03 00:37:48 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][262/311]	eta 0:00:43 lr 0.000768	time 0.8761 (0.8815)	loss 0.6844 (0.6555)	grad_norm 2.7989 (2.6602)	mem 20675MB
[2025-04-03 00:37:49 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][264/311]	eta 0:00:41 lr 0.000770	time 0.8762 (0.8815)	loss 0.6081 (0.6555)	grad_norm 1.5624 (2.6567)	mem 20675MB
[2025-04-03 00:37:51 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][266/311]	eta 0:00:39 lr 0.000773	time 0.8760 (0.8815)	loss 0.6642 (0.6555)	grad_norm 2.3211 (2.6499)	mem 20675MB
[2025-04-03 00:37:53 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][268/311]	eta 0:00:37 lr 0.000776	time 0.8761 (0.8814)	loss 0.6161 (0.6554)	grad_norm 2.2506 (2.6463)	mem 20675MB
[2025-04-03 00:37:55 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][270/311]	eta 0:00:36 lr 0.000778	time 0.8761 (0.8814)	loss 0.6721 (0.6554)	grad_norm 4.5567 (2.6575)	mem 20675MB
[2025-04-03 00:37:56 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][272/311]	eta 0:00:34 lr 0.000781	time 0.8761 (0.8814)	loss 0.5959 (0.6552)	grad_norm 2.4806 (2.6518)	mem 20675MB
[2025-04-03 00:37:58 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][274/311]	eta 0:00:32 lr 0.000784	time 0.8759 (0.8813)	loss 0.6464 (0.6551)	grad_norm 4.0566 (2.6583)	mem 20675MB
[2025-04-03 00:38:00 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][276/311]	eta 0:00:30 lr 0.000787	time 0.8759 (0.8813)	loss 0.6988 (0.6553)	grad_norm 2.2839 (2.6592)	mem 20675MB
[2025-04-03 00:38:02 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][278/311]	eta 0:00:29 lr 0.000789	time 0.8767 (0.8813)	loss 0.6496 (0.6553)	grad_norm 2.5508 (2.6612)	mem 20675MB
[2025-04-03 00:38:04 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][280/311]	eta 0:00:27 lr 0.000792	time 0.8758 (0.8812)	loss 0.6770 (0.6553)	grad_norm 2.9099 (2.6561)	mem 20675MB
[2025-04-03 00:38:05 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][282/311]	eta 0:00:25 lr 0.000795	time 0.8762 (0.8812)	loss 0.6854 (0.6554)	grad_norm 2.6973 (2.6534)	mem 20675MB
[2025-04-03 00:38:07 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][284/311]	eta 0:00:23 lr 0.000797	time 0.8761 (0.8812)	loss 0.6593 (0.6554)	grad_norm 4.4809 (2.6553)	mem 20675MB
[2025-04-03 00:38:09 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][286/311]	eta 0:00:22 lr 0.000800	time 0.8759 (0.8811)	loss 0.6378 (0.6553)	grad_norm 3.6118 (2.6527)	mem 20675MB
[2025-04-03 00:38:11 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][288/311]	eta 0:00:20 lr 0.000803	time 0.8759 (0.8811)	loss 0.6791 (0.6553)	grad_norm 2.1146 (2.6473)	mem 20675MB
[2025-04-03 00:38:12 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][290/311]	eta 0:00:18 lr 0.000805	time 0.8760 (0.8811)	loss 0.6143 (0.6550)	grad_norm 1.9946 (2.6417)	mem 20675MB
[2025-04-03 00:38:14 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][292/311]	eta 0:00:16 lr 0.000808	time 0.8764 (0.8810)	loss 0.6769 (0.6548)	grad_norm 2.6918 (2.6415)	mem 20675MB
[2025-04-03 00:38:16 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][294/311]	eta 0:00:14 lr 0.000811	time 0.8763 (0.8810)	loss 0.6560 (0.6551)	grad_norm 2.6568 (2.6494)	mem 20675MB
[2025-04-03 00:38:18 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][296/311]	eta 0:00:13 lr 0.000813	time 0.8757 (0.8810)	loss 0.6468 (0.6552)	grad_norm 3.3147 (2.6569)	mem 20675MB
[2025-04-03 00:38:19 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][298/311]	eta 0:00:11 lr 0.000816	time 0.8758 (0.8810)	loss 0.6481 (0.6552)	grad_norm 2.7760 (2.6534)	mem 20675MB
[2025-04-03 00:38:21 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][300/311]	eta 0:00:09 lr 0.000819	time 0.8755 (0.8809)	loss 0.6046 (0.6550)	grad_norm 1.9620 (2.6506)	mem 20675MB
[2025-04-03 00:38:23 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][302/311]	eta 0:00:07 lr 0.000821	time 0.8758 (0.8809)	loss 0.6629 (0.6550)	grad_norm 1.4292 (2.6446)	mem 20675MB
[2025-04-03 00:38:25 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][304/311]	eta 0:00:06 lr 0.000824	time 0.8760 (0.8809)	loss 0.6504 (0.6549)	grad_norm 2.6655 (2.6403)	mem 20675MB
[2025-04-03 00:38:26 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][306/311]	eta 0:00:04 lr 0.000827	time 0.8755 (0.8808)	loss 0.6486 (0.6549)	grad_norm 2.1523 (2.6346)	mem 20675MB
[2025-04-03 00:38:28 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][308/311]	eta 0:00:02 lr 0.000829	time 0.8756 (0.8808)	loss 0.6551 (0.6549)	grad_norm 2.5888 (2.6313)	mem 20675MB
[2025-04-03 00:38:30 simmim_finetune] (main_finetune.py 252): INFO Train: [1/30][310/311]	eta 0:00:00 lr 0.000832	time 0.8757 (0.8808)	loss 0.6560 (0.6549)	grad_norm 4.1452 (2.6322)	mem 20675MB
[2025-04-03 00:38:30 simmim_finetune] (main_finetune.py 260): INFO EPOCH 1 training takes 0:04:34
[2025-04-03 00:38:31 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.381 (1.381)	Loss 0.6658 (0.6658)	Acc@1 60.156 (60.156)	Mem 20675MB
[2025-04-03 00:38:31 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 62.676
[2025-04-03 00:38:31 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 62.7%
[2025-04-03 00:38:31 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 63.38%
[2025-04-03 00:38:31 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.1596528112575324e-06, 3.1596528112575324e-06, 4.81598824784117e-06, 4.81598824784117e-06, 7.364196611815998e-06, 7.364196611815998e-06, 1.128451717177727e-05, 1.128451717177727e-05, 1.731577957171769e-05, 1.731577957171769e-05, 2.6594644802395258e-05, 2.6594644802395258e-05, 4.086982208036074e-05, 4.086982208036074e-05, 6.283163327723073e-05, 6.283163327723073e-05, 9.661903511856914e-05, 9.661903511856914e-05, 0.00014859965333601289, 0.00014859965333601289, 0.00022856983520900322, 0.00022856983520900322, 0.00035160088424437296, 0.00035160088424437296, 0.000540879421221865, 0.000540879421221865, 0.0008320771704180064, 0.0008320771704180064]
[2025-04-03 00:38:33 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][0/311]	eta 0:10:23 lr 0.000833	time 2.0058 (2.0058)	loss 0.7108 (0.7108)	grad_norm 2.0179 (2.0179)	mem 20675MB
[2025-04-03 00:38:35 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][2/311]	eta 0:06:27 lr 0.000836	time 0.8759 (1.2532)	loss 0.6568 (0.6792)	grad_norm 1.7330 (1.7344)	mem 20675MB
[2025-04-03 00:38:37 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][4/311]	eta 0:05:38 lr 0.000839	time 0.8766 (1.1028)	loss 0.6550 (0.6641)	grad_norm 3.4875 (2.0497)	mem 20675MB
[2025-04-03 00:38:39 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][6/311]	eta 0:05:16 lr 0.000841	time 0.8764 (1.0384)	loss 0.6302 (0.6545)	grad_norm 2.1487 (1.9783)	mem 20675MB
[2025-04-03 00:38:40 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][8/311]	eta 0:05:03 lr 0.000844	time 0.8763 (1.0027)	loss 0.6704 (0.6586)	grad_norm 4.0616 (2.1731)	mem 20675MB
[2025-04-03 00:38:42 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][10/311]	eta 0:04:54 lr 0.000847	time 0.8761 (0.9798)	loss 0.6570 (0.6602)	grad_norm 1.7987 (2.1447)	mem 20675MB
[2025-04-03 00:38:44 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][12/311]	eta 0:04:48 lr 0.000849	time 0.8763 (0.9639)	loss 0.5866 (0.6524)	grad_norm 2.8044 (2.1665)	mem 20675MB
[2025-04-03 00:38:46 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][14/311]	eta 0:04:42 lr 0.000852	time 0.8760 (0.9523)	loss 0.6453 (0.6502)	grad_norm 4.4056 (2.3061)	mem 20675MB
[2025-04-03 00:38:48 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][16/311]	eta 0:04:38 lr 0.000855	time 0.8762 (0.9434)	loss 0.6311 (0.6494)	grad_norm 1.8328 (2.3184)	mem 20675MB
[2025-04-03 00:38:49 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][18/311]	eta 0:04:34 lr 0.000858	time 0.8761 (0.9364)	loss 0.6235 (0.6455)	grad_norm 4.8166 (2.5411)	mem 20675MB
[2025-04-03 00:38:51 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][20/311]	eta 0:04:30 lr 0.000860	time 0.8760 (0.9307)	loss 0.7007 (0.6477)	grad_norm 1.8110 (2.5625)	mem 20675MB
[2025-04-03 00:38:53 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][22/311]	eta 0:04:27 lr 0.000863	time 0.8762 (0.9261)	loss 0.5694 (0.6444)	grad_norm 2.6151 (2.5904)	mem 20675MB
[2025-04-03 00:38:55 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][24/311]	eta 0:04:24 lr 0.000866	time 0.8760 (0.9221)	loss 0.6648 (0.6464)	grad_norm 2.7198 (2.6847)	mem 20675MB
[2025-04-03 00:38:56 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][26/311]	eta 0:04:21 lr 0.000868	time 0.8761 (0.9188)	loss 0.6282 (0.6453)	grad_norm 1.8110 (2.7270)	mem 20675MB
[2025-04-03 00:38:58 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][28/311]	eta 0:04:19 lr 0.000871	time 0.8761 (0.9159)	loss 0.6628 (0.6460)	grad_norm 2.1120 (2.7244)	mem 20675MB
[2025-04-03 00:39:00 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][30/311]	eta 0:04:16 lr 0.000874	time 0.8760 (0.9134)	loss 0.7008 (0.6470)	grad_norm 3.7596 (2.7965)	mem 20675MB
[2025-04-03 00:39:02 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][32/311]	eta 0:04:14 lr 0.000876	time 0.8760 (0.9112)	loss 0.6122 (0.6457)	grad_norm 2.9665 (2.8094)	mem 20675MB
[2025-04-03 00:39:03 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][34/311]	eta 0:04:11 lr 0.000879	time 0.8758 (0.9092)	loss 0.5957 (0.6430)	grad_norm 6.0818 (2.9048)	mem 20675MB
[2025-04-03 00:39:05 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][36/311]	eta 0:04:09 lr 0.000882	time 0.8759 (0.9074)	loss 0.6293 (0.6432)	grad_norm 1.6119 (2.9825)	mem 20675MB
[2025-04-03 00:39:07 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][38/311]	eta 0:04:07 lr 0.000884	time 0.8760 (0.9059)	loss 0.5769 (0.6413)	grad_norm 3.2718 (2.9897)	mem 20675MB
[2025-04-03 00:39:09 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][40/311]	eta 0:04:05 lr 0.000887	time 0.8762 (0.9045)	loss 0.5965 (0.6400)	grad_norm 3.3365 (3.0173)	mem 20675MB
[2025-04-03 00:39:10 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][42/311]	eta 0:04:02 lr 0.000890	time 0.8761 (0.9032)	loss 0.6210 (0.6404)	grad_norm 4.3858 (3.0109)	mem 20675MB
[2025-04-03 00:39:12 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][44/311]	eta 0:04:00 lr 0.000892	time 0.8761 (0.9020)	loss 0.6386 (0.6410)	grad_norm 1.9555 (2.9868)	mem 20675MB
[2025-04-03 00:39:14 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][46/311]	eta 0:03:58 lr 0.000895	time 0.8762 (0.9010)	loss 0.6408 (0.6410)	grad_norm 3.1648 (2.9715)	mem 20675MB
[2025-04-03 00:39:16 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][48/311]	eta 0:03:56 lr 0.000898	time 0.8768 (0.9000)	loss 0.6526 (0.6416)	grad_norm 1.4804 (2.9234)	mem 20675MB
[2025-04-03 00:39:17 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][50/311]	eta 0:03:54 lr 0.000900	time 0.8768 (0.8991)	loss 0.6291 (0.6416)	grad_norm 1.7156 (2.9314)	mem 20675MB
[2025-04-03 00:39:19 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][52/311]	eta 0:03:52 lr 0.000903	time 0.8762 (0.8983)	loss 0.6560 (0.6419)	grad_norm 1.4318 (2.8918)	mem 20675MB
[2025-04-03 00:39:21 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][54/311]	eta 0:03:50 lr 0.000906	time 0.8768 (0.8976)	loss 0.6385 (0.6418)	grad_norm 2.7022 (2.8701)	mem 20675MB
[2025-04-03 00:39:23 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][56/311]	eta 0:03:48 lr 0.000908	time 0.8765 (0.8968)	loss 0.6752 (0.6419)	grad_norm 3.5336 (2.8888)	mem 20675MB
[2025-04-03 00:39:24 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][58/311]	eta 0:03:46 lr 0.000911	time 0.8767 (0.8962)	loss 0.6459 (0.6417)	grad_norm 1.7469 (2.8834)	mem 20675MB
[2025-04-03 00:39:26 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][60/311]	eta 0:03:44 lr 0.000914	time 0.8762 (0.8955)	loss 0.6434 (0.6415)	grad_norm 1.2065 (2.8416)	mem 20675MB
[2025-04-03 00:39:28 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][62/311]	eta 0:03:42 lr 0.000916	time 0.8759 (0.8950)	loss 0.6610 (0.6418)	grad_norm 2.6755 (2.8168)	mem 20675MB
[2025-04-03 00:39:30 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][64/311]	eta 0:03:40 lr 0.000919	time 0.8762 (0.8944)	loss 0.6703 (0.6425)	grad_norm 1.7048 (2.8066)	mem 20675MB
[2025-04-03 00:39:31 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][66/311]	eta 0:03:38 lr 0.000922	time 0.8758 (0.8939)	loss 0.6086 (0.6417)	grad_norm 2.2137 (2.7805)	mem 20675MB
[2025-04-03 00:39:33 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][68/311]	eta 0:03:37 lr 0.000925	time 0.8759 (0.8934)	loss 0.6408 (0.6410)	grad_norm 2.0141 (2.7792)	mem 20675MB
[2025-04-03 00:39:35 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][70/311]	eta 0:03:35 lr 0.000927	time 0.8766 (0.8929)	loss 0.6827 (0.6415)	grad_norm 2.3693 (2.7650)	mem 20675MB
[2025-04-03 00:39:37 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][72/311]	eta 0:03:33 lr 0.000930	time 0.8759 (0.8925)	loss 0.6104 (0.6413)	grad_norm 2.0768 (2.7566)	mem 20675MB
[2025-04-03 00:39:38 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][74/311]	eta 0:03:31 lr 0.000933	time 0.8757 (0.8921)	loss 0.5479 (0.6395)	grad_norm 2.4987 (2.7491)	mem 20675MB
[2025-04-03 00:39:40 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][76/311]	eta 0:03:29 lr 0.000935	time 0.8758 (0.8917)	loss 0.6141 (0.6392)	grad_norm 2.7641 (2.7397)	mem 20675MB
[2025-04-03 00:39:42 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][78/311]	eta 0:03:27 lr 0.000938	time 0.8761 (0.8913)	loss 0.7173 (0.6396)	grad_norm 6.3147 (2.7760)	mem 20675MB
[2025-04-03 00:39:44 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][80/311]	eta 0:03:25 lr 0.000941	time 0.8761 (0.8909)	loss 0.6658 (0.6404)	grad_norm 2.0392 (2.7742)	mem 20675MB
[2025-04-03 00:39:45 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][82/311]	eta 0:03:23 lr 0.000943	time 0.8761 (0.8906)	loss 0.6493 (0.6404)	grad_norm 2.8893 (2.7745)	mem 20675MB
[2025-04-03 00:39:47 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][84/311]	eta 0:03:22 lr 0.000946	time 0.8763 (0.8903)	loss 0.5873 (0.6401)	grad_norm 2.2345 (2.7511)	mem 20675MB
[2025-04-03 00:39:49 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][86/311]	eta 0:03:20 lr 0.000949	time 0.8759 (0.8900)	loss 0.6049 (0.6401)	grad_norm 2.1684 (2.7501)	mem 20675MB
[2025-04-03 00:39:51 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][88/311]	eta 0:03:18 lr 0.000951	time 0.8761 (0.8897)	loss 0.6081 (0.6396)	grad_norm 2.6690 (2.7330)	mem 20675MB
[2025-04-03 00:39:52 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][90/311]	eta 0:03:16 lr 0.000954	time 0.8763 (0.8894)	loss 0.6536 (0.6393)	grad_norm 5.1610 (2.7611)	mem 20675MB
[2025-04-03 00:39:54 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][92/311]	eta 0:03:14 lr 0.000957	time 0.8758 (0.8891)	loss 0.6577 (0.6395)	grad_norm 3.0751 (2.7799)	mem 20675MB
[2025-04-03 00:39:56 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][94/311]	eta 0:03:12 lr 0.000959	time 0.8758 (0.8889)	loss 0.6175 (0.6391)	grad_norm 3.4263 (2.7862)	mem 20675MB
[2025-04-03 00:39:58 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][96/311]	eta 0:03:11 lr 0.000962	time 0.8761 (0.8886)	loss 0.6609 (0.6399)	grad_norm 1.4196 (2.7844)	mem 20675MB
[2025-04-03 00:39:59 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][98/311]	eta 0:03:09 lr 0.000965	time 0.8761 (0.8884)	loss 0.6631 (0.6401)	grad_norm 1.2838 (2.7563)	mem 20675MB
[2025-04-03 00:40:01 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][100/311]	eta 0:03:07 lr 0.000967	time 0.8760 (0.8882)	loss 0.6598 (0.6403)	grad_norm 1.7459 (2.7380)	mem 20675MB
[2025-04-03 00:40:03 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][102/311]	eta 0:03:05 lr 0.000970	time 0.8764 (0.8880)	loss 0.6256 (0.6402)	grad_norm 1.5240 (2.7112)	mem 20675MB
[2025-04-03 00:40:05 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][104/311]	eta 0:03:03 lr 0.000973	time 0.8762 (0.8877)	loss 0.6360 (0.6403)	grad_norm 1.9359 (2.6909)	mem 20675MB
[2025-04-03 00:40:06 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][106/311]	eta 0:03:01 lr 0.000975	time 0.8761 (0.8875)	loss 0.6401 (0.6403)	grad_norm 2.0320 (2.6777)	mem 20675MB
[2025-04-03 00:40:08 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][108/311]	eta 0:03:00 lr 0.000978	time 0.8758 (0.8873)	loss 0.6521 (0.6405)	grad_norm 2.4813 (2.6697)	mem 20675MB
[2025-04-03 00:40:10 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][110/311]	eta 0:02:58 lr 0.000981	time 0.8757 (0.8871)	loss 0.6406 (0.6407)	grad_norm 3.2659 (2.6746)	mem 20675MB
[2025-04-03 00:40:12 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][112/311]	eta 0:02:56 lr 0.000983	time 0.8760 (0.8870)	loss 0.6850 (0.6413)	grad_norm 2.5164 (2.6794)	mem 20675MB
[2025-04-03 00:40:13 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][114/311]	eta 0:02:54 lr 0.000986	time 0.8760 (0.8868)	loss 0.6851 (0.6424)	grad_norm 0.9620 (2.6692)	mem 20675MB
[2025-04-03 00:40:15 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][116/311]	eta 0:02:52 lr 0.000989	time 0.8758 (0.8866)	loss 0.6688 (0.6428)	grad_norm 0.8739 (2.6590)	mem 20675MB
[2025-04-03 00:40:17 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][118/311]	eta 0:02:51 lr 0.000991	time 0.8758 (0.8864)	loss 0.6756 (0.6431)	grad_norm 2.7310 (2.6510)	mem 20675MB
[2025-04-03 00:40:19 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][120/311]	eta 0:02:49 lr 0.000994	time 0.8758 (0.8863)	loss 0.6505 (0.6433)	grad_norm 1.2027 (2.6318)	mem 20675MB
[2025-04-03 00:40:20 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][122/311]	eta 0:02:47 lr 0.000997	time 0.8757 (0.8861)	loss 0.6680 (0.6433)	grad_norm 3.7221 (2.6321)	mem 20675MB
[2025-04-03 00:40:22 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][124/311]	eta 0:02:45 lr 0.001000	time 0.8762 (0.8860)	loss 0.6492 (0.6436)	grad_norm 4.4138 (2.6452)	mem 20675MB
[2025-04-03 00:40:24 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][126/311]	eta 0:02:43 lr 0.001002	time 0.8763 (0.8858)	loss 0.6200 (0.6431)	grad_norm 1.9288 (2.6346)	mem 20675MB
[2025-04-03 00:40:26 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][128/311]	eta 0:02:42 lr 0.001005	time 0.8759 (0.8857)	loss 0.6686 (0.6431)	grad_norm 3.1778 (2.6377)	mem 20675MB
[2025-04-03 00:40:27 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][130/311]	eta 0:02:40 lr 0.001008	time 0.8758 (0.8856)	loss 0.6206 (0.6429)	grad_norm 3.2713 (2.6376)	mem 20675MB
[2025-04-03 00:40:29 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][132/311]	eta 0:02:38 lr 0.001010	time 0.8759 (0.8854)	loss 0.5834 (0.6421)	grad_norm 4.4350 (2.6510)	mem 20675MB
[2025-04-03 00:40:31 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][134/311]	eta 0:02:36 lr 0.001013	time 0.8764 (0.8853)	loss 0.6372 (0.6422)	grad_norm 2.6959 (2.6538)	mem 20675MB
[2025-04-03 00:40:33 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][136/311]	eta 0:02:34 lr 0.001016	time 0.8758 (0.8852)	loss 0.6968 (0.6428)	grad_norm 6.2969 (2.6830)	mem 20675MB
[2025-04-03 00:40:34 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][138/311]	eta 0:02:33 lr 0.001018	time 0.8758 (0.8851)	loss 0.6356 (0.6428)	grad_norm 3.8158 (2.6817)	mem 20675MB
[2025-04-03 00:40:36 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][140/311]	eta 0:02:31 lr 0.001021	time 0.8761 (0.8850)	loss 0.6133 (0.6425)	grad_norm 1.5465 (2.6658)	mem 20675MB
[2025-04-03 00:40:38 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][142/311]	eta 0:02:29 lr 0.001024	time 0.8760 (0.8848)	loss 0.6606 (0.6429)	grad_norm 1.3587 (2.6861)	mem 20675MB
[2025-04-03 00:40:40 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][144/311]	eta 0:02:27 lr 0.001026	time 0.8766 (0.8847)	loss 0.6183 (0.6427)	grad_norm 2.4820 (2.6849)	mem 20675MB
[2025-04-03 00:40:42 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][146/311]	eta 0:02:25 lr 0.001029	time 0.8761 (0.8846)	loss 0.6645 (0.6428)	grad_norm 5.1382 (2.7040)	mem 20675MB
[2025-04-03 00:40:43 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][148/311]	eta 0:02:24 lr 0.001032	time 0.8760 (0.8845)	loss 0.6793 (0.6432)	grad_norm 2.7449 (2.6997)	mem 20675MB
[2025-04-03 00:40:45 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][150/311]	eta 0:02:22 lr 0.001034	time 0.8773 (0.8844)	loss 0.6588 (0.6433)	grad_norm 3.1843 (2.7121)	mem 20675MB
[2025-04-03 00:40:47 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][152/311]	eta 0:02:20 lr 0.001037	time 0.8763 (0.8843)	loss 0.6226 (0.6433)	grad_norm 2.2412 (2.7096)	mem 20675MB
[2025-04-03 00:40:49 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][154/311]	eta 0:02:18 lr 0.001040	time 0.8757 (0.8842)	loss 0.6103 (0.6432)	grad_norm 3.6581 (2.7255)	mem 20675MB
[2025-04-03 00:40:50 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][156/311]	eta 0:02:17 lr 0.001042	time 0.8757 (0.8841)	loss 0.6394 (0.6434)	grad_norm 2.2217 (2.7122)	mem 20675MB
[2025-04-03 00:40:52 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][158/311]	eta 0:02:15 lr 0.001045	time 0.8759 (0.8840)	loss 0.6350 (0.6431)	grad_norm 5.4311 (2.7285)	mem 20675MB
[2025-04-03 00:40:54 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][160/311]	eta 0:02:13 lr 0.001048	time 0.8757 (0.8840)	loss 0.6567 (0.6429)	grad_norm 2.2752 (2.7237)	mem 20675MB
[2025-04-03 00:40:56 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][162/311]	eta 0:02:11 lr 0.001050	time 0.8754 (0.8839)	loss 0.6297 (0.6426)	grad_norm 4.2279 (2.7323)	mem 20675MB
[2025-04-03 00:40:57 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][164/311]	eta 0:02:09 lr 0.001053	time 0.8760 (0.8838)	loss 0.6333 (0.6428)	grad_norm 1.3534 (2.7404)	mem 20675MB
[2025-04-03 00:40:59 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][166/311]	eta 0:02:08 lr 0.001056	time 0.8771 (0.8837)	loss 0.6618 (0.6433)	grad_norm 1.5922 (2.7464)	mem 20675MB
[2025-04-03 00:41:01 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][168/311]	eta 0:02:06 lr 0.001058	time 0.8758 (0.8836)	loss 0.6265 (0.6432)	grad_norm 1.1819 (2.7276)	mem 20675MB
[2025-04-03 00:41:03 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][170/311]	eta 0:02:04 lr 0.001061	time 0.8758 (0.8835)	loss 0.6403 (0.6432)	grad_norm 0.8673 (2.7066)	mem 20675MB
[2025-04-03 00:41:04 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][172/311]	eta 0:02:02 lr 0.001064	time 0.8760 (0.8835)	loss 0.6445 (0.6430)	grad_norm 1.6904 (2.6946)	mem 20675MB
[2025-04-03 00:41:06 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][174/311]	eta 0:02:01 lr 0.001066	time 0.8766 (0.8834)	loss 0.6730 (0.6429)	grad_norm 2.2589 (2.6922)	mem 20675MB
[2025-04-03 00:41:08 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][176/311]	eta 0:01:59 lr 0.001069	time 0.8763 (0.8833)	loss 0.6640 (0.6430)	grad_norm 2.7740 (2.6893)	mem 20675MB
[2025-04-03 00:41:10 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][178/311]	eta 0:01:57 lr 0.001072	time 0.8761 (0.8833)	loss 0.6716 (0.6430)	grad_norm 3.9739 (2.6987)	mem 20675MB
[2025-04-03 00:41:11 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][180/311]	eta 0:01:55 lr 0.001075	time 0.8754 (0.8832)	loss 0.6095 (0.6426)	grad_norm 2.7379 (2.6968)	mem 20675MB
[2025-04-03 00:41:13 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][182/311]	eta 0:01:53 lr 0.001077	time 0.8758 (0.8831)	loss 0.7034 (0.6427)	grad_norm 3.4130 (2.7036)	mem 20675MB
[2025-04-03 00:41:15 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][184/311]	eta 0:01:52 lr 0.001080	time 0.8759 (0.8830)	loss 0.6159 (0.6426)	grad_norm 4.1448 (2.7222)	mem 20675MB
[2025-04-03 00:41:17 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][186/311]	eta 0:01:50 lr 0.001083	time 0.8763 (0.8830)	loss 0.5875 (0.6424)	grad_norm 2.2516 (2.7110)	mem 20675MB
[2025-04-03 00:41:18 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][188/311]	eta 0:01:48 lr 0.001085	time 0.8759 (0.8829)	loss 0.5878 (0.6419)	grad_norm 3.0216 (2.7090)	mem 20675MB
[2025-04-03 00:41:20 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][190/311]	eta 0:01:46 lr 0.001088	time 0.8762 (0.8829)	loss 0.6068 (0.6416)	grad_norm 3.4283 (2.7059)	mem 20675MB
[2025-04-03 00:41:22 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][192/311]	eta 0:01:45 lr 0.001091	time 0.8758 (0.8828)	loss 0.6568 (0.6418)	grad_norm 1.7335 (2.7001)	mem 20675MB
[2025-04-03 00:41:24 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][194/311]	eta 0:01:43 lr 0.001093	time 0.8754 (0.8827)	loss 0.6520 (0.6420)	grad_norm 3.7896 (2.7019)	mem 20675MB
[2025-04-03 00:41:25 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][196/311]	eta 0:01:41 lr 0.001096	time 0.8754 (0.8827)	loss 0.6530 (0.6422)	grad_norm 1.3391 (2.7025)	mem 20675MB
[2025-04-03 00:41:27 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][198/311]	eta 0:01:39 lr 0.001099	time 0.8757 (0.8826)	loss 0.6645 (0.6421)	grad_norm 3.6569 (2.7042)	mem 20675MB
[2025-04-03 00:41:29 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][200/311]	eta 0:01:37 lr 0.001101	time 0.8754 (0.8825)	loss 0.6442 (0.6422)	grad_norm 2.5993 (2.7110)	mem 20675MB
[2025-04-03 00:41:31 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][202/311]	eta 0:01:36 lr 0.001104	time 0.8757 (0.8825)	loss 0.6169 (0.6422)	grad_norm 2.3177 (2.7019)	mem 20675MB
[2025-04-03 00:41:32 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][204/311]	eta 0:01:34 lr 0.001107	time 0.8760 (0.8824)	loss 0.6114 (0.6418)	grad_norm 2.8881 (2.7039)	mem 20675MB
[2025-04-03 00:41:34 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][206/311]	eta 0:01:32 lr 0.001109	time 0.8761 (0.8824)	loss 0.6504 (0.6417)	grad_norm 3.4243 (2.7088)	mem 20675MB
[2025-04-03 00:41:36 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][208/311]	eta 0:01:30 lr 0.001112	time 0.8755 (0.8823)	loss 0.6361 (0.6418)	grad_norm 4.4979 (2.7242)	mem 20675MB
[2025-04-03 00:41:38 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][210/311]	eta 0:01:29 lr 0.001115	time 0.8758 (0.8823)	loss 0.6896 (0.6419)	grad_norm 5.7198 (2.7343)	mem 20675MB
[2025-04-03 00:41:39 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][212/311]	eta 0:01:27 lr 0.001117	time 0.8757 (0.8822)	loss 0.6035 (0.6417)	grad_norm 2.1599 (2.7462)	mem 20675MB
[2025-04-03 00:41:41 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][214/311]	eta 0:01:25 lr 0.001120	time 0.8756 (0.8821)	loss 0.6359 (0.6418)	grad_norm 3.1637 (2.7412)	mem 20675MB
[2025-04-03 00:41:43 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][216/311]	eta 0:01:23 lr 0.001123	time 0.8757 (0.8821)	loss 0.7127 (0.6423)	grad_norm 5.2793 (2.7545)	mem 20675MB
[2025-04-03 00:41:45 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][218/311]	eta 0:01:22 lr 0.001125	time 0.8759 (0.8820)	loss 0.6785 (0.6423)	grad_norm 2.3714 (2.7465)	mem 20675MB
[2025-04-03 00:41:46 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][220/311]	eta 0:01:20 lr 0.001128	time 0.8757 (0.8820)	loss 0.6695 (0.6424)	grad_norm 4.9088 (2.7508)	mem 20675MB
[2025-04-03 00:41:48 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][222/311]	eta 0:01:18 lr 0.001131	time 0.8757 (0.8819)	loss 0.6327 (0.6422)	grad_norm 1.4259 (2.7411)	mem 20675MB
[2025-04-03 00:41:50 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][224/311]	eta 0:01:16 lr 0.001133	time 0.8754 (0.8819)	loss 0.6660 (0.6422)	grad_norm 2.7296 (2.7372)	mem 20675MB
[2025-04-03 00:41:52 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][226/311]	eta 0:01:14 lr 0.001136	time 0.8756 (0.8818)	loss 0.6446 (0.6419)	grad_norm 3.2653 (2.7470)	mem 20675MB
[2025-04-03 00:41:53 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][228/311]	eta 0:01:13 lr 0.001139	time 0.8757 (0.8818)	loss 0.6617 (0.6423)	grad_norm 3.0305 (2.7604)	mem 20675MB
[2025-04-03 00:41:55 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][230/311]	eta 0:01:11 lr 0.001142	time 0.8757 (0.8818)	loss 0.6532 (0.6424)	grad_norm 3.4944 (2.7711)	mem 20675MB
[2025-04-03 00:41:57 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][232/311]	eta 0:01:09 lr 0.001144	time 0.8757 (0.8817)	loss 0.6386 (0.6425)	grad_norm 1.1159 (2.7632)	mem 20675MB
[2025-04-03 00:41:59 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][234/311]	eta 0:01:07 lr 0.001147	time 0.8756 (0.8817)	loss 0.6583 (0.6428)	grad_norm 2.8737 (2.7625)	mem 20675MB
[2025-04-03 00:42:00 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][236/311]	eta 0:01:06 lr 0.001150	time 0.8757 (0.8816)	loss 0.6537 (0.6428)	grad_norm 2.5621 (2.7656)	mem 20675MB
[2025-04-03 00:42:02 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][238/311]	eta 0:01:04 lr 0.001152	time 0.8755 (0.8816)	loss 0.6497 (0.6428)	grad_norm 1.8543 (2.7583)	mem 20675MB
[2025-04-03 00:42:04 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][240/311]	eta 0:01:02 lr 0.001155	time 0.8754 (0.8815)	loss 0.5882 (0.6426)	grad_norm 2.1832 (2.7610)	mem 20675MB
[2025-04-03 00:42:06 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][242/311]	eta 0:01:00 lr 0.001158	time 0.8755 (0.8815)	loss 0.5775 (0.6424)	grad_norm 1.7564 (2.7551)	mem 20675MB
[2025-04-03 00:42:07 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][244/311]	eta 0:00:59 lr 0.001160	time 0.8779 (0.8815)	loss 0.6677 (0.6429)	grad_norm 7.1608 (2.7977)	mem 20675MB
[2025-04-03 00:42:09 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][246/311]	eta 0:00:57 lr 0.001163	time 0.8755 (0.8814)	loss 0.7009 (0.6433)	grad_norm 4.3633 (2.8036)	mem 20675MB
[2025-04-03 00:42:11 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][248/311]	eta 0:00:55 lr 0.001166	time 0.8756 (0.8814)	loss 0.6668 (0.6435)	grad_norm 2.1780 (2.8022)	mem 20675MB
[2025-04-03 00:42:13 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][250/311]	eta 0:00:53 lr 0.001168	time 0.8757 (0.8814)	loss 0.6519 (0.6435)	grad_norm 2.2277 (2.7967)	mem 20675MB
[2025-04-03 00:42:14 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][252/311]	eta 0:00:51 lr 0.001171	time 0.8756 (0.8813)	loss 0.6428 (0.6436)	grad_norm 1.4968 (2.7839)	mem 20675MB
[2025-04-03 00:42:16 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][254/311]	eta 0:00:50 lr 0.001174	time 0.8756 (0.8813)	loss 0.6244 (0.6434)	grad_norm 4.2084 (2.7869)	mem 20675MB
[2025-04-03 00:42:18 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][256/311]	eta 0:00:48 lr 0.001176	time 0.8759 (0.8812)	loss 0.6492 (0.6434)	grad_norm 1.9871 (2.7785)	mem 20675MB
[2025-04-03 00:42:20 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][258/311]	eta 0:00:46 lr 0.001179	time 0.8758 (0.8812)	loss 0.6371 (0.6432)	grad_norm 2.5311 (2.7803)	mem 20675MB
[2025-04-03 00:42:21 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][260/311]	eta 0:00:44 lr 0.001182	time 0.8755 (0.8812)	loss 0.6144 (0.6428)	grad_norm 2.3392 (2.7804)	mem 20675MB
[2025-04-03 00:42:23 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][262/311]	eta 0:00:43 lr 0.001184	time 0.8758 (0.8811)	loss 0.6115 (0.6428)	grad_norm 3.3854 (2.7853)	mem 20675MB
[2025-04-03 00:42:25 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][264/311]	eta 0:00:41 lr 0.001187	time 0.8757 (0.8811)	loss 0.6086 (0.6427)	grad_norm 1.7073 (2.7775)	mem 20675MB
[2025-04-03 00:42:27 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][266/311]	eta 0:00:39 lr 0.001190	time 0.8756 (0.8811)	loss 0.6527 (0.6428)	grad_norm 3.4779 (2.7837)	mem 20675MB
[2025-04-03 00:42:28 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][268/311]	eta 0:00:37 lr 0.001192	time 0.8757 (0.8810)	loss 0.6191 (0.6427)	grad_norm 1.9293 (2.7845)	mem 20675MB
[2025-04-03 00:42:30 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][270/311]	eta 0:00:36 lr 0.001195	time 0.8757 (0.8810)	loss 0.5868 (0.6426)	grad_norm 1.6909 (2.7851)	mem 20675MB
[2025-04-03 00:42:32 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][272/311]	eta 0:00:34 lr 0.001198	time 0.8757 (0.8810)	loss 0.6369 (0.6428)	grad_norm 1.9439 (2.7914)	mem 20675MB
[2025-04-03 00:42:34 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][274/311]	eta 0:00:32 lr 0.001200	time 0.8758 (0.8809)	loss 0.6146 (0.6427)	grad_norm 3.2568 (2.7883)	mem 20675MB
[2025-04-03 00:42:35 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][276/311]	eta 0:00:30 lr 0.001203	time 0.8754 (0.8809)	loss 0.5766 (0.6424)	grad_norm 3.2578 (2.7966)	mem 20675MB
[2025-04-03 00:42:37 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][278/311]	eta 0:00:29 lr 0.001206	time 0.8755 (0.8809)	loss 0.6655 (0.6428)	grad_norm 3.6863 (2.8075)	mem 20675MB
[2025-04-03 00:42:39 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][280/311]	eta 0:00:27 lr 0.001208	time 0.8755 (0.8808)	loss 0.6710 (0.6430)	grad_norm 1.9329 (2.8064)	mem 20675MB
[2025-04-03 00:42:41 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][282/311]	eta 0:00:25 lr 0.001211	time 0.8757 (0.8808)	loss 0.6592 (0.6430)	grad_norm 3.2838 (2.8045)	mem 20675MB
[2025-04-03 00:42:42 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][284/311]	eta 0:00:23 lr 0.001214	time 0.8756 (0.8808)	loss 0.5956 (0.6428)	grad_norm 2.3798 (2.8003)	mem 20675MB
[2025-04-03 00:42:44 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][286/311]	eta 0:00:22 lr 0.001217	time 0.8754 (0.8808)	loss 0.6363 (0.6427)	grad_norm 2.1519 (2.7934)	mem 20675MB
[2025-04-03 00:42:46 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][288/311]	eta 0:00:20 lr 0.001219	time 0.8755 (0.8807)	loss 0.5814 (0.6427)	grad_norm 1.9575 (2.7940)	mem 20675MB
[2025-04-03 00:42:48 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][290/311]	eta 0:00:18 lr 0.001222	time 0.8758 (0.8807)	loss 0.6598 (0.6427)	grad_norm 2.7767 (2.7928)	mem 20675MB
[2025-04-03 00:42:50 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][292/311]	eta 0:00:16 lr 0.001225	time 0.8758 (0.8807)	loss 0.6312 (0.6427)	grad_norm 1.8303 (2.7953)	mem 20675MB
[2025-04-03 00:42:51 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][294/311]	eta 0:00:14 lr 0.001227	time 0.8759 (0.8806)	loss 0.6102 (0.6426)	grad_norm 2.6455 (2.7940)	mem 20675MB
[2025-04-03 00:42:53 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][296/311]	eta 0:00:13 lr 0.001230	time 0.8755 (0.8806)	loss 0.6865 (0.6427)	grad_norm 4.2598 (2.8057)	mem 20675MB
[2025-04-03 00:42:55 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][298/311]	eta 0:00:11 lr 0.001233	time 0.8756 (0.8806)	loss 0.6678 (0.6428)	grad_norm 2.5111 (2.8012)	mem 20675MB
[2025-04-03 00:42:57 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][300/311]	eta 0:00:09 lr 0.001235	time 0.8752 (0.8806)	loss 0.6463 (0.6429)	grad_norm 2.7363 (2.8006)	mem 20675MB
[2025-04-03 00:42:58 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][302/311]	eta 0:00:07 lr 0.001238	time 0.8758 (0.8805)	loss 0.6395 (0.6430)	grad_norm 2.8494 (2.7957)	mem 20675MB
[2025-04-03 00:43:00 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][304/311]	eta 0:00:06 lr 0.001241	time 0.8753 (0.8805)	loss 0.6306 (0.6429)	grad_norm 1.8771 (2.7873)	mem 20675MB
[2025-04-03 00:43:02 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][306/311]	eta 0:00:04 lr 0.001243	time 0.8755 (0.8805)	loss 0.7288 (0.6433)	grad_norm 4.9906 (2.7977)	mem 20675MB
[2025-04-03 00:43:04 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][308/311]	eta 0:00:02 lr 0.001246	time 0.8752 (0.8804)	loss 0.6475 (0.6433)	grad_norm 2.2167 (2.7930)	mem 20675MB
[2025-04-03 00:43:05 simmim_finetune] (main_finetune.py 252): INFO Train: [2/30][310/311]	eta 0:00:00 lr 0.001249	time 0.8754 (0.8804)	loss 0.6562 (0.6434)	grad_norm 0.8063 (2.7860)	mem 20675MB
[2025-04-03 00:43:05 simmim_finetune] (main_finetune.py 260): INFO EPOCH 2 training takes 0:04:33
[2025-04-03 00:43:07 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.392 (1.392)	Loss 0.6452 (0.6452)	Acc@1 64.844 (64.844)	Mem 20675MB
[2025-04-03 00:43:07 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 67.606
[2025-04-03 00:43:07 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 67.6%
[2025-04-03 00:43:07 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 67.61%
[2025-04-03 00:43:07 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.6168219325153305e-06, 4.6168219325153305e-06, 7.102658690801885e-06, 7.102658690801885e-06, 1.0927022934319661e-05, 1.0927022934319661e-05, 1.6810660232039318e-05, 1.6810660232039318e-05, 2.5862409920838787e-05, 2.5862409920838787e-05, 3.978817867283797e-05, 3.978817867283797e-05, 6.121243829129824e-05, 6.121243829129824e-05, 9.417283770431406e-05, 9.417283770431406e-05, 0.00014488114449356915, 0.00014488114449356915, 0.00022289392416934622, 0.00022289392416934622, 0.0003429135852090032, 0.0003429135852090032, 0.0005275592175777063, 0.0005275592175777063, 0.000811629421221865, 0.000811629421221865, 0.0012486605037513396, 0.0012486605037513396]
[2025-04-03 00:43:09 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][0/311]	eta 0:10:41 lr 0.001219	time 2.0613 (2.0613)	loss 0.6268 (0.6268)	grad_norm 2.6466 (2.6466)	mem 20675MB
[2025-04-03 00:43:11 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][2/311]	eta 0:06:32 lr 0.001219	time 0.8761 (1.2718)	loss 0.6022 (0.6198)	grad_norm 4.9156 (3.8374)	mem 20675MB
[2025-04-03 00:43:13 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][4/311]	eta 0:05:41 lr 0.001219	time 0.8759 (1.1138)	loss 0.5925 (0.6233)	grad_norm 1.3917 (2.8668)	mem 20675MB
[2025-04-03 00:43:14 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][6/311]	eta 0:05:19 lr 0.001219	time 0.8755 (1.0460)	loss 0.6412 (0.6266)	grad_norm 2.5779 (2.6224)	mem 20675MB
[2025-04-03 00:43:16 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][8/311]	eta 0:05:05 lr 0.001219	time 0.8755 (1.0083)	loss 0.5602 (0.6225)	grad_norm 4.4577 (3.1764)	mem 20675MB
[2025-04-03 00:43:18 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][10/311]	eta 0:04:56 lr 0.001219	time 0.8753 (0.9843)	loss 0.7063 (0.6276)	grad_norm 7.0422 (3.4832)	mem 20675MB
[2025-04-03 00:43:20 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][12/311]	eta 0:04:49 lr 0.001219	time 0.8754 (0.9677)	loss 0.6888 (0.6346)	grad_norm 1.8274 (3.4477)	mem 20675MB
[2025-04-03 00:43:21 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][14/311]	eta 0:04:43 lr 0.001218	time 0.8756 (0.9555)	loss 0.6644 (0.6348)	grad_norm 4.9873 (3.4846)	mem 20675MB
[2025-04-03 00:43:23 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][16/311]	eta 0:04:39 lr 0.001218	time 0.8757 (0.9462)	loss 0.6022 (0.6300)	grad_norm 3.0584 (3.4308)	mem 20675MB
[2025-04-03 00:43:25 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][18/311]	eta 0:04:35 lr 0.001218	time 0.8756 (0.9389)	loss 0.6378 (0.6289)	grad_norm 1.3540 (3.2405)	mem 20675MB
[2025-04-03 00:43:27 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][20/311]	eta 0:04:31 lr 0.001218	time 0.8758 (0.9329)	loss 0.5582 (0.6256)	grad_norm 2.6642 (3.1156)	mem 20675MB
[2025-04-03 00:43:28 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][22/311]	eta 0:04:28 lr 0.001218	time 0.8759 (0.9280)	loss 0.6307 (0.6277)	grad_norm 2.2384 (3.1008)	mem 20675MB
[2025-04-03 00:43:30 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][24/311]	eta 0:04:25 lr 0.001218	time 0.8758 (0.9239)	loss 0.6714 (0.6309)	grad_norm 2.6795 (3.1895)	mem 20675MB
[2025-04-03 00:43:32 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][26/311]	eta 0:04:22 lr 0.001218	time 0.8758 (0.9204)	loss 0.6838 (0.6306)	grad_norm 3.6213 (3.2049)	mem 20675MB
[2025-04-03 00:43:34 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][28/311]	eta 0:04:19 lr 0.001218	time 0.8755 (0.9174)	loss 0.6551 (0.6325)	grad_norm 2.4561 (3.2210)	mem 20675MB
[2025-04-03 00:43:35 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][30/311]	eta 0:04:17 lr 0.001217	time 0.8759 (0.9147)	loss 0.6173 (0.6327)	grad_norm 2.4147 (3.1572)	mem 20675MB
[2025-04-03 00:43:37 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][32/311]	eta 0:04:14 lr 0.001217	time 0.8757 (0.9124)	loss 0.6392 (0.6330)	grad_norm 2.0754 (3.1120)	mem 20675MB
[2025-04-03 00:43:39 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][34/311]	eta 0:04:12 lr 0.001217	time 0.8754 (0.9103)	loss 0.6332 (0.6338)	grad_norm 1.4111 (3.0805)	mem 20675MB
[2025-04-03 00:43:41 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][36/311]	eta 0:04:09 lr 0.001217	time 0.8754 (0.9085)	loss 0.6005 (0.6336)	grad_norm 2.4601 (3.0719)	mem 20675MB
[2025-04-03 00:43:42 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][38/311]	eta 0:04:07 lr 0.001217	time 0.8763 (0.9069)	loss 0.6843 (0.6351)	grad_norm 2.0114 (3.0373)	mem 20675MB
[2025-04-03 00:43:44 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][40/311]	eta 0:04:05 lr 0.001217	time 0.8756 (0.9054)	loss 0.6356 (0.6356)	grad_norm 2.0380 (3.0344)	mem 20675MB
[2025-04-03 00:43:46 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][42/311]	eta 0:04:03 lr 0.001217	time 0.8757 (0.9041)	loss 0.6586 (0.6365)	grad_norm 1.8779 (2.9866)	mem 20675MB
[2025-04-03 00:43:48 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][44/311]	eta 0:04:01 lr 0.001216	time 0.8754 (0.9029)	loss 0.6211 (0.6330)	grad_norm 3.6265 (2.9897)	mem 20675MB
[2025-04-03 00:43:49 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][46/311]	eta 0:03:58 lr 0.001216	time 0.8758 (0.9018)	loss 0.6678 (0.6345)	grad_norm 1.7500 (2.9521)	mem 20675MB
[2025-04-03 00:43:51 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][48/311]	eta 0:03:56 lr 0.001216	time 0.8762 (0.9007)	loss 0.6846 (0.6366)	grad_norm 2.6852 (2.9185)	mem 20675MB
[2025-04-03 00:43:53 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][50/311]	eta 0:03:54 lr 0.001216	time 0.8755 (0.8998)	loss 0.6372 (0.6351)	grad_norm 1.8232 (2.8980)	mem 20675MB
[2025-04-03 00:43:55 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][52/311]	eta 0:03:52 lr 0.001216	time 0.8755 (0.8989)	loss 0.5574 (0.6337)	grad_norm 1.9870 (2.8499)	mem 20675MB
[2025-04-03 00:43:56 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][54/311]	eta 0:03:50 lr 0.001216	time 0.8755 (0.8981)	loss 0.5947 (0.6338)	grad_norm 3.3425 (2.8355)	mem 20675MB
[2025-04-03 00:43:58 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][56/311]	eta 0:03:48 lr 0.001216	time 0.8756 (0.8973)	loss 0.6439 (0.6343)	grad_norm 2.8432 (2.8372)	mem 20675MB
[2025-04-03 00:44:00 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][58/311]	eta 0:03:46 lr 0.001216	time 0.8758 (0.8966)	loss 0.6718 (0.6349)	grad_norm 2.2242 (2.8273)	mem 20675MB
[2025-04-03 00:44:02 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][60/311]	eta 0:03:44 lr 0.001215	time 0.8758 (0.8960)	loss 0.5686 (0.6337)	grad_norm 1.6731 (2.8096)	mem 20675MB
[2025-04-03 00:44:03 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][62/311]	eta 0:03:42 lr 0.001215	time 0.8765 (0.8954)	loss 0.6416 (0.6335)	grad_norm 1.4538 (2.7720)	mem 20675MB
[2025-04-03 00:44:05 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][64/311]	eta 0:03:41 lr 0.001215	time 0.8757 (0.8948)	loss 0.6442 (0.6331)	grad_norm 4.1207 (2.7873)	mem 20675MB
[2025-04-03 00:44:07 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][66/311]	eta 0:03:39 lr 0.001215	time 0.8769 (0.8943)	loss 0.6317 (0.6333)	grad_norm 1.6202 (2.7693)	mem 20675MB
[2025-04-03 00:44:09 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][68/311]	eta 0:03:37 lr 0.001215	time 0.8754 (0.8937)	loss 0.6858 (0.6340)	grad_norm 3.5741 (2.7874)	mem 20675MB
[2025-04-03 00:44:10 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][70/311]	eta 0:03:35 lr 0.001215	time 0.8759 (0.8933)	loss 0.6269 (0.6342)	grad_norm 2.1558 (2.7663)	mem 20675MB
[2025-04-03 00:44:12 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][72/311]	eta 0:03:33 lr 0.001215	time 0.8754 (0.8928)	loss 0.6578 (0.6340)	grad_norm 1.1934 (2.7429)	mem 20675MB
[2025-04-03 00:44:14 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][74/311]	eta 0:03:31 lr 0.001214	time 0.8759 (0.8924)	loss 0.6354 (0.6344)	grad_norm 3.4212 (2.7529)	mem 20675MB
[2025-04-03 00:44:16 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][76/311]	eta 0:03:29 lr 0.001214	time 0.8779 (0.8920)	loss 0.6424 (0.6349)	grad_norm 2.1645 (2.7326)	mem 20675MB
[2025-04-03 00:44:17 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][78/311]	eta 0:03:27 lr 0.001214	time 0.8759 (0.8916)	loss 0.6656 (0.6353)	grad_norm 3.4781 (2.7236)	mem 20675MB
[2025-04-03 00:44:19 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][80/311]	eta 0:03:25 lr 0.001214	time 0.8758 (0.8912)	loss 0.6379 (0.6352)	grad_norm 1.7801 (2.6960)	mem 20675MB
[2025-04-03 00:44:21 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][82/311]	eta 0:03:24 lr 0.001214	time 0.8757 (0.8909)	loss 0.6386 (0.6349)	grad_norm 2.2015 (2.6815)	mem 20675MB
[2025-04-03 00:44:23 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][84/311]	eta 0:03:22 lr 0.001214	time 0.8758 (0.8905)	loss 0.5771 (0.6336)	grad_norm 2.3047 (2.6668)	mem 20675MB
[2025-04-03 00:44:24 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][86/311]	eta 0:03:20 lr 0.001214	time 0.8757 (0.8902)	loss 0.6446 (0.6336)	grad_norm 2.4423 (2.6637)	mem 20675MB
[2025-04-03 00:44:26 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][88/311]	eta 0:03:18 lr 0.001213	time 0.8759 (0.8899)	loss 0.6416 (0.6339)	grad_norm 4.1250 (2.6776)	mem 20675MB
[2025-04-03 00:44:28 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][90/311]	eta 0:03:16 lr 0.001213	time 0.8763 (0.8896)	loss 0.5894 (0.6338)	grad_norm 2.8194 (2.6973)	mem 20675MB
[2025-04-03 00:44:30 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][92/311]	eta 0:03:14 lr 0.001213	time 0.8755 (0.8893)	loss 0.6316 (0.6331)	grad_norm 3.0032 (2.7048)	mem 20675MB
[2025-04-03 00:44:31 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][94/311]	eta 0:03:12 lr 0.001213	time 0.8758 (0.8891)	loss 0.6169 (0.6331)	grad_norm 2.3986 (2.7305)	mem 20675MB
[2025-04-03 00:44:33 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][96/311]	eta 0:03:11 lr 0.001213	time 0.8756 (0.8888)	loss 0.5910 (0.6330)	grad_norm 2.3251 (2.7347)	mem 20675MB
[2025-04-03 00:44:35 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][98/311]	eta 0:03:09 lr 0.001213	time 0.8758 (0.8886)	loss 0.6113 (0.6331)	grad_norm 3.7289 (2.7406)	mem 20675MB
[2025-04-03 00:44:37 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][100/311]	eta 0:03:07 lr 0.001213	time 0.8757 (0.8883)	loss 0.6408 (0.6328)	grad_norm 2.3941 (2.7359)	mem 20675MB
[2025-04-03 00:44:38 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][102/311]	eta 0:03:05 lr 0.001212	time 0.8756 (0.8881)	loss 0.6748 (0.6332)	grad_norm 3.4780 (2.7374)	mem 20675MB
[2025-04-03 00:44:40 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][104/311]	eta 0:03:03 lr 0.001212	time 0.8760 (0.8879)	loss 0.6875 (0.6340)	grad_norm 1.6561 (2.7288)	mem 20675MB
[2025-04-03 00:44:42 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][106/311]	eta 0:03:01 lr 0.001212	time 0.8762 (0.8877)	loss 0.6324 (0.6340)	grad_norm 2.5150 (2.7144)	mem 20675MB
[2025-04-03 00:44:44 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][108/311]	eta 0:03:00 lr 0.001212	time 0.8756 (0.8875)	loss 0.6469 (0.6343)	grad_norm 2.0140 (2.7104)	mem 20675MB
[2025-04-03 00:44:45 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][110/311]	eta 0:02:58 lr 0.001212	time 0.8754 (0.8873)	loss 0.5800 (0.6338)	grad_norm 1.9299 (2.6934)	mem 20675MB
[2025-04-03 00:44:47 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][112/311]	eta 0:02:56 lr 0.001212	time 0.8758 (0.8871)	loss 0.6365 (0.6339)	grad_norm 1.0713 (2.6779)	mem 20675MB
[2025-04-03 00:44:49 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][114/311]	eta 0:02:54 lr 0.001212	time 0.8754 (0.8869)	loss 0.6547 (0.6340)	grad_norm 3.0467 (2.6752)	mem 20675MB
[2025-04-03 00:44:51 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][116/311]	eta 0:02:52 lr 0.001211	time 0.8755 (0.8867)	loss 0.6174 (0.6340)	grad_norm 1.2416 (2.6554)	mem 20675MB
[2025-04-03 00:44:52 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][118/311]	eta 0:02:51 lr 0.001211	time 0.8758 (0.8866)	loss 0.6397 (0.6337)	grad_norm 1.4074 (2.6384)	mem 20675MB
[2025-04-03 00:44:54 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][120/311]	eta 0:02:49 lr 0.001211	time 0.8756 (0.8864)	loss 0.6627 (0.6340)	grad_norm 2.8295 (2.6380)	mem 20675MB
[2025-04-03 00:44:56 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][122/311]	eta 0:02:47 lr 0.001211	time 0.8754 (0.8862)	loss 0.6159 (0.6344)	grad_norm 1.6251 (2.6502)	mem 20675MB
[2025-04-03 00:44:58 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][124/311]	eta 0:02:45 lr 0.001211	time 0.8766 (0.8861)	loss 0.6528 (0.6352)	grad_norm 2.7483 (2.6692)	mem 20675MB
[2025-04-03 00:44:59 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][126/311]	eta 0:02:43 lr 0.001211	time 0.8758 (0.8859)	loss 0.6456 (0.6354)	grad_norm 2.2784 (2.6591)	mem 20675MB
[2025-04-03 00:45:01 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][128/311]	eta 0:02:42 lr 0.001211	time 0.8756 (0.8858)	loss 0.6541 (0.6356)	grad_norm 1.4472 (2.6446)	mem 20675MB
[2025-04-03 00:45:03 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][130/311]	eta 0:02:40 lr 0.001210	time 0.8754 (0.8856)	loss 0.6609 (0.6359)	grad_norm 2.3730 (2.6388)	mem 20675MB
[2025-04-03 00:45:05 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][132/311]	eta 0:02:38 lr 0.001210	time 0.8764 (0.8855)	loss 0.6222 (0.6360)	grad_norm 2.5135 (2.6290)	mem 20675MB
[2025-04-03 00:45:06 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][134/311]	eta 0:02:36 lr 0.001210	time 0.8774 (0.8854)	loss 0.6283 (0.6358)	grad_norm 1.2119 (2.6229)	mem 20675MB
[2025-04-03 00:45:08 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][136/311]	eta 0:02:34 lr 0.001210	time 0.8766 (0.8853)	loss 0.5904 (0.6349)	grad_norm 2.0076 (2.6176)	mem 20675MB
[2025-04-03 00:45:10 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][138/311]	eta 0:02:33 lr 0.001210	time 0.8753 (0.8851)	loss 0.5804 (0.6349)	grad_norm 2.6052 (2.6291)	mem 20675MB
[2025-04-03 00:45:12 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][140/311]	eta 0:02:31 lr 0.001210	time 0.8759 (0.8850)	loss 0.6163 (0.6346)	grad_norm 1.9605 (2.6184)	mem 20675MB
[2025-04-03 00:45:13 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][142/311]	eta 0:02:29 lr 0.001210	time 0.8778 (0.8849)	loss 0.6013 (0.6344)	grad_norm 1.8953 (2.6107)	mem 20675MB
[2025-04-03 00:45:15 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][144/311]	eta 0:02:27 lr 0.001209	time 0.8774 (0.8848)	loss 0.6507 (0.6345)	grad_norm 1.8577 (2.6003)	mem 20675MB
[2025-04-03 00:45:17 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][146/311]	eta 0:02:25 lr 0.001209	time 0.8761 (0.8847)	loss 0.6143 (0.6346)	grad_norm 1.8501 (2.5928)	mem 20675MB
[2025-04-03 00:45:19 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][148/311]	eta 0:02:24 lr 0.001209	time 0.8769 (0.8846)	loss 0.6427 (0.6345)	grad_norm 1.1845 (2.5812)	mem 20675MB
[2025-04-03 00:45:21 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][150/311]	eta 0:02:22 lr 0.001209	time 0.8758 (0.8845)	loss 0.6142 (0.6342)	grad_norm 1.6412 (2.5648)	mem 20675MB
[2025-04-03 00:45:22 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][152/311]	eta 0:02:20 lr 0.001209	time 0.8758 (0.8844)	loss 0.6470 (0.6335)	grad_norm 3.0822 (2.5653)	mem 20675MB
[2025-04-03 00:45:24 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][154/311]	eta 0:02:18 lr 0.001209	time 0.8764 (0.8843)	loss 0.5341 (0.6326)	grad_norm 2.2635 (2.5641)	mem 20675MB
[2025-04-03 00:45:26 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][156/311]	eta 0:02:17 lr 0.001208	time 0.8763 (0.8842)	loss 0.6922 (0.6332)	grad_norm 2.5439 (2.5623)	mem 20675MB
[2025-04-03 00:45:28 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][158/311]	eta 0:02:15 lr 0.001208	time 0.8757 (0.8841)	loss 0.5950 (0.6332)	grad_norm 3.7195 (2.5822)	mem 20675MB
[2025-04-03 00:45:29 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][160/311]	eta 0:02:13 lr 0.001208	time 0.8754 (0.8840)	loss 0.5996 (0.6330)	grad_norm 4.5133 (2.5883)	mem 20675MB
[2025-04-03 00:45:31 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][162/311]	eta 0:02:11 lr 0.001208	time 0.8761 (0.8839)	loss 0.6435 (0.6329)	grad_norm 3.5269 (2.5985)	mem 20675MB
[2025-04-03 00:45:33 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][164/311]	eta 0:02:09 lr 0.001208	time 0.8756 (0.8838)	loss 0.6030 (0.6327)	grad_norm 2.9762 (2.5995)	mem 20675MB
[2025-04-03 00:45:35 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][166/311]	eta 0:02:08 lr 0.001208	time 0.8755 (0.8837)	loss 0.6095 (0.6322)	grad_norm 1.9642 (2.5978)	mem 20675MB
[2025-04-03 00:45:36 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][168/311]	eta 0:02:06 lr 0.001208	time 0.8755 (0.8837)	loss 0.5776 (0.6320)	grad_norm 2.0538 (2.5878)	mem 20675MB
[2025-04-03 00:45:38 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][170/311]	eta 0:02:04 lr 0.001207	time 0.8755 (0.8836)	loss 0.6914 (0.6326)	grad_norm 1.9519 (2.6015)	mem 20675MB
[2025-04-03 00:45:40 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][172/311]	eta 0:02:02 lr 0.001207	time 0.8754 (0.8835)	loss 0.6905 (0.6329)	grad_norm 1.7573 (2.6002)	mem 20675MB
[2025-04-03 00:45:42 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][174/311]	eta 0:02:01 lr 0.001207	time 0.8757 (0.8834)	loss 0.6067 (0.6328)	grad_norm 2.3596 (2.5944)	mem 20675MB
[2025-04-03 00:45:43 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][176/311]	eta 0:01:59 lr 0.001207	time 0.8757 (0.8833)	loss 0.6657 (0.6326)	grad_norm 2.3048 (2.6090)	mem 20675MB
[2025-04-03 00:45:45 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][178/311]	eta 0:01:57 lr 0.001207	time 0.8754 (0.8833)	loss 0.6601 (0.6328)	grad_norm 2.2901 (2.6114)	mem 20675MB
[2025-04-03 00:45:47 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][180/311]	eta 0:01:55 lr 0.001207	time 0.8755 (0.8832)	loss 0.6301 (0.6329)	grad_norm 2.3438 (2.6141)	mem 20675MB
[2025-04-03 00:45:49 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][182/311]	eta 0:01:53 lr 0.001206	time 0.8753 (0.8831)	loss 0.6190 (0.6329)	grad_norm 1.7369 (2.6075)	mem 20675MB
[2025-04-03 00:45:50 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][184/311]	eta 0:01:52 lr 0.001206	time 0.8760 (0.8830)	loss 0.6770 (0.6331)	grad_norm 2.0731 (2.6049)	mem 20675MB
[2025-04-03 00:45:52 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][186/311]	eta 0:01:50 lr 0.001206	time 0.8756 (0.8830)	loss 0.6048 (0.6332)	grad_norm 1.6624 (2.6105)	mem 20675MB
[2025-04-03 00:45:54 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][188/311]	eta 0:01:48 lr 0.001206	time 0.8757 (0.8829)	loss 0.5991 (0.6329)	grad_norm 1.9249 (2.6066)	mem 20675MB
[2025-04-03 00:45:56 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][190/311]	eta 0:01:46 lr 0.001206	time 0.8753 (0.8828)	loss 0.6313 (0.6329)	grad_norm 2.8328 (2.6024)	mem 20675MB
[2025-04-03 00:45:57 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][192/311]	eta 0:01:45 lr 0.001206	time 0.8767 (0.8828)	loss 0.6275 (0.6331)	grad_norm 1.3443 (2.6043)	mem 20675MB
[2025-04-03 00:45:59 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][194/311]	eta 0:01:43 lr 0.001206	time 0.8754 (0.8827)	loss 0.5665 (0.6329)	grad_norm 2.4013 (2.6017)	mem 20675MB
[2025-04-03 00:46:01 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][196/311]	eta 0:01:41 lr 0.001205	time 0.8754 (0.8826)	loss 0.6249 (0.6326)	grad_norm 2.4347 (2.6003)	mem 20675MB
[2025-04-03 00:46:03 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][198/311]	eta 0:01:39 lr 0.001205	time 0.8756 (0.8826)	loss 0.5545 (0.6327)	grad_norm 2.6361 (2.6082)	mem 20675MB
[2025-04-03 00:46:04 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][200/311]	eta 0:01:37 lr 0.001205	time 0.8766 (0.8825)	loss 0.6064 (0.6327)	grad_norm 2.3162 (2.6072)	mem 20675MB
[2025-04-03 00:46:06 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][202/311]	eta 0:01:36 lr 0.001205	time 0.8763 (0.8825)	loss 0.6429 (0.6327)	grad_norm 1.8203 (2.6143)	mem 20675MB
[2025-04-03 00:46:08 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][204/311]	eta 0:01:34 lr 0.001205	time 0.8759 (0.8824)	loss 0.6449 (0.6327)	grad_norm 2.2962 (2.6108)	mem 20675MB
[2025-04-03 00:46:10 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][206/311]	eta 0:01:32 lr 0.001205	time 0.8759 (0.8824)	loss 0.6358 (0.6327)	grad_norm 2.3162 (2.6080)	mem 20675MB
[2025-04-03 00:46:11 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][208/311]	eta 0:01:30 lr 0.001204	time 0.8758 (0.8823)	loss 0.6540 (0.6325)	grad_norm 1.9098 (2.6056)	mem 20675MB
[2025-04-03 00:46:13 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][210/311]	eta 0:01:29 lr 0.001204	time 0.8754 (0.8822)	loss 0.6797 (0.6327)	grad_norm 2.3966 (2.6076)	mem 20675MB
[2025-04-03 00:46:15 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][212/311]	eta 0:01:27 lr 0.001204	time 0.8757 (0.8822)	loss 0.5757 (0.6325)	grad_norm 3.4586 (2.6134)	mem 20675MB
[2025-04-03 00:46:17 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][214/311]	eta 0:01:25 lr 0.001204	time 0.8753 (0.8821)	loss 0.6844 (0.6330)	grad_norm 5.2009 (2.6251)	mem 20675MB
[2025-04-03 00:46:18 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][216/311]	eta 0:01:23 lr 0.001204	time 0.8775 (0.8821)	loss 0.6023 (0.6326)	grad_norm 2.0148 (2.6191)	mem 20675MB
[2025-04-03 00:46:20 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][218/311]	eta 0:01:22 lr 0.001204	time 0.8754 (0.8820)	loss 0.6804 (0.6328)	grad_norm 3.4699 (2.6185)	mem 20675MB
[2025-04-03 00:46:22 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][220/311]	eta 0:01:20 lr 0.001203	time 0.8767 (0.8820)	loss 0.6079 (0.6328)	grad_norm 2.6227 (2.6201)	mem 20675MB
[2025-04-03 00:46:24 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][222/311]	eta 0:01:18 lr 0.001203	time 0.8766 (0.8819)	loss 0.6701 (0.6330)	grad_norm 1.5778 (2.6101)	mem 20675MB
[2025-04-03 00:46:25 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][224/311]	eta 0:01:16 lr 0.001203	time 0.8772 (0.8819)	loss 0.6602 (0.6331)	grad_norm 5.2350 (2.6179)	mem 20675MB
[2025-04-03 00:46:27 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][226/311]	eta 0:01:14 lr 0.001203	time 0.8764 (0.8819)	loss 0.5650 (0.6330)	grad_norm 1.9414 (2.6100)	mem 20675MB
[2025-04-03 00:46:29 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][228/311]	eta 0:01:13 lr 0.001203	time 0.8777 (0.8818)	loss 0.6288 (0.6330)	grad_norm 3.5303 (2.6110)	mem 20675MB
[2025-04-03 00:46:31 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][230/311]	eta 0:01:11 lr 0.001203	time 0.8778 (0.8818)	loss 0.5499 (0.6326)	grad_norm 2.9809 (2.6094)	mem 20675MB
[2025-04-03 00:46:32 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][232/311]	eta 0:01:09 lr 0.001203	time 0.8761 (0.8818)	loss 0.6698 (0.6327)	grad_norm 2.1450 (2.6125)	mem 20675MB
[2025-04-03 00:46:34 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][234/311]	eta 0:01:07 lr 0.001202	time 0.8758 (0.8817)	loss 0.6303 (0.6327)	grad_norm 2.6633 (2.6121)	mem 20675MB
[2025-04-03 00:46:36 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][236/311]	eta 0:01:06 lr 0.001202	time 0.8755 (0.8817)	loss 0.5889 (0.6324)	grad_norm 1.3705 (2.6047)	mem 20675MB
[2025-04-03 00:46:38 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][238/311]	eta 0:01:04 lr 0.001202	time 0.8756 (0.8817)	loss 0.6190 (0.6323)	grad_norm 3.2354 (2.6089)	mem 20675MB
[2025-04-03 00:46:39 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][240/311]	eta 0:01:02 lr 0.001202	time 0.8764 (0.8816)	loss 0.6787 (0.6326)	grad_norm 3.5114 (2.6113)	mem 20675MB
[2025-04-03 00:46:41 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][242/311]	eta 0:01:00 lr 0.001202	time 0.8769 (0.8816)	loss 0.6433 (0.6329)	grad_norm 1.6627 (2.6055)	mem 20675MB
[2025-04-03 00:46:43 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][244/311]	eta 0:00:59 lr 0.001202	time 0.8779 (0.8816)	loss 0.6317 (0.6328)	grad_norm 1.7165 (2.5977)	mem 20675MB
[2025-04-03 00:46:45 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][246/311]	eta 0:00:57 lr 0.001201	time 0.8797 (0.8815)	loss 0.6599 (0.6331)	grad_norm 1.4278 (2.5953)	mem 20675MB
[2025-04-03 00:46:46 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][248/311]	eta 0:00:55 lr 0.001201	time 0.8771 (0.8815)	loss 0.6766 (0.6332)	grad_norm 1.5815 (2.5927)	mem 20675MB
[2025-04-03 00:46:48 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][250/311]	eta 0:00:53 lr 0.001201	time 0.8762 (0.8815)	loss 0.6260 (0.6332)	grad_norm 1.7668 (2.5856)	mem 20675MB
[2025-04-03 00:46:50 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][252/311]	eta 0:00:52 lr 0.001201	time 0.8778 (0.8816)	loss 0.6174 (0.6332)	grad_norm 1.3999 (2.5800)	mem 20675MB
[2025-04-03 00:46:52 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][254/311]	eta 0:00:50 lr 0.001201	time 0.8762 (0.8816)	loss 0.6468 (0.6333)	grad_norm 3.1737 (2.5805)	mem 20675MB
[2025-04-03 00:46:53 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][256/311]	eta 0:00:48 lr 0.001201	time 0.8765 (0.8815)	loss 0.6550 (0.6334)	grad_norm 1.2890 (2.5758)	mem 20675MB
[2025-04-03 00:46:55 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][258/311]	eta 0:00:46 lr 0.001200	time 0.8775 (0.8815)	loss 0.6118 (0.6331)	grad_norm 1.8830 (2.5747)	mem 20675MB
[2025-04-03 00:46:57 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][260/311]	eta 0:00:44 lr 0.001200	time 0.8771 (0.8815)	loss 0.6030 (0.6330)	grad_norm 2.8002 (2.5779)	mem 20675MB
[2025-04-03 00:46:59 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][262/311]	eta 0:00:43 lr 0.001200	time 0.8784 (0.8815)	loss 0.6558 (0.6328)	grad_norm 2.6834 (2.5773)	mem 20675MB
[2025-04-03 00:47:01 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][264/311]	eta 0:00:41 lr 0.001200	time 0.8766 (0.8814)	loss 0.6560 (0.6328)	grad_norm 2.8507 (2.5759)	mem 20675MB
[2025-04-03 00:47:02 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][266/311]	eta 0:00:39 lr 0.001200	time 0.8767 (0.8814)	loss 0.6496 (0.6330)	grad_norm 2.2814 (2.5723)	mem 20675MB
[2025-04-03 00:47:04 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][268/311]	eta 0:00:37 lr 0.001200	time 0.8766 (0.8814)	loss 0.5296 (0.6324)	grad_norm 2.7739 (2.5707)	mem 20675MB
[2025-04-03 00:47:06 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][270/311]	eta 0:00:36 lr 0.001199	time 0.8770 (0.8814)	loss 0.6707 (0.6328)	grad_norm 2.0290 (2.5723)	mem 20675MB
[2025-04-03 00:47:08 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][272/311]	eta 0:00:34 lr 0.001199	time 0.8762 (0.8813)	loss 0.6204 (0.6328)	grad_norm 2.0979 (2.5699)	mem 20675MB
[2025-04-03 00:47:09 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][274/311]	eta 0:00:32 lr 0.001199	time 0.8756 (0.8813)	loss 0.6186 (0.6327)	grad_norm 1.6483 (2.5672)	mem 20675MB
[2025-04-03 00:47:11 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][276/311]	eta 0:00:30 lr 0.001199	time 0.8765 (0.8813)	loss 0.6063 (0.6325)	grad_norm 1.8017 (2.5691)	mem 20675MB
[2025-04-03 00:47:13 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][278/311]	eta 0:00:29 lr 0.001199	time 0.8773 (0.8812)	loss 0.6316 (0.6324)	grad_norm 1.6616 (2.5695)	mem 20675MB
[2025-04-03 00:47:15 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][280/311]	eta 0:00:27 lr 0.001199	time 0.8771 (0.8812)	loss 0.6275 (0.6324)	grad_norm 2.2472 (2.5659)	mem 20675MB
[2025-04-03 00:47:16 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][282/311]	eta 0:00:25 lr 0.001198	time 0.8765 (0.8812)	loss 0.5544 (0.6322)	grad_norm 1.8335 (2.5639)	mem 20675MB
[2025-04-03 00:47:18 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][284/311]	eta 0:00:23 lr 0.001198	time 0.8764 (0.8812)	loss 0.6169 (0.6323)	grad_norm 4.3114 (2.5760)	mem 20675MB
[2025-04-03 00:47:20 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][286/311]	eta 0:00:22 lr 0.001198	time 0.8762 (0.8811)	loss 0.6359 (0.6320)	grad_norm 3.9795 (2.5798)	mem 20675MB
[2025-04-03 00:47:22 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][288/311]	eta 0:00:20 lr 0.001198	time 0.8756 (0.8811)	loss 0.6372 (0.6319)	grad_norm 1.9938 (2.5799)	mem 20675MB
[2025-04-03 00:47:23 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][290/311]	eta 0:00:18 lr 0.001198	time 0.8756 (0.8811)	loss 0.6524 (0.6320)	grad_norm 1.5981 (2.5741)	mem 20675MB
[2025-04-03 00:47:25 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][292/311]	eta 0:00:16 lr 0.001198	time 0.8757 (0.8810)	loss 0.5683 (0.6318)	grad_norm 3.8895 (2.5767)	mem 20675MB
[2025-04-03 00:47:27 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][294/311]	eta 0:00:14 lr 0.001197	time 0.8760 (0.8810)	loss 0.5220 (0.6312)	grad_norm 3.0688 (2.5788)	mem 20675MB
[2025-04-03 00:47:29 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][296/311]	eta 0:00:13 lr 0.001197	time 0.8755 (0.8810)	loss 0.6466 (0.6312)	grad_norm 2.3614 (2.5779)	mem 20675MB
[2025-04-03 00:47:30 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][298/311]	eta 0:00:11 lr 0.001197	time 0.8767 (0.8810)	loss 0.5353 (0.6311)	grad_norm 5.4878 (2.5890)	mem 20675MB
[2025-04-03 00:47:32 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][300/311]	eta 0:00:09 lr 0.001197	time 0.8757 (0.8809)	loss 0.5834 (0.6311)	grad_norm 3.9297 (2.5944)	mem 20675MB
[2025-04-03 00:47:34 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][302/311]	eta 0:00:07 lr 0.001197	time 0.8758 (0.8809)	loss 0.5148 (0.6307)	grad_norm 3.2616 (2.5974)	mem 20675MB
[2025-04-03 00:47:36 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][304/311]	eta 0:00:06 lr 0.001197	time 0.8764 (0.8809)	loss 0.6282 (0.6307)	grad_norm 1.9664 (2.5979)	mem 20675MB
[2025-04-03 00:47:37 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][306/311]	eta 0:00:04 lr 0.001196	time 0.8755 (0.8809)	loss 0.6193 (0.6306)	grad_norm 1.7939 (2.5943)	mem 20675MB
[2025-04-03 00:47:39 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][308/311]	eta 0:00:02 lr 0.001196	time 0.8756 (0.8808)	loss 0.6871 (0.6310)	grad_norm 3.1066 (2.5966)	mem 20675MB
[2025-04-03 00:47:41 simmim_finetune] (main_finetune.py 252): INFO Train: [3/30][310/311]	eta 0:00:00 lr 0.001196	time 0.8754 (0.8808)	loss 0.6169 (0.6311)	grad_norm 1.6772 (2.5931)	mem 20675MB
[2025-04-03 00:47:41 simmim_finetune] (main_finetune.py 260): INFO EPOCH 3 training takes 0:04:34
[2025-04-03 00:47:42 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.390 (1.390)	Loss 0.6183 (0.6183)	Acc@1 70.312 (70.312)	Mem 20675MB
[2025-04-03 00:47:42 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 68.310
[2025-04-03 00:47:42 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 68.3%
[2025-04-03 00:47:42 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 68.31%
[2025-04-03 00:47:43 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.432838268698631e-06, 4.432838268698631e-06, 6.813941341593908e-06, 6.813941341593908e-06, 1.0477176838355874e-05, 1.0477176838355874e-05, 1.6112923756451206e-05, 1.6112923756451206e-05, 2.4783303630444022e-05, 2.4783303630444022e-05, 3.8122349590432976e-05, 3.8122349590432976e-05, 5.8643958759646735e-05, 5.8643958759646735e-05, 9.021566517382175e-05, 9.021566517382175e-05, 0.00013878752119562949, 0.00013878752119562949, 0.00021351345353687215, 0.00021351345353687215, 0.0003284764263695531, 0.0003284764263695531, 0.0005053425384198316, 0.0005053425384198316, 0.0007774442492664139, 0.0007774442492664139, 0.0011960622659534634, 0.0011960622659534634]
[2025-04-03 00:47:45 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][0/311]	eta 0:10:41 lr 0.001196	time 2.0643 (2.0643)	loss 0.5951 (0.5951)	grad_norm 1.8945 (1.8945)	mem 20675MB
[2025-04-03 00:47:46 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][2/311]	eta 0:06:33 lr 0.001196	time 0.8768 (1.2736)	loss 0.6106 (0.6143)	grad_norm 1.3802 (1.6404)	mem 20675MB
[2025-04-03 00:47:48 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][4/311]	eta 0:05:42 lr 0.001196	time 0.8755 (1.1148)	loss 0.5927 (0.6140)	grad_norm 3.8954 (2.3044)	mem 20675MB
[2025-04-03 00:47:50 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][6/311]	eta 0:05:19 lr 0.001195	time 0.8764 (1.0468)	loss 0.5937 (0.6065)	grad_norm 1.5631 (2.1869)	mem 20675MB
[2025-04-03 00:47:52 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][8/311]	eta 0:05:05 lr 0.001195	time 0.8761 (1.0090)	loss 0.6510 (0.6068)	grad_norm 3.5672 (2.3708)	mem 20675MB
[2025-04-03 00:47:53 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][10/311]	eta 0:04:56 lr 0.001195	time 0.8758 (0.9849)	loss 0.5695 (0.6068)	grad_norm 3.5474 (2.5496)	mem 20675MB
[2025-04-03 00:47:55 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][12/311]	eta 0:04:49 lr 0.001195	time 0.8755 (0.9682)	loss 0.5802 (0.6105)	grad_norm 2.2724 (2.5865)	mem 20675MB
[2025-04-03 00:47:57 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][14/311]	eta 0:04:43 lr 0.001195	time 0.8755 (0.9559)	loss 0.6556 (0.6131)	grad_norm 2.0366 (2.4758)	mem 20675MB
[2025-04-03 00:47:59 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][16/311]	eta 0:04:39 lr 0.001195	time 0.8753 (0.9465)	loss 0.5800 (0.6114)	grad_norm 2.9267 (2.4940)	mem 20675MB
[2025-04-03 00:48:00 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][18/311]	eta 0:04:35 lr 0.001194	time 0.8756 (0.9392)	loss 0.6254 (0.6115)	grad_norm 3.1824 (2.5159)	mem 20675MB
[2025-04-03 00:48:02 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][20/311]	eta 0:04:31 lr 0.001194	time 0.8763 (0.9332)	loss 0.6681 (0.6157)	grad_norm 1.9364 (2.4446)	mem 20675MB
[2025-04-03 00:48:04 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][22/311]	eta 0:04:28 lr 0.001194	time 0.8759 (0.9284)	loss 0.6146 (0.6172)	grad_norm 1.5452 (2.3886)	mem 20675MB
[2025-04-03 00:48:06 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][24/311]	eta 0:04:25 lr 0.001194	time 0.8755 (0.9242)	loss 0.6504 (0.6171)	grad_norm 1.8833 (2.3511)	mem 20675MB
[2025-04-03 00:48:07 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][26/311]	eta 0:04:22 lr 0.001194	time 0.8758 (0.9207)	loss 0.6136 (0.6176)	grad_norm 3.0608 (2.3520)	mem 20675MB
[2025-04-03 00:48:09 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][28/311]	eta 0:04:19 lr 0.001194	time 0.8755 (0.9176)	loss 0.6210 (0.6180)	grad_norm 1.3493 (2.2979)	mem 20675MB
[2025-04-03 00:48:11 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][30/311]	eta 0:04:17 lr 0.001193	time 0.8758 (0.9150)	loss 0.5957 (0.6189)	grad_norm 1.4263 (2.2616)	mem 20675MB
[2025-04-03 00:48:13 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][32/311]	eta 0:04:14 lr 0.001193	time 0.8757 (0.9126)	loss 0.5917 (0.6192)	grad_norm 1.3095 (2.2101)	mem 20675MB
[2025-04-03 00:48:14 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][34/311]	eta 0:04:12 lr 0.001193	time 0.8761 (0.9106)	loss 0.6075 (0.6201)	grad_norm 3.2622 (2.2645)	mem 20675MB
[2025-04-03 00:48:16 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][36/311]	eta 0:04:09 lr 0.001193	time 0.8761 (0.9088)	loss 0.6309 (0.6204)	grad_norm 2.8296 (2.2623)	mem 20675MB
[2025-04-03 00:48:18 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][38/311]	eta 0:04:07 lr 0.001193	time 0.8754 (0.9071)	loss 0.6348 (0.6221)	grad_norm 2.8558 (2.3284)	mem 20675MB
[2025-04-03 00:48:20 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][40/311]	eta 0:04:05 lr 0.001193	time 0.8755 (0.9056)	loss 0.6100 (0.6216)	grad_norm 1.4899 (2.3023)	mem 20675MB
[2025-04-03 00:48:21 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][42/311]	eta 0:04:03 lr 0.001192	time 0.8756 (0.9042)	loss 0.6355 (0.6231)	grad_norm 3.2591 (2.3157)	mem 20675MB
[2025-04-03 00:48:23 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][44/311]	eta 0:04:01 lr 0.001192	time 0.8754 (0.9030)	loss 0.6152 (0.6234)	grad_norm 3.7556 (2.3618)	mem 20675MB
[2025-04-03 00:48:25 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][46/311]	eta 0:03:58 lr 0.001192	time 0.8758 (0.9019)	loss 0.6696 (0.6251)	grad_norm 2.8012 (2.3751)	mem 20675MB
[2025-04-03 00:48:27 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][48/311]	eta 0:03:56 lr 0.001192	time 0.8756 (0.9008)	loss 0.6693 (0.6252)	grad_norm 3.3438 (2.4207)	mem 20675MB
[2025-04-03 00:48:28 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][50/311]	eta 0:03:54 lr 0.001192	time 0.8757 (0.8999)	loss 0.6163 (0.6257)	grad_norm 1.7920 (2.4331)	mem 20675MB
[2025-04-03 00:48:30 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][52/311]	eta 0:03:52 lr 0.001191	time 0.8760 (0.8990)	loss 0.6034 (0.6256)	grad_norm 2.1459 (2.4121)	mem 20675MB
[2025-04-03 00:48:32 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][54/311]	eta 0:03:50 lr 0.001191	time 0.8756 (0.8982)	loss 0.6462 (0.6256)	grad_norm 1.9806 (2.4025)	mem 20675MB
[2025-04-03 00:48:34 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][56/311]	eta 0:03:48 lr 0.001191	time 0.8758 (0.8974)	loss 0.6322 (0.6257)	grad_norm 2.0818 (2.3874)	mem 20675MB
[2025-04-03 00:48:35 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][58/311]	eta 0:03:46 lr 0.001191	time 0.8755 (0.8967)	loss 0.6380 (0.6249)	grad_norm 1.7892 (2.3615)	mem 20675MB
[2025-04-03 00:48:37 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][60/311]	eta 0:03:44 lr 0.001191	time 0.8755 (0.8960)	loss 0.5808 (0.6246)	grad_norm 2.5890 (2.3534)	mem 20675MB
[2025-04-03 00:48:39 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][62/311]	eta 0:03:42 lr 0.001191	time 0.8757 (0.8954)	loss 0.6329 (0.6241)	grad_norm 2.2085 (2.3510)	mem 20675MB
[2025-04-03 00:48:41 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][64/311]	eta 0:03:41 lr 0.001190	time 0.8756 (0.8948)	loss 0.6224 (0.6243)	grad_norm 2.7952 (2.3656)	mem 20675MB
[2025-04-03 00:48:42 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][66/311]	eta 0:03:39 lr 0.001190	time 0.8757 (0.8943)	loss 0.5277 (0.6238)	grad_norm 3.5197 (2.3937)	mem 20675MB
[2025-04-03 00:48:44 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][68/311]	eta 0:03:37 lr 0.001190	time 0.8755 (0.8938)	loss 0.5497 (0.6228)	grad_norm 1.8950 (2.3817)	mem 20675MB
[2025-04-03 00:48:46 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][70/311]	eta 0:03:35 lr 0.001190	time 0.8758 (0.8933)	loss 0.5573 (0.6220)	grad_norm 2.3352 (2.3794)	mem 20675MB
[2025-04-03 00:48:48 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][72/311]	eta 0:03:33 lr 0.001190	time 0.8758 (0.8928)	loss 0.6124 (0.6222)	grad_norm 2.1047 (2.3698)	mem 20675MB
[2025-04-03 00:48:49 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][74/311]	eta 0:03:31 lr 0.001189	time 0.8755 (0.8924)	loss 0.5698 (0.6210)	grad_norm 2.4487 (2.3732)	mem 20675MB
[2025-04-03 00:48:51 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][76/311]	eta 0:03:29 lr 0.001189	time 0.8753 (0.8920)	loss 0.6420 (0.6217)	grad_norm 3.5464 (2.3915)	mem 20675MB
[2025-04-03 00:48:53 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][78/311]	eta 0:03:27 lr 0.001189	time 0.8755 (0.8916)	loss 0.6556 (0.6216)	grad_norm 2.8765 (2.4062)	mem 20675MB
[2025-04-03 00:48:55 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][80/311]	eta 0:03:25 lr 0.001189	time 0.8758 (0.8912)	loss 0.6473 (0.6216)	grad_norm 1.7966 (2.4123)	mem 20675MB
[2025-04-03 00:48:56 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][82/311]	eta 0:03:24 lr 0.001189	time 0.8761 (0.8909)	loss 0.5973 (0.6221)	grad_norm 2.1993 (2.4149)	mem 20675MB
[2025-04-03 00:48:58 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][84/311]	eta 0:03:22 lr 0.001189	time 0.8755 (0.8905)	loss 0.6215 (0.6225)	grad_norm 1.6129 (2.3980)	mem 20675MB
[2025-04-03 00:49:00 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][86/311]	eta 0:03:20 lr 0.001188	time 0.8757 (0.8902)	loss 0.6424 (0.6230)	grad_norm 3.0586 (2.3942)	mem 20675MB
[2025-04-03 00:49:02 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][88/311]	eta 0:03:18 lr 0.001188	time 0.8755 (0.8899)	loss 0.6278 (0.6226)	grad_norm 1.7031 (2.3825)	mem 20675MB
[2025-04-03 00:49:03 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][90/311]	eta 0:03:16 lr 0.001188	time 0.8757 (0.8896)	loss 0.5551 (0.6223)	grad_norm 3.7216 (2.3993)	mem 20675MB
[2025-04-03 00:49:05 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][92/311]	eta 0:03:14 lr 0.001188	time 0.8754 (0.8893)	loss 0.6350 (0.6218)	grad_norm 1.8267 (2.3935)	mem 20675MB
[2025-04-03 00:49:07 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][94/311]	eta 0:03:12 lr 0.001188	time 0.8757 (0.8890)	loss 0.6370 (0.6224)	grad_norm 2.2004 (2.4033)	mem 20675MB
[2025-04-03 00:49:09 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][96/311]	eta 0:03:11 lr 0.001187	time 0.8777 (0.8888)	loss 0.7150 (0.6240)	grad_norm 3.6362 (2.4245)	mem 20675MB
[2025-04-03 00:49:10 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][98/311]	eta 0:03:09 lr 0.001187	time 0.8755 (0.8886)	loss 0.5915 (0.6229)	grad_norm 1.6699 (2.4180)	mem 20675MB
[2025-04-03 00:49:12 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][100/311]	eta 0:03:07 lr 0.001187	time 0.8755 (0.8883)	loss 0.6273 (0.6223)	grad_norm 1.4746 (2.4063)	mem 20675MB
[2025-04-03 00:49:14 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][102/311]	eta 0:03:05 lr 0.001187	time 0.8755 (0.8881)	loss 0.5804 (0.6220)	grad_norm 3.2599 (2.4091)	mem 20675MB
[2025-04-03 00:49:16 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][104/311]	eta 0:03:03 lr 0.001187	time 0.8757 (0.8879)	loss 0.6559 (0.6223)	grad_norm 2.0301 (2.3977)	mem 20675MB
[2025-04-03 00:49:17 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][106/311]	eta 0:03:01 lr 0.001187	time 0.8754 (0.8876)	loss 0.5971 (0.6228)	grad_norm 3.3421 (2.4338)	mem 20675MB
[2025-04-03 00:49:19 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][108/311]	eta 0:03:00 lr 0.001186	time 0.8764 (0.8874)	loss 0.6420 (0.6231)	grad_norm 3.1616 (2.4501)	mem 20675MB
[2025-04-03 00:49:21 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][110/311]	eta 0:02:58 lr 0.001186	time 0.8761 (0.8872)	loss 0.6131 (0.6230)	grad_norm 3.1648 (2.4677)	mem 20675MB
[2025-04-03 00:49:23 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][112/311]	eta 0:02:56 lr 0.001186	time 0.8757 (0.8871)	loss 0.6540 (0.6232)	grad_norm 1.7922 (2.4526)	mem 20675MB
[2025-04-03 00:49:24 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][114/311]	eta 0:02:54 lr 0.001186	time 0.8760 (0.8869)	loss 0.6038 (0.6235)	grad_norm 3.2258 (2.4627)	mem 20675MB
[2025-04-03 00:49:26 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][116/311]	eta 0:02:52 lr 0.001186	time 0.8763 (0.8867)	loss 0.6274 (0.6240)	grad_norm 1.6008 (2.4618)	mem 20675MB
[2025-04-03 00:49:28 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][118/311]	eta 0:02:51 lr 0.001185	time 0.8756 (0.8865)	loss 0.6287 (0.6240)	grad_norm 1.3538 (2.4460)	mem 20675MB
[2025-04-03 00:49:30 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][120/311]	eta 0:02:49 lr 0.001185	time 0.8757 (0.8864)	loss 0.6357 (0.6241)	grad_norm 3.2281 (2.4441)	mem 20675MB
[2025-04-03 00:49:32 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][122/311]	eta 0:02:47 lr 0.001185	time 0.8755 (0.8862)	loss 0.6704 (0.6247)	grad_norm 1.7464 (2.4337)	mem 20675MB
[2025-04-03 00:49:33 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][124/311]	eta 0:02:45 lr 0.001185	time 0.8753 (0.8860)	loss 0.5838 (0.6242)	grad_norm 2.2448 (2.4300)	mem 20675MB
[2025-04-03 00:49:35 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][126/311]	eta 0:02:43 lr 0.001185	time 0.8755 (0.8859)	loss 0.6561 (0.6243)	grad_norm 3.3842 (2.4365)	mem 20675MB
[2025-04-03 00:49:37 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][128/311]	eta 0:02:42 lr 0.001184	time 0.8761 (0.8857)	loss 0.6250 (0.6246)	grad_norm 2.0348 (2.4302)	mem 20675MB
[2025-04-03 00:49:39 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][130/311]	eta 0:02:40 lr 0.001184	time 0.8762 (0.8856)	loss 0.6454 (0.6248)	grad_norm 2.0282 (2.4284)	mem 20675MB
[2025-04-03 00:49:40 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][132/311]	eta 0:02:38 lr 0.001184	time 0.8769 (0.8855)	loss 0.5563 (0.6240)	grad_norm 3.3356 (2.4372)	mem 20675MB
[2025-04-03 00:49:42 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][134/311]	eta 0:02:36 lr 0.001184	time 0.8755 (0.8853)	loss 0.6686 (0.6239)	grad_norm 2.1991 (2.4337)	mem 20675MB
[2025-04-03 00:49:44 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][136/311]	eta 0:02:34 lr 0.001184	time 0.8777 (0.8852)	loss 0.6232 (0.6238)	grad_norm 2.8472 (2.4393)	mem 20675MB
[2025-04-03 00:49:46 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][138/311]	eta 0:02:33 lr 0.001184	time 0.8780 (0.8851)	loss 0.7215 (0.6245)	grad_norm 3.0621 (2.4561)	mem 20675MB
[2025-04-03 00:49:47 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][140/311]	eta 0:02:31 lr 0.001183	time 0.8764 (0.8850)	loss 0.6101 (0.6240)	grad_norm 1.6394 (2.4471)	mem 20675MB
[2025-04-03 00:49:49 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][142/311]	eta 0:02:29 lr 0.001183	time 0.8768 (0.8849)	loss 0.6640 (0.6243)	grad_norm 3.9514 (2.4597)	mem 20675MB
[2025-04-03 00:49:51 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][144/311]	eta 0:02:27 lr 0.001183	time 0.8758 (0.8848)	loss 0.6592 (0.6240)	grad_norm 1.6237 (2.4606)	mem 20675MB
[2025-04-03 00:49:53 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][146/311]	eta 0:02:25 lr 0.001183	time 0.8759 (0.8847)	loss 0.6874 (0.6244)	grad_norm 4.1040 (2.4643)	mem 20675MB
[2025-04-03 00:49:54 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][148/311]	eta 0:02:24 lr 0.001183	time 0.8759 (0.8846)	loss 0.5731 (0.6237)	grad_norm 1.6812 (2.4638)	mem 20675MB
[2025-04-03 00:49:56 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][150/311]	eta 0:02:22 lr 0.001182	time 0.8757 (0.8845)	loss 0.6289 (0.6238)	grad_norm 1.7357 (2.4517)	mem 20675MB
[2025-04-03 00:49:58 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][152/311]	eta 0:02:20 lr 0.001182	time 0.8755 (0.8844)	loss 0.5734 (0.6230)	grad_norm 1.5272 (2.4509)	mem 20675MB
[2025-04-03 00:50:00 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][154/311]	eta 0:02:18 lr 0.001182	time 0.8757 (0.8843)	loss 0.6344 (0.6235)	grad_norm 4.0585 (2.4604)	mem 20675MB
[2025-04-03 00:50:01 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][156/311]	eta 0:02:17 lr 0.001182	time 0.8754 (0.8842)	loss 0.6926 (0.6244)	grad_norm 2.1221 (2.4638)	mem 20675MB
[2025-04-03 00:50:03 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][158/311]	eta 0:02:15 lr 0.001182	time 0.8759 (0.8841)	loss 0.6678 (0.6249)	grad_norm 2.4562 (2.4627)	mem 20675MB
[2025-04-03 00:50:05 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][160/311]	eta 0:02:13 lr 0.001181	time 0.8764 (0.8840)	loss 0.6733 (0.6252)	grad_norm 1.1336 (2.4507)	mem 20675MB
[2025-04-03 00:50:07 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][162/311]	eta 0:02:11 lr 0.001181	time 0.8757 (0.8839)	loss 0.6350 (0.6253)	grad_norm 0.8781 (2.4340)	mem 20675MB
[2025-04-03 00:50:08 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][164/311]	eta 0:02:09 lr 0.001181	time 0.8757 (0.8838)	loss 0.6377 (0.6253)	grad_norm 1.0963 (2.4204)	mem 20675MB
[2025-04-03 00:50:10 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][166/311]	eta 0:02:08 lr 0.001181	time 0.8757 (0.8837)	loss 0.6588 (0.6256)	grad_norm 1.3628 (2.4040)	mem 20675MB
[2025-04-03 00:50:12 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][168/311]	eta 0:02:06 lr 0.001181	time 0.8755 (0.8837)	loss 0.6686 (0.6259)	grad_norm 2.3176 (2.4049)	mem 20675MB
[2025-04-03 00:50:14 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][170/311]	eta 0:02:04 lr 0.001181	time 0.8758 (0.8836)	loss 0.6065 (0.6260)	grad_norm 2.5403 (2.3993)	mem 20675MB
[2025-04-03 00:50:15 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][172/311]	eta 0:02:02 lr 0.001180	time 0.8757 (0.8835)	loss 0.5621 (0.6254)	grad_norm 2.5474 (2.3977)	mem 20675MB
[2025-04-03 00:50:17 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][174/311]	eta 0:02:01 lr 0.001180	time 0.8759 (0.8834)	loss 0.6574 (0.6255)	grad_norm 2.6725 (2.3951)	mem 20675MB
[2025-04-03 00:50:19 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][176/311]	eta 0:01:59 lr 0.001180	time 0.8754 (0.8833)	loss 0.6345 (0.6256)	grad_norm 2.5889 (2.3952)	mem 20675MB
[2025-04-03 00:50:21 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][178/311]	eta 0:01:57 lr 0.001180	time 0.8758 (0.8833)	loss 0.6951 (0.6261)	grad_norm 3.1630 (2.3980)	mem 20675MB
[2025-04-03 00:50:22 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][180/311]	eta 0:01:55 lr 0.001180	time 0.8764 (0.8832)	loss 0.6259 (0.6259)	grad_norm 2.4041 (2.4022)	mem 20675MB
[2025-04-03 00:50:24 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][182/311]	eta 0:01:53 lr 0.001179	time 0.8757 (0.8831)	loss 0.5905 (0.6255)	grad_norm 2.5997 (2.4013)	mem 20675MB
[2025-04-03 00:50:26 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][184/311]	eta 0:01:52 lr 0.001179	time 0.8762 (0.8830)	loss 0.6461 (0.6252)	grad_norm 1.6604 (2.3989)	mem 20675MB
[2025-04-03 00:50:28 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][186/311]	eta 0:01:50 lr 0.001179	time 0.8757 (0.8830)	loss 0.6643 (0.6253)	grad_norm 2.2012 (2.3987)	mem 20675MB
[2025-04-03 00:50:29 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][188/311]	eta 0:01:48 lr 0.001179	time 0.8754 (0.8829)	loss 0.6453 (0.6257)	grad_norm 2.3223 (2.3967)	mem 20675MB
[2025-04-03 00:50:31 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][190/311]	eta 0:01:46 lr 0.001179	time 0.8756 (0.8828)	loss 0.5915 (0.6250)	grad_norm 1.8334 (2.3947)	mem 20675MB
[2025-04-03 00:50:33 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][192/311]	eta 0:01:45 lr 0.001178	time 0.8757 (0.8828)	loss 0.5736 (0.6247)	grad_norm 2.0040 (2.3887)	mem 20675MB
[2025-04-03 00:50:35 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][194/311]	eta 0:01:43 lr 0.001178	time 0.8759 (0.8827)	loss 0.5403 (0.6241)	grad_norm 2.5497 (2.3911)	mem 20675MB
[2025-04-03 00:50:36 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][196/311]	eta 0:01:41 lr 0.001178	time 0.8778 (0.8827)	loss 0.5731 (0.6239)	grad_norm 3.2417 (2.3949)	mem 20675MB
[2025-04-03 00:50:38 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][198/311]	eta 0:01:39 lr 0.001178	time 0.8778 (0.8826)	loss 0.6671 (0.6244)	grad_norm 2.7526 (2.3997)	mem 20675MB
[2025-04-03 00:50:40 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][200/311]	eta 0:01:37 lr 0.001178	time 0.8772 (0.8826)	loss 0.5733 (0.6245)	grad_norm 3.6131 (2.4075)	mem 20675MB
[2025-04-03 00:50:42 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][202/311]	eta 0:01:36 lr 0.001177	time 0.8762 (0.8825)	loss 0.6609 (0.6247)	grad_norm 2.9177 (2.4063)	mem 20675MB
[2025-04-03 00:50:43 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][204/311]	eta 0:01:34 lr 0.001177	time 0.8757 (0.8825)	loss 0.5988 (0.6245)	grad_norm 1.7270 (2.4100)	mem 20675MB
[2025-04-03 00:50:45 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][206/311]	eta 0:01:32 lr 0.001177	time 0.8758 (0.8824)	loss 0.6293 (0.6244)	grad_norm 2.8589 (2.4072)	mem 20675MB
[2025-04-03 00:50:47 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][208/311]	eta 0:01:30 lr 0.001177	time 0.8757 (0.8823)	loss 0.6279 (0.6247)	grad_norm 3.9926 (2.4185)	mem 20675MB
[2025-04-03 00:50:49 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][210/311]	eta 0:01:29 lr 0.001177	time 0.8757 (0.8823)	loss 0.6546 (0.6250)	grad_norm 2.2266 (2.4171)	mem 20675MB
[2025-04-03 00:50:50 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][212/311]	eta 0:01:27 lr 0.001176	time 0.8755 (0.8822)	loss 0.6508 (0.6252)	grad_norm 2.3156 (2.4158)	mem 20675MB
[2025-04-03 00:50:52 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][214/311]	eta 0:01:25 lr 0.001176	time 0.8757 (0.8822)	loss 0.6037 (0.6254)	grad_norm 2.9700 (2.4260)	mem 20675MB
[2025-04-03 00:50:54 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][216/311]	eta 0:01:23 lr 0.001176	time 0.8755 (0.8821)	loss 0.6052 (0.6254)	grad_norm 1.3093 (2.4170)	mem 20675MB
[2025-04-03 00:50:56 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][218/311]	eta 0:01:22 lr 0.001176	time 0.8758 (0.8821)	loss 0.6057 (0.6255)	grad_norm 2.7019 (2.4208)	mem 20675MB
[2025-04-03 00:50:57 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][220/311]	eta 0:01:20 lr 0.001176	time 0.8756 (0.8820)	loss 0.5766 (0.6253)	grad_norm 1.5340 (2.4226)	mem 20675MB
[2025-04-03 00:50:59 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][222/311]	eta 0:01:18 lr 0.001175	time 0.8754 (0.8820)	loss 0.5902 (0.6247)	grad_norm 2.0489 (2.4226)	mem 20675MB
[2025-04-03 00:51:01 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][224/311]	eta 0:01:16 lr 0.001175	time 0.8756 (0.8819)	loss 0.6584 (0.6250)	grad_norm 1.8419 (2.4231)	mem 20675MB
[2025-04-03 00:51:03 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][226/311]	eta 0:01:14 lr 0.001175	time 0.8757 (0.8819)	loss 0.5457 (0.6248)	grad_norm 3.0850 (2.4242)	mem 20675MB
[2025-04-03 00:51:04 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][228/311]	eta 0:01:13 lr 0.001175	time 0.8771 (0.8819)	loss 0.6319 (0.6251)	grad_norm 2.5909 (2.4288)	mem 20675MB
[2025-04-03 00:51:06 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][230/311]	eta 0:01:11 lr 0.001175	time 0.8755 (0.8818)	loss 0.5192 (0.6246)	grad_norm 2.4200 (2.4242)	mem 20675MB
[2025-04-03 00:51:08 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][232/311]	eta 0:01:09 lr 0.001174	time 0.8760 (0.8818)	loss 0.5638 (0.6243)	grad_norm 2.9585 (2.4241)	mem 20675MB
[2025-04-03 00:51:10 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][234/311]	eta 0:01:07 lr 0.001174	time 0.8774 (0.8817)	loss 0.5686 (0.6241)	grad_norm 2.3615 (2.4211)	mem 20675MB
[2025-04-03 00:51:11 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][236/311]	eta 0:01:06 lr 0.001174	time 0.8773 (0.8817)	loss 0.6614 (0.6241)	grad_norm 1.9905 (2.4181)	mem 20675MB
[2025-04-03 00:51:13 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][238/311]	eta 0:01:04 lr 0.001174	time 0.8763 (0.8817)	loss 0.4812 (0.6231)	grad_norm 3.1532 (2.4213)	mem 20675MB
[2025-04-03 00:51:15 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][240/311]	eta 0:01:02 lr 0.001174	time 0.8760 (0.8816)	loss 0.6455 (0.6231)	grad_norm 2.3117 (2.4239)	mem 20675MB
[2025-04-03 00:51:17 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][242/311]	eta 0:01:00 lr 0.001173	time 0.8759 (0.8816)	loss 0.5669 (0.6228)	grad_norm 4.5052 (2.4310)	mem 20675MB
[2025-04-03 00:51:18 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][244/311]	eta 0:00:59 lr 0.001173	time 0.8758 (0.8816)	loss 0.6168 (0.6226)	grad_norm 2.6278 (2.4341)	mem 20675MB
[2025-04-03 00:51:20 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][246/311]	eta 0:00:57 lr 0.001173	time 0.8757 (0.8815)	loss 0.5954 (0.6220)	grad_norm 3.1862 (2.4394)	mem 20675MB
[2025-04-03 00:51:22 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][248/311]	eta 0:00:55 lr 0.001173	time 0.8764 (0.8815)	loss 0.5857 (0.6220)	grad_norm 2.7787 (2.4418)	mem 20675MB
[2025-04-03 00:51:24 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][250/311]	eta 0:00:53 lr 0.001173	time 0.8774 (0.8815)	loss 0.5847 (0.6218)	grad_norm 2.4657 (2.4439)	mem 20675MB
[2025-04-03 00:51:26 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][252/311]	eta 0:00:52 lr 0.001172	time 0.8760 (0.8814)	loss 0.5929 (0.6218)	grad_norm 2.0147 (2.4428)	mem 20675MB
[2025-04-03 00:51:27 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][254/311]	eta 0:00:50 lr 0.001172	time 0.8757 (0.8814)	loss 0.5902 (0.6218)	grad_norm 1.8464 (2.4440)	mem 20675MB
[2025-04-03 00:51:29 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][256/311]	eta 0:00:48 lr 0.001172	time 0.8768 (0.8814)	loss 0.6008 (0.6217)	grad_norm 1.7933 (2.4389)	mem 20675MB
[2025-04-03 00:51:31 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][258/311]	eta 0:00:46 lr 0.001172	time 0.8755 (0.8813)	loss 0.6168 (0.6217)	grad_norm 1.4723 (2.4360)	mem 20675MB
[2025-04-03 00:51:33 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][260/311]	eta 0:00:44 lr 0.001172	time 0.8783 (0.8813)	loss 0.6523 (0.6219)	grad_norm 1.4932 (2.4280)	mem 20675MB
[2025-04-03 00:51:34 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][262/311]	eta 0:00:43 lr 0.001171	time 0.8765 (0.8813)	loss 0.6564 (0.6219)	grad_norm 1.7112 (2.4221)	mem 20675MB
[2025-04-03 00:51:36 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][264/311]	eta 0:00:41 lr 0.001171	time 0.8768 (0.8812)	loss 0.6043 (0.6220)	grad_norm 1.5691 (2.4165)	mem 20675MB
[2025-04-03 00:51:38 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][266/311]	eta 0:00:39 lr 0.001171	time 0.8782 (0.8812)	loss 0.6399 (0.6219)	grad_norm 1.8910 (2.4149)	mem 20675MB
[2025-04-03 00:51:40 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][268/311]	eta 0:00:37 lr 0.001171	time 0.8777 (0.8812)	loss 0.6162 (0.6221)	grad_norm 1.8591 (2.4119)	mem 20675MB
[2025-04-03 00:51:41 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][270/311]	eta 0:00:36 lr 0.001171	time 0.8764 (0.8812)	loss 0.6588 (0.6220)	grad_norm 1.4808 (2.4078)	mem 20675MB
[2025-04-03 00:51:43 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][272/311]	eta 0:00:34 lr 0.001170	time 0.8760 (0.8812)	loss 0.6095 (0.6219)	grad_norm 1.6299 (2.4038)	mem 20675MB
[2025-04-03 00:51:45 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][274/311]	eta 0:00:32 lr 0.001170	time 0.8762 (0.8811)	loss 0.6453 (0.6222)	grad_norm 2.1461 (2.4037)	mem 20675MB
[2025-04-03 00:51:47 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][276/311]	eta 0:00:30 lr 0.001170	time 0.8794 (0.8811)	loss 0.6460 (0.6222)	grad_norm 1.9294 (2.4000)	mem 20675MB
[2025-04-03 00:51:48 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][278/311]	eta 0:00:29 lr 0.001170	time 0.8759 (0.8811)	loss 0.6325 (0.6223)	grad_norm 1.7116 (2.4025)	mem 20675MB
[2025-04-03 00:51:50 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][280/311]	eta 0:00:27 lr 0.001170	time 0.8771 (0.8811)	loss 0.6454 (0.6224)	grad_norm 2.1352 (2.4034)	mem 20675MB
[2025-04-03 00:51:52 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][282/311]	eta 0:00:25 lr 0.001169	time 0.8759 (0.8810)	loss 0.5853 (0.6222)	grad_norm 2.5845 (2.4014)	mem 20675MB
[2025-04-03 00:51:54 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][284/311]	eta 0:00:23 lr 0.001169	time 0.8756 (0.8810)	loss 0.5994 (0.6220)	grad_norm 2.1070 (2.4014)	mem 20675MB
[2025-04-03 00:51:55 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][286/311]	eta 0:00:22 lr 0.001169	time 0.8760 (0.8810)	loss 0.5498 (0.6216)	grad_norm 4.5132 (2.4080)	mem 20675MB
[2025-04-03 00:51:57 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][288/311]	eta 0:00:20 lr 0.001169	time 0.8758 (0.8809)	loss 0.5728 (0.6215)	grad_norm 3.1106 (2.4119)	mem 20675MB
[2025-04-03 00:51:59 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][290/311]	eta 0:00:18 lr 0.001168	time 0.8773 (0.8809)	loss 0.6211 (0.6215)	grad_norm 2.7456 (2.4203)	mem 20675MB
[2025-04-03 00:52:01 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][292/311]	eta 0:00:16 lr 0.001168	time 0.8759 (0.8809)	loss 0.6096 (0.6215)	grad_norm 2.4300 (2.4214)	mem 20675MB
[2025-04-03 00:52:02 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][294/311]	eta 0:00:14 lr 0.001168	time 0.8755 (0.8809)	loss 0.6259 (0.6214)	grad_norm 4.1040 (2.4260)	mem 20675MB
[2025-04-03 00:52:04 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][296/311]	eta 0:00:13 lr 0.001168	time 0.8757 (0.8808)	loss 0.5391 (0.6214)	grad_norm 3.3021 (2.4371)	mem 20675MB
[2025-04-03 00:52:06 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][298/311]	eta 0:00:11 lr 0.001168	time 0.8756 (0.8808)	loss 0.6359 (0.6215)	grad_norm 3.2239 (2.4387)	mem 20675MB
[2025-04-03 00:52:08 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][300/311]	eta 0:00:09 lr 0.001167	time 0.8775 (0.8808)	loss 0.5999 (0.6214)	grad_norm 2.2044 (2.4454)	mem 20675MB
[2025-04-03 00:52:09 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][302/311]	eta 0:00:07 lr 0.001167	time 0.8758 (0.8808)	loss 0.6597 (0.6214)	grad_norm 1.3217 (2.4417)	mem 20675MB
[2025-04-03 00:52:11 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][304/311]	eta 0:00:06 lr 0.001167	time 0.8762 (0.8808)	loss 0.5849 (0.6213)	grad_norm 3.2186 (2.4417)	mem 20675MB
[2025-04-03 00:52:13 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][306/311]	eta 0:00:04 lr 0.001167	time 0.8759 (0.8807)	loss 0.5793 (0.6212)	grad_norm 1.8098 (2.4425)	mem 20675MB
[2025-04-03 00:52:15 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][308/311]	eta 0:00:02 lr 0.001167	time 0.8755 (0.8807)	loss 0.6198 (0.6209)	grad_norm 1.9554 (2.4402)	mem 20675MB
[2025-04-03 00:52:16 simmim_finetune] (main_finetune.py 252): INFO Train: [4/30][310/311]	eta 0:00:00 lr 0.001166	time 0.8766 (0.8807)	loss 0.6385 (0.6210)	grad_norm 1.6239 (2.4355)	mem 20675MB
[2025-04-03 00:52:17 simmim_finetune] (main_finetune.py 260): INFO EPOCH 4 training takes 0:04:34
[2025-04-03 00:52:18 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.453 (1.453)	Loss 0.6258 (0.6258)	Acc@1 67.969 (67.969)	Mem 20675MB
[2025-04-03 00:52:18 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 69.718
[2025-04-03 00:52:18 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 69.7%
[2025-04-03 00:52:18 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 69.72%
[2025-04-03 00:52:18 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.32903978240994e-06, 4.32903978240994e-06, 6.651055011409026e-06, 6.651055011409026e-06, 1.0223386132946083e-05, 1.0223386132946083e-05, 1.5719280166080016e-05, 1.5719280166080016e-05, 2.4174501755516834e-05, 2.4174501755516834e-05, 3.718253497003502e-05, 3.718253497003502e-05, 5.7194893761601446e-05, 5.7194893761601446e-05, 8.798313805631902e-05, 8.798313805631902e-05, 0.00013534966774049993, 0.00013534966774049993, 0.00020822125187000903, 0.00020822125187000903, 0.00032033138130002295, 0.00032033138130002295, 0.0004928085035000445, 0.0004928085035000445, 0.0007581579222693083, 0.0007581579222693083, 0.0011663877972989448, 0.0011663877972989448]
[2025-04-03 00:52:20 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][0/311]	eta 0:11:11 lr 0.001166	time 2.1587 (2.1587)	loss 0.6413 (0.6413)	grad_norm 2.4293 (2.4293)	mem 20675MB
[2025-04-03 00:52:22 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][2/311]	eta 0:06:43 lr 0.001166	time 0.8761 (1.3044)	loss 0.5494 (0.6117)	grad_norm 2.5648 (2.2181)	mem 20675MB
[2025-04-03 00:52:24 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][4/311]	eta 0:05:48 lr 0.001166	time 0.8763 (1.1336)	loss 0.5480 (0.6042)	grad_norm 2.1894 (2.0450)	mem 20675MB
[2025-04-03 00:52:26 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][6/311]	eta 0:05:23 lr 0.001166	time 0.8769 (1.0605)	loss 0.5615 (0.5881)	grad_norm 2.3899 (2.1376)	mem 20675MB
[2025-04-03 00:52:27 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][8/311]	eta 0:05:08 lr 0.001165	time 0.8759 (1.0196)	loss 0.5874 (0.5845)	grad_norm 2.9442 (2.3534)	mem 20675MB
[2025-04-03 00:52:29 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][10/311]	eta 0:04:59 lr 0.001165	time 0.8757 (0.9936)	loss 0.6194 (0.5873)	grad_norm 3.1559 (2.5128)	mem 20675MB
[2025-04-03 00:52:31 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][12/311]	eta 0:04:51 lr 0.001165	time 0.8758 (0.9757)	loss 0.5217 (0.5847)	grad_norm 3.9971 (2.6434)	mem 20675MB
[2025-04-03 00:52:33 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][14/311]	eta 0:04:45 lr 0.001165	time 0.8759 (0.9624)	loss 0.5431 (0.5876)	grad_norm 2.5314 (2.6252)	mem 20675MB
[2025-04-03 00:52:34 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][16/311]	eta 0:04:40 lr 0.001165	time 0.8759 (0.9524)	loss 0.6297 (0.5899)	grad_norm 2.0348 (2.5927)	mem 20675MB
[2025-04-03 00:52:36 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][18/311]	eta 0:04:36 lr 0.001164	time 0.8757 (0.9444)	loss 0.6311 (0.5939)	grad_norm 2.2532 (2.5483)	mem 20675MB
[2025-04-03 00:52:38 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][20/311]	eta 0:04:32 lr 0.001164	time 0.8758 (0.9380)	loss 0.5851 (0.5964)	grad_norm 2.0823 (2.5134)	mem 20675MB
[2025-04-03 00:52:40 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][22/311]	eta 0:04:29 lr 0.001164	time 0.8772 (0.9328)	loss 0.6763 (0.5980)	grad_norm 2.2390 (2.4973)	mem 20675MB
[2025-04-03 00:52:41 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][24/311]	eta 0:04:26 lr 0.001164	time 0.8759 (0.9284)	loss 0.5929 (0.5998)	grad_norm 1.8725 (2.4544)	mem 20675MB
[2025-04-03 00:52:43 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][26/311]	eta 0:04:23 lr 0.001164	time 0.8759 (0.9246)	loss 0.6054 (0.5990)	grad_norm 2.3185 (2.4601)	mem 20675MB
[2025-04-03 00:52:45 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][28/311]	eta 0:04:20 lr 0.001163	time 0.8758 (0.9213)	loss 0.6499 (0.6002)	grad_norm 2.3679 (2.4500)	mem 20675MB
[2025-04-03 00:52:47 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][30/311]	eta 0:04:18 lr 0.001163	time 0.8758 (0.9184)	loss 0.6437 (0.6002)	grad_norm 2.6883 (2.4629)	mem 20675MB
[2025-04-03 00:52:48 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][32/311]	eta 0:04:15 lr 0.001163	time 0.8762 (0.9159)	loss 0.5856 (0.5993)	grad_norm 2.0029 (2.4417)	mem 20675MB
[2025-04-03 00:52:50 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][34/311]	eta 0:04:13 lr 0.001163	time 0.8759 (0.9137)	loss 0.6456 (0.6024)	grad_norm 2.6730 (2.4514)	mem 20675MB
[2025-04-03 00:52:52 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][36/311]	eta 0:04:10 lr 0.001162	time 0.8785 (0.9117)	loss 0.6159 (0.6027)	grad_norm 1.6454 (2.4228)	mem 20675MB
[2025-04-03 00:52:54 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][38/311]	eta 0:04:08 lr 0.001162	time 0.8758 (0.9100)	loss 0.6741 (0.6049)	grad_norm 1.8775 (2.4160)	mem 20675MB
[2025-04-03 00:52:55 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][40/311]	eta 0:04:06 lr 0.001162	time 0.8759 (0.9083)	loss 0.6698 (0.6075)	grad_norm 2.3102 (2.4070)	mem 20675MB
[2025-04-03 00:52:57 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][42/311]	eta 0:04:03 lr 0.001162	time 0.8756 (0.9069)	loss 0.6005 (0.6076)	grad_norm 1.6567 (2.3741)	mem 20675MB
[2025-04-03 00:52:59 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][44/311]	eta 0:04:01 lr 0.001162	time 0.8763 (0.9055)	loss 0.6716 (0.6100)	grad_norm 2.6986 (2.3918)	mem 20675MB
[2025-04-03 00:53:01 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][46/311]	eta 0:03:59 lr 0.001161	time 0.8759 (0.9043)	loss 0.6166 (0.6110)	grad_norm 1.9532 (2.3760)	mem 20675MB
[2025-04-03 00:53:02 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][48/311]	eta 0:03:57 lr 0.001161	time 0.8758 (0.9032)	loss 0.6650 (0.6119)	grad_norm 1.9441 (2.3416)	mem 20675MB
[2025-04-03 00:53:04 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][50/311]	eta 0:03:55 lr 0.001161	time 0.8755 (0.9021)	loss 0.6223 (0.6125)	grad_norm 1.3515 (2.3043)	mem 20675MB
[2025-04-03 00:53:06 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][52/311]	eta 0:03:53 lr 0.001161	time 0.8758 (0.9012)	loss 0.5768 (0.6126)	grad_norm 2.3609 (2.3007)	mem 20675MB
[2025-04-03 00:53:08 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][54/311]	eta 0:03:51 lr 0.001161	time 0.8761 (0.9003)	loss 0.5929 (0.6127)	grad_norm 2.0556 (2.2798)	mem 20675MB
[2025-04-03 00:53:09 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][56/311]	eta 0:03:49 lr 0.001160	time 0.8759 (0.8995)	loss 0.6062 (0.6128)	grad_norm 1.5678 (2.2613)	mem 20675MB
[2025-04-03 00:53:11 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][58/311]	eta 0:03:47 lr 0.001160	time 0.8759 (0.8987)	loss 0.5617 (0.6117)	grad_norm 1.4486 (2.2464)	mem 20675MB
[2025-04-03 00:53:13 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][60/311]	eta 0:03:45 lr 0.001160	time 0.8755 (0.8980)	loss 0.6474 (0.6120)	grad_norm 1.8540 (2.2441)	mem 20675MB
[2025-04-03 00:53:15 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][62/311]	eta 0:03:43 lr 0.001160	time 0.8758 (0.8973)	loss 0.6284 (0.6117)	grad_norm 1.9194 (2.2288)	mem 20675MB
[2025-04-03 00:53:16 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][64/311]	eta 0:03:41 lr 0.001159	time 0.8756 (0.8967)	loss 0.6094 (0.6106)	grad_norm 2.8704 (2.2478)	mem 20675MB
[2025-04-03 00:53:18 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][66/311]	eta 0:03:39 lr 0.001159	time 0.8755 (0.8961)	loss 0.5590 (0.6094)	grad_norm 2.2638 (2.2587)	mem 20675MB
[2025-04-03 00:53:20 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][68/311]	eta 0:03:37 lr 0.001159	time 0.8764 (0.8955)	loss 0.7326 (0.6114)	grad_norm 3.9448 (2.3178)	mem 20675MB
[2025-04-03 00:53:22 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][70/311]	eta 0:03:35 lr 0.001159	time 0.8758 (0.8950)	loss 0.6356 (0.6103)	grad_norm 4.0305 (2.3716)	mem 20675MB
[2025-04-03 00:53:23 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][72/311]	eta 0:03:33 lr 0.001159	time 0.8761 (0.8945)	loss 0.5817 (0.6105)	grad_norm 1.9032 (2.3621)	mem 20675MB
[2025-04-03 00:53:25 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][74/311]	eta 0:03:31 lr 0.001158	time 0.8759 (0.8940)	loss 0.6488 (0.6112)	grad_norm 2.0992 (2.3494)	mem 20675MB
[2025-04-03 00:53:27 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][76/311]	eta 0:03:29 lr 0.001158	time 0.8762 (0.8936)	loss 0.6215 (0.6116)	grad_norm 3.2179 (2.3590)	mem 20675MB
[2025-04-03 00:53:29 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][78/311]	eta 0:03:28 lr 0.001158	time 0.8760 (0.8932)	loss 0.6448 (0.6126)	grad_norm 1.7119 (2.3471)	mem 20675MB
[2025-04-03 00:53:30 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][80/311]	eta 0:03:26 lr 0.001158	time 0.8761 (0.8928)	loss 0.6435 (0.6137)	grad_norm 1.3668 (2.3289)	mem 20675MB
[2025-04-03 00:53:32 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][82/311]	eta 0:03:24 lr 0.001157	time 0.8759 (0.8924)	loss 0.6189 (0.6137)	grad_norm 2.2575 (2.3230)	mem 20675MB
[2025-04-03 00:53:34 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][84/311]	eta 0:03:22 lr 0.001157	time 0.8756 (0.8920)	loss 0.5633 (0.6125)	grad_norm 2.0644 (2.3137)	mem 20675MB
[2025-04-03 00:53:36 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][86/311]	eta 0:03:20 lr 0.001157	time 0.8758 (0.8916)	loss 0.5817 (0.6125)	grad_norm 1.8686 (2.3065)	mem 20675MB
[2025-04-03 00:53:37 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][88/311]	eta 0:03:18 lr 0.001157	time 0.8759 (0.8913)	loss 0.5552 (0.6108)	grad_norm 2.4979 (2.3039)	mem 20675MB
[2025-04-03 00:53:39 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][90/311]	eta 0:03:16 lr 0.001157	time 0.8758 (0.8910)	loss 0.5928 (0.6103)	grad_norm 4.9282 (2.3367)	mem 20675MB
[2025-04-03 00:53:41 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][92/311]	eta 0:03:15 lr 0.001156	time 0.8755 (0.8907)	loss 0.5416 (0.6089)	grad_norm 4.1812 (2.3751)	mem 20675MB
[2025-04-03 00:53:43 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][94/311]	eta 0:03:13 lr 0.001156	time 0.8755 (0.8904)	loss 0.6594 (0.6097)	grad_norm 2.1372 (2.3806)	mem 20675MB
[2025-04-03 00:53:44 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][96/311]	eta 0:03:11 lr 0.001156	time 0.8760 (0.8901)	loss 0.5375 (0.6084)	grad_norm 3.3005 (2.3948)	mem 20675MB
[2025-04-03 00:53:46 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][98/311]	eta 0:03:09 lr 0.001156	time 0.8759 (0.8898)	loss 0.5754 (0.6077)	grad_norm 2.2665 (2.3935)	mem 20675MB
[2025-04-03 00:53:48 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][100/311]	eta 0:03:07 lr 0.001155	time 0.8756 (0.8896)	loss 0.5391 (0.6067)	grad_norm 2.8616 (2.4083)	mem 20675MB
[2025-04-03 00:53:50 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][102/311]	eta 0:03:05 lr 0.001155	time 0.8758 (0.8893)	loss 0.6086 (0.6067)	grad_norm 3.8272 (2.4238)	mem 20675MB
[2025-04-03 00:53:51 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][104/311]	eta 0:03:04 lr 0.001155	time 0.8755 (0.8891)	loss 0.6263 (0.6072)	grad_norm 1.5263 (2.4166)	mem 20675MB
[2025-04-03 00:53:53 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][106/311]	eta 0:03:02 lr 0.001155	time 0.8756 (0.8888)	loss 0.5860 (0.6069)	grad_norm 2.3198 (2.4244)	mem 20675MB
[2025-04-03 00:53:55 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][108/311]	eta 0:03:00 lr 0.001155	time 0.8757 (0.8886)	loss 0.5526 (0.6070)	grad_norm 4.9811 (2.4477)	mem 20675MB
[2025-04-03 00:53:57 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][110/311]	eta 0:02:58 lr 0.001154	time 0.8758 (0.8884)	loss 0.6188 (0.6075)	grad_norm 1.3946 (2.4362)	mem 20675MB
[2025-04-03 00:53:59 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][112/311]	eta 0:02:56 lr 0.001154	time 0.8758 (0.8882)	loss 0.5891 (0.6080)	grad_norm 2.0163 (2.4339)	mem 20675MB
[2025-04-03 00:54:00 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][114/311]	eta 0:02:54 lr 0.001154	time 0.8758 (0.8880)	loss 0.6179 (0.6085)	grad_norm 1.6968 (2.4220)	mem 20675MB
[2025-04-03 00:54:02 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][116/311]	eta 0:02:53 lr 0.001154	time 0.8758 (0.8878)	loss 0.5685 (0.6084)	grad_norm 2.3213 (2.4124)	mem 20675MB
[2025-04-03 00:54:04 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][118/311]	eta 0:02:51 lr 0.001153	time 0.8759 (0.8876)	loss 0.6114 (0.6090)	grad_norm 3.1260 (2.4177)	mem 20675MB
[2025-04-03 00:54:06 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][120/311]	eta 0:02:49 lr 0.001153	time 0.8758 (0.8874)	loss 0.6786 (0.6095)	grad_norm 2.4813 (2.4129)	mem 20675MB
[2025-04-03 00:54:07 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][122/311]	eta 0:02:47 lr 0.001153	time 0.8756 (0.8873)	loss 0.6552 (0.6094)	grad_norm 1.8721 (2.4082)	mem 20675MB
[2025-04-03 00:54:09 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][124/311]	eta 0:02:45 lr 0.001153	time 0.8757 (0.8871)	loss 0.5806 (0.6092)	grad_norm 1.6510 (2.4183)	mem 20675MB
[2025-04-03 00:54:11 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][126/311]	eta 0:02:44 lr 0.001153	time 0.8756 (0.8869)	loss 0.6124 (0.6097)	grad_norm 2.4386 (2.4114)	mem 20675MB
[2025-04-03 00:54:13 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][128/311]	eta 0:02:42 lr 0.001152	time 0.8759 (0.8868)	loss 0.6025 (0.6094)	grad_norm 2.9918 (2.4249)	mem 20675MB
[2025-04-03 00:54:14 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][130/311]	eta 0:02:40 lr 0.001152	time 0.8757 (0.8866)	loss 0.6585 (0.6095)	grad_norm 3.3090 (2.4369)	mem 20675MB
[2025-04-03 00:54:16 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][132/311]	eta 0:02:38 lr 0.001152	time 0.8757 (0.8865)	loss 0.6596 (0.6102)	grad_norm 4.2436 (2.4515)	mem 20675MB
[2025-04-03 00:54:18 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][134/311]	eta 0:02:36 lr 0.001152	time 0.8761 (0.8863)	loss 0.5844 (0.6102)	grad_norm 1.6516 (2.4441)	mem 20675MB
[2025-04-03 00:54:20 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][136/311]	eta 0:02:35 lr 0.001151	time 0.8760 (0.8862)	loss 0.6450 (0.6104)	grad_norm 3.1752 (2.4435)	mem 20675MB
[2025-04-03 00:54:21 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][138/311]	eta 0:02:33 lr 0.001151	time 0.8757 (0.8861)	loss 0.6446 (0.6106)	grad_norm 1.6225 (2.4334)	mem 20675MB
[2025-04-03 00:54:23 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][140/311]	eta 0:02:31 lr 0.001151	time 0.8758 (0.8859)	loss 0.6520 (0.6111)	grad_norm 2.8564 (2.4303)	mem 20675MB
[2025-04-03 00:54:25 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][142/311]	eta 0:02:29 lr 0.001151	time 0.8759 (0.8858)	loss 0.6639 (0.6112)	grad_norm 2.8978 (2.4356)	mem 20675MB
[2025-04-03 00:54:27 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][144/311]	eta 0:02:27 lr 0.001151	time 0.8757 (0.8857)	loss 0.6514 (0.6116)	grad_norm 1.0157 (2.4240)	mem 20675MB
[2025-04-03 00:54:28 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][146/311]	eta 0:02:26 lr 0.001150	time 0.8756 (0.8855)	loss 0.6485 (0.6121)	grad_norm 2.7265 (2.4263)	mem 20675MB
[2025-04-03 00:54:30 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][148/311]	eta 0:02:24 lr 0.001150	time 0.8758 (0.8854)	loss 0.5957 (0.6116)	grad_norm 1.8615 (2.4172)	mem 20675MB
[2025-04-03 00:54:32 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][150/311]	eta 0:02:22 lr 0.001150	time 0.8759 (0.8853)	loss 0.6366 (0.6123)	grad_norm 2.4033 (2.4298)	mem 20675MB
[2025-04-03 00:54:34 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][152/311]	eta 0:02:20 lr 0.001150	time 0.8760 (0.8852)	loss 0.6203 (0.6120)	grad_norm 2.2262 (2.4273)	mem 20675MB
[2025-04-03 00:54:35 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][154/311]	eta 0:02:18 lr 0.001149	time 0.8764 (0.8851)	loss 0.5705 (0.6116)	grad_norm 2.2943 (2.4292)	mem 20675MB
[2025-04-03 00:54:37 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][156/311]	eta 0:02:17 lr 0.001149	time 0.8754 (0.8850)	loss 0.6702 (0.6119)	grad_norm 2.2037 (2.4266)	mem 20675MB
[2025-04-03 00:54:39 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][158/311]	eta 0:02:15 lr 0.001149	time 0.8757 (0.8849)	loss 0.5685 (0.6117)	grad_norm 2.1339 (2.4217)	mem 20675MB
[2025-04-03 00:54:41 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][160/311]	eta 0:02:13 lr 0.001149	time 0.8757 (0.8848)	loss 0.6024 (0.6111)	grad_norm 1.8155 (2.4192)	mem 20675MB
[2025-04-03 00:54:42 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][162/311]	eta 0:02:11 lr 0.001148	time 0.8758 (0.8847)	loss 0.5530 (0.6106)	grad_norm 3.6759 (2.4249)	mem 20675MB
[2025-04-03 00:54:44 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][164/311]	eta 0:02:10 lr 0.001148	time 0.8754 (0.8846)	loss 0.6007 (0.6107)	grad_norm 2.6276 (2.4302)	mem 20675MB
[2025-04-03 00:54:46 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][166/311]	eta 0:02:08 lr 0.001148	time 0.8755 (0.8845)	loss 0.6028 (0.6103)	grad_norm 2.2646 (2.4430)	mem 20675MB
[2025-04-03 00:54:48 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][168/311]	eta 0:02:06 lr 0.001148	time 0.8755 (0.8844)	loss 0.5738 (0.6105)	grad_norm 2.7169 (2.4474)	mem 20675MB
[2025-04-03 00:54:49 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][170/311]	eta 0:02:04 lr 0.001148	time 0.8757 (0.8843)	loss 0.5713 (0.6108)	grad_norm 3.4866 (2.4514)	mem 20675MB
[2025-04-03 00:54:51 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][172/311]	eta 0:02:02 lr 0.001147	time 0.8781 (0.8842)	loss 0.6735 (0.6110)	grad_norm 1.9923 (2.4461)	mem 20675MB
[2025-04-03 00:54:53 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][174/311]	eta 0:02:01 lr 0.001147	time 0.8757 (0.8841)	loss 0.6532 (0.6114)	grad_norm 2.1867 (2.4411)	mem 20675MB
[2025-04-03 00:54:55 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][176/311]	eta 0:01:59 lr 0.001147	time 0.8758 (0.8840)	loss 0.6640 (0.6119)	grad_norm 2.7071 (2.4395)	mem 20675MB
[2025-04-03 00:54:56 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][178/311]	eta 0:01:57 lr 0.001147	time 0.8761 (0.8839)	loss 0.5674 (0.6116)	grad_norm 1.8543 (2.4303)	mem 20675MB
[2025-04-03 00:54:58 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][180/311]	eta 0:01:55 lr 0.001146	time 0.8757 (0.8839)	loss 0.6363 (0.6116)	grad_norm 2.2438 (2.4271)	mem 20675MB
[2025-04-03 00:55:00 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][182/311]	eta 0:01:54 lr 0.001146	time 0.8784 (0.8838)	loss 0.6449 (0.6120)	grad_norm 2.5034 (2.4257)	mem 20675MB
[2025-04-03 00:55:02 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][184/311]	eta 0:01:52 lr 0.001146	time 0.8756 (0.8837)	loss 0.6390 (0.6120)	grad_norm 2.2033 (2.4247)	mem 20675MB
[2025-04-03 00:55:03 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][186/311]	eta 0:01:50 lr 0.001146	time 0.8757 (0.8836)	loss 0.5978 (0.6118)	grad_norm 2.4141 (2.4277)	mem 20675MB
[2025-04-03 00:55:05 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][188/311]	eta 0:01:48 lr 0.001145	time 0.8762 (0.8836)	loss 0.5376 (0.6117)	grad_norm 4.4774 (2.4397)	mem 20675MB
[2025-04-03 00:55:07 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][190/311]	eta 0:01:46 lr 0.001145	time 0.8759 (0.8835)	loss 0.5983 (0.6119)	grad_norm 2.8946 (2.4429)	mem 20675MB
[2025-04-03 00:55:09 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][192/311]	eta 0:01:45 lr 0.001145	time 0.8762 (0.8834)	loss 0.6776 (0.6122)	grad_norm 2.5571 (2.4492)	mem 20675MB
[2025-04-03 00:55:10 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][194/311]	eta 0:01:43 lr 0.001145	time 0.8753 (0.8834)	loss 0.5570 (0.6115)	grad_norm 3.2095 (2.4568)	mem 20675MB
[2025-04-03 00:55:12 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][196/311]	eta 0:01:41 lr 0.001144	time 0.8763 (0.8833)	loss 0.6132 (0.6115)	grad_norm 1.9523 (2.4530)	mem 20675MB
[2025-04-03 00:55:14 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][198/311]	eta 0:01:39 lr 0.001144	time 0.8789 (0.8832)	loss 0.6404 (0.6112)	grad_norm 2.9253 (2.4582)	mem 20675MB
[2025-04-03 00:55:16 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][200/311]	eta 0:01:38 lr 0.001144	time 0.8754 (0.8832)	loss 0.5663 (0.6107)	grad_norm 2.9371 (2.4633)	mem 20675MB
[2025-04-03 00:55:17 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][202/311]	eta 0:01:36 lr 0.001144	time 0.8758 (0.8831)	loss 0.5734 (0.6105)	grad_norm 2.7119 (2.4588)	mem 20675MB
[2025-04-03 00:55:19 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][204/311]	eta 0:01:34 lr 0.001144	time 0.8762 (0.8831)	loss 0.6774 (0.6110)	grad_norm 2.0440 (2.4523)	mem 20675MB
[2025-04-03 00:55:21 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][206/311]	eta 0:01:32 lr 0.001143	time 0.8780 (0.8830)	loss 0.6462 (0.6114)	grad_norm 2.1745 (2.4560)	mem 20675MB
[2025-04-03 00:55:23 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][208/311]	eta 0:01:30 lr 0.001143	time 0.8763 (0.8830)	loss 0.6922 (0.6119)	grad_norm 3.3786 (2.4587)	mem 20675MB
[2025-04-03 00:55:24 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][210/311]	eta 0:01:29 lr 0.001143	time 0.8772 (0.8829)	loss 0.6260 (0.6117)	grad_norm 1.9944 (2.4547)	mem 20675MB
[2025-04-03 00:55:26 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][212/311]	eta 0:01:27 lr 0.001143	time 0.8761 (0.8829)	loss 0.6040 (0.6114)	grad_norm 1.6487 (2.4493)	mem 20675MB
[2025-04-03 00:55:28 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][214/311]	eta 0:01:25 lr 0.001142	time 0.8763 (0.8828)	loss 0.6245 (0.6117)	grad_norm 1.7501 (2.4466)	mem 20675MB
[2025-04-03 00:55:30 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][216/311]	eta 0:01:23 lr 0.001142	time 0.8755 (0.8828)	loss 0.5484 (0.6113)	grad_norm 2.6961 (2.4461)	mem 20675MB
[2025-04-03 00:55:31 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][218/311]	eta 0:01:22 lr 0.001142	time 0.8757 (0.8827)	loss 0.6733 (0.6117)	grad_norm 1.9138 (2.4410)	mem 20675MB
[2025-04-03 00:55:33 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][220/311]	eta 0:01:20 lr 0.001142	time 0.8758 (0.8826)	loss 0.6010 (0.6118)	grad_norm 1.5928 (2.4336)	mem 20675MB
[2025-04-03 00:55:35 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][222/311]	eta 0:01:18 lr 0.001141	time 0.8762 (0.8826)	loss 0.5057 (0.6116)	grad_norm 2.1860 (2.4368)	mem 20675MB
[2025-04-03 00:55:37 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][224/311]	eta 0:01:16 lr 0.001141	time 0.8755 (0.8825)	loss 0.6373 (0.6117)	grad_norm 1.6518 (2.4405)	mem 20675MB
[2025-04-03 00:55:38 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][226/311]	eta 0:01:15 lr 0.001141	time 0.8754 (0.8825)	loss 0.6114 (0.6119)	grad_norm 2.0211 (2.4355)	mem 20675MB
[2025-04-03 00:55:40 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][228/311]	eta 0:01:13 lr 0.001141	time 0.8762 (0.8824)	loss 0.5591 (0.6116)	grad_norm 2.1508 (2.4320)	mem 20675MB
[2025-04-03 00:55:42 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][230/311]	eta 0:01:11 lr 0.001140	time 0.8761 (0.8824)	loss 0.6333 (0.6116)	grad_norm 1.9144 (2.4285)	mem 20675MB
[2025-04-03 00:55:44 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][232/311]	eta 0:01:09 lr 0.001140	time 0.8755 (0.8823)	loss 0.6432 (0.6117)	grad_norm 1.5139 (2.4221)	mem 20675MB
[2025-04-03 00:55:45 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][234/311]	eta 0:01:07 lr 0.001140	time 0.8756 (0.8823)	loss 0.6416 (0.6117)	grad_norm 2.9487 (2.4260)	mem 20675MB
[2025-04-03 00:55:47 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][236/311]	eta 0:01:06 lr 0.001140	time 0.8754 (0.8822)	loss 0.6289 (0.6119)	grad_norm 1.3751 (2.4237)	mem 20675MB
[2025-04-03 00:55:49 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][238/311]	eta 0:01:04 lr 0.001140	time 0.8760 (0.8822)	loss 0.6091 (0.6118)	grad_norm 2.5482 (2.4223)	mem 20675MB
[2025-04-03 00:55:51 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][240/311]	eta 0:01:02 lr 0.001139	time 0.8754 (0.8821)	loss 0.6534 (0.6121)	grad_norm 2.2340 (2.4218)	mem 20675MB
[2025-04-03 00:55:52 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][242/311]	eta 0:01:00 lr 0.001139	time 0.8761 (0.8821)	loss 0.6455 (0.6125)	grad_norm 2.4190 (2.4262)	mem 20675MB
[2025-04-03 00:55:54 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][244/311]	eta 0:00:59 lr 0.001139	time 0.8758 (0.8820)	loss 0.6533 (0.6122)	grad_norm 1.5434 (2.4217)	mem 20675MB
[2025-04-03 00:55:56 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][246/311]	eta 0:00:57 lr 0.001139	time 0.8755 (0.8820)	loss 0.6108 (0.6121)	grad_norm 2.0710 (2.4194)	mem 20675MB
[2025-04-03 00:55:58 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][248/311]	eta 0:00:55 lr 0.001138	time 0.8754 (0.8819)	loss 0.5730 (0.6118)	grad_norm 2.1266 (2.4190)	mem 20675MB
[2025-04-03 00:55:59 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][250/311]	eta 0:00:53 lr 0.001138	time 0.8756 (0.8819)	loss 0.6175 (0.6114)	grad_norm 1.7780 (2.4163)	mem 20675MB
[2025-04-03 00:56:01 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][252/311]	eta 0:00:52 lr 0.001138	time 0.8754 (0.8819)	loss 0.6033 (0.6115)	grad_norm 2.1321 (2.4182)	mem 20675MB
[2025-04-03 00:56:03 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][254/311]	eta 0:00:50 lr 0.001138	time 0.8754 (0.8818)	loss 0.5272 (0.6112)	grad_norm 3.5374 (2.4255)	mem 20675MB
[2025-04-03 00:56:05 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][256/311]	eta 0:00:48 lr 0.001137	time 0.8757 (0.8818)	loss 0.6272 (0.6110)	grad_norm 2.8199 (2.4309)	mem 20675MB
[2025-04-03 00:56:07 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][258/311]	eta 0:00:46 lr 0.001137	time 0.8759 (0.8817)	loss 0.5794 (0.6111)	grad_norm 3.3945 (2.4371)	mem 20675MB
[2025-04-03 00:56:08 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][260/311]	eta 0:00:44 lr 0.001137	time 0.8754 (0.8817)	loss 0.5827 (0.6111)	grad_norm 4.8833 (2.4516)	mem 20675MB
[2025-04-03 00:56:10 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][262/311]	eta 0:00:43 lr 0.001137	time 0.8756 (0.8817)	loss 0.6064 (0.6110)	grad_norm 2.9561 (2.4524)	mem 20675MB
[2025-04-03 00:56:12 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][264/311]	eta 0:00:41 lr 0.001136	time 0.8756 (0.8816)	loss 0.6195 (0.6109)	grad_norm 1.4809 (2.4483)	mem 20675MB
[2025-04-03 00:56:14 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][266/311]	eta 0:00:39 lr 0.001136	time 0.8761 (0.8816)	loss 0.5962 (0.6109)	grad_norm 2.3601 (2.4440)	mem 20675MB
[2025-04-03 00:56:15 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][268/311]	eta 0:00:37 lr 0.001136	time 0.8757 (0.8815)	loss 0.5848 (0.6110)	grad_norm 1.7025 (2.4407)	mem 20675MB
[2025-04-03 00:56:17 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][270/311]	eta 0:00:36 lr 0.001136	time 0.8757 (0.8815)	loss 0.6591 (0.6112)	grad_norm 1.9706 (2.4365)	mem 20675MB
[2025-04-03 00:56:19 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][272/311]	eta 0:00:34 lr 0.001135	time 0.8755 (0.8815)	loss 0.6290 (0.6112)	grad_norm 2.6519 (2.4361)	mem 20675MB
[2025-04-03 00:56:21 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][274/311]	eta 0:00:32 lr 0.001135	time 0.8757 (0.8814)	loss 0.6576 (0.6112)	grad_norm 2.3573 (2.4343)	mem 20675MB
[2025-04-03 00:56:22 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][276/311]	eta 0:00:30 lr 0.001135	time 0.8758 (0.8814)	loss 0.6103 (0.6111)	grad_norm 1.7005 (2.4295)	mem 20675MB
[2025-04-03 00:56:24 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][278/311]	eta 0:00:29 lr 0.001135	time 0.8755 (0.8814)	loss 0.5747 (0.6111)	grad_norm 1.8282 (2.4263)	mem 20675MB
[2025-04-03 00:56:26 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][280/311]	eta 0:00:27 lr 0.001134	time 0.8758 (0.8813)	loss 0.5375 (0.6110)	grad_norm 2.7407 (2.4265)	mem 20675MB
[2025-04-03 00:56:28 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][282/311]	eta 0:00:25 lr 0.001134	time 0.8757 (0.8813)	loss 0.5561 (0.6109)	grad_norm 3.7638 (2.4326)	mem 20675MB
[2025-04-03 00:56:29 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][284/311]	eta 0:00:23 lr 0.001134	time 0.8757 (0.8813)	loss 0.6720 (0.6111)	grad_norm 3.1945 (2.4352)	mem 20675MB
[2025-04-03 00:56:31 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][286/311]	eta 0:00:22 lr 0.001134	time 0.8757 (0.8812)	loss 0.5140 (0.6105)	grad_norm 2.9386 (2.4386)	mem 20675MB
[2025-04-03 00:56:33 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][288/311]	eta 0:00:20 lr 0.001133	time 0.8754 (0.8812)	loss 0.6028 (0.6106)	grad_norm 2.0935 (2.4370)	mem 20675MB
[2025-04-03 00:56:35 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][290/311]	eta 0:00:18 lr 0.001133	time 0.8754 (0.8812)	loss 0.5978 (0.6103)	grad_norm 2.1120 (2.4390)	mem 20675MB
[2025-04-03 00:56:36 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][292/311]	eta 0:00:16 lr 0.001133	time 0.8754 (0.8811)	loss 0.5405 (0.6100)	grad_norm 2.2691 (2.4429)	mem 20675MB
[2025-04-03 00:56:38 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][294/311]	eta 0:00:14 lr 0.001133	time 0.8755 (0.8811)	loss 0.6030 (0.6101)	grad_norm 2.5863 (2.4477)	mem 20675MB
[2025-04-03 00:56:40 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][296/311]	eta 0:00:13 lr 0.001133	time 0.8756 (0.8811)	loss 0.5588 (0.6095)	grad_norm 3.4904 (2.4518)	mem 20675MB
[2025-04-03 00:56:42 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][298/311]	eta 0:00:11 lr 0.001132	time 0.8755 (0.8810)	loss 0.5476 (0.6094)	grad_norm 4.3327 (2.4619)	mem 20675MB
[2025-04-03 00:56:43 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][300/311]	eta 0:00:09 lr 0.001132	time 0.8752 (0.8810)	loss 0.6622 (0.6093)	grad_norm 3.2963 (2.4645)	mem 20675MB
[2025-04-03 00:56:45 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][302/311]	eta 0:00:07 lr 0.001132	time 0.8754 (0.8810)	loss 0.6421 (0.6094)	grad_norm 2.6549 (2.4672)	mem 20675MB
[2025-04-03 00:56:47 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][304/311]	eta 0:00:06 lr 0.001132	time 0.8752 (0.8809)	loss 0.5841 (0.6093)	grad_norm 2.6293 (2.4683)	mem 20675MB
[2025-04-03 00:56:49 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][306/311]	eta 0:00:04 lr 0.001131	time 0.8757 (0.8809)	loss 0.5657 (0.6090)	grad_norm 2.1804 (2.4654)	mem 20675MB
[2025-04-03 00:56:50 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][308/311]	eta 0:00:02 lr 0.001131	time 0.8752 (0.8809)	loss 0.5584 (0.6087)	grad_norm 2.6111 (2.4646)	mem 20675MB
[2025-04-03 00:56:52 simmim_finetune] (main_finetune.py 252): INFO Train: [5/30][310/311]	eta 0:00:00 lr 0.001131	time 0.8756 (0.8808)	loss 0.5860 (0.6088)	grad_norm 2.9020 (2.4647)	mem 20675MB
[2025-04-03 00:56:52 simmim_finetune] (main_finetune.py 260): INFO EPOCH 5 training takes 0:04:34
[2025-04-03 00:56:52 simmim_finetune] (utils.py 60): INFO checkpoint/hand/ckpt5.pth saving......
[2025-04-03 00:56:55 simmim_finetune] (utils.py 62): INFO checkpoint/hand/ckpt5.pth saved !!!
[2025-04-03 00:56:57 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.368 (1.368)	Loss 0.6169 (0.6169)	Acc@1 67.969 (67.969)	Mem 20675MB
[2025-04-03 00:56:57 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 70.423
[2025-04-03 00:56:57 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 70.4%
[2025-04-03 00:56:57 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 70.42%
[2025-04-03 00:56:57 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.20449805740833e-06, 4.20449805740833e-06, 6.455617242846719e-06, 6.455617242846719e-06, 9.918877528136548e-06, 9.918877528136548e-06, 1.5246970274736286e-05, 1.5246970274736286e-05, 2.3444036038735877e-05, 2.3444036038735877e-05, 3.60549064448891e-05, 3.60549064448891e-05, 5.545624553127867e-05, 5.545624553127867e-05, 8.530445951033954e-05, 8.530445951033954e-05, 0.00013122478870889473, 0.00013122478870889473, 0.00020187144901436427, 0.00020187144901436427, 0.00031055861871508655, 0.00031055861871508655, 0.00047776964902389015, 0.00047776964902389015, 0.0007350173879605112, 0.0007350173879605112, 0.001130783140170697, 0.001130783140170697]
[2025-04-03 00:56:59 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][0/311]	eta 0:10:15 lr 0.001131	time 1.9781 (1.9781)	loss 0.5558 (0.5558)	grad_norm 3.3423 (3.3423)	mem 20675MB
[2025-04-03 00:57:01 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][2/311]	eta 0:06:24 lr 0.001130	time 0.8761 (1.2439)	loss 0.5299 (0.5621)	grad_norm 2.5395 (2.4913)	mem 20675MB
[2025-04-03 00:57:02 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][4/311]	eta 0:05:36 lr 0.001130	time 0.8758 (1.0971)	loss 0.6150 (0.5972)	grad_norm 2.3032 (2.5697)	mem 20675MB
[2025-04-03 00:57:04 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][6/311]	eta 0:05:15 lr 0.001130	time 0.8761 (1.0342)	loss 0.6445 (0.5991)	grad_norm 2.1358 (2.4407)	mem 20675MB
[2025-04-03 00:57:06 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][8/311]	eta 0:05:02 lr 0.001130	time 0.8758 (0.9992)	loss 0.6214 (0.6076)	grad_norm 2.9096 (2.4479)	mem 20675MB
[2025-04-03 00:57:08 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][10/311]	eta 0:04:54 lr 0.001129	time 0.8763 (0.9770)	loss 0.6324 (0.6164)	grad_norm 2.0709 (2.4071)	mem 20675MB
[2025-04-03 00:57:09 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][12/311]	eta 0:04:47 lr 0.001129	time 0.8762 (0.9617)	loss 0.6510 (0.6159)	grad_norm 1.8302 (2.3163)	mem 20675MB
[2025-04-03 00:57:11 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][14/311]	eta 0:04:42 lr 0.001129	time 0.8759 (0.9503)	loss 0.6284 (0.6164)	grad_norm 1.1908 (2.2461)	mem 20675MB
[2025-04-03 00:57:13 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][16/311]	eta 0:04:37 lr 0.001129	time 0.8759 (0.9417)	loss 0.5795 (0.6144)	grad_norm 3.2157 (2.2590)	mem 20675MB
[2025-04-03 00:57:15 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][18/311]	eta 0:04:33 lr 0.001128	time 0.8762 (0.9349)	loss 0.6666 (0.6215)	grad_norm 3.1066 (2.3069)	mem 20675MB
[2025-04-03 00:57:16 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][20/311]	eta 0:04:30 lr 0.001128	time 0.8758 (0.9293)	loss 0.5956 (0.6211)	grad_norm 1.5080 (2.2685)	mem 20675MB
[2025-04-03 00:57:18 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][22/311]	eta 0:04:27 lr 0.001128	time 0.8761 (0.9248)	loss 0.5271 (0.6167)	grad_norm 2.2171 (2.2248)	mem 20675MB
[2025-04-03 00:57:20 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][24/311]	eta 0:04:24 lr 0.001128	time 0.8757 (0.9209)	loss 0.5313 (0.6125)	grad_norm 2.1098 (2.1871)	mem 20675MB
[2025-04-03 00:57:22 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][26/311]	eta 0:04:21 lr 0.001127	time 0.8761 (0.9177)	loss 0.6875 (0.6129)	grad_norm 2.2839 (2.1860)	mem 20675MB
[2025-04-03 00:57:23 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][28/311]	eta 0:04:18 lr 0.001127	time 0.8757 (0.9148)	loss 0.5864 (0.6085)	grad_norm 2.2685 (2.1918)	mem 20675MB
[2025-04-03 00:57:25 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][30/311]	eta 0:04:16 lr 0.001127	time 0.8761 (0.9124)	loss 0.5730 (0.6078)	grad_norm 3.5835 (2.2426)	mem 20675MB
[2025-04-03 00:57:27 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][32/311]	eta 0:04:13 lr 0.001127	time 0.8761 (0.9102)	loss 0.6708 (0.6098)	grad_norm 3.4842 (2.3308)	mem 20675MB
[2025-04-03 00:57:29 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][34/311]	eta 0:04:11 lr 0.001126	time 0.8762 (0.9083)	loss 0.6160 (0.6095)	grad_norm 2.4382 (2.3508)	mem 20675MB
[2025-04-03 00:57:30 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][36/311]	eta 0:04:09 lr 0.001126	time 0.8762 (0.9066)	loss 0.6135 (0.6087)	grad_norm 1.9252 (2.3426)	mem 20675MB
[2025-04-03 00:57:32 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][38/311]	eta 0:04:07 lr 0.001126	time 0.8759 (0.9051)	loss 0.6306 (0.6097)	grad_norm 1.4204 (2.3189)	mem 20675MB
[2025-04-03 00:57:34 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][40/311]	eta 0:04:04 lr 0.001126	time 0.8759 (0.9037)	loss 0.5403 (0.6084)	grad_norm 2.3111 (2.3049)	mem 20675MB
[2025-04-03 00:57:36 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][42/311]	eta 0:04:02 lr 0.001125	time 0.8761 (0.9025)	loss 0.6005 (0.6090)	grad_norm 2.1771 (2.2938)	mem 20675MB
[2025-04-03 00:57:37 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][44/311]	eta 0:04:00 lr 0.001125	time 0.8759 (0.9013)	loss 0.6148 (0.6080)	grad_norm 1.5940 (2.2644)	mem 20675MB
[2025-04-03 00:57:39 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][46/311]	eta 0:03:58 lr 0.001125	time 0.8761 (0.9003)	loss 0.5447 (0.6063)	grad_norm 1.8493 (2.2563)	mem 20675MB
[2025-04-03 00:57:41 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][48/311]	eta 0:03:56 lr 0.001125	time 0.8758 (0.8993)	loss 0.4617 (0.6016)	grad_norm 2.6751 (2.2812)	mem 20675MB
[2025-04-03 00:57:43 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][50/311]	eta 0:03:54 lr 0.001124	time 0.8765 (0.8985)	loss 0.5457 (0.6006)	grad_norm 3.5339 (2.3024)	mem 20675MB
[2025-04-03 00:57:44 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][52/311]	eta 0:03:52 lr 0.001124	time 0.8761 (0.8977)	loss 0.6135 (0.5987)	grad_norm 2.4607 (2.3281)	mem 20675MB
[2025-04-03 00:57:46 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][54/311]	eta 0:03:50 lr 0.001124	time 0.8761 (0.8969)	loss 0.5568 (0.5989)	grad_norm 3.4432 (2.3724)	mem 20675MB
[2025-04-03 00:57:48 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][56/311]	eta 0:03:48 lr 0.001124	time 0.8758 (0.8962)	loss 0.5698 (0.5985)	grad_norm 2.6761 (2.4015)	mem 20675MB
[2025-04-03 00:57:50 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][58/311]	eta 0:03:46 lr 0.001123	time 0.8763 (0.8955)	loss 0.6597 (0.5994)	grad_norm 2.3611 (2.4085)	mem 20675MB
[2025-04-03 00:57:51 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][60/311]	eta 0:03:44 lr 0.001123	time 0.8761 (0.8949)	loss 0.5666 (0.5976)	grad_norm 2.7541 (2.4167)	mem 20675MB
[2025-04-03 00:57:53 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][62/311]	eta 0:03:42 lr 0.001123	time 0.8764 (0.8944)	loss 0.6581 (0.6003)	grad_norm 1.6360 (2.4284)	mem 20675MB
[2025-04-03 00:57:55 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][64/311]	eta 0:03:40 lr 0.001123	time 0.8761 (0.8938)	loss 0.6038 (0.6003)	grad_norm 1.9472 (2.4061)	mem 20675MB
[2025-04-03 00:57:57 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][66/311]	eta 0:03:38 lr 0.001122	time 0.8760 (0.8933)	loss 0.5961 (0.5994)	grad_norm 1.6004 (2.4103)	mem 20675MB
[2025-04-03 00:57:59 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][68/311]	eta 0:03:36 lr 0.001122	time 0.8761 (0.8928)	loss 0.6599 (0.5999)	grad_norm 1.5409 (2.3906)	mem 20675MB
[2025-04-03 00:58:00 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][70/311]	eta 0:03:35 lr 0.001122	time 0.8761 (0.8924)	loss 0.6231 (0.6000)	grad_norm 3.1863 (2.3926)	mem 20675MB
[2025-04-03 00:58:02 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][72/311]	eta 0:03:33 lr 0.001122	time 0.8760 (0.8920)	loss 0.5600 (0.5992)	grad_norm 2.4123 (2.3949)	mem 20675MB
[2025-04-03 00:58:04 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][74/311]	eta 0:03:31 lr 0.001121	time 0.8761 (0.8916)	loss 0.6162 (0.5986)	grad_norm 1.5141 (2.3833)	mem 20675MB
[2025-04-03 00:58:06 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][76/311]	eta 0:03:29 lr 0.001121	time 0.8758 (0.8912)	loss 0.5781 (0.5986)	grad_norm 1.9835 (2.3708)	mem 20675MB
[2025-04-03 00:58:07 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][78/311]	eta 0:03:27 lr 0.001121	time 0.8762 (0.8908)	loss 0.5925 (0.5985)	grad_norm 2.1384 (2.3710)	mem 20675MB
[2025-04-03 00:58:09 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][80/311]	eta 0:03:25 lr 0.001121	time 0.8758 (0.8905)	loss 0.5913 (0.5980)	grad_norm 3.3422 (2.3814)	mem 20675MB
[2025-04-03 00:58:11 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][82/311]	eta 0:03:23 lr 0.001120	time 0.8759 (0.8901)	loss 0.5799 (0.5966)	grad_norm 2.4176 (2.3864)	mem 20675MB
[2025-04-03 00:58:13 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][84/311]	eta 0:03:21 lr 0.001120	time 0.8759 (0.8898)	loss 0.6282 (0.5976)	grad_norm 2.5129 (2.3880)	mem 20675MB
[2025-04-03 00:58:14 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][86/311]	eta 0:03:20 lr 0.001120	time 0.8760 (0.8895)	loss 0.5545 (0.5970)	grad_norm 3.3628 (2.3972)	mem 20675MB
[2025-04-03 00:58:16 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][88/311]	eta 0:03:18 lr 0.001120	time 0.8761 (0.8892)	loss 0.6274 (0.5962)	grad_norm 3.3513 (2.4136)	mem 20675MB
[2025-04-03 00:58:18 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][90/311]	eta 0:03:16 lr 0.001119	time 0.8782 (0.8890)	loss 0.6115 (0.5971)	grad_norm 2.5445 (2.4186)	mem 20675MB
[2025-04-03 00:58:20 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][92/311]	eta 0:03:14 lr 0.001119	time 0.8760 (0.8887)	loss 0.5865 (0.5974)	grad_norm 1.9738 (2.4115)	mem 20675MB
[2025-04-03 00:58:21 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][94/311]	eta 0:03:12 lr 0.001119	time 0.8758 (0.8885)	loss 0.6252 (0.5968)	grad_norm 1.5798 (2.4101)	mem 20675MB
[2025-04-03 00:58:23 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][96/311]	eta 0:03:10 lr 0.001119	time 0.8758 (0.8882)	loss 0.5456 (0.5968)	grad_norm 2.8979 (2.4130)	mem 20675MB
[2025-04-03 00:58:25 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][98/311]	eta 0:03:09 lr 0.001118	time 0.8760 (0.8880)	loss 0.6471 (0.5967)	grad_norm 2.1165 (2.4155)	mem 20675MB
[2025-04-03 00:58:27 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][100/311]	eta 0:03:07 lr 0.001118	time 0.8759 (0.8878)	loss 0.6031 (0.5960)	grad_norm 1.8332 (2.4140)	mem 20675MB
[2025-04-03 00:58:28 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][102/311]	eta 0:03:05 lr 0.001118	time 0.8757 (0.8876)	loss 0.5429 (0.5947)	grad_norm 3.0533 (2.4240)	mem 20675MB
[2025-04-03 00:58:30 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][104/311]	eta 0:03:03 lr 0.001117	time 0.8763 (0.8874)	loss 0.5588 (0.5948)	grad_norm 2.2405 (2.4235)	mem 20675MB
[2025-04-03 00:58:32 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][106/311]	eta 0:03:01 lr 0.001117	time 0.8758 (0.8872)	loss 0.6454 (0.5959)	grad_norm 2.6318 (2.4337)	mem 20675MB
[2025-04-03 00:58:34 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][108/311]	eta 0:03:00 lr 0.001117	time 0.8757 (0.8870)	loss 0.5859 (0.5961)	grad_norm 2.5715 (2.4476)	mem 20675MB
[2025-04-03 00:58:35 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][110/311]	eta 0:02:58 lr 0.001117	time 0.8758 (0.8868)	loss 0.6813 (0.5973)	grad_norm 2.0672 (2.4378)	mem 20675MB
[2025-04-03 00:58:37 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][112/311]	eta 0:02:56 lr 0.001116	time 0.8760 (0.8866)	loss 0.5360 (0.5972)	grad_norm 2.2999 (2.4331)	mem 20675MB
[2025-04-03 00:58:39 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][114/311]	eta 0:02:54 lr 0.001116	time 0.8758 (0.8864)	loss 0.4996 (0.5956)	grad_norm 2.2600 (2.4290)	mem 20675MB
[2025-04-03 00:58:41 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][116/311]	eta 0:02:52 lr 0.001116	time 0.8760 (0.8863)	loss 0.6484 (0.5954)	grad_norm 1.5911 (2.4222)	mem 20675MB
[2025-04-03 00:58:42 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][118/311]	eta 0:02:51 lr 0.001116	time 0.8760 (0.8861)	loss 0.6444 (0.5953)	grad_norm 2.6904 (2.4301)	mem 20675MB
[2025-04-03 00:58:44 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][120/311]	eta 0:02:49 lr 0.001115	time 0.8757 (0.8860)	loss 0.5543 (0.5945)	grad_norm 3.8613 (2.4488)	mem 20675MB
[2025-04-03 00:58:46 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][122/311]	eta 0:02:47 lr 0.001115	time 0.8762 (0.8858)	loss 0.5847 (0.5949)	grad_norm 2.6010 (2.4518)	mem 20675MB
[2025-04-03 00:58:48 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][124/311]	eta 0:02:45 lr 0.001115	time 0.8764 (0.8857)	loss 0.6033 (0.5953)	grad_norm 3.3870 (2.4618)	mem 20675MB
[2025-04-03 00:58:49 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][126/311]	eta 0:02:43 lr 0.001115	time 0.8756 (0.8855)	loss 0.5582 (0.5950)	grad_norm 3.1954 (2.4670)	mem 20675MB
[2025-04-03 00:58:51 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][128/311]	eta 0:02:42 lr 0.001114	time 0.8760 (0.8854)	loss 0.5225 (0.5950)	grad_norm 2.1140 (2.4655)	mem 20675MB
[2025-04-03 00:58:53 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][130/311]	eta 0:02:40 lr 0.001114	time 0.8761 (0.8853)	loss 0.6184 (0.5958)	grad_norm 1.9878 (2.4617)	mem 20675MB
[2025-04-03 00:58:55 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][132/311]	eta 0:02:38 lr 0.001114	time 0.8757 (0.8851)	loss 0.6329 (0.5956)	grad_norm 1.5805 (2.4551)	mem 20675MB
[2025-04-03 00:58:56 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][134/311]	eta 0:02:36 lr 0.001114	time 0.8757 (0.8850)	loss 0.6407 (0.5961)	grad_norm 1.3093 (2.4418)	mem 20675MB
[2025-04-03 00:58:58 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][136/311]	eta 0:02:34 lr 0.001113	time 0.8758 (0.8849)	loss 0.6356 (0.5963)	grad_norm 1.8199 (2.4313)	mem 20675MB
[2025-04-03 00:59:00 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][138/311]	eta 0:02:33 lr 0.001113	time 0.8761 (0.8848)	loss 0.6604 (0.5971)	grad_norm 2.5427 (2.4259)	mem 20675MB
[2025-04-03 00:59:02 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][140/311]	eta 0:02:31 lr 0.001113	time 0.8759 (0.8847)	loss 0.6449 (0.5978)	grad_norm 1.6912 (2.4174)	mem 20675MB
[2025-04-03 00:59:03 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][142/311]	eta 0:02:29 lr 0.001113	time 0.8760 (0.8845)	loss 0.5948 (0.5980)	grad_norm 1.7585 (2.4073)	mem 20675MB
[2025-04-03 00:59:05 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][144/311]	eta 0:02:27 lr 0.001112	time 0.8760 (0.8844)	loss 0.6416 (0.5982)	grad_norm 2.4671 (2.4065)	mem 20675MB
[2025-04-03 00:59:07 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][146/311]	eta 0:02:25 lr 0.001112	time 0.8757 (0.8843)	loss 0.5727 (0.5982)	grad_norm 3.1478 (2.4078)	mem 20675MB
[2025-04-03 00:59:09 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][148/311]	eta 0:02:24 lr 0.001112	time 0.8760 (0.8842)	loss 0.5635 (0.5981)	grad_norm 2.9518 (2.4073)	mem 20675MB
[2025-04-03 00:59:10 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][150/311]	eta 0:02:22 lr 0.001111	time 0.8758 (0.8841)	loss 0.6604 (0.5981)	grad_norm 2.1566 (2.4042)	mem 20675MB
[2025-04-03 00:59:12 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][152/311]	eta 0:02:20 lr 0.001111	time 0.8761 (0.8840)	loss 0.5649 (0.5981)	grad_norm 2.0174 (2.4079)	mem 20675MB
[2025-04-03 00:59:14 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][154/311]	eta 0:02:18 lr 0.001111	time 0.8760 (0.8839)	loss 0.6213 (0.5988)	grad_norm 4.8119 (2.4238)	mem 20675MB
[2025-04-03 00:59:16 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][156/311]	eta 0:02:16 lr 0.001111	time 0.8773 (0.8839)	loss 0.6557 (0.5995)	grad_norm 1.8369 (2.4209)	mem 20675MB
[2025-04-03 00:59:17 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][158/311]	eta 0:02:15 lr 0.001110	time 0.8758 (0.8838)	loss 0.6181 (0.5992)	grad_norm 1.3801 (2.4186)	mem 20675MB
[2025-04-03 00:59:19 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][160/311]	eta 0:02:13 lr 0.001110	time 0.8758 (0.8837)	loss 0.6466 (0.5996)	grad_norm 1.8573 (2.4116)	mem 20675MB
[2025-04-03 00:59:21 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][162/311]	eta 0:02:11 lr 0.001110	time 0.8758 (0.8836)	loss 0.6154 (0.5996)	grad_norm 2.3564 (2.4082)	mem 20675MB
[2025-04-03 00:59:23 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][164/311]	eta 0:02:09 lr 0.001110	time 0.8758 (0.8835)	loss 0.6281 (0.5999)	grad_norm 2.0533 (2.4085)	mem 20675MB
[2025-04-03 00:59:24 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][166/311]	eta 0:02:08 lr 0.001109	time 0.8758 (0.8834)	loss 0.6044 (0.6002)	grad_norm 2.1120 (2.4057)	mem 20675MB
[2025-04-03 00:59:26 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][168/311]	eta 0:02:06 lr 0.001109	time 0.8761 (0.8833)	loss 0.5482 (0.5996)	grad_norm 3.1590 (2.4109)	mem 20675MB
[2025-04-03 00:59:28 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][170/311]	eta 0:02:04 lr 0.001109	time 0.8761 (0.8833)	loss 0.6516 (0.6002)	grad_norm 2.3230 (2.4084)	mem 20675MB
[2025-04-03 00:59:30 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][172/311]	eta 0:02:02 lr 0.001109	time 0.8760 (0.8832)	loss 0.6491 (0.6003)	grad_norm 2.7101 (2.4096)	mem 20675MB
[2025-04-03 00:59:31 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][174/311]	eta 0:02:00 lr 0.001108	time 0.8758 (0.8831)	loss 0.6283 (0.6007)	grad_norm 2.4724 (2.4067)	mem 20675MB
[2025-04-03 00:59:33 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][176/311]	eta 0:01:59 lr 0.001108	time 0.8761 (0.8830)	loss 0.5835 (0.6005)	grad_norm 1.5451 (2.3977)	mem 20675MB
[2025-04-03 00:59:35 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][178/311]	eta 0:01:57 lr 0.001108	time 0.8758 (0.8830)	loss 0.6075 (0.6002)	grad_norm 1.5144 (2.3954)	mem 20675MB
[2025-04-03 00:59:37 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][180/311]	eta 0:01:55 lr 0.001107	time 0.8769 (0.8829)	loss 0.6219 (0.6001)	grad_norm 1.8898 (2.3898)	mem 20675MB
[2025-04-03 00:59:38 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][182/311]	eta 0:01:53 lr 0.001107	time 0.8758 (0.8828)	loss 0.5834 (0.6001)	grad_norm 1.6459 (2.3830)	mem 20675MB
[2025-04-03 00:59:40 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][184/311]	eta 0:01:52 lr 0.001107	time 0.8764 (0.8828)	loss 0.5721 (0.5999)	grad_norm 2.7923 (2.3818)	mem 20675MB
[2025-04-03 00:59:42 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][186/311]	eta 0:01:50 lr 0.001107	time 0.8761 (0.8827)	loss 0.6442 (0.6005)	grad_norm 2.6317 (2.3897)	mem 20675MB
[2025-04-03 00:59:44 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][188/311]	eta 0:01:48 lr 0.001106	time 0.8762 (0.8827)	loss 0.5510 (0.6003)	grad_norm 2.8993 (2.4031)	mem 20675MB
[2025-04-03 00:59:45 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][190/311]	eta 0:01:46 lr 0.001106	time 0.8763 (0.8826)	loss 0.5136 (0.5996)	grad_norm 2.1186 (2.4036)	mem 20675MB
[2025-04-03 00:59:47 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][192/311]	eta 0:01:45 lr 0.001106	time 0.8761 (0.8825)	loss 0.6613 (0.5995)	grad_norm 2.9128 (2.4070)	mem 20675MB
[2025-04-03 00:59:49 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][194/311]	eta 0:01:43 lr 0.001106	time 0.8758 (0.8825)	loss 0.6336 (0.5995)	grad_norm 2.4153 (2.4061)	mem 20675MB
[2025-04-03 00:59:51 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][196/311]	eta 0:01:41 lr 0.001105	time 0.8760 (0.8824)	loss 0.5910 (0.5991)	grad_norm 1.9736 (2.4057)	mem 20675MB
[2025-04-03 00:59:52 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][198/311]	eta 0:01:39 lr 0.001105	time 0.8761 (0.8824)	loss 0.6174 (0.5996)	grad_norm 1.8512 (2.4024)	mem 20675MB
[2025-04-03 00:59:54 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][200/311]	eta 0:01:37 lr 0.001105	time 0.8762 (0.8823)	loss 0.5312 (0.5992)	grad_norm 3.1015 (2.4056)	mem 20675MB
[2025-04-03 00:59:56 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][202/311]	eta 0:01:36 lr 0.001105	time 0.8761 (0.8823)	loss 0.5917 (0.5993)	grad_norm 2.2643 (2.4036)	mem 20675MB
[2025-04-03 00:59:58 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][204/311]	eta 0:01:34 lr 0.001104	time 0.8759 (0.8822)	loss 0.6464 (0.5997)	grad_norm 2.0095 (2.4007)	mem 20675MB
[2025-04-03 01:00:00 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][206/311]	eta 0:01:32 lr 0.001104	time 0.8761 (0.8822)	loss 0.6055 (0.6001)	grad_norm 1.8484 (2.3952)	mem 20675MB
[2025-04-03 01:00:01 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][208/311]	eta 0:01:30 lr 0.001104	time 0.8761 (0.8821)	loss 0.5863 (0.6000)	grad_norm 1.7968 (2.3888)	mem 20675MB
[2025-04-03 01:00:03 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][210/311]	eta 0:01:29 lr 0.001103	time 0.8760 (0.8821)	loss 0.5899 (0.6001)	grad_norm 2.1722 (2.3846)	mem 20675MB
[2025-04-03 01:00:05 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][212/311]	eta 0:01:27 lr 0.001103	time 0.8760 (0.8820)	loss 0.6161 (0.6001)	grad_norm 2.2562 (2.3821)	mem 20675MB
[2025-04-03 01:00:07 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][214/311]	eta 0:01:25 lr 0.001103	time 0.8788 (0.8820)	loss 0.5511 (0.5998)	grad_norm 2.8995 (2.3864)	mem 20675MB
[2025-04-03 01:00:08 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][216/311]	eta 0:01:23 lr 0.001103	time 0.8762 (0.8819)	loss 0.6131 (0.5994)	grad_norm 3.6643 (2.3958)	mem 20675MB
[2025-04-03 01:00:10 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][218/311]	eta 0:01:22 lr 0.001102	time 0.8762 (0.8819)	loss 0.6733 (0.6000)	grad_norm 3.1717 (2.4025)	mem 20675MB
[2025-04-03 01:00:12 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][220/311]	eta 0:01:20 lr 0.001102	time 0.8758 (0.8818)	loss 0.5967 (0.6000)	grad_norm 2.2269 (2.3988)	mem 20675MB
[2025-04-03 01:00:14 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][222/311]	eta 0:01:18 lr 0.001102	time 0.8758 (0.8818)	loss 0.6956 (0.6005)	grad_norm 2.1600 (2.3963)	mem 20675MB
[2025-04-03 01:00:15 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][224/311]	eta 0:01:16 lr 0.001102	time 0.8759 (0.8817)	loss 0.6458 (0.6007)	grad_norm 1.5326 (2.3892)	mem 20675MB
[2025-04-03 01:00:17 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][226/311]	eta 0:01:14 lr 0.001101	time 0.8767 (0.8817)	loss 0.6437 (0.6009)	grad_norm 1.8698 (2.3844)	mem 20675MB
[2025-04-03 01:00:19 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][228/311]	eta 0:01:13 lr 0.001101	time 0.8760 (0.8817)	loss 0.6807 (0.6009)	grad_norm 1.5854 (2.3797)	mem 20675MB
[2025-04-03 01:00:21 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][230/311]	eta 0:01:11 lr 0.001101	time 0.8761 (0.8816)	loss 0.5746 (0.6009)	grad_norm 1.8068 (2.3723)	mem 20675MB
[2025-04-03 01:00:22 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][232/311]	eta 0:01:09 lr 0.001100	time 0.8758 (0.8816)	loss 0.6211 (0.6010)	grad_norm 1.1997 (2.3697)	mem 20675MB
[2025-04-03 01:00:24 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][234/311]	eta 0:01:07 lr 0.001100	time 0.8762 (0.8815)	loss 0.6102 (0.6007)	grad_norm 1.5988 (2.3651)	mem 20675MB
[2025-04-03 01:00:26 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][236/311]	eta 0:01:06 lr 0.001100	time 0.8760 (0.8815)	loss 0.6113 (0.6008)	grad_norm 2.2919 (2.3622)	mem 20675MB
[2025-04-03 01:00:28 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][238/311]	eta 0:01:04 lr 0.001100	time 0.8759 (0.8815)	loss 0.6529 (0.6010)	grad_norm 1.9112 (2.3607)	mem 20675MB
[2025-04-03 01:00:29 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][240/311]	eta 0:01:02 lr 0.001099	time 0.8766 (0.8814)	loss 0.6409 (0.6012)	grad_norm 3.0760 (2.3639)	mem 20675MB
[2025-04-03 01:00:31 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][242/311]	eta 0:01:00 lr 0.001099	time 0.8762 (0.8814)	loss 0.6363 (0.6011)	grad_norm 1.5329 (2.3664)	mem 20675MB
[2025-04-03 01:00:33 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][244/311]	eta 0:00:59 lr 0.001099	time 0.8763 (0.8814)	loss 0.6404 (0.6011)	grad_norm 2.5118 (2.3695)	mem 20675MB
[2025-04-03 01:00:35 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][246/311]	eta 0:00:57 lr 0.001099	time 0.8764 (0.8813)	loss 0.6061 (0.6008)	grad_norm 3.1291 (2.3726)	mem 20675MB
[2025-04-03 01:00:36 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][248/311]	eta 0:00:55 lr 0.001098	time 0.8780 (0.8813)	loss 0.6030 (0.6009)	grad_norm 1.3923 (2.3673)	mem 20675MB
[2025-04-03 01:00:38 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][250/311]	eta 0:00:53 lr 0.001098	time 0.8763 (0.8813)	loss 0.6264 (0.6012)	grad_norm 1.6977 (2.3623)	mem 20675MB
[2025-04-03 01:00:40 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][252/311]	eta 0:00:51 lr 0.001098	time 0.8777 (0.8812)	loss 0.6108 (0.6012)	grad_norm 2.2532 (2.3621)	mem 20675MB
[2025-04-03 01:00:42 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][254/311]	eta 0:00:50 lr 0.001097	time 0.8763 (0.8812)	loss 0.6023 (0.6010)	grad_norm 2.2217 (2.3615)	mem 20675MB
[2025-04-03 01:00:43 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][256/311]	eta 0:00:48 lr 0.001097	time 0.8761 (0.8812)	loss 0.5834 (0.6007)	grad_norm 1.9877 (2.3608)	mem 20675MB
[2025-04-03 01:00:45 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][258/311]	eta 0:00:46 lr 0.001097	time 0.8764 (0.8812)	loss 0.6550 (0.6010)	grad_norm 2.6618 (2.3604)	mem 20675MB
[2025-04-03 01:00:47 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][260/311]	eta 0:00:44 lr 0.001097	time 0.8763 (0.8811)	loss 0.5345 (0.6008)	grad_norm 2.6234 (2.3610)	mem 20675MB
[2025-04-03 01:00:49 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][262/311]	eta 0:00:43 lr 0.001096	time 0.8772 (0.8811)	loss 0.5372 (0.6004)	grad_norm 2.4051 (2.3608)	mem 20675MB
[2025-04-03 01:00:50 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][264/311]	eta 0:00:41 lr 0.001096	time 0.8765 (0.8811)	loss 0.6790 (0.6011)	grad_norm 3.4862 (2.3713)	mem 20675MB
[2025-04-03 01:00:52 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][266/311]	eta 0:00:39 lr 0.001096	time 0.8765 (0.8810)	loss 0.6439 (0.6013)	grad_norm 2.6331 (2.3719)	mem 20675MB
[2025-04-03 01:00:54 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][268/311]	eta 0:00:37 lr 0.001096	time 0.8767 (0.8810)	loss 0.6649 (0.6016)	grad_norm 2.2358 (2.3688)	mem 20675MB
[2025-04-03 01:00:56 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][270/311]	eta 0:00:36 lr 0.001095	time 0.8769 (0.8810)	loss 0.6429 (0.6019)	grad_norm 1.8400 (2.3642)	mem 20675MB
[2025-04-03 01:00:57 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][272/311]	eta 0:00:34 lr 0.001095	time 0.8769 (0.8810)	loss 0.5726 (0.6018)	grad_norm 3.2321 (2.3664)	mem 20675MB
[2025-04-03 01:00:59 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][274/311]	eta 0:00:32 lr 0.001095	time 0.8766 (0.8809)	loss 0.6236 (0.6021)	grad_norm 2.3076 (2.3646)	mem 20675MB
[2025-04-03 01:01:01 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][276/311]	eta 0:00:30 lr 0.001094	time 0.8768 (0.8809)	loss 0.5849 (0.6022)	grad_norm 2.8158 (2.3672)	mem 20675MB
[2025-04-03 01:01:03 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][278/311]	eta 0:00:29 lr 0.001094	time 0.8766 (0.8809)	loss 0.6057 (0.6023)	grad_norm 1.5853 (2.3649)	mem 20675MB
[2025-04-03 01:01:04 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][280/311]	eta 0:00:27 lr 0.001094	time 0.8770 (0.8809)	loss 0.6117 (0.6025)	grad_norm 2.0754 (2.3602)	mem 20675MB
[2025-04-03 01:01:06 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][282/311]	eta 0:00:25 lr 0.001094	time 0.8763 (0.8808)	loss 0.5742 (0.6026)	grad_norm 2.4041 (2.3570)	mem 20675MB
[2025-04-03 01:01:08 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][284/311]	eta 0:00:23 lr 0.001093	time 0.8765 (0.8808)	loss 0.6134 (0.6025)	grad_norm 1.8647 (2.3558)	mem 20675MB
[2025-04-03 01:01:10 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][286/311]	eta 0:00:22 lr 0.001093	time 0.8764 (0.8808)	loss 0.5955 (0.6025)	grad_norm 1.9056 (2.3534)	mem 20675MB
[2025-04-03 01:01:11 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][288/311]	eta 0:00:20 lr 0.001093	time 0.8762 (0.8808)	loss 0.6410 (0.6028)	grad_norm 2.6593 (2.3534)	mem 20675MB
[2025-04-03 01:01:13 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][290/311]	eta 0:00:18 lr 0.001092	time 0.8766 (0.8808)	loss 0.6443 (0.6030)	grad_norm 1.7288 (2.3505)	mem 20675MB
[2025-04-03 01:01:15 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][292/311]	eta 0:00:16 lr 0.001092	time 0.8762 (0.8807)	loss 0.5599 (0.6030)	grad_norm 3.2636 (2.3511)	mem 20675MB
[2025-04-03 01:01:17 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][294/311]	eta 0:00:14 lr 0.001092	time 0.8762 (0.8807)	loss 0.6408 (0.6028)	grad_norm 2.2880 (2.3531)	mem 20675MB
[2025-04-03 01:01:18 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][296/311]	eta 0:00:13 lr 0.001092	time 0.8762 (0.8807)	loss 0.6476 (0.6029)	grad_norm 2.4221 (2.3529)	mem 20675MB
[2025-04-03 01:01:20 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][298/311]	eta 0:00:11 lr 0.001091	time 0.8763 (0.8807)	loss 0.6123 (0.6031)	grad_norm 2.5893 (2.3555)	mem 20675MB
[2025-04-03 01:01:22 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][300/311]	eta 0:00:09 lr 0.001091	time 0.8762 (0.8806)	loss 0.6612 (0.6034)	grad_norm 1.8679 (2.3518)	mem 20675MB
[2025-04-03 01:01:24 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][302/311]	eta 0:00:07 lr 0.001091	time 0.8760 (0.8806)	loss 0.5384 (0.6032)	grad_norm 2.2675 (2.3511)	mem 20675MB
[2025-04-03 01:01:25 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][304/311]	eta 0:00:06 lr 0.001090	time 0.8768 (0.8806)	loss 0.6258 (0.6034)	grad_norm 2.0952 (2.3470)	mem 20675MB
[2025-04-03 01:01:27 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][306/311]	eta 0:00:04 lr 0.001090	time 0.8763 (0.8806)	loss 0.5261 (0.6030)	grad_norm 2.6491 (2.3478)	mem 20675MB
[2025-04-03 01:01:29 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][308/311]	eta 0:00:02 lr 0.001090	time 0.8765 (0.8805)	loss 0.5264 (0.6028)	grad_norm 2.6536 (2.3470)	mem 20675MB
[2025-04-03 01:01:31 simmim_finetune] (main_finetune.py 252): INFO Train: [6/30][310/311]	eta 0:00:00 lr 0.001090	time 0.8755 (0.8805)	loss 0.6312 (0.6029)	grad_norm 1.8901 (2.3461)	mem 20675MB
[2025-04-03 01:01:31 simmim_finetune] (main_finetune.py 260): INFO EPOCH 6 training takes 0:04:33
[2025-04-03 01:01:32 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.371 (1.371)	Loss 0.5761 (0.5761)	Acc@1 72.656 (72.656)	Mem 20675MB
[2025-04-03 01:01:32 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 74.648
[2025-04-03 01:01:32 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 74.6%
[2025-04-03 01:01:32 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 74.65%
[2025-04-03 01:01:32 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.060577598894948e-06, 4.060577598894948e-06, 6.229769292997334e-06, 6.229769292997334e-06, 9.56698728392408e-06, 9.56698728392408e-06, 1.4701168808426768e-05, 1.4701168808426768e-05, 2.2599909615353978e-05, 2.2599909615353978e-05, 3.475181854908815e-05, 3.475181854908815e-05, 5.344706306252533e-05, 5.344706306252533e-05, 8.220897769858252e-05, 8.220897769858252e-05, 0.00012645807713867053, 0.00012645807713867053, 0.00019453361473880594, 0.00019453361473880594, 0.0002992652110467065, 0.0002992652110467065, 0.000460390743828092, 0.000460390743828092, 0.0007082761788763774, 0.0007082761788763774, 0.0010896383866429702, 0.0010896383866429702]
[2025-04-03 01:01:34 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][0/311]	eta 0:10:56 lr 0.001089	time 2.1097 (2.1097)	loss 0.6158 (0.6158)	grad_norm 2.3869 (2.3869)	mem 20675MB
[2025-04-03 01:01:36 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][2/311]	eta 0:06:38 lr 0.001089	time 0.8759 (1.2880)	loss 0.6127 (0.6050)	grad_norm 2.1356 (2.3591)	mem 20675MB
[2025-04-03 01:01:38 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][4/311]	eta 0:05:44 lr 0.001089	time 0.8761 (1.1237)	loss 0.6494 (0.6068)	grad_norm 1.7611 (2.3746)	mem 20675MB
[2025-04-03 01:01:40 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][6/311]	eta 0:05:21 lr 0.001089	time 0.8768 (1.0533)	loss 0.5111 (0.5847)	grad_norm 3.3168 (2.5014)	mem 20675MB
[2025-04-03 01:01:42 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][8/311]	eta 0:05:07 lr 0.001088	time 0.8764 (1.0141)	loss 0.6574 (0.5987)	grad_norm 1.9500 (2.3829)	mem 20675MB
[2025-04-03 01:01:43 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][10/311]	eta 0:04:57 lr 0.001088	time 0.8756 (0.9892)	loss 0.6399 (0.6067)	grad_norm 2.3478 (2.3356)	mem 20675MB
[2025-04-03 01:01:45 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][12/311]	eta 0:04:50 lr 0.001088	time 0.8761 (0.9719)	loss 0.6476 (0.6142)	grad_norm 1.7903 (2.2567)	mem 20675MB
[2025-04-03 01:01:47 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][14/311]	eta 0:04:44 lr 0.001088	time 0.8755 (0.9591)	loss 0.5663 (0.6134)	grad_norm 1.7052 (2.2274)	mem 20675MB
[2025-04-03 01:01:49 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][16/311]	eta 0:04:40 lr 0.001087	time 0.8757 (0.9494)	loss 0.6142 (0.6162)	grad_norm 1.8255 (2.2154)	mem 20675MB
[2025-04-03 01:01:50 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][18/311]	eta 0:04:35 lr 0.001087	time 0.8758 (0.9418)	loss 0.5485 (0.6095)	grad_norm 2.7173 (2.2211)	mem 20675MB
[2025-04-03 01:01:52 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][20/311]	eta 0:04:32 lr 0.001087	time 0.8757 (0.9356)	loss 0.6532 (0.6096)	grad_norm 1.8335 (2.1857)	mem 20675MB
[2025-04-03 01:01:54 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][22/311]	eta 0:04:28 lr 0.001086	time 0.8757 (0.9305)	loss 0.5971 (0.6043)	grad_norm 3.2196 (2.2662)	mem 20675MB
[2025-04-03 01:01:56 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][24/311]	eta 0:04:25 lr 0.001086	time 0.8755 (0.9261)	loss 0.6029 (0.6069)	grad_norm 2.3721 (2.2909)	mem 20675MB
[2025-04-03 01:01:57 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][26/311]	eta 0:04:22 lr 0.001086	time 0.8758 (0.9225)	loss 0.6364 (0.6065)	grad_norm 2.6283 (2.3007)	mem 20675MB
[2025-04-03 01:01:59 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][28/311]	eta 0:04:20 lr 0.001086	time 0.8755 (0.9193)	loss 0.5541 (0.6030)	grad_norm 2.4418 (2.3356)	mem 20675MB
[2025-04-03 01:02:01 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][30/311]	eta 0:04:17 lr 0.001085	time 0.8759 (0.9165)	loss 0.5510 (0.6021)	grad_norm 3.1448 (2.3641)	mem 20675MB
[2025-04-03 01:02:03 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][32/311]	eta 0:04:15 lr 0.001085	time 0.8763 (0.9142)	loss 0.6122 (0.6016)	grad_norm 2.5779 (2.3548)	mem 20675MB
[2025-04-03 01:02:04 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][34/311]	eta 0:04:12 lr 0.001085	time 0.8763 (0.9121)	loss 0.6650 (0.6026)	grad_norm 2.5780 (2.3847)	mem 20675MB
[2025-04-03 01:02:06 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][36/311]	eta 0:04:10 lr 0.001084	time 0.8766 (0.9102)	loss 0.6105 (0.6021)	grad_norm 1.5574 (2.3477)	mem 20675MB
[2025-04-03 01:02:08 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][38/311]	eta 0:04:08 lr 0.001084	time 0.8756 (0.9085)	loss 0.6219 (0.6034)	grad_norm 2.7978 (2.3355)	mem 20675MB
[2025-04-03 01:02:10 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][40/311]	eta 0:04:05 lr 0.001084	time 0.8767 (0.9070)	loss 0.6145 (0.6042)	grad_norm 2.2747 (2.3122)	mem 20675MB
[2025-04-03 01:02:11 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][42/311]	eta 0:04:03 lr 0.001084	time 0.8778 (0.9056)	loss 0.6522 (0.6048)	grad_norm 2.7778 (2.3805)	mem 20675MB
[2025-04-03 01:02:13 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][44/311]	eta 0:04:01 lr 0.001083	time 0.8769 (0.9044)	loss 0.5882 (0.6031)	grad_norm 2.8674 (2.3863)	mem 20675MB
[2025-04-03 01:02:15 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][46/311]	eta 0:03:59 lr 0.001083	time 0.8758 (0.9033)	loss 0.6256 (0.6024)	grad_norm 3.3897 (2.4031)	mem 20675MB
[2025-04-03 01:02:17 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][48/311]	eta 0:03:57 lr 0.001083	time 0.8770 (0.9022)	loss 0.6756 (0.6016)	grad_norm 2.9179 (2.4492)	mem 20675MB
[2025-04-03 01:02:18 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][50/311]	eta 0:03:55 lr 0.001082	time 0.8782 (0.9013)	loss 0.5832 (0.6015)	grad_norm 2.2306 (2.4456)	mem 20675MB
[2025-04-03 01:02:20 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][52/311]	eta 0:03:53 lr 0.001082	time 0.8775 (0.9005)	loss 0.6205 (0.6018)	grad_norm 1.9203 (2.4983)	mem 20675MB
[2025-04-03 01:02:22 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][54/311]	eta 0:03:51 lr 0.001082	time 0.8774 (0.8996)	loss 0.4953 (0.5992)	grad_norm 2.9616 (2.5141)	mem 20675MB
[2025-04-03 01:02:24 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][56/311]	eta 0:03:49 lr 0.001082	time 0.8776 (0.8989)	loss 0.5290 (0.5982)	grad_norm 2.8868 (2.5299)	mem 20675MB
[2025-04-03 01:02:25 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][58/311]	eta 0:03:47 lr 0.001081	time 0.8785 (0.8983)	loss 0.6674 (0.5998)	grad_norm 1.9389 (2.5271)	mem 20675MB
[2025-04-03 01:02:27 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][60/311]	eta 0:03:45 lr 0.001081	time 0.8787 (0.8976)	loss 0.5099 (0.5984)	grad_norm 2.2748 (2.5317)	mem 20675MB
[2025-04-03 01:02:29 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][62/311]	eta 0:03:43 lr 0.001081	time 0.8774 (0.8970)	loss 0.6548 (0.5991)	grad_norm 1.8537 (2.5174)	mem 20675MB
[2025-04-03 01:02:31 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][64/311]	eta 0:03:41 lr 0.001080	time 0.8766 (0.8964)	loss 0.6385 (0.5999)	grad_norm 2.1370 (2.5149)	mem 20675MB
[2025-04-03 01:02:32 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][66/311]	eta 0:03:39 lr 0.001080	time 0.8760 (0.8959)	loss 0.5835 (0.5997)	grad_norm 2.1941 (2.5109)	mem 20675MB
[2025-04-03 01:02:34 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][68/311]	eta 0:03:37 lr 0.001080	time 0.8774 (0.8953)	loss 0.6580 (0.5994)	grad_norm 2.0257 (2.5023)	mem 20675MB
[2025-04-03 01:02:36 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][70/311]	eta 0:03:35 lr 0.001080	time 0.8767 (0.8948)	loss 0.5042 (0.5982)	grad_norm 3.4444 (2.5082)	mem 20675MB
[2025-04-03 01:02:38 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][72/311]	eta 0:03:33 lr 0.001079	time 0.8769 (0.8944)	loss 0.5732 (0.5971)	grad_norm 2.4856 (2.5088)	mem 20675MB
[2025-04-03 01:02:39 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][74/311]	eta 0:03:31 lr 0.001079	time 0.8771 (0.8939)	loss 0.6360 (0.5982)	grad_norm 2.1740 (2.5008)	mem 20675MB
[2025-04-03 01:02:41 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][76/311]	eta 0:03:29 lr 0.001079	time 0.8764 (0.8935)	loss 0.5961 (0.5979)	grad_norm 2.9637 (2.5128)	mem 20675MB
[2025-04-03 01:02:43 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][78/311]	eta 0:03:28 lr 0.001078	time 0.8767 (0.8931)	loss 0.6324 (0.5972)	grad_norm 3.2770 (2.5262)	mem 20675MB
[2025-04-03 01:02:45 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][80/311]	eta 0:03:26 lr 0.001078	time 0.8785 (0.8927)	loss 0.5253 (0.5968)	grad_norm 3.0629 (2.5270)	mem 20675MB
[2025-04-03 01:02:46 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][82/311]	eta 0:03:24 lr 0.001078	time 0.8766 (0.8924)	loss 0.5606 (0.5957)	grad_norm 2.8259 (2.5321)	mem 20675MB
[2025-04-03 01:02:48 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][84/311]	eta 0:03:22 lr 0.001077	time 0.8785 (0.8921)	loss 0.5856 (0.5962)	grad_norm 1.9358 (2.5450)	mem 20675MB
[2025-04-03 01:02:50 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][86/311]	eta 0:03:20 lr 0.001077	time 0.8766 (0.8918)	loss 0.6029 (0.5958)	grad_norm 1.9312 (2.5337)	mem 20675MB
[2025-04-03 01:02:52 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][88/311]	eta 0:03:18 lr 0.001077	time 0.8756 (0.8914)	loss 0.6515 (0.5971)	grad_norm 2.2238 (2.5313)	mem 20675MB
[2025-04-03 01:02:53 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][90/311]	eta 0:03:16 lr 0.001077	time 0.8758 (0.8911)	loss 0.6610 (0.5977)	grad_norm 2.7932 (2.5421)	mem 20675MB
[2025-04-03 01:02:55 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][92/311]	eta 0:03:15 lr 0.001076	time 0.8767 (0.8908)	loss 0.5473 (0.5971)	grad_norm 2.2912 (2.5423)	mem 20675MB
[2025-04-03 01:02:57 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][94/311]	eta 0:03:13 lr 0.001076	time 0.8763 (0.8905)	loss 0.6641 (0.5975)	grad_norm 2.9552 (2.5411)	mem 20675MB
[2025-04-03 01:02:59 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][96/311]	eta 0:03:11 lr 0.001076	time 0.8764 (0.8902)	loss 0.6725 (0.5987)	grad_norm 3.0349 (2.5439)	mem 20675MB
[2025-04-03 01:03:00 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][98/311]	eta 0:03:09 lr 0.001075	time 0.8761 (0.8900)	loss 0.6242 (0.5994)	grad_norm 1.4841 (2.5277)	mem 20675MB
[2025-04-03 01:03:02 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][100/311]	eta 0:03:07 lr 0.001075	time 0.8755 (0.8897)	loss 0.5903 (0.5992)	grad_norm 1.5867 (2.5073)	mem 20675MB
[2025-04-03 01:03:04 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][102/311]	eta 0:03:05 lr 0.001075	time 0.8758 (0.8894)	loss 0.6038 (0.5987)	grad_norm 1.7860 (2.5018)	mem 20675MB
[2025-04-03 01:03:06 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][104/311]	eta 0:03:04 lr 0.001075	time 0.8758 (0.8892)	loss 0.6044 (0.5989)	grad_norm 1.2829 (2.4831)	mem 20675MB
[2025-04-03 01:03:07 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][106/311]	eta 0:03:02 lr 0.001074	time 0.8756 (0.8890)	loss 0.5550 (0.5981)	grad_norm 2.7001 (2.4865)	mem 20675MB
[2025-04-03 01:03:09 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][108/311]	eta 0:03:00 lr 0.001074	time 0.8757 (0.8887)	loss 0.6331 (0.5976)	grad_norm 2.1383 (2.4898)	mem 20675MB
[2025-04-03 01:03:11 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][110/311]	eta 0:02:58 lr 0.001074	time 0.8756 (0.8885)	loss 0.5117 (0.5966)	grad_norm 3.2815 (2.4974)	mem 20675MB
[2025-04-03 01:03:13 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][112/311]	eta 0:02:56 lr 0.001073	time 0.8757 (0.8883)	loss 0.6247 (0.5960)	grad_norm 3.2840 (2.5152)	mem 20675MB
[2025-04-03 01:03:15 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][114/311]	eta 0:02:54 lr 0.001073	time 0.8758 (0.8881)	loss 0.6953 (0.5978)	grad_norm 3.2597 (2.5282)	mem 20675MB
[2025-04-03 01:03:16 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][116/311]	eta 0:02:53 lr 0.001073	time 0.8755 (0.8879)	loss 0.6152 (0.5978)	grad_norm 2.8824 (2.5581)	mem 20675MB
[2025-04-03 01:03:18 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][118/311]	eta 0:02:51 lr 0.001073	time 0.8756 (0.8877)	loss 0.6219 (0.5975)	grad_norm 2.1020 (2.5619)	mem 20675MB
[2025-04-03 01:03:20 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][120/311]	eta 0:02:49 lr 0.001072	time 0.8756 (0.8875)	loss 0.5938 (0.5979)	grad_norm 1.7605 (2.5471)	mem 20675MB
[2025-04-03 01:03:22 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][122/311]	eta 0:02:47 lr 0.001072	time 0.8763 (0.8874)	loss 0.6003 (0.5979)	grad_norm 3.3727 (2.5515)	mem 20675MB
[2025-04-03 01:03:23 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][124/311]	eta 0:02:45 lr 0.001072	time 0.8765 (0.8872)	loss 0.5506 (0.5974)	grad_norm 2.8962 (2.5516)	mem 20675MB
[2025-04-03 01:03:25 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][126/311]	eta 0:02:44 lr 0.001071	time 0.8755 (0.8870)	loss 0.5954 (0.5978)	grad_norm 2.7735 (2.5494)	mem 20675MB
[2025-04-03 01:03:27 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][128/311]	eta 0:02:42 lr 0.001071	time 0.8754 (0.8869)	loss 0.5869 (0.5973)	grad_norm 1.8775 (2.5470)	mem 20675MB
[2025-04-03 01:03:29 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][130/311]	eta 0:02:40 lr 0.001071	time 0.8755 (0.8867)	loss 0.5863 (0.5970)	grad_norm 1.8251 (2.5394)	mem 20675MB
[2025-04-03 01:03:30 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][132/311]	eta 0:02:38 lr 0.001070	time 0.8758 (0.8866)	loss 0.4983 (0.5965)	grad_norm 3.0102 (2.5403)	mem 20675MB
[2025-04-03 01:03:32 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][134/311]	eta 0:02:36 lr 0.001070	time 0.8759 (0.8864)	loss 0.6374 (0.5962)	grad_norm 2.5364 (2.5442)	mem 20675MB
[2025-04-03 01:03:34 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][136/311]	eta 0:02:35 lr 0.001070	time 0.8759 (0.8863)	loss 0.6487 (0.5969)	grad_norm 3.3161 (2.5498)	mem 20675MB
[2025-04-03 01:03:36 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][138/311]	eta 0:02:33 lr 0.001070	time 0.8758 (0.8861)	loss 0.5533 (0.5968)	grad_norm 2.0403 (2.5512)	mem 20675MB
[2025-04-03 01:03:37 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][140/311]	eta 0:02:31 lr 0.001069	time 0.8764 (0.8860)	loss 0.5804 (0.5969)	grad_norm 2.5423 (2.5531)	mem 20675MB
[2025-04-03 01:03:39 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][142/311]	eta 0:02:29 lr 0.001069	time 0.8755 (0.8859)	loss 0.6398 (0.5973)	grad_norm 2.1421 (2.5484)	mem 20675MB
[2025-04-03 01:03:41 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][144/311]	eta 0:02:27 lr 0.001069	time 0.8758 (0.8858)	loss 0.6567 (0.5971)	grad_norm 1.4786 (2.5380)	mem 20675MB
[2025-04-03 01:03:43 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][146/311]	eta 0:02:26 lr 0.001068	time 0.8758 (0.8856)	loss 0.6050 (0.5976)	grad_norm 1.5689 (2.5249)	mem 20675MB
[2025-04-03 01:03:44 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][148/311]	eta 0:02:24 lr 0.001068	time 0.8754 (0.8855)	loss 0.6132 (0.5978)	grad_norm 2.1868 (2.5186)	mem 20675MB
[2025-04-03 01:03:46 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][150/311]	eta 0:02:22 lr 0.001068	time 0.8762 (0.8854)	loss 0.5750 (0.5979)	grad_norm 2.0515 (2.5109)	mem 20675MB
[2025-04-03 01:03:48 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][152/311]	eta 0:02:20 lr 0.001067	time 0.8756 (0.8853)	loss 0.5939 (0.5977)	grad_norm 2.1327 (2.5046)	mem 20675MB
[2025-04-03 01:03:50 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][154/311]	eta 0:02:18 lr 0.001067	time 0.8757 (0.8852)	loss 0.6397 (0.5982)	grad_norm 2.0286 (2.4970)	mem 20675MB
[2025-04-03 01:03:51 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][156/311]	eta 0:02:17 lr 0.001067	time 0.8757 (0.8851)	loss 0.6197 (0.5981)	grad_norm 2.2212 (2.4981)	mem 20675MB
[2025-04-03 01:03:53 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][158/311]	eta 0:02:15 lr 0.001067	time 0.8759 (0.8850)	loss 0.6379 (0.5989)	grad_norm 1.8972 (2.4910)	mem 20675MB
[2025-04-03 01:03:55 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][160/311]	eta 0:02:13 lr 0.001066	time 0.8755 (0.8848)	loss 0.5234 (0.5981)	grad_norm 2.7856 (2.4894)	mem 20675MB
[2025-04-03 01:03:57 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][162/311]	eta 0:02:11 lr 0.001066	time 0.8754 (0.8847)	loss 0.5512 (0.5981)	grad_norm 2.9009 (2.4905)	mem 20675MB
[2025-04-03 01:03:58 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][164/311]	eta 0:02:10 lr 0.001066	time 0.8755 (0.8846)	loss 0.6460 (0.5987)	grad_norm 2.5773 (2.4872)	mem 20675MB
[2025-04-03 01:04:00 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][166/311]	eta 0:02:08 lr 0.001065	time 0.8758 (0.8845)	loss 0.6272 (0.5991)	grad_norm 2.7092 (2.4862)	mem 20675MB
[2025-04-03 01:04:02 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][168/311]	eta 0:02:06 lr 0.001065	time 0.8754 (0.8845)	loss 0.6098 (0.5991)	grad_norm 2.1479 (2.4807)	mem 20675MB
[2025-04-03 01:04:04 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][170/311]	eta 0:02:04 lr 0.001065	time 0.8760 (0.8844)	loss 0.6190 (0.5997)	grad_norm 1.5834 (2.4723)	mem 20675MB
[2025-04-03 01:04:05 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][172/311]	eta 0:02:02 lr 0.001065	time 0.8756 (0.8843)	loss 0.6492 (0.5998)	grad_norm 1.6872 (2.4678)	mem 20675MB
[2025-04-03 01:04:07 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][174/311]	eta 0:02:01 lr 0.001064	time 0.8758 (0.8842)	loss 0.5626 (0.5994)	grad_norm 2.1790 (2.4691)	mem 20675MB
[2025-04-03 01:04:09 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][176/311]	eta 0:01:59 lr 0.001064	time 0.8753 (0.8841)	loss 0.6095 (0.5997)	grad_norm 1.8772 (2.4595)	mem 20675MB
[2025-04-03 01:04:11 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][178/311]	eta 0:01:57 lr 0.001064	time 0.8757 (0.8840)	loss 0.5523 (0.5992)	grad_norm 2.9886 (2.4628)	mem 20675MB
[2025-04-03 01:04:12 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][180/311]	eta 0:01:55 lr 0.001063	time 0.8758 (0.8840)	loss 0.6649 (0.5997)	grad_norm 3.1648 (2.4657)	mem 20675MB
[2025-04-03 01:04:14 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][182/311]	eta 0:01:54 lr 0.001063	time 0.8763 (0.8839)	loss 0.5053 (0.5987)	grad_norm 3.0622 (2.4712)	mem 20675MB
[2025-04-03 01:04:16 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][184/311]	eta 0:01:52 lr 0.001063	time 0.8758 (0.8838)	loss 0.5759 (0.5987)	grad_norm 2.0681 (2.4647)	mem 20675MB
[2025-04-03 01:04:18 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][186/311]	eta 0:01:50 lr 0.001062	time 0.8758 (0.8837)	loss 0.6676 (0.5991)	grad_norm 2.7716 (2.4701)	mem 20675MB
[2025-04-03 01:04:19 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][188/311]	eta 0:01:48 lr 0.001062	time 0.8758 (0.8837)	loss 0.6541 (0.5996)	grad_norm 2.1300 (2.4650)	mem 20675MB
[2025-04-03 01:04:21 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][190/311]	eta 0:01:46 lr 0.001062	time 0.8758 (0.8836)	loss 0.5662 (0.5990)	grad_norm 2.8388 (2.4685)	mem 20675MB
[2025-04-03 01:04:23 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][192/311]	eta 0:01:45 lr 0.001062	time 0.8754 (0.8835)	loss 0.6866 (0.5991)	grad_norm 1.9996 (2.4646)	mem 20675MB
[2025-04-03 01:04:25 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][194/311]	eta 0:01:43 lr 0.001061	time 0.8758 (0.8834)	loss 0.6050 (0.5992)	grad_norm 1.9389 (2.4608)	mem 20675MB
[2025-04-03 01:04:26 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][196/311]	eta 0:01:41 lr 0.001061	time 0.8758 (0.8834)	loss 0.6076 (0.5996)	grad_norm 1.9000 (2.4542)	mem 20675MB
[2025-04-03 01:04:28 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][198/311]	eta 0:01:39 lr 0.001061	time 0.8756 (0.8833)	loss 0.6094 (0.5992)	grad_norm 1.4608 (2.4494)	mem 20675MB
[2025-04-03 01:04:30 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][200/311]	eta 0:01:38 lr 0.001060	time 0.8754 (0.8832)	loss 0.5751 (0.5992)	grad_norm 2.3033 (2.4476)	mem 20675MB
[2025-04-03 01:04:32 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][202/311]	eta 0:01:36 lr 0.001060	time 0.8755 (0.8832)	loss 0.6366 (0.5992)	grad_norm 1.7969 (2.4404)	mem 20675MB
[2025-04-03 01:04:33 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][204/311]	eta 0:01:34 lr 0.001060	time 0.8753 (0.8831)	loss 0.6245 (0.5998)	grad_norm 1.7056 (2.4365)	mem 20675MB
[2025-04-03 01:04:35 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][206/311]	eta 0:01:32 lr 0.001059	time 0.8753 (0.8830)	loss 0.5223 (0.5993)	grad_norm 2.5590 (2.4350)	mem 20675MB
[2025-04-03 01:04:37 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][208/311]	eta 0:01:30 lr 0.001059	time 0.8757 (0.8830)	loss 0.6037 (0.5996)	grad_norm 1.6705 (2.4287)	mem 20675MB
[2025-04-03 01:04:39 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][210/311]	eta 0:01:29 lr 0.001059	time 0.8758 (0.8829)	loss 0.5213 (0.5991)	grad_norm 2.9840 (2.4313)	mem 20675MB
[2025-04-03 01:04:40 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][212/311]	eta 0:01:27 lr 0.001058	time 0.8753 (0.8829)	loss 0.5599 (0.5992)	grad_norm 2.1250 (2.4309)	mem 20675MB
[2025-04-03 01:04:42 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][214/311]	eta 0:01:25 lr 0.001058	time 0.8754 (0.8828)	loss 0.5401 (0.5989)	grad_norm 1.9696 (2.4294)	mem 20675MB
[2025-04-03 01:04:44 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][216/311]	eta 0:01:23 lr 0.001058	time 0.8757 (0.8827)	loss 0.6325 (0.5992)	grad_norm 2.4377 (2.4266)	mem 20675MB
[2025-04-03 01:04:46 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][218/311]	eta 0:01:22 lr 0.001058	time 0.8754 (0.8827)	loss 0.6299 (0.5992)	grad_norm 1.5347 (2.4217)	mem 20675MB
[2025-04-03 01:04:47 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][220/311]	eta 0:01:20 lr 0.001057	time 0.8756 (0.8826)	loss 0.5279 (0.5985)	grad_norm 2.6224 (2.4259)	mem 20675MB
[2025-04-03 01:04:49 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][222/311]	eta 0:01:18 lr 0.001057	time 0.8759 (0.8826)	loss 0.5633 (0.5984)	grad_norm 2.3139 (2.4220)	mem 20675MB
[2025-04-03 01:04:51 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][224/311]	eta 0:01:16 lr 0.001057	time 0.8755 (0.8825)	loss 0.6646 (0.5987)	grad_norm 3.5189 (2.4272)	mem 20675MB
[2025-04-03 01:04:53 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][226/311]	eta 0:01:15 lr 0.001056	time 0.8760 (0.8825)	loss 0.6246 (0.5991)	grad_norm 2.5290 (2.4308)	mem 20675MB
[2025-04-03 01:04:54 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][228/311]	eta 0:01:13 lr 0.001056	time 0.8757 (0.8824)	loss 0.6375 (0.5993)	grad_norm 2.5150 (2.4315)	mem 20675MB
[2025-04-03 01:04:56 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][230/311]	eta 0:01:11 lr 0.001056	time 0.8755 (0.8824)	loss 0.6401 (0.5998)	grad_norm 3.0682 (2.4389)	mem 20675MB
[2025-04-03 01:04:58 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][232/311]	eta 0:01:09 lr 0.001055	time 0.8754 (0.8823)	loss 0.5738 (0.5995)	grad_norm 2.4289 (2.4369)	mem 20675MB
[2025-04-03 01:05:00 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][234/311]	eta 0:01:07 lr 0.001055	time 0.8756 (0.8823)	loss 0.5450 (0.5989)	grad_norm 2.3233 (2.4397)	mem 20675MB
[2025-04-03 01:05:01 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][236/311]	eta 0:01:06 lr 0.001055	time 0.8774 (0.8822)	loss 0.5409 (0.5990)	grad_norm 3.1492 (2.4454)	mem 20675MB
[2025-04-03 01:05:03 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][238/311]	eta 0:01:04 lr 0.001055	time 0.8758 (0.8822)	loss 0.5417 (0.5987)	grad_norm 2.7731 (2.4435)	mem 20675MB
[2025-04-03 01:05:05 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][240/311]	eta 0:01:02 lr 0.001054	time 0.8755 (0.8821)	loss 0.5597 (0.5980)	grad_norm 2.2557 (2.4434)	mem 20675MB
[2025-04-03 01:05:07 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][242/311]	eta 0:01:00 lr 0.001054	time 0.8758 (0.8821)	loss 0.6562 (0.5981)	grad_norm 3.1681 (2.4446)	mem 20675MB
[2025-04-03 01:05:08 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][244/311]	eta 0:00:59 lr 0.001054	time 0.8757 (0.8820)	loss 0.4865 (0.5977)	grad_norm 3.2235 (2.4504)	mem 20675MB
[2025-04-03 01:05:10 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][246/311]	eta 0:00:57 lr 0.001053	time 0.8757 (0.8820)	loss 0.7045 (0.5983)	grad_norm 3.8441 (2.4583)	mem 20675MB
[2025-04-03 01:05:12 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][248/311]	eta 0:00:55 lr 0.001053	time 0.8756 (0.8819)	loss 0.4914 (0.5979)	grad_norm 4.8227 (2.4699)	mem 20675MB
[2025-04-03 01:05:14 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][250/311]	eta 0:00:53 lr 0.001053	time 0.8757 (0.8819)	loss 0.6961 (0.5984)	grad_norm 2.8398 (2.4719)	mem 20675MB
[2025-04-03 01:05:15 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][252/311]	eta 0:00:52 lr 0.001052	time 0.8754 (0.8818)	loss 0.5533 (0.5981)	grad_norm 1.5903 (2.4669)	mem 20675MB
[2025-04-03 01:05:17 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][254/311]	eta 0:00:50 lr 0.001052	time 0.8756 (0.8818)	loss 0.5725 (0.5983)	grad_norm 1.8566 (2.4621)	mem 20675MB
[2025-04-03 01:05:19 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][256/311]	eta 0:00:48 lr 0.001052	time 0.8755 (0.8818)	loss 0.6178 (0.5985)	grad_norm 1.6893 (2.4563)	mem 20675MB
[2025-04-03 01:05:21 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][258/311]	eta 0:00:46 lr 0.001051	time 0.8755 (0.8817)	loss 0.6153 (0.5985)	grad_norm 1.8983 (2.4516)	mem 20675MB
[2025-04-03 01:05:22 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][260/311]	eta 0:00:44 lr 0.001051	time 0.8755 (0.8817)	loss 0.6485 (0.5985)	grad_norm 2.1266 (2.4501)	mem 20675MB
[2025-04-03 01:05:24 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][262/311]	eta 0:00:43 lr 0.001051	time 0.8758 (0.8817)	loss 0.6365 (0.5987)	grad_norm 2.6781 (2.4467)	mem 20675MB
[2025-04-03 01:05:26 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][264/311]	eta 0:00:41 lr 0.001051	time 0.8754 (0.8816)	loss 0.5834 (0.5987)	grad_norm 2.8447 (2.4451)	mem 20675MB
[2025-04-03 01:05:28 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][266/311]	eta 0:00:39 lr 0.001050	time 0.8755 (0.8816)	loss 0.6069 (0.5987)	grad_norm 2.9935 (2.4466)	mem 20675MB
[2025-04-03 01:05:30 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][268/311]	eta 0:00:37 lr 0.001050	time 0.8758 (0.8815)	loss 0.5713 (0.5987)	grad_norm 2.2289 (2.4440)	mem 20675MB
[2025-04-03 01:05:31 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][270/311]	eta 0:00:36 lr 0.001050	time 0.8757 (0.8815)	loss 0.5972 (0.5985)	grad_norm 1.6704 (2.4459)	mem 20675MB
[2025-04-03 01:05:33 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][272/311]	eta 0:00:34 lr 0.001049	time 0.8757 (0.8815)	loss 0.6980 (0.5990)	grad_norm 2.2877 (2.4475)	mem 20675MB
[2025-04-03 01:05:35 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][274/311]	eta 0:00:32 lr 0.001049	time 0.8756 (0.8814)	loss 0.6847 (0.5993)	grad_norm 2.3306 (2.4488)	mem 20675MB
[2025-04-03 01:05:37 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][276/311]	eta 0:00:30 lr 0.001049	time 0.8757 (0.8814)	loss 0.6146 (0.5995)	grad_norm 1.6044 (2.4467)	mem 20675MB
[2025-04-03 01:05:38 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][278/311]	eta 0:00:29 lr 0.001048	time 0.8760 (0.8814)	loss 0.6185 (0.5997)	grad_norm 1.4004 (2.4397)	mem 20675MB
[2025-04-03 01:05:40 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][280/311]	eta 0:00:27 lr 0.001048	time 0.8754 (0.8813)	loss 0.5392 (0.5993)	grad_norm 2.0558 (2.4372)	mem 20675MB
[2025-04-03 01:05:42 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][282/311]	eta 0:00:25 lr 0.001048	time 0.8755 (0.8813)	loss 0.6234 (0.5991)	grad_norm 2.6792 (2.4388)	mem 20675MB
[2025-04-03 01:05:44 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][284/311]	eta 0:00:23 lr 0.001047	time 0.8760 (0.8813)	loss 0.5116 (0.5990)	grad_norm 2.1455 (2.4364)	mem 20675MB
[2025-04-03 01:05:45 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][286/311]	eta 0:00:22 lr 0.001047	time 0.8756 (0.8812)	loss 0.5437 (0.5987)	grad_norm 2.6412 (2.4392)	mem 20675MB
[2025-04-03 01:05:47 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][288/311]	eta 0:00:20 lr 0.001047	time 0.8761 (0.8812)	loss 0.4806 (0.5985)	grad_norm 4.5526 (2.4473)	mem 20675MB
[2025-04-03 01:05:49 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][290/311]	eta 0:00:18 lr 0.001047	time 0.8755 (0.8812)	loss 0.6193 (0.5989)	grad_norm 2.5339 (2.4523)	mem 20675MB
[2025-04-03 01:05:51 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][292/311]	eta 0:00:16 lr 0.001046	time 0.8757 (0.8813)	loss 0.5904 (0.5990)	grad_norm 1.8595 (2.4487)	mem 20675MB
[2025-04-03 01:05:52 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][294/311]	eta 0:00:14 lr 0.001046	time 0.8754 (0.8812)	loss 0.5284 (0.5987)	grad_norm 1.8049 (2.4450)	mem 20675MB
[2025-04-03 01:05:54 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][296/311]	eta 0:00:13 lr 0.001046	time 0.8753 (0.8812)	loss 0.6078 (0.5987)	grad_norm 2.6170 (2.4473)	mem 20675MB
[2025-04-03 01:05:56 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][298/311]	eta 0:00:11 lr 0.001045	time 0.8753 (0.8812)	loss 0.5989 (0.5988)	grad_norm 2.5975 (2.4482)	mem 20675MB
[2025-04-03 01:05:58 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][300/311]	eta 0:00:09 lr 0.001045	time 0.8751 (0.8811)	loss 0.5509 (0.5987)	grad_norm 2.0104 (2.4445)	mem 20675MB
[2025-04-03 01:05:59 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][302/311]	eta 0:00:07 lr 0.001045	time 0.8753 (0.8811)	loss 0.5661 (0.5988)	grad_norm 2.1372 (2.4414)	mem 20675MB
[2025-04-03 01:06:01 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][304/311]	eta 0:00:06 lr 0.001044	time 0.8755 (0.8811)	loss 0.5811 (0.5987)	grad_norm 1.9781 (2.4374)	mem 20675MB
[2025-04-03 01:06:03 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][306/311]	eta 0:00:04 lr 0.001044	time 0.8755 (0.8810)	loss 0.5790 (0.5989)	grad_norm 1.8070 (2.4336)	mem 20675MB
[2025-04-03 01:06:05 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][308/311]	eta 0:00:02 lr 0.001044	time 0.8770 (0.8810)	loss 0.6199 (0.5990)	grad_norm 2.1402 (2.4339)	mem 20675MB
[2025-04-03 01:06:06 simmim_finetune] (main_finetune.py 252): INFO Train: [7/30][310/311]	eta 0:00:00 lr 0.001043	time 0.8752 (0.8810)	loss 0.5827 (0.5989)	grad_norm 2.2863 (2.4351)	mem 20675MB
[2025-04-03 01:06:06 simmim_finetune] (main_finetune.py 260): INFO EPOCH 7 training takes 0:04:34
[2025-04-03 01:06:08 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.468 (1.468)	Loss 0.6002 (0.6002)	Acc@1 71.875 (71.875)	Mem 20675MB
[2025-04-03 01:06:08 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 72.535
[2025-04-03 01:06:08 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 72.5%
[2025-04-03 01:06:08 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 74.65%
[2025-04-03 01:06:08 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.8988552295305604e-06, 3.8988552295305604e-06, 5.975985599261145e-06, 5.975985599261145e-06, 9.171570783462042e-06, 9.171570783462042e-06, 1.4087855682232656e-05, 1.4087855682232656e-05, 2.165137091111052e-05, 2.165137091111052e-05, 3.328754818630724e-05, 3.328754818630724e-05, 5.118935937891757e-05, 5.118935937891757e-05, 7.873060736754883e-05, 7.873060736754883e-05, 0.00012110175811928926, 0.00012110175811928926, 0.00018628814389119763, 0.00018628814389119763, 0.00028657489123259505, 0.00028657489123259505, 0.00044086219483474494, 0.00044086219483474494, 0.0006782272772995911, 0.0006782272772995911, 0.001043404327245508, 0.001043404327245508]
[2025-04-03 01:06:10 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][0/311]	eta 0:10:42 lr 0.001043	time 2.0651 (2.0651)	loss 0.5758 (0.5758)	grad_norm 2.0850 (2.0850)	mem 20675MB
[2025-04-03 01:06:12 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][2/311]	eta 0:06:33 lr 0.001043	time 0.8760 (1.2732)	loss 0.6469 (0.6053)	grad_norm 3.1992 (2.4672)	mem 20675MB
[2025-04-03 01:06:14 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][4/311]	eta 0:05:42 lr 0.001043	time 0.8758 (1.1146)	loss 0.5598 (0.5957)	grad_norm 3.3619 (2.5784)	mem 20675MB
[2025-04-03 01:06:15 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][6/311]	eta 0:05:19 lr 0.001042	time 0.8761 (1.0467)	loss 0.6189 (0.5992)	grad_norm 2.3655 (2.5848)	mem 20675MB
[2025-04-03 01:06:17 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][8/311]	eta 0:05:05 lr 0.001042	time 0.8763 (1.0090)	loss 0.5672 (0.5900)	grad_norm 2.5237 (2.6509)	mem 20675MB
[2025-04-03 01:06:19 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][10/311]	eta 0:04:56 lr 0.001042	time 0.8758 (0.9849)	loss 0.5993 (0.5848)	grad_norm 1.9112 (2.6136)	mem 20675MB
[2025-04-03 01:06:21 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][12/311]	eta 0:04:49 lr 0.001041	time 0.8761 (0.9683)	loss 0.6906 (0.5943)	grad_norm 2.5303 (2.5392)	mem 20675MB
[2025-04-03 01:06:22 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][14/311]	eta 0:04:43 lr 0.001041	time 0.8761 (0.9561)	loss 0.5686 (0.5858)	grad_norm 2.9534 (2.6278)	mem 20675MB
[2025-04-03 01:06:24 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][16/311]	eta 0:04:39 lr 0.001041	time 0.8760 (0.9468)	loss 0.6286 (0.5878)	grad_norm 2.2397 (2.6152)	mem 20675MB
[2025-04-03 01:06:26 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][18/311]	eta 0:04:35 lr 0.001040	time 0.8757 (0.9394)	loss 0.5634 (0.5832)	grad_norm 3.2347 (2.6302)	mem 20675MB
[2025-04-03 01:06:28 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][20/311]	eta 0:04:31 lr 0.001040	time 0.8757 (0.9334)	loss 0.6619 (0.5866)	grad_norm 2.3425 (2.5921)	mem 20675MB
[2025-04-03 01:06:29 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][22/311]	eta 0:04:28 lr 0.001040	time 0.8760 (0.9285)	loss 0.6736 (0.5898)	grad_norm 2.4224 (2.5657)	mem 20675MB
[2025-04-03 01:06:31 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][24/311]	eta 0:04:25 lr 0.001039	time 0.8757 (0.9243)	loss 0.5240 (0.5873)	grad_norm 2.5799 (2.5397)	mem 20675MB
[2025-04-03 01:06:33 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][26/311]	eta 0:04:22 lr 0.001039	time 0.8758 (0.9208)	loss 0.6068 (0.5896)	grad_norm 2.4301 (2.5106)	mem 20675MB
[2025-04-03 01:06:35 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][28/311]	eta 0:04:19 lr 0.001039	time 0.8769 (0.9178)	loss 0.6069 (0.5895)	grad_norm 1.7954 (2.4643)	mem 20675MB
[2025-04-03 01:06:36 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][30/311]	eta 0:04:17 lr 0.001039	time 0.8756 (0.9151)	loss 0.5918 (0.5911)	grad_norm 2.1598 (2.4430)	mem 20675MB
[2025-04-03 01:06:38 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][32/311]	eta 0:04:14 lr 0.001038	time 0.8758 (0.9128)	loss 0.6028 (0.5915)	grad_norm 3.2683 (2.4567)	mem 20675MB
[2025-04-03 01:06:40 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][34/311]	eta 0:04:12 lr 0.001038	time 0.8761 (0.9107)	loss 0.5448 (0.5909)	grad_norm 4.3614 (2.5027)	mem 20675MB
[2025-04-03 01:06:42 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][36/311]	eta 0:04:09 lr 0.001038	time 0.8758 (0.9089)	loss 0.6263 (0.5935)	grad_norm 2.2368 (2.4933)	mem 20675MB
[2025-04-03 01:06:43 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][38/311]	eta 0:04:07 lr 0.001037	time 0.8758 (0.9073)	loss 0.6221 (0.5958)	grad_norm 1.8057 (2.4646)	mem 20675MB
[2025-04-03 01:06:45 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][40/311]	eta 0:04:05 lr 0.001037	time 0.8759 (0.9058)	loss 0.6116 (0.5971)	grad_norm 2.4350 (2.4605)	mem 20675MB
[2025-04-03 01:06:47 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][42/311]	eta 0:04:03 lr 0.001037	time 0.8757 (0.9044)	loss 0.6231 (0.5957)	grad_norm 1.9583 (2.4422)	mem 20675MB
[2025-04-03 01:06:49 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][44/311]	eta 0:04:01 lr 0.001036	time 0.8759 (0.9032)	loss 0.5508 (0.5945)	grad_norm 2.2484 (2.4266)	mem 20675MB
[2025-04-03 01:06:50 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][46/311]	eta 0:03:59 lr 0.001036	time 0.8762 (0.9020)	loss 0.6684 (0.5972)	grad_norm 1.9698 (2.4060)	mem 20675MB
[2025-04-03 01:06:52 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][48/311]	eta 0:03:56 lr 0.001036	time 0.8759 (0.9010)	loss 0.6360 (0.5965)	grad_norm 2.0225 (2.3933)	mem 20675MB
[2025-04-03 01:06:54 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][50/311]	eta 0:03:54 lr 0.001035	time 0.8763 (0.9001)	loss 0.5218 (0.5945)	grad_norm 2.7733 (2.4027)	mem 20675MB
[2025-04-03 01:06:56 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][52/311]	eta 0:03:52 lr 0.001035	time 0.8760 (0.8992)	loss 0.5714 (0.5947)	grad_norm 3.9297 (2.4297)	mem 20675MB
[2025-04-03 01:06:57 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][54/311]	eta 0:03:50 lr 0.001035	time 0.8765 (0.8984)	loss 0.6653 (0.5941)	grad_norm 2.0022 (2.4207)	mem 20675MB
[2025-04-03 01:06:59 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][56/311]	eta 0:03:48 lr 0.001034	time 0.8758 (0.8977)	loss 0.5689 (0.5927)	grad_norm 1.6812 (2.4092)	mem 20675MB
[2025-04-03 01:07:01 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][58/311]	eta 0:03:46 lr 0.001034	time 0.8760 (0.8970)	loss 0.6348 (0.5931)	grad_norm 2.1434 (2.4090)	mem 20675MB
[2025-04-03 01:07:03 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][60/311]	eta 0:03:44 lr 0.001034	time 0.8758 (0.8963)	loss 0.6377 (0.5950)	grad_norm 2.0193 (2.4092)	mem 20675MB
[2025-04-03 01:07:05 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][62/311]	eta 0:03:43 lr 0.001033	time 0.8759 (0.8957)	loss 0.6350 (0.5958)	grad_norm 1.9121 (2.3939)	mem 20675MB
[2025-04-03 01:07:06 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][64/311]	eta 0:03:41 lr 0.001033	time 0.8757 (0.8951)	loss 0.6129 (0.5960)	grad_norm 1.6079 (2.3664)	mem 20675MB
[2025-04-03 01:07:08 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][66/311]	eta 0:03:39 lr 0.001033	time 0.8757 (0.8945)	loss 0.6421 (0.5963)	grad_norm 1.2965 (2.3504)	mem 20675MB
[2025-04-03 01:07:10 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][68/311]	eta 0:03:37 lr 0.001033	time 0.8758 (0.8940)	loss 0.6710 (0.5983)	grad_norm 1.5799 (2.3259)	mem 20675MB
[2025-04-03 01:07:12 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][70/311]	eta 0:03:35 lr 0.001032	time 0.8760 (0.8935)	loss 0.6250 (0.5991)	grad_norm 2.1648 (2.3140)	mem 20675MB
[2025-04-03 01:07:13 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][72/311]	eta 0:03:33 lr 0.001032	time 0.8760 (0.8931)	loss 0.6351 (0.5999)	grad_norm 1.5254 (2.3010)	mem 20675MB
[2025-04-03 01:07:15 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][74/311]	eta 0:03:31 lr 0.001032	time 0.8763 (0.8927)	loss 0.6403 (0.6004)	grad_norm 2.3902 (2.2975)	mem 20675MB
[2025-04-03 01:07:17 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][76/311]	eta 0:03:29 lr 0.001031	time 0.8760 (0.8923)	loss 0.6001 (0.6007)	grad_norm 1.8696 (2.2901)	mem 20675MB
[2025-04-03 01:07:19 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][78/311]	eta 0:03:27 lr 0.001031	time 0.8756 (0.8919)	loss 0.6279 (0.6013)	grad_norm 2.2183 (2.2903)	mem 20675MB
[2025-04-03 01:07:20 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][80/311]	eta 0:03:25 lr 0.001031	time 0.8758 (0.8915)	loss 0.6405 (0.6025)	grad_norm 2.7071 (2.2865)	mem 20675MB
[2025-04-03 01:07:22 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][82/311]	eta 0:03:24 lr 0.001030	time 0.8759 (0.8911)	loss 0.6455 (0.6034)	grad_norm 3.1421 (2.2942)	mem 20675MB
[2025-04-03 01:07:24 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][84/311]	eta 0:03:22 lr 0.001030	time 0.8758 (0.8908)	loss 0.6144 (0.6028)	grad_norm 2.3215 (2.2924)	mem 20675MB
[2025-04-03 01:07:26 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][86/311]	eta 0:03:20 lr 0.001030	time 0.8760 (0.8905)	loss 0.5942 (0.6029)	grad_norm 2.7733 (2.2894)	mem 20675MB
[2025-04-03 01:07:27 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][88/311]	eta 0:03:18 lr 0.001029	time 0.8758 (0.8902)	loss 0.5171 (0.6025)	grad_norm 3.0728 (2.2942)	mem 20675MB
[2025-04-03 01:07:29 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][90/311]	eta 0:03:16 lr 0.001029	time 0.8759 (0.8899)	loss 0.7031 (0.6038)	grad_norm 2.3332 (2.2892)	mem 20675MB
[2025-04-03 01:07:31 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][92/311]	eta 0:03:14 lr 0.001029	time 0.8758 (0.8896)	loss 0.6259 (0.6040)	grad_norm 1.8401 (2.2829)	mem 20675MB
[2025-04-03 01:07:33 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][94/311]	eta 0:03:12 lr 0.001028	time 0.8756 (0.8893)	loss 0.5920 (0.6040)	grad_norm 1.3509 (2.2664)	mem 20675MB
[2025-04-03 01:07:34 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][96/311]	eta 0:03:11 lr 0.001028	time 0.8759 (0.8890)	loss 0.5504 (0.6029)	grad_norm 2.6288 (2.2690)	mem 20675MB
[2025-04-03 01:07:36 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][98/311]	eta 0:03:09 lr 0.001028	time 0.8757 (0.8888)	loss 0.6452 (0.6036)	grad_norm 2.4151 (2.2645)	mem 20675MB
[2025-04-03 01:07:38 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][100/311]	eta 0:03:07 lr 0.001027	time 0.8760 (0.8885)	loss 0.5782 (0.6037)	grad_norm 3.0310 (2.2686)	mem 20675MB
[2025-04-03 01:07:40 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][102/311]	eta 0:03:05 lr 0.001027	time 0.8760 (0.8883)	loss 0.6315 (0.6032)	grad_norm 2.9253 (2.2727)	mem 20675MB
[2025-04-03 01:07:41 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][104/311]	eta 0:03:03 lr 0.001027	time 0.8759 (0.8881)	loss 0.6377 (0.6029)	grad_norm 1.7567 (2.2738)	mem 20675MB
[2025-04-03 01:07:43 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][106/311]	eta 0:03:02 lr 0.001026	time 0.8760 (0.8879)	loss 0.5881 (0.6019)	grad_norm 2.1758 (2.3041)	mem 20675MB
[2025-04-03 01:07:45 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][108/311]	eta 0:03:00 lr 0.001026	time 0.8758 (0.8877)	loss 0.6033 (0.6018)	grad_norm 2.1832 (2.3032)	mem 20675MB
[2025-04-03 01:07:47 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][110/311]	eta 0:02:58 lr 0.001026	time 0.8760 (0.8875)	loss 0.6181 (0.6017)	grad_norm 2.9959 (2.3073)	mem 20675MB
[2025-04-03 01:07:48 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][112/311]	eta 0:02:56 lr 0.001025	time 0.8761 (0.8873)	loss 0.5741 (0.6023)	grad_norm 2.4420 (2.3169)	mem 20675MB
[2025-04-03 01:07:50 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][114/311]	eta 0:02:54 lr 0.001025	time 0.8770 (0.8871)	loss 0.5682 (0.6017)	grad_norm 2.8651 (2.3242)	mem 20675MB
[2025-04-03 01:07:52 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][116/311]	eta 0:02:52 lr 0.001025	time 0.8764 (0.8870)	loss 0.6040 (0.6007)	grad_norm 2.8344 (2.3381)	mem 20675MB
[2025-04-03 01:07:54 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][118/311]	eta 0:02:51 lr 0.001024	time 0.8759 (0.8868)	loss 0.6124 (0.6003)	grad_norm 2.1904 (2.3349)	mem 20675MB
[2025-04-03 01:07:55 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][120/311]	eta 0:02:49 lr 0.001024	time 0.8762 (0.8866)	loss 0.5523 (0.5991)	grad_norm 2.4617 (2.3470)	mem 20675MB
[2025-04-03 01:07:57 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][122/311]	eta 0:02:47 lr 0.001024	time 0.8760 (0.8865)	loss 0.6029 (0.5992)	grad_norm 3.3583 (2.3739)	mem 20675MB
[2025-04-03 01:07:59 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][124/311]	eta 0:02:45 lr 0.001024	time 0.8759 (0.8863)	loss 0.5330 (0.5992)	grad_norm 3.4286 (2.3819)	mem 20675MB
[2025-04-03 01:08:01 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][126/311]	eta 0:02:43 lr 0.001023	time 0.8760 (0.8862)	loss 0.6623 (0.6003)	grad_norm 2.7453 (2.3869)	mem 20675MB
[2025-04-03 01:08:02 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][128/311]	eta 0:02:42 lr 0.001023	time 0.8758 (0.8860)	loss 0.6145 (0.5998)	grad_norm 3.2762 (2.3923)	mem 20675MB
[2025-04-03 01:08:04 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][130/311]	eta 0:02:40 lr 0.001023	time 0.8759 (0.8859)	loss 0.6122 (0.5997)	grad_norm 1.8446 (2.3830)	mem 20675MB
[2025-04-03 01:08:06 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][132/311]	eta 0:02:38 lr 0.001022	time 0.8759 (0.8857)	loss 0.5683 (0.5998)	grad_norm 2.4545 (2.3860)	mem 20675MB
[2025-04-03 01:08:08 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][134/311]	eta 0:02:36 lr 0.001022	time 0.8760 (0.8856)	loss 0.5813 (0.5995)	grad_norm 1.9963 (2.3823)	mem 20675MB
[2025-04-03 01:08:09 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][136/311]	eta 0:02:34 lr 0.001022	time 0.8758 (0.8855)	loss 0.5837 (0.5985)	grad_norm 2.0977 (2.3864)	mem 20675MB
[2025-04-03 01:08:11 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][138/311]	eta 0:02:33 lr 0.001021	time 0.8761 (0.8854)	loss 0.6383 (0.5988)	grad_norm 2.6099 (2.3867)	mem 20675MB
[2025-04-03 01:08:13 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][140/311]	eta 0:02:31 lr 0.001021	time 0.8761 (0.8852)	loss 0.6010 (0.5988)	grad_norm 3.3039 (2.3903)	mem 20675MB
[2025-04-03 01:08:15 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][142/311]	eta 0:02:29 lr 0.001021	time 0.8759 (0.8851)	loss 0.5920 (0.5983)	grad_norm 2.4717 (2.3984)	mem 20675MB
[2025-04-03 01:08:16 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][144/311]	eta 0:02:27 lr 0.001020	time 0.8762 (0.8850)	loss 0.6249 (0.5989)	grad_norm 2.6364 (2.4028)	mem 20675MB
[2025-04-03 01:08:18 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][146/311]	eta 0:02:26 lr 0.001020	time 0.8757 (0.8849)	loss 0.5645 (0.5992)	grad_norm 2.6087 (2.4094)	mem 20675MB
[2025-04-03 01:08:20 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][148/311]	eta 0:02:24 lr 0.001020	time 0.8770 (0.8848)	loss 0.5827 (0.5995)	grad_norm 2.5163 (2.4089)	mem 20675MB
[2025-04-03 01:08:22 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][150/311]	eta 0:02:22 lr 0.001019	time 0.8758 (0.8847)	loss 0.5902 (0.5997)	grad_norm 2.1756 (2.4017)	mem 20675MB
[2025-04-03 01:08:23 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][152/311]	eta 0:02:20 lr 0.001019	time 0.8760 (0.8846)	loss 0.6496 (0.6000)	grad_norm 2.6182 (2.4026)	mem 20675MB
[2025-04-03 01:08:25 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][154/311]	eta 0:02:18 lr 0.001019	time 0.8759 (0.8845)	loss 0.5756 (0.6002)	grad_norm 1.6150 (2.3950)	mem 20675MB
[2025-04-03 01:08:27 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][156/311]	eta 0:02:17 lr 0.001018	time 0.8765 (0.8844)	loss 0.6537 (0.6006)	grad_norm 2.0212 (2.3901)	mem 20675MB
[2025-04-03 01:08:29 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][158/311]	eta 0:02:15 lr 0.001018	time 0.8760 (0.8843)	loss 0.6405 (0.6006)	grad_norm 2.4933 (2.3963)	mem 20675MB
[2025-04-03 01:08:30 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][160/311]	eta 0:02:13 lr 0.001018	time 0.8760 (0.8842)	loss 0.6333 (0.6010)	grad_norm 1.8582 (2.3934)	mem 20675MB
[2025-04-03 01:08:32 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][162/311]	eta 0:02:11 lr 0.001017	time 0.8758 (0.8841)	loss 0.5564 (0.6002)	grad_norm 2.3808 (2.3949)	mem 20675MB
[2025-04-03 01:08:34 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][164/311]	eta 0:02:09 lr 0.001017	time 0.8762 (0.8840)	loss 0.4914 (0.5992)	grad_norm 3.1178 (2.4010)	mem 20675MB
[2025-04-03 01:08:36 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][166/311]	eta 0:02:08 lr 0.001017	time 0.8759 (0.8839)	loss 0.6776 (0.5994)	grad_norm 2.5097 (2.4001)	mem 20675MB
[2025-04-03 01:08:37 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][168/311]	eta 0:02:06 lr 0.001016	time 0.8758 (0.8838)	loss 0.6183 (0.6001)	grad_norm 2.3425 (2.4057)	mem 20675MB
[2025-04-03 01:08:39 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][170/311]	eta 0:02:04 lr 0.001016	time 0.8758 (0.8838)	loss 0.5935 (0.6003)	grad_norm 2.2252 (2.4043)	mem 20675MB
[2025-04-03 01:08:41 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][172/311]	eta 0:02:02 lr 0.001016	time 0.8757 (0.8837)	loss 0.5778 (0.6005)	grad_norm 2.3231 (2.4114)	mem 20675MB
[2025-04-03 01:08:43 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][174/311]	eta 0:02:01 lr 0.001015	time 0.8760 (0.8836)	loss 0.6160 (0.6004)	grad_norm 2.3714 (2.4098)	mem 20675MB
[2025-04-03 01:08:44 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][176/311]	eta 0:01:59 lr 0.001015	time 0.8757 (0.8835)	loss 0.6208 (0.6005)	grad_norm 3.0647 (2.4093)	mem 20675MB
[2025-04-03 01:08:46 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][178/311]	eta 0:01:57 lr 0.001015	time 0.8756 (0.8834)	loss 0.6038 (0.6007)	grad_norm 2.5247 (2.4085)	mem 20675MB
[2025-04-03 01:08:48 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][180/311]	eta 0:01:55 lr 0.001014	time 0.8763 (0.8834)	loss 0.6049 (0.6007)	grad_norm 2.1453 (2.4029)	mem 20675MB
[2025-04-03 01:08:50 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][182/311]	eta 0:01:53 lr 0.001014	time 0.8762 (0.8833)	loss 0.5951 (0.6009)	grad_norm 1.7316 (2.3948)	mem 20675MB
[2025-04-03 01:08:51 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][184/311]	eta 0:01:52 lr 0.001014	time 0.8757 (0.8832)	loss 0.5927 (0.6005)	grad_norm 1.4166 (2.3879)	mem 20675MB
[2025-04-03 01:08:53 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][186/311]	eta 0:01:50 lr 0.001013	time 0.8761 (0.8832)	loss 0.5644 (0.6004)	grad_norm 2.7670 (2.3858)	mem 20675MB
[2025-04-03 01:08:55 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][188/311]	eta 0:01:48 lr 0.001013	time 0.8757 (0.8831)	loss 0.6360 (0.6007)	grad_norm 2.5545 (2.3830)	mem 20675MB
[2025-04-03 01:08:57 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][190/311]	eta 0:01:46 lr 0.001013	time 0.8760 (0.8830)	loss 0.6276 (0.6008)	grad_norm 2.5704 (2.3808)	mem 20675MB
[2025-04-03 01:08:58 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][192/311]	eta 0:01:45 lr 0.001012	time 0.8760 (0.8830)	loss 0.6372 (0.6013)	grad_norm 3.3789 (2.3852)	mem 20675MB
[2025-04-03 01:09:00 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][194/311]	eta 0:01:43 lr 0.001012	time 0.8757 (0.8829)	loss 0.6167 (0.6009)	grad_norm 1.9014 (2.3886)	mem 20675MB
[2025-04-03 01:09:02 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][196/311]	eta 0:01:41 lr 0.001012	time 0.8757 (0.8828)	loss 0.5394 (0.6001)	grad_norm 2.4274 (2.3921)	mem 20675MB
[2025-04-03 01:09:04 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][198/311]	eta 0:01:39 lr 0.001011	time 0.8759 (0.8828)	loss 0.6276 (0.6003)	grad_norm 1.9752 (2.3886)	mem 20675MB
[2025-04-03 01:09:06 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][200/311]	eta 0:01:37 lr 0.001011	time 0.8756 (0.8827)	loss 0.5881 (0.6003)	grad_norm 2.4345 (2.3920)	mem 20675MB
[2025-04-03 01:09:07 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][202/311]	eta 0:01:36 lr 0.001011	time 0.8756 (0.8827)	loss 0.6346 (0.6006)	grad_norm 2.3042 (2.3952)	mem 20675MB
[2025-04-03 01:09:09 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][204/311]	eta 0:01:34 lr 0.001010	time 0.8759 (0.8826)	loss 0.4670 (0.5997)	grad_norm 2.8270 (2.3957)	mem 20675MB
[2025-04-03 01:09:11 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][206/311]	eta 0:01:32 lr 0.001010	time 0.8758 (0.8825)	loss 0.6682 (0.5998)	grad_norm 3.5773 (2.4013)	mem 20675MB
[2025-04-03 01:09:13 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][208/311]	eta 0:01:30 lr 0.001010	time 0.8764 (0.8825)	loss 0.6512 (0.5998)	grad_norm 2.3753 (2.4030)	mem 20675MB
[2025-04-03 01:09:14 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][210/311]	eta 0:01:29 lr 0.001009	time 0.8765 (0.8824)	loss 0.6534 (0.6000)	grad_norm 2.7466 (2.4050)	mem 20675MB
[2025-04-03 01:09:16 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][212/311]	eta 0:01:27 lr 0.001009	time 0.8758 (0.8824)	loss 0.5746 (0.6000)	grad_norm 2.0111 (2.4008)	mem 20675MB
[2025-04-03 01:09:18 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][214/311]	eta 0:01:25 lr 0.001009	time 0.8756 (0.8823)	loss 0.5950 (0.5998)	grad_norm 1.9389 (2.3999)	mem 20675MB
[2025-04-03 01:09:20 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][216/311]	eta 0:01:23 lr 0.001008	time 0.8756 (0.8823)	loss 0.6229 (0.6003)	grad_norm 1.7491 (2.3967)	mem 20675MB
[2025-04-03 01:09:21 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][218/311]	eta 0:01:22 lr 0.001008	time 0.8764 (0.8822)	loss 0.6228 (0.6003)	grad_norm 1.5063 (2.3892)	mem 20675MB
[2025-04-03 01:09:23 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][220/311]	eta 0:01:20 lr 0.001008	time 0.8756 (0.8822)	loss 0.5613 (0.6001)	grad_norm 2.2470 (2.3872)	mem 20675MB
[2025-04-03 01:09:25 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][222/311]	eta 0:01:18 lr 0.001007	time 0.8757 (0.8821)	loss 0.5918 (0.6001)	grad_norm 1.7164 (2.3818)	mem 20675MB
[2025-04-03 01:09:27 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][224/311]	eta 0:01:16 lr 0.001007	time 0.8759 (0.8821)	loss 0.5737 (0.6002)	grad_norm 1.6369 (2.3767)	mem 20675MB
[2025-04-03 01:09:28 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][226/311]	eta 0:01:14 lr 0.001007	time 0.8757 (0.8820)	loss 0.6176 (0.6000)	grad_norm 3.3399 (2.3798)	mem 20675MB
[2025-04-03 01:09:30 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][228/311]	eta 0:01:13 lr 0.001006	time 0.8759 (0.8820)	loss 0.4837 (0.5996)	grad_norm 2.9311 (2.3835)	mem 20675MB
[2025-04-03 01:09:32 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][230/311]	eta 0:01:11 lr 0.001006	time 0.8756 (0.8820)	loss 0.5765 (0.5995)	grad_norm 1.8104 (2.3804)	mem 20675MB
[2025-04-03 01:09:34 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][232/311]	eta 0:01:09 lr 0.001006	time 0.8757 (0.8819)	loss 0.6146 (0.5993)	grad_norm 2.8541 (2.3886)	mem 20675MB
[2025-04-03 01:09:35 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][234/311]	eta 0:01:07 lr 0.001005	time 0.8758 (0.8819)	loss 0.6182 (0.5992)	grad_norm 2.1741 (2.3910)	mem 20675MB
[2025-04-03 01:09:37 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][236/311]	eta 0:01:06 lr 0.001005	time 0.8759 (0.8818)	loss 0.6240 (0.5997)	grad_norm 3.3323 (2.3981)	mem 20675MB
[2025-04-03 01:09:39 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][238/311]	eta 0:01:04 lr 0.001005	time 0.8809 (0.8818)	loss 0.6112 (0.5997)	grad_norm 1.4734 (2.3936)	mem 20675MB
[2025-04-03 01:09:41 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][240/311]	eta 0:01:02 lr 0.001004	time 0.8760 (0.8818)	loss 0.6314 (0.6000)	grad_norm 1.5243 (2.3861)	mem 20675MB
[2025-04-03 01:09:42 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][242/311]	eta 0:01:00 lr 0.001004	time 0.8757 (0.8817)	loss 0.5927 (0.5999)	grad_norm 1.9223 (2.3815)	mem 20675MB
[2025-04-03 01:09:44 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][244/311]	eta 0:00:59 lr 0.001004	time 0.8761 (0.8817)	loss 0.6336 (0.5999)	grad_norm 1.7805 (2.3763)	mem 20675MB
[2025-04-03 01:09:46 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][246/311]	eta 0:00:57 lr 0.001003	time 0.8764 (0.8816)	loss 0.5574 (0.5995)	grad_norm 2.1405 (2.3738)	mem 20675MB
[2025-04-03 01:09:48 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][248/311]	eta 0:00:55 lr 0.001003	time 0.8759 (0.8816)	loss 0.6630 (0.5995)	grad_norm 1.9553 (2.3742)	mem 20675MB
[2025-04-03 01:09:49 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][250/311]	eta 0:00:53 lr 0.001003	time 0.8773 (0.8816)	loss 0.5453 (0.5989)	grad_norm 1.8875 (2.3739)	mem 20675MB
[2025-04-03 01:09:51 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][252/311]	eta 0:00:52 lr 0.001002	time 0.8758 (0.8815)	loss 0.5244 (0.5983)	grad_norm 3.6919 (2.3805)	mem 20675MB
[2025-04-03 01:09:53 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][254/311]	eta 0:00:50 lr 0.001002	time 0.8782 (0.8815)	loss 0.5665 (0.5983)	grad_norm 2.5470 (2.3815)	mem 20675MB
[2025-04-03 01:09:55 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][256/311]	eta 0:00:48 lr 0.001002	time 0.8760 (0.8815)	loss 0.6082 (0.5987)	grad_norm 2.9329 (2.3854)	mem 20675MB
[2025-04-03 01:09:56 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][258/311]	eta 0:00:46 lr 0.001001	time 0.8765 (0.8814)	loss 0.4893 (0.5984)	grad_norm 4.1553 (2.3933)	mem 20675MB
[2025-04-03 01:09:58 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][260/311]	eta 0:00:44 lr 0.001001	time 0.8761 (0.8814)	loss 0.5724 (0.5984)	grad_norm 2.7673 (2.3981)	mem 20675MB
[2025-04-03 01:10:00 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][262/311]	eta 0:00:43 lr 0.001001	time 0.8760 (0.8814)	loss 0.5647 (0.5982)	grad_norm 2.2514 (2.3957)	mem 20675MB
[2025-04-03 01:10:02 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][264/311]	eta 0:00:41 lr 0.001000	time 0.8758 (0.8813)	loss 0.4951 (0.5979)	grad_norm 2.5053 (2.3954)	mem 20675MB
[2025-04-03 01:10:03 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][266/311]	eta 0:00:39 lr 0.001000	time 0.8759 (0.8813)	loss 0.5651 (0.5980)	grad_norm 2.6667 (2.3944)	mem 20675MB
[2025-04-03 01:10:05 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][268/311]	eta 0:00:37 lr 0.001000	time 0.8770 (0.8813)	loss 0.5829 (0.5979)	grad_norm 2.1070 (2.3914)	mem 20675MB
[2025-04-03 01:10:07 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][270/311]	eta 0:00:36 lr 0.000999	time 0.8762 (0.8812)	loss 0.5143 (0.5975)	grad_norm 2.7286 (2.3912)	mem 20675MB
[2025-04-03 01:10:09 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][272/311]	eta 0:00:34 lr 0.000999	time 0.8757 (0.8812)	loss 0.4582 (0.5970)	grad_norm 2.6167 (2.3902)	mem 20675MB
[2025-04-03 01:10:10 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][274/311]	eta 0:00:32 lr 0.000999	time 0.8760 (0.8812)	loss 0.6448 (0.5973)	grad_norm 2.6851 (2.3886)	mem 20675MB
[2025-04-03 01:10:12 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][276/311]	eta 0:00:30 lr 0.000998	time 0.8762 (0.8811)	loss 0.6228 (0.5977)	grad_norm 1.7988 (2.3861)	mem 20675MB
[2025-04-03 01:10:14 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][278/311]	eta 0:00:29 lr 0.000998	time 0.8758 (0.8811)	loss 0.5862 (0.5976)	grad_norm 1.6460 (2.3830)	mem 20675MB
[2025-04-03 01:10:16 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][280/311]	eta 0:00:27 lr 0.000998	time 0.8759 (0.8811)	loss 0.5177 (0.5976)	grad_norm 2.4846 (2.3826)	mem 20675MB
[2025-04-03 01:10:17 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][282/311]	eta 0:00:25 lr 0.000997	time 0.8758 (0.8810)	loss 0.6515 (0.5979)	grad_norm 1.6548 (2.3811)	mem 20675MB
[2025-04-03 01:10:19 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][284/311]	eta 0:00:23 lr 0.000997	time 0.8760 (0.8810)	loss 0.6249 (0.5977)	grad_norm 1.5679 (2.3801)	mem 20675MB
[2025-04-03 01:10:21 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][286/311]	eta 0:00:22 lr 0.000997	time 0.8759 (0.8810)	loss 0.5966 (0.5977)	grad_norm 1.8800 (2.3775)	mem 20675MB
[2025-04-03 01:10:23 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][288/311]	eta 0:00:20 lr 0.000996	time 0.8759 (0.8810)	loss 0.6934 (0.5980)	grad_norm 2.9887 (2.3800)	mem 20675MB
[2025-04-03 01:10:24 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][290/311]	eta 0:00:18 lr 0.000996	time 0.8760 (0.8809)	loss 0.5840 (0.5979)	grad_norm 1.8790 (2.3781)	mem 20675MB
[2025-04-03 01:10:26 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][292/311]	eta 0:00:16 lr 0.000996	time 0.8758 (0.8809)	loss 0.6223 (0.5977)	grad_norm 2.5929 (2.3828)	mem 20675MB
[2025-04-03 01:10:28 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][294/311]	eta 0:00:14 lr 0.000995	time 0.8760 (0.8809)	loss 0.6136 (0.5977)	grad_norm 2.7479 (2.3833)	mem 20675MB
[2025-04-03 01:10:30 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][296/311]	eta 0:00:13 lr 0.000995	time 0.8758 (0.8808)	loss 0.5819 (0.5974)	grad_norm 2.8718 (2.3862)	mem 20675MB
[2025-04-03 01:10:31 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][298/311]	eta 0:00:11 lr 0.000995	time 0.8755 (0.8808)	loss 0.5097 (0.5971)	grad_norm 3.2533 (2.3966)	mem 20675MB
[2025-04-03 01:10:33 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][300/311]	eta 0:00:09 lr 0.000994	time 0.8754 (0.8808)	loss 0.5251 (0.5970)	grad_norm 3.0745 (2.4005)	mem 20675MB
[2025-04-03 01:10:35 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][302/311]	eta 0:00:07 lr 0.000994	time 0.8755 (0.8808)	loss 0.6107 (0.5972)	grad_norm 2.2485 (2.4051)	mem 20675MB
[2025-04-03 01:10:37 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][304/311]	eta 0:00:06 lr 0.000994	time 0.8754 (0.8807)	loss 0.5359 (0.5968)	grad_norm 2.9689 (2.4112)	mem 20675MB
[2025-04-03 01:10:38 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][306/311]	eta 0:00:04 lr 0.000993	time 0.8757 (0.8807)	loss 0.6146 (0.5968)	grad_norm 1.4510 (2.4090)	mem 20675MB
[2025-04-03 01:10:40 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][308/311]	eta 0:00:02 lr 0.000993	time 0.8755 (0.8807)	loss 0.6103 (0.5967)	grad_norm 2.0292 (2.4108)	mem 20675MB
[2025-04-03 01:10:42 simmim_finetune] (main_finetune.py 252): INFO Train: [8/30][310/311]	eta 0:00:00 lr 0.000993	time 0.8766 (0.8807)	loss 0.5373 (0.5966)	grad_norm 3.6424 (2.4164)	mem 20675MB
[2025-04-03 01:10:42 simmim_finetune] (main_finetune.py 260): INFO EPOCH 8 training takes 0:04:34
[2025-04-03 01:10:43 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.402 (1.402)	Loss 0.5391 (0.5391)	Acc@1 74.219 (74.219)	Mem 20675MB
[2025-04-03 01:10:44 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 75.352
[2025-04-03 01:10:44 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 75.4%
[2025-04-03 01:10:44 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 75.35%
[2025-04-03 01:10:44 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.7211028134365046e-06, 3.7211028134365046e-06, 5.697046668894377e-06, 5.697046668894377e-06, 8.73696029267572e-06, 8.73696029267572e-06, 1.3413750483108555e-05, 1.3413750483108555e-05, 2.0608812314543684e-05, 2.0608812314543684e-05, 3.1678138209059266e-05, 3.1678138209059266e-05, 4.870787035446785e-05, 4.870787035446785e-05, 7.490745827048106e-05, 7.490745827048106e-05, 0.00011521451660280907, 0.00011521451660280907, 0.00017722537557562142, 0.00017722537557562142, 0.00027262669707225575, 0.00027262669707225575, 0.00041939796091323164, 0.00041939796091323164, 0.000645199905283964, 0.000645199905283964, 0.0009925875120081673, 0.0009925875120081673]
[2025-04-03 01:10:46 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][0/311]	eta 0:11:27 lr 0.000992	time 2.2102 (2.2102)	loss 0.5872 (0.5872)	grad_norm 2.2182 (2.2182)	mem 20675MB
[2025-04-03 01:10:48 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][2/311]	eta 0:06:48 lr 0.000992	time 0.8759 (1.3213)	loss 0.4665 (0.5588)	grad_norm 3.4277 (2.6538)	mem 20675MB
[2025-04-03 01:10:49 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][4/311]	eta 0:05:51 lr 0.000992	time 0.8764 (1.1435)	loss 0.5678 (0.5746)	grad_norm 3.1893 (2.7536)	mem 20675MB
[2025-04-03 01:10:51 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][6/311]	eta 0:05:25 lr 0.000991	time 0.8766 (1.0674)	loss 0.6172 (0.5833)	grad_norm 3.9488 (2.9097)	mem 20675MB
[2025-04-03 01:10:53 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][8/311]	eta 0:05:10 lr 0.000991	time 0.8761 (1.0252)	loss 0.6500 (0.5911)	grad_norm 3.1571 (2.9088)	mem 20675MB
[2025-04-03 01:10:55 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][10/311]	eta 0:05:00 lr 0.000991	time 0.8766 (0.9983)	loss 0.6627 (0.5978)	grad_norm 2.5144 (2.7778)	mem 20675MB
[2025-04-03 01:10:56 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][12/311]	eta 0:04:52 lr 0.000990	time 0.8785 (0.9798)	loss 0.4967 (0.5924)	grad_norm 2.6135 (2.7708)	mem 20675MB
[2025-04-03 01:10:58 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][14/311]	eta 0:04:46 lr 0.000990	time 0.8761 (0.9660)	loss 0.6465 (0.5982)	grad_norm 1.9612 (2.6304)	mem 20675MB
[2025-04-03 01:11:00 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][16/311]	eta 0:04:41 lr 0.000990	time 0.8764 (0.9556)	loss 0.6337 (0.5988)	grad_norm 2.6271 (2.6155)	mem 20675MB
[2025-04-03 01:11:02 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][18/311]	eta 0:04:37 lr 0.000989	time 0.8758 (0.9473)	loss 0.5404 (0.5968)	grad_norm 2.0864 (2.5339)	mem 20675MB
[2025-04-03 01:11:03 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][20/311]	eta 0:04:33 lr 0.000989	time 0.8761 (0.9406)	loss 0.6441 (0.5952)	grad_norm 1.7184 (2.4602)	mem 20675MB
[2025-04-03 01:11:05 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][22/311]	eta 0:04:30 lr 0.000989	time 0.8762 (0.9351)	loss 0.5562 (0.5937)	grad_norm 2.4077 (2.4390)	mem 20675MB
[2025-04-03 01:11:07 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][24/311]	eta 0:04:27 lr 0.000988	time 0.8760 (0.9304)	loss 0.4879 (0.5899)	grad_norm 2.5805 (2.4377)	mem 20675MB
[2025-04-03 01:11:09 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][26/311]	eta 0:04:24 lr 0.000988	time 0.8758 (0.9265)	loss 0.6287 (0.5930)	grad_norm 2.1204 (2.4317)	mem 20675MB
[2025-04-03 01:11:10 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][28/311]	eta 0:04:21 lr 0.000988	time 0.8761 (0.9230)	loss 0.6272 (0.5918)	grad_norm 3.2427 (2.5027)	mem 20675MB
[2025-04-03 01:11:12 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][30/311]	eta 0:04:18 lr 0.000987	time 0.8758 (0.9201)	loss 0.5502 (0.5916)	grad_norm 3.0707 (2.5412)	mem 20675MB
[2025-04-03 01:11:14 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][32/311]	eta 0:04:15 lr 0.000987	time 0.8761 (0.9174)	loss 0.5402 (0.5882)	grad_norm 3.2612 (2.6019)	mem 20675MB
[2025-04-03 01:11:16 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][34/311]	eta 0:04:13 lr 0.000987	time 0.8759 (0.9151)	loss 0.6157 (0.5899)	grad_norm 2.6716 (2.6292)	mem 20675MB
[2025-04-03 01:11:17 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][36/311]	eta 0:04:11 lr 0.000986	time 0.8758 (0.9130)	loss 0.6063 (0.5886)	grad_norm 2.3841 (2.6163)	mem 20675MB
[2025-04-03 01:11:19 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][38/311]	eta 0:04:08 lr 0.000986	time 0.8757 (0.9112)	loss 0.5310 (0.5877)	grad_norm 2.7290 (2.6256)	mem 20675MB
[2025-04-03 01:11:21 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][40/311]	eta 0:04:06 lr 0.000986	time 0.8761 (0.9095)	loss 0.6462 (0.5886)	grad_norm 2.5585 (2.6216)	mem 20675MB
[2025-04-03 01:11:23 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][42/311]	eta 0:04:04 lr 0.000985	time 0.8759 (0.9080)	loss 0.6082 (0.5873)	grad_norm 2.3004 (2.6211)	mem 20675MB
[2025-04-03 01:11:24 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][44/311]	eta 0:04:02 lr 0.000985	time 0.8757 (0.9066)	loss 0.5280 (0.5843)	grad_norm 3.4017 (2.6585)	mem 20675MB
[2025-04-03 01:11:26 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][46/311]	eta 0:03:59 lr 0.000985	time 0.8757 (0.9053)	loss 0.5643 (0.5837)	grad_norm 3.3304 (2.6547)	mem 20675MB
[2025-04-03 01:11:28 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][48/311]	eta 0:03:57 lr 0.000984	time 0.8759 (0.9041)	loss 0.6101 (0.5845)	grad_norm 3.5570 (2.7558)	mem 20675MB
[2025-04-03 01:11:30 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][50/311]	eta 0:03:55 lr 0.000984	time 0.8759 (0.9031)	loss 0.5996 (0.5848)	grad_norm 1.8020 (2.7406)	mem 20675MB
[2025-04-03 01:11:31 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][52/311]	eta 0:03:53 lr 0.000984	time 0.8758 (0.9021)	loss 0.6393 (0.5852)	grad_norm 3.4812 (2.7829)	mem 20675MB
[2025-04-03 01:11:33 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][54/311]	eta 0:03:51 lr 0.000983	time 0.8760 (0.9011)	loss 0.5012 (0.5840)	grad_norm 3.8280 (2.8053)	mem 20675MB
[2025-04-03 01:11:35 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][56/311]	eta 0:03:49 lr 0.000983	time 0.8761 (0.9003)	loss 0.4805 (0.5819)	grad_norm 3.2484 (2.8089)	mem 20675MB
[2025-04-03 01:11:37 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][58/311]	eta 0:03:47 lr 0.000982	time 0.8758 (0.8995)	loss 0.4742 (0.5793)	grad_norm 3.3399 (2.8151)	mem 20675MB
[2025-04-03 01:11:38 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][60/311]	eta 0:03:45 lr 0.000982	time 0.8759 (0.8988)	loss 0.5833 (0.5774)	grad_norm 3.6788 (2.8352)	mem 20675MB
[2025-04-03 01:11:40 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][62/311]	eta 0:03:43 lr 0.000982	time 0.8758 (0.8981)	loss 0.5910 (0.5765)	grad_norm 4.1311 (2.8629)	mem 20675MB
[2025-04-03 01:11:42 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][64/311]	eta 0:03:41 lr 0.000981	time 0.8761 (0.8974)	loss 0.6637 (0.5775)	grad_norm 2.9305 (2.8885)	mem 20675MB
[2025-04-03 01:11:44 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][66/311]	eta 0:03:39 lr 0.000981	time 0.8757 (0.8968)	loss 0.6169 (0.5789)	grad_norm 2.4603 (2.8848)	mem 20675MB
[2025-04-03 01:11:45 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][68/311]	eta 0:03:37 lr 0.000981	time 0.8761 (0.8962)	loss 0.4810 (0.5767)	grad_norm 3.2214 (2.9020)	mem 20675MB
[2025-04-03 01:11:47 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][70/311]	eta 0:03:35 lr 0.000980	time 0.8758 (0.8957)	loss 0.6110 (0.5781)	grad_norm 2.5977 (2.9052)	mem 20675MB
[2025-04-03 01:11:49 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][72/311]	eta 0:03:33 lr 0.000980	time 0.8758 (0.8951)	loss 0.4980 (0.5771)	grad_norm 3.0487 (2.9181)	mem 20675MB
[2025-04-03 01:11:51 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][74/311]	eta 0:03:32 lr 0.000980	time 0.8758 (0.8946)	loss 0.6295 (0.5789)	grad_norm 1.6872 (2.8829)	mem 20675MB
[2025-04-03 01:11:52 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][76/311]	eta 0:03:30 lr 0.000979	time 0.8757 (0.8942)	loss 0.6157 (0.5796)	grad_norm 1.5937 (2.8814)	mem 20675MB
[2025-04-03 01:11:54 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][78/311]	eta 0:03:28 lr 0.000979	time 0.8759 (0.8937)	loss 0.5664 (0.5795)	grad_norm 2.8840 (2.8769)	mem 20675MB
[2025-04-03 01:11:56 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][80/311]	eta 0:03:26 lr 0.000979	time 0.8757 (0.8933)	loss 0.6070 (0.5798)	grad_norm 2.2249 (2.8758)	mem 20675MB
[2025-04-03 01:11:58 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][82/311]	eta 0:03:24 lr 0.000978	time 0.8760 (0.8929)	loss 0.6020 (0.5804)	grad_norm 1.9002 (2.8591)	mem 20675MB
[2025-04-03 01:11:59 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][84/311]	eta 0:03:22 lr 0.000978	time 0.8761 (0.8925)	loss 0.5501 (0.5799)	grad_norm 2.8575 (2.8475)	mem 20675MB
[2025-04-03 01:12:01 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][86/311]	eta 0:03:20 lr 0.000978	time 0.8759 (0.8922)	loss 0.5324 (0.5785)	grad_norm 2.7708 (2.8391)	mem 20675MB
[2025-04-03 01:12:03 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][88/311]	eta 0:03:18 lr 0.000977	time 0.8758 (0.8918)	loss 0.4653 (0.5764)	grad_norm 2.3621 (2.8365)	mem 20675MB
[2025-04-03 01:12:05 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][90/311]	eta 0:03:17 lr 0.000977	time 0.8763 (0.8915)	loss 0.5965 (0.5765)	grad_norm 2.6051 (2.8296)	mem 20675MB
[2025-04-03 01:12:06 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][92/311]	eta 0:03:15 lr 0.000977	time 0.8761 (0.8912)	loss 0.6422 (0.5787)	grad_norm 3.0330 (2.8260)	mem 20675MB
[2025-04-03 01:12:08 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][94/311]	eta 0:03:13 lr 0.000976	time 0.8763 (0.8909)	loss 0.5895 (0.5795)	grad_norm 3.0462 (2.8202)	mem 20675MB
[2025-04-03 01:12:10 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][96/311]	eta 0:03:11 lr 0.000976	time 0.8758 (0.8906)	loss 0.5505 (0.5799)	grad_norm 2.0583 (2.8171)	mem 20675MB
[2025-04-03 01:12:12 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][98/311]	eta 0:03:09 lr 0.000976	time 0.8760 (0.8903)	loss 0.5909 (0.5803)	grad_norm 1.7221 (2.7977)	mem 20675MB
[2025-04-03 01:12:14 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][100/311]	eta 0:03:07 lr 0.000975	time 0.8760 (0.8900)	loss 0.6119 (0.5812)	grad_norm 1.4851 (2.7699)	mem 20675MB
[2025-04-03 01:12:15 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][102/311]	eta 0:03:05 lr 0.000975	time 0.8761 (0.8898)	loss 0.5897 (0.5806)	grad_norm 1.9480 (2.7515)	mem 20675MB
[2025-04-03 01:12:17 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][104/311]	eta 0:03:04 lr 0.000974	time 0.8761 (0.8895)	loss 0.6734 (0.5812)	grad_norm 2.4035 (2.7449)	mem 20675MB
[2025-04-03 01:12:19 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][106/311]	eta 0:03:02 lr 0.000974	time 0.8760 (0.8893)	loss 0.5699 (0.5810)	grad_norm 2.5455 (2.7478)	mem 20675MB
[2025-04-03 01:12:21 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][108/311]	eta 0:03:00 lr 0.000974	time 0.8758 (0.8891)	loss 0.6200 (0.5810)	grad_norm 1.4490 (2.7270)	mem 20675MB
[2025-04-03 01:12:22 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][110/311]	eta 0:02:58 lr 0.000973	time 0.8760 (0.8889)	loss 0.5638 (0.5802)	grad_norm 3.8719 (2.7449)	mem 20675MB
[2025-04-03 01:12:24 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][112/311]	eta 0:02:56 lr 0.000973	time 0.8758 (0.8886)	loss 0.6174 (0.5806)	grad_norm 2.3211 (2.7349)	mem 20675MB
[2025-04-03 01:12:26 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][114/311]	eta 0:02:55 lr 0.000973	time 0.8758 (0.8884)	loss 0.6424 (0.5815)	grad_norm 2.7428 (2.7301)	mem 20675MB
[2025-04-03 01:12:28 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][116/311]	eta 0:02:53 lr 0.000972	time 0.8783 (0.8883)	loss 0.6350 (0.5823)	grad_norm 3.1747 (2.7328)	mem 20675MB
[2025-04-03 01:12:29 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][118/311]	eta 0:02:51 lr 0.000972	time 0.8760 (0.8881)	loss 0.5613 (0.5825)	grad_norm 3.1770 (2.7339)	mem 20675MB
[2025-04-03 01:12:31 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][120/311]	eta 0:02:49 lr 0.000972	time 0.8757 (0.8879)	loss 0.5030 (0.5815)	grad_norm 3.0218 (2.7386)	mem 20675MB
[2025-04-03 01:12:33 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][122/311]	eta 0:02:47 lr 0.000971	time 0.8757 (0.8877)	loss 0.6220 (0.5817)	grad_norm 1.7760 (2.7296)	mem 20675MB
[2025-04-03 01:12:35 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][124/311]	eta 0:02:45 lr 0.000971	time 0.8758 (0.8875)	loss 0.5011 (0.5812)	grad_norm 2.7404 (2.7254)	mem 20675MB
[2025-04-03 01:12:36 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][126/311]	eta 0:02:44 lr 0.000971	time 0.8762 (0.8874)	loss 0.6282 (0.5818)	grad_norm 2.4066 (2.7210)	mem 20675MB
[2025-04-03 01:12:38 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][128/311]	eta 0:02:42 lr 0.000970	time 0.8757 (0.8872)	loss 0.5742 (0.5820)	grad_norm 1.6310 (2.7040)	mem 20675MB
[2025-04-03 01:12:40 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][130/311]	eta 0:02:40 lr 0.000970	time 0.8760 (0.8870)	loss 0.5968 (0.5828)	grad_norm 2.0878 (2.6965)	mem 20675MB
[2025-04-03 01:12:42 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][132/311]	eta 0:02:38 lr 0.000970	time 0.8761 (0.8869)	loss 0.6510 (0.5835)	grad_norm 2.4920 (2.6875)	mem 20675MB
[2025-04-03 01:12:43 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][134/311]	eta 0:02:36 lr 0.000969	time 0.8762 (0.8867)	loss 0.6510 (0.5841)	grad_norm 1.6357 (2.6796)	mem 20675MB
[2025-04-03 01:12:45 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][136/311]	eta 0:02:35 lr 0.000969	time 0.8766 (0.8866)	loss 0.6425 (0.5847)	grad_norm 2.1557 (2.6724)	mem 20675MB
[2025-04-03 01:12:47 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][138/311]	eta 0:02:33 lr 0.000969	time 0.8759 (0.8865)	loss 0.5852 (0.5849)	grad_norm 1.5449 (2.6542)	mem 20675MB
[2025-04-03 01:12:49 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][140/311]	eta 0:02:31 lr 0.000968	time 0.8757 (0.8863)	loss 0.5130 (0.5845)	grad_norm 1.9526 (2.6392)	mem 20675MB
[2025-04-03 01:12:50 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][142/311]	eta 0:02:29 lr 0.000968	time 0.8760 (0.8862)	loss 0.6254 (0.5852)	grad_norm 1.8262 (2.6277)	mem 20675MB
[2025-04-03 01:12:52 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][144/311]	eta 0:02:27 lr 0.000967	time 0.8758 (0.8861)	loss 0.5227 (0.5842)	grad_norm 2.3876 (2.6241)	mem 20675MB
[2025-04-03 01:12:54 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][146/311]	eta 0:02:26 lr 0.000967	time 0.8766 (0.8860)	loss 0.6184 (0.5848)	grad_norm 2.9706 (2.6270)	mem 20675MB
[2025-04-03 01:12:56 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][148/311]	eta 0:02:24 lr 0.000967	time 0.8759 (0.8858)	loss 0.5856 (0.5841)	grad_norm 1.9925 (2.6216)	mem 20675MB
[2025-04-03 01:12:57 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][150/311]	eta 0:02:22 lr 0.000966	time 0.8758 (0.8857)	loss 0.5944 (0.5839)	grad_norm 2.2182 (2.6302)	mem 20675MB
[2025-04-03 01:12:59 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][152/311]	eta 0:02:20 lr 0.000966	time 0.8772 (0.8856)	loss 0.4670 (0.5832)	grad_norm 3.0593 (2.6325)	mem 20675MB
[2025-04-03 01:13:01 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][154/311]	eta 0:02:19 lr 0.000966	time 0.8765 (0.8855)	loss 0.5054 (0.5830)	grad_norm 3.8499 (2.6401)	mem 20675MB
[2025-04-03 01:13:03 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][156/311]	eta 0:02:17 lr 0.000965	time 0.8760 (0.8854)	loss 0.6316 (0.5835)	grad_norm 3.2645 (2.6419)	mem 20675MB
[2025-04-03 01:13:04 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][158/311]	eta 0:02:15 lr 0.000965	time 0.8758 (0.8853)	loss 0.6664 (0.5837)	grad_norm 2.0875 (2.6378)	mem 20675MB
[2025-04-03 01:13:06 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][160/311]	eta 0:02:13 lr 0.000965	time 0.8769 (0.8852)	loss 0.6110 (0.5838)	grad_norm 2.3265 (2.6371)	mem 20675MB
[2025-04-03 01:13:08 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][162/311]	eta 0:02:11 lr 0.000964	time 0.8760 (0.8851)	loss 0.5496 (0.5835)	grad_norm 3.1375 (2.6374)	mem 20675MB
[2025-04-03 01:13:10 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][164/311]	eta 0:02:10 lr 0.000964	time 0.8758 (0.8850)	loss 0.5953 (0.5837)	grad_norm 3.3545 (2.6412)	mem 20675MB
[2025-04-03 01:13:11 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][166/311]	eta 0:02:08 lr 0.000964	time 0.8760 (0.8849)	loss 0.6353 (0.5838)	grad_norm 2.4275 (2.6402)	mem 20675MB
[2025-04-03 01:13:13 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][168/311]	eta 0:02:06 lr 0.000963	time 0.8761 (0.8848)	loss 0.6366 (0.5842)	grad_norm 2.2791 (2.6323)	mem 20675MB
[2025-04-03 01:13:15 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][170/311]	eta 0:02:04 lr 0.000963	time 0.8757 (0.8847)	loss 0.5758 (0.5844)	grad_norm 2.3391 (2.6288)	mem 20675MB
[2025-04-03 01:13:17 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][172/311]	eta 0:02:02 lr 0.000963	time 0.8759 (0.8846)	loss 0.6405 (0.5849)	grad_norm 2.5786 (2.6286)	mem 20675MB
[2025-04-03 01:13:18 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][174/311]	eta 0:02:01 lr 0.000962	time 0.8761 (0.8845)	loss 0.6196 (0.5849)	grad_norm 1.7514 (2.6199)	mem 20675MB
[2025-04-03 01:13:20 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][176/311]	eta 0:01:59 lr 0.000962	time 0.8758 (0.8844)	loss 0.6323 (0.5855)	grad_norm 1.3410 (2.6052)	mem 20675MB
[2025-04-03 01:13:22 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][178/311]	eta 0:01:57 lr 0.000961	time 0.8760 (0.8844)	loss 0.6087 (0.5856)	grad_norm 2.6406 (2.5992)	mem 20675MB
[2025-04-03 01:13:24 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][180/311]	eta 0:01:55 lr 0.000961	time 0.8763 (0.8843)	loss 0.5924 (0.5855)	grad_norm 1.9619 (2.6018)	mem 20675MB
[2025-04-03 01:13:25 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][182/311]	eta 0:01:54 lr 0.000961	time 0.8762 (0.8842)	loss 0.4739 (0.5844)	grad_norm 2.6357 (2.6032)	mem 20675MB
[2025-04-03 01:13:27 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][184/311]	eta 0:01:52 lr 0.000960	time 0.8770 (0.8841)	loss 0.6277 (0.5843)	grad_norm 1.8603 (2.5992)	mem 20675MB
[2025-04-03 01:13:29 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][186/311]	eta 0:01:50 lr 0.000960	time 0.8759 (0.8841)	loss 0.6004 (0.5840)	grad_norm 2.2060 (2.5998)	mem 20675MB
[2025-04-03 01:13:31 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][188/311]	eta 0:01:48 lr 0.000960	time 0.8767 (0.8840)	loss 0.6462 (0.5844)	grad_norm 2.8214 (2.6094)	mem 20675MB
[2025-04-03 01:13:32 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][190/311]	eta 0:01:46 lr 0.000959	time 0.8762 (0.8839)	loss 0.6613 (0.5849)	grad_norm 3.3344 (2.6141)	mem 20675MB
[2025-04-03 01:13:34 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][192/311]	eta 0:01:45 lr 0.000959	time 0.8759 (0.8838)	loss 0.5868 (0.5850)	grad_norm 3.6270 (2.6211)	mem 20675MB
[2025-04-03 01:13:36 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][194/311]	eta 0:01:43 lr 0.000959	time 0.8758 (0.8838)	loss 0.6734 (0.5857)	grad_norm 1.2642 (2.6126)	mem 20675MB
[2025-04-03 01:13:38 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][196/311]	eta 0:01:41 lr 0.000958	time 0.8761 (0.8837)	loss 0.5756 (0.5854)	grad_norm 1.9784 (2.6116)	mem 20675MB
[2025-04-03 01:13:39 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][198/311]	eta 0:01:39 lr 0.000958	time 0.8761 (0.8836)	loss 0.5851 (0.5855)	grad_norm 2.2148 (2.6037)	mem 20675MB
[2025-04-03 01:13:41 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][200/311]	eta 0:01:38 lr 0.000958	time 0.8763 (0.8836)	loss 0.6374 (0.5852)	grad_norm 1.5319 (2.5972)	mem 20675MB
[2025-04-03 01:13:43 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][202/311]	eta 0:01:36 lr 0.000957	time 0.8762 (0.8835)	loss 0.6438 (0.5853)	grad_norm 1.9800 (2.5925)	mem 20675MB
[2025-04-03 01:13:45 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][204/311]	eta 0:01:34 lr 0.000957	time 0.8761 (0.8835)	loss 0.5326 (0.5852)	grad_norm 2.3544 (2.5875)	mem 20675MB
[2025-04-03 01:13:46 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][206/311]	eta 0:01:32 lr 0.000956	time 0.8759 (0.8834)	loss 0.5901 (0.5855)	grad_norm 2.2811 (2.5879)	mem 20675MB
[2025-04-03 01:13:48 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][208/311]	eta 0:01:30 lr 0.000956	time 0.8762 (0.8833)	loss 0.5299 (0.5851)	grad_norm 3.1030 (2.5876)	mem 20675MB
[2025-04-03 01:13:50 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][210/311]	eta 0:01:29 lr 0.000956	time 0.8762 (0.8833)	loss 0.6095 (0.5850)	grad_norm 2.2460 (2.5858)	mem 20675MB
[2025-04-03 01:13:52 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][212/311]	eta 0:01:27 lr 0.000955	time 0.8761 (0.8832)	loss 0.5796 (0.5852)	grad_norm 3.0589 (2.5921)	mem 20675MB
[2025-04-03 01:13:53 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][214/311]	eta 0:01:25 lr 0.000955	time 0.8760 (0.8831)	loss 0.5600 (0.5852)	grad_norm 1.8937 (2.5873)	mem 20675MB
[2025-04-03 01:13:55 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][216/311]	eta 0:01:23 lr 0.000955	time 0.8763 (0.8831)	loss 0.6605 (0.5855)	grad_norm 2.3366 (2.5924)	mem 20675MB
[2025-04-03 01:13:57 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][218/311]	eta 0:01:22 lr 0.000954	time 0.8761 (0.8830)	loss 0.5922 (0.5855)	grad_norm 2.4319 (2.5986)	mem 20675MB
[2025-04-03 01:13:59 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][220/311]	eta 0:01:20 lr 0.000954	time 0.8758 (0.8830)	loss 0.5435 (0.5852)	grad_norm 2.2582 (2.5951)	mem 20675MB
[2025-04-03 01:14:01 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][222/311]	eta 0:01:18 lr 0.000954	time 0.8762 (0.8829)	loss 0.5152 (0.5850)	grad_norm 3.0229 (2.5957)	mem 20675MB
[2025-04-03 01:14:02 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][224/311]	eta 0:01:16 lr 0.000953	time 0.8761 (0.8829)	loss 0.6044 (0.5852)	grad_norm 2.1106 (2.5927)	mem 20675MB
[2025-04-03 01:14:04 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][226/311]	eta 0:01:15 lr 0.000953	time 0.8772 (0.8828)	loss 0.5680 (0.5846)	grad_norm 3.3206 (2.5970)	mem 20675MB
[2025-04-03 01:14:06 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][228/311]	eta 0:01:13 lr 0.000953	time 0.8758 (0.8828)	loss 0.5480 (0.5841)	grad_norm 2.3696 (2.5989)	mem 20675MB
[2025-04-03 01:14:08 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][230/311]	eta 0:01:11 lr 0.000952	time 0.8761 (0.8827)	loss 0.5633 (0.5842)	grad_norm 3.5165 (2.6044)	mem 20675MB
[2025-04-03 01:14:09 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][232/311]	eta 0:01:09 lr 0.000952	time 0.8760 (0.8827)	loss 0.5611 (0.5841)	grad_norm 3.9744 (2.6122)	mem 20675MB
[2025-04-03 01:14:11 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][234/311]	eta 0:01:07 lr 0.000951	time 0.8760 (0.8826)	loss 0.6631 (0.5847)	grad_norm 2.8488 (2.6143)	mem 20675MB
[2025-04-03 01:14:13 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][236/311]	eta 0:01:06 lr 0.000951	time 0.8760 (0.8826)	loss 0.5957 (0.5849)	grad_norm 2.0512 (2.6107)	mem 20675MB
[2025-04-03 01:14:15 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][238/311]	eta 0:01:04 lr 0.000951	time 0.8758 (0.8825)	loss 0.6309 (0.5852)	grad_norm 1.7339 (2.6046)	mem 20675MB
[2025-04-03 01:14:16 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][240/311]	eta 0:01:02 lr 0.000950	time 0.8761 (0.8825)	loss 0.5766 (0.5851)	grad_norm 2.1373 (2.5992)	mem 20675MB
[2025-04-03 01:14:18 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][242/311]	eta 0:01:00 lr 0.000950	time 0.8761 (0.8824)	loss 0.5386 (0.5845)	grad_norm 2.2299 (2.5974)	mem 20675MB
[2025-04-03 01:14:20 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][244/311]	eta 0:00:59 lr 0.000950	time 0.8760 (0.8824)	loss 0.5938 (0.5849)	grad_norm 1.7675 (2.5930)	mem 20675MB
[2025-04-03 01:14:22 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][246/311]	eta 0:00:57 lr 0.000949	time 0.8757 (0.8823)	loss 0.5309 (0.5851)	grad_norm 2.7057 (2.5913)	mem 20675MB
[2025-04-03 01:14:23 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][248/311]	eta 0:00:55 lr 0.000949	time 0.8758 (0.8823)	loss 0.6084 (0.5849)	grad_norm 1.8064 (2.5900)	mem 20675MB
[2025-04-03 01:14:25 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][250/311]	eta 0:00:53 lr 0.000949	time 0.8758 (0.8822)	loss 0.5965 (0.5849)	grad_norm 2.3715 (2.5874)	mem 20675MB
[2025-04-03 01:14:27 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][252/311]	eta 0:00:52 lr 0.000948	time 0.8760 (0.8822)	loss 0.6272 (0.5854)	grad_norm 2.2170 (2.5842)	mem 20675MB
[2025-04-03 01:14:29 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][254/311]	eta 0:00:50 lr 0.000948	time 0.8766 (0.8822)	loss 0.6150 (0.5851)	grad_norm 3.0732 (2.5870)	mem 20675MB
[2025-04-03 01:14:30 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][256/311]	eta 0:00:48 lr 0.000948	time 0.8764 (0.8821)	loss 0.5202 (0.5850)	grad_norm 3.0202 (2.5884)	mem 20675MB
[2025-04-03 01:14:32 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][258/311]	eta 0:00:46 lr 0.000947	time 0.8760 (0.8821)	loss 0.6226 (0.5853)	grad_norm 2.7161 (2.5858)	mem 20675MB
[2025-04-03 01:14:34 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][260/311]	eta 0:00:44 lr 0.000947	time 0.8760 (0.8820)	loss 0.5005 (0.5848)	grad_norm 2.8980 (2.5895)	mem 20675MB
[2025-04-03 01:14:36 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][262/311]	eta 0:00:43 lr 0.000946	time 0.8758 (0.8820)	loss 0.6064 (0.5849)	grad_norm 3.2846 (2.5922)	mem 20675MB
[2025-04-03 01:14:37 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][264/311]	eta 0:00:41 lr 0.000946	time 0.8761 (0.8820)	loss 0.5127 (0.5844)	grad_norm 3.3678 (2.5968)	mem 20675MB
[2025-04-03 01:14:39 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][266/311]	eta 0:00:39 lr 0.000946	time 0.8762 (0.8819)	loss 0.5484 (0.5843)	grad_norm 2.5070 (2.6012)	mem 20675MB
[2025-04-03 01:14:41 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][268/311]	eta 0:00:37 lr 0.000945	time 0.8759 (0.8819)	loss 0.6161 (0.5842)	grad_norm 2.6769 (2.6030)	mem 20675MB
[2025-04-03 01:14:43 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][270/311]	eta 0:00:36 lr 0.000945	time 0.8760 (0.8819)	loss 0.5534 (0.5842)	grad_norm 2.2608 (2.6000)	mem 20675MB
[2025-04-03 01:14:44 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][272/311]	eta 0:00:34 lr 0.000945	time 0.8760 (0.8818)	loss 0.6235 (0.5844)	grad_norm 2.6725 (2.6038)	mem 20675MB
[2025-04-03 01:14:46 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][274/311]	eta 0:00:32 lr 0.000944	time 0.8761 (0.8818)	loss 0.6255 (0.5845)	grad_norm 2.0353 (2.6012)	mem 20675MB
[2025-04-03 01:14:48 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][276/311]	eta 0:00:30 lr 0.000944	time 0.8765 (0.8818)	loss 0.5797 (0.5840)	grad_norm 2.1243 (2.6015)	mem 20675MB
[2025-04-03 01:14:50 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][278/311]	eta 0:00:29 lr 0.000944	time 0.8758 (0.8817)	loss 0.5344 (0.5834)	grad_norm 2.1251 (2.6023)	mem 20675MB
[2025-04-03 01:14:51 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][280/311]	eta 0:00:27 lr 0.000943	time 0.8757 (0.8817)	loss 0.5251 (0.5833)	grad_norm 3.4444 (2.6063)	mem 20675MB
[2025-04-03 01:14:53 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][282/311]	eta 0:00:25 lr 0.000943	time 0.8759 (0.8817)	loss 0.6076 (0.5831)	grad_norm 2.3004 (2.6079)	mem 20675MB
[2025-04-03 01:14:55 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][284/311]	eta 0:00:23 lr 0.000942	time 0.8761 (0.8816)	loss 0.6643 (0.5829)	grad_norm 3.3001 (2.6126)	mem 20675MB
[2025-04-03 01:14:57 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][286/311]	eta 0:00:22 lr 0.000942	time 0.8760 (0.8816)	loss 0.5622 (0.5824)	grad_norm 3.4377 (2.6201)	mem 20675MB
[2025-04-03 01:14:58 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][288/311]	eta 0:00:20 lr 0.000942	time 0.8759 (0.8816)	loss 0.6105 (0.5824)	grad_norm 2.0511 (2.6171)	mem 20675MB
[2025-04-03 01:15:00 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][290/311]	eta 0:00:18 lr 0.000941	time 0.8778 (0.8815)	loss 0.6387 (0.5823)	grad_norm 3.0042 (2.6194)	mem 20675MB
[2025-04-03 01:15:02 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][292/311]	eta 0:00:16 lr 0.000941	time 0.8760 (0.8815)	loss 0.6399 (0.5827)	grad_norm 1.7832 (2.6177)	mem 20675MB
[2025-04-03 01:15:04 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][294/311]	eta 0:00:14 lr 0.000941	time 0.8759 (0.8815)	loss 0.5786 (0.5828)	grad_norm 2.9861 (2.6166)	mem 20675MB
[2025-04-03 01:15:05 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][296/311]	eta 0:00:13 lr 0.000940	time 0.8757 (0.8814)	loss 0.6130 (0.5831)	grad_norm 2.5563 (2.6164)	mem 20675MB
[2025-04-03 01:15:07 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][298/311]	eta 0:00:11 lr 0.000940	time 0.8754 (0.8814)	loss 0.6054 (0.5833)	grad_norm 2.1493 (2.6115)	mem 20675MB
[2025-04-03 01:15:09 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][300/311]	eta 0:00:09 lr 0.000940	time 0.8754 (0.8814)	loss 0.6390 (0.5833)	grad_norm 1.6074 (2.6083)	mem 20675MB
[2025-04-03 01:15:11 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][302/311]	eta 0:00:07 lr 0.000939	time 0.8755 (0.8813)	loss 0.4956 (0.5832)	grad_norm 2.6691 (2.6062)	mem 20675MB
[2025-04-03 01:15:12 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][304/311]	eta 0:00:06 lr 0.000939	time 0.8761 (0.8813)	loss 0.5375 (0.5830)	grad_norm 2.6778 (2.6048)	mem 20675MB
[2025-04-03 01:15:14 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][306/311]	eta 0:00:04 lr 0.000938	time 0.8754 (0.8813)	loss 0.6214 (0.5831)	grad_norm 2.5256 (2.6021)	mem 20675MB
[2025-04-03 01:15:16 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][308/311]	eta 0:00:02 lr 0.000938	time 0.8755 (0.8812)	loss 0.5893 (0.5830)	grad_norm 3.2551 (2.6025)	mem 20675MB
[2025-04-03 01:15:18 simmim_finetune] (main_finetune.py 252): INFO Train: [9/30][310/311]	eta 0:00:00 lr 0.000938	time 0.8758 (0.8812)	loss 0.6244 (0.5833)	grad_norm 2.5847 (2.6058)	mem 20675MB
[2025-04-03 01:15:18 simmim_finetune] (main_finetune.py 260): INFO EPOCH 9 training takes 0:04:34
[2025-04-03 01:15:19 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.371 (1.371)	Loss 0.5285 (0.5285)	Acc@1 76.562 (76.562)	Mem 20675MB
[2025-04-03 01:15:19 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.465
[2025-04-03 01:15:19 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 77.5%
[2025-04-03 01:15:19 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 77.46%
[2025-04-03 01:15:19 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.5292678432805904e-06, 3.5292678432805904e-06, 5.396008615189853e-06, 5.396008615189853e-06, 8.267917495050257e-06, 8.267917495050257e-06, 1.2686238848681647e-05, 1.2686238848681647e-05, 1.948365631580686e-05, 1.948365631580686e-05, 2.9941221649845654e-05, 2.9941221649845654e-05, 4.6029783702213025e-05, 4.6029783702213025e-05, 7.078141762893205e-05, 7.078141762893205e-05, 0.00010886085443926901, 0.00010886085443926901, 0.000167444603378249, 0.000167444603378249, 0.00025757344789975654, 0.00025757344789975654, 0.000396233208702076, 0.000396233208702076, 0.0006095559176287213, 0.0006095559176287213, 0.0009377447005927908, 0.0009377447005927908]
[2025-04-03 01:15:22 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][0/311]	eta 0:11:41 lr 0.000938	time 2.2564 (2.2564)	loss 0.5184 (0.5184)	grad_norm 4.6209 (4.6209)	mem 20675MB
[2025-04-03 01:15:23 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][2/311]	eta 0:06:53 lr 0.000937	time 0.8758 (1.3366)	loss 0.4734 (0.5374)	grad_norm 2.5294 (3.0711)	mem 20675MB
[2025-04-03 01:15:25 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][4/311]	eta 0:05:53 lr 0.000937	time 0.8759 (1.1526)	loss 0.5835 (0.5578)	grad_norm 2.5438 (2.7485)	mem 20675MB
[2025-04-03 01:15:27 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][6/311]	eta 0:05:27 lr 0.000936	time 0.8760 (1.0738)	loss 0.6607 (0.5981)	grad_norm 3.7227 (3.2130)	mem 20675MB
[2025-04-03 01:15:29 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][8/311]	eta 0:05:12 lr 0.000936	time 0.8758 (1.0300)	loss 0.5456 (0.5942)	grad_norm 2.7237 (2.9973)	mem 20675MB
[2025-04-03 01:15:30 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][10/311]	eta 0:05:01 lr 0.000936	time 0.8760 (1.0021)	loss 0.5913 (0.5949)	grad_norm 2.8462 (2.8706)	mem 20675MB
[2025-04-03 01:15:32 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][12/311]	eta 0:04:53 lr 0.000935	time 0.8760 (0.9830)	loss 0.5946 (0.5920)	grad_norm 1.8607 (2.7662)	mem 20675MB
[2025-04-03 01:15:34 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][14/311]	eta 0:04:47 lr 0.000935	time 0.8759 (0.9688)	loss 0.6274 (0.5931)	grad_norm 3.3412 (2.7372)	mem 20675MB
[2025-04-03 01:15:36 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][16/311]	eta 0:04:42 lr 0.000935	time 0.8758 (0.9580)	loss 0.5602 (0.5865)	grad_norm 1.8465 (2.6659)	mem 20675MB
[2025-04-03 01:15:37 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][18/311]	eta 0:04:38 lr 0.000934	time 0.8760 (0.9494)	loss 0.5032 (0.5816)	grad_norm 2.9989 (2.7109)	mem 20675MB
[2025-04-03 01:15:39 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][20/311]	eta 0:04:34 lr 0.000934	time 0.8759 (0.9425)	loss 0.6196 (0.5862)	grad_norm 2.1316 (2.6832)	mem 20675MB
[2025-04-03 01:15:41 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][22/311]	eta 0:04:30 lr 0.000934	time 0.8759 (0.9368)	loss 0.6542 (0.5902)	grad_norm 2.1319 (2.6375)	mem 20675MB
[2025-04-03 01:15:43 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][24/311]	eta 0:04:27 lr 0.000933	time 0.8757 (0.9320)	loss 0.6392 (0.5909)	grad_norm 2.6901 (2.6417)	mem 20675MB
[2025-04-03 01:15:44 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][26/311]	eta 0:04:24 lr 0.000933	time 0.8762 (0.9279)	loss 0.4912 (0.5886)	grad_norm 2.6803 (2.6550)	mem 20675MB
[2025-04-03 01:15:46 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][28/311]	eta 0:04:21 lr 0.000932	time 0.8758 (0.9244)	loss 0.5412 (0.5831)	grad_norm 3.7235 (2.7064)	mem 20675MB
[2025-04-03 01:15:48 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][30/311]	eta 0:04:18 lr 0.000932	time 0.8757 (0.9213)	loss 0.5879 (0.5828)	grad_norm 3.9282 (2.7854)	mem 20675MB
[2025-04-03 01:15:50 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][32/311]	eta 0:04:16 lr 0.000932	time 0.8759 (0.9186)	loss 0.6385 (0.5839)	grad_norm 2.4570 (2.7591)	mem 20675MB
[2025-04-03 01:15:51 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][34/311]	eta 0:04:13 lr 0.000931	time 0.8760 (0.9162)	loss 0.6042 (0.5850)	grad_norm 1.7838 (2.7163)	mem 20675MB
[2025-04-03 01:15:53 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][36/311]	eta 0:04:11 lr 0.000931	time 0.8762 (0.9141)	loss 0.6268 (0.5841)	grad_norm 1.7920 (2.7058)	mem 20675MB
[2025-04-03 01:15:55 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][38/311]	eta 0:04:09 lr 0.000931	time 0.8758 (0.9122)	loss 0.6167 (0.5863)	grad_norm 2.2675 (2.6724)	mem 20675MB
[2025-04-03 01:15:57 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][40/311]	eta 0:04:06 lr 0.000930	time 0.8761 (0.9105)	loss 0.4672 (0.5820)	grad_norm 2.3183 (2.6612)	mem 20675MB
[2025-04-03 01:15:58 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][42/311]	eta 0:04:04 lr 0.000930	time 0.8758 (0.9089)	loss 0.5763 (0.5835)	grad_norm 2.3090 (2.6553)	mem 20675MB
[2025-04-03 01:16:00 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][44/311]	eta 0:04:02 lr 0.000930	time 0.8761 (0.9075)	loss 0.5120 (0.5831)	grad_norm 2.5312 (2.6405)	mem 20675MB
[2025-04-03 01:16:02 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][46/311]	eta 0:04:00 lr 0.000929	time 0.8762 (0.9062)	loss 0.6320 (0.5851)	grad_norm 2.5119 (2.6350)	mem 20675MB
[2025-04-03 01:16:04 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][48/311]	eta 0:03:58 lr 0.000929	time 0.8765 (0.9050)	loss 0.6201 (0.5847)	grad_norm 2.5942 (2.6585)	mem 20675MB
[2025-04-03 01:16:05 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][50/311]	eta 0:03:55 lr 0.000928	time 0.8761 (0.9039)	loss 0.5156 (0.5831)	grad_norm 2.6942 (2.6598)	mem 20675MB
[2025-04-03 01:16:07 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][52/311]	eta 0:03:53 lr 0.000928	time 0.8760 (0.9028)	loss 0.6707 (0.5842)	grad_norm 2.5044 (2.6423)	mem 20675MB
[2025-04-03 01:16:09 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][54/311]	eta 0:03:51 lr 0.000928	time 0.8766 (0.9019)	loss 0.6531 (0.5860)	grad_norm 2.8997 (2.6389)	mem 20675MB
[2025-04-03 01:16:11 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][56/311]	eta 0:03:49 lr 0.000927	time 0.8761 (0.9010)	loss 0.5307 (0.5844)	grad_norm 1.9657 (2.6213)	mem 20675MB
[2025-04-03 01:16:12 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][58/311]	eta 0:03:47 lr 0.000927	time 0.8758 (0.9002)	loss 0.6090 (0.5850)	grad_norm 1.5382 (2.5897)	mem 20675MB
[2025-04-03 01:16:14 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][60/311]	eta 0:03:45 lr 0.000927	time 0.8757 (0.8994)	loss 0.5920 (0.5846)	grad_norm 2.3867 (2.5774)	mem 20675MB
[2025-04-03 01:16:16 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][62/311]	eta 0:03:43 lr 0.000926	time 0.8760 (0.8987)	loss 0.6140 (0.5845)	grad_norm 2.1922 (2.6020)	mem 20675MB
[2025-04-03 01:16:18 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][64/311]	eta 0:03:41 lr 0.000926	time 0.8765 (0.8980)	loss 0.4961 (0.5829)	grad_norm 3.1999 (2.5992)	mem 20675MB
[2025-04-03 01:16:19 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][66/311]	eta 0:03:39 lr 0.000925	time 0.8756 (0.8974)	loss 0.6127 (0.5833)	grad_norm 2.2193 (2.5887)	mem 20675MB
[2025-04-03 01:16:21 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][68/311]	eta 0:03:37 lr 0.000925	time 0.8759 (0.8968)	loss 0.5715 (0.5832)	grad_norm 3.6546 (2.5954)	mem 20675MB
[2025-04-03 01:16:23 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][70/311]	eta 0:03:35 lr 0.000925	time 0.8764 (0.8962)	loss 0.6113 (0.5841)	grad_norm 2.5648 (2.5986)	mem 20675MB
[2025-04-03 01:16:25 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][72/311]	eta 0:03:34 lr 0.000924	time 0.8757 (0.8957)	loss 0.4743 (0.5830)	grad_norm 4.4452 (2.6336)	mem 20675MB
[2025-04-03 01:16:26 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][74/311]	eta 0:03:32 lr 0.000924	time 0.8763 (0.8952)	loss 0.5018 (0.5813)	grad_norm 3.3539 (2.6652)	mem 20675MB
[2025-04-03 01:16:28 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][76/311]	eta 0:03:30 lr 0.000924	time 0.8759 (0.8947)	loss 0.4683 (0.5799)	grad_norm 2.9954 (2.6689)	mem 20675MB
[2025-04-03 01:16:30 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][78/311]	eta 0:03:28 lr 0.000923	time 0.8757 (0.8943)	loss 0.5679 (0.5808)	grad_norm 1.9567 (2.6551)	mem 20675MB
[2025-04-03 01:16:32 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][80/311]	eta 0:03:26 lr 0.000923	time 0.8757 (0.8938)	loss 0.6183 (0.5818)	grad_norm 2.2706 (2.6611)	mem 20675MB
[2025-04-03 01:16:33 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][82/311]	eta 0:03:24 lr 0.000923	time 0.8759 (0.8934)	loss 0.6148 (0.5813)	grad_norm 1.9669 (2.6717)	mem 20675MB
[2025-04-03 01:16:35 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][84/311]	eta 0:03:22 lr 0.000922	time 0.8782 (0.8931)	loss 0.5687 (0.5810)	grad_norm 2.1255 (2.6554)	mem 20675MB
[2025-04-03 01:16:37 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][86/311]	eta 0:03:20 lr 0.000922	time 0.8760 (0.8927)	loss 0.5878 (0.5806)	grad_norm 1.9562 (2.6520)	mem 20675MB
[2025-04-03 01:16:39 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][88/311]	eta 0:03:18 lr 0.000921	time 0.8757 (0.8923)	loss 0.6044 (0.5807)	grad_norm 1.9600 (2.6377)	mem 20675MB
[2025-04-03 01:16:40 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][90/311]	eta 0:03:17 lr 0.000921	time 0.8764 (0.8920)	loss 0.5997 (0.5816)	grad_norm 2.4537 (2.6272)	mem 20675MB
[2025-04-03 01:16:42 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][92/311]	eta 0:03:15 lr 0.000921	time 0.8765 (0.8917)	loss 0.6247 (0.5812)	grad_norm 1.9865 (2.6222)	mem 20675MB
[2025-04-03 01:16:44 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][94/311]	eta 0:03:13 lr 0.000920	time 0.8758 (0.8914)	loss 0.6375 (0.5821)	grad_norm 1.9471 (2.6078)	mem 20675MB
[2025-04-03 01:16:46 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][96/311]	eta 0:03:11 lr 0.000920	time 0.8766 (0.8911)	loss 0.5727 (0.5828)	grad_norm 2.2030 (2.6112)	mem 20675MB
[2025-04-03 01:16:48 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][98/311]	eta 0:03:09 lr 0.000920	time 0.8770 (0.8908)	loss 0.5407 (0.5815)	grad_norm 3.5494 (2.6331)	mem 20675MB
[2025-04-03 01:16:49 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][100/311]	eta 0:03:07 lr 0.000919	time 0.8761 (0.8905)	loss 0.6249 (0.5826)	grad_norm 1.5765 (2.6199)	mem 20675MB
[2025-04-03 01:16:51 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][102/311]	eta 0:03:06 lr 0.000919	time 0.8762 (0.8903)	loss 0.5851 (0.5830)	grad_norm 1.7249 (2.6104)	mem 20675MB
[2025-04-03 01:16:53 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][104/311]	eta 0:03:04 lr 0.000918	time 0.8759 (0.8900)	loss 0.5538 (0.5832)	grad_norm 2.8745 (2.6075)	mem 20675MB
[2025-04-03 01:16:55 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][106/311]	eta 0:03:02 lr 0.000918	time 0.8759 (0.8898)	loss 0.5558 (0.5824)	grad_norm 3.0356 (2.6086)	mem 20675MB
[2025-04-03 01:16:56 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][108/311]	eta 0:03:00 lr 0.000918	time 0.8760 (0.8895)	loss 0.6275 (0.5825)	grad_norm 2.0766 (2.6010)	mem 20675MB
[2025-04-03 01:16:58 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][110/311]	eta 0:02:58 lr 0.000917	time 0.8760 (0.8893)	loss 0.5781 (0.5829)	grad_norm 2.2128 (2.5958)	mem 20675MB
[2025-04-03 01:17:00 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][112/311]	eta 0:02:56 lr 0.000917	time 0.8759 (0.8891)	loss 0.5924 (0.5830)	grad_norm 3.3902 (2.6150)	mem 20675MB
[2025-04-03 01:17:02 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][114/311]	eta 0:02:55 lr 0.000917	time 0.8759 (0.8889)	loss 0.5755 (0.5831)	grad_norm 2.6653 (2.6115)	mem 20675MB
[2025-04-03 01:17:03 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][116/311]	eta 0:02:53 lr 0.000916	time 0.8760 (0.8887)	loss 0.5634 (0.5832)	grad_norm 2.1231 (2.6015)	mem 20675MB
[2025-04-03 01:17:05 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][118/311]	eta 0:02:51 lr 0.000916	time 0.8763 (0.8885)	loss 0.6451 (0.5832)	grad_norm 2.6351 (2.6052)	mem 20675MB
[2025-04-03 01:17:07 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][120/311]	eta 0:02:49 lr 0.000915	time 0.8775 (0.8883)	loss 0.6751 (0.5845)	grad_norm 2.4371 (2.6029)	mem 20675MB
[2025-04-03 01:17:09 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][122/311]	eta 0:02:47 lr 0.000915	time 0.8760 (0.8881)	loss 0.6076 (0.5852)	grad_norm 2.0294 (2.5955)	mem 20675MB
[2025-04-03 01:17:10 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][124/311]	eta 0:02:46 lr 0.000915	time 0.8758 (0.8879)	loss 0.6453 (0.5857)	grad_norm 1.8566 (2.5920)	mem 20675MB
[2025-04-03 01:17:12 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][126/311]	eta 0:02:44 lr 0.000914	time 0.8760 (0.8878)	loss 0.6087 (0.5862)	grad_norm 2.3269 (2.5843)	mem 20675MB
[2025-04-03 01:17:14 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][128/311]	eta 0:02:42 lr 0.000914	time 0.8762 (0.8876)	loss 0.5636 (0.5860)	grad_norm 2.0578 (2.5704)	mem 20675MB
[2025-04-03 01:17:16 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][130/311]	eta 0:02:40 lr 0.000914	time 0.8760 (0.8874)	loss 0.6524 (0.5867)	grad_norm 2.4159 (2.5662)	mem 20675MB
[2025-04-03 01:17:17 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][132/311]	eta 0:02:38 lr 0.000913	time 0.8760 (0.8873)	loss 0.5296 (0.5862)	grad_norm 2.8690 (2.5639)	mem 20675MB
[2025-04-03 01:17:19 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][134/311]	eta 0:02:37 lr 0.000913	time 0.8756 (0.8871)	loss 0.5681 (0.5864)	grad_norm 2.4958 (2.5665)	mem 20675MB
[2025-04-03 01:17:21 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][136/311]	eta 0:02:35 lr 0.000912	time 0.8773 (0.8870)	loss 0.5661 (0.5856)	grad_norm 2.5390 (2.5703)	mem 20675MB
[2025-04-03 01:17:23 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][138/311]	eta 0:02:33 lr 0.000912	time 0.8762 (0.8868)	loss 0.5788 (0.5864)	grad_norm 2.6462 (2.5710)	mem 20675MB
[2025-04-03 01:17:24 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][140/311]	eta 0:02:31 lr 0.000912	time 0.8760 (0.8867)	loss 0.6905 (0.5871)	grad_norm 2.8405 (2.5754)	mem 20675MB
[2025-04-03 01:17:26 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][142/311]	eta 0:02:29 lr 0.000911	time 0.8764 (0.8865)	loss 0.5114 (0.5866)	grad_norm 3.3301 (2.5783)	mem 20675MB
[2025-04-03 01:17:28 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][144/311]	eta 0:02:28 lr 0.000911	time 0.8757 (0.8864)	loss 0.5664 (0.5869)	grad_norm 2.5542 (2.5740)	mem 20675MB
[2025-04-03 01:17:30 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][146/311]	eta 0:02:26 lr 0.000911	time 0.8761 (0.8863)	loss 0.5313 (0.5866)	grad_norm 2.6071 (2.5727)	mem 20675MB
[2025-04-03 01:17:31 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][148/311]	eta 0:02:24 lr 0.000910	time 0.8762 (0.8862)	loss 0.6161 (0.5871)	grad_norm 1.8993 (2.5627)	mem 20675MB
[2025-04-03 01:17:33 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][150/311]	eta 0:02:22 lr 0.000910	time 0.8765 (0.8860)	loss 0.5648 (0.5872)	grad_norm 3.2546 (2.5625)	mem 20675MB
[2025-04-03 01:17:35 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][152/311]	eta 0:02:20 lr 0.000909	time 0.8765 (0.8859)	loss 0.7228 (0.5875)	grad_norm 3.3870 (2.5785)	mem 20675MB
[2025-04-03 01:17:37 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][154/311]	eta 0:02:19 lr 0.000909	time 0.8763 (0.8858)	loss 0.5893 (0.5874)	grad_norm 1.7323 (2.5744)	mem 20675MB
[2025-04-03 01:17:38 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][156/311]	eta 0:02:17 lr 0.000909	time 0.8761 (0.8857)	loss 0.5911 (0.5876)	grad_norm 3.1104 (2.5746)	mem 20675MB
[2025-04-03 01:17:40 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][158/311]	eta 0:02:15 lr 0.000908	time 0.8761 (0.8856)	loss 0.6398 (0.5874)	grad_norm 2.2147 (2.5716)	mem 20675MB
[2025-04-03 01:17:42 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][160/311]	eta 0:02:13 lr 0.000908	time 0.8758 (0.8855)	loss 0.5743 (0.5869)	grad_norm 2.8944 (2.5742)	mem 20675MB
[2025-04-03 01:17:44 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][162/311]	eta 0:02:11 lr 0.000908	time 0.8757 (0.8854)	loss 0.5881 (0.5867)	grad_norm 1.8885 (2.5726)	mem 20675MB
[2025-04-03 01:17:45 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][164/311]	eta 0:02:10 lr 0.000907	time 0.8760 (0.8853)	loss 0.5801 (0.5869)	grad_norm 1.8398 (2.5664)	mem 20675MB
[2025-04-03 01:17:47 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][166/311]	eta 0:02:08 lr 0.000907	time 0.8765 (0.8852)	loss 0.5940 (0.5870)	grad_norm 2.2909 (2.5685)	mem 20675MB
[2025-04-03 01:17:49 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][168/311]	eta 0:02:06 lr 0.000906	time 0.8757 (0.8851)	loss 0.5900 (0.5868)	grad_norm 2.5113 (2.5693)	mem 20675MB
[2025-04-03 01:17:51 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][170/311]	eta 0:02:04 lr 0.000906	time 0.8758 (0.8850)	loss 0.5534 (0.5864)	grad_norm 2.1385 (2.5771)	mem 20675MB
[2025-04-03 01:17:52 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][172/311]	eta 0:02:02 lr 0.000906	time 0.8761 (0.8849)	loss 0.6647 (0.5864)	grad_norm 3.2784 (2.5826)	mem 20675MB
[2025-04-03 01:17:54 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][174/311]	eta 0:02:01 lr 0.000905	time 0.8763 (0.8848)	loss 0.4457 (0.5854)	grad_norm 3.4366 (2.5881)	mem 20675MB
[2025-04-03 01:17:56 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][176/311]	eta 0:01:59 lr 0.000905	time 0.8761 (0.8847)	loss 0.5102 (0.5850)	grad_norm 2.2457 (2.5893)	mem 20675MB
[2025-04-03 01:17:58 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][178/311]	eta 0:01:57 lr 0.000905	time 0.8760 (0.8846)	loss 0.5855 (0.5852)	grad_norm 2.8384 (2.5903)	mem 20675MB
[2025-04-03 01:17:59 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][180/311]	eta 0:01:55 lr 0.000904	time 0.8758 (0.8845)	loss 0.5898 (0.5850)	grad_norm 2.8294 (2.5887)	mem 20675MB
[2025-04-03 01:18:01 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][182/311]	eta 0:01:54 lr 0.000904	time 0.8759 (0.8844)	loss 0.6079 (0.5851)	grad_norm 2.2017 (2.5906)	mem 20675MB
[2025-04-03 01:18:03 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][184/311]	eta 0:01:52 lr 0.000903	time 0.8759 (0.8844)	loss 0.6228 (0.5852)	grad_norm 1.9907 (2.5843)	mem 20675MB
[2025-04-03 01:18:05 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][186/311]	eta 0:01:50 lr 0.000903	time 0.8760 (0.8843)	loss 0.5577 (0.5855)	grad_norm 2.2031 (2.5844)	mem 20675MB
[2025-04-03 01:18:06 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][188/311]	eta 0:01:48 lr 0.000903	time 0.8759 (0.8842)	loss 0.4918 (0.5852)	grad_norm 2.5307 (2.5856)	mem 20675MB
[2025-04-03 01:18:08 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][190/311]	eta 0:01:46 lr 0.000902	time 0.8760 (0.8841)	loss 0.4817 (0.5849)	grad_norm 3.8698 (2.5891)	mem 20675MB
[2025-04-03 01:18:10 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][192/311]	eta 0:01:45 lr 0.000902	time 0.8760 (0.8840)	loss 0.6123 (0.5851)	grad_norm 1.9426 (2.5867)	mem 20675MB
[2025-04-03 01:18:12 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][194/311]	eta 0:01:43 lr 0.000902	time 0.8761 (0.8840)	loss 0.6321 (0.5851)	grad_norm 1.7348 (2.5812)	mem 20675MB
[2025-04-03 01:18:13 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][196/311]	eta 0:01:41 lr 0.000901	time 0.8759 (0.8839)	loss 0.5730 (0.5854)	grad_norm 1.7324 (2.5727)	mem 20675MB
[2025-04-03 01:18:15 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][198/311]	eta 0:01:39 lr 0.000901	time 0.8757 (0.8838)	loss 0.5601 (0.5853)	grad_norm 2.5237 (2.5741)	mem 20675MB
[2025-04-03 01:18:17 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][200/311]	eta 0:01:38 lr 0.000900	time 0.8760 (0.8838)	loss 0.6371 (0.5858)	grad_norm 2.6094 (2.5741)	mem 20675MB
[2025-04-03 01:18:19 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][202/311]	eta 0:01:36 lr 0.000900	time 0.8758 (0.8837)	loss 0.5163 (0.5854)	grad_norm 2.9224 (2.5796)	mem 20675MB
[2025-04-03 01:18:20 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][204/311]	eta 0:01:34 lr 0.000900	time 0.8759 (0.8836)	loss 0.6523 (0.5861)	grad_norm 3.8297 (2.5895)	mem 20675MB
[2025-04-03 01:18:22 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][206/311]	eta 0:01:32 lr 0.000899	time 0.8761 (0.8836)	loss 0.5986 (0.5858)	grad_norm 2.0610 (2.5895)	mem 20675MB
[2025-04-03 01:18:24 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][208/311]	eta 0:01:30 lr 0.000899	time 0.8759 (0.8835)	loss 0.5662 (0.5861)	grad_norm 1.8784 (2.5868)	mem 20675MB
[2025-04-03 01:18:26 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][210/311]	eta 0:01:29 lr 0.000899	time 0.8758 (0.8834)	loss 0.6228 (0.5863)	grad_norm 1.8325 (2.5830)	mem 20675MB
[2025-04-03 01:18:27 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][212/311]	eta 0:01:27 lr 0.000898	time 0.8760 (0.8834)	loss 0.5121 (0.5857)	grad_norm 3.1618 (2.5842)	mem 20675MB
[2025-04-03 01:18:29 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][214/311]	eta 0:01:25 lr 0.000898	time 0.8758 (0.8833)	loss 0.6050 (0.5856)	grad_norm 2.2407 (2.5903)	mem 20675MB
[2025-04-03 01:18:31 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][216/311]	eta 0:01:23 lr 0.000897	time 0.8761 (0.8832)	loss 0.4823 (0.5847)	grad_norm 2.5180 (2.5913)	mem 20675MB
[2025-04-03 01:18:33 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][218/311]	eta 0:01:22 lr 0.000897	time 0.8759 (0.8832)	loss 0.5755 (0.5846)	grad_norm 2.8133 (2.5908)	mem 20675MB
[2025-04-03 01:18:34 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][220/311]	eta 0:01:20 lr 0.000897	time 0.8779 (0.8831)	loss 0.5717 (0.5845)	grad_norm 3.5514 (2.5938)	mem 20675MB
[2025-04-03 01:18:36 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][222/311]	eta 0:01:18 lr 0.000896	time 0.8759 (0.8831)	loss 0.5788 (0.5846)	grad_norm 3.2338 (2.5963)	mem 20675MB
[2025-04-03 01:18:38 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][224/311]	eta 0:01:16 lr 0.000896	time 0.8762 (0.8830)	loss 0.4380 (0.5834)	grad_norm 3.0439 (2.6003)	mem 20675MB
[2025-04-03 01:18:40 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][226/311]	eta 0:01:15 lr 0.000896	time 0.8764 (0.8830)	loss 0.4712 (0.5829)	grad_norm 4.5382 (2.6157)	mem 20675MB
[2025-04-03 01:18:42 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][228/311]	eta 0:01:13 lr 0.000895	time 0.8758 (0.8829)	loss 0.5773 (0.5828)	grad_norm 6.0143 (2.6296)	mem 20675MB
[2025-04-03 01:18:43 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][230/311]	eta 0:01:11 lr 0.000895	time 0.8760 (0.8829)	loss 0.6703 (0.5828)	grad_norm 3.5801 (2.6409)	mem 20675MB
[2025-04-03 01:18:45 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][232/311]	eta 0:01:09 lr 0.000894	time 0.8759 (0.8828)	loss 0.6136 (0.5831)	grad_norm 2.8316 (2.6489)	mem 20675MB
[2025-04-03 01:18:47 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][234/311]	eta 0:01:07 lr 0.000894	time 0.8762 (0.8828)	loss 0.6016 (0.5832)	grad_norm 2.1078 (2.6491)	mem 20675MB
[2025-04-03 01:18:49 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][236/311]	eta 0:01:06 lr 0.000894	time 0.8761 (0.8827)	loss 0.6416 (0.5832)	grad_norm 2.6141 (2.6522)	mem 20675MB
[2025-04-03 01:18:50 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][238/311]	eta 0:01:04 lr 0.000893	time 0.8766 (0.8827)	loss 0.6631 (0.5833)	grad_norm 2.1566 (2.6514)	mem 20675MB
[2025-04-03 01:18:52 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][240/311]	eta 0:01:02 lr 0.000893	time 0.8758 (0.8826)	loss 0.6227 (0.5837)	grad_norm 1.5396 (2.6442)	mem 20675MB
[2025-04-03 01:18:54 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][242/311]	eta 0:01:00 lr 0.000892	time 0.8761 (0.8826)	loss 0.6572 (0.5842)	grad_norm 1.8668 (2.6364)	mem 20675MB
[2025-04-03 01:18:56 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][244/311]	eta 0:00:59 lr 0.000892	time 0.8757 (0.8825)	loss 0.5106 (0.5836)	grad_norm 3.1228 (2.6357)	mem 20675MB
[2025-04-03 01:18:57 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][246/311]	eta 0:00:57 lr 0.000892	time 0.8762 (0.8825)	loss 0.6788 (0.5839)	grad_norm 2.2735 (2.6311)	mem 20675MB
[2025-04-03 01:18:59 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][248/311]	eta 0:00:55 lr 0.000891	time 0.8758 (0.8824)	loss 0.6040 (0.5839)	grad_norm 1.5958 (2.6223)	mem 20675MB
[2025-04-03 01:19:01 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][250/311]	eta 0:00:53 lr 0.000891	time 0.8760 (0.8824)	loss 0.6000 (0.5840)	grad_norm 1.6644 (2.6162)	mem 20675MB
[2025-04-03 01:19:03 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][252/311]	eta 0:00:52 lr 0.000891	time 0.8762 (0.8824)	loss 0.6264 (0.5841)	grad_norm 2.4043 (2.6127)	mem 20675MB
[2025-04-03 01:19:04 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][254/311]	eta 0:00:50 lr 0.000890	time 0.8761 (0.8823)	loss 0.5271 (0.5835)	grad_norm 3.3721 (2.6152)	mem 20675MB
[2025-04-03 01:19:06 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][256/311]	eta 0:00:48 lr 0.000890	time 0.8761 (0.8823)	loss 0.5935 (0.5838)	grad_norm 2.9060 (2.6170)	mem 20675MB
[2025-04-03 01:19:08 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][258/311]	eta 0:00:46 lr 0.000889	time 0.8758 (0.8822)	loss 0.6737 (0.5840)	grad_norm 2.7594 (2.6165)	mem 20675MB
[2025-04-03 01:19:10 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][260/311]	eta 0:00:44 lr 0.000889	time 0.8759 (0.8822)	loss 0.6227 (0.5840)	grad_norm 2.7356 (2.6210)	mem 20675MB
[2025-04-03 01:19:11 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][262/311]	eta 0:00:43 lr 0.000889	time 0.8759 (0.8821)	loss 0.5020 (0.5833)	grad_norm 2.5744 (2.6205)	mem 20675MB
[2025-04-03 01:19:13 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][264/311]	eta 0:00:41 lr 0.000888	time 0.8759 (0.8821)	loss 0.6064 (0.5832)	grad_norm 2.8929 (2.6216)	mem 20675MB
[2025-04-03 01:19:15 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][266/311]	eta 0:00:39 lr 0.000888	time 0.8759 (0.8821)	loss 0.6373 (0.5834)	grad_norm 2.5908 (2.6194)	mem 20675MB
[2025-04-03 01:19:17 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][268/311]	eta 0:00:37 lr 0.000888	time 0.8763 (0.8820)	loss 0.6284 (0.5836)	grad_norm 3.1832 (2.6195)	mem 20675MB
[2025-04-03 01:19:18 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][270/311]	eta 0:00:36 lr 0.000887	time 0.8759 (0.8820)	loss 0.5789 (0.5832)	grad_norm 2.3927 (2.6203)	mem 20675MB
[2025-04-03 01:19:20 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][272/311]	eta 0:00:34 lr 0.000887	time 0.8760 (0.8820)	loss 0.6311 (0.5829)	grad_norm 2.4153 (2.6218)	mem 20675MB
[2025-04-03 01:19:22 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][274/311]	eta 0:00:32 lr 0.000886	time 0.8759 (0.8819)	loss 0.6201 (0.5833)	grad_norm 1.8845 (2.6251)	mem 20675MB
[2025-04-03 01:19:24 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][276/311]	eta 0:00:30 lr 0.000886	time 0.8758 (0.8819)	loss 0.6556 (0.5837)	grad_norm 1.4592 (2.6187)	mem 20675MB
[2025-04-03 01:19:25 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][278/311]	eta 0:00:29 lr 0.000886	time 0.8759 (0.8818)	loss 0.5180 (0.5837)	grad_norm 2.2760 (2.6148)	mem 20675MB
[2025-04-03 01:19:27 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][280/311]	eta 0:00:27 lr 0.000885	time 0.8762 (0.8818)	loss 0.6244 (0.5837)	grad_norm 1.7203 (2.6087)	mem 20675MB
[2025-04-03 01:19:29 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][282/311]	eta 0:00:25 lr 0.000885	time 0.8760 (0.8818)	loss 0.5564 (0.5835)	grad_norm 2.1519 (2.6058)	mem 20675MB
[2025-04-03 01:19:31 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][284/311]	eta 0:00:23 lr 0.000884	time 0.8757 (0.8817)	loss 0.6049 (0.5840)	grad_norm 3.0912 (2.6084)	mem 20675MB
[2025-04-03 01:19:32 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][286/311]	eta 0:00:22 lr 0.000884	time 0.8760 (0.8817)	loss 0.5273 (0.5836)	grad_norm 3.7784 (2.6118)	mem 20675MB
[2025-04-03 01:19:34 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][288/311]	eta 0:00:20 lr 0.000884	time 0.8759 (0.8817)	loss 0.4444 (0.5831)	grad_norm 4.4052 (2.6235)	mem 20675MB
[2025-04-03 01:19:36 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][290/311]	eta 0:00:18 lr 0.000883	time 0.8760 (0.8816)	loss 0.5342 (0.5831)	grad_norm 3.5907 (2.6253)	mem 20675MB
[2025-04-03 01:19:38 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][292/311]	eta 0:00:16 lr 0.000883	time 0.8758 (0.8816)	loss 0.6425 (0.5834)	grad_norm 3.6930 (2.6279)	mem 20675MB
[2025-04-03 01:19:39 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][294/311]	eta 0:00:14 lr 0.000883	time 0.8759 (0.8816)	loss 0.5736 (0.5834)	grad_norm 2.4059 (2.6245)	mem 20675MB
[2025-04-03 01:19:41 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][296/311]	eta 0:00:13 lr 0.000882	time 0.8759 (0.8815)	loss 0.6459 (0.5836)	grad_norm 2.1831 (2.6209)	mem 20675MB
[2025-04-03 01:19:43 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][298/311]	eta 0:00:11 lr 0.000882	time 0.8760 (0.8815)	loss 0.5028 (0.5833)	grad_norm 3.5118 (2.6246)	mem 20675MB
[2025-04-03 01:19:45 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][300/311]	eta 0:00:09 lr 0.000881	time 0.8761 (0.8815)	loss 0.5991 (0.5837)	grad_norm 2.1207 (2.6250)	mem 20675MB
[2025-04-03 01:19:46 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][302/311]	eta 0:00:07 lr 0.000881	time 0.8759 (0.8814)	loss 0.5062 (0.5833)	grad_norm 2.3950 (2.6268)	mem 20675MB
[2025-04-03 01:19:48 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][304/311]	eta 0:00:06 lr 0.000881	time 0.8756 (0.8814)	loss 0.5760 (0.5835)	grad_norm 1.8147 (2.6232)	mem 20675MB
[2025-04-03 01:19:50 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][306/311]	eta 0:00:04 lr 0.000880	time 0.8759 (0.8814)	loss 0.6086 (0.5836)	grad_norm 2.3338 (2.6194)	mem 20675MB
[2025-04-03 01:19:52 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][308/311]	eta 0:00:02 lr 0.000880	time 0.8756 (0.8813)	loss 0.6552 (0.5835)	grad_norm 2.4309 (2.6196)	mem 20675MB
[2025-04-03 01:19:53 simmim_finetune] (main_finetune.py 252): INFO Train: [10/30][310/311]	eta 0:00:00 lr 0.000879	time 0.8756 (0.8813)	loss 0.5224 (0.5835)	grad_norm 2.6425 (2.6191)	mem 20675MB
[2025-04-03 01:19:54 simmim_finetune] (main_finetune.py 260): INFO EPOCH 10 training takes 0:04:34
[2025-04-03 01:19:54 simmim_finetune] (utils.py 60): INFO checkpoint/hand/ckpt10.pth saving......
[2025-04-03 01:19:57 simmim_finetune] (utils.py 62): INFO checkpoint/hand/ckpt10.pth saved !!!
[2025-04-03 01:19:59 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.414 (1.414)	Loss 0.5333 (0.5333)	Acc@1 77.344 (77.344)	Mem 20675MB
[2025-04-03 01:19:59 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.169
[2025-04-03 01:19:59 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 78.2%
[2025-04-03 01:19:59 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.17%
[2025-04-03 01:19:59 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.3254521031398957e-06, 3.3254521031398957e-06, 5.076169674060223e-06, 5.076169674060223e-06, 7.769581321629956e-06, 7.769581321629956e-06, 1.1913291548660315e-05, 1.1913291548660315e-05, 1.828823035947625e-05, 1.828823035947625e-05, 2.8095828529962307e-05, 2.8095828529962307e-05, 4.3184441099940846e-05, 4.3184441099940846e-05, 6.639769120760014e-05, 6.639769120760014e-05, 0.00010211038368092214, 0.00010211038368092214, 0.0001570529874860329, 0.0001570529874860329, 0.00024158007026312637, 0.00024158007026312637, 0.0003716217360740394, 0.0003716217360740394, 0.000571685837321598, 0.000571685837321598, 0.0008794767623178418, 0.0008794767623178418]
[2025-04-03 01:20:01 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][0/311]	eta 0:11:28 lr 0.000879	time 2.2131 (2.2131)	loss 0.6136 (0.6136)	grad_norm 1.9631 (1.9631)	mem 20675MB
[2025-04-03 01:20:03 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][2/311]	eta 0:06:48 lr 0.000879	time 0.8759 (1.3223)	loss 0.5812 (0.6104)	grad_norm 1.7322 (1.7882)	mem 20675MB
[2025-04-03 01:20:05 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][4/311]	eta 0:05:51 lr 0.000879	time 0.8763 (1.1442)	loss 0.5497 (0.5914)	grad_norm 2.5648 (2.1537)	mem 20675MB
[2025-04-03 01:20:06 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][6/311]	eta 0:05:25 lr 0.000878	time 0.8762 (1.0679)	loss 0.5032 (0.5728)	grad_norm 2.7488 (2.2533)	mem 20675MB
[2025-04-03 01:20:08 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][8/311]	eta 0:05:10 lr 0.000878	time 0.8763 (1.0255)	loss 0.6035 (0.5740)	grad_norm 1.5937 (2.1619)	mem 20675MB
[2025-04-03 01:20:10 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][10/311]	eta 0:05:00 lr 0.000877	time 0.8760 (0.9985)	loss 0.5373 (0.5671)	grad_norm 3.5178 (2.3135)	mem 20675MB
[2025-04-03 01:20:12 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][12/311]	eta 0:04:52 lr 0.000877	time 0.8760 (0.9798)	loss 0.5512 (0.5707)	grad_norm 2.5646 (2.3362)	mem 20675MB
[2025-04-03 01:20:13 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][14/311]	eta 0:04:46 lr 0.000877	time 0.8762 (0.9661)	loss 0.5169 (0.5664)	grad_norm 3.6899 (2.4540)	mem 20675MB
[2025-04-03 01:20:15 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][16/311]	eta 0:04:41 lr 0.000876	time 0.8762 (0.9556)	loss 0.5562 (0.5632)	grad_norm 3.3933 (2.5893)	mem 20675MB
[2025-04-03 01:20:17 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][18/311]	eta 0:04:37 lr 0.000876	time 0.8759 (0.9473)	loss 0.6037 (0.5696)	grad_norm 2.1482 (2.6483)	mem 20675MB
[2025-04-03 01:20:19 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][20/311]	eta 0:04:33 lr 0.000875	time 0.8761 (0.9405)	loss 0.6560 (0.5740)	grad_norm 2.6130 (2.6751)	mem 20675MB
[2025-04-03 01:20:20 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][22/311]	eta 0:04:30 lr 0.000875	time 0.8758 (0.9350)	loss 0.5299 (0.5746)	grad_norm 5.2337 (2.7354)	mem 20675MB
[2025-04-03 01:20:22 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][24/311]	eta 0:04:27 lr 0.000875	time 0.8759 (0.9303)	loss 0.6075 (0.5779)	grad_norm 2.0784 (2.6833)	mem 20675MB
[2025-04-03 01:20:24 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][26/311]	eta 0:04:24 lr 0.000874	time 0.8758 (0.9264)	loss 0.6323 (0.5803)	grad_norm 2.0412 (2.6621)	mem 20675MB
[2025-04-03 01:20:26 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][28/311]	eta 0:04:21 lr 0.000874	time 0.8758 (0.9229)	loss 0.6028 (0.5783)	grad_norm 2.0242 (2.6232)	mem 20675MB
[2025-04-03 01:20:27 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][30/311]	eta 0:04:18 lr 0.000874	time 0.8758 (0.9200)	loss 0.5798 (0.5769)	grad_norm 1.7213 (2.5951)	mem 20675MB
[2025-04-03 01:20:29 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][32/311]	eta 0:04:15 lr 0.000873	time 0.8762 (0.9173)	loss 0.4605 (0.5739)	grad_norm 2.4694 (2.5805)	mem 20675MB
[2025-04-03 01:20:31 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][34/311]	eta 0:04:13 lr 0.000873	time 0.8762 (0.9150)	loss 0.6631 (0.5784)	grad_norm 2.6020 (2.5610)	mem 20675MB
[2025-04-03 01:20:33 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][36/311]	eta 0:04:11 lr 0.000872	time 0.8761 (0.9130)	loss 0.6252 (0.5824)	grad_norm 3.3642 (2.5763)	mem 20675MB
[2025-04-03 01:20:34 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][38/311]	eta 0:04:08 lr 0.000872	time 0.8762 (0.9111)	loss 0.5325 (0.5782)	grad_norm 2.4271 (2.5856)	mem 20675MB
[2025-04-03 01:20:36 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][40/311]	eta 0:04:06 lr 0.000872	time 0.8759 (0.9094)	loss 0.5599 (0.5785)	grad_norm 2.7731 (2.5831)	mem 20675MB
[2025-04-03 01:20:38 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][42/311]	eta 0:04:04 lr 0.000871	time 0.8758 (0.9079)	loss 0.4925 (0.5781)	grad_norm 2.7338 (2.5713)	mem 20675MB
[2025-04-03 01:20:40 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][44/311]	eta 0:04:02 lr 0.000871	time 0.8758 (0.9065)	loss 0.4938 (0.5742)	grad_norm 3.3117 (2.6165)	mem 20675MB
[2025-04-03 01:20:41 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][46/311]	eta 0:03:59 lr 0.000870	time 0.8762 (0.9053)	loss 0.4676 (0.5740)	grad_norm 3.2705 (2.6345)	mem 20675MB
[2025-04-03 01:20:43 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][48/311]	eta 0:03:57 lr 0.000870	time 0.8758 (0.9041)	loss 0.6764 (0.5760)	grad_norm 2.6420 (2.6504)	mem 20675MB
[2025-04-03 01:20:45 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][50/311]	eta 0:03:55 lr 0.000870	time 0.8763 (0.9031)	loss 0.5277 (0.5745)	grad_norm 2.7071 (2.6652)	mem 20675MB
[2025-04-03 01:20:47 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][52/311]	eta 0:03:53 lr 0.000869	time 0.8764 (0.9021)	loss 0.6148 (0.5754)	grad_norm 2.8982 (2.6646)	mem 20675MB
[2025-04-03 01:20:48 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][54/311]	eta 0:03:51 lr 0.000869	time 0.8760 (0.9012)	loss 0.5674 (0.5753)	grad_norm 2.3796 (2.6564)	mem 20675MB
[2025-04-03 01:20:50 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][56/311]	eta 0:03:49 lr 0.000868	time 0.8765 (0.9003)	loss 0.6742 (0.5758)	grad_norm 3.2445 (2.6887)	mem 20675MB
[2025-04-03 01:20:52 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][58/311]	eta 0:03:47 lr 0.000868	time 0.8764 (0.8996)	loss 0.5602 (0.5759)	grad_norm 2.1969 (2.6895)	mem 20675MB
[2025-04-03 01:20:54 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][60/311]	eta 0:03:45 lr 0.000868	time 0.8761 (0.8988)	loss 0.5919 (0.5770)	grad_norm 2.6474 (2.6858)	mem 20675MB
[2025-04-03 01:20:55 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][62/311]	eta 0:03:43 lr 0.000867	time 0.8767 (0.8982)	loss 0.5614 (0.5756)	grad_norm 2.4062 (2.6828)	mem 20675MB
[2025-04-03 01:20:57 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][64/311]	eta 0:03:41 lr 0.000867	time 0.8759 (0.8975)	loss 0.5574 (0.5747)	grad_norm 2.2187 (2.6733)	mem 20675MB
[2025-04-03 01:20:59 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][66/311]	eta 0:03:39 lr 0.000867	time 0.8761 (0.8969)	loss 0.6662 (0.5773)	grad_norm 2.4204 (2.6632)	mem 20675MB
[2025-04-03 01:21:01 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][68/311]	eta 0:03:37 lr 0.000866	time 0.8766 (0.8963)	loss 0.5439 (0.5766)	grad_norm 2.2675 (2.6739)	mem 20675MB
[2025-04-03 01:21:02 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][70/311]	eta 0:03:35 lr 0.000866	time 0.8761 (0.8958)	loss 0.6051 (0.5770)	grad_norm 1.9944 (2.6651)	mem 20675MB
[2025-04-03 01:21:04 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][72/311]	eta 0:03:33 lr 0.000865	time 0.8761 (0.8953)	loss 0.5691 (0.5780)	grad_norm 2.3257 (2.6540)	mem 20675MB
[2025-04-03 01:21:06 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][74/311]	eta 0:03:32 lr 0.000865	time 0.8772 (0.8948)	loss 0.6142 (0.5772)	grad_norm 2.6567 (2.6637)	mem 20675MB
[2025-04-03 01:21:08 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][76/311]	eta 0:03:30 lr 0.000865	time 0.8759 (0.8943)	loss 0.6238 (0.5771)	grad_norm 3.9493 (2.6885)	mem 20675MB
[2025-04-03 01:21:10 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][78/311]	eta 0:03:28 lr 0.000864	time 0.8768 (0.8939)	loss 0.6129 (0.5762)	grad_norm 1.9243 (2.6959)	mem 20675MB
[2025-04-03 01:21:11 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][80/311]	eta 0:03:26 lr 0.000864	time 0.8758 (0.8935)	loss 0.6223 (0.5770)	grad_norm 1.9362 (2.6911)	mem 20675MB
[2025-04-03 01:21:13 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][82/311]	eta 0:03:24 lr 0.000863	time 0.8769 (0.8931)	loss 0.6665 (0.5791)	grad_norm 2.7265 (2.6974)	mem 20675MB
[2025-04-03 01:21:15 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][84/311]	eta 0:03:22 lr 0.000863	time 0.8762 (0.8927)	loss 0.6622 (0.5801)	grad_norm 3.3922 (2.7055)	mem 20675MB
[2025-04-03 01:21:17 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][86/311]	eta 0:03:20 lr 0.000863	time 0.8759 (0.8924)	loss 0.5642 (0.5803)	grad_norm 1.7311 (2.6836)	mem 20675MB
[2025-04-03 01:21:18 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][88/311]	eta 0:03:18 lr 0.000862	time 0.8761 (0.8920)	loss 0.4831 (0.5799)	grad_norm 2.4104 (2.6717)	mem 20675MB
[2025-04-03 01:21:20 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][90/311]	eta 0:03:17 lr 0.000862	time 0.8763 (0.8917)	loss 0.5792 (0.5795)	grad_norm 1.9473 (2.6642)	mem 20675MB
[2025-04-03 01:21:22 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][92/311]	eta 0:03:15 lr 0.000861	time 0.8757 (0.8914)	loss 0.5123 (0.5791)	grad_norm 3.0885 (2.6632)	mem 20675MB
[2025-04-03 01:21:24 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][94/311]	eta 0:03:13 lr 0.000861	time 0.8762 (0.8911)	loss 0.6147 (0.5798)	grad_norm 2.5224 (2.6521)	mem 20675MB
[2025-04-03 01:21:25 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][96/311]	eta 0:03:11 lr 0.000861	time 0.8763 (0.8908)	loss 0.5391 (0.5785)	grad_norm 2.4644 (2.6537)	mem 20675MB
[2025-04-03 01:21:27 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][98/311]	eta 0:03:09 lr 0.000860	time 0.8759 (0.8905)	loss 0.5509 (0.5780)	grad_norm 2.3607 (2.6512)	mem 20675MB
[2025-04-03 01:21:29 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][100/311]	eta 0:03:07 lr 0.000860	time 0.8758 (0.8902)	loss 0.5874 (0.5772)	grad_norm 2.9620 (2.6594)	mem 20675MB
[2025-04-03 01:21:31 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][102/311]	eta 0:03:06 lr 0.000860	time 0.8761 (0.8900)	loss 0.6728 (0.5783)	grad_norm 3.2507 (2.6618)	mem 20675MB
[2025-04-03 01:21:32 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][104/311]	eta 0:03:04 lr 0.000859	time 0.8761 (0.8897)	loss 0.4896 (0.5773)	grad_norm 3.1287 (2.6665)	mem 20675MB
[2025-04-03 01:21:34 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][106/311]	eta 0:03:02 lr 0.000859	time 0.8759 (0.8895)	loss 0.5810 (0.5775)	grad_norm 3.1820 (2.6669)	mem 20675MB
[2025-04-03 01:21:36 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][108/311]	eta 0:03:00 lr 0.000858	time 0.8783 (0.8893)	loss 0.6267 (0.5779)	grad_norm 3.2503 (2.6679)	mem 20675MB
[2025-04-03 01:21:38 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][110/311]	eta 0:02:58 lr 0.000858	time 0.8784 (0.8891)	loss 0.5729 (0.5778)	grad_norm 2.0648 (2.6694)	mem 20675MB
[2025-04-03 01:21:39 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][112/311]	eta 0:02:56 lr 0.000858	time 0.8763 (0.8889)	loss 0.5587 (0.5783)	grad_norm 2.4012 (2.6682)	mem 20675MB
[2025-04-03 01:21:41 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][114/311]	eta 0:02:55 lr 0.000857	time 0.8762 (0.8887)	loss 0.5540 (0.5783)	grad_norm 2.8366 (2.6666)	mem 20675MB
[2025-04-03 01:21:43 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][116/311]	eta 0:02:53 lr 0.000857	time 0.8758 (0.8885)	loss 0.5585 (0.5784)	grad_norm 2.1917 (2.6569)	mem 20675MB
[2025-04-03 01:21:45 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][118/311]	eta 0:02:51 lr 0.000856	time 0.8761 (0.8883)	loss 0.5085 (0.5780)	grad_norm 2.8294 (2.6538)	mem 20675MB
[2025-04-03 01:21:46 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][120/311]	eta 0:02:49 lr 0.000856	time 0.8760 (0.8881)	loss 0.5468 (0.5777)	grad_norm 2.0805 (2.6485)	mem 20675MB
[2025-04-03 01:21:48 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][122/311]	eta 0:02:47 lr 0.000856	time 0.8761 (0.8879)	loss 0.5922 (0.5769)	grad_norm 3.0313 (2.6529)	mem 20675MB
[2025-04-03 01:21:50 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][124/311]	eta 0:02:46 lr 0.000855	time 0.8763 (0.8877)	loss 0.5616 (0.5770)	grad_norm 2.6030 (2.6518)	mem 20675MB
[2025-04-03 01:21:52 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][126/311]	eta 0:02:44 lr 0.000855	time 0.8763 (0.8876)	loss 0.5369 (0.5757)	grad_norm 2.1931 (2.6511)	mem 20675MB
[2025-04-03 01:21:53 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][128/311]	eta 0:02:42 lr 0.000854	time 0.8763 (0.8874)	loss 0.6794 (0.5762)	grad_norm 3.8703 (2.6622)	mem 20675MB
[2025-04-03 01:21:55 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][130/311]	eta 0:02:40 lr 0.000854	time 0.8759 (0.8872)	loss 0.6143 (0.5759)	grad_norm 2.2621 (2.6829)	mem 20675MB
[2025-04-03 01:21:57 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][132/311]	eta 0:02:38 lr 0.000854	time 0.8760 (0.8871)	loss 0.5173 (0.5756)	grad_norm 4.8725 (2.6992)	mem 20675MB
[2025-04-03 01:21:59 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][134/311]	eta 0:02:36 lr 0.000853	time 0.8760 (0.8869)	loss 0.6151 (0.5762)	grad_norm 2.2460 (2.6953)	mem 20675MB
[2025-04-03 01:22:00 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][136/311]	eta 0:02:35 lr 0.000853	time 0.8759 (0.8868)	loss 0.6412 (0.5767)	grad_norm 2.6186 (2.6863)	mem 20675MB
[2025-04-03 01:22:02 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][138/311]	eta 0:02:33 lr 0.000852	time 0.8760 (0.8866)	loss 0.6471 (0.5777)	grad_norm 1.4585 (2.6717)	mem 20675MB
[2025-04-03 01:22:04 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][140/311]	eta 0:02:31 lr 0.000852	time 0.8761 (0.8865)	loss 0.6029 (0.5776)	grad_norm 2.1639 (2.6660)	mem 20675MB
[2025-04-03 01:22:06 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][142/311]	eta 0:02:29 lr 0.000852	time 0.8762 (0.8864)	loss 0.6391 (0.5783)	grad_norm 1.6314 (2.6524)	mem 20675MB
[2025-04-03 01:22:07 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][144/311]	eta 0:02:28 lr 0.000851	time 0.8763 (0.8862)	loss 0.5127 (0.5781)	grad_norm 3.3959 (2.6488)	mem 20675MB
[2025-04-03 01:22:09 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][146/311]	eta 0:02:26 lr 0.000851	time 0.8762 (0.8861)	loss 0.5642 (0.5782)	grad_norm 1.8183 (2.6372)	mem 20675MB
[2025-04-03 01:22:11 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][148/311]	eta 0:02:24 lr 0.000851	time 0.8761 (0.8860)	loss 0.5239 (0.5781)	grad_norm 3.1883 (2.6357)	mem 20675MB
[2025-04-03 01:22:13 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][150/311]	eta 0:02:22 lr 0.000850	time 0.8759 (0.8859)	loss 0.6215 (0.5782)	grad_norm 3.7661 (2.6457)	mem 20675MB
[2025-04-03 01:22:14 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][152/311]	eta 0:02:20 lr 0.000850	time 0.8764 (0.8858)	loss 0.5908 (0.5783)	grad_norm 2.3978 (2.6428)	mem 20675MB
[2025-04-03 01:22:16 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][154/311]	eta 0:02:19 lr 0.000849	time 0.8759 (0.8857)	loss 0.6154 (0.5785)	grad_norm 2.5689 (2.6514)	mem 20675MB
[2025-04-03 01:22:18 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][156/311]	eta 0:02:17 lr 0.000849	time 0.8760 (0.8855)	loss 0.6672 (0.5794)	grad_norm 2.5327 (2.6466)	mem 20675MB
[2025-04-03 01:22:20 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][158/311]	eta 0:02:15 lr 0.000849	time 0.8762 (0.8854)	loss 0.5646 (0.5797)	grad_norm 2.3240 (2.6434)	mem 20675MB
[2025-04-03 01:22:21 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][160/311]	eta 0:02:13 lr 0.000848	time 0.8757 (0.8853)	loss 0.4680 (0.5793)	grad_norm 2.2487 (2.6349)	mem 20675MB
[2025-04-03 01:22:23 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][162/311]	eta 0:02:11 lr 0.000848	time 0.8765 (0.8852)	loss 0.5891 (0.5790)	grad_norm 2.1054 (2.6306)	mem 20675MB
[2025-04-03 01:22:25 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][164/311]	eta 0:02:10 lr 0.000847	time 0.8759 (0.8851)	loss 0.5962 (0.5794)	grad_norm 1.9224 (2.6239)	mem 20675MB
[2025-04-03 01:22:27 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][166/311]	eta 0:02:08 lr 0.000847	time 0.8766 (0.8850)	loss 0.5806 (0.5798)	grad_norm 2.4674 (2.6220)	mem 20675MB
[2025-04-03 01:22:28 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][168/311]	eta 0:02:06 lr 0.000847	time 0.8759 (0.8849)	loss 0.5017 (0.5796)	grad_norm 2.6754 (2.6188)	mem 20675MB
[2025-04-03 01:22:30 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][170/311]	eta 0:02:04 lr 0.000846	time 0.8760 (0.8848)	loss 0.6166 (0.5799)	grad_norm 2.2620 (2.6120)	mem 20675MB
[2025-04-03 01:22:32 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][172/311]	eta 0:02:02 lr 0.000846	time 0.8763 (0.8848)	loss 0.5498 (0.5792)	grad_norm 1.6615 (2.6045)	mem 20675MB
[2025-04-03 01:22:34 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][174/311]	eta 0:02:01 lr 0.000845	time 0.8760 (0.8847)	loss 0.5170 (0.5790)	grad_norm 2.4486 (2.6006)	mem 20675MB
[2025-04-03 01:22:35 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][176/311]	eta 0:01:59 lr 0.000845	time 0.8759 (0.8846)	loss 0.5277 (0.5789)	grad_norm 3.1946 (2.6001)	mem 20675MB
[2025-04-03 01:22:37 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][178/311]	eta 0:01:57 lr 0.000845	time 0.8759 (0.8845)	loss 0.5999 (0.5790)	grad_norm 2.5837 (2.5979)	mem 20675MB
[2025-04-03 01:22:39 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][180/311]	eta 0:01:55 lr 0.000844	time 0.8760 (0.8844)	loss 0.5530 (0.5783)	grad_norm 2.6046 (2.6034)	mem 20675MB
[2025-04-03 01:22:41 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][182/311]	eta 0:01:54 lr 0.000844	time 0.8760 (0.8843)	loss 0.6412 (0.5778)	grad_norm 1.8090 (2.6006)	mem 20675MB
[2025-04-03 01:22:42 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][184/311]	eta 0:01:52 lr 0.000843	time 0.8761 (0.8843)	loss 0.5873 (0.5781)	grad_norm 2.8660 (2.6017)	mem 20675MB
[2025-04-03 01:22:44 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][186/311]	eta 0:01:50 lr 0.000843	time 0.8760 (0.8842)	loss 0.5965 (0.5784)	grad_norm 2.5446 (2.6052)	mem 20675MB
[2025-04-03 01:22:46 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][188/311]	eta 0:01:48 lr 0.000843	time 0.8761 (0.8841)	loss 0.6744 (0.5782)	grad_norm 2.9726 (2.6157)	mem 20675MB
[2025-04-03 01:22:48 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][190/311]	eta 0:01:46 lr 0.000842	time 0.8765 (0.8840)	loss 0.6588 (0.5788)	grad_norm 2.4196 (2.6170)	mem 20675MB
[2025-04-03 01:22:49 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][192/311]	eta 0:01:45 lr 0.000842	time 0.8758 (0.8840)	loss 0.6530 (0.5792)	grad_norm 1.6100 (2.6116)	mem 20675MB
[2025-04-03 01:22:51 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][194/311]	eta 0:01:43 lr 0.000841	time 0.8759 (0.8839)	loss 0.6453 (0.5796)	grad_norm 1.9066 (2.6046)	mem 20675MB
[2025-04-03 01:22:53 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][196/311]	eta 0:01:41 lr 0.000841	time 0.8759 (0.8838)	loss 0.6136 (0.5796)	grad_norm 1.7076 (2.5942)	mem 20675MB
[2025-04-03 01:22:55 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][198/311]	eta 0:01:39 lr 0.000841	time 0.8760 (0.8837)	loss 0.5829 (0.5795)	grad_norm 1.6956 (2.5917)	mem 20675MB
[2025-04-03 01:22:57 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][200/311]	eta 0:01:38 lr 0.000840	time 0.8760 (0.8837)	loss 0.5754 (0.5796)	grad_norm 1.7742 (2.5829)	mem 20675MB
[2025-04-03 01:22:58 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][202/311]	eta 0:01:36 lr 0.000840	time 0.8761 (0.8836)	loss 0.5962 (0.5794)	grad_norm 2.1225 (2.5773)	mem 20675MB
[2025-04-03 01:23:00 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][204/311]	eta 0:01:34 lr 0.000840	time 0.8760 (0.8835)	loss 0.6556 (0.5801)	grad_norm 2.1264 (2.5733)	mem 20675MB
[2025-04-03 01:23:02 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][206/311]	eta 0:01:32 lr 0.000839	time 0.8760 (0.8835)	loss 0.5926 (0.5801)	grad_norm 1.9727 (2.5708)	mem 20675MB
[2025-04-03 01:23:04 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][208/311]	eta 0:01:30 lr 0.000839	time 0.8758 (0.8834)	loss 0.6350 (0.5806)	grad_norm 2.3442 (2.5672)	mem 20675MB
[2025-04-03 01:23:05 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][210/311]	eta 0:01:29 lr 0.000838	time 0.8759 (0.8833)	loss 0.6123 (0.5807)	grad_norm 2.4021 (2.5694)	mem 20675MB
[2025-04-03 01:23:07 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][212/311]	eta 0:01:27 lr 0.000838	time 0.8764 (0.8833)	loss 0.6012 (0.5810)	grad_norm 2.2624 (2.5677)	mem 20675MB
[2025-04-03 01:23:09 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][214/311]	eta 0:01:25 lr 0.000838	time 0.8759 (0.8832)	loss 0.5651 (0.5812)	grad_norm 2.1462 (2.5719)	mem 20675MB
[2025-04-03 01:23:11 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][216/311]	eta 0:01:23 lr 0.000837	time 0.8761 (0.8832)	loss 0.5838 (0.5810)	grad_norm 1.9813 (2.5694)	mem 20675MB
[2025-04-03 01:23:12 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][218/311]	eta 0:01:22 lr 0.000837	time 0.8762 (0.8831)	loss 0.5262 (0.5805)	grad_norm 2.9901 (2.5738)	mem 20675MB
[2025-04-03 01:23:14 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][220/311]	eta 0:01:20 lr 0.000836	time 0.8758 (0.8831)	loss 0.6066 (0.5807)	grad_norm 3.9027 (2.5798)	mem 20675MB
[2025-04-03 01:23:16 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][222/311]	eta 0:01:18 lr 0.000836	time 0.8760 (0.8830)	loss 0.5974 (0.5810)	grad_norm 2.8461 (2.5791)	mem 20675MB
[2025-04-03 01:23:18 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][224/311]	eta 0:01:16 lr 0.000836	time 0.8759 (0.8829)	loss 0.5873 (0.5813)	grad_norm 2.1721 (2.5774)	mem 20675MB
[2025-04-03 01:23:19 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][226/311]	eta 0:01:15 lr 0.000835	time 0.8761 (0.8829)	loss 0.6192 (0.5813)	grad_norm 2.1038 (2.5737)	mem 20675MB
[2025-04-03 01:23:21 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][228/311]	eta 0:01:13 lr 0.000835	time 0.8759 (0.8828)	loss 0.5754 (0.5810)	grad_norm 2.1810 (2.5740)	mem 20675MB
[2025-04-03 01:23:23 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][230/311]	eta 0:01:11 lr 0.000834	time 0.8761 (0.8828)	loss 0.6194 (0.5812)	grad_norm 2.8474 (2.5731)	mem 20675MB
[2025-04-03 01:23:25 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][232/311]	eta 0:01:09 lr 0.000834	time 0.8760 (0.8827)	loss 0.4623 (0.5808)	grad_norm 3.6465 (2.5738)	mem 20675MB
[2025-04-03 01:23:26 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][234/311]	eta 0:01:07 lr 0.000834	time 0.8761 (0.8827)	loss 0.5825 (0.5807)	grad_norm 2.6606 (2.5774)	mem 20675MB
[2025-04-03 01:23:28 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][236/311]	eta 0:01:06 lr 0.000833	time 0.8760 (0.8826)	loss 0.5897 (0.5805)	grad_norm 2.6811 (2.5796)	mem 20675MB
[2025-04-03 01:23:30 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][238/311]	eta 0:01:04 lr 0.000833	time 0.8759 (0.8826)	loss 0.6193 (0.5804)	grad_norm 2.3297 (2.5857)	mem 20675MB
[2025-04-03 01:23:32 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][240/311]	eta 0:01:02 lr 0.000832	time 0.8757 (0.8825)	loss 0.5789 (0.5807)	grad_norm 2.4958 (2.5825)	mem 20675MB
[2025-04-03 01:23:33 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][242/311]	eta 0:01:00 lr 0.000832	time 0.8760 (0.8825)	loss 0.5399 (0.5806)	grad_norm 2.6924 (2.5831)	mem 20675MB
[2025-04-03 01:23:35 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][244/311]	eta 0:00:59 lr 0.000832	time 0.8760 (0.8825)	loss 0.6310 (0.5808)	grad_norm 2.8036 (2.5820)	mem 20675MB
[2025-04-03 01:23:37 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][246/311]	eta 0:00:57 lr 0.000831	time 0.8759 (0.8824)	loss 0.4385 (0.5803)	grad_norm 3.2883 (2.5809)	mem 20675MB
[2025-04-03 01:23:39 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][248/311]	eta 0:00:55 lr 0.000831	time 0.8761 (0.8824)	loss 0.5746 (0.5804)	grad_norm 2.5645 (2.5821)	mem 20675MB
[2025-04-03 01:23:40 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][250/311]	eta 0:00:53 lr 0.000830	time 0.8757 (0.8823)	loss 0.6126 (0.5805)	grad_norm 1.8998 (2.5793)	mem 20675MB
[2025-04-03 01:23:42 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][252/311]	eta 0:00:52 lr 0.000830	time 0.8757 (0.8823)	loss 0.6240 (0.5807)	grad_norm 1.8618 (2.5769)	mem 20675MB
[2025-04-03 01:23:44 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][254/311]	eta 0:00:50 lr 0.000830	time 0.8760 (0.8822)	loss 0.5388 (0.5806)	grad_norm 2.3441 (2.5760)	mem 20675MB
[2025-04-03 01:23:46 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][256/311]	eta 0:00:48 lr 0.000829	time 0.8758 (0.8822)	loss 0.5756 (0.5802)	grad_norm 2.2333 (2.5768)	mem 20675MB
[2025-04-03 01:23:47 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][258/311]	eta 0:00:46 lr 0.000829	time 0.8758 (0.8821)	loss 0.5832 (0.5801)	grad_norm 2.0495 (2.5739)	mem 20675MB
[2025-04-03 01:23:49 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][260/311]	eta 0:00:44 lr 0.000828	time 0.8760 (0.8821)	loss 0.6595 (0.5805)	grad_norm 2.2946 (2.5735)	mem 20675MB
[2025-04-03 01:23:51 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][262/311]	eta 0:00:43 lr 0.000828	time 0.8756 (0.8821)	loss 0.5196 (0.5803)	grad_norm 2.6598 (2.5702)	mem 20675MB
[2025-04-03 01:23:53 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][264/311]	eta 0:00:41 lr 0.000828	time 0.8759 (0.8820)	loss 0.5481 (0.5802)	grad_norm 3.9797 (2.5780)	mem 20675MB
[2025-04-03 01:23:54 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][266/311]	eta 0:00:39 lr 0.000827	time 0.8758 (0.8820)	loss 0.6019 (0.5803)	grad_norm 2.2483 (2.5757)	mem 20675MB
[2025-04-03 01:23:56 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][268/311]	eta 0:00:37 lr 0.000827	time 0.8757 (0.8819)	loss 0.6134 (0.5805)	grad_norm 1.7548 (2.5726)	mem 20675MB
[2025-04-03 01:23:58 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][270/311]	eta 0:00:36 lr 0.000826	time 0.8759 (0.8819)	loss 0.6270 (0.5805)	grad_norm 1.8262 (2.5733)	mem 20675MB
[2025-04-03 01:24:00 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][272/311]	eta 0:00:34 lr 0.000826	time 0.8759 (0.8819)	loss 0.5595 (0.5803)	grad_norm 2.4213 (2.5741)	mem 20675MB
[2025-04-03 01:24:01 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][274/311]	eta 0:00:32 lr 0.000826	time 0.8761 (0.8818)	loss 0.5019 (0.5804)	grad_norm 2.8013 (2.5768)	mem 20675MB
[2025-04-03 01:24:03 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][276/311]	eta 0:00:30 lr 0.000825	time 0.8758 (0.8818)	loss 0.6223 (0.5802)	grad_norm 2.2347 (2.5790)	mem 20675MB
[2025-04-03 01:24:05 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][278/311]	eta 0:00:29 lr 0.000825	time 0.8759 (0.8817)	loss 0.5667 (0.5802)	grad_norm 2.0479 (2.5766)	mem 20675MB
[2025-04-03 01:24:07 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][280/311]	eta 0:00:27 lr 0.000824	time 0.8760 (0.8817)	loss 0.5665 (0.5802)	grad_norm 1.8648 (2.5733)	mem 20675MB
[2025-04-03 01:24:08 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][282/311]	eta 0:00:25 lr 0.000824	time 0.8757 (0.8817)	loss 0.5962 (0.5803)	grad_norm 2.3201 (2.5718)	mem 20675MB
[2025-04-03 01:24:10 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][284/311]	eta 0:00:23 lr 0.000824	time 0.8759 (0.8816)	loss 0.4543 (0.5796)	grad_norm 2.3661 (2.5735)	mem 20675MB
[2025-04-03 01:24:12 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][286/311]	eta 0:00:22 lr 0.000823	time 0.8764 (0.8816)	loss 0.6362 (0.5797)	grad_norm 2.6587 (2.5730)	mem 20675MB
[2025-04-03 01:24:14 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][288/311]	eta 0:00:20 lr 0.000823	time 0.8759 (0.8816)	loss 0.5876 (0.5799)	grad_norm 2.7201 (2.5761)	mem 20675MB
[2025-04-03 01:24:15 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][290/311]	eta 0:00:18 lr 0.000822	time 0.8757 (0.8815)	loss 0.4548 (0.5793)	grad_norm 4.2221 (2.5831)	mem 20675MB
[2025-04-03 01:24:17 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][292/311]	eta 0:00:16 lr 0.000822	time 0.8759 (0.8815)	loss 0.6017 (0.5797)	grad_norm 2.2776 (2.5864)	mem 20675MB
[2025-04-03 01:24:19 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][294/311]	eta 0:00:14 lr 0.000822	time 0.8758 (0.8815)	loss 0.6034 (0.5798)	grad_norm 1.9147 (2.5848)	mem 20675MB
[2025-04-03 01:24:21 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][296/311]	eta 0:00:13 lr 0.000821	time 0.8760 (0.8814)	loss 0.5827 (0.5801)	grad_norm 2.6800 (2.5830)	mem 20675MB
[2025-04-03 01:24:22 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][298/311]	eta 0:00:11 lr 0.000821	time 0.8757 (0.8814)	loss 0.5170 (0.5800)	grad_norm 3.2491 (2.5860)	mem 20675MB
[2025-04-03 01:24:24 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][300/311]	eta 0:00:09 lr 0.000820	time 0.8757 (0.8814)	loss 0.6223 (0.5800)	grad_norm 1.9279 (2.5823)	mem 20675MB
[2025-04-03 01:24:26 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][302/311]	eta 0:00:07 lr 0.000820	time 0.8757 (0.8813)	loss 0.5259 (0.5799)	grad_norm 2.7381 (2.5813)	mem 20675MB
[2025-04-03 01:24:28 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][304/311]	eta 0:00:06 lr 0.000820	time 0.8755 (0.8813)	loss 0.5621 (0.5799)	grad_norm 2.2578 (2.5786)	mem 20675MB
[2025-04-03 01:24:29 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][306/311]	eta 0:00:04 lr 0.000819	time 0.8758 (0.8813)	loss 0.6225 (0.5801)	grad_norm 2.6854 (2.5767)	mem 20675MB
[2025-04-03 01:24:31 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][308/311]	eta 0:00:02 lr 0.000819	time 0.8757 (0.8813)	loss 0.5945 (0.5802)	grad_norm 2.1097 (2.5741)	mem 20675MB
[2025-04-03 01:24:33 simmim_finetune] (main_finetune.py 252): INFO Train: [11/30][310/311]	eta 0:00:00 lr 0.000818	time 0.8759 (0.8812)	loss 0.6278 (0.5804)	grad_norm 2.6612 (2.5742)	mem 20675MB
[2025-04-03 01:24:33 simmim_finetune] (main_finetune.py 260): INFO EPOCH 11 training takes 0:04:34
[2025-04-03 01:24:35 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.408 (1.408)	Loss 0.5156 (0.5156)	Acc@1 75.000 (75.000)	Mem 20675MB
[2025-04-03 01:24:35 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 76.761
[2025-04-03 01:24:35 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 76.8%
[2025-04-03 01:24:35 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.17%
[2025-04-03 01:24:35 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.1118886409145865e-06, 3.1118886409145865e-06, 4.741034067875102e-06, 4.741034067875102e-06, 7.247411647814356e-06, 7.247411647814356e-06, 1.1103377155413209e-05, 1.1103377155413209e-05, 1.7035631782488364e-05, 1.7035631782488364e-05, 2.6162177362603992e-05, 2.6162177362603992e-05, 4.020301671662803e-05, 4.020301671662803e-05, 6.180430803051115e-05, 6.180430803051115e-05, 9.503706389802366e-05, 9.503706389802366e-05, 0.0001461643806172737, 0.0001461643806172737, 0.0002248217909545814, 0.0002248217909545814, 0.00034583319147351633, 0.00034583319147351633, 0.0005320045768872624, 0.0005320045768872624, 0.0008184220929084102, 0.0008184220929084102]
[2025-04-03 01:24:37 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][0/311]	eta 0:12:05 lr 0.000818	time 2.3313 (2.3313)	loss 0.6114 (0.6114)	grad_norm 2.7852 (2.7852)	mem 20675MB
[2025-04-03 01:24:39 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][2/311]	eta 0:07:00 lr 0.000818	time 0.8757 (1.3615)	loss 0.5158 (0.5798)	grad_norm 4.1516 (3.0493)	mem 20675MB
[2025-04-03 01:24:40 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][4/311]	eta 0:05:58 lr 0.000817	time 0.8761 (1.1676)	loss 0.6352 (0.5811)	grad_norm 2.5734 (2.7934)	mem 20675MB
[2025-04-03 01:24:42 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][6/311]	eta 0:05:30 lr 0.000817	time 0.8760 (1.0845)	loss 0.6100 (0.5995)	grad_norm 2.7800 (2.7273)	mem 20675MB
[2025-04-03 01:24:44 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][8/311]	eta 0:05:14 lr 0.000817	time 0.8757 (1.0383)	loss 0.6662 (0.6097)	grad_norm 2.1120 (2.6055)	mem 20675MB
[2025-04-03 01:24:46 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][10/311]	eta 0:05:03 lr 0.000816	time 0.8759 (1.0089)	loss 0.5982 (0.6022)	grad_norm 1.4151 (2.4640)	mem 20675MB
[2025-04-03 01:24:48 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][12/311]	eta 0:04:55 lr 0.000816	time 0.8756 (0.9886)	loss 0.6063 (0.6029)	grad_norm 2.5796 (2.4359)	mem 20675MB
[2025-04-03 01:24:49 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][14/311]	eta 0:04:49 lr 0.000815	time 0.8756 (0.9736)	loss 0.6237 (0.5963)	grad_norm 1.8676 (2.4514)	mem 20675MB
[2025-04-03 01:24:51 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][16/311]	eta 0:04:43 lr 0.000815	time 0.8756 (0.9622)	loss 0.4762 (0.5902)	grad_norm 2.7334 (2.4562)	mem 20675MB
[2025-04-03 01:24:53 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][18/311]	eta 0:04:39 lr 0.000815	time 0.8754 (0.9531)	loss 0.4924 (0.5838)	grad_norm 2.8296 (2.4600)	mem 20675MB
[2025-04-03 01:24:55 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][20/311]	eta 0:04:35 lr 0.000814	time 0.8754 (0.9458)	loss 0.6178 (0.5839)	grad_norm 1.9813 (2.4642)	mem 20675MB
[2025-04-03 01:24:56 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][22/311]	eta 0:04:31 lr 0.000814	time 0.8757 (0.9398)	loss 0.6184 (0.5839)	grad_norm 2.9855 (2.4909)	mem 20675MB
[2025-04-03 01:24:58 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][24/311]	eta 0:04:28 lr 0.000813	time 0.8757 (0.9347)	loss 0.5432 (0.5821)	grad_norm 3.1866 (2.5357)	mem 20675MB
[2025-04-03 01:25:00 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][26/311]	eta 0:04:25 lr 0.000813	time 0.8775 (0.9306)	loss 0.5842 (0.5778)	grad_norm 2.7021 (2.5739)	mem 20675MB
[2025-04-03 01:25:02 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][28/311]	eta 0:04:22 lr 0.000813	time 0.8756 (0.9268)	loss 0.6246 (0.5806)	grad_norm 3.0074 (2.6302)	mem 20675MB
[2025-04-03 01:25:03 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][30/311]	eta 0:04:19 lr 0.000812	time 0.8756 (0.9236)	loss 0.6115 (0.5857)	grad_norm 2.1379 (2.6220)	mem 20675MB
[2025-04-03 01:25:05 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][32/311]	eta 0:04:16 lr 0.000812	time 0.8758 (0.9207)	loss 0.4751 (0.5822)	grad_norm 2.8390 (2.6281)	mem 20675MB
[2025-04-03 01:25:07 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][34/311]	eta 0:04:14 lr 0.000811	time 0.8760 (0.9182)	loss 0.5716 (0.5831)	grad_norm 1.8138 (2.5865)	mem 20675MB
[2025-04-03 01:25:09 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][36/311]	eta 0:04:11 lr 0.000811	time 0.8755 (0.9160)	loss 0.5322 (0.5822)	grad_norm 2.9653 (2.5879)	mem 20675MB
[2025-04-03 01:25:10 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][38/311]	eta 0:04:09 lr 0.000811	time 0.8759 (0.9140)	loss 0.5890 (0.5846)	grad_norm 1.8246 (2.5633)	mem 20675MB
[2025-04-03 01:25:12 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][40/311]	eta 0:04:07 lr 0.000810	time 0.8762 (0.9121)	loss 0.4930 (0.5824)	grad_norm 2.9379 (2.5596)	mem 20675MB
[2025-04-03 01:25:14 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][42/311]	eta 0:04:04 lr 0.000810	time 0.8758 (0.9105)	loss 0.6063 (0.5815)	grad_norm 1.9648 (2.5477)	mem 20675MB
[2025-04-03 01:25:16 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][44/311]	eta 0:04:02 lr 0.000809	time 0.8756 (0.9090)	loss 0.6072 (0.5822)	grad_norm 2.3404 (2.5284)	mem 20675MB
[2025-04-03 01:25:17 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][46/311]	eta 0:04:00 lr 0.000809	time 0.8755 (0.9076)	loss 0.5824 (0.5819)	grad_norm 2.3982 (2.5233)	mem 20675MB
[2025-04-03 01:25:19 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][48/311]	eta 0:03:58 lr 0.000809	time 0.8755 (0.9063)	loss 0.5440 (0.5794)	grad_norm 2.2801 (2.5698)	mem 20675MB
[2025-04-03 01:25:21 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][50/311]	eta 0:03:56 lr 0.000808	time 0.8756 (0.9052)	loss 0.6261 (0.5801)	grad_norm 4.0953 (2.6143)	mem 20675MB
[2025-04-03 01:25:23 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][52/311]	eta 0:03:54 lr 0.000808	time 0.8754 (0.9041)	loss 0.5743 (0.5807)	grad_norm 2.8663 (2.6224)	mem 20675MB
[2025-04-03 01:25:24 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][54/311]	eta 0:03:52 lr 0.000807	time 0.8758 (0.9031)	loss 0.5582 (0.5810)	grad_norm 3.1606 (2.6431)	mem 20675MB
[2025-04-03 01:25:26 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][56/311]	eta 0:03:50 lr 0.000807	time 0.8758 (0.9021)	loss 0.6057 (0.5824)	grad_norm 2.2438 (2.6307)	mem 20675MB
[2025-04-03 01:25:28 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][58/311]	eta 0:03:48 lr 0.000807	time 0.8757 (0.9013)	loss 0.4903 (0.5813)	grad_norm 3.3172 (2.6499)	mem 20675MB
[2025-04-03 01:25:30 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][60/311]	eta 0:03:46 lr 0.000806	time 0.8756 (0.9005)	loss 0.5103 (0.5810)	grad_norm 2.6791 (2.6477)	mem 20675MB
[2025-04-03 01:25:31 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][62/311]	eta 0:03:44 lr 0.000806	time 0.8757 (0.8997)	loss 0.5152 (0.5797)	grad_norm 2.9089 (2.6481)	mem 20675MB
[2025-04-03 01:25:33 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][64/311]	eta 0:03:42 lr 0.000805	time 0.8754 (0.8990)	loss 0.5994 (0.5808)	grad_norm 2.2255 (2.6363)	mem 20675MB
[2025-04-03 01:25:35 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][66/311]	eta 0:03:40 lr 0.000805	time 0.8754 (0.8983)	loss 0.5967 (0.5811)	grad_norm 1.9942 (2.6252)	mem 20675MB
[2025-04-03 01:25:37 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][68/311]	eta 0:03:38 lr 0.000805	time 0.8754 (0.8977)	loss 0.5719 (0.5807)	grad_norm 2.4547 (2.6213)	mem 20675MB
[2025-04-03 01:25:38 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][70/311]	eta 0:03:36 lr 0.000804	time 0.8756 (0.8971)	loss 0.6031 (0.5808)	grad_norm 2.0005 (2.6172)	mem 20675MB
[2025-04-03 01:25:40 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][72/311]	eta 0:03:34 lr 0.000804	time 0.8765 (0.8965)	loss 0.4688 (0.5788)	grad_norm 3.4168 (2.6410)	mem 20675MB
[2025-04-03 01:25:42 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][74/311]	eta 0:03:32 lr 0.000803	time 0.8762 (0.8960)	loss 0.5470 (0.5791)	grad_norm 2.7413 (2.6350)	mem 20675MB
[2025-04-03 01:25:44 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][76/311]	eta 0:03:30 lr 0.000803	time 0.8759 (0.8955)	loss 0.5488 (0.5786)	grad_norm 2.6384 (2.6314)	mem 20675MB
[2025-04-03 01:25:45 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][78/311]	eta 0:03:28 lr 0.000803	time 0.8762 (0.8950)	loss 0.6058 (0.5784)	grad_norm 2.8711 (2.6495)	mem 20675MB
[2025-04-03 01:25:47 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][80/311]	eta 0:03:26 lr 0.000802	time 0.8755 (0.8946)	loss 0.5290 (0.5774)	grad_norm 4.3675 (2.6768)	mem 20675MB
[2025-04-03 01:25:49 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][82/311]	eta 0:03:24 lr 0.000802	time 0.8757 (0.8941)	loss 0.4757 (0.5757)	grad_norm 3.4461 (2.7011)	mem 20675MB
[2025-04-03 01:25:51 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][84/311]	eta 0:03:22 lr 0.000801	time 0.8759 (0.8937)	loss 0.6323 (0.5767)	grad_norm 2.9465 (2.7050)	mem 20675MB
[2025-04-03 01:25:52 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][86/311]	eta 0:03:21 lr 0.000801	time 0.8756 (0.8933)	loss 0.6489 (0.5777)	grad_norm 2.2241 (2.6938)	mem 20675MB
[2025-04-03 01:25:54 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][88/311]	eta 0:03:19 lr 0.000801	time 0.8755 (0.8930)	loss 0.6439 (0.5787)	grad_norm 2.2775 (2.6927)	mem 20675MB
[2025-04-03 01:25:56 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][90/311]	eta 0:03:17 lr 0.000800	time 0.8755 (0.8926)	loss 0.5999 (0.5783)	grad_norm 1.7360 (2.6766)	mem 20675MB
[2025-04-03 01:25:58 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][92/311]	eta 0:03:15 lr 0.000800	time 0.8757 (0.8922)	loss 0.6250 (0.5791)	grad_norm 2.2022 (2.6654)	mem 20675MB
[2025-04-03 01:25:59 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][94/311]	eta 0:03:13 lr 0.000799	time 0.8757 (0.8919)	loss 0.6079 (0.5801)	grad_norm 1.4162 (2.6473)	mem 20675MB
[2025-04-03 01:26:01 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][96/311]	eta 0:03:11 lr 0.000799	time 0.8758 (0.8916)	loss 0.5992 (0.5809)	grad_norm 1.7592 (2.6309)	mem 20675MB
[2025-04-03 01:26:03 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][98/311]	eta 0:03:09 lr 0.000799	time 0.8756 (0.8913)	loss 0.4902 (0.5803)	grad_norm 2.7318 (2.6216)	mem 20675MB
[2025-04-03 01:26:05 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][100/311]	eta 0:03:08 lr 0.000798	time 0.8756 (0.8910)	loss 0.6227 (0.5802)	grad_norm 1.8972 (2.6160)	mem 20675MB
[2025-04-03 01:26:06 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][102/311]	eta 0:03:06 lr 0.000798	time 0.8759 (0.8907)	loss 0.5769 (0.5801)	grad_norm 2.4747 (2.6125)	mem 20675MB
[2025-04-03 01:26:08 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][104/311]	eta 0:03:04 lr 0.000797	time 0.8756 (0.8905)	loss 0.6176 (0.5800)	grad_norm 2.2812 (2.6207)	mem 20675MB
[2025-04-03 01:26:10 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][106/311]	eta 0:03:02 lr 0.000797	time 0.8759 (0.8902)	loss 0.5728 (0.5794)	grad_norm 2.2374 (2.6264)	mem 20675MB
[2025-04-03 01:26:12 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][108/311]	eta 0:03:00 lr 0.000796	time 0.8760 (0.8900)	loss 0.5227 (0.5793)	grad_norm 3.4256 (2.6381)	mem 20675MB
[2025-04-03 01:26:13 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][110/311]	eta 0:02:58 lr 0.000796	time 0.8758 (0.8897)	loss 0.6065 (0.5786)	grad_norm 1.8135 (2.6398)	mem 20675MB
[2025-04-03 01:26:15 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][112/311]	eta 0:02:57 lr 0.000796	time 0.8759 (0.8895)	loss 0.6814 (0.5788)	grad_norm 2.7731 (2.6369)	mem 20675MB
[2025-04-03 01:26:17 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][114/311]	eta 0:02:55 lr 0.000795	time 0.8755 (0.8893)	loss 0.6039 (0.5785)	grad_norm 2.5718 (2.6498)	mem 20675MB
[2025-04-03 01:26:19 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][116/311]	eta 0:02:53 lr 0.000795	time 0.8756 (0.8890)	loss 0.5770 (0.5788)	grad_norm 1.7106 (2.6445)	mem 20675MB
[2025-04-03 01:26:20 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][118/311]	eta 0:02:51 lr 0.000794	time 0.8758 (0.8888)	loss 0.6940 (0.5798)	grad_norm 2.6608 (2.6391)	mem 20675MB
[2025-04-03 01:26:22 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][120/311]	eta 0:02:49 lr 0.000794	time 0.8755 (0.8886)	loss 0.6053 (0.5804)	grad_norm 1.9267 (2.6302)	mem 20675MB
[2025-04-03 01:26:24 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][122/311]	eta 0:02:47 lr 0.000794	time 0.8757 (0.8884)	loss 0.4621 (0.5792)	grad_norm 3.5971 (2.6379)	mem 20675MB
[2025-04-03 01:26:26 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][124/311]	eta 0:02:46 lr 0.000793	time 0.8757 (0.8882)	loss 0.4794 (0.5781)	grad_norm 2.7489 (2.6304)	mem 20675MB
[2025-04-03 01:26:27 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][126/311]	eta 0:02:44 lr 0.000793	time 0.8755 (0.8880)	loss 0.6684 (0.5789)	grad_norm 2.3604 (2.6241)	mem 20675MB
[2025-04-03 01:26:29 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][128/311]	eta 0:02:42 lr 0.000792	time 0.8758 (0.8879)	loss 0.5113 (0.5780)	grad_norm 2.7461 (2.6305)	mem 20675MB
[2025-04-03 01:26:31 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][130/311]	eta 0:02:40 lr 0.000792	time 0.8756 (0.8877)	loss 0.5924 (0.5783)	grad_norm 1.4115 (2.6186)	mem 20675MB
[2025-04-03 01:26:33 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][132/311]	eta 0:02:38 lr 0.000792	time 0.8756 (0.8875)	loss 0.5752 (0.5782)	grad_norm 4.1938 (2.6280)	mem 20675MB
[2025-04-03 01:26:34 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][134/311]	eta 0:02:37 lr 0.000791	time 0.8755 (0.8874)	loss 0.5412 (0.5774)	grad_norm 2.2383 (2.6296)	mem 20675MB
[2025-04-03 01:26:36 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][136/311]	eta 0:02:35 lr 0.000791	time 0.8754 (0.8872)	loss 0.5565 (0.5770)	grad_norm 1.9189 (2.6191)	mem 20675MB
[2025-04-03 01:26:38 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][138/311]	eta 0:02:33 lr 0.000790	time 0.8756 (0.8870)	loss 0.6312 (0.5775)	grad_norm 2.1200 (2.6094)	mem 20675MB
[2025-04-03 01:26:40 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][140/311]	eta 0:02:31 lr 0.000790	time 0.8757 (0.8869)	loss 0.5048 (0.5773)	grad_norm 3.4191 (2.6121)	mem 20675MB
[2025-04-03 01:26:41 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][142/311]	eta 0:02:29 lr 0.000790	time 0.8756 (0.8868)	loss 0.6355 (0.5777)	grad_norm 2.2074 (2.6058)	mem 20675MB
[2025-04-03 01:26:43 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][144/311]	eta 0:02:28 lr 0.000789	time 0.8759 (0.8866)	loss 0.4630 (0.5768)	grad_norm 3.4204 (2.6117)	mem 20675MB
[2025-04-03 01:26:45 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][146/311]	eta 0:02:26 lr 0.000789	time 0.8757 (0.8865)	loss 0.5852 (0.5763)	grad_norm 3.1272 (2.6218)	mem 20675MB
[2025-04-03 01:26:47 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][148/311]	eta 0:02:24 lr 0.000788	time 0.8758 (0.8864)	loss 0.7047 (0.5770)	grad_norm 3.6581 (2.6262)	mem 20675MB
[2025-04-03 01:26:48 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][150/311]	eta 0:02:22 lr 0.000788	time 0.8760 (0.8862)	loss 0.5385 (0.5769)	grad_norm 3.0374 (2.6270)	mem 20675MB
[2025-04-03 01:26:50 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][152/311]	eta 0:02:20 lr 0.000788	time 0.8760 (0.8861)	loss 0.6128 (0.5776)	grad_norm 2.4341 (2.6247)	mem 20675MB
[2025-04-03 01:26:52 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][154/311]	eta 0:02:19 lr 0.000787	time 0.8760 (0.8860)	loss 0.5557 (0.5776)	grad_norm 2.1067 (2.6170)	mem 20675MB
[2025-04-03 01:26:54 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][156/311]	eta 0:02:17 lr 0.000787	time 0.8754 (0.8859)	loss 0.5807 (0.5771)	grad_norm 1.7429 (2.6103)	mem 20675MB
[2025-04-03 01:26:55 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][158/311]	eta 0:02:15 lr 0.000786	time 0.8758 (0.8857)	loss 0.5538 (0.5769)	grad_norm 1.8420 (2.6020)	mem 20675MB
[2025-04-03 01:26:57 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][160/311]	eta 0:02:13 lr 0.000786	time 0.8758 (0.8856)	loss 0.6312 (0.5765)	grad_norm 2.1102 (2.5986)	mem 20675MB
[2025-04-03 01:26:59 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][162/311]	eta 0:02:11 lr 0.000786	time 0.8757 (0.8855)	loss 0.5521 (0.5764)	grad_norm 2.6703 (2.5953)	mem 20675MB
[2025-04-03 01:27:01 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][164/311]	eta 0:02:10 lr 0.000785	time 0.8756 (0.8854)	loss 0.6054 (0.5762)	grad_norm 2.3364 (2.5942)	mem 20675MB
[2025-04-03 01:27:03 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][166/311]	eta 0:02:08 lr 0.000785	time 0.8755 (0.8853)	loss 0.4042 (0.5751)	grad_norm 2.9950 (2.6005)	mem 20675MB
[2025-04-03 01:27:04 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][168/311]	eta 0:02:06 lr 0.000784	time 0.8758 (0.8852)	loss 0.6017 (0.5743)	grad_norm 2.6179 (2.6081)	mem 20675MB
[2025-04-03 01:27:06 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][170/311]	eta 0:02:04 lr 0.000784	time 0.8759 (0.8851)	loss 0.6691 (0.5745)	grad_norm 3.2645 (2.6234)	mem 20675MB
[2025-04-03 01:27:08 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][172/311]	eta 0:02:03 lr 0.000783	time 0.8759 (0.8850)	loss 0.5521 (0.5748)	grad_norm 3.3864 (2.6276)	mem 20675MB
[2025-04-03 01:27:10 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][174/311]	eta 0:02:01 lr 0.000783	time 0.8757 (0.8849)	loss 0.5628 (0.5745)	grad_norm 2.6313 (2.6250)	mem 20675MB
[2025-04-03 01:27:11 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][176/311]	eta 0:01:59 lr 0.000783	time 0.8755 (0.8848)	loss 0.5353 (0.5740)	grad_norm 2.4111 (2.6246)	mem 20675MB
[2025-04-03 01:27:13 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][178/311]	eta 0:01:57 lr 0.000782	time 0.8757 (0.8847)	loss 0.5874 (0.5743)	grad_norm 1.7649 (2.6152)	mem 20675MB
[2025-04-03 01:27:15 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][180/311]	eta 0:01:55 lr 0.000782	time 0.8757 (0.8846)	loss 0.6236 (0.5741)	grad_norm 2.0515 (2.6116)	mem 20675MB
[2025-04-03 01:27:17 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][182/311]	eta 0:01:54 lr 0.000781	time 0.8758 (0.8845)	loss 0.6394 (0.5742)	grad_norm 2.2159 (2.6043)	mem 20675MB
[2025-04-03 01:27:18 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][184/311]	eta 0:01:52 lr 0.000781	time 0.8756 (0.8845)	loss 0.6049 (0.5745)	grad_norm 2.0447 (2.5975)	mem 20675MB
[2025-04-03 01:27:20 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][186/311]	eta 0:01:50 lr 0.000781	time 0.8756 (0.8844)	loss 0.4647 (0.5741)	grad_norm 3.1618 (2.5972)	mem 20675MB
[2025-04-03 01:27:22 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][188/311]	eta 0:01:48 lr 0.000780	time 0.8762 (0.8843)	loss 0.6701 (0.5744)	grad_norm 2.0220 (2.5929)	mem 20675MB
[2025-04-03 01:27:24 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][190/311]	eta 0:01:46 lr 0.000780	time 0.8759 (0.8842)	loss 0.5446 (0.5740)	grad_norm 2.5766 (2.5954)	mem 20675MB
[2025-04-03 01:27:25 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][192/311]	eta 0:01:45 lr 0.000779	time 0.8757 (0.8841)	loss 0.5296 (0.5739)	grad_norm 1.9350 (2.5867)	mem 20675MB
[2025-04-03 01:27:27 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][194/311]	eta 0:01:43 lr 0.000779	time 0.8759 (0.8841)	loss 0.5670 (0.5734)	grad_norm 1.8378 (2.5860)	mem 20675MB
[2025-04-03 01:27:29 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][196/311]	eta 0:01:41 lr 0.000779	time 0.8759 (0.8840)	loss 0.6394 (0.5737)	grad_norm 2.9695 (2.5873)	mem 20675MB
[2025-04-03 01:27:31 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][198/311]	eta 0:01:39 lr 0.000778	time 0.8753 (0.8839)	loss 0.5499 (0.5729)	grad_norm 3.1621 (2.5917)	mem 20675MB
[2025-04-03 01:27:32 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][200/311]	eta 0:01:38 lr 0.000778	time 0.8759 (0.8838)	loss 0.6443 (0.5732)	grad_norm 2.4959 (2.5932)	mem 20675MB
[2025-04-03 01:27:34 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][202/311]	eta 0:01:36 lr 0.000777	time 0.8758 (0.8838)	loss 0.5128 (0.5731)	grad_norm 2.9612 (2.5984)	mem 20675MB
[2025-04-03 01:27:36 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][204/311]	eta 0:01:34 lr 0.000777	time 0.8758 (0.8837)	loss 0.4578 (0.5726)	grad_norm 3.8626 (2.6080)	mem 20675MB
[2025-04-03 01:27:38 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][206/311]	eta 0:01:32 lr 0.000777	time 0.8756 (0.8836)	loss 0.5016 (0.5722)	grad_norm 4.2934 (2.6165)	mem 20675MB
[2025-04-03 01:27:39 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][208/311]	eta 0:01:31 lr 0.000776	time 0.8758 (0.8836)	loss 0.6144 (0.5721)	grad_norm 3.7314 (2.6260)	mem 20675MB
[2025-04-03 01:27:41 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][210/311]	eta 0:01:29 lr 0.000776	time 0.8756 (0.8835)	loss 0.5832 (0.5722)	grad_norm 2.4241 (2.6268)	mem 20675MB
[2025-04-03 01:27:43 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][212/311]	eta 0:01:27 lr 0.000775	time 0.8757 (0.8834)	loss 0.6134 (0.5726)	grad_norm 2.9571 (2.6280)	mem 20675MB
[2025-04-03 01:27:45 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][214/311]	eta 0:01:25 lr 0.000775	time 0.8759 (0.8834)	loss 0.4799 (0.5722)	grad_norm 3.5954 (2.6297)	mem 20675MB
[2025-04-03 01:27:46 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][216/311]	eta 0:01:23 lr 0.000775	time 0.8759 (0.8833)	loss 0.4981 (0.5714)	grad_norm 3.6842 (2.6353)	mem 20675MB
[2025-04-03 01:27:48 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][218/311]	eta 0:01:22 lr 0.000774	time 0.8757 (0.8832)	loss 0.5650 (0.5714)	grad_norm 2.0723 (2.6339)	mem 20675MB
[2025-04-03 01:27:50 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][220/311]	eta 0:01:20 lr 0.000774	time 0.8759 (0.8832)	loss 0.5832 (0.5712)	grad_norm 2.9283 (2.6386)	mem 20675MB
[2025-04-03 01:27:52 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][222/311]	eta 0:01:18 lr 0.000773	time 0.8758 (0.8831)	loss 0.5650 (0.5714)	grad_norm 3.2715 (2.6432)	mem 20675MB
[2025-04-03 01:27:53 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][224/311]	eta 0:01:16 lr 0.000773	time 0.8757 (0.8831)	loss 0.7410 (0.5722)	grad_norm 3.8839 (2.6482)	mem 20675MB
[2025-04-03 01:27:55 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][226/311]	eta 0:01:15 lr 0.000772	time 0.8755 (0.8830)	loss 0.5415 (0.5719)	grad_norm 2.8393 (2.6580)	mem 20675MB
[2025-04-03 01:27:57 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][228/311]	eta 0:01:13 lr 0.000772	time 0.8756 (0.8829)	loss 0.5067 (0.5716)	grad_norm 2.8321 (2.6636)	mem 20675MB
[2025-04-03 01:27:59 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][230/311]	eta 0:01:11 lr 0.000772	time 0.8755 (0.8829)	loss 0.6012 (0.5719)	grad_norm 2.2161 (2.6630)	mem 20675MB
[2025-04-03 01:28:00 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][232/311]	eta 0:01:09 lr 0.000771	time 0.8754 (0.8828)	loss 0.6026 (0.5722)	grad_norm 2.0335 (2.6599)	mem 20675MB
[2025-04-03 01:28:02 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][234/311]	eta 0:01:07 lr 0.000771	time 0.8759 (0.8828)	loss 0.6120 (0.5726)	grad_norm 2.0279 (2.6536)	mem 20675MB
[2025-04-03 01:28:04 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][236/311]	eta 0:01:06 lr 0.000770	time 0.8758 (0.8827)	loss 0.6424 (0.5730)	grad_norm 1.9176 (2.6471)	mem 20675MB
[2025-04-03 01:28:06 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][238/311]	eta 0:01:04 lr 0.000770	time 0.8759 (0.8827)	loss 0.6482 (0.5732)	grad_norm 1.9087 (2.6410)	mem 20675MB
[2025-04-03 01:28:07 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][240/311]	eta 0:01:02 lr 0.000770	time 0.8759 (0.8826)	loss 0.6189 (0.5732)	grad_norm 1.6089 (2.6358)	mem 20675MB
[2025-04-03 01:28:09 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][242/311]	eta 0:01:00 lr 0.000769	time 0.8756 (0.8826)	loss 0.5725 (0.5729)	grad_norm 2.4880 (2.6332)	mem 20675MB
[2025-04-03 01:28:11 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][244/311]	eta 0:00:59 lr 0.000769	time 0.8759 (0.8825)	loss 0.6262 (0.5730)	grad_norm 1.7077 (2.6269)	mem 20675MB
[2025-04-03 01:28:13 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][246/311]	eta 0:00:57 lr 0.000768	time 0.8757 (0.8825)	loss 0.6243 (0.5734)	grad_norm 3.2364 (2.6294)	mem 20675MB
[2025-04-03 01:28:14 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][248/311]	eta 0:00:55 lr 0.000768	time 0.8757 (0.8824)	loss 0.4924 (0.5732)	grad_norm 3.1261 (2.6294)	mem 20675MB
[2025-04-03 01:28:16 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][250/311]	eta 0:00:53 lr 0.000768	time 0.8761 (0.8824)	loss 0.6139 (0.5732)	grad_norm 2.7099 (2.6297)	mem 20675MB
[2025-04-03 01:28:18 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][252/311]	eta 0:00:52 lr 0.000767	time 0.8754 (0.8823)	loss 0.5151 (0.5729)	grad_norm 3.7826 (2.6389)	mem 20675MB
[2025-04-03 01:28:20 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][254/311]	eta 0:00:50 lr 0.000767	time 0.8758 (0.8823)	loss 0.4717 (0.5728)	grad_norm 3.7160 (2.6453)	mem 20675MB
[2025-04-03 01:28:21 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][256/311]	eta 0:00:48 lr 0.000766	time 0.8754 (0.8823)	loss 0.6045 (0.5731)	grad_norm 2.6768 (2.6513)	mem 20675MB
[2025-04-03 01:28:23 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][258/311]	eta 0:00:46 lr 0.000766	time 0.8762 (0.8822)	loss 0.6616 (0.5736)	grad_norm 3.1952 (2.6511)	mem 20675MB
[2025-04-03 01:28:25 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][260/311]	eta 0:00:44 lr 0.000766	time 0.8755 (0.8822)	loss 0.5422 (0.5732)	grad_norm 1.8169 (2.6520)	mem 20675MB
[2025-04-03 01:28:27 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][262/311]	eta 0:00:43 lr 0.000765	time 0.8761 (0.8821)	loss 0.4730 (0.5732)	grad_norm 3.2675 (2.6586)	mem 20675MB
[2025-04-03 01:28:28 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][264/311]	eta 0:00:41 lr 0.000765	time 0.8759 (0.8821)	loss 0.6781 (0.5736)	grad_norm 2.7694 (2.6630)	mem 20675MB
[2025-04-03 01:28:30 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][266/311]	eta 0:00:39 lr 0.000764	time 0.8756 (0.8820)	loss 0.5936 (0.5732)	grad_norm 1.8396 (2.6631)	mem 20675MB
[2025-04-03 01:28:32 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][268/311]	eta 0:00:37 lr 0.000764	time 0.8759 (0.8820)	loss 0.6227 (0.5733)	grad_norm 2.5730 (2.6671)	mem 20675MB
[2025-04-03 01:28:34 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][270/311]	eta 0:00:36 lr 0.000763	time 0.8755 (0.8820)	loss 0.4576 (0.5731)	grad_norm 3.7831 (2.6704)	mem 20675MB
[2025-04-03 01:28:35 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][272/311]	eta 0:00:34 lr 0.000763	time 0.8758 (0.8819)	loss 0.5401 (0.5734)	grad_norm 3.5222 (2.6726)	mem 20675MB
[2025-04-03 01:28:37 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][274/311]	eta 0:00:32 lr 0.000763	time 0.8756 (0.8819)	loss 0.4494 (0.5727)	grad_norm 3.4978 (2.6794)	mem 20675MB
[2025-04-03 01:28:39 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][276/311]	eta 0:00:30 lr 0.000762	time 0.8757 (0.8818)	loss 0.5177 (0.5726)	grad_norm 3.4325 (2.6826)	mem 20675MB
[2025-04-03 01:28:41 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][278/311]	eta 0:00:29 lr 0.000762	time 0.8759 (0.8818)	loss 0.5859 (0.5727)	grad_norm 1.9090 (2.6773)	mem 20675MB
[2025-04-03 01:28:42 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][280/311]	eta 0:00:27 lr 0.000761	time 0.8757 (0.8818)	loss 0.6360 (0.5728)	grad_norm 2.2129 (2.6758)	mem 20675MB
[2025-04-03 01:28:44 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][282/311]	eta 0:00:25 lr 0.000761	time 0.8755 (0.8817)	loss 0.4836 (0.5725)	grad_norm 2.5547 (2.6738)	mem 20675MB
[2025-04-03 01:28:46 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][284/311]	eta 0:00:23 lr 0.000761	time 0.8758 (0.8817)	loss 0.6318 (0.5731)	grad_norm 2.3031 (2.6704)	mem 20675MB
[2025-04-03 01:28:48 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][286/311]	eta 0:00:22 lr 0.000760	time 0.8756 (0.8817)	loss 0.5691 (0.5732)	grad_norm 2.0251 (2.6643)	mem 20675MB
[2025-04-03 01:28:49 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][288/311]	eta 0:00:20 lr 0.000760	time 0.8764 (0.8816)	loss 0.5570 (0.5732)	grad_norm 2.9574 (2.6636)	mem 20675MB
[2025-04-03 01:28:51 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][290/311]	eta 0:00:18 lr 0.000759	time 0.8757 (0.8816)	loss 0.5968 (0.5733)	grad_norm 1.8582 (2.6580)	mem 20675MB
[2025-04-03 01:28:53 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][292/311]	eta 0:00:16 lr 0.000759	time 0.8758 (0.8816)	loss 0.5678 (0.5732)	grad_norm 2.4192 (2.6566)	mem 20675MB
[2025-04-03 01:28:55 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][294/311]	eta 0:00:14 lr 0.000759	time 0.8758 (0.8815)	loss 0.4773 (0.5728)	grad_norm 3.0638 (2.6558)	mem 20675MB
[2025-04-03 01:28:56 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][296/311]	eta 0:00:13 lr 0.000758	time 0.8764 (0.8815)	loss 0.5729 (0.5726)	grad_norm 2.8790 (2.6553)	mem 20675MB
[2025-04-03 01:28:58 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][298/311]	eta 0:00:11 lr 0.000758	time 0.8756 (0.8815)	loss 0.5787 (0.5726)	grad_norm 2.9766 (2.6579)	mem 20675MB
[2025-04-03 01:29:00 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][300/311]	eta 0:00:09 lr 0.000757	time 0.8753 (0.8814)	loss 0.6258 (0.5725)	grad_norm 2.6426 (2.6596)	mem 20675MB
[2025-04-03 01:29:02 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][302/311]	eta 0:00:07 lr 0.000757	time 0.8767 (0.8814)	loss 0.5464 (0.5722)	grad_norm 2.2786 (2.6603)	mem 20675MB
[2025-04-03 01:29:03 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][304/311]	eta 0:00:06 lr 0.000756	time 0.8756 (0.8814)	loss 0.5953 (0.5725)	grad_norm 2.7311 (2.6607)	mem 20675MB
[2025-04-03 01:29:05 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][306/311]	eta 0:00:04 lr 0.000756	time 0.8756 (0.8813)	loss 0.5563 (0.5726)	grad_norm 2.3964 (2.6617)	mem 20675MB
[2025-04-03 01:29:07 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][308/311]	eta 0:00:02 lr 0.000756	time 0.8753 (0.8813)	loss 0.5887 (0.5728)	grad_norm 1.8972 (2.6596)	mem 20675MB
[2025-04-03 01:29:09 simmim_finetune] (main_finetune.py 252): INFO Train: [12/30][310/311]	eta 0:00:00 lr 0.000755	time 0.8762 (0.8813)	loss 0.6382 (0.5729)	grad_norm 2.1377 (2.6568)	mem 20675MB
[2025-04-03 01:29:09 simmim_finetune] (main_finetune.py 260): INFO EPOCH 12 training takes 0:04:34
[2025-04-03 01:29:10 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.458 (1.458)	Loss 0.5561 (0.5561)	Acc@1 75.781 (75.781)	Mem 20675MB
[2025-04-03 01:29:10 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.465
[2025-04-03 01:29:10 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 77.5%
[2025-04-03 01:29:10 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.17%
[2025-04-03 01:29:10 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.890917302587831e-06, 2.890917302587831e-06, 4.394273612467489e-06, 4.394273612467489e-06, 6.70712947382081e-06, 6.70712947382081e-06, 1.0265369260518226e-05, 1.0265369260518226e-05, 1.573958431697579e-05, 1.573958431697579e-05, 2.416145363460281e-05, 2.416145363460281e-05, 3.71181756617213e-05, 3.71181756617213e-05, 5.705159416498051e-05, 5.705159416498051e-05, 8.771839186230238e-05, 8.771839186230238e-05, 0.00013489808062741297, 0.00013489808062741297, 0.0002074822171891215, 0.0002074822171891215, 0.00031915011959175006, 0.00031915011959175006, 0.0004909468925188709, 0.0004909468925188709, 0.0007552496200990569, 0.0007552496200990569]
[2025-04-03 01:29:13 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][0/311]	eta 0:11:53 lr 0.000755	time 2.2954 (2.2954)	loss 0.6210 (0.6210)	grad_norm 3.4883 (3.4883)	mem 20675MB
[2025-04-03 01:29:15 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][2/311]	eta 0:06:57 lr 0.000755	time 0.8757 (1.3498)	loss 0.6093 (0.6115)	grad_norm 1.8463 (2.5003)	mem 20675MB
[2025-04-03 01:29:16 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][4/311]	eta 0:05:56 lr 0.000754	time 0.8759 (1.1605)	loss 0.6190 (0.6140)	grad_norm 1.9712 (2.2542)	mem 20675MB
[2025-04-03 01:29:18 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][6/311]	eta 0:05:29 lr 0.000754	time 0.8757 (1.0794)	loss 0.5828 (0.6010)	grad_norm 1.9274 (2.1725)	mem 20675MB
[2025-04-03 01:29:20 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][8/311]	eta 0:05:13 lr 0.000753	time 0.8755 (1.0343)	loss 0.5515 (0.5949)	grad_norm 2.1447 (2.1042)	mem 20675MB
[2025-04-03 01:29:22 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][10/311]	eta 0:05:02 lr 0.000753	time 0.8756 (1.0055)	loss 0.5444 (0.5922)	grad_norm 2.4413 (2.0893)	mem 20675MB
[2025-04-03 01:29:23 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][12/311]	eta 0:04:54 lr 0.000753	time 0.8771 (0.9859)	loss 0.6960 (0.6013)	grad_norm 2.8081 (2.1294)	mem 20675MB
[2025-04-03 01:29:25 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][14/311]	eta 0:04:48 lr 0.000752	time 0.8761 (0.9714)	loss 0.4839 (0.5943)	grad_norm 2.8293 (2.1747)	mem 20675MB
[2025-04-03 01:29:27 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][16/311]	eta 0:04:43 lr 0.000752	time 0.8757 (0.9602)	loss 0.5152 (0.5865)	grad_norm 2.5502 (2.2842)	mem 20675MB
[2025-04-03 01:29:29 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][18/311]	eta 0:04:38 lr 0.000751	time 0.8761 (0.9514)	loss 0.6407 (0.5913)	grad_norm 2.6587 (2.3441)	mem 20675MB
[2025-04-03 01:29:30 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][20/311]	eta 0:04:34 lr 0.000751	time 0.8760 (0.9444)	loss 0.4881 (0.5902)	grad_norm 3.6596 (2.4283)	mem 20675MB
[2025-04-03 01:29:32 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][22/311]	eta 0:04:31 lr 0.000751	time 0.8759 (0.9385)	loss 0.4648 (0.5805)	grad_norm 3.5367 (2.5664)	mem 20675MB
[2025-04-03 01:29:34 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][24/311]	eta 0:04:27 lr 0.000750	time 0.8756 (0.9335)	loss 0.5851 (0.5771)	grad_norm 2.6628 (2.5887)	mem 20675MB
[2025-04-03 01:29:36 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][26/311]	eta 0:04:24 lr 0.000750	time 0.8767 (0.9294)	loss 0.5063 (0.5726)	grad_norm 4.1118 (2.6726)	mem 20675MB
[2025-04-03 01:29:37 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][28/311]	eta 0:04:21 lr 0.000749	time 0.8758 (0.9258)	loss 0.6297 (0.5731)	grad_norm 3.1479 (2.7272)	mem 20675MB
[2025-04-03 01:29:39 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][30/311]	eta 0:04:19 lr 0.000749	time 0.8755 (0.9226)	loss 0.4998 (0.5730)	grad_norm 3.9048 (2.7700)	mem 20675MB
[2025-04-03 01:29:41 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][32/311]	eta 0:04:16 lr 0.000748	time 0.8759 (0.9198)	loss 0.4415 (0.5696)	grad_norm 3.8254 (2.7832)	mem 20675MB
[2025-04-03 01:29:43 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][34/311]	eta 0:04:14 lr 0.000748	time 0.8758 (0.9173)	loss 0.4690 (0.5678)	grad_norm 3.2247 (2.7939)	mem 20675MB
[2025-04-03 01:29:44 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][36/311]	eta 0:04:11 lr 0.000748	time 0.8758 (0.9151)	loss 0.5359 (0.5651)	grad_norm 3.1649 (2.8180)	mem 20675MB
[2025-04-03 01:29:46 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][38/311]	eta 0:04:09 lr 0.000747	time 0.8757 (0.9131)	loss 0.5162 (0.5644)	grad_norm 3.5702 (2.8556)	mem 20675MB
[2025-04-03 01:29:48 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][40/311]	eta 0:04:06 lr 0.000747	time 0.8766 (0.9114)	loss 0.4715 (0.5586)	grad_norm 4.5738 (2.9146)	mem 20675MB
[2025-04-03 01:29:50 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][42/311]	eta 0:04:04 lr 0.000746	time 0.8756 (0.9098)	loss 0.5871 (0.5579)	grad_norm 3.2774 (2.9654)	mem 20675MB
[2025-04-03 01:29:51 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][44/311]	eta 0:04:02 lr 0.000746	time 0.8758 (0.9083)	loss 0.6431 (0.5611)	grad_norm 3.5789 (3.0027)	mem 20675MB
[2025-04-03 01:29:53 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][46/311]	eta 0:04:00 lr 0.000746	time 0.8761 (0.9070)	loss 0.4906 (0.5588)	grad_norm 4.1539 (3.0220)	mem 20675MB
[2025-04-03 01:29:55 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][48/311]	eta 0:03:58 lr 0.000745	time 0.8757 (0.9057)	loss 0.6115 (0.5611)	grad_norm 2.5511 (2.9921)	mem 20675MB
[2025-04-03 01:29:57 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][50/311]	eta 0:03:56 lr 0.000745	time 0.8757 (0.9046)	loss 0.6209 (0.5622)	grad_norm 2.0650 (2.9579)	mem 20675MB
[2025-04-03 01:29:58 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][52/311]	eta 0:03:54 lr 0.000744	time 0.8756 (0.9036)	loss 0.6096 (0.5628)	grad_norm 2.1752 (2.9296)	mem 20675MB
[2025-04-03 01:30:00 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][54/311]	eta 0:03:51 lr 0.000744	time 0.8764 (0.9026)	loss 0.5729 (0.5623)	grad_norm 1.9867 (2.9085)	mem 20675MB
[2025-04-03 01:30:02 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][56/311]	eta 0:03:49 lr 0.000743	time 0.8759 (0.9017)	loss 0.5555 (0.5620)	grad_norm 2.1064 (2.8937)	mem 20675MB
[2025-04-03 01:30:04 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][58/311]	eta 0:03:47 lr 0.000743	time 0.8760 (0.9008)	loss 0.4730 (0.5608)	grad_norm 2.9269 (2.8824)	mem 20675MB
[2025-04-03 01:30:05 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][60/311]	eta 0:03:45 lr 0.000743	time 0.8762 (0.9000)	loss 0.6120 (0.5605)	grad_norm 2.1292 (2.8709)	mem 20675MB
[2025-04-03 01:30:07 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][62/311]	eta 0:03:43 lr 0.000742	time 0.8756 (0.8993)	loss 0.5337 (0.5587)	grad_norm 2.8127 (2.8831)	mem 20675MB
[2025-04-03 01:30:09 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][64/311]	eta 0:03:41 lr 0.000742	time 0.8755 (0.8986)	loss 0.5668 (0.5598)	grad_norm 2.4226 (2.8621)	mem 20675MB
[2025-04-03 01:30:11 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][66/311]	eta 0:03:39 lr 0.000741	time 0.8761 (0.8979)	loss 0.6160 (0.5608)	grad_norm 2.6500 (2.8675)	mem 20675MB
[2025-04-03 01:30:12 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][68/311]	eta 0:03:38 lr 0.000741	time 0.8756 (0.8973)	loss 0.5677 (0.5624)	grad_norm 3.4458 (2.8715)	mem 20675MB
[2025-04-03 01:30:14 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][70/311]	eta 0:03:36 lr 0.000741	time 0.8759 (0.8968)	loss 0.5704 (0.5628)	grad_norm 3.9904 (2.8836)	mem 20675MB
[2025-04-03 01:30:16 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][72/311]	eta 0:03:34 lr 0.000740	time 0.8759 (0.8962)	loss 0.5186 (0.5623)	grad_norm 3.3622 (2.8834)	mem 20675MB
[2025-04-03 01:30:18 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][74/311]	eta 0:03:32 lr 0.000740	time 0.8761 (0.8957)	loss 0.6033 (0.5630)	grad_norm 2.9887 (2.8930)	mem 20675MB
[2025-04-03 01:30:19 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][76/311]	eta 0:03:30 lr 0.000739	time 0.8757 (0.8952)	loss 0.3990 (0.5616)	grad_norm 2.9108 (2.8879)	mem 20675MB
[2025-04-03 01:30:21 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][78/311]	eta 0:03:28 lr 0.000739	time 0.8763 (0.8948)	loss 0.6645 (0.5624)	grad_norm 3.1418 (2.8911)	mem 20675MB
[2025-04-03 01:30:23 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][80/311]	eta 0:03:26 lr 0.000739	time 0.8762 (0.8943)	loss 0.5669 (0.5620)	grad_norm 1.6691 (2.8837)	mem 20675MB
[2025-04-03 01:30:25 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][82/311]	eta 0:03:24 lr 0.000738	time 0.8767 (0.8939)	loss 0.5599 (0.5621)	grad_norm 2.9073 (2.8854)	mem 20675MB
[2025-04-03 01:30:26 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][84/311]	eta 0:03:22 lr 0.000738	time 0.8763 (0.8935)	loss 0.6060 (0.5638)	grad_norm 2.3258 (2.8756)	mem 20675MB
[2025-04-03 01:30:28 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][86/311]	eta 0:03:20 lr 0.000737	time 0.8764 (0.8931)	loss 0.5515 (0.5638)	grad_norm 3.0673 (2.8763)	mem 20675MB
[2025-04-03 01:30:30 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][88/311]	eta 0:03:19 lr 0.000737	time 0.8771 (0.8928)	loss 0.5597 (0.5635)	grad_norm 2.6372 (2.8711)	mem 20675MB
[2025-04-03 01:30:32 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][90/311]	eta 0:03:17 lr 0.000736	time 0.8764 (0.8925)	loss 0.4442 (0.5617)	grad_norm 4.1059 (2.8960)	mem 20675MB
[2025-04-03 01:30:33 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][92/311]	eta 0:03:15 lr 0.000736	time 0.8760 (0.8921)	loss 0.5956 (0.5626)	grad_norm 3.3368 (2.9045)	mem 20675MB
[2025-04-03 01:30:35 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][94/311]	eta 0:03:13 lr 0.000736	time 0.8781 (0.8918)	loss 0.6255 (0.5632)	grad_norm 2.1036 (2.8942)	mem 20675MB
[2025-04-03 01:30:37 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][96/311]	eta 0:03:11 lr 0.000735	time 0.8769 (0.8916)	loss 0.6571 (0.5648)	grad_norm 2.8002 (2.8989)	mem 20675MB
[2025-04-03 01:30:39 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][98/311]	eta 0:03:09 lr 0.000735	time 0.8759 (0.8913)	loss 0.5418 (0.5639)	grad_norm 2.4251 (2.9124)	mem 20675MB
[2025-04-03 01:30:40 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][100/311]	eta 0:03:07 lr 0.000734	time 0.8763 (0.8910)	loss 0.4868 (0.5634)	grad_norm 3.2171 (2.9110)	mem 20675MB
[2025-04-03 01:30:42 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][102/311]	eta 0:03:06 lr 0.000734	time 0.8768 (0.8907)	loss 0.5564 (0.5634)	grad_norm 3.8678 (2.9161)	mem 20675MB
[2025-04-03 01:30:44 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][104/311]	eta 0:03:04 lr 0.000734	time 0.8772 (0.8905)	loss 0.5626 (0.5632)	grad_norm 3.0210 (2.9173)	mem 20675MB
[2025-04-03 01:30:46 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][106/311]	eta 0:03:02 lr 0.000733	time 0.8762 (0.8902)	loss 0.6479 (0.5643)	grad_norm 2.4409 (2.9087)	mem 20675MB
[2025-04-03 01:30:47 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][108/311]	eta 0:03:00 lr 0.000733	time 0.8767 (0.8900)	loss 0.6330 (0.5657)	grad_norm 2.7912 (2.9037)	mem 20675MB
[2025-04-03 01:30:49 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][110/311]	eta 0:02:58 lr 0.000732	time 0.8764 (0.8898)	loss 0.5229 (0.5656)	grad_norm 2.6430 (2.8957)	mem 20675MB
[2025-04-03 01:30:51 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][112/311]	eta 0:02:57 lr 0.000732	time 0.8765 (0.8896)	loss 0.4786 (0.5653)	grad_norm 2.7994 (2.8913)	mem 20675MB
[2025-04-03 01:30:53 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][114/311]	eta 0:02:55 lr 0.000731	time 0.8766 (0.8894)	loss 0.4932 (0.5648)	grad_norm 3.1236 (2.8835)	mem 20675MB
[2025-04-03 01:30:54 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][116/311]	eta 0:02:53 lr 0.000731	time 0.8765 (0.8891)	loss 0.5900 (0.5652)	grad_norm 2.1531 (2.8700)	mem 20675MB
[2025-04-03 01:30:56 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][118/311]	eta 0:02:51 lr 0.000731	time 0.8784 (0.8890)	loss 0.5359 (0.5639)	grad_norm 3.9096 (2.8789)	mem 20675MB
[2025-04-03 01:30:58 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][120/311]	eta 0:02:49 lr 0.000730	time 0.8765 (0.8888)	loss 0.5824 (0.5644)	grad_norm 2.3589 (2.8736)	mem 20675MB
[2025-04-03 01:31:00 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][122/311]	eta 0:02:47 lr 0.000730	time 0.8767 (0.8886)	loss 0.6011 (0.5654)	grad_norm 2.6399 (2.8732)	mem 20675MB
[2025-04-03 01:31:02 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][124/311]	eta 0:02:46 lr 0.000729	time 0.8764 (0.8884)	loss 0.4826 (0.5651)	grad_norm 2.5599 (2.8687)	mem 20675MB
[2025-04-03 01:31:03 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][126/311]	eta 0:02:44 lr 0.000729	time 0.8764 (0.8882)	loss 0.5446 (0.5650)	grad_norm 2.9807 (2.8726)	mem 20675MB
[2025-04-03 01:31:05 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][128/311]	eta 0:02:42 lr 0.000729	time 0.8768 (0.8881)	loss 0.6682 (0.5661)	grad_norm 2.8610 (2.8721)	mem 20675MB
[2025-04-03 01:31:07 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][130/311]	eta 0:02:40 lr 0.000728	time 0.8767 (0.8879)	loss 0.6150 (0.5664)	grad_norm 3.6762 (2.8776)	mem 20675MB
[2025-04-03 01:31:09 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][132/311]	eta 0:02:38 lr 0.000728	time 0.8761 (0.8877)	loss 0.5891 (0.5664)	grad_norm 2.1046 (2.8740)	mem 20675MB
[2025-04-03 01:31:10 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][134/311]	eta 0:02:37 lr 0.000727	time 0.8762 (0.8876)	loss 0.5871 (0.5673)	grad_norm 1.7870 (2.8682)	mem 20675MB
[2025-04-03 01:31:12 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][136/311]	eta 0:02:35 lr 0.000727	time 0.8764 (0.8874)	loss 0.5839 (0.5675)	grad_norm 2.0883 (2.8580)	mem 20675MB
[2025-04-03 01:31:14 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][138/311]	eta 0:02:33 lr 0.000727	time 0.8760 (0.8873)	loss 0.6173 (0.5683)	grad_norm 1.9317 (2.8519)	mem 20675MB
[2025-04-03 01:31:16 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][140/311]	eta 0:02:31 lr 0.000726	time 0.8759 (0.8871)	loss 0.5524 (0.5683)	grad_norm 2.0222 (2.8435)	mem 20675MB
[2025-04-03 01:31:17 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][142/311]	eta 0:02:29 lr 0.000726	time 0.8767 (0.8870)	loss 0.5187 (0.5685)	grad_norm 3.2037 (2.8413)	mem 20675MB
[2025-04-03 01:31:19 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][144/311]	eta 0:02:28 lr 0.000725	time 0.8762 (0.8869)	loss 0.6068 (0.5685)	grad_norm 2.2871 (2.8394)	mem 20675MB
[2025-04-03 01:31:21 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][146/311]	eta 0:02:26 lr 0.000725	time 0.8763 (0.8868)	loss 0.5241 (0.5684)	grad_norm 3.3354 (2.8414)	mem 20675MB
[2025-04-03 01:31:23 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][148/311]	eta 0:02:24 lr 0.000724	time 0.8764 (0.8866)	loss 0.4856 (0.5682)	grad_norm 3.9841 (2.8446)	mem 20675MB
[2025-04-03 01:31:24 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][150/311]	eta 0:02:22 lr 0.000724	time 0.8760 (0.8865)	loss 0.5096 (0.5672)	grad_norm 2.6250 (2.8467)	mem 20675MB
[2025-04-03 01:31:26 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][152/311]	eta 0:02:20 lr 0.000724	time 0.8762 (0.8864)	loss 0.6517 (0.5680)	grad_norm 2.0523 (2.8378)	mem 20675MB
[2025-04-03 01:31:28 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][154/311]	eta 0:02:19 lr 0.000723	time 0.8758 (0.8862)	loss 0.3930 (0.5676)	grad_norm 3.0110 (2.8375)	mem 20675MB
[2025-04-03 01:31:30 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][156/311]	eta 0:02:17 lr 0.000723	time 0.8760 (0.8861)	loss 0.5056 (0.5673)	grad_norm 2.8174 (2.8341)	mem 20675MB
[2025-04-03 01:31:31 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][158/311]	eta 0:02:15 lr 0.000722	time 0.8757 (0.8860)	loss 0.6183 (0.5676)	grad_norm 1.9560 (2.8244)	mem 20675MB
[2025-04-03 01:31:33 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][160/311]	eta 0:02:13 lr 0.000722	time 0.8756 (0.8859)	loss 0.4692 (0.5672)	grad_norm 2.5468 (2.8194)	mem 20675MB
[2025-04-03 01:31:35 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][162/311]	eta 0:02:11 lr 0.000722	time 0.8754 (0.8858)	loss 0.6298 (0.5681)	grad_norm 2.1254 (2.8138)	mem 20675MB
[2025-04-03 01:31:37 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][164/311]	eta 0:02:10 lr 0.000721	time 0.8761 (0.8857)	loss 0.5179 (0.5678)	grad_norm 2.8638 (2.8132)	mem 20675MB
[2025-04-03 01:31:38 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][166/311]	eta 0:02:08 lr 0.000721	time 0.8777 (0.8856)	loss 0.6398 (0.5685)	grad_norm 2.6734 (2.8078)	mem 20675MB
[2025-04-03 01:31:40 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][168/311]	eta 0:02:06 lr 0.000720	time 0.8758 (0.8855)	loss 0.5397 (0.5686)	grad_norm 2.2422 (2.7980)	mem 20675MB
[2025-04-03 01:31:42 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][170/311]	eta 0:02:04 lr 0.000720	time 0.8762 (0.8855)	loss 0.5596 (0.5683)	grad_norm 1.6780 (2.7961)	mem 20675MB
[2025-04-03 01:31:44 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][172/311]	eta 0:02:03 lr 0.000719	time 0.8782 (0.8854)	loss 0.6613 (0.5691)	grad_norm 2.4903 (2.7938)	mem 20675MB
[2025-04-03 01:31:45 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][174/311]	eta 0:02:01 lr 0.000719	time 0.8763 (0.8854)	loss 0.5490 (0.5694)	grad_norm 3.3489 (2.7964)	mem 20675MB
[2025-04-03 01:31:47 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][176/311]	eta 0:01:59 lr 0.000719	time 0.8762 (0.8853)	loss 0.5963 (0.5698)	grad_norm 2.4134 (2.7906)	mem 20675MB
[2025-04-03 01:31:49 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][178/311]	eta 0:01:57 lr 0.000718	time 0.8758 (0.8852)	loss 0.4745 (0.5696)	grad_norm 2.5595 (2.7827)	mem 20675MB
[2025-04-03 01:31:51 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][180/311]	eta 0:01:55 lr 0.000718	time 0.8759 (0.8852)	loss 0.4897 (0.5692)	grad_norm 3.6239 (2.7821)	mem 20675MB
[2025-04-03 01:31:52 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][182/311]	eta 0:01:54 lr 0.000717	time 0.8756 (0.8851)	loss 0.4622 (0.5689)	grad_norm 3.0637 (2.7810)	mem 20675MB
[2025-04-03 01:31:54 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][184/311]	eta 0:01:52 lr 0.000717	time 0.8759 (0.8850)	loss 0.4970 (0.5687)	grad_norm 3.7016 (2.7846)	mem 20675MB
[2025-04-03 01:31:56 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][186/311]	eta 0:01:50 lr 0.000717	time 0.8759 (0.8849)	loss 0.6386 (0.5684)	grad_norm 2.7027 (2.7871)	mem 20675MB
[2025-04-03 01:31:58 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][188/311]	eta 0:01:48 lr 0.000716	time 0.8759 (0.8848)	loss 0.4796 (0.5676)	grad_norm 4.5086 (2.7944)	mem 20675MB
[2025-04-03 01:31:59 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][190/311]	eta 0:01:47 lr 0.000716	time 0.8757 (0.8847)	loss 0.6253 (0.5680)	grad_norm 3.2601 (2.7953)	mem 20675MB
[2025-04-03 01:32:01 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][192/311]	eta 0:01:45 lr 0.000715	time 0.8759 (0.8847)	loss 0.4750 (0.5676)	grad_norm 3.0557 (2.7959)	mem 20675MB
[2025-04-03 01:32:03 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][194/311]	eta 0:01:43 lr 0.000715	time 0.8758 (0.8846)	loss 0.5557 (0.5676)	grad_norm 2.9245 (2.7951)	mem 20675MB
[2025-04-03 01:32:05 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][196/311]	eta 0:01:41 lr 0.000714	time 0.8758 (0.8845)	loss 0.6029 (0.5681)	grad_norm 2.1859 (2.7899)	mem 20675MB
[2025-04-03 01:32:06 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][198/311]	eta 0:01:39 lr 0.000714	time 0.8758 (0.8844)	loss 0.6242 (0.5685)	grad_norm 2.2117 (2.7837)	mem 20675MB
[2025-04-03 01:32:08 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][200/311]	eta 0:01:38 lr 0.000714	time 0.8757 (0.8843)	loss 0.5775 (0.5680)	grad_norm 2.1679 (2.7902)	mem 20675MB
[2025-04-03 01:32:10 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][202/311]	eta 0:01:36 lr 0.000713	time 0.8758 (0.8843)	loss 0.5004 (0.5672)	grad_norm 3.7583 (2.8022)	mem 20675MB
[2025-04-03 01:32:12 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][204/311]	eta 0:01:34 lr 0.000713	time 0.8756 (0.8842)	loss 0.6033 (0.5679)	grad_norm 2.5540 (2.8016)	mem 20675MB
[2025-04-03 01:32:13 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][206/311]	eta 0:01:32 lr 0.000712	time 0.8759 (0.8841)	loss 0.6259 (0.5682)	grad_norm 2.4400 (2.7997)	mem 20675MB
[2025-04-03 01:32:15 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][208/311]	eta 0:01:31 lr 0.000712	time 0.8757 (0.8840)	loss 0.5587 (0.5683)	grad_norm 2.0954 (2.7940)	mem 20675MB
[2025-04-03 01:32:17 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][210/311]	eta 0:01:29 lr 0.000712	time 0.8757 (0.8840)	loss 0.6032 (0.5683)	grad_norm 1.8253 (2.7864)	mem 20675MB
[2025-04-03 01:32:19 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][212/311]	eta 0:01:27 lr 0.000711	time 0.8759 (0.8839)	loss 0.6613 (0.5686)	grad_norm 2.6936 (2.7869)	mem 20675MB
[2025-04-03 01:32:20 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][214/311]	eta 0:01:25 lr 0.000711	time 0.8758 (0.8838)	loss 0.5801 (0.5688)	grad_norm 2.0481 (2.7808)	mem 20675MB
[2025-04-03 01:32:22 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][216/311]	eta 0:01:23 lr 0.000710	time 0.8761 (0.8838)	loss 0.6359 (0.5690)	grad_norm 1.9746 (2.7755)	mem 20675MB
[2025-04-03 01:32:24 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][218/311]	eta 0:01:22 lr 0.000710	time 0.8758 (0.8837)	loss 0.5403 (0.5684)	grad_norm 3.1831 (2.7815)	mem 20675MB
[2025-04-03 01:32:26 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][220/311]	eta 0:01:20 lr 0.000709	time 0.8759 (0.8836)	loss 0.4507 (0.5679)	grad_norm 2.9899 (2.7811)	mem 20675MB
[2025-04-03 01:32:28 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][222/311]	eta 0:01:18 lr 0.000709	time 0.8763 (0.8836)	loss 0.5361 (0.5680)	grad_norm 2.3210 (2.7771)	mem 20675MB
[2025-04-03 01:32:29 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][224/311]	eta 0:01:16 lr 0.000709	time 0.8756 (0.8835)	loss 0.5780 (0.5679)	grad_norm 2.8509 (2.7790)	mem 20675MB
[2025-04-03 01:32:31 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][226/311]	eta 0:01:15 lr 0.000708	time 0.8758 (0.8835)	loss 0.6655 (0.5679)	grad_norm 2.6978 (2.7827)	mem 20675MB
[2025-04-03 01:32:33 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][228/311]	eta 0:01:13 lr 0.000708	time 0.8757 (0.8834)	loss 0.6908 (0.5681)	grad_norm 2.1733 (2.7848)	mem 20675MB
[2025-04-03 01:32:35 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][230/311]	eta 0:01:11 lr 0.000707	time 0.8760 (0.8834)	loss 0.6128 (0.5684)	grad_norm 2.3485 (2.7830)	mem 20675MB
[2025-04-03 01:32:36 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][232/311]	eta 0:01:09 lr 0.000707	time 0.8756 (0.8833)	loss 0.6225 (0.5686)	grad_norm 2.8389 (2.7810)	mem 20675MB
[2025-04-03 01:32:38 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][234/311]	eta 0:01:08 lr 0.000707	time 0.8758 (0.8832)	loss 0.5655 (0.5682)	grad_norm 3.0881 (2.7833)	mem 20675MB
[2025-04-03 01:32:40 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][236/311]	eta 0:01:06 lr 0.000706	time 0.8757 (0.8832)	loss 0.6146 (0.5686)	grad_norm 1.7120 (2.7731)	mem 20675MB
[2025-04-03 01:32:42 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][238/311]	eta 0:01:04 lr 0.000706	time 0.8762 (0.8831)	loss 0.6162 (0.5688)	grad_norm 2.1686 (2.7715)	mem 20675MB
[2025-04-03 01:32:43 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][240/311]	eta 0:01:02 lr 0.000705	time 0.8761 (0.8831)	loss 0.4977 (0.5680)	grad_norm 2.9870 (2.7747)	mem 20675MB
[2025-04-03 01:32:45 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][242/311]	eta 0:01:00 lr 0.000705	time 0.8758 (0.8830)	loss 0.5630 (0.5684)	grad_norm 2.4286 (2.7724)	mem 20675MB
[2025-04-03 01:32:47 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][244/311]	eta 0:00:59 lr 0.000704	time 0.8755 (0.8830)	loss 0.5657 (0.5687)	grad_norm 2.2600 (2.7702)	mem 20675MB
[2025-04-03 01:32:49 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][246/311]	eta 0:00:57 lr 0.000704	time 0.8758 (0.8829)	loss 0.4862 (0.5685)	grad_norm 3.2643 (2.7692)	mem 20675MB
[2025-04-03 01:32:50 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][248/311]	eta 0:00:55 lr 0.000704	time 0.8782 (0.8829)	loss 0.6153 (0.5688)	grad_norm 2.1952 (2.7664)	mem 20675MB
[2025-04-03 01:32:52 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][250/311]	eta 0:00:53 lr 0.000703	time 0.8756 (0.8828)	loss 0.6302 (0.5688)	grad_norm 1.8098 (2.7680)	mem 20675MB
[2025-04-03 01:32:54 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][252/311]	eta 0:00:52 lr 0.000703	time 0.8756 (0.8828)	loss 0.5124 (0.5688)	grad_norm 2.5670 (2.7652)	mem 20675MB
[2025-04-03 01:32:56 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][254/311]	eta 0:00:50 lr 0.000702	time 0.8754 (0.8827)	loss 0.5906 (0.5690)	grad_norm 3.1736 (2.7656)	mem 20675MB
[2025-04-03 01:32:57 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][256/311]	eta 0:00:48 lr 0.000702	time 0.8755 (0.8827)	loss 0.6252 (0.5690)	grad_norm 2.6415 (2.7688)	mem 20675MB
[2025-04-03 01:32:59 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][258/311]	eta 0:00:46 lr 0.000702	time 0.8758 (0.8826)	loss 0.5259 (0.5685)	grad_norm 3.1426 (2.7736)	mem 20675MB
[2025-04-03 01:33:01 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][260/311]	eta 0:00:45 lr 0.000701	time 0.8755 (0.8826)	loss 0.5614 (0.5685)	grad_norm 2.2811 (2.7733)	mem 20675MB
[2025-04-03 01:33:03 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][262/311]	eta 0:00:43 lr 0.000701	time 0.8760 (0.8825)	loss 0.5631 (0.5683)	grad_norm 3.0688 (2.7721)	mem 20675MB
[2025-04-03 01:33:04 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][264/311]	eta 0:00:41 lr 0.000700	time 0.8756 (0.8825)	loss 0.5742 (0.5682)	grad_norm 2.1445 (2.7672)	mem 20675MB
[2025-04-03 01:33:06 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][266/311]	eta 0:00:39 lr 0.000700	time 0.8776 (0.8825)	loss 0.6034 (0.5685)	grad_norm 3.2082 (2.7674)	mem 20675MB
[2025-04-03 01:33:08 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][268/311]	eta 0:00:37 lr 0.000699	time 0.8757 (0.8824)	loss 0.5934 (0.5690)	grad_norm 2.1427 (2.7657)	mem 20675MB
[2025-04-03 01:33:10 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][270/311]	eta 0:00:36 lr 0.000699	time 0.8758 (0.8824)	loss 0.4536 (0.5687)	grad_norm 3.3446 (2.7657)	mem 20675MB
[2025-04-03 01:33:11 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][272/311]	eta 0:00:34 lr 0.000699	time 0.8758 (0.8823)	loss 0.5737 (0.5685)	grad_norm 3.3103 (2.7655)	mem 20675MB
[2025-04-03 01:33:13 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][274/311]	eta 0:00:32 lr 0.000698	time 0.8755 (0.8823)	loss 0.6313 (0.5689)	grad_norm 1.7505 (2.7595)	mem 20675MB
[2025-04-03 01:33:15 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][276/311]	eta 0:00:30 lr 0.000698	time 0.8761 (0.8822)	loss 0.6053 (0.5691)	grad_norm 2.8283 (2.7573)	mem 20675MB
[2025-04-03 01:33:17 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][278/311]	eta 0:00:29 lr 0.000697	time 0.8755 (0.8822)	loss 0.5809 (0.5694)	grad_norm 1.7952 (2.7520)	mem 20675MB
[2025-04-03 01:33:18 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][280/311]	eta 0:00:27 lr 0.000697	time 0.8755 (0.8822)	loss 0.5685 (0.5693)	grad_norm 2.2267 (2.7489)	mem 20675MB
[2025-04-03 01:33:20 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][282/311]	eta 0:00:25 lr 0.000697	time 0.8754 (0.8821)	loss 0.5966 (0.5696)	grad_norm 1.7792 (2.7430)	mem 20675MB
[2025-04-03 01:33:22 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][284/311]	eta 0:00:23 lr 0.000696	time 0.8754 (0.8821)	loss 0.4692 (0.5689)	grad_norm 3.6708 (2.7486)	mem 20675MB
[2025-04-03 01:33:24 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][286/311]	eta 0:00:22 lr 0.000696	time 0.8753 (0.8820)	loss 0.6279 (0.5689)	grad_norm 2.5872 (2.7502)	mem 20675MB
[2025-04-03 01:33:25 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][288/311]	eta 0:00:20 lr 0.000695	time 0.8757 (0.8820)	loss 0.6397 (0.5691)	grad_norm 3.0109 (2.7500)	mem 20675MB
[2025-04-03 01:33:27 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][290/311]	eta 0:00:18 lr 0.000695	time 0.8761 (0.8820)	loss 0.4858 (0.5689)	grad_norm 3.9365 (2.7550)	mem 20675MB
[2025-04-03 01:33:29 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][292/311]	eta 0:00:16 lr 0.000694	time 0.8757 (0.8819)	loss 0.6242 (0.5691)	grad_norm 2.6190 (2.7532)	mem 20675MB
[2025-04-03 01:33:31 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][294/311]	eta 0:00:14 lr 0.000694	time 0.8758 (0.8819)	loss 0.4340 (0.5682)	grad_norm 3.1453 (2.7568)	mem 20675MB
[2025-04-03 01:33:32 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][296/311]	eta 0:00:13 lr 0.000694	time 0.8756 (0.8819)	loss 0.6220 (0.5683)	grad_norm 2.2722 (2.7557)	mem 20675MB
[2025-04-03 01:33:34 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][298/311]	eta 0:00:11 lr 0.000693	time 0.8754 (0.8818)	loss 0.6070 (0.5687)	grad_norm 2.4966 (2.7551)	mem 20675MB
[2025-04-03 01:33:36 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][300/311]	eta 0:00:09 lr 0.000693	time 0.8755 (0.8818)	loss 0.4947 (0.5683)	grad_norm 3.1617 (2.7617)	mem 20675MB
[2025-04-03 01:33:38 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][302/311]	eta 0:00:07 lr 0.000692	time 0.8752 (0.8818)	loss 0.5871 (0.5682)	grad_norm 2.3994 (2.7619)	mem 20675MB
[2025-04-03 01:33:39 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][304/311]	eta 0:00:06 lr 0.000692	time 0.8755 (0.8817)	loss 0.4293 (0.5678)	grad_norm 3.2984 (2.7615)	mem 20675MB
[2025-04-03 01:33:41 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][306/311]	eta 0:00:04 lr 0.000691	time 0.8753 (0.8817)	loss 0.6117 (0.5682)	grad_norm 5.3697 (2.7702)	mem 20675MB
[2025-04-03 01:33:43 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][308/311]	eta 0:00:02 lr 0.000691	time 0.8752 (0.8817)	loss 0.5646 (0.5685)	grad_norm 2.6711 (2.7687)	mem 20675MB
[2025-04-03 01:33:45 simmim_finetune] (main_finetune.py 252): INFO Train: [13/30][310/311]	eta 0:00:00 lr 0.000691	time 0.8754 (0.8816)	loss 0.5804 (0.5688)	grad_norm 2.2896 (2.7712)	mem 20675MB
[2025-04-03 01:33:45 simmim_finetune] (main_finetune.py 260): INFO EPOCH 13 training takes 0:04:34
[2025-04-03 01:33:46 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.391 (1.391)	Loss 0.5347 (0.5347)	Acc@1 75.000 (75.000)	Mem 20675MB
[2025-04-03 01:33:46 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 76.761
[2025-04-03 01:33:46 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 76.8%
[2025-04-03 01:33:46 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.17%
[2025-04-03 01:33:46 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.6649590963835634e-06, 2.6649590963835634e-06, 4.039687487951123e-06, 4.039687487951123e-06, 6.154654244208908e-06, 6.154654244208908e-06, 9.408449253836269e-06, 9.408449253836269e-06, 1.4414287730186055e-05, 1.4414287730186055e-05, 2.211557769380111e-05, 2.211557769380111e-05, 3.396371609936273e-05, 3.396371609936273e-05, 5.219162133868829e-05, 5.219162133868829e-05, 8.023455247611225e-05, 8.023455247611225e-05, 0.00012337752345676451, 0.00012337752345676451, 0.00018975132496546024, 0.00018975132496546024, 0.000291864865748069, 0.000291864865748069, 0.00044896262079823645, 0.00044896262079823645, 0.000690651474721571, 0.000690651474721571]
[2025-04-03 01:33:48 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][0/311]	eta 0:10:57 lr 0.000690	time 2.1155 (2.1155)	loss 0.6150 (0.6150)	grad_norm 2.5838 (2.5838)	mem 20675MB
[2025-04-03 01:33:50 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][2/311]	eta 0:06:38 lr 0.000690	time 0.8765 (1.2901)	loss 0.6009 (0.5819)	grad_norm 2.2814 (2.6376)	mem 20675MB
[2025-04-03 01:33:52 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][4/311]	eta 0:05:45 lr 0.000690	time 0.8758 (1.1247)	loss 0.5650 (0.5771)	grad_norm 2.3683 (2.4030)	mem 20675MB
[2025-04-03 01:33:54 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][6/311]	eta 0:05:21 lr 0.000689	time 0.8758 (1.0538)	loss 0.4959 (0.5751)	grad_norm 3.7565 (2.4965)	mem 20675MB
[2025-04-03 01:33:55 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][8/311]	eta 0:05:07 lr 0.000689	time 0.8773 (1.0146)	loss 0.5455 (0.5712)	grad_norm 2.5962 (2.4429)	mem 20675MB
[2025-04-03 01:33:57 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][10/311]	eta 0:04:57 lr 0.000688	time 0.8756 (0.9895)	loss 0.6058 (0.5656)	grad_norm 1.5583 (2.4760)	mem 20675MB
[2025-04-03 01:33:59 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][12/311]	eta 0:04:50 lr 0.000688	time 0.8756 (0.9721)	loss 0.5609 (0.5719)	grad_norm 1.7662 (2.3691)	mem 20675MB
[2025-04-03 01:34:01 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][14/311]	eta 0:04:45 lr 0.000688	time 0.8765 (0.9621)	loss 0.5828 (0.5719)	grad_norm 2.3539 (2.3897)	mem 20675MB
[2025-04-03 01:34:03 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][16/311]	eta 0:04:40 lr 0.000687	time 0.8757 (0.9521)	loss 0.5802 (0.5734)	grad_norm 2.0370 (2.3769)	mem 20675MB
[2025-04-03 01:34:04 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][18/311]	eta 0:04:36 lr 0.000687	time 0.8755 (0.9441)	loss 0.5771 (0.5752)	grad_norm 2.8784 (2.4043)	mem 20675MB
[2025-04-03 01:34:06 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][20/311]	eta 0:04:32 lr 0.000686	time 0.8762 (0.9377)	loss 0.6473 (0.5800)	grad_norm 3.1810 (2.4409)	mem 20675MB
[2025-04-03 01:34:08 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][22/311]	eta 0:04:29 lr 0.000686	time 0.8763 (0.9325)	loss 0.5105 (0.5779)	grad_norm 4.0486 (2.5267)	mem 20675MB
[2025-04-03 01:34:10 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][24/311]	eta 0:04:26 lr 0.000685	time 0.8756 (0.9280)	loss 0.4455 (0.5694)	grad_norm 4.1751 (2.6220)	mem 20675MB
[2025-04-03 01:34:11 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][26/311]	eta 0:04:23 lr 0.000685	time 0.8764 (0.9242)	loss 0.6343 (0.5730)	grad_norm 2.7962 (2.6049)	mem 20675MB
[2025-04-03 01:34:13 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][28/311]	eta 0:04:20 lr 0.000685	time 0.8756 (0.9210)	loss 0.6008 (0.5701)	grad_norm 2.1831 (2.6364)	mem 20675MB
[2025-04-03 01:34:15 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][30/311]	eta 0:04:17 lr 0.000684	time 0.8757 (0.9181)	loss 0.6064 (0.5688)	grad_norm 2.3261 (2.6366)	mem 20675MB
[2025-04-03 01:34:17 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][32/311]	eta 0:04:15 lr 0.000684	time 0.8755 (0.9157)	loss 0.5208 (0.5665)	grad_norm 4.4691 (2.7008)	mem 20675MB
[2025-04-03 01:34:18 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][34/311]	eta 0:04:13 lr 0.000683	time 0.8757 (0.9135)	loss 0.6011 (0.5651)	grad_norm 2.2596 (2.7287)	mem 20675MB
[2025-04-03 01:34:20 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][36/311]	eta 0:04:10 lr 0.000683	time 0.8757 (0.9115)	loss 0.5988 (0.5677)	grad_norm 2.5315 (2.7582)	mem 20675MB
[2025-04-03 01:34:22 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][38/311]	eta 0:04:08 lr 0.000682	time 0.8758 (0.9097)	loss 0.5835 (0.5691)	grad_norm 2.8362 (2.7498)	mem 20675MB
[2025-04-03 01:34:24 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][40/311]	eta 0:04:06 lr 0.000682	time 0.8758 (0.9081)	loss 0.6392 (0.5692)	grad_norm 2.7101 (2.7604)	mem 20675MB
[2025-04-03 01:34:25 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][42/311]	eta 0:04:03 lr 0.000682	time 0.8761 (0.9066)	loss 0.5675 (0.5698)	grad_norm 3.7846 (2.8015)	mem 20675MB
[2025-04-03 01:34:27 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][44/311]	eta 0:04:01 lr 0.000681	time 0.8755 (0.9053)	loss 0.6632 (0.5729)	grad_norm 2.5451 (2.7917)	mem 20675MB
[2025-04-03 01:34:29 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][46/311]	eta 0:03:59 lr 0.000681	time 0.8755 (0.9041)	loss 0.6170 (0.5752)	grad_norm 1.7055 (2.7567)	mem 20675MB
[2025-04-03 01:34:31 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][48/311]	eta 0:03:57 lr 0.000680	time 0.8757 (0.9029)	loss 0.5503 (0.5751)	grad_norm 1.9012 (2.7177)	mem 20675MB
[2025-04-03 01:34:32 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][50/311]	eta 0:03:55 lr 0.000680	time 0.8756 (0.9019)	loss 0.5792 (0.5761)	grad_norm 1.7527 (2.6788)	mem 20675MB
[2025-04-03 01:34:34 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][52/311]	eta 0:03:53 lr 0.000680	time 0.8756 (0.9010)	loss 0.5829 (0.5752)	grad_norm 1.5207 (2.6508)	mem 20675MB
[2025-04-03 01:34:36 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][54/311]	eta 0:03:51 lr 0.000679	time 0.8755 (0.9001)	loss 0.5946 (0.5749)	grad_norm 1.6201 (2.6200)	mem 20675MB
[2025-04-03 01:34:38 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][56/311]	eta 0:03:49 lr 0.000679	time 0.8756 (0.8992)	loss 0.5942 (0.5743)	grad_norm 1.6332 (2.6082)	mem 20675MB
[2025-04-03 01:34:39 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][58/311]	eta 0:03:47 lr 0.000678	time 0.8754 (0.8985)	loss 0.6108 (0.5738)	grad_norm 1.6588 (2.6033)	mem 20675MB
[2025-04-03 01:34:41 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][60/311]	eta 0:03:45 lr 0.000678	time 0.8758 (0.8978)	loss 0.4641 (0.5719)	grad_norm 3.3321 (2.6258)	mem 20675MB
[2025-04-03 01:34:43 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][62/311]	eta 0:03:43 lr 0.000677	time 0.8761 (0.8971)	loss 0.6571 (0.5728)	grad_norm 2.9607 (2.6423)	mem 20675MB
[2025-04-03 01:34:45 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][64/311]	eta 0:03:41 lr 0.000677	time 0.8753 (0.8965)	loss 0.5924 (0.5709)	grad_norm 3.0163 (2.6715)	mem 20675MB
[2025-04-03 01:34:46 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][66/311]	eta 0:03:39 lr 0.000677	time 0.8757 (0.8959)	loss 0.5985 (0.5720)	grad_norm 3.5034 (2.6820)	mem 20675MB
[2025-04-03 01:34:48 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][68/311]	eta 0:03:37 lr 0.000676	time 0.8755 (0.8953)	loss 0.6284 (0.5733)	grad_norm 2.0495 (2.6745)	mem 20675MB
[2025-04-03 01:34:50 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][70/311]	eta 0:03:35 lr 0.000676	time 0.8759 (0.8948)	loss 0.5889 (0.5726)	grad_norm 2.9520 (2.6929)	mem 20675MB
[2025-04-03 01:34:52 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][72/311]	eta 0:03:33 lr 0.000675	time 0.8757 (0.8943)	loss 0.4969 (0.5702)	grad_norm 3.5350 (2.7127)	mem 20675MB
[2025-04-03 01:34:53 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][74/311]	eta 0:03:31 lr 0.000675	time 0.8758 (0.8938)	loss 0.6044 (0.5685)	grad_norm 2.3432 (2.7184)	mem 20675MB
[2025-04-03 01:34:55 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][76/311]	eta 0:03:29 lr 0.000675	time 0.8757 (0.8934)	loss 0.6441 (0.5687)	grad_norm 2.6741 (2.7223)	mem 20675MB
[2025-04-03 01:34:57 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][78/311]	eta 0:03:28 lr 0.000674	time 0.8755 (0.8929)	loss 0.6366 (0.5696)	grad_norm 2.0849 (2.7128)	mem 20675MB
[2025-04-03 01:34:59 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][80/311]	eta 0:03:26 lr 0.000674	time 0.8757 (0.8925)	loss 0.4047 (0.5682)	grad_norm 3.0681 (2.7126)	mem 20675MB
[2025-04-03 01:35:00 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][82/311]	eta 0:03:24 lr 0.000673	time 0.8760 (0.8921)	loss 0.6525 (0.5685)	grad_norm 2.9519 (2.7302)	mem 20675MB
[2025-04-03 01:35:02 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][84/311]	eta 0:03:22 lr 0.000673	time 0.8757 (0.8918)	loss 0.5983 (0.5694)	grad_norm 2.8478 (2.7277)	mem 20675MB
[2025-04-03 01:35:04 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][86/311]	eta 0:03:20 lr 0.000672	time 0.8760 (0.8914)	loss 0.6512 (0.5702)	grad_norm 2.0611 (2.7091)	mem 20675MB
[2025-04-03 01:35:06 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][88/311]	eta 0:03:18 lr 0.000672	time 0.8755 (0.8911)	loss 0.5799 (0.5692)	grad_norm 2.0427 (2.7099)	mem 20675MB
[2025-04-03 01:35:07 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][90/311]	eta 0:03:16 lr 0.000672	time 0.8757 (0.8908)	loss 0.6199 (0.5703)	grad_norm 2.0167 (2.6980)	mem 20675MB
[2025-04-03 01:35:09 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][92/311]	eta 0:03:15 lr 0.000671	time 0.8755 (0.8905)	loss 0.5250 (0.5705)	grad_norm 3.4710 (2.7015)	mem 20675MB
[2025-04-03 01:35:11 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][94/311]	eta 0:03:13 lr 0.000671	time 0.8755 (0.8902)	loss 0.6265 (0.5718)	grad_norm 1.7360 (2.6842)	mem 20675MB
[2025-04-03 01:35:13 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][96/311]	eta 0:03:11 lr 0.000670	time 0.8769 (0.8899)	loss 0.4948 (0.5714)	grad_norm 3.4778 (2.6855)	mem 20675MB
[2025-04-03 01:35:14 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][98/311]	eta 0:03:09 lr 0.000670	time 0.8758 (0.8896)	loss 0.6213 (0.5711)	grad_norm 2.0016 (2.6843)	mem 20675MB
[2025-04-03 01:35:16 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][100/311]	eta 0:03:07 lr 0.000669	time 0.8757 (0.8894)	loss 0.5650 (0.5702)	grad_norm 2.0389 (2.6802)	mem 20675MB
[2025-04-03 01:35:18 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][102/311]	eta 0:03:05 lr 0.000669	time 0.8757 (0.8891)	loss 0.4717 (0.5686)	grad_norm 2.5982 (2.6874)	mem 20675MB
[2025-04-03 01:35:20 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][104/311]	eta 0:03:03 lr 0.000669	time 0.8759 (0.8889)	loss 0.5368 (0.5688)	grad_norm 2.8516 (2.6892)	mem 20675MB
[2025-04-03 01:35:21 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][106/311]	eta 0:03:02 lr 0.000668	time 0.8755 (0.8887)	loss 0.4808 (0.5682)	grad_norm 4.5338 (2.7034)	mem 20675MB
[2025-04-03 01:35:23 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][108/311]	eta 0:03:00 lr 0.000668	time 0.8756 (0.8884)	loss 0.5776 (0.5684)	grad_norm 2.3499 (2.6956)	mem 20675MB
[2025-04-03 01:35:25 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][110/311]	eta 0:02:58 lr 0.000667	time 0.8761 (0.8882)	loss 0.5426 (0.5687)	grad_norm 1.9214 (2.6891)	mem 20675MB
[2025-04-03 01:35:27 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][112/311]	eta 0:02:56 lr 0.000667	time 0.8757 (0.8880)	loss 0.5844 (0.5691)	grad_norm 2.5003 (2.6789)	mem 20675MB
[2025-04-03 01:35:28 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][114/311]	eta 0:02:54 lr 0.000667	time 0.8761 (0.8878)	loss 0.5206 (0.5692)	grad_norm 3.5403 (2.6859)	mem 20675MB
[2025-04-03 01:35:30 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][116/311]	eta 0:02:53 lr 0.000666	time 0.8755 (0.8876)	loss 0.5863 (0.5696)	grad_norm 1.7517 (2.6728)	mem 20675MB
[2025-04-03 01:35:32 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][118/311]	eta 0:02:51 lr 0.000666	time 0.8757 (0.8874)	loss 0.6460 (0.5701)	grad_norm 3.0170 (2.6737)	mem 20675MB
[2025-04-03 01:35:34 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][120/311]	eta 0:02:49 lr 0.000665	time 0.8756 (0.8873)	loss 0.4670 (0.5698)	grad_norm 3.3344 (2.6769)	mem 20675MB
[2025-04-03 01:35:35 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][122/311]	eta 0:02:47 lr 0.000665	time 0.8771 (0.8871)	loss 0.5683 (0.5691)	grad_norm 2.2893 (2.6765)	mem 20675MB
[2025-04-03 01:35:37 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][124/311]	eta 0:02:45 lr 0.000664	time 0.8769 (0.8870)	loss 0.5716 (0.5690)	grad_norm 3.1649 (2.6775)	mem 20675MB
[2025-04-03 01:35:39 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][126/311]	eta 0:02:44 lr 0.000664	time 0.8757 (0.8868)	loss 0.5819 (0.5696)	grad_norm 2.1078 (2.6762)	mem 20675MB
[2025-04-03 01:35:41 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][128/311]	eta 0:02:42 lr 0.000664	time 0.8771 (0.8867)	loss 0.6094 (0.5698)	grad_norm 2.2341 (2.6687)	mem 20675MB
[2025-04-03 01:35:42 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][130/311]	eta 0:02:40 lr 0.000663	time 0.8764 (0.8865)	loss 0.5999 (0.5705)	grad_norm 3.0431 (2.6726)	mem 20675MB
[2025-04-03 01:35:44 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][132/311]	eta 0:02:38 lr 0.000663	time 0.8760 (0.8864)	loss 0.4523 (0.5693)	grad_norm 3.0491 (2.6802)	mem 20675MB
[2025-04-03 01:35:46 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][134/311]	eta 0:02:36 lr 0.000662	time 0.8762 (0.8862)	loss 0.5049 (0.5683)	grad_norm 4.1046 (2.6926)	mem 20675MB
[2025-04-03 01:35:48 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][136/311]	eta 0:02:35 lr 0.000662	time 0.8757 (0.8861)	loss 0.4765 (0.5669)	grad_norm 2.9356 (2.7022)	mem 20675MB
[2025-04-03 01:35:49 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][138/311]	eta 0:02:33 lr 0.000662	time 0.8768 (0.8860)	loss 0.5775 (0.5671)	grad_norm 2.6624 (2.6993)	mem 20675MB
[2025-04-03 01:35:51 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][140/311]	eta 0:02:31 lr 0.000661	time 0.8759 (0.8859)	loss 0.6022 (0.5678)	grad_norm 2.7497 (2.6978)	mem 20675MB
[2025-04-03 01:35:53 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][142/311]	eta 0:02:29 lr 0.000661	time 0.8758 (0.8857)	loss 0.5582 (0.5680)	grad_norm 3.2898 (2.7020)	mem 20675MB
[2025-04-03 01:35:55 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][144/311]	eta 0:02:27 lr 0.000660	time 0.8757 (0.8856)	loss 0.4962 (0.5683)	grad_norm 3.4887 (2.7129)	mem 20675MB
[2025-04-03 01:35:56 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][146/311]	eta 0:02:26 lr 0.000660	time 0.8770 (0.8855)	loss 0.5852 (0.5676)	grad_norm 3.0064 (2.7388)	mem 20675MB
[2025-04-03 01:35:58 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][148/311]	eta 0:02:24 lr 0.000659	time 0.8757 (0.8854)	loss 0.6146 (0.5677)	grad_norm 2.6712 (2.7487)	mem 20675MB
[2025-04-03 01:36:00 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][150/311]	eta 0:02:22 lr 0.000659	time 0.8759 (0.8853)	loss 0.6113 (0.5676)	grad_norm 2.0606 (2.7512)	mem 20675MB
[2025-04-03 01:36:02 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][152/311]	eta 0:02:20 lr 0.000659	time 0.8756 (0.8852)	loss 0.6088 (0.5684)	grad_norm 1.9638 (2.7419)	mem 20675MB
[2025-04-03 01:36:04 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][154/311]	eta 0:02:18 lr 0.000658	time 0.8755 (0.8850)	loss 0.6461 (0.5693)	grad_norm 2.4564 (2.7394)	mem 20675MB
[2025-04-03 01:36:05 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][156/311]	eta 0:02:17 lr 0.000658	time 0.8756 (0.8849)	loss 0.6029 (0.5699)	grad_norm 2.0550 (2.7310)	mem 20675MB
[2025-04-03 01:36:07 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][158/311]	eta 0:02:15 lr 0.000657	time 0.8753 (0.8848)	loss 0.6406 (0.5698)	grad_norm 1.8110 (2.7292)	mem 20675MB
[2025-04-03 01:36:09 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][160/311]	eta 0:02:13 lr 0.000657	time 0.8752 (0.8847)	loss 0.4413 (0.5687)	grad_norm 3.0422 (2.7310)	mem 20675MB
[2025-04-03 01:36:11 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][162/311]	eta 0:02:11 lr 0.000656	time 0.8753 (0.8846)	loss 0.5770 (0.5686)	grad_norm 2.3941 (2.7267)	mem 20675MB
[2025-04-03 01:36:12 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][164/311]	eta 0:02:10 lr 0.000656	time 0.8759 (0.8845)	loss 0.6788 (0.5696)	grad_norm 2.2442 (2.7215)	mem 20675MB
[2025-04-03 01:36:14 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][166/311]	eta 0:02:08 lr 0.000656	time 0.8757 (0.8844)	loss 0.6419 (0.5696)	grad_norm 2.4761 (2.7235)	mem 20675MB
[2025-04-03 01:36:16 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][168/311]	eta 0:02:06 lr 0.000655	time 0.8756 (0.8843)	loss 0.5514 (0.5694)	grad_norm 1.9880 (2.7176)	mem 20675MB
[2025-04-03 01:36:18 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][170/311]	eta 0:02:04 lr 0.000655	time 0.8754 (0.8842)	loss 0.5426 (0.5696)	grad_norm 5.5224 (2.7300)	mem 20675MB
[2025-04-03 01:36:19 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][172/311]	eta 0:02:02 lr 0.000654	time 0.8754 (0.8842)	loss 0.6527 (0.5703)	grad_norm 2.1414 (2.7261)	mem 20675MB
[2025-04-03 01:36:21 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][174/311]	eta 0:02:01 lr 0.000654	time 0.8756 (0.8841)	loss 0.5799 (0.5705)	grad_norm 2.0047 (2.7228)	mem 20675MB
[2025-04-03 01:36:23 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][176/311]	eta 0:01:59 lr 0.000654	time 0.8758 (0.8840)	loss 0.4904 (0.5704)	grad_norm 2.5556 (2.7207)	mem 20675MB
[2025-04-03 01:36:25 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][178/311]	eta 0:01:57 lr 0.000653	time 0.8753 (0.8839)	loss 0.5403 (0.5704)	grad_norm 2.6605 (2.7164)	mem 20675MB
[2025-04-03 01:36:26 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][180/311]	eta 0:01:55 lr 0.000653	time 0.8756 (0.8838)	loss 0.6468 (0.5708)	grad_norm 2.2300 (2.7093)	mem 20675MB
[2025-04-03 01:36:28 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][182/311]	eta 0:01:54 lr 0.000652	time 0.8757 (0.8837)	loss 0.5254 (0.5701)	grad_norm 2.6199 (2.7082)	mem 20675MB
[2025-04-03 01:36:30 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][184/311]	eta 0:01:52 lr 0.000652	time 0.8757 (0.8837)	loss 0.4651 (0.5697)	grad_norm 2.6889 (2.7064)	mem 20675MB
[2025-04-03 01:36:32 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][186/311]	eta 0:01:50 lr 0.000651	time 0.8754 (0.8836)	loss 0.6484 (0.5702)	grad_norm 1.6517 (2.6960)	mem 20675MB
[2025-04-03 01:36:33 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][188/311]	eta 0:01:48 lr 0.000651	time 0.8756 (0.8835)	loss 0.5890 (0.5702)	grad_norm 2.6590 (2.6938)	mem 20675MB
[2025-04-03 01:36:35 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][190/311]	eta 0:01:46 lr 0.000651	time 0.8757 (0.8834)	loss 0.5185 (0.5699)	grad_norm 5.1634 (2.7017)	mem 20675MB
[2025-04-03 01:36:37 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][192/311]	eta 0:01:45 lr 0.000650	time 0.8760 (0.8834)	loss 0.6544 (0.5704)	grad_norm 2.7905 (2.6986)	mem 20675MB
[2025-04-03 01:36:39 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][194/311]	eta 0:01:43 lr 0.000650	time 0.8762 (0.8833)	loss 0.6027 (0.5702)	grad_norm 1.8226 (2.6954)	mem 20675MB
[2025-04-03 01:36:40 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][196/311]	eta 0:01:41 lr 0.000649	time 0.8756 (0.8832)	loss 0.6207 (0.5704)	grad_norm 2.2570 (2.6903)	mem 20675MB
[2025-04-03 01:36:42 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][198/311]	eta 0:01:39 lr 0.000649	time 0.8755 (0.8832)	loss 0.6164 (0.5702)	grad_norm 2.2443 (2.6983)	mem 20675MB
[2025-04-03 01:36:44 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][200/311]	eta 0:01:38 lr 0.000648	time 0.8770 (0.8831)	loss 0.6130 (0.5705)	grad_norm 2.2027 (2.6979)	mem 20675MB
[2025-04-03 01:36:46 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][202/311]	eta 0:01:36 lr 0.000648	time 0.8758 (0.8831)	loss 0.5985 (0.5707)	grad_norm 2.7382 (2.6955)	mem 20675MB
[2025-04-03 01:36:47 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][204/311]	eta 0:01:34 lr 0.000648	time 0.8769 (0.8830)	loss 0.5622 (0.5709)	grad_norm 2.3313 (2.6917)	mem 20675MB
[2025-04-03 01:36:49 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][206/311]	eta 0:01:32 lr 0.000647	time 0.8759 (0.8829)	loss 0.5529 (0.5702)	grad_norm 1.9146 (2.6894)	mem 20675MB
[2025-04-03 01:36:51 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][208/311]	eta 0:01:30 lr 0.000647	time 0.8762 (0.8829)	loss 0.6209 (0.5707)	grad_norm 2.2026 (2.6851)	mem 20675MB
[2025-04-03 01:36:53 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][210/311]	eta 0:01:29 lr 0.000646	time 0.8780 (0.8828)	loss 0.5136 (0.5711)	grad_norm 2.3192 (2.6813)	mem 20675MB
[2025-04-03 01:36:54 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][212/311]	eta 0:01:27 lr 0.000646	time 0.8754 (0.8828)	loss 0.4335 (0.5705)	grad_norm 3.1888 (2.6830)	mem 20675MB
[2025-04-03 01:36:56 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][214/311]	eta 0:01:25 lr 0.000646	time 0.8763 (0.8827)	loss 0.5871 (0.5710)	grad_norm 1.4020 (2.6721)	mem 20675MB
[2025-04-03 01:36:58 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][216/311]	eta 0:01:23 lr 0.000645	time 0.8763 (0.8827)	loss 0.4617 (0.5705)	grad_norm 3.2401 (2.6708)	mem 20675MB
[2025-04-03 01:37:00 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][218/311]	eta 0:01:22 lr 0.000645	time 0.8762 (0.8826)	loss 0.5408 (0.5704)	grad_norm 2.7599 (2.6693)	mem 20675MB
[2025-04-03 01:37:01 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][220/311]	eta 0:01:20 lr 0.000644	time 0.8756 (0.8826)	loss 0.5245 (0.5701)	grad_norm 3.0677 (2.6705)	mem 20675MB
[2025-04-03 01:37:03 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][222/311]	eta 0:01:18 lr 0.000644	time 0.8768 (0.8825)	loss 0.6274 (0.5707)	grad_norm 2.7132 (2.6700)	mem 20675MB
[2025-04-03 01:37:05 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][224/311]	eta 0:01:16 lr 0.000643	time 0.8765 (0.8825)	loss 0.6114 (0.5708)	grad_norm 3.1245 (2.6729)	mem 20675MB
[2025-04-03 01:37:07 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][226/311]	eta 0:01:15 lr 0.000643	time 0.8761 (0.8824)	loss 0.5537 (0.5710)	grad_norm 1.9282 (2.6685)	mem 20675MB
[2025-04-03 01:37:08 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][228/311]	eta 0:01:13 lr 0.000643	time 0.8762 (0.8824)	loss 0.5529 (0.5713)	grad_norm 2.4929 (2.6705)	mem 20675MB
[2025-04-03 01:37:10 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][230/311]	eta 0:01:11 lr 0.000642	time 0.8758 (0.8824)	loss 0.5980 (0.5712)	grad_norm 2.2326 (2.6745)	mem 20675MB
[2025-04-03 01:37:12 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][232/311]	eta 0:01:09 lr 0.000642	time 0.8758 (0.8823)	loss 0.6738 (0.5720)	grad_norm 2.3998 (2.6728)	mem 20675MB
[2025-04-03 01:37:14 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][234/311]	eta 0:01:07 lr 0.000641	time 0.8762 (0.8823)	loss 0.5458 (0.5714)	grad_norm 2.2743 (2.6760)	mem 20675MB
[2025-04-03 01:37:15 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][236/311]	eta 0:01:06 lr 0.000641	time 0.8758 (0.8822)	loss 0.6493 (0.5719)	grad_norm 2.2603 (2.6720)	mem 20675MB
[2025-04-03 01:37:17 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][238/311]	eta 0:01:04 lr 0.000640	time 0.8758 (0.8822)	loss 0.4986 (0.5720)	grad_norm 3.1031 (2.6709)	mem 20675MB
[2025-04-03 01:37:19 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][240/311]	eta 0:01:02 lr 0.000640	time 0.8764 (0.8821)	loss 0.4995 (0.5718)	grad_norm 3.0526 (2.6708)	mem 20675MB
[2025-04-03 01:37:21 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][242/311]	eta 0:01:00 lr 0.000640	time 0.8766 (0.8821)	loss 0.6010 (0.5718)	grad_norm 2.1242 (2.6671)	mem 20675MB
[2025-04-03 01:37:22 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][244/311]	eta 0:00:59 lr 0.000639	time 0.8761 (0.8821)	loss 0.5899 (0.5720)	grad_norm 2.4443 (2.6634)	mem 20675MB
[2025-04-03 01:37:24 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][246/311]	eta 0:00:57 lr 0.000639	time 0.8756 (0.8820)	loss 0.5342 (0.5717)	grad_norm 2.7516 (2.6657)	mem 20675MB
[2025-04-03 01:37:26 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][248/311]	eta 0:00:55 lr 0.000638	time 0.8761 (0.8820)	loss 0.6072 (0.5719)	grad_norm 2.1330 (2.6617)	mem 20675MB
[2025-04-03 01:37:28 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][250/311]	eta 0:00:53 lr 0.000638	time 0.8755 (0.8819)	loss 0.5431 (0.5718)	grad_norm 2.5585 (2.6585)	mem 20675MB
[2025-04-03 01:37:29 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][252/311]	eta 0:00:52 lr 0.000638	time 0.8757 (0.8819)	loss 0.4724 (0.5711)	grad_norm 3.6715 (2.6688)	mem 20675MB
[2025-04-03 01:37:31 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][254/311]	eta 0:00:50 lr 0.000637	time 0.8763 (0.8819)	loss 0.4913 (0.5703)	grad_norm 3.0974 (2.6788)	mem 20675MB
[2025-04-03 01:37:33 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][256/311]	eta 0:00:48 lr 0.000637	time 0.8758 (0.8818)	loss 0.5701 (0.5705)	grad_norm 3.6434 (2.6845)	mem 20675MB
[2025-04-03 01:37:35 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][258/311]	eta 0:00:46 lr 0.000636	time 0.8754 (0.8818)	loss 0.6330 (0.5704)	grad_norm 3.8562 (2.6958)	mem 20675MB
[2025-04-03 01:37:36 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][260/311]	eta 0:00:44 lr 0.000636	time 0.8764 (0.8817)	loss 0.6134 (0.5710)	grad_norm 2.6009 (2.6974)	mem 20675MB
[2025-04-03 01:37:38 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][262/311]	eta 0:00:43 lr 0.000635	time 0.8759 (0.8817)	loss 0.6262 (0.5706)	grad_norm 2.9222 (2.7022)	mem 20675MB
[2025-04-03 01:37:40 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][264/311]	eta 0:00:41 lr 0.000635	time 0.8758 (0.8817)	loss 0.5306 (0.5708)	grad_norm 2.8976 (2.7087)	mem 20675MB
[2025-04-03 01:37:42 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][266/311]	eta 0:00:39 lr 0.000635	time 0.8754 (0.8816)	loss 0.4654 (0.5706)	grad_norm 3.8007 (2.7152)	mem 20675MB
[2025-04-03 01:37:43 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][268/311]	eta 0:00:37 lr 0.000634	time 0.8771 (0.8816)	loss 0.6517 (0.5709)	grad_norm 2.5412 (2.7147)	mem 20675MB
[2025-04-03 01:37:45 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][270/311]	eta 0:00:36 lr 0.000634	time 0.8758 (0.8816)	loss 0.4564 (0.5706)	grad_norm 4.0207 (2.7183)	mem 20675MB
[2025-04-03 01:37:47 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][272/311]	eta 0:00:34 lr 0.000633	time 0.8756 (0.8815)	loss 0.6121 (0.5709)	grad_norm 1.6674 (2.7127)	mem 20675MB
[2025-04-03 01:37:49 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][274/311]	eta 0:00:32 lr 0.000633	time 0.8759 (0.8815)	loss 0.6257 (0.5711)	grad_norm 1.8437 (2.7103)	mem 20675MB
[2025-04-03 01:37:50 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][276/311]	eta 0:00:30 lr 0.000632	time 0.8761 (0.8815)	loss 0.4809 (0.5708)	grad_norm 2.6808 (2.7059)	mem 20675MB
[2025-04-03 01:37:52 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][278/311]	eta 0:00:29 lr 0.000632	time 0.8775 (0.8814)	loss 0.5724 (0.5709)	grad_norm 2.4982 (2.7018)	mem 20675MB
[2025-04-03 01:37:54 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][280/311]	eta 0:00:27 lr 0.000632	time 0.8758 (0.8814)	loss 0.5599 (0.5711)	grad_norm 2.1114 (2.6985)	mem 20675MB
[2025-04-03 01:37:56 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][282/311]	eta 0:00:25 lr 0.000631	time 0.8758 (0.8814)	loss 0.5927 (0.5712)	grad_norm 2.0959 (2.6931)	mem 20675MB
[2025-04-03 01:37:57 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][284/311]	eta 0:00:23 lr 0.000631	time 0.8756 (0.8813)	loss 0.5564 (0.5712)	grad_norm 2.6497 (2.6912)	mem 20675MB
[2025-04-03 01:37:59 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][286/311]	eta 0:00:22 lr 0.000630	time 0.8758 (0.8813)	loss 0.5888 (0.5710)	grad_norm 2.3390 (2.6883)	mem 20675MB
[2025-04-03 01:38:01 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][288/311]	eta 0:00:20 lr 0.000630	time 0.8760 (0.8813)	loss 0.5766 (0.5713)	grad_norm 2.9213 (2.6893)	mem 20675MB
[2025-04-03 01:38:03 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][290/311]	eta 0:00:18 lr 0.000630	time 0.8760 (0.8812)	loss 0.4851 (0.5710)	grad_norm 4.3884 (2.6957)	mem 20675MB
[2025-04-03 01:38:05 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][292/311]	eta 0:00:16 lr 0.000629	time 0.8755 (0.8812)	loss 0.4478 (0.5706)	grad_norm 3.7982 (2.7004)	mem 20675MB
[2025-04-03 01:38:06 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][294/311]	eta 0:00:14 lr 0.000629	time 0.8758 (0.8812)	loss 0.5203 (0.5705)	grad_norm 3.2102 (2.7019)	mem 20675MB
[2025-04-03 01:38:08 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][296/311]	eta 0:00:13 lr 0.000628	time 0.8756 (0.8811)	loss 0.5858 (0.5707)	grad_norm 2.6604 (2.7030)	mem 20675MB
[2025-04-03 01:38:10 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][298/311]	eta 0:00:11 lr 0.000628	time 0.8754 (0.8811)	loss 0.5290 (0.5707)	grad_norm 2.9868 (2.7042)	mem 20675MB
[2025-04-03 01:38:12 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][300/311]	eta 0:00:09 lr 0.000627	time 0.8764 (0.8811)	loss 0.5848 (0.5711)	grad_norm 3.0054 (2.7100)	mem 20675MB
[2025-04-03 01:38:13 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][302/311]	eta 0:00:07 lr 0.000627	time 0.8754 (0.8810)	loss 0.6109 (0.5714)	grad_norm 2.0244 (2.7069)	mem 20675MB
[2025-04-03 01:38:15 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][304/311]	eta 0:00:06 lr 0.000627	time 0.8754 (0.8810)	loss 0.5887 (0.5715)	grad_norm 1.9143 (2.7019)	mem 20675MB
[2025-04-03 01:38:17 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][306/311]	eta 0:00:04 lr 0.000626	time 0.8753 (0.8810)	loss 0.5848 (0.5717)	grad_norm 1.8765 (2.7007)	mem 20675MB
[2025-04-03 01:38:19 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][308/311]	eta 0:00:02 lr 0.000626	time 0.8756 (0.8810)	loss 0.5350 (0.5718)	grad_norm 2.0386 (2.6963)	mem 20675MB
[2025-04-03 01:38:20 simmim_finetune] (main_finetune.py 252): INFO Train: [14/30][310/311]	eta 0:00:00 lr 0.000625	time 0.8754 (0.8809)	loss 0.5753 (0.5715)	grad_norm 2.4913 (2.6957)	mem 20675MB
[2025-04-03 01:38:20 simmim_finetune] (main_finetune.py 260): INFO EPOCH 14 training takes 0:04:34
[2025-04-03 01:38:22 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.342 (1.342)	Loss 0.5139 (0.5139)	Acc@1 77.344 (77.344)	Mem 20675MB
[2025-04-03 01:38:22 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.873
[2025-04-03 01:38:22 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 78.9%
[2025-04-03 01:38:22 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 01:38:22 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.4364896676937507e-06, 2.4364896676937507e-06, 3.6811606141081196e-06, 3.6811606141081196e-06, 5.5960389932071484e-06, 5.5960389932071484e-06, 8.542005730282577e-06, 8.542005730282577e-06, 1.307426224886016e-05, 1.307426224886016e-05, 2.0046964585133366e-05, 2.0046964585133366e-05, 3.07741989486306e-05, 3.07741989486306e-05, 4.727763643093403e-05, 4.727763643093403e-05, 7.26675402498624e-05, 7.26675402498624e-05, 0.00011172893074052144, 0.00011172893074052144, 0.00017182337764922763, 0.00017182337764922763, 0.00026427637289339094, 0.00026427637289339094, 0.00040651175019210383, 0.00040651175019210383, 0.000625335407574739, 0.000625335407574739]
[2025-04-03 01:38:24 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][0/311]	eta 0:11:16 lr 0.000625	time 2.1739 (2.1739)	loss 0.5861 (0.5861)	grad_norm 1.9790 (1.9790)	mem 20675MB
[2025-04-03 01:38:26 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][2/311]	eta 0:06:44 lr 0.000625	time 0.8769 (1.3103)	loss 0.5317 (0.5868)	grad_norm 2.6240 (2.6523)	mem 20675MB
[2025-04-03 01:38:28 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][4/311]	eta 0:05:49 lr 0.000624	time 0.8761 (1.1371)	loss 0.6892 (0.6027)	grad_norm 2.8153 (2.7052)	mem 20675MB
[2025-04-03 01:38:29 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][6/311]	eta 0:05:24 lr 0.000624	time 0.8759 (1.0627)	loss 0.5902 (0.5952)	grad_norm 2.3582 (2.5349)	mem 20675MB
[2025-04-03 01:38:31 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][8/311]	eta 0:05:09 lr 0.000623	time 0.8759 (1.0214)	loss 0.6300 (0.5960)	grad_norm 3.0864 (2.7040)	mem 20675MB
[2025-04-03 01:38:33 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][10/311]	eta 0:04:59 lr 0.000623	time 0.8759 (0.9951)	loss 0.5236 (0.5902)	grad_norm 3.8189 (2.7527)	mem 20675MB
[2025-04-03 01:38:35 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][12/311]	eta 0:04:52 lr 0.000623	time 0.8757 (0.9769)	loss 0.4307 (0.5800)	grad_norm 3.5356 (2.8019)	mem 20675MB
[2025-04-03 01:38:36 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][14/311]	eta 0:04:46 lr 0.000622	time 0.8761 (0.9638)	loss 0.6208 (0.5864)	grad_norm 2.0598 (2.7309)	mem 20675MB
[2025-04-03 01:38:38 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][16/311]	eta 0:04:41 lr 0.000622	time 0.8781 (0.9537)	loss 0.5267 (0.5819)	grad_norm 2.6865 (2.6767)	mem 20675MB
[2025-04-03 01:38:40 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][18/311]	eta 0:04:37 lr 0.000621	time 0.8761 (0.9456)	loss 0.6677 (0.5858)	grad_norm 3.2732 (2.6876)	mem 20675MB
[2025-04-03 01:38:42 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][20/311]	eta 0:04:33 lr 0.000621	time 0.8755 (0.9390)	loss 0.6068 (0.5840)	grad_norm 2.2352 (2.6940)	mem 20675MB
[2025-04-03 01:38:43 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][22/311]	eta 0:04:29 lr 0.000620	time 0.8756 (0.9335)	loss 0.5841 (0.5843)	grad_norm 2.3915 (2.6490)	mem 20675MB
[2025-04-03 01:38:45 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][24/311]	eta 0:04:26 lr 0.000620	time 0.8761 (0.9290)	loss 0.6063 (0.5836)	grad_norm 2.5021 (2.6693)	mem 20675MB
[2025-04-03 01:38:47 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][26/311]	eta 0:04:23 lr 0.000620	time 0.8757 (0.9251)	loss 0.6556 (0.5890)	grad_norm 2.9433 (2.6567)	mem 20675MB
[2025-04-03 01:38:49 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][28/311]	eta 0:04:20 lr 0.000619	time 0.8754 (0.9218)	loss 0.5547 (0.5890)	grad_norm 3.4536 (2.6716)	mem 20675MB
[2025-04-03 01:38:50 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][30/311]	eta 0:04:18 lr 0.000619	time 0.8761 (0.9189)	loss 0.5851 (0.5871)	grad_norm 1.9069 (2.6522)	mem 20675MB
[2025-04-03 01:38:52 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][32/311]	eta 0:04:15 lr 0.000618	time 0.8775 (0.9164)	loss 0.6467 (0.5904)	grad_norm 2.1719 (2.6201)	mem 20675MB
[2025-04-03 01:38:54 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][34/311]	eta 0:04:13 lr 0.000618	time 0.8759 (0.9141)	loss 0.5789 (0.5917)	grad_norm 2.0660 (2.5911)	mem 20675MB
[2025-04-03 01:38:56 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][36/311]	eta 0:04:10 lr 0.000618	time 0.8757 (0.9121)	loss 0.6288 (0.5940)	grad_norm 1.9810 (2.5560)	mem 20675MB
[2025-04-03 01:38:57 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][38/311]	eta 0:04:08 lr 0.000617	time 0.8759 (0.9103)	loss 0.6089 (0.5923)	grad_norm 2.0930 (2.5322)	mem 20675MB
[2025-04-03 01:38:59 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][40/311]	eta 0:04:06 lr 0.000617	time 0.8758 (0.9086)	loss 0.5629 (0.5906)	grad_norm 1.8814 (2.5141)	mem 20675MB
[2025-04-03 01:39:01 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][42/311]	eta 0:04:04 lr 0.000616	time 0.8756 (0.9071)	loss 0.5402 (0.5900)	grad_norm 2.4733 (2.5034)	mem 20675MB
[2025-04-03 01:39:03 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][44/311]	eta 0:04:01 lr 0.000616	time 0.8758 (0.9058)	loss 0.4534 (0.5849)	grad_norm 2.5541 (2.5032)	mem 20675MB
[2025-04-03 01:39:04 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][46/311]	eta 0:03:59 lr 0.000615	time 0.8754 (0.9045)	loss 0.4740 (0.5809)	grad_norm 3.6172 (2.5186)	mem 20675MB
[2025-04-03 01:39:06 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][48/311]	eta 0:03:57 lr 0.000615	time 0.8757 (0.9034)	loss 0.6490 (0.5823)	grad_norm 2.7685 (2.5208)	mem 20675MB
[2025-04-03 01:39:08 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][50/311]	eta 0:03:55 lr 0.000615	time 0.8756 (0.9023)	loss 0.6752 (0.5843)	grad_norm 3.0395 (2.5289)	mem 20675MB
[2025-04-03 01:39:10 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][52/311]	eta 0:03:53 lr 0.000614	time 0.8755 (0.9013)	loss 0.6334 (0.5852)	grad_norm 3.8097 (2.5597)	mem 20675MB
[2025-04-03 01:39:11 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][54/311]	eta 0:03:51 lr 0.000614	time 0.8756 (0.9004)	loss 0.6231 (0.5833)	grad_norm 2.8938 (2.6145)	mem 20675MB
[2025-04-03 01:39:13 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][56/311]	eta 0:03:49 lr 0.000613	time 0.8786 (0.8996)	loss 0.6521 (0.5851)	grad_norm 3.1572 (2.6258)	mem 20675MB
[2025-04-03 01:39:15 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][58/311]	eta 0:03:47 lr 0.000613	time 0.8757 (0.8989)	loss 0.5253 (0.5845)	grad_norm 3.5987 (2.6515)	mem 20675MB
[2025-04-03 01:39:17 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][60/311]	eta 0:03:45 lr 0.000613	time 0.8758 (0.8981)	loss 0.6087 (0.5842)	grad_norm 2.8510 (2.6603)	mem 20675MB
[2025-04-03 01:39:18 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][62/311]	eta 0:03:43 lr 0.000612	time 0.8760 (0.8974)	loss 0.5113 (0.5842)	grad_norm 3.5019 (2.6597)	mem 20675MB
[2025-04-03 01:39:20 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][64/311]	eta 0:03:41 lr 0.000612	time 0.8755 (0.8968)	loss 0.5335 (0.5815)	grad_norm 2.1297 (2.6653)	mem 20675MB
[2025-04-03 01:39:22 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][66/311]	eta 0:03:39 lr 0.000611	time 0.8757 (0.8962)	loss 0.5948 (0.5797)	grad_norm 2.4363 (2.6635)	mem 20675MB
[2025-04-03 01:39:24 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][68/311]	eta 0:03:37 lr 0.000611	time 0.8754 (0.8956)	loss 0.5360 (0.5792)	grad_norm 2.4743 (2.6488)	mem 20675MB
[2025-04-03 01:39:25 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][70/311]	eta 0:03:35 lr 0.000610	time 0.8754 (0.8951)	loss 0.6539 (0.5807)	grad_norm 2.2319 (2.6357)	mem 20675MB
[2025-04-03 01:39:27 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][72/311]	eta 0:03:33 lr 0.000610	time 0.8754 (0.8946)	loss 0.5914 (0.5812)	grad_norm 1.7803 (2.6135)	mem 20675MB
[2025-04-03 01:39:29 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][74/311]	eta 0:03:31 lr 0.000610	time 0.8754 (0.8941)	loss 0.5772 (0.5796)	grad_norm 2.6337 (2.6278)	mem 20675MB
[2025-04-03 01:39:31 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][76/311]	eta 0:03:30 lr 0.000609	time 0.8754 (0.8936)	loss 0.6164 (0.5800)	grad_norm 1.8446 (2.6183)	mem 20675MB
[2025-04-03 01:39:32 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][78/311]	eta 0:03:28 lr 0.000609	time 0.8765 (0.8932)	loss 0.5101 (0.5797)	grad_norm 4.2837 (2.6386)	mem 20675MB
[2025-04-03 01:39:34 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][80/311]	eta 0:03:26 lr 0.000608	time 0.8756 (0.8928)	loss 0.6302 (0.5792)	grad_norm 2.9032 (2.6588)	mem 20675MB
[2025-04-03 01:39:36 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][82/311]	eta 0:03:24 lr 0.000608	time 0.8754 (0.8924)	loss 0.5355 (0.5787)	grad_norm 3.0679 (2.6589)	mem 20675MB
[2025-04-03 01:39:38 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][84/311]	eta 0:03:22 lr 0.000607	time 0.8754 (0.8920)	loss 0.6355 (0.5797)	grad_norm 1.8720 (2.6494)	mem 20675MB
[2025-04-03 01:39:39 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][86/311]	eta 0:03:20 lr 0.000607	time 0.8756 (0.8917)	loss 0.4779 (0.5792)	grad_norm 3.9472 (2.6697)	mem 20675MB
[2025-04-03 01:39:41 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][88/311]	eta 0:03:18 lr 0.000607	time 0.8755 (0.8913)	loss 0.5889 (0.5791)	grad_norm 1.9541 (2.6627)	mem 20675MB
[2025-04-03 01:39:43 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][90/311]	eta 0:03:16 lr 0.000606	time 0.8757 (0.8910)	loss 0.4157 (0.5771)	grad_norm 2.7233 (2.6495)	mem 20675MB
[2025-04-03 01:39:45 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][92/311]	eta 0:03:15 lr 0.000606	time 0.8754 (0.8907)	loss 0.6482 (0.5781)	grad_norm 2.1911 (2.6389)	mem 20675MB
[2025-04-03 01:39:46 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][94/311]	eta 0:03:13 lr 0.000605	time 0.8753 (0.8904)	loss 0.6459 (0.5776)	grad_norm 4.4197 (2.6682)	mem 20675MB
[2025-04-03 01:39:48 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][96/311]	eta 0:03:11 lr 0.000605	time 0.8757 (0.8901)	loss 0.6283 (0.5778)	grad_norm 2.5444 (2.6659)	mem 20675MB
[2025-04-03 01:39:50 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][98/311]	eta 0:03:09 lr 0.000605	time 0.8756 (0.8898)	loss 0.6146 (0.5783)	grad_norm 2.3164 (2.6625)	mem 20675MB
[2025-04-03 01:39:52 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][100/311]	eta 0:03:07 lr 0.000604	time 0.8756 (0.8896)	loss 0.5959 (0.5779)	grad_norm 2.0935 (2.6565)	mem 20675MB
[2025-04-03 01:39:53 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][102/311]	eta 0:03:05 lr 0.000604	time 0.8757 (0.8893)	loss 0.5450 (0.5764)	grad_norm 2.8903 (2.6628)	mem 20675MB
[2025-04-03 01:39:55 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][104/311]	eta 0:03:04 lr 0.000603	time 0.8756 (0.8891)	loss 0.4801 (0.5755)	grad_norm 2.4743 (2.6555)	mem 20675MB
[2025-04-03 01:39:57 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][106/311]	eta 0:03:02 lr 0.000603	time 0.8753 (0.8888)	loss 0.5871 (0.5754)	grad_norm 1.7416 (2.6452)	mem 20675MB
[2025-04-03 01:39:59 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][108/311]	eta 0:03:00 lr 0.000602	time 0.8753 (0.8886)	loss 0.5482 (0.5737)	grad_norm 2.7624 (2.6499)	mem 20675MB
[2025-04-03 01:40:01 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][110/311]	eta 0:02:58 lr 0.000602	time 0.8758 (0.8884)	loss 0.6376 (0.5737)	grad_norm 2.6064 (2.6653)	mem 20675MB
[2025-04-03 01:40:02 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][112/311]	eta 0:02:56 lr 0.000602	time 0.8756 (0.8882)	loss 0.5789 (0.5742)	grad_norm 3.8853 (2.6780)	mem 20675MB
[2025-04-03 01:40:04 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][114/311]	eta 0:02:54 lr 0.000601	time 0.8753 (0.8880)	loss 0.5550 (0.5738)	grad_norm 2.4631 (2.6770)	mem 20675MB
[2025-04-03 01:40:06 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][116/311]	eta 0:02:53 lr 0.000601	time 0.8753 (0.8878)	loss 0.5805 (0.5745)	grad_norm 2.5795 (2.6748)	mem 20675MB
[2025-04-03 01:40:08 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][118/311]	eta 0:02:51 lr 0.000600	time 0.8754 (0.8876)	loss 0.5823 (0.5744)	grad_norm 2.3736 (2.6830)	mem 20675MB
[2025-04-03 01:40:09 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][120/311]	eta 0:02:49 lr 0.000600	time 0.8755 (0.8874)	loss 0.5805 (0.5737)	grad_norm 1.8594 (2.6853)	mem 20675MB
[2025-04-03 01:40:11 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][122/311]	eta 0:02:47 lr 0.000599	time 0.8755 (0.8872)	loss 0.5574 (0.5739)	grad_norm 2.4626 (2.6773)	mem 20675MB
[2025-04-03 01:40:13 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][124/311]	eta 0:02:45 lr 0.000599	time 0.8761 (0.8871)	loss 0.5967 (0.5739)	grad_norm 1.7524 (2.6628)	mem 20675MB
[2025-04-03 01:40:15 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][126/311]	eta 0:02:44 lr 0.000599	time 0.8759 (0.8869)	loss 0.4265 (0.5726)	grad_norm 2.7973 (2.6614)	mem 20675MB
[2025-04-03 01:40:16 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][128/311]	eta 0:02:42 lr 0.000598	time 0.8756 (0.8867)	loss 0.5968 (0.5732)	grad_norm 2.1730 (2.6536)	mem 20675MB
[2025-04-03 01:40:18 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][130/311]	eta 0:02:40 lr 0.000598	time 0.8757 (0.8866)	loss 0.4496 (0.5723)	grad_norm 3.3549 (2.6514)	mem 20675MB
[2025-04-03 01:40:20 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][132/311]	eta 0:02:38 lr 0.000597	time 0.8757 (0.8864)	loss 0.6127 (0.5724)	grad_norm 2.9222 (2.6505)	mem 20675MB
[2025-04-03 01:40:22 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][134/311]	eta 0:02:36 lr 0.000597	time 0.8757 (0.8863)	loss 0.5040 (0.5721)	grad_norm 2.7906 (2.6548)	mem 20675MB
[2025-04-03 01:40:23 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][136/311]	eta 0:02:35 lr 0.000597	time 0.8753 (0.8861)	loss 0.5576 (0.5726)	grad_norm 2.1513 (2.6543)	mem 20675MB
[2025-04-03 01:40:25 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][138/311]	eta 0:02:33 lr 0.000596	time 0.8756 (0.8860)	loss 0.6365 (0.5729)	grad_norm 2.4346 (2.6515)	mem 20675MB
[2025-04-03 01:40:27 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][140/311]	eta 0:02:31 lr 0.000596	time 0.8758 (0.8859)	loss 0.5473 (0.5727)	grad_norm 2.8610 (2.6564)	mem 20675MB
[2025-04-03 01:40:29 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][142/311]	eta 0:02:29 lr 0.000595	time 0.8757 (0.8857)	loss 0.6122 (0.5723)	grad_norm 2.3961 (2.6591)	mem 20675MB
[2025-04-03 01:40:30 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][144/311]	eta 0:02:27 lr 0.000595	time 0.8757 (0.8856)	loss 0.5217 (0.5720)	grad_norm 2.9423 (2.6631)	mem 20675MB
[2025-04-03 01:40:32 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][146/311]	eta 0:02:26 lr 0.000594	time 0.8758 (0.8855)	loss 0.6193 (0.5719)	grad_norm 2.7283 (2.6645)	mem 20675MB
[2025-04-03 01:40:34 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][148/311]	eta 0:02:24 lr 0.000594	time 0.8755 (0.8854)	loss 0.5684 (0.5722)	grad_norm 2.4529 (2.6579)	mem 20675MB
[2025-04-03 01:40:36 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][150/311]	eta 0:02:22 lr 0.000594	time 0.8755 (0.8853)	loss 0.6265 (0.5730)	grad_norm 2.2483 (2.6540)	mem 20675MB
[2025-04-03 01:40:37 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][152/311]	eta 0:02:20 lr 0.000593	time 0.8756 (0.8851)	loss 0.6049 (0.5734)	grad_norm 1.7000 (2.6461)	mem 20675MB
[2025-04-03 01:40:39 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][154/311]	eta 0:02:18 lr 0.000593	time 0.8757 (0.8850)	loss 0.5550 (0.5728)	grad_norm 3.0898 (2.6596)	mem 20675MB
[2025-04-03 01:40:41 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][156/311]	eta 0:02:17 lr 0.000592	time 0.8754 (0.8849)	loss 0.4800 (0.5714)	grad_norm 4.0960 (2.6741)	mem 20675MB
[2025-04-03 01:40:43 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][158/311]	eta 0:02:15 lr 0.000592	time 0.8760 (0.8848)	loss 0.5031 (0.5707)	grad_norm 3.5487 (2.6808)	mem 20675MB
[2025-04-03 01:40:44 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][160/311]	eta 0:02:13 lr 0.000591	time 0.8755 (0.8847)	loss 0.6612 (0.5713)	grad_norm 3.5078 (2.6879)	mem 20675MB
[2025-04-03 01:40:46 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][162/311]	eta 0:02:11 lr 0.000591	time 0.8757 (0.8846)	loss 0.5524 (0.5706)	grad_norm 3.2550 (2.6922)	mem 20675MB
[2025-04-03 01:40:48 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][164/311]	eta 0:02:10 lr 0.000591	time 0.8753 (0.8845)	loss 0.6061 (0.5712)	grad_norm 3.3345 (2.6959)	mem 20675MB
[2025-04-03 01:40:50 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][166/311]	eta 0:02:08 lr 0.000590	time 0.8756 (0.8844)	loss 0.5968 (0.5714)	grad_norm 2.6386 (2.6961)	mem 20675MB
[2025-04-03 01:40:51 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][168/311]	eta 0:02:06 lr 0.000590	time 0.8763 (0.8843)	loss 0.5780 (0.5716)	grad_norm 2.3859 (2.6907)	mem 20675MB
[2025-04-03 01:40:53 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][170/311]	eta 0:02:04 lr 0.000589	time 0.8760 (0.8842)	loss 0.5683 (0.5715)	grad_norm 2.1112 (2.6839)	mem 20675MB
[2025-04-03 01:40:55 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][172/311]	eta 0:02:02 lr 0.000589	time 0.8758 (0.8842)	loss 0.5884 (0.5721)	grad_norm 2.1310 (2.6813)	mem 20675MB
[2025-04-03 01:40:57 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][174/311]	eta 0:02:01 lr 0.000589	time 0.8754 (0.8841)	loss 0.4756 (0.5712)	grad_norm 3.2065 (2.6908)	mem 20675MB
[2025-04-03 01:40:58 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][176/311]	eta 0:01:59 lr 0.000588	time 0.8757 (0.8840)	loss 0.6010 (0.5715)	grad_norm 1.9829 (2.6841)	mem 20675MB
[2025-04-03 01:41:00 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][178/311]	eta 0:01:57 lr 0.000588	time 0.8757 (0.8839)	loss 0.6406 (0.5722)	grad_norm 2.4697 (2.6813)	mem 20675MB
[2025-04-03 01:41:02 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][180/311]	eta 0:01:55 lr 0.000587	time 0.8758 (0.8838)	loss 0.4925 (0.5719)	grad_norm 3.7400 (2.6872)	mem 20675MB
[2025-04-03 01:41:04 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][182/311]	eta 0:01:54 lr 0.000587	time 0.8757 (0.8837)	loss 0.4703 (0.5714)	grad_norm 4.5363 (2.6919)	mem 20675MB
[2025-04-03 01:41:05 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][184/311]	eta 0:01:52 lr 0.000586	time 0.8755 (0.8837)	loss 0.6024 (0.5713)	grad_norm 2.2873 (2.6930)	mem 20675MB
[2025-04-03 01:41:07 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][186/311]	eta 0:01:50 lr 0.000586	time 0.8762 (0.8836)	loss 0.5224 (0.5712)	grad_norm 3.7751 (2.6944)	mem 20675MB
[2025-04-03 01:41:09 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][188/311]	eta 0:01:48 lr 0.000586	time 0.8756 (0.8835)	loss 0.6927 (0.5718)	grad_norm 2.4080 (2.6891)	mem 20675MB
[2025-04-03 01:41:11 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][190/311]	eta 0:01:46 lr 0.000585	time 0.8757 (0.8834)	loss 0.4885 (0.5714)	grad_norm 3.1664 (2.6918)	mem 20675MB
[2025-04-03 01:41:12 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][192/311]	eta 0:01:45 lr 0.000585	time 0.8757 (0.8834)	loss 0.5866 (0.5715)	grad_norm 2.7750 (2.6947)	mem 20675MB
[2025-04-03 01:41:14 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][194/311]	eta 0:01:43 lr 0.000584	time 0.8757 (0.8833)	loss 0.4919 (0.5707)	grad_norm 3.8161 (2.7030)	mem 20675MB
[2025-04-03 01:41:16 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][196/311]	eta 0:01:41 lr 0.000584	time 0.8756 (0.8832)	loss 0.4920 (0.5701)	grad_norm 5.2187 (2.7141)	mem 20675MB
[2025-04-03 01:41:18 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][198/311]	eta 0:01:39 lr 0.000583	time 0.8756 (0.8832)	loss 0.4862 (0.5695)	grad_norm 2.8733 (2.7160)	mem 20675MB
[2025-04-03 01:41:19 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][200/311]	eta 0:01:38 lr 0.000583	time 0.8755 (0.8831)	loss 0.5322 (0.5695)	grad_norm 3.5873 (2.7172)	mem 20675MB
[2025-04-03 01:41:21 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][202/311]	eta 0:01:36 lr 0.000583	time 0.8757 (0.8830)	loss 0.5503 (0.5689)	grad_norm 3.2281 (2.7232)	mem 20675MB
[2025-04-03 01:41:23 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][204/311]	eta 0:01:34 lr 0.000582	time 0.8756 (0.8830)	loss 0.5689 (0.5690)	grad_norm 2.5855 (2.7216)	mem 20675MB
[2025-04-03 01:41:25 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][206/311]	eta 0:01:32 lr 0.000582	time 0.8762 (0.8829)	loss 0.4105 (0.5677)	grad_norm 2.9024 (2.7270)	mem 20675MB
[2025-04-03 01:41:26 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][208/311]	eta 0:01:30 lr 0.000581	time 0.8760 (0.8828)	loss 0.6090 (0.5679)	grad_norm 2.6262 (2.7268)	mem 20675MB
[2025-04-03 01:41:28 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][210/311]	eta 0:01:29 lr 0.000581	time 0.8755 (0.8828)	loss 0.5349 (0.5677)	grad_norm 2.7679 (2.7300)	mem 20675MB
[2025-04-03 01:41:30 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][212/311]	eta 0:01:27 lr 0.000581	time 0.8754 (0.8827)	loss 0.4960 (0.5674)	grad_norm 4.2621 (2.7389)	mem 20675MB
[2025-04-03 01:41:32 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][214/311]	eta 0:01:25 lr 0.000580	time 0.8753 (0.8827)	loss 0.5094 (0.5675)	grad_norm 3.9847 (2.7449)	mem 20675MB
[2025-04-03 01:41:33 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][216/311]	eta 0:01:23 lr 0.000580	time 0.8754 (0.8826)	loss 0.6314 (0.5682)	grad_norm 2.9335 (2.7464)	mem 20675MB
[2025-04-03 01:41:35 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][218/311]	eta 0:01:22 lr 0.000579	time 0.8761 (0.8825)	loss 0.4373 (0.5678)	grad_norm 4.5351 (2.7517)	mem 20675MB
[2025-04-03 01:41:37 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][220/311]	eta 0:01:20 lr 0.000579	time 0.8758 (0.8825)	loss 0.5971 (0.5677)	grad_norm 4.5114 (2.7645)	mem 20675MB
[2025-04-03 01:41:39 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][222/311]	eta 0:01:18 lr 0.000578	time 0.8757 (0.8824)	loss 0.5197 (0.5679)	grad_norm 3.8272 (2.7678)	mem 20675MB
[2025-04-03 01:41:40 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][224/311]	eta 0:01:16 lr 0.000578	time 0.8765 (0.8824)	loss 0.6161 (0.5681)	grad_norm 2.4535 (2.7653)	mem 20675MB
[2025-04-03 01:41:42 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][226/311]	eta 0:01:15 lr 0.000578	time 0.8759 (0.8824)	loss 0.6144 (0.5687)	grad_norm 2.2662 (2.7602)	mem 20675MB
[2025-04-03 01:41:44 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][228/311]	eta 0:01:13 lr 0.000577	time 0.8757 (0.8823)	loss 0.5491 (0.5686)	grad_norm 2.4440 (2.7545)	mem 20675MB
[2025-04-03 01:41:46 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][230/311]	eta 0:01:11 lr 0.000577	time 0.8761 (0.8823)	loss 0.5061 (0.5683)	grad_norm 2.4621 (2.7550)	mem 20675MB
[2025-04-03 01:41:47 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][232/311]	eta 0:01:09 lr 0.000576	time 0.8757 (0.8822)	loss 0.5355 (0.5679)	grad_norm 3.0257 (2.7547)	mem 20675MB
[2025-04-03 01:41:49 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][234/311]	eta 0:01:07 lr 0.000576	time 0.8759 (0.8822)	loss 0.6179 (0.5682)	grad_norm 1.8790 (2.7507)	mem 20675MB
[2025-04-03 01:41:51 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][236/311]	eta 0:01:06 lr 0.000576	time 0.8770 (0.8822)	loss 0.5927 (0.5679)	grad_norm 3.1052 (2.7507)	mem 20675MB
[2025-04-03 01:41:53 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][238/311]	eta 0:01:04 lr 0.000575	time 0.8758 (0.8822)	loss 0.4945 (0.5678)	grad_norm 2.9165 (2.7480)	mem 20675MB
[2025-04-03 01:41:54 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][240/311]	eta 0:01:02 lr 0.000575	time 0.8757 (0.8821)	loss 0.5646 (0.5679)	grad_norm 2.9182 (2.7492)	mem 20675MB
[2025-04-03 01:41:56 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][242/311]	eta 0:01:00 lr 0.000574	time 0.8756 (0.8821)	loss 0.6081 (0.5686)	grad_norm 2.4082 (2.7524)	mem 20675MB
[2025-04-03 01:41:58 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][244/311]	eta 0:00:59 lr 0.000574	time 0.8758 (0.8820)	loss 0.5525 (0.5681)	grad_norm 2.6464 (2.7530)	mem 20675MB
[2025-04-03 01:42:00 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][246/311]	eta 0:00:57 lr 0.000573	time 0.8756 (0.8820)	loss 0.5587 (0.5676)	grad_norm 2.3542 (2.7560)	mem 20675MB
[2025-04-03 01:42:02 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][248/311]	eta 0:00:55 lr 0.000573	time 0.8762 (0.8819)	loss 0.5245 (0.5677)	grad_norm 3.3575 (2.7576)	mem 20675MB
[2025-04-03 01:42:03 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][250/311]	eta 0:00:53 lr 0.000573	time 0.8755 (0.8819)	loss 0.5936 (0.5681)	grad_norm 2.2178 (2.7537)	mem 20675MB
[2025-04-03 01:42:05 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][252/311]	eta 0:00:52 lr 0.000572	time 0.8758 (0.8819)	loss 0.5536 (0.5681)	grad_norm 2.4807 (2.7519)	mem 20675MB
[2025-04-03 01:42:07 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][254/311]	eta 0:00:50 lr 0.000572	time 0.8767 (0.8818)	loss 0.5532 (0.5683)	grad_norm 2.2165 (2.7480)	mem 20675MB
[2025-04-03 01:42:09 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][256/311]	eta 0:00:48 lr 0.000571	time 0.8767 (0.8818)	loss 0.5619 (0.5682)	grad_norm 2.3502 (2.7466)	mem 20675MB
[2025-04-03 01:42:10 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][258/311]	eta 0:00:46 lr 0.000571	time 0.8767 (0.8818)	loss 0.4939 (0.5677)	grad_norm 2.4676 (2.7438)	mem 20675MB
[2025-04-03 01:42:12 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][260/311]	eta 0:00:44 lr 0.000570	time 0.8759 (0.8817)	loss 0.5177 (0.5675)	grad_norm 2.5700 (2.7411)	mem 20675MB
[2025-04-03 01:42:14 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][262/311]	eta 0:00:43 lr 0.000570	time 0.8757 (0.8817)	loss 0.4678 (0.5672)	grad_norm 5.2252 (2.7513)	mem 20675MB
[2025-04-03 01:42:16 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][264/311]	eta 0:00:41 lr 0.000570	time 0.8759 (0.8817)	loss 0.6583 (0.5677)	grad_norm 2.6327 (2.7519)	mem 20675MB
[2025-04-03 01:42:17 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][266/311]	eta 0:00:39 lr 0.000569	time 0.8758 (0.8816)	loss 0.5688 (0.5678)	grad_norm 3.0983 (2.7515)	mem 20675MB
[2025-04-03 01:42:19 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][268/311]	eta 0:00:37 lr 0.000569	time 0.8758 (0.8816)	loss 0.6187 (0.5682)	grad_norm 2.4353 (2.7514)	mem 20675MB
[2025-04-03 01:42:21 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][270/311]	eta 0:00:36 lr 0.000568	time 0.8759 (0.8816)	loss 0.6183 (0.5685)	grad_norm 2.5212 (2.7483)	mem 20675MB
[2025-04-03 01:42:23 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][272/311]	eta 0:00:34 lr 0.000568	time 0.8758 (0.8815)	loss 0.5807 (0.5685)	grad_norm 2.7432 (2.7515)	mem 20675MB
[2025-04-03 01:42:24 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][274/311]	eta 0:00:32 lr 0.000568	time 0.8758 (0.8815)	loss 0.4935 (0.5685)	grad_norm 4.0756 (2.7566)	mem 20675MB
[2025-04-03 01:42:26 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][276/311]	eta 0:00:30 lr 0.000567	time 0.8755 (0.8814)	loss 0.5306 (0.5685)	grad_norm 2.1617 (2.7540)	mem 20675MB
[2025-04-03 01:42:28 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][278/311]	eta 0:00:29 lr 0.000567	time 0.8758 (0.8814)	loss 0.5334 (0.5683)	grad_norm 2.7712 (2.7510)	mem 20675MB
[2025-04-03 01:42:30 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][280/311]	eta 0:00:27 lr 0.000566	time 0.8762 (0.8814)	loss 0.6506 (0.5688)	grad_norm 2.6131 (2.7480)	mem 20675MB
[2025-04-03 01:42:31 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][282/311]	eta 0:00:25 lr 0.000566	time 0.8761 (0.8813)	loss 0.6671 (0.5688)	grad_norm 4.1815 (2.7582)	mem 20675MB
[2025-04-03 01:42:33 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][284/311]	eta 0:00:23 lr 0.000565	time 0.8758 (0.8813)	loss 0.5874 (0.5684)	grad_norm 1.5766 (2.7556)	mem 20675MB
[2025-04-03 01:42:35 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][286/311]	eta 0:00:22 lr 0.000565	time 0.8758 (0.8813)	loss 0.5896 (0.5684)	grad_norm 2.3195 (2.7556)	mem 20675MB
[2025-04-03 01:42:37 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][288/311]	eta 0:00:20 lr 0.000565	time 0.8760 (0.8813)	loss 0.5855 (0.5688)	grad_norm 2.6360 (2.7536)	mem 20675MB
[2025-04-03 01:42:38 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][290/311]	eta 0:00:18 lr 0.000564	time 0.8757 (0.8812)	loss 0.5796 (0.5685)	grad_norm 2.4351 (2.7579)	mem 20675MB
[2025-04-03 01:42:40 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][292/311]	eta 0:00:16 lr 0.000564	time 0.8758 (0.8812)	loss 0.6206 (0.5687)	grad_norm 2.0055 (2.7591)	mem 20675MB
[2025-04-03 01:42:42 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][294/311]	eta 0:00:14 lr 0.000563	time 0.8755 (0.8812)	loss 0.5652 (0.5689)	grad_norm 2.8160 (2.7601)	mem 20675MB
[2025-04-03 01:42:44 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][296/311]	eta 0:00:13 lr 0.000563	time 0.8756 (0.8811)	loss 0.4369 (0.5683)	grad_norm 2.9635 (2.7621)	mem 20675MB
[2025-04-03 01:42:45 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][298/311]	eta 0:00:11 lr 0.000563	time 0.8758 (0.8811)	loss 0.5958 (0.5682)	grad_norm 2.1640 (2.7647)	mem 20675MB
[2025-04-03 01:42:47 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][300/311]	eta 0:00:09 lr 0.000562	time 0.8753 (0.8811)	loss 0.5885 (0.5684)	grad_norm 2.7794 (2.7653)	mem 20675MB
[2025-04-03 01:42:49 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][302/311]	eta 0:00:07 lr 0.000562	time 0.8757 (0.8810)	loss 0.4877 (0.5677)	grad_norm 3.0651 (2.7675)	mem 20675MB
[2025-04-03 01:42:51 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][304/311]	eta 0:00:06 lr 0.000561	time 0.8758 (0.8810)	loss 0.4866 (0.5674)	grad_norm 3.0452 (2.7664)	mem 20675MB
[2025-04-03 01:42:52 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][306/311]	eta 0:00:04 lr 0.000561	time 0.8757 (0.8810)	loss 0.6897 (0.5679)	grad_norm 2.7357 (2.7665)	mem 20675MB
[2025-04-03 01:42:54 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][308/311]	eta 0:00:02 lr 0.000560	time 0.8754 (0.8810)	loss 0.5785 (0.5683)	grad_norm 3.4389 (2.7682)	mem 20675MB
[2025-04-03 01:42:56 simmim_finetune] (main_finetune.py 252): INFO Train: [15/30][310/311]	eta 0:00:00 lr 0.000560	time 0.8754 (0.8809)	loss 0.6297 (0.5685)	grad_norm 3.0608 (2.7729)	mem 20675MB
[2025-04-03 01:42:56 simmim_finetune] (main_finetune.py 260): INFO EPOCH 15 training takes 0:04:34
[2025-04-03 01:42:56 simmim_finetune] (utils.py 60): INFO checkpoint/hand/ckpt15.pth saving......
[2025-04-03 01:43:00 simmim_finetune] (utils.py 62): INFO checkpoint/hand/ckpt15.pth saved !!!
[2025-04-03 01:43:01 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.448 (1.448)	Loss 0.5181 (0.5181)	Acc@1 76.562 (76.562)	Mem 20675MB
[2025-04-03 01:43:01 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.465
[2025-04-03 01:43:01 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 77.5%
[2025-04-03 01:43:01 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 01:43:01 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.208012175389422e-06, 2.208012175389422e-06, 3.322621086394876e-06, 3.322621086394876e-06, 5.0374040264032676e-06, 5.0374040264032676e-06, 7.675531626416177e-06, 7.675531626416177e-06, 1.1734189472589882e-05, 1.1734189472589882e-05, 1.797827846670328e-05, 1.797827846670328e-05, 2.7584569226877726e-05, 2.7584569226877726e-05, 4.236347808868457e-05, 4.236347808868457e-05, 6.51002609530028e-05, 6.51002609530028e-05, 0.00010007992689810778, 0.00010007992689810778, 0.00015389479758288463, 0.00015389479758288463, 0.00023668690632869518, 0.00023668690632869518, 0.00036405938132224984, 0.00036405938132224984, 0.0005600170351584879, 0.0005600170351584879]
[2025-04-03 01:43:03 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][0/311]	eta 0:11:43 lr 0.000560	time 2.2635 (2.2635)	loss 0.6410 (0.6410)	grad_norm 2.4294 (2.4294)	mem 20675MB
[2025-04-03 01:43:05 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][2/311]	eta 0:06:53 lr 0.000559	time 0.8759 (1.3393)	loss 0.6076 (0.6105)	grad_norm 2.1682 (2.1768)	mem 20675MB
[2025-04-03 01:43:07 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][4/311]	eta 0:05:54 lr 0.000559	time 0.8759 (1.1543)	loss 0.6016 (0.6029)	grad_norm 1.8512 (2.1595)	mem 20675MB
[2025-04-03 01:43:09 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][6/311]	eta 0:05:27 lr 0.000559	time 0.8762 (1.0750)	loss 0.6147 (0.6082)	grad_norm 1.7857 (2.1445)	mem 20675MB
[2025-04-03 01:43:10 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][8/311]	eta 0:05:12 lr 0.000558	time 0.8759 (1.0310)	loss 0.6379 (0.6071)	grad_norm 2.0430 (2.0688)	mem 20675MB
[2025-04-03 01:43:12 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][10/311]	eta 0:05:01 lr 0.000558	time 0.8764 (1.0030)	loss 0.6122 (0.5990)	grad_norm 1.6856 (2.0267)	mem 20675MB
[2025-04-03 01:43:14 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][12/311]	eta 0:04:54 lr 0.000557	time 0.8763 (0.9837)	loss 0.4934 (0.5883)	grad_norm 2.6035 (2.1086)	mem 20675MB
[2025-04-03 01:43:16 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][14/311]	eta 0:04:47 lr 0.000557	time 0.8789 (0.9696)	loss 0.5504 (0.5826)	grad_norm 2.1081 (2.1015)	mem 20675MB
[2025-04-03 01:43:17 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][16/311]	eta 0:04:42 lr 0.000556	time 0.8760 (0.9587)	loss 0.5496 (0.5773)	grad_norm 1.9970 (2.0794)	mem 20675MB
[2025-04-03 01:43:19 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][18/311]	eta 0:04:38 lr 0.000556	time 0.8762 (0.9501)	loss 0.5862 (0.5726)	grad_norm 2.1751 (2.1474)	mem 20675MB
[2025-04-03 01:43:21 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][20/311]	eta 0:04:34 lr 0.000556	time 0.8759 (0.9432)	loss 0.5761 (0.5769)	grad_norm 2.5475 (2.2208)	mem 20675MB
[2025-04-03 01:43:23 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][22/311]	eta 0:04:30 lr 0.000555	time 0.8759 (0.9374)	loss 0.6019 (0.5814)	grad_norm 2.1403 (2.2120)	mem 20675MB
[2025-04-03 01:43:24 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][24/311]	eta 0:04:27 lr 0.000555	time 0.8774 (0.9326)	loss 0.5587 (0.5833)	grad_norm 2.7395 (2.2182)	mem 20675MB
[2025-04-03 01:43:26 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][26/311]	eta 0:04:24 lr 0.000554	time 0.8760 (0.9285)	loss 0.4536 (0.5783)	grad_norm 4.0401 (2.3148)	mem 20675MB
[2025-04-03 01:43:28 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][28/311]	eta 0:04:21 lr 0.000554	time 0.8763 (0.9249)	loss 0.6020 (0.5813)	grad_norm 1.8124 (2.3103)	mem 20675MB
[2025-04-03 01:43:30 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][30/311]	eta 0:04:19 lr 0.000554	time 0.8760 (0.9219)	loss 0.5226 (0.5761)	grad_norm 3.6307 (2.3566)	mem 20675MB
[2025-04-03 01:43:32 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][32/311]	eta 0:04:16 lr 0.000553	time 0.8759 (0.9192)	loss 0.5938 (0.5738)	grad_norm 2.1108 (2.3677)	mem 20675MB
[2025-04-03 01:43:33 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][34/311]	eta 0:04:13 lr 0.000553	time 0.8758 (0.9167)	loss 0.5879 (0.5759)	grad_norm 2.7325 (2.3702)	mem 20675MB
[2025-04-03 01:43:35 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][36/311]	eta 0:04:11 lr 0.000552	time 0.8772 (0.9146)	loss 0.6664 (0.5776)	grad_norm 2.9234 (2.3874)	mem 20675MB
[2025-04-03 01:43:37 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][38/311]	eta 0:04:09 lr 0.000552	time 0.8761 (0.9127)	loss 0.6435 (0.5771)	grad_norm 3.0922 (2.4273)	mem 20675MB
[2025-04-03 01:43:39 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][40/311]	eta 0:04:06 lr 0.000551	time 0.8758 (0.9109)	loss 0.6464 (0.5797)	grad_norm 1.8361 (2.4013)	mem 20675MB
[2025-04-03 01:43:40 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][42/311]	eta 0:04:04 lr 0.000551	time 0.8760 (0.9093)	loss 0.6368 (0.5809)	grad_norm 2.3380 (2.4203)	mem 20675MB
[2025-04-03 01:43:42 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][44/311]	eta 0:04:02 lr 0.000551	time 0.8760 (0.9079)	loss 0.4845 (0.5759)	grad_norm 2.8626 (2.4449)	mem 20675MB
[2025-04-03 01:43:44 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][46/311]	eta 0:04:00 lr 0.000550	time 0.8756 (0.9066)	loss 0.5179 (0.5724)	grad_norm 2.8606 (2.4625)	mem 20675MB
[2025-04-03 01:43:46 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][48/311]	eta 0:03:58 lr 0.000550	time 0.8762 (0.9053)	loss 0.5955 (0.5733)	grad_norm 1.9267 (2.4387)	mem 20675MB
[2025-04-03 01:43:47 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][50/311]	eta 0:03:56 lr 0.000549	time 0.8760 (0.9043)	loss 0.5816 (0.5745)	grad_norm 2.2923 (2.4305)	mem 20675MB
[2025-04-03 01:43:49 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][52/311]	eta 0:03:53 lr 0.000549	time 0.8761 (0.9032)	loss 0.4542 (0.5696)	grad_norm 2.9266 (2.4569)	mem 20675MB
[2025-04-03 01:43:51 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][54/311]	eta 0:03:51 lr 0.000549	time 0.8760 (0.9023)	loss 0.5174 (0.5700)	grad_norm 3.3446 (2.4800)	mem 20675MB
[2025-04-03 01:43:53 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][56/311]	eta 0:03:49 lr 0.000548	time 0.8760 (0.9014)	loss 0.4916 (0.5692)	grad_norm 3.8535 (2.5103)	mem 20675MB
[2025-04-03 01:43:54 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][58/311]	eta 0:03:47 lr 0.000548	time 0.8757 (0.9006)	loss 0.4319 (0.5673)	grad_norm 4.5425 (2.5475)	mem 20675MB
[2025-04-03 01:43:56 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][60/311]	eta 0:03:45 lr 0.000547	time 0.8757 (0.8998)	loss 0.5822 (0.5682)	grad_norm 2.7901 (2.5618)	mem 20675MB
[2025-04-03 01:43:58 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][62/311]	eta 0:03:43 lr 0.000547	time 0.8760 (0.8991)	loss 0.5871 (0.5682)	grad_norm 2.7454 (2.5714)	mem 20675MB
[2025-04-03 01:44:00 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][64/311]	eta 0:03:41 lr 0.000546	time 0.8761 (0.8984)	loss 0.5664 (0.5684)	grad_norm 3.4125 (2.5791)	mem 20675MB
[2025-04-03 01:44:01 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][66/311]	eta 0:03:39 lr 0.000546	time 0.8757 (0.8977)	loss 0.4910 (0.5674)	grad_norm 2.7248 (2.5823)	mem 20675MB
[2025-04-03 01:44:03 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][68/311]	eta 0:03:38 lr 0.000546	time 0.8758 (0.8971)	loss 0.5614 (0.5680)	grad_norm 2.7463 (2.5778)	mem 20675MB
[2025-04-03 01:44:05 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][70/311]	eta 0:03:36 lr 0.000545	time 0.8760 (0.8966)	loss 0.4834 (0.5666)	grad_norm 2.8817 (2.5883)	mem 20675MB
[2025-04-03 01:44:07 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][72/311]	eta 0:03:34 lr 0.000545	time 0.8760 (0.8960)	loss 0.5903 (0.5671)	grad_norm 2.1551 (2.5768)	mem 20675MB
[2025-04-03 01:44:08 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][74/311]	eta 0:03:32 lr 0.000544	time 0.8761 (0.8955)	loss 0.4476 (0.5670)	grad_norm 3.0221 (2.5816)	mem 20675MB
[2025-04-03 01:44:10 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][76/311]	eta 0:03:30 lr 0.000544	time 0.8761 (0.8950)	loss 0.5994 (0.5661)	grad_norm 2.6184 (2.5908)	mem 20675MB
[2025-04-03 01:44:12 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][78/311]	eta 0:03:28 lr 0.000544	time 0.8760 (0.8946)	loss 0.6304 (0.5678)	grad_norm 2.5287 (2.5989)	mem 20675MB
[2025-04-03 01:44:14 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][80/311]	eta 0:03:26 lr 0.000543	time 0.8761 (0.8941)	loss 0.6244 (0.5694)	grad_norm 2.3287 (2.5988)	mem 20675MB
[2025-04-03 01:44:15 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][82/311]	eta 0:03:24 lr 0.000543	time 0.8759 (0.8937)	loss 0.5799 (0.5703)	grad_norm 2.3891 (2.5854)	mem 20675MB
[2025-04-03 01:44:17 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][84/311]	eta 0:03:22 lr 0.000542	time 0.8759 (0.8933)	loss 0.6203 (0.5712)	grad_norm 1.8798 (2.5749)	mem 20675MB
[2025-04-03 01:44:19 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][86/311]	eta 0:03:20 lr 0.000542	time 0.8761 (0.8929)	loss 0.5107 (0.5692)	grad_norm 3.2293 (2.5796)	mem 20675MB
[2025-04-03 01:44:21 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][88/311]	eta 0:03:19 lr 0.000541	time 0.8761 (0.8926)	loss 0.5472 (0.5693)	grad_norm 2.8326 (2.5725)	mem 20675MB
[2025-04-03 01:44:22 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][90/311]	eta 0:03:17 lr 0.000541	time 0.8762 (0.8922)	loss 0.6306 (0.5697)	grad_norm 2.4895 (2.5692)	mem 20675MB
[2025-04-03 01:44:24 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][92/311]	eta 0:03:15 lr 0.000541	time 0.8758 (0.8919)	loss 0.5866 (0.5692)	grad_norm 2.1346 (2.5662)	mem 20675MB
[2025-04-03 01:44:26 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][94/311]	eta 0:03:13 lr 0.000540	time 0.8761 (0.8916)	loss 0.4937 (0.5669)	grad_norm 3.5705 (2.5842)	mem 20675MB
[2025-04-03 01:44:28 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][96/311]	eta 0:03:11 lr 0.000540	time 0.8758 (0.8913)	loss 0.4618 (0.5666)	grad_norm 4.4989 (2.6036)	mem 20675MB
[2025-04-03 01:44:29 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][98/311]	eta 0:03:09 lr 0.000539	time 0.8758 (0.8910)	loss 0.6021 (0.5675)	grad_norm 2.7346 (2.6061)	mem 20675MB
[2025-04-03 01:44:31 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][100/311]	eta 0:03:07 lr 0.000539	time 0.8762 (0.8907)	loss 0.5016 (0.5672)	grad_norm 4.9270 (2.6321)	mem 20675MB
[2025-04-03 01:44:33 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][102/311]	eta 0:03:06 lr 0.000539	time 0.8764 (0.8904)	loss 0.4776 (0.5666)	grad_norm 3.7744 (2.6377)	mem 20675MB
[2025-04-03 01:44:35 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][104/311]	eta 0:03:04 lr 0.000538	time 0.8759 (0.8902)	loss 0.6128 (0.5674)	grad_norm 1.4691 (2.6245)	mem 20675MB
[2025-04-03 01:44:36 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][106/311]	eta 0:03:02 lr 0.000538	time 0.8760 (0.8899)	loss 0.6245 (0.5678)	grad_norm 2.7755 (2.6186)	mem 20675MB
[2025-04-03 01:44:38 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][108/311]	eta 0:03:00 lr 0.000537	time 0.8763 (0.8897)	loss 0.4800 (0.5663)	grad_norm 3.1249 (2.6290)	mem 20675MB
[2025-04-03 01:44:40 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][110/311]	eta 0:02:58 lr 0.000537	time 0.8762 (0.8895)	loss 0.6296 (0.5666)	grad_norm 2.1943 (2.6241)	mem 20675MB
[2025-04-03 01:44:42 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][112/311]	eta 0:02:56 lr 0.000536	time 0.8759 (0.8892)	loss 0.5484 (0.5666)	grad_norm 2.8861 (2.6321)	mem 20675MB
[2025-04-03 01:44:43 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][114/311]	eta 0:02:55 lr 0.000536	time 0.8758 (0.8890)	loss 0.5888 (0.5661)	grad_norm 2.0723 (2.6269)	mem 20675MB
[2025-04-03 01:44:45 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][116/311]	eta 0:02:53 lr 0.000536	time 0.8758 (0.8888)	loss 0.4625 (0.5642)	grad_norm 3.7137 (2.6414)	mem 20675MB
[2025-04-03 01:44:47 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][118/311]	eta 0:02:51 lr 0.000535	time 0.8758 (0.8886)	loss 0.5815 (0.5632)	grad_norm 2.4978 (2.6388)	mem 20675MB
[2025-04-03 01:44:49 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][120/311]	eta 0:02:49 lr 0.000535	time 0.8761 (0.8884)	loss 0.6526 (0.5644)	grad_norm 2.4366 (2.6394)	mem 20675MB
[2025-04-03 01:44:50 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][122/311]	eta 0:02:47 lr 0.000534	time 0.8768 (0.8882)	loss 0.4362 (0.5627)	grad_norm 4.1425 (2.6592)	mem 20675MB
[2025-04-03 01:44:52 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][124/311]	eta 0:02:46 lr 0.000534	time 0.8759 (0.8880)	loss 0.5621 (0.5626)	grad_norm 3.5253 (2.6605)	mem 20675MB
[2025-04-03 01:44:54 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][126/311]	eta 0:02:44 lr 0.000534	time 0.8759 (0.8879)	loss 0.6139 (0.5638)	grad_norm 2.2741 (2.6594)	mem 20675MB
[2025-04-03 01:44:56 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][128/311]	eta 0:02:42 lr 0.000533	time 0.8761 (0.8877)	loss 0.6052 (0.5645)	grad_norm 2.5152 (2.6586)	mem 20675MB
[2025-04-03 01:44:57 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][130/311]	eta 0:02:40 lr 0.000533	time 0.8763 (0.8875)	loss 0.5502 (0.5636)	grad_norm 2.8980 (2.6657)	mem 20675MB
[2025-04-03 01:44:59 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][132/311]	eta 0:02:38 lr 0.000532	time 0.8758 (0.8874)	loss 0.5565 (0.5640)	grad_norm 3.5087 (2.6701)	mem 20675MB
[2025-04-03 01:45:01 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][134/311]	eta 0:02:37 lr 0.000532	time 0.8760 (0.8872)	loss 0.4538 (0.5629)	grad_norm 3.5761 (2.6756)	mem 20675MB
[2025-04-03 01:45:03 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][136/311]	eta 0:02:35 lr 0.000531	time 0.8758 (0.8871)	loss 0.6565 (0.5638)	grad_norm 2.5322 (2.6722)	mem 20675MB
[2025-04-03 01:45:04 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][138/311]	eta 0:02:33 lr 0.000531	time 0.8758 (0.8869)	loss 0.5656 (0.5644)	grad_norm 1.6411 (2.6622)	mem 20675MB
[2025-04-03 01:45:06 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][140/311]	eta 0:02:31 lr 0.000531	time 0.8760 (0.8868)	loss 0.4888 (0.5632)	grad_norm 2.3159 (2.6585)	mem 20675MB
[2025-04-03 01:45:08 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][142/311]	eta 0:02:29 lr 0.000530	time 0.8758 (0.8866)	loss 0.5287 (0.5627)	grad_norm 2.6194 (2.6597)	mem 20675MB
[2025-04-03 01:45:10 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][144/311]	eta 0:02:28 lr 0.000530	time 0.8763 (0.8865)	loss 0.5755 (0.5632)	grad_norm 2.8049 (2.6589)	mem 20675MB
[2025-04-03 01:45:11 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][146/311]	eta 0:02:26 lr 0.000529	time 0.8758 (0.8864)	loss 0.5179 (0.5630)	grad_norm 2.6389 (2.6562)	mem 20675MB
[2025-04-03 01:45:13 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][148/311]	eta 0:02:24 lr 0.000529	time 0.8760 (0.8862)	loss 0.6193 (0.5632)	grad_norm 2.0426 (2.6582)	mem 20675MB
[2025-04-03 01:45:15 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][150/311]	eta 0:02:22 lr 0.000529	time 0.8760 (0.8861)	loss 0.5711 (0.5634)	grad_norm 2.8248 (2.6599)	mem 20675MB
[2025-04-03 01:45:17 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][152/311]	eta 0:02:20 lr 0.000528	time 0.8760 (0.8860)	loss 0.5878 (0.5631)	grad_norm 2.2656 (2.6567)	mem 20675MB
[2025-04-03 01:45:18 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][154/311]	eta 0:02:19 lr 0.000528	time 0.8757 (0.8859)	loss 0.6341 (0.5637)	grad_norm 1.8869 (2.6520)	mem 20675MB
[2025-04-03 01:45:20 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][156/311]	eta 0:02:17 lr 0.000527	time 0.8761 (0.8858)	loss 0.5960 (0.5635)	grad_norm 2.3579 (2.6499)	mem 20675MB
[2025-04-03 01:45:22 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][158/311]	eta 0:02:15 lr 0.000527	time 0.8759 (0.8856)	loss 0.5889 (0.5635)	grad_norm 2.1535 (2.6493)	mem 20675MB
[2025-04-03 01:45:24 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][160/311]	eta 0:02:13 lr 0.000526	time 0.8761 (0.8855)	loss 0.5445 (0.5633)	grad_norm 2.3182 (2.6512)	mem 20675MB
[2025-04-03 01:45:25 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][162/311]	eta 0:02:11 lr 0.000526	time 0.8759 (0.8854)	loss 0.5818 (0.5637)	grad_norm 2.6363 (2.6476)	mem 20675MB
[2025-04-03 01:45:27 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][164/311]	eta 0:02:10 lr 0.000526	time 0.8757 (0.8853)	loss 0.5872 (0.5646)	grad_norm 2.1577 (2.6463)	mem 20675MB
[2025-04-03 01:45:29 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][166/311]	eta 0:02:08 lr 0.000525	time 0.8760 (0.8852)	loss 0.6088 (0.5650)	grad_norm 2.4679 (2.6442)	mem 20675MB
[2025-04-03 01:45:31 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][168/311]	eta 0:02:06 lr 0.000525	time 0.8759 (0.8851)	loss 0.5814 (0.5647)	grad_norm 2.7142 (2.6455)	mem 20675MB
[2025-04-03 01:45:33 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][170/311]	eta 0:02:04 lr 0.000524	time 0.8761 (0.8850)	loss 0.4970 (0.5639)	grad_norm 3.5934 (2.6546)	mem 20675MB
[2025-04-03 01:45:34 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][172/311]	eta 0:02:03 lr 0.000524	time 0.8759 (0.8849)	loss 0.6728 (0.5638)	grad_norm 2.6719 (2.6549)	mem 20675MB
[2025-04-03 01:45:36 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][174/311]	eta 0:02:01 lr 0.000524	time 0.8758 (0.8848)	loss 0.6018 (0.5641)	grad_norm 2.5656 (2.6531)	mem 20675MB
[2025-04-03 01:45:38 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][176/311]	eta 0:01:59 lr 0.000523	time 0.8759 (0.8847)	loss 0.4947 (0.5637)	grad_norm 2.8121 (2.6545)	mem 20675MB
[2025-04-03 01:45:40 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][178/311]	eta 0:01:57 lr 0.000523	time 0.8758 (0.8847)	loss 0.5310 (0.5638)	grad_norm 2.6048 (2.6556)	mem 20675MB
[2025-04-03 01:45:41 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][180/311]	eta 0:01:55 lr 0.000522	time 0.8757 (0.8846)	loss 0.5833 (0.5638)	grad_norm 2.8908 (2.6545)	mem 20675MB
[2025-04-03 01:45:43 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][182/311]	eta 0:01:54 lr 0.000522	time 0.8760 (0.8845)	loss 0.5169 (0.5637)	grad_norm 2.6486 (2.6553)	mem 20675MB
[2025-04-03 01:45:45 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][184/311]	eta 0:01:52 lr 0.000521	time 0.8756 (0.8844)	loss 0.5747 (0.5638)	grad_norm 2.8058 (2.6561)	mem 20675MB
[2025-04-03 01:45:47 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][186/311]	eta 0:01:50 lr 0.000521	time 0.8763 (0.8843)	loss 0.6168 (0.5644)	grad_norm 2.8038 (2.6539)	mem 20675MB
[2025-04-03 01:45:48 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][188/311]	eta 0:01:48 lr 0.000521	time 0.8761 (0.8842)	loss 0.5469 (0.5647)	grad_norm 2.3712 (2.6564)	mem 20675MB
[2025-04-03 01:45:50 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][190/311]	eta 0:01:46 lr 0.000520	time 0.8758 (0.8842)	loss 0.5924 (0.5653)	grad_norm 2.9465 (2.6572)	mem 20675MB
[2025-04-03 01:45:52 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][192/311]	eta 0:01:45 lr 0.000520	time 0.8758 (0.8841)	loss 0.5811 (0.5651)	grad_norm 3.1136 (2.6747)	mem 20675MB
[2025-04-03 01:45:54 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][194/311]	eta 0:01:43 lr 0.000519	time 0.8762 (0.8840)	loss 0.4885 (0.5642)	grad_norm 3.3357 (2.6817)	mem 20675MB
[2025-04-03 01:45:55 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][196/311]	eta 0:01:41 lr 0.000519	time 0.8759 (0.8839)	loss 0.6372 (0.5647)	grad_norm 2.3361 (2.6792)	mem 20675MB
[2025-04-03 01:45:57 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][198/311]	eta 0:01:39 lr 0.000519	time 0.8759 (0.8839)	loss 0.5538 (0.5648)	grad_norm 2.4683 (2.6762)	mem 20675MB
[2025-04-03 01:45:59 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][200/311]	eta 0:01:38 lr 0.000518	time 0.8757 (0.8838)	loss 0.3819 (0.5643)	grad_norm 4.0745 (2.6901)	mem 20675MB
[2025-04-03 01:46:01 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][202/311]	eta 0:01:36 lr 0.000518	time 0.8759 (0.8837)	loss 0.6272 (0.5646)	grad_norm 1.9323 (2.6902)	mem 20675MB
[2025-04-03 01:46:02 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][204/311]	eta 0:01:34 lr 0.000517	time 0.8759 (0.8837)	loss 0.6419 (0.5652)	grad_norm 2.2577 (2.6861)	mem 20675MB
[2025-04-03 01:46:04 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][206/311]	eta 0:01:32 lr 0.000517	time 0.8758 (0.8836)	loss 0.6040 (0.5650)	grad_norm 2.2140 (2.6830)	mem 20675MB
[2025-04-03 01:46:06 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][208/311]	eta 0:01:31 lr 0.000516	time 0.8759 (0.8835)	loss 0.5897 (0.5649)	grad_norm 2.6041 (2.6858)	mem 20675MB
[2025-04-03 01:46:08 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][210/311]	eta 0:01:29 lr 0.000516	time 0.8756 (0.8835)	loss 0.5102 (0.5644)	grad_norm 2.4635 (2.6848)	mem 20675MB
[2025-04-03 01:46:09 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][212/311]	eta 0:01:27 lr 0.000516	time 0.8756 (0.8834)	loss 0.5896 (0.5641)	grad_norm 2.2694 (2.6864)	mem 20675MB
[2025-04-03 01:46:11 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][214/311]	eta 0:01:25 lr 0.000515	time 0.8756 (0.8833)	loss 0.6087 (0.5638)	grad_norm 2.8675 (2.6948)	mem 20675MB
[2025-04-03 01:46:13 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][216/311]	eta 0:01:23 lr 0.000515	time 0.8759 (0.8833)	loss 0.5029 (0.5637)	grad_norm 3.5363 (2.6966)	mem 20675MB
[2025-04-03 01:46:15 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][218/311]	eta 0:01:22 lr 0.000514	time 0.8760 (0.8832)	loss 0.4858 (0.5635)	grad_norm 2.9621 (2.6973)	mem 20675MB
[2025-04-03 01:46:16 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][220/311]	eta 0:01:20 lr 0.000514	time 0.8759 (0.8832)	loss 0.5994 (0.5639)	grad_norm 2.0140 (2.6934)	mem 20675MB
[2025-04-03 01:46:18 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][222/311]	eta 0:01:18 lr 0.000514	time 0.8759 (0.8831)	loss 0.4572 (0.5636)	grad_norm 3.1814 (2.6933)	mem 20675MB
[2025-04-03 01:46:20 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][224/311]	eta 0:01:16 lr 0.000513	time 0.8759 (0.8830)	loss 0.6655 (0.5640)	grad_norm 3.2565 (2.6971)	mem 20675MB
[2025-04-03 01:46:22 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][226/311]	eta 0:01:15 lr 0.000513	time 0.8760 (0.8830)	loss 0.6150 (0.5645)	grad_norm 2.0203 (2.6927)	mem 20675MB
[2025-04-03 01:46:23 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][228/311]	eta 0:01:13 lr 0.000512	time 0.8757 (0.8829)	loss 0.5730 (0.5640)	grad_norm 2.2877 (2.6954)	mem 20675MB
[2025-04-03 01:46:25 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][230/311]	eta 0:01:11 lr 0.000512	time 0.8762 (0.8829)	loss 0.6331 (0.5640)	grad_norm 2.6295 (2.6969)	mem 20675MB
[2025-04-03 01:46:27 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][232/311]	eta 0:01:09 lr 0.000512	time 0.8757 (0.8828)	loss 0.5578 (0.5643)	grad_norm 2.6029 (2.6987)	mem 20675MB
[2025-04-03 01:46:29 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][234/311]	eta 0:01:07 lr 0.000511	time 0.8758 (0.8828)	loss 0.5889 (0.5648)	grad_norm 2.0877 (2.6940)	mem 20675MB
[2025-04-03 01:46:30 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][236/311]	eta 0:01:06 lr 0.000511	time 0.8758 (0.8827)	loss 0.5716 (0.5642)	grad_norm 3.2734 (2.6981)	mem 20675MB
[2025-04-03 01:46:32 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][238/311]	eta 0:01:04 lr 0.000510	time 0.8760 (0.8827)	loss 0.6505 (0.5649)	grad_norm 2.8611 (2.6983)	mem 20675MB
[2025-04-03 01:46:34 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][240/311]	eta 0:01:02 lr 0.000510	time 0.8759 (0.8826)	loss 0.5499 (0.5651)	grad_norm 2.1940 (2.6949)	mem 20675MB
[2025-04-03 01:46:36 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][242/311]	eta 0:01:00 lr 0.000509	time 0.8762 (0.8826)	loss 0.5763 (0.5648)	grad_norm 1.8085 (2.6952)	mem 20675MB
[2025-04-03 01:46:37 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][244/311]	eta 0:00:59 lr 0.000509	time 0.8761 (0.8825)	loss 0.5872 (0.5651)	grad_norm 2.8475 (2.6982)	mem 20675MB
[2025-04-03 01:46:39 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][246/311]	eta 0:00:57 lr 0.000509	time 0.8758 (0.8825)	loss 0.6302 (0.5655)	grad_norm 3.1262 (2.7013)	mem 20675MB
[2025-04-03 01:46:41 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][248/311]	eta 0:00:55 lr 0.000508	time 0.8760 (0.8824)	loss 0.5298 (0.5653)	grad_norm 3.1473 (2.6997)	mem 20675MB
[2025-04-03 01:46:43 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][250/311]	eta 0:00:53 lr 0.000508	time 0.8756 (0.8824)	loss 0.4686 (0.5645)	grad_norm 4.4111 (2.7077)	mem 20675MB
[2025-04-03 01:46:44 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][252/311]	eta 0:00:52 lr 0.000507	time 0.8757 (0.8823)	loss 0.4664 (0.5642)	grad_norm 3.3397 (2.7080)	mem 20675MB
[2025-04-03 01:46:46 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][254/311]	eta 0:00:50 lr 0.000507	time 0.8757 (0.8823)	loss 0.5917 (0.5644)	grad_norm 2.0818 (2.7060)	mem 20675MB
[2025-04-03 01:46:48 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][256/311]	eta 0:00:48 lr 0.000507	time 0.8757 (0.8823)	loss 0.6073 (0.5649)	grad_norm 2.8238 (2.7075)	mem 20675MB
[2025-04-03 01:46:50 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][258/311]	eta 0:00:46 lr 0.000506	time 0.8766 (0.8822)	loss 0.5831 (0.5646)	grad_norm 2.2324 (2.7049)	mem 20675MB
[2025-04-03 01:46:51 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][260/311]	eta 0:00:44 lr 0.000506	time 0.8761 (0.8822)	loss 0.6469 (0.5650)	grad_norm 3.2105 (2.7057)	mem 20675MB
[2025-04-03 01:46:53 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][262/311]	eta 0:00:43 lr 0.000505	time 0.8759 (0.8821)	loss 0.6153 (0.5653)	grad_norm 2.9588 (2.7055)	mem 20675MB
[2025-04-03 01:46:55 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][264/311]	eta 0:00:41 lr 0.000505	time 0.8757 (0.8821)	loss 0.6062 (0.5658)	grad_norm 2.3287 (2.7045)	mem 20675MB
[2025-04-03 01:46:57 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][266/311]	eta 0:00:39 lr 0.000504	time 0.8760 (0.8821)	loss 0.5221 (0.5656)	grad_norm 2.9184 (2.7086)	mem 20675MB
[2025-04-03 01:46:58 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][268/311]	eta 0:00:37 lr 0.000504	time 0.8760 (0.8820)	loss 0.5852 (0.5659)	grad_norm 2.7185 (2.7078)	mem 20675MB
[2025-04-03 01:47:00 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][270/311]	eta 0:00:36 lr 0.000504	time 0.8761 (0.8820)	loss 0.5535 (0.5658)	grad_norm 3.0302 (2.7074)	mem 20675MB
[2025-04-03 01:47:02 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][272/311]	eta 0:00:34 lr 0.000503	time 0.8757 (0.8819)	loss 0.4895 (0.5653)	grad_norm 5.2599 (2.7197)	mem 20675MB
[2025-04-03 01:47:04 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][274/311]	eta 0:00:32 lr 0.000503	time 0.8761 (0.8819)	loss 0.6007 (0.5657)	grad_norm 2.1214 (2.7145)	mem 20675MB
[2025-04-03 01:47:05 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][276/311]	eta 0:00:30 lr 0.000502	time 0.8776 (0.8819)	loss 0.5337 (0.5657)	grad_norm 4.2568 (2.7181)	mem 20675MB
[2025-04-03 01:47:07 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][278/311]	eta 0:00:29 lr 0.000502	time 0.8760 (0.8818)	loss 0.5556 (0.5658)	grad_norm 1.7310 (2.7144)	mem 20675MB
[2025-04-03 01:47:09 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][280/311]	eta 0:00:27 lr 0.000502	time 0.8757 (0.8818)	loss 0.5688 (0.5654)	grad_norm 2.3956 (2.7135)	mem 20675MB
[2025-04-03 01:47:11 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][282/311]	eta 0:00:25 lr 0.000501	time 0.8760 (0.8818)	loss 0.5152 (0.5654)	grad_norm 2.2383 (2.7101)	mem 20675MB
[2025-04-03 01:47:12 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][284/311]	eta 0:00:23 lr 0.000501	time 0.8762 (0.8817)	loss 0.6517 (0.5658)	grad_norm 2.9235 (2.7148)	mem 20675MB
[2025-04-03 01:47:14 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][286/311]	eta 0:00:22 lr 0.000500	time 0.8758 (0.8817)	loss 0.5332 (0.5658)	grad_norm 3.7720 (2.7182)	mem 20675MB
[2025-04-03 01:47:16 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][288/311]	eta 0:00:20 lr 0.000500	time 0.8762 (0.8817)	loss 0.6444 (0.5658)	grad_norm 2.3617 (2.7202)	mem 20675MB
[2025-04-03 01:47:18 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][290/311]	eta 0:00:18 lr 0.000500	time 0.8757 (0.8816)	loss 0.6263 (0.5661)	grad_norm 2.0692 (2.7198)	mem 20675MB
[2025-04-03 01:47:19 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][292/311]	eta 0:00:16 lr 0.000499	time 0.8765 (0.8816)	loss 0.5343 (0.5657)	grad_norm 2.3560 (2.7250)	mem 20675MB
[2025-04-03 01:47:21 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][294/311]	eta 0:00:14 lr 0.000499	time 0.8760 (0.8816)	loss 0.5697 (0.5660)	grad_norm 2.5294 (2.7231)	mem 20675MB
[2025-04-03 01:47:23 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][296/311]	eta 0:00:13 lr 0.000498	time 0.8771 (0.8815)	loss 0.5559 (0.5660)	grad_norm 3.2392 (2.7240)	mem 20675MB
[2025-04-03 01:47:25 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][298/311]	eta 0:00:11 lr 0.000498	time 0.8762 (0.8815)	loss 0.5406 (0.5657)	grad_norm 3.1105 (2.7278)	mem 20675MB
[2025-04-03 01:47:27 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][300/311]	eta 0:00:09 lr 0.000497	time 0.8757 (0.8815)	loss 0.4379 (0.5655)	grad_norm 4.4821 (2.7322)	mem 20675MB
[2025-04-03 01:47:28 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][302/311]	eta 0:00:07 lr 0.000497	time 0.8757 (0.8815)	loss 0.5652 (0.5656)	grad_norm 2.4956 (2.7310)	mem 20675MB
[2025-04-03 01:47:30 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][304/311]	eta 0:00:06 lr 0.000497	time 0.8756 (0.8814)	loss 0.6312 (0.5658)	grad_norm 2.8332 (2.7299)	mem 20675MB
[2025-04-03 01:47:32 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][306/311]	eta 0:00:04 lr 0.000496	time 0.8774 (0.8814)	loss 0.5599 (0.5654)	grad_norm 2.7615 (2.7341)	mem 20675MB
[2025-04-03 01:47:34 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][308/311]	eta 0:00:02 lr 0.000496	time 0.8759 (0.8814)	loss 0.6232 (0.5658)	grad_norm 2.5024 (2.7323)	mem 20675MB
[2025-04-03 01:47:35 simmim_finetune] (main_finetune.py 252): INFO Train: [16/30][310/311]	eta 0:00:00 lr 0.000495	time 0.8758 (0.8813)	loss 0.6364 (0.5664)	grad_norm 2.2883 (2.7321)	mem 20675MB
[2025-04-03 01:47:35 simmim_finetune] (main_finetune.py 260): INFO EPOCH 16 training takes 0:04:34
[2025-04-03 01:47:37 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.435 (1.435)	Loss 0.5123 (0.5123)	Acc@1 75.781 (75.781)	Mem 20675MB
[2025-04-03 01:47:37 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 76.761
[2025-04-03 01:47:37 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 76.8%
[2025-04-03 01:47:37 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 01:47:37 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.982029866688251e-06, 1.982029866688251e-06, 2.9679971389062374e-06, 2.9679971389062374e-06, 4.484869865395447e-06, 4.484869865395447e-06, 6.818520213840385e-06, 6.818520213840385e-06, 1.0408751519140288e-05, 1.0408751519140288e-05, 1.5932184296524757e-05, 1.5932184296524757e-05, 2.442977318480855e-05, 2.442977318480855e-05, 3.7502986859091307e-05, 3.7502986859091307e-05, 5.7615623281064784e-05, 5.7615623281064784e-05, 8.855814085333169e-05, 8.855814085333169e-05, 0.00013616201404143458, 0.00013616201404143458, 0.00020939874202313135, 0.00020939874202313135, 0.00032207063122574175, 0.00032207063122574175, 0.0004954119992297578, 0.0004954119992297578]
[2025-04-03 01:47:39 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][0/311]	eta 0:11:39 lr 0.000495	time 2.2491 (2.2491)	loss 0.6021 (0.6021)	grad_norm 2.8447 (2.8447)	mem 20675MB
[2025-04-03 01:47:41 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][2/311]	eta 0:06:52 lr 0.000495	time 0.8758 (1.3343)	loss 0.5460 (0.5355)	grad_norm 2.8535 (2.9624)	mem 20675MB
[2025-04-03 01:47:43 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][4/311]	eta 0:05:53 lr 0.000494	time 0.8758 (1.1512)	loss 0.4511 (0.5336)	grad_norm 4.1641 (3.1665)	mem 20675MB
[2025-04-03 01:47:45 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][6/311]	eta 0:05:27 lr 0.000494	time 0.8764 (1.0729)	loss 0.6168 (0.5549)	grad_norm 2.1807 (2.9087)	mem 20675MB
[2025-04-03 01:47:46 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][8/311]	eta 0:05:11 lr 0.000494	time 0.8761 (1.0295)	loss 0.6007 (0.5498)	grad_norm 2.2201 (2.9303)	mem 20675MB
[2025-04-03 01:47:48 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][10/311]	eta 0:05:01 lr 0.000493	time 0.8760 (1.0017)	loss 0.5768 (0.5599)	grad_norm 2.1197 (2.8098)	mem 20675MB
[2025-04-03 01:47:50 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][12/311]	eta 0:04:53 lr 0.000493	time 0.8759 (0.9825)	loss 0.5671 (0.5586)	grad_norm 2.6357 (2.7509)	mem 20675MB
[2025-04-03 01:47:52 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][14/311]	eta 0:04:47 lr 0.000492	time 0.8764 (0.9684)	loss 0.5572 (0.5644)	grad_norm 1.9450 (2.6742)	mem 20675MB
[2025-04-03 01:47:53 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][16/311]	eta 0:04:42 lr 0.000492	time 0.8759 (0.9577)	loss 0.6045 (0.5691)	grad_norm 2.8039 (2.6342)	mem 20675MB
[2025-04-03 01:47:55 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][18/311]	eta 0:04:38 lr 0.000492	time 0.8760 (0.9492)	loss 0.4343 (0.5594)	grad_norm 3.9087 (2.7375)	mem 20675MB
[2025-04-03 01:47:57 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][20/311]	eta 0:04:34 lr 0.000491	time 0.8758 (0.9423)	loss 0.6371 (0.5691)	grad_norm 2.4966 (2.7122)	mem 20675MB
[2025-04-03 01:47:59 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][22/311]	eta 0:04:30 lr 0.000491	time 0.8767 (0.9366)	loss 0.6034 (0.5706)	grad_norm 2.2402 (2.7066)	mem 20675MB
[2025-04-03 01:48:00 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][24/311]	eta 0:04:27 lr 0.000490	time 0.8762 (0.9320)	loss 0.4176 (0.5649)	grad_norm 3.9543 (2.7615)	mem 20675MB
[2025-04-03 01:48:02 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][26/311]	eta 0:04:24 lr 0.000490	time 0.8758 (0.9279)	loss 0.4860 (0.5606)	grad_norm 3.9119 (2.8216)	mem 20675MB
[2025-04-03 01:48:04 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][28/311]	eta 0:04:21 lr 0.000489	time 0.8762 (0.9244)	loss 0.5519 (0.5626)	grad_norm 2.3736 (2.8106)	mem 20675MB
[2025-04-03 01:48:06 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][30/311]	eta 0:04:18 lr 0.000489	time 0.8762 (0.9213)	loss 0.5791 (0.5613)	grad_norm 1.7839 (2.7896)	mem 20675MB
[2025-04-03 01:48:07 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][32/311]	eta 0:04:16 lr 0.000489	time 0.8758 (0.9186)	loss 0.4083 (0.5576)	grad_norm 3.4349 (2.8032)	mem 20675MB
[2025-04-03 01:48:09 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][34/311]	eta 0:04:13 lr 0.000488	time 0.8763 (0.9162)	loss 0.6646 (0.5587)	grad_norm 2.5339 (2.8058)	mem 20675MB
[2025-04-03 01:48:11 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][36/311]	eta 0:04:11 lr 0.000488	time 0.8761 (0.9141)	loss 0.5343 (0.5577)	grad_norm 3.9674 (2.8239)	mem 20675MB
[2025-04-03 01:48:13 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][38/311]	eta 0:04:09 lr 0.000487	time 0.8759 (0.9122)	loss 0.5048 (0.5565)	grad_norm 4.0323 (2.8442)	mem 20675MB
[2025-04-03 01:48:14 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][40/311]	eta 0:04:06 lr 0.000487	time 0.8763 (0.9105)	loss 0.5965 (0.5589)	grad_norm 3.5351 (2.8608)	mem 20675MB
[2025-04-03 01:48:16 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][42/311]	eta 0:04:04 lr 0.000487	time 0.8760 (0.9090)	loss 0.6161 (0.5578)	grad_norm 4.1489 (2.9149)	mem 20675MB
[2025-04-03 01:48:18 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][44/311]	eta 0:04:02 lr 0.000486	time 0.8761 (0.9075)	loss 0.5109 (0.5565)	grad_norm 3.6011 (2.9258)	mem 20675MB
[2025-04-03 01:48:20 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][46/311]	eta 0:04:00 lr 0.000486	time 0.8762 (0.9062)	loss 0.4732 (0.5563)	grad_norm 4.1417 (2.9418)	mem 20675MB
[2025-04-03 01:48:21 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][48/311]	eta 0:03:58 lr 0.000485	time 0.8764 (0.9051)	loss 0.6668 (0.5566)	grad_norm 3.1591 (2.9623)	mem 20675MB
[2025-04-03 01:48:23 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][50/311]	eta 0:03:55 lr 0.000485	time 0.8759 (0.9039)	loss 0.6316 (0.5593)	grad_norm 2.4265 (2.9380)	mem 20675MB
[2025-04-03 01:48:25 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][52/311]	eta 0:03:53 lr 0.000485	time 0.8763 (0.9029)	loss 0.5932 (0.5608)	grad_norm 2.6142 (2.9126)	mem 20675MB
[2025-04-03 01:48:27 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][54/311]	eta 0:03:51 lr 0.000484	time 0.8762 (0.9020)	loss 0.4894 (0.5600)	grad_norm 3.2346 (2.9027)	mem 20675MB
[2025-04-03 01:48:28 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][56/311]	eta 0:03:49 lr 0.000484	time 0.8788 (0.9012)	loss 0.5876 (0.5583)	grad_norm 2.3934 (2.8996)	mem 20675MB
[2025-04-03 01:48:30 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][58/311]	eta 0:03:47 lr 0.000483	time 0.8763 (0.9003)	loss 0.6499 (0.5606)	grad_norm 2.5226 (2.8815)	mem 20675MB
[2025-04-03 01:48:32 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][60/311]	eta 0:03:45 lr 0.000483	time 0.8760 (0.8996)	loss 0.6104 (0.5623)	grad_norm 1.7328 (2.8688)	mem 20675MB
[2025-04-03 01:48:34 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][62/311]	eta 0:03:43 lr 0.000482	time 0.8758 (0.8989)	loss 0.5960 (0.5642)	grad_norm 2.2316 (2.8444)	mem 20675MB
[2025-04-03 01:48:35 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][64/311]	eta 0:03:41 lr 0.000482	time 0.8761 (0.8982)	loss 0.4605 (0.5630)	grad_norm 3.0929 (2.8413)	mem 20675MB
[2025-04-03 01:48:37 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][66/311]	eta 0:03:39 lr 0.000482	time 0.8759 (0.8976)	loss 0.4513 (0.5617)	grad_norm 3.9175 (2.8452)	mem 20675MB
[2025-04-03 01:48:39 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][68/311]	eta 0:03:37 lr 0.000481	time 0.8762 (0.8970)	loss 0.5347 (0.5611)	grad_norm 2.9905 (2.8337)	mem 20675MB
[2025-04-03 01:48:41 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][70/311]	eta 0:03:36 lr 0.000481	time 0.8761 (0.8964)	loss 0.4697 (0.5605)	grad_norm 3.0468 (2.8304)	mem 20675MB
[2025-04-03 01:48:42 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][72/311]	eta 0:03:34 lr 0.000480	time 0.8761 (0.8959)	loss 0.6435 (0.5615)	grad_norm 2.6793 (2.8252)	mem 20675MB
[2025-04-03 01:48:44 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][74/311]	eta 0:03:32 lr 0.000480	time 0.8762 (0.8954)	loss 0.6344 (0.5619)	grad_norm 2.1702 (2.8271)	mem 20675MB
[2025-04-03 01:48:46 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][76/311]	eta 0:03:30 lr 0.000480	time 0.8758 (0.8949)	loss 0.5141 (0.5612)	grad_norm 2.9826 (2.8324)	mem 20675MB
[2025-04-03 01:48:48 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][78/311]	eta 0:03:28 lr 0.000479	time 0.8759 (0.8944)	loss 0.5721 (0.5611)	grad_norm 2.6567 (2.8318)	mem 20675MB
[2025-04-03 01:48:49 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][80/311]	eta 0:03:26 lr 0.000479	time 0.8765 (0.8940)	loss 0.5011 (0.5590)	grad_norm 3.1539 (2.8512)	mem 20675MB
[2025-04-03 01:48:51 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][82/311]	eta 0:03:24 lr 0.000478	time 0.8759 (0.8936)	loss 0.4294 (0.5576)	grad_norm 3.0523 (2.8446)	mem 20675MB
[2025-04-03 01:48:53 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][84/311]	eta 0:03:22 lr 0.000478	time 0.8761 (0.8932)	loss 0.6074 (0.5592)	grad_norm 2.4841 (2.8323)	mem 20675MB
[2025-04-03 01:48:55 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][86/311]	eta 0:03:20 lr 0.000478	time 0.8761 (0.8928)	loss 0.4875 (0.5588)	grad_norm 2.9549 (2.8305)	mem 20675MB
[2025-04-03 01:48:56 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][88/311]	eta 0:03:19 lr 0.000477	time 0.8764 (0.8925)	loss 0.5987 (0.5593)	grad_norm 2.6111 (2.8234)	mem 20675MB
[2025-04-03 01:48:58 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][90/311]	eta 0:03:17 lr 0.000477	time 0.8761 (0.8921)	loss 0.5685 (0.5600)	grad_norm 2.6226 (2.8210)	mem 20675MB
[2025-04-03 01:49:00 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][92/311]	eta 0:03:15 lr 0.000476	time 0.8758 (0.8918)	loss 0.4700 (0.5600)	grad_norm 3.5076 (2.8406)	mem 20675MB
[2025-04-03 01:49:02 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][94/311]	eta 0:03:13 lr 0.000476	time 0.8760 (0.8915)	loss 0.4305 (0.5584)	grad_norm 3.9498 (2.8619)	mem 20675MB
[2025-04-03 01:49:03 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][96/311]	eta 0:03:11 lr 0.000476	time 0.8761 (0.8912)	loss 0.6453 (0.5591)	grad_norm 2.5906 (2.8620)	mem 20675MB
[2025-04-03 01:49:05 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][98/311]	eta 0:03:09 lr 0.000475	time 0.8758 (0.8909)	loss 0.5745 (0.5593)	grad_norm 2.5206 (2.8502)	mem 20675MB
[2025-04-03 01:49:07 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][100/311]	eta 0:03:07 lr 0.000475	time 0.8758 (0.8906)	loss 0.5626 (0.5587)	grad_norm 2.1234 (2.8572)	mem 20675MB
[2025-04-03 01:49:09 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][102/311]	eta 0:03:06 lr 0.000474	time 0.8760 (0.8904)	loss 0.5552 (0.5592)	grad_norm 3.0275 (2.8664)	mem 20675MB
[2025-04-03 01:49:10 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][104/311]	eta 0:03:04 lr 0.000474	time 0.8765 (0.8901)	loss 0.6121 (0.5594)	grad_norm 2.3797 (2.8710)	mem 20675MB
[2025-04-03 01:49:12 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][106/311]	eta 0:03:02 lr 0.000473	time 0.8761 (0.8899)	loss 0.6724 (0.5591)	grad_norm 3.2968 (2.8812)	mem 20675MB
[2025-04-03 01:49:14 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][108/311]	eta 0:03:00 lr 0.000473	time 0.8761 (0.8896)	loss 0.6568 (0.5603)	grad_norm 4.3294 (2.8874)	mem 20675MB
[2025-04-03 01:49:16 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][110/311]	eta 0:02:58 lr 0.000473	time 0.8759 (0.8894)	loss 0.6654 (0.5616)	grad_norm 2.6471 (2.8850)	mem 20675MB
[2025-04-03 01:49:17 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][112/311]	eta 0:02:56 lr 0.000472	time 0.8757 (0.8892)	loss 0.5523 (0.5617)	grad_norm 2.3064 (2.8743)	mem 20675MB
[2025-04-03 01:49:19 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][114/311]	eta 0:02:55 lr 0.000472	time 0.8766 (0.8890)	loss 0.5940 (0.5622)	grad_norm 2.4306 (2.8648)	mem 20675MB
[2025-04-03 01:49:21 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][116/311]	eta 0:02:53 lr 0.000471	time 0.8760 (0.8887)	loss 0.5970 (0.5613)	grad_norm 2.8064 (2.8720)	mem 20675MB
[2025-04-03 01:49:23 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][118/311]	eta 0:02:51 lr 0.000471	time 0.8763 (0.8886)	loss 0.5665 (0.5617)	grad_norm 2.3710 (2.8615)	mem 20675MB
[2025-04-03 01:49:24 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][120/311]	eta 0:02:49 lr 0.000471	time 0.8758 (0.8884)	loss 0.5791 (0.5615)	grad_norm 2.1715 (2.8565)	mem 20675MB
[2025-04-03 01:49:26 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][122/311]	eta 0:02:47 lr 0.000470	time 0.8772 (0.8882)	loss 0.5930 (0.5613)	grad_norm 1.9880 (2.8453)	mem 20675MB
[2025-04-03 01:49:28 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][124/311]	eta 0:02:46 lr 0.000470	time 0.8761 (0.8880)	loss 0.6270 (0.5616)	grad_norm 2.7369 (2.8415)	mem 20675MB
[2025-04-03 01:49:30 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][126/311]	eta 0:02:44 lr 0.000469	time 0.8766 (0.8878)	loss 0.4912 (0.5611)	grad_norm 3.1309 (2.8392)	mem 20675MB
[2025-04-03 01:49:32 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][128/311]	eta 0:02:42 lr 0.000469	time 0.8760 (0.8877)	loss 0.4257 (0.5603)	grad_norm 5.3239 (2.8574)	mem 20675MB
[2025-04-03 01:49:33 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][130/311]	eta 0:02:40 lr 0.000469	time 0.8758 (0.8875)	loss 0.6414 (0.5615)	grad_norm 2.5504 (2.8509)	mem 20675MB
[2025-04-03 01:49:35 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][132/311]	eta 0:02:38 lr 0.000468	time 0.8759 (0.8873)	loss 0.5403 (0.5619)	grad_norm 2.4998 (2.8485)	mem 20675MB
[2025-04-03 01:49:37 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][134/311]	eta 0:02:37 lr 0.000468	time 0.8763 (0.8872)	loss 0.4407 (0.5604)	grad_norm 3.3996 (2.8661)	mem 20675MB
[2025-04-03 01:49:39 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][136/311]	eta 0:02:35 lr 0.000467	time 0.8757 (0.8870)	loss 0.5551 (0.5594)	grad_norm 2.7468 (2.8842)	mem 20675MB
[2025-04-03 01:49:40 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][138/311]	eta 0:02:33 lr 0.000467	time 0.8760 (0.8869)	loss 0.4132 (0.5580)	grad_norm 3.7759 (2.8904)	mem 20675MB
[2025-04-03 01:49:42 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][140/311]	eta 0:02:31 lr 0.000467	time 0.8760 (0.8868)	loss 0.5925 (0.5587)	grad_norm 2.8016 (2.8891)	mem 20675MB
[2025-04-03 01:49:44 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][142/311]	eta 0:02:29 lr 0.000466	time 0.8759 (0.8866)	loss 0.5714 (0.5582)	grad_norm 2.5249 (2.8929)	mem 20675MB
[2025-04-03 01:49:46 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][144/311]	eta 0:02:28 lr 0.000466	time 0.8758 (0.8865)	loss 0.5580 (0.5589)	grad_norm 2.5407 (2.8936)	mem 20675MB
[2025-04-03 01:49:47 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][146/311]	eta 0:02:26 lr 0.000465	time 0.8761 (0.8863)	loss 0.5549 (0.5589)	grad_norm 3.8893 (2.9040)	mem 20675MB
[2025-04-03 01:49:49 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][148/311]	eta 0:02:24 lr 0.000465	time 0.8759 (0.8862)	loss 0.5218 (0.5590)	grad_norm 2.2654 (2.8946)	mem 20675MB
[2025-04-03 01:49:51 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][150/311]	eta 0:02:22 lr 0.000465	time 0.8757 (0.8861)	loss 0.6069 (0.5592)	grad_norm 2.0522 (2.8839)	mem 20675MB
[2025-04-03 01:49:53 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][152/311]	eta 0:02:20 lr 0.000464	time 0.8758 (0.8860)	loss 0.5330 (0.5585)	grad_norm 2.4521 (2.8906)	mem 20675MB
[2025-04-03 01:49:54 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][154/311]	eta 0:02:19 lr 0.000464	time 0.8761 (0.8859)	loss 0.6142 (0.5587)	grad_norm 2.2447 (2.8831)	mem 20675MB
[2025-04-03 01:49:56 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][156/311]	eta 0:02:17 lr 0.000463	time 0.8758 (0.8857)	loss 0.6226 (0.5589)	grad_norm 2.6962 (2.8861)	mem 20675MB
[2025-04-03 01:49:58 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][158/311]	eta 0:02:15 lr 0.000463	time 0.8758 (0.8856)	loss 0.6062 (0.5594)	grad_norm 1.6918 (2.8727)	mem 20675MB
[2025-04-03 01:50:00 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][160/311]	eta 0:02:13 lr 0.000462	time 0.8763 (0.8855)	loss 0.6535 (0.5595)	grad_norm 2.7770 (2.8814)	mem 20675MB
[2025-04-03 01:50:01 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][162/311]	eta 0:02:11 lr 0.000462	time 0.8759 (0.8854)	loss 0.5946 (0.5599)	grad_norm 2.4423 (2.8776)	mem 20675MB
[2025-04-03 01:50:03 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][164/311]	eta 0:02:10 lr 0.000462	time 0.8761 (0.8853)	loss 0.6249 (0.5600)	grad_norm 2.1930 (2.8704)	mem 20675MB
[2025-04-03 01:50:05 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][166/311]	eta 0:02:08 lr 0.000461	time 0.8760 (0.8852)	loss 0.5311 (0.5599)	grad_norm 3.0198 (2.8684)	mem 20675MB
[2025-04-03 01:50:07 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][168/311]	eta 0:02:06 lr 0.000461	time 0.8757 (0.8851)	loss 0.6142 (0.5599)	grad_norm 2.7722 (2.8663)	mem 20675MB
[2025-04-03 01:50:08 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][170/311]	eta 0:02:04 lr 0.000460	time 0.8761 (0.8850)	loss 0.6901 (0.5603)	grad_norm 2.5690 (2.8649)	mem 20675MB
[2025-04-03 01:50:10 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][172/311]	eta 0:02:03 lr 0.000460	time 0.8760 (0.8849)	loss 0.5952 (0.5602)	grad_norm 2.4384 (2.8709)	mem 20675MB
[2025-04-03 01:50:12 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][174/311]	eta 0:02:01 lr 0.000460	time 0.8769 (0.8848)	loss 0.5537 (0.5603)	grad_norm 2.5042 (2.8663)	mem 20675MB
[2025-04-03 01:50:14 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][176/311]	eta 0:01:59 lr 0.000459	time 0.8788 (0.8848)	loss 0.6279 (0.5606)	grad_norm 1.9959 (2.8598)	mem 20675MB
[2025-04-03 01:50:15 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][178/311]	eta 0:01:57 lr 0.000459	time 0.8760 (0.8847)	loss 0.6434 (0.5612)	grad_norm 2.5362 (2.8568)	mem 20675MB
[2025-04-03 01:50:17 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][180/311]	eta 0:01:55 lr 0.000458	time 0.8758 (0.8846)	loss 0.4940 (0.5610)	grad_norm 2.0276 (2.8479)	mem 20675MB
[2025-04-03 01:50:19 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][182/311]	eta 0:01:54 lr 0.000458	time 0.8764 (0.8845)	loss 0.5929 (0.5614)	grad_norm 1.6068 (2.8369)	mem 20675MB
[2025-04-03 01:50:21 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][184/311]	eta 0:01:52 lr 0.000458	time 0.8760 (0.8844)	loss 0.4725 (0.5607)	grad_norm 3.4012 (2.8401)	mem 20675MB
[2025-04-03 01:50:22 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][186/311]	eta 0:01:50 lr 0.000457	time 0.8758 (0.8843)	loss 0.5794 (0.5608)	grad_norm 2.5704 (2.8411)	mem 20675MB
[2025-04-03 01:50:24 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][188/311]	eta 0:01:48 lr 0.000457	time 0.8759 (0.8842)	loss 0.5603 (0.5610)	grad_norm 1.8842 (2.8369)	mem 20675MB
[2025-04-03 01:50:26 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][190/311]	eta 0:01:46 lr 0.000456	time 0.8763 (0.8842)	loss 0.4975 (0.5605)	grad_norm 2.6110 (2.8352)	mem 20675MB
[2025-04-03 01:50:28 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][192/311]	eta 0:01:45 lr 0.000456	time 0.8757 (0.8841)	loss 0.5275 (0.5602)	grad_norm 3.0758 (2.8375)	mem 20675MB
[2025-04-03 01:50:29 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][194/311]	eta 0:01:43 lr 0.000456	time 0.8758 (0.8840)	loss 0.5061 (0.5602)	grad_norm 4.9890 (2.8583)	mem 20675MB
[2025-04-03 01:50:31 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][196/311]	eta 0:01:41 lr 0.000455	time 0.8757 (0.8839)	loss 0.4067 (0.5593)	grad_norm 4.2671 (2.8645)	mem 20675MB
[2025-04-03 01:50:33 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][198/311]	eta 0:01:39 lr 0.000455	time 0.8760 (0.8839)	loss 0.6380 (0.5589)	grad_norm 2.5598 (2.8677)	mem 20675MB
[2025-04-03 01:50:35 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][200/311]	eta 0:01:38 lr 0.000454	time 0.8764 (0.8838)	loss 0.5501 (0.5589)	grad_norm 2.3981 (2.8657)	mem 20675MB
[2025-04-03 01:50:36 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][202/311]	eta 0:01:36 lr 0.000454	time 0.8759 (0.8837)	loss 0.5476 (0.5590)	grad_norm 3.1259 (2.8726)	mem 20675MB
[2025-04-03 01:50:38 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][204/311]	eta 0:01:34 lr 0.000454	time 0.8760 (0.8837)	loss 0.4410 (0.5584)	grad_norm 4.5000 (2.8794)	mem 20675MB
[2025-04-03 01:50:40 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][206/311]	eta 0:01:32 lr 0.000453	time 0.8759 (0.8836)	loss 0.6104 (0.5588)	grad_norm 2.5321 (2.8743)	mem 20675MB
[2025-04-03 01:50:42 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][208/311]	eta 0:01:31 lr 0.000453	time 0.8759 (0.8835)	loss 0.6051 (0.5591)	grad_norm 2.4471 (2.8681)	mem 20675MB
[2025-04-03 01:50:43 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][210/311]	eta 0:01:29 lr 0.000452	time 0.8760 (0.8835)	loss 0.4730 (0.5583)	grad_norm 3.2511 (2.8706)	mem 20675MB
[2025-04-03 01:50:45 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][212/311]	eta 0:01:27 lr 0.000452	time 0.8756 (0.8834)	loss 0.4887 (0.5582)	grad_norm 2.6567 (2.8695)	mem 20675MB
[2025-04-03 01:50:47 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][214/311]	eta 0:01:25 lr 0.000452	time 0.8760 (0.8834)	loss 0.6126 (0.5582)	grad_norm 2.7355 (2.8677)	mem 20675MB
[2025-04-03 01:50:49 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][216/311]	eta 0:01:23 lr 0.000451	time 0.8761 (0.8833)	loss 0.5294 (0.5584)	grad_norm 2.9114 (2.8753)	mem 20675MB
[2025-04-03 01:50:50 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][218/311]	eta 0:01:22 lr 0.000451	time 0.8759 (0.8832)	loss 0.6627 (0.5592)	grad_norm 3.0149 (2.8741)	mem 20675MB
[2025-04-03 01:50:52 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][220/311]	eta 0:01:20 lr 0.000450	time 0.8760 (0.8832)	loss 0.5786 (0.5586)	grad_norm 2.0351 (2.8753)	mem 20675MB
[2025-04-03 01:50:54 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][222/311]	eta 0:01:18 lr 0.000450	time 0.8757 (0.8831)	loss 0.4994 (0.5585)	grad_norm 2.5774 (2.8716)	mem 20675MB
[2025-04-03 01:50:56 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][224/311]	eta 0:01:16 lr 0.000450	time 0.8776 (0.8831)	loss 0.5818 (0.5588)	grad_norm 1.8982 (2.8650)	mem 20675MB
[2025-04-03 01:50:57 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][226/311]	eta 0:01:15 lr 0.000449	time 0.8761 (0.8830)	loss 0.4573 (0.5579)	grad_norm 2.9963 (2.8653)	mem 20675MB
[2025-04-03 01:50:59 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][228/311]	eta 0:01:13 lr 0.000449	time 0.8761 (0.8830)	loss 0.5597 (0.5578)	grad_norm 3.1213 (2.8650)	mem 20675MB
[2025-04-03 01:51:01 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][230/311]	eta 0:01:11 lr 0.000448	time 0.8758 (0.8829)	loss 0.5887 (0.5580)	grad_norm 2.8578 (2.8614)	mem 20675MB
[2025-04-03 01:51:03 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][232/311]	eta 0:01:09 lr 0.000448	time 0.8757 (0.8829)	loss 0.5668 (0.5583)	grad_norm 2.3732 (2.8559)	mem 20675MB
[2025-04-03 01:51:04 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][234/311]	eta 0:01:07 lr 0.000447	time 0.8759 (0.8828)	loss 0.4979 (0.5582)	grad_norm 3.4001 (2.8561)	mem 20675MB
[2025-04-03 01:51:06 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][236/311]	eta 0:01:06 lr 0.000447	time 0.8764 (0.8828)	loss 0.5013 (0.5585)	grad_norm 2.7578 (2.8569)	mem 20675MB
[2025-04-03 01:51:08 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][238/311]	eta 0:01:04 lr 0.000447	time 0.8757 (0.8827)	loss 0.4134 (0.5583)	grad_norm 3.5046 (2.8614)	mem 20675MB
[2025-04-03 01:51:10 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][240/311]	eta 0:01:02 lr 0.000446	time 0.8758 (0.8827)	loss 0.4565 (0.5579)	grad_norm 3.0758 (2.8631)	mem 20675MB
[2025-04-03 01:51:11 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][242/311]	eta 0:01:00 lr 0.000446	time 0.8759 (0.8826)	loss 0.5890 (0.5580)	grad_norm 3.5566 (2.8649)	mem 20675MB
[2025-04-03 01:51:13 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][244/311]	eta 0:00:59 lr 0.000445	time 0.8762 (0.8826)	loss 0.4063 (0.5575)	grad_norm 4.1869 (2.8699)	mem 20675MB
[2025-04-03 01:51:15 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][246/311]	eta 0:00:57 lr 0.000445	time 0.8760 (0.8825)	loss 0.4398 (0.5572)	grad_norm 3.4737 (2.8720)	mem 20675MB
[2025-04-03 01:51:17 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][248/311]	eta 0:00:55 lr 0.000445	time 0.8759 (0.8825)	loss 0.6332 (0.5572)	grad_norm 3.6079 (2.8795)	mem 20675MB
[2025-04-03 01:51:18 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][250/311]	eta 0:00:53 lr 0.000444	time 0.8758 (0.8824)	loss 0.5284 (0.5572)	grad_norm 2.1998 (2.8752)	mem 20675MB
[2025-04-03 01:51:20 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][252/311]	eta 0:00:52 lr 0.000444	time 0.8759 (0.8824)	loss 0.5547 (0.5570)	grad_norm 3.9205 (2.8802)	mem 20675MB
[2025-04-03 01:51:22 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][254/311]	eta 0:00:50 lr 0.000443	time 0.8756 (0.8823)	loss 0.5725 (0.5571)	grad_norm 2.2350 (2.8737)	mem 20675MB
[2025-04-03 01:51:24 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][256/311]	eta 0:00:48 lr 0.000443	time 0.8758 (0.8823)	loss 0.5409 (0.5567)	grad_norm 2.5076 (2.8779)	mem 20675MB
[2025-04-03 01:51:26 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][258/311]	eta 0:00:46 lr 0.000443	time 0.8760 (0.8823)	loss 0.4563 (0.5559)	grad_norm 5.0882 (2.8930)	mem 20675MB
[2025-04-03 01:51:27 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][260/311]	eta 0:00:44 lr 0.000442	time 0.8758 (0.8822)	loss 0.5947 (0.5563)	grad_norm 2.4945 (2.8887)	mem 20675MB
[2025-04-03 01:51:29 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][262/311]	eta 0:00:43 lr 0.000442	time 0.8757 (0.8822)	loss 0.6094 (0.5560)	grad_norm 2.5085 (2.8931)	mem 20675MB
[2025-04-03 01:51:31 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][264/311]	eta 0:00:41 lr 0.000441	time 0.8760 (0.8821)	loss 0.5676 (0.5562)	grad_norm 2.6828 (2.8945)	mem 20675MB
[2025-04-03 01:51:33 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][266/311]	eta 0:00:39 lr 0.000441	time 0.8764 (0.8821)	loss 0.4474 (0.5561)	grad_norm 3.2925 (2.8952)	mem 20675MB
[2025-04-03 01:51:34 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][268/311]	eta 0:00:37 lr 0.000441	time 0.8759 (0.8821)	loss 0.5231 (0.5561)	grad_norm 2.2311 (2.8927)	mem 20675MB
[2025-04-03 01:51:36 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][270/311]	eta 0:00:36 lr 0.000440	time 0.8780 (0.8820)	loss 0.5551 (0.5561)	grad_norm 3.4438 (2.8914)	mem 20675MB
[2025-04-03 01:51:38 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][272/311]	eta 0:00:34 lr 0.000440	time 0.8757 (0.8820)	loss 0.4104 (0.5559)	grad_norm 4.6961 (2.8993)	mem 20675MB
[2025-04-03 01:51:40 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][274/311]	eta 0:00:32 lr 0.000439	time 0.8761 (0.8820)	loss 0.6369 (0.5559)	grad_norm 3.7805 (2.9063)	mem 20675MB
[2025-04-03 01:51:41 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][276/311]	eta 0:00:30 lr 0.000439	time 0.8772 (0.8819)	loss 0.6445 (0.5564)	grad_norm 3.4327 (2.9061)	mem 20675MB
[2025-04-03 01:51:43 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][278/311]	eta 0:00:29 lr 0.000439	time 0.8759 (0.8819)	loss 0.5887 (0.5562)	grad_norm 2.8328 (2.9063)	mem 20675MB
[2025-04-03 01:51:45 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][280/311]	eta 0:00:27 lr 0.000438	time 0.8758 (0.8818)	loss 0.4935 (0.5562)	grad_norm 2.2586 (2.9036)	mem 20675MB
[2025-04-03 01:51:47 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][282/311]	eta 0:00:25 lr 0.000438	time 0.8762 (0.8818)	loss 0.6151 (0.5565)	grad_norm 2.0819 (2.8992)	mem 20675MB
[2025-04-03 01:51:48 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][284/311]	eta 0:00:23 lr 0.000437	time 0.8760 (0.8818)	loss 0.4094 (0.5557)	grad_norm 2.6107 (2.8984)	mem 20675MB
[2025-04-03 01:51:50 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][286/311]	eta 0:00:22 lr 0.000437	time 0.8771 (0.8818)	loss 0.5761 (0.5557)	grad_norm 2.0825 (2.8995)	mem 20675MB
[2025-04-03 01:51:52 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][288/311]	eta 0:00:20 lr 0.000437	time 0.8764 (0.8817)	loss 0.5424 (0.5553)	grad_norm 2.1483 (2.8967)	mem 20675MB
[2025-04-03 01:51:54 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][290/311]	eta 0:00:18 lr 0.000436	time 0.8758 (0.8817)	loss 0.6235 (0.5557)	grad_norm 2.1071 (2.8948)	mem 20675MB
[2025-04-03 01:51:55 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][292/311]	eta 0:00:16 lr 0.000436	time 0.8760 (0.8817)	loss 0.4053 (0.5555)	grad_norm 3.2345 (2.8945)	mem 20675MB
[2025-04-03 01:51:57 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][294/311]	eta 0:00:14 lr 0.000435	time 0.8760 (0.8816)	loss 0.4708 (0.5553)	grad_norm 2.9915 (2.8935)	mem 20675MB
[2025-04-03 01:51:59 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][296/311]	eta 0:00:13 lr 0.000435	time 0.8763 (0.8816)	loss 0.5347 (0.5552)	grad_norm 3.1391 (2.8941)	mem 20675MB
[2025-04-03 01:52:01 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][298/311]	eta 0:00:11 lr 0.000435	time 0.8760 (0.8816)	loss 0.6280 (0.5554)	grad_norm 2.8787 (2.8922)	mem 20675MB
[2025-04-03 01:52:02 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][300/311]	eta 0:00:09 lr 0.000434	time 0.8759 (0.8815)	loss 0.6203 (0.5558)	grad_norm 2.8069 (2.8906)	mem 20675MB
[2025-04-03 01:52:04 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][302/311]	eta 0:00:07 lr 0.000434	time 0.8759 (0.8815)	loss 0.5883 (0.5560)	grad_norm 3.2621 (2.8903)	mem 20675MB
[2025-04-03 01:52:06 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][304/311]	eta 0:00:06 lr 0.000433	time 0.8759 (0.8815)	loss 0.4810 (0.5560)	grad_norm 3.2379 (2.8902)	mem 20675MB
[2025-04-03 01:52:08 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][306/311]	eta 0:00:04 lr 0.000433	time 0.8760 (0.8815)	loss 0.6139 (0.5559)	grad_norm 2.8285 (2.8898)	mem 20675MB
[2025-04-03 01:52:09 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][308/311]	eta 0:00:02 lr 0.000433	time 0.8757 (0.8814)	loss 0.5092 (0.5561)	grad_norm 3.2485 (2.8921)	mem 20675MB
[2025-04-03 01:52:11 simmim_finetune] (main_finetune.py 252): INFO Train: [17/30][310/311]	eta 0:00:00 lr 0.000432	time 0.8763 (0.8814)	loss 0.6417 (0.5565)	grad_norm 1.9991 (2.8866)	mem 20675MB
[2025-04-03 01:52:11 simmim_finetune] (main_finetune.py 260): INFO EPOCH 17 training takes 0:04:34
[2025-04-03 01:52:13 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.445 (1.445)	Loss 0.5271 (0.5271)	Acc@1 78.125 (78.125)	Mem 20675MB
[2025-04-03 01:52:13 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.873
[2025-04-03 01:52:13 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 78.9%
[2025-04-03 01:52:13 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 01:52:13 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.7610186510542067e-06, 1.7610186510542067e-06, 2.621174105820721e-06, 2.621174105820721e-06, 3.9444901900768965e-06, 3.9444901900768965e-06, 5.980361088932552e-06, 5.980361088932552e-06, 9.112470164095096e-06, 9.112470164095096e-06, 1.3931099510499014e-05, 1.3931099510499014e-05, 2.13443754280435e-05, 2.13443754280435e-05, 3.274941530118885e-05, 3.274941530118885e-05, 5.0295630490643254e-05, 5.0295630490643254e-05, 7.728980770518851e-05, 7.728980770518851e-05, 0.00011881931111218116, 0.00011881931111218116, 0.00018271085481524682, 0.00018271085481524682, 0.00028100553743534777, 0.00028100553743534777, 0.00043222812608165694, 0.00043222812608165694]
[2025-04-03 01:52:15 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][0/311]	eta 0:10:47 lr 0.000432	time 2.0827 (2.0827)	loss 0.6181 (0.6181)	grad_norm 2.5504 (2.5504)	mem 20675MB
[2025-04-03 01:52:17 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][2/311]	eta 0:06:35 lr 0.000432	time 0.8756 (1.2786)	loss 0.4743 (0.5593)	grad_norm 3.1553 (2.7532)	mem 20675MB
[2025-04-03 01:52:18 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][4/311]	eta 0:05:43 lr 0.000431	time 0.8754 (1.1177)	loss 0.4252 (0.5464)	grad_norm 3.7557 (2.7054)	mem 20675MB
[2025-04-03 01:52:20 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][6/311]	eta 0:05:19 lr 0.000431	time 0.8765 (1.0489)	loss 0.5114 (0.5306)	grad_norm 2.5363 (2.8455)	mem 20675MB
[2025-04-03 01:52:22 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][8/311]	eta 0:05:06 lr 0.000430	time 0.8755 (1.0106)	loss 0.6080 (0.5492)	grad_norm 2.2609 (2.7306)	mem 20675MB
[2025-04-03 01:52:24 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][10/311]	eta 0:04:56 lr 0.000430	time 0.8754 (0.9862)	loss 0.5120 (0.5533)	grad_norm 1.7190 (2.6607)	mem 20675MB
[2025-04-03 01:52:25 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][12/311]	eta 0:04:49 lr 0.000430	time 0.8754 (0.9693)	loss 0.5129 (0.5564)	grad_norm 3.6929 (2.7310)	mem 20675MB
[2025-04-03 01:52:27 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][14/311]	eta 0:04:44 lr 0.000429	time 0.8754 (0.9569)	loss 0.6000 (0.5564)	grad_norm 2.9979 (2.7412)	mem 20675MB
[2025-04-03 01:52:29 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][16/311]	eta 0:04:39 lr 0.000429	time 0.8758 (0.9475)	loss 0.5753 (0.5520)	grad_norm 2.1140 (2.7412)	mem 20675MB
[2025-04-03 01:52:31 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][18/311]	eta 0:04:35 lr 0.000428	time 0.8754 (0.9400)	loss 0.5254 (0.5529)	grad_norm 3.8133 (2.8165)	mem 20675MB
[2025-04-03 01:52:32 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][20/311]	eta 0:04:31 lr 0.000428	time 0.8758 (0.9339)	loss 0.5862 (0.5564)	grad_norm 2.5403 (2.7656)	mem 20675MB
[2025-04-03 01:52:34 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][22/311]	eta 0:04:28 lr 0.000428	time 0.8762 (0.9290)	loss 0.5332 (0.5578)	grad_norm 3.0275 (2.7796)	mem 20675MB
[2025-04-03 01:52:36 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][24/311]	eta 0:04:25 lr 0.000427	time 0.8754 (0.9248)	loss 0.5337 (0.5592)	grad_norm 3.0898 (2.7565)	mem 20675MB
[2025-04-03 01:52:38 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][26/311]	eta 0:04:22 lr 0.000427	time 0.8758 (0.9212)	loss 0.5965 (0.5581)	grad_norm 2.2898 (2.7426)	mem 20675MB
[2025-04-03 01:52:39 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][28/311]	eta 0:04:19 lr 0.000426	time 0.8758 (0.9182)	loss 0.6184 (0.5603)	grad_norm 2.4915 (2.7166)	mem 20675MB
[2025-04-03 01:52:41 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][30/311]	eta 0:04:17 lr 0.000426	time 0.8761 (0.9155)	loss 0.6077 (0.5643)	grad_norm 3.0666 (2.7200)	mem 20675MB
[2025-04-03 01:52:43 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][32/311]	eta 0:04:14 lr 0.000426	time 0.8758 (0.9132)	loss 0.5750 (0.5632)	grad_norm 2.3596 (2.7324)	mem 20675MB
[2025-04-03 01:52:45 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][34/311]	eta 0:04:12 lr 0.000425	time 0.8758 (0.9111)	loss 0.6489 (0.5658)	grad_norm 2.5962 (2.7052)	mem 20675MB
[2025-04-03 01:52:46 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][36/311]	eta 0:04:10 lr 0.000425	time 0.8756 (0.9092)	loss 0.5756 (0.5670)	grad_norm 2.4233 (2.7098)	mem 20675MB
[2025-04-03 01:52:48 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][38/311]	eta 0:04:07 lr 0.000424	time 0.8758 (0.9075)	loss 0.5151 (0.5669)	grad_norm 3.0668 (2.6987)	mem 20675MB
[2025-04-03 01:52:50 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][40/311]	eta 0:04:05 lr 0.000424	time 0.8764 (0.9061)	loss 0.5858 (0.5654)	grad_norm 2.0728 (2.7252)	mem 20675MB
[2025-04-03 01:52:52 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][42/311]	eta 0:04:03 lr 0.000424	time 0.8758 (0.9047)	loss 0.6341 (0.5692)	grad_norm 2.2959 (2.7183)	mem 20675MB
[2025-04-03 01:52:54 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][44/311]	eta 0:04:01 lr 0.000423	time 0.8758 (0.9035)	loss 0.5925 (0.5693)	grad_norm 2.2834 (2.7048)	mem 20675MB
[2025-04-03 01:52:55 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][46/311]	eta 0:03:59 lr 0.000423	time 0.8758 (0.9023)	loss 0.4934 (0.5648)	grad_norm 3.6325 (2.7276)	mem 20675MB
[2025-04-03 01:52:57 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][48/311]	eta 0:03:57 lr 0.000422	time 0.8760 (0.9012)	loss 0.4963 (0.5637)	grad_norm 3.2942 (2.7259)	mem 20675MB
[2025-04-03 01:52:59 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][50/311]	eta 0:03:54 lr 0.000422	time 0.8755 (0.9003)	loss 0.4720 (0.5597)	grad_norm 4.4778 (2.7658)	mem 20675MB
[2025-04-03 01:53:01 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][52/311]	eta 0:03:52 lr 0.000422	time 0.8758 (0.8994)	loss 0.6597 (0.5618)	grad_norm 2.8554 (2.7490)	mem 20675MB
[2025-04-03 01:53:02 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][54/311]	eta 0:03:50 lr 0.000421	time 0.8756 (0.8986)	loss 0.4825 (0.5613)	grad_norm 3.5483 (2.7518)	mem 20675MB
[2025-04-03 01:53:04 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][56/311]	eta 0:03:48 lr 0.000421	time 0.8761 (0.8978)	loss 0.5843 (0.5627)	grad_norm 2.7636 (2.7348)	mem 20675MB
[2025-04-03 01:53:06 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][58/311]	eta 0:03:46 lr 0.000420	time 0.8755 (0.8971)	loss 0.4000 (0.5601)	grad_norm 3.6231 (2.7472)	mem 20675MB
[2025-04-03 01:53:08 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][60/311]	eta 0:03:44 lr 0.000420	time 0.8758 (0.8964)	loss 0.6523 (0.5622)	grad_norm 2.5374 (2.7403)	mem 20675MB
[2025-04-03 01:53:09 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][62/311]	eta 0:03:43 lr 0.000420	time 0.8756 (0.8958)	loss 0.4899 (0.5624)	grad_norm 2.8934 (2.7400)	mem 20675MB
[2025-04-03 01:53:11 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][64/311]	eta 0:03:41 lr 0.000419	time 0.8759 (0.8952)	loss 0.4865 (0.5609)	grad_norm 2.8606 (2.7419)	mem 20675MB
[2025-04-03 01:53:13 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][66/311]	eta 0:03:39 lr 0.000419	time 0.8756 (0.8946)	loss 0.4865 (0.5588)	grad_norm 2.0159 (2.7344)	mem 20675MB
[2025-04-03 01:53:15 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][68/311]	eta 0:03:37 lr 0.000418	time 0.8758 (0.8941)	loss 0.5499 (0.5595)	grad_norm 2.8352 (2.7278)	mem 20675MB
[2025-04-03 01:53:16 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][70/311]	eta 0:03:35 lr 0.000418	time 0.8758 (0.8936)	loss 0.4458 (0.5568)	grad_norm 3.9445 (2.7650)	mem 20675MB
[2025-04-03 01:53:18 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][72/311]	eta 0:03:33 lr 0.000418	time 0.8755 (0.8931)	loss 0.6048 (0.5582)	grad_norm 2.7188 (2.7587)	mem 20675MB
[2025-04-03 01:53:20 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][74/311]	eta 0:03:31 lr 0.000417	time 0.8758 (0.8927)	loss 0.6205 (0.5582)	grad_norm 2.6964 (2.7709)	mem 20675MB
[2025-04-03 01:53:22 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][76/311]	eta 0:03:29 lr 0.000417	time 0.8759 (0.8923)	loss 0.6379 (0.5592)	grad_norm 2.8592 (2.7790)	mem 20675MB
[2025-04-03 01:53:23 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][78/311]	eta 0:03:27 lr 0.000416	time 0.8755 (0.8919)	loss 0.5695 (0.5595)	grad_norm 3.3533 (2.8099)	mem 20675MB
[2025-04-03 01:53:25 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][80/311]	eta 0:03:25 lr 0.000416	time 0.8754 (0.8915)	loss 0.5177 (0.5588)	grad_norm 2.8361 (2.8034)	mem 20675MB
[2025-04-03 01:53:27 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][82/311]	eta 0:03:24 lr 0.000416	time 0.8756 (0.8911)	loss 0.5777 (0.5576)	grad_norm 2.1182 (2.7994)	mem 20675MB
[2025-04-03 01:53:29 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][84/311]	eta 0:03:22 lr 0.000415	time 0.8757 (0.8908)	loss 0.6544 (0.5589)	grad_norm 2.5493 (2.7915)	mem 20675MB
[2025-04-03 01:53:30 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][86/311]	eta 0:03:20 lr 0.000415	time 0.8755 (0.8905)	loss 0.5115 (0.5590)	grad_norm 3.2094 (2.7950)	mem 20675MB
[2025-04-03 01:53:32 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][88/311]	eta 0:03:18 lr 0.000415	time 0.8754 (0.8901)	loss 0.5729 (0.5596)	grad_norm 2.4022 (2.8084)	mem 20675MB
[2025-04-03 01:53:34 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][90/311]	eta 0:03:16 lr 0.000414	time 0.8761 (0.8899)	loss 0.5236 (0.5599)	grad_norm 3.6755 (2.8100)	mem 20675MB
[2025-04-03 01:53:36 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][92/311]	eta 0:03:14 lr 0.000414	time 0.8757 (0.8896)	loss 0.5880 (0.5589)	grad_norm 1.9214 (2.7993)	mem 20675MB
[2025-04-03 01:53:37 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][94/311]	eta 0:03:12 lr 0.000413	time 0.8757 (0.8893)	loss 0.5924 (0.5591)	grad_norm 2.1348 (2.7866)	mem 20675MB
[2025-04-03 01:53:39 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][96/311]	eta 0:03:11 lr 0.000413	time 0.8759 (0.8890)	loss 0.4457 (0.5585)	grad_norm 4.5499 (2.7967)	mem 20675MB
[2025-04-03 01:53:41 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][98/311]	eta 0:03:09 lr 0.000413	time 0.8758 (0.8888)	loss 0.6129 (0.5591)	grad_norm 2.8827 (2.7985)	mem 20675MB
[2025-04-03 01:53:43 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][100/311]	eta 0:03:07 lr 0.000412	time 0.8755 (0.8885)	loss 0.5415 (0.5592)	grad_norm 1.8136 (2.7848)	mem 20675MB
[2025-04-03 01:53:44 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][102/311]	eta 0:03:05 lr 0.000412	time 0.8758 (0.8883)	loss 0.6529 (0.5591)	grad_norm 2.9296 (2.7950)	mem 20675MB
[2025-04-03 01:53:46 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][104/311]	eta 0:03:03 lr 0.000411	time 0.8756 (0.8881)	loss 0.5905 (0.5600)	grad_norm 2.2398 (2.7834)	mem 20675MB
[2025-04-03 01:53:48 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][106/311]	eta 0:03:02 lr 0.000411	time 0.8761 (0.8879)	loss 0.5056 (0.5591)	grad_norm 2.4886 (2.7844)	mem 20675MB
[2025-04-03 01:53:50 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][108/311]	eta 0:03:00 lr 0.000411	time 0.8757 (0.8877)	loss 0.4795 (0.5585)	grad_norm 3.7970 (2.7882)	mem 20675MB
[2025-04-03 01:53:51 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][110/311]	eta 0:02:58 lr 0.000410	time 0.8755 (0.8875)	loss 0.3901 (0.5573)	grad_norm 4.1808 (2.7969)	mem 20675MB
[2025-04-03 01:53:53 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][112/311]	eta 0:02:56 lr 0.000410	time 0.8758 (0.8873)	loss 0.5072 (0.5561)	grad_norm 3.7589 (2.8133)	mem 20675MB
[2025-04-03 01:53:55 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][114/311]	eta 0:02:54 lr 0.000409	time 0.8757 (0.8871)	loss 0.6352 (0.5565)	grad_norm 2.6332 (2.8087)	mem 20675MB
[2025-04-03 01:53:57 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][116/311]	eta 0:02:52 lr 0.000409	time 0.8759 (0.8869)	loss 0.6082 (0.5560)	grad_norm 2.2545 (2.8075)	mem 20675MB
[2025-04-03 01:53:58 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][118/311]	eta 0:02:51 lr 0.000409	time 0.8758 (0.8868)	loss 0.4227 (0.5552)	grad_norm 2.9401 (2.8087)	mem 20675MB
[2025-04-03 01:54:00 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][120/311]	eta 0:02:49 lr 0.000408	time 0.8755 (0.8866)	loss 0.6532 (0.5551)	grad_norm 3.8416 (2.8214)	mem 20675MB
[2025-04-03 01:54:02 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][122/311]	eta 0:02:47 lr 0.000408	time 0.8768 (0.8864)	loss 0.5786 (0.5555)	grad_norm 2.0433 (2.8117)	mem 20675MB
[2025-04-03 01:54:04 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][124/311]	eta 0:02:45 lr 0.000407	time 0.8758 (0.8863)	loss 0.5363 (0.5554)	grad_norm 2.8670 (2.8165)	mem 20675MB
[2025-04-03 01:54:05 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][126/311]	eta 0:02:43 lr 0.000407	time 0.8783 (0.8862)	loss 0.5424 (0.5550)	grad_norm 4.4820 (2.8344)	mem 20675MB
[2025-04-03 01:54:07 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][128/311]	eta 0:02:42 lr 0.000407	time 0.8755 (0.8860)	loss 0.6024 (0.5556)	grad_norm 2.1452 (2.8272)	mem 20675MB
[2025-04-03 01:54:09 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][130/311]	eta 0:02:40 lr 0.000406	time 0.8755 (0.8859)	loss 0.5648 (0.5553)	grad_norm 2.7096 (2.8240)	mem 20675MB
[2025-04-03 01:54:11 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][132/311]	eta 0:02:38 lr 0.000406	time 0.8762 (0.8858)	loss 0.4535 (0.5541)	grad_norm 3.7061 (2.8483)	mem 20675MB
[2025-04-03 01:54:12 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][134/311]	eta 0:02:36 lr 0.000405	time 0.8758 (0.8856)	loss 0.4580 (0.5531)	grad_norm 2.9773 (2.8462)	mem 20675MB
[2025-04-03 01:54:14 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][136/311]	eta 0:02:34 lr 0.000405	time 0.8771 (0.8855)	loss 0.6101 (0.5541)	grad_norm 2.4126 (2.8396)	mem 20675MB
[2025-04-03 01:54:16 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][138/311]	eta 0:02:33 lr 0.000405	time 0.8754 (0.8854)	loss 0.4740 (0.5539)	grad_norm 3.8088 (2.8461)	mem 20675MB
[2025-04-03 01:54:18 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][140/311]	eta 0:02:31 lr 0.000404	time 0.8760 (0.8852)	loss 0.6379 (0.5550)	grad_norm 2.2909 (2.8362)	mem 20675MB
[2025-04-03 01:54:19 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][142/311]	eta 0:02:29 lr 0.000404	time 0.8757 (0.8851)	loss 0.6033 (0.5551)	grad_norm 3.0581 (2.8449)	mem 20675MB
[2025-04-03 01:54:21 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][144/311]	eta 0:02:27 lr 0.000403	time 0.8757 (0.8850)	loss 0.5021 (0.5548)	grad_norm 3.8248 (2.8522)	mem 20675MB
[2025-04-03 01:54:23 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][146/311]	eta 0:02:26 lr 0.000403	time 0.8757 (0.8849)	loss 0.5149 (0.5539)	grad_norm 2.8375 (2.8575)	mem 20675MB
[2025-04-03 01:54:25 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][148/311]	eta 0:02:24 lr 0.000403	time 0.8756 (0.8848)	loss 0.5801 (0.5542)	grad_norm 2.9118 (2.8530)	mem 20675MB
[2025-04-03 01:54:26 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][150/311]	eta 0:02:22 lr 0.000402	time 0.8756 (0.8847)	loss 0.6297 (0.5551)	grad_norm 3.4480 (2.8544)	mem 20675MB
[2025-04-03 01:54:28 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][152/311]	eta 0:02:20 lr 0.000402	time 0.8755 (0.8846)	loss 0.5837 (0.5551)	grad_norm 2.2871 (2.8628)	mem 20675MB
[2025-04-03 01:54:30 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][154/311]	eta 0:02:18 lr 0.000401	time 0.8754 (0.8845)	loss 0.5066 (0.5553)	grad_norm 2.7447 (2.8591)	mem 20675MB
[2025-04-03 01:54:32 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][156/311]	eta 0:02:17 lr 0.000401	time 0.8754 (0.8844)	loss 0.5335 (0.5556)	grad_norm 2.5402 (2.8541)	mem 20675MB
[2025-04-03 01:54:33 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][158/311]	eta 0:02:15 lr 0.000401	time 0.8757 (0.8843)	loss 0.6115 (0.5559)	grad_norm 1.9581 (2.8447)	mem 20675MB
[2025-04-03 01:54:35 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][160/311]	eta 0:02:13 lr 0.000400	time 0.8756 (0.8842)	loss 0.6271 (0.5561)	grad_norm 2.2393 (2.8420)	mem 20675MB
[2025-04-03 01:54:37 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][162/311]	eta 0:02:11 lr 0.000400	time 0.8753 (0.8841)	loss 0.5564 (0.5566)	grad_norm 2.1175 (2.8321)	mem 20675MB
[2025-04-03 01:54:39 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][164/311]	eta 0:02:09 lr 0.000400	time 0.8755 (0.8840)	loss 0.5854 (0.5568)	grad_norm 1.9451 (2.8218)	mem 20675MB
[2025-04-03 01:54:40 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][166/311]	eta 0:02:08 lr 0.000399	time 0.8759 (0.8839)	loss 0.5183 (0.5568)	grad_norm 3.4447 (2.8213)	mem 20675MB
[2025-04-03 01:54:42 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][168/311]	eta 0:02:06 lr 0.000399	time 0.8773 (0.8838)	loss 0.5575 (0.5569)	grad_norm 2.6275 (2.8198)	mem 20675MB
[2025-04-03 01:54:44 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][170/311]	eta 0:02:04 lr 0.000398	time 0.8757 (0.8837)	loss 0.5948 (0.5564)	grad_norm 2.4677 (2.8174)	mem 20675MB
[2025-04-03 01:54:46 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][172/311]	eta 0:02:02 lr 0.000398	time 0.8757 (0.8836)	loss 0.4633 (0.5559)	grad_norm 2.6933 (2.8204)	mem 20675MB
[2025-04-03 01:54:47 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][174/311]	eta 0:02:01 lr 0.000398	time 0.8757 (0.8836)	loss 0.4714 (0.5559)	grad_norm 3.9210 (2.8232)	mem 20675MB
[2025-04-03 01:54:49 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][176/311]	eta 0:01:59 lr 0.000397	time 0.8753 (0.8835)	loss 0.4549 (0.5557)	grad_norm 4.6725 (2.8358)	mem 20675MB
[2025-04-03 01:54:51 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][178/311]	eta 0:01:57 lr 0.000397	time 0.8755 (0.8834)	loss 0.5771 (0.5560)	grad_norm 2.8038 (2.8322)	mem 20675MB
[2025-04-03 01:54:53 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][180/311]	eta 0:01:55 lr 0.000396	time 0.8758 (0.8833)	loss 0.5015 (0.5553)	grad_norm 5.2073 (2.8492)	mem 20675MB
[2025-04-03 01:54:54 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][182/311]	eta 0:01:53 lr 0.000396	time 0.8757 (0.8832)	loss 0.5969 (0.5557)	grad_norm 3.3011 (2.8529)	mem 20675MB
[2025-04-03 01:54:56 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][184/311]	eta 0:01:52 lr 0.000396	time 0.8757 (0.8832)	loss 0.4463 (0.5552)	grad_norm 3.4481 (2.8538)	mem 20675MB
[2025-04-03 01:54:58 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][186/311]	eta 0:01:50 lr 0.000395	time 0.8757 (0.8831)	loss 0.5960 (0.5559)	grad_norm 2.1277 (2.8501)	mem 20675MB
[2025-04-03 01:55:00 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][188/311]	eta 0:01:48 lr 0.000395	time 0.8764 (0.8830)	loss 0.4063 (0.5552)	grad_norm 3.6589 (2.8532)	mem 20675MB
[2025-04-03 01:55:01 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][190/311]	eta 0:01:46 lr 0.000394	time 0.8755 (0.8830)	loss 0.6150 (0.5550)	grad_norm 2.5166 (2.8525)	mem 20675MB
[2025-04-03 01:55:03 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][192/311]	eta 0:01:45 lr 0.000394	time 0.8757 (0.8829)	loss 0.5510 (0.5552)	grad_norm 2.9049 (2.8513)	mem 20675MB
[2025-04-03 01:55:05 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][194/311]	eta 0:01:43 lr 0.000394	time 0.8757 (0.8828)	loss 0.5529 (0.5546)	grad_norm 3.5884 (2.8571)	mem 20675MB
[2025-04-03 01:55:07 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][196/311]	eta 0:01:41 lr 0.000393	time 0.8757 (0.8828)	loss 0.6733 (0.5552)	grad_norm 3.2039 (2.8617)	mem 20675MB
[2025-04-03 01:55:09 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][198/311]	eta 0:01:39 lr 0.000393	time 0.8758 (0.8827)	loss 0.5319 (0.5550)	grad_norm 2.7062 (2.8599)	mem 20675MB
[2025-04-03 01:55:10 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][200/311]	eta 0:01:37 lr 0.000392	time 0.8758 (0.8826)	loss 0.5975 (0.5552)	grad_norm 2.5696 (2.8580)	mem 20675MB
[2025-04-03 01:55:12 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][202/311]	eta 0:01:36 lr 0.000392	time 0.8758 (0.8826)	loss 0.5905 (0.5555)	grad_norm 2.3127 (2.8563)	mem 20675MB
[2025-04-03 01:55:14 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][204/311]	eta 0:01:34 lr 0.000392	time 0.8755 (0.8825)	loss 0.5366 (0.5556)	grad_norm 5.7740 (2.8732)	mem 20675MB
[2025-04-03 01:55:16 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][206/311]	eta 0:01:32 lr 0.000391	time 0.8759 (0.8825)	loss 0.6332 (0.5557)	grad_norm 3.6027 (2.8825)	mem 20675MB
[2025-04-03 01:55:17 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][208/311]	eta 0:01:30 lr 0.000391	time 0.8757 (0.8824)	loss 0.6216 (0.5560)	grad_norm 2.5993 (2.8777)	mem 20675MB
[2025-04-03 01:55:19 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][210/311]	eta 0:01:29 lr 0.000391	time 0.8756 (0.8823)	loss 0.6070 (0.5565)	grad_norm 1.8461 (2.8695)	mem 20675MB
[2025-04-03 01:55:21 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][212/311]	eta 0:01:27 lr 0.000390	time 0.8756 (0.8823)	loss 0.5750 (0.5563)	grad_norm 2.6516 (2.8683)	mem 20675MB
[2025-04-03 01:55:23 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][214/311]	eta 0:01:25 lr 0.000390	time 0.8755 (0.8822)	loss 0.4652 (0.5555)	grad_norm 4.1158 (2.8762)	mem 20675MB
[2025-04-03 01:55:24 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][216/311]	eta 0:01:23 lr 0.000389	time 0.8759 (0.8822)	loss 0.5270 (0.5552)	grad_norm 2.7599 (2.8733)	mem 20675MB
[2025-04-03 01:55:26 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][218/311]	eta 0:01:22 lr 0.000389	time 0.8755 (0.8821)	loss 0.5421 (0.5547)	grad_norm 3.1400 (2.8859)	mem 20675MB
[2025-04-03 01:55:28 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][220/311]	eta 0:01:20 lr 0.000389	time 0.8758 (0.8821)	loss 0.6109 (0.5552)	grad_norm 2.7323 (2.8826)	mem 20675MB
[2025-04-03 01:55:30 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][222/311]	eta 0:01:18 lr 0.000388	time 0.8759 (0.8820)	loss 0.6513 (0.5560)	grad_norm 2.4973 (2.8823)	mem 20675MB
[2025-04-03 01:55:31 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][224/311]	eta 0:01:16 lr 0.000388	time 0.8769 (0.8820)	loss 0.5989 (0.5561)	grad_norm 2.6018 (2.8787)	mem 20675MB
[2025-04-03 01:55:33 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][226/311]	eta 0:01:14 lr 0.000387	time 0.8765 (0.8819)	loss 0.5115 (0.5566)	grad_norm 3.2637 (2.8820)	mem 20675MB
[2025-04-03 01:55:35 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][228/311]	eta 0:01:13 lr 0.000387	time 0.8758 (0.8819)	loss 0.5720 (0.5569)	grad_norm 1.6287 (2.8745)	mem 20675MB
[2025-04-03 01:55:37 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][230/311]	eta 0:01:11 lr 0.000387	time 0.8763 (0.8819)	loss 0.6158 (0.5577)	grad_norm 1.7509 (2.8707)	mem 20675MB
[2025-04-03 01:55:38 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][232/311]	eta 0:01:09 lr 0.000386	time 0.8755 (0.8818)	loss 0.5397 (0.5579)	grad_norm 2.8412 (2.8687)	mem 20675MB
[2025-04-03 01:55:40 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][234/311]	eta 0:01:07 lr 0.000386	time 0.8756 (0.8818)	loss 0.5739 (0.5577)	grad_norm 2.2509 (2.8650)	mem 20675MB
[2025-04-03 01:55:42 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][236/311]	eta 0:01:06 lr 0.000385	time 0.8757 (0.8817)	loss 0.6189 (0.5578)	grad_norm 2.0310 (2.8631)	mem 20675MB
[2025-04-03 01:55:44 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][238/311]	eta 0:01:04 lr 0.000385	time 0.8759 (0.8817)	loss 0.6057 (0.5584)	grad_norm 1.9608 (2.8548)	mem 20675MB
[2025-04-03 01:55:45 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][240/311]	eta 0:01:02 lr 0.000385	time 0.8756 (0.8817)	loss 0.5473 (0.5580)	grad_norm 2.7410 (2.8544)	mem 20675MB
[2025-04-03 01:55:47 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][242/311]	eta 0:01:00 lr 0.000384	time 0.8755 (0.8816)	loss 0.5199 (0.5578)	grad_norm 2.8394 (2.8512)	mem 20675MB
[2025-04-03 01:55:49 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][244/311]	eta 0:00:59 lr 0.000384	time 0.8772 (0.8816)	loss 0.5895 (0.5579)	grad_norm 2.2472 (2.8495)	mem 20675MB
[2025-04-03 01:55:51 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][246/311]	eta 0:00:57 lr 0.000384	time 0.8761 (0.8815)	loss 0.5751 (0.5578)	grad_norm 2.7197 (2.8476)	mem 20675MB
[2025-04-03 01:55:52 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][248/311]	eta 0:00:55 lr 0.000383	time 0.8755 (0.8815)	loss 0.5809 (0.5580)	grad_norm 2.8817 (2.8467)	mem 20675MB
[2025-04-03 01:55:54 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][250/311]	eta 0:00:53 lr 0.000383	time 0.8767 (0.8815)	loss 0.5294 (0.5580)	grad_norm 3.0497 (2.8468)	mem 20675MB
[2025-04-03 01:55:56 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][252/311]	eta 0:00:52 lr 0.000382	time 0.8760 (0.8814)	loss 0.5271 (0.5575)	grad_norm 4.1352 (2.8575)	mem 20675MB
[2025-04-03 01:55:58 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][254/311]	eta 0:00:50 lr 0.000382	time 0.8761 (0.8814)	loss 0.4981 (0.5574)	grad_norm 6.8899 (2.8710)	mem 20675MB
[2025-04-03 01:55:59 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][256/311]	eta 0:00:48 lr 0.000382	time 0.8754 (0.8814)	loss 0.5924 (0.5576)	grad_norm 3.2333 (2.8717)	mem 20675MB
[2025-04-03 01:56:01 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][258/311]	eta 0:00:46 lr 0.000381	time 0.8756 (0.8813)	loss 0.5472 (0.5579)	grad_norm 3.8560 (2.8776)	mem 20675MB
[2025-04-03 01:56:03 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][260/311]	eta 0:00:44 lr 0.000381	time 0.8758 (0.8813)	loss 0.6044 (0.5582)	grad_norm 3.0309 (2.8763)	mem 20675MB
[2025-04-03 01:56:05 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][262/311]	eta 0:00:43 lr 0.000380	time 0.8756 (0.8813)	loss 0.6529 (0.5587)	grad_norm 2.0429 (2.8717)	mem 20675MB
[2025-04-03 01:56:06 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][264/311]	eta 0:00:41 lr 0.000380	time 0.8755 (0.8812)	loss 0.5754 (0.5588)	grad_norm 3.3257 (2.8712)	mem 20675MB
[2025-04-03 01:56:08 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][266/311]	eta 0:00:39 lr 0.000380	time 0.8759 (0.8812)	loss 0.6640 (0.5595)	grad_norm 2.3879 (2.8664)	mem 20675MB
[2025-04-03 01:56:10 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][268/311]	eta 0:00:37 lr 0.000379	time 0.8758 (0.8811)	loss 0.6410 (0.5599)	grad_norm 2.8741 (2.8638)	mem 20675MB
[2025-04-03 01:56:12 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][270/311]	eta 0:00:36 lr 0.000379	time 0.8758 (0.8811)	loss 0.6622 (0.5605)	grad_norm 2.9531 (2.8623)	mem 20675MB
[2025-04-03 01:56:13 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][272/311]	eta 0:00:34 lr 0.000378	time 0.8760 (0.8811)	loss 0.6275 (0.5609)	grad_norm 1.7663 (2.8542)	mem 20675MB
[2025-04-03 01:56:15 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][274/311]	eta 0:00:32 lr 0.000378	time 0.8758 (0.8810)	loss 0.5733 (0.5610)	grad_norm 2.0773 (2.8477)	mem 20675MB
[2025-04-03 01:56:17 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][276/311]	eta 0:00:30 lr 0.000378	time 0.8763 (0.8810)	loss 0.5797 (0.5613)	grad_norm 1.9964 (2.8439)	mem 20675MB
[2025-04-03 01:56:19 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][278/311]	eta 0:00:29 lr 0.000377	time 0.8758 (0.8810)	loss 0.6133 (0.5616)	grad_norm 1.8125 (2.8362)	mem 20675MB
[2025-04-03 01:56:20 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][280/311]	eta 0:00:27 lr 0.000377	time 0.8759 (0.8810)	loss 0.4824 (0.5612)	grad_norm 2.3979 (2.8327)	mem 20675MB
[2025-04-03 01:56:22 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][282/311]	eta 0:00:25 lr 0.000377	time 0.8755 (0.8809)	loss 0.5575 (0.5612)	grad_norm 1.8932 (2.8274)	mem 20675MB
[2025-04-03 01:56:24 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][284/311]	eta 0:00:23 lr 0.000376	time 0.8756 (0.8809)	loss 0.4936 (0.5611)	grad_norm 3.3945 (2.8287)	mem 20675MB
[2025-04-03 01:56:26 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][286/311]	eta 0:00:22 lr 0.000376	time 0.8766 (0.8809)	loss 0.6351 (0.5616)	grad_norm 2.2361 (2.8244)	mem 20675MB
[2025-04-03 01:56:27 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][288/311]	eta 0:00:20 lr 0.000375	time 0.8778 (0.8809)	loss 0.6203 (0.5619)	grad_norm 2.3918 (2.8209)	mem 20675MB
[2025-04-03 01:56:29 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][290/311]	eta 0:00:18 lr 0.000375	time 0.8759 (0.8808)	loss 0.5184 (0.5618)	grad_norm 4.2857 (2.8230)	mem 20675MB
[2025-04-03 01:56:31 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][292/311]	eta 0:00:16 lr 0.000375	time 0.8758 (0.8808)	loss 0.5795 (0.5618)	grad_norm 2.1479 (2.8224)	mem 20675MB
[2025-04-03 01:56:33 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][294/311]	eta 0:00:14 lr 0.000374	time 0.8763 (0.8808)	loss 0.6740 (0.5617)	grad_norm 3.0094 (2.8276)	mem 20675MB
[2025-04-03 01:56:34 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][296/311]	eta 0:00:13 lr 0.000374	time 0.8755 (0.8808)	loss 0.5034 (0.5611)	grad_norm 2.5850 (2.8289)	mem 20675MB
[2025-04-03 01:56:36 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][298/311]	eta 0:00:11 lr 0.000373	time 0.8754 (0.8807)	loss 0.4497 (0.5604)	grad_norm 3.3696 (2.8316)	mem 20675MB
[2025-04-03 01:56:38 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][300/311]	eta 0:00:09 lr 0.000373	time 0.8759 (0.8807)	loss 0.4851 (0.5598)	grad_norm 3.4164 (2.8389)	mem 20675MB
[2025-04-03 01:56:40 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][302/311]	eta 0:00:07 lr 0.000373	time 0.8755 (0.8807)	loss 0.6220 (0.5596)	grad_norm 2.6648 (2.8392)	mem 20675MB
[2025-04-03 01:56:41 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][304/311]	eta 0:00:06 lr 0.000372	time 0.8755 (0.8806)	loss 0.6157 (0.5598)	grad_norm 2.7670 (2.8360)	mem 20675MB
[2025-04-03 01:56:43 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][306/311]	eta 0:00:04 lr 0.000372	time 0.8759 (0.8806)	loss 0.5964 (0.5600)	grad_norm 3.6721 (2.8410)	mem 20675MB
[2025-04-03 01:56:45 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][308/311]	eta 0:00:02 lr 0.000372	time 0.8757 (0.8806)	loss 0.6104 (0.5600)	grad_norm 2.7723 (2.8423)	mem 20675MB
[2025-04-03 01:56:47 simmim_finetune] (main_finetune.py 252): INFO Train: [18/30][310/311]	eta 0:00:00 lr 0.000371	time 0.8759 (0.8806)	loss 0.6216 (0.5601)	grad_norm 2.5900 (2.8460)	mem 20675MB
[2025-04-03 01:56:47 simmim_finetune] (main_finetune.py 260): INFO EPOCH 18 training takes 0:04:33
[2025-04-03 01:56:48 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.416 (1.416)	Loss 0.5701 (0.5701)	Acc@1 73.438 (73.438)	Mem 20675MB
[2025-04-03 01:56:48 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 75.352
[2025-04-03 01:56:48 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 75.4%
[2025-04-03 01:56:48 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 01:56:48 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.5473999736153461e-06, 1.5473999736153461e-06, 2.285951852866198e-06, 2.285951852866198e-06, 3.422185513252124e-06, 3.422185513252124e-06, 5.170237298461241e-06, 5.170237298461241e-06, 7.859547737244496e-06, 7.859547737244496e-06, 1.199694841229566e-05, 1.199694841229566e-05, 1.836218022006668e-05, 1.836218022006668e-05, 2.8154844539714395e-05, 2.8154844539714395e-05, 4.322048195455704e-05, 4.322048195455704e-05, 6.63983856696996e-05, 6.63983856696996e-05, 0.00010205669907761119, 0.00010205669907761119, 0.00015691564278209056, 0.00015691564278209056, 0.00024131401771205883, 0.00024131401771205883, 0.0003711576714504715, 0.0003711576714504715]
[2025-04-03 01:56:51 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][0/311]	eta 0:10:59 lr 0.000371	time 2.1193 (2.1193)	loss 0.6121 (0.6121)	grad_norm 3.0310 (3.0310)	mem 20675MB
[2025-04-03 01:56:52 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][2/311]	eta 0:06:39 lr 0.000371	time 0.8763 (1.2914)	loss 0.5061 (0.5602)	grad_norm 2.5852 (2.7887)	mem 20675MB
[2025-04-03 01:56:54 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][4/311]	eta 0:05:45 lr 0.000370	time 0.8755 (1.1255)	loss 0.6240 (0.5622)	grad_norm 2.3293 (2.8211)	mem 20675MB
[2025-04-03 01:56:56 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][6/311]	eta 0:05:21 lr 0.000370	time 0.8754 (1.0543)	loss 0.5485 (0.5516)	grad_norm 2.8312 (2.9056)	mem 20675MB
[2025-04-03 01:56:58 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][8/311]	eta 0:05:07 lr 0.000369	time 0.8758 (1.0148)	loss 0.5873 (0.5408)	grad_norm 2.2762 (2.8741)	mem 20675MB
[2025-04-03 01:56:59 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][10/311]	eta 0:04:57 lr 0.000369	time 0.8757 (0.9896)	loss 0.6303 (0.5562)	grad_norm 2.5587 (2.7967)	mem 20675MB
[2025-04-03 01:57:01 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][12/311]	eta 0:04:50 lr 0.000369	time 0.8754 (0.9722)	loss 0.6422 (0.5647)	grad_norm 1.8380 (2.6923)	mem 20675MB
[2025-04-03 01:57:03 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][14/311]	eta 0:04:44 lr 0.000368	time 0.8756 (0.9594)	loss 0.4216 (0.5520)	grad_norm 4.1980 (2.8358)	mem 20675MB
[2025-04-03 01:57:05 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][16/311]	eta 0:04:40 lr 0.000368	time 0.8765 (0.9497)	loss 0.6103 (0.5588)	grad_norm 2.8026 (2.8078)	mem 20675MB
[2025-04-03 01:57:06 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][18/311]	eta 0:04:36 lr 0.000368	time 0.8756 (0.9420)	loss 0.5933 (0.5537)	grad_norm 1.8474 (2.8364)	mem 20675MB
[2025-04-03 01:57:08 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][20/311]	eta 0:04:32 lr 0.000367	time 0.8758 (0.9358)	loss 0.5427 (0.5566)	grad_norm 2.4628 (2.7853)	mem 20675MB
[2025-04-03 01:57:10 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][22/311]	eta 0:04:28 lr 0.000367	time 0.8757 (0.9308)	loss 0.6230 (0.5609)	grad_norm 2.6128 (2.7751)	mem 20675MB
[2025-04-03 01:57:12 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][24/311]	eta 0:04:25 lr 0.000366	time 0.8759 (0.9265)	loss 0.4287 (0.5584)	grad_norm 4.1910 (2.8080)	mem 20675MB
[2025-04-03 01:57:13 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][26/311]	eta 0:04:22 lr 0.000366	time 0.8757 (0.9228)	loss 0.6315 (0.5608)	grad_norm 2.5897 (2.7792)	mem 20675MB
[2025-04-03 01:57:15 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][28/311]	eta 0:04:20 lr 0.000366	time 0.8755 (0.9196)	loss 0.5753 (0.5607)	grad_norm 2.5555 (2.8009)	mem 20675MB
[2025-04-03 01:57:17 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][30/311]	eta 0:04:17 lr 0.000365	time 0.8755 (0.9168)	loss 0.5297 (0.5607)	grad_norm 2.7889 (2.7771)	mem 20675MB
[2025-04-03 01:57:19 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][32/311]	eta 0:04:15 lr 0.000365	time 0.8757 (0.9144)	loss 0.5658 (0.5632)	grad_norm 2.9052 (2.7857)	mem 20675MB
[2025-04-03 01:57:20 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][34/311]	eta 0:04:12 lr 0.000364	time 0.8755 (0.9122)	loss 0.5879 (0.5642)	grad_norm 2.0855 (2.7647)	mem 20675MB
[2025-04-03 01:57:22 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][36/311]	eta 0:04:10 lr 0.000364	time 0.8758 (0.9103)	loss 0.4699 (0.5590)	grad_norm 4.9885 (2.8596)	mem 20675MB
[2025-04-03 01:57:24 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][38/311]	eta 0:04:08 lr 0.000364	time 0.8762 (0.9085)	loss 0.5643 (0.5615)	grad_norm 3.3774 (2.8791)	mem 20675MB
[2025-04-03 01:57:26 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][40/311]	eta 0:04:05 lr 0.000363	time 0.8754 (0.9070)	loss 0.5381 (0.5610)	grad_norm 2.1249 (2.8404)	mem 20675MB
[2025-04-03 01:57:27 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][42/311]	eta 0:04:03 lr 0.000363	time 0.8754 (0.9055)	loss 0.3986 (0.5573)	grad_norm 4.2060 (2.8682)	mem 20675MB
[2025-04-03 01:57:29 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][44/311]	eta 0:04:01 lr 0.000363	time 0.8755 (0.9042)	loss 0.5781 (0.5558)	grad_norm 1.9385 (2.8745)	mem 20675MB
[2025-04-03 01:57:31 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][46/311]	eta 0:03:59 lr 0.000362	time 0.8754 (0.9031)	loss 0.5669 (0.5544)	grad_norm 2.9067 (2.9347)	mem 20675MB
[2025-04-03 01:57:33 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][48/311]	eta 0:03:57 lr 0.000362	time 0.8760 (0.9020)	loss 0.4146 (0.5519)	grad_norm 4.0493 (2.9587)	mem 20675MB
[2025-04-03 01:57:34 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][50/311]	eta 0:03:55 lr 0.000361	time 0.8755 (0.9010)	loss 0.5877 (0.5506)	grad_norm 3.1072 (2.9978)	mem 20675MB
[2025-04-03 01:57:36 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][52/311]	eta 0:03:53 lr 0.000361	time 0.8755 (0.9001)	loss 0.3848 (0.5466)	grad_norm 3.7238 (3.0376)	mem 20675MB
[2025-04-03 01:57:38 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][54/311]	eta 0:03:51 lr 0.000361	time 0.8755 (0.8992)	loss 0.6655 (0.5495)	grad_norm 3.6715 (3.0425)	mem 20675MB
[2025-04-03 01:57:40 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][56/311]	eta 0:03:49 lr 0.000360	time 0.8756 (0.8984)	loss 0.6535 (0.5527)	grad_norm 4.0179 (3.0601)	mem 20675MB
[2025-04-03 01:57:41 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][58/311]	eta 0:03:47 lr 0.000360	time 0.8757 (0.8977)	loss 0.6243 (0.5551)	grad_norm 3.4111 (3.0665)	mem 20675MB
[2025-04-03 01:57:43 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][60/311]	eta 0:03:45 lr 0.000359	time 0.8753 (0.8970)	loss 0.4373 (0.5544)	grad_norm 4.5868 (3.0856)	mem 20675MB
[2025-04-03 01:57:45 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][62/311]	eta 0:03:43 lr 0.000359	time 0.8763 (0.8963)	loss 0.6392 (0.5547)	grad_norm 2.0784 (3.0667)	mem 20675MB
[2025-04-03 01:57:47 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][64/311]	eta 0:03:41 lr 0.000359	time 0.8754 (0.8957)	loss 0.6025 (0.5565)	grad_norm 1.6249 (3.0397)	mem 20675MB
[2025-04-03 01:57:48 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][66/311]	eta 0:03:39 lr 0.000358	time 0.8756 (0.8951)	loss 0.5682 (0.5568)	grad_norm 2.4591 (3.0235)	mem 20675MB
[2025-04-03 01:57:50 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][68/311]	eta 0:03:37 lr 0.000358	time 0.8754 (0.8946)	loss 0.4759 (0.5563)	grad_norm 2.8258 (3.0056)	mem 20675MB
[2025-04-03 01:57:52 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][70/311]	eta 0:03:35 lr 0.000358	time 0.8788 (0.8941)	loss 0.6055 (0.5578)	grad_norm 2.3435 (2.9722)	mem 20675MB
[2025-04-03 01:57:54 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][72/311]	eta 0:03:33 lr 0.000357	time 0.8756 (0.8937)	loss 0.5908 (0.5591)	grad_norm 1.9700 (2.9530)	mem 20675MB
[2025-04-03 01:57:55 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][74/311]	eta 0:03:31 lr 0.000357	time 0.8755 (0.8932)	loss 0.5386 (0.5575)	grad_norm 2.4059 (2.9463)	mem 20675MB
[2025-04-03 01:57:57 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][76/311]	eta 0:03:29 lr 0.000356	time 0.8756 (0.8928)	loss 0.5555 (0.5574)	grad_norm 2.5578 (2.9389)	mem 20675MB
[2025-04-03 01:57:59 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][78/311]	eta 0:03:27 lr 0.000356	time 0.8765 (0.8924)	loss 0.5142 (0.5572)	grad_norm 2.5390 (2.9233)	mem 20675MB
[2025-04-03 01:58:01 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][80/311]	eta 0:03:26 lr 0.000356	time 0.8758 (0.8920)	loss 0.6605 (0.5595)	grad_norm 3.1345 (2.9200)	mem 20675MB
[2025-04-03 01:58:02 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][82/311]	eta 0:03:24 lr 0.000355	time 0.8755 (0.8916)	loss 0.6069 (0.5592)	grad_norm 2.5755 (2.9124)	mem 20675MB
[2025-04-03 01:58:04 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][84/311]	eta 0:03:22 lr 0.000355	time 0.8755 (0.8913)	loss 0.4882 (0.5572)	grad_norm 4.4967 (2.9319)	mem 20675MB
[2025-04-03 01:58:06 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][86/311]	eta 0:03:20 lr 0.000355	time 0.8757 (0.8909)	loss 0.5460 (0.5574)	grad_norm 2.0040 (2.9184)	mem 20675MB
[2025-04-03 01:58:08 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][88/311]	eta 0:03:18 lr 0.000354	time 0.8755 (0.8906)	loss 0.6158 (0.5586)	grad_norm 2.4165 (2.9070)	mem 20675MB
[2025-04-03 01:58:09 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][90/311]	eta 0:03:16 lr 0.000354	time 0.8755 (0.8903)	loss 0.6159 (0.5585)	grad_norm 2.2043 (2.9250)	mem 20675MB
[2025-04-03 01:58:11 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][92/311]	eta 0:03:14 lr 0.000353	time 0.8757 (0.8900)	loss 0.5261 (0.5586)	grad_norm 2.5921 (2.9054)	mem 20675MB
[2025-04-03 01:58:13 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][94/311]	eta 0:03:13 lr 0.000353	time 0.8758 (0.8897)	loss 0.4898 (0.5574)	grad_norm 2.6212 (2.9045)	mem 20675MB
[2025-04-03 01:58:15 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][96/311]	eta 0:03:11 lr 0.000353	time 0.8755 (0.8894)	loss 0.4947 (0.5578)	grad_norm 3.4625 (2.9128)	mem 20675MB
[2025-04-03 01:58:16 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][98/311]	eta 0:03:09 lr 0.000352	time 0.8761 (0.8892)	loss 0.5903 (0.5585)	grad_norm 2.2477 (2.9039)	mem 20675MB
[2025-04-03 01:58:18 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][100/311]	eta 0:03:07 lr 0.000352	time 0.8754 (0.8889)	loss 0.5525 (0.5592)	grad_norm 2.1849 (2.8926)	mem 20675MB
[2025-04-03 01:58:20 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][102/311]	eta 0:03:05 lr 0.000352	time 0.8756 (0.8887)	loss 0.6874 (0.5601)	grad_norm 3.4747 (2.9025)	mem 20675MB
[2025-04-03 01:58:22 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][104/311]	eta 0:03:03 lr 0.000351	time 0.8762 (0.8885)	loss 0.5794 (0.5606)	grad_norm 2.1482 (2.9035)	mem 20675MB
[2025-04-03 01:58:23 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][106/311]	eta 0:03:02 lr 0.000351	time 0.8757 (0.8883)	loss 0.5285 (0.5606)	grad_norm 4.8881 (2.9175)	mem 20675MB
[2025-04-03 01:58:25 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][108/311]	eta 0:03:00 lr 0.000350	time 0.8756 (0.8880)	loss 0.5661 (0.5607)	grad_norm 2.5041 (2.9125)	mem 20675MB
[2025-04-03 01:58:27 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][110/311]	eta 0:02:58 lr 0.000350	time 0.8754 (0.8878)	loss 0.4515 (0.5593)	grad_norm 3.9543 (2.9267)	mem 20675MB
[2025-04-03 01:58:29 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][112/311]	eta 0:02:56 lr 0.000350	time 0.8757 (0.8876)	loss 0.5316 (0.5594)	grad_norm 2.5436 (2.9188)	mem 20675MB
[2025-04-03 01:58:30 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][114/311]	eta 0:02:54 lr 0.000349	time 0.8759 (0.8874)	loss 0.6018 (0.5599)	grad_norm 1.9800 (2.9039)	mem 20675MB
[2025-04-03 01:58:32 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][116/311]	eta 0:02:53 lr 0.000349	time 0.8759 (0.8872)	loss 0.6001 (0.5599)	grad_norm 1.8828 (2.8921)	mem 20675MB
[2025-04-03 01:58:34 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][118/311]	eta 0:02:51 lr 0.000348	time 0.8757 (0.8871)	loss 0.6139 (0.5591)	grad_norm 2.2115 (2.8909)	mem 20675MB
[2025-04-03 01:58:36 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][120/311]	eta 0:02:49 lr 0.000348	time 0.8758 (0.8869)	loss 0.5664 (0.5587)	grad_norm 2.5111 (2.8922)	mem 20675MB
[2025-04-03 01:58:37 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][122/311]	eta 0:02:47 lr 0.000348	time 0.8759 (0.8867)	loss 0.4142 (0.5563)	grad_norm 3.9153 (2.9060)	mem 20675MB
[2025-04-03 01:58:39 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][124/311]	eta 0:02:45 lr 0.000347	time 0.8755 (0.8866)	loss 0.5634 (0.5555)	grad_norm 2.1677 (2.9144)	mem 20675MB
[2025-04-03 01:58:41 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][126/311]	eta 0:02:43 lr 0.000347	time 0.8756 (0.8864)	loss 0.5724 (0.5559)	grad_norm 3.0122 (2.9135)	mem 20675MB
[2025-04-03 01:58:43 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][128/311]	eta 0:02:42 lr 0.000347	time 0.8758 (0.8863)	loss 0.6438 (0.5560)	grad_norm 3.5319 (2.9216)	mem 20675MB
[2025-04-03 01:58:44 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][130/311]	eta 0:02:40 lr 0.000346	time 0.8756 (0.8861)	loss 0.5346 (0.5558)	grad_norm 2.1223 (2.9145)	mem 20675MB
[2025-04-03 01:58:46 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][132/311]	eta 0:02:38 lr 0.000346	time 0.8755 (0.8860)	loss 0.4962 (0.5547)	grad_norm 2.6889 (2.9162)	mem 20675MB
[2025-04-03 01:58:48 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][134/311]	eta 0:02:36 lr 0.000345	time 0.8754 (0.8858)	loss 0.5134 (0.5546)	grad_norm 3.1596 (2.9237)	mem 20675MB
[2025-04-03 01:58:50 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][136/311]	eta 0:02:34 lr 0.000345	time 0.8759 (0.8857)	loss 0.6053 (0.5555)	grad_norm 3.3166 (2.9278)	mem 20675MB
[2025-04-03 01:58:52 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][138/311]	eta 0:02:33 lr 0.000345	time 0.8758 (0.8856)	loss 0.5506 (0.5552)	grad_norm 2.4401 (2.9339)	mem 20675MB
[2025-04-03 01:58:53 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][140/311]	eta 0:02:31 lr 0.000344	time 0.8760 (0.8854)	loss 0.6104 (0.5547)	grad_norm 3.1156 (2.9550)	mem 20675MB
[2025-04-03 01:58:55 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][142/311]	eta 0:02:29 lr 0.000344	time 0.8758 (0.8853)	loss 0.5903 (0.5550)	grad_norm 2.6542 (2.9547)	mem 20675MB
[2025-04-03 01:58:57 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][144/311]	eta 0:02:27 lr 0.000344	time 0.8759 (0.8852)	loss 0.5697 (0.5554)	grad_norm 3.4486 (2.9563)	mem 20675MB
[2025-04-03 01:58:59 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][146/311]	eta 0:02:26 lr 0.000343	time 0.8762 (0.8851)	loss 0.6183 (0.5558)	grad_norm 2.1096 (2.9510)	mem 20675MB
[2025-04-03 01:59:00 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][148/311]	eta 0:02:24 lr 0.000343	time 0.8755 (0.8850)	loss 0.4712 (0.5551)	grad_norm 4.4374 (2.9688)	mem 20675MB
[2025-04-03 01:59:02 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][150/311]	eta 0:02:22 lr 0.000342	time 0.8757 (0.8849)	loss 0.6228 (0.5554)	grad_norm 2.2667 (2.9672)	mem 20675MB
[2025-04-03 01:59:04 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][152/311]	eta 0:02:20 lr 0.000342	time 0.8759 (0.8847)	loss 0.5681 (0.5557)	grad_norm 2.1327 (2.9580)	mem 20675MB
[2025-04-03 01:59:06 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][154/311]	eta 0:02:18 lr 0.000342	time 0.8768 (0.8846)	loss 0.6329 (0.5563)	grad_norm 2.9351 (2.9538)	mem 20675MB
[2025-04-03 01:59:07 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][156/311]	eta 0:02:17 lr 0.000341	time 0.8759 (0.8846)	loss 0.5888 (0.5566)	grad_norm 1.9048 (2.9427)	mem 20675MB
[2025-04-03 01:59:09 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][158/311]	eta 0:02:15 lr 0.000341	time 0.8758 (0.8845)	loss 0.5711 (0.5571)	grad_norm 2.2159 (2.9373)	mem 20675MB
[2025-04-03 01:59:11 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][160/311]	eta 0:02:13 lr 0.000341	time 0.8759 (0.8844)	loss 0.5309 (0.5567)	grad_norm 2.6681 (2.9285)	mem 20675MB
[2025-04-03 01:59:13 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][162/311]	eta 0:02:11 lr 0.000340	time 0.8758 (0.8843)	loss 0.5109 (0.5565)	grad_norm 2.3446 (2.9278)	mem 20675MB
[2025-04-03 01:59:14 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][164/311]	eta 0:02:09 lr 0.000340	time 0.8761 (0.8842)	loss 0.4667 (0.5563)	grad_norm 3.1788 (2.9266)	mem 20675MB
[2025-04-03 01:59:16 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][166/311]	eta 0:02:08 lr 0.000339	time 0.8758 (0.8841)	loss 0.6479 (0.5572)	grad_norm 3.3616 (2.9248)	mem 20675MB
[2025-04-03 01:59:18 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][168/311]	eta 0:02:06 lr 0.000339	time 0.8756 (0.8840)	loss 0.5002 (0.5573)	grad_norm 2.6276 (2.9176)	mem 20675MB
[2025-04-03 01:59:20 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][170/311]	eta 0:02:04 lr 0.000339	time 0.8759 (0.8839)	loss 0.4280 (0.5570)	grad_norm 2.9594 (2.9182)	mem 20675MB
[2025-04-03 01:59:21 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][172/311]	eta 0:02:02 lr 0.000338	time 0.8763 (0.8838)	loss 0.5241 (0.5569)	grad_norm 2.5451 (2.9119)	mem 20675MB
[2025-04-03 01:59:23 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][174/311]	eta 0:02:01 lr 0.000338	time 0.8755 (0.8838)	loss 0.5545 (0.5574)	grad_norm 3.0107 (2.9068)	mem 20675MB
[2025-04-03 01:59:25 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][176/311]	eta 0:01:59 lr 0.000338	time 0.8755 (0.8837)	loss 0.6117 (0.5575)	grad_norm 2.4037 (2.9041)	mem 20675MB
[2025-04-03 01:59:27 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][178/311]	eta 0:01:57 lr 0.000337	time 0.8759 (0.8836)	loss 0.6593 (0.5585)	grad_norm 3.0902 (2.9141)	mem 20675MB
[2025-04-03 01:59:28 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][180/311]	eta 0:01:55 lr 0.000337	time 0.8761 (0.8835)	loss 0.6070 (0.5584)	grad_norm 2.5836 (2.9166)	mem 20675MB
[2025-04-03 01:59:30 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][182/311]	eta 0:01:53 lr 0.000336	time 0.8756 (0.8834)	loss 0.5501 (0.5590)	grad_norm 3.0938 (2.9139)	mem 20675MB
[2025-04-03 01:59:32 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][184/311]	eta 0:01:52 lr 0.000336	time 0.8758 (0.8834)	loss 0.4459 (0.5587)	grad_norm 4.1431 (2.9169)	mem 20675MB
[2025-04-03 01:59:34 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][186/311]	eta 0:01:50 lr 0.000336	time 0.8756 (0.8833)	loss 0.5372 (0.5584)	grad_norm 2.1262 (2.9145)	mem 20675MB
[2025-04-03 01:59:35 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][188/311]	eta 0:01:48 lr 0.000335	time 0.8769 (0.8832)	loss 0.6421 (0.5588)	grad_norm 1.7931 (2.9082)	mem 20675MB
[2025-04-03 01:59:37 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][190/311]	eta 0:01:46 lr 0.000335	time 0.8758 (0.8832)	loss 0.5579 (0.5586)	grad_norm 4.0193 (2.9252)	mem 20675MB
[2025-04-03 01:59:39 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][192/311]	eta 0:01:45 lr 0.000335	time 0.8758 (0.8831)	loss 0.4732 (0.5583)	grad_norm 2.5253 (2.9188)	mem 20675MB
[2025-04-03 01:59:41 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][194/311]	eta 0:01:43 lr 0.000334	time 0.8758 (0.8830)	loss 0.6235 (0.5588)	grad_norm 2.0216 (2.9099)	mem 20675MB
[2025-04-03 01:59:42 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][196/311]	eta 0:01:41 lr 0.000334	time 0.8754 (0.8830)	loss 0.4476 (0.5580)	grad_norm 3.3024 (2.9120)	mem 20675MB
[2025-04-03 01:59:44 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][198/311]	eta 0:01:39 lr 0.000333	time 0.8757 (0.8829)	loss 0.6020 (0.5577)	grad_norm 2.3451 (2.9121)	mem 20675MB
[2025-04-03 01:59:46 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][200/311]	eta 0:01:37 lr 0.000333	time 0.8757 (0.8828)	loss 0.5208 (0.5570)	grad_norm 3.6911 (2.9200)	mem 20675MB
[2025-04-03 01:59:48 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][202/311]	eta 0:01:36 lr 0.000333	time 0.8755 (0.8828)	loss 0.5223 (0.5569)	grad_norm 3.3146 (2.9210)	mem 20675MB
[2025-04-03 01:59:49 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][204/311]	eta 0:01:34 lr 0.000332	time 0.8755 (0.8827)	loss 0.5511 (0.5561)	grad_norm 3.0295 (2.9300)	mem 20675MB
[2025-04-03 01:59:51 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][206/311]	eta 0:01:32 lr 0.000332	time 0.8755 (0.8827)	loss 0.5653 (0.5566)	grad_norm 2.8076 (2.9293)	mem 20675MB
[2025-04-03 01:59:53 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][208/311]	eta 0:01:30 lr 0.000332	time 0.8759 (0.8826)	loss 0.5695 (0.5569)	grad_norm 2.1052 (2.9212)	mem 20675MB
[2025-04-03 01:59:55 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][210/311]	eta 0:01:29 lr 0.000331	time 0.8758 (0.8825)	loss 0.5350 (0.5569)	grad_norm 3.6598 (2.9216)	mem 20675MB
[2025-04-03 01:59:56 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][212/311]	eta 0:01:27 lr 0.000331	time 0.8767 (0.8825)	loss 0.4803 (0.5570)	grad_norm 3.1608 (2.9190)	mem 20675MB
[2025-04-03 01:59:58 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][214/311]	eta 0:01:25 lr 0.000331	time 0.8756 (0.8824)	loss 0.5937 (0.5569)	grad_norm 2.2810 (2.9220)	mem 20675MB
[2025-04-03 02:00:00 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][216/311]	eta 0:01:23 lr 0.000330	time 0.8757 (0.8824)	loss 0.5100 (0.5565)	grad_norm 3.0785 (2.9272)	mem 20675MB
[2025-04-03 02:00:02 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][218/311]	eta 0:01:22 lr 0.000330	time 0.8754 (0.8823)	loss 0.5305 (0.5564)	grad_norm 3.3212 (2.9258)	mem 20675MB
[2025-04-03 02:00:03 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][220/311]	eta 0:01:20 lr 0.000329	time 0.8755 (0.8823)	loss 0.4958 (0.5564)	grad_norm 3.4490 (2.9277)	mem 20675MB
[2025-04-03 02:00:05 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][222/311]	eta 0:01:18 lr 0.000329	time 0.8756 (0.8822)	loss 0.6014 (0.5570)	grad_norm 2.8038 (2.9281)	mem 20675MB
[2025-04-03 02:00:07 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][224/311]	eta 0:01:16 lr 0.000329	time 0.8764 (0.8822)	loss 0.4833 (0.5568)	grad_norm 2.9299 (2.9264)	mem 20675MB
[2025-04-03 02:00:09 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][226/311]	eta 0:01:14 lr 0.000328	time 0.8768 (0.8821)	loss 0.4437 (0.5560)	grad_norm 3.4445 (2.9393)	mem 20675MB
[2025-04-03 02:00:10 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][228/311]	eta 0:01:13 lr 0.000328	time 0.8759 (0.8821)	loss 0.5434 (0.5555)	grad_norm 2.7699 (2.9604)	mem 20675MB
[2025-04-03 02:00:12 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][230/311]	eta 0:01:11 lr 0.000328	time 0.8764 (0.8820)	loss 0.5664 (0.5552)	grad_norm 2.0974 (2.9579)	mem 20675MB
[2025-04-03 02:00:14 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][232/311]	eta 0:01:09 lr 0.000327	time 0.8755 (0.8820)	loss 0.4606 (0.5548)	grad_norm 4.0399 (2.9661)	mem 20675MB
[2025-04-03 02:00:16 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][234/311]	eta 0:01:07 lr 0.000327	time 0.8759 (0.8820)	loss 0.5792 (0.5547)	grad_norm 2.6878 (2.9679)	mem 20675MB
[2025-04-03 02:00:17 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][236/311]	eta 0:01:06 lr 0.000326	time 0.8772 (0.8819)	loss 0.5959 (0.5548)	grad_norm 2.9105 (2.9674)	mem 20675MB
[2025-04-03 02:00:19 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][238/311]	eta 0:01:04 lr 0.000326	time 0.8756 (0.8819)	loss 0.5087 (0.5543)	grad_norm 2.6994 (2.9689)	mem 20675MB
[2025-04-03 02:00:21 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][240/311]	eta 0:01:02 lr 0.000326	time 0.8757 (0.8818)	loss 0.4619 (0.5540)	grad_norm 5.3758 (2.9788)	mem 20675MB
[2025-04-03 02:00:23 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][242/311]	eta 0:01:00 lr 0.000325	time 0.8757 (0.8818)	loss 0.5365 (0.5543)	grad_norm 3.3592 (2.9817)	mem 20675MB
[2025-04-03 02:00:24 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][244/311]	eta 0:00:59 lr 0.000325	time 0.8762 (0.8818)	loss 0.5569 (0.5547)	grad_norm 3.4362 (2.9825)	mem 20675MB
[2025-04-03 02:00:26 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][246/311]	eta 0:00:57 lr 0.000325	time 0.8782 (0.8817)	loss 0.6224 (0.5553)	grad_norm 2.5558 (2.9769)	mem 20675MB
[2025-04-03 02:00:28 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][248/311]	eta 0:00:55 lr 0.000324	time 0.8756 (0.8817)	loss 0.5772 (0.5551)	grad_norm 2.9208 (2.9811)	mem 20675MB
[2025-04-03 02:00:30 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][250/311]	eta 0:00:53 lr 0.000324	time 0.8759 (0.8816)	loss 0.5103 (0.5546)	grad_norm 2.8746 (2.9841)	mem 20675MB
[2025-04-03 02:00:31 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][252/311]	eta 0:00:52 lr 0.000323	time 0.8759 (0.8816)	loss 0.5483 (0.5548)	grad_norm 2.3439 (2.9802)	mem 20675MB
[2025-04-03 02:00:33 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][254/311]	eta 0:00:50 lr 0.000323	time 0.8764 (0.8816)	loss 0.5595 (0.5550)	grad_norm 2.1423 (2.9743)	mem 20675MB
[2025-04-03 02:00:35 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][256/311]	eta 0:00:48 lr 0.000323	time 0.8786 (0.8816)	loss 0.6602 (0.5556)	grad_norm 2.2129 (2.9678)	mem 20675MB
[2025-04-03 02:00:37 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][258/311]	eta 0:00:46 lr 0.000322	time 0.8760 (0.8815)	loss 0.5518 (0.5555)	grad_norm 2.3399 (2.9635)	mem 20675MB
[2025-04-03 02:00:38 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][260/311]	eta 0:00:44 lr 0.000322	time 0.8760 (0.8815)	loss 0.5237 (0.5552)	grad_norm 2.5484 (2.9604)	mem 20675MB
[2025-04-03 02:00:40 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][262/311]	eta 0:00:43 lr 0.000322	time 0.8768 (0.8814)	loss 0.6008 (0.5558)	grad_norm 1.6348 (2.9535)	mem 20675MB
[2025-04-03 02:00:42 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][264/311]	eta 0:00:41 lr 0.000321	time 0.8762 (0.8814)	loss 0.6107 (0.5561)	grad_norm 2.5274 (2.9492)	mem 20675MB
[2025-04-03 02:00:44 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][266/311]	eta 0:00:39 lr 0.000321	time 0.8761 (0.8814)	loss 0.4993 (0.5562)	grad_norm 2.7471 (2.9450)	mem 20675MB
[2025-04-03 02:00:46 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][268/311]	eta 0:00:37 lr 0.000321	time 0.8787 (0.8814)	loss 0.5499 (0.5563)	grad_norm 2.4385 (2.9403)	mem 20675MB
[2025-04-03 02:00:47 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][270/311]	eta 0:00:36 lr 0.000320	time 0.8768 (0.8814)	loss 0.6580 (0.5566)	grad_norm 2.0617 (2.9376)	mem 20675MB
[2025-04-03 02:00:49 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][272/311]	eta 0:00:34 lr 0.000320	time 0.8762 (0.8814)	loss 0.5367 (0.5567)	grad_norm 3.0691 (2.9360)	mem 20675MB
[2025-04-03 02:00:51 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][274/311]	eta 0:00:32 lr 0.000319	time 0.8763 (0.8813)	loss 0.4097 (0.5558)	grad_norm 3.6280 (2.9395)	mem 20675MB
[2025-04-03 02:00:53 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][276/311]	eta 0:00:30 lr 0.000319	time 0.8766 (0.8813)	loss 0.5203 (0.5559)	grad_norm 2.1950 (2.9349)	mem 20675MB
[2025-04-03 02:00:54 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][278/311]	eta 0:00:29 lr 0.000319	time 0.8762 (0.8813)	loss 0.5796 (0.5561)	grad_norm 2.7562 (2.9386)	mem 20675MB
[2025-04-03 02:00:56 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][280/311]	eta 0:00:27 lr 0.000318	time 0.8762 (0.8812)	loss 0.5660 (0.5563)	grad_norm 2.4925 (2.9399)	mem 20675MB
[2025-04-03 02:00:58 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][282/311]	eta 0:00:25 lr 0.000318	time 0.8762 (0.8812)	loss 0.5353 (0.5563)	grad_norm 2.4779 (2.9369)	mem 20675MB
[2025-04-03 02:01:00 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][284/311]	eta 0:00:23 lr 0.000318	time 0.8787 (0.8812)	loss 0.5990 (0.5560)	grad_norm 2.9569 (2.9437)	mem 20675MB
[2025-04-03 02:01:01 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][286/311]	eta 0:00:22 lr 0.000317	time 0.8767 (0.8812)	loss 0.6027 (0.5559)	grad_norm 2.1676 (2.9442)	mem 20675MB
[2025-04-03 02:01:03 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][288/311]	eta 0:00:20 lr 0.000317	time 0.8762 (0.8811)	loss 0.6049 (0.5562)	grad_norm 2.5756 (2.9424)	mem 20675MB
[2025-04-03 02:01:05 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][290/311]	eta 0:00:18 lr 0.000317	time 0.8794 (0.8811)	loss 0.5406 (0.5562)	grad_norm 4.1710 (2.9451)	mem 20675MB
[2025-04-03 02:01:07 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][292/311]	eta 0:00:16 lr 0.000316	time 0.8764 (0.8811)	loss 0.6584 (0.5565)	grad_norm 3.3922 (2.9462)	mem 20675MB
[2025-04-03 02:01:08 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][294/311]	eta 0:00:14 lr 0.000316	time 0.8760 (0.8811)	loss 0.5074 (0.5565)	grad_norm 4.9159 (2.9512)	mem 20675MB
[2025-04-03 02:01:10 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][296/311]	eta 0:00:13 lr 0.000315	time 0.8758 (0.8811)	loss 0.5399 (0.5562)	grad_norm 3.2251 (2.9564)	mem 20675MB
[2025-04-03 02:01:12 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][298/311]	eta 0:00:11 lr 0.000315	time 0.8760 (0.8811)	loss 0.5326 (0.5561)	grad_norm 2.4658 (2.9569)	mem 20675MB
[2025-04-03 02:01:14 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][300/311]	eta 0:00:09 lr 0.000315	time 0.8760 (0.8810)	loss 0.4887 (0.5556)	grad_norm 4.6041 (2.9654)	mem 20675MB
[2025-04-03 02:01:15 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][302/311]	eta 0:00:07 lr 0.000314	time 0.8759 (0.8810)	loss 0.4272 (0.5552)	grad_norm 3.1187 (2.9638)	mem 20675MB
[2025-04-03 02:01:17 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][304/311]	eta 0:00:06 lr 0.000314	time 0.8760 (0.8810)	loss 0.6426 (0.5553)	grad_norm 3.0464 (2.9678)	mem 20675MB
[2025-04-03 02:01:19 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][306/311]	eta 0:00:04 lr 0.000314	time 0.8761 (0.8810)	loss 0.5493 (0.5549)	grad_norm 2.6505 (2.9696)	mem 20675MB
[2025-04-03 02:01:21 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][308/311]	eta 0:00:02 lr 0.000313	time 0.8759 (0.8809)	loss 0.6185 (0.5551)	grad_norm 3.0103 (2.9721)	mem 20675MB
[2025-04-03 02:01:22 simmim_finetune] (main_finetune.py 252): INFO Train: [19/30][310/311]	eta 0:00:00 lr 0.000313	time 0.8768 (0.8809)	loss 0.6402 (0.5554)	grad_norm 2.8092 (2.9708)	mem 20675MB
[2025-04-03 02:01:23 simmim_finetune] (main_finetune.py 260): INFO EPOCH 19 training takes 0:04:34
[2025-04-03 02:01:24 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.531 (1.531)	Loss 0.5523 (0.5523)	Acc@1 75.000 (75.000)	Mem 20675MB
[2025-04-03 02:01:24 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 76.761
[2025-04-03 02:01:24 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 76.8%
[2025-04-03 02:01:24 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:01:24 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.34351428530427e-06, 1.34351428530427e-06, 1.9660031451958024e-06, 1.9660031451958024e-06, 2.923678314259698e-06, 2.923678314259698e-06, 4.397024728204154e-06, 4.397024728204154e-06, 6.66371151888793e-06, 6.66371151888793e-06, 1.015092196609374e-05, 1.015092196609374e-05, 1.5515861115641137e-05, 1.5515861115641137e-05, 2.3769613653406365e-05, 2.3769613653406365e-05, 3.6467694480737486e-05, 3.6467694480737486e-05, 5.6003203445862295e-05, 5.6003203445862295e-05, 8.605783262297738e-05, 8.605783262297738e-05, 0.0001322957236646929, 0.0001322957236646929, 0.00020343094065194752, 0.00020343094065194752, 0.0003128697360169546, 0.0003128697360169546]
[2025-04-03 02:01:26 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][0/311]	eta 0:11:33 lr 0.000313	time 2.2298 (2.2298)	loss 0.5519 (0.5519)	grad_norm 3.1437 (3.1437)	mem 20675MB
[2025-04-03 02:01:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][2/311]	eta 0:06:50 lr 0.000312	time 0.8760 (1.3282)	loss 0.6144 (0.5735)	grad_norm 2.2882 (2.6519)	mem 20675MB
[2025-04-03 02:01:30 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][4/311]	eta 0:05:52 lr 0.000312	time 0.8761 (1.1477)	loss 0.5392 (0.5670)	grad_norm 2.8118 (2.6827)	mem 20675MB
[2025-04-03 02:01:32 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][6/311]	eta 0:05:26 lr 0.000312	time 0.8761 (1.0704)	loss 0.6408 (0.5594)	grad_norm 2.1201 (2.7900)	mem 20675MB
[2025-04-03 02:01:33 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][8/311]	eta 0:05:11 lr 0.000311	time 0.8760 (1.0274)	loss 0.5545 (0.5471)	grad_norm 3.7287 (3.0655)	mem 20675MB
[2025-04-03 02:01:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][10/311]	eta 0:05:01 lr 0.000311	time 0.8766 (1.0001)	loss 0.5145 (0.5513)	grad_norm 2.9918 (3.0172)	mem 20675MB
[2025-04-03 02:01:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][12/311]	eta 0:04:53 lr 0.000311	time 0.8761 (0.9812)	loss 0.5689 (0.5484)	grad_norm 2.4849 (2.9482)	mem 20675MB
[2025-04-03 02:01:39 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][14/311]	eta 0:04:47 lr 0.000310	time 0.8760 (0.9675)	loss 0.5922 (0.5532)	grad_norm 2.7480 (2.8693)	mem 20675MB
[2025-04-03 02:01:40 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][16/311]	eta 0:04:42 lr 0.000310	time 0.8761 (0.9568)	loss 0.5884 (0.5573)	grad_norm 2.3776 (2.8084)	mem 20675MB
[2025-04-03 02:01:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][18/311]	eta 0:04:37 lr 0.000309	time 0.8761 (0.9484)	loss 0.4963 (0.5566)	grad_norm 2.0196 (2.7506)	mem 20675MB
[2025-04-03 02:01:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][20/311]	eta 0:04:34 lr 0.000309	time 0.8756 (0.9416)	loss 0.5832 (0.5577)	grad_norm 2.1422 (2.7129)	mem 20675MB
[2025-04-03 02:01:46 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][22/311]	eta 0:04:30 lr 0.000309	time 0.8759 (0.9360)	loss 0.6224 (0.5615)	grad_norm 2.6794 (2.7014)	mem 20675MB
[2025-04-03 02:01:48 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][24/311]	eta 0:04:27 lr 0.000308	time 0.8759 (0.9313)	loss 0.4144 (0.5551)	grad_norm 4.0316 (2.7396)	mem 20675MB
[2025-04-03 02:01:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][26/311]	eta 0:04:24 lr 0.000308	time 0.8758 (0.9272)	loss 0.5904 (0.5596)	grad_norm 2.4015 (2.7129)	mem 20675MB
[2025-04-03 02:01:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][28/311]	eta 0:04:21 lr 0.000308	time 0.8757 (0.9237)	loss 0.5778 (0.5629)	grad_norm 2.2756 (2.6928)	mem 20675MB
[2025-04-03 02:01:53 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][30/311]	eta 0:04:18 lr 0.000307	time 0.8759 (0.9207)	loss 0.5926 (0.5636)	grad_norm 2.3848 (2.6694)	mem 20675MB
[2025-04-03 02:01:55 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][32/311]	eta 0:04:16 lr 0.000307	time 0.8758 (0.9180)	loss 0.6061 (0.5665)	grad_norm 2.8971 (2.6712)	mem 20675MB
[2025-04-03 02:01:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][34/311]	eta 0:04:13 lr 0.000307	time 0.8756 (0.9157)	loss 0.4589 (0.5611)	grad_norm 3.1205 (2.6811)	mem 20675MB
[2025-04-03 02:01:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][36/311]	eta 0:04:11 lr 0.000306	time 0.8757 (0.9136)	loss 0.5603 (0.5608)	grad_norm 3.9556 (2.7130)	mem 20675MB
[2025-04-03 02:02:00 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][38/311]	eta 0:04:08 lr 0.000306	time 0.8756 (0.9117)	loss 0.4850 (0.5586)	grad_norm 2.3188 (2.7167)	mem 20675MB
[2025-04-03 02:02:02 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][40/311]	eta 0:04:06 lr 0.000305	time 0.8765 (0.9100)	loss 0.5881 (0.5582)	grad_norm 2.7996 (2.7298)	mem 20675MB
[2025-04-03 02:02:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][42/311]	eta 0:04:04 lr 0.000305	time 0.8756 (0.9084)	loss 0.6364 (0.5620)	grad_norm 2.5536 (2.7093)	mem 20675MB
[2025-04-03 02:02:05 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][44/311]	eta 0:04:02 lr 0.000305	time 0.8755 (0.9070)	loss 0.6612 (0.5658)	grad_norm 2.6909 (2.7008)	mem 20675MB
[2025-04-03 02:02:07 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][46/311]	eta 0:04:00 lr 0.000304	time 0.8755 (0.9057)	loss 0.5718 (0.5680)	grad_norm 2.9603 (2.7348)	mem 20675MB
[2025-04-03 02:02:09 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][48/311]	eta 0:03:57 lr 0.000304	time 0.8760 (0.9045)	loss 0.4981 (0.5663)	grad_norm 2.7147 (2.7223)	mem 20675MB
[2025-04-03 02:02:10 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][50/311]	eta 0:03:55 lr 0.000304	time 0.8759 (0.9035)	loss 0.5620 (0.5668)	grad_norm 2.5039 (2.7261)	mem 20675MB
[2025-04-03 02:02:12 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][52/311]	eta 0:03:53 lr 0.000303	time 0.8765 (0.9025)	loss 0.5805 (0.5663)	grad_norm 1.9134 (2.7534)	mem 20675MB
[2025-04-03 02:02:14 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][54/311]	eta 0:03:51 lr 0.000303	time 0.8783 (0.9016)	loss 0.5385 (0.5661)	grad_norm 3.9868 (2.7701)	mem 20675MB
[2025-04-03 02:02:16 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][56/311]	eta 0:03:49 lr 0.000303	time 0.8760 (0.9007)	loss 0.5628 (0.5649)	grad_norm 2.8611 (2.7581)	mem 20675MB
[2025-04-03 02:02:17 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][58/311]	eta 0:03:47 lr 0.000302	time 0.8759 (0.8999)	loss 0.3770 (0.5614)	grad_norm 3.1069 (2.7496)	mem 20675MB
[2025-04-03 02:02:19 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][60/311]	eta 0:03:45 lr 0.000302	time 0.8764 (0.8992)	loss 0.6315 (0.5626)	grad_norm 2.1933 (2.7374)	mem 20675MB
[2025-04-03 02:02:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][62/311]	eta 0:03:43 lr 0.000301	time 0.8759 (0.8985)	loss 0.5942 (0.5608)	grad_norm 2.3017 (2.7834)	mem 20675MB
[2025-04-03 02:02:23 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][64/311]	eta 0:03:41 lr 0.000301	time 0.8758 (0.8978)	loss 0.5688 (0.5614)	grad_norm 2.5316 (2.7747)	mem 20675MB
[2025-04-03 02:02:24 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][66/311]	eta 0:03:39 lr 0.000301	time 0.8762 (0.8972)	loss 0.6152 (0.5615)	grad_norm 2.7746 (2.7717)	mem 20675MB
[2025-04-03 02:02:26 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][68/311]	eta 0:03:37 lr 0.000300	time 0.8755 (0.8966)	loss 0.5386 (0.5614)	grad_norm 2.4736 (2.7709)	mem 20675MB
[2025-04-03 02:02:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][70/311]	eta 0:03:35 lr 0.000300	time 0.8759 (0.8960)	loss 0.5817 (0.5622)	grad_norm 2.6440 (2.7665)	mem 20675MB
[2025-04-03 02:02:30 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][72/311]	eta 0:03:34 lr 0.000300	time 0.8757 (0.8955)	loss 0.6828 (0.5636)	grad_norm 2.4977 (2.7632)	mem 20675MB
[2025-04-03 02:02:31 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][74/311]	eta 0:03:32 lr 0.000299	time 0.8757 (0.8950)	loss 0.5914 (0.5640)	grad_norm 2.5395 (2.7537)	mem 20675MB
[2025-04-03 02:02:33 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][76/311]	eta 0:03:30 lr 0.000299	time 0.8753 (0.8945)	loss 0.5243 (0.5634)	grad_norm 3.1048 (2.7568)	mem 20675MB
[2025-04-03 02:02:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][78/311]	eta 0:03:28 lr 0.000299	time 0.8763 (0.8941)	loss 0.6156 (0.5637)	grad_norm 2.0448 (2.7453)	mem 20675MB
[2025-04-03 02:02:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][80/311]	eta 0:03:26 lr 0.000298	time 0.8755 (0.8936)	loss 0.6012 (0.5633)	grad_norm 2.4812 (2.7437)	mem 20675MB
[2025-04-03 02:02:38 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][82/311]	eta 0:03:24 lr 0.000298	time 0.8764 (0.8932)	loss 0.6233 (0.5639)	grad_norm 2.6044 (2.7452)	mem 20675MB
[2025-04-03 02:02:40 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][84/311]	eta 0:03:22 lr 0.000298	time 0.8756 (0.8928)	loss 0.6383 (0.5658)	grad_norm 2.3111 (2.7445)	mem 20675MB
[2025-04-03 02:02:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][86/311]	eta 0:03:20 lr 0.000297	time 0.8757 (0.8925)	loss 0.6258 (0.5668)	grad_norm 2.8630 (2.7420)	mem 20675MB
[2025-04-03 02:02:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][88/311]	eta 0:03:18 lr 0.000297	time 0.8756 (0.8921)	loss 0.6318 (0.5678)	grad_norm 2.1952 (2.7323)	mem 20675MB
[2025-04-03 02:02:45 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][90/311]	eta 0:03:17 lr 0.000296	time 0.8756 (0.8918)	loss 0.6450 (0.5682)	grad_norm 3.1026 (2.7470)	mem 20675MB
[2025-04-03 02:02:47 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][92/311]	eta 0:03:15 lr 0.000296	time 0.8754 (0.8914)	loss 0.5318 (0.5669)	grad_norm 2.5961 (2.7432)	mem 20675MB
[2025-04-03 02:02:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][94/311]	eta 0:03:13 lr 0.000296	time 0.8758 (0.8911)	loss 0.4682 (0.5663)	grad_norm 3.1242 (2.7443)	mem 20675MB
[2025-04-03 02:02:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][96/311]	eta 0:03:11 lr 0.000295	time 0.8756 (0.8908)	loss 0.6212 (0.5668)	grad_norm 2.3229 (2.7337)	mem 20675MB
[2025-04-03 02:02:52 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][98/311]	eta 0:03:09 lr 0.000295	time 0.8755 (0.8905)	loss 0.5872 (0.5672)	grad_norm 2.5927 (2.7224)	mem 20675MB
[2025-04-03 02:02:54 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][100/311]	eta 0:03:07 lr 0.000295	time 0.8758 (0.8903)	loss 0.4998 (0.5653)	grad_norm 3.1839 (2.7215)	mem 20675MB
[2025-04-03 02:02:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][102/311]	eta 0:03:06 lr 0.000294	time 0.8755 (0.8900)	loss 0.6113 (0.5664)	grad_norm 2.8780 (2.7333)	mem 20675MB
[2025-04-03 02:02:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][104/311]	eta 0:03:04 lr 0.000294	time 0.8758 (0.8897)	loss 0.6031 (0.5674)	grad_norm 2.2664 (2.7207)	mem 20675MB
[2025-04-03 02:02:59 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][106/311]	eta 0:03:02 lr 0.000294	time 0.8757 (0.8895)	loss 0.5640 (0.5664)	grad_norm 2.5289 (2.7208)	mem 20675MB
[2025-04-03 02:03:01 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][108/311]	eta 0:03:00 lr 0.000293	time 0.8758 (0.8893)	loss 0.4967 (0.5651)	grad_norm 3.3721 (2.7342)	mem 20675MB
[2025-04-03 02:03:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][110/311]	eta 0:02:58 lr 0.000293	time 0.8754 (0.8890)	loss 0.6199 (0.5640)	grad_norm 2.7699 (2.7366)	mem 20675MB
[2025-04-03 02:03:05 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][112/311]	eta 0:02:56 lr 0.000293	time 0.8754 (0.8888)	loss 0.5780 (0.5640)	grad_norm 2.9638 (2.7468)	mem 20675MB
[2025-04-03 02:03:06 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][114/311]	eta 0:02:55 lr 0.000292	time 0.8755 (0.8886)	loss 0.5890 (0.5645)	grad_norm 2.4205 (2.7451)	mem 20675MB
[2025-04-03 02:03:08 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][116/311]	eta 0:02:53 lr 0.000292	time 0.8757 (0.8884)	loss 0.5895 (0.5652)	grad_norm 3.1278 (2.7422)	mem 20675MB
[2025-04-03 02:03:10 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][118/311]	eta 0:02:51 lr 0.000291	time 0.8785 (0.8882)	loss 0.6068 (0.5658)	grad_norm 2.6764 (2.7466)	mem 20675MB
[2025-04-03 02:03:12 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][120/311]	eta 0:02:49 lr 0.000291	time 0.8758 (0.8880)	loss 0.4624 (0.5645)	grad_norm 3.6338 (2.7526)	mem 20675MB
[2025-04-03 02:03:13 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][122/311]	eta 0:02:47 lr 0.000291	time 0.8756 (0.8878)	loss 0.6018 (0.5651)	grad_norm 3.1268 (2.7503)	mem 20675MB
[2025-04-03 02:03:15 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][124/311]	eta 0:02:45 lr 0.000290	time 0.8757 (0.8877)	loss 0.6091 (0.5648)	grad_norm 2.4115 (2.7577)	mem 20675MB
[2025-04-03 02:03:17 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][126/311]	eta 0:02:44 lr 0.000290	time 0.8755 (0.8875)	loss 0.5486 (0.5651)	grad_norm 2.7953 (2.7541)	mem 20675MB
[2025-04-03 02:03:19 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][128/311]	eta 0:02:42 lr 0.000290	time 0.8756 (0.8873)	loss 0.5947 (0.5654)	grad_norm 2.3531 (2.7541)	mem 20675MB
[2025-04-03 02:03:20 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][130/311]	eta 0:02:40 lr 0.000289	time 0.8759 (0.8871)	loss 0.4327 (0.5633)	grad_norm 3.0367 (2.7642)	mem 20675MB
[2025-04-03 02:03:22 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][132/311]	eta 0:02:38 lr 0.000289	time 0.8756 (0.8870)	loss 0.4699 (0.5619)	grad_norm 2.8903 (2.7760)	mem 20675MB
[2025-04-03 02:03:24 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][134/311]	eta 0:02:36 lr 0.000289	time 0.8754 (0.8868)	loss 0.6733 (0.5636)	grad_norm 2.4870 (2.7748)	mem 20675MB
[2025-04-03 02:03:26 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][136/311]	eta 0:02:35 lr 0.000288	time 0.8755 (0.8867)	loss 0.5349 (0.5632)	grad_norm 2.5201 (2.7824)	mem 20675MB
[2025-04-03 02:03:27 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][138/311]	eta 0:02:33 lr 0.000288	time 0.8754 (0.8865)	loss 0.5895 (0.5626)	grad_norm 2.3617 (2.7890)	mem 20675MB
[2025-04-03 02:03:29 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][140/311]	eta 0:02:31 lr 0.000288	time 0.8757 (0.8864)	loss 0.6431 (0.5638)	grad_norm 3.2827 (2.7939)	mem 20675MB
[2025-04-03 02:03:31 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][142/311]	eta 0:02:29 lr 0.000287	time 0.8755 (0.8863)	loss 0.6098 (0.5644)	grad_norm 2.7574 (2.7913)	mem 20675MB
[2025-04-03 02:03:33 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][144/311]	eta 0:02:27 lr 0.000287	time 0.8759 (0.8861)	loss 0.5679 (0.5639)	grad_norm 3.1691 (2.8018)	mem 20675MB
[2025-04-03 02:03:34 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][146/311]	eta 0:02:26 lr 0.000286	time 0.8754 (0.8860)	loss 0.4802 (0.5639)	grad_norm 3.3994 (2.8083)	mem 20675MB
[2025-04-03 02:03:36 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][148/311]	eta 0:02:24 lr 0.000286	time 0.8754 (0.8859)	loss 0.4984 (0.5637)	grad_norm 2.4630 (2.8035)	mem 20675MB
[2025-04-03 02:03:38 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][150/311]	eta 0:02:22 lr 0.000286	time 0.8754 (0.8857)	loss 0.5673 (0.5637)	grad_norm 2.7094 (2.7984)	mem 20675MB
[2025-04-03 02:03:40 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][152/311]	eta 0:02:20 lr 0.000285	time 0.8754 (0.8856)	loss 0.4909 (0.5633)	grad_norm 3.7133 (2.8018)	mem 20675MB
[2025-04-03 02:03:41 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][154/311]	eta 0:02:19 lr 0.000285	time 0.8754 (0.8855)	loss 0.6078 (0.5640)	grad_norm 2.2802 (2.7987)	mem 20675MB
[2025-04-03 02:03:43 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][156/311]	eta 0:02:17 lr 0.000285	time 0.8757 (0.8854)	loss 0.5323 (0.5641)	grad_norm 2.2650 (2.7903)	mem 20675MB
[2025-04-03 02:03:45 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][158/311]	eta 0:02:15 lr 0.000284	time 0.8756 (0.8853)	loss 0.5920 (0.5641)	grad_norm 2.1164 (2.7842)	mem 20675MB
[2025-04-03 02:03:47 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][160/311]	eta 0:02:13 lr 0.000284	time 0.8755 (0.8852)	loss 0.5677 (0.5638)	grad_norm 2.9233 (2.7851)	mem 20675MB
[2025-04-03 02:03:48 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][162/311]	eta 0:02:11 lr 0.000284	time 0.8760 (0.8851)	loss 0.5227 (0.5633)	grad_norm 3.1057 (2.7881)	mem 20675MB
[2025-04-03 02:03:50 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][164/311]	eta 0:02:10 lr 0.000283	time 0.8757 (0.8850)	loss 0.5062 (0.5634)	grad_norm 2.9064 (2.7864)	mem 20675MB
[2025-04-03 02:03:52 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][166/311]	eta 0:02:08 lr 0.000283	time 0.8754 (0.8848)	loss 0.4972 (0.5633)	grad_norm 2.8662 (2.7920)	mem 20675MB
[2025-04-03 02:03:54 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][168/311]	eta 0:02:06 lr 0.000283	time 0.8753 (0.8847)	loss 0.4883 (0.5621)	grad_norm 3.0727 (2.8030)	mem 20675MB
[2025-04-03 02:03:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][170/311]	eta 0:02:04 lr 0.000282	time 0.8758 (0.8847)	loss 0.5733 (0.5627)	grad_norm 2.0695 (2.7966)	mem 20675MB
[2025-04-03 02:03:57 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][172/311]	eta 0:02:02 lr 0.000282	time 0.8755 (0.8846)	loss 0.5039 (0.5628)	grad_norm 2.5463 (2.7985)	mem 20675MB
[2025-04-03 02:03:59 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][174/311]	eta 0:02:01 lr 0.000282	time 0.8754 (0.8845)	loss 0.5404 (0.5625)	grad_norm 2.5520 (2.7932)	mem 20675MB
[2025-04-03 02:04:01 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][176/311]	eta 0:01:59 lr 0.000281	time 0.8757 (0.8844)	loss 0.6042 (0.5625)	grad_norm 2.6605 (2.7985)	mem 20675MB
[2025-04-03 02:04:03 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][178/311]	eta 0:01:57 lr 0.000281	time 0.8758 (0.8843)	loss 0.5747 (0.5623)	grad_norm 3.0691 (2.8030)	mem 20675MB
[2025-04-03 02:04:04 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][180/311]	eta 0:01:55 lr 0.000280	time 0.8758 (0.8842)	loss 0.4522 (0.5618)	grad_norm 5.7708 (2.8194)	mem 20675MB
[2025-04-03 02:04:06 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][182/311]	eta 0:01:54 lr 0.000280	time 0.8758 (0.8841)	loss 0.5043 (0.5617)	grad_norm 2.5952 (2.8181)	mem 20675MB
[2025-04-03 02:04:08 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][184/311]	eta 0:01:52 lr 0.000280	time 0.8757 (0.8840)	loss 0.5475 (0.5621)	grad_norm 2.2874 (2.8135)	mem 20675MB
[2025-04-03 02:04:10 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][186/311]	eta 0:01:50 lr 0.000279	time 0.8778 (0.8840)	loss 0.4176 (0.5619)	grad_norm 4.3880 (2.8202)	mem 20675MB
[2025-04-03 02:04:11 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][188/311]	eta 0:01:48 lr 0.000279	time 0.8758 (0.8839)	loss 0.5507 (0.5618)	grad_norm 3.2932 (2.8197)	mem 20675MB
[2025-04-03 02:04:13 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][190/311]	eta 0:01:46 lr 0.000279	time 0.8755 (0.8838)	loss 0.5253 (0.5613)	grad_norm 3.3731 (2.8257)	mem 20675MB
[2025-04-03 02:04:15 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][192/311]	eta 0:01:45 lr 0.000278	time 0.8756 (0.8837)	loss 0.5840 (0.5618)	grad_norm 2.2359 (2.8211)	mem 20675MB
[2025-04-03 02:04:17 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][194/311]	eta 0:01:43 lr 0.000278	time 0.8756 (0.8837)	loss 0.5820 (0.5621)	grad_norm 1.9758 (2.8183)	mem 20675MB
[2025-04-03 02:04:18 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][196/311]	eta 0:01:41 lr 0.000278	time 0.8757 (0.8836)	loss 0.5269 (0.5625)	grad_norm 2.9180 (2.8142)	mem 20675MB
[2025-04-03 02:04:20 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][198/311]	eta 0:01:39 lr 0.000277	time 0.8754 (0.8835)	loss 0.6496 (0.5632)	grad_norm 1.9411 (2.8063)	mem 20675MB
[2025-04-03 02:04:22 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][200/311]	eta 0:01:38 lr 0.000277	time 0.8758 (0.8835)	loss 0.6365 (0.5638)	grad_norm 2.0699 (2.7993)	mem 20675MB
[2025-04-03 02:04:24 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][202/311]	eta 0:01:36 lr 0.000277	time 0.8754 (0.8834)	loss 0.5671 (0.5635)	grad_norm 2.3018 (2.8009)	mem 20675MB
[2025-04-03 02:04:25 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][204/311]	eta 0:01:34 lr 0.000276	time 0.8755 (0.8833)	loss 0.5232 (0.5631)	grad_norm 2.5454 (2.8021)	mem 20675MB
[2025-04-03 02:04:27 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][206/311]	eta 0:01:32 lr 0.000276	time 0.8757 (0.8833)	loss 0.6411 (0.5635)	grad_norm 1.9629 (2.7951)	mem 20675MB
[2025-04-03 02:04:29 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][208/311]	eta 0:01:30 lr 0.000276	time 0.8757 (0.8832)	loss 0.4326 (0.5625)	grad_norm 3.1924 (2.7928)	mem 20675MB
[2025-04-03 02:04:31 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][210/311]	eta 0:01:29 lr 0.000275	time 0.8754 (0.8831)	loss 0.5511 (0.5622)	grad_norm 2.6634 (2.7893)	mem 20675MB
[2025-04-03 02:04:32 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][212/311]	eta 0:01:27 lr 0.000275	time 0.8780 (0.8831)	loss 0.4443 (0.5613)	grad_norm 2.8705 (2.7913)	mem 20675MB
[2025-04-03 02:04:34 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][214/311]	eta 0:01:25 lr 0.000275	time 0.8756 (0.8830)	loss 0.4468 (0.5609)	grad_norm 3.1065 (2.7981)	mem 20675MB
[2025-04-03 02:04:36 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][216/311]	eta 0:01:23 lr 0.000274	time 0.8755 (0.8830)	loss 0.5421 (0.5608)	grad_norm 2.0438 (2.7918)	mem 20675MB
[2025-04-03 02:04:38 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][218/311]	eta 0:01:22 lr 0.000274	time 0.8755 (0.8829)	loss 0.5639 (0.5609)	grad_norm 2.6596 (2.7940)	mem 20675MB
[2025-04-03 02:04:39 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][220/311]	eta 0:01:20 lr 0.000273	time 0.8756 (0.8828)	loss 0.5972 (0.5613)	grad_norm 3.2996 (2.8001)	mem 20675MB
[2025-04-03 02:04:41 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][222/311]	eta 0:01:18 lr 0.000273	time 0.8765 (0.8828)	loss 0.4716 (0.5610)	grad_norm 3.0191 (2.8007)	mem 20675MB
[2025-04-03 02:04:43 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][224/311]	eta 0:01:16 lr 0.000273	time 0.8755 (0.8827)	loss 0.5128 (0.5604)	grad_norm 2.5761 (2.8076)	mem 20675MB
[2025-04-03 02:04:45 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][226/311]	eta 0:01:15 lr 0.000272	time 0.8756 (0.8827)	loss 0.5240 (0.5599)	grad_norm 4.3773 (2.8195)	mem 20675MB
[2025-04-03 02:04:46 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][228/311]	eta 0:01:13 lr 0.000272	time 0.8755 (0.8826)	loss 0.5707 (0.5602)	grad_norm 2.4394 (2.8153)	mem 20675MB
[2025-04-03 02:04:48 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][230/311]	eta 0:01:11 lr 0.000272	time 0.8763 (0.8826)	loss 0.6450 (0.5600)	grad_norm 3.3114 (2.8200)	mem 20675MB
[2025-04-03 02:04:50 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][232/311]	eta 0:01:09 lr 0.000271	time 0.8759 (0.8825)	loss 0.6588 (0.5605)	grad_norm 3.6162 (2.8228)	mem 20675MB
[2025-04-03 02:04:52 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][234/311]	eta 0:01:07 lr 0.000271	time 0.8758 (0.8825)	loss 0.4107 (0.5600)	grad_norm 3.8632 (2.8253)	mem 20675MB
[2025-04-03 02:04:53 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][236/311]	eta 0:01:06 lr 0.000271	time 0.8757 (0.8824)	loss 0.6120 (0.5596)	grad_norm 2.4609 (2.8279)	mem 20675MB
[2025-04-03 02:04:55 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][238/311]	eta 0:01:04 lr 0.000270	time 0.8758 (0.8824)	loss 0.5078 (0.5591)	grad_norm 2.8396 (2.8280)	mem 20675MB
[2025-04-03 02:04:57 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][240/311]	eta 0:01:02 lr 0.000270	time 0.8761 (0.8823)	loss 0.4767 (0.5587)	grad_norm 3.0109 (2.8283)	mem 20675MB
[2025-04-03 02:04:59 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][242/311]	eta 0:01:00 lr 0.000270	time 0.8757 (0.8823)	loss 0.6017 (0.5591)	grad_norm 2.4508 (2.8237)	mem 20675MB
[2025-04-03 02:05:00 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][244/311]	eta 0:00:59 lr 0.000269	time 0.8755 (0.8822)	loss 0.5788 (0.5588)	grad_norm 2.4045 (2.8248)	mem 20675MB
[2025-04-03 02:05:02 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][246/311]	eta 0:00:57 lr 0.000269	time 0.8756 (0.8822)	loss 0.5165 (0.5581)	grad_norm 3.9610 (2.8348)	mem 20675MB
[2025-04-03 02:05:04 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][248/311]	eta 0:00:55 lr 0.000269	time 0.8757 (0.8821)	loss 0.5721 (0.5583)	grad_norm 2.4487 (2.8322)	mem 20675MB
[2025-04-03 02:05:06 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][250/311]	eta 0:00:53 lr 0.000268	time 0.8758 (0.8821)	loss 0.4954 (0.5578)	grad_norm 3.1731 (2.8353)	mem 20675MB
[2025-04-03 02:05:07 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][252/311]	eta 0:00:52 lr 0.000268	time 0.8759 (0.8821)	loss 0.5576 (0.5573)	grad_norm 3.3164 (2.8466)	mem 20675MB
[2025-04-03 02:05:09 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][254/311]	eta 0:00:50 lr 0.000268	time 0.8755 (0.8820)	loss 0.5952 (0.5571)	grad_norm 2.1182 (2.8439)	mem 20675MB
[2025-04-03 02:05:11 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][256/311]	eta 0:00:48 lr 0.000267	time 0.8758 (0.8820)	loss 0.4068 (0.5566)	grad_norm 4.5299 (2.8493)	mem 20675MB
[2025-04-03 02:05:13 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][258/311]	eta 0:00:46 lr 0.000267	time 0.8755 (0.8819)	loss 0.4978 (0.5567)	grad_norm 4.1825 (2.8533)	mem 20675MB
[2025-04-03 02:05:14 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][260/311]	eta 0:00:44 lr 0.000267	time 0.8757 (0.8819)	loss 0.5455 (0.5564)	grad_norm 3.3726 (2.8554)	mem 20675MB
[2025-04-03 02:05:16 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][262/311]	eta 0:00:43 lr 0.000266	time 0.8758 (0.8819)	loss 0.5356 (0.5559)	grad_norm 2.0007 (2.8545)	mem 20675MB
[2025-04-03 02:05:18 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][264/311]	eta 0:00:41 lr 0.000266	time 0.8757 (0.8818)	loss 0.4911 (0.5557)	grad_norm 2.3542 (2.8518)	mem 20675MB
[2025-04-03 02:05:20 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][266/311]	eta 0:00:39 lr 0.000266	time 0.8755 (0.8818)	loss 0.6048 (0.5562)	grad_norm 3.2386 (2.8532)	mem 20675MB
[2025-04-03 02:05:21 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][268/311]	eta 0:00:37 lr 0.000265	time 0.8757 (0.8817)	loss 0.4651 (0.5561)	grad_norm 3.4511 (2.8558)	mem 20675MB
[2025-04-03 02:05:23 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][270/311]	eta 0:00:36 lr 0.000265	time 0.8758 (0.8817)	loss 0.6443 (0.5560)	grad_norm 2.3549 (2.8574)	mem 20675MB
[2025-04-03 02:05:25 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][272/311]	eta 0:00:34 lr 0.000265	time 0.8760 (0.8817)	loss 0.4655 (0.5557)	grad_norm 3.1077 (2.8595)	mem 20675MB
[2025-04-03 02:05:27 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][274/311]	eta 0:00:32 lr 0.000264	time 0.8759 (0.8816)	loss 0.5906 (0.5561)	grad_norm 2.9358 (2.8590)	mem 20675MB
[2025-04-03 02:05:28 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][276/311]	eta 0:00:30 lr 0.000264	time 0.8755 (0.8816)	loss 0.4587 (0.5559)	grad_norm 5.2788 (2.8646)	mem 20675MB
[2025-04-03 02:05:30 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][278/311]	eta 0:00:29 lr 0.000263	time 0.8757 (0.8816)	loss 0.5089 (0.5554)	grad_norm 2.8880 (2.8683)	mem 20675MB
[2025-04-03 02:05:32 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][280/311]	eta 0:00:27 lr 0.000263	time 0.8756 (0.8815)	loss 0.5653 (0.5549)	grad_norm 2.0136 (2.8709)	mem 20675MB
[2025-04-03 02:05:34 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][282/311]	eta 0:00:25 lr 0.000263	time 0.8759 (0.8815)	loss 0.5932 (0.5554)	grad_norm 3.0662 (2.8669)	mem 20675MB
[2025-04-03 02:05:35 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][284/311]	eta 0:00:23 lr 0.000262	time 0.8757 (0.8815)	loss 0.5927 (0.5557)	grad_norm 2.6930 (2.8645)	mem 20675MB
[2025-04-03 02:05:37 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][286/311]	eta 0:00:22 lr 0.000262	time 0.8776 (0.8814)	loss 0.5642 (0.5558)	grad_norm 2.0116 (2.8596)	mem 20675MB
[2025-04-03 02:05:39 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][288/311]	eta 0:00:20 lr 0.000262	time 0.8757 (0.8814)	loss 0.5787 (0.5558)	grad_norm 2.7681 (2.8594)	mem 20675MB
[2025-04-03 02:05:41 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][290/311]	eta 0:00:18 lr 0.000261	time 0.8759 (0.8814)	loss 0.5963 (0.5562)	grad_norm 2.0651 (2.8562)	mem 20675MB
[2025-04-03 02:05:42 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][292/311]	eta 0:00:16 lr 0.000261	time 0.8757 (0.8813)	loss 0.6719 (0.5561)	grad_norm 4.9502 (2.8675)	mem 20675MB
[2025-04-03 02:05:44 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][294/311]	eta 0:00:14 lr 0.000261	time 0.8767 (0.8813)	loss 0.5945 (0.5562)	grad_norm 2.6062 (2.8654)	mem 20675MB
[2025-04-03 02:05:46 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][296/311]	eta 0:00:13 lr 0.000260	time 0.8759 (0.8813)	loss 0.5347 (0.5560)	grad_norm 2.0813 (2.8629)	mem 20675MB
[2025-04-03 02:05:48 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][298/311]	eta 0:00:11 lr 0.000260	time 0.8761 (0.8813)	loss 0.5762 (0.5558)	grad_norm 2.9069 (2.8673)	mem 20675MB
[2025-04-03 02:05:49 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][300/311]	eta 0:00:09 lr 0.000260	time 0.8756 (0.8812)	loss 0.4372 (0.5554)	grad_norm 3.0250 (2.8663)	mem 20675MB
[2025-04-03 02:05:51 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][302/311]	eta 0:00:07 lr 0.000259	time 0.8757 (0.8812)	loss 0.6215 (0.5558)	grad_norm 2.1924 (2.8640)	mem 20675MB
[2025-04-03 02:05:53 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][304/311]	eta 0:00:06 lr 0.000259	time 0.8756 (0.8812)	loss 0.5262 (0.5554)	grad_norm 3.4511 (2.8647)	mem 20675MB
[2025-04-03 02:05:55 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][306/311]	eta 0:00:04 lr 0.000259	time 0.8757 (0.8811)	loss 0.4592 (0.5552)	grad_norm 3.1476 (2.8626)	mem 20675MB
[2025-04-03 02:05:56 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][308/311]	eta 0:00:02 lr 0.000258	time 0.8753 (0.8811)	loss 0.5486 (0.5554)	grad_norm 3.1770 (2.8626)	mem 20675MB
[2025-04-03 02:05:58 simmim_finetune] (main_finetune.py 252): INFO Train: [20/30][310/311]	eta 0:00:00 lr 0.000258	time 0.8753 (0.8811)	loss 0.4811 (0.5554)	grad_norm 3.7931 (2.8656)	mem 20675MB
[2025-04-03 02:05:58 simmim_finetune] (main_finetune.py 260): INFO EPOCH 20 training takes 0:04:34
[2025-04-03 02:05:58 simmim_finetune] (utils.py 60): INFO checkpoint/hand/ckpt20.pth saving......
[2025-04-03 02:06:02 simmim_finetune] (utils.py 62): INFO checkpoint/hand/ckpt20.pth saved !!!
[2025-04-03 02:06:03 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.463 (1.463)	Loss 0.5185 (0.5185)	Acc@1 76.562 (76.562)	Mem 20675MB
[2025-04-03 02:06:03 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.465
[2025-04-03 02:06:03 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 77.5%
[2025-04-03 02:06:03 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:06:03 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [1.1515954003879404e-06, 1.1515954003879404e-06, 1.664833407804346e-06, 1.664833407804346e-06, 2.4544303422911243e-06, 2.4544303422911243e-06, 3.6691948568861675e-06, 3.6691948568861675e-06, 5.538063340878541e-06, 5.538063340878541e-06, 8.413245623943731e-06, 8.413245623943731e-06, 1.2836602982505561e-05, 1.2836602982505561e-05, 1.9641768149523756e-05, 1.9641768149523756e-05, 3.011125302185945e-05, 3.011125302185945e-05, 4.621815282545283e-05, 4.621815282545283e-05, 7.099799867713494e-05, 7.099799867713494e-05, 0.00010912083844895358, 0.00010912083844895358, 0.0001677713611748284, 0.0001677713611748284, 0.00025800293459925114, 0.00025800293459925114]
[2025-04-03 02:06:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][0/311]	eta 0:11:15 lr 0.000258	time 2.1705 (2.1705)	loss 0.4721 (0.4721)	grad_norm 4.6296 (4.6296)	mem 20675MB
[2025-04-03 02:06:07 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][2/311]	eta 0:06:44 lr 0.000257	time 0.8761 (1.3086)	loss 0.5993 (0.5541)	grad_norm 2.4789 (3.0922)	mem 20675MB
[2025-04-03 02:06:09 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][4/311]	eta 0:05:48 lr 0.000257	time 0.8761 (1.1359)	loss 0.5403 (0.5586)	grad_norm 3.0575 (3.0413)	mem 20675MB
[2025-04-03 02:06:11 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][6/311]	eta 0:05:23 lr 0.000257	time 0.8770 (1.0620)	loss 0.5514 (0.5638)	grad_norm 4.1187 (3.2473)	mem 20675MB
[2025-04-03 02:06:13 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][8/311]	eta 0:05:09 lr 0.000256	time 0.8762 (1.0210)	loss 0.6332 (0.5740)	grad_norm 2.4167 (3.1482)	mem 20675MB
[2025-04-03 02:06:14 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][10/311]	eta 0:04:59 lr 0.000256	time 0.8761 (0.9948)	loss 0.5315 (0.5774)	grad_norm 2.3850 (3.0528)	mem 20675MB
[2025-04-03 02:06:16 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][12/311]	eta 0:04:52 lr 0.000256	time 0.8763 (0.9767)	loss 0.5920 (0.5803)	grad_norm 3.2155 (2.9998)	mem 20675MB
[2025-04-03 02:06:18 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][14/311]	eta 0:04:46 lr 0.000255	time 0.8759 (0.9635)	loss 0.5193 (0.5686)	grad_norm 3.6493 (3.0364)	mem 20675MB
[2025-04-03 02:06:20 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][16/311]	eta 0:04:41 lr 0.000255	time 0.8762 (0.9533)	loss 0.5704 (0.5651)	grad_norm 2.3621 (2.9916)	mem 20675MB
[2025-04-03 02:06:21 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][18/311]	eta 0:04:36 lr 0.000255	time 0.8762 (0.9453)	loss 0.5938 (0.5646)	grad_norm 2.6546 (2.9486)	mem 20675MB
[2025-04-03 02:06:23 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][20/311]	eta 0:04:33 lr 0.000254	time 0.8763 (0.9388)	loss 0.5605 (0.5659)	grad_norm 2.3677 (2.8761)	mem 20675MB
[2025-04-03 02:06:25 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][22/311]	eta 0:04:29 lr 0.000254	time 0.8780 (0.9335)	loss 0.5982 (0.5642)	grad_norm 2.7590 (2.8755)	mem 20675MB
[2025-04-03 02:06:27 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][24/311]	eta 0:04:26 lr 0.000254	time 0.8760 (0.9290)	loss 0.5304 (0.5593)	grad_norm 3.5643 (2.9221)	mem 20675MB
[2025-04-03 02:06:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][26/311]	eta 0:04:23 lr 0.000253	time 0.8761 (0.9252)	loss 0.5509 (0.5577)	grad_norm 3.7889 (2.9290)	mem 20675MB
[2025-04-03 02:06:30 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][28/311]	eta 0:04:20 lr 0.000253	time 0.8762 (0.9218)	loss 0.4332 (0.5517)	grad_norm 3.9620 (2.9549)	mem 20675MB
[2025-04-03 02:06:32 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][30/311]	eta 0:04:18 lr 0.000253	time 0.8761 (0.9190)	loss 0.6000 (0.5556)	grad_norm 2.5335 (2.9099)	mem 20675MB
[2025-04-03 02:06:34 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][32/311]	eta 0:04:15 lr 0.000252	time 0.8761 (0.9164)	loss 0.5289 (0.5515)	grad_norm 3.1454 (2.9152)	mem 20675MB
[2025-04-03 02:06:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][34/311]	eta 0:04:13 lr 0.000252	time 0.8781 (0.9142)	loss 0.5441 (0.5541)	grad_norm 2.4657 (2.9059)	mem 20675MB
[2025-04-03 02:06:37 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][36/311]	eta 0:04:10 lr 0.000252	time 0.8761 (0.9122)	loss 0.6382 (0.5591)	grad_norm 3.8039 (2.9416)	mem 20675MB
[2025-04-03 02:06:39 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][38/311]	eta 0:04:08 lr 0.000251	time 0.8761 (0.9104)	loss 0.5035 (0.5579)	grad_norm 4.2120 (2.9761)	mem 20675MB
[2025-04-03 02:06:41 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][40/311]	eta 0:04:06 lr 0.000251	time 0.8762 (0.9088)	loss 0.5947 (0.5594)	grad_norm 3.4684 (2.9666)	mem 20675MB
[2025-04-03 02:06:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][42/311]	eta 0:04:04 lr 0.000251	time 0.8766 (0.9073)	loss 0.6253 (0.5610)	grad_norm 2.6567 (2.9510)	mem 20675MB
[2025-04-03 02:06:44 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][44/311]	eta 0:04:01 lr 0.000250	time 0.8761 (0.9060)	loss 0.4410 (0.5547)	grad_norm 4.0294 (2.9962)	mem 20675MB
[2025-04-03 02:06:46 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][46/311]	eta 0:03:59 lr 0.000250	time 0.8758 (0.9047)	loss 0.4995 (0.5542)	grad_norm 3.2839 (3.0047)	mem 20675MB
[2025-04-03 02:06:48 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][48/311]	eta 0:03:57 lr 0.000250	time 0.8763 (0.9036)	loss 0.4757 (0.5530)	grad_norm 3.4717 (2.9939)	mem 20675MB
[2025-04-03 02:06:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][50/311]	eta 0:03:55 lr 0.000249	time 0.8759 (0.9026)	loss 0.6576 (0.5529)	grad_norm 2.5635 (3.0069)	mem 20675MB
[2025-04-03 02:06:51 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][52/311]	eta 0:03:53 lr 0.000249	time 0.8761 (0.9016)	loss 0.5606 (0.5540)	grad_norm 2.3496 (2.9982)	mem 20675MB
[2025-04-03 02:06:53 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][54/311]	eta 0:03:51 lr 0.000249	time 0.8764 (0.9007)	loss 0.6081 (0.5570)	grad_norm 2.3996 (2.9815)	mem 20675MB
[2025-04-03 02:06:55 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][56/311]	eta 0:03:49 lr 0.000248	time 0.8762 (0.8999)	loss 0.6294 (0.5577)	grad_norm 2.1617 (2.9619)	mem 20675MB
[2025-04-03 02:06:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][58/311]	eta 0:03:47 lr 0.000248	time 0.8760 (0.8991)	loss 0.4934 (0.5573)	grad_norm 2.7805 (2.9613)	mem 20675MB
[2025-04-03 02:06:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][60/311]	eta 0:03:45 lr 0.000248	time 0.8759 (0.8984)	loss 0.5296 (0.5571)	grad_norm 3.9083 (2.9723)	mem 20675MB
[2025-04-03 02:07:00 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][62/311]	eta 0:03:43 lr 0.000247	time 0.8761 (0.8977)	loss 0.5888 (0.5578)	grad_norm 2.4993 (2.9610)	mem 20675MB
[2025-04-03 02:07:02 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][64/311]	eta 0:03:41 lr 0.000247	time 0.8761 (0.8971)	loss 0.5674 (0.5588)	grad_norm 2.6976 (2.9472)	mem 20675MB
[2025-04-03 02:07:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][66/311]	eta 0:03:39 lr 0.000247	time 0.8758 (0.8965)	loss 0.5935 (0.5581)	grad_norm 1.8679 (2.9441)	mem 20675MB
[2025-04-03 02:07:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][68/311]	eta 0:03:37 lr 0.000246	time 0.8761 (0.8959)	loss 0.6003 (0.5582)	grad_norm 2.3528 (2.9249)	mem 20675MB
[2025-04-03 02:07:07 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][70/311]	eta 0:03:35 lr 0.000246	time 0.8765 (0.8954)	loss 0.6088 (0.5574)	grad_norm 2.4921 (2.9319)	mem 20675MB
[2025-04-03 02:07:09 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][72/311]	eta 0:03:33 lr 0.000246	time 0.8762 (0.8949)	loss 0.5115 (0.5554)	grad_norm 3.2971 (2.9478)	mem 20675MB
[2025-04-03 02:07:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][74/311]	eta 0:03:31 lr 0.000245	time 0.8760 (0.8944)	loss 0.3981 (0.5532)	grad_norm 3.9185 (2.9751)	mem 20675MB
[2025-04-03 02:07:12 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][76/311]	eta 0:03:30 lr 0.000245	time 0.8761 (0.8940)	loss 0.5045 (0.5530)	grad_norm 3.3022 (2.9846)	mem 20675MB
[2025-04-03 02:07:14 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][78/311]	eta 0:03:28 lr 0.000245	time 0.8769 (0.8935)	loss 0.5730 (0.5515)	grad_norm 3.1602 (2.9969)	mem 20675MB
[2025-04-03 02:07:16 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][80/311]	eta 0:03:26 lr 0.000244	time 0.8757 (0.8931)	loss 0.5709 (0.5517)	grad_norm 2.5244 (2.9829)	mem 20675MB
[2025-04-03 02:07:17 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][82/311]	eta 0:03:24 lr 0.000244	time 0.8759 (0.8927)	loss 0.5691 (0.5519)	grad_norm 2.8817 (2.9846)	mem 20675MB
[2025-04-03 02:07:19 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][84/311]	eta 0:03:22 lr 0.000244	time 0.8760 (0.8923)	loss 0.6011 (0.5520)	grad_norm 3.6330 (3.0020)	mem 20675MB
[2025-04-03 02:07:21 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][86/311]	eta 0:03:20 lr 0.000243	time 0.8757 (0.8920)	loss 0.5586 (0.5530)	grad_norm 2.7773 (2.9999)	mem 20675MB
[2025-04-03 02:07:23 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][88/311]	eta 0:03:18 lr 0.000243	time 0.8757 (0.8916)	loss 0.6102 (0.5530)	grad_norm 2.8081 (3.0023)	mem 20675MB
[2025-04-03 02:07:24 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][90/311]	eta 0:03:16 lr 0.000243	time 0.8761 (0.8913)	loss 0.6018 (0.5545)	grad_norm 2.2205 (3.0078)	mem 20675MB
[2025-04-03 02:07:26 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][92/311]	eta 0:03:15 lr 0.000242	time 0.8762 (0.8910)	loss 0.4704 (0.5545)	grad_norm 4.2216 (3.0321)	mem 20675MB
[2025-04-03 02:07:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][94/311]	eta 0:03:13 lr 0.000242	time 0.8757 (0.8907)	loss 0.5974 (0.5557)	grad_norm 2.2830 (3.0263)	mem 20675MB
[2025-04-03 02:07:30 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][96/311]	eta 0:03:11 lr 0.000242	time 0.8761 (0.8904)	loss 0.5770 (0.5561)	grad_norm 2.1102 (3.0120)	mem 20675MB
[2025-04-03 02:07:31 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][98/311]	eta 0:03:09 lr 0.000241	time 0.8762 (0.8902)	loss 0.4274 (0.5534)	grad_norm 4.5456 (3.0357)	mem 20675MB
[2025-04-03 02:07:33 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][100/311]	eta 0:03:07 lr 0.000241	time 0.8760 (0.8899)	loss 0.6432 (0.5530)	grad_norm 2.3526 (3.0336)	mem 20675MB
[2025-04-03 02:07:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][102/311]	eta 0:03:05 lr 0.000241	time 0.8759 (0.8896)	loss 0.5925 (0.5529)	grad_norm 2.2117 (3.0255)	mem 20675MB
[2025-04-03 02:07:37 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][104/311]	eta 0:03:04 lr 0.000240	time 0.8761 (0.8894)	loss 0.6186 (0.5541)	grad_norm 2.6217 (3.0120)	mem 20675MB
[2025-04-03 02:07:38 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][106/311]	eta 0:03:02 lr 0.000240	time 0.8759 (0.8892)	loss 0.5360 (0.5531)	grad_norm 2.1804 (3.0078)	mem 20675MB
[2025-04-03 02:07:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][108/311]	eta 0:03:00 lr 0.000240	time 0.8760 (0.8889)	loss 0.6258 (0.5540)	grad_norm 2.8207 (2.9979)	mem 20675MB
[2025-04-03 02:07:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][110/311]	eta 0:02:58 lr 0.000239	time 0.8762 (0.8887)	loss 0.5194 (0.5528)	grad_norm 3.1596 (3.0022)	mem 20675MB
[2025-04-03 02:07:44 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][112/311]	eta 0:02:56 lr 0.000239	time 0.8762 (0.8885)	loss 0.6582 (0.5543)	grad_norm 2.5489 (2.9939)	mem 20675MB
[2025-04-03 02:07:45 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][114/311]	eta 0:02:54 lr 0.000239	time 0.8758 (0.8883)	loss 0.5625 (0.5548)	grad_norm 2.5087 (2.9830)	mem 20675MB
[2025-04-03 02:07:47 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][116/311]	eta 0:02:53 lr 0.000238	time 0.8759 (0.8881)	loss 0.5533 (0.5547)	grad_norm 3.2922 (2.9877)	mem 20675MB
[2025-04-03 02:07:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][118/311]	eta 0:02:51 lr 0.000238	time 0.8758 (0.8879)	loss 0.5409 (0.5541)	grad_norm 2.0896 (2.9804)	mem 20675MB
[2025-04-03 02:07:51 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][120/311]	eta 0:02:49 lr 0.000238	time 0.8761 (0.8878)	loss 0.6233 (0.5540)	grad_norm 3.1957 (2.9848)	mem 20675MB
[2025-04-03 02:07:52 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][122/311]	eta 0:02:47 lr 0.000237	time 0.8760 (0.8876)	loss 0.5396 (0.5537)	grad_norm 2.1360 (2.9852)	mem 20675MB
[2025-04-03 02:07:54 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][124/311]	eta 0:02:45 lr 0.000237	time 0.8762 (0.8874)	loss 0.5574 (0.5530)	grad_norm 2.4665 (2.9851)	mem 20675MB
[2025-04-03 02:07:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][126/311]	eta 0:02:44 lr 0.000237	time 0.8760 (0.8872)	loss 0.6619 (0.5541)	grad_norm 1.7603 (2.9672)	mem 20675MB
[2025-04-03 02:07:58 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][128/311]	eta 0:02:42 lr 0.000236	time 0.8762 (0.8871)	loss 0.5106 (0.5541)	grad_norm 2.8123 (2.9601)	mem 20675MB
[2025-04-03 02:08:00 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][130/311]	eta 0:02:40 lr 0.000236	time 0.8763 (0.8869)	loss 0.5713 (0.5543)	grad_norm 2.4670 (2.9517)	mem 20675MB
[2025-04-03 02:08:01 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][132/311]	eta 0:02:38 lr 0.000236	time 0.8758 (0.8868)	loss 0.4250 (0.5525)	grad_norm 3.9929 (2.9711)	mem 20675MB
[2025-04-03 02:08:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][134/311]	eta 0:02:36 lr 0.000235	time 0.8760 (0.8866)	loss 0.6124 (0.5528)	grad_norm 2.8409 (2.9656)	mem 20675MB
[2025-04-03 02:08:05 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][136/311]	eta 0:02:35 lr 0.000235	time 0.8760 (0.8865)	loss 0.6150 (0.5532)	grad_norm 3.0807 (2.9703)	mem 20675MB
[2025-04-03 02:08:07 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][138/311]	eta 0:02:33 lr 0.000235	time 0.8766 (0.8864)	loss 0.5619 (0.5523)	grad_norm 1.8744 (2.9623)	mem 20675MB
[2025-04-03 02:08:08 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][140/311]	eta 0:02:31 lr 0.000234	time 0.8763 (0.8862)	loss 0.4910 (0.5510)	grad_norm 3.1723 (2.9684)	mem 20675MB
[2025-04-03 02:08:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][142/311]	eta 0:02:29 lr 0.000234	time 0.8760 (0.8861)	loss 0.5977 (0.5515)	grad_norm 2.4064 (2.9633)	mem 20675MB
[2025-04-03 02:08:12 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][144/311]	eta 0:02:27 lr 0.000234	time 0.8764 (0.8860)	loss 0.5637 (0.5506)	grad_norm 2.4844 (2.9680)	mem 20675MB
[2025-04-03 02:08:14 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][146/311]	eta 0:02:26 lr 0.000233	time 0.8759 (0.8859)	loss 0.4246 (0.5499)	grad_norm 3.3738 (2.9688)	mem 20675MB
[2025-04-03 02:08:15 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][148/311]	eta 0:02:24 lr 0.000233	time 0.8761 (0.8858)	loss 0.4537 (0.5488)	grad_norm 3.9622 (2.9769)	mem 20675MB
[2025-04-03 02:08:17 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][150/311]	eta 0:02:22 lr 0.000233	time 0.8759 (0.8856)	loss 0.6123 (0.5496)	grad_norm 2.5973 (2.9758)	mem 20675MB
[2025-04-03 02:08:19 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][152/311]	eta 0:02:20 lr 0.000232	time 0.8763 (0.8855)	loss 0.4582 (0.5494)	grad_norm 2.9674 (2.9737)	mem 20675MB
[2025-04-03 02:08:21 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][154/311]	eta 0:02:19 lr 0.000232	time 0.8759 (0.8854)	loss 0.5784 (0.5495)	grad_norm 4.0540 (2.9875)	mem 20675MB
[2025-04-03 02:08:22 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][156/311]	eta 0:02:17 lr 0.000232	time 0.8762 (0.8853)	loss 0.5961 (0.5501)	grad_norm 2.7180 (2.9844)	mem 20675MB
[2025-04-03 02:08:24 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][158/311]	eta 0:02:15 lr 0.000231	time 0.8763 (0.8852)	loss 0.4362 (0.5486)	grad_norm 3.6053 (2.9906)	mem 20675MB
[2025-04-03 02:08:26 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][160/311]	eta 0:02:13 lr 0.000231	time 0.8759 (0.8851)	loss 0.6000 (0.5481)	grad_norm 2.6182 (2.9910)	mem 20675MB
[2025-04-03 02:08:28 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][162/311]	eta 0:02:11 lr 0.000231	time 0.8758 (0.8850)	loss 0.4999 (0.5479)	grad_norm 4.3031 (2.9994)	mem 20675MB
[2025-04-03 02:08:29 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][164/311]	eta 0:02:10 lr 0.000230	time 0.8764 (0.8849)	loss 0.6318 (0.5486)	grad_norm 2.3287 (2.9929)	mem 20675MB
[2025-04-03 02:08:31 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][166/311]	eta 0:02:08 lr 0.000230	time 0.8758 (0.8848)	loss 0.6557 (0.5495)	grad_norm 2.8197 (2.9944)	mem 20675MB
[2025-04-03 02:08:33 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][168/311]	eta 0:02:06 lr 0.000230	time 0.8758 (0.8847)	loss 0.5971 (0.5501)	grad_norm 2.2943 (2.9884)	mem 20675MB
[2025-04-03 02:08:35 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][170/311]	eta 0:02:04 lr 0.000230	time 0.8759 (0.8846)	loss 0.6190 (0.5500)	grad_norm 2.1358 (2.9859)	mem 20675MB
[2025-04-03 02:08:36 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][172/311]	eta 0:02:02 lr 0.000229	time 0.8760 (0.8845)	loss 0.5535 (0.5504)	grad_norm 2.5275 (2.9793)	mem 20675MB
[2025-04-03 02:08:38 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][174/311]	eta 0:02:01 lr 0.000229	time 0.8761 (0.8844)	loss 0.5968 (0.5509)	grad_norm 2.0907 (2.9713)	mem 20675MB
[2025-04-03 02:08:40 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][176/311]	eta 0:01:59 lr 0.000229	time 0.8759 (0.8844)	loss 0.5739 (0.5514)	grad_norm 2.1984 (2.9601)	mem 20675MB
[2025-04-03 02:08:42 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][178/311]	eta 0:01:57 lr 0.000228	time 0.8759 (0.8843)	loss 0.5791 (0.5513)	grad_norm 2.3753 (2.9588)	mem 20675MB
[2025-04-03 02:08:43 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][180/311]	eta 0:01:55 lr 0.000228	time 0.8761 (0.8842)	loss 0.4692 (0.5508)	grad_norm 3.4862 (2.9605)	mem 20675MB
[2025-04-03 02:08:45 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][182/311]	eta 0:01:54 lr 0.000228	time 0.8758 (0.8841)	loss 0.5525 (0.5510)	grad_norm 3.0351 (2.9544)	mem 20675MB
[2025-04-03 02:08:47 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][184/311]	eta 0:01:52 lr 0.000227	time 0.8765 (0.8840)	loss 0.5957 (0.5515)	grad_norm 1.8570 (2.9487)	mem 20675MB
[2025-04-03 02:08:49 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][186/311]	eta 0:01:50 lr 0.000227	time 0.8758 (0.8840)	loss 0.5363 (0.5518)	grad_norm 2.3965 (2.9419)	mem 20675MB
[2025-04-03 02:08:50 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][188/311]	eta 0:01:48 lr 0.000227	time 0.8758 (0.8839)	loss 0.5784 (0.5525)	grad_norm 1.8639 (2.9376)	mem 20675MB
[2025-04-03 02:08:52 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][190/311]	eta 0:01:46 lr 0.000226	time 0.8758 (0.8838)	loss 0.5977 (0.5527)	grad_norm 2.1404 (2.9311)	mem 20675MB
[2025-04-03 02:08:54 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][192/311]	eta 0:01:45 lr 0.000226	time 0.8761 (0.8837)	loss 0.5115 (0.5524)	grad_norm 1.7098 (2.9191)	mem 20675MB
[2025-04-03 02:08:56 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][194/311]	eta 0:01:43 lr 0.000226	time 0.8758 (0.8837)	loss 0.5253 (0.5526)	grad_norm 4.1518 (2.9235)	mem 20675MB
[2025-04-03 02:08:57 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][196/311]	eta 0:01:41 lr 0.000225	time 0.8766 (0.8836)	loss 0.6023 (0.5527)	grad_norm 2.1907 (2.9208)	mem 20675MB
[2025-04-03 02:08:59 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][198/311]	eta 0:01:39 lr 0.000225	time 0.8758 (0.8835)	loss 0.5636 (0.5530)	grad_norm 2.8390 (2.9181)	mem 20675MB
[2025-04-03 02:09:01 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][200/311]	eta 0:01:38 lr 0.000225	time 0.8764 (0.8835)	loss 0.5811 (0.5535)	grad_norm 2.5819 (2.9138)	mem 20675MB
[2025-04-03 02:09:03 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][202/311]	eta 0:01:36 lr 0.000224	time 0.8758 (0.8834)	loss 0.5564 (0.5532)	grad_norm 2.3945 (2.9121)	mem 20675MB
[2025-04-03 02:09:04 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][204/311]	eta 0:01:34 lr 0.000224	time 0.8760 (0.8834)	loss 0.4923 (0.5532)	grad_norm 3.8470 (2.9174)	mem 20675MB
[2025-04-03 02:09:06 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][206/311]	eta 0:01:32 lr 0.000224	time 0.8761 (0.8833)	loss 0.5355 (0.5535)	grad_norm 3.8347 (2.9179)	mem 20675MB
[2025-04-03 02:09:08 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][208/311]	eta 0:01:30 lr 0.000223	time 0.8765 (0.8832)	loss 0.5099 (0.5530)	grad_norm 3.5074 (2.9179)	mem 20675MB
[2025-04-03 02:09:10 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][210/311]	eta 0:01:29 lr 0.000223	time 0.8762 (0.8832)	loss 0.4739 (0.5528)	grad_norm 3.4828 (2.9182)	mem 20675MB
[2025-04-03 02:09:11 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][212/311]	eta 0:01:27 lr 0.000223	time 0.8758 (0.8831)	loss 0.4761 (0.5520)	grad_norm 4.4964 (2.9308)	mem 20675MB
[2025-04-03 02:09:13 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][214/311]	eta 0:01:25 lr 0.000222	time 0.8761 (0.8831)	loss 0.4580 (0.5516)	grad_norm 3.7651 (2.9370)	mem 20675MB
[2025-04-03 02:09:15 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][216/311]	eta 0:01:23 lr 0.000222	time 0.8757 (0.8830)	loss 0.4037 (0.5504)	grad_norm 4.6975 (2.9467)	mem 20675MB
[2025-04-03 02:09:17 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][218/311]	eta 0:01:22 lr 0.000222	time 0.8761 (0.8829)	loss 0.6217 (0.5502)	grad_norm 2.3143 (2.9494)	mem 20675MB
[2025-04-03 02:09:18 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][220/311]	eta 0:01:20 lr 0.000221	time 0.8759 (0.8829)	loss 0.5733 (0.5506)	grad_norm 2.9483 (2.9478)	mem 20675MB
[2025-04-03 02:09:20 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][222/311]	eta 0:01:18 lr 0.000221	time 0.8758 (0.8828)	loss 0.5566 (0.5509)	grad_norm 2.8021 (2.9473)	mem 20675MB
[2025-04-03 02:09:22 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][224/311]	eta 0:01:16 lr 0.000221	time 0.8761 (0.8828)	loss 0.5979 (0.5512)	grad_norm 2.4478 (2.9503)	mem 20675MB
[2025-04-03 02:09:24 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][226/311]	eta 0:01:15 lr 0.000220	time 0.8761 (0.8827)	loss 0.4680 (0.5506)	grad_norm 4.2909 (2.9581)	mem 20675MB
[2025-04-03 02:09:25 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][228/311]	eta 0:01:13 lr 0.000220	time 0.8758 (0.8827)	loss 0.5390 (0.5503)	grad_norm 1.9524 (2.9553)	mem 20675MB
[2025-04-03 02:09:27 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][230/311]	eta 0:01:11 lr 0.000220	time 0.8760 (0.8826)	loss 0.5426 (0.5502)	grad_norm 3.1739 (2.9567)	mem 20675MB
[2025-04-03 02:09:29 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][232/311]	eta 0:01:09 lr 0.000220	time 0.8758 (0.8826)	loss 0.4509 (0.5501)	grad_norm 3.4352 (2.9630)	mem 20675MB
[2025-04-03 02:09:31 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][234/311]	eta 0:01:07 lr 0.000219	time 0.8762 (0.8825)	loss 0.5129 (0.5498)	grad_norm 2.4872 (2.9691)	mem 20675MB
[2025-04-03 02:09:32 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][236/311]	eta 0:01:06 lr 0.000219	time 0.8763 (0.8825)	loss 0.4593 (0.5495)	grad_norm 4.5823 (2.9727)	mem 20675MB
[2025-04-03 02:09:34 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][238/311]	eta 0:01:04 lr 0.000219	time 0.8760 (0.8824)	loss 0.5485 (0.5490)	grad_norm 2.0237 (2.9716)	mem 20675MB
[2025-04-03 02:09:36 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][240/311]	eta 0:01:02 lr 0.000218	time 0.8762 (0.8824)	loss 0.5810 (0.5488)	grad_norm 2.6277 (2.9755)	mem 20675MB
[2025-04-03 02:09:38 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][242/311]	eta 0:01:00 lr 0.000218	time 0.8761 (0.8824)	loss 0.6370 (0.5488)	grad_norm 3.7017 (2.9756)	mem 20675MB
[2025-04-03 02:09:39 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][244/311]	eta 0:00:59 lr 0.000218	time 0.8759 (0.8823)	loss 0.5848 (0.5493)	grad_norm 2.3494 (2.9722)	mem 20675MB
[2025-04-03 02:09:41 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][246/311]	eta 0:00:57 lr 0.000217	time 0.8759 (0.8823)	loss 0.4911 (0.5494)	grad_norm 3.1097 (2.9743)	mem 20675MB
[2025-04-03 02:09:43 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][248/311]	eta 0:00:55 lr 0.000217	time 0.8758 (0.8822)	loss 0.6119 (0.5495)	grad_norm 2.0337 (2.9766)	mem 20675MB
[2025-04-03 02:09:45 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][250/311]	eta 0:00:53 lr 0.000217	time 0.8759 (0.8822)	loss 0.5716 (0.5496)	grad_norm 2.8673 (2.9760)	mem 20675MB
[2025-04-03 02:09:46 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][252/311]	eta 0:00:52 lr 0.000216	time 0.8761 (0.8821)	loss 0.6100 (0.5492)	grad_norm 3.4956 (2.9783)	mem 20675MB
[2025-04-03 02:09:48 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][254/311]	eta 0:00:50 lr 0.000216	time 0.8759 (0.8821)	loss 0.5109 (0.5493)	grad_norm 2.4600 (2.9742)	mem 20675MB
[2025-04-03 02:09:50 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][256/311]	eta 0:00:48 lr 0.000216	time 0.8758 (0.8821)	loss 0.5486 (0.5494)	grad_norm 2.7210 (2.9757)	mem 20675MB
[2025-04-03 02:09:52 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][258/311]	eta 0:00:46 lr 0.000215	time 0.8769 (0.8820)	loss 0.5496 (0.5497)	grad_norm 2.7176 (2.9735)	mem 20675MB
[2025-04-03 02:09:54 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][260/311]	eta 0:00:44 lr 0.000215	time 0.8761 (0.8820)	loss 0.6280 (0.5502)	grad_norm 2.6520 (2.9707)	mem 20675MB
[2025-04-03 02:09:55 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][262/311]	eta 0:00:43 lr 0.000215	time 0.8758 (0.8819)	loss 0.6603 (0.5507)	grad_norm 3.0217 (2.9704)	mem 20675MB
[2025-04-03 02:09:57 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][264/311]	eta 0:00:41 lr 0.000214	time 0.8758 (0.8819)	loss 0.6549 (0.5509)	grad_norm 2.9255 (2.9704)	mem 20675MB
[2025-04-03 02:09:59 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][266/311]	eta 0:00:39 lr 0.000214	time 0.8761 (0.8819)	loss 0.4164 (0.5507)	grad_norm 3.6692 (2.9697)	mem 20675MB
[2025-04-03 02:10:01 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][268/311]	eta 0:00:37 lr 0.000214	time 0.8760 (0.8818)	loss 0.5188 (0.5502)	grad_norm 3.2784 (2.9723)	mem 20675MB
[2025-04-03 02:10:02 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][270/311]	eta 0:00:36 lr 0.000213	time 0.8758 (0.8818)	loss 0.5790 (0.5505)	grad_norm 2.1886 (2.9682)	mem 20675MB
[2025-04-03 02:10:04 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][272/311]	eta 0:00:34 lr 0.000213	time 0.8757 (0.8818)	loss 0.4324 (0.5501)	grad_norm 4.7010 (2.9727)	mem 20675MB
[2025-04-03 02:10:06 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][274/311]	eta 0:00:32 lr 0.000213	time 0.8761 (0.8817)	loss 0.4655 (0.5497)	grad_norm 4.6005 (2.9774)	mem 20675MB
[2025-04-03 02:10:08 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][276/311]	eta 0:00:30 lr 0.000213	time 0.8762 (0.8817)	loss 0.6271 (0.5499)	grad_norm 2.5798 (2.9741)	mem 20675MB
[2025-04-03 02:10:09 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][278/311]	eta 0:00:29 lr 0.000212	time 0.8757 (0.8816)	loss 0.5421 (0.5501)	grad_norm 2.2784 (2.9675)	mem 20675MB
[2025-04-03 02:10:11 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][280/311]	eta 0:00:27 lr 0.000212	time 0.8760 (0.8816)	loss 0.6420 (0.5509)	grad_norm 3.0221 (2.9710)	mem 20675MB
[2025-04-03 02:10:13 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][282/311]	eta 0:00:25 lr 0.000212	time 0.8764 (0.8816)	loss 0.6433 (0.5508)	grad_norm 2.3275 (2.9713)	mem 20675MB
[2025-04-03 02:10:15 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][284/311]	eta 0:00:23 lr 0.000211	time 0.8761 (0.8816)	loss 0.5188 (0.5504)	grad_norm 2.6049 (2.9732)	mem 20675MB
[2025-04-03 02:10:16 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][286/311]	eta 0:00:22 lr 0.000211	time 0.8761 (0.8815)	loss 0.6760 (0.5512)	grad_norm 3.1105 (2.9719)	mem 20675MB
[2025-04-03 02:10:18 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][288/311]	eta 0:00:20 lr 0.000211	time 0.8766 (0.8815)	loss 0.6728 (0.5511)	grad_norm 3.0906 (2.9740)	mem 20675MB
[2025-04-03 02:10:20 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][290/311]	eta 0:00:18 lr 0.000210	time 0.8761 (0.8815)	loss 0.5958 (0.5512)	grad_norm 2.8495 (2.9728)	mem 20675MB
[2025-04-03 02:10:22 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][292/311]	eta 0:00:16 lr 0.000210	time 0.8761 (0.8814)	loss 0.5448 (0.5509)	grad_norm 2.6545 (2.9710)	mem 20675MB
[2025-04-03 02:10:23 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][294/311]	eta 0:00:14 lr 0.000210	time 0.8757 (0.8814)	loss 0.5643 (0.5509)	grad_norm 2.9821 (2.9688)	mem 20675MB
[2025-04-03 02:10:25 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][296/311]	eta 0:00:13 lr 0.000209	time 0.8758 (0.8814)	loss 0.5432 (0.5509)	grad_norm 2.6077 (2.9659)	mem 20675MB
[2025-04-03 02:10:27 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][298/311]	eta 0:00:11 lr 0.000209	time 0.8755 (0.8813)	loss 0.4917 (0.5509)	grad_norm 2.3819 (2.9603)	mem 20675MB
[2025-04-03 02:10:29 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][300/311]	eta 0:00:09 lr 0.000209	time 0.8755 (0.8813)	loss 0.4841 (0.5508)	grad_norm 3.0825 (2.9617)	mem 20675MB
[2025-04-03 02:10:30 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][302/311]	eta 0:00:07 lr 0.000208	time 0.8757 (0.8813)	loss 0.6149 (0.5506)	grad_norm 2.5141 (2.9606)	mem 20675MB
[2025-04-03 02:10:32 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][304/311]	eta 0:00:06 lr 0.000208	time 0.8760 (0.8812)	loss 0.4754 (0.5506)	grad_norm 4.1947 (2.9610)	mem 20675MB
[2025-04-03 02:10:34 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][306/311]	eta 0:00:04 lr 0.000208	time 0.8759 (0.8812)	loss 0.6165 (0.5503)	grad_norm 2.1199 (2.9588)	mem 20675MB
[2025-04-03 02:10:36 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][308/311]	eta 0:00:02 lr 0.000207	time 0.8756 (0.8812)	loss 0.5755 (0.5505)	grad_norm 2.6788 (2.9567)	mem 20675MB
[2025-04-03 02:10:37 simmim_finetune] (main_finetune.py 252): INFO Train: [21/30][310/311]	eta 0:00:00 lr 0.000207	time 0.8759 (0.8812)	loss 0.6536 (0.5507)	grad_norm 2.1074 (2.9549)	mem 20675MB
[2025-04-03 02:10:37 simmim_finetune] (main_finetune.py 260): INFO EPOCH 21 training takes 0:04:34
[2025-04-03 02:10:39 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.409 (1.409)	Loss 0.5796 (0.5796)	Acc@1 73.438 (73.438)	Mem 20675MB
[2025-04-03 02:10:39 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 74.648
[2025-04-03 02:10:39 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 74.6%
[2025-04-03 02:10:39 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:10:39 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [9.73746022331107e-07, 9.73746022331107e-07, 1.385742319358506e-06, 1.385742319358506e-06, 2.019582776323736e-06, 2.019582776323736e-06, 2.994721940885628e-06, 2.994721940885628e-06, 4.4949360402116145e-06, 4.4949360402116145e-06, 6.802957731482364e-06, 6.802957731482364e-06, 1.0353760333437362e-05, 1.0353760333437362e-05, 1.581653356721428e-05, 1.581653356721428e-05, 2.4220800080717232e-05, 2.4220800080717232e-05, 3.715044087072178e-05, 3.715044087072178e-05, 5.704219593226723e-05, 5.704219593226723e-05, 8.764489602695255e-05, 8.764489602695255e-05, 0.0001347259730956992, 0.0001347259730956992, 0.00020715839935530936, 0.00020715839935530936]
[2025-04-03 02:10:41 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][0/311]	eta 0:10:52 lr 0.000207	time 2.0995 (2.0995)	loss 0.5798 (0.5798)	grad_norm 2.4022 (2.4022)	mem 20675MB
[2025-04-03 02:10:43 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][2/311]	eta 0:06:36 lr 0.000207	time 0.8759 (1.2845)	loss 0.4578 (0.5543)	grad_norm 3.9064 (3.1921)	mem 20675MB
[2025-04-03 02:10:45 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][4/311]	eta 0:05:44 lr 0.000206	time 0.8756 (1.1212)	loss 0.4612 (0.5441)	grad_norm 4.8814 (3.4814)	mem 20675MB
[2025-04-03 02:10:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][6/311]	eta 0:05:20 lr 0.000206	time 0.8760 (1.0515)	loss 0.4810 (0.5412)	grad_norm 2.3307 (3.1331)	mem 20675MB
[2025-04-03 02:10:48 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][8/311]	eta 0:05:06 lr 0.000206	time 0.8757 (1.0126)	loss 0.4061 (0.5140)	grad_norm 3.3620 (3.1514)	mem 20675MB
[2025-04-03 02:10:50 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][10/311]	eta 0:04:57 lr 0.000205	time 0.8757 (0.9879)	loss 0.4821 (0.5104)	grad_norm 2.9737 (3.1479)	mem 20675MB
[2025-04-03 02:10:52 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][12/311]	eta 0:04:50 lr 0.000205	time 0.8758 (0.9708)	loss 0.4542 (0.5149)	grad_norm 3.7400 (3.2409)	mem 20675MB
[2025-04-03 02:10:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][14/311]	eta 0:04:44 lr 0.000205	time 0.8757 (0.9582)	loss 0.3980 (0.5079)	grad_norm 4.3278 (3.3414)	mem 20675MB
[2025-04-03 02:10:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][16/311]	eta 0:04:39 lr 0.000205	time 0.8757 (0.9486)	loss 0.6588 (0.5147)	grad_norm 3.2609 (3.3431)	mem 20675MB
[2025-04-03 02:10:57 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][18/311]	eta 0:04:35 lr 0.000204	time 0.8760 (0.9410)	loss 0.6585 (0.5279)	grad_norm 3.6357 (3.3348)	mem 20675MB
[2025-04-03 02:10:59 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][20/311]	eta 0:04:32 lr 0.000204	time 0.8755 (0.9349)	loss 0.5939 (0.5285)	grad_norm 2.4636 (3.3142)	mem 20675MB
[2025-04-03 02:11:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][22/311]	eta 0:04:28 lr 0.000204	time 0.8757 (0.9299)	loss 0.5396 (0.5269)	grad_norm 2.8689 (3.2930)	mem 20675MB
[2025-04-03 02:11:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][24/311]	eta 0:04:25 lr 0.000203	time 0.8757 (0.9256)	loss 0.6036 (0.5283)	grad_norm 3.1103 (3.2690)	mem 20675MB
[2025-04-03 02:11:04 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][26/311]	eta 0:04:22 lr 0.000203	time 0.8755 (0.9219)	loss 0.5872 (0.5279)	grad_norm 2.6513 (3.2670)	mem 20675MB
[2025-04-03 02:11:06 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][28/311]	eta 0:04:20 lr 0.000203	time 0.8758 (0.9188)	loss 0.4875 (0.5261)	grad_norm 3.6927 (3.2594)	mem 20675MB
[2025-04-03 02:11:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][30/311]	eta 0:04:17 lr 0.000202	time 0.8758 (0.9161)	loss 0.6602 (0.5352)	grad_norm 2.9550 (3.2572)	mem 20675MB
[2025-04-03 02:11:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][32/311]	eta 0:04:14 lr 0.000202	time 0.8762 (0.9137)	loss 0.5425 (0.5395)	grad_norm 2.5090 (3.2259)	mem 20675MB
[2025-04-03 02:11:11 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][34/311]	eta 0:04:12 lr 0.000202	time 0.8761 (0.9116)	loss 0.3924 (0.5337)	grad_norm 3.0627 (3.2308)	mem 20675MB
[2025-04-03 02:11:13 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][36/311]	eta 0:04:10 lr 0.000201	time 0.8755 (0.9097)	loss 0.4821 (0.5337)	grad_norm 3.1779 (3.1975)	mem 20675MB
[2025-04-03 02:11:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][38/311]	eta 0:04:07 lr 0.000201	time 0.8756 (0.9080)	loss 0.5625 (0.5355)	grad_norm 1.9571 (3.1603)	mem 20675MB
[2025-04-03 02:11:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][40/311]	eta 0:04:05 lr 0.000201	time 0.8759 (0.9065)	loss 0.5468 (0.5378)	grad_norm 2.7042 (3.1313)	mem 20675MB
[2025-04-03 02:11:18 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][42/311]	eta 0:04:03 lr 0.000200	time 0.8758 (0.9051)	loss 0.4512 (0.5368)	grad_norm 3.3642 (3.1332)	mem 20675MB
[2025-04-03 02:11:20 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][44/311]	eta 0:04:01 lr 0.000200	time 0.8761 (0.9039)	loss 0.4944 (0.5385)	grad_norm 2.6358 (3.1026)	mem 20675MB
[2025-04-03 02:11:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][46/311]	eta 0:03:59 lr 0.000200	time 0.8755 (0.9027)	loss 0.5764 (0.5360)	grad_norm 3.0384 (3.1388)	mem 20675MB
[2025-04-03 02:11:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][48/311]	eta 0:03:57 lr 0.000200	time 0.8759 (0.9017)	loss 0.6012 (0.5387)	grad_norm 2.8921 (3.1136)	mem 20675MB
[2025-04-03 02:11:25 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][50/311]	eta 0:03:55 lr 0.000199	time 0.8755 (0.9007)	loss 0.6245 (0.5403)	grad_norm 2.7656 (3.0957)	mem 20675MB
[2025-04-03 02:11:27 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][52/311]	eta 0:03:53 lr 0.000199	time 0.8758 (0.8998)	loss 0.6083 (0.5391)	grad_norm 2.8622 (3.0983)	mem 20675MB
[2025-04-03 02:11:29 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][54/311]	eta 0:03:51 lr 0.000199	time 0.8759 (0.8989)	loss 0.4663 (0.5400)	grad_norm 2.6335 (3.0730)	mem 20675MB
[2025-04-03 02:11:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][56/311]	eta 0:03:49 lr 0.000198	time 0.8756 (0.8982)	loss 0.5868 (0.5412)	grad_norm 2.3201 (3.0573)	mem 20675MB
[2025-04-03 02:11:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][58/311]	eta 0:03:47 lr 0.000198	time 0.8757 (0.8974)	loss 0.3982 (0.5383)	grad_norm 3.5190 (3.0626)	mem 20675MB
[2025-04-03 02:11:34 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][60/311]	eta 0:03:45 lr 0.000198	time 0.8757 (0.8967)	loss 0.5647 (0.5394)	grad_norm 2.6343 (3.0442)	mem 20675MB
[2025-04-03 02:11:36 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][62/311]	eta 0:03:43 lr 0.000197	time 0.8756 (0.8961)	loss 0.4256 (0.5365)	grad_norm 4.6911 (3.0842)	mem 20675MB
[2025-04-03 02:11:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][64/311]	eta 0:03:41 lr 0.000197	time 0.8757 (0.8955)	loss 0.6236 (0.5361)	grad_norm 2.8885 (3.1000)	mem 20675MB
[2025-04-03 02:11:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][66/311]	eta 0:03:39 lr 0.000197	time 0.8754 (0.8949)	loss 0.4823 (0.5367)	grad_norm 2.8537 (3.0890)	mem 20675MB
[2025-04-03 02:11:41 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][68/311]	eta 0:03:37 lr 0.000196	time 0.8759 (0.8944)	loss 0.4245 (0.5341)	grad_norm 3.7948 (3.1070)	mem 20675MB
[2025-04-03 02:11:43 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][70/311]	eta 0:03:35 lr 0.000196	time 0.8757 (0.8939)	loss 0.6034 (0.5335)	grad_norm 2.6147 (3.1106)	mem 20675MB
[2025-04-03 02:11:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][72/311]	eta 0:03:33 lr 0.000196	time 0.8758 (0.8935)	loss 0.6342 (0.5359)	grad_norm 2.8842 (3.1168)	mem 20675MB
[2025-04-03 02:11:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][74/311]	eta 0:03:31 lr 0.000196	time 0.8755 (0.8930)	loss 0.5158 (0.5348)	grad_norm 4.2108 (3.1506)	mem 20675MB
[2025-04-03 02:11:48 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][76/311]	eta 0:03:29 lr 0.000195	time 0.8757 (0.8926)	loss 0.5609 (0.5354)	grad_norm 2.6294 (3.1414)	mem 20675MB
[2025-04-03 02:11:50 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][78/311]	eta 0:03:27 lr 0.000195	time 0.8756 (0.8922)	loss 0.5731 (0.5342)	grad_norm 3.0069 (3.1755)	mem 20675MB
[2025-04-03 02:11:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][80/311]	eta 0:03:26 lr 0.000195	time 0.8758 (0.8918)	loss 0.4278 (0.5343)	grad_norm 3.0551 (3.1921)	mem 20675MB
[2025-04-03 02:11:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][82/311]	eta 0:03:24 lr 0.000194	time 0.8758 (0.8914)	loss 0.4511 (0.5348)	grad_norm 3.5709 (3.1953)	mem 20675MB
[2025-04-03 02:11:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][84/311]	eta 0:03:22 lr 0.000194	time 0.8759 (0.8911)	loss 0.6750 (0.5355)	grad_norm 3.1411 (3.1989)	mem 20675MB
[2025-04-03 02:11:57 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][86/311]	eta 0:03:20 lr 0.000194	time 0.8758 (0.8907)	loss 0.5993 (0.5366)	grad_norm 3.3593 (3.2039)	mem 20675MB
[2025-04-03 02:11:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][88/311]	eta 0:03:18 lr 0.000193	time 0.8760 (0.8904)	loss 0.5306 (0.5373)	grad_norm 4.9171 (3.2160)	mem 20675MB
[2025-04-03 02:12:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][90/311]	eta 0:03:16 lr 0.000193	time 0.8757 (0.8901)	loss 0.4884 (0.5380)	grad_norm 3.2703 (3.2132)	mem 20675MB
[2025-04-03 02:12:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][92/311]	eta 0:03:14 lr 0.000193	time 0.8755 (0.8898)	loss 0.5369 (0.5379)	grad_norm 2.3974 (3.2061)	mem 20675MB
[2025-04-03 02:12:04 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][94/311]	eta 0:03:13 lr 0.000193	time 0.8771 (0.8896)	loss 0.5099 (0.5385)	grad_norm 4.1338 (3.2077)	mem 20675MB
[2025-04-03 02:12:05 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][96/311]	eta 0:03:11 lr 0.000192	time 0.8756 (0.8893)	loss 0.5724 (0.5391)	grad_norm 2.1960 (3.1916)	mem 20675MB
[2025-04-03 02:12:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][98/311]	eta 0:03:09 lr 0.000192	time 0.8755 (0.8891)	loss 0.5422 (0.5392)	grad_norm 3.4554 (3.1849)	mem 20675MB
[2025-04-03 02:12:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][100/311]	eta 0:03:07 lr 0.000192	time 0.8756 (0.8888)	loss 0.5668 (0.5384)	grad_norm 2.7036 (3.1841)	mem 20675MB
[2025-04-03 02:12:11 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][102/311]	eta 0:03:05 lr 0.000191	time 0.8761 (0.8886)	loss 0.4145 (0.5379)	grad_norm 4.8133 (3.1928)	mem 20675MB
[2025-04-03 02:12:12 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][104/311]	eta 0:03:03 lr 0.000191	time 0.8755 (0.8884)	loss 0.6209 (0.5392)	grad_norm 2.0895 (3.1715)	mem 20675MB
[2025-04-03 02:12:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][106/311]	eta 0:03:02 lr 0.000191	time 0.8757 (0.8881)	loss 0.6314 (0.5398)	grad_norm 2.1264 (3.1635)	mem 20675MB
[2025-04-03 02:12:16 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][108/311]	eta 0:03:00 lr 0.000190	time 0.8757 (0.8879)	loss 0.6179 (0.5394)	grad_norm 1.9011 (3.1642)	mem 20675MB
[2025-04-03 02:12:18 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][110/311]	eta 0:02:58 lr 0.000190	time 0.8757 (0.8877)	loss 0.4191 (0.5388)	grad_norm 3.1660 (3.1555)	mem 20675MB
[2025-04-03 02:12:19 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][112/311]	eta 0:02:56 lr 0.000190	time 0.8757 (0.8875)	loss 0.4675 (0.5372)	grad_norm 4.6083 (3.1787)	mem 20675MB
[2025-04-03 02:12:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][114/311]	eta 0:02:54 lr 0.000189	time 0.8758 (0.8873)	loss 0.5243 (0.5374)	grad_norm 3.1556 (3.1739)	mem 20675MB
[2025-04-03 02:12:23 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][116/311]	eta 0:02:52 lr 0.000189	time 0.8760 (0.8872)	loss 0.6723 (0.5379)	grad_norm 2.6338 (3.1678)	mem 20675MB
[2025-04-03 02:12:25 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][118/311]	eta 0:02:51 lr 0.000189	time 0.8755 (0.8870)	loss 0.4814 (0.5377)	grad_norm 2.7948 (3.1579)	mem 20675MB
[2025-04-03 02:12:26 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][120/311]	eta 0:02:49 lr 0.000189	time 0.8754 (0.8868)	loss 0.5960 (0.5385)	grad_norm 2.7104 (3.1483)	mem 20675MB
[2025-04-03 02:12:28 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][122/311]	eta 0:02:47 lr 0.000188	time 0.8756 (0.8866)	loss 0.5990 (0.5397)	grad_norm 3.1422 (3.1425)	mem 20675MB
[2025-04-03 02:12:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][124/311]	eta 0:02:45 lr 0.000188	time 0.8756 (0.8865)	loss 0.4808 (0.5389)	grad_norm 3.7986 (3.1498)	mem 20675MB
[2025-04-03 02:12:32 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][126/311]	eta 0:02:43 lr 0.000188	time 0.8756 (0.8863)	loss 0.5745 (0.5397)	grad_norm 3.5216 (3.1508)	mem 20675MB
[2025-04-03 02:12:33 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][128/311]	eta 0:02:42 lr 0.000187	time 0.8769 (0.8862)	loss 0.5875 (0.5398)	grad_norm 2.7811 (3.1517)	mem 20675MB
[2025-04-03 02:12:35 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][130/311]	eta 0:02:40 lr 0.000187	time 0.8758 (0.8860)	loss 0.5970 (0.5404)	grad_norm 2.4990 (3.1410)	mem 20675MB
[2025-04-03 02:12:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][132/311]	eta 0:02:38 lr 0.000187	time 0.8756 (0.8859)	loss 0.4155 (0.5395)	grad_norm 3.7916 (3.1414)	mem 20675MB
[2025-04-03 02:12:39 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][134/311]	eta 0:02:36 lr 0.000186	time 0.8758 (0.8858)	loss 0.6033 (0.5405)	grad_norm 2.4110 (3.1281)	mem 20675MB
[2025-04-03 02:12:40 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][136/311]	eta 0:02:34 lr 0.000186	time 0.8758 (0.8856)	loss 0.5936 (0.5412)	grad_norm 2.9462 (3.1281)	mem 20675MB
[2025-04-03 02:12:42 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][138/311]	eta 0:02:33 lr 0.000186	time 0.8758 (0.8855)	loss 0.5192 (0.5409)	grad_norm 3.4338 (3.1262)	mem 20675MB
[2025-04-03 02:12:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][140/311]	eta 0:02:31 lr 0.000186	time 0.8757 (0.8854)	loss 0.5561 (0.5416)	grad_norm 3.1066 (3.1294)	mem 20675MB
[2025-04-03 02:12:46 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][142/311]	eta 0:02:29 lr 0.000185	time 0.8758 (0.8853)	loss 0.6458 (0.5412)	grad_norm 3.2762 (3.1412)	mem 20675MB
[2025-04-03 02:12:47 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][144/311]	eta 0:02:27 lr 0.000185	time 0.8756 (0.8851)	loss 0.4900 (0.5407)	grad_norm 4.0744 (3.1502)	mem 20675MB
[2025-04-03 02:12:49 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][146/311]	eta 0:02:26 lr 0.000185	time 0.8757 (0.8850)	loss 0.5701 (0.5410)	grad_norm 2.7388 (3.1485)	mem 20675MB
[2025-04-03 02:12:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][148/311]	eta 0:02:24 lr 0.000184	time 0.8760 (0.8849)	loss 0.5665 (0.5402)	grad_norm 2.8150 (3.1489)	mem 20675MB
[2025-04-03 02:12:53 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][150/311]	eta 0:02:22 lr 0.000184	time 0.8758 (0.8848)	loss 0.6006 (0.5409)	grad_norm 2.2048 (3.1402)	mem 20675MB
[2025-04-03 02:12:54 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][152/311]	eta 0:02:20 lr 0.000184	time 0.8759 (0.8847)	loss 0.4694 (0.5403)	grad_norm 2.7942 (3.1398)	mem 20675MB
[2025-04-03 02:12:56 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][154/311]	eta 0:02:18 lr 0.000183	time 0.8753 (0.8846)	loss 0.5438 (0.5399)	grad_norm 2.5382 (3.1347)	mem 20675MB
[2025-04-03 02:12:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][156/311]	eta 0:02:17 lr 0.000183	time 0.8759 (0.8845)	loss 0.4577 (0.5394)	grad_norm 3.8191 (3.1382)	mem 20675MB
[2025-04-03 02:13:00 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][158/311]	eta 0:02:15 lr 0.000183	time 0.8761 (0.8844)	loss 0.5065 (0.5388)	grad_norm 4.0738 (3.1458)	mem 20675MB
[2025-04-03 02:13:01 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][160/311]	eta 0:02:13 lr 0.000183	time 0.8758 (0.8843)	loss 0.5399 (0.5378)	grad_norm 2.3958 (3.1442)	mem 20675MB
[2025-04-03 02:13:03 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][162/311]	eta 0:02:11 lr 0.000182	time 0.8762 (0.8842)	loss 0.5073 (0.5370)	grad_norm 3.5423 (3.1514)	mem 20675MB
[2025-04-03 02:13:05 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][164/311]	eta 0:02:09 lr 0.000182	time 0.8775 (0.8842)	loss 0.4934 (0.5363)	grad_norm 3.5885 (3.1634)	mem 20675MB
[2025-04-03 02:13:07 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][166/311]	eta 0:02:08 lr 0.000182	time 0.8758 (0.8841)	loss 0.5569 (0.5368)	grad_norm 2.7536 (3.1594)	mem 20675MB
[2025-04-03 02:13:08 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][168/311]	eta 0:02:06 lr 0.000181	time 0.8755 (0.8840)	loss 0.6482 (0.5374)	grad_norm 2.0841 (3.1570)	mem 20675MB
[2025-04-03 02:13:10 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][170/311]	eta 0:02:04 lr 0.000181	time 0.8756 (0.8839)	loss 0.5083 (0.5374)	grad_norm 4.4318 (3.1680)	mem 20675MB
[2025-04-03 02:13:12 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][172/311]	eta 0:02:02 lr 0.000181	time 0.8755 (0.8838)	loss 0.6036 (0.5377)	grad_norm 2.5464 (3.1640)	mem 20675MB
[2025-04-03 02:13:14 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][174/311]	eta 0:02:01 lr 0.000181	time 0.8756 (0.8837)	loss 0.5827 (0.5384)	grad_norm 2.8344 (3.1621)	mem 20675MB
[2025-04-03 02:13:15 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][176/311]	eta 0:01:59 lr 0.000180	time 0.8756 (0.8836)	loss 0.3899 (0.5376)	grad_norm 5.7485 (3.1752)	mem 20675MB
[2025-04-03 02:13:17 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][178/311]	eta 0:01:57 lr 0.000180	time 0.8766 (0.8836)	loss 0.5204 (0.5381)	grad_norm 3.0998 (3.1737)	mem 20675MB
[2025-04-03 02:13:19 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][180/311]	eta 0:01:55 lr 0.000180	time 0.8761 (0.8835)	loss 0.5365 (0.5384)	grad_norm 2.7482 (3.1658)	mem 20675MB
[2025-04-03 02:13:21 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][182/311]	eta 0:01:53 lr 0.000179	time 0.8757 (0.8834)	loss 0.4526 (0.5385)	grad_norm 4.9920 (3.1793)	mem 20675MB
[2025-04-03 02:13:22 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][184/311]	eta 0:01:52 lr 0.000179	time 0.8757 (0.8834)	loss 0.5501 (0.5389)	grad_norm 3.2525 (3.1755)	mem 20675MB
[2025-04-03 02:13:24 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][186/311]	eta 0:01:50 lr 0.000179	time 0.8756 (0.8833)	loss 0.5052 (0.5392)	grad_norm 4.2663 (3.1788)	mem 20675MB
[2025-04-03 02:13:26 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][188/311]	eta 0:01:48 lr 0.000178	time 0.8759 (0.8832)	loss 0.5883 (0.5399)	grad_norm 2.1278 (3.1702)	mem 20675MB
[2025-04-03 02:13:28 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][190/311]	eta 0:01:46 lr 0.000178	time 0.8756 (0.8831)	loss 0.5357 (0.5396)	grad_norm 3.1423 (3.1710)	mem 20675MB
[2025-04-03 02:13:30 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][192/311]	eta 0:01:45 lr 0.000178	time 0.8759 (0.8831)	loss 0.5750 (0.5399)	grad_norm 2.1422 (3.1644)	mem 20675MB
[2025-04-03 02:13:31 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][194/311]	eta 0:01:43 lr 0.000178	time 0.8762 (0.8830)	loss 0.4768 (0.5397)	grad_norm 3.2413 (3.1610)	mem 20675MB
[2025-04-03 02:13:33 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][196/311]	eta 0:01:41 lr 0.000177	time 0.8757 (0.8829)	loss 0.5887 (0.5405)	grad_norm 2.8744 (3.1586)	mem 20675MB
[2025-04-03 02:13:35 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][198/311]	eta 0:01:39 lr 0.000177	time 0.8757 (0.8829)	loss 0.5056 (0.5395)	grad_norm 2.9473 (3.1586)	mem 20675MB
[2025-04-03 02:13:37 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][200/311]	eta 0:01:37 lr 0.000177	time 0.8758 (0.8828)	loss 0.3399 (0.5379)	grad_norm 3.3416 (3.1630)	mem 20675MB
[2025-04-03 02:13:38 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][202/311]	eta 0:01:36 lr 0.000176	time 0.8759 (0.8828)	loss 0.6190 (0.5385)	grad_norm 2.8425 (3.1596)	mem 20675MB
[2025-04-03 02:13:40 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][204/311]	eta 0:01:34 lr 0.000176	time 0.8757 (0.8827)	loss 0.5599 (0.5387)	grad_norm 2.0572 (3.1490)	mem 20675MB
[2025-04-03 02:13:42 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][206/311]	eta 0:01:32 lr 0.000176	time 0.8757 (0.8826)	loss 0.6010 (0.5392)	grad_norm 2.8165 (3.1467)	mem 20675MB
[2025-04-03 02:13:44 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][208/311]	eta 0:01:30 lr 0.000176	time 0.8756 (0.8826)	loss 0.3750 (0.5385)	grad_norm 5.4913 (3.1548)	mem 20675MB
[2025-04-03 02:13:45 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][210/311]	eta 0:01:29 lr 0.000175	time 0.8755 (0.8825)	loss 0.4505 (0.5379)	grad_norm 3.0252 (3.1568)	mem 20675MB
[2025-04-03 02:13:47 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][212/311]	eta 0:01:27 lr 0.000175	time 0.8757 (0.8825)	loss 0.4487 (0.5375)	grad_norm 3.5001 (3.1537)	mem 20675MB
[2025-04-03 02:13:49 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][214/311]	eta 0:01:25 lr 0.000175	time 0.8758 (0.8824)	loss 0.5902 (0.5380)	grad_norm 3.1870 (3.1517)	mem 20675MB
[2025-04-03 02:13:51 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][216/311]	eta 0:01:23 lr 0.000174	time 0.8756 (0.8824)	loss 0.6135 (0.5386)	grad_norm 2.9316 (3.1514)	mem 20675MB
[2025-04-03 02:13:52 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][218/311]	eta 0:01:22 lr 0.000174	time 0.8756 (0.8823)	loss 0.5717 (0.5390)	grad_norm 2.8552 (3.1518)	mem 20675MB
[2025-04-03 02:13:54 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][220/311]	eta 0:01:20 lr 0.000174	time 0.8756 (0.8823)	loss 0.5926 (0.5394)	grad_norm 2.7759 (3.1491)	mem 20675MB
[2025-04-03 02:13:56 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][222/311]	eta 0:01:18 lr 0.000173	time 0.8762 (0.8822)	loss 0.5298 (0.5394)	grad_norm 2.6858 (3.1474)	mem 20675MB
[2025-04-03 02:13:58 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][224/311]	eta 0:01:16 lr 0.000173	time 0.8758 (0.8822)	loss 0.6010 (0.5397)	grad_norm 3.1231 (3.1461)	mem 20675MB
[2025-04-03 02:13:59 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][226/311]	eta 0:01:14 lr 0.000173	time 0.8757 (0.8821)	loss 0.6016 (0.5400)	grad_norm 2.6606 (3.1392)	mem 20675MB
[2025-04-03 02:14:01 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][228/311]	eta 0:01:13 lr 0.000173	time 0.8760 (0.8821)	loss 0.5889 (0.5399)	grad_norm 3.4063 (3.1416)	mem 20675MB
[2025-04-03 02:14:03 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][230/311]	eta 0:01:11 lr 0.000172	time 0.8757 (0.8820)	loss 0.5281 (0.5397)	grad_norm 2.3793 (3.1433)	mem 20675MB
[2025-04-03 02:14:05 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][232/311]	eta 0:01:09 lr 0.000172	time 0.8756 (0.8820)	loss 0.4299 (0.5390)	grad_norm 2.7688 (3.1459)	mem 20675MB
[2025-04-03 02:14:06 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][234/311]	eta 0:01:07 lr 0.000172	time 0.8759 (0.8819)	loss 0.4221 (0.5382)	grad_norm 3.3205 (3.1465)	mem 20675MB
[2025-04-03 02:14:08 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][236/311]	eta 0:01:06 lr 0.000171	time 0.8759 (0.8819)	loss 0.4615 (0.5382)	grad_norm 3.7557 (3.1482)	mem 20675MB
[2025-04-03 02:14:10 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][238/311]	eta 0:01:04 lr 0.000171	time 0.8754 (0.8818)	loss 0.5096 (0.5382)	grad_norm 3.1766 (3.1460)	mem 20675MB
[2025-04-03 02:14:12 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][240/311]	eta 0:01:02 lr 0.000171	time 0.8758 (0.8818)	loss 0.4693 (0.5382)	grad_norm 5.1852 (3.1567)	mem 20675MB
[2025-04-03 02:14:13 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][242/311]	eta 0:01:00 lr 0.000171	time 0.8756 (0.8818)	loss 0.5496 (0.5383)	grad_norm 2.0539 (3.1480)	mem 20675MB
[2025-04-03 02:14:15 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][244/311]	eta 0:00:59 lr 0.000170	time 0.8757 (0.8817)	loss 0.5706 (0.5381)	grad_norm 2.6526 (3.1494)	mem 20675MB
[2025-04-03 02:14:17 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][246/311]	eta 0:00:57 lr 0.000170	time 0.8758 (0.8817)	loss 0.6718 (0.5391)	grad_norm 3.2808 (3.1515)	mem 20675MB
[2025-04-03 02:14:19 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][248/311]	eta 0:00:55 lr 0.000170	time 0.8755 (0.8816)	loss 0.4810 (0.5390)	grad_norm 3.4696 (3.1526)	mem 20675MB
[2025-04-03 02:14:20 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][250/311]	eta 0:00:53 lr 0.000169	time 0.8758 (0.8816)	loss 0.6560 (0.5396)	grad_norm 2.7736 (3.1498)	mem 20675MB
[2025-04-03 02:14:22 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][252/311]	eta 0:00:52 lr 0.000169	time 0.8759 (0.8816)	loss 0.4688 (0.5393)	grad_norm 4.3809 (3.1553)	mem 20675MB
[2025-04-03 02:14:24 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][254/311]	eta 0:00:50 lr 0.000169	time 0.8757 (0.8815)	loss 0.6426 (0.5396)	grad_norm 2.0785 (3.1510)	mem 20675MB
[2025-04-03 02:14:26 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][256/311]	eta 0:00:48 lr 0.000169	time 0.8757 (0.8815)	loss 0.5397 (0.5399)	grad_norm 2.4821 (3.1496)	mem 20675MB
[2025-04-03 02:14:27 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][258/311]	eta 0:00:46 lr 0.000168	time 0.8756 (0.8815)	loss 0.5257 (0.5401)	grad_norm 2.2582 (3.1424)	mem 20675MB
[2025-04-03 02:14:29 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][260/311]	eta 0:00:44 lr 0.000168	time 0.8758 (0.8814)	loss 0.4995 (0.5399)	grad_norm 3.3418 (3.1428)	mem 20675MB
[2025-04-03 02:14:31 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][262/311]	eta 0:00:43 lr 0.000168	time 0.8755 (0.8814)	loss 0.5485 (0.5396)	grad_norm 3.5722 (3.1445)	mem 20675MB
[2025-04-03 02:14:33 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][264/311]	eta 0:00:41 lr 0.000167	time 0.8758 (0.8813)	loss 0.5369 (0.5395)	grad_norm 3.9452 (3.1479)	mem 20675MB
[2025-04-03 02:14:34 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][266/311]	eta 0:00:39 lr 0.000167	time 0.8758 (0.8813)	loss 0.4984 (0.5396)	grad_norm 3.2517 (3.1441)	mem 20675MB
[2025-04-03 02:14:36 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][268/311]	eta 0:00:37 lr 0.000167	time 0.8754 (0.8813)	loss 0.4584 (0.5390)	grad_norm 4.5588 (3.1494)	mem 20675MB
[2025-04-03 02:14:38 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][270/311]	eta 0:00:36 lr 0.000167	time 0.8759 (0.8812)	loss 0.4394 (0.5388)	grad_norm 3.1241 (3.1443)	mem 20675MB
[2025-04-03 02:14:40 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][272/311]	eta 0:00:34 lr 0.000166	time 0.8754 (0.8812)	loss 0.5030 (0.5389)	grad_norm 2.5105 (3.1422)	mem 20675MB
[2025-04-03 02:14:41 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][274/311]	eta 0:00:32 lr 0.000166	time 0.8759 (0.8812)	loss 0.5818 (0.5391)	grad_norm 3.4562 (3.1423)	mem 20675MB
[2025-04-03 02:14:43 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][276/311]	eta 0:00:30 lr 0.000166	time 0.8758 (0.8811)	loss 0.6046 (0.5396)	grad_norm 2.5914 (3.1412)	mem 20675MB
[2025-04-03 02:14:45 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][278/311]	eta 0:00:29 lr 0.000165	time 0.8755 (0.8811)	loss 0.5408 (0.5392)	grad_norm 4.8509 (3.1491)	mem 20675MB
[2025-04-03 02:14:47 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][280/311]	eta 0:00:27 lr 0.000165	time 0.8758 (0.8811)	loss 0.4410 (0.5388)	grad_norm 3.2909 (3.1548)	mem 20675MB
[2025-04-03 02:14:48 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][282/311]	eta 0:00:25 lr 0.000165	time 0.8755 (0.8810)	loss 0.5777 (0.5391)	grad_norm 2.6821 (3.1509)	mem 20675MB
[2025-04-03 02:14:50 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][284/311]	eta 0:00:23 lr 0.000165	time 0.8754 (0.8810)	loss 0.5819 (0.5395)	grad_norm 2.5827 (3.1487)	mem 20675MB
[2025-04-03 02:14:52 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][286/311]	eta 0:00:22 lr 0.000164	time 0.8754 (0.8810)	loss 0.4894 (0.5391)	grad_norm 3.0502 (3.1495)	mem 20675MB
[2025-04-03 02:14:54 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][288/311]	eta 0:00:20 lr 0.000164	time 0.8757 (0.8809)	loss 0.6175 (0.5397)	grad_norm 2.7888 (3.1456)	mem 20675MB
[2025-04-03 02:14:55 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][290/311]	eta 0:00:18 lr 0.000164	time 0.8754 (0.8809)	loss 0.5972 (0.5399)	grad_norm 2.3210 (3.1411)	mem 20675MB
[2025-04-03 02:14:57 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][292/311]	eta 0:00:16 lr 0.000163	time 0.8753 (0.8809)	loss 0.3966 (0.5397)	grad_norm 4.3300 (3.1420)	mem 20675MB
[2025-04-03 02:14:59 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][294/311]	eta 0:00:14 lr 0.000163	time 0.8762 (0.8809)	loss 0.6215 (0.5400)	grad_norm 3.8481 (3.1416)	mem 20675MB
[2025-04-03 02:15:01 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][296/311]	eta 0:00:13 lr 0.000163	time 0.8752 (0.8808)	loss 0.6112 (0.5399)	grad_norm 2.2766 (3.1429)	mem 20675MB
[2025-04-03 02:15:02 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][298/311]	eta 0:00:11 lr 0.000163	time 0.8753 (0.8808)	loss 0.4542 (0.5396)	grad_norm 2.6440 (3.1386)	mem 20675MB
[2025-04-03 02:15:04 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][300/311]	eta 0:00:09 lr 0.000162	time 0.8754 (0.8808)	loss 0.5674 (0.5398)	grad_norm 3.5060 (3.1398)	mem 20675MB
[2025-04-03 02:15:06 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][302/311]	eta 0:00:07 lr 0.000162	time 0.8754 (0.8807)	loss 0.5496 (0.5398)	grad_norm 1.9490 (3.1357)	mem 20675MB
[2025-04-03 02:15:08 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][304/311]	eta 0:00:06 lr 0.000162	time 0.8754 (0.8807)	loss 0.5470 (0.5403)	grad_norm 1.9431 (3.1298)	mem 20675MB
[2025-04-03 02:15:09 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][306/311]	eta 0:00:04 lr 0.000161	time 0.8755 (0.8807)	loss 0.6734 (0.5409)	grad_norm 2.6815 (3.1264)	mem 20675MB
[2025-04-03 02:15:11 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][308/311]	eta 0:00:02 lr 0.000161	time 0.8752 (0.8807)	loss 0.4757 (0.5408)	grad_norm 3.5992 (3.1258)	mem 20675MB
[2025-04-03 02:15:13 simmim_finetune] (main_finetune.py 252): INFO Train: [22/30][310/311]	eta 0:00:00 lr 0.000161	time 0.8754 (0.8806)	loss 0.6254 (0.5414)	grad_norm 2.9642 (3.1245)	mem 20675MB
[2025-04-03 02:15:13 simmim_finetune] (main_finetune.py 260): INFO EPOCH 22 training takes 0:04:34
[2025-04-03 02:15:15 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.449 (1.449)	Loss 0.5580 (0.5580)	Acc@1 77.344 (77.344)	Mem 20675MB
[2025-04-03 02:15:15 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 78.169
[2025-04-03 02:15:15 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 78.2%
[2025-04-03 02:15:15 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:15:15 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [8.11914706137137e-07, 8.11914706137137e-07, 1.1317876602268624e-06, 1.1317876602268624e-06, 1.6238998972879785e-06, 1.6238998972879785e-06, 2.3809956466127727e-06, 2.3809956466127727e-06, 3.5457583378816863e-06, 3.5457583378816863e-06, 5.3377009398338614e-06, 5.3377009398338614e-06, 8.094535712067975e-06, 8.094535712067975e-06, 1.2335819977043535e-05, 1.2335819977043535e-05, 1.886087269239055e-05, 1.886087269239055e-05, 2.8899415331385965e-05, 2.8899415331385965e-05, 4.434332708368659e-05, 4.434332708368659e-05, 6.810319131799525e-05, 6.810319131799525e-05, 0.00010465682860154704, 0.00010465682860154704, 0.00016089319365316518, 0.00016089319365316518]
[2025-04-03 02:15:17 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][0/311]	eta 0:11:51 lr 0.000161	time 2.2867 (2.2867)	loss 0.6634 (0.6634)	grad_norm 3.2136 (3.2136)	mem 20675MB
[2025-04-03 02:15:19 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][2/311]	eta 0:06:56 lr 0.000160	time 0.8760 (1.3468)	loss 0.6108 (0.6240)	grad_norm 2.0109 (2.4417)	mem 20675MB
[2025-04-03 02:15:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][4/311]	eta 0:05:55 lr 0.000160	time 0.8765 (1.1589)	loss 0.6168 (0.6117)	grad_norm 2.4025 (2.4953)	mem 20675MB
[2025-04-03 02:15:22 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][6/311]	eta 0:05:28 lr 0.000160	time 0.8757 (1.0786)	loss 0.5343 (0.6022)	grad_norm 2.1179 (2.4857)	mem 20675MB
[2025-04-03 02:15:24 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][8/311]	eta 0:05:13 lr 0.000160	time 0.8755 (1.0336)	loss 0.5468 (0.5957)	grad_norm 2.1588 (2.5066)	mem 20675MB
[2025-04-03 02:15:26 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][10/311]	eta 0:05:02 lr 0.000159	time 0.8754 (1.0050)	loss 0.5453 (0.5862)	grad_norm 2.4562 (2.5276)	mem 20675MB
[2025-04-03 02:15:28 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][12/311]	eta 0:04:54 lr 0.000159	time 0.8754 (0.9852)	loss 0.5050 (0.5778)	grad_norm 3.5294 (2.6234)	mem 20675MB
[2025-04-03 02:15:29 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][14/311]	eta 0:04:48 lr 0.000159	time 0.8754 (0.9707)	loss 0.5998 (0.5804)	grad_norm 2.6579 (2.5872)	mem 20675MB
[2025-04-03 02:15:31 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][16/311]	eta 0:04:43 lr 0.000159	time 0.8755 (0.9596)	loss 0.5789 (0.5826)	grad_norm 1.7923 (2.5429)	mem 20675MB
[2025-04-03 02:15:33 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][18/311]	eta 0:04:38 lr 0.000158	time 0.8758 (0.9509)	loss 0.4216 (0.5733)	grad_norm 3.7654 (2.6218)	mem 20675MB
[2025-04-03 02:15:35 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][20/311]	eta 0:04:34 lr 0.000158	time 0.8757 (0.9438)	loss 0.6188 (0.5689)	grad_norm 2.8439 (2.7204)	mem 20675MB
[2025-04-03 02:15:36 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][22/311]	eta 0:04:31 lr 0.000158	time 0.8758 (0.9380)	loss 0.6274 (0.5745)	grad_norm 2.4335 (2.6916)	mem 20675MB
[2025-04-03 02:15:38 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][24/311]	eta 0:04:27 lr 0.000157	time 0.8760 (0.9331)	loss 0.4231 (0.5686)	grad_norm 3.3499 (2.7225)	mem 20675MB
[2025-04-03 02:15:40 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][26/311]	eta 0:04:24 lr 0.000157	time 0.8758 (0.9289)	loss 0.4292 (0.5594)	grad_norm 4.1551 (2.8223)	mem 20675MB
[2025-04-03 02:15:42 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][28/311]	eta 0:04:21 lr 0.000157	time 0.8758 (0.9253)	loss 0.5870 (0.5620)	grad_norm 2.4542 (2.8063)	mem 20675MB
[2025-04-03 02:15:43 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][30/311]	eta 0:04:19 lr 0.000157	time 0.8757 (0.9223)	loss 0.5323 (0.5615)	grad_norm 2.5520 (2.8086)	mem 20675MB
[2025-04-03 02:15:45 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][32/311]	eta 0:04:16 lr 0.000156	time 0.8756 (0.9195)	loss 0.4841 (0.5554)	grad_norm 3.4971 (2.9208)	mem 20675MB
[2025-04-03 02:15:47 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][34/311]	eta 0:04:14 lr 0.000156	time 0.8755 (0.9170)	loss 0.4905 (0.5496)	grad_norm 3.0669 (2.9522)	mem 20675MB
[2025-04-03 02:15:49 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][36/311]	eta 0:04:11 lr 0.000156	time 0.8761 (0.9148)	loss 0.5942 (0.5515)	grad_norm 2.9243 (2.9239)	mem 20675MB
[2025-04-03 02:15:50 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][38/311]	eta 0:04:09 lr 0.000155	time 0.8758 (0.9129)	loss 0.6056 (0.5507)	grad_norm 2.2043 (2.9094)	mem 20675MB
[2025-04-03 02:15:52 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][40/311]	eta 0:04:06 lr 0.000155	time 0.8755 (0.9111)	loss 0.5584 (0.5468)	grad_norm 3.0073 (2.9306)	mem 20675MB
[2025-04-03 02:15:54 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][42/311]	eta 0:04:04 lr 0.000155	time 0.8759 (0.9095)	loss 0.6515 (0.5514)	grad_norm 2.9374 (2.9374)	mem 20675MB
[2025-04-03 02:15:56 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][44/311]	eta 0:04:02 lr 0.000155	time 0.8759 (0.9081)	loss 0.4725 (0.5515)	grad_norm 5.3167 (2.9828)	mem 20675MB
[2025-04-03 02:15:57 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][46/311]	eta 0:04:00 lr 0.000154	time 0.8754 (0.9067)	loss 0.5274 (0.5508)	grad_norm 2.6586 (2.9632)	mem 20675MB
[2025-04-03 02:15:59 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][48/311]	eta 0:03:58 lr 0.000154	time 0.8755 (0.9055)	loss 0.5917 (0.5513)	grad_norm 3.1946 (2.9504)	mem 20675MB
[2025-04-03 02:16:01 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][50/311]	eta 0:03:56 lr 0.000154	time 0.8756 (0.9044)	loss 0.5271 (0.5495)	grad_norm 3.0727 (2.9542)	mem 20675MB
[2025-04-03 02:16:03 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][52/311]	eta 0:03:53 lr 0.000154	time 0.8761 (0.9033)	loss 0.4583 (0.5482)	grad_norm 3.9001 (2.9672)	mem 20675MB
[2025-04-03 02:16:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][54/311]	eta 0:03:51 lr 0.000153	time 0.8754 (0.9023)	loss 0.5254 (0.5457)	grad_norm 3.5344 (2.9792)	mem 20675MB
[2025-04-03 02:16:06 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][56/311]	eta 0:03:49 lr 0.000153	time 0.8758 (0.9014)	loss 0.4581 (0.5437)	grad_norm 3.2081 (2.9892)	mem 20675MB
[2025-04-03 02:16:08 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][58/311]	eta 0:03:47 lr 0.000153	time 0.8756 (0.9006)	loss 0.5993 (0.5440)	grad_norm 3.1336 (3.0101)	mem 20675MB
[2025-04-03 02:16:10 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][60/311]	eta 0:03:45 lr 0.000152	time 0.8759 (0.8998)	loss 0.5214 (0.5427)	grad_norm 3.2078 (3.0118)	mem 20675MB
[2025-04-03 02:16:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][62/311]	eta 0:03:43 lr 0.000152	time 0.8757 (0.8991)	loss 0.5965 (0.5455)	grad_norm 3.0933 (3.0053)	mem 20675MB
[2025-04-03 02:16:13 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][64/311]	eta 0:03:41 lr 0.000152	time 0.8757 (0.8984)	loss 0.5044 (0.5461)	grad_norm 5.1214 (3.0563)	mem 20675MB
[2025-04-03 02:16:15 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][66/311]	eta 0:03:39 lr 0.000152	time 0.8756 (0.8977)	loss 0.6385 (0.5487)	grad_norm 3.1098 (3.0560)	mem 20675MB
[2025-04-03 02:16:17 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][68/311]	eta 0:03:38 lr 0.000151	time 0.8756 (0.8971)	loss 0.4997 (0.5498)	grad_norm 2.7308 (3.0472)	mem 20675MB
[2025-04-03 02:16:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][70/311]	eta 0:03:36 lr 0.000151	time 0.8757 (0.8965)	loss 0.5929 (0.5515)	grad_norm 2.6635 (3.0366)	mem 20675MB
[2025-04-03 02:16:20 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][72/311]	eta 0:03:34 lr 0.000151	time 0.8760 (0.8960)	loss 0.5388 (0.5504)	grad_norm 2.5490 (3.0379)	mem 20675MB
[2025-04-03 02:16:22 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][74/311]	eta 0:03:32 lr 0.000150	time 0.8758 (0.8955)	loss 0.5224 (0.5492)	grad_norm 2.6931 (3.0234)	mem 20675MB
[2025-04-03 02:16:24 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][76/311]	eta 0:03:30 lr 0.000150	time 0.8753 (0.8950)	loss 0.5406 (0.5471)	grad_norm 2.5523 (3.0304)	mem 20675MB
[2025-04-03 02:16:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][78/311]	eta 0:03:28 lr 0.000150	time 0.8756 (0.8945)	loss 0.5727 (0.5481)	grad_norm 2.6365 (3.0179)	mem 20675MB
[2025-04-03 02:16:27 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][80/311]	eta 0:03:26 lr 0.000150	time 0.8760 (0.8941)	loss 0.5932 (0.5487)	grad_norm 2.8724 (3.0253)	mem 20675MB
[2025-04-03 02:16:29 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][82/311]	eta 0:03:24 lr 0.000149	time 0.8758 (0.8937)	loss 0.5083 (0.5497)	grad_norm 2.8926 (3.0210)	mem 20675MB
[2025-04-03 02:16:31 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][84/311]	eta 0:03:22 lr 0.000149	time 0.8756 (0.8933)	loss 0.5840 (0.5510)	grad_norm 2.3975 (3.0082)	mem 20675MB
[2025-04-03 02:16:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][86/311]	eta 0:03:20 lr 0.000149	time 0.8760 (0.8929)	loss 0.4804 (0.5502)	grad_norm 3.1482 (3.0169)	mem 20675MB
[2025-04-03 02:16:34 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][88/311]	eta 0:03:19 lr 0.000149	time 0.8758 (0.8925)	loss 0.5917 (0.5511)	grad_norm 2.4903 (3.0054)	mem 20675MB
[2025-04-03 02:16:36 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][90/311]	eta 0:03:17 lr 0.000148	time 0.8755 (0.8922)	loss 0.5502 (0.5514)	grad_norm 2.0456 (2.9858)	mem 20675MB
[2025-04-03 02:16:38 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][92/311]	eta 0:03:15 lr 0.000148	time 0.8757 (0.8918)	loss 0.4781 (0.5515)	grad_norm 3.2343 (2.9817)	mem 20675MB
[2025-04-03 02:16:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][94/311]	eta 0:03:13 lr 0.000148	time 0.8758 (0.8915)	loss 0.4348 (0.5494)	grad_norm 3.0974 (3.0028)	mem 20675MB
[2025-04-03 02:16:41 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][96/311]	eta 0:03:11 lr 0.000147	time 0.8757 (0.8912)	loss 0.5833 (0.5503)	grad_norm 2.5927 (2.9937)	mem 20675MB
[2025-04-03 02:16:43 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][98/311]	eta 0:03:09 lr 0.000147	time 0.8761 (0.8909)	loss 0.5483 (0.5504)	grad_norm 2.3001 (2.9803)	mem 20675MB
[2025-04-03 02:16:45 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][100/311]	eta 0:03:07 lr 0.000147	time 0.8755 (0.8906)	loss 0.5570 (0.5497)	grad_norm 2.3124 (2.9823)	mem 20675MB
[2025-04-03 02:16:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][102/311]	eta 0:03:06 lr 0.000147	time 0.8754 (0.8904)	loss 0.5034 (0.5488)	grad_norm 3.1815 (2.9912)	mem 20675MB
[2025-04-03 02:16:48 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][104/311]	eta 0:03:04 lr 0.000146	time 0.8754 (0.8901)	loss 0.5775 (0.5494)	grad_norm 2.5958 (2.9831)	mem 20675MB
[2025-04-03 02:16:50 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][106/311]	eta 0:03:02 lr 0.000146	time 0.8758 (0.8898)	loss 0.6338 (0.5495)	grad_norm 3.5946 (2.9912)	mem 20675MB
[2025-04-03 02:16:52 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][108/311]	eta 0:03:00 lr 0.000146	time 0.8758 (0.8896)	loss 0.5976 (0.5507)	grad_norm 2.8644 (2.9857)	mem 20675MB
[2025-04-03 02:16:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][110/311]	eta 0:02:58 lr 0.000146	time 0.8757 (0.8894)	loss 0.6435 (0.5520)	grad_norm 2.7374 (2.9826)	mem 20675MB
[2025-04-03 02:16:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][112/311]	eta 0:02:56 lr 0.000145	time 0.8758 (0.8891)	loss 0.6109 (0.5524)	grad_norm 2.5507 (2.9763)	mem 20675MB
[2025-04-03 02:16:57 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][114/311]	eta 0:02:55 lr 0.000145	time 0.8754 (0.8889)	loss 0.5802 (0.5529)	grad_norm 2.4214 (2.9640)	mem 20675MB
[2025-04-03 02:16:59 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][116/311]	eta 0:02:53 lr 0.000145	time 0.8756 (0.8887)	loss 0.4749 (0.5515)	grad_norm 4.5586 (2.9770)	mem 20675MB
[2025-04-03 02:17:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][118/311]	eta 0:02:51 lr 0.000145	time 0.8757 (0.8885)	loss 0.5952 (0.5520)	grad_norm 2.6745 (2.9696)	mem 20675MB
[2025-04-03 02:17:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][120/311]	eta 0:02:49 lr 0.000144	time 0.8759 (0.8883)	loss 0.4929 (0.5520)	grad_norm 2.8339 (2.9630)	mem 20675MB
[2025-04-03 02:17:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][122/311]	eta 0:02:47 lr 0.000144	time 0.8755 (0.8881)	loss 0.4256 (0.5515)	grad_norm 4.4444 (2.9747)	mem 20675MB
[2025-04-03 02:17:06 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][124/311]	eta 0:02:46 lr 0.000144	time 0.8776 (0.8880)	loss 0.4644 (0.5508)	grad_norm 3.4079 (2.9783)	mem 20675MB
[2025-04-03 02:17:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][126/311]	eta 0:02:44 lr 0.000143	time 0.8755 (0.8878)	loss 0.5979 (0.5504)	grad_norm 2.9893 (2.9904)	mem 20675MB
[2025-04-03 02:17:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][128/311]	eta 0:02:42 lr 0.000143	time 0.8759 (0.8876)	loss 0.4906 (0.5489)	grad_norm 4.0511 (3.0144)	mem 20675MB
[2025-04-03 02:17:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][130/311]	eta 0:02:40 lr 0.000143	time 0.8756 (0.8875)	loss 0.5206 (0.5488)	grad_norm 5.2652 (3.0468)	mem 20675MB
[2025-04-03 02:17:13 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][132/311]	eta 0:02:38 lr 0.000143	time 0.8754 (0.8873)	loss 0.5726 (0.5492)	grad_norm 2.5745 (3.0373)	mem 20675MB
[2025-04-03 02:17:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][134/311]	eta 0:02:37 lr 0.000142	time 0.8761 (0.8871)	loss 0.5896 (0.5493)	grad_norm 2.8384 (3.0309)	mem 20675MB
[2025-04-03 02:17:16 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][136/311]	eta 0:02:35 lr 0.000142	time 0.8756 (0.8870)	loss 0.5433 (0.5492)	grad_norm 2.3760 (3.0259)	mem 20675MB
[2025-04-03 02:17:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][138/311]	eta 0:02:33 lr 0.000142	time 0.8757 (0.8868)	loss 0.4953 (0.5496)	grad_norm 2.3053 (3.0148)	mem 20675MB
[2025-04-03 02:17:20 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][140/311]	eta 0:02:31 lr 0.000142	time 0.8758 (0.8867)	loss 0.5881 (0.5496)	grad_norm 2.2599 (3.0067)	mem 20675MB
[2025-04-03 02:17:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][142/311]	eta 0:02:29 lr 0.000141	time 0.8759 (0.8865)	loss 0.5472 (0.5484)	grad_norm 3.0348 (3.0166)	mem 20675MB
[2025-04-03 02:17:23 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][144/311]	eta 0:02:28 lr 0.000141	time 0.8758 (0.8864)	loss 0.5428 (0.5487)	grad_norm 2.7898 (3.0131)	mem 20675MB
[2025-04-03 02:17:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][146/311]	eta 0:02:26 lr 0.000141	time 0.8759 (0.8863)	loss 0.5244 (0.5480)	grad_norm 3.6249 (3.0143)	mem 20675MB
[2025-04-03 02:17:27 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][148/311]	eta 0:02:24 lr 0.000141	time 0.8758 (0.8862)	loss 0.5889 (0.5489)	grad_norm 2.5978 (3.0162)	mem 20675MB
[2025-04-03 02:17:29 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][150/311]	eta 0:02:22 lr 0.000140	time 0.8755 (0.8860)	loss 0.6308 (0.5492)	grad_norm 2.7874 (3.0129)	mem 20675MB
[2025-04-03 02:17:30 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][152/311]	eta 0:02:20 lr 0.000140	time 0.8755 (0.8859)	loss 0.4512 (0.5488)	grad_norm 3.6619 (3.0145)	mem 20675MB
[2025-04-03 02:17:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][154/311]	eta 0:02:19 lr 0.000140	time 0.8758 (0.8858)	loss 0.6033 (0.5490)	grad_norm 2.2103 (3.0155)	mem 20675MB
[2025-04-03 02:17:34 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][156/311]	eta 0:02:17 lr 0.000139	time 0.8755 (0.8857)	loss 0.6033 (0.5497)	grad_norm 2.5949 (3.0096)	mem 20675MB
[2025-04-03 02:17:36 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][158/311]	eta 0:02:15 lr 0.000139	time 0.8773 (0.8856)	loss 0.4820 (0.5492)	grad_norm 3.4198 (3.0086)	mem 20675MB
[2025-04-03 02:17:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][160/311]	eta 0:02:13 lr 0.000139	time 0.8757 (0.8855)	loss 0.6352 (0.5494)	grad_norm 3.4598 (3.0168)	mem 20675MB
[2025-04-03 02:17:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][162/311]	eta 0:02:11 lr 0.000139	time 0.8775 (0.8854)	loss 0.4292 (0.5489)	grad_norm 3.3465 (3.0164)	mem 20675MB
[2025-04-03 02:17:41 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][164/311]	eta 0:02:10 lr 0.000138	time 0.8758 (0.8853)	loss 0.5151 (0.5482)	grad_norm 3.6827 (3.0195)	mem 20675MB
[2025-04-03 02:17:43 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][166/311]	eta 0:02:08 lr 0.000138	time 0.8755 (0.8852)	loss 0.4359 (0.5478)	grad_norm 3.3158 (3.0199)	mem 20675MB
[2025-04-03 02:17:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][168/311]	eta 0:02:06 lr 0.000138	time 0.8758 (0.8851)	loss 0.5414 (0.5471)	grad_norm 3.1718 (3.0201)	mem 20675MB
[2025-04-03 02:17:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][170/311]	eta 0:02:04 lr 0.000138	time 0.8755 (0.8850)	loss 0.5919 (0.5477)	grad_norm 2.4285 (3.0123)	mem 20675MB
[2025-04-03 02:17:48 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][172/311]	eta 0:02:02 lr 0.000137	time 0.8757 (0.8849)	loss 0.4007 (0.5474)	grad_norm 5.2868 (3.0335)	mem 20675MB
[2025-04-03 02:17:50 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][174/311]	eta 0:02:01 lr 0.000137	time 0.8759 (0.8848)	loss 0.4448 (0.5467)	grad_norm 4.0224 (3.0405)	mem 20675MB
[2025-04-03 02:17:51 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][176/311]	eta 0:01:59 lr 0.000137	time 0.8759 (0.8847)	loss 0.4819 (0.5464)	grad_norm 3.3845 (3.0373)	mem 20675MB
[2025-04-03 02:17:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][178/311]	eta 0:01:57 lr 0.000137	time 0.8757 (0.8846)	loss 0.6455 (0.5468)	grad_norm 2.7404 (3.0359)	mem 20675MB
[2025-04-03 02:17:55 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][180/311]	eta 0:01:55 lr 0.000136	time 0.8758 (0.8845)	loss 0.4806 (0.5463)	grad_norm 5.3531 (3.0485)	mem 20675MB
[2025-04-03 02:17:57 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][182/311]	eta 0:01:54 lr 0.000136	time 0.8754 (0.8844)	loss 0.5793 (0.5464)	grad_norm 2.6989 (3.0487)	mem 20675MB
[2025-04-03 02:17:58 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][184/311]	eta 0:01:52 lr 0.000136	time 0.8763 (0.8843)	loss 0.5482 (0.5464)	grad_norm 2.6497 (3.0425)	mem 20675MB
[2025-04-03 02:18:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][186/311]	eta 0:01:50 lr 0.000135	time 0.8760 (0.8843)	loss 0.5789 (0.5464)	grad_norm 2.6428 (3.0403)	mem 20675MB
[2025-04-03 02:18:02 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][188/311]	eta 0:01:48 lr 0.000135	time 0.8758 (0.8842)	loss 0.6395 (0.5468)	grad_norm 3.0136 (3.0395)	mem 20675MB
[2025-04-03 02:18:04 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][190/311]	eta 0:01:46 lr 0.000135	time 0.8759 (0.8841)	loss 0.5807 (0.5473)	grad_norm 3.6721 (3.0450)	mem 20675MB
[2025-04-03 02:18:05 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][192/311]	eta 0:01:45 lr 0.000135	time 0.8756 (0.8840)	loss 0.5678 (0.5474)	grad_norm 2.2120 (3.0398)	mem 20675MB
[2025-04-03 02:18:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][194/311]	eta 0:01:43 lr 0.000134	time 0.8755 (0.8839)	loss 0.4666 (0.5474)	grad_norm 3.4127 (3.0399)	mem 20675MB
[2025-04-03 02:18:09 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][196/311]	eta 0:01:41 lr 0.000134	time 0.8757 (0.8839)	loss 0.5124 (0.5469)	grad_norm 3.3735 (3.0456)	mem 20675MB
[2025-04-03 02:18:11 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][198/311]	eta 0:01:39 lr 0.000134	time 0.8758 (0.8838)	loss 0.5050 (0.5471)	grad_norm 3.4930 (3.0455)	mem 20675MB
[2025-04-03 02:18:12 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][200/311]	eta 0:01:38 lr 0.000134	time 0.8757 (0.8837)	loss 0.5365 (0.5470)	grad_norm 3.1187 (3.0448)	mem 20675MB
[2025-04-03 02:18:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][202/311]	eta 0:01:36 lr 0.000133	time 0.8757 (0.8837)	loss 0.5784 (0.5465)	grad_norm 2.7656 (3.0472)	mem 20675MB
[2025-04-03 02:18:16 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][204/311]	eta 0:01:34 lr 0.000133	time 0.8755 (0.8836)	loss 0.4093 (0.5464)	grad_norm 4.5349 (3.0518)	mem 20675MB
[2025-04-03 02:18:18 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][206/311]	eta 0:01:32 lr 0.000133	time 0.8755 (0.8835)	loss 0.5791 (0.5469)	grad_norm 2.3400 (3.0467)	mem 20675MB
[2025-04-03 02:18:19 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][208/311]	eta 0:01:30 lr 0.000133	time 0.8757 (0.8834)	loss 0.6094 (0.5477)	grad_norm 2.7485 (3.0470)	mem 20675MB
[2025-04-03 02:18:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][210/311]	eta 0:01:29 lr 0.000132	time 0.8758 (0.8834)	loss 0.4458 (0.5468)	grad_norm 3.6412 (3.0503)	mem 20675MB
[2025-04-03 02:18:23 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][212/311]	eta 0:01:27 lr 0.000132	time 0.8757 (0.8833)	loss 0.5032 (0.5470)	grad_norm 3.0176 (3.0491)	mem 20675MB
[2025-04-03 02:18:25 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][214/311]	eta 0:01:25 lr 0.000132	time 0.8756 (0.8833)	loss 0.5343 (0.5467)	grad_norm 3.4087 (3.0541)	mem 20675MB
[2025-04-03 02:18:26 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][216/311]	eta 0:01:23 lr 0.000132	time 0.8755 (0.8832)	loss 0.5592 (0.5467)	grad_norm 2.9441 (3.0590)	mem 20675MB
[2025-04-03 02:18:28 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][218/311]	eta 0:01:22 lr 0.000131	time 0.8758 (0.8831)	loss 0.4192 (0.5463)	grad_norm 3.0161 (3.0583)	mem 20675MB
[2025-04-03 02:18:30 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][220/311]	eta 0:01:20 lr 0.000131	time 0.8757 (0.8831)	loss 0.6463 (0.5464)	grad_norm 2.6527 (3.0560)	mem 20675MB
[2025-04-03 02:18:32 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][222/311]	eta 0:01:18 lr 0.000131	time 0.8761 (0.8830)	loss 0.6553 (0.5466)	grad_norm 2.8393 (3.0522)	mem 20675MB
[2025-04-03 02:18:33 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][224/311]	eta 0:01:16 lr 0.000131	time 0.8757 (0.8830)	loss 0.6223 (0.5472)	grad_norm 4.1763 (3.0537)	mem 20675MB
[2025-04-03 02:18:35 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][226/311]	eta 0:01:15 lr 0.000130	time 0.8756 (0.8829)	loss 0.5856 (0.5469)	grad_norm 2.4790 (3.0625)	mem 20675MB
[2025-04-03 02:18:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][228/311]	eta 0:01:13 lr 0.000130	time 0.8757 (0.8829)	loss 0.5879 (0.5471)	grad_norm 2.2559 (3.0564)	mem 20675MB
[2025-04-03 02:18:39 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][230/311]	eta 0:01:11 lr 0.000130	time 0.8757 (0.8828)	loss 0.4796 (0.5470)	grad_norm 3.7797 (3.0630)	mem 20675MB
[2025-04-03 02:18:40 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][232/311]	eta 0:01:09 lr 0.000130	time 0.8777 (0.8828)	loss 0.5737 (0.5472)	grad_norm 2.7588 (3.0578)	mem 20675MB
[2025-04-03 02:18:42 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][234/311]	eta 0:01:07 lr 0.000129	time 0.8757 (0.8827)	loss 0.5604 (0.5474)	grad_norm 2.3147 (3.0541)	mem 20675MB
[2025-04-03 02:18:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][236/311]	eta 0:01:06 lr 0.000129	time 0.8766 (0.8827)	loss 0.4998 (0.5472)	grad_norm 3.8256 (3.0534)	mem 20675MB
[2025-04-03 02:18:46 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][238/311]	eta 0:01:04 lr 0.000129	time 0.8756 (0.8826)	loss 0.6397 (0.5477)	grad_norm 2.5163 (3.0482)	mem 20675MB
[2025-04-03 02:18:47 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][240/311]	eta 0:01:02 lr 0.000129	time 0.8757 (0.8826)	loss 0.5349 (0.5476)	grad_norm 3.2991 (3.0469)	mem 20675MB
[2025-04-03 02:18:49 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][242/311]	eta 0:01:00 lr 0.000128	time 0.8759 (0.8825)	loss 0.4007 (0.5468)	grad_norm 3.1906 (3.0484)	mem 20675MB
[2025-04-03 02:18:51 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][244/311]	eta 0:00:59 lr 0.000128	time 0.8761 (0.8825)	loss 0.4064 (0.5463)	grad_norm 4.4216 (3.0528)	mem 20675MB
[2025-04-03 02:18:53 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][246/311]	eta 0:00:57 lr 0.000128	time 0.8755 (0.8824)	loss 0.5110 (0.5456)	grad_norm 3.5816 (3.0551)	mem 20675MB
[2025-04-03 02:18:54 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][248/311]	eta 0:00:55 lr 0.000127	time 0.8757 (0.8824)	loss 0.4930 (0.5457)	grad_norm 3.2487 (3.0536)	mem 20675MB
[2025-04-03 02:18:56 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][250/311]	eta 0:00:53 lr 0.000127	time 0.8756 (0.8823)	loss 0.6263 (0.5457)	grad_norm 2.7206 (3.0582)	mem 20675MB
[2025-04-03 02:18:58 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][252/311]	eta 0:00:52 lr 0.000127	time 0.8762 (0.8823)	loss 0.5475 (0.5456)	grad_norm 4.3139 (3.0589)	mem 20675MB
[2025-04-03 02:19:00 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][254/311]	eta 0:00:50 lr 0.000127	time 0.8764 (0.8822)	loss 0.5661 (0.5462)	grad_norm 2.8321 (3.0549)	mem 20675MB
[2025-04-03 02:19:01 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][256/311]	eta 0:00:48 lr 0.000126	time 0.8776 (0.8822)	loss 0.6014 (0.5461)	grad_norm 2.5094 (3.0585)	mem 20675MB
[2025-04-03 02:19:03 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][258/311]	eta 0:00:46 lr 0.000126	time 0.8756 (0.8822)	loss 0.5953 (0.5463)	grad_norm 2.1244 (3.0533)	mem 20675MB
[2025-04-03 02:19:05 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][260/311]	eta 0:00:44 lr 0.000126	time 0.8755 (0.8821)	loss 0.4629 (0.5460)	grad_norm 3.5484 (3.0537)	mem 20675MB
[2025-04-03 02:19:07 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][262/311]	eta 0:00:43 lr 0.000126	time 0.8754 (0.8821)	loss 0.5151 (0.5455)	grad_norm 1.9709 (3.0510)	mem 20675MB
[2025-04-03 02:19:08 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][264/311]	eta 0:00:41 lr 0.000125	time 0.8758 (0.8820)	loss 0.5610 (0.5457)	grad_norm 2.8335 (3.0503)	mem 20675MB
[2025-04-03 02:19:10 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][266/311]	eta 0:00:39 lr 0.000125	time 0.8759 (0.8820)	loss 0.5987 (0.5458)	grad_norm 2.6097 (3.0484)	mem 20675MB
[2025-04-03 02:19:12 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][268/311]	eta 0:00:37 lr 0.000125	time 0.8755 (0.8820)	loss 0.5814 (0.5458)	grad_norm 3.1992 (3.0464)	mem 20675MB
[2025-04-03 02:19:14 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][270/311]	eta 0:00:36 lr 0.000125	time 0.8756 (0.8819)	loss 0.5655 (0.5457)	grad_norm 3.4265 (3.0472)	mem 20675MB
[2025-04-03 02:19:15 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][272/311]	eta 0:00:34 lr 0.000124	time 0.8754 (0.8819)	loss 0.5854 (0.5463)	grad_norm 2.6356 (3.0443)	mem 20675MB
[2025-04-03 02:19:17 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][274/311]	eta 0:00:32 lr 0.000124	time 0.8755 (0.8818)	loss 0.5930 (0.5465)	grad_norm 3.3096 (3.0492)	mem 20675MB
[2025-04-03 02:19:19 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][276/311]	eta 0:00:30 lr 0.000124	time 0.8754 (0.8818)	loss 0.3948 (0.5456)	grad_norm 3.6375 (3.0558)	mem 20675MB
[2025-04-03 02:19:21 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][278/311]	eta 0:00:29 lr 0.000124	time 0.8757 (0.8818)	loss 0.5990 (0.5460)	grad_norm 2.4874 (3.0510)	mem 20675MB
[2025-04-03 02:19:22 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][280/311]	eta 0:00:27 lr 0.000123	time 0.8758 (0.8817)	loss 0.5125 (0.5459)	grad_norm 3.6014 (3.0514)	mem 20675MB
[2025-04-03 02:19:24 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][282/311]	eta 0:00:25 lr 0.000123	time 0.8760 (0.8817)	loss 0.4357 (0.5454)	grad_norm 2.5742 (3.0477)	mem 20675MB
[2025-04-03 02:19:26 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][284/311]	eta 0:00:23 lr 0.000123	time 0.8760 (0.8817)	loss 0.5088 (0.5450)	grad_norm 2.6019 (3.0491)	mem 20675MB
[2025-04-03 02:19:28 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][286/311]	eta 0:00:22 lr 0.000123	time 0.8758 (0.8816)	loss 0.6344 (0.5454)	grad_norm 3.1902 (3.0487)	mem 20675MB
[2025-04-03 02:19:29 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][288/311]	eta 0:00:20 lr 0.000122	time 0.8758 (0.8816)	loss 0.4710 (0.5448)	grad_norm 2.9554 (3.0485)	mem 20675MB
[2025-04-03 02:19:31 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][290/311]	eta 0:00:18 lr 0.000122	time 0.8754 (0.8816)	loss 0.6259 (0.5446)	grad_norm 4.0347 (3.0589)	mem 20675MB
[2025-04-03 02:19:33 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][292/311]	eta 0:00:16 lr 0.000122	time 0.8757 (0.8815)	loss 0.5449 (0.5441)	grad_norm 2.9807 (3.0613)	mem 20675MB
[2025-04-03 02:19:35 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][294/311]	eta 0:00:14 lr 0.000122	time 0.8755 (0.8815)	loss 0.5713 (0.5444)	grad_norm 2.4245 (3.0567)	mem 20675MB
[2025-04-03 02:19:37 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][296/311]	eta 0:00:13 lr 0.000121	time 0.8752 (0.8814)	loss 0.5174 (0.5440)	grad_norm 3.8212 (3.0639)	mem 20675MB
[2025-04-03 02:19:38 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][298/311]	eta 0:00:11 lr 0.000121	time 0.8757 (0.8814)	loss 0.5073 (0.5434)	grad_norm 3.4049 (3.0680)	mem 20675MB
[2025-04-03 02:19:40 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][300/311]	eta 0:00:09 lr 0.000121	time 0.8755 (0.8814)	loss 0.5806 (0.5439)	grad_norm 2.4946 (3.0640)	mem 20675MB
[2025-04-03 02:19:42 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][302/311]	eta 0:00:07 lr 0.000121	time 0.8754 (0.8814)	loss 0.5926 (0.5444)	grad_norm 2.7486 (3.0628)	mem 20675MB
[2025-04-03 02:19:44 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][304/311]	eta 0:00:06 lr 0.000120	time 0.8760 (0.8813)	loss 0.4632 (0.5437)	grad_norm 3.5026 (3.0685)	mem 20675MB
[2025-04-03 02:19:45 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][306/311]	eta 0:00:04 lr 0.000120	time 0.8752 (0.8813)	loss 0.4602 (0.5435)	grad_norm 4.8282 (3.0712)	mem 20675MB
[2025-04-03 02:19:47 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][308/311]	eta 0:00:02 lr 0.000120	time 0.8757 (0.8813)	loss 0.6280 (0.5438)	grad_norm 3.5960 (3.0729)	mem 20675MB
[2025-04-03 02:19:49 simmim_finetune] (main_finetune.py 252): INFO Train: [23/30][310/311]	eta 0:00:00 lr 0.000120	time 0.8755 (0.8812)	loss 0.6639 (0.5444)	grad_norm 3.0192 (3.0723)	mem 20675MB
[2025-04-03 02:19:49 simmim_finetune] (main_finetune.py 260): INFO EPOCH 23 training takes 0:04:34
[2025-04-03 02:19:50 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.405 (1.405)	Loss 0.5464 (0.5464)	Acc@1 75.781 (75.781)	Mem 20675MB
[2025-04-03 02:19:50 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.465
[2025-04-03 02:19:50 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 77.5%
[2025-04-03 02:19:50 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:19:50 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [6.678745095716295e-07, 6.678745095716295e-07, 9.057518107982868e-07, 9.057518107982868e-07, 1.271716889608529e-06, 1.271716889608529e-06, 1.8347400877781324e-06, 1.8347400877781324e-06, 2.700929623423676e-06, 2.700929623423676e-06, 4.033528909032204e-06, 4.033528909032204e-06, 6.0836816561222474e-06, 6.0836816561222474e-06, 9.237762805491542e-06, 9.237762805491542e-06, 1.4090195342982769e-05, 1.4090195342982769e-05, 2.155547616989235e-05, 2.155547616989235e-05, 3.3040523595907085e-05, 3.3040523595907085e-05, 5.070982732823745e-05, 5.070982732823745e-05, 7.789337153182264e-05, 7.789337153182264e-05, 0.00011971420876810751, 0.00011971420876810751]
[2025-04-03 02:19:53 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][0/311]	eta 0:11:02 lr 0.000120	time 2.1315 (2.1315)	loss 0.5750 (0.5750)	grad_norm 2.9683 (2.9683)	mem 20675MB
[2025-04-03 02:19:54 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][2/311]	eta 0:06:40 lr 0.000119	time 0.8762 (1.2953)	loss 0.5939 (0.5646)	grad_norm 2.6810 (3.3092)	mem 20675MB
[2025-04-03 02:19:56 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][4/311]	eta 0:05:46 lr 0.000119	time 0.8760 (1.1279)	loss 0.6147 (0.5711)	grad_norm 2.9690 (3.0976)	mem 20675MB
[2025-04-03 02:19:58 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][6/311]	eta 0:05:22 lr 0.000119	time 0.8759 (1.0562)	loss 0.4337 (0.5384)	grad_norm 3.1204 (3.2215)	mem 20675MB
[2025-04-03 02:20:00 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][8/311]	eta 0:05:07 lr 0.000119	time 0.8761 (1.0163)	loss 0.5051 (0.5429)	grad_norm 2.8018 (3.1522)	mem 20675MB
[2025-04-03 02:20:01 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][10/311]	eta 0:04:58 lr 0.000118	time 0.8755 (0.9908)	loss 0.5490 (0.5538)	grad_norm 2.6172 (3.0716)	mem 20675MB
[2025-04-03 02:20:03 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][12/311]	eta 0:04:51 lr 0.000118	time 0.8758 (0.9733)	loss 0.5728 (0.5498)	grad_norm 2.6116 (3.0399)	mem 20675MB
[2025-04-03 02:20:05 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][14/311]	eta 0:04:45 lr 0.000118	time 0.8779 (0.9606)	loss 0.5557 (0.5458)	grad_norm 2.7367 (3.0360)	mem 20675MB
[2025-04-03 02:20:07 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][16/311]	eta 0:04:40 lr 0.000118	time 0.8759 (0.9507)	loss 0.6196 (0.5511)	grad_norm 2.8557 (2.9556)	mem 20675MB
[2025-04-03 02:20:08 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][18/311]	eta 0:04:36 lr 0.000117	time 0.8754 (0.9429)	loss 0.4875 (0.5483)	grad_norm 2.5805 (2.9428)	mem 20675MB
[2025-04-03 02:20:10 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][20/311]	eta 0:04:32 lr 0.000117	time 0.8759 (0.9366)	loss 0.5755 (0.5480)	grad_norm 2.4276 (2.9127)	mem 20675MB
[2025-04-03 02:20:12 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][22/311]	eta 0:04:29 lr 0.000117	time 0.8759 (0.9314)	loss 0.4967 (0.5434)	grad_norm 4.2687 (2.9896)	mem 20675MB
[2025-04-03 02:20:14 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][24/311]	eta 0:04:26 lr 0.000117	time 0.8756 (0.9270)	loss 0.5768 (0.5390)	grad_norm 1.8077 (2.9627)	mem 20675MB
[2025-04-03 02:20:15 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][26/311]	eta 0:04:23 lr 0.000116	time 0.8756 (0.9233)	loss 0.5838 (0.5389)	grad_norm 3.2688 (3.0076)	mem 20675MB
[2025-04-03 02:20:17 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][28/311]	eta 0:04:20 lr 0.000116	time 0.8757 (0.9201)	loss 0.5332 (0.5378)	grad_norm 5.3750 (3.1111)	mem 20675MB
[2025-04-03 02:20:19 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][30/311]	eta 0:04:17 lr 0.000116	time 0.8758 (0.9173)	loss 0.6253 (0.5416)	grad_norm 2.9857 (3.0871)	mem 20675MB
[2025-04-03 02:20:21 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][32/311]	eta 0:04:15 lr 0.000116	time 0.8756 (0.9148)	loss 0.5515 (0.5400)	grad_norm 1.8703 (3.0995)	mem 20675MB
[2025-04-03 02:20:22 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][34/311]	eta 0:04:12 lr 0.000115	time 0.8762 (0.9127)	loss 0.5887 (0.5431)	grad_norm 2.8698 (3.0896)	mem 20675MB
[2025-04-03 02:20:24 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][36/311]	eta 0:04:10 lr 0.000115	time 0.8758 (0.9107)	loss 0.5066 (0.5377)	grad_norm 3.7455 (3.1412)	mem 20675MB
[2025-04-03 02:20:26 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][38/311]	eta 0:04:08 lr 0.000115	time 0.8756 (0.9090)	loss 0.5794 (0.5397)	grad_norm 4.1090 (3.1370)	mem 20675MB
[2025-04-03 02:20:28 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][40/311]	eta 0:04:05 lr 0.000115	time 0.8758 (0.9074)	loss 0.5707 (0.5384)	grad_norm 2.5320 (3.1713)	mem 20675MB
[2025-04-03 02:20:29 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][42/311]	eta 0:04:03 lr 0.000114	time 0.8758 (0.9060)	loss 0.4920 (0.5369)	grad_norm 3.5539 (3.2148)	mem 20675MB
[2025-04-03 02:20:31 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][44/311]	eta 0:04:01 lr 0.000114	time 0.8757 (0.9047)	loss 0.5089 (0.5347)	grad_norm 3.3886 (3.2197)	mem 20675MB
[2025-04-03 02:20:33 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][46/311]	eta 0:03:59 lr 0.000114	time 0.8755 (0.9035)	loss 0.5918 (0.5350)	grad_norm 2.7813 (3.2320)	mem 20675MB
[2025-04-03 02:20:35 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][48/311]	eta 0:03:57 lr 0.000114	time 0.8758 (0.9024)	loss 0.5188 (0.5350)	grad_norm 3.1619 (3.2169)	mem 20675MB
[2025-04-03 02:20:36 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][50/311]	eta 0:03:55 lr 0.000113	time 0.8755 (0.9014)	loss 0.5019 (0.5358)	grad_norm 3.7289 (3.2404)	mem 20675MB
[2025-04-03 02:20:38 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][52/311]	eta 0:03:53 lr 0.000113	time 0.8755 (0.9004)	loss 0.5212 (0.5352)	grad_norm 3.4252 (3.2465)	mem 20675MB
[2025-04-03 02:20:40 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][54/311]	eta 0:03:51 lr 0.000113	time 0.8762 (0.8996)	loss 0.5883 (0.5373)	grad_norm 2.4324 (3.2254)	mem 20675MB
[2025-04-03 02:20:42 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][56/311]	eta 0:03:49 lr 0.000113	time 0.8755 (0.8988)	loss 0.4251 (0.5349)	grad_norm 2.9714 (3.2191)	mem 20675MB
[2025-04-03 02:20:43 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][58/311]	eta 0:03:47 lr 0.000113	time 0.8757 (0.8980)	loss 0.5855 (0.5337)	grad_norm 3.1822 (3.2223)	mem 20675MB
[2025-04-03 02:20:45 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][60/311]	eta 0:03:45 lr 0.000112	time 0.8756 (0.8973)	loss 0.6237 (0.5328)	grad_norm 2.9212 (3.2262)	mem 20675MB
[2025-04-03 02:20:47 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][62/311]	eta 0:03:43 lr 0.000112	time 0.8768 (0.8967)	loss 0.5403 (0.5349)	grad_norm 4.3414 (3.2476)	mem 20675MB
[2025-04-03 02:20:49 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][64/311]	eta 0:03:41 lr 0.000112	time 0.8756 (0.8961)	loss 0.6081 (0.5362)	grad_norm 2.2140 (3.2166)	mem 20675MB
[2025-04-03 02:20:50 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][66/311]	eta 0:03:39 lr 0.000112	time 0.8757 (0.8955)	loss 0.5918 (0.5387)	grad_norm 2.4805 (3.2087)	mem 20675MB
[2025-04-03 02:20:52 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][68/311]	eta 0:03:37 lr 0.000111	time 0.8755 (0.8949)	loss 0.5478 (0.5378)	grad_norm 4.5260 (3.2397)	mem 20675MB
[2025-04-03 02:20:54 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][70/311]	eta 0:03:35 lr 0.000111	time 0.8785 (0.8945)	loss 0.5861 (0.5375)	grad_norm 3.2675 (3.2545)	mem 20675MB
[2025-04-03 02:20:56 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][72/311]	eta 0:03:33 lr 0.000111	time 0.8756 (0.8940)	loss 0.5469 (0.5364)	grad_norm 4.1426 (3.2818)	mem 20675MB
[2025-04-03 02:20:57 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][74/311]	eta 0:03:31 lr 0.000111	time 0.8755 (0.8935)	loss 0.5543 (0.5375)	grad_norm 2.8196 (3.2744)	mem 20675MB
[2025-04-03 02:20:59 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][76/311]	eta 0:03:29 lr 0.000110	time 0.8758 (0.8931)	loss 0.6292 (0.5390)	grad_norm 2.8113 (3.2614)	mem 20675MB
[2025-04-03 02:21:01 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][78/311]	eta 0:03:27 lr 0.000110	time 0.8759 (0.8927)	loss 0.6142 (0.5413)	grad_norm 2.4069 (3.2550)	mem 20675MB
[2025-04-03 02:21:03 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][80/311]	eta 0:03:26 lr 0.000110	time 0.8757 (0.8923)	loss 0.5609 (0.5409)	grad_norm 1.9230 (3.2366)	mem 20675MB
[2025-04-03 02:21:04 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][82/311]	eta 0:03:24 lr 0.000110	time 0.8758 (0.8919)	loss 0.5481 (0.5402)	grad_norm 2.7582 (3.2410)	mem 20675MB
[2025-04-03 02:21:06 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][84/311]	eta 0:03:22 lr 0.000109	time 0.8761 (0.8916)	loss 0.6432 (0.5417)	grad_norm 2.8197 (3.2277)	mem 20675MB
[2025-04-03 02:21:08 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][86/311]	eta 0:03:20 lr 0.000109	time 0.8754 (0.8912)	loss 0.6434 (0.5418)	grad_norm 2.8782 (3.2207)	mem 20675MB
[2025-04-03 02:21:10 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][88/311]	eta 0:03:18 lr 0.000109	time 0.8757 (0.8909)	loss 0.5467 (0.5417)	grad_norm 2.0384 (3.2015)	mem 20675MB
[2025-04-03 02:21:12 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][90/311]	eta 0:03:16 lr 0.000109	time 0.8760 (0.8906)	loss 0.4186 (0.5389)	grad_norm 3.8543 (3.2220)	mem 20675MB
[2025-04-03 02:21:13 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][92/311]	eta 0:03:14 lr 0.000108	time 0.8758 (0.8903)	loss 0.4052 (0.5378)	grad_norm 3.4155 (3.2266)	mem 20675MB
[2025-04-03 02:21:15 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][94/311]	eta 0:03:13 lr 0.000108	time 0.8785 (0.8900)	loss 0.6016 (0.5385)	grad_norm 2.1846 (3.2087)	mem 20675MB
[2025-04-03 02:21:17 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][96/311]	eta 0:03:11 lr 0.000108	time 0.8760 (0.8898)	loss 0.5016 (0.5365)	grad_norm 4.1776 (3.2141)	mem 20675MB
[2025-04-03 02:21:19 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][98/311]	eta 0:03:09 lr 0.000108	time 0.8760 (0.8895)	loss 0.4657 (0.5349)	grad_norm 3.6067 (3.2343)	mem 20675MB
[2025-04-03 02:21:20 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][100/311]	eta 0:03:07 lr 0.000108	time 0.8755 (0.8892)	loss 0.5021 (0.5334)	grad_norm 3.1028 (3.2325)	mem 20675MB
[2025-04-03 02:21:22 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][102/311]	eta 0:03:05 lr 0.000107	time 0.8755 (0.8890)	loss 0.5208 (0.5327)	grad_norm 3.5719 (3.2346)	mem 20675MB
[2025-04-03 02:21:24 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][104/311]	eta 0:03:03 lr 0.000107	time 0.8760 (0.8888)	loss 0.6170 (0.5342)	grad_norm 2.6048 (3.2262)	mem 20675MB
[2025-04-03 02:21:26 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][106/311]	eta 0:03:02 lr 0.000107	time 0.8758 (0.8885)	loss 0.4209 (0.5331)	grad_norm 3.7403 (3.2302)	mem 20675MB
[2025-04-03 02:21:27 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][108/311]	eta 0:03:00 lr 0.000107	time 0.8755 (0.8883)	loss 0.5288 (0.5333)	grad_norm 3.2359 (3.2255)	mem 20675MB
[2025-04-03 02:21:29 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][110/311]	eta 0:02:58 lr 0.000106	time 0.8754 (0.8881)	loss 0.6023 (0.5338)	grad_norm 2.9176 (3.2203)	mem 20675MB
[2025-04-03 02:21:31 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][112/311]	eta 0:02:56 lr 0.000106	time 0.8763 (0.8879)	loss 0.5307 (0.5347)	grad_norm 2.6837 (3.2091)	mem 20675MB
[2025-04-03 02:21:33 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][114/311]	eta 0:02:54 lr 0.000106	time 0.8755 (0.8877)	loss 0.6363 (0.5364)	grad_norm 3.3913 (3.2064)	mem 20675MB
[2025-04-03 02:21:34 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][116/311]	eta 0:02:53 lr 0.000106	time 0.8759 (0.8875)	loss 0.5098 (0.5362)	grad_norm 3.5189 (3.2045)	mem 20675MB
[2025-04-03 02:21:36 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][118/311]	eta 0:02:51 lr 0.000105	time 0.8755 (0.8873)	loss 0.6310 (0.5373)	grad_norm 2.2998 (3.1977)	mem 20675MB
[2025-04-03 02:21:38 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][120/311]	eta 0:02:49 lr 0.000105	time 0.8757 (0.8872)	loss 0.5030 (0.5366)	grad_norm 1.9808 (3.1853)	mem 20675MB
[2025-04-03 02:21:40 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][122/311]	eta 0:02:47 lr 0.000105	time 0.8755 (0.8870)	loss 0.5755 (0.5379)	grad_norm 2.2830 (3.1749)	mem 20675MB
[2025-04-03 02:21:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][124/311]	eta 0:02:45 lr 0.000105	time 0.8764 (0.8868)	loss 0.4559 (0.5382)	grad_norm 4.8631 (3.1875)	mem 20675MB
[2025-04-03 02:21:43 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][126/311]	eta 0:02:44 lr 0.000104	time 0.8753 (0.8867)	loss 0.5334 (0.5379)	grad_norm 2.6655 (3.1862)	mem 20675MB
[2025-04-03 02:21:45 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][128/311]	eta 0:02:42 lr 0.000104	time 0.8757 (0.8865)	loss 0.4607 (0.5368)	grad_norm 3.0783 (3.1903)	mem 20675MB
[2025-04-03 02:21:47 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][130/311]	eta 0:02:40 lr 0.000104	time 0.8754 (0.8863)	loss 0.5913 (0.5371)	grad_norm 2.6903 (3.1815)	mem 20675MB
[2025-04-03 02:21:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][132/311]	eta 0:02:38 lr 0.000104	time 0.8756 (0.8862)	loss 0.6501 (0.5373)	grad_norm 4.0657 (3.1960)	mem 20675MB
[2025-04-03 02:21:50 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][134/311]	eta 0:02:36 lr 0.000104	time 0.8760 (0.8861)	loss 0.5341 (0.5366)	grad_norm 2.2586 (3.1925)	mem 20675MB
[2025-04-03 02:21:52 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][136/311]	eta 0:02:35 lr 0.000103	time 0.8756 (0.8859)	loss 0.5516 (0.5375)	grad_norm 2.6606 (3.1907)	mem 20675MB
[2025-04-03 02:21:54 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][138/311]	eta 0:02:33 lr 0.000103	time 0.8758 (0.8858)	loss 0.4380 (0.5366)	grad_norm 3.6984 (3.2036)	mem 20675MB
[2025-04-03 02:21:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][140/311]	eta 0:02:31 lr 0.000103	time 0.8757 (0.8857)	loss 0.6264 (0.5376)	grad_norm 2.6639 (3.1924)	mem 20675MB
[2025-04-03 02:21:57 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][142/311]	eta 0:02:29 lr 0.000103	time 0.8755 (0.8855)	loss 0.6132 (0.5385)	grad_norm 2.6199 (3.1824)	mem 20675MB
[2025-04-03 02:21:59 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][144/311]	eta 0:02:27 lr 0.000102	time 0.8756 (0.8854)	loss 0.5417 (0.5382)	grad_norm 3.4904 (3.1840)	mem 20675MB
[2025-04-03 02:22:01 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][146/311]	eta 0:02:26 lr 0.000102	time 0.8782 (0.8853)	loss 0.5827 (0.5386)	grad_norm 2.3488 (3.1723)	mem 20675MB
[2025-04-03 02:22:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][148/311]	eta 0:02:24 lr 0.000102	time 0.8756 (0.8852)	loss 0.4758 (0.5370)	grad_norm 3.6562 (3.1830)	mem 20675MB
[2025-04-03 02:22:04 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][150/311]	eta 0:02:22 lr 0.000102	time 0.8755 (0.8851)	loss 0.3977 (0.5365)	grad_norm 5.4495 (3.1972)	mem 20675MB
[2025-04-03 02:22:06 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][152/311]	eta 0:02:20 lr 0.000101	time 0.8758 (0.8850)	loss 0.6340 (0.5365)	grad_norm 3.0032 (3.2066)	mem 20675MB
[2025-04-03 02:22:08 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][154/311]	eta 0:02:18 lr 0.000101	time 0.8757 (0.8848)	loss 0.5458 (0.5370)	grad_norm 2.4997 (3.1967)	mem 20675MB
[2025-04-03 02:22:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][156/311]	eta 0:02:17 lr 0.000101	time 0.8754 (0.8847)	loss 0.5813 (0.5366)	grad_norm 2.1934 (3.1932)	mem 20675MB
[2025-04-03 02:22:11 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][158/311]	eta 0:02:15 lr 0.000101	time 0.8760 (0.8846)	loss 0.5497 (0.5369)	grad_norm 2.0693 (3.1911)	mem 20675MB
[2025-04-03 02:22:13 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][160/311]	eta 0:02:13 lr 0.000101	time 0.8759 (0.8845)	loss 0.5870 (0.5374)	grad_norm 2.4448 (3.1873)	mem 20675MB
[2025-04-03 02:22:15 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][162/311]	eta 0:02:11 lr 0.000100	time 0.8755 (0.8844)	loss 0.6061 (0.5371)	grad_norm 2.8652 (3.1900)	mem 20675MB
[2025-04-03 02:22:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][164/311]	eta 0:02:09 lr 0.000100	time 0.8755 (0.8843)	loss 0.6099 (0.5382)	grad_norm 3.1868 (3.1877)	mem 20675MB
[2025-04-03 02:22:18 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][166/311]	eta 0:02:08 lr 0.000100	time 0.8760 (0.8842)	loss 0.5214 (0.5383)	grad_norm 3.1247 (3.1852)	mem 20675MB
[2025-04-03 02:22:20 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][168/311]	eta 0:02:06 lr 0.000100	time 0.8756 (0.8842)	loss 0.6020 (0.5393)	grad_norm 2.2415 (3.1756)	mem 20675MB
[2025-04-03 02:22:22 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][170/311]	eta 0:02:04 lr 0.000099	time 0.8758 (0.8841)	loss 0.5252 (0.5398)	grad_norm 2.8086 (3.1718)	mem 20675MB
[2025-04-03 02:22:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][172/311]	eta 0:02:02 lr 0.000099	time 0.8758 (0.8840)	loss 0.4782 (0.5390)	grad_norm 3.8223 (3.1828)	mem 20675MB
[2025-04-03 02:22:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][174/311]	eta 0:02:01 lr 0.000099	time 0.8754 (0.8839)	loss 0.5606 (0.5386)	grad_norm 2.6694 (3.1851)	mem 20675MB
[2025-04-03 02:22:27 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][176/311]	eta 0:01:59 lr 0.000099	time 0.8753 (0.8838)	loss 0.5374 (0.5388)	grad_norm 3.9731 (3.1851)	mem 20675MB
[2025-04-03 02:22:29 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][178/311]	eta 0:01:57 lr 0.000098	time 0.8758 (0.8837)	loss 0.4829 (0.5388)	grad_norm 2.5884 (3.1770)	mem 20675MB
[2025-04-03 02:22:30 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][180/311]	eta 0:01:55 lr 0.000098	time 0.8757 (0.8837)	loss 0.5705 (0.5387)	grad_norm 3.0047 (3.1776)	mem 20675MB
[2025-04-03 02:22:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][182/311]	eta 0:01:53 lr 0.000098	time 0.8757 (0.8836)	loss 0.6237 (0.5393)	grad_norm 2.5499 (3.1732)	mem 20675MB
[2025-04-03 02:22:34 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][184/311]	eta 0:01:52 lr 0.000098	time 0.8755 (0.8835)	loss 0.5662 (0.5396)	grad_norm 2.9804 (3.1701)	mem 20675MB
[2025-04-03 02:22:36 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][186/311]	eta 0:01:50 lr 0.000098	time 0.8757 (0.8834)	loss 0.4936 (0.5398)	grad_norm 3.2214 (3.1663)	mem 20675MB
[2025-04-03 02:22:37 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][188/311]	eta 0:01:48 lr 0.000097	time 0.8755 (0.8833)	loss 0.4985 (0.5396)	grad_norm 3.9893 (3.1697)	mem 20675MB
[2025-04-03 02:22:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][190/311]	eta 0:01:46 lr 0.000097	time 0.8756 (0.8833)	loss 0.5542 (0.5390)	grad_norm 2.8617 (3.1737)	mem 20675MB
[2025-04-03 02:22:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][192/311]	eta 0:01:45 lr 0.000097	time 0.8762 (0.8832)	loss 0.5891 (0.5395)	grad_norm 2.3742 (3.1677)	mem 20675MB
[2025-04-03 02:22:43 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][194/311]	eta 0:01:43 lr 0.000097	time 0.8756 (0.8831)	loss 0.5388 (0.5393)	grad_norm 2.3556 (3.1611)	mem 20675MB
[2025-04-03 02:22:44 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][196/311]	eta 0:01:41 lr 0.000096	time 0.8754 (0.8831)	loss 0.4547 (0.5393)	grad_norm 2.8534 (3.1588)	mem 20675MB
[2025-04-03 02:22:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][198/311]	eta 0:01:39 lr 0.000096	time 0.8757 (0.8830)	loss 0.6456 (0.5402)	grad_norm 2.5928 (3.1554)	mem 20675MB
[2025-04-03 02:22:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][200/311]	eta 0:01:38 lr 0.000096	time 0.8756 (0.8830)	loss 0.5736 (0.5404)	grad_norm 2.5071 (3.1484)	mem 20675MB
[2025-04-03 02:22:50 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][202/311]	eta 0:01:36 lr 0.000096	time 0.8755 (0.8829)	loss 0.5051 (0.5403)	grad_norm 2.3084 (3.1398)	mem 20675MB
[2025-04-03 02:22:51 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][204/311]	eta 0:01:34 lr 0.000096	time 0.8754 (0.8828)	loss 0.5853 (0.5405)	grad_norm 2.0852 (3.1391)	mem 20675MB
[2025-04-03 02:22:53 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][206/311]	eta 0:01:32 lr 0.000095	time 0.8756 (0.8828)	loss 0.4008 (0.5400)	grad_norm 4.0124 (3.1426)	mem 20675MB
[2025-04-03 02:22:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][208/311]	eta 0:01:30 lr 0.000095	time 0.8757 (0.8827)	loss 0.5089 (0.5395)	grad_norm 3.9598 (3.1456)	mem 20675MB
[2025-04-03 02:22:57 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][210/311]	eta 0:01:29 lr 0.000095	time 0.8754 (0.8826)	loss 0.5543 (0.5400)	grad_norm 2.6751 (3.1413)	mem 20675MB
[2025-04-03 02:22:58 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][212/311]	eta 0:01:27 lr 0.000095	time 0.8755 (0.8826)	loss 0.5346 (0.5394)	grad_norm 2.5300 (3.1423)	mem 20675MB
[2025-04-03 02:23:00 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][214/311]	eta 0:01:25 lr 0.000094	time 0.8755 (0.8825)	loss 0.5903 (0.5391)	grad_norm 1.9795 (3.1407)	mem 20675MB
[2025-04-03 02:23:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][216/311]	eta 0:01:23 lr 0.000094	time 0.8755 (0.8825)	loss 0.5500 (0.5389)	grad_norm 3.2371 (3.1418)	mem 20675MB
[2025-04-03 02:23:04 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][218/311]	eta 0:01:22 lr 0.000094	time 0.8756 (0.8824)	loss 0.4730 (0.5387)	grad_norm 3.1048 (3.1389)	mem 20675MB
[2025-04-03 02:23:05 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][220/311]	eta 0:01:20 lr 0.000094	time 0.8755 (0.8824)	loss 0.5802 (0.5390)	grad_norm 3.6879 (3.1382)	mem 20675MB
[2025-04-03 02:23:07 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][222/311]	eta 0:01:18 lr 0.000094	time 0.8754 (0.8823)	loss 0.5201 (0.5393)	grad_norm 3.3327 (3.1371)	mem 20675MB
[2025-04-03 02:23:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][224/311]	eta 0:01:16 lr 0.000093	time 0.8757 (0.8823)	loss 0.6400 (0.5395)	grad_norm 3.1976 (3.1398)	mem 20675MB
[2025-04-03 02:23:11 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][226/311]	eta 0:01:14 lr 0.000093	time 0.8754 (0.8822)	loss 0.6289 (0.5401)	grad_norm 3.1299 (3.1399)	mem 20675MB
[2025-04-03 02:23:12 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][228/311]	eta 0:01:13 lr 0.000093	time 0.8769 (0.8822)	loss 0.5343 (0.5402)	grad_norm 4.6857 (3.1483)	mem 20675MB
[2025-04-03 02:23:14 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][230/311]	eta 0:01:11 lr 0.000093	time 0.8756 (0.8821)	loss 0.5847 (0.5406)	grad_norm 2.3453 (3.1416)	mem 20675MB
[2025-04-03 02:23:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][232/311]	eta 0:01:09 lr 0.000092	time 0.8756 (0.8821)	loss 0.5324 (0.5409)	grad_norm 2.9599 (3.1379)	mem 20675MB
[2025-04-03 02:23:18 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][234/311]	eta 0:01:07 lr 0.000092	time 0.8757 (0.8820)	loss 0.5104 (0.5411)	grad_norm 2.5977 (3.1353)	mem 20675MB
[2025-04-03 02:23:19 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][236/311]	eta 0:01:06 lr 0.000092	time 0.8755 (0.8820)	loss 0.5504 (0.5413)	grad_norm 1.9576 (3.1273)	mem 20675MB
[2025-04-03 02:23:21 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][238/311]	eta 0:01:04 lr 0.000092	time 0.8757 (0.8819)	loss 0.5822 (0.5417)	grad_norm 2.0470 (3.1216)	mem 20675MB
[2025-04-03 02:23:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][240/311]	eta 0:01:02 lr 0.000092	time 0.8755 (0.8819)	loss 0.5732 (0.5415)	grad_norm 1.7355 (3.1179)	mem 20675MB
[2025-04-03 02:23:25 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][242/311]	eta 0:01:00 lr 0.000091	time 0.8754 (0.8818)	loss 0.5611 (0.5417)	grad_norm 3.6565 (3.1171)	mem 20675MB
[2025-04-03 02:23:27 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][244/311]	eta 0:00:59 lr 0.000091	time 0.8754 (0.8818)	loss 0.5757 (0.5421)	grad_norm 2.9939 (3.1151)	mem 20675MB
[2025-04-03 02:23:28 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][246/311]	eta 0:00:57 lr 0.000091	time 0.8755 (0.8817)	loss 0.5353 (0.5422)	grad_norm 2.9500 (3.1102)	mem 20675MB
[2025-04-03 02:23:30 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][248/311]	eta 0:00:55 lr 0.000091	time 0.8754 (0.8817)	loss 0.4304 (0.5411)	grad_norm 4.1726 (3.1190)	mem 20675MB
[2025-04-03 02:23:32 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][250/311]	eta 0:00:53 lr 0.000090	time 0.8756 (0.8817)	loss 0.5680 (0.5410)	grad_norm 3.6653 (3.1228)	mem 20675MB
[2025-04-03 02:23:34 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][252/311]	eta 0:00:52 lr 0.000090	time 0.8756 (0.8816)	loss 0.6062 (0.5412)	grad_norm 2.8905 (3.1273)	mem 20675MB
[2025-04-03 02:23:35 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][254/311]	eta 0:00:50 lr 0.000090	time 0.8754 (0.8816)	loss 0.4645 (0.5406)	grad_norm 5.2068 (3.1358)	mem 20675MB
[2025-04-03 02:23:37 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][256/311]	eta 0:00:48 lr 0.000090	time 0.8753 (0.8815)	loss 0.4212 (0.5396)	grad_norm 5.6716 (3.1517)	mem 20675MB
[2025-04-03 02:23:39 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][258/311]	eta 0:00:46 lr 0.000090	time 0.8757 (0.8815)	loss 0.5451 (0.5395)	grad_norm 3.2783 (3.1574)	mem 20675MB
[2025-04-03 02:23:41 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][260/311]	eta 0:00:44 lr 0.000089	time 0.8755 (0.8815)	loss 0.5377 (0.5393)	grad_norm 3.2564 (3.1621)	mem 20675MB
[2025-04-03 02:23:42 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][262/311]	eta 0:00:43 lr 0.000089	time 0.8755 (0.8814)	loss 0.5933 (0.5397)	grad_norm 2.9450 (3.1590)	mem 20675MB
[2025-04-03 02:23:44 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][264/311]	eta 0:00:41 lr 0.000089	time 0.8755 (0.8814)	loss 0.5434 (0.5400)	grad_norm 2.5041 (3.1562)	mem 20675MB
[2025-04-03 02:23:46 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][266/311]	eta 0:00:39 lr 0.000089	time 0.8754 (0.8813)	loss 0.5645 (0.5406)	grad_norm 3.4309 (3.1570)	mem 20675MB
[2025-04-03 02:23:48 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][268/311]	eta 0:00:37 lr 0.000089	time 0.8761 (0.8813)	loss 0.3967 (0.5401)	grad_norm 3.0650 (3.1580)	mem 20675MB
[2025-04-03 02:23:49 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][270/311]	eta 0:00:36 lr 0.000088	time 0.8757 (0.8813)	loss 0.5741 (0.5398)	grad_norm 2.9056 (3.1561)	mem 20675MB
[2025-04-03 02:23:51 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][272/311]	eta 0:00:34 lr 0.000088	time 0.8758 (0.8812)	loss 0.5749 (0.5398)	grad_norm 2.9906 (3.1521)	mem 20675MB
[2025-04-03 02:23:53 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][274/311]	eta 0:00:32 lr 0.000088	time 0.8757 (0.8812)	loss 0.6110 (0.5399)	grad_norm 2.7412 (3.1498)	mem 20675MB
[2025-04-03 02:23:55 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][276/311]	eta 0:00:30 lr 0.000088	time 0.8754 (0.8812)	loss 0.4960 (0.5393)	grad_norm 3.2174 (3.1558)	mem 20675MB
[2025-04-03 02:23:56 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][278/311]	eta 0:00:29 lr 0.000087	time 0.8754 (0.8811)	loss 0.5865 (0.5395)	grad_norm 3.8974 (3.1593)	mem 20675MB
[2025-04-03 02:23:58 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][280/311]	eta 0:00:27 lr 0.000087	time 0.8758 (0.8811)	loss 0.4758 (0.5391)	grad_norm 2.8692 (3.1654)	mem 20675MB
[2025-04-03 02:24:00 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][282/311]	eta 0:00:25 lr 0.000087	time 0.8756 (0.8811)	loss 0.6244 (0.5396)	grad_norm 2.8966 (3.1636)	mem 20675MB
[2025-04-03 02:24:02 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][284/311]	eta 0:00:23 lr 0.000087	time 0.8754 (0.8810)	loss 0.5421 (0.5395)	grad_norm 2.8718 (3.1645)	mem 20675MB
[2025-04-03 02:24:03 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][286/311]	eta 0:00:22 lr 0.000087	time 0.8754 (0.8810)	loss 0.4685 (0.5390)	grad_norm 5.2800 (3.1712)	mem 20675MB
[2025-04-03 02:24:05 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][288/311]	eta 0:00:20 lr 0.000086	time 0.8758 (0.8810)	loss 0.5316 (0.5393)	grad_norm 3.1066 (3.1703)	mem 20675MB
[2025-04-03 02:24:07 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][290/311]	eta 0:00:18 lr 0.000086	time 0.8754 (0.8809)	loss 0.4975 (0.5393)	grad_norm 4.7900 (3.1742)	mem 20675MB
[2025-04-03 02:24:09 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][292/311]	eta 0:00:16 lr 0.000086	time 0.8754 (0.8809)	loss 0.5867 (0.5392)	grad_norm 3.5262 (3.1749)	mem 20675MB
[2025-04-03 02:24:10 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][294/311]	eta 0:00:14 lr 0.000086	time 0.8757 (0.8809)	loss 0.5473 (0.5392)	grad_norm 3.2289 (3.1758)	mem 20675MB
[2025-04-03 02:24:12 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][296/311]	eta 0:00:13 lr 0.000086	time 0.8754 (0.8809)	loss 0.5196 (0.5393)	grad_norm 3.4818 (3.1780)	mem 20675MB
[2025-04-03 02:24:14 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][298/311]	eta 0:00:11 lr 0.000085	time 0.8754 (0.8808)	loss 0.4089 (0.5389)	grad_norm 4.6892 (3.1826)	mem 20675MB
[2025-04-03 02:24:16 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][300/311]	eta 0:00:09 lr 0.000085	time 0.8757 (0.8808)	loss 0.4827 (0.5388)	grad_norm 3.4258 (3.1810)	mem 20675MB
[2025-04-03 02:24:17 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][302/311]	eta 0:00:07 lr 0.000085	time 0.8753 (0.8808)	loss 0.6289 (0.5393)	grad_norm 2.8696 (3.1758)	mem 20675MB
[2025-04-03 02:24:19 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][304/311]	eta 0:00:06 lr 0.000085	time 0.8772 (0.8807)	loss 0.5131 (0.5392)	grad_norm 4.2895 (3.1814)	mem 20675MB
[2025-04-03 02:24:21 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][306/311]	eta 0:00:04 lr 0.000084	time 0.8755 (0.8807)	loss 0.5741 (0.5390)	grad_norm 3.0213 (3.1860)	mem 20675MB
[2025-04-03 02:24:23 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][308/311]	eta 0:00:02 lr 0.000084	time 0.8760 (0.8807)	loss 0.3557 (0.5386)	grad_norm 4.6398 (3.1900)	mem 20675MB
[2025-04-03 02:24:24 simmim_finetune] (main_finetune.py 252): INFO Train: [24/30][310/311]	eta 0:00:00 lr 0.000084	time 0.8755 (0.8807)	loss 0.6180 (0.5389)	grad_norm 2.7216 (3.1926)	mem 20675MB
[2025-04-03 02:24:24 simmim_finetune] (main_finetune.py 260): INFO EPOCH 24 training takes 0:04:34
[2025-04-03 02:24:26 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.375 (1.375)	Loss 0.5635 (0.5635)	Acc@1 75.781 (75.781)	Mem 20675MB
[2025-04-03 02:24:26 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.465
[2025-04-03 02:24:26 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 77.5%
[2025-04-03 02:24:26 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:24:26 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [5.432035671705076e-07, 5.432035671705076e-07, 7.101112671401652e-07, 7.101112671401652e-07, 9.668923440165614e-07, 9.668923440165614e-07, 1.3619401545956325e-06, 1.3619401545956325e-06, 1.9697060170249727e-06, 1.9697060170249727e-06, 2.904730420762419e-06, 2.904730420762419e-06, 4.343229503435413e-06, 4.343229503435413e-06, 6.556305015240018e-06, 6.556305015240018e-06, 9.961036571862489e-06, 9.961036571862489e-06, 1.5199085120512445e-05, 1.5199085120512445e-05, 2.325762134920468e-05, 2.325762134920468e-05, 3.565536939334659e-05, 3.565536939334659e-05, 5.472882792279567e-05, 5.472882792279567e-05, 8.407261027579427e-05, 8.407261027579427e-05]
[2025-04-03 02:24:28 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][0/311]	eta 0:11:47 lr 0.000084	time 2.2743 (2.2743)	loss 0.6304 (0.6304)	grad_norm 2.7779 (2.7779)	mem 20675MB
[2025-04-03 02:24:30 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][2/311]	eta 0:06:54 lr 0.000084	time 0.8756 (1.3426)	loss 0.4706 (0.5773)	grad_norm 4.5873 (3.5256)	mem 20675MB
[2025-04-03 02:24:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][4/311]	eta 0:05:54 lr 0.000084	time 0.8761 (1.1563)	loss 0.5176 (0.5448)	grad_norm 3.3576 (3.5231)	mem 20675MB
[2025-04-03 02:24:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][6/311]	eta 0:05:28 lr 0.000083	time 0.8766 (1.0767)	loss 0.6294 (0.5590)	grad_norm 3.2567 (3.4863)	mem 20675MB
[2025-04-03 02:24:35 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][8/311]	eta 0:05:12 lr 0.000083	time 0.8762 (1.0323)	loss 0.5744 (0.5524)	grad_norm 3.0819 (3.4977)	mem 20675MB
[2025-04-03 02:24:37 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][10/311]	eta 0:05:02 lr 0.000083	time 0.8760 (1.0040)	loss 0.4740 (0.5499)	grad_norm 3.4976 (3.4631)	mem 20675MB
[2025-04-03 02:24:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][12/311]	eta 0:04:54 lr 0.000083	time 0.8755 (0.9844)	loss 0.5223 (0.5520)	grad_norm 3.6710 (3.4323)	mem 20675MB
[2025-04-03 02:24:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][14/311]	eta 0:04:48 lr 0.000083	time 0.8756 (0.9700)	loss 0.6059 (0.5600)	grad_norm 3.3681 (3.4044)	mem 20675MB
[2025-04-03 02:24:42 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][16/311]	eta 0:04:42 lr 0.000082	time 0.8757 (0.9590)	loss 0.5367 (0.5514)	grad_norm 2.3252 (3.3419)	mem 20675MB
[2025-04-03 02:24:44 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][18/311]	eta 0:04:38 lr 0.000082	time 0.8759 (0.9503)	loss 0.6256 (0.5524)	grad_norm 2.2280 (3.3214)	mem 20675MB
[2025-04-03 02:24:46 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][20/311]	eta 0:04:34 lr 0.000082	time 0.8758 (0.9434)	loss 0.5327 (0.5518)	grad_norm 3.1931 (3.2637)	mem 20675MB
[2025-04-03 02:24:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][22/311]	eta 0:04:30 lr 0.000082	time 0.8758 (0.9375)	loss 0.5274 (0.5531)	grad_norm 3.3323 (3.2458)	mem 20675MB
[2025-04-03 02:24:49 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][24/311]	eta 0:04:27 lr 0.000081	time 0.8756 (0.9327)	loss 0.3848 (0.5420)	grad_norm 4.4257 (3.3814)	mem 20675MB
[2025-04-03 02:24:51 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][26/311]	eta 0:04:24 lr 0.000081	time 0.8758 (0.9285)	loss 0.5437 (0.5432)	grad_norm 3.3323 (3.3438)	mem 20675MB
[2025-04-03 02:24:53 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][28/311]	eta 0:04:21 lr 0.000081	time 0.8757 (0.9249)	loss 0.6365 (0.5456)	grad_norm 2.5243 (3.3167)	mem 20675MB
[2025-04-03 02:24:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][30/311]	eta 0:04:19 lr 0.000081	time 0.8754 (0.9218)	loss 0.5341 (0.5489)	grad_norm 2.8230 (3.2784)	mem 20675MB
[2025-04-03 02:24:56 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][32/311]	eta 0:04:16 lr 0.000081	time 0.8756 (0.9191)	loss 0.6048 (0.5510)	grad_norm 2.4451 (3.2342)	mem 20675MB
[2025-04-03 02:24:58 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][34/311]	eta 0:04:13 lr 0.000080	time 0.8755 (0.9166)	loss 0.5160 (0.5459)	grad_norm 3.4344 (3.2649)	mem 20675MB
[2025-04-03 02:25:00 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][36/311]	eta 0:04:11 lr 0.000080	time 0.8759 (0.9145)	loss 0.5188 (0.5461)	grad_norm 2.4183 (3.2200)	mem 20675MB
[2025-04-03 02:25:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][38/311]	eta 0:04:09 lr 0.000080	time 0.8758 (0.9125)	loss 0.4764 (0.5447)	grad_norm 4.0137 (3.2215)	mem 20675MB
[2025-04-03 02:25:03 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][40/311]	eta 0:04:06 lr 0.000080	time 0.8758 (0.9108)	loss 0.6327 (0.5486)	grad_norm 3.7693 (3.2305)	mem 20675MB
[2025-04-03 02:25:05 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][42/311]	eta 0:04:04 lr 0.000080	time 0.8759 (0.9092)	loss 0.5616 (0.5494)	grad_norm 3.0954 (3.2074)	mem 20675MB
[2025-04-03 02:25:07 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][44/311]	eta 0:04:02 lr 0.000079	time 0.8760 (0.9078)	loss 0.5316 (0.5498)	grad_norm 3.1527 (3.1986)	mem 20675MB
[2025-04-03 02:25:09 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][46/311]	eta 0:04:00 lr 0.000079	time 0.8759 (0.9065)	loss 0.4688 (0.5488)	grad_norm 4.7639 (3.2130)	mem 20675MB
[2025-04-03 02:25:10 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][48/311]	eta 0:03:58 lr 0.000079	time 0.8756 (0.9053)	loss 0.5680 (0.5487)	grad_norm 1.5867 (3.1858)	mem 20675MB
[2025-04-03 02:25:12 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][50/311]	eta 0:03:55 lr 0.000079	time 0.8755 (0.9042)	loss 0.5730 (0.5469)	grad_norm 1.7374 (3.1732)	mem 20675MB
[2025-04-03 02:25:14 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][52/311]	eta 0:03:53 lr 0.000079	time 0.8760 (0.9031)	loss 0.5330 (0.5452)	grad_norm 2.0994 (3.1522)	mem 20675MB
[2025-04-03 02:25:16 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][54/311]	eta 0:03:51 lr 0.000078	time 0.8758 (0.9022)	loss 0.5904 (0.5479)	grad_norm 2.4591 (3.1270)	mem 20675MB
[2025-04-03 02:25:17 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][56/311]	eta 0:03:49 lr 0.000078	time 0.8771 (0.9013)	loss 0.3948 (0.5436)	grad_norm 3.5027 (3.1428)	mem 20675MB
[2025-04-03 02:25:19 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][58/311]	eta 0:03:47 lr 0.000078	time 0.8755 (0.9005)	loss 0.5880 (0.5438)	grad_norm 3.5664 (3.1581)	mem 20675MB
[2025-04-03 02:25:21 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][60/311]	eta 0:03:45 lr 0.000078	time 0.8757 (0.8997)	loss 0.6082 (0.5460)	grad_norm 2.8374 (3.1460)	mem 20675MB
[2025-04-03 02:25:23 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][62/311]	eta 0:03:43 lr 0.000078	time 0.8758 (0.8989)	loss 0.4708 (0.5460)	grad_norm 4.4219 (3.1598)	mem 20675MB
[2025-04-03 02:25:24 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][64/311]	eta 0:03:41 lr 0.000077	time 0.8758 (0.8982)	loss 0.4289 (0.5427)	grad_norm 6.2631 (3.2194)	mem 20675MB
[2025-04-03 02:25:26 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][66/311]	eta 0:03:39 lr 0.000077	time 0.8762 (0.8976)	loss 0.6460 (0.5451)	grad_norm 2.5501 (3.2010)	mem 20675MB
[2025-04-03 02:25:28 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][68/311]	eta 0:03:37 lr 0.000077	time 0.8781 (0.8970)	loss 0.6250 (0.5473)	grad_norm 2.5188 (3.1828)	mem 20675MB
[2025-04-03 02:25:30 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][70/311]	eta 0:03:36 lr 0.000077	time 0.8759 (0.8965)	loss 0.4036 (0.5459)	grad_norm 3.4600 (3.1756)	mem 20675MB
[2025-04-03 02:25:31 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][72/311]	eta 0:03:34 lr 0.000077	time 0.8755 (0.8959)	loss 0.5497 (0.5463)	grad_norm 2.4807 (3.1537)	mem 20675MB
[2025-04-03 02:25:33 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][74/311]	eta 0:03:32 lr 0.000076	time 0.8774 (0.8954)	loss 0.5533 (0.5468)	grad_norm 2.5049 (3.1354)	mem 20675MB
[2025-04-03 02:25:35 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][76/311]	eta 0:03:30 lr 0.000076	time 0.8754 (0.8950)	loss 0.5024 (0.5451)	grad_norm 4.0118 (3.1680)	mem 20675MB
[2025-04-03 02:25:37 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][78/311]	eta 0:03:28 lr 0.000076	time 0.8757 (0.8945)	loss 0.6069 (0.5460)	grad_norm 3.1315 (3.1732)	mem 20675MB
[2025-04-03 02:25:38 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][80/311]	eta 0:03:26 lr 0.000076	time 0.8754 (0.8940)	loss 0.4931 (0.5434)	grad_norm 4.5582 (3.2037)	mem 20675MB
[2025-04-03 02:25:40 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][82/311]	eta 0:03:24 lr 0.000076	time 0.8762 (0.8936)	loss 0.5508 (0.5439)	grad_norm 3.4227 (3.2016)	mem 20675MB
[2025-04-03 02:25:42 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][84/311]	eta 0:03:22 lr 0.000075	time 0.8758 (0.8932)	loss 0.5823 (0.5446)	grad_norm 2.7046 (3.1945)	mem 20675MB
[2025-04-03 02:25:44 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][86/311]	eta 0:03:20 lr 0.000075	time 0.8773 (0.8929)	loss 0.5665 (0.5442)	grad_norm 2.9186 (3.1878)	mem 20675MB
[2025-04-03 02:25:45 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][88/311]	eta 0:03:19 lr 0.000075	time 0.8757 (0.8925)	loss 0.5323 (0.5440)	grad_norm 2.8921 (3.1857)	mem 20675MB
[2025-04-03 02:25:47 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][90/311]	eta 0:03:17 lr 0.000075	time 0.8757 (0.8922)	loss 0.5945 (0.5453)	grad_norm 2.6335 (3.1768)	mem 20675MB
[2025-04-03 02:25:49 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][92/311]	eta 0:03:15 lr 0.000075	time 0.8753 (0.8918)	loss 0.5568 (0.5456)	grad_norm 2.3201 (3.1596)	mem 20675MB
[2025-04-03 02:25:51 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][94/311]	eta 0:03:13 lr 0.000074	time 0.8755 (0.8915)	loss 0.5033 (0.5464)	grad_norm 3.6329 (3.1665)	mem 20675MB
[2025-04-03 02:25:52 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][96/311]	eta 0:03:11 lr 0.000074	time 0.8758 (0.8912)	loss 0.6900 (0.5484)	grad_norm 3.2755 (3.1648)	mem 20675MB
[2025-04-03 02:25:54 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][98/311]	eta 0:03:09 lr 0.000074	time 0.8757 (0.8909)	loss 0.5878 (0.5481)	grad_norm 2.4834 (3.1679)	mem 20675MB
[2025-04-03 02:25:56 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][100/311]	eta 0:03:07 lr 0.000074	time 0.8758 (0.8906)	loss 0.6127 (0.5481)	grad_norm 2.8223 (3.1580)	mem 20675MB
[2025-04-03 02:25:58 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][102/311]	eta 0:03:06 lr 0.000074	time 0.8757 (0.8903)	loss 0.4584 (0.5474)	grad_norm 3.8073 (3.1593)	mem 20675MB
[2025-04-03 02:25:59 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][104/311]	eta 0:03:04 lr 0.000073	time 0.8755 (0.8901)	loss 0.5788 (0.5472)	grad_norm 2.3601 (3.1512)	mem 20675MB
[2025-04-03 02:26:01 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][106/311]	eta 0:03:02 lr 0.000073	time 0.8755 (0.8898)	loss 0.6046 (0.5477)	grad_norm 2.8309 (3.1433)	mem 20675MB
[2025-04-03 02:26:03 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][108/311]	eta 0:03:00 lr 0.000073	time 0.8757 (0.8896)	loss 0.5461 (0.5477)	grad_norm 3.4811 (3.1425)	mem 20675MB
[2025-04-03 02:26:05 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][110/311]	eta 0:02:58 lr 0.000073	time 0.8756 (0.8893)	loss 0.4993 (0.5461)	grad_norm 3.3856 (3.1598)	mem 20675MB
[2025-04-03 02:26:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][112/311]	eta 0:02:56 lr 0.000073	time 0.8755 (0.8891)	loss 0.5128 (0.5460)	grad_norm 3.6907 (3.1556)	mem 20675MB
[2025-04-03 02:26:08 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][114/311]	eta 0:02:55 lr 0.000072	time 0.8756 (0.8889)	loss 0.6244 (0.5469)	grad_norm 2.2087 (3.1432)	mem 20675MB
[2025-04-03 02:26:10 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][116/311]	eta 0:02:53 lr 0.000072	time 0.8757 (0.8887)	loss 0.5238 (0.5470)	grad_norm 2.7496 (3.1367)	mem 20675MB
[2025-04-03 02:26:12 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][118/311]	eta 0:02:51 lr 0.000072	time 0.8763 (0.8885)	loss 0.6194 (0.5475)	grad_norm 3.2693 (3.1339)	mem 20675MB
[2025-04-03 02:26:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][120/311]	eta 0:02:49 lr 0.000072	time 0.8756 (0.8883)	loss 0.5552 (0.5479)	grad_norm 1.8553 (3.1135)	mem 20675MB
[2025-04-03 02:26:15 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][122/311]	eta 0:02:47 lr 0.000072	time 0.8756 (0.8881)	loss 0.5460 (0.5474)	grad_norm 2.7178 (3.1090)	mem 20675MB
[2025-04-03 02:26:17 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][124/311]	eta 0:02:46 lr 0.000071	time 0.8758 (0.8879)	loss 0.4491 (0.5471)	grad_norm 3.5336 (3.1105)	mem 20675MB
[2025-04-03 02:26:19 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][126/311]	eta 0:02:44 lr 0.000071	time 0.8759 (0.8877)	loss 0.6544 (0.5482)	grad_norm 3.5031 (3.1120)	mem 20675MB
[2025-04-03 02:26:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][128/311]	eta 0:02:42 lr 0.000071	time 0.8755 (0.8876)	loss 0.5519 (0.5481)	grad_norm 2.9944 (3.1069)	mem 20675MB
[2025-04-03 02:26:22 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][130/311]	eta 0:02:40 lr 0.000071	time 0.8756 (0.8874)	loss 0.5865 (0.5483)	grad_norm 2.4894 (3.0931)	mem 20675MB
[2025-04-03 02:26:24 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][132/311]	eta 0:02:38 lr 0.000071	time 0.8758 (0.8872)	loss 0.5042 (0.5483)	grad_norm 3.0345 (3.0875)	mem 20675MB
[2025-04-03 02:26:26 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][134/311]	eta 0:02:37 lr 0.000070	time 0.8759 (0.8871)	loss 0.6742 (0.5486)	grad_norm 2.7034 (3.0882)	mem 20675MB
[2025-04-03 02:26:28 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][136/311]	eta 0:02:35 lr 0.000070	time 0.8755 (0.8869)	loss 0.4921 (0.5480)	grad_norm 3.7743 (3.0947)	mem 20675MB
[2025-04-03 02:26:29 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][138/311]	eta 0:02:33 lr 0.000070	time 0.8756 (0.8868)	loss 0.5220 (0.5472)	grad_norm 2.4696 (3.0907)	mem 20675MB
[2025-04-03 02:26:31 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][140/311]	eta 0:02:31 lr 0.000070	time 0.8755 (0.8866)	loss 0.5357 (0.5472)	grad_norm 3.3785 (3.0906)	mem 20675MB
[2025-04-03 02:26:33 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][142/311]	eta 0:02:29 lr 0.000070	time 0.8759 (0.8865)	loss 0.4943 (0.5463)	grad_norm 4.1253 (3.0996)	mem 20675MB
[2025-04-03 02:26:35 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][144/311]	eta 0:02:28 lr 0.000069	time 0.8756 (0.8864)	loss 0.5216 (0.5463)	grad_norm 3.1462 (3.1004)	mem 20675MB
[2025-04-03 02:26:36 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][146/311]	eta 0:02:26 lr 0.000069	time 0.8758 (0.8862)	loss 0.6461 (0.5466)	grad_norm 2.8176 (3.0992)	mem 20675MB
[2025-04-03 02:26:38 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][148/311]	eta 0:02:24 lr 0.000069	time 0.8756 (0.8861)	loss 0.5751 (0.5459)	grad_norm 3.0119 (3.1106)	mem 20675MB
[2025-04-03 02:26:40 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][150/311]	eta 0:02:22 lr 0.000069	time 0.8755 (0.8860)	loss 0.5783 (0.5461)	grad_norm 2.7475 (3.1103)	mem 20675MB
[2025-04-03 02:26:42 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][152/311]	eta 0:02:20 lr 0.000069	time 0.8758 (0.8858)	loss 0.4619 (0.5461)	grad_norm 4.7718 (3.1186)	mem 20675MB
[2025-04-03 02:26:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][154/311]	eta 0:02:19 lr 0.000069	time 0.8756 (0.8857)	loss 0.4048 (0.5458)	grad_norm 3.3680 (3.1171)	mem 20675MB
[2025-04-03 02:26:45 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][156/311]	eta 0:02:17 lr 0.000068	time 0.8755 (0.8856)	loss 0.5847 (0.5459)	grad_norm 2.4940 (3.1103)	mem 20675MB
[2025-04-03 02:26:47 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][158/311]	eta 0:02:15 lr 0.000068	time 0.8758 (0.8855)	loss 0.5683 (0.5465)	grad_norm 3.1246 (3.1119)	mem 20675MB
[2025-04-03 02:26:49 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][160/311]	eta 0:02:13 lr 0.000068	time 0.8766 (0.8854)	loss 0.4748 (0.5456)	grad_norm 3.8175 (3.1391)	mem 20675MB
[2025-04-03 02:26:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][162/311]	eta 0:02:11 lr 0.000068	time 0.8756 (0.8853)	loss 0.5594 (0.5463)	grad_norm 2.1416 (3.1320)	mem 20675MB
[2025-04-03 02:26:52 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][164/311]	eta 0:02:10 lr 0.000068	time 0.8757 (0.8852)	loss 0.5492 (0.5465)	grad_norm 3.4132 (3.1366)	mem 20675MB
[2025-04-03 02:26:54 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][166/311]	eta 0:02:08 lr 0.000067	time 0.8758 (0.8851)	loss 0.4705 (0.5465)	grad_norm 3.7992 (3.1393)	mem 20675MB
[2025-04-03 02:26:56 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][168/311]	eta 0:02:06 lr 0.000067	time 0.8757 (0.8850)	loss 0.5722 (0.5471)	grad_norm 4.0064 (3.1494)	mem 20675MB
[2025-04-03 02:26:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][170/311]	eta 0:02:04 lr 0.000067	time 0.8756 (0.8849)	loss 0.5952 (0.5477)	grad_norm 2.8247 (3.1467)	mem 20675MB
[2025-04-03 02:26:59 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][172/311]	eta 0:02:02 lr 0.000067	time 0.8757 (0.8848)	loss 0.5019 (0.5470)	grad_norm 3.0098 (3.1418)	mem 20675MB
[2025-04-03 02:27:01 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][174/311]	eta 0:02:01 lr 0.000067	time 0.8757 (0.8847)	loss 0.5811 (0.5475)	grad_norm 2.5887 (3.1319)	mem 20675MB
[2025-04-03 02:27:03 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][176/311]	eta 0:01:59 lr 0.000066	time 0.8754 (0.8846)	loss 0.3713 (0.5462)	grad_norm 4.5837 (3.1464)	mem 20675MB
[2025-04-03 02:27:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][178/311]	eta 0:01:57 lr 0.000066	time 0.8756 (0.8845)	loss 0.5252 (0.5457)	grad_norm 3.8199 (3.1571)	mem 20675MB
[2025-04-03 02:27:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][180/311]	eta 0:01:55 lr 0.000066	time 0.8753 (0.8844)	loss 0.5687 (0.5456)	grad_norm 2.8617 (3.1570)	mem 20675MB
[2025-04-03 02:27:08 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][182/311]	eta 0:01:54 lr 0.000066	time 0.8755 (0.8843)	loss 0.6131 (0.5461)	grad_norm 2.5608 (3.1499)	mem 20675MB
[2025-04-03 02:27:10 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][184/311]	eta 0:01:52 lr 0.000066	time 0.8755 (0.8842)	loss 0.5258 (0.5461)	grad_norm 3.1668 (3.1516)	mem 20675MB
[2025-04-03 02:27:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][186/311]	eta 0:01:50 lr 0.000065	time 0.8757 (0.8841)	loss 0.4524 (0.5459)	grad_norm 3.2188 (3.1498)	mem 20675MB
[2025-04-03 02:27:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][188/311]	eta 0:01:48 lr 0.000065	time 0.8758 (0.8841)	loss 0.4063 (0.5456)	grad_norm 3.7476 (3.1479)	mem 20675MB
[2025-04-03 02:27:15 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][190/311]	eta 0:01:46 lr 0.000065	time 0.8760 (0.8842)	loss 0.4524 (0.5452)	grad_norm 4.3869 (3.1524)	mem 20675MB
[2025-04-03 02:27:17 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][192/311]	eta 0:01:45 lr 0.000065	time 0.8755 (0.8841)	loss 0.5326 (0.5446)	grad_norm 3.1070 (3.1577)	mem 20675MB
[2025-04-03 02:27:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][194/311]	eta 0:01:43 lr 0.000065	time 0.8756 (0.8840)	loss 0.5180 (0.5446)	grad_norm 4.0599 (3.1612)	mem 20675MB
[2025-04-03 02:27:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][196/311]	eta 0:01:41 lr 0.000065	time 0.8759 (0.8840)	loss 0.5045 (0.5438)	grad_norm 4.1516 (3.1712)	mem 20675MB
[2025-04-03 02:27:22 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][198/311]	eta 0:01:39 lr 0.000064	time 0.8758 (0.8839)	loss 0.4091 (0.5432)	grad_norm 3.6541 (3.1698)	mem 20675MB
[2025-04-03 02:27:24 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][200/311]	eta 0:01:38 lr 0.000064	time 0.8758 (0.8838)	loss 0.6394 (0.5442)	grad_norm 3.0585 (3.1691)	mem 20675MB
[2025-04-03 02:27:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][202/311]	eta 0:01:36 lr 0.000064	time 0.8758 (0.8838)	loss 0.5173 (0.5442)	grad_norm 2.7331 (3.1689)	mem 20675MB
[2025-04-03 02:27:27 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][204/311]	eta 0:01:34 lr 0.000064	time 0.8757 (0.8837)	loss 0.4410 (0.5433)	grad_norm 2.5453 (3.1679)	mem 20675MB
[2025-04-03 02:27:29 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][206/311]	eta 0:01:32 lr 0.000064	time 0.8755 (0.8836)	loss 0.5301 (0.5434)	grad_norm 3.6929 (3.1680)	mem 20675MB
[2025-04-03 02:27:31 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][208/311]	eta 0:01:31 lr 0.000063	time 0.8756 (0.8836)	loss 0.5691 (0.5432)	grad_norm 2.1907 (3.1661)	mem 20675MB
[2025-04-03 02:27:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][210/311]	eta 0:01:29 lr 0.000063	time 0.8781 (0.8835)	loss 0.4855 (0.5428)	grad_norm 4.6788 (3.1732)	mem 20675MB
[2025-04-03 02:27:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][212/311]	eta 0:01:27 lr 0.000063	time 0.8755 (0.8834)	loss 0.6254 (0.5434)	grad_norm 2.3087 (3.1641)	mem 20675MB
[2025-04-03 02:27:36 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][214/311]	eta 0:01:25 lr 0.000063	time 0.8755 (0.8834)	loss 0.5217 (0.5434)	grad_norm 3.1979 (3.1613)	mem 20675MB
[2025-04-03 02:27:38 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][216/311]	eta 0:01:23 lr 0.000063	time 0.8758 (0.8833)	loss 0.5021 (0.5425)	grad_norm 4.0180 (3.1696)	mem 20675MB
[2025-04-03 02:27:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][218/311]	eta 0:01:22 lr 0.000063	time 0.8757 (0.8832)	loss 0.6070 (0.5425)	grad_norm 3.5710 (3.1728)	mem 20675MB
[2025-04-03 02:27:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][220/311]	eta 0:01:20 lr 0.000062	time 0.8755 (0.8832)	loss 0.5922 (0.5430)	grad_norm 2.4009 (3.1657)	mem 20675MB
[2025-04-03 02:27:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][222/311]	eta 0:01:18 lr 0.000062	time 0.8756 (0.8831)	loss 0.3706 (0.5420)	grad_norm 3.1629 (3.1770)	mem 20675MB
[2025-04-03 02:27:45 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][224/311]	eta 0:01:16 lr 0.000062	time 0.8758 (0.8831)	loss 0.6152 (0.5418)	grad_norm 2.7710 (3.1785)	mem 20675MB
[2025-04-03 02:27:46 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][226/311]	eta 0:01:15 lr 0.000062	time 0.8758 (0.8830)	loss 0.6326 (0.5425)	grad_norm 2.7151 (3.1788)	mem 20675MB
[2025-04-03 02:27:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][228/311]	eta 0:01:13 lr 0.000062	time 0.8755 (0.8829)	loss 0.4971 (0.5424)	grad_norm 2.8386 (3.1770)	mem 20675MB
[2025-04-03 02:27:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][230/311]	eta 0:01:11 lr 0.000061	time 0.8758 (0.8829)	loss 0.5281 (0.5426)	grad_norm 2.8676 (3.1744)	mem 20675MB
[2025-04-03 02:27:52 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][232/311]	eta 0:01:09 lr 0.000061	time 0.8754 (0.8828)	loss 0.4240 (0.5422)	grad_norm 4.2027 (3.1768)	mem 20675MB
[2025-04-03 02:27:53 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][234/311]	eta 0:01:07 lr 0.000061	time 0.8758 (0.8828)	loss 0.5256 (0.5418)	grad_norm 3.3259 (3.1799)	mem 20675MB
[2025-04-03 02:27:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][236/311]	eta 0:01:06 lr 0.000061	time 0.8755 (0.8827)	loss 0.5690 (0.5419)	grad_norm 2.2575 (3.1718)	mem 20675MB
[2025-04-03 02:27:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][238/311]	eta 0:01:04 lr 0.000061	time 0.8756 (0.8827)	loss 0.5522 (0.5418)	grad_norm 3.3070 (3.1716)	mem 20675MB
[2025-04-03 02:27:59 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][240/311]	eta 0:01:02 lr 0.000061	time 0.8757 (0.8826)	loss 0.6309 (0.5423)	grad_norm 2.8711 (3.1654)	mem 20675MB
[2025-04-03 02:28:00 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][242/311]	eta 0:01:00 lr 0.000060	time 0.8757 (0.8826)	loss 0.6322 (0.5426)	grad_norm 2.6472 (3.1688)	mem 20675MB
[2025-04-03 02:28:02 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][244/311]	eta 0:00:59 lr 0.000060	time 0.8753 (0.8825)	loss 0.5346 (0.5426)	grad_norm 3.3453 (3.1668)	mem 20675MB
[2025-04-03 02:28:04 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][246/311]	eta 0:00:57 lr 0.000060	time 0.8758 (0.8825)	loss 0.4062 (0.5417)	grad_norm 3.8246 (3.1691)	mem 20675MB
[2025-04-03 02:28:06 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][248/311]	eta 0:00:55 lr 0.000060	time 0.8757 (0.8824)	loss 0.5389 (0.5413)	grad_norm 3.1120 (3.1670)	mem 20675MB
[2025-04-03 02:28:07 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][250/311]	eta 0:00:53 lr 0.000060	time 0.8757 (0.8824)	loss 0.4266 (0.5408)	grad_norm 3.2786 (3.1637)	mem 20675MB
[2025-04-03 02:28:09 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][252/311]	eta 0:00:52 lr 0.000059	time 0.8757 (0.8823)	loss 0.6140 (0.5410)	grad_norm 2.5254 (3.1612)	mem 20675MB
[2025-04-03 02:28:11 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][254/311]	eta 0:00:50 lr 0.000059	time 0.8757 (0.8823)	loss 0.4922 (0.5410)	grad_norm 3.2458 (3.1589)	mem 20675MB
[2025-04-03 02:28:13 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][256/311]	eta 0:00:48 lr 0.000059	time 0.8758 (0.8822)	loss 0.6122 (0.5409)	grad_norm 3.4549 (3.1645)	mem 20675MB
[2025-04-03 02:28:14 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][258/311]	eta 0:00:46 lr 0.000059	time 0.8756 (0.8822)	loss 0.5633 (0.5410)	grad_norm 2.3483 (3.1607)	mem 20675MB
[2025-04-03 02:28:16 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][260/311]	eta 0:00:44 lr 0.000059	time 0.8763 (0.8822)	loss 0.5406 (0.5411)	grad_norm 3.0386 (3.1581)	mem 20675MB
[2025-04-03 02:28:18 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][262/311]	eta 0:00:43 lr 0.000059	time 0.8755 (0.8821)	loss 0.4168 (0.5409)	grad_norm 3.7509 (3.1571)	mem 20675MB
[2025-04-03 02:28:20 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][264/311]	eta 0:00:41 lr 0.000058	time 0.8758 (0.8821)	loss 0.4364 (0.5407)	grad_norm 3.9633 (3.1584)	mem 20675MB
[2025-04-03 02:28:22 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][266/311]	eta 0:00:39 lr 0.000058	time 0.8756 (0.8820)	loss 0.6261 (0.5409)	grad_norm 3.5316 (3.1597)	mem 20675MB
[2025-04-03 02:28:23 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][268/311]	eta 0:00:37 lr 0.000058	time 0.8754 (0.8820)	loss 0.5590 (0.5405)	grad_norm 2.2907 (3.1572)	mem 20675MB
[2025-04-03 02:28:25 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][270/311]	eta 0:00:36 lr 0.000058	time 0.8757 (0.8820)	loss 0.4599 (0.5404)	grad_norm 3.6337 (3.1576)	mem 20675MB
[2025-04-03 02:28:27 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][272/311]	eta 0:00:34 lr 0.000058	time 0.8756 (0.8819)	loss 0.4184 (0.5401)	grad_norm 4.7660 (3.1611)	mem 20675MB
[2025-04-03 02:28:29 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][274/311]	eta 0:00:32 lr 0.000057	time 0.8767 (0.8819)	loss 0.4452 (0.5395)	grad_norm 4.1611 (3.1736)	mem 20675MB
[2025-04-03 02:28:30 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][276/311]	eta 0:00:30 lr 0.000057	time 0.8758 (0.8818)	loss 0.5262 (0.5394)	grad_norm 2.6683 (3.1719)	mem 20675MB
[2025-04-03 02:28:32 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][278/311]	eta 0:00:29 lr 0.000057	time 0.8755 (0.8818)	loss 0.5433 (0.5397)	grad_norm 1.9598 (3.1661)	mem 20675MB
[2025-04-03 02:28:34 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][280/311]	eta 0:00:27 lr 0.000057	time 0.8757 (0.8818)	loss 0.6546 (0.5397)	grad_norm 3.1062 (3.1696)	mem 20675MB
[2025-04-03 02:28:36 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][282/311]	eta 0:00:25 lr 0.000057	time 0.8756 (0.8817)	loss 0.6312 (0.5396)	grad_norm 3.0635 (3.1729)	mem 20675MB
[2025-04-03 02:28:37 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][284/311]	eta 0:00:23 lr 0.000057	time 0.8772 (0.8817)	loss 0.3981 (0.5393)	grad_norm 4.5235 (3.1769)	mem 20675MB
[2025-04-03 02:28:39 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][286/311]	eta 0:00:22 lr 0.000056	time 0.8754 (0.8817)	loss 0.6120 (0.5397)	grad_norm 2.6074 (3.1733)	mem 20675MB
[2025-04-03 02:28:41 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][288/311]	eta 0:00:20 lr 0.000056	time 0.8756 (0.8816)	loss 0.5078 (0.5398)	grad_norm 3.1415 (3.1709)	mem 20675MB
[2025-04-03 02:28:43 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][290/311]	eta 0:00:18 lr 0.000056	time 0.8757 (0.8816)	loss 0.4421 (0.5396)	grad_norm 3.4214 (3.1707)	mem 20675MB
[2025-04-03 02:28:44 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][292/311]	eta 0:00:16 lr 0.000056	time 0.8761 (0.8816)	loss 0.6239 (0.5401)	grad_norm 3.7332 (3.1699)	mem 20675MB
[2025-04-03 02:28:46 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][294/311]	eta 0:00:14 lr 0.000056	time 0.8763 (0.8815)	loss 0.6343 (0.5406)	grad_norm 2.1488 (3.1653)	mem 20675MB
[2025-04-03 02:28:48 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][296/311]	eta 0:00:13 lr 0.000056	time 0.8754 (0.8815)	loss 0.4534 (0.5406)	grad_norm 4.2779 (3.1678)	mem 20675MB
[2025-04-03 02:28:50 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][298/311]	eta 0:00:11 lr 0.000055	time 0.8753 (0.8815)	loss 0.5739 (0.5411)	grad_norm 3.0125 (3.1654)	mem 20675MB
[2025-04-03 02:28:51 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][300/311]	eta 0:00:09 lr 0.000055	time 0.8759 (0.8814)	loss 0.6060 (0.5409)	grad_norm 3.7696 (3.1695)	mem 20675MB
[2025-04-03 02:28:53 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][302/311]	eta 0:00:07 lr 0.000055	time 0.8754 (0.8814)	loss 0.6127 (0.5416)	grad_norm 2.8765 (3.1684)	mem 20675MB
[2025-04-03 02:28:55 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][304/311]	eta 0:00:06 lr 0.000055	time 0.8772 (0.8814)	loss 0.5098 (0.5410)	grad_norm 2.3807 (3.1666)	mem 20675MB
[2025-04-03 02:28:57 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][306/311]	eta 0:00:04 lr 0.000055	time 0.8757 (0.8813)	loss 0.5531 (0.5411)	grad_norm 1.9085 (3.1605)	mem 20675MB
[2025-04-03 02:28:58 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][308/311]	eta 0:00:02 lr 0.000055	time 0.8753 (0.8813)	loss 0.5781 (0.5410)	grad_norm 2.4993 (3.1634)	mem 20675MB
[2025-04-03 02:29:00 simmim_finetune] (main_finetune.py 252): INFO Train: [25/30][310/311]	eta 0:00:00 lr 0.000054	time 0.8753 (0.8813)	loss 0.5762 (0.5411)	grad_norm 2.7654 (3.1610)	mem 20675MB
[2025-04-03 02:29:00 simmim_finetune] (main_finetune.py 260): INFO EPOCH 25 training takes 0:04:34
[2025-04-03 02:29:00 simmim_finetune] (utils.py 60): INFO checkpoint/hand/ckpt25.pth saving......
[2025-04-03 02:29:03 simmim_finetune] (utils.py 62): INFO checkpoint/hand/ckpt25.pth saved !!!
[2025-04-03 02:29:05 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.495 (1.495)	Loss 0.5678 (0.5678)	Acc@1 75.781 (75.781)	Mem 20675MB
[2025-04-03 02:29:05 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.465
[2025-04-03 02:29:05 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 77.5%
[2025-04-03 02:29:05 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:29:05 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [4.3926779986778896e-07, 4.3926779986778896e-07, 5.470095079892298e-07, 5.470095079892298e-07, 7.127659820222157e-07, 7.127659820222157e-07, 9.677759420729633e-07, 9.677759420729633e-07, 1.3600989575356516e-06, 1.3600989575356516e-06, 1.9636728274782494e-06, 1.9636728274782494e-06, 2.8922480120053225e-06, 2.8922480120053225e-06, 4.32082521897005e-06, 4.32082521897005e-06, 6.5186363066080925e-06, 6.5186363066080925e-06, 9.899884133743544e-06, 9.899884133743544e-06, 1.5101803867798082e-05, 1.5101803867798082e-05, 2.3104757304805065e-05, 2.3104757304805065e-05, 3.541699336173888e-05, 3.541699336173888e-05, 5.435889498779091e-05, 5.435889498779091e-05]
[2025-04-03 02:29:07 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][0/311]	eta 0:12:06 lr 0.000054	time 2.3356 (2.3356)	loss 0.4871 (0.4871)	grad_norm 2.4001 (2.4001)	mem 20675MB
[2025-04-03 02:29:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][2/311]	eta 0:07:01 lr 0.000054	time 0.8765 (1.3637)	loss 0.6314 (0.5386)	grad_norm 3.1113 (3.3032)	mem 20675MB
[2025-04-03 02:29:11 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][4/311]	eta 0:05:58 lr 0.000054	time 0.8761 (1.1690)	loss 0.6362 (0.5643)	grad_norm 3.3838 (3.3601)	mem 20675MB
[2025-04-03 02:29:13 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][6/311]	eta 0:05:31 lr 0.000054	time 0.8761 (1.0856)	loss 0.5343 (0.5683)	grad_norm 2.3833 (3.1286)	mem 20675MB
[2025-04-03 02:29:14 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][8/311]	eta 0:05:14 lr 0.000054	time 0.8766 (1.0393)	loss 0.4473 (0.5632)	grad_norm 2.6080 (2.9915)	mem 20675MB
[2025-04-03 02:29:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][10/311]	eta 0:05:03 lr 0.000053	time 0.8762 (1.0098)	loss 0.6348 (0.5765)	grad_norm 2.8891 (2.9339)	mem 20675MB
[2025-04-03 02:29:18 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][12/311]	eta 0:04:55 lr 0.000053	time 0.8757 (0.9894)	loss 0.5362 (0.5662)	grad_norm 3.4294 (3.0446)	mem 20675MB
[2025-04-03 02:29:20 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][14/311]	eta 0:04:49 lr 0.000053	time 0.8757 (0.9743)	loss 0.6028 (0.5583)	grad_norm 2.8771 (3.1770)	mem 20675MB
[2025-04-03 02:29:21 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][16/311]	eta 0:04:44 lr 0.000053	time 0.8763 (0.9629)	loss 0.4132 (0.5441)	grad_norm 4.3344 (3.2517)	mem 20675MB
[2025-04-03 02:29:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][18/311]	eta 0:04:39 lr 0.000053	time 0.8763 (0.9539)	loss 0.5207 (0.5456)	grad_norm 2.8018 (3.1812)	mem 20675MB
[2025-04-03 02:29:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][20/311]	eta 0:04:35 lr 0.000053	time 0.8758 (0.9466)	loss 0.6039 (0.5469)	grad_norm 2.7323 (3.1659)	mem 20675MB
[2025-04-03 02:29:27 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][22/311]	eta 0:04:31 lr 0.000052	time 0.8758 (0.9405)	loss 0.5770 (0.5442)	grad_norm 2.1521 (3.1721)	mem 20675MB
[2025-04-03 02:29:28 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][24/311]	eta 0:04:28 lr 0.000052	time 0.8764 (0.9354)	loss 0.4996 (0.5418)	grad_norm 3.5693 (3.1550)	mem 20675MB
[2025-04-03 02:29:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][26/311]	eta 0:04:25 lr 0.000052	time 0.8759 (0.9311)	loss 0.6196 (0.5451)	grad_norm 2.7774 (3.1344)	mem 20675MB
[2025-04-03 02:29:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][28/311]	eta 0:04:22 lr 0.000052	time 0.8761 (0.9273)	loss 0.5058 (0.5423)	grad_norm 3.1077 (3.1563)	mem 20675MB
[2025-04-03 02:29:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][30/311]	eta 0:04:19 lr 0.000052	time 0.8759 (0.9241)	loss 0.4671 (0.5381)	grad_norm 6.3363 (3.2934)	mem 20675MB
[2025-04-03 02:29:35 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][32/311]	eta 0:04:17 lr 0.000052	time 0.8765 (0.9213)	loss 0.6638 (0.5419)	grad_norm 2.9340 (3.2849)	mem 20675MB
[2025-04-03 02:29:37 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][34/311]	eta 0:04:14 lr 0.000051	time 0.8760 (0.9187)	loss 0.6107 (0.5451)	grad_norm 2.2882 (3.2309)	mem 20675MB
[2025-04-03 02:29:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][36/311]	eta 0:04:12 lr 0.000051	time 0.8770 (0.9165)	loss 0.5745 (0.5458)	grad_norm 2.4968 (3.2053)	mem 20675MB
[2025-04-03 02:29:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][38/311]	eta 0:04:09 lr 0.000051	time 0.8759 (0.9145)	loss 0.4876 (0.5436)	grad_norm 4.4366 (3.2326)	mem 20675MB
[2025-04-03 02:29:42 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][40/311]	eta 0:04:07 lr 0.000051	time 0.8761 (0.9126)	loss 0.6224 (0.5472)	grad_norm 3.4052 (3.2067)	mem 20675MB
[2025-04-03 02:29:44 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][42/311]	eta 0:04:05 lr 0.000051	time 0.8759 (0.9110)	loss 0.5781 (0.5476)	grad_norm 2.3531 (3.1828)	mem 20675MB
[2025-04-03 02:29:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][44/311]	eta 0:04:02 lr 0.000051	time 0.8758 (0.9095)	loss 0.4934 (0.5483)	grad_norm 3.2372 (3.1780)	mem 20675MB
[2025-04-03 02:29:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][46/311]	eta 0:04:00 lr 0.000050	time 0.8760 (0.9081)	loss 0.4763 (0.5455)	grad_norm 4.5960 (3.2214)	mem 20675MB
[2025-04-03 02:29:49 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][48/311]	eta 0:03:58 lr 0.000050	time 0.8766 (0.9068)	loss 0.3771 (0.5421)	grad_norm 3.9917 (3.2303)	mem 20675MB
[2025-04-03 02:29:51 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][50/311]	eta 0:03:56 lr 0.000050	time 0.8762 (0.9056)	loss 0.5740 (0.5411)	grad_norm 2.7292 (3.2262)	mem 20675MB
[2025-04-03 02:29:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][52/311]	eta 0:03:54 lr 0.000050	time 0.8758 (0.9046)	loss 0.5522 (0.5422)	grad_norm 2.2039 (3.2025)	mem 20675MB
[2025-04-03 02:29:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][54/311]	eta 0:03:52 lr 0.000050	time 0.8757 (0.9035)	loss 0.5075 (0.5438)	grad_norm 3.4642 (3.1953)	mem 20675MB
[2025-04-03 02:29:56 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][56/311]	eta 0:03:50 lr 0.000050	time 0.8761 (0.9026)	loss 0.5021 (0.5439)	grad_norm 2.6782 (3.1816)	mem 20675MB
[2025-04-03 02:29:58 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][58/311]	eta 0:03:48 lr 0.000049	time 0.8759 (0.9017)	loss 0.5841 (0.5438)	grad_norm 3.7435 (3.2002)	mem 20675MB
[2025-04-03 02:30:00 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][60/311]	eta 0:03:46 lr 0.000049	time 0.8757 (0.9009)	loss 0.4828 (0.5427)	grad_norm 3.5849 (3.1900)	mem 20675MB
[2025-04-03 02:30:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][62/311]	eta 0:03:44 lr 0.000049	time 0.8761 (0.9001)	loss 0.5752 (0.5424)	grad_norm 2.9454 (3.1815)	mem 20675MB
[2025-04-03 02:30:03 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][64/311]	eta 0:03:42 lr 0.000049	time 0.8762 (0.8994)	loss 0.5574 (0.5435)	grad_norm 2.1424 (3.1595)	mem 20675MB
[2025-04-03 02:30:05 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][66/311]	eta 0:03:40 lr 0.000049	time 0.8759 (0.8987)	loss 0.5016 (0.5427)	grad_norm 6.0980 (3.2053)	mem 20675MB
[2025-04-03 02:30:07 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][68/311]	eta 0:03:38 lr 0.000049	time 0.8759 (0.8981)	loss 0.6519 (0.5424)	grad_norm 2.5790 (3.1946)	mem 20675MB
[2025-04-03 02:30:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][70/311]	eta 0:03:36 lr 0.000048	time 0.8759 (0.8975)	loss 0.5656 (0.5439)	grad_norm 2.7346 (3.1827)	mem 20675MB
[2025-04-03 02:30:10 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][72/311]	eta 0:03:34 lr 0.000048	time 0.8762 (0.8969)	loss 0.5120 (0.5442)	grad_norm 4.0030 (3.1815)	mem 20675MB
[2025-04-03 02:30:12 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][74/311]	eta 0:03:32 lr 0.000048	time 0.8758 (0.8964)	loss 0.4831 (0.5418)	grad_norm 2.9187 (3.1861)	mem 20675MB
[2025-04-03 02:30:14 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][76/311]	eta 0:03:30 lr 0.000048	time 0.8761 (0.8959)	loss 0.4135 (0.5405)	grad_norm 5.9171 (3.2173)	mem 20675MB
[2025-04-03 02:30:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][78/311]	eta 0:03:28 lr 0.000048	time 0.8759 (0.8954)	loss 0.5874 (0.5413)	grad_norm 3.2720 (3.2234)	mem 20675MB
[2025-04-03 02:30:17 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][80/311]	eta 0:03:26 lr 0.000048	time 0.8766 (0.8950)	loss 0.5715 (0.5415)	grad_norm 2.4853 (3.2172)	mem 20675MB
[2025-04-03 02:30:19 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][82/311]	eta 0:03:24 lr 0.000047	time 0.8758 (0.8945)	loss 0.5908 (0.5429)	grad_norm 2.6873 (3.2155)	mem 20675MB
[2025-04-03 02:30:21 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][84/311]	eta 0:03:22 lr 0.000047	time 0.8762 (0.8941)	loss 0.6323 (0.5436)	grad_norm 2.0306 (3.1981)	mem 20675MB
[2025-04-03 02:30:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][86/311]	eta 0:03:21 lr 0.000047	time 0.8763 (0.8937)	loss 0.3886 (0.5428)	grad_norm 3.6028 (3.1981)	mem 20675MB
[2025-04-03 02:30:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][88/311]	eta 0:03:19 lr 0.000047	time 0.8761 (0.8934)	loss 0.5632 (0.5430)	grad_norm 2.6602 (3.1960)	mem 20675MB
[2025-04-03 02:30:26 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][90/311]	eta 0:03:17 lr 0.000047	time 0.8760 (0.8930)	loss 0.5953 (0.5435)	grad_norm 2.5041 (3.1848)	mem 20675MB
[2025-04-03 02:30:28 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][92/311]	eta 0:03:15 lr 0.000047	time 0.8760 (0.8927)	loss 0.6033 (0.5447)	grad_norm 3.1602 (3.1770)	mem 20675MB
[2025-04-03 02:30:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][94/311]	eta 0:03:13 lr 0.000047	time 0.8763 (0.8923)	loss 0.5783 (0.5445)	grad_norm 2.5207 (3.1682)	mem 20675MB
[2025-04-03 02:30:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][96/311]	eta 0:03:11 lr 0.000046	time 0.8758 (0.8920)	loss 0.4064 (0.5436)	grad_norm 4.1790 (3.1710)	mem 20675MB
[2025-04-03 02:30:33 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][98/311]	eta 0:03:09 lr 0.000046	time 0.8763 (0.8917)	loss 0.6107 (0.5453)	grad_norm 2.1894 (3.1515)	mem 20675MB
[2025-04-03 02:30:35 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][100/311]	eta 0:03:08 lr 0.000046	time 0.8771 (0.8914)	loss 0.6078 (0.5464)	grad_norm 2.5884 (3.1412)	mem 20675MB
[2025-04-03 02:30:37 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][102/311]	eta 0:03:06 lr 0.000046	time 0.8765 (0.8912)	loss 0.5677 (0.5453)	grad_norm 2.3689 (3.1500)	mem 20675MB
[2025-04-03 02:30:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][104/311]	eta 0:03:04 lr 0.000046	time 0.8761 (0.8909)	loss 0.5589 (0.5439)	grad_norm 2.1968 (3.1491)	mem 20675MB
[2025-04-03 02:30:40 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][106/311]	eta 0:03:02 lr 0.000046	time 0.8762 (0.8906)	loss 0.5810 (0.5430)	grad_norm 2.4971 (3.1449)	mem 20675MB
[2025-04-03 02:30:42 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][108/311]	eta 0:03:00 lr 0.000045	time 0.8766 (0.8904)	loss 0.5624 (0.5429)	grad_norm 2.5484 (3.1383)	mem 20675MB
[2025-04-03 02:30:44 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][110/311]	eta 0:02:58 lr 0.000045	time 0.8762 (0.8902)	loss 0.5349 (0.5426)	grad_norm 2.0488 (3.1224)	mem 20675MB
[2025-04-03 02:30:46 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][112/311]	eta 0:02:57 lr 0.000045	time 0.8763 (0.8899)	loss 0.4967 (0.5417)	grad_norm 3.7726 (3.1334)	mem 20675MB
[2025-04-03 02:30:47 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][114/311]	eta 0:02:55 lr 0.000045	time 0.8764 (0.8897)	loss 0.6063 (0.5430)	grad_norm 3.0476 (3.1219)	mem 20675MB
[2025-04-03 02:30:49 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][116/311]	eta 0:02:53 lr 0.000045	time 0.8763 (0.8895)	loss 0.5658 (0.5433)	grad_norm 2.0451 (3.1085)	mem 20675MB
[2025-04-03 02:30:51 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][118/311]	eta 0:02:51 lr 0.000045	time 0.8765 (0.8893)	loss 0.5510 (0.5438)	grad_norm 2.9072 (3.0983)	mem 20675MB
[2025-04-03 02:30:53 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][120/311]	eta 0:02:49 lr 0.000044	time 0.8766 (0.8891)	loss 0.6486 (0.5438)	grad_norm 3.4728 (3.0989)	mem 20675MB
[2025-04-03 02:30:54 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][122/311]	eta 0:02:48 lr 0.000044	time 0.8765 (0.8889)	loss 0.5303 (0.5436)	grad_norm 2.3741 (3.1023)	mem 20675MB
[2025-04-03 02:30:56 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][124/311]	eta 0:02:46 lr 0.000044	time 0.8766 (0.8887)	loss 0.5692 (0.5431)	grad_norm 2.4316 (3.1099)	mem 20675MB
[2025-04-03 02:30:58 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][126/311]	eta 0:02:44 lr 0.000044	time 0.8766 (0.8886)	loss 0.4396 (0.5423)	grad_norm 3.9455 (3.1151)	mem 20675MB
[2025-04-03 02:31:00 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][128/311]	eta 0:02:42 lr 0.000044	time 0.8767 (0.8884)	loss 0.5833 (0.5423)	grad_norm 2.5441 (3.1137)	mem 20675MB
[2025-04-03 02:31:01 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][130/311]	eta 0:02:40 lr 0.000044	time 0.8765 (0.8882)	loss 0.5963 (0.5429)	grad_norm 2.8537 (3.1056)	mem 20675MB
[2025-04-03 02:31:03 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][132/311]	eta 0:02:38 lr 0.000044	time 0.8765 (0.8881)	loss 0.4982 (0.5420)	grad_norm 3.9472 (3.1146)	mem 20675MB
[2025-04-03 02:31:05 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][134/311]	eta 0:02:37 lr 0.000043	time 0.8762 (0.8879)	loss 0.6293 (0.5421)	grad_norm 2.9879 (3.1172)	mem 20675MB
[2025-04-03 02:31:07 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][136/311]	eta 0:02:35 lr 0.000043	time 0.8764 (0.8877)	loss 0.5027 (0.5422)	grad_norm 2.9569 (3.1119)	mem 20675MB
[2025-04-03 02:31:08 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][138/311]	eta 0:02:33 lr 0.000043	time 0.8762 (0.8876)	loss 0.4022 (0.5418)	grad_norm 4.1849 (3.1204)	mem 20675MB
[2025-04-03 02:31:10 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][140/311]	eta 0:02:31 lr 0.000043	time 0.8762 (0.8875)	loss 0.4317 (0.5403)	grad_norm 4.9878 (3.1435)	mem 20675MB
[2025-04-03 02:31:12 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][142/311]	eta 0:02:29 lr 0.000043	time 0.8763 (0.8873)	loss 0.5826 (0.5409)	grad_norm 2.8733 (3.1371)	mem 20675MB
[2025-04-03 02:31:14 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][144/311]	eta 0:02:28 lr 0.000043	time 0.8772 (0.8872)	loss 0.4628 (0.5405)	grad_norm 3.4272 (3.1330)	mem 20675MB
[2025-04-03 02:31:15 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][146/311]	eta 0:02:26 lr 0.000042	time 0.8765 (0.8870)	loss 0.4150 (0.5389)	grad_norm 3.7724 (3.1535)	mem 20675MB
[2025-04-03 02:31:17 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][148/311]	eta 0:02:24 lr 0.000042	time 0.8764 (0.8869)	loss 0.5487 (0.5390)	grad_norm 2.0346 (3.1466)	mem 20675MB
[2025-04-03 02:31:19 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][150/311]	eta 0:02:22 lr 0.000042	time 0.8761 (0.8868)	loss 0.4950 (0.5385)	grad_norm 3.4895 (3.1602)	mem 20675MB
[2025-04-03 02:31:21 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][152/311]	eta 0:02:20 lr 0.000042	time 0.8764 (0.8867)	loss 0.6592 (0.5387)	grad_norm 3.0893 (3.1589)	mem 20675MB
[2025-04-03 02:31:22 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][154/311]	eta 0:02:19 lr 0.000042	time 0.8767 (0.8865)	loss 0.5772 (0.5389)	grad_norm 2.7363 (3.1555)	mem 20675MB
[2025-04-03 02:31:24 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][156/311]	eta 0:02:17 lr 0.000042	time 0.8761 (0.8864)	loss 0.3808 (0.5383)	grad_norm 4.1421 (3.1547)	mem 20675MB
[2025-04-03 02:31:26 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][158/311]	eta 0:02:15 lr 0.000042	time 0.8764 (0.8863)	loss 0.5724 (0.5388)	grad_norm 3.0754 (3.1518)	mem 20675MB
[2025-04-03 02:31:28 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][160/311]	eta 0:02:13 lr 0.000041	time 0.8765 (0.8862)	loss 0.5663 (0.5396)	grad_norm 2.2918 (3.1484)	mem 20675MB
[2025-04-03 02:31:29 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][162/311]	eta 0:02:12 lr 0.000041	time 0.8764 (0.8861)	loss 0.5107 (0.5386)	grad_norm 2.7605 (3.1574)	mem 20675MB
[2025-04-03 02:31:31 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][164/311]	eta 0:02:10 lr 0.000041	time 0.8762 (0.8860)	loss 0.5719 (0.5388)	grad_norm 2.9459 (3.1543)	mem 20675MB
[2025-04-03 02:31:33 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][166/311]	eta 0:02:08 lr 0.000041	time 0.8763 (0.8859)	loss 0.6057 (0.5397)	grad_norm 2.5978 (3.1472)	mem 20675MB
[2025-04-03 02:31:35 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][168/311]	eta 0:02:06 lr 0.000041	time 0.8758 (0.8858)	loss 0.6142 (0.5398)	grad_norm 2.1373 (3.1474)	mem 20675MB
[2025-04-03 02:31:36 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][170/311]	eta 0:02:04 lr 0.000041	time 0.8763 (0.8857)	loss 0.5858 (0.5406)	grad_norm 2.6503 (3.1391)	mem 20675MB
[2025-04-03 02:31:38 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][172/311]	eta 0:02:03 lr 0.000041	time 0.8757 (0.8856)	loss 0.5216 (0.5406)	grad_norm 3.2777 (3.1333)	mem 20675MB
[2025-04-03 02:31:40 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][174/311]	eta 0:02:01 lr 0.000040	time 0.8757 (0.8855)	loss 0.4836 (0.5407)	grad_norm 2.7051 (3.1265)	mem 20675MB
[2025-04-03 02:31:42 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][176/311]	eta 0:01:59 lr 0.000040	time 0.8773 (0.8854)	loss 0.5625 (0.5408)	grad_norm 2.6708 (3.1218)	mem 20675MB
[2025-04-03 02:31:43 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][178/311]	eta 0:01:57 lr 0.000040	time 0.8759 (0.8853)	loss 0.5933 (0.5415)	grad_norm 3.6329 (3.1284)	mem 20675MB
[2025-04-03 02:31:45 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][180/311]	eta 0:01:55 lr 0.000040	time 0.8760 (0.8852)	loss 0.6142 (0.5422)	grad_norm 2.4063 (3.1225)	mem 20675MB
[2025-04-03 02:31:47 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][182/311]	eta 0:01:54 lr 0.000040	time 0.8758 (0.8851)	loss 0.5592 (0.5422)	grad_norm 2.4090 (3.1229)	mem 20675MB
[2025-04-03 02:31:49 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][184/311]	eta 0:01:52 lr 0.000040	time 0.8765 (0.8850)	loss 0.6131 (0.5427)	grad_norm 2.0979 (3.1135)	mem 20675MB
[2025-04-03 02:31:50 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][186/311]	eta 0:01:50 lr 0.000039	time 0.8759 (0.8849)	loss 0.5851 (0.5430)	grad_norm 2.8602 (3.1095)	mem 20675MB
[2025-04-03 02:31:52 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][188/311]	eta 0:01:48 lr 0.000039	time 0.8757 (0.8848)	loss 0.5977 (0.5427)	grad_norm 2.1802 (3.1057)	mem 20675MB
[2025-04-03 02:31:54 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][190/311]	eta 0:01:47 lr 0.000039	time 0.8760 (0.8847)	loss 0.6213 (0.5432)	grad_norm 3.1657 (3.1076)	mem 20675MB
[2025-04-03 02:31:56 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][192/311]	eta 0:01:45 lr 0.000039	time 0.8759 (0.8847)	loss 0.5208 (0.5428)	grad_norm 2.8719 (3.1086)	mem 20675MB
[2025-04-03 02:31:57 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][194/311]	eta 0:01:43 lr 0.000039	time 0.8758 (0.8846)	loss 0.5075 (0.5422)	grad_norm 2.5188 (3.1125)	mem 20675MB
[2025-04-03 02:31:59 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][196/311]	eta 0:01:41 lr 0.000039	time 0.8757 (0.8845)	loss 0.6428 (0.5434)	grad_norm 2.6225 (3.1104)	mem 20675MB
[2025-04-03 02:32:01 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][198/311]	eta 0:01:39 lr 0.000039	time 0.8762 (0.8844)	loss 0.7118 (0.5444)	grad_norm 3.4583 (3.1136)	mem 20675MB
[2025-04-03 02:32:03 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][200/311]	eta 0:01:38 lr 0.000038	time 0.8760 (0.8843)	loss 0.5335 (0.5441)	grad_norm 3.4925 (3.1138)	mem 20675MB
[2025-04-03 02:32:05 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][202/311]	eta 0:01:36 lr 0.000038	time 0.8758 (0.8843)	loss 0.5805 (0.5441)	grad_norm 2.5186 (3.1139)	mem 20675MB
[2025-04-03 02:32:06 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][204/311]	eta 0:01:34 lr 0.000038	time 0.8758 (0.8842)	loss 0.6031 (0.5446)	grad_norm 3.0650 (3.1110)	mem 20675MB
[2025-04-03 02:32:08 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][206/311]	eta 0:01:32 lr 0.000038	time 0.8759 (0.8841)	loss 0.5744 (0.5449)	grad_norm 2.5536 (3.1067)	mem 20675MB
[2025-04-03 02:32:10 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][208/311]	eta 0:01:31 lr 0.000038	time 0.8757 (0.8841)	loss 0.5002 (0.5443)	grad_norm 2.9860 (3.1071)	mem 20675MB
[2025-04-03 02:32:12 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][210/311]	eta 0:01:29 lr 0.000038	time 0.8757 (0.8840)	loss 0.5564 (0.5443)	grad_norm 2.8869 (3.1061)	mem 20675MB
[2025-04-03 02:32:13 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][212/311]	eta 0:01:27 lr 0.000038	time 0.8760 (0.8839)	loss 0.4523 (0.5434)	grad_norm 3.0005 (3.1137)	mem 20675MB
[2025-04-03 02:32:15 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][214/311]	eta 0:01:25 lr 0.000037	time 0.8760 (0.8839)	loss 0.5979 (0.5439)	grad_norm 2.5722 (3.1092)	mem 20675MB
[2025-04-03 02:32:17 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][216/311]	eta 0:01:23 lr 0.000037	time 0.8755 (0.8838)	loss 0.4493 (0.5434)	grad_norm 3.8088 (3.1138)	mem 20675MB
[2025-04-03 02:32:19 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][218/311]	eta 0:01:22 lr 0.000037	time 0.8756 (0.8837)	loss 0.5386 (0.5435)	grad_norm 3.1774 (3.1112)	mem 20675MB
[2025-04-03 02:32:20 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][220/311]	eta 0:01:20 lr 0.000037	time 0.8758 (0.8837)	loss 0.4512 (0.5425)	grad_norm 5.1490 (3.1304)	mem 20675MB
[2025-04-03 02:32:22 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][222/311]	eta 0:01:18 lr 0.000037	time 0.8759 (0.8836)	loss 0.5427 (0.5428)	grad_norm 4.2353 (3.1307)	mem 20675MB
[2025-04-03 02:32:24 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][224/311]	eta 0:01:16 lr 0.000037	time 0.8758 (0.8835)	loss 0.5992 (0.5429)	grad_norm 2.7064 (3.1299)	mem 20675MB
[2025-04-03 02:32:26 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][226/311]	eta 0:01:15 lr 0.000037	time 0.8757 (0.8835)	loss 0.5701 (0.5433)	grad_norm 3.2380 (3.1271)	mem 20675MB
[2025-04-03 02:32:27 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][228/311]	eta 0:01:13 lr 0.000036	time 0.8758 (0.8834)	loss 0.4346 (0.5426)	grad_norm 3.3448 (3.1288)	mem 20675MB
[2025-04-03 02:32:29 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][230/311]	eta 0:01:11 lr 0.000036	time 0.8757 (0.8834)	loss 0.5902 (0.5424)	grad_norm 2.0581 (3.1289)	mem 20675MB
[2025-04-03 02:32:31 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][232/311]	eta 0:01:09 lr 0.000036	time 0.8756 (0.8833)	loss 0.5232 (0.5424)	grad_norm 3.0861 (3.1283)	mem 20675MB
[2025-04-03 02:32:33 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][234/311]	eta 0:01:08 lr 0.000036	time 0.8757 (0.8832)	loss 0.5620 (0.5428)	grad_norm 2.4132 (3.1230)	mem 20675MB
[2025-04-03 02:32:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][236/311]	eta 0:01:06 lr 0.000036	time 0.8760 (0.8832)	loss 0.5890 (0.5429)	grad_norm 3.0294 (3.1220)	mem 20675MB
[2025-04-03 02:32:36 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][238/311]	eta 0:01:04 lr 0.000036	time 0.8760 (0.8831)	loss 0.3967 (0.5423)	grad_norm 4.3579 (3.1256)	mem 20675MB
[2025-04-03 02:32:38 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][240/311]	eta 0:01:02 lr 0.000036	time 0.8759 (0.8831)	loss 0.4498 (0.5422)	grad_norm 4.6992 (3.1280)	mem 20675MB
[2025-04-03 02:32:40 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][242/311]	eta 0:01:00 lr 0.000035	time 0.8758 (0.8830)	loss 0.4867 (0.5421)	grad_norm 3.2506 (3.1249)	mem 20675MB
[2025-04-03 02:32:41 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][244/311]	eta 0:00:59 lr 0.000035	time 0.8761 (0.8830)	loss 0.5056 (0.5420)	grad_norm 4.8188 (3.1287)	mem 20675MB
[2025-04-03 02:32:43 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][246/311]	eta 0:00:57 lr 0.000035	time 0.8761 (0.8829)	loss 0.5302 (0.5421)	grad_norm 2.8123 (3.1237)	mem 20675MB
[2025-04-03 02:32:45 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][248/311]	eta 0:00:55 lr 0.000035	time 0.8760 (0.8829)	loss 0.5772 (0.5425)	grad_norm 3.2287 (3.1209)	mem 20675MB
[2025-04-03 02:32:47 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][250/311]	eta 0:00:53 lr 0.000035	time 0.8760 (0.8828)	loss 0.5221 (0.5424)	grad_norm 3.3115 (3.1203)	mem 20675MB
[2025-04-03 02:32:48 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][252/311]	eta 0:00:52 lr 0.000035	time 0.8761 (0.8828)	loss 0.6437 (0.5424)	grad_norm 3.0495 (3.1253)	mem 20675MB
[2025-04-03 02:32:50 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][254/311]	eta 0:00:50 lr 0.000035	time 0.8760 (0.8827)	loss 0.5034 (0.5423)	grad_norm 3.4692 (3.1304)	mem 20675MB
[2025-04-03 02:32:52 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][256/311]	eta 0:00:48 lr 0.000035	time 0.8760 (0.8827)	loss 0.5490 (0.5423)	grad_norm 2.5550 (3.1243)	mem 20675MB
[2025-04-03 02:32:54 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][258/311]	eta 0:00:46 lr 0.000034	time 0.8758 (0.8827)	loss 0.5001 (0.5420)	grad_norm 4.4136 (3.1353)	mem 20675MB
[2025-04-03 02:32:55 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][260/311]	eta 0:00:45 lr 0.000034	time 0.8772 (0.8826)	loss 0.5928 (0.5418)	grad_norm 3.1692 (3.1408)	mem 20675MB
[2025-04-03 02:32:57 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][262/311]	eta 0:00:43 lr 0.000034	time 0.8758 (0.8826)	loss 0.5547 (0.5416)	grad_norm 2.3383 (3.1393)	mem 20675MB
[2025-04-03 02:32:59 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][264/311]	eta 0:00:41 lr 0.000034	time 0.8759 (0.8825)	loss 0.5604 (0.5420)	grad_norm 3.4419 (3.1377)	mem 20675MB
[2025-04-03 02:33:01 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][266/311]	eta 0:00:39 lr 0.000034	time 0.8758 (0.8825)	loss 0.5731 (0.5421)	grad_norm 2.8255 (3.1336)	mem 20675MB
[2025-04-03 02:33:02 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][268/311]	eta 0:00:37 lr 0.000034	time 0.8766 (0.8825)	loss 0.4629 (0.5420)	grad_norm 3.2952 (3.1332)	mem 20675MB
[2025-04-03 02:33:04 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][270/311]	eta 0:00:36 lr 0.000034	time 0.8761 (0.8824)	loss 0.6132 (0.5422)	grad_norm 2.3942 (3.1355)	mem 20675MB
[2025-04-03 02:33:06 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][272/311]	eta 0:00:34 lr 0.000033	time 0.8760 (0.8824)	loss 0.5163 (0.5424)	grad_norm 4.6144 (3.1380)	mem 20675MB
[2025-04-03 02:33:08 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][274/311]	eta 0:00:32 lr 0.000033	time 0.8760 (0.8823)	loss 0.6417 (0.5427)	grad_norm 2.6780 (3.1338)	mem 20675MB
[2025-04-03 02:33:09 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][276/311]	eta 0:00:30 lr 0.000033	time 0.8761 (0.8823)	loss 0.6022 (0.5431)	grad_norm 3.4519 (3.1351)	mem 20675MB
[2025-04-03 02:33:11 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][278/311]	eta 0:00:29 lr 0.000033	time 0.8760 (0.8823)	loss 0.4180 (0.5424)	grad_norm 3.1825 (3.1371)	mem 20675MB
[2025-04-03 02:33:13 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][280/311]	eta 0:00:27 lr 0.000033	time 0.8765 (0.8822)	loss 0.4662 (0.5422)	grad_norm 4.8130 (3.1406)	mem 20675MB
[2025-04-03 02:33:15 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][282/311]	eta 0:00:25 lr 0.000033	time 0.8761 (0.8822)	loss 0.3956 (0.5420)	grad_norm 3.6606 (3.1409)	mem 20675MB
[2025-04-03 02:33:16 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][284/311]	eta 0:00:23 lr 0.000033	time 0.8761 (0.8821)	loss 0.5095 (0.5422)	grad_norm 2.5494 (3.1383)	mem 20675MB
[2025-04-03 02:33:18 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][286/311]	eta 0:00:22 lr 0.000032	time 0.8764 (0.8821)	loss 0.6388 (0.5426)	grad_norm 2.6849 (3.1350)	mem 20675MB
[2025-04-03 02:33:20 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][288/311]	eta 0:00:20 lr 0.000032	time 0.8760 (0.8821)	loss 0.5533 (0.5426)	grad_norm 2.5067 (3.1293)	mem 20675MB
[2025-04-03 02:33:22 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][290/311]	eta 0:00:18 lr 0.000032	time 0.8757 (0.8820)	loss 0.5557 (0.5424)	grad_norm 3.9111 (3.1300)	mem 20675MB
[2025-04-03 02:33:23 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][292/311]	eta 0:00:16 lr 0.000032	time 0.8760 (0.8820)	loss 0.5171 (0.5425)	grad_norm 2.9850 (3.1266)	mem 20675MB
[2025-04-03 02:33:25 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][294/311]	eta 0:00:14 lr 0.000032	time 0.8760 (0.8820)	loss 0.5859 (0.5426)	grad_norm 4.0953 (3.1283)	mem 20675MB
[2025-04-03 02:33:27 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][296/311]	eta 0:00:13 lr 0.000032	time 0.8756 (0.8819)	loss 0.5347 (0.5429)	grad_norm 3.2567 (3.1280)	mem 20675MB
[2025-04-03 02:33:29 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][298/311]	eta 0:00:11 lr 0.000032	time 0.8756 (0.8819)	loss 0.4556 (0.5430)	grad_norm 3.1773 (3.1279)	mem 20675MB
[2025-04-03 02:33:30 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][300/311]	eta 0:00:09 lr 0.000032	time 0.8757 (0.8819)	loss 0.5838 (0.5433)	grad_norm 2.1571 (3.1245)	mem 20675MB
[2025-04-03 02:33:32 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][302/311]	eta 0:00:07 lr 0.000031	time 0.8759 (0.8818)	loss 0.6487 (0.5438)	grad_norm 3.2806 (3.1223)	mem 20675MB
[2025-04-03 02:33:34 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][304/311]	eta 0:00:06 lr 0.000031	time 0.8758 (0.8818)	loss 0.5262 (0.5437)	grad_norm 2.7983 (3.1199)	mem 20675MB
[2025-04-03 02:33:36 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][306/311]	eta 0:00:04 lr 0.000031	time 0.8758 (0.8818)	loss 0.5326 (0.5436)	grad_norm 3.1771 (3.1205)	mem 20675MB
[2025-04-03 02:33:37 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][308/311]	eta 0:00:02 lr 0.000031	time 0.8758 (0.8817)	loss 0.5511 (0.5437)	grad_norm 2.5791 (3.1187)	mem 20675MB
[2025-04-03 02:33:39 simmim_finetune] (main_finetune.py 252): INFO Train: [26/30][310/311]	eta 0:00:00 lr 0.000031	time 0.8759 (0.8817)	loss 0.5237 (0.5441)	grad_norm 3.4577 (3.1174)	mem 20675MB
[2025-04-03 02:33:39 simmim_finetune] (main_finetune.py 260): INFO EPOCH 26 training takes 0:04:34
[2025-04-03 02:33:41 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.434 (1.434)	Loss 0.5820 (0.5820)	Acc@1 75.781 (75.781)	Mem 20675MB
[2025-04-03 02:33:41 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 77.465
[2025-04-03 02:33:41 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 77.5%
[2025-04-03 02:33:41 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:33:41 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [3.572059496799993e-07, 3.572059496799993e-07, 4.1823351034997525e-07, 4.1823351034997525e-07, 5.121220652268613e-07, 5.121220652268613e-07, 6.565659958066862e-07, 6.565659958066862e-07, 8.78787427467955e-07, 8.78787427467955e-07, 1.2206665531006764e-06, 1.2206665531006764e-06, 1.7466344386894782e-06, 1.7466344386894782e-06, 2.5558158011337886e-06, 2.5558158011337886e-06, 3.8007102048942665e-06, 3.8007102048942665e-06, 5.715932364525771e-06, 5.715932364525771e-06, 8.662427994728084e-06, 8.662427994728084e-06, 1.3195498195039337e-05, 1.3195498195039337e-05, 2.016945234936434e-05, 2.016945234936434e-05, 3.089861258678742e-05, 3.089861258678742e-05]
[2025-04-03 02:33:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][0/311]	eta 0:15:21 lr 0.000031	time 2.9639 (2.9639)	loss 0.6102 (0.6102)	grad_norm 2.5874 (2.5874)	mem 20675MB
[2025-04-03 02:33:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][2/311]	eta 0:08:05 lr 0.000031	time 0.8763 (1.5727)	loss 0.5968 (0.5498)	grad_norm 2.9411 (3.1400)	mem 20675MB
[2025-04-03 02:33:47 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][4/311]	eta 0:06:37 lr 0.000031	time 0.8761 (1.2947)	loss 0.4314 (0.5174)	grad_norm 3.1449 (2.9640)	mem 20675MB
[2025-04-03 02:33:49 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][6/311]	eta 0:05:58 lr 0.000030	time 0.8756 (1.1753)	loss 0.6014 (0.5460)	grad_norm 2.5848 (3.0492)	mem 20675MB
[2025-04-03 02:33:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][8/311]	eta 0:05:36 lr 0.000030	time 0.8758 (1.1089)	loss 0.5098 (0.5555)	grad_norm 2.5318 (2.9144)	mem 20675MB
[2025-04-03 02:33:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][10/311]	eta 0:05:21 lr 0.000030	time 0.8756 (1.0667)	loss 0.6222 (0.5599)	grad_norm 2.2596 (2.9766)	mem 20675MB
[2025-04-03 02:33:54 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][12/311]	eta 0:05:10 lr 0.000030	time 0.8756 (1.0374)	loss 0.6095 (0.5667)	grad_norm 2.6587 (2.9496)	mem 20675MB
[2025-04-03 02:33:56 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][14/311]	eta 0:05:01 lr 0.000030	time 0.8755 (1.0160)	loss 0.4523 (0.5570)	grad_norm 3.2939 (2.9833)	mem 20675MB
[2025-04-03 02:33:58 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][16/311]	eta 0:04:54 lr 0.000030	time 0.8755 (0.9996)	loss 0.3730 (0.5419)	grad_norm 3.4774 (3.0927)	mem 20675MB
[2025-04-03 02:34:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][18/311]	eta 0:04:49 lr 0.000030	time 0.8762 (0.9867)	loss 0.4118 (0.5329)	grad_norm 3.1380 (3.0953)	mem 20675MB
[2025-04-03 02:34:01 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][20/311]	eta 0:04:44 lr 0.000030	time 0.8758 (0.9762)	loss 0.4916 (0.5324)	grad_norm 2.9254 (3.1191)	mem 20675MB
[2025-04-03 02:34:03 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][22/311]	eta 0:04:39 lr 0.000029	time 0.8754 (0.9675)	loss 0.4348 (0.5317)	grad_norm 7.4378 (3.2852)	mem 20675MB
[2025-04-03 02:34:05 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][24/311]	eta 0:04:35 lr 0.000029	time 0.8759 (0.9603)	loss 0.5954 (0.5366)	grad_norm 2.4086 (3.2339)	mem 20675MB
[2025-04-03 02:34:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][26/311]	eta 0:04:31 lr 0.000029	time 0.8755 (0.9541)	loss 0.5938 (0.5412)	grad_norm 1.7530 (3.2153)	mem 20675MB
[2025-04-03 02:34:08 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][28/311]	eta 0:04:28 lr 0.000029	time 0.8760 (0.9487)	loss 0.4310 (0.5363)	grad_norm 5.2185 (3.2664)	mem 20675MB
[2025-04-03 02:34:10 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][30/311]	eta 0:04:25 lr 0.000029	time 0.8758 (0.9441)	loss 0.4442 (0.5352)	grad_norm 3.3059 (3.2468)	mem 20675MB
[2025-04-03 02:34:12 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][32/311]	eta 0:04:22 lr 0.000029	time 0.8755 (0.9400)	loss 0.6358 (0.5408)	grad_norm 2.2188 (3.2141)	mem 20675MB
[2025-04-03 02:34:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][34/311]	eta 0:04:19 lr 0.000029	time 0.8756 (0.9364)	loss 0.5246 (0.5400)	grad_norm 2.3221 (3.1595)	mem 20675MB
[2025-04-03 02:34:15 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][36/311]	eta 0:04:16 lr 0.000029	time 0.8756 (0.9331)	loss 0.4399 (0.5372)	grad_norm 3.7713 (3.1731)	mem 20675MB
[2025-04-03 02:34:17 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][38/311]	eta 0:04:13 lr 0.000028	time 0.8756 (0.9302)	loss 0.6102 (0.5372)	grad_norm 2.1043 (3.1497)	mem 20675MB
[2025-04-03 02:34:19 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][40/311]	eta 0:04:11 lr 0.000028	time 0.8758 (0.9276)	loss 0.5456 (0.5401)	grad_norm 3.6738 (3.1561)	mem 20675MB
[2025-04-03 02:34:21 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][42/311]	eta 0:04:08 lr 0.000028	time 0.8757 (0.9253)	loss 0.5890 (0.5402)	grad_norm 1.9764 (3.1434)	mem 20675MB
[2025-04-03 02:34:22 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][44/311]	eta 0:04:06 lr 0.000028	time 0.8759 (0.9231)	loss 0.5980 (0.5429)	grad_norm 2.0192 (3.0963)	mem 20675MB
[2025-04-03 02:34:24 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][46/311]	eta 0:04:04 lr 0.000028	time 0.8765 (0.9212)	loss 0.5908 (0.5437)	grad_norm 2.3605 (3.0643)	mem 20675MB
[2025-04-03 02:34:26 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][48/311]	eta 0:04:01 lr 0.000028	time 0.8757 (0.9194)	loss 0.6425 (0.5461)	grad_norm 2.4288 (3.0392)	mem 20675MB
[2025-04-03 02:34:28 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][50/311]	eta 0:03:59 lr 0.000028	time 0.8758 (0.9177)	loss 0.5321 (0.5446)	grad_norm 3.4637 (3.0481)	mem 20675MB
[2025-04-03 02:34:29 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][52/311]	eta 0:03:57 lr 0.000028	time 0.8756 (0.9162)	loss 0.6270 (0.5471)	grad_norm 2.3968 (3.0294)	mem 20675MB
[2025-04-03 02:34:31 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][54/311]	eta 0:03:55 lr 0.000027	time 0.8758 (0.9147)	loss 0.5760 (0.5489)	grad_norm 1.8035 (3.0112)	mem 20675MB
[2025-04-03 02:34:33 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][56/311]	eta 0:03:52 lr 0.000027	time 0.8756 (0.9134)	loss 0.5601 (0.5488)	grad_norm 3.4866 (3.0180)	mem 20675MB
[2025-04-03 02:34:35 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][58/311]	eta 0:03:50 lr 0.000027	time 0.8757 (0.9121)	loss 0.6132 (0.5489)	grad_norm 2.0226 (2.9966)	mem 20675MB
[2025-04-03 02:34:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][60/311]	eta 0:03:48 lr 0.000027	time 0.8757 (0.9110)	loss 0.5174 (0.5489)	grad_norm 2.8962 (2.9889)	mem 20675MB
[2025-04-03 02:34:38 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][62/311]	eta 0:03:46 lr 0.000027	time 0.8755 (0.9099)	loss 0.5446 (0.5490)	grad_norm 2.9690 (2.9822)	mem 20675MB
[2025-04-03 02:34:40 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][64/311]	eta 0:03:44 lr 0.000027	time 0.8756 (0.9088)	loss 0.5988 (0.5503)	grad_norm 2.8808 (2.9875)	mem 20675MB
[2025-04-03 02:34:42 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][66/311]	eta 0:03:42 lr 0.000027	time 0.8762 (0.9079)	loss 0.5809 (0.5488)	grad_norm 2.4184 (2.9993)	mem 20675MB
[2025-04-03 02:34:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][68/311]	eta 0:03:40 lr 0.000027	time 0.8755 (0.9070)	loss 0.5511 (0.5473)	grad_norm 1.5934 (2.9809)	mem 20675MB
[2025-04-03 02:34:45 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][70/311]	eta 0:03:38 lr 0.000026	time 0.8754 (0.9061)	loss 0.4161 (0.5455)	grad_norm 4.0991 (2.9978)	mem 20675MB
[2025-04-03 02:34:47 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][72/311]	eta 0:03:36 lr 0.000026	time 0.8758 (0.9053)	loss 0.4864 (0.5458)	grad_norm 3.7171 (3.0024)	mem 20675MB
[2025-04-03 02:34:49 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][74/311]	eta 0:03:34 lr 0.000026	time 0.8757 (0.9045)	loss 0.4752 (0.5436)	grad_norm 4.8602 (3.0322)	mem 20675MB
[2025-04-03 02:34:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][76/311]	eta 0:03:32 lr 0.000026	time 0.8754 (0.9038)	loss 0.4025 (0.5425)	grad_norm 5.4060 (3.0580)	mem 20675MB
[2025-04-03 02:34:52 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][78/311]	eta 0:03:30 lr 0.000026	time 0.8756 (0.9032)	loss 0.6140 (0.5444)	grad_norm 3.4815 (3.0697)	mem 20675MB
[2025-04-03 02:34:54 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][80/311]	eta 0:03:28 lr 0.000026	time 0.8758 (0.9025)	loss 0.5134 (0.5432)	grad_norm 3.9476 (3.0909)	mem 20675MB
[2025-04-03 02:34:56 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][82/311]	eta 0:03:26 lr 0.000026	time 0.8755 (0.9019)	loss 0.5002 (0.5428)	grad_norm 2.8104 (3.0910)	mem 20675MB
[2025-04-03 02:34:58 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][84/311]	eta 0:03:24 lr 0.000026	time 0.8757 (0.9013)	loss 0.5106 (0.5423)	grad_norm 4.9574 (3.1006)	mem 20675MB
[2025-04-03 02:34:59 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][86/311]	eta 0:03:22 lr 0.000025	time 0.8759 (0.9007)	loss 0.5283 (0.5414)	grad_norm 3.4100 (3.1062)	mem 20675MB
[2025-04-03 02:35:01 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][88/311]	eta 0:03:20 lr 0.000025	time 0.8755 (0.9002)	loss 0.4505 (0.5401)	grad_norm 4.4991 (3.1376)	mem 20675MB
[2025-04-03 02:35:03 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][90/311]	eta 0:03:18 lr 0.000025	time 0.8756 (0.8996)	loss 0.6282 (0.5418)	grad_norm 3.0272 (3.1276)	mem 20675MB
[2025-04-03 02:35:05 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][92/311]	eta 0:03:16 lr 0.000025	time 0.8756 (0.8991)	loss 0.5782 (0.5430)	grad_norm 2.7798 (3.1201)	mem 20675MB
[2025-04-03 02:35:06 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][94/311]	eta 0:03:15 lr 0.000025	time 0.8758 (0.8987)	loss 0.6084 (0.5430)	grad_norm 2.8966 (3.1209)	mem 20675MB
[2025-04-03 02:35:08 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][96/311]	eta 0:03:13 lr 0.000025	time 0.8756 (0.8982)	loss 0.5146 (0.5433)	grad_norm 2.4005 (3.1067)	mem 20675MB
[2025-04-03 02:35:10 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][98/311]	eta 0:03:11 lr 0.000025	time 0.8754 (0.8978)	loss 0.5694 (0.5442)	grad_norm 2.5856 (3.0934)	mem 20675MB
[2025-04-03 02:35:12 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][100/311]	eta 0:03:09 lr 0.000025	time 0.8759 (0.8973)	loss 0.5007 (0.5444)	grad_norm 3.4952 (3.0951)	mem 20675MB
[2025-04-03 02:35:13 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][102/311]	eta 0:03:07 lr 0.000025	time 0.8758 (0.8970)	loss 0.6140 (0.5436)	grad_norm 2.3732 (3.0914)	mem 20675MB
[2025-04-03 02:35:15 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][104/311]	eta 0:03:05 lr 0.000024	time 0.8755 (0.8966)	loss 0.5518 (0.5443)	grad_norm 2.6107 (3.0794)	mem 20675MB
[2025-04-03 02:35:17 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][106/311]	eta 0:03:03 lr 0.000024	time 0.8758 (0.8962)	loss 0.5556 (0.5445)	grad_norm 2.6410 (3.0752)	mem 20675MB
[2025-04-03 02:35:19 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][108/311]	eta 0:03:01 lr 0.000024	time 0.8761 (0.8958)	loss 0.5210 (0.5446)	grad_norm 2.4579 (3.0591)	mem 20675MB
[2025-04-03 02:35:20 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][110/311]	eta 0:02:59 lr 0.000024	time 0.8755 (0.8955)	loss 0.5607 (0.5455)	grad_norm 2.7809 (3.0557)	mem 20675MB
[2025-04-03 02:35:22 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][112/311]	eta 0:02:58 lr 0.000024	time 0.8756 (0.8952)	loss 0.5574 (0.5469)	grad_norm 2.1694 (3.0477)	mem 20675MB
[2025-04-03 02:35:24 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][114/311]	eta 0:02:56 lr 0.000024	time 0.8758 (0.8948)	loss 0.5622 (0.5469)	grad_norm 3.1614 (3.0439)	mem 20675MB
[2025-04-03 02:35:26 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][116/311]	eta 0:02:54 lr 0.000024	time 0.8763 (0.8945)	loss 0.6053 (0.5468)	grad_norm 2.5970 (3.0493)	mem 20675MB
[2025-04-03 02:35:27 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][118/311]	eta 0:02:52 lr 0.000024	time 0.8758 (0.8942)	loss 0.5780 (0.5463)	grad_norm 3.3082 (3.0475)	mem 20675MB
[2025-04-03 02:35:29 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][120/311]	eta 0:02:50 lr 0.000024	time 0.8756 (0.8939)	loss 0.3550 (0.5435)	grad_norm 3.2239 (3.0613)	mem 20675MB
[2025-04-03 02:35:31 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][122/311]	eta 0:02:48 lr 0.000023	time 0.8756 (0.8936)	loss 0.5535 (0.5441)	grad_norm 3.9273 (3.0658)	mem 20675MB
[2025-04-03 02:35:33 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][124/311]	eta 0:02:47 lr 0.000023	time 0.8755 (0.8934)	loss 0.5663 (0.5444)	grad_norm 3.1267 (3.0693)	mem 20675MB
[2025-04-03 02:35:34 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][126/311]	eta 0:02:45 lr 0.000023	time 0.8758 (0.8931)	loss 0.6134 (0.5449)	grad_norm 3.1774 (3.0662)	mem 20675MB
[2025-04-03 02:35:36 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][128/311]	eta 0:02:43 lr 0.000023	time 0.8756 (0.8929)	loss 0.5783 (0.5455)	grad_norm 2.6073 (3.0543)	mem 20675MB
[2025-04-03 02:35:38 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][130/311]	eta 0:02:41 lr 0.000023	time 0.8754 (0.8926)	loss 0.5919 (0.5451)	grad_norm 2.7097 (3.0583)	mem 20675MB
[2025-04-03 02:35:40 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][132/311]	eta 0:02:39 lr 0.000023	time 0.8758 (0.8924)	loss 0.6255 (0.5461)	grad_norm 2.9371 (3.0561)	mem 20675MB
[2025-04-03 02:35:41 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][134/311]	eta 0:02:37 lr 0.000023	time 0.8761 (0.8921)	loss 0.5872 (0.5468)	grad_norm 2.5507 (3.0534)	mem 20675MB
[2025-04-03 02:35:43 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][136/311]	eta 0:02:36 lr 0.000023	time 0.8759 (0.8919)	loss 0.5832 (0.5473)	grad_norm 3.3391 (3.0556)	mem 20675MB
[2025-04-03 02:35:45 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][138/311]	eta 0:02:34 lr 0.000023	time 0.8754 (0.8917)	loss 0.6050 (0.5474)	grad_norm 2.2705 (3.0508)	mem 20675MB
[2025-04-03 02:35:47 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][140/311]	eta 0:02:32 lr 0.000022	time 0.8756 (0.8915)	loss 0.5888 (0.5475)	grad_norm 2.2400 (3.0422)	mem 20675MB
[2025-04-03 02:35:48 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][142/311]	eta 0:02:30 lr 0.000022	time 0.8759 (0.8913)	loss 0.6752 (0.5488)	grad_norm 2.4608 (3.0345)	mem 20675MB
[2025-04-03 02:35:50 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][144/311]	eta 0:02:28 lr 0.000022	time 0.8758 (0.8911)	loss 0.6456 (0.5494)	grad_norm 2.3375 (3.0277)	mem 20675MB
[2025-04-03 02:35:52 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][146/311]	eta 0:02:26 lr 0.000022	time 0.8758 (0.8909)	loss 0.5708 (0.5491)	grad_norm 2.7654 (3.0291)	mem 20675MB
[2025-04-03 02:35:54 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][148/311]	eta 0:02:25 lr 0.000022	time 0.8758 (0.8907)	loss 0.5947 (0.5489)	grad_norm 4.4582 (3.0410)	mem 20675MB
[2025-04-03 02:35:55 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][150/311]	eta 0:02:23 lr 0.000022	time 0.8756 (0.8905)	loss 0.5465 (0.5484)	grad_norm 2.9113 (3.0454)	mem 20675MB
[2025-04-03 02:35:57 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][152/311]	eta 0:02:21 lr 0.000022	time 0.8755 (0.8903)	loss 0.5438 (0.5490)	grad_norm 3.0535 (3.0402)	mem 20675MB
[2025-04-03 02:35:59 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][154/311]	eta 0:02:19 lr 0.000022	time 0.8761 (0.8901)	loss 0.5936 (0.5495)	grad_norm 2.3246 (3.0319)	mem 20675MB
[2025-04-03 02:36:01 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][156/311]	eta 0:02:17 lr 0.000022	time 0.8757 (0.8900)	loss 0.5793 (0.5498)	grad_norm 2.0014 (3.0251)	mem 20675MB
[2025-04-03 02:36:02 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][158/311]	eta 0:02:16 lr 0.000021	time 0.8757 (0.8898)	loss 0.5572 (0.5493)	grad_norm 2.5716 (3.0242)	mem 20675MB
[2025-04-03 02:36:04 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][160/311]	eta 0:02:14 lr 0.000021	time 0.8754 (0.8896)	loss 0.5501 (0.5495)	grad_norm 2.8266 (3.0237)	mem 20675MB
[2025-04-03 02:36:06 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][162/311]	eta 0:02:12 lr 0.000021	time 0.8759 (0.8895)	loss 0.5594 (0.5487)	grad_norm 2.8986 (3.0304)	mem 20675MB
[2025-04-03 02:36:08 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][164/311]	eta 0:02:10 lr 0.000021	time 0.8757 (0.8893)	loss 0.4356 (0.5483)	grad_norm 2.6678 (3.0273)	mem 20675MB
[2025-04-03 02:36:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][166/311]	eta 0:02:08 lr 0.000021	time 0.8763 (0.8892)	loss 0.6245 (0.5491)	grad_norm 2.6583 (3.0191)	mem 20675MB
[2025-04-03 02:36:11 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][168/311]	eta 0:02:07 lr 0.000021	time 0.8756 (0.8890)	loss 0.5539 (0.5495)	grad_norm 2.8766 (3.0157)	mem 20675MB
[2025-04-03 02:36:13 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][170/311]	eta 0:02:05 lr 0.000021	time 0.8761 (0.8889)	loss 0.6020 (0.5500)	grad_norm 2.4488 (3.0105)	mem 20675MB
[2025-04-03 02:36:15 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][172/311]	eta 0:02:03 lr 0.000021	time 0.8759 (0.8887)	loss 0.4239 (0.5486)	grad_norm 3.0795 (3.0145)	mem 20675MB
[2025-04-03 02:36:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][174/311]	eta 0:02:01 lr 0.000021	time 0.8758 (0.8886)	loss 0.5761 (0.5490)	grad_norm 2.8641 (3.0075)	mem 20675MB
[2025-04-03 02:36:18 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][176/311]	eta 0:01:59 lr 0.000020	time 0.8756 (0.8885)	loss 0.5700 (0.5494)	grad_norm 3.7219 (3.0072)	mem 20675MB
[2025-04-03 02:36:20 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][178/311]	eta 0:01:58 lr 0.000020	time 0.8762 (0.8883)	loss 0.4750 (0.5495)	grad_norm 4.2211 (3.0138)	mem 20675MB
[2025-04-03 02:36:22 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][180/311]	eta 0:01:56 lr 0.000020	time 0.8756 (0.8882)	loss 0.4730 (0.5491)	grad_norm 3.3058 (3.0170)	mem 20675MB
[2025-04-03 02:36:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][182/311]	eta 0:01:54 lr 0.000020	time 0.8766 (0.8881)	loss 0.5942 (0.5494)	grad_norm 2.1351 (3.0141)	mem 20675MB
[2025-04-03 02:36:25 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][184/311]	eta 0:01:52 lr 0.000020	time 0.8760 (0.8880)	loss 0.6017 (0.5498)	grad_norm 2.3630 (3.0072)	mem 20675MB
[2025-04-03 02:36:27 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][186/311]	eta 0:01:50 lr 0.000020	time 0.8755 (0.8878)	loss 0.5712 (0.5495)	grad_norm 3.1448 (3.0198)	mem 20675MB
[2025-04-03 02:36:29 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][188/311]	eta 0:01:49 lr 0.000020	time 0.8756 (0.8877)	loss 0.5014 (0.5483)	grad_norm 3.1634 (3.0243)	mem 20675MB
[2025-04-03 02:36:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][190/311]	eta 0:01:47 lr 0.000020	time 0.8759 (0.8876)	loss 0.5960 (0.5489)	grad_norm 2.3835 (3.0190)	mem 20675MB
[2025-04-03 02:36:32 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][192/311]	eta 0:01:45 lr 0.000020	time 0.8756 (0.8875)	loss 0.4686 (0.5490)	grad_norm 3.8866 (3.0227)	mem 20675MB
[2025-04-03 02:36:34 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][194/311]	eta 0:01:43 lr 0.000019	time 0.8758 (0.8874)	loss 0.5887 (0.5498)	grad_norm 2.5226 (3.0190)	mem 20675MB
[2025-04-03 02:36:36 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][196/311]	eta 0:01:42 lr 0.000019	time 0.8756 (0.8873)	loss 0.5277 (0.5500)	grad_norm 2.7822 (3.0156)	mem 20675MB
[2025-04-03 02:36:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][198/311]	eta 0:01:40 lr 0.000019	time 0.8761 (0.8872)	loss 0.6221 (0.5504)	grad_norm 2.6142 (3.0113)	mem 20675MB
[2025-04-03 02:36:39 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][200/311]	eta 0:01:38 lr 0.000019	time 0.8754 (0.8871)	loss 0.4993 (0.5498)	grad_norm 3.3433 (3.0119)	mem 20675MB
[2025-04-03 02:36:41 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][202/311]	eta 0:01:36 lr 0.000019	time 0.8780 (0.8870)	loss 0.4943 (0.5499)	grad_norm 4.1569 (3.0157)	mem 20675MB
[2025-04-03 02:36:43 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][204/311]	eta 0:01:34 lr 0.000019	time 0.8758 (0.8869)	loss 0.5800 (0.5503)	grad_norm 3.1342 (3.0151)	mem 20675MB
[2025-04-03 02:36:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][206/311]	eta 0:01:33 lr 0.000019	time 0.8757 (0.8868)	loss 0.5866 (0.5505)	grad_norm 2.7561 (3.0116)	mem 20675MB
[2025-04-03 02:36:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][208/311]	eta 0:01:31 lr 0.000019	time 0.8757 (0.8867)	loss 0.5037 (0.5502)	grad_norm 3.6118 (3.0110)	mem 20675MB
[2025-04-03 02:36:48 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][210/311]	eta 0:01:29 lr 0.000019	time 0.8758 (0.8866)	loss 0.6257 (0.5507)	grad_norm 2.5659 (3.0077)	mem 20675MB
[2025-04-03 02:36:50 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][212/311]	eta 0:01:27 lr 0.000019	time 0.8761 (0.8865)	loss 0.6111 (0.5506)	grad_norm 2.3504 (3.0098)	mem 20675MB
[2025-04-03 02:36:52 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][214/311]	eta 0:01:25 lr 0.000018	time 0.8755 (0.8864)	loss 0.5402 (0.5506)	grad_norm 2.2417 (3.0066)	mem 20675MB
[2025-04-03 02:36:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][216/311]	eta 0:01:24 lr 0.000018	time 0.8755 (0.8863)	loss 0.5043 (0.5499)	grad_norm 3.6605 (3.0100)	mem 20675MB
[2025-04-03 02:36:55 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][218/311]	eta 0:01:22 lr 0.000018	time 0.8754 (0.8862)	loss 0.5539 (0.5499)	grad_norm 3.0687 (3.0121)	mem 20675MB
[2025-04-03 02:36:57 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][220/311]	eta 0:01:20 lr 0.000018	time 0.8757 (0.8861)	loss 0.6326 (0.5503)	grad_norm 2.3582 (3.0057)	mem 20675MB
[2025-04-03 02:36:59 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][222/311]	eta 0:01:18 lr 0.000018	time 0.8758 (0.8860)	loss 0.5668 (0.5505)	grad_norm 3.1869 (3.0050)	mem 20675MB
[2025-04-03 02:37:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][224/311]	eta 0:01:17 lr 0.000018	time 0.8757 (0.8860)	loss 0.5804 (0.5510)	grad_norm 2.7845 (3.0013)	mem 20675MB
[2025-04-03 02:37:02 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][226/311]	eta 0:01:15 lr 0.000018	time 0.8759 (0.8859)	loss 0.4682 (0.5507)	grad_norm 2.6112 (2.9954)	mem 20675MB
[2025-04-03 02:37:04 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][228/311]	eta 0:01:13 lr 0.000018	time 0.8756 (0.8858)	loss 0.5804 (0.5505)	grad_norm 2.3638 (2.9963)	mem 20675MB
[2025-04-03 02:37:06 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][230/311]	eta 0:01:11 lr 0.000018	time 0.8758 (0.8857)	loss 0.5782 (0.5504)	grad_norm 2.1799 (3.0019)	mem 20675MB
[2025-04-03 02:37:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][232/311]	eta 0:01:09 lr 0.000018	time 0.8758 (0.8856)	loss 0.5104 (0.5505)	grad_norm 2.8769 (2.9988)	mem 20675MB
[2025-04-03 02:37:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][234/311]	eta 0:01:08 lr 0.000017	time 0.8756 (0.8856)	loss 0.6029 (0.5508)	grad_norm 2.6192 (2.9935)	mem 20675MB
[2025-04-03 02:37:11 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][236/311]	eta 0:01:06 lr 0.000017	time 0.8759 (0.8855)	loss 0.5348 (0.5502)	grad_norm 2.4497 (2.9961)	mem 20675MB
[2025-04-03 02:37:13 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][238/311]	eta 0:01:04 lr 0.000017	time 0.8758 (0.8854)	loss 0.5335 (0.5496)	grad_norm 3.0840 (2.9990)	mem 20675MB
[2025-04-03 02:37:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][240/311]	eta 0:01:02 lr 0.000017	time 0.8757 (0.8853)	loss 0.6480 (0.5496)	grad_norm 3.0603 (3.0013)	mem 20675MB
[2025-04-03 02:37:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][242/311]	eta 0:01:01 lr 0.000017	time 0.8759 (0.8853)	loss 0.5916 (0.5497)	grad_norm 2.8601 (3.0012)	mem 20675MB
[2025-04-03 02:37:18 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][244/311]	eta 0:00:59 lr 0.000017	time 0.8757 (0.8852)	loss 0.5905 (0.5502)	grad_norm 2.5527 (2.9967)	mem 20675MB
[2025-04-03 02:37:20 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][246/311]	eta 0:00:57 lr 0.000017	time 0.8754 (0.8851)	loss 0.5497 (0.5503)	grad_norm 2.0402 (2.9900)	mem 20675MB
[2025-04-03 02:37:21 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][248/311]	eta 0:00:55 lr 0.000017	time 0.8757 (0.8851)	loss 0.5708 (0.5504)	grad_norm 2.5703 (2.9843)	mem 20675MB
[2025-04-03 02:37:23 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][250/311]	eta 0:00:53 lr 0.000017	time 0.8755 (0.8850)	loss 0.5258 (0.5497)	grad_norm 2.8292 (2.9883)	mem 20675MB
[2025-04-03 02:37:25 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][252/311]	eta 0:00:52 lr 0.000017	time 0.8756 (0.8849)	loss 0.5022 (0.5490)	grad_norm 3.6650 (3.0023)	mem 20675MB
[2025-04-03 02:37:27 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][254/311]	eta 0:00:50 lr 0.000017	time 0.8757 (0.8849)	loss 0.4853 (0.5491)	grad_norm 4.6291 (3.0072)	mem 20675MB
[2025-04-03 02:37:28 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][256/311]	eta 0:00:48 lr 0.000016	time 0.8757 (0.8848)	loss 0.6128 (0.5495)	grad_norm 3.1971 (3.0052)	mem 20675MB
[2025-04-03 02:37:30 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][258/311]	eta 0:00:46 lr 0.000016	time 0.8754 (0.8847)	loss 0.4125 (0.5491)	grad_norm 4.2252 (3.0095)	mem 20675MB
[2025-04-03 02:37:32 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][260/311]	eta 0:00:45 lr 0.000016	time 0.8759 (0.8847)	loss 0.4461 (0.5486)	grad_norm 3.2591 (3.0125)	mem 20675MB
[2025-04-03 02:37:34 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][262/311]	eta 0:00:43 lr 0.000016	time 0.8757 (0.8846)	loss 0.5544 (0.5483)	grad_norm 2.5998 (3.0128)	mem 20675MB
[2025-04-03 02:37:35 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][264/311]	eta 0:00:41 lr 0.000016	time 0.8761 (0.8845)	loss 0.5984 (0.5486)	grad_norm 2.2850 (3.0082)	mem 20675MB
[2025-04-03 02:37:37 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][266/311]	eta 0:00:39 lr 0.000016	time 0.8754 (0.8845)	loss 0.5404 (0.5487)	grad_norm 2.5139 (3.0069)	mem 20675MB
[2025-04-03 02:37:39 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][268/311]	eta 0:00:38 lr 0.000016	time 0.8758 (0.8844)	loss 0.6511 (0.5487)	grad_norm 2.4020 (3.0072)	mem 20675MB
[2025-04-03 02:37:41 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][270/311]	eta 0:00:36 lr 0.000016	time 0.8755 (0.8844)	loss 0.4381 (0.5487)	grad_norm 3.6767 (3.0088)	mem 20675MB
[2025-04-03 02:37:42 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][272/311]	eta 0:00:34 lr 0.000016	time 0.8754 (0.8843)	loss 0.4654 (0.5485)	grad_norm 5.3391 (3.0165)	mem 20675MB
[2025-04-03 02:37:44 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][274/311]	eta 0:00:32 lr 0.000016	time 0.8754 (0.8842)	loss 0.6299 (0.5488)	grad_norm 3.8564 (3.0228)	mem 20675MB
[2025-04-03 02:37:46 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][276/311]	eta 0:00:30 lr 0.000015	time 0.8755 (0.8842)	loss 0.5646 (0.5488)	grad_norm 2.2385 (3.0177)	mem 20675MB
[2025-04-03 02:37:48 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][278/311]	eta 0:00:29 lr 0.000015	time 0.8755 (0.8841)	loss 0.6217 (0.5493)	grad_norm 2.1683 (3.0137)	mem 20675MB
[2025-04-03 02:37:49 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][280/311]	eta 0:00:27 lr 0.000015	time 0.8756 (0.8841)	loss 0.3905 (0.5484)	grad_norm 3.9175 (3.0263)	mem 20675MB
[2025-04-03 02:37:51 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][282/311]	eta 0:00:25 lr 0.000015	time 0.8756 (0.8840)	loss 0.3541 (0.5479)	grad_norm 3.5790 (3.0254)	mem 20675MB
[2025-04-03 02:37:53 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][284/311]	eta 0:00:23 lr 0.000015	time 0.8760 (0.8840)	loss 0.5655 (0.5483)	grad_norm 3.5377 (3.0264)	mem 20675MB
[2025-04-03 02:37:55 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][286/311]	eta 0:00:22 lr 0.000015	time 0.8773 (0.8839)	loss 0.4504 (0.5481)	grad_norm 5.2721 (3.0338)	mem 20675MB
[2025-04-03 02:37:56 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][288/311]	eta 0:00:20 lr 0.000015	time 0.8757 (0.8839)	loss 0.6203 (0.5481)	grad_norm 3.2153 (3.0359)	mem 20675MB
[2025-04-03 02:37:58 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][290/311]	eta 0:00:18 lr 0.000015	time 0.8797 (0.8839)	loss 0.4354 (0.5479)	grad_norm 3.2228 (3.0362)	mem 20675MB
[2025-04-03 02:38:00 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][292/311]	eta 0:00:16 lr 0.000015	time 0.8761 (0.8838)	loss 0.5968 (0.5481)	grad_norm 2.5387 (3.0357)	mem 20675MB
[2025-04-03 02:38:02 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][294/311]	eta 0:00:15 lr 0.000015	time 0.8754 (0.8838)	loss 0.3936 (0.5474)	grad_norm 3.5519 (3.0366)	mem 20675MB
[2025-04-03 02:38:03 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][296/311]	eta 0:00:13 lr 0.000015	time 0.8754 (0.8837)	loss 0.6536 (0.5478)	grad_norm 4.4268 (3.0415)	mem 20675MB
[2025-04-03 02:38:05 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][298/311]	eta 0:00:11 lr 0.000014	time 0.8753 (0.8837)	loss 0.5667 (0.5475)	grad_norm 2.7054 (3.0415)	mem 20675MB
[2025-04-03 02:38:07 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][300/311]	eta 0:00:09 lr 0.000014	time 0.8757 (0.8836)	loss 0.5718 (0.5475)	grad_norm 2.0875 (3.0441)	mem 20675MB
[2025-04-03 02:38:09 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][302/311]	eta 0:00:07 lr 0.000014	time 0.8751 (0.8836)	loss 0.5566 (0.5473)	grad_norm 2.0270 (3.0419)	mem 20675MB
[2025-04-03 02:38:10 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][304/311]	eta 0:00:06 lr 0.000014	time 0.8753 (0.8835)	loss 0.5482 (0.5478)	grad_norm 2.3826 (3.0399)	mem 20675MB
[2025-04-03 02:38:12 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][306/311]	eta 0:00:04 lr 0.000014	time 0.8760 (0.8835)	loss 0.6071 (0.5481)	grad_norm 2.3887 (3.0357)	mem 20675MB
[2025-04-03 02:38:14 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][308/311]	eta 0:00:02 lr 0.000014	time 0.8752 (0.8834)	loss 0.4853 (0.5480)	grad_norm 4.0668 (3.0365)	mem 20675MB
[2025-04-03 02:38:16 simmim_finetune] (main_finetune.py 252): INFO Train: [27/30][310/311]	eta 0:00:00 lr 0.000014	time 0.8752 (0.8834)	loss 0.4473 (0.5475)	grad_norm 2.8499 (3.0352)	mem 20675MB
[2025-04-03 02:38:16 simmim_finetune] (main_finetune.py 260): INFO EPOCH 27 training takes 0:04:34
[2025-04-03 02:38:17 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.392 (1.392)	Loss 0.5599 (0.5599)	Acc@1 75.000 (75.000)	Mem 20675MB
[2025-04-03 02:38:17 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 76.056
[2025-04-03 02:38:17 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 76.1%
[2025-04-03 02:38:17 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:38:17 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.979171034103424e-07, 2.979171034103424e-07, 3.2519417100064746e-07, 3.2519417100064746e-07, 3.671588903703475e-07, 3.671588903703475e-07, 4.3171999709296297e-07, 4.3171999709296297e-07, 5.310447766662175e-07, 5.310447766662175e-07, 6.8385212985584e-07, 6.8385212985584e-07, 9.189403655321821e-07, 9.189403655321821e-07, 1.280614574265016e-06, 1.280614574265016e-06, 1.8370364338539916e-06, 1.8370364338539916e-06, 2.6930700639908774e-06, 2.6930700639908774e-06, 4.010044879586085e-06, 4.010044879586085e-06, 6.036159980501791e-06, 6.036159980501791e-06, 9.15326013575672e-06, 9.15326013575672e-06, 1.3948798836148922e-05, 1.3948798836148922e-05]
[2025-04-03 02:38:20 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][0/311]	eta 0:11:34 lr 0.000014	time 2.2333 (2.2333)	loss 0.5736 (0.5736)	grad_norm 2.6219 (2.6219)	mem 20675MB
[2025-04-03 02:38:21 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][2/311]	eta 0:06:50 lr 0.000014	time 0.8762 (1.3292)	loss 0.5806 (0.5877)	grad_norm 2.2722 (2.3467)	mem 20675MB
[2025-04-03 02:38:23 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][4/311]	eta 0:05:52 lr 0.000014	time 0.8761 (1.1484)	loss 0.4793 (0.5556)	grad_norm 2.5887 (2.7448)	mem 20675MB
[2025-04-03 02:38:25 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][6/311]	eta 0:05:26 lr 0.000014	time 0.8760 (1.0708)	loss 0.5513 (0.5583)	grad_norm 2.7817 (2.7771)	mem 20675MB
[2025-04-03 02:38:27 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][8/311]	eta 0:05:11 lr 0.000014	time 0.8757 (1.0276)	loss 0.5274 (0.5575)	grad_norm 2.4085 (2.7396)	mem 20675MB
[2025-04-03 02:38:28 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][10/311]	eta 0:05:01 lr 0.000013	time 0.8758 (1.0002)	loss 0.6055 (0.5643)	grad_norm 2.7163 (2.6982)	mem 20675MB
[2025-04-03 02:38:30 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][12/311]	eta 0:04:53 lr 0.000013	time 0.8758 (0.9812)	loss 0.5669 (0.5588)	grad_norm 2.2324 (2.6392)	mem 20675MB
[2025-04-03 02:38:32 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][14/311]	eta 0:04:47 lr 0.000013	time 0.8762 (0.9673)	loss 0.4740 (0.5551)	grad_norm 3.0233 (2.6683)	mem 20675MB
[2025-04-03 02:38:34 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][16/311]	eta 0:04:42 lr 0.000013	time 0.8759 (0.9566)	loss 0.5815 (0.5545)	grad_norm 3.3011 (2.7514)	mem 20675MB
[2025-04-03 02:38:35 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][18/311]	eta 0:04:37 lr 0.000013	time 0.8758 (0.9482)	loss 0.4576 (0.5494)	grad_norm 4.9401 (2.8810)	mem 20675MB
[2025-04-03 02:38:37 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][20/311]	eta 0:04:33 lr 0.000013	time 0.8760 (0.9414)	loss 0.3856 (0.5454)	grad_norm 4.5823 (2.9677)	mem 20675MB
[2025-04-03 02:38:39 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][22/311]	eta 0:04:30 lr 0.000013	time 0.8761 (0.9358)	loss 0.5284 (0.5479)	grad_norm 2.1424 (2.9439)	mem 20675MB
[2025-04-03 02:38:41 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][24/311]	eta 0:04:27 lr 0.000013	time 0.8760 (0.9311)	loss 0.5494 (0.5496)	grad_norm 3.5842 (2.9442)	mem 20675MB
[2025-04-03 02:38:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][26/311]	eta 0:04:24 lr 0.000013	time 0.8757 (0.9271)	loss 0.5036 (0.5529)	grad_norm 3.2353 (2.9636)	mem 20675MB
[2025-04-03 02:38:44 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][28/311]	eta 0:04:21 lr 0.000013	time 0.8760 (0.9236)	loss 0.6065 (0.5545)	grad_norm 3.5517 (2.9687)	mem 20675MB
[2025-04-03 02:38:46 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][30/311]	eta 0:04:18 lr 0.000013	time 0.8758 (0.9206)	loss 0.5645 (0.5554)	grad_norm 2.5583 (2.9648)	mem 20675MB
[2025-04-03 02:38:48 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][32/311]	eta 0:04:16 lr 0.000013	time 0.8759 (0.9179)	loss 0.5013 (0.5555)	grad_norm 2.2501 (2.9150)	mem 20675MB
[2025-04-03 02:38:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][34/311]	eta 0:04:13 lr 0.000012	time 0.8759 (0.9156)	loss 0.5316 (0.5549)	grad_norm 2.9262 (2.8881)	mem 20675MB
[2025-04-03 02:38:51 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][36/311]	eta 0:04:11 lr 0.000012	time 0.8764 (0.9135)	loss 0.5587 (0.5528)	grad_norm 2.7471 (2.8898)	mem 20675MB
[2025-04-03 02:38:53 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][38/311]	eta 0:04:08 lr 0.000012	time 0.8757 (0.9116)	loss 0.4942 (0.5504)	grad_norm 2.1323 (2.8847)	mem 20675MB
[2025-04-03 02:38:55 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][40/311]	eta 0:04:06 lr 0.000012	time 0.8760 (0.9099)	loss 0.5311 (0.5507)	grad_norm 3.6344 (2.8965)	mem 20675MB
[2025-04-03 02:38:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][42/311]	eta 0:04:04 lr 0.000012	time 0.8756 (0.9084)	loss 0.5150 (0.5500)	grad_norm 2.7685 (2.8768)	mem 20675MB
[2025-04-03 02:38:58 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][44/311]	eta 0:04:02 lr 0.000012	time 0.8762 (0.9070)	loss 0.5246 (0.5509)	grad_norm 2.5580 (2.8486)	mem 20675MB
[2025-04-03 02:39:00 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][46/311]	eta 0:04:00 lr 0.000012	time 0.8758 (0.9057)	loss 0.5762 (0.5515)	grad_norm 3.3001 (2.8628)	mem 20675MB
[2025-04-03 02:39:02 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][48/311]	eta 0:03:57 lr 0.000012	time 0.8764 (0.9045)	loss 0.6150 (0.5540)	grad_norm 3.2183 (2.8659)	mem 20675MB
[2025-04-03 02:39:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][50/311]	eta 0:03:55 lr 0.000012	time 0.8760 (0.9034)	loss 0.6192 (0.5560)	grad_norm 2.3983 (2.8400)	mem 20675MB
[2025-04-03 02:39:05 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][52/311]	eta 0:03:53 lr 0.000012	time 0.8758 (0.9024)	loss 0.5459 (0.5554)	grad_norm 2.8699 (2.8385)	mem 20675MB
[2025-04-03 02:39:07 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][54/311]	eta 0:03:51 lr 0.000012	time 0.8759 (0.9015)	loss 0.5802 (0.5562)	grad_norm 3.4786 (2.8400)	mem 20675MB
[2025-04-03 02:39:09 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][56/311]	eta 0:03:49 lr 0.000012	time 0.8760 (0.9006)	loss 0.4041 (0.5534)	grad_norm 3.8315 (2.8579)	mem 20675MB
[2025-04-03 02:39:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][58/311]	eta 0:03:47 lr 0.000011	time 0.8762 (0.8998)	loss 0.6457 (0.5546)	grad_norm 3.0257 (2.8552)	mem 20675MB
[2025-04-03 02:39:12 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][60/311]	eta 0:03:45 lr 0.000011	time 0.8758 (0.8991)	loss 0.4681 (0.5539)	grad_norm 4.6805 (2.8758)	mem 20675MB
[2025-04-03 02:39:14 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][62/311]	eta 0:03:43 lr 0.000011	time 0.8760 (0.8984)	loss 0.5189 (0.5522)	grad_norm 2.3265 (2.8809)	mem 20675MB
[2025-04-03 02:39:16 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][64/311]	eta 0:03:41 lr 0.000011	time 0.8760 (0.8977)	loss 0.6193 (0.5543)	grad_norm 3.5297 (2.8871)	mem 20675MB
[2025-04-03 02:39:17 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][66/311]	eta 0:03:39 lr 0.000011	time 0.8759 (0.8971)	loss 0.6123 (0.5556)	grad_norm 2.3817 (2.8783)	mem 20675MB
[2025-04-03 02:39:19 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][68/311]	eta 0:03:37 lr 0.000011	time 0.8760 (0.8965)	loss 0.4887 (0.5539)	grad_norm 4.0925 (2.8977)	mem 20675MB
[2025-04-03 02:39:21 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][70/311]	eta 0:03:35 lr 0.000011	time 0.8760 (0.8959)	loss 0.5925 (0.5548)	grad_norm 2.5568 (2.8852)	mem 20675MB
[2025-04-03 02:39:23 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][72/311]	eta 0:03:33 lr 0.000011	time 0.8758 (0.8954)	loss 0.5680 (0.5549)	grad_norm 2.9101 (2.9019)	mem 20675MB
[2025-04-03 02:39:24 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][74/311]	eta 0:03:32 lr 0.000011	time 0.8759 (0.8949)	loss 0.6433 (0.5569)	grad_norm 3.2634 (2.9011)	mem 20675MB
[2025-04-03 02:39:26 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][76/311]	eta 0:03:30 lr 0.000011	time 0.8757 (0.8945)	loss 0.4775 (0.5553)	grad_norm 4.9263 (2.9503)	mem 20675MB
[2025-04-03 02:39:28 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][78/311]	eta 0:03:28 lr 0.000011	time 0.8765 (0.8940)	loss 0.5897 (0.5561)	grad_norm 2.4209 (2.9421)	mem 20675MB
[2025-04-03 02:39:30 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][80/311]	eta 0:03:26 lr 0.000011	time 0.8770 (0.8936)	loss 0.6436 (0.5576)	grad_norm 2.4135 (2.9333)	mem 20675MB
[2025-04-03 02:39:31 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][82/311]	eta 0:03:24 lr 0.000011	time 0.8759 (0.8932)	loss 0.5850 (0.5578)	grad_norm 2.1781 (2.9191)	mem 20675MB
[2025-04-03 02:39:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][84/311]	eta 0:03:22 lr 0.000010	time 0.8761 (0.8928)	loss 0.6479 (0.5586)	grad_norm 2.9785 (2.9167)	mem 20675MB
[2025-04-03 02:39:35 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][86/311]	eta 0:03:20 lr 0.000010	time 0.8760 (0.8925)	loss 0.5727 (0.5575)	grad_norm 2.0522 (2.9194)	mem 20675MB
[2025-04-03 02:39:37 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][88/311]	eta 0:03:18 lr 0.000010	time 0.8760 (0.8921)	loss 0.4619 (0.5573)	grad_norm 3.3736 (2.9194)	mem 20675MB
[2025-04-03 02:39:39 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][90/311]	eta 0:03:17 lr 0.000010	time 0.8761 (0.8918)	loss 0.4313 (0.5555)	grad_norm 4.6518 (2.9424)	mem 20675MB
[2025-04-03 02:39:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][92/311]	eta 0:03:15 lr 0.000010	time 0.8761 (0.8915)	loss 0.5281 (0.5555)	grad_norm 2.6125 (2.9385)	mem 20675MB
[2025-04-03 02:39:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][94/311]	eta 0:03:13 lr 0.000010	time 0.8761 (0.8912)	loss 0.5192 (0.5540)	grad_norm 3.3441 (2.9672)	mem 20675MB
[2025-04-03 02:39:44 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][96/311]	eta 0:03:11 lr 0.000010	time 0.8758 (0.8909)	loss 0.5397 (0.5541)	grad_norm 2.5275 (2.9597)	mem 20675MB
[2025-04-03 02:39:46 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][98/311]	eta 0:03:09 lr 0.000010	time 0.8758 (0.8906)	loss 0.5031 (0.5530)	grad_norm 2.8022 (2.9590)	mem 20675MB
[2025-04-03 02:39:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][100/311]	eta 0:03:07 lr 0.000010	time 0.8761 (0.8903)	loss 0.6198 (0.5533)	grad_norm 2.4107 (2.9516)	mem 20675MB
[2025-04-03 02:39:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][102/311]	eta 0:03:06 lr 0.000010	time 0.8761 (0.8901)	loss 0.5374 (0.5523)	grad_norm 3.9877 (2.9709)	mem 20675MB
[2025-04-03 02:39:51 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][104/311]	eta 0:03:04 lr 0.000010	time 0.8756 (0.8898)	loss 0.5949 (0.5513)	grad_norm 2.3936 (2.9776)	mem 20675MB
[2025-04-03 02:39:53 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][106/311]	eta 0:03:02 lr 0.000010	time 0.8759 (0.8896)	loss 0.5571 (0.5510)	grad_norm 2.3095 (2.9722)	mem 20675MB
[2025-04-03 02:39:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][108/311]	eta 0:03:00 lr 0.000010	time 0.8757 (0.8893)	loss 0.5637 (0.5515)	grad_norm 2.6398 (2.9691)	mem 20675MB
[2025-04-03 02:39:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][110/311]	eta 0:02:58 lr 0.000010	time 0.8759 (0.8891)	loss 0.5680 (0.5515)	grad_norm 2.8040 (2.9675)	mem 20675MB
[2025-04-03 02:39:58 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][112/311]	eta 0:02:56 lr 0.000009	time 0.8758 (0.8889)	loss 0.4728 (0.5501)	grad_norm 2.6758 (2.9664)	mem 20675MB
[2025-04-03 02:40:00 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][114/311]	eta 0:02:55 lr 0.000009	time 0.8758 (0.8887)	loss 0.5157 (0.5500)	grad_norm 3.7451 (2.9708)	mem 20675MB
[2025-04-03 02:40:01 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][116/311]	eta 0:02:53 lr 0.000009	time 0.8781 (0.8885)	loss 0.6219 (0.5513)	grad_norm 2.6205 (2.9643)	mem 20675MB
[2025-04-03 02:40:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][118/311]	eta 0:02:51 lr 0.000009	time 0.8761 (0.8883)	loss 0.5634 (0.5513)	grad_norm 2.9011 (2.9592)	mem 20675MB
[2025-04-03 02:40:05 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][120/311]	eta 0:02:49 lr 0.000009	time 0.8759 (0.8881)	loss 0.4873 (0.5511)	grad_norm 2.8281 (2.9514)	mem 20675MB
[2025-04-03 02:40:07 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][122/311]	eta 0:02:47 lr 0.000009	time 0.8760 (0.8879)	loss 0.5260 (0.5507)	grad_norm 2.6615 (2.9471)	mem 20675MB
[2025-04-03 02:40:08 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][124/311]	eta 0:02:46 lr 0.000009	time 0.8759 (0.8878)	loss 0.5374 (0.5508)	grad_norm 2.6843 (2.9437)	mem 20675MB
[2025-04-03 02:40:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][126/311]	eta 0:02:44 lr 0.000009	time 0.8759 (0.8876)	loss 0.5645 (0.5513)	grad_norm 2.6349 (2.9384)	mem 20675MB
[2025-04-03 02:40:12 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][128/311]	eta 0:02:42 lr 0.000009	time 0.8762 (0.8874)	loss 0.5928 (0.5510)	grad_norm 2.3465 (2.9326)	mem 20675MB
[2025-04-03 02:40:14 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][130/311]	eta 0:02:40 lr 0.000009	time 0.8758 (0.8873)	loss 0.5343 (0.5508)	grad_norm 2.7810 (2.9292)	mem 20675MB
[2025-04-03 02:40:15 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][132/311]	eta 0:02:38 lr 0.000009	time 0.8757 (0.8871)	loss 0.5549 (0.5511)	grad_norm 1.7803 (2.9195)	mem 20675MB
[2025-04-03 02:40:17 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][134/311]	eta 0:02:36 lr 0.000009	time 0.8762 (0.8870)	loss 0.5383 (0.5507)	grad_norm 3.2149 (2.9195)	mem 20675MB
[2025-04-03 02:40:19 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][136/311]	eta 0:02:35 lr 0.000009	time 0.8757 (0.8868)	loss 0.6021 (0.5510)	grad_norm 2.5892 (2.9158)	mem 20675MB
[2025-04-03 02:40:21 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][138/311]	eta 0:02:33 lr 0.000009	time 0.8757 (0.8867)	loss 0.6218 (0.5520)	grad_norm 3.0768 (2.9155)	mem 20675MB
[2025-04-03 02:40:22 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][140/311]	eta 0:02:31 lr 0.000008	time 0.8760 (0.8865)	loss 0.5392 (0.5514)	grad_norm 3.7652 (2.9350)	mem 20675MB
[2025-04-03 02:40:24 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][142/311]	eta 0:02:29 lr 0.000008	time 0.8759 (0.8864)	loss 0.5605 (0.5515)	grad_norm 2.4851 (2.9348)	mem 20675MB
[2025-04-03 02:40:26 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][144/311]	eta 0:02:28 lr 0.000008	time 0.8761 (0.8863)	loss 0.5336 (0.5515)	grad_norm 3.2127 (2.9336)	mem 20675MB
[2025-04-03 02:40:28 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][146/311]	eta 0:02:26 lr 0.000008	time 0.8774 (0.8861)	loss 0.5799 (0.5519)	grad_norm 2.8080 (2.9308)	mem 20675MB
[2025-04-03 02:40:29 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][148/311]	eta 0:02:24 lr 0.000008	time 0.8758 (0.8860)	loss 0.5250 (0.5514)	grad_norm 2.5750 (2.9317)	mem 20675MB
[2025-04-03 02:40:31 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][150/311]	eta 0:02:22 lr 0.000008	time 0.8759 (0.8859)	loss 0.6509 (0.5513)	grad_norm 2.5866 (2.9260)	mem 20675MB
[2025-04-03 02:40:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][152/311]	eta 0:02:20 lr 0.000008	time 0.8758 (0.8858)	loss 0.5869 (0.5522)	grad_norm 2.3014 (2.9188)	mem 20675MB
[2025-04-03 02:40:35 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][154/311]	eta 0:02:19 lr 0.000008	time 0.8760 (0.8857)	loss 0.6048 (0.5520)	grad_norm 2.9582 (2.9191)	mem 20675MB
[2025-04-03 02:40:36 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][156/311]	eta 0:02:17 lr 0.000008	time 0.8761 (0.8856)	loss 0.4348 (0.5508)	grad_norm 3.6395 (2.9222)	mem 20675MB
[2025-04-03 02:40:38 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][158/311]	eta 0:02:15 lr 0.000008	time 0.8761 (0.8854)	loss 0.3769 (0.5498)	grad_norm 3.4196 (2.9226)	mem 20675MB
[2025-04-03 02:40:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][160/311]	eta 0:02:13 lr 0.000008	time 0.8761 (0.8853)	loss 0.5908 (0.5500)	grad_norm 2.9926 (2.9231)	mem 20675MB
[2025-04-03 02:40:42 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][162/311]	eta 0:02:11 lr 0.000008	time 0.8758 (0.8852)	loss 0.5698 (0.5503)	grad_norm 2.2026 (2.9165)	mem 20675MB
[2025-04-03 02:40:43 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][164/311]	eta 0:02:10 lr 0.000008	time 0.8760 (0.8851)	loss 0.4727 (0.5487)	grad_norm 3.5622 (2.9270)	mem 20675MB
[2025-04-03 02:40:45 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][166/311]	eta 0:02:08 lr 0.000008	time 0.8761 (0.8850)	loss 0.6132 (0.5492)	grad_norm 3.2375 (2.9274)	mem 20675MB
[2025-04-03 02:40:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][168/311]	eta 0:02:06 lr 0.000008	time 0.8760 (0.8849)	loss 0.3873 (0.5486)	grad_norm 4.4560 (2.9347)	mem 20675MB
[2025-04-03 02:40:49 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][170/311]	eta 0:02:04 lr 0.000007	time 0.8757 (0.8848)	loss 0.5426 (0.5488)	grad_norm 2.3633 (2.9398)	mem 20675MB
[2025-04-03 02:40:50 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][172/311]	eta 0:02:02 lr 0.000007	time 0.8760 (0.8847)	loss 0.4232 (0.5483)	grad_norm 3.4552 (2.9397)	mem 20675MB
[2025-04-03 02:40:52 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][174/311]	eta 0:02:01 lr 0.000007	time 0.8760 (0.8847)	loss 0.5213 (0.5486)	grad_norm 3.1108 (2.9408)	mem 20675MB
[2025-04-03 02:40:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][176/311]	eta 0:01:59 lr 0.000007	time 0.8758 (0.8846)	loss 0.6201 (0.5497)	grad_norm 2.3907 (2.9404)	mem 20675MB
[2025-04-03 02:40:56 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][178/311]	eta 0:01:57 lr 0.000007	time 0.8757 (0.8845)	loss 0.4753 (0.5488)	grad_norm 3.6364 (2.9478)	mem 20675MB
[2025-04-03 02:40:57 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][180/311]	eta 0:01:55 lr 0.000007	time 0.8760 (0.8844)	loss 0.4043 (0.5481)	grad_norm 3.8705 (2.9500)	mem 20675MB
[2025-04-03 02:40:59 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][182/311]	eta 0:01:54 lr 0.000007	time 0.8763 (0.8843)	loss 0.4835 (0.5477)	grad_norm 4.8927 (2.9609)	mem 20675MB
[2025-04-03 02:41:01 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][184/311]	eta 0:01:52 lr 0.000007	time 0.8761 (0.8842)	loss 0.4500 (0.5477)	grad_norm 4.0436 (2.9652)	mem 20675MB
[2025-04-03 02:41:03 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][186/311]	eta 0:01:50 lr 0.000007	time 0.8757 (0.8841)	loss 0.5046 (0.5477)	grad_norm 3.3202 (2.9628)	mem 20675MB
[2025-04-03 02:41:04 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][188/311]	eta 0:01:48 lr 0.000007	time 0.8764 (0.8841)	loss 0.5212 (0.5477)	grad_norm 2.5525 (2.9584)	mem 20675MB
[2025-04-03 02:41:06 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][190/311]	eta 0:01:46 lr 0.000007	time 0.8762 (0.8840)	loss 0.3994 (0.5473)	grad_norm 5.4201 (2.9681)	mem 20675MB
[2025-04-03 02:41:08 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][192/311]	eta 0:01:45 lr 0.000007	time 0.8758 (0.8839)	loss 0.5951 (0.5470)	grad_norm 3.0180 (2.9684)	mem 20675MB
[2025-04-03 02:41:10 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][194/311]	eta 0:01:43 lr 0.000007	time 0.8758 (0.8839)	loss 0.3887 (0.5462)	grad_norm 4.3694 (2.9741)	mem 20675MB
[2025-04-03 02:41:11 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][196/311]	eta 0:01:41 lr 0.000007	time 0.8761 (0.8838)	loss 0.5467 (0.5459)	grad_norm 2.6850 (2.9804)	mem 20675MB
[2025-04-03 02:41:13 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][198/311]	eta 0:01:39 lr 0.000007	time 0.8761 (0.8837)	loss 0.5088 (0.5462)	grad_norm 3.7453 (2.9857)	mem 20675MB
[2025-04-03 02:41:15 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][200/311]	eta 0:01:38 lr 0.000007	time 0.8761 (0.8836)	loss 0.6098 (0.5468)	grad_norm 2.4970 (2.9819)	mem 20675MB
[2025-04-03 02:41:17 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][202/311]	eta 0:01:36 lr 0.000006	time 0.8760 (0.8836)	loss 0.5579 (0.5471)	grad_norm 2.8810 (2.9797)	mem 20675MB
[2025-04-03 02:41:18 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][204/311]	eta 0:01:34 lr 0.000006	time 0.8762 (0.8835)	loss 0.5532 (0.5470)	grad_norm 2.1050 (2.9837)	mem 20675MB
[2025-04-03 02:41:20 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][206/311]	eta 0:01:32 lr 0.000006	time 0.8766 (0.8835)	loss 0.5494 (0.5475)	grad_norm 2.9052 (2.9824)	mem 20675MB
[2025-04-03 02:41:22 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][208/311]	eta 0:01:30 lr 0.000006	time 0.8759 (0.8834)	loss 0.4353 (0.5469)	grad_norm 4.4984 (2.9874)	mem 20675MB
[2025-04-03 02:41:24 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][210/311]	eta 0:01:29 lr 0.000006	time 0.8761 (0.8833)	loss 0.5034 (0.5469)	grad_norm 3.4546 (2.9889)	mem 20675MB
[2025-04-03 02:41:25 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][212/311]	eta 0:01:27 lr 0.000006	time 0.8759 (0.8833)	loss 0.5625 (0.5470)	grad_norm 2.6718 (2.9896)	mem 20675MB
[2025-04-03 02:41:27 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][214/311]	eta 0:01:25 lr 0.000006	time 0.8760 (0.8832)	loss 0.5152 (0.5471)	grad_norm 2.9225 (2.9872)	mem 20675MB
[2025-04-03 02:41:29 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][216/311]	eta 0:01:23 lr 0.000006	time 0.8757 (0.8832)	loss 0.5080 (0.5466)	grad_norm 3.0155 (2.9879)	mem 20675MB
[2025-04-03 02:41:31 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][218/311]	eta 0:01:22 lr 0.000006	time 0.8760 (0.8831)	loss 0.5691 (0.5470)	grad_norm 3.0848 (2.9879)	mem 20675MB
[2025-04-03 02:41:33 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][220/311]	eta 0:01:20 lr 0.000006	time 0.8759 (0.8830)	loss 0.6396 (0.5474)	grad_norm 2.0894 (2.9847)	mem 20675MB
[2025-04-03 02:41:34 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][222/311]	eta 0:01:18 lr 0.000006	time 0.8765 (0.8830)	loss 0.6249 (0.5480)	grad_norm 2.5249 (2.9824)	mem 20675MB
[2025-04-03 02:41:36 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][224/311]	eta 0:01:16 lr 0.000006	time 0.8760 (0.8829)	loss 0.5255 (0.5477)	grad_norm 2.6236 (2.9819)	mem 20675MB
[2025-04-03 02:41:38 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][226/311]	eta 0:01:15 lr 0.000006	time 0.8759 (0.8829)	loss 0.4916 (0.5479)	grad_norm 3.4106 (2.9831)	mem 20675MB
[2025-04-03 02:41:40 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][228/311]	eta 0:01:13 lr 0.000006	time 0.8760 (0.8828)	loss 0.5773 (0.5482)	grad_norm 2.5293 (2.9776)	mem 20675MB
[2025-04-03 02:41:41 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][230/311]	eta 0:01:11 lr 0.000006	time 0.8762 (0.8828)	loss 0.5750 (0.5483)	grad_norm 2.5547 (2.9717)	mem 20675MB
[2025-04-03 02:41:43 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][232/311]	eta 0:01:09 lr 0.000006	time 0.8759 (0.8827)	loss 0.5071 (0.5476)	grad_norm 4.5509 (2.9788)	mem 20675MB
[2025-04-03 02:41:45 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][234/311]	eta 0:01:07 lr 0.000006	time 0.8760 (0.8827)	loss 0.4945 (0.5472)	grad_norm 3.3964 (2.9820)	mem 20675MB
[2025-04-03 02:41:47 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][236/311]	eta 0:01:06 lr 0.000006	time 0.8760 (0.8826)	loss 0.4556 (0.5468)	grad_norm 4.2607 (2.9866)	mem 20675MB
[2025-04-03 02:41:48 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][238/311]	eta 0:01:04 lr 0.000005	time 0.8758 (0.8826)	loss 0.4942 (0.5468)	grad_norm 2.6424 (2.9831)	mem 20675MB
[2025-04-03 02:41:50 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][240/311]	eta 0:01:02 lr 0.000005	time 0.8756 (0.8825)	loss 0.5689 (0.5470)	grad_norm 4.0405 (2.9857)	mem 20675MB
[2025-04-03 02:41:52 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][242/311]	eta 0:01:00 lr 0.000005	time 0.8761 (0.8825)	loss 0.6457 (0.5473)	grad_norm 2.6872 (2.9877)	mem 20675MB
[2025-04-03 02:41:54 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][244/311]	eta 0:00:59 lr 0.000005	time 0.8762 (0.8824)	loss 0.4374 (0.5466)	grad_norm 3.4132 (2.9976)	mem 20675MB
[2025-04-03 02:41:55 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][246/311]	eta 0:00:57 lr 0.000005	time 0.8762 (0.8824)	loss 0.6030 (0.5466)	grad_norm 2.2231 (2.9945)	mem 20675MB
[2025-04-03 02:41:57 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][248/311]	eta 0:00:55 lr 0.000005	time 0.8757 (0.8824)	loss 0.5744 (0.5464)	grad_norm 3.4737 (2.9981)	mem 20675MB
[2025-04-03 02:41:59 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][250/311]	eta 0:00:53 lr 0.000005	time 0.8760 (0.8823)	loss 0.6561 (0.5469)	grad_norm 3.2169 (2.9970)	mem 20675MB
[2025-04-03 02:42:01 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][252/311]	eta 0:00:52 lr 0.000005	time 0.8761 (0.8823)	loss 0.5709 (0.5470)	grad_norm 2.0952 (2.9943)	mem 20675MB
[2025-04-03 02:42:02 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][254/311]	eta 0:00:50 lr 0.000005	time 0.8759 (0.8822)	loss 0.6070 (0.5476)	grad_norm 2.5802 (2.9897)	mem 20675MB
[2025-04-03 02:42:04 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][256/311]	eta 0:00:48 lr 0.000005	time 0.8757 (0.8822)	loss 0.6540 (0.5483)	grad_norm 3.4949 (2.9880)	mem 20675MB
[2025-04-03 02:42:06 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][258/311]	eta 0:00:46 lr 0.000005	time 0.8761 (0.8821)	loss 0.5738 (0.5487)	grad_norm 1.6434 (2.9832)	mem 20675MB
[2025-04-03 02:42:08 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][260/311]	eta 0:00:44 lr 0.000005	time 0.8758 (0.8821)	loss 0.6137 (0.5491)	grad_norm 1.9530 (2.9783)	mem 20675MB
[2025-04-03 02:42:09 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][262/311]	eta 0:00:43 lr 0.000005	time 0.8758 (0.8821)	loss 0.6433 (0.5494)	grad_norm 2.0359 (2.9742)	mem 20675MB
[2025-04-03 02:42:11 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][264/311]	eta 0:00:41 lr 0.000005	time 0.8758 (0.8820)	loss 0.6153 (0.5496)	grad_norm 2.4230 (2.9701)	mem 20675MB
[2025-04-03 02:42:13 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][266/311]	eta 0:00:39 lr 0.000005	time 0.8757 (0.8820)	loss 0.4537 (0.5490)	grad_norm 3.3971 (2.9779)	mem 20675MB
[2025-04-03 02:42:15 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][268/311]	eta 0:00:37 lr 0.000005	time 0.8758 (0.8819)	loss 0.4087 (0.5483)	grad_norm 3.3890 (2.9846)	mem 20675MB
[2025-04-03 02:42:16 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][270/311]	eta 0:00:36 lr 0.000005	time 0.8756 (0.8819)	loss 0.4060 (0.5479)	grad_norm 4.2967 (2.9872)	mem 20675MB
[2025-04-03 02:42:18 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][272/311]	eta 0:00:34 lr 0.000005	time 0.8759 (0.8819)	loss 0.6025 (0.5478)	grad_norm 2.2823 (2.9888)	mem 20675MB
[2025-04-03 02:42:20 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][274/311]	eta 0:00:32 lr 0.000005	time 0.8758 (0.8818)	loss 0.6054 (0.5482)	grad_norm 3.6359 (2.9895)	mem 20675MB
[2025-04-03 02:42:22 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][276/311]	eta 0:00:30 lr 0.000004	time 0.8760 (0.8818)	loss 0.4785 (0.5479)	grad_norm 3.6985 (2.9910)	mem 20675MB
[2025-04-03 02:42:23 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][278/311]	eta 0:00:29 lr 0.000004	time 0.8760 (0.8817)	loss 0.6962 (0.5483)	grad_norm 3.9734 (2.9910)	mem 20675MB
[2025-04-03 02:42:25 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][280/311]	eta 0:00:27 lr 0.000004	time 0.8759 (0.8817)	loss 0.6055 (0.5487)	grad_norm 2.3761 (2.9880)	mem 20675MB
[2025-04-03 02:42:27 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][282/311]	eta 0:00:25 lr 0.000004	time 0.8758 (0.8817)	loss 0.5781 (0.5490)	grad_norm 2.3189 (2.9843)	mem 20675MB
[2025-04-03 02:42:29 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][284/311]	eta 0:00:23 lr 0.000004	time 0.8758 (0.8816)	loss 0.4685 (0.5485)	grad_norm 4.6991 (2.9964)	mem 20675MB
[2025-04-03 02:42:30 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][286/311]	eta 0:00:22 lr 0.000004	time 0.8761 (0.8816)	loss 0.3924 (0.5477)	grad_norm 3.9016 (3.0012)	mem 20675MB
[2025-04-03 02:42:32 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][288/311]	eta 0:00:20 lr 0.000004	time 0.8760 (0.8816)	loss 0.5223 (0.5474)	grad_norm 3.4619 (3.0020)	mem 20675MB
[2025-04-03 02:42:34 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][290/311]	eta 0:00:18 lr 0.000004	time 0.8760 (0.8815)	loss 0.4995 (0.5468)	grad_norm 3.1482 (3.0074)	mem 20675MB
[2025-04-03 02:42:36 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][292/311]	eta 0:00:16 lr 0.000004	time 0.8760 (0.8815)	loss 0.3834 (0.5462)	grad_norm 5.2820 (3.0144)	mem 20675MB
[2025-04-03 02:42:37 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][294/311]	eta 0:00:14 lr 0.000004	time 0.8760 (0.8815)	loss 0.3996 (0.5458)	grad_norm 5.4776 (3.0235)	mem 20675MB
[2025-04-03 02:42:39 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][296/311]	eta 0:00:13 lr 0.000004	time 0.8756 (0.8814)	loss 0.5669 (0.5455)	grad_norm 3.4018 (3.0267)	mem 20675MB
[2025-04-03 02:42:41 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][298/311]	eta 0:00:11 lr 0.000004	time 0.8758 (0.8814)	loss 0.5443 (0.5454)	grad_norm 1.9974 (3.0256)	mem 20675MB
[2025-04-03 02:42:43 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][300/311]	eta 0:00:09 lr 0.000004	time 0.8755 (0.8814)	loss 0.4846 (0.5449)	grad_norm 4.3348 (3.0329)	mem 20675MB
[2025-04-03 02:42:44 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][302/311]	eta 0:00:07 lr 0.000004	time 0.8757 (0.8814)	loss 0.4983 (0.5448)	grad_norm 3.5248 (3.0322)	mem 20675MB
[2025-04-03 02:42:46 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][304/311]	eta 0:00:06 lr 0.000004	time 0.8762 (0.8813)	loss 0.4355 (0.5444)	grad_norm 4.7561 (3.0351)	mem 20675MB
[2025-04-03 02:42:48 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][306/311]	eta 0:00:04 lr 0.000004	time 0.8756 (0.8813)	loss 0.6437 (0.5449)	grad_norm 2.1322 (3.0316)	mem 20675MB
[2025-04-03 02:42:50 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][308/311]	eta 0:00:02 lr 0.000004	time 0.8757 (0.8813)	loss 0.6207 (0.5452)	grad_norm 3.0719 (3.0295)	mem 20675MB
[2025-04-03 02:42:51 simmim_finetune] (main_finetune.py 252): INFO Train: [28/30][310/311]	eta 0:00:00 lr 0.000004	time 0.8757 (0.8812)	loss 0.5321 (0.5453)	grad_norm 3.1772 (3.0303)	mem 20675MB
[2025-04-03 02:42:52 simmim_finetune] (main_finetune.py 260): INFO EPOCH 28 training takes 0:04:34
[2025-04-03 02:42:53 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.377 (1.377)	Loss 0.5724 (0.5724)	Acc@1 75.000 (75.000)	Mem 20675MB
[2025-04-03 02:42:53 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 76.056
[2025-04-03 02:42:53 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 76.1%
[2025-04-03 02:42:53 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:42:53 simmim_finetune] (main_finetune.py 184): INFO Current learning rate for different parameter groups: [2.6205084206553783e-07, 2.6205084206553783e-07, 2.689108484128918e-07, 2.689108484128918e-07, 2.7946470433189796e-07, 2.7946470433189796e-07, 2.957014057457536e-07, 2.957014057457536e-07, 3.2068094638245457e-07, 3.2068094638245457e-07, 3.59111008900456e-07, 3.59111008900456e-07, 4.182341820050736e-07, 4.182341820050736e-07, 5.091929098583315e-07, 5.091929098583315e-07, 6.49129414247959e-07, 6.49129414247959e-07, 8.644163440781552e-07, 8.644163440781552e-07, 1.19562700535538e-06, 1.19562700535538e-06, 1.7051818688588028e-06, 1.7051818688588028e-06, 2.4891124280948383e-06, 2.4891124280948383e-06, 3.6951594423041228e-06, 3.6951594423041228e-06]
[2025-04-03 02:42:55 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][0/311]	eta 0:10:52 lr 0.000004	time 2.0976 (2.0976)	loss 0.5727 (0.5727)	grad_norm 2.1630 (2.1630)	mem 20675MB
[2025-04-03 02:42:57 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][2/311]	eta 0:06:36 lr 0.000004	time 0.8759 (1.2841)	loss 0.4794 (0.5499)	grad_norm 3.2230 (2.7609)	mem 20675MB
[2025-04-03 02:42:59 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][4/311]	eta 0:05:44 lr 0.000004	time 0.8755 (1.1211)	loss 0.4223 (0.5127)	grad_norm 6.0031 (3.4592)	mem 20675MB
[2025-04-03 02:43:00 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][6/311]	eta 0:05:20 lr 0.000004	time 0.8755 (1.0512)	loss 0.5556 (0.5306)	grad_norm 2.4163 (3.2310)	mem 20675MB
[2025-04-03 02:43:02 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][8/311]	eta 0:05:06 lr 0.000003	time 0.8757 (1.0123)	loss 0.6186 (0.5504)	grad_norm 2.7080 (3.0446)	mem 20675MB
[2025-04-03 02:43:04 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][10/311]	eta 0:04:57 lr 0.000003	time 0.8757 (0.9876)	loss 0.5133 (0.5411)	grad_norm 3.1930 (3.0726)	mem 20675MB
[2025-04-03 02:43:06 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][12/311]	eta 0:04:50 lr 0.000003	time 0.8758 (0.9705)	loss 0.6028 (0.5478)	grad_norm 3.0152 (3.0195)	mem 20675MB
[2025-04-03 02:43:07 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][14/311]	eta 0:04:44 lr 0.000003	time 0.8757 (0.9580)	loss 0.5242 (0.5399)	grad_norm 3.5282 (3.1516)	mem 20675MB
[2025-04-03 02:43:09 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][16/311]	eta 0:04:39 lr 0.000003	time 0.8760 (0.9484)	loss 0.6099 (0.5475)	grad_norm 2.9620 (3.1020)	mem 20675MB
[2025-04-03 02:43:11 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][18/311]	eta 0:04:35 lr 0.000003	time 0.8758 (0.9409)	loss 0.5100 (0.5432)	grad_norm 4.3329 (3.1599)	mem 20675MB
[2025-04-03 02:43:13 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][20/311]	eta 0:04:32 lr 0.000003	time 0.8756 (0.9348)	loss 0.4590 (0.5422)	grad_norm 4.5823 (3.2397)	mem 20675MB
[2025-04-03 02:43:14 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][22/311]	eta 0:04:28 lr 0.000003	time 0.8769 (0.9298)	loss 0.4305 (0.5401)	grad_norm 3.8963 (3.2428)	mem 20675MB
[2025-04-03 02:43:16 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][24/311]	eta 0:04:25 lr 0.000003	time 0.8773 (0.9256)	loss 0.6623 (0.5436)	grad_norm 3.7448 (3.2778)	mem 20675MB
[2025-04-03 02:43:18 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][26/311]	eta 0:04:22 lr 0.000003	time 0.8758 (0.9220)	loss 0.4549 (0.5347)	grad_norm 4.6823 (3.3800)	mem 20675MB
[2025-04-03 02:43:20 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][28/311]	eta 0:04:20 lr 0.000003	time 0.8763 (0.9189)	loss 0.4168 (0.5332)	grad_norm 4.3190 (3.3769)	mem 20675MB
[2025-04-03 02:43:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][30/311]	eta 0:04:17 lr 0.000003	time 0.8760 (0.9161)	loss 0.5433 (0.5304)	grad_norm 3.2114 (3.3774)	mem 20675MB
[2025-04-03 02:43:23 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][32/311]	eta 0:04:14 lr 0.000003	time 0.8763 (0.9138)	loss 0.6330 (0.5287)	grad_norm 2.8203 (3.3929)	mem 20675MB
[2025-04-03 02:43:25 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][34/311]	eta 0:04:12 lr 0.000003	time 0.8768 (0.9118)	loss 0.4607 (0.5275)	grad_norm 3.5785 (3.3879)	mem 20675MB
[2025-04-03 02:43:27 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][36/311]	eta 0:04:10 lr 0.000003	time 0.8787 (0.9100)	loss 0.5659 (0.5301)	grad_norm 2.8030 (3.3294)	mem 20675MB
[2025-04-03 02:43:29 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][38/311]	eta 0:04:07 lr 0.000003	time 0.8778 (0.9084)	loss 0.5049 (0.5291)	grad_norm 3.0297 (3.3098)	mem 20675MB
[2025-04-03 02:43:30 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][40/311]	eta 0:04:05 lr 0.000003	time 0.8761 (0.9069)	loss 0.4091 (0.5244)	grad_norm 4.7049 (3.3698)	mem 20675MB
[2025-04-03 02:43:32 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][42/311]	eta 0:04:03 lr 0.000003	time 0.8757 (0.9055)	loss 0.5759 (0.5276)	grad_norm 2.4492 (3.3336)	mem 20675MB
[2025-04-03 02:43:34 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][44/311]	eta 0:04:01 lr 0.000003	time 0.8759 (0.9042)	loss 0.4392 (0.5272)	grad_norm 3.7655 (3.3287)	mem 20675MB
[2025-04-03 02:43:36 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][46/311]	eta 0:03:59 lr 0.000003	time 0.8762 (0.9030)	loss 0.5448 (0.5294)	grad_norm 3.6871 (3.3180)	mem 20675MB
[2025-04-03 02:43:37 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][48/311]	eta 0:03:57 lr 0.000003	time 0.8757 (0.9019)	loss 0.6108 (0.5325)	grad_norm 2.7757 (3.2956)	mem 20675MB
[2025-04-03 02:43:39 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][50/311]	eta 0:03:55 lr 0.000003	time 0.8756 (0.9009)	loss 0.5835 (0.5323)	grad_norm 2.1767 (3.2982)	mem 20675MB
[2025-04-03 02:43:41 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][52/311]	eta 0:03:53 lr 0.000003	time 0.8757 (0.9000)	loss 0.5960 (0.5326)	grad_norm 2.6967 (3.3129)	mem 20675MB
[2025-04-03 02:43:43 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][54/311]	eta 0:03:51 lr 0.000003	time 0.8758 (0.8992)	loss 0.5997 (0.5339)	grad_norm 2.9178 (3.3035)	mem 20675MB
[2025-04-03 02:43:44 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][56/311]	eta 0:03:49 lr 0.000003	time 0.8759 (0.8984)	loss 0.6017 (0.5352)	grad_norm 2.5630 (3.2923)	mem 20675MB
[2025-04-03 02:43:46 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][58/311]	eta 0:03:47 lr 0.000003	time 0.8760 (0.8977)	loss 0.4919 (0.5364)	grad_norm 3.5220 (3.2767)	mem 20675MB
[2025-04-03 02:43:48 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][60/311]	eta 0:03:45 lr 0.000002	time 0.8758 (0.8970)	loss 0.5849 (0.5356)	grad_norm 3.5457 (3.3012)	mem 20675MB
[2025-04-03 02:43:50 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][62/311]	eta 0:03:43 lr 0.000002	time 0.8755 (0.8963)	loss 0.5314 (0.5368)	grad_norm 2.9367 (3.2884)	mem 20675MB
[2025-04-03 02:43:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][64/311]	eta 0:03:41 lr 0.000002	time 0.8757 (0.8957)	loss 0.5988 (0.5379)	grad_norm 2.9654 (3.2800)	mem 20675MB
[2025-04-03 02:43:53 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][66/311]	eta 0:03:39 lr 0.000002	time 0.8756 (0.8952)	loss 0.5362 (0.5374)	grad_norm 2.2303 (3.2691)	mem 20675MB
[2025-04-03 02:43:55 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][68/311]	eta 0:03:37 lr 0.000002	time 0.8758 (0.8946)	loss 0.4427 (0.5370)	grad_norm 4.0464 (3.2694)	mem 20675MB
[2025-04-03 02:43:57 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][70/311]	eta 0:03:35 lr 0.000002	time 0.8758 (0.8941)	loss 0.5517 (0.5376)	grad_norm 2.3809 (3.2455)	mem 20675MB
[2025-04-03 02:43:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][72/311]	eta 0:03:33 lr 0.000002	time 0.8761 (0.8936)	loss 0.5492 (0.5388)	grad_norm 2.6706 (3.2301)	mem 20675MB
[2025-04-03 02:44:00 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][74/311]	eta 0:03:31 lr 0.000002	time 0.8756 (0.8932)	loss 0.5527 (0.5382)	grad_norm 2.3122 (3.2042)	mem 20675MB
[2025-04-03 02:44:02 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][76/311]	eta 0:03:29 lr 0.000002	time 0.8760 (0.8928)	loss 0.3946 (0.5359)	grad_norm 3.9280 (3.2191)	mem 20675MB
[2025-04-03 02:44:04 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][78/311]	eta 0:03:27 lr 0.000002	time 0.8755 (0.8923)	loss 0.5206 (0.5350)	grad_norm 3.1983 (3.2176)	mem 20675MB
[2025-04-03 02:44:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][80/311]	eta 0:03:26 lr 0.000002	time 0.8757 (0.8920)	loss 0.4874 (0.5351)	grad_norm 3.2886 (3.2102)	mem 20675MB
[2025-04-03 02:44:07 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][82/311]	eta 0:03:24 lr 0.000002	time 0.8757 (0.8916)	loss 0.5901 (0.5360)	grad_norm 3.4952 (3.2054)	mem 20675MB
[2025-04-03 02:44:09 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][84/311]	eta 0:03:22 lr 0.000002	time 0.8757 (0.8912)	loss 0.4343 (0.5347)	grad_norm 2.2251 (3.1909)	mem 20675MB
[2025-04-03 02:44:11 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][86/311]	eta 0:03:20 lr 0.000002	time 0.8758 (0.8909)	loss 0.5185 (0.5351)	grad_norm 2.4086 (3.1826)	mem 20675MB
[2025-04-03 02:44:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][88/311]	eta 0:03:18 lr 0.000002	time 0.8759 (0.8906)	loss 0.6284 (0.5375)	grad_norm 2.8169 (3.1778)	mem 20675MB
[2025-04-03 02:44:14 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][90/311]	eta 0:03:16 lr 0.000002	time 0.8756 (0.8903)	loss 0.6003 (0.5389)	grad_norm 3.1503 (3.1676)	mem 20675MB
[2025-04-03 02:44:16 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][92/311]	eta 0:03:14 lr 0.000002	time 0.8760 (0.8900)	loss 0.3974 (0.5377)	grad_norm 5.4905 (3.1858)	mem 20675MB
[2025-04-03 02:44:18 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][94/311]	eta 0:03:13 lr 0.000002	time 0.8756 (0.8897)	loss 0.5011 (0.5376)	grad_norm 2.1838 (3.1702)	mem 20675MB
[2025-04-03 02:44:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][96/311]	eta 0:03:11 lr 0.000002	time 0.8757 (0.8894)	loss 0.4386 (0.5352)	grad_norm 4.0810 (3.1896)	mem 20675MB
[2025-04-03 02:44:21 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][98/311]	eta 0:03:09 lr 0.000002	time 0.8755 (0.8892)	loss 0.4232 (0.5348)	grad_norm 4.4451 (3.1956)	mem 20675MB
[2025-04-03 02:44:23 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][100/311]	eta 0:03:07 lr 0.000002	time 0.8764 (0.8889)	loss 0.6745 (0.5359)	grad_norm 3.2934 (3.2066)	mem 20675MB
[2025-04-03 02:44:25 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][102/311]	eta 0:03:05 lr 0.000002	time 0.8755 (0.8887)	loss 0.5610 (0.5353)	grad_norm 2.7060 (3.2066)	mem 20675MB
[2025-04-03 02:44:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][104/311]	eta 0:03:03 lr 0.000002	time 0.8754 (0.8884)	loss 0.4929 (0.5347)	grad_norm 4.5120 (3.2250)	mem 20675MB
[2025-04-03 02:44:28 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][106/311]	eta 0:03:02 lr 0.000002	time 0.8754 (0.8882)	loss 0.5336 (0.5348)	grad_norm 3.7485 (3.2274)	mem 20675MB
[2025-04-03 02:44:30 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][108/311]	eta 0:03:00 lr 0.000002	time 0.8755 (0.8880)	loss 0.5851 (0.5353)	grad_norm 2.6660 (3.2113)	mem 20675MB
[2025-04-03 02:44:32 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][110/311]	eta 0:02:58 lr 0.000002	time 0.8756 (0.8878)	loss 0.6242 (0.5356)	grad_norm 2.5256 (3.2092)	mem 20675MB
[2025-04-03 02:44:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][112/311]	eta 0:02:56 lr 0.000002	time 0.8755 (0.8876)	loss 0.6906 (0.5373)	grad_norm 2.7990 (3.2053)	mem 20675MB
[2025-04-03 02:44:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][114/311]	eta 0:02:54 lr 0.000002	time 0.8755 (0.8874)	loss 0.4859 (0.5366)	grad_norm 2.8585 (3.2005)	mem 20675MB
[2025-04-03 02:44:37 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][116/311]	eta 0:02:53 lr 0.000002	time 0.8758 (0.8872)	loss 0.5830 (0.5371)	grad_norm 2.2219 (3.1927)	mem 20675MB
[2025-04-03 02:44:39 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][118/311]	eta 0:02:51 lr 0.000002	time 0.8756 (0.8870)	loss 0.4425 (0.5351)	grad_norm 4.7267 (3.2074)	mem 20675MB
[2025-04-03 02:44:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][120/311]	eta 0:02:49 lr 0.000002	time 0.8759 (0.8869)	loss 0.6019 (0.5360)	grad_norm 2.9044 (3.1998)	mem 20675MB
[2025-04-03 02:44:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][122/311]	eta 0:02:47 lr 0.000002	time 0.8755 (0.8867)	loss 0.5868 (0.5367)	grad_norm 3.2602 (3.1958)	mem 20675MB
[2025-04-03 02:44:44 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][124/311]	eta 0:02:45 lr 0.000001	time 0.8758 (0.8865)	loss 0.4591 (0.5365)	grad_norm 2.9281 (3.1903)	mem 20675MB
[2025-04-03 02:44:46 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][126/311]	eta 0:02:43 lr 0.000001	time 0.8757 (0.8864)	loss 0.5400 (0.5358)	grad_norm 2.3140 (3.1848)	mem 20675MB
[2025-04-03 02:44:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][128/311]	eta 0:02:42 lr 0.000001	time 0.8757 (0.8862)	loss 0.4901 (0.5361)	grad_norm 2.3869 (3.1747)	mem 20675MB
[2025-04-03 02:44:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][130/311]	eta 0:02:40 lr 0.000001	time 0.8756 (0.8861)	loss 0.5057 (0.5351)	grad_norm 3.0384 (3.1836)	mem 20675MB
[2025-04-03 02:44:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][132/311]	eta 0:02:38 lr 0.000001	time 0.8755 (0.8859)	loss 0.4978 (0.5348)	grad_norm 3.5349 (3.1841)	mem 20675MB
[2025-04-03 02:44:53 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][134/311]	eta 0:02:36 lr 0.000001	time 0.8756 (0.8858)	loss 0.5130 (0.5350)	grad_norm 3.8493 (3.1861)	mem 20675MB
[2025-04-03 02:44:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][136/311]	eta 0:02:34 lr 0.000001	time 0.8758 (0.8857)	loss 0.6140 (0.5350)	grad_norm 2.7820 (3.1865)	mem 20675MB
[2025-04-03 02:44:56 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][138/311]	eta 0:02:33 lr 0.000001	time 0.8758 (0.8855)	loss 0.4860 (0.5347)	grad_norm 2.6614 (3.1765)	mem 20675MB
[2025-04-03 02:44:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][140/311]	eta 0:02:31 lr 0.000001	time 0.8758 (0.8854)	loss 0.6181 (0.5343)	grad_norm 2.5936 (3.1751)	mem 20675MB
[2025-04-03 02:45:00 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][142/311]	eta 0:02:29 lr 0.000001	time 0.8756 (0.8853)	loss 0.5837 (0.5340)	grad_norm 2.8064 (3.1689)	mem 20675MB
[2025-04-03 02:45:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][144/311]	eta 0:02:27 lr 0.000001	time 0.8761 (0.8852)	loss 0.5722 (0.5348)	grad_norm 3.0298 (3.1682)	mem 20675MB
[2025-04-03 02:45:03 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][146/311]	eta 0:02:26 lr 0.000001	time 0.8759 (0.8851)	loss 0.6064 (0.5353)	grad_norm 2.0797 (3.1612)	mem 20675MB
[2025-04-03 02:45:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][148/311]	eta 0:02:24 lr 0.000001	time 0.8756 (0.8849)	loss 0.6069 (0.5353)	grad_norm 2.4714 (3.1548)	mem 20675MB
[2025-04-03 02:45:07 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][150/311]	eta 0:02:22 lr 0.000001	time 0.8761 (0.8848)	loss 0.5734 (0.5349)	grad_norm 2.2854 (3.1720)	mem 20675MB
[2025-04-03 02:45:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][152/311]	eta 0:02:20 lr 0.000001	time 0.8758 (0.8847)	loss 0.4015 (0.5343)	grad_norm 3.5699 (3.1675)	mem 20675MB
[2025-04-03 02:45:10 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][154/311]	eta 0:02:18 lr 0.000001	time 0.8755 (0.8846)	loss 0.5161 (0.5341)	grad_norm 2.2333 (3.1618)	mem 20675MB
[2025-04-03 02:45:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][156/311]	eta 0:02:17 lr 0.000001	time 0.8755 (0.8845)	loss 0.4174 (0.5327)	grad_norm 4.8312 (3.1762)	mem 20675MB
[2025-04-03 02:45:14 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][158/311]	eta 0:02:15 lr 0.000001	time 0.8758 (0.8844)	loss 0.5889 (0.5333)	grad_norm 3.6586 (3.1763)	mem 20675MB
[2025-04-03 02:45:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][160/311]	eta 0:02:13 lr 0.000001	time 0.8755 (0.8843)	loss 0.5371 (0.5331)	grad_norm 2.2067 (3.1712)	mem 20675MB
[2025-04-03 02:45:17 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][162/311]	eta 0:02:11 lr 0.000001	time 0.8759 (0.8842)	loss 0.5891 (0.5328)	grad_norm 2.8656 (3.1918)	mem 20675MB
[2025-04-03 02:45:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][164/311]	eta 0:02:09 lr 0.000001	time 0.8756 (0.8841)	loss 0.6303 (0.5326)	grad_norm 2.5679 (3.1917)	mem 20675MB
[2025-04-03 02:45:21 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][166/311]	eta 0:02:08 lr 0.000001	time 0.8757 (0.8840)	loss 0.5384 (0.5322)	grad_norm 4.1922 (3.1973)	mem 20675MB
[2025-04-03 02:45:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][168/311]	eta 0:02:06 lr 0.000001	time 0.8755 (0.8840)	loss 0.5587 (0.5324)	grad_norm 2.5698 (3.1917)	mem 20675MB
[2025-04-03 02:45:24 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][170/311]	eta 0:02:04 lr 0.000001	time 0.8757 (0.8839)	loss 0.6039 (0.5332)	grad_norm 2.9185 (3.1844)	mem 20675MB
[2025-04-03 02:45:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][172/311]	eta 0:02:02 lr 0.000001	time 0.8754 (0.8838)	loss 0.4923 (0.5324)	grad_norm 3.2864 (3.1885)	mem 20675MB
[2025-04-03 02:45:28 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][174/311]	eta 0:02:01 lr 0.000001	time 0.8761 (0.8837)	loss 0.4787 (0.5316)	grad_norm 2.9152 (3.1860)	mem 20675MB
[2025-04-03 02:45:30 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][176/311]	eta 0:01:59 lr 0.000001	time 0.8757 (0.8836)	loss 0.6457 (0.5328)	grad_norm 3.0201 (3.1794)	mem 20675MB
[2025-04-03 02:45:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][178/311]	eta 0:01:57 lr 0.000001	time 0.8758 (0.8835)	loss 0.6235 (0.5335)	grad_norm 2.8774 (3.1764)	mem 20675MB
[2025-04-03 02:45:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][180/311]	eta 0:01:55 lr 0.000001	time 0.8756 (0.8835)	loss 0.5516 (0.5334)	grad_norm 1.7437 (3.1690)	mem 20675MB
[2025-04-03 02:45:35 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][182/311]	eta 0:01:53 lr 0.000001	time 0.8761 (0.8834)	loss 0.4092 (0.5326)	grad_norm 5.5201 (3.1817)	mem 20675MB
[2025-04-03 02:45:37 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][184/311]	eta 0:01:52 lr 0.000001	time 0.8755 (0.8833)	loss 0.6515 (0.5335)	grad_norm 2.9429 (3.1760)	mem 20675MB
[2025-04-03 02:45:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][186/311]	eta 0:01:50 lr 0.000001	time 0.8755 (0.8832)	loss 0.4959 (0.5337)	grad_norm 4.0107 (3.1759)	mem 20675MB
[2025-04-03 02:45:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][188/311]	eta 0:01:48 lr 0.000001	time 0.8757 (0.8832)	loss 0.6040 (0.5338)	grad_norm 2.4835 (3.1761)	mem 20675MB
[2025-04-03 02:45:42 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][190/311]	eta 0:01:46 lr 0.000001	time 0.8760 (0.8831)	loss 0.6250 (0.5347)	grad_norm 3.7743 (3.1784)	mem 20675MB
[2025-04-03 02:45:44 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][192/311]	eta 0:01:45 lr 0.000001	time 0.8755 (0.8830)	loss 0.4566 (0.5343)	grad_norm 5.5475 (3.1849)	mem 20675MB
[2025-04-03 02:45:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][194/311]	eta 0:01:43 lr 0.000001	time 0.8758 (0.8830)	loss 0.4198 (0.5342)	grad_norm 3.1531 (3.1808)	mem 20675MB
[2025-04-03 02:45:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][196/311]	eta 0:01:41 lr 0.000001	time 0.8760 (0.8829)	loss 0.6674 (0.5355)	grad_norm 3.1930 (3.1801)	mem 20675MB
[2025-04-03 02:45:49 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][198/311]	eta 0:01:39 lr 0.000001	time 0.8754 (0.8828)	loss 0.5578 (0.5359)	grad_norm 2.4485 (3.1741)	mem 20675MB
[2025-04-03 02:45:51 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][200/311]	eta 0:01:37 lr 0.000001	time 0.8757 (0.8828)	loss 0.5568 (0.5365)	grad_norm 2.1920 (3.1690)	mem 20675MB
[2025-04-03 02:45:52 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][202/311]	eta 0:01:36 lr 0.000001	time 0.8759 (0.8827)	loss 0.5440 (0.5363)	grad_norm 3.1181 (3.1679)	mem 20675MB
[2025-04-03 02:45:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][204/311]	eta 0:01:34 lr 0.000001	time 0.8758 (0.8827)	loss 0.4003 (0.5357)	grad_norm 3.0124 (3.1636)	mem 20675MB
[2025-04-03 02:45:56 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][206/311]	eta 0:01:32 lr 0.000001	time 0.8757 (0.8826)	loss 0.4313 (0.5353)	grad_norm 3.6576 (3.1634)	mem 20675MB
[2025-04-03 02:45:58 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][208/311]	eta 0:01:30 lr 0.000001	time 0.8760 (0.8825)	loss 0.5903 (0.5354)	grad_norm 2.2443 (3.1554)	mem 20675MB
[2025-04-03 02:45:59 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][210/311]	eta 0:01:29 lr 0.000001	time 0.8757 (0.8825)	loss 0.3838 (0.5345)	grad_norm 3.8936 (3.1573)	mem 20675MB
[2025-04-03 02:46:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][212/311]	eta 0:01:27 lr 0.000001	time 0.8754 (0.8824)	loss 0.5477 (0.5349)	grad_norm 3.2159 (3.1571)	mem 20675MB
[2025-04-03 02:46:03 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][214/311]	eta 0:01:25 lr 0.000001	time 0.8753 (0.8824)	loss 0.4810 (0.5344)	grad_norm 3.5526 (3.1607)	mem 20675MB
[2025-04-03 02:46:05 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][216/311]	eta 0:01:23 lr 0.000001	time 0.8757 (0.8823)	loss 0.5378 (0.5346)	grad_norm 4.5146 (3.1639)	mem 20675MB
[2025-04-03 02:46:06 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][218/311]	eta 0:01:22 lr 0.000001	time 0.8756 (0.8823)	loss 0.5877 (0.5347)	grad_norm 2.4917 (3.1618)	mem 20675MB
[2025-04-03 02:46:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][220/311]	eta 0:01:20 lr 0.000001	time 0.8755 (0.8822)	loss 0.4758 (0.5348)	grad_norm 2.9704 (3.1578)	mem 20675MB
[2025-04-03 02:46:10 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][222/311]	eta 0:01:18 lr 0.000001	time 0.8757 (0.8822)	loss 0.5571 (0.5348)	grad_norm 3.1917 (3.1548)	mem 20675MB
[2025-04-03 02:46:12 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][224/311]	eta 0:01:16 lr 0.000001	time 0.8758 (0.8821)	loss 0.4314 (0.5343)	grad_norm 5.4785 (3.1713)	mem 20675MB
[2025-04-03 02:46:13 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][226/311]	eta 0:01:14 lr 0.000001	time 0.8755 (0.8821)	loss 0.5744 (0.5346)	grad_norm 2.0707 (3.1620)	mem 20675MB
[2025-04-03 02:46:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][228/311]	eta 0:01:13 lr 0.000000	time 0.8754 (0.8820)	loss 0.4967 (0.5339)	grad_norm 4.0287 (3.1732)	mem 20675MB
[2025-04-03 02:46:17 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][230/311]	eta 0:01:11 lr 0.000000	time 0.8757 (0.8820)	loss 0.5907 (0.5345)	grad_norm 2.4359 (3.1667)	mem 20675MB
[2025-04-03 02:46:19 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][232/311]	eta 0:01:09 lr 0.000000	time 0.8754 (0.8819)	loss 0.6028 (0.5352)	grad_norm 2.8853 (3.1637)	mem 20675MB
[2025-04-03 02:46:20 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][234/311]	eta 0:01:07 lr 0.000000	time 0.8754 (0.8819)	loss 0.4828 (0.5353)	grad_norm 3.0321 (3.1624)	mem 20675MB
[2025-04-03 02:46:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][236/311]	eta 0:01:06 lr 0.000000	time 0.8758 (0.8818)	loss 0.4197 (0.5352)	grad_norm 3.8688 (3.1648)	mem 20675MB
[2025-04-03 02:46:24 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][238/311]	eta 0:01:04 lr 0.000000	time 0.8768 (0.8818)	loss 0.4977 (0.5352)	grad_norm 2.4070 (3.1607)	mem 20675MB
[2025-04-03 02:46:26 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][240/311]	eta 0:01:02 lr 0.000000	time 0.8755 (0.8817)	loss 0.5871 (0.5356)	grad_norm 2.3505 (3.1568)	mem 20675MB
[2025-04-03 02:46:27 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][242/311]	eta 0:01:00 lr 0.000000	time 0.8760 (0.8817)	loss 0.5756 (0.5359)	grad_norm 2.5571 (3.1509)	mem 20675MB
[2025-04-03 02:46:29 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][244/311]	eta 0:00:59 lr 0.000000	time 0.8757 (0.8817)	loss 0.6406 (0.5361)	grad_norm 2.5307 (3.1466)	mem 20675MB
[2025-04-03 02:46:31 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][246/311]	eta 0:00:57 lr 0.000000	time 0.8757 (0.8816)	loss 0.4882 (0.5361)	grad_norm 2.7212 (3.1439)	mem 20675MB
[2025-04-03 02:46:33 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][248/311]	eta 0:00:55 lr 0.000000	time 0.8756 (0.8816)	loss 0.6117 (0.5368)	grad_norm 2.6880 (3.1381)	mem 20675MB
[2025-04-03 02:46:34 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][250/311]	eta 0:00:53 lr 0.000000	time 0.8764 (0.8815)	loss 0.6367 (0.5370)	grad_norm 2.4290 (3.1372)	mem 20675MB
[2025-04-03 02:46:36 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][252/311]	eta 0:00:52 lr 0.000000	time 0.8759 (0.8815)	loss 0.5941 (0.5367)	grad_norm 2.1584 (3.1397)	mem 20675MB
[2025-04-03 02:46:38 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][254/311]	eta 0:00:50 lr 0.000000	time 0.8757 (0.8815)	loss 0.5055 (0.5367)	grad_norm 2.9123 (3.1354)	mem 20675MB
[2025-04-03 02:46:40 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][256/311]	eta 0:00:48 lr 0.000000	time 0.8760 (0.8814)	loss 0.6122 (0.5369)	grad_norm 2.8625 (3.1319)	mem 20675MB
[2025-04-03 02:46:41 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][258/311]	eta 0:00:46 lr 0.000000	time 0.8761 (0.8814)	loss 0.6328 (0.5369)	grad_norm 2.2641 (3.1345)	mem 20675MB
[2025-04-03 02:46:43 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][260/311]	eta 0:00:44 lr 0.000000	time 0.8755 (0.8814)	loss 0.5373 (0.5373)	grad_norm 3.5684 (3.1391)	mem 20675MB
[2025-04-03 02:46:45 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][262/311]	eta 0:00:43 lr 0.000000	time 0.8759 (0.8813)	loss 0.4947 (0.5372)	grad_norm 4.6684 (3.1410)	mem 20675MB
[2025-04-03 02:46:47 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][264/311]	eta 0:00:41 lr 0.000000	time 0.8754 (0.8813)	loss 0.5341 (0.5370)	grad_norm 2.4633 (3.1392)	mem 20675MB
[2025-04-03 02:46:48 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][266/311]	eta 0:00:39 lr 0.000000	time 0.8757 (0.8813)	loss 0.7019 (0.5373)	grad_norm 3.2145 (3.1470)	mem 20675MB
[2025-04-03 02:46:50 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][268/311]	eta 0:00:37 lr 0.000000	time 0.8758 (0.8812)	loss 0.4970 (0.5373)	grad_norm 3.9708 (3.1479)	mem 20675MB
[2025-04-03 02:46:52 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][270/311]	eta 0:00:36 lr 0.000000	time 0.8757 (0.8812)	loss 0.5297 (0.5374)	grad_norm 2.5841 (3.1442)	mem 20675MB
[2025-04-03 02:46:54 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][272/311]	eta 0:00:34 lr 0.000000	time 0.8756 (0.8812)	loss 0.5481 (0.5369)	grad_norm 2.5002 (3.1426)	mem 20675MB
[2025-04-03 02:46:55 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][274/311]	eta 0:00:32 lr 0.000000	time 0.8769 (0.8811)	loss 0.6027 (0.5369)	grad_norm 3.3831 (3.1465)	mem 20675MB
[2025-04-03 02:46:57 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][276/311]	eta 0:00:30 lr 0.000000	time 0.8760 (0.8811)	loss 0.5005 (0.5370)	grad_norm 3.7198 (3.1482)	mem 20675MB
[2025-04-03 02:46:59 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][278/311]	eta 0:00:29 lr 0.000000	time 0.8759 (0.8811)	loss 0.4850 (0.5365)	grad_norm 3.5851 (3.1540)	mem 20675MB
[2025-04-03 02:47:01 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][280/311]	eta 0:00:27 lr 0.000000	time 0.8756 (0.8810)	loss 0.5227 (0.5365)	grad_norm 2.6305 (3.1511)	mem 20675MB
[2025-04-03 02:47:02 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][282/311]	eta 0:00:25 lr 0.000000	time 0.8758 (0.8810)	loss 0.3914 (0.5361)	grad_norm 4.7683 (3.1557)	mem 20675MB
[2025-04-03 02:47:04 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][284/311]	eta 0:00:23 lr 0.000000	time 0.8757 (0.8810)	loss 0.5820 (0.5367)	grad_norm 2.9823 (3.1546)	mem 20675MB
[2025-04-03 02:47:06 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][286/311]	eta 0:00:22 lr 0.000000	time 0.8761 (0.8809)	loss 0.6166 (0.5369)	grad_norm 3.2136 (3.1563)	mem 20675MB
[2025-04-03 02:47:08 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][288/311]	eta 0:00:20 lr 0.000000	time 0.8759 (0.8809)	loss 0.5586 (0.5373)	grad_norm 2.9917 (3.1537)	mem 20675MB
[2025-04-03 02:47:09 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][290/311]	eta 0:00:18 lr 0.000000	time 0.8757 (0.8809)	loss 0.4190 (0.5372)	grad_norm 4.6995 (3.1575)	mem 20675MB
[2025-04-03 02:47:11 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][292/311]	eta 0:00:16 lr 0.000000	time 0.8758 (0.8809)	loss 0.4609 (0.5372)	grad_norm 4.8418 (3.1612)	mem 20675MB
[2025-04-03 02:47:13 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][294/311]	eta 0:00:14 lr 0.000000	time 0.8757 (0.8808)	loss 0.4472 (0.5368)	grad_norm 3.0843 (3.1651)	mem 20675MB
[2025-04-03 02:47:15 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][296/311]	eta 0:00:13 lr 0.000000	time 0.8752 (0.8808)	loss 0.4443 (0.5366)	grad_norm 3.2975 (3.1635)	mem 20675MB
[2025-04-03 02:47:16 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][298/311]	eta 0:00:11 lr 0.000000	time 0.8753 (0.8808)	loss 0.5562 (0.5367)	grad_norm 2.6737 (3.1596)	mem 20675MB
[2025-04-03 02:47:18 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][300/311]	eta 0:00:09 lr 0.000000	time 0.8756 (0.8807)	loss 0.5124 (0.5370)	grad_norm 3.1058 (3.1585)	mem 20675MB
[2025-04-03 02:47:20 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][302/311]	eta 0:00:07 lr 0.000000	time 0.8756 (0.8807)	loss 0.5655 (0.5373)	grad_norm 2.5307 (3.1538)	mem 20675MB
[2025-04-03 02:47:22 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][304/311]	eta 0:00:06 lr 0.000000	time 0.8759 (0.8807)	loss 0.4256 (0.5368)	grad_norm 3.7077 (3.1589)	mem 20675MB
[2025-04-03 02:47:23 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][306/311]	eta 0:00:04 lr 0.000000	time 0.8752 (0.8807)	loss 0.5095 (0.5371)	grad_norm 3.0713 (3.1560)	mem 20675MB
[2025-04-03 02:47:25 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][308/311]	eta 0:00:02 lr 0.000000	time 0.8752 (0.8806)	loss 0.4491 (0.5364)	grad_norm 3.8656 (3.1630)	mem 20675MB
[2025-04-03 02:47:27 simmim_finetune] (main_finetune.py 252): INFO Train: [29/30][310/311]	eta 0:00:00 lr 0.000000	time 0.8755 (0.8806)	loss 0.5422 (0.5366)	grad_norm 3.7084 (3.1646)	mem 20675MB
[2025-04-03 02:47:27 simmim_finetune] (main_finetune.py 260): INFO EPOCH 29 training takes 0:04:34
[2025-04-03 02:47:27 simmim_finetune] (utils.py 60): INFO checkpoint/hand/ckpt29.pth saving......
[2025-04-03 02:47:30 simmim_finetune] (utils.py 62): INFO checkpoint/hand/ckpt29.pth saved !!!
[2025-04-03 02:47:32 simmim_finetune] (main_finetune.py 297): INFO Test: [0/2]	Time 1.481 (1.481)	Loss 0.5753 (0.5753)	Acc@1 74.219 (74.219)	Mem 20675MB
[2025-04-03 02:47:32 simmim_finetune] (main_finetune.py 304): INFO  * Acc@1 76.056
[2025-04-03 02:47:32 simmim_finetune] (main_finetune.py 171): INFO Accuracy of the network on the 142 test images: 76.1%
[2025-04-03 02:47:32 simmim_finetune] (main_finetune.py 173): INFO Max accuracy: 78.87%
[2025-04-03 02:47:32 simmim_finetune] (main_finetune.py 177): INFO Training time 2:18:27
