alecccdd's picture
Update README.md
ede7dec verified
metadata
dataset_info:
  features:
    - name: id
      dtype: string
    - name: image
      dtype: image
    - name: height
      dtype: int16
    - name: weight
      dtype: int16
    - name: body_type
      dtype: string
    - name: shirt_size
      dtype: string
    - name: pant_size
      dtype: string
    - name: body_pose_params
      list: float64
    - name: pred_global_rots
      list:
        list:
          list: float64
    - name: focal_length
      dtype: float64
    - name: pred_joint_coords
      list:
        list: float64
    - name: global_rot
      list: float64
    - name: shape_params
      list: float64
    - name: pred_cam_t
      list: float64
    - name: pred_keypoints_2d
      list:
        list: float64
    - name: pred_keypoints_3d
      list:
        list: float64
    - name: bbox
      list: float64
    - name: scale_params
      list: float64
    - name: bmi
      dtype: float64
    - name: height_cls
      dtype:
        class_label:
          names:
            '0': H_132_155
            '1': H_157_157
            '2': H_160_160
            '3': H_163_163
            '4': H_165_165
            '5': H_168_168
            '6': H_170_170
            '7': H_173_173
            '8': H_175_175
            '9': H_178_178
            '10': H_180_191
    - name: weight_cls
      dtype:
        class_label:
          names:
            '0': W_38_53
            '1': W_54_58
            '2': W_59_62
            '3': W_63_66
            '4': W_67_70
            '5': W_71_75
            '6': W_76_79
            '7': W_80_86
            '8': W_87_95
            '9': W_96_109
            '10': W_110_186
    - name: bmi_cls
      dtype:
        class_label:
          names:
            '0': BMI_12.5_20
            '1': BMI_20.5_21.5
            '2': BMI_22_23
            '3': BMI_23.5_24.5
            '4': BMI_25_26
            '5': BMI_26.5_28
            '6': BMI_28.5_30
            '7': BMI_30.5_33.5
            '8': BMI_34_38
            '9': BMI_38.5_72.5
    - name: pants_cls
      dtype:
        class_label:
          names:
            '0': PANTS_28_32
            '1': PANTS_34_34
            '2': PANTS_36_36
            '3': PANTS_38_38
            '4': PANTS_40_40
            '5': PANTS_42_42
            '6': PANTS_44_46
            '7': PANTS_48_76
    - name: shirt_cls
      dtype:
        class_label:
          names:
            '0': SHIRT_32_36
            '1': SHIRT_40_40
            '2': SHIRT_44_44
            '3': SHIRT_46_46
            '4': SHIRT_48_48
            '5': SHIRT_50_66
    - name: labels
      list: string
  splits:
    - name: train
      num_bytes: 1754024153
      num_examples: 26817
    - name: test
      num_bytes: 193214537
      num_examples: 2977
  download_size: 1711984856
  dataset_size: 1947238690
configs:
  - config_name: default
    data_files:
      - split: train
        path: data/train-*
      - split: test
        path: data/test-*
tags:
  - weight-estimation
  - height-estimation
size_categories:
  - 10K<n<100K

mbg-2511-pose-estimation-no_shapes-labeled

Summary

TRAIN: 26,817 samples height_cls: 11 classes, balance 4.0%-13.4% weight_cls: 11 classes, balance 7.0%-10.0% bmi_cls: 10 classes, balance 8.4%-11.3% pants_cls: 8 classes, balance 7.8%-18.1% shirt_cls: 6 classes, balance 4.3%-30.3%

TEST: 2,977 samples height_cls: 11 classes, balance 3.7%-12.8% weight_cls: 11 classes, balance 6.8%-10.2% bmi_cls: 10 classes, balance 9.0%-11.7% pants_cls: 8 classes, balance 7.5%-17.4% shirt_cls: 6 classes, balance 4.9%-31.1%

raw_target_distributions

Train & Test Split

>>> ds
DatasetDict({
    train: Dataset({
        features: ['id', 'image', 'height', 'weight', 'body_type', 'shirt_size', 'pant_size', 'body_pose_params', 'pred_global_rots', 'focal_length', 'pred_joint_coords', 'global_rot', 'shape_params', 'pred_cam_t', 'pred_keypoints_2d', 'pred_keypoints_3d', 'bbox', 'scale_params', 'bmi', 'height_cls', 'weight_cls', 'bmi_cls', 'pants_cls', 'shirt_cls', 'labels'],
        num_rows: 26817
    })
    test: Dataset({
        features: ['id', 'image', 'height', 'weight', 'body_type', 'shirt_size', 'pant_size', 'body_pose_params', 'pred_global_rots', 'focal_length', 'pred_joint_coords', 'global_rot', 'shape_params', 'pred_cam_t', 'pred_keypoints_2d', 'pred_keypoints_3d', 'bbox', 'scale_params', 'bmi', 'height_cls', 'weight_cls', 'bmi_cls', 'pants_cls', 'shirt_cls', 'labels'],
        num_rows: 2977
    })
})

Features

feature_importance feature_correlations

>>> ds["train"].features
{
    'id': Value('string'), 
    'image': Image(mode=None, decode=True), 
    'height': Value('int16'), 
    'weight': Value('int16'), 
    'body_type': Value('string'), 
    'shirt_size': Value('string'), 
    'pant_size': Value('string'), 
    'body_pose_params': List(Value('float64')), 
    'pred_global_rots': List(List(List(Value('float64')))), 
    'focal_length': Value('float64'), 
    'pred_joint_coords': List(List(Value('float64'))), 
    'global_rot': List(Value('float64')), 
    'shape_params': List(Value('float64')), 
    'pred_cam_t': List(Value('float64')), 
    'pred_keypoints_2d': List(List(Value('float64'))), 
    'pred_keypoints_3d': List(List(Value('float64'))), 
    'bbox': List(Value('float64')), 
    'scale_params': List(Value('float64')), 
    'bmi': Value('float64'), 
    'height_cls': ClassLabel(names=['H_132_155', 'H_157_157', 'H_160_160', 
                                    'H_163_163', 'H_165_165', 'H_168_168', 
                                    'H_170_170', 'H_173_173', 'H_175_175', 
                                    'H_178_178', 'H_180_191']), 
    'weight_cls': ClassLabel(names=['W_38_53', 'W_54_58', 'W_59_62', 'W_63_66', 
                                    'W_67_70', 'W_71_75', 'W_76_79', 'W_80_86', 
                                    'W_87_95', 'W_96_109', 'W_110_186']), 
    'bmi_cls': ClassLabel(names=['BMI_12.5_20', 'BMI_20.5_21.5', 'BMI_22_23', 
                                 'BMI_23.5_24.5', 'BMI_25_26', 'BMI_26.5_28', 
                                 'BMI_28.5_30', 'BMI_30.5_33.5', 'BMI_34_38', 
                                 'BMI_38.5_72.5']), 
    'pants_cls': ClassLabel(names=['PANTS_28_32', 'PANTS_34_34', 'PANTS_36_36', 
                                   'PANTS_38_38', 'PANTS_40_40', 'PANTS_42_42', 
                                   'PANTS_44_46', 'PANTS_48_76']), 
    'shirt_cls': ClassLabel(names=['SHIRT_32_36', 'SHIRT_40_40', 'SHIRT_44_44', 
                                   'SHIRT_46_46', 'SHIRT_48_48', 'SHIRT_50_66']), 
    'labels': List(Value('string'))
}

Sample

>>> ds['train'][0]
id: 382327
image: <PIL.Image.Image image mode=RGB size=500x1037 at 0x2B95CD6D82C0>
height: 160
weight: 50
body_type: rectangle
shirt_size: 36
pant_size: 28
body_pose_params: [[ 0.03886835 -0.00208342 -0.02778961]...[0. 0.]] shape=(133,), dtype=float64
pred_global_rots: [[[[ 1.          0.          0.        ]
  [ 0.          1.          0.        ]
  [ 0.          0.          1.        ]]

 [[ 0.99640441  0.05728238 -0.06242636]
  [-0.06423091  0.99120814 -0.11567552]
  [ 0.05525135  0.1192693   0.99132341]]

 [[-0.0984109  -0.05719048  0.99350119]
  [-0.99246413  0.07888412 -0.09376723]
  [-0.07300889 -0.99524194 -0.06452253]]]...[[[ 0.98913765  0.07291167  0.12763445]
  [-0.02712706  0.94393992 -0.32900089]
  [-0.14446725  0.3219648   0.93566442]]

 [[ 0.07291168  0.03137279  0.99684489]
  [ 0.94393992 -0.32483506 -0.05881885]
  [ 0.32196486  0.94525027 -0.05329826]]]] shape=(127, 3, 3), dtype=float64
focal_length: 1014.9290161132812
pred_joint_coords: [[[ 0.         -0.         -0.        ]
 [ 0.         -0.92398697 -0.        ]
 [ 0.08633898 -0.8986935  -0.00758996]]...[[ 0.0923527  -1.56066072 -0.27626431]
 [ 0.0634833  -1.68751919 -0.22902386]]] shape=(127, 3), dtype=float64
global_rot: [0.11973767727613449, -0.05527949333190918, -0.06437363475561142]
shape_params: [[-0.76408136  0.79042131 -1.15044034]...[ 0.12328251 -0.02765466]] shape=(45,), dtype=float64
pred_cam_t: [-0.006475623697042465, 1.0043303966522217, 1.6861722469329834]
pred_keypoints_2d: [[[286.81719971 143.6751709 ]
 [311.28182983 119.13873291]
 [263.91549683 114.87661743]]...[[165.12356567 245.27871704]
 [272.43045044 236.9727478 ]]] shape=(70, 2), dtype=float64
pred_keypoints_3d: [[[ 0.05692464 -1.51793671 -0.29545864]
 [ 0.09171973 -1.55984902 -0.27438799]
 [ 0.02577711 -1.56417561 -0.27841634]]...[[-0.12643629 -1.43218017 -0.09684809]
 [ 0.04102116 -1.43791533 -0.12306281]]] shape=(70, 3), dtype=float64
bbox: [54.824241638183594, 207.05712890625, 494.9328308105469, 1034.299072265625]
scale_params: [[ 0.00039525 -0.0016843  -0.00043011]...[ 0.39154088 -0.13409461]] shape=(28,), dtype=float64
bmi: 19.5
height_cls: 2
weight_cls: 0
bmi_cls: 0
pants_cls: 0
shirt_cls: 0
labels: ['H_160_160', 'W_38_53', 'BMI_12.5_20', 'PANTS_28_32', 'SHIRT_32_36']

Bins

class_distributions

Bins for height

  • H_132_155: 2,088 samples (7.8%)
  • H_157_157: 2,042 samples (7.6%)
  • H_160_160: 2,846 samples (10.6%)
  • H_163_163: 3,602 samples (13.4%)
  • H_165_165: 3,272 samples (12.2%)
  • H_168_168: 3,466 samples (12.9%)
  • H_170_170: 3,175 samples (11.8%)
  • H_173_173: 2,380 samples (8.9%)
  • H_175_175: 1,725 samples (6.4%)
  • H_178_178: 1,154 samples (4.3%)
  • H_180_191: 1,067 samples (4.0%)

Bins for weight

  • W_38_53: 2,529 samples (9.4%)
  • W_54_58: 2,496 samples (9.3%)
  • W_59_62: 2,392 samples (8.9%)
  • W_63_66: 2,562 samples (9.6%)
  • W_67_70: 2,601 samples (9.7%)
  • W_71_75: 2,681 samples (10.0%)
  • W_76_79: 1,869 samples (7.0%)
  • W_80_86: 2,645 samples (9.9%)
  • W_87_95: 2,335 samples (8.7%)
  • W_96_109: 2,389 samples (8.9%)
  • W_110_186: 2,318 samples (8.6%)

Bins for bmi

  • BMI_12.5_20: 3,043 samples (11.3%)
  • BMI_20.5_21.5: 2,567 samples (9.6%)
  • BMI_22_23: 2,743 samples (10.2%)
  • BMI_23.5_24.5: 2,809 samples (10.5%)
  • BMI_25_26: 2,666 samples (9.9%)
  • BMI_26.5_28: 2,716 samples (10.1%)
  • BMI_28.5_30: 2,243 samples (8.4%)
  • BMI_30.5_33.5: 2,904 samples (10.8%)
  • BMI_34_38: 2,476 samples (9.2%)
  • BMI_38.5_72.5: 2,650 samples (9.9%)

Bins for pant_size

  • PANTS_28_32: 4,858 samples (18.1%)
  • PANTS_34_34: 3,020 samples (11.3%)
  • PANTS_36_36: 3,467 samples (12.9%)
  • PANTS_38_38: 3,431 samples (12.8%)
  • PANTS_40_40: 3,331 samples (12.4%)
  • PANTS_42_42: 2,715 samples (10.1%)
  • PANTS_44_46: 3,890 samples (14.5%)
  • PANTS_48_76: 2,105 samples (7.8%)

Bins for shirt_size

  • SHIRT_32_36: 7,034 samples (26.2%)
  • SHIRT_40_40: 8,134 samples (30.3%)
  • SHIRT_44_44: 5,542 samples (20.7%)
  • SHIRT_46_46: 3,171 samples (11.8%)
  • SHIRT_48_48: 1,158 samples (4.3%)
  • SHIRT_50_66: 1,778 samples (6.6%)

Analysis

Analyses Included

Analysis Purpose Architecture/Training Impact
Feature Structure Exact shapes of each feature (e.g., keypoints Nx3) Determines if you need attention/conv vs MLP
Feature Statistics Mean, std, skewness, outliers, constant dims Normalization strategy, dead feature pruning
Feature Correlations Inter-group correlations Feature fusion strategy, redundancy removal
PCA Analysis Intrinsic dimensionality Bottleneck layer sizing
Feature Redundancy Effective rank per feature group Per-group compression potential
Target Correlations Spearman + mutual info between targets Shared representation depth, MTL benefits
Class Distributions Imbalance ratios, entropy Class weights, sampling strategy
Joint Distribution Multi-label co-occurrence Combined loss behavior
Feature Importance RF importance + linear probes Feature weighting, gating
Feature Ablation Leave-one-out accuracy drop Critical features identification
Per-Feature Accuracy Individual group predictive power Which features to prioritize
Class Separability LDA + t-SNE Expected ceiling, nonlinearity needs
Sample Difficulty k-NN consistency Curriculum learning, hard mining
Train/Test Comparison Distribution shift Generalization expectations
Ordinal Structure Monotonicity with raw values Ordinal regression vs classification
Keypoint Correlations Which keypoint dims predict what Body-aware feature selection
Normalization Strategy Outliers, bounds checking Per-feature normalization choice
Class Weights Balanced + effective weights Loss weighting values

Key Questions Answered

  1. How many pose feature dimensions do you actually have? (for MLP sizing)
  2. What's the effective dimensionality after PCA? (bottleneck size)
  3. Which targets are correlated? (shared representation design)
  4. How hard is each classification task? (loss weighting)
  5. Which features matter most? (attention/gating design)
  6. Should you use ordinal regression? (loss function choice)
  7. How much class imbalance? (sampling strategy)
  8. Are there redundant dimensions to prune? (efficiency)

Full Analysis

can be found in ./analysis/analysis_results.json

Addendum

Per Feature Accuracy

per_feature_accuracy

PCA Analysis

pca_analysis

Target Correlations

target_correlations

TSNE Visualization

tsne_visualization