Orion-BiX Logo

Orion-BiX: Bi-Axial Meta-Learning Model for Tabular In-Context Learning

Orion-BiX is an advanced tabular foundation model that combines Bi-Axial Attention with Meta-Learning capabilities for few-shot tabular classification. The model extends the TabICL architecture with alternating attention patterns and episode-based training.

The model is part of Orion, a family of tabular foundation models with various enhancements.

Key Innovations

  1. Bi-Axial Attention: Alternating attention patterns (Standard β†’ Grouped β†’ Hierarchical β†’ Relational) that capture multi-scale feature interactions within tabular data
  2. Meta-Learning with k-NN Support Selection: Episode-based training with intelligent support set selection using similarity metrics
  3. Three-Component Architecture: Column embedding (Set Transformer), Bi-Axial row interaction, and In-Context Learning prediction

Architecture Overview

Input β†’ tf_col (Set Transformer) β†’ Bi-Axial Attention β†’ tf_icl (ICL) β†’ Output

Component Details:

  • tf_col (Column Embedder): Set Transformer for statistical distribution learning across features
  • Bi-Axial Attention: Replaces standard RowInteraction with alternating attention patterns:
    • Standard Cross-Feature Attention
    • Grouped Feature Attention
    • Hierarchical Feature Attention
    • Relational Feature Attention
    • CLS Token Aggregation
  • tf_icl (ICL Predictor): In-context learning module for few-shot prediction

Usage

from orion_bix.sklearn import OrionBiXClassifier

# Initialize and use
clf = OrionBiXClassifier()
clf.fit(X_train, y_train)
predictions = clf.predict(X_test)

This code will automatically download the pre-trained model from Hugging Face and use a GPU if available.

Installation

From the source

Option 1: From the local clone

cd orion-bix
pip install -e .

Option 2: From the Git Remote

pip install git+https://github.com/Lexsi-Labs/Orion-BiX.git

Citation

If you use Orion-BiX in your research, please cite:

@misc{bouadi25oriobix,
  title={Orion-BiX: Bi-Axial Meta-Learning for Tabular In-Context Learning},
  author={Mohamed Bouadi and Pratinav Seth and Aditya Tanna and Vinay Kumar Sankarapu},
  year={2025},
}
Downloads last month
10
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Collection including Lexsi/Orion-BiX