File size: 4,655 Bytes
73daac0 eedd29d 73daac0 f5649bc 73daac0 f5649bc 73daac0 86adbd9 73daac0 f5649bc 73daac0 86adbd9 8d6de57 f5649bc 73daac0 f5649bc 73daac0 86adbd9 27944de f5649bc 73daac0 86adbd9 73daac0 f5649bc 73daac0 86adbd9 73daac0 86adbd9 73daac0 f5649bc 73daac0 f5649bc 73daac0 86adbd9 73daac0 86adbd9 73daac0 f5649bc 73daac0 f5649bc 73daac0 f5649bc 73daac0 f5649bc 73daac0 f5649bc 73daac0 f5649bc 73daac0 f5649bc 73daac0 f5649bc 8d6de57 86adbd9 8d6de57 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 |
---
pipeline_tag: text-generation
base_model: Qwen/Qwen1.5-7B
library_name: peft
tags:
- LoRA
- TLE
- space-domain-awareness
- trajectory-prediction
- orbital-mechanics
license: other
---
# tle-orbit-explainer
A LoRA adapter for **Qwen-1.5-7B** that translates raw Two-Line Elements (TLEs) into natural-language orbit explanations, decay risk scores, and anomaly flags for general space awareness workflows.
---
## Model Details
### Model Description
| | |
| ------------------ | ----------------------------------------------------------- |
| **Developed by** | Jack Al-Kahwati / Stardrive |
| **Funded by** | ⬜️ (Self-funded) |
| **Shared by** | jackal79 (Hugging Face) |
| **Model type** | LoRA adapter (`peft==0.10.0`) |
| **Languages** | English |
| **License** | TLE-Orbit-NonCommercial v1.0 ([custom terms](./LICENSE.txt)) |
| **Finetuned from** | [`Qwen/Qwen1.5-7B`](https://huggingface.co/Qwen/Qwen1.5-7B) |
### Model Sources
| | |
| ---------------- | ---------------------------------------------------------------------------------------------------------- |
| **Repository** | [https://huggingface.co/jackal79/tle-orbit-explainer](https://huggingface.co/jackal79/tle-orbit-explainer) |
| **Paper / Blog** | https://medium.com/@jack_16944/enhancing-space-awareness-with-fine-tuned-transformer-models-introducing-tle-orbit-explainer-67ae40653ed5 |
---
## Uses
### Direct Use
* Quick summarization of satellite orbital states for analysts
* Plain-language TLE explanations for educational purposes
* Offline dataset labeling (orbital classifications)
### Downstream Use
* Combine with SGP4 for enhanced position forecasting
* Integration into satellite autonomy stacks (cubesats, small-scale hardware)
* Pre-prompted agent support in secure orbital management workflows
### Out-of-Scope Use
* High-precision orbit propagation without additional physics modeling
* Applications related to targeting, weapons systems, or lethal autonomous decisions
* Jurisdictions prohibiting ML or data export (verify with ITAR/EAR guidelines)
---
## Bias, Risks, & Limitations
| Category | Note |
| ------------------- | ------------------------------------------------------------------------------------------------------------- |
| **Data bias** | Trained primarily on decayed objects (`DECAY = 1`), possibly underestimating longevity for active satellites. |
| **Temporal limits** | Operates on snapshot data; does not handle continuous high-frequency time-series. |
| **Language** | Supports explanations in English only. |
| **Accuracy** | Potential inaccuracies in decay date predictions; verify independently. |
### Recommendations
Incorporate independent physics-based validation before operational use and maintain a human-in-the-loop for any critical or high-risk decisions.
---
## How to Get Started
```python
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
from peft import PeftModel
base = "Qwen/Qwen1.5-7B"
lora = "jackal79/tle-orbit-explainer"
tok = AutoTokenizer.from_pretrained(base)
model = AutoModelForCausalLM.from_pretrained(base, device_map="auto")
model = PeftModel.from_pretrained(model, lora) # merges LoRA
pipe = pipeline("text-generation", model=model, tokenizer=tok, device=0)
prompt = """### Prompt:
1 25544U 98067A 24079.07757601 .00016717 00000+0 10270-3 0 9994
2 25544 51.6400 337.6640 0007776 35.5310 330.5120 15.50377579499263
### Reasoning:
"""
print(pipe(prompt, max_new_tokens=120)[0]["generated_text"])
```
---
## License
This model is released under the **TLE-Orbit-NonCommercial License v1.0**.
- ✅ Free for non-commercial use, research, and internal evaluation
- 🚫 Commercial, operational, or for-profit use requires a separate license
To request a commercial license, contact: [email protected]
|