Lapisbird nielsr HF Staff commited on
Commit
263dfc0
·
verified ·
1 Parent(s): d68a9b4

Improve model card: Add pipeline tag, paper and code links, and usage example (#1)

Browse files

- Improve model card: Add pipeline tag, paper and code links, and usage example (35d6d7a3e59101f6f56386c8f44e0c7f35766d42)


Co-authored-by: Niels Rogge <[email protected]>

Files changed (1) hide show
  1. README.md +24 -3
README.md CHANGED
@@ -1,10 +1,31 @@
1
  ---
2
- library_name: transformers
3
- license: llama3.2
4
  base_model: meta-llama/Llama-3.2-1B-Instruct
5
  datasets:
6
  - whynlp/gsm8k-aug
 
 
7
  tags: []
 
8
  ---
9
 
10
- Built with Llama
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
 
 
2
  base_model: meta-llama/Llama-3.2-1B-Instruct
3
  datasets:
4
  - whynlp/gsm8k-aug
5
+ library_name: transformers
6
+ license: llama3.2
7
  tags: []
8
+ pipeline_tag: text-generation
9
  ---
10
 
11
+ # Learning When to Stop: Adaptive Latent Reasoning via Reinforcement Learning
12
+
13
+ This repository contains the model presented in the paper [Learning When to Stop: Adaptive Latent Reasoning via Reinforcement Learning](https://huggingface.co/papers/2511.21581).
14
+
15
+ Latent reasoning represents a new approach in Transformer language models, aiming to compress reasoning lengths. This model uses a post-SFT reinforcement-learning methodology to optimize latent reasoning length, minimizing it while maintaining accuracy.
16
+
17
+ Code: [https://github.com/apning/adaptive-latent-reasoning](https://github.com/apning/adaptive-latent-reasoning)
18
+
19
+ ## Sample Usage
20
+
21
+ You can load these models using the `automodelforcausallm_from_pretrained_latent` function from `src.model_creation`.
22
+
23
+ ```python
24
+ from transformers import AutoTokenizer
25
+ from src.model_creation import automodelforcausallm_from_pretrained_latent
26
+
27
+ repo_id = "Lapisbird/Llama-adaLR-model-latent-6"
28
+
29
+ model = automodelforcausallm_from_pretrained_latent(repo_id)
30
+ tokenizer = AutoTokenizer.from_pretrained(repo_id)
31
+ ```