dranreb1660
commited on
Commit
·
7b8b18b
0
Parent(s):
Initial commit with model files
Browse files- .DS_Store +0 -0
- .gitattributes +11 -0
- llama_finetuned/.DS_Store +0 -0
- llama_finetuned/README.md +202 -0
- llama_finetuned/gptqmodel_4bit/quant_log.csv +225 -0
.DS_Store
ADDED
|
Binary file (6.15 kB). View file
|
|
|
.gitattributes
ADDED
|
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
*.bin filter=lfs diff=lfs merge=lfs -text
|
| 2 |
+
*.pkl filter=lfs diff=lfs merge=lfs -text
|
| 3 |
+
*.h5 filter=lfs diff=lfs merge=lfs -text
|
| 4 |
+
*.tflite filter=lfs diff=lfs merge=lfs -text
|
| 5 |
+
*.pb filter=lfs diff=lfs merge=lfs -text
|
| 6 |
+
*.onnx filter=lfs diff=lfs merge=lfs -text
|
| 7 |
+
*.tar.gz filter=lfs diff=lfs merge=lfs -text
|
| 8 |
+
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 9 |
+
*.7z filter=lfs diff=lfs merge=lfs -text
|
| 10 |
+
*.safetensors filter=lfs diff=lfs merge=lfs -text
|
| 11 |
+
*.json filter=lfs diff=lfs merge=lfs -text
|
llama_finetuned/.DS_Store
ADDED
|
Binary file (6.15 kB). View file
|
|
|
llama_finetuned/README.md
ADDED
|
@@ -0,0 +1,202 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
base_model: meta-llama/Llama-3.1-8B-Instruct
|
| 3 |
+
library_name: peft
|
| 4 |
+
---
|
| 5 |
+
|
| 6 |
+
# Model Card for Model ID
|
| 7 |
+
|
| 8 |
+
<!-- Provide a quick summary of what the model is/does. -->
|
| 9 |
+
|
| 10 |
+
|
| 11 |
+
|
| 12 |
+
## Model Details
|
| 13 |
+
|
| 14 |
+
### Model Description
|
| 15 |
+
|
| 16 |
+
<!-- Provide a longer summary of what this model is. -->
|
| 17 |
+
|
| 18 |
+
|
| 19 |
+
|
| 20 |
+
- **Developed by:** [More Information Needed]
|
| 21 |
+
- **Funded by [optional]:** [More Information Needed]
|
| 22 |
+
- **Shared by [optional]:** [More Information Needed]
|
| 23 |
+
- **Model type:** [More Information Needed]
|
| 24 |
+
- **Language(s) (NLP):** [More Information Needed]
|
| 25 |
+
- **License:** [More Information Needed]
|
| 26 |
+
- **Finetuned from model [optional]:** [More Information Needed]
|
| 27 |
+
|
| 28 |
+
### Model Sources [optional]
|
| 29 |
+
|
| 30 |
+
<!-- Provide the basic links for the model. -->
|
| 31 |
+
|
| 32 |
+
- **Repository:** [More Information Needed]
|
| 33 |
+
- **Paper [optional]:** [More Information Needed]
|
| 34 |
+
- **Demo [optional]:** [More Information Needed]
|
| 35 |
+
|
| 36 |
+
## Uses
|
| 37 |
+
|
| 38 |
+
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
| 39 |
+
|
| 40 |
+
### Direct Use
|
| 41 |
+
|
| 42 |
+
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
|
| 43 |
+
|
| 44 |
+
[More Information Needed]
|
| 45 |
+
|
| 46 |
+
### Downstream Use [optional]
|
| 47 |
+
|
| 48 |
+
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
|
| 49 |
+
|
| 50 |
+
[More Information Needed]
|
| 51 |
+
|
| 52 |
+
### Out-of-Scope Use
|
| 53 |
+
|
| 54 |
+
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
|
| 55 |
+
|
| 56 |
+
[More Information Needed]
|
| 57 |
+
|
| 58 |
+
## Bias, Risks, and Limitations
|
| 59 |
+
|
| 60 |
+
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
|
| 61 |
+
|
| 62 |
+
[More Information Needed]
|
| 63 |
+
|
| 64 |
+
### Recommendations
|
| 65 |
+
|
| 66 |
+
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
|
| 67 |
+
|
| 68 |
+
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
|
| 69 |
+
|
| 70 |
+
## How to Get Started with the Model
|
| 71 |
+
|
| 72 |
+
Use the code below to get started with the model.
|
| 73 |
+
|
| 74 |
+
[More Information Needed]
|
| 75 |
+
|
| 76 |
+
## Training Details
|
| 77 |
+
|
| 78 |
+
### Training Data
|
| 79 |
+
|
| 80 |
+
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
|
| 81 |
+
|
| 82 |
+
[More Information Needed]
|
| 83 |
+
|
| 84 |
+
### Training Procedure
|
| 85 |
+
|
| 86 |
+
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
|
| 87 |
+
|
| 88 |
+
#### Preprocessing [optional]
|
| 89 |
+
|
| 90 |
+
[More Information Needed]
|
| 91 |
+
|
| 92 |
+
|
| 93 |
+
#### Training Hyperparameters
|
| 94 |
+
|
| 95 |
+
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
|
| 96 |
+
|
| 97 |
+
#### Speeds, Sizes, Times [optional]
|
| 98 |
+
|
| 99 |
+
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
|
| 100 |
+
|
| 101 |
+
[More Information Needed]
|
| 102 |
+
|
| 103 |
+
## Evaluation
|
| 104 |
+
|
| 105 |
+
<!-- This section describes the evaluation protocols and provides the results. -->
|
| 106 |
+
|
| 107 |
+
### Testing Data, Factors & Metrics
|
| 108 |
+
|
| 109 |
+
#### Testing Data
|
| 110 |
+
|
| 111 |
+
<!-- This should link to a Dataset Card if possible. -->
|
| 112 |
+
|
| 113 |
+
[More Information Needed]
|
| 114 |
+
|
| 115 |
+
#### Factors
|
| 116 |
+
|
| 117 |
+
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
|
| 118 |
+
|
| 119 |
+
[More Information Needed]
|
| 120 |
+
|
| 121 |
+
#### Metrics
|
| 122 |
+
|
| 123 |
+
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
|
| 124 |
+
|
| 125 |
+
[More Information Needed]
|
| 126 |
+
|
| 127 |
+
### Results
|
| 128 |
+
|
| 129 |
+
[More Information Needed]
|
| 130 |
+
|
| 131 |
+
#### Summary
|
| 132 |
+
|
| 133 |
+
|
| 134 |
+
|
| 135 |
+
## Model Examination [optional]
|
| 136 |
+
|
| 137 |
+
<!-- Relevant interpretability work for the model goes here -->
|
| 138 |
+
|
| 139 |
+
[More Information Needed]
|
| 140 |
+
|
| 141 |
+
## Environmental Impact
|
| 142 |
+
|
| 143 |
+
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
|
| 144 |
+
|
| 145 |
+
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
|
| 146 |
+
|
| 147 |
+
- **Hardware Type:** [More Information Needed]
|
| 148 |
+
- **Hours used:** [More Information Needed]
|
| 149 |
+
- **Cloud Provider:** [More Information Needed]
|
| 150 |
+
- **Compute Region:** [More Information Needed]
|
| 151 |
+
- **Carbon Emitted:** [More Information Needed]
|
| 152 |
+
|
| 153 |
+
## Technical Specifications [optional]
|
| 154 |
+
|
| 155 |
+
### Model Architecture and Objective
|
| 156 |
+
|
| 157 |
+
[More Information Needed]
|
| 158 |
+
|
| 159 |
+
### Compute Infrastructure
|
| 160 |
+
|
| 161 |
+
[More Information Needed]
|
| 162 |
+
|
| 163 |
+
#### Hardware
|
| 164 |
+
|
| 165 |
+
[More Information Needed]
|
| 166 |
+
|
| 167 |
+
#### Software
|
| 168 |
+
|
| 169 |
+
[More Information Needed]
|
| 170 |
+
|
| 171 |
+
## Citation [optional]
|
| 172 |
+
|
| 173 |
+
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
| 174 |
+
|
| 175 |
+
**BibTeX:**
|
| 176 |
+
|
| 177 |
+
[More Information Needed]
|
| 178 |
+
|
| 179 |
+
**APA:**
|
| 180 |
+
|
| 181 |
+
[More Information Needed]
|
| 182 |
+
|
| 183 |
+
## Glossary [optional]
|
| 184 |
+
|
| 185 |
+
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
|
| 186 |
+
|
| 187 |
+
[More Information Needed]
|
| 188 |
+
|
| 189 |
+
## More Information [optional]
|
| 190 |
+
|
| 191 |
+
[More Information Needed]
|
| 192 |
+
|
| 193 |
+
## Model Card Authors [optional]
|
| 194 |
+
|
| 195 |
+
[More Information Needed]
|
| 196 |
+
|
| 197 |
+
## Model Card Contact
|
| 198 |
+
|
| 199 |
+
[More Information Needed]
|
| 200 |
+
### Framework versions
|
| 201 |
+
|
| 202 |
+
- PEFT 0.14.0
|
llama_finetuned/gptqmodel_4bit/quant_log.csv
ADDED
|
@@ -0,0 +1,225 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
layer,module,loss,damp,time
|
| 2 |
+
0,self_attn.k_proj,0.20497,0.01000,2.207
|
| 3 |
+
0,self_attn.v_proj,0.00201,0.01000,1.465
|
| 4 |
+
0,self_attn.q_proj,0.30193,0.01000,1.488
|
| 5 |
+
0,self_attn.o_proj,0.00019,0.01000,1.464
|
| 6 |
+
0,mlp.up_proj,0.14901,0.01000,1.538
|
| 7 |
+
0,mlp.gate_proj,0.17876,0.01000,1.495
|
| 8 |
+
0,mlp.down_proj,0.00063,0.01000,5.856
|
| 9 |
+
1,self_attn.k_proj,0.16532,0.01000,1.578
|
| 10 |
+
1,self_attn.v_proj,0.00722,0.01000,1.452
|
| 11 |
+
1,self_attn.q_proj,0.29823,0.01000,1.474
|
| 12 |
+
1,self_attn.o_proj,0.00045,0.01000,1.478
|
| 13 |
+
1,mlp.up_proj,0.28177,0.01000,1.518
|
| 14 |
+
1,mlp.gate_proj,0.33167,0.01000,1.473
|
| 15 |
+
1,mlp.down_proj,0.15022,0.01000,5.751
|
| 16 |
+
2,self_attn.k_proj,0.74963,0.01000,1.564
|
| 17 |
+
2,self_attn.v_proj,0.03474,0.01000,1.430
|
| 18 |
+
2,self_attn.q_proj,1.15687,0.01000,1.666
|
| 19 |
+
2,self_attn.o_proj,0.00045,0.01000,1.496
|
| 20 |
+
2,mlp.up_proj,0.48363,0.01000,1.516
|
| 21 |
+
2,mlp.gate_proj,0.60252,0.01000,1.495
|
| 22 |
+
2,mlp.down_proj,0.00285,0.01000,5.790
|
| 23 |
+
3,self_attn.k_proj,0.63965,0.01000,1.489
|
| 24 |
+
3,self_attn.v_proj,0.06011,0.01000,1.460
|
| 25 |
+
3,self_attn.q_proj,1.14490,0.01000,1.496
|
| 26 |
+
3,self_attn.o_proj,0.00123,0.01000,1.484
|
| 27 |
+
3,mlp.up_proj,0.63800,0.01000,1.504
|
| 28 |
+
3,mlp.gate_proj,0.91154,0.01000,1.496
|
| 29 |
+
3,mlp.down_proj,0.00571,0.01000,5.737
|
| 30 |
+
4,self_attn.k_proj,0.55720,0.01000,1.480
|
| 31 |
+
4,self_attn.v_proj,0.05674,0.01000,1.434
|
| 32 |
+
4,self_attn.q_proj,0.97971,0.01000,1.475
|
| 33 |
+
4,self_attn.o_proj,0.00259,0.01000,1.512
|
| 34 |
+
4,mlp.up_proj,0.80503,0.01000,1.489
|
| 35 |
+
4,mlp.gate_proj,1.30177,0.01000,1.478
|
| 36 |
+
4,mlp.down_proj,0.00961,0.01000,5.808
|
| 37 |
+
5,self_attn.k_proj,0.87404,0.01000,1.499
|
| 38 |
+
5,self_attn.v_proj,0.05538,0.01000,1.452
|
| 39 |
+
5,self_attn.q_proj,1.40345,0.01000,1.458
|
| 40 |
+
5,self_attn.o_proj,0.00271,0.01000,1.511
|
| 41 |
+
5,mlp.up_proj,0.95746,0.01000,1.506
|
| 42 |
+
5,mlp.gate_proj,1.53153,0.01000,1.486
|
| 43 |
+
5,mlp.down_proj,0.01401,0.01000,5.878
|
| 44 |
+
6,self_attn.k_proj,0.72873,0.01000,1.496
|
| 45 |
+
6,self_attn.v_proj,0.06350,0.01000,1.504
|
| 46 |
+
6,self_attn.q_proj,1.31834,0.01000,1.515
|
| 47 |
+
6,self_attn.o_proj,0.00532,0.01000,1.470
|
| 48 |
+
6,mlp.up_proj,1.06246,0.01000,1.510
|
| 49 |
+
6,mlp.gate_proj,1.72084,0.01000,1.486
|
| 50 |
+
6,mlp.down_proj,0.01771,0.01000,5.809
|
| 51 |
+
7,self_attn.k_proj,0.77504,0.01000,1.491
|
| 52 |
+
7,self_attn.v_proj,0.06837,0.01000,1.457
|
| 53 |
+
7,self_attn.q_proj,1.30903,0.01000,1.466
|
| 54 |
+
7,self_attn.o_proj,0.00768,0.01000,1.511
|
| 55 |
+
7,mlp.up_proj,1.18322,0.01000,1.507
|
| 56 |
+
7,mlp.gate_proj,1.79183,0.01000,1.485
|
| 57 |
+
7,mlp.down_proj,0.02134,0.01000,5.715
|
| 58 |
+
8,self_attn.k_proj,1.00739,0.01000,1.496
|
| 59 |
+
8,self_attn.v_proj,0.09484,0.01000,1.494
|
| 60 |
+
8,self_attn.q_proj,1.66843,0.01000,1.532
|
| 61 |
+
8,self_attn.o_proj,0.01138,0.01000,1.469
|
| 62 |
+
8,mlp.up_proj,1.25729,0.01000,1.497
|
| 63 |
+
8,mlp.gate_proj,1.93310,0.01000,1.482
|
| 64 |
+
8,mlp.down_proj,0.02314,0.01000,5.739
|
| 65 |
+
9,self_attn.k_proj,0.97315,0.01000,1.477
|
| 66 |
+
9,self_attn.v_proj,0.13074,0.01000,1.420
|
| 67 |
+
9,self_attn.q_proj,1.65996,0.01000,1.490
|
| 68 |
+
9,self_attn.o_proj,0.01420,0.01000,1.485
|
| 69 |
+
9,mlp.up_proj,1.31191,0.01000,1.509
|
| 70 |
+
9,mlp.gate_proj,2.03104,0.01000,1.490
|
| 71 |
+
9,mlp.down_proj,0.02473,0.01000,5.790
|
| 72 |
+
10,self_attn.k_proj,1.16183,0.01000,1.472
|
| 73 |
+
10,self_attn.v_proj,0.10278,0.01000,1.447
|
| 74 |
+
10,self_attn.q_proj,1.93474,0.01000,1.588
|
| 75 |
+
10,self_attn.o_proj,0.01309,0.01000,1.457
|
| 76 |
+
10,mlp.up_proj,1.37429,0.01000,1.542
|
| 77 |
+
10,mlp.gate_proj,1.98475,0.01000,1.489
|
| 78 |
+
10,mlp.down_proj,0.02690,0.01000,5.896
|
| 79 |
+
11,self_attn.k_proj,1.11032,0.01000,1.531
|
| 80 |
+
11,self_attn.v_proj,0.10660,0.01000,1.440
|
| 81 |
+
11,self_attn.q_proj,1.76108,0.01000,1.533
|
| 82 |
+
11,self_attn.o_proj,0.01489,0.01000,1.556
|
| 83 |
+
11,mlp.up_proj,1.47593,0.01000,1.523
|
| 84 |
+
11,mlp.gate_proj,2.06277,0.01000,1.498
|
| 85 |
+
11,mlp.down_proj,0.02968,0.01000,5.827
|
| 86 |
+
12,self_attn.k_proj,0.88908,0.01000,1.492
|
| 87 |
+
12,self_attn.v_proj,0.12811,0.01000,1.450
|
| 88 |
+
12,self_attn.q_proj,1.57054,0.01000,1.478
|
| 89 |
+
12,self_attn.o_proj,0.01993,0.01000,1.467
|
| 90 |
+
12,mlp.up_proj,1.54407,0.01000,1.516
|
| 91 |
+
12,mlp.gate_proj,2.04233,0.01000,1.456
|
| 92 |
+
12,mlp.down_proj,0.03332,0.01000,5.828
|
| 93 |
+
13,self_attn.k_proj,1.29054,0.01000,1.499
|
| 94 |
+
13,self_attn.v_proj,0.14054,0.01000,1.453
|
| 95 |
+
13,self_attn.q_proj,2.03878,0.01000,1.468
|
| 96 |
+
13,self_attn.o_proj,0.01995,0.01000,1.466
|
| 97 |
+
13,mlp.up_proj,1.60221,0.01000,1.540
|
| 98 |
+
13,mlp.gate_proj,2.11504,0.01000,1.534
|
| 99 |
+
13,mlp.down_proj,0.03465,0.01000,5.802
|
| 100 |
+
14,self_attn.k_proj,1.29833,0.01000,1.494
|
| 101 |
+
14,self_attn.v_proj,0.13893,0.01000,1.462
|
| 102 |
+
14,self_attn.q_proj,1.95875,0.01000,1.540
|
| 103 |
+
14,self_attn.o_proj,0.02435,0.01000,1.510
|
| 104 |
+
14,mlp.up_proj,1.72852,0.01000,1.499
|
| 105 |
+
14,mlp.gate_proj,2.40024,0.01000,1.493
|
| 106 |
+
14,mlp.down_proj,0.04297,0.01000,5.825
|
| 107 |
+
15,self_attn.k_proj,1.19167,0.01000,1.490
|
| 108 |
+
15,self_attn.v_proj,0.16916,0.01000,1.440
|
| 109 |
+
15,self_attn.q_proj,2.28649,0.01000,1.483
|
| 110 |
+
15,self_attn.o_proj,0.02458,0.01000,1.498
|
| 111 |
+
15,mlp.up_proj,1.79800,0.01000,1.510
|
| 112 |
+
15,mlp.gate_proj,2.63445,0.01000,1.500
|
| 113 |
+
15,mlp.down_proj,0.04993,0.01000,5.849
|
| 114 |
+
16,self_attn.k_proj,1.17782,0.01000,1.688
|
| 115 |
+
16,self_attn.v_proj,0.14366,0.01000,1.456
|
| 116 |
+
16,self_attn.q_proj,2.00419,0.01000,1.478
|
| 117 |
+
16,self_attn.o_proj,0.02002,0.01000,1.486
|
| 118 |
+
16,mlp.up_proj,1.82658,0.01000,1.500
|
| 119 |
+
16,mlp.gate_proj,2.83391,0.01000,1.502
|
| 120 |
+
16,mlp.down_proj,0.05562,0.01000,5.837
|
| 121 |
+
17,self_attn.k_proj,1.23447,0.01000,1.571
|
| 122 |
+
17,self_attn.v_proj,0.15860,0.01000,1.471
|
| 123 |
+
17,self_attn.q_proj,2.06489,0.01000,1.513
|
| 124 |
+
17,self_attn.o_proj,0.01764,0.01000,1.509
|
| 125 |
+
17,mlp.up_proj,1.86635,0.01000,1.516
|
| 126 |
+
17,mlp.gate_proj,2.96908,0.01000,1.496
|
| 127 |
+
17,mlp.down_proj,0.06157,0.01000,5.869
|
| 128 |
+
18,self_attn.k_proj,1.33415,0.01000,1.533
|
| 129 |
+
18,self_attn.v_proj,0.15418,0.01000,1.471
|
| 130 |
+
18,self_attn.q_proj,2.05141,0.01000,1.473
|
| 131 |
+
18,self_attn.o_proj,0.01136,0.01000,1.478
|
| 132 |
+
18,mlp.up_proj,1.92367,0.01000,1.539
|
| 133 |
+
18,mlp.gate_proj,3.09713,0.01000,1.498
|
| 134 |
+
18,mlp.down_proj,0.06182,0.01000,5.831
|
| 135 |
+
19,self_attn.k_proj,1.20882,0.01000,1.522
|
| 136 |
+
19,self_attn.v_proj,0.16848,0.01000,1.480
|
| 137 |
+
19,self_attn.q_proj,2.08260,0.01000,1.484
|
| 138 |
+
19,self_attn.o_proj,0.01204,0.01000,1.489
|
| 139 |
+
19,mlp.up_proj,1.99953,0.01000,1.524
|
| 140 |
+
19,mlp.gate_proj,3.27914,0.01000,1.527
|
| 141 |
+
19,mlp.down_proj,0.06632,0.01000,5.739
|
| 142 |
+
20,self_attn.k_proj,1.30188,0.01000,1.517
|
| 143 |
+
20,self_attn.v_proj,0.18483,0.01000,1.527
|
| 144 |
+
20,self_attn.q_proj,2.13806,0.01000,1.568
|
| 145 |
+
20,self_attn.o_proj,0.01415,0.01000,1.506
|
| 146 |
+
20,mlp.up_proj,2.15121,0.01000,1.524
|
| 147 |
+
20,mlp.gate_proj,3.49964,0.01000,1.575
|
| 148 |
+
20,mlp.down_proj,0.07272,0.01000,5.907
|
| 149 |
+
21,self_attn.k_proj,1.28493,0.01000,1.603
|
| 150 |
+
21,self_attn.v_proj,0.19875,0.01000,1.440
|
| 151 |
+
21,self_attn.q_proj,2.06501,0.01000,1.502
|
| 152 |
+
21,self_attn.o_proj,0.01691,0.01000,1.484
|
| 153 |
+
21,mlp.up_proj,2.29450,0.01000,1.507
|
| 154 |
+
21,mlp.gate_proj,3.76310,0.01000,1.498
|
| 155 |
+
21,mlp.down_proj,0.08457,0.01000,5.753
|
| 156 |
+
22,self_attn.k_proj,1.33741,0.01000,1.512
|
| 157 |
+
22,self_attn.v_proj,0.23293,0.01000,1.470
|
| 158 |
+
22,self_attn.q_proj,2.07345,0.01000,1.524
|
| 159 |
+
22,self_attn.o_proj,0.01621,0.01000,1.548
|
| 160 |
+
22,mlp.up_proj,2.41273,0.01000,1.522
|
| 161 |
+
22,mlp.gate_proj,3.91765,0.01000,1.511
|
| 162 |
+
22,mlp.down_proj,0.08732,0.01000,5.917
|
| 163 |
+
23,self_attn.k_proj,1.31320,0.01000,1.518
|
| 164 |
+
23,self_attn.v_proj,0.25737,0.01000,1.470
|
| 165 |
+
23,self_attn.q_proj,2.16603,0.01000,1.499
|
| 166 |
+
23,self_attn.o_proj,0.01396,0.01000,1.509
|
| 167 |
+
23,mlp.up_proj,2.55272,0.01000,1.516
|
| 168 |
+
23,mlp.gate_proj,4.12648,0.01000,1.501
|
| 169 |
+
23,mlp.down_proj,0.09488,0.01000,6.082
|
| 170 |
+
24,self_attn.k_proj,1.30349,0.01000,1.568
|
| 171 |
+
24,self_attn.v_proj,0.32369,0.01000,1.514
|
| 172 |
+
24,self_attn.q_proj,2.18847,0.01000,1.530
|
| 173 |
+
24,self_attn.o_proj,0.01625,0.01000,1.529
|
| 174 |
+
24,mlp.up_proj,2.76134,0.01000,1.553
|
| 175 |
+
24,mlp.gate_proj,4.46655,0.01000,1.531
|
| 176 |
+
24,mlp.down_proj,0.10605,0.01000,5.900
|
| 177 |
+
25,self_attn.k_proj,1.27495,0.01000,1.538
|
| 178 |
+
25,self_attn.v_proj,0.34449,0.01000,1.527
|
| 179 |
+
25,self_attn.q_proj,2.23903,0.01000,1.490
|
| 180 |
+
25,self_attn.o_proj,0.01893,0.01000,1.501
|
| 181 |
+
25,mlp.up_proj,3.02008,0.01000,1.535
|
| 182 |
+
25,mlp.gate_proj,4.88290,0.01000,1.522
|
| 183 |
+
25,mlp.down_proj,0.12015,0.01000,5.817
|
| 184 |
+
26,self_attn.k_proj,1.34739,0.01000,1.598
|
| 185 |
+
26,self_attn.v_proj,0.33584,0.01000,1.492
|
| 186 |
+
26,self_attn.q_proj,2.18536,0.01000,1.644
|
| 187 |
+
26,self_attn.o_proj,0.02690,0.01000,1.515
|
| 188 |
+
26,mlp.up_proj,3.25457,0.01000,1.549
|
| 189 |
+
26,mlp.gate_proj,5.27174,0.01000,1.523
|
| 190 |
+
26,mlp.down_proj,0.13506,0.01000,5.841
|
| 191 |
+
27,self_attn.k_proj,1.48341,0.01000,1.591
|
| 192 |
+
27,self_attn.v_proj,0.46845,0.01000,1.485
|
| 193 |
+
27,self_attn.q_proj,2.31207,0.01000,1.513
|
| 194 |
+
27,self_attn.o_proj,0.03408,0.01000,1.561
|
| 195 |
+
27,mlp.up_proj,3.67523,0.01000,1.555
|
| 196 |
+
27,mlp.gate_proj,5.90887,0.01000,1.516
|
| 197 |
+
27,mlp.down_proj,0.16378,0.01000,5.812
|
| 198 |
+
28,self_attn.k_proj,1.23230,0.01000,1.511
|
| 199 |
+
28,self_attn.v_proj,0.44825,0.01000,1.453
|
| 200 |
+
28,self_attn.q_proj,2.21801,0.01000,1.510
|
| 201 |
+
28,self_attn.o_proj,0.06715,0.01000,1.488
|
| 202 |
+
28,mlp.up_proj,4.15845,0.01000,1.574
|
| 203 |
+
28,mlp.gate_proj,6.39001,0.01000,1.504
|
| 204 |
+
28,mlp.down_proj,0.20997,0.01000,5.943
|
| 205 |
+
29,self_attn.k_proj,1.36047,0.01000,1.505
|
| 206 |
+
29,self_attn.v_proj,0.55055,0.01000,1.459
|
| 207 |
+
29,self_attn.q_proj,2.44167,0.01000,1.477
|
| 208 |
+
29,self_attn.o_proj,0.05338,0.01000,1.506
|
| 209 |
+
29,mlp.up_proj,4.49159,0.01000,1.515
|
| 210 |
+
29,mlp.gate_proj,6.60862,0.01000,1.496
|
| 211 |
+
29,mlp.down_proj,0.27847,0.01000,5.858
|
| 212 |
+
30,self_attn.k_proj,1.27395,0.01000,1.612
|
| 213 |
+
30,self_attn.v_proj,0.73981,0.01000,1.495
|
| 214 |
+
30,self_attn.q_proj,2.19805,0.01000,1.564
|
| 215 |
+
30,self_attn.o_proj,0.12117,0.01000,1.559
|
| 216 |
+
30,mlp.up_proj,4.80780,0.01000,1.551
|
| 217 |
+
30,mlp.gate_proj,7.24052,0.01000,1.533
|
| 218 |
+
30,mlp.down_proj,0.43298,0.01000,5.796
|
| 219 |
+
31,self_attn.k_proj,0.97036,0.01000,1.487
|
| 220 |
+
31,self_attn.v_proj,0.44060,0.01000,1.448
|
| 221 |
+
31,self_attn.q_proj,1.90233,0.01000,1.585
|
| 222 |
+
31,self_attn.o_proj,0.17504,0.01000,1.501
|
| 223 |
+
31,mlp.up_proj,4.47512,0.01000,1.515
|
| 224 |
+
31,mlp.gate_proj,6.52746,0.01000,1.487
|
| 225 |
+
31,mlp.down_proj,1.10257,0.01000,5.953
|