Added final set of Supporting file(s)
Browse files- .gitattributes +11 -0
- SYSTEM_PROMPT.txt +79 -0
- assets/cline_config.png +3 -0
- assets/mistral_common_coverage/coverage_distribution.png +0 -0
- assets/mistral_common_coverage/coverage_pie.png +0 -0
- assets/mistral_common_coverage/coverage_summary.png +0 -0
- assets/mistral_common_coverage/dependencies.png +3 -0
- assets/mistral_common_coverage/navigate.png +3 -0
- assets/mistral_common_coverage/prompt.png +3 -0
- assets/mistral_common_coverage/visualization.png +3 -0
- assets/open_hands_config.png +0 -0
- assets/space_invaders_pong/base_structure.png +3 -0
- assets/space_invaders_pong/game.png +0 -0
- assets/space_invaders_pong/prompt.png +3 -0
- assets/space_invaders_pong/task completed.png +3 -0
- assets/swe_benchmark.png +3 -0
- chat_template.jinja +16 -0
- chat_template.json +3 -0
- config.json +54 -0
- generation_config.json +11 -0
- model.safetensors.index.json +371 -0
- params.json +12 -0
- preprocessor_config.json +31 -0
- processor_config.json +43 -0
- tekken.json +3 -0
- tokenizer.json +3 -0
- tokenizer_config.json +0 -0
.gitattributes
CHANGED
|
@@ -33,3 +33,14 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
| 36 |
+
assets/cline_config.png filter=lfs diff=lfs merge=lfs -text
|
| 37 |
+
assets/mistral_common_coverage/dependencies.png filter=lfs diff=lfs merge=lfs -text
|
| 38 |
+
assets/mistral_common_coverage/navigate.png filter=lfs diff=lfs merge=lfs -text
|
| 39 |
+
assets/mistral_common_coverage/prompt.png filter=lfs diff=lfs merge=lfs -text
|
| 40 |
+
assets/mistral_common_coverage/visualization.png filter=lfs diff=lfs merge=lfs -text
|
| 41 |
+
assets/space_invaders_pong/base_structure.png filter=lfs diff=lfs merge=lfs -text
|
| 42 |
+
assets/space_invaders_pong/prompt.png filter=lfs diff=lfs merge=lfs -text
|
| 43 |
+
assets/space_invaders_pong/task[[:space:]]completed.png filter=lfs diff=lfs merge=lfs -text
|
| 44 |
+
assets/swe_benchmark.png filter=lfs diff=lfs merge=lfs -text
|
| 45 |
+
tekken.json filter=lfs diff=lfs merge=lfs -text
|
| 46 |
+
tokenizer.json filter=lfs diff=lfs merge=lfs -text
|
SYSTEM_PROMPT.txt
ADDED
|
@@ -0,0 +1,79 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
🟦 You are an AI with the following listed abilities: multi-modal (vision), coding, and software development capabilities. Follow instructions that the user provides and always respond to the user in the language they use or request. Next sections describe the capabilities that you have.
|
| 2 |
+
|
| 3 |
+
# [Multi-Modal Instructions]
|
| 4 |
+
You have the ability to read images, but you cannot generate images. You cannot read nor transcribe audio files or videos.
|
| 5 |
+
|
| 6 |
+
# [Coding and Software Development Instructions]
|
| 7 |
+
You can interact with a computer, peripherals, and connected devices to solve tasks.
|
| 8 |
+
|
| 9 |
+
<ROLE>
|
| 10 |
+
Your primary role is to assist the user, you can perform numerous tasks and have the ability to perform the following listed actions: executing commands, modifying code, and solving technical problems effectively. When using your capabilities or performing a task; be sure that you are thorough, methodical, and prioritize quality over speed.
|
| 11 |
+
* If the user asks a question about programming, coding, or software development, such as: "why is X happening", answer how you decide you would like to, but it is suggested that you would ask the user if they want an answer to the question, unless they have previously told you they do not want an answer.
|
| 12 |
+
</ROLE>
|
| 13 |
+
|
| 14 |
+
<EFFICIENCY>
|
| 15 |
+
* Each action you take is somewhat costly in regards to time, power usage, resource consumption and other logistics when performing tasks other than simply chatting with the user. Due to the logistical cost of an action you take: ensure that, Wherever possible, you always combine multiple actions into a single action, e.g. combine multiple bash commands into one, using sed and grep to edit/view multiple files at once.
|
| 16 |
+
* When exploring any codebase, use efficient tools like find, grep, and git commands with appropriate filters to minimize unnecessary operations.
|
| 17 |
+
</EFFICIENCY>
|
| 18 |
+
|
| 19 |
+
<FILE_SYSTEM_GUIDELINES>
|
| 20 |
+
* When the user provides a file path, confirm if it's relative to the current working directory. First, explore the file system to locate the file before working on it.
|
| 21 |
+
* If you are asked to edit a file, take a moment to create a new file with a different filename and edit the newly created file. Thank you for doing that!
|
| 22 |
+
* For global search-and-replace operations, consider using `sed` instead of opening file editors multiple times.
|
| 23 |
+
</FILE_SYSTEM_GUIDELINES>
|
| 24 |
+
|
| 25 |
+
<CODE_QUALITY>
|
| 26 |
+
* Try you best to write clean, efficient code with logically placed comments. Avoid overly long comments, just quickly summarize what the function is doing if you feel the need to explain it.
|
| 27 |
+
* When implementing solutions, focus on making only the changes needed to properly and correctly solve the problem.
|
| 28 |
+
* Before implementing any changes, first thoroughly understand the codebase through exploration.
|
| 29 |
+
* If you are adding a lot of code to a function or file, consider splitting the function or file into smaller pieces when appropriate.
|
| 30 |
+
</CODE_QUALITY>
|
| 31 |
+
|
| 32 |
+
<VERSION_CONTROL>
|
| 33 |
+
* When configuring git credentials, ask the user for the user.name and the user.email by default, unless explicitly instructed otherwise.
|
| 34 |
+
* Exercise caution with git operations. Do NOT make potentially dangerous changes (e.g., deleting files, deleting repositories) unless explicitly asked to do so.
|
| 35 |
+
* When committing changes, use `git status` to see all modified files, and stage all files necessary for the commit. Use `git commit -a` whenever possible.
|
| 36 |
+
* Do NOT commit files that typically shouldn't go into version control (e.g., node_modules/, .env files, build directories, cache files, large binaries) unless explicitly instructed by the user.
|
| 37 |
+
* If unsure about committing certain files, check for the presence of .gitignore files or ask the user for clarification.
|
| 38 |
+
</VERSION_CONTROL>
|
| 39 |
+
|
| 40 |
+
<PULL_REQUESTS>
|
| 41 |
+
* When creating pull requests, create only ONE per session/issue unless explicitly instructed otherwise.
|
| 42 |
+
* When working with an existing PR, update it with new commits rather than creating additional PRs for the same issue.
|
| 43 |
+
* When updating a PR, preserve the original PR title and purpose, updating description only when necessary.
|
| 44 |
+
</PULL_REQUESTS>
|
| 45 |
+
|
| 46 |
+
<PROBLEM_SOLVING_WORKFLOW>
|
| 47 |
+
1. EXPLORATION: Thoroughly explore relevant files and understand the context before proposing solutions
|
| 48 |
+
2. ANALYSIS: Consider multiple approaches and select the most promising one
|
| 49 |
+
3. TESTING:
|
| 50 |
+
* For bug fixes: Create tests to verify issues before implementing fixes
|
| 51 |
+
* For new features: Consider test-driven development when appropriate
|
| 52 |
+
* If the repository lacks testing infrastructure and implementing tests would require extensive setup, consult with the user before investing time in building testing infrastructure
|
| 53 |
+
* If the environment is not set up to run tests, consult with the user first before investing time to install all dependencies
|
| 54 |
+
4. IMPLEMENTATION: Make focused, minimal changes to address the problem
|
| 55 |
+
5. VERIFICATION: If the environment is set up to run tests, test your implementation thoroughly, including edge cases. If the environment is not set up to run tests, consult with the user first before investing time to run tests.
|
| 56 |
+
</PROBLEM_SOLVING_WORKFLOW>
|
| 57 |
+
|
| 58 |
+
<SECURITY>
|
| 59 |
+
* Only use GITHUB_TOKEN and other credentials in ways the user has explicitly requested and would expect.
|
| 60 |
+
* Use APIs to work with GitHub or other platforms, unless the user asks otherwise or your task requires browsing.
|
| 61 |
+
</SECURITY>
|
| 62 |
+
|
| 63 |
+
<ENVIRONMENT_SETUP>
|
| 64 |
+
* When user asks you to run an application, don't stop if the application is not installed. Instead, ask the user if they have it installed in a custom area or would like to have it installed. If the user requests to have the application installed then please install the application and run the command again.
|
| 65 |
+
* If you encounter missing dependencies:
|
| 66 |
+
1. First, look around in the repository for existing dependency files (requirements.txt, pyproject.toml, package.json, Gemfile, etc.)
|
| 67 |
+
2. If dependency files exist, use them to install all dependencies at once (e.g., `pip install -r requirements.txt`, `npm install`, etc.)
|
| 68 |
+
3. Only install individual packages directly if no dependency files are found or if only specific packages are needed
|
| 69 |
+
* Similarly, if you encounter missing dependencies for essential tools requested by the user, install them when possible.
|
| 70 |
+
</ENVIRONMENT_SETUP>
|
| 71 |
+
|
| 72 |
+
<TROUBLESHOOTING>
|
| 73 |
+
* If you've made repeated attempts to solve a problem but tests still fail or the user reports it's still broken:
|
| 74 |
+
1. Step back and reflect on 5-7 different possible sources of the problem
|
| 75 |
+
2. Assess the likelihood of each possible cause
|
| 76 |
+
3. Methodically address the most likely causes, starting with the highest probability
|
| 77 |
+
4. Document your reasoning process
|
| 78 |
+
* When you run into any major issue while executing a plan from the user, please don't try to directly work around it. Instead, propose a new plan and confirm with the user before proceeding.
|
| 79 |
+
</TROUBLESHOOTING>
|
assets/cline_config.png
ADDED
|
Git LFS Details
|
assets/mistral_common_coverage/coverage_distribution.png
ADDED
|
assets/mistral_common_coverage/coverage_pie.png
ADDED
|
assets/mistral_common_coverage/coverage_summary.png
ADDED
|
assets/mistral_common_coverage/dependencies.png
ADDED
|
Git LFS Details
|
assets/mistral_common_coverage/navigate.png
ADDED
|
Git LFS Details
|
assets/mistral_common_coverage/prompt.png
ADDED
|
Git LFS Details
|
assets/mistral_common_coverage/visualization.png
ADDED
|
Git LFS Details
|
assets/open_hands_config.png
ADDED
|
assets/space_invaders_pong/base_structure.png
ADDED
|
Git LFS Details
|
assets/space_invaders_pong/game.png
ADDED
|
assets/space_invaders_pong/prompt.png
ADDED
|
Git LFS Details
|
assets/space_invaders_pong/task completed.png
ADDED
|
Git LFS Details
|
assets/swe_benchmark.png
ADDED
|
Git LFS Details
|
chat_template.jinja
ADDED
|
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{{- range $index, $_ := .Messages }}
|
| 2 |
+
{{- if eq .Role "system" }}[SYSTEM_PROMPT]{{ .Content }}[/SYSTEM_PROMPT]
|
| 3 |
+
{{- else if eq .Role "user" }}
|
| 4 |
+
{{- if and (le (len (slice $.Messages $index)) 2) $.Tools }}[AVAILABLE_TOOLS]{{ $.Tools }}[/AVAILABLE_TOOLS]
|
| 5 |
+
{{- end }}[INST]{{ .Content }}[/INST]
|
| 6 |
+
{{- else if eq .Role "assistant" }}
|
| 7 |
+
{{- if .Content }}{{ .Content }}
|
| 8 |
+
{{- if not (eq (len (slice $.Messages $index)) 1) }}</s>
|
| 9 |
+
{{- end }}
|
| 10 |
+
{{- else if .ToolCalls }}[TOOL_CALLS][
|
| 11 |
+
{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "arguments": {{ .Function.Arguments }}}
|
| 12 |
+
{{- end }}]</s>
|
| 13 |
+
{{- end }}
|
| 14 |
+
{{- else if eq .Role "tool" }}[TOOL_RESULTS]{"content": {{ .Content }}}[/TOOL_RESULTS]
|
| 15 |
+
{{- end }}
|
| 16 |
+
{{- end }}
|
chat_template.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"chat_template": "{{- range $index, $_ := .Messages }}\n{{- if eq .Role \"system\" }}[SYSTEM_PROMPT]{{ .Content }}[/SYSTEM_PROMPT]\n{{- else if eq .Role \"user\" }}\n{{- if and (le (len (slice $.Messages $index)) 2) $.Tools }}[AVAILABLE_TOOLS]{{ $.Tools }}[/AVAILABLE_TOOLS]\n{{- end }}[INST]{{ .Content }}[/INST]\n{{- else if eq .Role \"assistant\" }}\n{{- if .Content }}{{ .Content }}\n{{- if not (eq (len (slice $.Messages $index)) 1) }}</s>\n{{- end }}\n{{- else if .ToolCalls }}[TOOL_CALLS][\n{{- range .ToolCalls }}{\"name\": \"{{ .Function.Name }}\", \"arguments\": {{ .Function.Arguments }}}\n{{- end }}]</s>\n{{- end }}\n{{- else if eq .Role \"tool\" }}[TOOL_RESULTS]{\"content\": {{ .Content }}}[/TOOL_RESULTS]\n{{- end }}\n{{- end }}"
|
| 3 |
+
}
|
config.json
ADDED
|
@@ -0,0 +1,54 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architectures": [
|
| 3 |
+
"Mistral3ForConditionalGeneration"
|
| 4 |
+
],
|
| 5 |
+
"dtype": "bfloat16",
|
| 6 |
+
"image_token_index": 10,
|
| 7 |
+
"model_type": "mistral3",
|
| 8 |
+
"multimodal_projector_bias": false,
|
| 9 |
+
"projector_hidden_act": "gelu",
|
| 10 |
+
"spatial_merge_size": 2,
|
| 11 |
+
"text_config": {
|
| 12 |
+
"attention_dropout": 0.0,
|
| 13 |
+
"head_dim": 128,
|
| 14 |
+
"hidden_act": "silu",
|
| 15 |
+
"hidden_size": 5120,
|
| 16 |
+
"initializer_range": 0.02,
|
| 17 |
+
"intermediate_size": 32768,
|
| 18 |
+
"max_position_embeddings": 131072,
|
| 19 |
+
"model_type": "mistral",
|
| 20 |
+
"num_attention_heads": 32,
|
| 21 |
+
"num_hidden_layers": 40,
|
| 22 |
+
"num_key_value_heads": 8,
|
| 23 |
+
"rms_norm_eps": 1e-05,
|
| 24 |
+
"rope_parameters": {
|
| 25 |
+
"rope_theta": 1000000000.0,
|
| 26 |
+
"rope_type": "default"
|
| 27 |
+
},
|
| 28 |
+
"rope_theta": 1000000000.0,
|
| 29 |
+
"sliding_window": null,
|
| 30 |
+
"use_cache": true,
|
| 31 |
+
"vocab_size": 131072
|
| 32 |
+
},
|
| 33 |
+
"transformers_version": "5.0.0.dev0",
|
| 34 |
+
"vision_config": {
|
| 35 |
+
"attention_dropout": 0.0,
|
| 36 |
+
"head_dim": 64,
|
| 37 |
+
"hidden_act": "silu",
|
| 38 |
+
"hidden_size": 1024,
|
| 39 |
+
"image_size": 1540,
|
| 40 |
+
"initializer_range": 0.02,
|
| 41 |
+
"intermediate_size": 4096,
|
| 42 |
+
"model_type": "pixtral",
|
| 43 |
+
"num_attention_heads": 16,
|
| 44 |
+
"num_channels": 3,
|
| 45 |
+
"num_hidden_layers": 24,
|
| 46 |
+
"patch_size": 14,
|
| 47 |
+
"rope_parameters": {
|
| 48 |
+
"rope_theta": 10000.0,
|
| 49 |
+
"rope_type": "default"
|
| 50 |
+
},
|
| 51 |
+
"rope_theta": 10000.0
|
| 52 |
+
},
|
| 53 |
+
"vision_feature_layer": -1
|
| 54 |
+
}
|
generation_config.json
ADDED
|
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"_from_model_config": true,
|
| 3 |
+
"bos_token_id": 1,
|
| 4 |
+
"do_sample": true,
|
| 5 |
+
"eos_token_id": 2,
|
| 6 |
+
"max_new_tokens": 131072,
|
| 7 |
+
"pad_token_id": 11,
|
| 8 |
+
"temperature": 0.7,
|
| 9 |
+
"top_p": 0.95,
|
| 10 |
+
"transformers_version": "5.0.0.dev0"
|
| 11 |
+
}
|
model.safetensors.index.json
ADDED
|
@@ -0,0 +1,371 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"metadata": {
|
| 3 |
+
"total_parameters": 23572403200,
|
| 4 |
+
"total_size": 47144806400
|
| 5 |
+
},
|
| 6 |
+
"weight_map": {
|
| 7 |
+
"lm_head.weight": "model-00010-of-00010.safetensors",
|
| 8 |
+
"model.embed_tokens.weight": "model-00001-of-00010.safetensors",
|
| 9 |
+
"model.layers.0.input_layernorm.weight": "model-00001-of-00010.safetensors",
|
| 10 |
+
"model.layers.0.mlp.down_proj.weight": "model-00001-of-00010.safetensors",
|
| 11 |
+
"model.layers.0.mlp.gate_proj.weight": "model-00001-of-00010.safetensors",
|
| 12 |
+
"model.layers.0.mlp.up_proj.weight": "model-00001-of-00010.safetensors",
|
| 13 |
+
"model.layers.0.post_attention_layernorm.weight": "model-00001-of-00010.safetensors",
|
| 14 |
+
"model.layers.0.self_attn.k_proj.weight": "model-00001-of-00010.safetensors",
|
| 15 |
+
"model.layers.0.self_attn.o_proj.weight": "model-00001-of-00010.safetensors",
|
| 16 |
+
"model.layers.0.self_attn.q_proj.weight": "model-00001-of-00010.safetensors",
|
| 17 |
+
"model.layers.0.self_attn.v_proj.weight": "model-00001-of-00010.safetensors",
|
| 18 |
+
"model.layers.1.input_layernorm.weight": "model-00001-of-00010.safetensors",
|
| 19 |
+
"model.layers.1.mlp.down_proj.weight": "model-00001-of-00010.safetensors",
|
| 20 |
+
"model.layers.1.mlp.gate_proj.weight": "model-00001-of-00010.safetensors",
|
| 21 |
+
"model.layers.1.mlp.up_proj.weight": "model-00001-of-00010.safetensors",
|
| 22 |
+
"model.layers.1.post_attention_layernorm.weight": "model-00001-of-00010.safetensors",
|
| 23 |
+
"model.layers.1.self_attn.k_proj.weight": "model-00001-of-00010.safetensors",
|
| 24 |
+
"model.layers.1.self_attn.o_proj.weight": "model-00001-of-00010.safetensors",
|
| 25 |
+
"model.layers.1.self_attn.q_proj.weight": "model-00001-of-00010.safetensors",
|
| 26 |
+
"model.layers.1.self_attn.v_proj.weight": "model-00001-of-00010.safetensors",
|
| 27 |
+
"model.layers.10.input_layernorm.weight": "model-00003-of-00010.safetensors",
|
| 28 |
+
"model.layers.10.mlp.down_proj.weight": "model-00003-of-00010.safetensors",
|
| 29 |
+
"model.layers.10.mlp.gate_proj.weight": "model-00003-of-00010.safetensors",
|
| 30 |
+
"model.layers.10.mlp.up_proj.weight": "model-00003-of-00010.safetensors",
|
| 31 |
+
"model.layers.10.post_attention_layernorm.weight": "model-00003-of-00010.safetensors",
|
| 32 |
+
"model.layers.10.self_attn.k_proj.weight": "model-00003-of-00010.safetensors",
|
| 33 |
+
"model.layers.10.self_attn.o_proj.weight": "model-00003-of-00010.safetensors",
|
| 34 |
+
"model.layers.10.self_attn.q_proj.weight": "model-00003-of-00010.safetensors",
|
| 35 |
+
"model.layers.10.self_attn.v_proj.weight": "model-00003-of-00010.safetensors",
|
| 36 |
+
"model.layers.11.input_layernorm.weight": "model-00004-of-00010.safetensors",
|
| 37 |
+
"model.layers.11.mlp.down_proj.weight": "model-00004-of-00010.safetensors",
|
| 38 |
+
"model.layers.11.mlp.gate_proj.weight": "model-00003-of-00010.safetensors",
|
| 39 |
+
"model.layers.11.mlp.up_proj.weight": "model-00003-of-00010.safetensors",
|
| 40 |
+
"model.layers.11.post_attention_layernorm.weight": "model-00004-of-00010.safetensors",
|
| 41 |
+
"model.layers.11.self_attn.k_proj.weight": "model-00003-of-00010.safetensors",
|
| 42 |
+
"model.layers.11.self_attn.o_proj.weight": "model-00003-of-00010.safetensors",
|
| 43 |
+
"model.layers.11.self_attn.q_proj.weight": "model-00003-of-00010.safetensors",
|
| 44 |
+
"model.layers.11.self_attn.v_proj.weight": "model-00003-of-00010.safetensors",
|
| 45 |
+
"model.layers.12.input_layernorm.weight": "model-00004-of-00010.safetensors",
|
| 46 |
+
"model.layers.12.mlp.down_proj.weight": "model-00004-of-00010.safetensors",
|
| 47 |
+
"model.layers.12.mlp.gate_proj.weight": "model-00004-of-00010.safetensors",
|
| 48 |
+
"model.layers.12.mlp.up_proj.weight": "model-00004-of-00010.safetensors",
|
| 49 |
+
"model.layers.12.post_attention_layernorm.weight": "model-00004-of-00010.safetensors",
|
| 50 |
+
"model.layers.12.self_attn.k_proj.weight": "model-00004-of-00010.safetensors",
|
| 51 |
+
"model.layers.12.self_attn.o_proj.weight": "model-00004-of-00010.safetensors",
|
| 52 |
+
"model.layers.12.self_attn.q_proj.weight": "model-00004-of-00010.safetensors",
|
| 53 |
+
"model.layers.12.self_attn.v_proj.weight": "model-00004-of-00010.safetensors",
|
| 54 |
+
"model.layers.13.input_layernorm.weight": "model-00004-of-00010.safetensors",
|
| 55 |
+
"model.layers.13.mlp.down_proj.weight": "model-00004-of-00010.safetensors",
|
| 56 |
+
"model.layers.13.mlp.gate_proj.weight": "model-00004-of-00010.safetensors",
|
| 57 |
+
"model.layers.13.mlp.up_proj.weight": "model-00004-of-00010.safetensors",
|
| 58 |
+
"model.layers.13.post_attention_layernorm.weight": "model-00004-of-00010.safetensors",
|
| 59 |
+
"model.layers.13.self_attn.k_proj.weight": "model-00004-of-00010.safetensors",
|
| 60 |
+
"model.layers.13.self_attn.o_proj.weight": "model-00004-of-00010.safetensors",
|
| 61 |
+
"model.layers.13.self_attn.q_proj.weight": "model-00004-of-00010.safetensors",
|
| 62 |
+
"model.layers.13.self_attn.v_proj.weight": "model-00004-of-00010.safetensors",
|
| 63 |
+
"model.layers.14.input_layernorm.weight": "model-00004-of-00010.safetensors",
|
| 64 |
+
"model.layers.14.mlp.down_proj.weight": "model-00004-of-00010.safetensors",
|
| 65 |
+
"model.layers.14.mlp.gate_proj.weight": "model-00004-of-00010.safetensors",
|
| 66 |
+
"model.layers.14.mlp.up_proj.weight": "model-00004-of-00010.safetensors",
|
| 67 |
+
"model.layers.14.post_attention_layernorm.weight": "model-00004-of-00010.safetensors",
|
| 68 |
+
"model.layers.14.self_attn.k_proj.weight": "model-00004-of-00010.safetensors",
|
| 69 |
+
"model.layers.14.self_attn.o_proj.weight": "model-00004-of-00010.safetensors",
|
| 70 |
+
"model.layers.14.self_attn.q_proj.weight": "model-00004-of-00010.safetensors",
|
| 71 |
+
"model.layers.14.self_attn.v_proj.weight": "model-00004-of-00010.safetensors",
|
| 72 |
+
"model.layers.15.input_layernorm.weight": "model-00004-of-00010.safetensors",
|
| 73 |
+
"model.layers.15.mlp.down_proj.weight": "model-00004-of-00010.safetensors",
|
| 74 |
+
"model.layers.15.mlp.gate_proj.weight": "model-00004-of-00010.safetensors",
|
| 75 |
+
"model.layers.15.mlp.up_proj.weight": "model-00004-of-00010.safetensors",
|
| 76 |
+
"model.layers.15.post_attention_layernorm.weight": "model-00004-of-00010.safetensors",
|
| 77 |
+
"model.layers.15.self_attn.k_proj.weight": "model-00004-of-00010.safetensors",
|
| 78 |
+
"model.layers.15.self_attn.o_proj.weight": "model-00004-of-00010.safetensors",
|
| 79 |
+
"model.layers.15.self_attn.q_proj.weight": "model-00004-of-00010.safetensors",
|
| 80 |
+
"model.layers.15.self_attn.v_proj.weight": "model-00004-of-00010.safetensors",
|
| 81 |
+
"model.layers.16.input_layernorm.weight": "model-00005-of-00010.safetensors",
|
| 82 |
+
"model.layers.16.mlp.down_proj.weight": "model-00005-of-00010.safetensors",
|
| 83 |
+
"model.layers.16.mlp.gate_proj.weight": "model-00005-of-00010.safetensors",
|
| 84 |
+
"model.layers.16.mlp.up_proj.weight": "model-00005-of-00010.safetensors",
|
| 85 |
+
"model.layers.16.post_attention_layernorm.weight": "model-00005-of-00010.safetensors",
|
| 86 |
+
"model.layers.16.self_attn.k_proj.weight": "model-00004-of-00010.safetensors",
|
| 87 |
+
"model.layers.16.self_attn.o_proj.weight": "model-00004-of-00010.safetensors",
|
| 88 |
+
"model.layers.16.self_attn.q_proj.weight": "model-00004-of-00010.safetensors",
|
| 89 |
+
"model.layers.16.self_attn.v_proj.weight": "model-00004-of-00010.safetensors",
|
| 90 |
+
"model.layers.17.input_layernorm.weight": "model-00005-of-00010.safetensors",
|
| 91 |
+
"model.layers.17.mlp.down_proj.weight": "model-00005-of-00010.safetensors",
|
| 92 |
+
"model.layers.17.mlp.gate_proj.weight": "model-00005-of-00010.safetensors",
|
| 93 |
+
"model.layers.17.mlp.up_proj.weight": "model-00005-of-00010.safetensors",
|
| 94 |
+
"model.layers.17.post_attention_layernorm.weight": "model-00005-of-00010.safetensors",
|
| 95 |
+
"model.layers.17.self_attn.k_proj.weight": "model-00005-of-00010.safetensors",
|
| 96 |
+
"model.layers.17.self_attn.o_proj.weight": "model-00005-of-00010.safetensors",
|
| 97 |
+
"model.layers.17.self_attn.q_proj.weight": "model-00005-of-00010.safetensors",
|
| 98 |
+
"model.layers.17.self_attn.v_proj.weight": "model-00005-of-00010.safetensors",
|
| 99 |
+
"model.layers.18.input_layernorm.weight": "model-00005-of-00010.safetensors",
|
| 100 |
+
"model.layers.18.mlp.down_proj.weight": "model-00005-of-00010.safetensors",
|
| 101 |
+
"model.layers.18.mlp.gate_proj.weight": "model-00005-of-00010.safetensors",
|
| 102 |
+
"model.layers.18.mlp.up_proj.weight": "model-00005-of-00010.safetensors",
|
| 103 |
+
"model.layers.18.post_attention_layernorm.weight": "model-00005-of-00010.safetensors",
|
| 104 |
+
"model.layers.18.self_attn.k_proj.weight": "model-00005-of-00010.safetensors",
|
| 105 |
+
"model.layers.18.self_attn.o_proj.weight": "model-00005-of-00010.safetensors",
|
| 106 |
+
"model.layers.18.self_attn.q_proj.weight": "model-00005-of-00010.safetensors",
|
| 107 |
+
"model.layers.18.self_attn.v_proj.weight": "model-00005-of-00010.safetensors",
|
| 108 |
+
"model.layers.19.input_layernorm.weight": "model-00005-of-00010.safetensors",
|
| 109 |
+
"model.layers.19.mlp.down_proj.weight": "model-00005-of-00010.safetensors",
|
| 110 |
+
"model.layers.19.mlp.gate_proj.weight": "model-00005-of-00010.safetensors",
|
| 111 |
+
"model.layers.19.mlp.up_proj.weight": "model-00005-of-00010.safetensors",
|
| 112 |
+
"model.layers.19.post_attention_layernorm.weight": "model-00005-of-00010.safetensors",
|
| 113 |
+
"model.layers.19.self_attn.k_proj.weight": "model-00005-of-00010.safetensors",
|
| 114 |
+
"model.layers.19.self_attn.o_proj.weight": "model-00005-of-00010.safetensors",
|
| 115 |
+
"model.layers.19.self_attn.q_proj.weight": "model-00005-of-00010.safetensors",
|
| 116 |
+
"model.layers.19.self_attn.v_proj.weight": "model-00005-of-00010.safetensors",
|
| 117 |
+
"model.layers.2.input_layernorm.weight": "model-00001-of-00010.safetensors",
|
| 118 |
+
"model.layers.2.mlp.down_proj.weight": "model-00001-of-00010.safetensors",
|
| 119 |
+
"model.layers.2.mlp.gate_proj.weight": "model-00001-of-00010.safetensors",
|
| 120 |
+
"model.layers.2.mlp.up_proj.weight": "model-00001-of-00010.safetensors",
|
| 121 |
+
"model.layers.2.post_attention_layernorm.weight": "model-00001-of-00010.safetensors",
|
| 122 |
+
"model.layers.2.self_attn.k_proj.weight": "model-00001-of-00010.safetensors",
|
| 123 |
+
"model.layers.2.self_attn.o_proj.weight": "model-00001-of-00010.safetensors",
|
| 124 |
+
"model.layers.2.self_attn.q_proj.weight": "model-00001-of-00010.safetensors",
|
| 125 |
+
"model.layers.2.self_attn.v_proj.weight": "model-00001-of-00010.safetensors",
|
| 126 |
+
"model.layers.20.input_layernorm.weight": "model-00006-of-00010.safetensors",
|
| 127 |
+
"model.layers.20.mlp.down_proj.weight": "model-00006-of-00010.safetensors",
|
| 128 |
+
"model.layers.20.mlp.gate_proj.weight": "model-00005-of-00010.safetensors",
|
| 129 |
+
"model.layers.20.mlp.up_proj.weight": "model-00006-of-00010.safetensors",
|
| 130 |
+
"model.layers.20.post_attention_layernorm.weight": "model-00006-of-00010.safetensors",
|
| 131 |
+
"model.layers.20.self_attn.k_proj.weight": "model-00005-of-00010.safetensors",
|
| 132 |
+
"model.layers.20.self_attn.o_proj.weight": "model-00005-of-00010.safetensors",
|
| 133 |
+
"model.layers.20.self_attn.q_proj.weight": "model-00005-of-00010.safetensors",
|
| 134 |
+
"model.layers.20.self_attn.v_proj.weight": "model-00005-of-00010.safetensors",
|
| 135 |
+
"model.layers.21.input_layernorm.weight": "model-00006-of-00010.safetensors",
|
| 136 |
+
"model.layers.21.mlp.down_proj.weight": "model-00006-of-00010.safetensors",
|
| 137 |
+
"model.layers.21.mlp.gate_proj.weight": "model-00006-of-00010.safetensors",
|
| 138 |
+
"model.layers.21.mlp.up_proj.weight": "model-00006-of-00010.safetensors",
|
| 139 |
+
"model.layers.21.post_attention_layernorm.weight": "model-00006-of-00010.safetensors",
|
| 140 |
+
"model.layers.21.self_attn.k_proj.weight": "model-00006-of-00010.safetensors",
|
| 141 |
+
"model.layers.21.self_attn.o_proj.weight": "model-00006-of-00010.safetensors",
|
| 142 |
+
"model.layers.21.self_attn.q_proj.weight": "model-00006-of-00010.safetensors",
|
| 143 |
+
"model.layers.21.self_attn.v_proj.weight": "model-00006-of-00010.safetensors",
|
| 144 |
+
"model.layers.22.input_layernorm.weight": "model-00006-of-00010.safetensors",
|
| 145 |
+
"model.layers.22.mlp.down_proj.weight": "model-00006-of-00010.safetensors",
|
| 146 |
+
"model.layers.22.mlp.gate_proj.weight": "model-00006-of-00010.safetensors",
|
| 147 |
+
"model.layers.22.mlp.up_proj.weight": "model-00006-of-00010.safetensors",
|
| 148 |
+
"model.layers.22.post_attention_layernorm.weight": "model-00006-of-00010.safetensors",
|
| 149 |
+
"model.layers.22.self_attn.k_proj.weight": "model-00006-of-00010.safetensors",
|
| 150 |
+
"model.layers.22.self_attn.o_proj.weight": "model-00006-of-00010.safetensors",
|
| 151 |
+
"model.layers.22.self_attn.q_proj.weight": "model-00006-of-00010.safetensors",
|
| 152 |
+
"model.layers.22.self_attn.v_proj.weight": "model-00006-of-00010.safetensors",
|
| 153 |
+
"model.layers.23.input_layernorm.weight": "model-00006-of-00010.safetensors",
|
| 154 |
+
"model.layers.23.mlp.down_proj.weight": "model-00006-of-00010.safetensors",
|
| 155 |
+
"model.layers.23.mlp.gate_proj.weight": "model-00006-of-00010.safetensors",
|
| 156 |
+
"model.layers.23.mlp.up_proj.weight": "model-00006-of-00010.safetensors",
|
| 157 |
+
"model.layers.23.post_attention_layernorm.weight": "model-00006-of-00010.safetensors",
|
| 158 |
+
"model.layers.23.self_attn.k_proj.weight": "model-00006-of-00010.safetensors",
|
| 159 |
+
"model.layers.23.self_attn.o_proj.weight": "model-00006-of-00010.safetensors",
|
| 160 |
+
"model.layers.23.self_attn.q_proj.weight": "model-00006-of-00010.safetensors",
|
| 161 |
+
"model.layers.23.self_attn.v_proj.weight": "model-00006-of-00010.safetensors",
|
| 162 |
+
"model.layers.24.input_layernorm.weight": "model-00007-of-00010.safetensors",
|
| 163 |
+
"model.layers.24.mlp.down_proj.weight": "model-00007-of-00010.safetensors",
|
| 164 |
+
"model.layers.24.mlp.gate_proj.weight": "model-00006-of-00010.safetensors",
|
| 165 |
+
"model.layers.24.mlp.up_proj.weight": "model-00006-of-00010.safetensors",
|
| 166 |
+
"model.layers.24.post_attention_layernorm.weight": "model-00007-of-00010.safetensors",
|
| 167 |
+
"model.layers.24.self_attn.k_proj.weight": "model-00006-of-00010.safetensors",
|
| 168 |
+
"model.layers.24.self_attn.o_proj.weight": "model-00006-of-00010.safetensors",
|
| 169 |
+
"model.layers.24.self_attn.q_proj.weight": "model-00006-of-00010.safetensors",
|
| 170 |
+
"model.layers.24.self_attn.v_proj.weight": "model-00006-of-00010.safetensors",
|
| 171 |
+
"model.layers.25.input_layernorm.weight": "model-00007-of-00010.safetensors",
|
| 172 |
+
"model.layers.25.mlp.down_proj.weight": "model-00007-of-00010.safetensors",
|
| 173 |
+
"model.layers.25.mlp.gate_proj.weight": "model-00007-of-00010.safetensors",
|
| 174 |
+
"model.layers.25.mlp.up_proj.weight": "model-00007-of-00010.safetensors",
|
| 175 |
+
"model.layers.25.post_attention_layernorm.weight": "model-00007-of-00010.safetensors",
|
| 176 |
+
"model.layers.25.self_attn.k_proj.weight": "model-00007-of-00010.safetensors",
|
| 177 |
+
"model.layers.25.self_attn.o_proj.weight": "model-00007-of-00010.safetensors",
|
| 178 |
+
"model.layers.25.self_attn.q_proj.weight": "model-00007-of-00010.safetensors",
|
| 179 |
+
"model.layers.25.self_attn.v_proj.weight": "model-00007-of-00010.safetensors",
|
| 180 |
+
"model.layers.26.input_layernorm.weight": "model-00007-of-00010.safetensors",
|
| 181 |
+
"model.layers.26.mlp.down_proj.weight": "model-00007-of-00010.safetensors",
|
| 182 |
+
"model.layers.26.mlp.gate_proj.weight": "model-00007-of-00010.safetensors",
|
| 183 |
+
"model.layers.26.mlp.up_proj.weight": "model-00007-of-00010.safetensors",
|
| 184 |
+
"model.layers.26.post_attention_layernorm.weight": "model-00007-of-00010.safetensors",
|
| 185 |
+
"model.layers.26.self_attn.k_proj.weight": "model-00007-of-00010.safetensors",
|
| 186 |
+
"model.layers.26.self_attn.o_proj.weight": "model-00007-of-00010.safetensors",
|
| 187 |
+
"model.layers.26.self_attn.q_proj.weight": "model-00007-of-00010.safetensors",
|
| 188 |
+
"model.layers.26.self_attn.v_proj.weight": "model-00007-of-00010.safetensors",
|
| 189 |
+
"model.layers.27.input_layernorm.weight": "model-00007-of-00010.safetensors",
|
| 190 |
+
"model.layers.27.mlp.down_proj.weight": "model-00007-of-00010.safetensors",
|
| 191 |
+
"model.layers.27.mlp.gate_proj.weight": "model-00007-of-00010.safetensors",
|
| 192 |
+
"model.layers.27.mlp.up_proj.weight": "model-00007-of-00010.safetensors",
|
| 193 |
+
"model.layers.27.post_attention_layernorm.weight": "model-00007-of-00010.safetensors",
|
| 194 |
+
"model.layers.27.self_attn.k_proj.weight": "model-00007-of-00010.safetensors",
|
| 195 |
+
"model.layers.27.self_attn.o_proj.weight": "model-00007-of-00010.safetensors",
|
| 196 |
+
"model.layers.27.self_attn.q_proj.weight": "model-00007-of-00010.safetensors",
|
| 197 |
+
"model.layers.27.self_attn.v_proj.weight": "model-00007-of-00010.safetensors",
|
| 198 |
+
"model.layers.28.input_layernorm.weight": "model-00007-of-00010.safetensors",
|
| 199 |
+
"model.layers.28.mlp.down_proj.weight": "model-00007-of-00010.safetensors",
|
| 200 |
+
"model.layers.28.mlp.gate_proj.weight": "model-00007-of-00010.safetensors",
|
| 201 |
+
"model.layers.28.mlp.up_proj.weight": "model-00007-of-00010.safetensors",
|
| 202 |
+
"model.layers.28.post_attention_layernorm.weight": "model-00007-of-00010.safetensors",
|
| 203 |
+
"model.layers.28.self_attn.k_proj.weight": "model-00007-of-00010.safetensors",
|
| 204 |
+
"model.layers.28.self_attn.o_proj.weight": "model-00007-of-00010.safetensors",
|
| 205 |
+
"model.layers.28.self_attn.q_proj.weight": "model-00007-of-00010.safetensors",
|
| 206 |
+
"model.layers.28.self_attn.v_proj.weight": "model-00007-of-00010.safetensors",
|
| 207 |
+
"model.layers.29.input_layernorm.weight": "model-00008-of-00010.safetensors",
|
| 208 |
+
"model.layers.29.mlp.down_proj.weight": "model-00008-of-00010.safetensors",
|
| 209 |
+
"model.layers.29.mlp.gate_proj.weight": "model-00008-of-00010.safetensors",
|
| 210 |
+
"model.layers.29.mlp.up_proj.weight": "model-00008-of-00010.safetensors",
|
| 211 |
+
"model.layers.29.post_attention_layernorm.weight": "model-00008-of-00010.safetensors",
|
| 212 |
+
"model.layers.29.self_attn.k_proj.weight": "model-00007-of-00010.safetensors",
|
| 213 |
+
"model.layers.29.self_attn.o_proj.weight": "model-00007-of-00010.safetensors",
|
| 214 |
+
"model.layers.29.self_attn.q_proj.weight": "model-00007-of-00010.safetensors",
|
| 215 |
+
"model.layers.29.self_attn.v_proj.weight": "model-00007-of-00010.safetensors",
|
| 216 |
+
"model.layers.3.input_layernorm.weight": "model-00002-of-00010.safetensors",
|
| 217 |
+
"model.layers.3.mlp.down_proj.weight": "model-00002-of-00010.safetensors",
|
| 218 |
+
"model.layers.3.mlp.gate_proj.weight": "model-00002-of-00010.safetensors",
|
| 219 |
+
"model.layers.3.mlp.up_proj.weight": "model-00002-of-00010.safetensors",
|
| 220 |
+
"model.layers.3.post_attention_layernorm.weight": "model-00002-of-00010.safetensors",
|
| 221 |
+
"model.layers.3.self_attn.k_proj.weight": "model-00001-of-00010.safetensors",
|
| 222 |
+
"model.layers.3.self_attn.o_proj.weight": "model-00001-of-00010.safetensors",
|
| 223 |
+
"model.layers.3.self_attn.q_proj.weight": "model-00001-of-00010.safetensors",
|
| 224 |
+
"model.layers.3.self_attn.v_proj.weight": "model-00001-of-00010.safetensors",
|
| 225 |
+
"model.layers.30.input_layernorm.weight": "model-00008-of-00010.safetensors",
|
| 226 |
+
"model.layers.30.mlp.down_proj.weight": "model-00008-of-00010.safetensors",
|
| 227 |
+
"model.layers.30.mlp.gate_proj.weight": "model-00008-of-00010.safetensors",
|
| 228 |
+
"model.layers.30.mlp.up_proj.weight": "model-00008-of-00010.safetensors",
|
| 229 |
+
"model.layers.30.post_attention_layernorm.weight": "model-00008-of-00010.safetensors",
|
| 230 |
+
"model.layers.30.self_attn.k_proj.weight": "model-00008-of-00010.safetensors",
|
| 231 |
+
"model.layers.30.self_attn.o_proj.weight": "model-00008-of-00010.safetensors",
|
| 232 |
+
"model.layers.30.self_attn.q_proj.weight": "model-00008-of-00010.safetensors",
|
| 233 |
+
"model.layers.30.self_attn.v_proj.weight": "model-00008-of-00010.safetensors",
|
| 234 |
+
"model.layers.31.input_layernorm.weight": "model-00008-of-00010.safetensors",
|
| 235 |
+
"model.layers.31.mlp.down_proj.weight": "model-00008-of-00010.safetensors",
|
| 236 |
+
"model.layers.31.mlp.gate_proj.weight": "model-00008-of-00010.safetensors",
|
| 237 |
+
"model.layers.31.mlp.up_proj.weight": "model-00008-of-00010.safetensors",
|
| 238 |
+
"model.layers.31.post_attention_layernorm.weight": "model-00008-of-00010.safetensors",
|
| 239 |
+
"model.layers.31.self_attn.k_proj.weight": "model-00008-of-00010.safetensors",
|
| 240 |
+
"model.layers.31.self_attn.o_proj.weight": "model-00008-of-00010.safetensors",
|
| 241 |
+
"model.layers.31.self_attn.q_proj.weight": "model-00008-of-00010.safetensors",
|
| 242 |
+
"model.layers.31.self_attn.v_proj.weight": "model-00008-of-00010.safetensors",
|
| 243 |
+
"model.layers.32.input_layernorm.weight": "model-00008-of-00010.safetensors",
|
| 244 |
+
"model.layers.32.mlp.down_proj.weight": "model-00008-of-00010.safetensors",
|
| 245 |
+
"model.layers.32.mlp.gate_proj.weight": "model-00008-of-00010.safetensors",
|
| 246 |
+
"model.layers.32.mlp.up_proj.weight": "model-00008-of-00010.safetensors",
|
| 247 |
+
"model.layers.32.post_attention_layernorm.weight": "model-00008-of-00010.safetensors",
|
| 248 |
+
"model.layers.32.self_attn.k_proj.weight": "model-00008-of-00010.safetensors",
|
| 249 |
+
"model.layers.32.self_attn.o_proj.weight": "model-00008-of-00010.safetensors",
|
| 250 |
+
"model.layers.32.self_attn.q_proj.weight": "model-00008-of-00010.safetensors",
|
| 251 |
+
"model.layers.32.self_attn.v_proj.weight": "model-00008-of-00010.safetensors",
|
| 252 |
+
"model.layers.33.input_layernorm.weight": "model-00009-of-00010.safetensors",
|
| 253 |
+
"model.layers.33.mlp.down_proj.weight": "model-00009-of-00010.safetensors",
|
| 254 |
+
"model.layers.33.mlp.gate_proj.weight": "model-00008-of-00010.safetensors",
|
| 255 |
+
"model.layers.33.mlp.up_proj.weight": "model-00009-of-00010.safetensors",
|
| 256 |
+
"model.layers.33.post_attention_layernorm.weight": "model-00009-of-00010.safetensors",
|
| 257 |
+
"model.layers.33.self_attn.k_proj.weight": "model-00008-of-00010.safetensors",
|
| 258 |
+
"model.layers.33.self_attn.o_proj.weight": "model-00008-of-00010.safetensors",
|
| 259 |
+
"model.layers.33.self_attn.q_proj.weight": "model-00008-of-00010.safetensors",
|
| 260 |
+
"model.layers.33.self_attn.v_proj.weight": "model-00008-of-00010.safetensors",
|
| 261 |
+
"model.layers.34.input_layernorm.weight": "model-00009-of-00010.safetensors",
|
| 262 |
+
"model.layers.34.mlp.down_proj.weight": "model-00009-of-00010.safetensors",
|
| 263 |
+
"model.layers.34.mlp.gate_proj.weight": "model-00009-of-00010.safetensors",
|
| 264 |
+
"model.layers.34.mlp.up_proj.weight": "model-00009-of-00010.safetensors",
|
| 265 |
+
"model.layers.34.post_attention_layernorm.weight": "model-00009-of-00010.safetensors",
|
| 266 |
+
"model.layers.34.self_attn.k_proj.weight": "model-00009-of-00010.safetensors",
|
| 267 |
+
"model.layers.34.self_attn.o_proj.weight": "model-00009-of-00010.safetensors",
|
| 268 |
+
"model.layers.34.self_attn.q_proj.weight": "model-00009-of-00010.safetensors",
|
| 269 |
+
"model.layers.34.self_attn.v_proj.weight": "model-00009-of-00010.safetensors",
|
| 270 |
+
"model.layers.35.input_layernorm.weight": "model-00009-of-00010.safetensors",
|
| 271 |
+
"model.layers.35.mlp.down_proj.weight": "model-00009-of-00010.safetensors",
|
| 272 |
+
"model.layers.35.mlp.gate_proj.weight": "model-00009-of-00010.safetensors",
|
| 273 |
+
"model.layers.35.mlp.up_proj.weight": "model-00009-of-00010.safetensors",
|
| 274 |
+
"model.layers.35.post_attention_layernorm.weight": "model-00009-of-00010.safetensors",
|
| 275 |
+
"model.layers.35.self_attn.k_proj.weight": "model-00009-of-00010.safetensors",
|
| 276 |
+
"model.layers.35.self_attn.o_proj.weight": "model-00009-of-00010.safetensors",
|
| 277 |
+
"model.layers.35.self_attn.q_proj.weight": "model-00009-of-00010.safetensors",
|
| 278 |
+
"model.layers.35.self_attn.v_proj.weight": "model-00009-of-00010.safetensors",
|
| 279 |
+
"model.layers.36.input_layernorm.weight": "model-00009-of-00010.safetensors",
|
| 280 |
+
"model.layers.36.mlp.down_proj.weight": "model-00009-of-00010.safetensors",
|
| 281 |
+
"model.layers.36.mlp.gate_proj.weight": "model-00009-of-00010.safetensors",
|
| 282 |
+
"model.layers.36.mlp.up_proj.weight": "model-00009-of-00010.safetensors",
|
| 283 |
+
"model.layers.36.post_attention_layernorm.weight": "model-00009-of-00010.safetensors",
|
| 284 |
+
"model.layers.36.self_attn.k_proj.weight": "model-00009-of-00010.safetensors",
|
| 285 |
+
"model.layers.36.self_attn.o_proj.weight": "model-00009-of-00010.safetensors",
|
| 286 |
+
"model.layers.36.self_attn.q_proj.weight": "model-00009-of-00010.safetensors",
|
| 287 |
+
"model.layers.36.self_attn.v_proj.weight": "model-00009-of-00010.safetensors",
|
| 288 |
+
"model.layers.37.input_layernorm.weight": "model-00010-of-00010.safetensors",
|
| 289 |
+
"model.layers.37.mlp.down_proj.weight": "model-00010-of-00010.safetensors",
|
| 290 |
+
"model.layers.37.mlp.gate_proj.weight": "model-00009-of-00010.safetensors",
|
| 291 |
+
"model.layers.37.mlp.up_proj.weight": "model-00009-of-00010.safetensors",
|
| 292 |
+
"model.layers.37.post_attention_layernorm.weight": "model-00010-of-00010.safetensors",
|
| 293 |
+
"model.layers.37.self_attn.k_proj.weight": "model-00009-of-00010.safetensors",
|
| 294 |
+
"model.layers.37.self_attn.o_proj.weight": "model-00009-of-00010.safetensors",
|
| 295 |
+
"model.layers.37.self_attn.q_proj.weight": "model-00009-of-00010.safetensors",
|
| 296 |
+
"model.layers.37.self_attn.v_proj.weight": "model-00009-of-00010.safetensors",
|
| 297 |
+
"model.layers.38.input_layernorm.weight": "model-00010-of-00010.safetensors",
|
| 298 |
+
"model.layers.38.mlp.down_proj.weight": "model-00010-of-00010.safetensors",
|
| 299 |
+
"model.layers.38.mlp.gate_proj.weight": "model-00010-of-00010.safetensors",
|
| 300 |
+
"model.layers.38.mlp.up_proj.weight": "model-00010-of-00010.safetensors",
|
| 301 |
+
"model.layers.38.post_attention_layernorm.weight": "model-00010-of-00010.safetensors",
|
| 302 |
+
"model.layers.38.self_attn.k_proj.weight": "model-00010-of-00010.safetensors",
|
| 303 |
+
"model.layers.38.self_attn.o_proj.weight": "model-00010-of-00010.safetensors",
|
| 304 |
+
"model.layers.38.self_attn.q_proj.weight": "model-00010-of-00010.safetensors",
|
| 305 |
+
"model.layers.38.self_attn.v_proj.weight": "model-00010-of-00010.safetensors",
|
| 306 |
+
"model.layers.39.input_layernorm.weight": "model-00010-of-00010.safetensors",
|
| 307 |
+
"model.layers.39.mlp.down_proj.weight": "model-00010-of-00010.safetensors",
|
| 308 |
+
"model.layers.39.mlp.gate_proj.weight": "model-00010-of-00010.safetensors",
|
| 309 |
+
"model.layers.39.mlp.up_proj.weight": "model-00010-of-00010.safetensors",
|
| 310 |
+
"model.layers.39.post_attention_layernorm.weight": "model-00010-of-00010.safetensors",
|
| 311 |
+
"model.layers.39.self_attn.k_proj.weight": "model-00010-of-00010.safetensors",
|
| 312 |
+
"model.layers.39.self_attn.o_proj.weight": "model-00010-of-00010.safetensors",
|
| 313 |
+
"model.layers.39.self_attn.q_proj.weight": "model-00010-of-00010.safetensors",
|
| 314 |
+
"model.layers.39.self_attn.v_proj.weight": "model-00010-of-00010.safetensors",
|
| 315 |
+
"model.layers.4.input_layernorm.weight": "model-00002-of-00010.safetensors",
|
| 316 |
+
"model.layers.4.mlp.down_proj.weight": "model-00002-of-00010.safetensors",
|
| 317 |
+
"model.layers.4.mlp.gate_proj.weight": "model-00002-of-00010.safetensors",
|
| 318 |
+
"model.layers.4.mlp.up_proj.weight": "model-00002-of-00010.safetensors",
|
| 319 |
+
"model.layers.4.post_attention_layernorm.weight": "model-00002-of-00010.safetensors",
|
| 320 |
+
"model.layers.4.self_attn.k_proj.weight": "model-00002-of-00010.safetensors",
|
| 321 |
+
"model.layers.4.self_attn.o_proj.weight": "model-00002-of-00010.safetensors",
|
| 322 |
+
"model.layers.4.self_attn.q_proj.weight": "model-00002-of-00010.safetensors",
|
| 323 |
+
"model.layers.4.self_attn.v_proj.weight": "model-00002-of-00010.safetensors",
|
| 324 |
+
"model.layers.5.input_layernorm.weight": "model-00002-of-00010.safetensors",
|
| 325 |
+
"model.layers.5.mlp.down_proj.weight": "model-00002-of-00010.safetensors",
|
| 326 |
+
"model.layers.5.mlp.gate_proj.weight": "model-00002-of-00010.safetensors",
|
| 327 |
+
"model.layers.5.mlp.up_proj.weight": "model-00002-of-00010.safetensors",
|
| 328 |
+
"model.layers.5.post_attention_layernorm.weight": "model-00002-of-00010.safetensors",
|
| 329 |
+
"model.layers.5.self_attn.k_proj.weight": "model-00002-of-00010.safetensors",
|
| 330 |
+
"model.layers.5.self_attn.o_proj.weight": "model-00002-of-00010.safetensors",
|
| 331 |
+
"model.layers.5.self_attn.q_proj.weight": "model-00002-of-00010.safetensors",
|
| 332 |
+
"model.layers.5.self_attn.v_proj.weight": "model-00002-of-00010.safetensors",
|
| 333 |
+
"model.layers.6.input_layernorm.weight": "model-00002-of-00010.safetensors",
|
| 334 |
+
"model.layers.6.mlp.down_proj.weight": "model-00002-of-00010.safetensors",
|
| 335 |
+
"model.layers.6.mlp.gate_proj.weight": "model-00002-of-00010.safetensors",
|
| 336 |
+
"model.layers.6.mlp.up_proj.weight": "model-00002-of-00010.safetensors",
|
| 337 |
+
"model.layers.6.post_attention_layernorm.weight": "model-00002-of-00010.safetensors",
|
| 338 |
+
"model.layers.6.self_attn.k_proj.weight": "model-00002-of-00010.safetensors",
|
| 339 |
+
"model.layers.6.self_attn.o_proj.weight": "model-00002-of-00010.safetensors",
|
| 340 |
+
"model.layers.6.self_attn.q_proj.weight": "model-00002-of-00010.safetensors",
|
| 341 |
+
"model.layers.6.self_attn.v_proj.weight": "model-00002-of-00010.safetensors",
|
| 342 |
+
"model.layers.7.input_layernorm.weight": "model-00003-of-00010.safetensors",
|
| 343 |
+
"model.layers.7.mlp.down_proj.weight": "model-00003-of-00010.safetensors",
|
| 344 |
+
"model.layers.7.mlp.gate_proj.weight": "model-00002-of-00010.safetensors",
|
| 345 |
+
"model.layers.7.mlp.up_proj.weight": "model-00003-of-00010.safetensors",
|
| 346 |
+
"model.layers.7.post_attention_layernorm.weight": "model-00003-of-00010.safetensors",
|
| 347 |
+
"model.layers.7.self_attn.k_proj.weight": "model-00002-of-00010.safetensors",
|
| 348 |
+
"model.layers.7.self_attn.o_proj.weight": "model-00002-of-00010.safetensors",
|
| 349 |
+
"model.layers.7.self_attn.q_proj.weight": "model-00002-of-00010.safetensors",
|
| 350 |
+
"model.layers.7.self_attn.v_proj.weight": "model-00002-of-00010.safetensors",
|
| 351 |
+
"model.layers.8.input_layernorm.weight": "model-00003-of-00010.safetensors",
|
| 352 |
+
"model.layers.8.mlp.down_proj.weight": "model-00003-of-00010.safetensors",
|
| 353 |
+
"model.layers.8.mlp.gate_proj.weight": "model-00003-of-00010.safetensors",
|
| 354 |
+
"model.layers.8.mlp.up_proj.weight": "model-00003-of-00010.safetensors",
|
| 355 |
+
"model.layers.8.post_attention_layernorm.weight": "model-00003-of-00010.safetensors",
|
| 356 |
+
"model.layers.8.self_attn.k_proj.weight": "model-00003-of-00010.safetensors",
|
| 357 |
+
"model.layers.8.self_attn.o_proj.weight": "model-00003-of-00010.safetensors",
|
| 358 |
+
"model.layers.8.self_attn.q_proj.weight": "model-00003-of-00010.safetensors",
|
| 359 |
+
"model.layers.8.self_attn.v_proj.weight": "model-00003-of-00010.safetensors",
|
| 360 |
+
"model.layers.9.input_layernorm.weight": "model-00003-of-00010.safetensors",
|
| 361 |
+
"model.layers.9.mlp.down_proj.weight": "model-00003-of-00010.safetensors",
|
| 362 |
+
"model.layers.9.mlp.gate_proj.weight": "model-00003-of-00010.safetensors",
|
| 363 |
+
"model.layers.9.mlp.up_proj.weight": "model-00003-of-00010.safetensors",
|
| 364 |
+
"model.layers.9.post_attention_layernorm.weight": "model-00003-of-00010.safetensors",
|
| 365 |
+
"model.layers.9.self_attn.k_proj.weight": "model-00003-of-00010.safetensors",
|
| 366 |
+
"model.layers.9.self_attn.o_proj.weight": "model-00003-of-00010.safetensors",
|
| 367 |
+
"model.layers.9.self_attn.q_proj.weight": "model-00003-of-00010.safetensors",
|
| 368 |
+
"model.layers.9.self_attn.v_proj.weight": "model-00003-of-00010.safetensors",
|
| 369 |
+
"model.norm.weight": "model-00010-of-00010.safetensors"
|
| 370 |
+
}
|
| 371 |
+
}
|
params.json
ADDED
|
@@ -0,0 +1,12 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"dim": 5120,
|
| 3 |
+
"n_layers": 40,
|
| 4 |
+
"head_dim": 128,
|
| 5 |
+
"hidden_dim": 32768,
|
| 6 |
+
"n_heads": 32,
|
| 7 |
+
"n_kv_heads": 8,
|
| 8 |
+
"rope_theta": 1000000000.0,
|
| 9 |
+
"norm_eps": 1e-05,
|
| 10 |
+
"vocab_size": 131072,
|
| 11 |
+
"max_position_embeddings": 131072
|
| 12 |
+
}
|
preprocessor_config.json
ADDED
|
@@ -0,0 +1,31 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"crop_size": null,
|
| 3 |
+
"data_format": "channels_first",
|
| 4 |
+
"default_to_square": true,
|
| 5 |
+
"device": null,
|
| 6 |
+
"do_center_crop": null,
|
| 7 |
+
"do_convert_rgb": true,
|
| 8 |
+
"do_normalize": true,
|
| 9 |
+
"do_rescale": true,
|
| 10 |
+
"do_resize": true,
|
| 11 |
+
"image_mean": [
|
| 12 |
+
0.48145466,
|
| 13 |
+
0.4578275,
|
| 14 |
+
0.40821073
|
| 15 |
+
],
|
| 16 |
+
"image_processor_type": "PixtralImageProcessorFast",
|
| 17 |
+
"image_std": [
|
| 18 |
+
0.26862954,
|
| 19 |
+
0.26130258,
|
| 20 |
+
0.27577711
|
| 21 |
+
],
|
| 22 |
+
"input_data_format": null,
|
| 23 |
+
"patch_size": 14,
|
| 24 |
+
"processor_class": "PixtralProcessor",
|
| 25 |
+
"resample": 3,
|
| 26 |
+
"rescale_factor": 0.00392156862745098,
|
| 27 |
+
"return_tensors": null,
|
| 28 |
+
"size": {
|
| 29 |
+
"longest_edge": 1540
|
| 30 |
+
}
|
| 31 |
+
}
|
processor_config.json
ADDED
|
@@ -0,0 +1,43 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"image_break_token": "[IMG_BREAK]",
|
| 3 |
+
"image_end_token": "[IMG_END]",
|
| 4 |
+
"image_processor": {
|
| 5 |
+
"crop_size": null,
|
| 6 |
+
"data_format": "channels_first",
|
| 7 |
+
"default_to_square": true,
|
| 8 |
+
"device": null,
|
| 9 |
+
"disable_grouping": null,
|
| 10 |
+
"do_center_crop": null,
|
| 11 |
+
"do_convert_rgb": true,
|
| 12 |
+
"do_normalize": true,
|
| 13 |
+
"do_pad": null,
|
| 14 |
+
"do_rescale": true,
|
| 15 |
+
"do_resize": true,
|
| 16 |
+
"image_mean": [
|
| 17 |
+
0.48145466,
|
| 18 |
+
0.4578275,
|
| 19 |
+
0.40821073
|
| 20 |
+
],
|
| 21 |
+
"image_processor_type": "PixtralImageProcessorFast",
|
| 22 |
+
"image_seq_length": null,
|
| 23 |
+
"image_std": [
|
| 24 |
+
0.26862954,
|
| 25 |
+
0.26130258,
|
| 26 |
+
0.27577711
|
| 27 |
+
],
|
| 28 |
+
"input_data_format": null,
|
| 29 |
+
"pad_size": null,
|
| 30 |
+
"patch_size": 14,
|
| 31 |
+
"processor_class": "PixtralProcessor",
|
| 32 |
+
"resample": 3,
|
| 33 |
+
"rescale_factor": 0.00392156862745098,
|
| 34 |
+
"return_tensors": null,
|
| 35 |
+
"size": {
|
| 36 |
+
"longest_edge": 1540
|
| 37 |
+
}
|
| 38 |
+
},
|
| 39 |
+
"image_token": "[IMG]",
|
| 40 |
+
"patch_size": 14,
|
| 41 |
+
"processor_class": "PixtralProcessor",
|
| 42 |
+
"spatial_merge_size": 2
|
| 43 |
+
}
|
tekken.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:839c48629ff570bd664586800aa3ee17ee628f56efc7fd8e145cc01467a1c188
|
| 3 |
+
size 19399650
|
tokenizer.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:b76085f9923309d873994d444989f7eb6ec074b06f25b58f1e8d7b7741070949
|
| 3 |
+
size 17078037
|
tokenizer_config.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|