aiqwen lovis93 commited on
Commit
72ffdcb
·
verified ·
0 Parent(s):

Duplicate from lovis93/next-scene-qwen-image-lora-2509

Browse files

Co-authored-by: ldf <[email protected]>

.gitattributes ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ 01.gif filter=lfs diff=lfs merge=lfs -text
37
+ 02.gif filter=lfs diff=lfs merge=lfs -text
38
+ 03.gif filter=lfs diff=lfs merge=lfs -text
39
+ *.gif filter=lfs diff=lfs merge=lfs -text
01-update.gif ADDED

Git LFS Details

  • SHA256: d2233ffc5bfbd311900123484035b0de45627e16e7fc6e932c9463e465dd0f41
  • Pointer size: 132 Bytes
  • Size of remote file: 2.95 MB
01.gif ADDED

Git LFS Details

  • SHA256: a74a354c482156e1a9d5fbaf4f6f673aa8eabd3096dc14a06fc2b78f7bb6f6b4
  • Pointer size: 132 Bytes
  • Size of remote file: 4.3 MB
02-update.gif ADDED

Git LFS Details

  • SHA256: 72b70e54e31c3b44a5292792d0fd8eac487e8579fd843420b49ee0e0911d2a3f
  • Pointer size: 132 Bytes
  • Size of remote file: 2.71 MB
02.gif ADDED

Git LFS Details

  • SHA256: 7dd18e10e0fdfbf7e58450949c014a84a21a08ac6ebfbea1c4c4cfe03ad721a0
  • Pointer size: 132 Bytes
  • Size of remote file: 5.02 MB
03-update.gif ADDED

Git LFS Details

  • SHA256: b17e8c967ca32c6e61542eb180e9ff7e2b91546ca2902f19bbe7f11186ed45cc
  • Pointer size: 132 Bytes
  • Size of remote file: 2.82 MB
03.gif ADDED

Git LFS Details

  • SHA256: 8a7d8c414b50559097f80af7420ba5eb71c41da0c1458de7ac8b174756de245d
  • Pointer size: 132 Bytes
  • Size of remote file: 7.76 MB
README.md ADDED
@@ -0,0 +1,170 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ language: en
4
+ base_model:
5
+ - Qwen/Qwen-Image-Edit-2509
6
+ pipeline_tag: image-to-image
7
+ tags:
8
+ - lora
9
+ - cinematic
10
+ - comfyui
11
+ - qwen
12
+ - image-editing
13
+ - next-scene
14
+ - ai-video
15
+ - diffusers
16
+ ---
17
+
18
+ # 🎥 next-scene-qwen-image-lora-2509
19
+
20
+ ---
21
+
22
+ ## 🎉 ✨ **UPDATE - Version 2 Now Available! (21 Oct 2025)** ✨ 🎉
23
+
24
+ 🚀 **New Model:** `next-scene_lora-v2-3000.safetensors`
25
+
26
+ **What's New in V2:**
27
+ - 🎯 **Trained on higher quality data** for significantly improved results
28
+ - 💪 **Better command responsiveness** - the model follows your prompts more accurately
29
+ - 🖼️ **Fixed black bar artifacts** - no more unwanted black borders on generated images
30
+ - ⚡ **Overall enhanced performance** - smoother transitions and better cinematic flow
31
+
32
+ **Recommended:** Use V2 for all new projects.
33
+
34
+ **📥 ComfyUI Workflow:** [workflow-comfyui-basic-next-scene-v2.json](workflow-comfyui-basic-next-scene-v2.json)
35
+
36
+ ### V2 Demo Examples:
37
+
38
+ ![Demo 1 V2](01-update.gif) ![Demo 2 V2](02-update.gif) ![Demo 3 V2](03-update.gif)
39
+
40
+ ---
41
+
42
+ **next-scene-qwen-image-lora-2509** is a **LoRA adapter fine-tuned on Qwen-Image-Edit (build 2509)**, purpose-built to generate cinematic image sequences with natural visual progression from frame to frame.
43
+
44
+ This model enables Qwen Image Edit to think like a film director—understanding camera dynamics, visual composition, and narrative continuity to create shots that flow seamlessly into one another.
45
+
46
+ ---
47
+
48
+ ## 📦 Version 1 (Legacy)
49
+
50
+ **Model File:** `next-scene_lora_v1-3000.safetensors`
51
+ **ComfyUI Workflow:** [workflow-comfyui-basic-next-scene.json](workflow-comfyui-basic-next-scene.json)
52
+
53
+ ### V1 Demo Examples:
54
+
55
+ ![Demo 1 V1](01.gif) ![Demo 2 V1](02.gif) ![Demo 3 V1](03.gif)
56
+
57
+ ---
58
+
59
+ ## 🧠 What This Model Does
60
+
61
+ This LoRA brings **cinematic storytelling continuity** into AI image generation workflows.
62
+
63
+ Each output frame functions as the *"Next Scene"* in an evolving visual narrative, maintaining compositional coherence while introducing organic transitions such as:
64
+
65
+ - **Camera movement:** Dolly shots, push-ins, pull-backs, and tracking moves
66
+ - **Framing evolution:** Wide to close-up transitions, angle shifts, reframing
67
+ - **Environmental reveals:** New characters entering frame, expanded scenery, spatial progression
68
+ - **Atmospheric shifts:** Lighting changes, weather evolution, time-of-day transitions
69
+
70
+ ### Examples of Cinematic Logic:
71
+
72
+ - *"Next Scene: The camera pulls back from a tight close-up on the airship to a sweeping aerial view, revealing an entire fleet of vessels soaring through a fantasy landscape."*
73
+
74
+ - *"Next Scene: The camera tracks forward and tilts down, bringing the sun and helicopters closer into frame as a strong lens flare intensifies."*
75
+
76
+ - *"Next Scene: The camera pans right, removing the dragon and rider from view while revealing more of the floating mountain range in the distance."*
77
+
78
+ ---
79
+
80
+ ## ⚙️ Usage Instructions
81
+
82
+ ### Basic Setup:
83
+
84
+ 1. Load **Qwen-Image-Edit 2509** as your base model
85
+ 2. Add a **LoRA Loader** node and select:
86
+ - **V2 (Recommended):** `next-scene_lora-v2-3000.safetensors`
87
+ - **V1 (Legacy):** `next-scene_lora_v1-3000.safetensors`
88
+ 3. Set LoRA strength: **0.7 – 0.8** (recommended)
89
+ 4. Structure your prompts with **"Next Scene:"** prefix for optimal results
90
+
91
+ ### Example Prompt:
92
+
93
+ ```
94
+ Next Scene: The camera moves slightly forward as sunlight breaks through the clouds, casting a soft glow around the character's silhouette in the mist. Realistic cinematic style, atmospheric depth.
95
+ ```
96
+
97
+ ### Pro Tips:
98
+
99
+ - Begin prompts with camera direction for stronger continuity
100
+ - Specify lighting and atmospheric changes for mood consistency
101
+ - Chain multiple generations to create sequential storyboards
102
+ - Works particularly well with landscape and establishing shots
103
+
104
+ ---
105
+
106
+ ## 🎬 Design Philosophy
107
+
108
+ Trained on an extensive, curated cinematic dataset (proprietary), this model has learned to *think directionally* rather than just visually.
109
+
110
+ It doesn't simply modify an image—it **advances the story**, preserving spatial relationships, lighting consistency, and emotional resonance across sequential frames.
111
+
112
+ ### Ideal Applications:
113
+
114
+ - **Storyboard generation** for film and animation pre-production
115
+ - **Cinematic AI video pipelines** requiring frame-to-frame coherence
116
+ - **Sequential narrative workflows** in ComfyUI and similar tools
117
+ - **Concept art evolution** showing scene progression
118
+ - **Visual storytelling** for creative projects and presentations
119
+
120
+ ---
121
+
122
+ ## ⚠️ Important Limitations
123
+
124
+ - **Not optimized for:** Static portraits, single-image illustration tasks, or non-sequential edits
125
+ - **Best suited for:** Multi-frame workflows with narrative progression
126
+ - **Design priority:** Storytelling flow and continuity over isolated image perfection
127
+ - **Recommended use case:** Scene-to-scene transitions rather than detailed object manipulation
128
+
129
+ ---
130
+
131
+ ## 🧱 Technical Specifications
132
+
133
+ - **Base Model:** Qwen-Image-Edit (build 2509)
134
+ - **Architecture:** Low-Rank Adaptation (LoRA)
135
+ - **Training Objective:** Scene continuity and cinematic shot coherence
136
+ - **Dataset:** Large-scale proprietary cinematic imagery
137
+ - **Recommended Strength:** 0.7–0.8
138
+ - **Compatible Platforms:** ComfyUI, Automatic1111 (with Qwen support), custom pipelines
139
+
140
+ ---
141
+
142
+ ## 📄 License
143
+
144
+ **MIT License** — Free for research, educational, and creative use.
145
+
146
+ Commercial applications require independent testing and proper attribution. See LICENSE file for full terms.
147
+
148
+ ---
149
+
150
+ ## 🌐 Creator
151
+
152
+ Developed by **[@lovis93](https://huggingface.co/lovis93)**
153
+
154
+ Pushing the boundaries of AI-directed visual storytelling and cinematic image generation.
155
+
156
+ ---
157
+
158
+ ## 🐦 Share This Model
159
+
160
+ 🎥 Introducing **next-scene-qwen-image-lora-2509**
161
+
162
+ A LoRA fine-tuned for **Qwen-Image-Edit 2509** that thinks like a film director.
163
+
164
+ It evolves each frame naturally—new angles, new lighting, same coherent world.
165
+
166
+ Perfect for cinematic storyboards, sequential edits, and "Next Scene" workflows.
167
+
168
+ 👉 https://huggingface.co/lovis93/next-scene-qwen-image-lora-2509
169
+
170
+ #AIart #ComfyUI #Qwen #LoRA #GenerativeAI #AIcinema #ImageEditing
next-scene_lora-v2-3000.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8abf2306649a62282f89968515097650acaf7b040687c9b6c56e37558e6df9e1
3
+ size 295146176
next-scene_lora_v1-3000.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c31f5e95e1934a34079f6c4c22842299ff28bcc5d2798639df52f5c03de724dd
3
+ size 295146184
workflow-comfyui-basic-next-scene-v2.json ADDED
@@ -0,0 +1,1762 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "id": "908d0bfb-e192-4627-9b57-147496e6e2dd",
3
+ "revision": 0,
4
+ "last_node_id": 82,
5
+ "last_link_id": 103,
6
+ "nodes": [
7
+ {
8
+ "id": 40,
9
+ "type": "DualCLIPLoader",
10
+ "pos": [
11
+ -1054.8211950403152,
12
+ 313.2223343219331
13
+ ],
14
+ "size": [
15
+ 270,
16
+ 130
17
+ ],
18
+ "flags": {},
19
+ "order": 0,
20
+ "mode": 0,
21
+ "inputs": [],
22
+ "outputs": [
23
+ {
24
+ "name": "CLIP",
25
+ "type": "CLIP",
26
+ "links": [
27
+ 64
28
+ ]
29
+ }
30
+ ],
31
+ "properties": {
32
+ "cnr_id": "comfy-core",
33
+ "ver": "0.3.40",
34
+ "Node name for S&R": "DualCLIPLoader",
35
+ "models": [
36
+ {
37
+ "name": "clip_l.safetensors",
38
+ "url": "https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/clip_l.safetensors",
39
+ "directory": "text_encoders"
40
+ },
41
+ {
42
+ "name": "t5xxl_fp16.safetensors",
43
+ "url": "https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/t5xxl_fp16.safetensors",
44
+ "directory": "text_encoders"
45
+ }
46
+ ]
47
+ },
48
+ "widgets_values": [
49
+ "clip_l.safetensors",
50
+ "t5xxl_fp16.safetensors",
51
+ "flux",
52
+ "default"
53
+ ]
54
+ },
55
+ {
56
+ "id": 39,
57
+ "type": "VAELoader",
58
+ "pos": [
59
+ -1054.8211950403152,
60
+ 493.2223343219331
61
+ ],
62
+ "size": [
63
+ 270,
64
+ 58
65
+ ],
66
+ "flags": {},
67
+ "order": 1,
68
+ "mode": 0,
69
+ "inputs": [],
70
+ "outputs": [
71
+ {
72
+ "name": "VAE",
73
+ "type": "VAE",
74
+ "links": [
75
+ 58
76
+ ]
77
+ }
78
+ ],
79
+ "properties": {
80
+ "cnr_id": "comfy-core",
81
+ "ver": "0.3.40",
82
+ "Node name for S&R": "VAELoader",
83
+ "models": [
84
+ {
85
+ "name": "ae.safetensors",
86
+ "url": "https://huggingface.co/Comfy-Org/Lumina_Image_2.0_Repackaged/resolve/main/split_files/vae/ae.safetensors",
87
+ "directory": "vae"
88
+ }
89
+ ]
90
+ },
91
+ "widgets_values": [
92
+ "ae.safetensors"
93
+ ]
94
+ },
95
+ {
96
+ "id": 52,
97
+ "type": "CFGNorm",
98
+ "pos": [
99
+ 2230,
100
+ 230
101
+ ],
102
+ "size": [
103
+ 290,
104
+ 60
105
+ ],
106
+ "flags": {},
107
+ "order": 21,
108
+ "mode": 0,
109
+ "inputs": [
110
+ {
111
+ "name": "model",
112
+ "type": "MODEL",
113
+ "link": 71
114
+ }
115
+ ],
116
+ "outputs": [
117
+ {
118
+ "name": "patched_model",
119
+ "type": "MODEL",
120
+ "links": [
121
+ 73
122
+ ]
123
+ }
124
+ ],
125
+ "properties": {
126
+ "cnr_id": "comfy-core",
127
+ "ver": "0.3.50",
128
+ "Node name for S&R": "CFGNorm",
129
+ "enableTabs": false,
130
+ "tabWidth": 65,
131
+ "tabXOffset": 10,
132
+ "hasSecondTab": false,
133
+ "secondTabText": "Send Back",
134
+ "secondTabOffset": 80,
135
+ "secondTabWidth": 65,
136
+ "ue_properties": {
137
+ "widget_ue_connectable": {
138
+ "strength": true
139
+ }
140
+ }
141
+ },
142
+ "widgets_values": [
143
+ 1
144
+ ]
145
+ },
146
+ {
147
+ "id": 55,
148
+ "type": "ModelSamplingAuraFlow",
149
+ "pos": [
150
+ 2230,
151
+ 120
152
+ ],
153
+ "size": [
154
+ 290,
155
+ 60
156
+ ],
157
+ "flags": {},
158
+ "order": 19,
159
+ "mode": 0,
160
+ "inputs": [
161
+ {
162
+ "name": "model",
163
+ "type": "MODEL",
164
+ "link": 95
165
+ }
166
+ ],
167
+ "outputs": [
168
+ {
169
+ "name": "MODEL",
170
+ "type": "MODEL",
171
+ "links": [
172
+ 71
173
+ ]
174
+ }
175
+ ],
176
+ "properties": {
177
+ "cnr_id": "comfy-core",
178
+ "ver": "0.3.48",
179
+ "Node name for S&R": "ModelSamplingAuraFlow",
180
+ "enableTabs": false,
181
+ "tabWidth": 65,
182
+ "tabXOffset": 10,
183
+ "hasSecondTab": false,
184
+ "secondTabText": "Send Back",
185
+ "secondTabOffset": 80,
186
+ "secondTabWidth": 65,
187
+ "widget_ue_connectable": {}
188
+ },
189
+ "widgets_values": [
190
+ 3
191
+ ]
192
+ },
193
+ {
194
+ "id": 59,
195
+ "type": "EmptySD3LatentImage",
196
+ "pos": [
197
+ 2240,
198
+ 1110
199
+ ],
200
+ "size": [
201
+ 270,
202
+ 106
203
+ ],
204
+ "flags": {},
205
+ "order": 2,
206
+ "mode": 0,
207
+ "inputs": [],
208
+ "outputs": [
209
+ {
210
+ "name": "LATENT",
211
+ "type": "LATENT",
212
+ "links": []
213
+ }
214
+ ],
215
+ "properties": {
216
+ "cnr_id": "comfy-core",
217
+ "ver": "0.3.59",
218
+ "Node name for S&R": "EmptySD3LatentImage"
219
+ },
220
+ "widgets_values": [
221
+ 1024,
222
+ 1024,
223
+ 1
224
+ ]
225
+ },
226
+ {
227
+ "id": 60,
228
+ "type": "VAEEncode",
229
+ "pos": [
230
+ 1980,
231
+ 1150
232
+ ],
233
+ "size": [
234
+ 140,
235
+ 46
236
+ ],
237
+ "flags": {},
238
+ "order": 24,
239
+ "mode": 0,
240
+ "inputs": [
241
+ {
242
+ "name": "pixels",
243
+ "type": "IMAGE",
244
+ "link": 78
245
+ },
246
+ {
247
+ "name": "vae",
248
+ "type": "VAE",
249
+ "link": 79
250
+ }
251
+ ],
252
+ "outputs": [
253
+ {
254
+ "name": "LATENT",
255
+ "type": "LATENT",
256
+ "links": [
257
+ 76
258
+ ]
259
+ }
260
+ ],
261
+ "properties": {
262
+ "cnr_id": "comfy-core",
263
+ "ver": "0.3.50",
264
+ "Node name for S&R": "VAEEncode",
265
+ "enableTabs": false,
266
+ "tabWidth": 65,
267
+ "tabXOffset": 10,
268
+ "hasSecondTab": false,
269
+ "secondTabText": "Send Back",
270
+ "secondTabOffset": 80,
271
+ "secondTabWidth": 65,
272
+ "ue_properties": {
273
+ "widget_ue_connectable": {}
274
+ }
275
+ },
276
+ "widgets_values": []
277
+ },
278
+ {
279
+ "id": 61,
280
+ "type": "TextEncodeQwenImageEditPlus",
281
+ "pos": [
282
+ 1710,
283
+ 430
284
+ ],
285
+ "size": [
286
+ 400,
287
+ 200
288
+ ],
289
+ "flags": {},
290
+ "order": 25,
291
+ "mode": 0,
292
+ "inputs": [
293
+ {
294
+ "name": "clip",
295
+ "type": "CLIP",
296
+ "link": 80
297
+ },
298
+ {
299
+ "name": "vae",
300
+ "shape": 7,
301
+ "type": "VAE",
302
+ "link": 81
303
+ },
304
+ {
305
+ "name": "image1",
306
+ "shape": 7,
307
+ "type": "IMAGE",
308
+ "link": 82
309
+ },
310
+ {
311
+ "name": "image2",
312
+ "shape": 7,
313
+ "type": "IMAGE",
314
+ "link": null
315
+ },
316
+ {
317
+ "name": "image3",
318
+ "shape": 7,
319
+ "type": "IMAGE",
320
+ "link": null
321
+ }
322
+ ],
323
+ "outputs": [
324
+ {
325
+ "name": "CONDITIONING",
326
+ "type": "CONDITIONING",
327
+ "links": [
328
+ 75
329
+ ]
330
+ }
331
+ ],
332
+ "properties": {
333
+ "cnr_id": "comfy-core",
334
+ "ver": "0.3.59",
335
+ "Node name for S&R": "TextEncodeQwenImageEditPlus"
336
+ },
337
+ "widgets_values": [
338
+ ""
339
+ ],
340
+ "color": "#223",
341
+ "bgcolor": "#335"
342
+ },
343
+ {
344
+ "id": 69,
345
+ "type": "VAEDecode",
346
+ "pos": [
347
+ 2570,
348
+ 120
349
+ ],
350
+ "size": [
351
+ 210,
352
+ 46
353
+ ],
354
+ "flags": {
355
+ "collapsed": false
356
+ },
357
+ "order": 28,
358
+ "mode": 0,
359
+ "inputs": [
360
+ {
361
+ "name": "samples",
362
+ "type": "LATENT",
363
+ "link": 91
364
+ },
365
+ {
366
+ "name": "vae",
367
+ "type": "VAE",
368
+ "link": 92
369
+ }
370
+ ],
371
+ "outputs": [
372
+ {
373
+ "name": "IMAGE",
374
+ "type": "IMAGE",
375
+ "slot_index": 0,
376
+ "links": [
377
+ 77
378
+ ]
379
+ }
380
+ ],
381
+ "properties": {
382
+ "cnr_id": "comfy-core",
383
+ "ver": "0.3.48",
384
+ "Node name for S&R": "VAEDecode",
385
+ "enableTabs": false,
386
+ "tabWidth": 65,
387
+ "tabXOffset": 10,
388
+ "hasSecondTab": false,
389
+ "secondTabText": "Send Back",
390
+ "secondTabOffset": 80,
391
+ "secondTabWidth": 65,
392
+ "widget_ue_connectable": {}
393
+ },
394
+ "widgets_values": []
395
+ },
396
+ {
397
+ "id": 38,
398
+ "type": "UNETLoader",
399
+ "pos": [
400
+ -1054.8211950403152,
401
+ 173.22233432193306
402
+ ],
403
+ "size": [
404
+ 270,
405
+ 82
406
+ ],
407
+ "flags": {},
408
+ "order": 3,
409
+ "mode": 0,
410
+ "inputs": [],
411
+ "outputs": [
412
+ {
413
+ "name": "MODEL",
414
+ "type": "MODEL",
415
+ "links": [
416
+ 61
417
+ ]
418
+ }
419
+ ],
420
+ "properties": {
421
+ "cnr_id": "comfy-core",
422
+ "ver": "0.3.40",
423
+ "Node name for S&R": "UNETLoader",
424
+ "models": [
425
+ {
426
+ "name": "flux1-krea-dev_fp8_scaled.safetensors",
427
+ "url": "https://huggingface.co/Comfy-Org/FLUX.1-Krea-dev_ComfyUI/resolve/main/split_files/diffusion_models/flux1-krea-dev_fp8_scaled.safetensors",
428
+ "directory": "diffusion_models"
429
+ }
430
+ ]
431
+ },
432
+ "widgets_values": [
433
+ "flux1-krea-dev_fp8_scaled.safetensors",
434
+ "default"
435
+ ]
436
+ },
437
+ {
438
+ "id": 53,
439
+ "type": "VAELoader",
440
+ "pos": [
441
+ 1251.3350830078125,
442
+ 769.044189453125
443
+ ],
444
+ "size": [
445
+ 330,
446
+ 60
447
+ ],
448
+ "flags": {},
449
+ "order": 4,
450
+ "mode": 0,
451
+ "inputs": [],
452
+ "outputs": [
453
+ {
454
+ "name": "VAE",
455
+ "type": "VAE",
456
+ "slot_index": 0,
457
+ "links": [
458
+ 79,
459
+ 81,
460
+ 87,
461
+ 92
462
+ ]
463
+ }
464
+ ],
465
+ "properties": {
466
+ "cnr_id": "comfy-core",
467
+ "ver": "0.3.48",
468
+ "Node name for S&R": "VAELoader",
469
+ "models": [
470
+ {
471
+ "name": "qwen_image_vae.safetensors",
472
+ "url": "https://huggingface.co/Comfy-Org/Qwen-Image_ComfyUI/resolve/main/split_files/vae/qwen_image_vae.safetensors",
473
+ "directory": "vae"
474
+ }
475
+ ],
476
+ "enableTabs": false,
477
+ "tabWidth": 65,
478
+ "tabXOffset": 10,
479
+ "hasSecondTab": false,
480
+ "secondTabText": "Send Back",
481
+ "secondTabOffset": 80,
482
+ "secondTabWidth": 65,
483
+ "widget_ue_connectable": {}
484
+ },
485
+ "widgets_values": [
486
+ "qwen_image_vae.safetensors"
487
+ ]
488
+ },
489
+ {
490
+ "id": 54,
491
+ "type": "CLIPLoader",
492
+ "pos": [
493
+ 1246.7269287109375,
494
+ 558.9865112304688
495
+ ],
496
+ "size": [
497
+ 330,
498
+ 110
499
+ ],
500
+ "flags": {},
501
+ "order": 5,
502
+ "mode": 0,
503
+ "inputs": [],
504
+ "outputs": [
505
+ {
506
+ "name": "CLIP",
507
+ "type": "CLIP",
508
+ "slot_index": 0,
509
+ "links": [
510
+ 80,
511
+ 86
512
+ ]
513
+ }
514
+ ],
515
+ "properties": {
516
+ "cnr_id": "comfy-core",
517
+ "ver": "0.3.48",
518
+ "Node name for S&R": "CLIPLoader",
519
+ "models": [
520
+ {
521
+ "name": "qwen_2.5_vl_7b_fp8_scaled.safetensors",
522
+ "url": "https://huggingface.co/Comfy-Org/Qwen-Image_ComfyUI/resolve/main/split_files/text_encoders/qwen_2.5_vl_7b_fp8_scaled.safetensors",
523
+ "directory": "text_encoders"
524
+ }
525
+ ],
526
+ "enableTabs": false,
527
+ "tabWidth": 65,
528
+ "tabXOffset": 10,
529
+ "hasSecondTab": false,
530
+ "secondTabText": "Send Back",
531
+ "secondTabOffset": 80,
532
+ "secondTabWidth": 65,
533
+ "widget_ue_connectable": {}
534
+ },
535
+ "widgets_values": [
536
+ "qwen_2.5_vl_7b_fp8_scaled.safetensors",
537
+ "qwen_image",
538
+ "default"
539
+ ]
540
+ },
541
+ {
542
+ "id": 68,
543
+ "type": "TextEncodeQwenImageEditPlus",
544
+ "pos": [
545
+ 1703.297607421875,
546
+ 179.4862518310547
547
+ ],
548
+ "size": [
549
+ 400,
550
+ 200
551
+ ],
552
+ "flags": {},
553
+ "order": 26,
554
+ "mode": 0,
555
+ "inputs": [
556
+ {
557
+ "name": "clip",
558
+ "type": "CLIP",
559
+ "link": 86
560
+ },
561
+ {
562
+ "name": "vae",
563
+ "shape": 7,
564
+ "type": "VAE",
565
+ "link": 87
566
+ },
567
+ {
568
+ "name": "image1",
569
+ "shape": 7,
570
+ "type": "IMAGE",
571
+ "link": 98
572
+ },
573
+ {
574
+ "name": "image2",
575
+ "shape": 7,
576
+ "type": "IMAGE",
577
+ "link": null
578
+ },
579
+ {
580
+ "name": "image3",
581
+ "shape": 7,
582
+ "type": "IMAGE",
583
+ "link": null
584
+ },
585
+ {
586
+ "name": "prompt",
587
+ "type": "STRING",
588
+ "widget": {
589
+ "name": "prompt"
590
+ },
591
+ "link": 102
592
+ }
593
+ ],
594
+ "outputs": [
595
+ {
596
+ "name": "CONDITIONING",
597
+ "type": "CONDITIONING",
598
+ "links": [
599
+ 74
600
+ ]
601
+ }
602
+ ],
603
+ "properties": {
604
+ "cnr_id": "comfy-core",
605
+ "ver": "0.3.59",
606
+ "Node name for S&R": "TextEncodeQwenImageEditPlus"
607
+ },
608
+ "widgets_values": [
609
+ "Next Scene: The camera pushes in from behind the keeper, showing him gripping the rail as the storm rages and lightning illuminates his weathered face. realistic cinematic style"
610
+ ],
611
+ "color": "#232",
612
+ "bgcolor": "#353"
613
+ },
614
+ {
615
+ "id": 27,
616
+ "type": "EmptySD3LatentImage",
617
+ "pos": [
618
+ -1078.043561000177,
619
+ 654.8812058190549
620
+ ],
621
+ "size": [
622
+ 270,
623
+ 120
624
+ ],
625
+ "flags": {},
626
+ "order": 6,
627
+ "mode": 0,
628
+ "inputs": [],
629
+ "outputs": [
630
+ {
631
+ "name": "LATENT",
632
+ "type": "LATENT",
633
+ "slot_index": 0,
634
+ "links": [
635
+ 51
636
+ ]
637
+ }
638
+ ],
639
+ "properties": {
640
+ "cnr_id": "comfy-core",
641
+ "ver": "0.3.40",
642
+ "Node name for S&R": "EmptySD3LatentImage"
643
+ },
644
+ "widgets_values": [
645
+ 1280,
646
+ 720,
647
+ 1
648
+ ]
649
+ },
650
+ {
651
+ "id": 71,
652
+ "type": "ImageScaleToTotalPixels",
653
+ "pos": [
654
+ 1408.750732421875,
655
+ 1016.1314697265625
656
+ ],
657
+ "size": [
658
+ 270,
659
+ 82
660
+ ],
661
+ "flags": {},
662
+ "order": 23,
663
+ "mode": 0,
664
+ "inputs": [
665
+ {
666
+ "name": "image",
667
+ "type": "IMAGE",
668
+ "link": 97
669
+ }
670
+ ],
671
+ "outputs": [
672
+ {
673
+ "name": "IMAGE",
674
+ "type": "IMAGE",
675
+ "links": [
676
+ 78,
677
+ 82,
678
+ 98
679
+ ]
680
+ }
681
+ ],
682
+ "properties": {
683
+ "cnr_id": "comfy-core",
684
+ "ver": "0.3.50",
685
+ "Node name for S&R": "ImageScaleToTotalPixels",
686
+ "enableTabs": false,
687
+ "tabWidth": 65,
688
+ "tabXOffset": 10,
689
+ "hasSecondTab": false,
690
+ "secondTabText": "Send Back",
691
+ "secondTabOffset": 80,
692
+ "secondTabWidth": 65,
693
+ "ue_properties": {
694
+ "widget_ue_connectable": {
695
+ "upscale_method": true,
696
+ "megapixels": true
697
+ }
698
+ }
699
+ },
700
+ "widgets_values": [
701
+ "lanczos",
702
+ 1
703
+ ]
704
+ },
705
+ {
706
+ "id": 43,
707
+ "type": "MarkdownNote",
708
+ "pos": [
709
+ -1671.9521416746438,
710
+ 144.27145904592498
711
+ ],
712
+ "size": [
713
+ 520,
714
+ 390
715
+ ],
716
+ "flags": {},
717
+ "order": 7,
718
+ "mode": 0,
719
+ "inputs": [],
720
+ "outputs": [],
721
+ "title": "Model links",
722
+ "properties": {},
723
+ "widgets_values": [
724
+ "## Model links\n\n**Diffusion Model**\n\n- [flux1-krea-dev_fp8_scaled.safetensors](https://huggingface.co/Comfy-Org/FLUX.1-Krea-dev_ComfyUI/resolve/main/split_files/diffusion_models/flux1-krea-dev_fp8_scaled.safetensors)\n\nIf you need the original weights, head to [black-forest-labs/FLUX.1-Krea-dev](https://huggingface.co/black-forest-labs/FLUX.1-Krea-dev/), accept the agreement in the repo, then click the link below to download the models:\n\n- [flux1-krea-dev.safetensors](https://huggingface.co/black-forest-labs/FLUX.1-Krea-dev/resolve/main/flux1-krea-dev.safetensors)\n\n**Text Encoder**\n\n- [clip_l.safetensors](https://huggingface.co/comfyanonymous/flux_text_encoders/blob/main/clip_l.safetensors)\n\n- [t5xxl_fp16.safetensors](https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/t5xxl_fp16.safetensors) or [t5xxl_fp8_e4m3fn_scaled.safetensors](https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/t5xxl_fp8_e4m3fn_scaled.safetensors)\n\n**VAE**\n\n- [ae.safetensors](https://huggingface.co/Comfy-Org/Lumina_Image_2.0_Repackaged/resolve/main/split_files/vae/ae.safetensors)\n\n\n```\nComfyUI/\n├── models/\n│ ├── diffusion_models/\n│ │ └─── flux1-krea-dev_fp8_scaled.safetensors\n│ ├── text_encoders/\n│ │ ├── clip_l.safetensors\n│ │ └─── t5xxl_fp16.safetensors # or t5xxl_fp8_e4m3fn_scaled.safetensors\n│ └── vae/\n│ └── ae.safetensors\n```\n"
725
+ ],
726
+ "color": "#432",
727
+ "bgcolor": "#653"
728
+ },
729
+ {
730
+ "id": 8,
731
+ "type": "VAEDecode",
732
+ "pos": [
733
+ 91.50024745911058,
734
+ 208.9554308869794
735
+ ],
736
+ "size": [
737
+ 210,
738
+ 46
739
+ ],
740
+ "flags": {
741
+ "collapsed": false
742
+ },
743
+ "order": 20,
744
+ "mode": 0,
745
+ "inputs": [
746
+ {
747
+ "name": "samples",
748
+ "type": "LATENT",
749
+ "link": 52
750
+ },
751
+ {
752
+ "name": "vae",
753
+ "type": "VAE",
754
+ "link": 58
755
+ }
756
+ ],
757
+ "outputs": [
758
+ {
759
+ "name": "IMAGE",
760
+ "type": "IMAGE",
761
+ "slot_index": 0,
762
+ "links": [
763
+ 9,
764
+ 97
765
+ ]
766
+ }
767
+ ],
768
+ "properties": {
769
+ "cnr_id": "comfy-core",
770
+ "ver": "0.3.40",
771
+ "Node name for S&R": "VAEDecode"
772
+ },
773
+ "widgets_values": []
774
+ },
775
+ {
776
+ "id": 42,
777
+ "type": "ConditioningZeroOut",
778
+ "pos": [
779
+ 53.35103539963648,
780
+ 141.6342719840394
781
+ ],
782
+ "size": [
783
+ 200,
784
+ 30
785
+ ],
786
+ "flags": {
787
+ "collapsed": false
788
+ },
789
+ "order": 15,
790
+ "mode": 0,
791
+ "inputs": [
792
+ {
793
+ "name": "conditioning",
794
+ "type": "CONDITIONING",
795
+ "link": 66
796
+ }
797
+ ],
798
+ "outputs": [
799
+ {
800
+ "name": "CONDITIONING",
801
+ "type": "CONDITIONING",
802
+ "links": [
803
+ 63
804
+ ]
805
+ }
806
+ ],
807
+ "properties": {
808
+ "cnr_id": "comfy-core",
809
+ "ver": "0.3.40",
810
+ "Node name for S&R": "ConditioningZeroOut"
811
+ },
812
+ "widgets_values": []
813
+ },
814
+ {
815
+ "id": 74,
816
+ "type": "Text Load Line From File",
817
+ "pos": [
818
+ 1652.8071816703605,
819
+ -52.12227061246014
820
+ ],
821
+ "size": [
822
+ 270,
823
+ 174
824
+ ],
825
+ "flags": {},
826
+ "order": 14,
827
+ "mode": 0,
828
+ "inputs": [
829
+ {
830
+ "name": "multiline_text",
831
+ "shape": 7,
832
+ "type": "STRING",
833
+ "link": 103
834
+ },
835
+ {
836
+ "name": "index",
837
+ "type": "INT",
838
+ "widget": {
839
+ "name": "index"
840
+ },
841
+ "link": 100
842
+ }
843
+ ],
844
+ "outputs": [
845
+ {
846
+ "name": "line_text",
847
+ "type": "STRING",
848
+ "links": [
849
+ 101,
850
+ 102
851
+ ]
852
+ },
853
+ {
854
+ "name": "dictionary",
855
+ "type": "DICT",
856
+ "links": null
857
+ }
858
+ ],
859
+ "properties": {
860
+ "cnr_id": "was-node-suite-comfyui",
861
+ "ver": "ea935d1044ae5a26efa54ebeb18fe9020af49a45",
862
+ "Node name for S&R": "Text Load Line From File"
863
+ },
864
+ "widgets_values": [
865
+ "",
866
+ "[filename]",
867
+ "TextBatch",
868
+ "index",
869
+ 0
870
+ ]
871
+ },
872
+ {
873
+ "id": 82,
874
+ "type": "Note",
875
+ "pos": [
876
+ 438.7688374380804,
877
+ -145.55126237291242
878
+ ],
879
+ "size": [
880
+ 399.0292082823976,
881
+ 353.23574924486223
882
+ ],
883
+ "flags": {},
884
+ "order": 8,
885
+ "mode": 0,
886
+ "inputs": [],
887
+ "outputs": [],
888
+ "properties": {},
889
+ "widgets_values": [
890
+ "you can make the list of your prompt in the text multiline and the incrementer will select each line for each of you run.\nyou need to restart at 0 when you have no more lines. ( seed )"
891
+ ],
892
+ "color": "#432",
893
+ "bgcolor": "#653"
894
+ },
895
+ {
896
+ "id": 77,
897
+ "type": "PreviewAny",
898
+ "pos": [
899
+ -340,
900
+ 1420
901
+ ],
902
+ "size": [
903
+ 631.8195224440549,
904
+ 130.5215140178923
905
+ ],
906
+ "flags": {},
907
+ "order": 17,
908
+ "mode": 0,
909
+ "inputs": [
910
+ {
911
+ "name": "source",
912
+ "type": "*",
913
+ "link": 101
914
+ }
915
+ ],
916
+ "outputs": [],
917
+ "title": "NEXT SCENE PROMPT",
918
+ "properties": {
919
+ "cnr_id": "comfy-core",
920
+ "ver": "0.3.64",
921
+ "Node name for S&R": "PreviewAny"
922
+ },
923
+ "widgets_values": []
924
+ },
925
+ {
926
+ "id": 9,
927
+ "type": "SaveImage",
928
+ "pos": [
929
+ -350,
930
+ 950
931
+ ],
932
+ "size": [
933
+ 640.8512471405897,
934
+ 415.6048561340251
935
+ ],
936
+ "flags": {},
937
+ "order": 22,
938
+ "mode": 0,
939
+ "inputs": [
940
+ {
941
+ "name": "images",
942
+ "type": "IMAGE",
943
+ "link": 9
944
+ }
945
+ ],
946
+ "outputs": [],
947
+ "properties": {
948
+ "cnr_id": "comfy-core",
949
+ "ver": "0.3.40",
950
+ "Node name for S&R": "SaveImage"
951
+ },
952
+ "widgets_values": [
953
+ "flux"
954
+ ]
955
+ },
956
+ {
957
+ "id": 58,
958
+ "type": "SaveImage",
959
+ "pos": [
960
+ 310,
961
+ 950
962
+ ],
963
+ "size": [
964
+ 755.8363004499893,
965
+ 591.7956866416107
966
+ ],
967
+ "flags": {},
968
+ "order": 29,
969
+ "mode": 0,
970
+ "inputs": [
971
+ {
972
+ "name": "images",
973
+ "type": "IMAGE",
974
+ "link": 77
975
+ }
976
+ ],
977
+ "outputs": [],
978
+ "properties": {
979
+ "cnr_id": "comfy-core",
980
+ "ver": "0.3.48",
981
+ "Node name for S&R": "SaveImage",
982
+ "enableTabs": false,
983
+ "tabWidth": 65,
984
+ "tabXOffset": 10,
985
+ "hasSecondTab": false,
986
+ "secondTabText": "Send Back",
987
+ "secondTabOffset": 80,
988
+ "secondTabWidth": 65,
989
+ "widget_ue_connectable": {}
990
+ },
991
+ "widgets_values": [
992
+ "nextscene"
993
+ ]
994
+ },
995
+ {
996
+ "id": 65,
997
+ "type": "LoraLoaderModelOnly",
998
+ "pos": [
999
+ 1234.2114882922415,
1000
+ 275.26656409464516
1001
+ ],
1002
+ "size": [
1003
+ 310,
1004
+ 82
1005
+ ],
1006
+ "flags": {},
1007
+ "order": 13,
1008
+ "mode": 0,
1009
+ "inputs": [
1010
+ {
1011
+ "name": "model",
1012
+ "type": "MODEL",
1013
+ "link": 85
1014
+ }
1015
+ ],
1016
+ "outputs": [
1017
+ {
1018
+ "name": "MODEL",
1019
+ "type": "MODEL",
1020
+ "links": [
1021
+ 94
1022
+ ]
1023
+ }
1024
+ ],
1025
+ "properties": {
1026
+ "cnr_id": "comfy-core",
1027
+ "ver": "0.3.50",
1028
+ "Node name for S&R": "LoraLoaderModelOnly",
1029
+ "models": [
1030
+ {
1031
+ "name": "Qwen-Image-Edit-2509-Lightning-4steps-V1.0-bf16.safetensors",
1032
+ "url": "https://huggingface.co/lightx2v/Qwen-Image-Lightning/resolve/main/Qwen-Image-Edit-2509/Qwen-Image-Edit-2509-Lightning-4steps-V1.0-bf16.safetensors",
1033
+ "directory": "loras"
1034
+ }
1035
+ ],
1036
+ "enableTabs": false,
1037
+ "tabWidth": 65,
1038
+ "tabXOffset": 10,
1039
+ "hasSecondTab": false,
1040
+ "secondTabText": "Send Back",
1041
+ "secondTabOffset": 80,
1042
+ "secondTabWidth": 65,
1043
+ "ue_properties": {
1044
+ "widget_ue_connectable": {
1045
+ "lora_name": true,
1046
+ "strength_model": true
1047
+ }
1048
+ }
1049
+ },
1050
+ "widgets_values": [
1051
+ "Qwen-Image-Edit-2509-Lightning-4steps-V1.0-bf16.safetensors",
1052
+ 1
1053
+ ]
1054
+ },
1055
+ {
1056
+ "id": 57,
1057
+ "type": "UNETLoader",
1058
+ "pos": [
1059
+ 1232.2099994831387,
1060
+ 129.63747452349656
1061
+ ],
1062
+ "size": [
1063
+ 330,
1064
+ 90
1065
+ ],
1066
+ "flags": {},
1067
+ "order": 9,
1068
+ "mode": 0,
1069
+ "inputs": [],
1070
+ "outputs": [
1071
+ {
1072
+ "name": "MODEL",
1073
+ "type": "MODEL",
1074
+ "slot_index": 0,
1075
+ "links": [
1076
+ 85
1077
+ ]
1078
+ }
1079
+ ],
1080
+ "properties": {
1081
+ "cnr_id": "comfy-core",
1082
+ "ver": "0.3.48",
1083
+ "Node name for S&R": "UNETLoader",
1084
+ "models": [
1085
+ {
1086
+ "name": "qwen_image_edit_2509_fp8_e4m3fn.safetensors",
1087
+ "url": "https://huggingface.co/Comfy-Org/Qwen-Image-Edit_ComfyUI/resolve/main/split_files/diffusion_models/qwen_image_edit_2509_fp8_e4m3fn.safetensors",
1088
+ "directory": "diffusion_models"
1089
+ }
1090
+ ],
1091
+ "enableTabs": false,
1092
+ "tabWidth": 65,
1093
+ "tabXOffset": 10,
1094
+ "hasSecondTab": false,
1095
+ "secondTabText": "Send Back",
1096
+ "secondTabOffset": 80,
1097
+ "secondTabWidth": 65,
1098
+ "widget_ue_connectable": {}
1099
+ },
1100
+ "widgets_values": [
1101
+ "qwen_image_edit_2509_fp8_e4m3fn.safetensors",
1102
+ "default"
1103
+ ]
1104
+ },
1105
+ {
1106
+ "id": 73,
1107
+ "type": "LoraLoaderModelOnly",
1108
+ "pos": [
1109
+ 1224.4778739012954,
1110
+ 415.52477852979905
1111
+ ],
1112
+ "size": [
1113
+ 270,
1114
+ 82
1115
+ ],
1116
+ "flags": {},
1117
+ "order": 16,
1118
+ "mode": 0,
1119
+ "inputs": [
1120
+ {
1121
+ "name": "model",
1122
+ "type": "MODEL",
1123
+ "link": 94
1124
+ }
1125
+ ],
1126
+ "outputs": [
1127
+ {
1128
+ "name": "MODEL",
1129
+ "type": "MODEL",
1130
+ "links": [
1131
+ 95
1132
+ ]
1133
+ }
1134
+ ],
1135
+ "properties": {
1136
+ "cnr_id": "comfy-core",
1137
+ "ver": "0.3.64",
1138
+ "Node name for S&R": "LoraLoaderModelOnly"
1139
+ },
1140
+ "widgets_values": [
1141
+ "next-scene_lora-v2-3000.safetensors",
1142
+ 1
1143
+ ]
1144
+ },
1145
+ {
1146
+ "id": 45,
1147
+ "type": "CLIPTextEncode",
1148
+ "pos": [
1149
+ -670.7454588082238,
1150
+ 177.3197207116143
1151
+ ],
1152
+ "size": [
1153
+ 460.6943359375,
1154
+ 187.2991485595703
1155
+ ],
1156
+ "flags": {},
1157
+ "order": 12,
1158
+ "mode": 0,
1159
+ "inputs": [
1160
+ {
1161
+ "name": "clip",
1162
+ "type": "CLIP",
1163
+ "link": 64
1164
+ }
1165
+ ],
1166
+ "outputs": [
1167
+ {
1168
+ "name": "CONDITIONING",
1169
+ "type": "CONDITIONING",
1170
+ "links": [
1171
+ 65,
1172
+ 66
1173
+ ]
1174
+ }
1175
+ ],
1176
+ "title": "PROMPT",
1177
+ "properties": {
1178
+ "cnr_id": "comfy-core",
1179
+ "ver": "0.3.47",
1180
+ "Node name for S&R": "CLIPTextEncode"
1181
+ },
1182
+ "widgets_values": [
1183
+ "A cinematic interior of a modern apartment at night. a man and a woman stand in the living room, mid-argument. the warm light from a floor lamp contrasts with the cold blue glow from the city outside the window. tension fills the space, their faces flushed with emotion, half-lit by shadows. cinematic realism, shallow depth of field."
1184
+ ],
1185
+ "color": "#232",
1186
+ "bgcolor": "#353"
1187
+ },
1188
+ {
1189
+ "id": 81,
1190
+ "type": "Text Multiline",
1191
+ "pos": [
1192
+ 856.0914498023825,
1193
+ -165.02829581710353
1194
+ ],
1195
+ "size": [
1196
+ 400,
1197
+ 200
1198
+ ],
1199
+ "flags": {},
1200
+ "order": 10,
1201
+ "mode": 0,
1202
+ "inputs": [],
1203
+ "outputs": [
1204
+ {
1205
+ "name": "STRING",
1206
+ "type": "STRING",
1207
+ "links": [
1208
+ 103
1209
+ ]
1210
+ }
1211
+ ],
1212
+ "properties": {
1213
+ "cnr_id": "was-node-suite-comfyui",
1214
+ "ver": "ea935d1044ae5a26efa54ebeb18fe9020af49a45",
1215
+ "Node name for S&R": "Text Multiline"
1216
+ },
1217
+ "widgets_values": [
1218
+ "Next Scene: The camera moves closer to the woman, showing tears forming as she tries to speak but hesitates, her reflection faintly visible in the window behind her. realistic cinematic style\nNext Scene: The camera cuts to the man pacing back and forth, his hand in his hair, the city lights flickering behind him through the curtains. realistic cinematic style\nNext Scene: The camera moves to a medium shot between them, framing their silhouettes across the table, the soft lamp glow splitting them in two. realistic cinematic style\nNext Scene: The camera tilts down to the woman’s hand gripping her glass, trembling slightly before setting it down. realistic cinematic style\nNext Scene: The camera pans slowly as he stops and looks toward her, silence stretching between them, the hum of the city barely audible. realistic cinematic style\nNext Scene: The camera pulls back to a wide shot from the hallway, both figures motionless in the warm, fragile light. realistic cinematic style"
1219
+ ]
1220
+ },
1221
+ {
1222
+ "id": 56,
1223
+ "type": "KSampler",
1224
+ "pos": [
1225
+ 2207.7734375,
1226
+ 510.546875
1227
+ ],
1228
+ "size": [
1229
+ 300,
1230
+ 474
1231
+ ],
1232
+ "flags": {},
1233
+ "order": 27,
1234
+ "mode": 0,
1235
+ "inputs": [
1236
+ {
1237
+ "name": "model",
1238
+ "type": "MODEL",
1239
+ "link": 73
1240
+ },
1241
+ {
1242
+ "name": "positive",
1243
+ "type": "CONDITIONING",
1244
+ "link": 74
1245
+ },
1246
+ {
1247
+ "name": "negative",
1248
+ "type": "CONDITIONING",
1249
+ "link": 75
1250
+ },
1251
+ {
1252
+ "name": "latent_image",
1253
+ "type": "LATENT",
1254
+ "link": 76
1255
+ }
1256
+ ],
1257
+ "outputs": [
1258
+ {
1259
+ "name": "LATENT",
1260
+ "type": "LATENT",
1261
+ "slot_index": 0,
1262
+ "links": [
1263
+ 91
1264
+ ]
1265
+ }
1266
+ ],
1267
+ "properties": {
1268
+ "cnr_id": "comfy-core",
1269
+ "ver": "0.3.48",
1270
+ "Node name for S&R": "KSampler",
1271
+ "enableTabs": false,
1272
+ "tabWidth": 65,
1273
+ "tabXOffset": 10,
1274
+ "hasSecondTab": false,
1275
+ "secondTabText": "Send Back",
1276
+ "secondTabOffset": 80,
1277
+ "secondTabWidth": 65,
1278
+ "widget_ue_connectable": {}
1279
+ },
1280
+ "widgets_values": [
1281
+ 551042521370195,
1282
+ "fixed",
1283
+ 8,
1284
+ 1,
1285
+ "euler",
1286
+ "simple",
1287
+ 1
1288
+ ]
1289
+ },
1290
+ {
1291
+ "id": 31,
1292
+ "type": "KSampler",
1293
+ "pos": [
1294
+ 44.924703211736876,
1295
+ 371.35104434029375
1296
+ ],
1297
+ "size": [
1298
+ 315,
1299
+ 262
1300
+ ],
1301
+ "flags": {},
1302
+ "order": 18,
1303
+ "mode": 0,
1304
+ "inputs": [
1305
+ {
1306
+ "name": "model",
1307
+ "type": "MODEL",
1308
+ "link": 61
1309
+ },
1310
+ {
1311
+ "name": "positive",
1312
+ "type": "CONDITIONING",
1313
+ "link": 65
1314
+ },
1315
+ {
1316
+ "name": "negative",
1317
+ "type": "CONDITIONING",
1318
+ "link": 63
1319
+ },
1320
+ {
1321
+ "name": "latent_image",
1322
+ "type": "LATENT",
1323
+ "link": 51
1324
+ }
1325
+ ],
1326
+ "outputs": [
1327
+ {
1328
+ "name": "LATENT",
1329
+ "type": "LATENT",
1330
+ "slot_index": 0,
1331
+ "links": [
1332
+ 52
1333
+ ]
1334
+ }
1335
+ ],
1336
+ "properties": {
1337
+ "cnr_id": "comfy-core",
1338
+ "ver": "0.3.40",
1339
+ "Node name for S&R": "KSampler"
1340
+ },
1341
+ "widgets_values": [
1342
+ 943556351579141,
1343
+ "fixed",
1344
+ 30,
1345
+ 1,
1346
+ "euler",
1347
+ "simple",
1348
+ 1
1349
+ ]
1350
+ },
1351
+ {
1352
+ "id": 76,
1353
+ "type": "MasqueradeIncrementer",
1354
+ "pos": [
1355
+ 869.1791853417404,
1356
+ 106.91182894128491
1357
+ ],
1358
+ "size": [
1359
+ 270,
1360
+ 106
1361
+ ],
1362
+ "flags": {},
1363
+ "order": 11,
1364
+ "mode": 0,
1365
+ "inputs": [],
1366
+ "outputs": [
1367
+ {
1368
+ "name": "INT",
1369
+ "type": "INT",
1370
+ "links": [
1371
+ 100
1372
+ ]
1373
+ }
1374
+ ],
1375
+ "properties": {
1376
+ "cnr_id": "masquerade-nodes-comfyui",
1377
+ "ver": "432cb4d146a391b387a0cd25ace824328b5b61cf",
1378
+ "Node name for S&R": "MasqueradeIncrementer"
1379
+ },
1380
+ "widgets_values": [
1381
+ 6,
1382
+ "increment",
1383
+ 10
1384
+ ]
1385
+ }
1386
+ ],
1387
+ "links": [
1388
+ [
1389
+ 9,
1390
+ 8,
1391
+ 0,
1392
+ 9,
1393
+ 0,
1394
+ "IMAGE"
1395
+ ],
1396
+ [
1397
+ 51,
1398
+ 27,
1399
+ 0,
1400
+ 31,
1401
+ 3,
1402
+ "LATENT"
1403
+ ],
1404
+ [
1405
+ 52,
1406
+ 31,
1407
+ 0,
1408
+ 8,
1409
+ 0,
1410
+ "LATENT"
1411
+ ],
1412
+ [
1413
+ 58,
1414
+ 39,
1415
+ 0,
1416
+ 8,
1417
+ 1,
1418
+ "VAE"
1419
+ ],
1420
+ [
1421
+ 61,
1422
+ 38,
1423
+ 0,
1424
+ 31,
1425
+ 0,
1426
+ "MODEL"
1427
+ ],
1428
+ [
1429
+ 63,
1430
+ 42,
1431
+ 0,
1432
+ 31,
1433
+ 2,
1434
+ "CONDITIONING"
1435
+ ],
1436
+ [
1437
+ 64,
1438
+ 40,
1439
+ 0,
1440
+ 45,
1441
+ 0,
1442
+ "CLIP"
1443
+ ],
1444
+ [
1445
+ 65,
1446
+ 45,
1447
+ 0,
1448
+ 31,
1449
+ 1,
1450
+ "CONDITIONING"
1451
+ ],
1452
+ [
1453
+ 66,
1454
+ 45,
1455
+ 0,
1456
+ 42,
1457
+ 0,
1458
+ "CONDITIONING"
1459
+ ],
1460
+ [
1461
+ 71,
1462
+ 55,
1463
+ 0,
1464
+ 52,
1465
+ 0,
1466
+ "MODEL"
1467
+ ],
1468
+ [
1469
+ 73,
1470
+ 52,
1471
+ 0,
1472
+ 56,
1473
+ 0,
1474
+ "MODEL"
1475
+ ],
1476
+ [
1477
+ 74,
1478
+ 68,
1479
+ 0,
1480
+ 56,
1481
+ 1,
1482
+ "CONDITIONING"
1483
+ ],
1484
+ [
1485
+ 75,
1486
+ 61,
1487
+ 0,
1488
+ 56,
1489
+ 2,
1490
+ "CONDITIONING"
1491
+ ],
1492
+ [
1493
+ 76,
1494
+ 60,
1495
+ 0,
1496
+ 56,
1497
+ 3,
1498
+ "LATENT"
1499
+ ],
1500
+ [
1501
+ 77,
1502
+ 69,
1503
+ 0,
1504
+ 58,
1505
+ 0,
1506
+ "IMAGE"
1507
+ ],
1508
+ [
1509
+ 78,
1510
+ 71,
1511
+ 0,
1512
+ 60,
1513
+ 0,
1514
+ "IMAGE"
1515
+ ],
1516
+ [
1517
+ 79,
1518
+ 53,
1519
+ 0,
1520
+ 60,
1521
+ 1,
1522
+ "VAE"
1523
+ ],
1524
+ [
1525
+ 80,
1526
+ 54,
1527
+ 0,
1528
+ 61,
1529
+ 0,
1530
+ "CLIP"
1531
+ ],
1532
+ [
1533
+ 81,
1534
+ 53,
1535
+ 0,
1536
+ 61,
1537
+ 1,
1538
+ "VAE"
1539
+ ],
1540
+ [
1541
+ 82,
1542
+ 71,
1543
+ 0,
1544
+ 61,
1545
+ 2,
1546
+ "IMAGE"
1547
+ ],
1548
+ [
1549
+ 85,
1550
+ 57,
1551
+ 0,
1552
+ 65,
1553
+ 0,
1554
+ "MODEL"
1555
+ ],
1556
+ [
1557
+ 86,
1558
+ 54,
1559
+ 0,
1560
+ 68,
1561
+ 0,
1562
+ "CLIP"
1563
+ ],
1564
+ [
1565
+ 87,
1566
+ 53,
1567
+ 0,
1568
+ 68,
1569
+ 1,
1570
+ "VAE"
1571
+ ],
1572
+ [
1573
+ 91,
1574
+ 56,
1575
+ 0,
1576
+ 69,
1577
+ 0,
1578
+ "LATENT"
1579
+ ],
1580
+ [
1581
+ 92,
1582
+ 53,
1583
+ 0,
1584
+ 69,
1585
+ 1,
1586
+ "VAE"
1587
+ ],
1588
+ [
1589
+ 94,
1590
+ 65,
1591
+ 0,
1592
+ 73,
1593
+ 0,
1594
+ "MODEL"
1595
+ ],
1596
+ [
1597
+ 95,
1598
+ 73,
1599
+ 0,
1600
+ 55,
1601
+ 0,
1602
+ "MODEL"
1603
+ ],
1604
+ [
1605
+ 97,
1606
+ 8,
1607
+ 0,
1608
+ 71,
1609
+ 0,
1610
+ "IMAGE"
1611
+ ],
1612
+ [
1613
+ 98,
1614
+ 71,
1615
+ 0,
1616
+ 68,
1617
+ 2,
1618
+ "IMAGE"
1619
+ ],
1620
+ [
1621
+ 100,
1622
+ 76,
1623
+ 0,
1624
+ 74,
1625
+ 1,
1626
+ "INT"
1627
+ ],
1628
+ [
1629
+ 101,
1630
+ 74,
1631
+ 0,
1632
+ 77,
1633
+ 0,
1634
+ "*"
1635
+ ],
1636
+ [
1637
+ 102,
1638
+ 74,
1639
+ 0,
1640
+ 68,
1641
+ 5,
1642
+ "STRING"
1643
+ ],
1644
+ [
1645
+ 103,
1646
+ 81,
1647
+ 0,
1648
+ 74,
1649
+ 0,
1650
+ "STRING"
1651
+ ]
1652
+ ],
1653
+ "groups": [
1654
+ {
1655
+ "id": 1,
1656
+ "title": "Step 1 - Load Models Here",
1657
+ "bounding": [
1658
+ -1064.8211950403152,
1659
+ 103.22233432193308,
1660
+ 300,
1661
+ 460
1662
+ ],
1663
+ "color": "#3f789e",
1664
+ "font_size": 24,
1665
+ "flags": {}
1666
+ },
1667
+ {
1668
+ "id": 2,
1669
+ "title": "Step 2 - Image Size",
1670
+ "bounding": [
1671
+ -1088.043561000177,
1672
+ 584.8812058190549,
1673
+ 300,
1674
+ 200
1675
+ ],
1676
+ "color": "#3f789e",
1677
+ "font_size": 24,
1678
+ "flags": {}
1679
+ },
1680
+ {
1681
+ "id": 3,
1682
+ "title": "Step 3 - Prompt",
1683
+ "bounding": [
1684
+ -738.4809656995598,
1685
+ 102.6588158981322,
1686
+ 611.5129083311813,
1687
+ 680.2799152758953
1688
+ ],
1689
+ "color": "#3f789e",
1690
+ "font_size": 24,
1691
+ "flags": {}
1692
+ },
1693
+ {
1694
+ "id": 4,
1695
+ "title": "Step1 - Load models",
1696
+ "bounding": [
1697
+ 1220,
1698
+ 80,
1699
+ 370,
1700
+ 570
1701
+ ],
1702
+ "color": "#3f789e",
1703
+ "font_size": 24,
1704
+ "flags": {}
1705
+ },
1706
+ {
1707
+ "id": 5,
1708
+ "title": "Step 2 - Upload image for editing",
1709
+ "bounding": [
1710
+ 1220,
1711
+ 680,
1712
+ 970,
1713
+ 550
1714
+ ],
1715
+ "color": "#3f789e",
1716
+ "font_size": 24,
1717
+ "flags": {}
1718
+ },
1719
+ {
1720
+ "id": 6,
1721
+ "title": "Step 4 - Prompt",
1722
+ "bounding": [
1723
+ 1620,
1724
+ 80,
1725
+ 570,
1726
+ 570
1727
+ ],
1728
+ "color": "#3f789e",
1729
+ "font_size": 24,
1730
+ "flags": {}
1731
+ },
1732
+ {
1733
+ "id": 7,
1734
+ "title": "Step3 - Image Size",
1735
+ "bounding": [
1736
+ 2220,
1737
+ 1030,
1738
+ 310,
1739
+ 200
1740
+ ],
1741
+ "color": "#3f789e",
1742
+ "font_size": 24,
1743
+ "flags": {}
1744
+ }
1745
+ ],
1746
+ "config": {},
1747
+ "extra": {
1748
+ "ds": {
1749
+ "scale": 0.8544135795549737,
1750
+ "offset": [
1751
+ 848.3933584607535,
1752
+ -834.9028248115114
1753
+ ]
1754
+ },
1755
+ "frontendVersion": "1.28.7",
1756
+ "VHS_latentpreview": false,
1757
+ "VHS_latentpreviewrate": 0,
1758
+ "VHS_MetadataImage": true,
1759
+ "VHS_KeepIntermediate": true
1760
+ },
1761
+ "version": 0.4
1762
+ }
workflow-comfyui-basic-next-scene.json ADDED
@@ -0,0 +1,1611 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "id": "908d0bfb-e192-4627-9b57-147496e6e2dd",
3
+ "revision": 0,
4
+ "last_node_id": 73,
5
+ "last_link_id": 98,
6
+ "nodes": [
7
+ {
8
+ "id": 40,
9
+ "type": "DualCLIPLoader",
10
+ "pos": [
11
+ -320,
12
+ 290
13
+ ],
14
+ "size": [
15
+ 270,
16
+ 130
17
+ ],
18
+ "flags": {},
19
+ "order": 0,
20
+ "mode": 0,
21
+ "inputs": [],
22
+ "outputs": [
23
+ {
24
+ "name": "CLIP",
25
+ "type": "CLIP",
26
+ "links": [
27
+ 64
28
+ ]
29
+ }
30
+ ],
31
+ "properties": {
32
+ "cnr_id": "comfy-core",
33
+ "ver": "0.3.40",
34
+ "Node name for S&R": "DualCLIPLoader",
35
+ "models": [
36
+ {
37
+ "name": "clip_l.safetensors",
38
+ "url": "https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/clip_l.safetensors",
39
+ "directory": "text_encoders"
40
+ },
41
+ {
42
+ "name": "t5xxl_fp16.safetensors",
43
+ "url": "https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/t5xxl_fp16.safetensors",
44
+ "directory": "text_encoders"
45
+ }
46
+ ]
47
+ },
48
+ "widgets_values": [
49
+ "clip_l.safetensors",
50
+ "t5xxl_fp16.safetensors",
51
+ "flux",
52
+ "default"
53
+ ]
54
+ },
55
+ {
56
+ "id": 39,
57
+ "type": "VAELoader",
58
+ "pos": [
59
+ -320,
60
+ 470
61
+ ],
62
+ "size": [
63
+ 270,
64
+ 58
65
+ ],
66
+ "flags": {},
67
+ "order": 1,
68
+ "mode": 0,
69
+ "inputs": [],
70
+ "outputs": [
71
+ {
72
+ "name": "VAE",
73
+ "type": "VAE",
74
+ "links": [
75
+ 58
76
+ ]
77
+ }
78
+ ],
79
+ "properties": {
80
+ "cnr_id": "comfy-core",
81
+ "ver": "0.3.40",
82
+ "Node name for S&R": "VAELoader",
83
+ "models": [
84
+ {
85
+ "name": "ae.safetensors",
86
+ "url": "https://huggingface.co/Comfy-Org/Lumina_Image_2.0_Repackaged/resolve/main/split_files/vae/ae.safetensors",
87
+ "directory": "vae"
88
+ }
89
+ ]
90
+ },
91
+ "widgets_values": [
92
+ "ae.safetensors"
93
+ ]
94
+ },
95
+ {
96
+ "id": 42,
97
+ "type": "ConditioningZeroOut",
98
+ "pos": [
99
+ -10,
100
+ 460
101
+ ],
102
+ "size": [
103
+ 200,
104
+ 30
105
+ ],
106
+ "flags": {
107
+ "collapsed": false
108
+ },
109
+ "order": 14,
110
+ "mode": 0,
111
+ "inputs": [
112
+ {
113
+ "name": "conditioning",
114
+ "type": "CONDITIONING",
115
+ "link": 66
116
+ }
117
+ ],
118
+ "outputs": [
119
+ {
120
+ "name": "CONDITIONING",
121
+ "type": "CONDITIONING",
122
+ "links": [
123
+ 63
124
+ ]
125
+ }
126
+ ],
127
+ "properties": {
128
+ "cnr_id": "comfy-core",
129
+ "ver": "0.3.40",
130
+ "Node name for S&R": "ConditioningZeroOut"
131
+ },
132
+ "widgets_values": []
133
+ },
134
+ {
135
+ "id": 31,
136
+ "type": "KSampler",
137
+ "pos": [
138
+ 10,
139
+ 550
140
+ ],
141
+ "size": [
142
+ 315,
143
+ 262
144
+ ],
145
+ "flags": {},
146
+ "order": 16,
147
+ "mode": 0,
148
+ "inputs": [
149
+ {
150
+ "name": "model",
151
+ "type": "MODEL",
152
+ "link": 61
153
+ },
154
+ {
155
+ "name": "positive",
156
+ "type": "CONDITIONING",
157
+ "link": 65
158
+ },
159
+ {
160
+ "name": "negative",
161
+ "type": "CONDITIONING",
162
+ "link": 63
163
+ },
164
+ {
165
+ "name": "latent_image",
166
+ "type": "LATENT",
167
+ "link": 51
168
+ }
169
+ ],
170
+ "outputs": [
171
+ {
172
+ "name": "LATENT",
173
+ "type": "LATENT",
174
+ "slot_index": 0,
175
+ "links": [
176
+ 52
177
+ ]
178
+ }
179
+ ],
180
+ "properties": {
181
+ "cnr_id": "comfy-core",
182
+ "ver": "0.3.40",
183
+ "Node name for S&R": "KSampler"
184
+ },
185
+ "widgets_values": [
186
+ 516682275155174,
187
+ "randomize",
188
+ 20,
189
+ 1,
190
+ "euler",
191
+ "simple",
192
+ 1
193
+ ]
194
+ },
195
+ {
196
+ "id": 43,
197
+ "type": "MarkdownNote",
198
+ "pos": [
199
+ -870,
200
+ 110
201
+ ],
202
+ "size": [
203
+ 520,
204
+ 390
205
+ ],
206
+ "flags": {},
207
+ "order": 2,
208
+ "mode": 0,
209
+ "inputs": [],
210
+ "outputs": [],
211
+ "title": "Model links",
212
+ "properties": {},
213
+ "widgets_values": [
214
+ "## Model links\n\n**Diffusion Model**\n\n- [flux1-krea-dev_fp8_scaled.safetensors](https://huggingface.co/Comfy-Org/FLUX.1-Krea-dev_ComfyUI/resolve/main/split_files/diffusion_models/flux1-krea-dev_fp8_scaled.safetensors)\n\nIf you need the original weights, head to [black-forest-labs/FLUX.1-Krea-dev](https://huggingface.co/black-forest-labs/FLUX.1-Krea-dev/), accept the agreement in the repo, then click the link below to download the models:\n\n- [flux1-krea-dev.safetensors](https://huggingface.co/black-forest-labs/FLUX.1-Krea-dev/resolve/main/flux1-krea-dev.safetensors)\n\n**Text Encoder**\n\n- [clip_l.safetensors](https://huggingface.co/comfyanonymous/flux_text_encoders/blob/main/clip_l.safetensors)\n\n- [t5xxl_fp16.safetensors](https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/t5xxl_fp16.safetensors) or [t5xxl_fp8_e4m3fn_scaled.safetensors](https://huggingface.co/comfyanonymous/flux_text_encoders/resolve/main/t5xxl_fp8_e4m3fn_scaled.safetensors)\n\n**VAE**\n\n- [ae.safetensors](https://huggingface.co/Comfy-Org/Lumina_Image_2.0_Repackaged/resolve/main/split_files/vae/ae.safetensors)\n\n\n```\nComfyUI/\n├── models/\n│ ├── diffusion_models/\n│ │ └─── flux1-krea-dev_fp8_scaled.safetensors\n│ ├── text_encoders/\n│ │ ├── clip_l.safetensors\n│ │ └─── t5xxl_fp16.safetensors # or t5xxl_fp8_e4m3fn_scaled.safetensors\n│ └── vae/\n│ └── ae.safetensors\n```\n"
215
+ ],
216
+ "color": "#432",
217
+ "bgcolor": "#653"
218
+ },
219
+ {
220
+ "id": 52,
221
+ "type": "CFGNorm",
222
+ "pos": [
223
+ 2230,
224
+ 230
225
+ ],
226
+ "size": [
227
+ 290,
228
+ 60
229
+ ],
230
+ "flags": {},
231
+ "order": 19,
232
+ "mode": 0,
233
+ "inputs": [
234
+ {
235
+ "name": "model",
236
+ "type": "MODEL",
237
+ "link": 71
238
+ }
239
+ ],
240
+ "outputs": [
241
+ {
242
+ "name": "patched_model",
243
+ "type": "MODEL",
244
+ "links": [
245
+ 73
246
+ ]
247
+ }
248
+ ],
249
+ "properties": {
250
+ "cnr_id": "comfy-core",
251
+ "ver": "0.3.50",
252
+ "Node name for S&R": "CFGNorm",
253
+ "enableTabs": false,
254
+ "tabWidth": 65,
255
+ "tabXOffset": 10,
256
+ "hasSecondTab": false,
257
+ "secondTabText": "Send Back",
258
+ "secondTabOffset": 80,
259
+ "secondTabWidth": 65,
260
+ "ue_properties": {
261
+ "widget_ue_connectable": {
262
+ "strength": true
263
+ }
264
+ }
265
+ },
266
+ "widgets_values": [
267
+ 1
268
+ ]
269
+ },
270
+ {
271
+ "id": 55,
272
+ "type": "ModelSamplingAuraFlow",
273
+ "pos": [
274
+ 2230,
275
+ 120
276
+ ],
277
+ "size": [
278
+ 290,
279
+ 60
280
+ ],
281
+ "flags": {},
282
+ "order": 17,
283
+ "mode": 0,
284
+ "inputs": [
285
+ {
286
+ "name": "model",
287
+ "type": "MODEL",
288
+ "link": 95
289
+ }
290
+ ],
291
+ "outputs": [
292
+ {
293
+ "name": "MODEL",
294
+ "type": "MODEL",
295
+ "links": [
296
+ 71
297
+ ]
298
+ }
299
+ ],
300
+ "properties": {
301
+ "cnr_id": "comfy-core",
302
+ "ver": "0.3.48",
303
+ "Node name for S&R": "ModelSamplingAuraFlow",
304
+ "enableTabs": false,
305
+ "tabWidth": 65,
306
+ "tabXOffset": 10,
307
+ "hasSecondTab": false,
308
+ "secondTabText": "Send Back",
309
+ "secondTabOffset": 80,
310
+ "secondTabWidth": 65,
311
+ "widget_ue_connectable": {}
312
+ },
313
+ "widgets_values": [
314
+ 3
315
+ ]
316
+ },
317
+ {
318
+ "id": 56,
319
+ "type": "KSampler",
320
+ "pos": [
321
+ 2230,
322
+ 340
323
+ ],
324
+ "size": [
325
+ 300,
326
+ 474
327
+ ],
328
+ "flags": {},
329
+ "order": 25,
330
+ "mode": 0,
331
+ "inputs": [
332
+ {
333
+ "name": "model",
334
+ "type": "MODEL",
335
+ "link": 73
336
+ },
337
+ {
338
+ "name": "positive",
339
+ "type": "CONDITIONING",
340
+ "link": 74
341
+ },
342
+ {
343
+ "name": "negative",
344
+ "type": "CONDITIONING",
345
+ "link": 75
346
+ },
347
+ {
348
+ "name": "latent_image",
349
+ "type": "LATENT",
350
+ "link": 76
351
+ }
352
+ ],
353
+ "outputs": [
354
+ {
355
+ "name": "LATENT",
356
+ "type": "LATENT",
357
+ "slot_index": 0,
358
+ "links": [
359
+ 91
360
+ ]
361
+ }
362
+ ],
363
+ "properties": {
364
+ "cnr_id": "comfy-core",
365
+ "ver": "0.3.48",
366
+ "Node name for S&R": "KSampler",
367
+ "enableTabs": false,
368
+ "tabWidth": 65,
369
+ "tabXOffset": 10,
370
+ "hasSecondTab": false,
371
+ "secondTabText": "Send Back",
372
+ "secondTabOffset": 80,
373
+ "secondTabWidth": 65,
374
+ "widget_ue_connectable": {}
375
+ },
376
+ "widgets_values": [
377
+ 84333832884411,
378
+ "randomize",
379
+ 4,
380
+ 1,
381
+ "euler",
382
+ "simple",
383
+ 1
384
+ ]
385
+ },
386
+ {
387
+ "id": 59,
388
+ "type": "EmptySD3LatentImage",
389
+ "pos": [
390
+ 2240,
391
+ 1110
392
+ ],
393
+ "size": [
394
+ 270,
395
+ 106
396
+ ],
397
+ "flags": {},
398
+ "order": 3,
399
+ "mode": 0,
400
+ "inputs": [],
401
+ "outputs": [
402
+ {
403
+ "name": "LATENT",
404
+ "type": "LATENT",
405
+ "links": []
406
+ }
407
+ ],
408
+ "properties": {
409
+ "cnr_id": "comfy-core",
410
+ "ver": "0.3.59",
411
+ "Node name for S&R": "EmptySD3LatentImage"
412
+ },
413
+ "widgets_values": [
414
+ 1024,
415
+ 1024,
416
+ 1
417
+ ]
418
+ },
419
+ {
420
+ "id": 60,
421
+ "type": "VAEEncode",
422
+ "pos": [
423
+ 1980,
424
+ 1150
425
+ ],
426
+ "size": [
427
+ 140,
428
+ 46
429
+ ],
430
+ "flags": {},
431
+ "order": 22,
432
+ "mode": 0,
433
+ "inputs": [
434
+ {
435
+ "name": "pixels",
436
+ "type": "IMAGE",
437
+ "link": 78
438
+ },
439
+ {
440
+ "name": "vae",
441
+ "type": "VAE",
442
+ "link": 79
443
+ }
444
+ ],
445
+ "outputs": [
446
+ {
447
+ "name": "LATENT",
448
+ "type": "LATENT",
449
+ "links": [
450
+ 76
451
+ ]
452
+ }
453
+ ],
454
+ "properties": {
455
+ "cnr_id": "comfy-core",
456
+ "ver": "0.3.50",
457
+ "Node name for S&R": "VAEEncode",
458
+ "enableTabs": false,
459
+ "tabWidth": 65,
460
+ "tabXOffset": 10,
461
+ "hasSecondTab": false,
462
+ "secondTabText": "Send Back",
463
+ "secondTabOffset": 80,
464
+ "secondTabWidth": 65,
465
+ "ue_properties": {
466
+ "widget_ue_connectable": {}
467
+ }
468
+ },
469
+ "widgets_values": []
470
+ },
471
+ {
472
+ "id": 61,
473
+ "type": "TextEncodeQwenImageEditPlus",
474
+ "pos": [
475
+ 1710,
476
+ 430
477
+ ],
478
+ "size": [
479
+ 400,
480
+ 200
481
+ ],
482
+ "flags": {},
483
+ "order": 23,
484
+ "mode": 0,
485
+ "inputs": [
486
+ {
487
+ "name": "clip",
488
+ "type": "CLIP",
489
+ "link": 80
490
+ },
491
+ {
492
+ "name": "vae",
493
+ "shape": 7,
494
+ "type": "VAE",
495
+ "link": 81
496
+ },
497
+ {
498
+ "name": "image1",
499
+ "shape": 7,
500
+ "type": "IMAGE",
501
+ "link": 82
502
+ },
503
+ {
504
+ "name": "image2",
505
+ "shape": 7,
506
+ "type": "IMAGE",
507
+ "link": null
508
+ },
509
+ {
510
+ "name": "image3",
511
+ "shape": 7,
512
+ "type": "IMAGE",
513
+ "link": null
514
+ }
515
+ ],
516
+ "outputs": [
517
+ {
518
+ "name": "CONDITIONING",
519
+ "type": "CONDITIONING",
520
+ "links": [
521
+ 75
522
+ ]
523
+ }
524
+ ],
525
+ "properties": {
526
+ "cnr_id": "comfy-core",
527
+ "ver": "0.3.59",
528
+ "Node name for S&R": "TextEncodeQwenImageEditPlus"
529
+ },
530
+ "widgets_values": [
531
+ ""
532
+ ],
533
+ "color": "#223",
534
+ "bgcolor": "#335"
535
+ },
536
+ {
537
+ "id": 62,
538
+ "type": "MarkdownNote",
539
+ "pos": [
540
+ 2220,
541
+ 1280
542
+ ],
543
+ "size": [
544
+ 330,
545
+ 90
546
+ ],
547
+ "flags": {},
548
+ "order": 4,
549
+ "mode": 0,
550
+ "inputs": [],
551
+ "outputs": [],
552
+ "title": "Note: About image size",
553
+ "properties": {},
554
+ "widgets_values": [
555
+ "You can use the latent from the **EmptySD3LatentImage** to replace **VAE Encode**, so you can customize the image size."
556
+ ],
557
+ "color": "#432",
558
+ "bgcolor": "#653"
559
+ },
560
+ {
561
+ "id": 63,
562
+ "type": "MarkdownNote",
563
+ "pos": [
564
+ 2230,
565
+ 860
566
+ ],
567
+ "size": [
568
+ 300,
569
+ 160
570
+ ],
571
+ "flags": {},
572
+ "order": 5,
573
+ "mode": 0,
574
+ "inputs": [],
575
+ "outputs": [],
576
+ "title": "Note: KSampler settings",
577
+ "properties": {},
578
+ "widgets_values": [
579
+ "You can test and find the best setting by yourself. The following table is for reference.\n\n| Model | Steps | CFG |\n|---------------------|---------------|---------------|\n| Offical | 50 | 4.0 \n| fp8_e4m3fn | 20 | 2.5 |\n| fp8_e4m3fn + 4steps LoRA | 4 | 1.0 |\n"
580
+ ],
581
+ "color": "#432",
582
+ "bgcolor": "#653"
583
+ },
584
+ {
585
+ "id": 69,
586
+ "type": "VAEDecode",
587
+ "pos": [
588
+ 2570,
589
+ 120
590
+ ],
591
+ "size": [
592
+ 210,
593
+ 46
594
+ ],
595
+ "flags": {
596
+ "collapsed": false
597
+ },
598
+ "order": 26,
599
+ "mode": 0,
600
+ "inputs": [
601
+ {
602
+ "name": "samples",
603
+ "type": "LATENT",
604
+ "link": 91
605
+ },
606
+ {
607
+ "name": "vae",
608
+ "type": "VAE",
609
+ "link": 92
610
+ }
611
+ ],
612
+ "outputs": [
613
+ {
614
+ "name": "IMAGE",
615
+ "type": "IMAGE",
616
+ "slot_index": 0,
617
+ "links": [
618
+ 77
619
+ ]
620
+ }
621
+ ],
622
+ "properties": {
623
+ "cnr_id": "comfy-core",
624
+ "ver": "0.3.48",
625
+ "Node name for S&R": "VAEDecode",
626
+ "enableTabs": false,
627
+ "tabWidth": 65,
628
+ "tabXOffset": 10,
629
+ "hasSecondTab": false,
630
+ "secondTabText": "Send Back",
631
+ "secondTabOffset": 80,
632
+ "secondTabWidth": 65,
633
+ "widget_ue_connectable": {}
634
+ },
635
+ "widgets_values": []
636
+ },
637
+ {
638
+ "id": 72,
639
+ "type": "MarkdownNote",
640
+ "pos": [
641
+ 1250,
642
+ 1280
643
+ ],
644
+ "size": [
645
+ 290,
646
+ 140
647
+ ],
648
+ "flags": {},
649
+ "order": 6,
650
+ "mode": 0,
651
+ "inputs": [],
652
+ "outputs": [],
653
+ "properties": {},
654
+ "widgets_values": [
655
+ "This node is to avoid bad output results caused by excessively large input image sizes. Because when we input one image, we use the size of that input image.\n\nThe **TextEncodeQwenImageEditPlus** will scale your input to 1024×104 pixels. We use the size of your first input image. This node is to avoid having an input image size that is too large (such as 3000×3000 pixels), which could bring bad results."
656
+ ],
657
+ "color": "#432",
658
+ "bgcolor": "#653"
659
+ },
660
+ {
661
+ "id": 38,
662
+ "type": "UNETLoader",
663
+ "pos": [
664
+ -320,
665
+ 150
666
+ ],
667
+ "size": [
668
+ 270,
669
+ 82
670
+ ],
671
+ "flags": {},
672
+ "order": 7,
673
+ "mode": 0,
674
+ "inputs": [],
675
+ "outputs": [
676
+ {
677
+ "name": "MODEL",
678
+ "type": "MODEL",
679
+ "links": [
680
+ 61
681
+ ]
682
+ }
683
+ ],
684
+ "properties": {
685
+ "cnr_id": "comfy-core",
686
+ "ver": "0.3.40",
687
+ "Node name for S&R": "UNETLoader",
688
+ "models": [
689
+ {
690
+ "name": "flux1-krea-dev_fp8_scaled.safetensors",
691
+ "url": "https://huggingface.co/Comfy-Org/FLUX.1-Krea-dev_ComfyUI/resolve/main/split_files/diffusion_models/flux1-krea-dev_fp8_scaled.safetensors",
692
+ "directory": "diffusion_models"
693
+ }
694
+ ]
695
+ },
696
+ "widgets_values": [
697
+ "flux1-krea-dev_fp8_scaled.safetensors",
698
+ "default"
699
+ ]
700
+ },
701
+ {
702
+ "id": 53,
703
+ "type": "VAELoader",
704
+ "pos": [
705
+ 1251.3350830078125,
706
+ 769.044189453125
707
+ ],
708
+ "size": [
709
+ 330,
710
+ 60
711
+ ],
712
+ "flags": {},
713
+ "order": 8,
714
+ "mode": 0,
715
+ "inputs": [],
716
+ "outputs": [
717
+ {
718
+ "name": "VAE",
719
+ "type": "VAE",
720
+ "slot_index": 0,
721
+ "links": [
722
+ 79,
723
+ 81,
724
+ 87,
725
+ 92
726
+ ]
727
+ }
728
+ ],
729
+ "properties": {
730
+ "cnr_id": "comfy-core",
731
+ "ver": "0.3.48",
732
+ "Node name for S&R": "VAELoader",
733
+ "models": [
734
+ {
735
+ "name": "qwen_image_vae.safetensors",
736
+ "url": "https://huggingface.co/Comfy-Org/Qwen-Image_ComfyUI/resolve/main/split_files/vae/qwen_image_vae.safetensors",
737
+ "directory": "vae"
738
+ }
739
+ ],
740
+ "enableTabs": false,
741
+ "tabWidth": 65,
742
+ "tabXOffset": 10,
743
+ "hasSecondTab": false,
744
+ "secondTabText": "Send Back",
745
+ "secondTabOffset": 80,
746
+ "secondTabWidth": 65,
747
+ "widget_ue_connectable": {}
748
+ },
749
+ "widgets_values": [
750
+ "qwen_image_vae.safetensors"
751
+ ]
752
+ },
753
+ {
754
+ "id": 54,
755
+ "type": "CLIPLoader",
756
+ "pos": [
757
+ 1246.7269287109375,
758
+ 558.9865112304688
759
+ ],
760
+ "size": [
761
+ 330,
762
+ 110
763
+ ],
764
+ "flags": {},
765
+ "order": 9,
766
+ "mode": 0,
767
+ "inputs": [],
768
+ "outputs": [
769
+ {
770
+ "name": "CLIP",
771
+ "type": "CLIP",
772
+ "slot_index": 0,
773
+ "links": [
774
+ 80,
775
+ 86
776
+ ]
777
+ }
778
+ ],
779
+ "properties": {
780
+ "cnr_id": "comfy-core",
781
+ "ver": "0.3.48",
782
+ "Node name for S&R": "CLIPLoader",
783
+ "models": [
784
+ {
785
+ "name": "qwen_2.5_vl_7b_fp8_scaled.safetensors",
786
+ "url": "https://huggingface.co/Comfy-Org/Qwen-Image_ComfyUI/resolve/main/split_files/text_encoders/qwen_2.5_vl_7b_fp8_scaled.safetensors",
787
+ "directory": "text_encoders"
788
+ }
789
+ ],
790
+ "enableTabs": false,
791
+ "tabWidth": 65,
792
+ "tabXOffset": 10,
793
+ "hasSecondTab": false,
794
+ "secondTabText": "Send Back",
795
+ "secondTabOffset": 80,
796
+ "secondTabWidth": 65,
797
+ "widget_ue_connectable": {}
798
+ },
799
+ "widgets_values": [
800
+ "qwen_2.5_vl_7b_fp8_scaled.safetensors",
801
+ "qwen_image",
802
+ "default"
803
+ ]
804
+ },
805
+ {
806
+ "id": 73,
807
+ "type": "LoraLoaderModelOnly",
808
+ "pos": [
809
+ 1227.6541748046875,
810
+ 406.69415283203125
811
+ ],
812
+ "size": [
813
+ 270,
814
+ 82
815
+ ],
816
+ "flags": {},
817
+ "order": 15,
818
+ "mode": 0,
819
+ "inputs": [
820
+ {
821
+ "name": "model",
822
+ "type": "MODEL",
823
+ "link": 94
824
+ }
825
+ ],
826
+ "outputs": [
827
+ {
828
+ "name": "MODEL",
829
+ "type": "MODEL",
830
+ "links": [
831
+ 95
832
+ ]
833
+ }
834
+ ],
835
+ "properties": {
836
+ "cnr_id": "comfy-core",
837
+ "ver": "0.3.64",
838
+ "Node name for S&R": "LoraLoaderModelOnly"
839
+ },
840
+ "widgets_values": [
841
+ "next-scene_lora_v1-3000.safetensors",
842
+ 1
843
+ ]
844
+ },
845
+ {
846
+ "id": 65,
847
+ "type": "LoraLoaderModelOnly",
848
+ "pos": [
849
+ 1222.294677734375,
850
+ 231.0044403076172
851
+ ],
852
+ "size": [
853
+ 310,
854
+ 82
855
+ ],
856
+ "flags": {},
857
+ "order": 13,
858
+ "mode": 0,
859
+ "inputs": [
860
+ {
861
+ "name": "model",
862
+ "type": "MODEL",
863
+ "link": 85
864
+ }
865
+ ],
866
+ "outputs": [
867
+ {
868
+ "name": "MODEL",
869
+ "type": "MODEL",
870
+ "links": [
871
+ 94
872
+ ]
873
+ }
874
+ ],
875
+ "properties": {
876
+ "cnr_id": "comfy-core",
877
+ "ver": "0.3.50",
878
+ "Node name for S&R": "LoraLoaderModelOnly",
879
+ "models": [
880
+ {
881
+ "name": "Qwen-Image-Edit-2509-Lightning-4steps-V1.0-bf16.safetensors",
882
+ "url": "https://huggingface.co/lightx2v/Qwen-Image-Lightning/resolve/main/Qwen-Image-Edit-2509/Qwen-Image-Edit-2509-Lightning-4steps-V1.0-bf16.safetensors",
883
+ "directory": "loras"
884
+ }
885
+ ],
886
+ "enableTabs": false,
887
+ "tabWidth": 65,
888
+ "tabXOffset": 10,
889
+ "hasSecondTab": false,
890
+ "secondTabText": "Send Back",
891
+ "secondTabOffset": 80,
892
+ "secondTabWidth": 65,
893
+ "ue_properties": {
894
+ "widget_ue_connectable": {
895
+ "lora_name": true,
896
+ "strength_model": true
897
+ }
898
+ }
899
+ },
900
+ "widgets_values": [
901
+ "Qwen-Image-Edit-2509-Lightning-4steps-V1.0-bf16.safetensors",
902
+ 1
903
+ ]
904
+ },
905
+ {
906
+ "id": 57,
907
+ "type": "UNETLoader",
908
+ "pos": [
909
+ 1262.0018310546875,
910
+ 82.82173919677734
911
+ ],
912
+ "size": [
913
+ 330,
914
+ 90
915
+ ],
916
+ "flags": {},
917
+ "order": 10,
918
+ "mode": 0,
919
+ "inputs": [],
920
+ "outputs": [
921
+ {
922
+ "name": "MODEL",
923
+ "type": "MODEL",
924
+ "slot_index": 0,
925
+ "links": [
926
+ 85
927
+ ]
928
+ }
929
+ ],
930
+ "properties": {
931
+ "cnr_id": "comfy-core",
932
+ "ver": "0.3.48",
933
+ "Node name for S&R": "UNETLoader",
934
+ "models": [
935
+ {
936
+ "name": "qwen_image_edit_2509_fp8_e4m3fn.safetensors",
937
+ "url": "https://huggingface.co/Comfy-Org/Qwen-Image-Edit_ComfyUI/resolve/main/split_files/diffusion_models/qwen_image_edit_2509_fp8_e4m3fn.safetensors",
938
+ "directory": "diffusion_models"
939
+ }
940
+ ],
941
+ "enableTabs": false,
942
+ "tabWidth": 65,
943
+ "tabXOffset": 10,
944
+ "hasSecondTab": false,
945
+ "secondTabText": "Send Back",
946
+ "secondTabOffset": 80,
947
+ "secondTabWidth": 65,
948
+ "widget_ue_connectable": {}
949
+ },
950
+ "widgets_values": [
951
+ "qwen_image_edit_2509_fp8_e4m3fn.safetensors",
952
+ "default"
953
+ ]
954
+ },
955
+ {
956
+ "id": 9,
957
+ "type": "SaveImage",
958
+ "pos": [
959
+ -68.49736022949219,
960
+ 949.423583984375
961
+ ],
962
+ "size": [
963
+ 640,
964
+ 660
965
+ ],
966
+ "flags": {},
967
+ "order": 20,
968
+ "mode": 0,
969
+ "inputs": [
970
+ {
971
+ "name": "images",
972
+ "type": "IMAGE",
973
+ "link": 9
974
+ }
975
+ ],
976
+ "outputs": [],
977
+ "properties": {
978
+ "cnr_id": "comfy-core",
979
+ "ver": "0.3.40",
980
+ "Node name for S&R": "SaveImage"
981
+ },
982
+ "widgets_values": [
983
+ "flux_krea/flux_krea"
984
+ ]
985
+ },
986
+ {
987
+ "id": 58,
988
+ "type": "SaveImage",
989
+ "pos": [
990
+ 592.7343139648438,
991
+ 952.6732788085938
992
+ ],
993
+ "size": [
994
+ 507.185302734375,
995
+ 476.2288513183594
996
+ ],
997
+ "flags": {},
998
+ "order": 27,
999
+ "mode": 0,
1000
+ "inputs": [
1001
+ {
1002
+ "name": "images",
1003
+ "type": "IMAGE",
1004
+ "link": 77
1005
+ }
1006
+ ],
1007
+ "outputs": [],
1008
+ "properties": {
1009
+ "cnr_id": "comfy-core",
1010
+ "ver": "0.3.48",
1011
+ "Node name for S&R": "SaveImage",
1012
+ "enableTabs": false,
1013
+ "tabWidth": 65,
1014
+ "tabXOffset": 10,
1015
+ "hasSecondTab": false,
1016
+ "secondTabText": "Send Back",
1017
+ "secondTabOffset": 80,
1018
+ "secondTabWidth": 65,
1019
+ "widget_ue_connectable": {}
1020
+ },
1021
+ "widgets_values": [
1022
+ "ComfyUI"
1023
+ ]
1024
+ },
1025
+ {
1026
+ "id": 45,
1027
+ "type": "CLIPTextEncode",
1028
+ "pos": [
1029
+ -426.0304260253906,
1030
+ 959.8533325195312
1031
+ ],
1032
+ "size": [
1033
+ 330,
1034
+ 210
1035
+ ],
1036
+ "flags": {},
1037
+ "order": 12,
1038
+ "mode": 0,
1039
+ "inputs": [
1040
+ {
1041
+ "name": "clip",
1042
+ "type": "CLIP",
1043
+ "link": 64
1044
+ }
1045
+ ],
1046
+ "outputs": [
1047
+ {
1048
+ "name": "CONDITIONING",
1049
+ "type": "CONDITIONING",
1050
+ "links": [
1051
+ 65,
1052
+ 66
1053
+ ]
1054
+ }
1055
+ ],
1056
+ "properties": {
1057
+ "cnr_id": "comfy-core",
1058
+ "ver": "0.3.47",
1059
+ "Node name for S&R": "CLIPTextEncode"
1060
+ },
1061
+ "widgets_values": [
1062
+ "A realistic cinematic shot of an old lighthouse keeper standing at the edge of a cliff during a violent storm, his long coat whipping in the wind, waves crashing below, lightning flashing across the sky. dramatic lighting, cinematic realism, moody atmosphere."
1063
+ ]
1064
+ },
1065
+ {
1066
+ "id": 68,
1067
+ "type": "TextEncodeQwenImageEditPlus",
1068
+ "pos": [
1069
+ 1703.297607421875,
1070
+ 179.4862518310547
1071
+ ],
1072
+ "size": [
1073
+ 400,
1074
+ 200
1075
+ ],
1076
+ "flags": {},
1077
+ "order": 24,
1078
+ "mode": 0,
1079
+ "inputs": [
1080
+ {
1081
+ "name": "clip",
1082
+ "type": "CLIP",
1083
+ "link": 86
1084
+ },
1085
+ {
1086
+ "name": "vae",
1087
+ "shape": 7,
1088
+ "type": "VAE",
1089
+ "link": 87
1090
+ },
1091
+ {
1092
+ "name": "image1",
1093
+ "shape": 7,
1094
+ "type": "IMAGE",
1095
+ "link": 98
1096
+ },
1097
+ {
1098
+ "name": "image2",
1099
+ "shape": 7,
1100
+ "type": "IMAGE",
1101
+ "link": null
1102
+ },
1103
+ {
1104
+ "name": "image3",
1105
+ "shape": 7,
1106
+ "type": "IMAGE",
1107
+ "link": null
1108
+ }
1109
+ ],
1110
+ "outputs": [
1111
+ {
1112
+ "name": "CONDITIONING",
1113
+ "type": "CONDITIONING",
1114
+ "links": [
1115
+ 74
1116
+ ]
1117
+ }
1118
+ ],
1119
+ "properties": {
1120
+ "cnr_id": "comfy-core",
1121
+ "ver": "0.3.59",
1122
+ "Node name for S&R": "TextEncodeQwenImageEditPlus"
1123
+ },
1124
+ "widgets_values": [
1125
+ "Next Scene: The camera pushes in from behind the keeper, showing him gripping the rail as the storm rages and lightning illuminates his weathered face. realistic cinematic style"
1126
+ ],
1127
+ "color": "#232",
1128
+ "bgcolor": "#353"
1129
+ },
1130
+ {
1131
+ "id": 27,
1132
+ "type": "EmptySD3LatentImage",
1133
+ "pos": [
1134
+ -320,
1135
+ 630
1136
+ ],
1137
+ "size": [
1138
+ 270,
1139
+ 120
1140
+ ],
1141
+ "flags": {},
1142
+ "order": 11,
1143
+ "mode": 0,
1144
+ "inputs": [],
1145
+ "outputs": [
1146
+ {
1147
+ "name": "LATENT",
1148
+ "type": "LATENT",
1149
+ "slot_index": 0,
1150
+ "links": [
1151
+ 51
1152
+ ]
1153
+ }
1154
+ ],
1155
+ "properties": {
1156
+ "cnr_id": "comfy-core",
1157
+ "ver": "0.3.40",
1158
+ "Node name for S&R": "EmptySD3LatentImage"
1159
+ },
1160
+ "widgets_values": [
1161
+ 1280,
1162
+ 720,
1163
+ 1
1164
+ ]
1165
+ },
1166
+ {
1167
+ "id": 8,
1168
+ "type": "VAEDecode",
1169
+ "pos": [
1170
+ 230,
1171
+ 470
1172
+ ],
1173
+ "size": [
1174
+ 210,
1175
+ 46
1176
+ ],
1177
+ "flags": {
1178
+ "collapsed": false
1179
+ },
1180
+ "order": 18,
1181
+ "mode": 0,
1182
+ "inputs": [
1183
+ {
1184
+ "name": "samples",
1185
+ "type": "LATENT",
1186
+ "link": 52
1187
+ },
1188
+ {
1189
+ "name": "vae",
1190
+ "type": "VAE",
1191
+ "link": 58
1192
+ }
1193
+ ],
1194
+ "outputs": [
1195
+ {
1196
+ "name": "IMAGE",
1197
+ "type": "IMAGE",
1198
+ "slot_index": 0,
1199
+ "links": [
1200
+ 9,
1201
+ 97
1202
+ ]
1203
+ }
1204
+ ],
1205
+ "properties": {
1206
+ "cnr_id": "comfy-core",
1207
+ "ver": "0.3.40",
1208
+ "Node name for S&R": "VAEDecode"
1209
+ },
1210
+ "widgets_values": []
1211
+ },
1212
+ {
1213
+ "id": 71,
1214
+ "type": "ImageScaleToTotalPixels",
1215
+ "pos": [
1216
+ 1408.750732421875,
1217
+ 1016.1314697265625
1218
+ ],
1219
+ "size": [
1220
+ 270,
1221
+ 82
1222
+ ],
1223
+ "flags": {},
1224
+ "order": 21,
1225
+ "mode": 0,
1226
+ "inputs": [
1227
+ {
1228
+ "name": "image",
1229
+ "type": "IMAGE",
1230
+ "link": 97
1231
+ }
1232
+ ],
1233
+ "outputs": [
1234
+ {
1235
+ "name": "IMAGE",
1236
+ "type": "IMAGE",
1237
+ "links": [
1238
+ 78,
1239
+ 82,
1240
+ 98
1241
+ ]
1242
+ }
1243
+ ],
1244
+ "properties": {
1245
+ "cnr_id": "comfy-core",
1246
+ "ver": "0.3.50",
1247
+ "Node name for S&R": "ImageScaleToTotalPixels",
1248
+ "enableTabs": false,
1249
+ "tabWidth": 65,
1250
+ "tabXOffset": 10,
1251
+ "hasSecondTab": false,
1252
+ "secondTabText": "Send Back",
1253
+ "secondTabOffset": 80,
1254
+ "secondTabWidth": 65,
1255
+ "ue_properties": {
1256
+ "widget_ue_connectable": {
1257
+ "upscale_method": true,
1258
+ "megapixels": true
1259
+ }
1260
+ }
1261
+ },
1262
+ "widgets_values": [
1263
+ "lanczos",
1264
+ 1
1265
+ ]
1266
+ }
1267
+ ],
1268
+ "links": [
1269
+ [
1270
+ 9,
1271
+ 8,
1272
+ 0,
1273
+ 9,
1274
+ 0,
1275
+ "IMAGE"
1276
+ ],
1277
+ [
1278
+ 51,
1279
+ 27,
1280
+ 0,
1281
+ 31,
1282
+ 3,
1283
+ "LATENT"
1284
+ ],
1285
+ [
1286
+ 52,
1287
+ 31,
1288
+ 0,
1289
+ 8,
1290
+ 0,
1291
+ "LATENT"
1292
+ ],
1293
+ [
1294
+ 58,
1295
+ 39,
1296
+ 0,
1297
+ 8,
1298
+ 1,
1299
+ "VAE"
1300
+ ],
1301
+ [
1302
+ 61,
1303
+ 38,
1304
+ 0,
1305
+ 31,
1306
+ 0,
1307
+ "MODEL"
1308
+ ],
1309
+ [
1310
+ 63,
1311
+ 42,
1312
+ 0,
1313
+ 31,
1314
+ 2,
1315
+ "CONDITIONING"
1316
+ ],
1317
+ [
1318
+ 64,
1319
+ 40,
1320
+ 0,
1321
+ 45,
1322
+ 0,
1323
+ "CLIP"
1324
+ ],
1325
+ [
1326
+ 65,
1327
+ 45,
1328
+ 0,
1329
+ 31,
1330
+ 1,
1331
+ "CONDITIONING"
1332
+ ],
1333
+ [
1334
+ 66,
1335
+ 45,
1336
+ 0,
1337
+ 42,
1338
+ 0,
1339
+ "CONDITIONING"
1340
+ ],
1341
+ [
1342
+ 71,
1343
+ 55,
1344
+ 0,
1345
+ 52,
1346
+ 0,
1347
+ "MODEL"
1348
+ ],
1349
+ [
1350
+ 73,
1351
+ 52,
1352
+ 0,
1353
+ 56,
1354
+ 0,
1355
+ "MODEL"
1356
+ ],
1357
+ [
1358
+ 74,
1359
+ 68,
1360
+ 0,
1361
+ 56,
1362
+ 1,
1363
+ "CONDITIONING"
1364
+ ],
1365
+ [
1366
+ 75,
1367
+ 61,
1368
+ 0,
1369
+ 56,
1370
+ 2,
1371
+ "CONDITIONING"
1372
+ ],
1373
+ [
1374
+ 76,
1375
+ 60,
1376
+ 0,
1377
+ 56,
1378
+ 3,
1379
+ "LATENT"
1380
+ ],
1381
+ [
1382
+ 77,
1383
+ 69,
1384
+ 0,
1385
+ 58,
1386
+ 0,
1387
+ "IMAGE"
1388
+ ],
1389
+ [
1390
+ 78,
1391
+ 71,
1392
+ 0,
1393
+ 60,
1394
+ 0,
1395
+ "IMAGE"
1396
+ ],
1397
+ [
1398
+ 79,
1399
+ 53,
1400
+ 0,
1401
+ 60,
1402
+ 1,
1403
+ "VAE"
1404
+ ],
1405
+ [
1406
+ 80,
1407
+ 54,
1408
+ 0,
1409
+ 61,
1410
+ 0,
1411
+ "CLIP"
1412
+ ],
1413
+ [
1414
+ 81,
1415
+ 53,
1416
+ 0,
1417
+ 61,
1418
+ 1,
1419
+ "VAE"
1420
+ ],
1421
+ [
1422
+ 82,
1423
+ 71,
1424
+ 0,
1425
+ 61,
1426
+ 2,
1427
+ "IMAGE"
1428
+ ],
1429
+ [
1430
+ 85,
1431
+ 57,
1432
+ 0,
1433
+ 65,
1434
+ 0,
1435
+ "MODEL"
1436
+ ],
1437
+ [
1438
+ 86,
1439
+ 54,
1440
+ 0,
1441
+ 68,
1442
+ 0,
1443
+ "CLIP"
1444
+ ],
1445
+ [
1446
+ 87,
1447
+ 53,
1448
+ 0,
1449
+ 68,
1450
+ 1,
1451
+ "VAE"
1452
+ ],
1453
+ [
1454
+ 91,
1455
+ 56,
1456
+ 0,
1457
+ 69,
1458
+ 0,
1459
+ "LATENT"
1460
+ ],
1461
+ [
1462
+ 92,
1463
+ 53,
1464
+ 0,
1465
+ 69,
1466
+ 1,
1467
+ "VAE"
1468
+ ],
1469
+ [
1470
+ 94,
1471
+ 65,
1472
+ 0,
1473
+ 73,
1474
+ 0,
1475
+ "MODEL"
1476
+ ],
1477
+ [
1478
+ 95,
1479
+ 73,
1480
+ 0,
1481
+ 55,
1482
+ 0,
1483
+ "MODEL"
1484
+ ],
1485
+ [
1486
+ 97,
1487
+ 8,
1488
+ 0,
1489
+ 71,
1490
+ 0,
1491
+ "IMAGE"
1492
+ ],
1493
+ [
1494
+ 98,
1495
+ 71,
1496
+ 0,
1497
+ 68,
1498
+ 2,
1499
+ "IMAGE"
1500
+ ]
1501
+ ],
1502
+ "groups": [
1503
+ {
1504
+ "id": 1,
1505
+ "title": "Step 1 - Load Models Here",
1506
+ "bounding": [
1507
+ -330,
1508
+ 80,
1509
+ 300,
1510
+ 460
1511
+ ],
1512
+ "color": "#3f789e",
1513
+ "font_size": 24,
1514
+ "flags": {}
1515
+ },
1516
+ {
1517
+ "id": 2,
1518
+ "title": "Step 2 - Image Size",
1519
+ "bounding": [
1520
+ -330,
1521
+ 560,
1522
+ 300,
1523
+ 200
1524
+ ],
1525
+ "color": "#3f789e",
1526
+ "font_size": 24,
1527
+ "flags": {}
1528
+ },
1529
+ {
1530
+ "id": 3,
1531
+ "title": "Step 3 - Prompt",
1532
+ "bounding": [
1533
+ -10,
1534
+ 80,
1535
+ 360,
1536
+ 333.6000061035156
1537
+ ],
1538
+ "color": "#3f789e",
1539
+ "font_size": 24,
1540
+ "flags": {}
1541
+ },
1542
+ {
1543
+ "id": 4,
1544
+ "title": "Step1 - Load models",
1545
+ "bounding": [
1546
+ 1220,
1547
+ 80,
1548
+ 370,
1549
+ 570
1550
+ ],
1551
+ "color": "#3f789e",
1552
+ "font_size": 24,
1553
+ "flags": {}
1554
+ },
1555
+ {
1556
+ "id": 5,
1557
+ "title": "Step 2 - Upload image for editing",
1558
+ "bounding": [
1559
+ 1220,
1560
+ 680,
1561
+ 970,
1562
+ 550
1563
+ ],
1564
+ "color": "#3f789e",
1565
+ "font_size": 24,
1566
+ "flags": {}
1567
+ },
1568
+ {
1569
+ "id": 6,
1570
+ "title": "Step 4 - Prompt",
1571
+ "bounding": [
1572
+ 1620,
1573
+ 80,
1574
+ 570,
1575
+ 570
1576
+ ],
1577
+ "color": "#3f789e",
1578
+ "font_size": 24,
1579
+ "flags": {}
1580
+ },
1581
+ {
1582
+ "id": 7,
1583
+ "title": "Step3 - Image Size",
1584
+ "bounding": [
1585
+ 2220,
1586
+ 1030,
1587
+ 310,
1588
+ 200
1589
+ ],
1590
+ "color": "#3f789e",
1591
+ "font_size": 24,
1592
+ "flags": {}
1593
+ }
1594
+ ],
1595
+ "config": {},
1596
+ "extra": {
1597
+ "ds": {
1598
+ "scale": 0.7187434554973852,
1599
+ "offset": [
1600
+ 1205.8930184858773,
1601
+ -570.9619526574623
1602
+ ]
1603
+ },
1604
+ "frontendVersion": "1.27.10",
1605
+ "VHS_latentpreview": false,
1606
+ "VHS_latentpreviewrate": 0,
1607
+ "VHS_MetadataImage": true,
1608
+ "VHS_KeepIntermediate": true
1609
+ },
1610
+ "version": 0.4
1611
+ }