What is Detailer?

What Stable Diffusion 1.5 is best at is images around 512-768px. When drawing a full body at this resolution, the number of pixels available for one person's "face" is only about 30-50px.
There is also the problem of resolution, but looking from the whole screen, the face is too small, and it becomes "too detailed work" for AI. Like humans, AI is not good at detailed work.
Then, the idea came up: why not just cut out the face part, redraw it, and paste it back to the original image?
This series of processes "Crop → Resize/Upscale → Inpaint → Paste back to original" is called Detailer.
Why is Detailer necessary?
There is also an idea "Why not just upscale the whole image and inpaint?".
However, the computational cost of inpainting is determined by the size of the original image, not the size of the mask. When there is a 4K image, even if you want to repaint only the face, calculation for the entire 4K image occurs, which is very inefficient.
Detailer is efficient because it cuts out only the surrounding area you want to inpaint.
Custom Node
Generally, the Detailer node included in the Impact Pack is used. That one is more multifunctional because it can automate up to object detection, but since it has many unique parameters and is difficult, we will handle it separately.
Mask and Crop Region
Let's check the difference between mask and crop region.

- Mask: The part you really want to rewrite (face itself, etc.)
- Crop region: "Working canvas" expanded slightly from the mask or BBOX
Detailer is inpainting performed only within this crop region.
✂️ Inpaint Crop

{
"id": "3b990c76-d92c-49ec-8f4c-6f3d1cbcc594",
"revision": 0,
"last_node_id": 13,
"last_link_id": 20,
"nodes": [
{
"id": 3,
"type": "LoadImage",
"pos": [
158.8744138939285,
364.03294906356285
],
"size": [
371.9572329709805,
583.1414645379411
],
"flags": {},
"order": 0,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": [
1
]
},
{
"name": "MASK",
"type": "MASK",
"links": [
7
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "LoadImage",
"image": "clipspace/clipspace-painted-masked-1765174382492.png [input]"
},
"widgets_values": [
"clipspace/clipspace-painted-masked-1765174382492.png [input]",
"image"
]
},
{
"id": 4,
"type": "PreviewImage",
"pos": [
921.2483639847512,
364.03294906356285
],
"size": [
301.62904350112717,
344.36448181818173
],
"flags": {},
"order": 2,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 2
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "PreviewImage"
},
"widgets_values": []
},
{
"id": 12,
"type": "MaskPreview",
"pos": [
921.2483639847512,
763.842428144825
],
"size": [
307.83944479576166,
349.81618240301236
],
"flags": {},
"order": 3,
"mode": 0,
"inputs": [
{
"name": "mask",
"type": "MASK",
"link": 18
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "MaskPreview"
},
"widgets_values": []
},
{
"id": 1,
"type": "InpaintCropImproved",
"pos": [
572.237278935808,
364.03294906356285
],
"size": [
307.6054529780441,
626
],
"flags": {},
"order": 1,
"mode": 0,
"inputs": [
{
"name": "image",
"type": "IMAGE",
"link": 1
},
{
"name": "mask",
"shape": 7,
"type": "MASK",
"link": 7
},
{
"name": "optional_context_mask",
"shape": 7,
"type": "MASK",
"link": null
}
],
"outputs": [
{
"name": "stitcher",
"type": "STITCHER",
"links": []
},
{
"name": "cropped_image",
"type": "IMAGE",
"links": [
2
]
},
{
"name": "cropped_mask",
"type": "MASK",
"links": [
18
]
}
],
"properties": {
"cnr_id": "comfyui-inpaint-cropandstitch",
"ver": "a5fbe06f766ec237f304fb65bb94c0f255060109",
"Node name for S&R": "InpaintCropImproved"
},
"widgets_values": [
"bilinear",
"bicubic",
false,
"ensure minimum resolution",
1024,
1024,
16384,
16384,
true,
0,
false,
32,
0.1,
false,
1,
1,
1,
1,
1.2,
true,
512,
512,
"32"
],
"color": "#232",
"bgcolor": "#353"
}
],
"links": [
[
1,
3,
0,
1,
0,
"IMAGE"
],
[
2,
1,
1,
4,
0,
"IMAGE"
],
[
7,
3,
1,
1,
1,
"MASK"
],
[
18,
1,
2,
12,
0,
"MASK"
]
],
"groups": [],
"config": {},
"extra": {
"ds": {
"scale": 0.9229599817706445,
"offset": [
160.30085263487416,
-190.57364276738986
]
},
"frontendVersion": "1.34.6",
"VHS_latentpreview": false,
"VHS_latentpreviewrate": 0,
"VHS_MetadataImage": true,
"VHS_KeepIntermediate": true
},
"version": 0.4
}
As you can see from the workflow, if you pass Mask + Original Image, it automatically creates a crop region with a little margin added based on the mask, and resizes only that part to the specified size.
There are quite a few parameters, but basically, you only need to look at the ones below.
| Parameter Name | Role/Meaning |
|---|---|
mask_fill_holes |
Automatically fills small holes (missed spots) in the mask |
mask_expand_pixels |
Expands the boundary of the mask outward by specified pixels |
mask_invert |
Inverts the mask |
mask_blend_pixels |
Blurs the mask boundary. |
🔥context_from_mask_extend_factor |
Specifies the "amount of margin" when creating a crop region from a mask by magnification |
🔥output_target_width |
Output width after cropping (pixels). Specifies horizontal size of face canvas etc. |
🔥output_target_height |
Output height after cropping (pixels). Specifies vertical size of face canvas etc. |
output_padding |
Adds margin so that resolution becomes a multiple of this value if necessary |
✂️ Inpaint Stitch (Improved)

{
"id": "3b990c76-d92c-49ec-8f4c-6f3d1cbcc594",
"revision": 0,
"last_node_id": 14,
"last_link_id": 25,
"nodes": [
{
"id": 10,
"type": "PreviewImage",
"pos": [
1367.7437333774412,
513.7276442801832
],
"size": [
223.71000000000026,
258
],
"flags": {},
"order": 5,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 16
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "PreviewImage"
},
"widgets_values": []
},
{
"id": 9,
"type": "EmptyImage",
"pos": [
890.6357108398461,
513.7276442801832
],
"size": [
210,
130
],
"flags": {},
"order": 0,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": [
14
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "EmptyImage"
},
"widgets_values": [
512,
512,
1,
255
]
},
{
"id": 8,
"type": "ImageBlend",
"pos": [
1129.1897221086429,
446.67980912237664
],
"size": [
210,
102
],
"flags": {},
"order": 3,
"mode": 0,
"inputs": [
{
"name": "image1",
"type": "IMAGE",
"link": 21
},
{
"name": "image2",
"type": "IMAGE",
"link": 14
}
],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": [
15,
16
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "ImageBlend"
},
"widgets_values": [
0.6,
"normal"
]
},
{
"id": 7,
"type": "PreviewImage",
"pos": [
1641.6985006857185,
364.03294906356285
],
"size": [
414.22380399999975,
591.965072
],
"flags": {},
"order": 6,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 11
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "PreviewImage"
},
"widgets_values": []
},
{
"id": 3,
"type": "LoadImage",
"pos": [
158.8744138939285,
364.03294906356285
],
"size": [
371.9572329709805,
583.1414645379411
],
"flags": {},
"order": 1,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": [
24
]
},
{
"name": "MASK",
"type": "MASK",
"links": [
25
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "LoadImage",
"image": "clipspace/clipspace-painted-masked-1765174382492.png [input]"
},
"widgets_values": [
"clipspace/clipspace-painted-masked-1765174382492.png [input]",
"image"
]
},
{
"id": 14,
"type": "InpaintCropImproved",
"pos": [
554.895920077805,
364.03294906356285
],
"size": [
307.6054529780441,
626
],
"flags": {},
"order": 2,
"mode": 0,
"inputs": [
{
"name": "image",
"type": "IMAGE",
"link": 24
},
{
"name": "mask",
"shape": 7,
"type": "MASK",
"link": 25
},
{
"name": "optional_context_mask",
"shape": 7,
"type": "MASK",
"link": null
}
],
"outputs": [
{
"name": "stitcher",
"type": "STITCHER",
"links": [
20
]
},
{
"name": "cropped_image",
"type": "IMAGE",
"links": [
21
]
},
{
"name": "cropped_mask",
"type": "MASK",
"links": []
}
],
"properties": {
"cnr_id": "comfyui-inpaint-cropandstitch",
"ver": "a5fbe06f766ec237f304fb65bb94c0f255060109",
"Node name for S&R": "InpaintCropImproved"
},
"widgets_values": [
"bilinear",
"bicubic",
false,
"ensure minimum resolution",
1024,
1024,
16384,
16384,
true,
0,
false,
32,
0.1,
false,
1,
1,
1,
1,
1.2,
true,
512,
512,
"32"
],
"color": "#232",
"bgcolor": "#353"
},
{
"id": 6,
"type": "InpaintStitchImproved",
"pos": [
1372.1887487452448,
364.03294906356285
],
"size": [
241.0976899545639,
46
],
"flags": {},
"order": 4,
"mode": 0,
"inputs": [
{
"name": "stitcher",
"type": "STITCHER",
"link": 20
},
{
"name": "inpainted_image",
"type": "IMAGE",
"link": 15
}
],
"outputs": [
{
"name": "image",
"type": "IMAGE",
"links": [
11
]
}
],
"properties": {
"cnr_id": "comfyui-inpaint-cropandstitch",
"ver": "a5fbe06f766ec237f304fb65bb94c0f255060109",
"Node name for S&R": "InpaintStitchImproved"
},
"widgets_values": [],
"color": "#432",
"bgcolor": "#653"
}
],
"links": [
[
11,
6,
0,
7,
0,
"IMAGE"
],
[
14,
9,
0,
8,
1,
"IMAGE"
],
[
15,
8,
0,
6,
1,
"IMAGE"
],
[
16,
8,
0,
10,
0,
"IMAGE"
],
[
20,
14,
0,
6,
0,
"STITCHER"
],
[
21,
14,
1,
8,
0,
"IMAGE"
],
[
24,
3,
0,
14,
0,
"IMAGE"
],
[
25,
3,
1,
14,
1,
"MASK"
]
],
"groups": [],
"config": {},
"extra": {
"ds": {
"scale": 0.8390545288824038,
"offset": [
-58.874413893928505,
-262.84113140979014
]
},
"frontendVersion": "1.34.6",
"VHS_latentpreview": false,
"VHS_latentpreviewrate": 0,
"VHS_MetadataImage": true,
"VHS_KeepIntermediate": true
},
"version": 0.4
}
The ✂️ Inpaint Stitch (Improved) node returns the modified crop image to its original position.
Only the masked part is overwritten on the original image.
Manual Detailer with Inpaint Crop and Stitch
Now, let's try Detailer immediately. However, it's just incorporating it into the inpainting workflow.

{
"id": "8b9f7796-0873-4025-be3c-0f997f67f866",
"revision": 0,
"last_node_id": 20,
"last_link_id": 26,
"nodes": [
{
"id": 7,
"type": "CLIPTextEncode",
"pos": [
416.1970166015625,
392.37848510742185
],
"size": [
410.75801513671877,
158.82607910156253
],
"flags": {},
"order": 6,
"mode": 0,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 5
}
],
"outputs": [
{
"name": "CONDITIONING",
"type": "CONDITIONING",
"slot_index": 0,
"links": [
6
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33",
"Node name for S&R": "CLIPTextEncode"
},
"widgets_values": [
"text, watermark"
]
},
{
"id": 13,
"type": "LoadImage",
"pos": [
-294.2855123966943,
742.4170000000003
],
"size": [
371.9572329709805,
583.1414645379411
],
"flags": {},
"order": 0,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": [
12
]
},
{
"name": "MASK",
"type": "MASK",
"links": [
13
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "LoadImage",
"image": "clipspace/clipspace-painted-masked-1765174382492.png [input]"
},
"widgets_values": [
"clipspace/clipspace-painted-masked-1765174382492.png [input]",
"image"
]
},
{
"id": 10,
"type": "VAELoader",
"pos": [
132.4262925619836,
611.9892535544651
],
"size": [
281.0743801652891,
58
],
"flags": {},
"order": 1,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "VAE",
"type": "VAE",
"links": [
10,
16
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "VAELoader"
},
"widgets_values": [
"vae-ft-mse-840000-ema-pruned.safetensors"
]
},
{
"id": 8,
"type": "VAEDecode",
"pos": [
1209,
188
],
"size": [
210,
46
],
"flags": {},
"order": 11,
"mode": 0,
"inputs": [
{
"name": "samples",
"type": "LATENT",
"link": 7
},
{
"name": "vae",
"type": "VAE",
"link": 10
}
],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"slot_index": 0,
"links": [
18,
19
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33",
"Node name for S&R": "VAEDecode"
},
"widgets_values": []
},
{
"id": 11,
"type": "InpaintStitchImproved",
"pos": [
1449.2712348513148,
738.4170000000003
],
"size": [
241.0976899545639,
46
],
"flags": {},
"order": 12,
"mode": 0,
"inputs": [
{
"name": "stitcher",
"type": "STITCHER",
"link": 11
},
{
"name": "inpainted_image",
"type": "IMAGE",
"link": 18
}
],
"outputs": [
{
"name": "image",
"type": "IMAGE",
"links": [
17
]
}
],
"properties": {
"cnr_id": "comfyui-inpaint-cropandstitch",
"ver": "a5fbe06f766ec237f304fb65bb94c0f255060109",
"Node name for S&R": "InpaintStitchImproved"
},
"widgets_values": [],
"color": "#432",
"bgcolor": "#653"
},
{
"id": 6,
"type": "CLIPTextEncode",
"pos": [
415,
186
],
"size": [
411.95503173828126,
151.0030493164063
],
"flags": {},
"order": 5,
"mode": 0,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 3
}
],
"outputs": [
{
"name": "CONDITIONING",
"type": "CONDITIONING",
"slot_index": 0,
"links": [
4
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33",
"Node name for S&R": "CLIPTextEncode"
},
"widgets_values": [
"illustration of a portrait, beautifull girl,Open your mouth and close your eyes."
]
},
{
"id": 16,
"type": "PreviewImage",
"pos": [
1477.694978315702,
193.37892043958945
],
"size": [
350.79999999999995,
384.4
],
"flags": {},
"order": 13,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 19
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "PreviewImage"
},
"widgets_values": []
},
{
"id": 9,
"type": "SaveImage",
"pos": [
1724.105085812445,
738.4170000000003
],
"size": [
423.52342840047254,
574.4169798178841
],
"flags": {},
"order": 14,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 17
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33"
},
"widgets_values": [
"ComfyUI"
]
},
{
"id": 3,
"type": "KSampler",
"pos": [
863,
186
],
"size": [
315,
262
],
"flags": {},
"order": 10,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 26
},
{
"name": "positive",
"type": "CONDITIONING",
"link": 4
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 6
},
{
"name": "latent_image",
"type": "LATENT",
"link": 23
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"slot_index": 0,
"links": [
7
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33",
"Node name for S&R": "KSampler"
},
"widgets_values": [
12345,
"fixed",
20,
8,
"euler",
"normal",
0.55
]
},
{
"id": 18,
"type": "SetLatentNoiseMask",
"pos": [
634.8898651420305,
682.6952818257673
],
"size": [
180.74765625,
46
],
"flags": {},
"order": 9,
"mode": 0,
"inputs": [
{
"name": "samples",
"type": "LATENT",
"link": 21
},
{
"name": "mask",
"type": "MASK",
"link": 22
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"links": [
23
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "SetLatentNoiseMask"
},
"widgets_values": []
},
{
"id": 15,
"type": "VAEEncode",
"pos": [
454.1952689346516,
682.6952818257673
],
"size": [
140,
46
],
"flags": {},
"order": 7,
"mode": 0,
"inputs": [
{
"name": "pixels",
"type": "IMAGE",
"link": 14
},
{
"name": "vae",
"type": "VAE",
"link": 16
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"links": [
21
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "VAEEncode"
},
"widgets_values": []
},
{
"id": 12,
"type": "InpaintCropImproved",
"pos": [
101.73599378718188,
742.4170000000003
],
"size": [
307.6054529780441,
626
],
"flags": {},
"order": 3,
"mode": 0,
"inputs": [
{
"name": "image",
"type": "IMAGE",
"link": 12
},
{
"name": "mask",
"shape": 7,
"type": "MASK",
"link": 13
},
{
"name": "optional_context_mask",
"shape": 7,
"type": "MASK",
"link": null
}
],
"outputs": [
{
"name": "stitcher",
"type": "STITCHER",
"links": [
11
]
},
{
"name": "cropped_image",
"type": "IMAGE",
"links": [
14,
24
]
},
{
"name": "cropped_mask",
"type": "MASK",
"links": [
22
]
}
],
"properties": {
"cnr_id": "comfyui-inpaint-cropandstitch",
"ver": "a5fbe06f766ec237f304fb65bb94c0f255060109",
"Node name for S&R": "InpaintCropImproved"
},
"widgets_values": [
"bilinear",
"bicubic",
false,
"ensure minimum resolution",
1024,
1024,
16384,
16384,
true,
0,
false,
32,
0.1,
false,
1,
1,
1,
1,
1.2,
true,
512,
512,
"32"
],
"color": "#232",
"bgcolor": "#353"
},
{
"id": 19,
"type": "PreviewImage",
"pos": [
453.71448760330577,
865
],
"size": [
254.39999999999986,
304.29999999999995
],
"flags": {},
"order": 8,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 24
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "PreviewImage"
},
"widgets_values": []
},
{
"id": 4,
"type": "CheckpointLoaderSimple",
"pos": [
38.10000000000001,
363.8900000000004
],
"size": [
315,
98
],
"flags": {},
"order": 2,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"slot_index": 0,
"links": [
25
]
},
{
"name": "CLIP",
"type": "CLIP",
"slot_index": 1,
"links": [
3,
5
]
},
{
"name": "VAE",
"type": "VAE",
"slot_index": 2,
"links": []
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33",
"Node name for S&R": "CheckpointLoaderSimple"
},
"widgets_values": [
"😎-v1.x\\AuroraONE_F16.safetensors"
]
},
{
"id": 20,
"type": "DifferentialDiffusion",
"pos": [
523.8254088354622,
71.70004836964688
],
"size": [
210,
58
],
"flags": {},
"order": 4,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 25
}
],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"links": [
26
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "DifferentialDiffusion"
},
"widgets_values": [
1
],
"color": "#322",
"bgcolor": "#533"
}
],
"links": [
[
3,
4,
1,
6,
0,
"CLIP"
],
[
4,
6,
0,
3,
1,
"CONDITIONING"
],
[
5,
4,
1,
7,
0,
"CLIP"
],
[
6,
7,
0,
3,
2,
"CONDITIONING"
],
[
7,
3,
0,
8,
0,
"LATENT"
],
[
10,
10,
0,
8,
1,
"VAE"
],
[
11,
12,
0,
11,
0,
"STITCHER"
],
[
12,
13,
0,
12,
0,
"IMAGE"
],
[
13,
13,
1,
12,
1,
"MASK"
],
[
14,
12,
1,
15,
0,
"IMAGE"
],
[
16,
10,
0,
15,
1,
"VAE"
],
[
17,
11,
0,
9,
0,
"IMAGE"
],
[
18,
8,
0,
11,
1,
"IMAGE"
],
[
19,
8,
0,
16,
0,
"IMAGE"
],
[
21,
15,
0,
18,
0,
"LATENT"
],
[
22,
12,
2,
18,
1,
"MASK"
],
[
23,
18,
0,
3,
3,
"LATENT"
],
[
24,
12,
1,
19,
0,
"IMAGE"
],
[
25,
4,
0,
20,
0,
"MODEL"
],
[
26,
20,
0,
3,
0,
"MODEL"
]
],
"groups": [],
"config": {},
"extra": {
"ds": {
"scale": 0.6830134553650707,
"offset": [
394.2855123966943,
28.29995163035312
]
},
"frontendVersion": "1.34.6",
"VHS_latentpreview": false,
"VHS_latentpreviewrate": 0,
"VHS_MetadataImage": true,
"VHS_KeepIntermediate": true
},
"version": 0.4
}
- 🟩 Adjust
output_target_width/heightdepending on the base model.- Since it is SD1.5 this time, it is 512px.
- 🟥 Although not directly related to Detailer, since Inpaint Crop outputs a "mask with blurred boundaries", it works very well with Differential Diffusion which can make use of it.
Combining with Object Detection
Let's automate it a bit by automatically creating a face mask.

{
"id": "8b9f7796-0873-4025-be3c-0f997f67f866",
"revision": 0,
"last_node_id": 21,
"last_link_id": 28,
"nodes": [
{
"id": 7,
"type": "CLIPTextEncode",
"pos": [
416.1970166015625,
392.37848510742185
],
"size": [
410.75801513671877,
158.82607910156253
],
"flags": {},
"order": 6,
"mode": 0,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 5
}
],
"outputs": [
{
"name": "CONDITIONING",
"type": "CONDITIONING",
"slot_index": 0,
"links": [
6
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33",
"Node name for S&R": "CLIPTextEncode"
},
"widgets_values": [
"text, watermark"
]
},
{
"id": 10,
"type": "VAELoader",
"pos": [
132.4262925619836,
611.9892535544651
],
"size": [
281.0743801652891,
58
],
"flags": {},
"order": 0,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "VAE",
"type": "VAE",
"links": [
10,
16
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "VAELoader"
},
"widgets_values": [
"vae-ft-mse-840000-ema-pruned.safetensors"
]
},
{
"id": 8,
"type": "VAEDecode",
"pos": [
1209,
188
],
"size": [
210,
46
],
"flags": {},
"order": 12,
"mode": 0,
"inputs": [
{
"name": "samples",
"type": "LATENT",
"link": 7
},
{
"name": "vae",
"type": "VAE",
"link": 10
}
],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"slot_index": 0,
"links": [
18,
19
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33",
"Node name for S&R": "VAEDecode"
},
"widgets_values": []
},
{
"id": 11,
"type": "InpaintStitchImproved",
"pos": [
1449.2712348513148,
738.4170000000003
],
"size": [
241.0976899545639,
46
],
"flags": {},
"order": 13,
"mode": 0,
"inputs": [
{
"name": "stitcher",
"type": "STITCHER",
"link": 11
},
{
"name": "inpainted_image",
"type": "IMAGE",
"link": 18
}
],
"outputs": [
{
"name": "image",
"type": "IMAGE",
"links": [
17
]
}
],
"properties": {
"cnr_id": "comfyui-inpaint-cropandstitch",
"ver": "a5fbe06f766ec237f304fb65bb94c0f255060109",
"Node name for S&R": "InpaintStitchImproved"
},
"widgets_values": [],
"color": "#432",
"bgcolor": "#653"
},
{
"id": 6,
"type": "CLIPTextEncode",
"pos": [
415,
186
],
"size": [
411.95503173828126,
151.0030493164063
],
"flags": {},
"order": 5,
"mode": 0,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 3
}
],
"outputs": [
{
"name": "CONDITIONING",
"type": "CONDITIONING",
"slot_index": 0,
"links": [
4
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33",
"Node name for S&R": "CLIPTextEncode"
},
"widgets_values": [
"illustration of a portrait, beautifull girl,Open your mouth and close your eyes."
]
},
{
"id": 16,
"type": "PreviewImage",
"pos": [
1477.694978315702,
193.37892043958945
],
"size": [
350.79999999999995,
384.4
],
"flags": {},
"order": 14,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 19
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "PreviewImage"
},
"widgets_values": []
},
{
"id": 9,
"type": "SaveImage",
"pos": [
1724.105085812445,
738.4170000000003
],
"size": [
423.52342840047254,
574.4169798178841
],
"flags": {},
"order": 15,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 17
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33"
},
"widgets_values": [
"ComfyUI"
]
},
{
"id": 3,
"type": "KSampler",
"pos": [
863,
186
],
"size": [
315,
262
],
"flags": {},
"order": 11,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 28
},
{
"name": "positive",
"type": "CONDITIONING",
"link": 4
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 6
},
{
"name": "latent_image",
"type": "LATENT",
"link": 23
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"slot_index": 0,
"links": [
7
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33",
"Node name for S&R": "KSampler"
},
"widgets_values": [
12345,
"fixed",
20,
8,
"euler",
"normal",
0.55
]
},
{
"id": 18,
"type": "SetLatentNoiseMask",
"pos": [
634.8898651420305,
682.6952818257673
],
"size": [
180.74765625,
46
],
"flags": {},
"order": 10,
"mode": 0,
"inputs": [
{
"name": "samples",
"type": "LATENT",
"link": 21
},
{
"name": "mask",
"type": "MASK",
"link": 22
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"links": [
23
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "SetLatentNoiseMask"
},
"widgets_values": []
},
{
"id": 15,
"type": "VAEEncode",
"pos": [
454.1952689346516,
682.6952818257673
],
"size": [
140,
46
],
"flags": {},
"order": 8,
"mode": 0,
"inputs": [
{
"name": "pixels",
"type": "IMAGE",
"link": 14
},
{
"name": "vae",
"type": "VAE",
"link": 16
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"links": [
21
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "VAEEncode"
},
"widgets_values": []
},
{
"id": 12,
"type": "InpaintCropImproved",
"pos": [
101.73599378718188,
742.4170000000003
],
"size": [
307.6054529780441,
626
],
"flags": {},
"order": 7,
"mode": 0,
"inputs": [
{
"name": "image",
"type": "IMAGE",
"link": 12
},
{
"name": "mask",
"shape": 7,
"type": "MASK",
"link": 26
},
{
"name": "optional_context_mask",
"shape": 7,
"type": "MASK",
"link": null
}
],
"outputs": [
{
"name": "stitcher",
"type": "STITCHER",
"links": [
11
]
},
{
"name": "cropped_image",
"type": "IMAGE",
"links": [
14,
24
]
},
{
"name": "cropped_mask",
"type": "MASK",
"links": [
22
]
}
],
"properties": {
"cnr_id": "comfyui-inpaint-cropandstitch",
"ver": "a5fbe06f766ec237f304fb65bb94c0f255060109",
"Node name for S&R": "InpaintCropImproved"
},
"widgets_values": [
"bilinear",
"bicubic",
false,
"ensure minimum resolution",
1024,
1024,
16384,
16384,
true,
0,
false,
32,
0.1,
false,
1,
1,
1,
1,
1.2,
true,
512,
512,
"32"
],
"color": "#232",
"bgcolor": "#353"
},
{
"id": 19,
"type": "PreviewImage",
"pos": [
453.71448760330577,
865
],
"size": [
254.39999999999986,
304.29999999999995
],
"flags": {},
"order": 9,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 24
}
],
"outputs": [],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "PreviewImage"
},
"widgets_values": []
},
{
"id": 13,
"type": "LoadImage",
"pos": [
-604.2329203606309,
742.4170000000003
],
"size": [
371.9572329709805,
583.1414645379411
],
"flags": {},
"order": 1,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": [
12,
25
]
},
{
"name": "MASK",
"type": "MASK",
"links": []
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "LoadImage",
"image": "clipspace/clipspace-painted-masked-1765182779878.png [input]"
},
"widgets_values": [
"clipspace/clipspace-painted-masked-1765182779878.png [input]",
"image"
]
},
{
"id": 20,
"type": "SAM3Segment",
"pos": [
-214.09484680123433,
828.8728691744416
],
"size": [
297.6500000000001,
332
],
"flags": {},
"order": 3,
"mode": 0,
"inputs": [
{
"name": "image",
"type": "IMAGE",
"link": 25
}
],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": []
},
{
"name": "MASK",
"type": "MASK",
"links": [
26
]
},
{
"name": "MASK_IMAGE",
"type": "IMAGE",
"links": null
}
],
"properties": {
"cnr_id": "comfyui-rmbg",
"ver": "2.9.4",
"Node name for S&R": "SAM3Segment"
},
"widgets_values": [
"face",
"sam3",
"Auto",
0.5,
0,
0,
false,
"Alpha",
"#222222"
],
"color": "#223",
"bgcolor": "#335"
},
{
"id": 4,
"type": "CheckpointLoaderSimple",
"pos": [
38.10000000000001,
363.8900000000004
],
"size": [
315,
98
],
"flags": {},
"order": 2,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"slot_index": 0,
"links": [
27
]
},
{
"name": "CLIP",
"type": "CLIP",
"slot_index": 1,
"links": [
3,
5
]
},
{
"name": "VAE",
"type": "VAE",
"slot_index": 2,
"links": []
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.33",
"Node name for S&R": "CheckpointLoaderSimple"
},
"widgets_values": [
"😎-v1.x\\AuroraONE_F16.safetensors"
]
},
{
"id": 21,
"type": "DifferentialDiffusion",
"pos": [
534.4404368428528,
62.781508537241564
],
"size": [
210,
58
],
"flags": {},
"order": 4,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 27
}
],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"links": [
28
]
}
],
"properties": {
"cnr_id": "comfy-core",
"ver": "0.3.76",
"Node name for S&R": "DifferentialDiffusion"
},
"widgets_values": [
1
],
"color": "#323",
"bgcolor": "#535"
}
],
"links": [
[
3,
4,
1,
6,
0,
"CLIP"
],
[
4,
6,
0,
3,
1,
"CONDITIONING"
],
[
5,
4,
1,
7,
0,
"CLIP"
],
[
6,
7,
0,
3,
2,
"CONDITIONING"
],
[
7,
3,
0,
8,
0,
"LATENT"
],
[
10,
10,
0,
8,
1,
"VAE"
],
[
11,
12,
0,
11,
0,
"STITCHER"
],
[
12,
13,
0,
12,
0,
"IMAGE"
],
[
14,
12,
1,
15,
0,
"IMAGE"
],
[
16,
10,
0,
15,
1,
"VAE"
],
[
17,
11,
0,
9,
0,
"IMAGE"
],
[
18,
8,
0,
11,
1,
"IMAGE"
],
[
19,
8,
0,
16,
0,
"IMAGE"
],
[
21,
15,
0,
18,
0,
"LATENT"
],
[
22,
12,
2,
18,
1,
"MASK"
],
[
23,
18,
0,
3,
3,
"LATENT"
],
[
24,
12,
1,
19,
0,
"IMAGE"
],
[
25,
13,
0,
20,
0,
"IMAGE"
],
[
26,
20,
1,
12,
1,
"MASK"
],
[
27,
4,
0,
21,
0,
"MODEL"
],
[
28,
21,
0,
3,
0,
"MODEL"
]
],
"groups": [],
"config": {},
"extra": {
"ds": {
"scale": 0.6830134553650707,
"offset": [
704.2329203606309,
37.218491462758436
]
},
"frontendVersion": "1.34.6",
"VHS_latentpreview": false,
"VHS_latentpreviewrate": 0,
"VHS_MetadataImage": true,
"VHS_KeepIntermediate": true
},
"version": 0.4
}
- 🟦 Create a face mask using SAM 3.
This is the basics of Detailer, so I think you can use it well enough. However, if you want to detect multiple people reflected in the image at once and process them all at once... you need to use ImpactPack.
Sample Image
