-
Notifications
You must be signed in to change notification settings - Fork 11k
Open
Labels
Potential BugUser is reporting a bug. This should be tested.User is reporting a bug. This should be tested.
Description
Custom Node Testing
- I have tried disabling custom nodes and the issue persists (see how to disable custom nodes if you need help)
Expected Behavior
hey guys i have faced this error
Got [16, 56, 56] but expected positional dim 262144
after i try to use NF4 Version of Flux inside ComfyUI and after installing ComfyUI_Manager and installing the extention from github
so here we go! if any one have any idea how should i solve this , please let me know
Actual Behavior
Steps to Reproduce
explained before
Debug Logs
# ComfyUI Error Report
## Error Details
- **Node ID:** 40
- **Node Type:** CheckpointLoaderNF4
- **Exception Type:** ValueError
- **Exception Message:** Got [16, 56, 56] but expected positional dim 262144
## Stack Trace
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 515, in execute
output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 329, in get_output_data
return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 303, in _async_map_node_over_list
await process_inputs(input_dict, i)
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 291, in process_inputs
result = f(**inputs)
File "F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_bitsandbytes_NF4\__init__.py", line 178, in load_checkpoint
out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"), model_options={"custom_operations": OPS})
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 1304, in load_checkpoint_guess_config
out = load_state_dict_guess_config(sd, output_vae, output_clip, output_clipvision, embedding_directory, output_model, model_options, te_model_options=te_model_options, metadata=metadata)
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 1357, in load_state_dict_guess_config
model = model_config.get_model(sd, diffusion_model_prefix, device=inital_load_device)
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\supported_models.py", line 714, in get_model
out = model_base.Flux(self, device=device)
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 838, in __init__
super().__init__(model_config, model_type, device=device, unet_model=unet_model)
~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 141, in __init__
self.diffusion_model = unet_model(**unet_config, device=device, operations=operations)
~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\ldm\flux\model.py", line 67, in __init__
raise ValueError(f"Got {params.axes_dim} but expected positional dim {pe_dim}")
## System Information
- **ComfyUI Version:** 0.4.0
- **Arguments:** ComfyUI\main.py --windows-standalone-build
- **OS:** win32
- **Python Version:** 3.13.9 (tags/v3.13.9:8183fa5, Oct 14 2025, 14:09:13) [MSC v.1944 64 bit (AMD64)]
- **Embedded Python:** true
- **PyTorch Version:** 2.9.1+cu130
## Devices
- **Name:** cuda:0 NVIDIA GeForce GTX 1060 6GB : cudaMallocAsync
- **Type:** cuda
- **VRAM Total:** 6442188800
- **VRAM Free:** 5449449472
- **Torch VRAM Total:** 0
- **Torch VRAM Free:** 0
## Logs
2025-12-19T12:30:17.195583 - [START] Security scan2025-12-19T12:30:17.195607 -
2025-12-19T12:30:18.433438 - [DONE] Security scan2025-12-19T12:30:18.433451 -
2025-12-19T12:30:18.572470 - ## ComfyUI-Manager: installing dependencies done.2025-12-19T12:30:18.572584 -
2025-12-19T12:30:18.572687 - ** ComfyUI startup time:2025-12-19T12:30:18.572840 - 2025-12-19T12:30:18.572925 - 2025-12-19 12:30:18.5722025-12-19T12:30:18.573013 -
2025-12-19T12:30:18.573101 - ** Platform:2025-12-19T12:30:18.573179 - 2025-12-19T12:30:18.573257 - Windows2025-12-19T12:30:18.573325 -
2025-12-19T12:30:18.573447 - ** Python version:2025-12-19T12:30:18.573536 - 2025-12-19T12:30:18.573616 - 3.13.9 (tags/v3.13.9:8183fa5, Oct 14 2025, 14:09:13) [MSC v.1944 64 bit (AMD64)]2025-12-19T12:30:18.573715 -
2025-12-19T12:30:18.573786 - ** Python executable:2025-12-19T12:30:18.573851 - 2025-12-19T12:30:18.573915 - F:\ComfyUI_windows_portable\python_embeded\python.exe2025-12-19T12:30:18.573977 -
2025-12-19T12:30:18.574043 - ** ComfyUI Path:2025-12-19T12:30:18.574106 - 2025-12-19T12:30:18.574167 - F:\ComfyUI_windows_portable\ComfyUI2025-12-19T12:30:18.574229 -
2025-12-19T12:30:18.574289 - ** ComfyUI Base Folder Path:2025-12-19T12:30:18.574347 - 2025-12-19T12:30:18.574413 - F:\ComfyUI_windows_portable\ComfyUI2025-12-19T12:30:18.574477 -
2025-12-19T12:30:18.574541 - ** User directory:2025-12-19T12:30:18.574610 - 2025-12-19T12:30:18.574671 - F:\ComfyUI_windows_portable\ComfyUI\user2025-12-19T12:30:18.574732 -
2025-12-19T12:30:18.574796 - ** ComfyUI-Manager config path:2025-12-19T12:30:18.574858 - 2025-12-19T12:30:18.577672 - F:\ComfyUI_windows_portable\ComfyUI\user\__manager\config.ini2025-12-19T12:30:18.577733 -
2025-12-19T12:30:18.577797 - ** Log path:2025-12-19T12:30:18.577847 - 2025-12-19T12:30:18.577900 - F:\ComfyUI_windows_portable\ComfyUI\user\comfyui.log2025-12-19T12:30:18.577950 -
2025-12-19T12:30:19.657009 -
Prestartup times for custom nodes:
2025-12-19T12:30:19.657151 - 3.0 seconds: F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-manager
2025-12-19T12:30:19.657248 -
2025-12-19T12:30:21.563003 - Checkpoint files will always be loaded safely.
2025-12-19T12:30:21.589417 - F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\cuda\__init__.py:283: UserWarning:
Found GPU0 NVIDIA GeForce GTX 1060 6GB which is of cuda capability 6.1.
Minimum and Maximum cuda capability supported by this version of PyTorch is
(7.5) - (12.0)
warnings.warn(
2025-12-19T12:30:21.589901 - F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\cuda\__init__.py:304: UserWarning:
Please install PyTorch with a following CUDA
configurations: 12.6 following instructions at
https://pytorch.org/get-started/locally/
warnings.warn(matched_cuda_warn.format(matched_arches))
2025-12-19T12:30:21.590623 - F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\torch\cuda\__init__.py:326: UserWarning:
NVIDIA GeForce GTX 1060 6GB with CUDA capability sm_61 is not compatible with the current PyTorch installation.
The current PyTorch install supports CUDA capabilities sm_75 sm_80 sm_86 sm_90 sm_100 sm_120.
If you want to use the NVIDIA GeForce GTX 1060 6GB GPU with PyTorch, please check the instructions at https://pytorch.org/get-started/locally/
warnings.warn(
2025-12-19T12:30:21.644057 - Total VRAM 6144 MB, total RAM 32701 MB
2025-12-19T12:30:21.644220 - pytorch version: 2.9.1+cu130
2025-12-19T12:30:21.644710 - Set vram state to: NORMAL_VRAM
2025-12-19T12:30:21.644997 - Device: cuda:0 NVIDIA GeForce GTX 1060 6GB : cudaMallocAsync
2025-12-19T12:30:21.659181 - Using async weight offloading with 2 streams
2025-12-19T12:30:21.661586 - Enabled pinned memory 14715.0
2025-12-19T12:30:21.682551 - working around nvidia conv3d memory bug.
2025-12-19T12:30:22.851444 - Using pytorch attention
2025-12-19T12:30:25.113607 - Python version: 3.13.9 (tags/v3.13.9:8183fa5, Oct 14 2025, 14:09:13) [MSC v.1944 64 bit (AMD64)]
2025-12-19T12:30:25.113751 - ComfyUI version: 0.4.0
2025-12-19T12:30:25.163579 - ComfyUI frontend version: 1.33.13
2025-12-19T12:30:25.164953 - [Prompt Server] web root: F:\ComfyUI_windows_portable\python_embeded\Lib\site-packages\comfyui_frontend_package\static
2025-12-19T12:30:25.682702 - Total VRAM 6144 MB, total RAM 32701 MB
2025-12-19T12:30:25.682932 - pytorch version: 2.9.1+cu130
2025-12-19T12:30:25.683638 - Set vram state to: NORMAL_VRAM
2025-12-19T12:30:25.683858 - Device: cuda:0 NVIDIA GeForce GTX 1060 6GB : cudaMallocAsync
2025-12-19T12:30:25.698274 - Using async weight offloading with 2 streams
2025-12-19T12:30:25.701879 - Enabled pinned memory 14715.0
2025-12-19T12:30:26.467700 - ### Loading: ComfyUI-Manager (V3.38.3)
2025-12-19T12:30:26.468771 - [ComfyUI-Manager] network_mode: public
2025-12-19T12:30:26.579667 - ### ComfyUI Revision: 150 [fc657f47] *DETACHED | Released on '2025-12-09'
2025-12-19T12:30:27.039988 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
2025-12-19T12:30:27.084661 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
2025-12-19T12:30:27.379413 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
2025-12-19T12:30:27.629295 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
2025-12-19T12:30:27.853112 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
2025-12-19T12:30:28.735018 -
Import times for custom nodes:
2025-12-19T12:30:28.735200 - 0.0 seconds: F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\websocket_image_save.py
2025-12-19T12:30:28.735305 - 0.1 seconds: F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\comfyui-manager
2025-12-19T12:30:28.735403 - 2.1 seconds: F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_bitsandbytes_NF4
2025-12-19T12:30:28.735532 -
2025-12-19T12:30:28.765765 - [ComfyUI-Manager] An error occurred while fetching 'https://api.comfy.org/nodes?page=1&limit=30&comfyui_version=v0.4.0&form_factor=git-windows': Expecting value: line 2 column 1 (char 1)
2025-12-19T12:30:28.769885 - Cannot connect to comfyregistry.2025-12-19T12:30:28.770065 -
2025-12-19T12:30:28.773484 - FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json2025-12-19T12:30:28.773566 - 2025-12-19T12:30:29.272534 - Context impl SQLiteImpl.
2025-12-19T12:30:29.272678 - Will assume non-transactional DDL.
2025-12-19T12:30:29.273635 - No target revision found.
2025-12-19T12:30:29.330856 - Starting server
2025-12-19T12:30:29.331363 - To see the GUI go to: http://127.0.0.1:8188
2025-12-19T12:30:31.855975 - [DONE]2025-12-19T12:30:31.889740 -
2025-12-19T12:30:32.007865 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-12-19T12:30:32.048963 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /extensions/core/groupNode.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-12-19T12:30:32.173937 - [ComfyUI-Manager] All startup tasks have been completed.
2025-12-19T12:30:32.866955 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui/components/buttonGroup.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-12-19T12:30:32.871194 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui/components/button.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-12-19T12:30:51.005565 - got prompt
2025-12-19T12:30:51.030262 - Using pytorch attention in VAE
2025-12-19T12:30:51.034263 - Using pytorch attention in VAE
2025-12-19T12:30:51.175400 - VAE load device: cuda:0, offload device: cpu, dtype: torch.float32
2025-12-19T12:30:51.362079 - !!! Exception during processing !!! Got [16, 56, 56] but expected positional dim 262144
2025-12-19T12:30:51.366896 - Traceback (most recent call last):
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 515, in execute
output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 329, in get_output_data
return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 303, in _async_map_node_over_list
await process_inputs(input_dict, i)
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 291, in process_inputs
result = f(**inputs)
File "F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_bitsandbytes_NF4\__init__.py", line 178, in load_checkpoint
out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"), model_options={"custom_operations": OPS})
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 1304, in load_checkpoint_guess_config
out = load_state_dict_guess_config(sd, output_vae, output_clip, output_clipvision, embedding_directory, output_model, model_options, te_model_options=te_model_options, metadata=metadata)
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 1357, in load_state_dict_guess_config
model = model_config.get_model(sd, diffusion_model_prefix, device=inital_load_device)
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\supported_models.py", line 714, in get_model
out = model_base.Flux(self, device=device)
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 838, in __init__
super().__init__(model_config, model_type, device=device, unet_model=unet_model)
~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 141, in __init__
self.diffusion_model = unet_model(**unet_config, device=device, operations=operations)
~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\ldm\flux\model.py", line 67, in __init__
raise ValueError(f"Got {params.axes_dim} but expected positional dim {pe_dim}")
ValueError: Got [16, 56, 56] but expected positional dim 262144
2025-12-19T12:30:51.371497 - Prompt executed in 0.36 seconds
2025-12-19T12:44:09.790394 - got prompt
2025-12-19T12:44:09.891227 - !!! Exception during processing !!! Got [16, 56, 56] but expected positional dim 262144
2025-12-19T12:44:09.893019 - Traceback (most recent call last):
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 515, in execute
output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 329, in get_output_data
return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 303, in _async_map_node_over_list
await process_inputs(input_dict, i)
File "F:\ComfyUI_windows_portable\ComfyUI\execution.py", line 291, in process_inputs
result = f(**inputs)
File "F:\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_bitsandbytes_NF4\__init__.py", line 178, in load_checkpoint
out = comfy.sd.load_checkpoint_guess_config(ckpt_path, output_vae=True, output_clip=True, embedding_directory=folder_paths.get_folder_paths("embeddings"), model_options={"custom_operations": OPS})
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 1304, in load_checkpoint_guess_config
out = load_state_dict_guess_config(sd, output_vae, output_clip, output_clipvision, embedding_directory, output_model, model_options, te_model_options=te_model_options, metadata=metadata)
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\sd.py", line 1357, in load_state_dict_guess_config
model = model_config.get_model(sd, diffusion_model_prefix, device=inital_load_device)
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\supported_models.py", line 714, in get_model
out = model_base.Flux(self, device=device)
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 838, in __init__
super().__init__(model_config, model_type, device=device, unet_model=unet_model)
~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\model_base.py", line 141, in __init__
self.diffusion_model = unet_model(**unet_config, device=device, operations=operations)
~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "F:\ComfyUI_windows_portable\ComfyUI\comfy\ldm\flux\model.py", line 67, in __init__
raise ValueError(f"Got {params.axes_dim} but expected positional dim {pe_dim}")
ValueError: Got [16, 56, 56] but expected positional dim 262144
2025-12-19T12:44:09.895377 - Prompt executed in 0.10 seconds
## Attached Workflow
Please make sure that workflow does not contain any sensitive information such as API keys or passwords.
{"id":"249a7d02-0192-4d6a-a14c-82dfa30e3db1","revision":0,"last_node_id":40,"last_link_id":125,"nodes":[{"id":17,"type":"BasicScheduler","pos":[578.3393350227425,1212.9314173994253],"size":[378,180],"flags":{},"order":12,"mode":0,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":55},{"localized_name":"scheduler","name":"scheduler","type":"COMBO","widget":{"name":"scheduler"},"link":null},{"localized_name":"steps","name":"steps","type":"INT","widget":{"name":"steps"},"link":null},{"localized_name":"denoise","name":"denoise","type":"FLOAT","widget":{"name":"denoise"},"link":null}],"outputs":[{"localized_name":"SIGMAS","name":"SIGMAS","type":"SIGMAS","links":[20]}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"BasicScheduler"},"widgets_values":["simple",20,1]},{"id":16,"type":"KSamplerSelect","pos":[578.3393350227425,1097.743075114269],"size":[378,112],"flags":{},"order":0,"mode":0,"inputs":[{"localized_name":"sampler_name","name":"sampler_name","type":"COMBO","widget":{"name":"sampler_name"},"link":null}],"outputs":[{"localized_name":"SAMPLER","name":"SAMPLER","type":"SAMPLER","links":[19]}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"KSamplerSelect"},"widgets_values":["euler"]},{"id":26,"type":"FluxGuidance","pos":[578.3393350227425,176.1263804731558],"size":[380.8833312988281,112],"flags":{},"order":13,"mode":0,"inputs":[{"localized_name":"conditioning","name":"conditioning","type":"CONDITIONING","link":41},{"localized_name":"guidance","name":"guidance","type":"FLOAT","widget":{"name":"guidance"},"link":null}],"outputs":[{"localized_name":"CONDITIONING","name":"CONDITIONING","type":"CONDITIONING","slot_index":0,"links":[42]}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"FluxGuidance"},"widgets_values":[3.5],"color":"#233","bgcolor":"#355"},{"id":13,"type":"SamplerCustomAdvanced","pos":[1039.1293283088753,233.74804490186673],"size":[326.8333435058594,421.20001220703125],"flags":{},"order":15,"mode":0,"inputs":[{"localized_name":"noise","name":"noise","type":"NOISE","link":37},{"localized_name":"guider","name":"guider","type":"GUIDER","link":30},{"localized_name":"sampler","name":"sampler","type":"SAMPLER","link":19},{"localized_name":"sigmas","name":"sigmas","type":"SIGMAS","link":20},{"localized_name":"latent_image","name":"latent_image","type":"LATENT","link":116}],"outputs":[{"localized_name":"output","name":"output","type":"LATENT","slot_index":0,"links":[24]},{"localized_name":"denoised_output","name":"denoised_output","type":"LATENT","links":null}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"SamplerCustomAdvanced"},"widgets_values":[]},{"id":25,"type":"RandomNoise","pos":[578.3393350227425,924.9330348310659],"size":[378,128.39999389648438],"flags":{},"order":1,"mode":0,"inputs":[{"localized_name":"noise_seed","name":"noise_seed","type":"INT","widget":{"name":"noise_seed"},"link":null}],"outputs":[{"localized_name":"NOISE","name":"NOISE","type":"NOISE","links":[37]}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"RandomNoise"},"widgets_values":[573810501730631,"randomize"],"color":"#2a363b","bgcolor":"#3f5159"},{"id":30,"type":"ModelSamplingFlux","pos":[578.3393350227425,1385.7413905439566],"size":[378,214],"flags":{},"order":10,"mode":0,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":121},{"localized_name":"max_shift","name":"max_shift","type":"FLOAT","widget":{"name":"max_shift"},"link":null},{"localized_name":"base_shift","name":"base_shift","type":"FLOAT","widget":{"name":"base_shift"},"link":null},{"localized_name":"width","name":"width","type":"INT","widget":{"name":"width"},"link":115},{"localized_name":"height","name":"height","type":"INT","widget":{"name":"height"},"link":114}],"outputs":[{"localized_name":"MODEL","name":"MODEL","type":"MODEL","slot_index":0,"links":[55,122]}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"ModelSamplingFlux"},"widgets_values":[1.15,0.5,1344,768]},{"id":37,"type":"Note","pos":[578.3393350227425,1616.1363368330192],"size":[378,171.5833282470703],"flags":{},"order":2,"mode":0,"inputs":[],"outputs":[],"properties":{"text":""},"widgets_values":["The reference sampling implementation auto adjusts the shift value based on the resolution, if you don't want this you can just bypass (CTRL-B) this ModelSamplingFlux node.\n"],"color":"#432","bgcolor":"#653"},{"id":22,"type":"BasicGuider","pos":[693.5276773078987,60.938054972667516],"size":[266.8166809082031,102],"flags":{},"order":14,"mode":0,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":122},{"localized_name":"conditioning","name":"conditioning","type":"CONDITIONING","link":42}],"outputs":[{"localized_name":"GUIDER","name":"GUIDER","type":"GUIDER","slot_index":0,"links":[30]}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"BasicGuider"},"widgets_values":[]},{"id":10,"type":"VAELoader","pos":[57.947673950965054,529.739705973644],"size":[374.1833190917969,112],"flags":{},"order":3,"mode":0,"inputs":[{"localized_name":"vae_name","name":"vae_name","type":"COMBO","widget":{"name":"vae_name"},"link":null}],"outputs":[{"localized_name":"VAE","name":"VAE","type":"VAE","slot_index":0,"links":[12]}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"VAELoader"},"widgets_values":["ae.safetensors"]},{"id":8,"type":"VAEDecode","pos":[1075.1359994514535,702.9347194013784],"size":[252,102],"flags":{},"order":16,"mode":0,"inputs":[{"localized_name":"samples","name":"samples","type":"LATENT","link":24},{"localized_name":"vae","name":"vae","type":"VAE","link":12}],"outputs":[{"localized_name":"IMAGE","name":"IMAGE","type":"IMAGE","slot_index":0,"links":[9]}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"VAEDecode"},"widgets_values":[]},{"id":11,"type":"DualCLIPLoader","pos":[-227.15399383467945,277.436403971691],"size":[378,214],"flags":{},"order":4,"mode":0,"inputs":[{"localized_name":"clip_name1","name":"clip_name1","type":"COMBO","widget":{"name":"clip_name1"},"link":null},{"localized_name":"clip_name2","name":"clip_name2","type":"COMBO","widget":{"name":"clip_name2"},"link":null},{"localized_name":"type","name":"type","type":"COMBO","widget":{"name":"type"},"link":null},{"localized_name":"device","name":"device","shape":7,"type":"COMBO","widget":{"name":"device"},"link":null}],"outputs":[{"localized_name":"CLIP","name":"CLIP","type":"CLIP","slot_index":0,"links":[]}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"DualCLIPLoader"},"widgets_values":["t5xxl_fp8_e4m3fn.safetensors","clip_l.safetensors","flux","default"]},{"id":34,"type":"PrimitiveNode","pos":[520.7359994514534,579.3314173994253],"size":[252,128.39999389648438],"flags":{},"order":5,"mode":0,"inputs":[],"outputs":[{"name":"INT","type":"INT","widget":{"name":"width"},"slot_index":0,"links":[112,115]}],"title":"width","properties":{"Run widget replace on values":false},"widgets_values":[1344,"fixed"],"color":"#323","bgcolor":"#535"},{"id":35,"type":"PrimitiveNode","pos":[808.7343148811408,579.3314173994253],"size":[252,128.39999389648438],"flags":{},"order":6,"mode":0,"inputs":[],"outputs":[{"name":"INT","type":"INT","widget":{"name":"height"},"slot_index":0,"links":[113,114]}],"title":"height","properties":{"Run widget replace on values":false},"widgets_values":[768,"fixed"],"color":"#323","bgcolor":"#535"},{"id":40,"type":"CheckpointLoaderNF4","pos":[4.946006165320512,69.62804825880033],"size":[315,160],"flags":{},"order":7,"mode":0,"inputs":[{"localized_name":"ckpt_name","name":"ckpt_name","type":"COMBO","widget":{"name":"ckpt_name"},"link":null}],"outputs":[{"localized_name":"MODEL","name":"MODEL","type":"MODEL","slot_index":0,"links":[121]},{"localized_name":"CLIP","name":"CLIP","type":"CLIP","slot_index":1,"links":[125]},{"localized_name":"VAE","name":"VAE","type":"VAE","links":null}],"properties":{"aux_id":"comfyanonymous/ComfyUI_bitsandbytes_NF4","ver":"6c65152bc48b28fc44cec3aa44035a8eba400eb9","Node name for S&R":"CheckpointLoaderNF4"},"widgets_values":["flux1DevHyperNF4Flux1DevBNB_flux1DevBNBNF4V2.safetensors"]},{"id":27,"type":"EmptySD3LatentImage","pos":[578.3393350227425,752.1413905439566],"size":[378,180],"flags":{},"order":9,"mode":0,"inputs":[{"localized_name":"width","name":"width","type":"INT","widget":{"name":"width"},"link":112},{"localized_name":"height","name":"height","type":"INT","widget":{"name":"height"},"link":113},{"localized_name":"batch_size","name":"batch_size","type":"INT","widget":{"name":"batch_size"},"link":null}],"outputs":[{"localized_name":"LATENT","name":"LATENT","type":"LATENT","slot_index":0,"links":[116]}],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"EmptySD3LatentImage"},"widgets_values":[1344,768,1]},{"id":9,"type":"SaveImage","pos":[1393.1460397346564,237.32304490186672],"size":[936,690],"flags":{},"order":17,"mode":0,"inputs":[{"localized_name":"images","name":"images","type":"IMAGE","link":9},{"localized_name":"filename_prefix","name":"filename_prefix","type":"STRING","widget":{"name":"filename_prefix"},"link":null}],"outputs":[],"properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"SaveImage"},"widgets_values":["ComfyUI"]},{"id":6,"type":"CLIPTextEncode","pos":[463.1326638801643,291.333034831066],"size":[507.4166564941406,227.18333435058594],"flags":{},"order":11,"mode":0,"inputs":[{"localized_name":"clip","name":"clip","type":"CLIP","link":125},{"localized_name":"text","name":"text","type":"STRING","widget":{"name":"text"},"link":null}],"outputs":[{"localized_name":"CONDITIONING","name":"CONDITIONING","type":"CONDITIONING","slot_index":0,"links":[41]}],"title":"CLIP Text Encode (Positive Prompt)","properties":{"cnr_id":"comfy-core","ver":"0.4.0","Node name for S&R":"CLIPTextEncode"},"widgets_values":["cyberpunk photo of face closeup, the text \"NF4\" is tattooed on the womans cheek. She has blue hair and sunglasses with a pink and yellow tint."],"color":"#232","bgcolor":"#353"},{"id":28,"type":"Note","pos":[59.94600616532054,694.5196925459097],"size":[403.20001220703125,375.6000061035156],"flags":{},"order":8,"mode":0,"inputs":[],"outputs":[],"properties":{"text":""},"widgets_values":["If you get an error in any of the nodes above make sure the files are in the correct directories.\n\nSee the top of the examples page for the links : https://comfyanonymous.github.io/ComfyUI_examples/flux/\n\nflux1-dev-bnb-nf4.safetensors goes in: ComfyUI/models/checkpoints/\n\nt5xxl_fp16.safetensors and clip_l.safetensors go in: ComfyUI/models/clip/\n\nae.safetensors goes in: ComfyUI/models/vae/"],"color":"#432","bgcolor":"#653"}],"links":[[9,8,0,9,0,"IMAGE"],[12,10,0,8,1,"VAE"],[19,16,0,13,2,"SAMPLER"],[20,17,0,13,3,"SIGMAS"],[24,13,0,8,0,"LATENT"],[30,22,0,13,1,"GUIDER"],[37,25,0,13,0,"NOISE"],[41,6,0,26,0,"CONDITIONING"],[42,26,0,22,1,"CONDITIONING"],[55,30,0,17,0,"MODEL"],[112,34,0,27,0,"INT"],[113,35,0,27,1,"INT"],[114,35,0,30,4,"INT"],[115,34,0,30,3,"INT"],[116,27,0,13,4,"LATENT"],[121,40,0,30,0,"MODEL"],[122,30,0,22,0,"MODEL"],[125,40,1,6,0,"CLIP"]],"groups":[],"config":{},"extra":{"ds":{"scale":0.8264462809917354,"offset":[-128.23767059403127,106.28695174119966]},"groupNodes":{},"workflowRendererVersion":"Vue"},"version":0.4}
## Additional Context
(Please add any additional context or steps to reproduce the error here)Other
No response
Metadata
Metadata
Assignees
Labels
Potential BugUser is reporting a bug. This should be tested.User is reporting a bug. This should be tested.