forked from comfyanonymous/ComfyUI
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathcomfyui.prev.log
125 lines (123 loc) · 10.3 KB
/
comfyui.prev.log
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
## ComfyUI-Manager: installing dependencies done.
[2024-06-29 21:44] ** ComfyUI startup time: 2024-06-29 21:44:53.204924
[2024-06-29 21:44] ** Platform: Linux
[2024-06-29 21:44] ** Python version: 3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0]
[2024-06-29 21:44] ** Python executable: /home/justin/Workspace/ComfyUI/venv/bin/python3
[2024-06-29 21:44] ** Log path: /home/justin/Workspace/ComfyUI/comfyui.log
[2024-06-29 21:44]
Prestartup times for custom nodes:
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI-Stable-Video-Diffusion
[2024-06-29 21:44] 0.3 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI-Manager
[2024-06-29 21:44]
Total VRAM 15836 MB, total RAM 64144 MB
[2024-06-29 21:44] pytorch version: 2.1.2+cu121
[2024-06-29 21:44] xformers version: 0.0.23.post1
[2024-06-29 21:44] Set vram state to: NORMAL_VRAM
[2024-06-29 21:44] Device: cuda:0 NVIDIA GeForce RTX 4080 : cudaMallocAsync
[2024-06-29 21:44] VAE dtype: torch.bfloat16
[2024-06-29 21:44] Using xformers cross attention
[2024-06-29 21:44] ### Loading: ComfyUI-Manager (V2.38.2)
[2024-06-29 21:44] ### ComfyUI Revision: 2257 [7d225c40] | Released on '2024-06-12'
[2024-06-29 21:44] [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
[2024-06-29 21:44] [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
[2024-06-29 21:44] [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
[2024-06-29 21:44] [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
[2024-06-29 21:44] [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
[2024-06-29 21:44] WARNING: Could not find OPENAI_API_KEY in .env, disabling gpt prompt generation.
[2024-06-29 21:44] ### Loading: ComfyUI-Inspire-Pack (V0.76)
[2024-06-29 21:44] ComfyUI_stable_fast: tensorrt_node import failed.
[2024-06-29 21:44] Traceback (most recent call last):
[2024-06-29 21:44] File "/home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI_stable_fast/__init__.py", line 26, in <module>
from .tensorrt_node import (
[2024-06-29 21:44] File "/home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI_stable_fast/tensorrt_node.py", line 13, in <module>
from .module.controlnet_tensorrt import (
[2024-06-29 21:44] File "/home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI_stable_fast/module/controlnet_tensorrt.py", line 1, in <module>
from .tensorrt_wrapper import CallableTensorRTEngineWrapper
[2024-06-29 21:44] File "/home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI_stable_fast/module/tensorrt_wrapper.py", line 19, in <module>
from .tensorrt_utilities import Engine
[2024-06-29 21:44] File "/home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI_stable_fast/module/tensorrt_utilities.py", line 24, in <module>
import onnx
[2024-06-29 21:44] ModuleNotFoundError: No module named 'onnx'
[2024-06-29 21:44] [2024-06-29 21:44] ### Loading: ComfyUI-Impact-Pack (V5.14)
[2024-06-29 21:44] ### Loading: ComfyUI-Impact-Pack (Subpack: V0.6)
Import times for custom nodes:
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/websocket_image_save.py
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/sdxl-recommended-res-calc
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI-Logic
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/extended-saveimage-comfyui
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI-Stable-Video-Diffusion
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/Comfy_Felsirnodes
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyMath
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI_UltimateSDUpscale
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI-Frame-Interpolation
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI_ExtraModels
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI_stable_fast
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI_IPAdapter_plus
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI-Manager
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/comfyui_segment_anything
[2024-06-29 21:44] 0.0 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI-Inspire-Pack
[2024-06-29 21:44] 0.1 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI-Impact-Pack
[2024-06-29 21:44] 0.3 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI-SUPIR
[2024-06-29 21:44] 0.7 seconds: /home/justin/Workspace/ComfyUI/custom_nodes/eden_comfy_pipelines
[2024-06-29 21:44]
[2024-06-29 21:44] [Impact Pack] Wildcards loading done.
[2024-06-29 21:44] Starting server
[2024-06-29 21:44] To see the GUI go to: http://127.0.0.1:8188
[2024-06-29 21:48] FETCH DATA from: /home/justin/Workspace/ComfyUI/custom_nodes/ComfyUI-Manager/extension-node-map.json [DONE]
[2024-06-29 21:51] got prompt
[2024-06-29 21:51] Failed to validate prompt for output 3:
[2024-06-29 21:51] * SUPIR_Upscale 1:
[2024-06-29 21:51] - Value not in list: sdxl_model: 'BENCHMARK_BASE_MODEL' not in (list of length 62)
[2024-06-29 21:51] Output will be ignored
[2024-06-29 21:51] Failed to validate prompt for output 4:
[2024-06-29 21:51] Output will be ignored
[2024-06-29 21:51] invalid prompt: {'type': 'prompt_outputs_failed_validation', 'message': 'Prompt outputs failed validation', 'details': '', 'extra_info': {}}
[2024-06-29 21:51] got prompt
[2024-06-29 21:51] returning response now before image is done
[2024-06-29 21:51] Diffusion using fp16
[2024-06-29 21:51] Diffusion using bf16
[2024-06-29 21:51] Encoder using bf16
[2024-06-29 21:51] Using non-tiled sampling
[2024-06-29 21:51] making attention of type 'vanilla-xformers' with 512 in_channels
[2024-06-29 21:51] building MemoryEfficientAttnBlock with 512 in_channels...
[2024-06-29 21:51] Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
[2024-06-29 21:51] making attention of type 'vanilla-xformers' with 512 in_channels
[2024-06-29 21:51] building MemoryEfficientAttnBlock with 512 in_channels...
[2024-06-29 21:51] Attempting to load SUPIR model: [/home/justin/Workspace/ComfyUI/models/checkpoints/SUPIR-v0Q.ckpt]
[2024-06-29 21:51] Loaded state_dict from [/home/justin/Workspace/ComfyUI/models/checkpoints/SUPIR-v0Q.ckpt]
[2024-06-29 21:51] Attempting to load SDXL model: [/home/justin/Workspace/ComfyUI/models/checkpoints/sdxl/sdXL_v10VAEFix.safetensors]
[2024-06-29 21:51] Loaded state_dict from [/home/justin/Workspace/ComfyUI/models/checkpoints/sdxl/sdXL_v10VAEFix.safetensors]
[2024-06-29 21:51] Loading first clip model from SDXL checkpoint
[2024-06-29 21:51] Loading second clip model from SDXL checkpoint
[2024-06-29 21:51] captions: ['']
[2024-06-29 21:51] Sampler: .sgm.modules.diffusionmodules.sampling.RestoreEDMSampler
[2024-06-29 21:51] sampler_config: {'num_steps': 45, 'restore_cfg': -1.0, 's_churn': 5, 's_noise': 1.0030000000000001, 'discretization_config': {'target': '.sgm.modules.diffusionmodules.discretizer.LegacyDDPMDiscretization'}, 'guider_config': {'target': '.sgm.modules.diffusionmodules.guiders.LinearCFG', 'params': {'scale': 4.0, 'scale_min': 4.0}}}
[2024-06-29 21:51] /home/justin/Workspace/ComfyUI/venv/lib/python3.10/site-packages/lightning_fabric/utilities/seed.py:51: 529891951515064 is not in bounds, numpy accepts from 0 to 4294967295
[2024-06-29 21:51] Seed set to 0
[2024-06-29 21:51] [Tiled VAE]: input_size: torch.Size([1, 3, 1664, 3200]), tile_size: 512, padding: 32
[2024-06-29 21:51] [Tiled VAE]: split to 4x7 = 28 tiles. Optimal tile size 448x416, original tile size 512x512
[2024-06-29 21:52] [Tiled VAE]: Executing Encoder Task Queue: 100%|████████████████████████████████████████████████| 2548/2548 [00:07<00:00, 318.97it/s]
[2024-06-29 21:52] [Tiled VAE]: Done in 8.227s, max VRAM alloc 585.941 MB
[2024-06-29 21:52] [Tiled VAE]: input_size: torch.Size([1, 4, 208, 400]), tile_size: 96, padding: 11
[2024-06-29 21:52] [Tiled VAE]: split to 2x4 = 8 tiles. Optimal tile size 96x96, original tile size 96x96
[2024-06-29 21:52] [Tiled VAE]: Executing Decoder Task Queue: 100%|███████████████████████████████████████████████████| 984/984 [00:17<00:00, 55.13it/s]
[2024-06-29 21:52] [Tiled VAE]: Done in 18.096s, max VRAM alloc 2137.969 MB
[2024-06-29 21:52] [Tiled VAE]: input_size: torch.Size([1, 3, 1664, 3200]), tile_size: 512, padding: 32
[2024-06-29 21:52] [Tiled VAE]: split to 4x7 = 28 tiles. Optimal tile size 448x416, original tile size 512x512
[2024-06-29 21:52] [Tiled VAE]: Executing Encoder Task Queue: 100%|████████████████████████████████████████████████| 2548/2548 [00:07<00:00, 354.89it/s]
[2024-06-29 21:52] [Tiled VAE]: Done in 7.410s, max VRAM alloc 647.514 MB
[2024-06-29 21:52] Using local prompt:
[2024-06-29 21:52] ['high quality, detailed']
[2024-06-29 21:54] [Tiled VAE]: input_size: torch.Size([1, 4, 208, 400]), tile_size: 96, padding: 11
[2024-06-29 21:54] [Tiled VAE]: split to 2x4 = 8 tiles. Optimal tile size 96x96, original tile size 96x96
[2024-06-29 21:54] [Tiled VAE]: Executing Decoder Task Queue: 100%|██████████████████████████████████████████████████▉| 982/984 [00:14<00:00, 70.96it/s][Tiled VAE]: Executing Decoder Task Queue: 100%|███████████████████████████████████████████████████| 984/984 [00:14<00:00, 70.25it/s]
[2024-06-29 21:54] [Tiled VAE]: Done in 14.462s, max VRAM alloc 2235.975 MB
[2024-06-29 21:54] Sampled 1 out of 1
[2024-06-29 21:54] Prompt executed in 161.65 seconds
[2024-06-29 21:56]
Stopped server