Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using certain naming conventions for the directory path breaks loading previous saves for LoRA Training #200

Open
cswattts opened this issue Oct 22, 2024 · 0 comments

Comments

@cswattts
Copy link

cswattts commented Oct 22, 2024

Describe the bug

paths = []
for pattern in patterns:
    paths.extend(glob.glob(os.path.join(self.save_root, pattern)))

# Filter out non-existent paths and sort by creation time
if paths:
    paths = [p for p in paths if os.path.exists(p)]
    # remove false positives
    if '_LoRA' not in name:
        paths = [p for p in paths if '_LoRA' not in p]
    if '_refiner' not in name:
        paths = [p for p in paths if '_refiner' not in p]
    if '_t2i' not in name:
        paths = [p for p in paths if '_t2i' not in p]
    if '_cn' not in name:
        paths = [p for p in paths if '_cn' not in p]

    if len(paths) > 0:
        latest_path = max(paths, key=os.path.getctime)

This section of code in BaseSDTrainProcess.py L621:639 will remove any paths that have those suffixes in them which prevents loading of previous saves.

Recommended fix - change the check to look at the basepath and not the entire path. In the shared colab, the value in the config is set to content/output but if you modify it like I did, to for example "my_LoRA/output" then the code will not locate any previous saves.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant