You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ataset: traffic Total train points: 12086964 Total val points: 981818
Dataset: kdd_cup_2018_without_missing Total train points: 2925624 Total val points: 307530
Dataset: saugeenday Total train points: 23697 Total val points: 1139
Dataset: sunspot_without_missing Total train points: 73880 Total val points: 1139
Traceback (most recent call last):
File "/content/lag-llama/run.py", line 843, in
train(args)
File "/content/lag-llama/run.py", line 334, in train
) = create_train_and_val_datasets_with_dates(
File "/content/lag-llama/data/data_utils.py", line 156, in create_train_and_val_datasets_with_dates
with open(path, "r") as f: data = json.load(f)
FileNotFoundError: [Errno 2] No such file or directory: 'datasets/huawei/cpu_limit_minute.json'
The text was updated successfully, but these errors were encountered:
Hi! You might not have updated your repo. Please pull again. The dataset path was updated, now it uses what you specify as the path in the run.py file.
Also pl check if you have followed the instructions at the top of the script before executing it.
ataset: traffic Total train points: 12086964 Total val points: 981818
Dataset: kdd_cup_2018_without_missing Total train points: 2925624 Total val points: 307530
Dataset: saugeenday Total train points: 23697 Total val points: 1139
Dataset: sunspot_without_missing Total train points: 73880 Total val points: 1139
Traceback (most recent call last):
File "/content/lag-llama/run.py", line 843, in
train(args)
File "/content/lag-llama/run.py", line 334, in train
) = create_train_and_val_datasets_with_dates(
File "/content/lag-llama/data/data_utils.py", line 156, in create_train_and_val_datasets_with_dates
with open(path, "r") as f: data = json.load(f)
FileNotFoundError: [Errno 2] No such file or directory: 'datasets/huawei/cpu_limit_minute.json'
The text was updated successfully, but these errors were encountered: