-
Notifications
You must be signed in to change notification settings - Fork 165
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About the validation #4
Comments
The eval command is almost same with the inference as I have tried it and get a return result like : |
yes your example is correct, is this working ? |
I am still in training process, I just want to know the eval command in advance. Ok, thanks very much anyway @antoine77340 . |
OK, thanks, I will also try it according to the inference command. @wincle |
@antoine77340 Hi, here is another question, If I use all the train and validation data to train these models with no eval process, are the test results reasonable? Have you tried it in your experiments ? |
Hi, when I used your inference code to the test data, I have met the errors as following: "tensorflow.python.framework.errors_impl.FailedPreconditionError: Attempting to use uninitialized value train_input/input_producer/limit_epochs/epochs". I did not revise the code anywhere. @antoine77340 @wincle |
Hi @junfengluo, If you use the train and validation data to train the models without eval process,
|
Hmm it is strange I do not understand this error (I am still not very good at understanding Tensorflow error code ahah). I tried to re-run this inference command with the latest TF version and it seems to work on my side. Are you sure you correctly trained the model and that at least one model is correctly exported ? |
Yeah, I also trained the models with TF 1.3.0, I am sure the GRU model is trained correctly by 300000 steps. There have two models are still in training in two single GPU, I don't know how to solve this problem. Are these 7 models affect each other when execute the inference command ? |
I haven't met that problem , it's all right for me to inference or evaluate. |
Hello, can you tell me how to transform the video id such as "-1VnJGJ6c2U" to a integer which is showed in result *.csv file. |
I find that the code which transform the video id into integer is mainly about two sentence in the inference.py as : |
Hi, can you tell me about the validation part, there is no describe about validation in your code. Can I just use the eval.py to evaluate the trained model on validation data like your method on inference.py as following? Or can you give me an example.
python eval.py --eval_data_pattern="$path_to_features/validatea*.tfrecord" --model=NetVLADModelLF --train_dir=gatedlightvladLF-256k-1024-80-0002-300iter-norelu-basic-gatedmoe --frame_features=True --feature_names="rgb,audio" --feature_sizes="1024,128" --batch_size=1024 --base_learning_rate=0.0002 --netvlad_cluster_size=256 --netvlad_hidden_size=1024 --moe_l2=1e-6 --iterations=300 --learning_rate_decay=0.8 --netvlad_relu=False --gating=True --moe_prob_gating=True --lightvlad=True --run_once=True --top_k=50
The text was updated successfully, but these errors were encountered: