-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when loading large file to local server #7
Comments
Can you provide the line number and file the error is from. |
As I recall, it said parse:1 |
So it's coming from the backend and likely means Pyulog is failing to read the file somehow. I'm not sure I can debug this one without the log file. Could you share it? |
Unfortunately I cannot share the particular log file. However, I might be able to find another one and test with that as well. I will also look into the parsing with python. |
I understand that no worries, if you can run the ulog2csv command it is most likely a network issue and not a issue with the log. |
It is definitely not an issue with the log. It doesn’t fail every time. If I try multiple times, it will succeed at some point. |
I see, definitely a network issue then, I am working on a javascript parser which would solve these issue but if it seems like it will take a lot of time I will invest more time into better netcode. |
The log is fine, if I try multiple times with the same log, eventually it will succeeed. |
With the larger logs i have noticed it can sometimes fail to send all the data to the python sever correctly. With a JavaScript parser that step is not necessary any more and there shouldn't be network failures since everything would happen on the client side. |
In the mean time, I could setup datacomets to accept pre-processed logs. This way for very large logs you can parse them separately with a quick python command and then load the log into data comets. This would also be faster for very large logs since you dont have to deal with the network. Less convenient but a stop gap for now, what do you think? |
Sounds great. Please let me know if I can be of any assistance. |
@dsaffo I missed the comment about preprocessed logs. How would I preprocess the logs then? It doesn't seem to be the "parsing" that fails, but the uploading. Which would still have to take place as the server is plotting? Please correct me if I am wrong. |
Keep in mind, that when I say "local server", it is a server I started using your suggestion with gunicorn. |
You would run the parse code before hand on your local machine and load the
.json file instead. The json is just loaded on the front end so there
shouldn't be a problem.... maybe.
…On Fri, Sep 6, 2019, 2:26 AM michael-sky ***@***.***> wrote:
Keep in mind, that when I say "local server", it is a server I started
using your suggestion with gunicorn.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#7?email_source=notifications&email_token=ADKX34RTQ5FYLQREFKDD4YDQIHZ3FA5CNFSM4IT5QOQ2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD6B4BMQ#issuecomment-528728242>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ADKX34X32L7C53CYYSUUVADQIHZ3FANCNFSM4IT5QOQQ>
.
|
Actually after doing so more research I think I will try to implement this plugin instead https://github.com/blueimp/jQuery-File-Upload/blob/master/README.md#features. It features file chunking and progress bars so might get 2 for 1 solution here. |
@dsaffo That would definitely be the preferred solution. |
File size is 72MB and I get the following error in the java console. However, the loading is still "running".
"Failed to load resource: net::ERR_CONTENT_LENGTH_MISMATCH"
The text was updated successfully, but these errors were encountered: