Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sending large job data #33

Open
kdar opened this issue Mar 9, 2014 · 7 comments
Open

Sending large job data #33

kdar opened this issue Mar 9, 2014 · 7 comments

Comments

@kdar
Copy link

kdar commented Mar 9, 2014

I noticed with my application, if I send a large amount of data, then it takes a long time to process while gearman is spewing out "Not enough data" errors back at me. There are multiple solutions to this problem, but one I tested can be found here:

https://github.com/kdar/gearman-go/compare/big-data

It basically reads the entire data upfront before it ever gets to decodeInPack(), so decodeInPack() won't throw an error. Another solution is to have the caller of decodeInPack() notice when it's not enough data and wait until there is a sufficient amount to continue. You would also need to increase bufferSize as a size of 1024 is extremely small and would still make it take a long time to process.

Let me know what you think.

@mikespook
Copy link
Owner

Your solution should be right. But I need some days to think about this issue.
If I can make sure there are no other problems, would you please make a pull request for me?

@kdar
Copy link
Author

kdar commented Mar 9, 2014

Yup. No problem.

@justinruggles
Copy link

Any update on this?

@mikespook
Copy link
Owner

I've made a pull request and merged it. Could you please do a test for this?

@justinruggles
Copy link

Unfortunately this is now causing more problems than it fixed. It was working fine for a few jobs when I did testing, but in production with a higher volume it will eventually hang on reading from the connection and block all incoming jobs. For my use, I will likely just revert this commit and increase the buffer size to fit my needs.

@mikespook
Copy link
Owner

I‘ve no idea why it would hang on reading.

Could you tell me witch line is blocked, L171 or L181?

@ndhfs
Copy link

ndhfs commented Nov 13, 2016

I revert de91c99
then fix read func like so

`
func (a *agent) read(length int) (data []byte, err error) {

var headerBuf []byte
var n = 0

if headerBuf, err = a.rw.Peek(minPacketLength); err != nil {
    return
}

dl := int(binary.BigEndian.Uint32(headerBuf[8:12])) + minPacketLength

buf := make([]byte, dl)

for len(data) < dl {
    if n, err = a.rw.Read(buf); err != nil {
        return
    }
    data = append(data, buf[:n]...)
}

return

}
`

And it seems to work stable

Repository owner deleted a comment from guijunchen Feb 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants