Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upload progress #127

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from
Draft

Upload progress #127

wants to merge 2 commits into from

Conversation

kevinresol
Copy link
Member

No description provided.

@kevinresol kevinresol changed the title Getting started with JS Upload progress Dec 3, 2020
@back2dos
Copy link
Member

back2dos commented Dec 4, 2020

Hmm. Off the top of my head, I'd say this can be implemented once in Client rather than in each ClientObject. Not sure that's better, but it's certainly less work ^^

@kevinresol
Copy link
Member Author

How does that work?

@back2dos
Copy link
Member

back2dos commented Dec 4, 2020

Well. For starters we need something to monitor the progress of a source. This looks like it might do the trick:

class WithProgress<E> extends StreamBase<Chunk, E> {
  var data:SourceObject<E>;
  var progressed:Int->Void;
  var written:Int;

  override function get_depleted():Bool
    return data.depleted;

  public function new(data, progressed, written = 0) {
    this.data = data;
    this.progressed = progressed;
    this.written = written;
  }

  override function next():Future<Step<Chunk, E>> {
    return data.next().map(function (step) return switch step {
      case Link(value, next):
        var written = this.written + value.length;
        progressed(written);
        Link(value, new WithProgress(next, progressed, written));
      default: step;
    });
  }

  override function forEach<Safety>(handler:Handler<Chunk, Safety>):Future<Conclusion<Chunk, Safety, E>> {
    var written = this.written;
    return data.forEach(function (chunk) {
      return handler.apply(chunk).map(function (h) {
        if (h.match(Finish | Resume))
          progressed(written += chunk.length);
        return h;
      });
    }).map(function (conclusion):Conclusion<Chunk, Safety, E> return switch conclusion {
      case Halted(rest): Halted(new WithProgress(rest, progressed, written));
      case Clogged(error, at): Clogged(error, new WithProgress(at, progressed, written));
      case Failed(error): Failed(error);
      case Depleted: Depleted;
    });
  }
}

It's not 100% ideal. For example calling Stream::next will count as progress, no matter whether the bytes are actually processed or not. But in case the consumer then starts forEach on the original stream (rather than the tail of the Link), it starts counting again at 0, so the number should be correct. Or so I hope ^^

With that in place, in the presence of a progress handler, you can extract the total size from the corresponding message head (if set) and wrap the message body WithProgress so that it calls the handler appropriately.

@back2dos
Copy link
Member

back2dos commented Dec 4, 2020

But of course this presumes that the underlying client can actually gradually read the data from the Source, which some platforms might not allow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants