Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cell's call: shouldn't the input's first dimension be of batch size? #22

Open
guillaume-chevalier opened this issue Jan 23, 2017 · 0 comments

Comments

@guillaume-chevalier
Copy link

guillaume-chevalier commented Jan 23, 2017

I see for the description of the input tensor in the call function: inputs: input Tensor, 2D, 1 x input_size.

Shouldn't it rather be inputs: input Tensor, 2D, batch x input_size.?

It returns something 2D of the batch size. The training is so fast on my laptop compared to a normal LSTM that I am starting to doubt whether or not if it processes the full batch I am feeding to the cell. I assume that it accepts an input of shape batch x output_dim because the output of the call contains 2D tensors of batch size.

def __call__(self, input_, state=None, scope=None):
    """Run one step of NTM.
    Args:
        inputs: input Tensor, 2D, 1 x input_size.
        state: state Dictionary which contains M, read_w, write_w, read,
            output, hidden.
        scope: VariableScope for the created subgraph; defaults to class name.
    Returns:
        A tuple containing:
        - A 2D, batch x output_dim, Tensor representing the output of the LSTM
            after reading "input_" when previous state was "state".
            Here output_dim is:
                 num_proj if num_proj was set,
                 num_units otherwise.
        - A 2D, batch x state_size, Tensor representing the new state of LSTM
            after reading "input_" when previous state was "state".
    """

Found in:
https://github.com/carpedm20/NTM-tensorflow/blob/master/ntm_cell.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant