You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there!
I couldn't understand why a break has been used (not a continue) during creating batch_inputs and batch_outputs in the case there's no candidate for a context:
This skips the next contexts and they don't come into batch_inputs and batch_outputs. Is this the desired behavior? If yes, could you please tell why is so? Furthermore, it seems that we would face KeyError during creating batch_input_representations, this way.
Thanks for spotting this. This looks like a legacy code that shouldn't be there anymore.
I think, at least for decoder-only models, we should create a single output with the last token of input to still be able to get activations when pre_encode_inputs=True. I guess the easiest fix would be to replace the break statement by _candidates = [""].
I need to properly test this to verify it is actually necessary.
Hi there!
I couldn't understand why a
break
has been used (not acontinue
) during creatingbatch_inputs
andbatch_outputs
in the case there's no candidate for a context:lamorel/lamorel/src/lamorel/server/llms/hf_llm.py
Lines 255 to 256 in b012e2e
This skips the next contexts and they don't come into
batch_inputs
andbatch_outputs
. Is this the desired behavior? If yes, could you please tell why is so? Furthermore, it seems that we would faceKeyError
during creatingbatch_input_representations
, this way.lamorel/lamorel/src/lamorel/server/llms/hf_llm.py
Lines 282 to 286 in b012e2e
as we're iterating over
contexts
including those we skipped earlier but there's no entry for them in_ids_tables
.@ClementRomac
The text was updated successfully, but these errors were encountered: