You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hey stagehand team, seeing a few methods return an any type -- this isn't good practice; esp in like the LLM client where you return instead of just a string for message responses.
The text was updated successfully, but these errors were encountered:
specifically in the lines i posted upthread, the line response.choices[0].message.tool_calls; is coming from the response generated by llmClient.createChatCompletion, not openai specifically, which can cause downstream errors.
My suggestion here would be to have createChatCompletion return a Promise containing the chat output (unless you want to have pricing information or whatever but that can be captured in logging, idk separate conversation). You can also have a callTool method that also returns a string but processes tool calling depending on the provider.
That way, if a client downstream wants to use ollama, perplexity, or any other API provider, they can just plug and play themselves easily if they implement their own llmClient
hey stagehand team, seeing a few methods return an any type -- this isn't good practice; esp in like the LLM client where you return instead of just a string for message responses.
The text was updated successfully, but these errors were encountered: