-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Azure Identity AzCli Token Credential Causes Performance Issues #38912
Comments
Thank you for the feedback @yogilad . We will investigate and get back to you asap. |
Hey, thanks for the feedback. From what I recall, the Azure CLI maintains its own cache (also using MSAL), so the decision was made to leave the cache handling to them (similar with other dev credentials). I will need to check with the wider identity sdk team for additional context here. I suppose the call to |
Thanks for the response. Technically it is possible, but with MSAL and Azure Identity are already implementing some form of in-memory cache in most flows, it sounds redundant to add this only to cover the case for AzCli. Would be better for the token credential itself to have a built-in caching mechanism and be in parity with other implementations. |
I believe one of the prevailing reasons why we didn't implement an extra layer of caching inside the developer credentials is to maintain consistency with the current user session of the external process (e.g. |
As you mentioned, the CLI does already cache the credential, which is why you don't get prompted for interactive login each time. However, to access the cache the process still needs to be started and the cache needs to be de-serialized from disk (since there is no persistent process that could quickly fetch this from memory). This is much slower than accessing an in-memory cache. Lastly, if the credential is being used with one of our SDKs, the client should be caching the token at the bearer token policy layer as long as a singleton instance of the client is used. |
Right--the credential always authenticates the user currently logged in to the CLI to avoid confusing and difficult to debug situations in which the authenticated user depends on the timing of token requests. This does mean the credential must run the CLI for every token request, which is slow, but we accept that compromise because the credential is intended for use in local development, not high performance scenarios, and keeping the credential in sync with the CLI makes it easier to understand. |
Hi @yogilad. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue. |
Is your feature request related to a problem? Please describe.
We ran into a performance issue with users leveraging Az Cli Token Credential with Kusto SDK.
The root cause seems to be that the token credential never caches the token. As a result, each request spends hundreds of milliseconds fetching a token which slows down the application.
Looking at the documentation of Token Cache confirm this.
Is there a timeline for adopting In Memory cache for all token credentials, or was there an explicit choice not to cache tokens for Az and a few other providers?
(I'm asking since I looked at C# SDK as well and both seems to be in parity WRT cache support)
Describe the solution you'd like
Implement built in Memory cache in Token Credentials in all SDKs (not just python).
The text was updated successfully, but these errors were encountered: