actions / runner-images

GitHub Actions runner images

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[BUG] Since 2 days our pipeline can not get secrets from the keyvault because not the defined TenantId is used

MarcoK80 opened this issue · comments

Description

Since 2 days our pipeline can not get secrets from the keyvault because not the defined TenantId is used. Before it has worked for one year without problems.

Azure Devops Pipeline running with the correct service principal to connect but the following error occured

`Azure.RequestFailedException: Service request failed.
Status: 401 (Unauthorized)

Content:
{"error":{"code":"Unauthorized","message":"AKV10032: Invalid issuer. Expected one of https://sts.windows.net/cfd26b50-fb8f-44cf-87b2-d5df3d15d884/, https://sts.windows.net/f8cdef31-a31e-4b4a-93e4-5f571e91255a/, https://sts.windows.net/e2d54eb5-3869-4f70-8578-dee5fc7331f4/, found https://sts.windows.net/72f988bf-86f1-41af-91ab-2d7cd011db47/."}}`

Used service principal is in the right TenantID

##[debug]b270df00-2259-4d37-8fd1-45b09abb963d auth param tenantid = cfd26b50-fb8f-44cf-87b2-d5df3d15d884

Works locally with own Azure account.

Could it be that the hosted agent is the source of the issue because I found some updates two days ago?

https://github.com/actions/virtual-environments/blob/main/images/linux/Ubuntu2004-Readme.md

eb3d502

Platforms affected

  • Azure DevOps
  • GitHub Actions

Virtual environments affected

  • Ubuntu 18.04
  • Ubuntu 20.04
  • Ubuntu 22.04
  • macOS 10.15
  • macOS 11
  • macOS 12
  • Windows Server 2019
  • Windows Server 2022

Image version and build link

Environment
ubuntu-latest

Is it regression?

no

Expected behavior

DefaultTenantId which is configured should be used

Actual behavior

https://sts.windows.net/72f988bf-86f1-41af-91ab-2d7cd011db47/ TenantId is used in Azure pipeline

Repro steps

Azure.Extensions.AspNetCore.Configuration.Secrets 1.2.2

` internal static class RealTestConfiguration
{
private static string DefaultTenantId => "cfd26b50-fb8f-44cf-87b2-d5df3d15d884";

public static IConfiguration GetDefaultConfiguration()
{
    var azureCredentialOptions = new DefaultAzureCredentialOptions
    {
        SharedTokenCacheTenantId = DefaultTenantId,
        VisualStudioTenantId = DefaultTenantId,
        VisualStudioCodeTenantId = DefaultTenantId,
        InteractiveBrowserTenantId = DefaultTenantId
    };

    return new ConfigurationBuilder()
                 .AddInMemoryCollection(TestConfiguration.DefaultOptions)
                 .AddAzureKeyVault(new Uri($"https://xyz.vault.azure.net/"), new DefaultAzureCredential(azureCredentialOptions))
                 .Build();
}

}`

Run in pipeline

  • task: AzureCLI@1
    displayName: iTest
    inputs:
    failOnStandardError: true
    azureSubscription: add subscription here
    scriptLocation: inlineScript
    inlineScript: |
    dotnet test ./test/iTests/iTests.csproj --configuration $(BuildConfiguration)

@MarcoK80 the issue can be related to the new Azure-cli version. Could you try downgrading it in runtime and see if it helps using the following snippet

sudo apt-get remove azure-cli
curl -sL https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor | sudo tee /etc/apt/trusted.gpg.d/microsoft.gpg > /dev/null
echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ $(lsb_release -cs) main" | sudo tee /etc/apt/sources.list.d/azure-cli.list
sudo apt-get update
sudo apt-get install azure-cli=2.37.0-1~$(lsb_release -cs)

If the issue is resolved by downgrading then please file a bug in the https://github.com/Azure/azure-cli/issues repository

@miketimofeev This was not successful. I still have the issue:

image

image

@MarcoK80 , Could you for testing purpose replace the inline script to?

inlineScript: |
    az account show

@al-cheb Hello this is the output of the command:

Setting active cloud to: AzureCloud
/usr/bin/az cloud set -n AzureCloud
/usr/bin/az login --service-principal -u *** --password=*** --tenant cfd26b50-fb8f-44cf-87b2-d5df3d15d884 --allow-no-subscriptions
[
{
"cloudName": "AzureCloud",
"homeTenantId": "cfd26b50-fb8f-44cf-87b2-d5df3d15d884",
"id": "581f94ee-8dff-4292-b6b9-55936b77a189",
"isDefault": true,
"managedByTenants": [],
"name": "Connectivity Dev",
"state": "Enabled",
"tenantId": "cfd26b50-fb8f-44cf-87b2-d5df3d15d884",
"user": {
"name": "",
"type": "servicePrincipal"
}
}
]
/usr/bin/az account set --subscription 581f94ee-8dff-4292-b6b9-55936b77a189
/usr/bin/pwsh -NoLogo -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -Command . '/home/vsts/work/_temp/azureclitaskscript1657867562794.ps1'
{
"environmentName": "AzureCloud",
"homeTenantId": "cfd26b50-fb8f-44cf-87b2-d5df3d15d884",
"id": "581f94ee-8dff-4292-b6b9-55936b77a189",
"isDefault": true,
"managedByTenants": [],
"name": "Connectivity Dev",
"state": "Enabled",
"tenantId": "cfd26b50-fb8f-44cf-87b2-d5df3d15d884",
"user": {
"name": "
",
"type": "servicePrincipal"
}
}

@MarcoK80, thank you, once more test please: az keyvault secret list --vault-name xyz

I have found an open bug report - Azure/azure-cli#11871 and one of the recommendation to set the value VisualStudioTenantId or configure AZURE_TENANT_ID env var - Azure/azure-cli#11871 (comment)

  env:
    AZURE_TENANT_ID: cfd26b50-fb8f-44cf-87b2-d5df3d15d884

az keyvault secret list --vault-name xyz

Yes this shows the keyvault secret without setting any environment variable:

/usr/bin/az account set --subscription 581f94ee-8dff-4292-b6b9-55936b77a189
/usr/bin/pwsh -NoLogo -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -Command . '/home/vsts/work/_temp/azureclitaskscript1657871418719.ps1'
[
{
"attributes": {
"created": "2021-10-18T06:53:42+00:00",
"enabled": true,
"expires": null,
"notBefore": null,
"recoveryLevel": "Recoverable+Purgeable",
"updated": "2021-10-18T06:53:42+00:00"
},
"contentType": null,
"id": "https://conctmsgweuitest1.vault.azure.net/secrets/RedisConnectionString",
"managed": null,
"name": "RedisConnectionString",
"tags": {}
}
]
/usr/bin/az account clear
Finishing: iTest

I believe we are also facing the same issue, but with connections to Azure SQL DB and Cosmos DB from our Azure Pipelines. As all of our pipelines are currently completely blocked, lots of effort is put to this issue investigation.
That would be one of our recent findings: when we try to dump the token, which is used for auth for those resources, that's what we can see in the claims:

{
  "aud": "https://database.windows.net/",
  "iss": "https://sts.windows.net/72f988bf-86f1-41af-91ab-2d7cd011db47/",
  "iat": 1657869274,
  "nbf": 1657869274,
  "exp": 1657955974,
  "aio": "...",
  "appid": "...",
  "appidacr": "2",
  "idp": "https://sts.windows.net/72f988bf-86f1-41af-91ab-2d7cd011db47/",
  "idtyp": "app",
  "oid": "...",
  "rh": "...",
  "sub": "...",
  "tid": "72f988bf-86f1-41af-91ab-2d7cd011db47",
  "uti": "...",
  "ver": "1.0"
}

We think that tid is of the most interest there. Our guess is that some underlying library uses the wrong tenant (not ours, but some other one). We tried to google the tenantId, and it seems that it could be the tenantId from Microsoft.

Will try to provide our real tenantId as env variable too (as it was proposed a couple of messages above).

@MarcoK80, Could we check with AZURE_TENANT_ID env var?

 env:
    AZURE_TENANT_ID: cfd26b50-fb8f-44cf-87b2-d5df3d15d884

@al-cheb, we have the same problem, but we are running inside of an Azure Powershell task. Setting the following in the first line of the script does help to prevent the problem.

$env:AZURE_TENANT_ID = "<our tenant id>";

Disabling Managed Identity Credential Authentication does allow the pipeline to authenticate successfully again:

new DefaultAzureCredential(new DefaultAzureCredentialOptions
{
    ExcludeManagedIdentityCredential = true
}

It seems to me, that somehow in the Pipeline is now a Managed Identity available?

@MarcoK80, Could we check with AZURE_TENANT_ID env var?

 env:
    AZURE_TENANT_ID: cfd26b50-fb8f-44cf-87b2-d5df3d15d884

Hi, env can not be used with AzureCLI tasks but to set the environment variable solved the issue:

  • task: AzureCLI@2
    displayName: iTest
    inputs:
    failOnStandardError: true
    azureSubscription: 'TeamPrivate-ConnectivityDev'
    scriptType: 'pscore'
    scriptLocation: inlineScript
    inlineScript: |
    Write-Host "##vso[task.setvariable variable=AZURE_TENANT_ID]cfd26b50-fb8f-44cf-87b2-d5df3d15d884"
    dotnet test

So everythink what you set in code is ignored. This bug should be solved.

I tried to set the env variable in powershell like this, and then I am trying to get a new token to get into it to see, what is the issuer. I do not see that the env variable is honored. I can still see that the issuer is [72f988bf-86f1-41af-91ab-2d7cd011db47](https://sts.windows.net/72f988bf-86f1-41af-91ab-2d7cd011db47/):

$env:AZURE_TENANT_ID = "582259a1-dcaa-4cca-b1cf-e60d3f045ecd"
# then we run our dotnet app ...
var credential = new DefaultAzureCredential(new DefaultAzureCredentialOptions());
var token = await credential.GetTokenAsync(new Azure.Core.TokenRequestContext(
        new[] { "https://database.windows.net/.default" }), stoppingToken);
_logger.LogInformation("Access Token: {Token}", token.Token);
var jwtToken = new JwtSecurityToken(token.Token);
_logger.LogInformation("Issuer      : {Iss}", jwtToken.Issuer);
Issuer      : https://sts.windows.net/72f988bf-86f1-41af-91ab-2d7cd011db47/

But we also spotted that ~30% of pipelines started to be green again. Is microsoft reverting something on their end?

I've been looking into this issue as we were suffering with it too. Interestingly we have both hosted agents and self hosted pools where we can pivot builds from one to the other. All builds running on our own hosted pools had no issue authenticating when calling azure services (in our case KeyVault/AppConfig) but MS hosted failed even using the same service connection. Both were running the same agent version (2.204.0) so we concluded it must be something with the underlying VM or the pool.

Tried multiple things but only thing that worked for us was similar to @dldldlepl explicitly exclude the management identity credential along with az account set -s ..... which was always did.

new DefaultAzureCredential(new DefaultAzureCredentialOptions
{
    ExcludeManagedIdentityCredential = true
}

Must be that managed identity management is available on those VM's now? we did some verbose logging of the outgoing requests and saw it call a metadata endpoint, maybe this is new (we don't have the verbose logs from before this started to break, so not sure if this is intended or unreleated)

....
azure:core-rest-pipeline retryPolicy:info Retry 0: Received a response from request bc1d8ff8-b70c-4ad5-82ef-38ba2260cdb9
azure:core-rest-pipeline retryPolicy:info Retry 0: Processing 2 retry strategies.
azure:core-rest-pipeline retryPolicy:info Retry 0: Processing retry strategy throttlingRetryStrategy.
azure:core-rest-pipeline retryPolicy:info Retry 0: Skipped.
azure:core-rest-pipeline retryPolicy:info Retry 0: Processing retry strategy exponentialRetryStrategy.
azure:core-rest-pipeline retryPolicy:info Retry 0: Skipped.
azure:core-rest-pipeline retryPolicy:info None of the retry strategies could work with the received response. Returning it.
azure:identity:info ManagedIdentityCredential - IMDS => ManagedIdentityCredential - IMDS: Using expires_on: 1657871766000 (original value: 1657871766)
azure:identity:info IdentityClient: [http://169.254.169.254/metadata/identity/oauth2/token?resource=https%3A%2F%2Fxyz.azconfig.io&api-version=2018-02-01] token acquired, expires on 1657871766000
azure:identity:info ManagedIdentityCredential => getToken() => SUCCESS. Scopes: https://xyz.azconfig.io/.default.
azure:identity:info ChainedTokenCredential => getToken() => Result for DefaultManagedIdentityCredential: SUCCESS. Scopes: https://xyz.azconfig.io/.default.
azure:core-http:info Request: {
  "streamResponseStatusCodes": {},
  "url": "REDACTED",
  "method": "GET",
  "headers": {
    "_headersMap": {....}
  },
  "withCredentials": false,
  "timeout": 0,
  "requestId": "adc3a1c0-56c3-4373-b9ca-bacceb19a807"
}
azure:core-http:info Request: {
  "streamResponseStatusCodes": {},
  "url": "REDACTED",
  "method": "GET",
  "headers": {
    "_headersMap": {
      "accept": "application/vnd.microsoft.appconfig.kv+json, application/json, application/problem+json",
      "user-agent": "azsdk-js-app-configuration/1.3.1 core-http/2.2.5 Node/v16.15.1 OS/(x64-Linux-5.13.0-1031-azure)",
      "x-ms-client-request-id": "561256fc-1b12-4be5-beae-dc10bc40fe84",
      "authorization": "REDACTED"
    }
  },
  "withCredentials": false,
  "timeout": 0,
  "requestId": "561256fc-1b12-4be5-beae-dc10bc40fe84"
}
azure:core-http:info Response status code: 401
azure:core-http:info Headers: {
  "_headersMap": {
    "access-control-allow-credentials": "true",
    "access-control-allow-origin": "*",
    "access-control-expose-headers": "DNT, X-CustomHeader, Keep-Alive, User-Agent, X-Requested-With, If-Modified-Since, Cache-Control, Content-Type, Authorization, x-ms-client-request-id, x-ms-useragent, x-ms-content-sha256, x-ms-date, host, Accept, Accept-Datetime, Date, If-Match, If-None-Match, Sync-Token, x-ms-return-client-request-id, ETag, Last-Modified, Link, Memento-Datetime, retry-after-ms, x-ms-request-id, x-ms-client-session-id, x-ms-effective-locale, WWW-Authenticate, traceparent, tracestate",
    "connection": "close",
    "content-length": "0",
    "date": "Thu, 14 Jul 2022 07:56:06 GMT",
    "server": "openresty/1.17.8.2",
    "strict-transport-security": "REDACTED",
    "www-authenticate": "HMAC-SHA256, Bearer error=\"invalid_token\", error_description=\"The access token is from the wrong issuer. It must match the AD tenant associated with the subscription, to which the configuration store belongs. If you just transferred your subscription and see this error message, please try back later.\"",
    "x-ms-correlation-request-id": "c8b9d568-01bc-4fee-ad95-0e64b3d68fc0",
    "x-ms-request-id": "c8b9d568-01bc-4fee-ad95-0e64b3d68fc0"
  }
}
...

this bit being the key log line

azure:identity:info ManagedIdentityCredential - IMDS => ManagedIdentityCredential - IMDS: Using expires_on: 1657871766000 (original value: 1657871766)
azure:identity:info IdentityClient: [http://169.254.169.254/metadata/identity/oauth2/token?resource=https%3A%2F%2Fxyz.azconfig.io&api-version=2018-02-01] token acquired, expires on 1657871766000

@egorshulga could you provide links to the successful and unsuccessful builds? We will try to compare the difference in environments that run those builds.

But we also spotted that ~30% of pipelines started to be green again. Is microsoft reverting something on their end?

Yes I agree the environment variable is maybe not the solution. We have now pipelines with it and it still fails for some runs.

PR run was green. Then goes into Main red > Then main build again green

image

@egorshulga could you provide links to the successful and unsuccessful builds? We will try to compare the difference in environments that run those builds.

I can:

https://dev.azure.com/shcgravity/Receiver/_build/results?buildId=659818&view=results
https://dev.azure.com/shcgravity/Receiver/_build/results?buildId=659974&view=results

So, I believe that we managed to find a workaround by providing not only the tenantId, but cliendId and clientSecret (which come from our service connection) as environment variable explicitly, which means that we are basically following the first credential provider in the chain.

So, I believe that we managed to find a workaround by providing not only the tenantId, but cliendId and clientSecret (which come from our service connection) as environment variable explicitly, which means that we are basically following the first credential provider in the chain.

This is something we want to avoid because we already have the registered service principal managed and available.

@jsquire could you please assist? We haven't found anything suspicious in the VM image so far

@schaabs: Would you please be so kind as to offer your thoughts?

I obtained some logs from Azure-Identity library:

[Informational] Azure-Identity: DefaultAzureCredential.GetToken invoked. Scopes: [ https://database.windows.net//.default ] ParentRequestId: 
[Informational] Azure-Identity: EnvironmentCredential.GetToken invoked. Scopes: [ https://database.windows.net//.default ] ParentRequestId: 
[Informational] Azure-Identity: EnvironmentCredential.GetToken was unable to retrieve an access token. Scopes: [ https://database.windows.net//.default ] ParentRequestId:  Exception: Azure.Identity.CredentialUnavailableException (0x80131500): EnvironmentCredential authentication unavailable. Environment variables are not fully configured. See the troubleshooting guide for more information. https://aka.ms/azsdk/net/identity/environmentcredential/troubleshoot
[Informational] Azure-Identity: ManagedIdentityCredential.GetToken invoked. Scopes: [ https://database.windows.net//.default ] ParentRequestId: 
[Informational] Azure-Core: Request [fc1d566e-e5c7-493a-9089-db768081bdfe] GET http://169.254.169.254/metadata/identity/oauth2/token?api-version=REDACTED&resource=REDACTED
Metadata:REDACTED
x-ms-client-request-id:fc1d566e-e5c7-493a-9089-db768081bdfe
x-ms-return-client-request-id:true
User-Agent:azsdk-net-Identity/1.5.0,(.NET 6.0.7; Linux 5.13.0-1031-azure #37~20.04.1-Ubuntu SMP Mon Jun 13 22:51:01 UTC 2022)
client assembly: Azure.Identity
[Informational] Azure-Core: Response [fc1d566e-e5c7-493a-9089-db768081bdfe] 200 OK (00.4s)
Server:IMDS/150.870.65.684
Date:Fri, 15 Jul 2022 14:52:03 GMT
Content-Type:application/json; charset=utf-8
Content-Length:1492

[Informational] Azure-Identity: ManagedIdentityCredential.GetToken succeeded. Scopes: [ https://database.windows.net//.default ] ParentRequestId:  ExpiresOn: 2022-07-16T14:52:04.0000000+00:00
[Informational] Azure-Identity: DefaultAzureCredential credential selected: Azure.Identity.ManagedIdentityCredential
[Informational] Azure-Identity: DefaultAzureCredential.GetToken succeeded. Scopes: [ https://database.windows.net//.default ] ParentRequestId:  ExpiresOn: 2022-07-16T14:52:04.0000000+00:00
fail: .......SqlSeeder[0]
      Microsoft.Data.SqlClient.SqlException (0x80131904): Login failed for user '<token-identified principal>'. The server is not currently configured to accept this token.
         at Microsoft.Data.ProviderBase.DbConnectionPool.CheckPoolBlockingPeriod(Exception e)
         at Microsoft.Data.ProviderBase.DbConnectionPool.CreateObject(DbConnection owningObject, DbConnectionOptions userOptions, DbConnectionInternal oldConnection)
         at Microsoft.Data.ProviderBase.DbConnectionPool.UserCreateRequest(DbConnection owningObject, DbConnectionOptions userOptions, DbConnectionInternal oldConnection)
         at Microsoft.Data.ProviderBase.DbConnectionPool.TryGetConnection(DbConnection owningObject, UInt32 waitForMultipleObjectsTimeout, Boolean allowCreate, Boolean onlyOneCheckConnection, DbConnectionOptions userOptions, DbConnectionInternal& connection)
         at Microsoft.Data.ProviderBase.DbConnectionPool.WaitForPendingOpen()
      --- End of stack trace from previous location ---
......

Would we know how to further debug the API endpoint which returns us the token? (http://169.254.169.254/metadata/identity/oauth2/token)

upd: This trace doesn't show this, but my previous comment does. The token is still issued by tenant 72f988bf-86f1-41af-91ab-2d7cd011db47

We experience the exact same behavior using the following configuration:

Platforms affected

  • Azure DevOps
  • GitHub Actions

Virtual environments affected

  • Ubuntu 18.04
  • Ubuntu 20.04
  • Ubuntu 22.04
  • macOS 10.15
  • macOS 11
  • macOS 12
  • Windows Server 2019
  • Windows Server 2022

Image version

windows-2022
https://github.com/Energinet-DataHub/.github/blob/a3126700b6c73ab4b8b361c7a717271d5aeaa01f/.github/workflows/dotnet-solution-ci.yml#L28

Example of the failure we experience:
https://github.com/Energinet-DataHub/geh-charges/runs/7316637361?check_suite_focus=true

The bug also exists on the Javascript SDK, using DefaultCredentials. I replaced it for a ClientSecretCredentials as a workaround.

Let me cautiously say, that the issue probably got healed itself over the weekend. We can see no more issues like this in our pipelines. We will continue to monitor it though

@MarcoK80, Does the issue still persist?

@MarcoK80, Does the issue still persist?

Hi, I havent seen any issue in the last days. But the temporary workaround is still implemented. Should I remove and check again?

@MarcoK80, Does the issue still persist?

Hi, I havent seen any issue in the last days. But the temporary workaround is still implemented. Should I remove and check again?

Let's check.

@MarcoK80, Does the issue still persist?

Hi, I havent seen any issue in the last days. But the temporary workaround is still implemented. Should I remove and check again?

Let's check.

Looks good without workaround

Thank you. I am planning to close the thread. We will reopen it if the issue returns.

Thank you. I am planning to close the thread. We will reopen it if the issue returns.

Please go ahead

I've been looking into this issue as we were suffering with it too. Interestingly we have both hosted agents and self hosted pools where we can pivot builds from one to the other. All builds running on our own hosted pools had no issue authenticating when calling azure services (in our case KeyVault/AppConfig) but MS hosted failed even using the same service connection. Both were running the same agent version (2.204.0) so we concluded it must be something with the underlying VM or the pool.

Tried multiple things but only thing that worked for us was similar to @dldldlepl explicitly exclude the management identity credential along with az account set -s ..... which was always did.

new DefaultAzureCredential(new DefaultAzureCredentialOptions
{
    ExcludeManagedIdentityCredential = true
}

Must be that managed identity management is available on those VM's now? we did some verbose logging of the outgoing requests and saw it call a metadata endpoint, maybe this is new (we don't have the verbose logs from before this started to break, so not sure if this is intended or unreleated)

....
azure:core-rest-pipeline retryPolicy:info Retry 0: Received a response from request bc1d8ff8-b70c-4ad5-82ef-38ba2260cdb9
azure:core-rest-pipeline retryPolicy:info Retry 0: Processing 2 retry strategies.
azure:core-rest-pipeline retryPolicy:info Retry 0: Processing retry strategy throttlingRetryStrategy.
azure:core-rest-pipeline retryPolicy:info Retry 0: Skipped.
azure:core-rest-pipeline retryPolicy:info Retry 0: Processing retry strategy exponentialRetryStrategy.
azure:core-rest-pipeline retryPolicy:info Retry 0: Skipped.
azure:core-rest-pipeline retryPolicy:info None of the retry strategies could work with the received response. Returning it.
azure:identity:info ManagedIdentityCredential - IMDS => ManagedIdentityCredential - IMDS: Using expires_on: 1657871766000 (original value: 1657871766)
azure:identity:info IdentityClient: [http://169.254.169.254/metadata/identity/oauth2/token?resource=https%3A%2F%2Fxyz.azconfig.io&api-version=2018-02-01] token acquired, expires on 1657871766000
azure:identity:info ManagedIdentityCredential => getToken() => SUCCESS. Scopes: https://xyz.azconfig.io/.default.
azure:identity:info ChainedTokenCredential => getToken() => Result for DefaultManagedIdentityCredential: SUCCESS. Scopes: https://xyz.azconfig.io/.default.
azure:core-http:info Request: {
  "streamResponseStatusCodes": {},
  "url": "REDACTED",
  "method": "GET",
  "headers": {
    "_headersMap": {....}
  },
  "withCredentials": false,
  "timeout": 0,
  "requestId": "adc3a1c0-56c3-4373-b9ca-bacceb19a807"
}
azure:core-http:info Request: {
  "streamResponseStatusCodes": {},
  "url": "REDACTED",
  "method": "GET",
  "headers": {
    "_headersMap": {
      "accept": "application/vnd.microsoft.appconfig.kv+json, application/json, application/problem+json",
      "user-agent": "azsdk-js-app-configuration/1.3.1 core-http/2.2.5 Node/v16.15.1 OS/(x64-Linux-5.13.0-1031-azure)",
      "x-ms-client-request-id": "561256fc-1b12-4be5-beae-dc10bc40fe84",
      "authorization": "REDACTED"
    }
  },
  "withCredentials": false,
  "timeout": 0,
  "requestId": "561256fc-1b12-4be5-beae-dc10bc40fe84"
}
azure:core-http:info Response status code: 401
azure:core-http:info Headers: {
  "_headersMap": {
    "access-control-allow-credentials": "true",
    "access-control-allow-origin": "*",
    "access-control-expose-headers": "DNT, X-CustomHeader, Keep-Alive, User-Agent, X-Requested-With, If-Modified-Since, Cache-Control, Content-Type, Authorization, x-ms-client-request-id, x-ms-useragent, x-ms-content-sha256, x-ms-date, host, Accept, Accept-Datetime, Date, If-Match, If-None-Match, Sync-Token, x-ms-return-client-request-id, ETag, Last-Modified, Link, Memento-Datetime, retry-after-ms, x-ms-request-id, x-ms-client-session-id, x-ms-effective-locale, WWW-Authenticate, traceparent, tracestate",
    "connection": "close",
    "content-length": "0",
    "date": "Thu, 14 Jul 2022 07:56:06 GMT",
    "server": "openresty/1.17.8.2",
    "strict-transport-security": "REDACTED",
    "www-authenticate": "HMAC-SHA256, Bearer error=\"invalid_token\", error_description=\"The access token is from the wrong issuer. It must match the AD tenant associated with the subscription, to which the configuration store belongs. If you just transferred your subscription and see this error message, please try back later.\"",
    "x-ms-correlation-request-id": "c8b9d568-01bc-4fee-ad95-0e64b3d68fc0",
    "x-ms-request-id": "c8b9d568-01bc-4fee-ad95-0e64b3d68fc0"
  }
}
...

this bit being the key log line

azure:identity:info ManagedIdentityCredential - IMDS => ManagedIdentityCredential - IMDS: Using expires_on: 1657871766000 (original value: 1657871766)
azure:identity:info IdentityClient: [http://169.254.169.254/metadata/identity/oauth2/token?resource=https%3A%2F%2Fxyz.azconfig.io&api-version=2018-02-01] token acquired, expires on 1657871766000

Your answer solved a lot of our problems, it was indeed a problem with the underlying VM. We develop in VS 2022 on the Azure hosted VM.
If "ExcludeManagedIdentityCredential = false", the code will use the managed identity from the VM, and the vault service will reject the request.