Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Terraform databricks cannot configure default credentials

We are running terraform through an Azure pipeline to create a databricks workspace and related resources, however when the apply stage of Terraform gets to the stage where it is grabbing the latest version of spark, the process throws an error.

Error is:

│ Error: default auth: cannot configure default credentials. Config: profile=DEFAULT, azure_client_secret=***, azure_client_id=***, azure_tenant_id=*****-*****. Env: ARM_CLIENT_SECRET, ARM_CLIENT_ID, ARM_TENANT_ID
│ 
│   with data.databricks_spark_version.latest_lts,
│   on databricks.tf line 33, in data "databricks_spark_version" "latest_lts":
│   33: data "databricks_spark_version" "latest_lts" {
│ 

We are using a service principal which has been created in Azure AD and has been given the account admin role in our databricks account

we've declared the databricks_connection_profile in a variables file:

databricks_connection_profile = "DEFAULT"

The part that appears to be at fault is the databricks_spark_version towards the bottom of this:

resource "azurerm_databricks_workspace" "dbw-uks" {
  name                = "dbw-uks"
  resource_group_name = azurerm_resource_group.rg-dataanalytics-uks-0002.name
  location            = azurerm_resource_group.rg-dataanalytics-uks-0002.location
  sku                 = "standard"

  depends_on = [
    azuread_service_principal.Databricks
  ]

  tags                = "${merge( local.common_tags, local.extra_tags)}"
}


output "databricks_host" {
  value = "https://${azurerm_databricks_workspace.dbw-uks.workspace_url}/"  
}

# #--------------- dbr-dataanalytics-uks-0002 Cluster ---------------#

data "databricks_node_type" "smallest" {
  local_disk = true

  depends_on = [
    azurerm_databricks_workspace.dbw-uks
  ]
}

data "databricks_spark_version" "latest_lts" {
   long_term_support = true

   depends_on = [
    azurerm_databricks_workspace.dbw-uks
  ]
}

We've followed through various tutorials from both Microsoft and Hashicorp but no positive results so far.

like image 317
Simon Avatar asked Sep 11 '25 03:09

Simon


1 Answers

Really, if you're creating Databricks workspace using the service principal, you can continue to use it to access/create Databricks resources & data sources. You don't need to specify databricks_connection_profile & just need to configure provider authentication correctly by providing host & other necessary attributes for service principal authentication - Databricks Terraform provider uses the same environment variables like azurerm provider.

like image 55
Alex Ott Avatar answered Sep 13 '25 17:09

Alex Ott