Manage Secret Scopes in Databricks using GUI

Databricks is a unified big data processing and analytics cloud platform used to transform and process huge volumes of data. Apache Spark is the building block of Databricks which is an in-memory analytics engine for big data and machine learning. Databricks can connect to various sources for data ingestion. This article describes how to manage secret scopes in Databricks using GUI.

Pre-requisites:
To mount a location, you would need:
1. Databricks service in Azure, GCP, or AWS cloud.
2. A Databricks cluster.
3. Azure subscription with Azure Key Vault service created.

What are Secret scopes in Databricks?

Databricks platform is used to connect to multiple applications. Databricks require credentials or secrets to connect to these applications. Databricks or Azure Key Vault can store these secrets securely. Secret scopes are used to manage the secrets which are stored in Azure Key Vault or Databricks.

Databricks supports two secret scopes :
1. Azure Key Vault backed scopes: to manage secrets stored in the Azure Key Vault.
2. Databricks-backed scopes: to manage secrets stored in Databricks.
In this article, we will focus on how to manage Azure Key Vault-backed secret scopes.

Create an Azure Key Vault-backed scopes

Follow the below steps to create an Azure Key Vault-backed secret scope.

1. Open https://<databricks-instance>#secrets/createScope URL

2. Provide the below details:

Scope Name: <Name of the scope>

Manage Principal: Using this option you can specify what all users can manage the secret scope. Here, you can either select “All Users” or “Create’.

DNS Name and Resource ID: Both these properties can be found in Azure Key Vault service properties.

3. Click on Create. This will create secret scope.

Access a secret from the Azure Key Vault

We can access secrets in Databricks using the below command.

password =  dbutils.secrets.get(scope = <name_of_scope>, key = "<name_of_secret>)

Delete a secret scope

Unfortunately, it is not possible to delete a secret scope using GUI. The alternative option is to use either Databricks CLI or Databricks Rest API for deletion. You will see more details about this in my next blog.

Pro tips:
1. Databricks provide a free community version where you can learn and explore Databricks. you can signup here.

Pavan Bangad

9+ years of experience in building data warehouse and big data application.
Helping customers in their digital transformation journey in cloud.
Passionate about data engineering.