1

I am trying to create an Azure SQL Database dataset using terraform for my Azure Data Factory. The code bellow works fine to define a linked service:

resource "azurerm_data_factory_linked_service_azure_sql_database" "example" {
  name                = "example"
  resource_group_name = azurerm_resource_group.example.name
  data_factory_name   = azurerm_data_factory.example.name
  connection_string   = "data source=serverhostname;initial catalog=master;user id=testUser;Password=test;integrated security=False;encrypt=True;connection timeout=30"
}

But I can't find a way to create the dataset, since the only resource provider for SQL datasets is the azurerm_data_factory_dataset_sql_server which does not work with the provider azurerm_data_factory_linked_service_azure_sql_database because it was supposed to be used with the azurerm_data_factory_linked_service_sql_server

2 Answers 2

2

It's been a while, but maybe someone could use the solution. You could use: azurerm_data_factory_custom_dataset

resource "azurerm_data_factory_custom_dataset" "DatasetSource" {
  name            = "Your ADF Name"
  data_factory_id = "Your ADF id"
  type            = "AzureSqlTable"

  linked_service {
    name = azurerm_data_factory_linked_service_azure_sql_database.LinkedServicesDBSource.name
  }

  type_properties_json = <<JSON
{
}
JSON

}
Sign up to request clarification or add additional context in comments.

Comments

0

It looks like this isn't supported in tf yet (as I'm discovering are a lot of things).

You could raise it here https://github.com/terraform-providers/terraform-provider-azurerm as an enhancement - there are a couple of kind folk actively adding tf resources in this area.

We made a decision to implement our linked services in tf (because they contain/use secrets we can inject in the pipeline), but are deploying datasets as JSON from the repo. Any reason why you want to deploy them with tf specifically? Did we miss something?

4 Comments

I think it’s a lot easier to manage everything on terraform than changing JSons. If you want to deploy the same data factory in other environments such as DEV and QA it’s just a matter of changing some variables. Using json you would have to parse the json to update it with the current environment configuration.
How do you deploy only datasets? What do you do with the pipeline itself? I believe that maybe if there’s an way to deploy only the pipeline and the datasets, leaving only the linked servers to be creates by terraform it could work, but in that case you cannot change the linked servers names or else you would have to parse the json to change the datasets to point to the right linked servers
We deploy linked services in tf using identical names in all environments so the datasets and pipelines don't need to change - they just deploy as JSON from the repo. The linked services targets are parameterised in the tf and we try to keep them as generic/parameterised as possible so we rarely have to deploy new ones across environments.
It seems like a good solution, I will do the same. Thank you.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.