0

I'm trying to migrate data from SQL DB using CosmosDB Data Migration Tool and I successfully migrated data from SQL DB but the result is all values are string

Wondering if there's a way to convert those JSON to Object during migration process?

Here's my sample Query

select 
       json_value(Data, '$.timestamp') as timestamp,
       json_query(Data, '$.Product.detail') as [Product.detail],
       json_value(Data, '$.Product.price') as [Product.price]

from myTable

nesting seperator: .

6
  • Can you show us your sample data?And what you expect? Commented Sep 17, 2020 at 9:00
  • @SteveZhao Sorry for late reply . the output is JSON document in Cosmos db container. Array and Object is double quotes string Commented Sep 21, 2020 at 0:54
  • Would you like to use Azure Data Factory? That can achieve this without exporting data from Cosmos db. Commented Sep 21, 2020 at 1:00
  • @SteveZhao I'll try. But my concern is my data stored in SQL DB is not normalized. It's a nvarchar type column that stored all JSON data. Is this still available for migration using Azure data factory? Commented Sep 21, 2020 at 1:24
  • I have tried this. Data in my SQL DB like this.And I can get this through Azure Data Factory. I would post it if you need. Commented Sep 21, 2020 at 1:38

2 Answers 2

1

1.create a dataflow and use SQL DB as source.

2.In source option choose Query:

SQL:

select 
       json_value(Data, '$.timestamp') as timestamp,
       json_query(Data, '$.Product.detail') as [Product.detail],
       json_value(Data, '$.Product.price') as [Product.price]

from test3

enter image description here

3.create a DerivedColumn,and change type of column.Expression of Product:

@(detail=split(replace(replace(replace(byName('Product.detail'),'[',''),']',''),'"',''),','),
        price=toDouble(byName('Product.price')))

enter image description here

4.choose Cosmos DB as sink and mapping like this:

enter image description here

5.create a pipeline and add the dataflow you created before,then click debug button or add trigger to execute it. enter image description here

6.result:

{
     "Product": {
        "price": 300.56,
        "detail": [
            "eee",
            "fff"
        ]
    },
    "id": "d9c66062-63ce-4b64-8bbe-95dcbdcad16d",
    "timestamp": 1600329425
}

Update:

You can enable the Data flow debug button, and see the result of expression in Data preview.

enter image description here

Sign up to request clarification or add additional context in comments.

6 Comments

Step 3 is something like concat in js ?
Not actually. Step 3 changes String to Array. You can refer to this documentation.
Sorry, But I don't see any new doc in Cosmos DB after published dataflow without any error. Did I miss something?
Maybe you didn't execute it. I update my answer and you can see the step 5.
Thanks. It's working! By the way, is there playground website or some tools that can let us test the expressions (Step 3)?
|
1

One option is to export your SQL data to a plain CSV file, do any reformatting with your favorite tool, and import the cleaned CSV or JSON file using the Cosmos migration tool.

With PowerShell, for example, the process could be:

  1. Export SQL data to CSV
  2. Use PowerShell Import-CSV to read the data as an array of custom objects
  3. Use PowerShell to modify the custom objects in memory to convert types, reformat, validate, etc
  4. Export the cleaned data back to CSV or JSON using Export-CSV or ConvertTo-Json
  5. Import the cleaned file using Cosmos Data Migration Tool

2 Comments

Does extra exporting gain fees ?
Importing to Cosmos will use your provisioned throughput. Not sure on SQL export.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.