1

I am providing a JSON file of 5 MB size as input for Azure Stream Analytics and i am getting the below error.

"selected file 'filename' must be between 1 and 2097152 bytes"

Number of streaming units selected is 3. How to make Stream Analytics to read a large Json file as input?

0

2 Answers 2

1

I think it's limitation for the test of ASA job,you can upload the json file in the Query Editor below 2M only.

enter image description here

Based on the statements in the official document.

For scenarios with large quantities of unstructured data to store in the cloud, Azure Blob storage offers a cost-effective and scalable solution. Data in Blob storage is usually considered data at rest; however, blob data can be processed as a data stream by Stream Analytics.

I suggest you storing large size json file into blob storage and it will be processed as data stream.

Sign up to request clarification or add additional context in comments.

3 Comments

Thanks @Jay Gong. will try adding the json file into blob storage.
@Antony Good,if it works,you could mark the answer for others' reference,thanks.
I confirm that it's a limitation of the testing experience. I'll ask the team to update the error message to make this clearer. Thanks for the feedback.
0

Storing the large size json file into Blob storage and feeding as input to Stream analytics works. Thanks @Jay Gong

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.