1

i'm trying to assign values to a dictionary that is contained within a python list. My logic seems to work fine except it is assigning 'None' values in the first iteration of the loop. The rest of the iterations assign the values as expected.

The goal is to loop through this json and create hundreds or thousands of objects for the "trend_values" key. This json is assgined to a variable called 'data' using json.load(filename)

{
  "machine_id": null,
  "trend_values": [
    {
      "name": null,
      "tag": null,
      "units": null,
      "value": null
    }
  ],
  "schedule_date": null,
  "message_ids": [
    null
  ]
}

I am trying to accomplish this by using the below code.

    data['machine_id'] = rand_machine_id
    for item in data:
       rand_tag = get_machine_tag()
       rand_unit = get_machine_unit()
       rand_int = random.randint(1,1000000)
           data['trend_values'].append(({
        'name' : device_name,
        'tag': rand_tag,
        'units': rand_unit,
        'value': rand_int}))
    data['schedule_date'] = date_time
    data['message_ids'] = message_id 

I know the above logic isn't correct because the loop assigns 'None' values in the first iteration as show below.

{"machine_id": "qleryoajmm", 
    "trend_values": 
     [
      {"name": null, "tag": null, "units": null, "value": null}, 
      {"name": "compressor", "tag": "hwc", "units": "mrqa", "value": 129859}, 
      {"name": "compressor", "tag": "clb", "units": "fmwn", "value": 725227}, 
      {"name": "compressor", "tag": "rfs", "units": "imjl", "value": 730777}, 
      {"name": "compressor", "tag": "ohy", "units": "tnbo", "value": 642758}
     ], 
    "schedule_date": "2019-09-13 T 10:21:51", 
    "message_ids": "20190913102151.138@391000176"}

I would expect for the first dictionary to have values assigned just like the following dictionaries that are contained within the 'trend_values' list.

2
  • 1
    Why is it surprising? You load the empty json file, so you end up with empty values. Than you keep appending data to trend_values, and end up with empty first element, followed by stuff you have amended. Commented Sep 13, 2019 at 15:39
  • So then my question is how can I avoid having an empty first element? I understand that I load the json with null values, is there a way to reassign those values to avoid the value of 'none' Commented Sep 13, 2019 at 15:41

3 Answers 3

1

The logic is correct. You append the values after the first dict in your list. Your list has an initial data. You can try with this:

{
  "machine_id": null,
  "trend_values": [],
  "schedule_date": null,
  "message_ids": [
    null
  ]
}  

Or changing your script to:

data['machine_id'] = rand_machine_id
data['trend_values'] = []
for item in data:
   rand_tag = get_machine_tag()
   rand_unit = get_machine_unit()
   rand_int = random.randint(1,1000000)
   data['trend_values'].append(({
    'name' : device_name,
    'tag': rand_tag,
    'units': rand_unit,
    'value': rand_int}))
data['schedule_date'] = date_time
data['message_ids'] = message_id 
Sign up to request clarification or add additional context in comments.

1 Comment

Thanks a lot for the help! I understand the issue now. The code is working as expected! Woo hoo!
1

You have the option of replacing the list that has the single dict with a new empty list first, like this:

data['machine_id'] = rand_machine_id
data['trend_values'] = []
for item in data:
    # code as above

Comments

1

The first sub-dict with null values in the sub-list of trend_values is loaded from your existing JSON file with json.load(filename). You can modify that file so that the value of trend_values (and presumably message_ids) is simply an empty sub-list:

{
  "machine_id": null,
  "trend_values": [],
  "schedule_date": null,
  "message_ids": []
}

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.