2

I am planning to insert a bunch of sobjects into Salesforce coming from json string (various objects and need to query existing information so data loader won't apply) . How can I use batch job to split it into smaller chunks? Say 10 records in each run?

I have a feeling I should be using Iterable in this case. But haven't found the related sample code though.

2
  • What is the data volume and what limit are you seeking to manage? DML rows? CPU time? Heap size? Commented Sep 23, 2019 at 0:58
  • @DavidReed The volume is only about 300. I am trying to manage DML rows and CPU time. Since there are a lot of triggers running underneath and we don't have the option to turn them off Commented Sep 23, 2019 at 1:05

1 Answer 1

5

There's no rule that says you can't use a built-in List object, if your JSON is formatted that way:

public class BatchInsert implements Database.Batchable<Object> {
  public String jsonString;

  public Object[] start(Database.BatchableContext context) {
    return (Object[])JSON.deserializeUntyped(jsonString);
  }
  public void execute(Database.BatchableContext context, Object[] scope) {
    // Do stuff here
  }
  public void finish(Database.BatchableContext context) {
  }
}

Or, if you have a wrapper class somewhere:

public class BatchInsert implements Database.Batchable<Wrapper> {
  public string jsonString;

  public Object[] start(Database.BatchableContext context) {
    return (Wrapper[])JSON.deserialize(jsonString, List<Wrapper>.class);
  }
  public void execute(Database.BatchableContext context, Wrapper[] scope) {
    // Do stuff here
  }
  public void finish(Database.BatchableContext context) {
  }
}

You could also use a custom iterator. However, this would involve using JSONParser, which isn't particularly friendly even when you know what you're doing. I can't think of any good reason to use JSONParser, except maybe if you happen to have found some existing Java source that would be compatible. JSON.deserializeUntyped is just as efficient and involves a lot less work.

2
  • Is it okay for me to specify the chunk size in the above example? Commented Sep 23, 2019 at 1:36
  • @LanceShi Yes, when calling Database.executeBatch, you can specify the batch size, just like you would with a query or custom iterator. Commented Sep 23, 2019 at 1:38

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.