2

I have class ProcessMessage and ProcessMessage detail. There is 1 to many relationship , 1 ProcessMessage can have many ProcessMessageDetail. I have written the code using Entity Framework 6 but its running extremely slow.

Any tips for optimizing it? If you see its at step 3, I am using ToList(). Step 1 and step 2 is an IQueryable.

// 1
var query =  UnitOfWorkAsync.Repository<ProcessMessage>()
                            .Queryable()
                            .Include(x => x.ProcessMessages)                     
                            .Include(x => x.TestDetail)
                            .AsNoTracking()
                            .AsExpandable()
                            .Where(Query(loggedProcess, status))
                            .Take(1000)
                            .AsQueryable();   

// 2        
query = query.OrderBy(sortBy + (reverse ? " descending" : "")).AsQueryable();


// 3  
return query
    .ToList()
    .Skip((page - 1) * pageSize)
    .Take(pageSize)
    .ToList();


public class ProcessMessage
{
    public ProcessMessage()
    {
        ProcessMessages = new List<ProcessMessageDetail>();
    }
    public int ProcessMessageId { get; set; }
    public int? LoggedProcessId { get; set; }
    public int? ProcessMessageTypeId { get; set; }  
    public virtual LoggedProcess LoggedProcess { get; set; }     
    public virtual ICollection<ProcessMessageDetail> ProcessMessages { get; set; }
}
7
  • 2
    The ToList() before Skip() and Take() really jumps out at me, since ToList() will return a list with all data at that point. The paging will happen in memory, rather than in sql. Commented Jun 16, 2017 at 15:16
  • I'm also not sure you need all those AsQueryable calls, but I don't think they will affect performance. Commented Jun 16, 2017 at 15:19
  • Also, You are doing your OrderBy wrong for descending items. To get a descending sort use OrderByDescending instead. Commented Jun 16, 2017 at 15:21
  • Processmessagedetails have lots of records in it (10k+). 1 PM can have 10 processmessagedetail. I have added appropriate index Commented Jun 16, 2017 at 15:27
  • An index should have already been created for that automatically if the Foreign Key was set up correctly. Commented Jun 16, 2017 at 15:28

1 Answer 1

5

The most likely culprit is the first ToList in step three.

// 3  
return query
    //.ToList() //This will cause performance issues
    .Skip((page - 1) * pageSize)
    .Take(pageSize)
    .ToList();

Check here for an explanation: Does calling ToList multiple times effect performance?

There might also be issues with Skip and Take if you're too far from the first pages, check this: Entity Framerowk Skip/Take is very slow when number to skip is big

Sign up to request clarification or add additional context in comments.

1 Comment

This is definitely the answer, you're storing all of the results from your data store into memory, instead of offloading it to the data store, then filtering the collection, and then converting your List into an IEnumerable<T>.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.