I'm seeing a weird behavior in my Entity Framework model. I've got a query that looks like this:
var rows = ( from alarm in context.Alarms
join temp in context.ListDetails on alarm.ListDetailId equals temp.ListDetailId into entries from entry in entries.DefaultIfEmpty()
join read in context.Reads on alarm.ReadId equals read.ReadId
join plate in context.Images on alarm.ReadId equals plate.ReadId
where alarm.IActive == 1 && ! alarm.TransmittedAlarm
where read.IActive == 1
where plate.IActive == 1 && plate.ImageTypeId == 2
select new { alarm, entry, read, plate } ).ToArray();
The query returns all columns in alphabetical order by column name. It turns out that this column is NULL for a few rows in the result set. When I expand the rows variable in the debugger, I see that the entire row is null!
EDIT: Some clarification.
By "first column", I mean the first column of the first row, i.e., in "SELECT A, B, C FROM ...", I mean A. It just happens that the query that Entity Framework builds returns all of the columns in the joined result set in alphabetical order, and the first one alphabetically is nullable and is null for some rows.
The column in question is not a primary key; if it were a primary key, it couldn't be null.
When Entity Framework processes the rows of the returned data into objects, it's looking at the value of the first column in each row. If that column is null, it's returning null for the row, instead of an object with the property that corresponds to that column set to null.
I don't believe this has anything to do specifically with a left outer join; it just happens that my query uses one. I haven't done any testing to verify this, however, so it's just a supposition.
Has anybody seen this before? Does anyone have a fix for this?
Tony