Using Jquery 12.1 and MVC 3 what's the best way to handle large datasets(100k+)?
We're using Linq to Entity framework and I'm selecting everything into a viewmodel
IQueryable<ViewModel> vm = from products in context.vw_Products select products;
and passing it into the view from our Index Action.
I have remote paging enabled so it doesn't load the entire dataset all at once and the pagingGetData action which returns the same dataset as before.
But this seems to be significantly slower than all the other grids, taking anywhere from 5 to 12 seconds just to finish rendering the grid.
Is there a way to set the grid model's total count and set how many pages it will be so I can utilize the int page and int pagesize I pass into the pagingGetData action? Or what is the best way to do this?
Hi, slinex,
The best way is to enable Continuous virtualization for the grid.
More information about can read here.
Regards,
Stanimir Todorov
That doesn't seem to quite make sense to have a large continuous grid when there is the option of paging.
I'm rendering 50 per page and doing the paging remotely so the json that gets returned is a small size but the query linq translates to pulls all the rows and takes 3-4 seconds everytime we change the page. We're trying to cut down on how long it takes to render a page.
You can specify total records count:
<%= Html.Infragistics().Grid(Model).ID("grid1").PrimaryKey("ProductID").Height("400px").Columns(column =>
{
column.For(x => x.ProductID).HeaderText(this.GetGlobalResourceObject("Grid", "PRODUCT_ID").ToString());
column.For(x => x.Name).HeaderText(this.GetGlobalResourceObject("Grid", "PRODUCT_NAME").ToString());
column.For(x => x.ProductNumber).HeaderText(this.GetGlobalResourceObject("Grid", "PRODUCT_NUMBER").ToString());
column.For(x => x.StandardCost).HeaderText(this.GetGlobalResourceObject("Grid", "STANDARD_COST").ToString());
}).Features(features => {
features.Paging().PageSize(10).TotalRecordsCount(100);
}).Virtualization(false)
.GenerateCompactJSONResponse(false)
.DataSourceUrl(Url.Action("PagingGetData"))
.DataBind()
.Render()%>
More information about paging you can read here.
I tried adding that to the grid but it doesn't change the total record count.
@(
Html.Infragistics().Grid<ViewModel>(Model).ID("grid")
.AutoGenerateColumns(false)
.AutoGenerateLayouts(false)
.Width("100%")
.Columns(column =>
column.For(x => x.DateActive).HeaderText("Date Active");
column.For(x => x.LastStatusChange).HeaderText("Last Status Change");
column.For(x => x.Status).HeaderText("Status");
column.For(x => x.Name).HeaderText("Name");
}).Features(features =>
features.Filtering().Mode(FilterMode.Advanced);
features.Paging().Type(OpType.Remote).PageSize(50)
.ShowPageSizeDropDown(true).PageSizeList(new List<int> { 10, 15, 20, 50 })
.PrevPageLabelText(Resources.DMS.GridButtonPrevious).NextPageLabelText(Resources.DMS.GridButtonNext).TotalRecordsCount(1000);
features.Sorting().Mode(SortingMode.Single);
features.Selection().MouseDragSelect(true).MultipleSelection(false);
features.GroupBy();
features.Resizing();
features.Hiding();
features.Tooltips().Visibility(TooltipsVisibility.Always);
}).Virtualization(false).DataSourceUrl(Url.Action("PagingGetData", "Products", new { area = "Products" }))
.DataBind().Render()
)
It would be great if it did work.
I managed to change the total record count by setting it in javascript.
grid.live("iggridpagingpagerrendering", function (evt, ui) {
ui.dataSource._recCount = 1000;
});
But of course now when I page, I'm passing in the datasource for the next page but I think the iggrid is skipping the data from the datasource since the next page that renders is blank.
Is there a way around this?
Edit: nevermind, doing it this way would mess with the sorting and filtering.
So is there an example of how to page properly with MVC?
Loading all the data to the model's datasource every time the page is changed doesn't seem like a good way to do it. With only 200k records so far, it's already taking 10-20 seconds to load a page that is displaying 50 records and it takes another 10-20 seconds to change each page.
I looked into the source code and got around the issue of the double querying for now.
But it looks like sorting and filtering are doing some strange things as well, see attached zip file for a screenshot of the requests that occur when sorting with a filter already in place. We filtered down 300k records down to 50 (took 5 seconds), and then did a sort which took 10 seconds to return the data. The last column is the duration in the screenshot.
Thanks.
Edit: After a day of trying to optimize the query in various ways, everything seems to be going smoothly even with the double querying.
Another quick update:AutoGenerateLayouts is slated to be set to FALSE by default in the next Service Release so that along the applied fix from the RnD team only one data retrieval query will be executed.The fix (and the change) will be included in the next Service Release which should be available somewhere around the middle of next month if all goes according to schedule.Thank you for helping us drill down to the bottom of the problem and for your patience - hopefully the next SR will give you exactly the performance you're looking for.Cheers,Borislav
Hi Borislav,
We already have that set to false and as I just checked from the sample you have on skydrive, that is also set to false.
Hi guys,Just a quick update from the RnD team:The fact that the AutoGenerateLayouts option is set to true (which is by default) is causing the excessive data-retrieval query.So, as a workaround you can set that option to false.The RnD team have implemented a fix which will appear in the next Service Release) which triggers an additional TOP 1 (Take(1)) query instead of the current full data-retrieving one.Cheers,Borislav
Hello slinex,
As this matter seems to be related to an existing development issue, I have linked this case to the existing work item - 114943. The next step would be for our development team to provide more information regarding this scenario.
Your support ticket regarding this matter is CAS-94207-YF6KG3. Please feel free to contninue sending us any updates through this case.
Do not hesitate to contact me if you need more information.