Using Jquery 12.1 and MVC 3 what's the best way to handle large datasets(100k+)?
We're using Linq to Entity framework and I'm selecting everything into a viewmodel
IQueryable<ViewModel> vm = from products in context.vw_Products select products;
and passing it into the view from our Index Action.
I have remote paging enabled so it doesn't load the entire dataset all at once and the pagingGetData action which returns the same dataset as before.
But this seems to be significantly slower than all the other grids, taking anywhere from 5 to 12 seconds just to finish rendering the grid.
Is there a way to set the grid model's total count and set how many pages it will be so I can utilize the int page and int pagesize I pass into the pagingGetData action? Or what is the best way to do this?
Hi, slinex,
The best way is to enable Continuous virtualization for the grid.
More information about can read here.
Regards,
Stanimir Todorov
That doesn't seem to quite make sense to have a large continuous grid when there is the option of paging.
I tried adding that to the grid but it doesn't change the total record count.
@(
Html.Infragistics().Grid<ViewModel>(Model).ID("grid")
.AutoGenerateColumns(false)
.AutoGenerateLayouts(false)
.Width("100%")
.Columns(column =>
{
column.For(x => x.DateActive).HeaderText("Date Active");
column.For(x => x.LastStatusChange).HeaderText("Last Status Change");
column.For(x => x.Status).HeaderText("Status");
column.For(x => x.Name).HeaderText("Name");
}).Features(features =>
features.Filtering().Mode(FilterMode.Advanced);
features.Paging().Type(OpType.Remote).PageSize(50)
.ShowPageSizeDropDown(true).PageSizeList(new List<int> { 10, 15, 20, 50 })
.PrevPageLabelText(Resources.DMS.GridButtonPrevious).NextPageLabelText(Resources.DMS.GridButtonNext).TotalRecordsCount(1000);
features.Sorting().Mode(SortingMode.Single);
features.Selection().MouseDragSelect(true).MultipleSelection(false);
features.GroupBy();
features.Resizing();
features.Hiding();
features.Tooltips().Visibility(TooltipsVisibility.Always);
}).Virtualization(false).DataSourceUrl(Url.Action("PagingGetData", "Products", new { area = "Products" }))
.DataBind().Render()
)
It would be great if it did work.
I managed to change the total record count by setting it in javascript.
grid.live("iggridpagingpagerrendering", function (evt, ui) {
ui.dataSource._recCount = 1000;
});
But of course now when I page, I'm passing in the datasource for the next page but I think the iggrid is skipping the data from the datasource since the next page that renders is blank.
Is there a way around this?
Edit: nevermind, doing it this way would mess with the sorting and filtering.
So is there an example of how to page properly with MVC?
Loading all the data to the model's datasource every time the page is changed doesn't seem like a good way to do it. With only 200k records so far, it's already taking 10-20 seconds to load a page that is displaying 50 records and it takes another 10-20 seconds to change each page.
Of course :)You can take a look at the following samples:http://samples.infragistics.com/jquery/grid/paging http://samples.infragistics.com/jquery/grid/paging-api Notice the Code View section of the page just below the grid - in that section you can find the controller and view code that's used for the sample.Since both samples use remote paging, the controller action with the [GridDataSourceAction] attribute is the main point of interest here.Hope this helps.Cheers,Borislav
Yes I've looked at those before, is there any more explanations on what is being returned in the paging api action PagingGetData? Is it just taking the whole table?
GridDataSourceAction] [ActionName("PagingGetData")] public ActionResult PagingGetData() { var ds = this.DataRepository.GetDataContext().MyProducts; return View("PagingAPI", ds); }
I looked into the source code and got around the issue of the double querying for now.
But it looks like sorting and filtering are doing some strange things as well, see attached zip file for a screenshot of the requests that occur when sorting with a filter already in place. We filtered down 300k records down to 50 (took 5 seconds), and then did a sort which took 10 seconds to return the data. The last column is the duration in the screenshot.
Thanks.
Edit: After a day of trying to optimize the query in various ways, everything seems to be going smoothly even with the double querying.
Another quick update:AutoGenerateLayouts is slated to be set to FALSE by default in the next Service Release so that along the applied fix from the RnD team only one data retrieval query will be executed.The fix (and the change) will be included in the next Service Release which should be available somewhere around the middle of next month if all goes according to schedule.Thank you for helping us drill down to the bottom of the problem and for your patience - hopefully the next SR will give you exactly the performance you're looking for.Cheers,Borislav
Hi Borislav,
We already have that set to false and as I just checked from the sample you have on skydrive, that is also set to false.
Hi guys,Just a quick update from the RnD team:The fact that the AutoGenerateLayouts option is set to true (which is by default) is causing the excessive data-retrieval query.So, as a workaround you can set that option to false.The RnD team have implemented a fix which will appear in the next Service Release) which triggers an additional TOP 1 (Take(1)) query instead of the current full data-retrieving one.Cheers,Borislav
Hello slinex,
As this matter seems to be related to an existing development issue, I have linked this case to the existing work item - 114943. The next step would be for our development team to provide more information regarding this scenario.
Your support ticket regarding this matter is CAS-94207-YF6KG3. Please feel free to contninue sending us any updates through this case.
Do not hesitate to contact me if you need more information.