Hello,I'm using xamDataChart to display the signals of a data acquisition system and I have to display up to 12 series with each having 10000 points (10 seconds of data). I read the following whitepaper: http://download.infragistics.com/marketing/infragistics-xamDataChart-performance-whitepaper.pdf On page 15 there's the following sentence: "During our performance testing we found that our Silverlight/WPF xamDataChart even with1 million data points supports refresh rate from just several milliseconds (3-10 ms).With tested competitors was impossible to achieve these results. "I used your XamDataChartPerformance sample any replaced your simulated data with measured data from my system. I'm using 7 series with altogether 70000 points and the cpu load is at 50% on an Intel i5 Cpu with 3 GHz. Please help me to improve the performance, with this cpu load you library is not applicable for my project.I attached my modified sample application so you can reproduce my problems. Please copy the following dll's to the \\Dll folder, I had to delete them because of the 200kB attachment limit.InfragisticsWPF4.Controls.Charts.XamDataChart.v11.1.dllInfragisticsWPF4.DataVisualization.v11.1.dllInfragisticsWPF4.v11.1.dll
Hi,
Does the sample data you have there represent real world data? It looks like you are generating a completely random Y value for each point.
While this type of test data is easy to generate it usually doesn't corellate with actual real world input, and represents a worst case scenario for both the geometry virtualizations in the chart and also in the WPF rasterizer itself.
If you use this code to generate the sample data, which is more of a real world setup where a value has some relation to the value previous to it, you will see that the performance of the chart is much, much better.
var x = 0; double curr = 10.0; for (int i = 0; i < 30000; i++) { if (r.NextDouble() < .5) { curr += r.NextDouble() * 4.0; } else { curr -= r.NextDouble() * 4.0; if (curr < 0) { curr = 0; curr += r.NextDouble() * 4.0; } } collection.Add(new NumericSampleObj { TimeOfSample = x + i, Value = curr }); }
Also, scatter type series represent a worst case scenario in terms of the assumptions we can make about the shape of the line to improve performance as there are no constraints on the data.
You would be much better off using a series with the CategoryXAxis or the CategoryDateTimeXAxis, if possible.
Here is the sample i tested with. with 30,000 samples, it's unusable when trying to interact with it.
I guess a better question would be, where is there a sample where the above 'million data points' is supported with millisecond refreshes.
It seems like Infragistics should have a sample that proves that claim.
I am currently using 11.2, the latest service release. I will clean up my sample and post it here
Mike,
Do you have a sample application that replicates the performance issue? If you can provide one I'll see what recommendations I can come up with. Also which exact version of the chart are you running?
-Graham
Is somebody looking into this or not? I'm still having the same problems.
horrible performance with a ScatterLineSeries with 30000 points.