This project has moved and is read-only. For the latest updates, please go here.

5M records - "The server sent an unrecognizable response"

Dec 2, 2015 at 3:37 PM
Thanks for developing this very useful tool.

I'm trying to return a recordset that has 5M records. I'm not doing anything to aggregate the data, just trying to return the table with "Evaluate myTable". I get the following error: "The server sent an unrecognizable response." If if try to return subsets of the data 3.3M records and less, I don't get the error. Any ideas?

I'm using DaxStudio 2.3.4 (just installed fresh yesterday) with Excel 2013 64bit on Win7. Thanks for your help
Dec 2, 2015 at 8:22 PM
Are you running this query against a Power Pivot model? And would the average rowsize of the table be 350 bytes or more? If so you may have hit the limit of what's currently possible. Microsoft does not support anything other than Excel connecting to PowerPivot so we are using unsupported APIs and a custom data transfer mechanism. There is an intrinsic 2Gb data payload limit and you could have hit that. Unfortunately if this is the case it's not easy to fix. (this limit does not exist for the other data sources like PowerBI Desktop and SSAS servers)
Dec 2, 2015 at 10:28 PM
Yes- I'm running against an Excel PowerPivot model. I'm not sure how to calculate the row size, but it's only 7 columns. They are all text with w a max of 15 characters, mostly around 8 characters. The Excel file is 44.1 MB.

FWIW, I also receive an error when trying to use VBA to write the recordset to a csv (based on the code here: http://www.powerpivotblog.nl/export-a-table-or-dax-query-from-power-pivot-to-csv-using-vba/). The error ("Method Open of object Recordset failed") appears when I try to run the whole recordset. Smaller subsets work fine.
Dec 2, 2015 at 11:37 PM
So even if we guess an average length of 12 characters that still only comes to 84 characters per row. Strings are stored as unicode so that would be 168 bytes at most. There will be some extra encoding and protocol overhead, but I'm still not sure if you will hit the 2Gb limit I was thinking of. You can't really compare the Excel file size to the size of the expanded dataset as PowerPivot can get some massive compression figures (10x or more) where as the evaluate has to expand everything back out again.

However we are running similar code (but in .Net) to the VBA in the link you posted and if the core Microsoft library is throwing that error it could be very difficult to work around.

You could try turning on the application logging (see http://daxstudio.codeplex.com/wikipage?title=logging%20settings and posting the logs here (you will probably have to create an issue to attach the log files as I don't think discussions allow for the attaching of files). But if 3.3M rows works and 5M fails I think we may have hit some internal limit in the APIs.

You are probably better off just pulling the data from the original data source.
Dec 3, 2015 at 3:50 PM
I think you're right, probably a very difficult error to work around. The data source is 100 csv files (similar but not exactly the same of course), transformed and unioned in Power Query. I'll try to come up with an alternate approach. Maybe Power Update or try to execute a insert command from Power Query to SQL server (http://blog.crossjoin.co.uk/2013/12/09/updating-data-in-sql-server-with-power-query/). Thanks for your help.