Good afternoon,
Last week we migrated from Pentaho CE 6 to Pentaho CE 8, and we migrated our app.
Everything is working like a charm except for exporting datatables with large data.
The issue is that in Pentaho 8 excel export takes from 5 to 15 min to export 15k to 60k lines, and as alternative CSV export takes seconds to do the same but Excel doesnt understand the default encoding of these files.
In Pentaho CE 6.1 exporting the same data to csv was being encoded correctly and read by excel.
In Pentaho 8 these are the following tests i've performed:
I've tried to modify the ajax call with the desired contentType, but for some reason all changes in this javascript dont reflect on the "..\pentaho-server\pentaho-solutions\system\pentaho-cdf\js\cdf-bootstrap-script-includes.js"
My question is there anyway to correct this behavior so that my client can open the csv automaticly in Excel?
I know it is possible to "import" the csv file to Excel and select the UTF-8 encoding and the file will load correctly, but my client is considering this as a non optimal solution for them.
Kinda regards.
Last week we migrated from Pentaho CE 6 to Pentaho CE 8, and we migrated our app.
Everything is working like a charm except for exporting datatables with large data.
The issue is that in Pentaho 8 excel export takes from 5 to 15 min to export 15k to 60k lines, and as alternative CSV export takes seconds to do the same but Excel doesnt understand the default encoding of these files.
In Pentaho CE 6.1 exporting the same data to csv was being encoded correctly and read by excel.
In Pentaho 8 these are the following tests i've performed:
- Export Component, with the behavior explanied previously.
- Export Popup Component, with the behavior explanied previously.
- Button Component with the ExportData function with the same results.
Code:
function exportData(){
render_relatorio.queryState.exportData('csv', null, {filename:'custom_name.csv'});
}
- Modify "..\pentaho-server\pentaho-solutions\system\pentaho-cdf\js\queries\CdaQuery.js"
I've tried to modify the ajax call with the desired contentType, but for some reason all changes in this javascript dont reflect on the "..\pentaho-server\pentaho-solutions\system\pentaho-cdf\js\cdf-bootstrap-script-includes.js"
Code:
$.ajax({
type: 'POST',
dataType: 'text',
async: true,
data: queryDefinition,
//contentType: "charset=ISO-8859-15",
contentType: "charset=windows-1252"
url: this.getOption('url'),
xhrFields: {
withCredentials: true
}
}).done(function(uuid) {
var _exportIframe = $('<iframe style="display:none">');
_exportIframe.detach();
_exportIframe[0].src = CdaQueryExt.getUnwrapQuery({"path": queryDefinition.path, "uuid": uuid});
_exportIframe.appendTo($('body'));
}).fail(function(jqXHR, textStatus, errorThrown) {
Logger.log("Request failed: " + jqXHR.responseText + " :: " + textStatus + " ::: " + errorThrown);
});
},
- I've also tested this on Pentaho CE 7, with the same behavior as in 8 so it kinda feels like a intended behavior.
My question is there anyway to correct this behavior so that my client can open the csv automaticly in Excel?
I know it is possible to "import" the csv file to Excel and select the UTF-8 encoding and the file will load correctly, but my client is considering this as a non optimal solution for them.
Kinda regards.