Read this blog MS Office 2007 versus Open Office 2.2 shootout. The test was already flawed from the beginning for the following reasons:
- The author did not upload the original file which is probably a Excel 97 xsl file as a reference
- Instead, two different formats were used: one from OpenOffice 2.0 beta sxw which is a compressed file and other a xml file from Microsoft Office 2003.
- Since two different formats were used, the test is automatically void of meaning because of the lack of the original file.
- Renaming the xml file to xsl like the author, I verified if it can be read with OpenOffice 2.2. It does not because the data is interpreted as xml which will exceed the limit of row in Calc application. Office Excel 2003 will render like a spreadsheet due to proprietary XML parser. However, Excel 2003 cannot read sxc nor OpendDocument Spreadsheet unless SUN's Open Document Format plugin is used.
- The test is unrealistic in real world business because of the size (273 Mib of spreadsheet data). According to the author, the spreadsheet is actually a log file. One has to wonder why using a spreadsheet application to archive log.
Comments
Remember to look at the problem from a Windows users perspective. You have large amounts of data you want tabulated, and you have a very limited amount of tools to do it with. You can import the file into word, but it wont be able to correlate all the events together. So you import it into excel and have a column for time, for type of events, and data. Tada, now you can use a tool you are familiar with to manipulate the log data. (and you can use things like normalization etc if you need statistics). Is it optimal.. no.. but it is a matter of what tools you know. [I mean how many perl/awk/python programs do exactly the same thing because each specific author knew what one tool to write it with.]