Why not run the CSV file through a custom made app that "cleans" each line in a proper format that will easily be imported into Excel, taking into account Excel's quirks. The app then spits out another CSV that Excel is able to properly import.
I find this solution a good compromise, although it may not be possible to do it.
Is this what you do? Because if someone sends me a csv file, under your workflow I'd have to open it in a text editor (or a different spreadsheet program) to figure out the data type of each column and run it through a perl script (or awk?). For a few columns, sure. For 200 columns, not ideal. Just to get around using excel's impmort directly. Plus, "taking into account Excel's quirks" is easier said than done.
That said, if you have a script you use that does this, you'd make a lot of friends if you posted a link.
I only do this for automated processes where I know the exact requirements of the CSV and the data is large. I definitely wouldn't do it for ad-hoc type of scenarios.
But for some things it just easier to put out an Excel file. In one scenario I have a set of complext spreadsheets that are updated nightly. I use EPPlus (http://epplus.codeplex.com/) a C# library which updates the appropriate data in the spreadsheet.
In an other scenario, I am taking in transactions (accounts payable/receivable, general ledger,etc.) that are in CSV format, applying some business rules and inserting them into a spreadsheet. This spreadsheet is then used by the accountants to do postbacks to the actual ERP system. I looked at doing this using the ERP's own batch interfaces and couldn't justify the time and expense as Excel was the best way to get the data in.
The above library doesn't require Excel on the machine. With Microsoft moving to an XML format for the file, it's made it much much easier to do these things. This particular library works well as long as you are doing simple data updates. There is no ambiguity in terms of the type of the data as you are able to explicitly state what is stored in the cell.
I would love to know if other library like this exist for Ruby or even Python.
If I'm going to write a custom app in some external programming language to consume and do type-aware processing on the CSV file, I might as well do whatever I analysis I need to do in that app, rather than just using it to reprocess CSV into CSV and then go back to working in Excel.
Its not like any of the languages I would use to do this (Python, Ruby, etc.) don't have easy-to-use libraries to treat the CSV as a data structure and perform analytic operations once I've spent the (minimal) effort to load it and do any basic transformations that I need to do based on the intended semantics of the data in each column.
we have to do cleanup of CSV imports in some areas of LedgerSMB and it isn't pretty.
Additionally the fact that it is even required makes you wonder what sort of monsters are lurking in the shadowy corners only to leap out later when you least expect it.
I find this solution a good compromise, although it may not be possible to do it.