Unreadable output from SuperCSV?

I have a utility class I created for my Spring controller to invoke to generate a CSV from a collection of beans using the SuperCSV library ( http://supercsv.sourceforge.net/ )

The utility class is pretty basic:

public static void export2CSV(HttpServletResponse response,
        String[] header, String filePrefix, List<? extends Object> dataObjs) {
    try{
        response.setContentType("text/csv;charset=utf-8"); 
        response.setHeader("Content-Disposition","attachment; filename="+filePrefix+"_Data.csv");

        OutputStream fout= response.getOutputStream();  
        OutputStream bos = new BufferedOutputStream(fout);   
        OutputStreamWriter outputwriter = new OutputStreamWriter(bos); 

        ICsvBeanWriter writer = new CsvBeanWriter(outputwriter, CsvPreference.EXCEL_PREFERENCE);

        // the actual writing
        writer.writeHeader(header);

        for(Object anObj : dataObjs){
            writer.write(anObj, header);                                
        }
    }catch (Exception e){
        e.printStackTrace();
    }
};

The catch is, I'm getting different behaviors out of this operation and I don't know why. When I invoke it from one controller (we'll call it 'A'), I get the expected output of data.

When I invoke it from the other controller ('B'), I get a tiny blurb of unrecognizable binary data that cannot be opened by OO Calc. Opening it in Notepad++ yields an unreadable line of gibberish that I can only assume is an attempt by the reader to show me a binary stream.

Controller 'A' invocation (the one that works)

@RequestMapping(value="/getFullReqData.html", method = RequestMethod.GET)
public void getFullData(HttpSession session, HttpServletRequest request, HttpServletResponse response) throws IOException{
    logger.info("INFO:  ******************************Received request for full Req data dump");
    String projName=  (String)session.getAttribute("currentProject");
    int projectID = ProjectService.getProjectID(projName);
    List<Requirement> allRecords = reqService.getFullDataSet(projectID);

      final String[] header = new String[] { 
              "ColumnA", 
              "ColumnB",                  
              "ColumnC",
              "ColumnD",
              "ColumnE"
              };

      CSVExporter.export2CSV(response, header, projName+"_reqs_", allRecords);


};

...and here's the Controller 'B' invocation (the one that fails):

@RequestMapping(value="/getFullTCData.html", method = RequestMethod.GET)
public void getFullData(HttpSession session, HttpServletRequest request, HttpServletResponse response) throws IOException{
    logger.info("INFO:  Received request for full TCD data dump");
    String projName=  (String)session.getAttribute("currentProject");
    int projectID = ProjectService.getProjectID(projName);
    List<TestCase> allRecords = testService.getFullTestCaseList(projectID);

    final String[] header = new String[] { 
            "ColumnW", 
            "ColumnX",
            "ColumnY",                
            "ColumnZ"
         };

    CSVExporter.export2CSV(response, header, projName+"_tcs_", allRecords);
}

Observations:

  • Which controller I invoke first is irrelevant. 'A' always works and 'B' always produces gibberish
  • Both calls to this function have a list of header columns that are a subset of the total set of operations defined in the bean being passed in to CSVWriter
  • The simple Exception printStackTrace is working to detect when a bean's reflection field doesn't match the definition (i.e., can't find get() to get the value programmatically) suggesting that all column/variable matchups are succeeding.
  • In the debugger, I've verified the writer.write(Object, header) call is being hit the expected number of times based on the number of objects being passed and that these objects have the expected data

Any suggestions or insights would be greatly appreciated. I'm really stumped how to better isolate the issue...

Answers


You aren't closing the writer. Also, CsvBeanWriter will wrap the writer in a BufferedWriter, so you can probably simplify your outputwriter as well.

public static void export2CSV(HttpServletResponse response,
        String[] header, String filePrefix, List<? extends Object> dataObjs) {
    ICsvBeanWriter writer;
    try{
        response.setContentType("text/csv;charset=utf-8"); 
        response.setHeader("Content-Disposition",
            "attachment; filename="+filePrefix+"_Data.csv");

        OutputStreamWriter outputwriter = 
            new OutputStreamWriter(response.getOutputStream()); 

        writer = new CsvBeanWriter(outputwriter, CsvPreference.EXCEL_PREFERENCE);

        // the actual writing
        writer.writeHeader(header);

        for(Object anObj : dataObjs){
            writer.write(anObj, header);                                
        }
    } catch (Exception e) {
        e.printStackTrace();
    } finally {
        try {
            writer.close(); // closes writer and underlying stream
        } catch (Exception e){}
    }
};

Super CSV 2.0.0-beta-1 is out now! As well as adding numerous other features (including Maven support and a new Dozer extension), CSV writers now expose a flush() method as well.


Need Your Help

Trying to sell Agile Development

documentation agile sdlc waterfall

My work place is very archaic when it comes to software development. Essentially all projects are forced to use a waterfall approach requiring immense paperwork.

How do I tell Spring cache not to cache null value in @Cacheable annotation

spring caching memcached

Is there a way to specify that if the method returns null value, then don't cache the result in @Cacheable annotation for a method like this?

About UNIX Resources Network

Original, collect and organize Developers related documents, information and materials, contains jQuery, Html, CSS, MySQL, .NET, ASP.NET, SQL, objective-c, iPhone, Ruby on Rails, C, SQL Server, Ruby, Arrays, Regex, ASP.NET MVC, WPF, XML, Ajax, DataBase, and so on.