Forum Discussion

Uzumaki's avatar
Uzumaki
Occasional Contributor
13 years ago

CSV file readline

Hi,



When reading line by line from a text (csv) file rows are cut in half and sometimes skipped all together. See below for code. I ahve also attached a sample .csv file.



What am I missing?




function ReadLineSucks(path)

{

  var CSVFile = aqFile["OpenTextFile"](path, aqFile.faRead, aqFile.ctANSI);

  var row;





  while(!CSVFile.IsEndOfFile())

  {

    row = CSVFile["ReadLine"]();





    Log["Message"](CSVFile["Line"]);

    Log["Message"](row);

  }





  CSVFile["Close"]();  

}


/Joakim

5 Replies

  • tristaanogre's avatar
    tristaanogre
    Esteemed Contributor
    I see that... the problem is, for some reason, with CaseKey 2652/109 when it follows CaseKey 2652/107.  When I move the 109 CaseKey to the end of the file, everything reads just fine.



    For whatever reason, the "ReadLine" is not finding an end of line character at the end of 107 and so it's continuing on to read the next line.  Really strange but it seems to be data related with something within the test.csv file itself.  I haven't done a lot of analysis on it but there's certainly something strange going on with that data.



    Question:  Is there a reason why you're going through your CSV file in this fashion?  Do you need to access the whole line as a line or are you trying to get to particular columns within each row?  If you're looking at getting data from columns, you could investigate using the DDT.CSVDriver http://smartbear.com/support/viewarticle/13850/.  You might need to create a schema.ini file entry for your CSV to designate the pipe character as your delimiter but it should be easily doable.
  • Uzumaki's avatar
    Uzumaki
    Occasional Contributor
    Im adding all data in this file to a double dimension array to compare it with a DBTable object.

    There might be a smarter way?



    I will look into your suggestion using the DDT.CSVDriver



    Thank you!



    /Joakim

  • tristaanogre's avatar
    tristaanogre
    Esteemed Contributor
    Yeah, if you're reading the file in to an array, I'd use the DDT.CSVDriver.   That way you can then reference columns within rows directly rather than having to try and parse out the columns from the result of a ReadLine.



    For that matter, you COULD use the DDT.CSVDriver to directly compare to the DBTable object without having to first put it into an array.  I don't have the code right at my finger tips for doing so but it should be pretty straight-forward.
  • Uzumaki's avatar
    Uzumaki
    Occasional Contributor
    Many thanks for your help Robert. I really appreciate it!



    Best regards

    /Joakim

  • HKosova's avatar
    HKosova
    SmartBear Alumni (Retired)
    Hi Joakim,



    When reading line by line from a text (csv) file rows are cut in half and sometimes skipped all together.


    I've notified our developers about the file reading issue, and they will look into it. Thanks for letting us know!





    Im adding all data in this file to a double dimension array to compare it with a DBTable object.

    There might be a smarter way?


    You can create a database checkpoint for your CSV file, or you can bind an existing DBTable object to the CSV file at run time. Then, you'll be able to perform the comparison using just a couple of lines of code:



    // Bind an existing DBTable to the CSV file

    // DBTables.table_name.ConnectionString = 'Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\CSVFileFolder\;Extended Properties="text;HDR=Yes;FMT=Delimited"';



    DBTables.table_name.Check();



    Notes:

    * Check out http://connectionstrings.com/textfile for connection strings for CSV files.

    * "|" is a non-standard separator, so you'll need to create the Schema.ini file for your CSV file with the following contents:



    [test.csv]

    Format=Delimited(|)



    * If you choose to rebind an existing DBTable object to your CSV file at run time, the file must have the same name as DBTable's original table. For example, if the DBTable object was originally created for the Orders table, the file must be named Orders.csv.