0
Terminé

data transform and validation

Paul Cannell il y a 11 ans mis à jour par Gandalf il y a 9 ans 2
I have a file from a payment processor with my debit responses.
There are 13k (6mn text file) lines in this file.
From time to time I get files that have bogus lines (blank lines you already cater for)
but lines that are badly formatted
Example (muchly simplieifed
File structure is as follows
first 13 chars ID NO
second 10 Surname
third 4 some response code

so example GOOD file would look as follows

9999999999999MYSURNAME 0010
8888888888888BlaName         0010
7777777777777  Name            0010

using the transform I can get this into a recordset really easily. I can the foreach over it or whatever. Happy days.
However then you get this
9999999999999MYSURNAME 0010
Crap Data
8888888888888BlaName 0010
7777777777777 Name 0010
 
Now the transform tool does its best but when it gets to the 2nd row it sticks "Crap Data" into the rs().IDNo and then rs().Surname is "888888888B" 
I want to tell the transform to make its best effort up until it finds (in my case) a line feed. Perhaps some companies would use a char to denote end of line. 
If I can do this I can then take my 13k entries and I can filter out the BAD ones and do something with say the 3 lines that are invalid.

PS Id really like to email the bad ones to somebody :)

Is there any way to attempt to solve my current use case? Fixed delimited files are common when dealing with financial institutions and tho bulk insert in sql may be faster id rather take a bit longer and have the ability to handle error cases (try finding a prob in a 13k lines file when using bulk insert :)

- P

À l'étude
Thanks Paul.
We are looking at putting in a Regex for you to validate the field. What do you think?