How to stop automatically to convert string to date in source csv file
I have problem to import payables transactions csv file to GP. The problem happens on one column in source file which will be imported into “Document number” in GP. The column name in source csv is “Source”. I DO set this column as “Char” type (the source file do include column headers). In source csv file, the column “Source” could have date value such as 03/14/2015 or any other values such as 12345, “ABC test” and so on. The problem is whenever the first data row has date value such as 03/15/2015 in “Source” column, the whole column “Source” in source is automatically thought as date type instead of Char type, which skip all not-date-format values in the column. For example, the value “123456” or “ABCcde” are not picked up when running the map. Is there any way to stop this behavior? As I said, I already set this column as “Char” type but it won’t work when I running the map. This is really annoying. In windows system, the “region and language” is set in US English. I don’t want to change any setting here.
Could anyone give some suggestion?
Thanks in advance!
In the Create Schema File, I set my test column to ‘char’ and the others appropriately (although I typically set all of them to char even if integer, etc).
I wonder if it is looking at your schema.ini or resetting it?
In my testing, on this version – actually I couldn’t break it. I would hand-edit the schema.ini to be ‘date’ and then every time I ran the map something ‘fixed’ the schema to be char again.
But that does certainly imply that on your install maybe it keeps setting it back to Date.
I tested this a bit more and I think I have a workaround – it worked for me anyway.
After setting it to what I wanted (date this time), I made my schema.ini read-only.
Now when I preview, I see my first line as the date and then the line 2 & 3 as empty. So I now repro’d what you are seeing.
The other thing you could try is to use ODBC datasource. Then ‘query’ the table. That should also work – even if you have to cast the column as a char in the query.
Thank you a lot for your input!
Unfortunately, the problem is still existing. In fact, I further found the column type is automatically decided by majority number of the value types in that column, instead of first value in that column. For example, In my test file, if column name “Source” have 6 rows with type “m/d/yyyy” values out of 10 rows. The type “m/d/yyyy” is set short date in my pc. The column is automatically set as date type even I clearly create schema with this column as Char 225 width. I can clearly see this in system.ini file. When I preview the file in smart connect map, the “Source” column only pick up 6 date type values, show empty in other four non-date type values. It really is weird. It seems to me smart connect map schema does not work at all, the columns type in source csv file is automatically decided by OS. I suspect whether this is bug of smart connect or it is releted with smart connect configuration. Now I have to write a program to change particular whole column type to char through adding double quotation.
If you would like to submit an answer or comment, please sign in to the eOne portal.