Really Don't understand this and have run out of hair to pull out.
DTS package whihc has a Bulk Insert Task. The task is populated with its source filename dynamically. It has a format file which is stored locally to the box its running on.
The output file is from a unix box pipe delimited. Great dandy. All works a treat as long as I treat everything as either an INT or CHAR. But I have a couple of columns that are DATES in the format 13/02/2004.
I want these inserted into smalldatetime columns. Will settle for datetime columns though.
I get an error on every row Bulk Insert data conversion error (type mismatch) for Row X Column 11 (PostingDate) which happens to be the first date column.
Tried loading it as CHAR, SQLDATETIME and SQLDATETIM4 in the format file all to no avail. TRied it with columns in the dest table as DATETIME and SMALLDATETIME
Anyone have any ideas?What happens when you use the import wizard and make all of the column varchar and create a stage table?
Then do Audits against the data
LIKE SELECT * FROM Stage WHERE ISDATE(Col1)=0
...|||Originally posted by Brett Kaiser
What happens when you use the import wizard and make all of the column varchar and create a stage table?
Then do Audits against the data
LIKE SELECT * FROM Stage WHERE ISDATE(Col1)=0
...
Done that. All the data in the column concerned is valid. They are all dates formatted as DD/MM/YYYY.|||You sure it's considered a valid date?
DECLARE @.X varchar(10)
SELECT @.x = '23/12/2003'
SELECT ISDATE(@.x)
SELECT CONVERT(datetime,@.x,120)
Doesn't seem to be...is it a collation thing?|||Originally posted by Brett Kaiser
You sure it's considered a valid date?
DECLARE @.X varchar(10)
SELECT @.x = '23/12/2003'
SELECT ISDATE(@.x)
SELECT CONVERT(datetime,@.x,120)
Doesn't seem to be...is it a collation thing?
Don't think so. Always use US LATIN 1 General code page 437 case Sensitive for everyuthing.
Will check.
Must be something to do with the DD/MM rather than MM/DD as the column inserts fine as a CHAR.
Stupid #@.*&%$! thing|||I think (ok, well SOMETIMES...not always) that's right...it's not recognizing it as a date...
How many records are we talking about?|||Originally posted by steve@.powell.ne
Really Don't understand this and have run out of hair to pull out.
Anyone have any ideas? I resemble that hair remark!
You are using BCP's -R option, right?
-PatP|||Originally posted by Brett Kaiser
I think (ok, well SOMETIMES...not always) that's right...it's not recognizing it as a date...
How many records are we talking about?
Its booting every single one. Even when I format the data so the dates are something innofensive like 01/01/2004.
Was using the Bulk Insert task in a DTS not using the BCP command line at all.
Have abandoned Bulk Insert in favour of a scripted transform instead.
Works a treat. The files arrive daily with between 2000 and 30000 lines. So the scripted transform is fine. Even at 30,000 it's not exactly taking very long.
Ho hum.|||Did you try bcp with a fromat file?
Or how about BULK INSERT..
I rarely use DTS...|||Originally posted by Brett Kaiser
Did you try bcp with a fromat file?
Or how about BULK INSERT..
I rarely use DTS...
Didn't try the BCP or Bulk INsert route. Not much interested in command line solutions though I geuss I could chron them.
We integrate into legacy systems that spit PSV files out after each nights proocessing. We import the results each morning before we arrive to provide us with off line access to the data.
I usually use scripted routines as it lets me check for nulls and such and do some clever tricks to reduce the number of replicated columns that come out.
Just thought I'd play with something that I don't normally use.
I've got XML to bring in next.sql
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment