I am trying to write (my first, unfortunatly) DTS, and am having some problems.
I need to be able to import multiple flatfiles (all in the same format, just with different schema), each one going into a different table. I have written an application to call my DTS, sending it variables for the tablename and the filename. This works fine when I test it on a single flatfile.
My problem is, the Tranformation object does not reset after each DTS call, so I get "Column does not exist" errors after the first successful import. I can go into the DTS Manager and reset the Transformation options, but that would defeat the purpose of automation. Is there anyway to reset, or another technique, the Transformation object so that it will continuosly work on files that use different schema?
I am very new at DTS, so please consider me "ignorant" when replying.
Thanks in advance.
- JordanExpansion:
I basically need a way to "remove all transformations and remap columns" automatically after, or before, each new flatfile is imported. I can't find an option to do this.
Any suggestions?
Or am I going about this task all wrong?|||I think you want to create a separate file connection a separate data source connection for each transformation. If file A has to successfully import before file B, C, or D can be imported, then link your steps together with workflow "on success" links like:
File A--transform--table A--On Success--File B--transform--table B--On Success--File C--transform--table C--On Success--File D--transform--table D
It is possible to modify your transformations through scripts, but that would be much more difficult. Just keep your file names the same and the four-step package can be re-used the next time you need to do this import.|||Thank you for the reply,
My problem is that I have a variable number of files. Maybe if I better explain what I'm trying to do, it would help.
Each month, we get updates for a database in the form of flatfiles. The flatfiles come with various names, and stored in various directories. Each flatfile name consist of the tablename that it belongs to.
I have written a small application that recursivly loops through all directories, and grabs the files. Then, it will call my DTS sending some variables: fileName and tableName. Using these variables, I would like to have the DTS import the fileName into the tableName with simple column-to-column mapping.
This process works for the first file. But the DTS retains the mapping, so the next file that is sent (with a different number of columns into a different table) stops the DTS because of "column does not exist" errors.
In a nutshell: I'd like to use a single DTS which will import one file into one table. But each call would use a different file that goes into a different table.
Thanks,|||I've decided to use BULK INSERT instead. My DTS still accepts a fileName and tableName, but now it just does a bulk insert from the file. This method seems to be working fine for me.
Thanks for the help,
No comments:
Post a Comment