Sunday, March 25, 2012
DTS excution error from ASP Page
I have to execute the DTS package from ASP file.
But the following line throws error
Set oPkg = Server.CreateObject("DTS.Package")
Error type:
Server object, ASP 0177 (0x800401F3)
Invalid ProgID.
Does anybody know the solution?
Thanks,
Mamatha
--
Message posted via http://www.sqlmonster.comHi
At a guess you need the DTS redistributables on the web server. This method
seems to be a better one:
http://www.sqldts.com/default.aspx?207,2
Alteratively you may want to run it through a stored procedure.
John
"Mamatha MD via SQLMonster.com" <forum@.SQLMonster.com> wrote in message
news:7aefeb75208d4814b21443be6e91a5dd@.SQLMonster.com...
> Hi,
> I have to execute the DTS package from ASP file.
> But the following line throws error
> Set oPkg = Server.CreateObject("DTS.Package")
> Error type:
> Server object, ASP 0177 (0x800401F3)
> Invalid ProgID.
> Does anybody know the solution?
> Thanks,
> Mamatha
> --
> Message posted via http://www.sqlmonster.com
DTS error Type Mismatch
DTS error on GUID column
"destination" has a primary key of data type uniqueidentifier and the source
doesn't have a corresponding map field. So I thought setting the
"destination" primary key to have a default value of newId() will
automatically insert a guid for every record inserted (just like a regular
identity key) but unfortunately this doesn't work. I get a message that says
"cannot insert null into the field fieldName" which makes sense because it i
s
the PK but why doesn't the newId() generate an automatic GUID? Or is it even
possible? or could I change the transformation script to make it work and if
so, how? Thanks for any help.On guid primary key column undid primary key constraint, set allow null, set
default value to newId() and DTS worked fine and new guids got inserted.
After DTS set all constraints back as original. So everything's good.
"Naveen" wrote:
> I am doing a DTS from table "source" to table "destination". However
> "destination" has a primary key of data type uniqueidentifier and the sour
ce
> doesn't have a corresponding map field. So I thought setting the
> "destination" primary key to have a default value of newId() will
> automatically insert a guid for every record inserted (just like a regular
> identity key) but unfortunately this doesn't work. I get a message that sa
ys
> "cannot insert null into the field fieldName" which makes sense because it
is
> the PK but why doesn't the newId() generate an automatic GUID? Or is it ev
en
> possible? or could I change the transformation script to make it work and
if
> so, how? Thanks for any help.
Friday, March 9, 2012
DTS ActiveX BCP file manipulation
A file coming into a directory based on the date filename042707
I use the fileexist stored procedure to check for the existence.
I use xp_cmdshell(sp)... stored procedure to rename the file so that
it just has filename instead of the date... I can not use
variables within the xp_cmdshell to replace the date...
everyday the filename would change to filename042807,filename042907 etc...
Basically I know how to copy the file to another directory if it exists...
then I want to strip the right 8 characters off... rename just to that
but with the rename i have to know what the file will be named for that day ...
excuse the grammar just somewhat tired...
any suggestions... please thanks time for sleep"DTS ActiveX BCP file manipulation"
what on earth are you doing with ActiveX in SQL Server?|||not sure where r u facing problem
I can not use variables within the xp_cmdshell to replace the date...
u can use variables withing xp_cmdshell like below
declare @.cmd varchar(100)
set @.cmd = 'ren d:\file042807.xls file' + replace(convert(varchar ,getdate(),1),'/','') + '.xls'
exec master..xp_cmdshell @.cmd
what on earth are you doing with ActiveX in SQL Server?
talking about ActiveX of DTS...
Tuesday, February 14, 2012
DT_NTEXT pass through columns in fuzzy lookup transformation
The documentation on the fuzzy lookup transform mentions that only columns of type DT_WSTR and DT_STR can be used in fuzzy matching. I interpreted this as meaning that you could not create a mapping between an input column of type DT_NTEXT and a column from the reference table. I assumed that you could still have a DT_NTEXT column as part of the input and mark this as a pass through column so that it's value could be inserted in the destination, together with the result of the lookup operation. Apparently this is not the case. Validation fails with the following message: 'The data type of column 'fieldname' is not supported.' First, I'd like to confirm that this is really the case and that I have not misinterpreted this limitation.
Finally, given the following situation
- A data source with input columns
Field_A DT_STR
Field_B DT_NTEXT
- A fuzzy lookup is used to match Field_A to a row in the reference table and obtain Field_C.
- Finally, Field_B and Field_C must be inserted into the destination.
Can anyone suggest how this could be achieved?
Fernando Tubio
One possible workaround is using a multicast transform to route the input columns around the lookup transform. A merge join transform can then be used to join the outputs from the multicast and the fuzzy lookup to include the DT_NTEXT field back into the data flow.
I've tried this solution and it works but I wonder if it is really necessary to resort to all these contortions.
|||
It looks like your workaround is the best approach. The Fuzzy lookup does not support DT_NTEXT, DT_TEXT, or DT_IMAGE columns as copy columns OR pass-through columns. I am not sure why that is, but I will try to find out.
Mark
|||I guess one of the reasons was performance, and that these columns require special handling. You might want to put in a request for this feature for a future release.
Thanks
Mark
Thank you Mark.
Considering my limited knowledge about the inner workings of the data flow pipeline I am very likely wrong, but I would have guessed that a pass-through operation merely involved copying some pointers around. In any case, the package creator can control which columns to pass-through and if he is concerned with performance, then he is in a better position to decide whether to include these columns in the output. So I guess it would be nice to have this choice in a future release.
Fernando Tubio
DT_DBTIME to DateTime conversion
I have a column of DT_DBTIME data type in my Flat File source. This column is mapped to DateTime column in the SQL server destination table. A problem is that time values '12:06:39' and '13:37:45' are converted to the dates '12/30/1899 12:06:39 PM' and '12/30/1899 1:37:45 PM' correspondingly.
When I try the similar conversion from the inside Server Management Studio :
SELECT CAST('12:06:39' as DateTime), CAST('13:37:45' as DateTime)
I get the following results:
'1900-01-01 12:06:39.000' and '1900-01-01 13:37:45.000'
As you can see, the date portion is different and I expect that January 1st, 1900 is correct for this conversion. So, my questions are:
1. Why I have "previous" day in the Integration Services transformation?
2. How to fix this problem?
Thank you,
Vlad
Don't know why it is happening, but it seems like you could use DateAdd in a derived column pretty easily to add 2 days.|||
Microsoft SQL Server
Values with the datetime data type are stored internally by the Microsoft SQL Server 2005 Database Engine as two 4-byte integers. The first 4 bytes store the number of days before or after the base date: January 1, 1900.
SSIS
The DT_DATE data type is implemented using an 8-byte floating-point number. Days are represented by whole number increments, starting with 30 December 1899, and midnight as time zero.