Joined: 24 Nov 2008
|Posted: Mon Nov 24, 2008 4:43 pm Post subject: Execute exported .sql scripts generated by SQL Data compare?
|I'm trying to merge two databases by using the scripts generated by SQL Data Comare. I have run up against a wall trying and I'm not sure how to get around it. Basically the two databases are part of a replication system and I have a series of unique ID columns that cause sync wizard to error out.
I am trying to export the results to a .sql script and that works great. The problem is that the script ends up being over 500 MB in size. I then take this script and manually modify the ID's that are causing me issues basically using a search and replace.
The problem I'm having is trying to execute the script after the chages are made. SSMS can't handle anything that large, and SQLCMD gives me "Sqlcmd: Error: Scripting error." when ever I try to use the .sql script as input. At first I thought it was possibly related to my search and replace on the ID's, but if I take the file exported by SQL Data Compare and use it as the input file, I get the same error. So it's obviously somewhere else. Is there an easier/better way to execute these scripts?
Joined: 24 Oct 2006
Location: Red Gate Software Ltd.
|Posted: Mon Dec 01, 2008 3:17 pm Post subject:
|Unfortunately SQL Server MS is unable to run such large scripts. The way round this is to switch off transactions whilst doing SQL Data Compare and obtain the script in the usual way. You then have to parse the script in an application, say written in Visual Basic or Perl that will partition the SQL script into smaller chunks and send the inserts to SQL Server in batches. As you will not have any transactional integrity you will have to take a backup before you attempt to run the VB application.
Unfortunately there isn't a facility in SQL Data Compare to do this at the moment but I believe it has been suggested as a design request (SDC-799).
The other workarounds might be to use a WHERE clause to restrict the amount of data being synced, or to select fewer objects for the sync and then run several syncs.
A VB script might look like this - this will inject GO every 100 lines:
Dim objFS, objFolder
Const ForReading = 1, ForWriting = 2
Dim fin, fout, MyFile, i, lineinterval, stuff
lineinterval = 100
i = 0
Set MyFile = CreateObject("Scripting.FileSystemObject")
Set fin = MyFile.OpenTextFile("input.sql", ForReading,false)
MyFile.CreateTextFile "output.sql" ' Create a file.
Set fout = MyFile.OpenTextFile("test1out.txt", ForWriting, True)
fout.Writeline "--Start of script"
Do While fin.AtEndOfStream <> True
if (i = lineinterval) then
i = 0
stuff = fin.ReadLine()
i = i +1
fout.Writeline "--End of script"