Red Gate forums :: View topic - Command-line ignoring include/exclude options
Return to www.red-gate.com RSS Feed Available

Search  | Usergroups |  Profile |  Messages |  Log in  Register 
Go to product documentation
SQL Packager 6
SQL Packager 6 forum

Command-line ignoring include/exclude options

Search in SQL Packager 6 forum
Post new topic   Reply to topic
Jump to:  
Author Message
daveburns



Joined: 26 Oct 2010
Posts: 3

PostPosted: Tue Oct 26, 2010 6:44 pm    Post subject: Command-line ignoring include/exclude options Reply with quote

I'm running 6.3.1.96 on WinXP against SQL Server 2008. (I'm using the latest patch because things failed with the latest supported release but I can't remember what right now.)

I'm trying to run packager from the command line so I can automate nightly backups and it fails because one of my tables does not have a primary key. I'm not sure why this matters but fine, I'll exclude it for now to get things working. I add /excludedata:table:foo to the command line (no other include or exclude options) and yet I still get the error:

Error: Cannot uniquely identify row in table 'foo' because there is no primary key or unique index defined on it. Output script generation failed.

For the purposes of just getting my nightly automation working, I've tried including only one table with /includeschema:Table:blah /includedata:Table:blah. When using /verbose, I see that it only dumps that table's schema but then gets to "Comparing databases" and takes forever. Turns out it's not frozen, it's just comparing the entire database and results in the same error as above.

Is this known, is there a workaround, and is there anything I can do to help diagnose further?

db
Back to top
View user's profile Send private message
Chris Auckland



Joined: 24 Oct 2006
Posts: 755
Location: Red Gate Software Ltd.

PostPosted: Thu Oct 28, 2010 7:03 pm    Post subject: Reply with quote

Thanks for your post and sorry you're having this issue with SQL Packager.

I'm not sure SQL Packager is the ideal choice for an automated backup utility. I would recommend using SQL Servers native backup, or a third party backup tool for this job. If you're using SQL Server express and can't schedule a SQL server backup using the SQL Agent, then you could use the command line interface to SQL Server (SQLCmd.exe) and schedule a command line backup using a batch file triggered through a windows scheduled task.

Anyway, with regards to:

Quote:
Error: Cannot uniquely identify row in table 'foo' because there is no primary key or unique index defined on it. Output script generation failed


I've seen this kind of error before if a table contains rows that only differ by the data in a BLOB column. Could you send me the table structure for table 'foo'?
_________________
Chris
Back to top
View user's profile Send private message Send e-mail
daveburns



Joined: 26 Oct 2010
Posts: 3

PostPosted: Thu Oct 28, 2010 10:43 pm    Post subject: Reply with quote

Chris -

I agree SQL Packager is not ideal but I'm curious what you would recommend for my client's situation: they're using SQL Server in a shared hosting environment where they cannot run a native backup. The hosting provider offers this at $5 a pop. If you want daily backups, it gets expensive and you need to find another way.

My initial idea was to replicate the database to one on my machine using Compare and Data Compare and then backing up that copy with native backup. I ran into a hitch when I realized my client did not buy the Pro versions of those so I couldn't automate that approach. SQL Packager will run from the command-line though and the backups *seem* ok - nicely compressed and mostly portable as a handful of files.

Would you recommend a different approach given the constraints?

As for the table structure, that's pasted from SSMS below. There were no indices on this table and I've since worked around this problem by placing a unique key on UNIQUE_ID.

UNIQUE_ID nvarchar(12) Checked
CTIME smalldatetime Checked
ETIME smalldatetime Checked
MTIME smalldatetime Checked
[USER] nvarchar(128) Checked
TYPEID float Checked
TYPE nvarchar(50) Checked
REGARDING ntext Checked
USER_TIME smalldatetime Checked
ATTACHMENT nvarchar(255) Checked
CONTACTID nvarchar(12) Checked
GROUPID nvarchar(12) Checked
RECORDMGR nvarchar(50) Checked

Thanks,
db
Back to top
View user's profile Send private message
Chris Auckland



Joined: 24 Oct 2006
Posts: 755
Location: Red Gate Software Ltd.

PostPosted: Tue Nov 02, 2010 7:58 pm    Post subject: Reply with quote

I suspect the problem is that there are some rows that are identical apart from having a possible difference in the ntext column.

The comparison engine cannot use any BLOB columns as a comparison key, so when all the other columns are identical, it can't uniquely identify a row as it can't consider what's in the BLOB column. It wouldn't matter if there were duplicate rows, it's the uncertainty about what's in the BLOB column (ntext) that stops things.

Putting a unique index on the table solves the problem as the rows can be uniquely identified.

Can you check if you have any duplicate rows in the table?
_________________
Chris
Back to top
View user's profile Send private message Send e-mail
daveburns



Joined: 26 Oct 2010
Posts: 3

PostPosted: Tue Nov 02, 2010 8:31 pm    Post subject: Reply with quote

Chris -

Thanks for your reply but I've had to move on since I found the workaround. I'd very much like to hear your thoughts on the overall backup constraints I described above though.

Thanks,
db
Back to top
View user's profile Send private message
Chris Auckland



Joined: 24 Oct 2006
Posts: 755
Location: Red Gate Software Ltd.

PostPosted: Tue Nov 02, 2010 8:50 pm    Post subject: Reply with quote

I think $5 for a backup is scandalous, but they really have you by the soft and danglies if that's their policy.

If they deny you any backup privileges, then backing up the database with scripts is probably your only other option.

You could have a database on your server, and then use a combination of SQL Compare and SQL Data Compare to keep these databases in sync on a schedule. If you run the sync fairly often, then the changes would be small and there is less chance of things going wrong. You can then take as many backups of your local copy as you like.

I hope this helps.
_________________
Chris
Back to top
View user's profile Send private message Send e-mail
Display posts from previous:   
Reply to topic All times are GMT + 1 Hour
Page 1 of 1

 
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2005 phpBB Group