In a previous article, The Science of Compliance, I gave a general summary of the sort of compliance issues that were concerning DBAs, so as to try to give a picture of the range of actions that were required of them. In this follow-on article, I’ll home in on just one of the new requirements: the independent cross-check by internal control, and offer a solution. The term ‘Internal control’ refers to a system of checks and balances that are designed to combat fraud by providing independent crosschecks to prevent fraud, misreporting and tampering.
When you are tracking fraud or errors in a financial database system, you need to crosscheck between at least two independent sources. The two sources will reconcile if there has been no tampering, unless, of course, both sources have been altered. The task for any System Administrator is to make sure this can’t happen, and be able to provide convincing documentation that it can’t happen
If a financial database system is properly designed, all the records in the base tables will be time stamped, and there will be a log table entry that corresponds to each financial transaction. This will give you a record of who did what, when, how, where and why, and it will give you a fairly independent crosscheck so that any illicit alteration of one of the sources of information will immediately flag an alert. You may be satisfied with this, but how can you be certain, and how can you assure a nervous auditor that it is impossible to introduce inaccuracies?
You can’t. Can you? What if the intruder, knowing the potential profit from ‘white-collar’ fraud, can work out all your internal crosschecks in order to cover up the tracks of a crime?
The problem with this crosscheck is that the data is held within the application, and any breach of security to the application will invalidate it. This is why, in pursuit of the goal of conforming as closely as possible to SOX, you are obliged to put an ‘audit’ process in place that is independent of the application itself. This process has to ensure that, when audit drills down to a transaction, the auditor, or investigator, can check that the transaction happened at the time, and that none of the details have been retrospectively changed. The auditor might also wish to check that, on a particular date, there were no other transactions that were subsequently erased.
This process doesn’t have to be expensive or time-consuming.
The first stage is to encrypt your backups to ensure that they cannot be altered. Then, you need to retain them over the period where audit may be required. This means high compression if the storage requirement is not to be unmanageable. You need to be doubly certain that backups are verified, and can be restored if required. I use SQL Backup for this as it is all done automatically, is easily monitored, and the high-compression mode saves a lot of storage cost for a corporate database. Most third-party backup systems on the market nowadays will allow you to do this part of the process.
Imagine that a questionable series of transactions is being investigated. Luckily, these transactions have not been archived off and are in the current database. The auditor asks for confirmation that the records that make up the transactions have not been altered since they were inserted all those weeks ago.
There is still a potential snag. Restoring a production database, just to investigate a table, or part of a table, is a time-consuming and expensive business. It needs hardware, and the restored database may need all manner of ancillary applications for it to function properly. If the transactions took place over several weeks, then you may be faced with the task of restoring several copies of the database. This is done to confirm that the transactions were originally the same as is currently recorded in the database and have not been altered since; in other words, that they have value as evidence.
The solution, with SQL Data Compare 6.1, is remarkably simple. The DBA need only go to the full backup closest to, and after, the insertion date for the transaction and compare the records on the current database with the one in the backup, without any need to restore the database. If there has been a regular full backup, then the problem is solved. Quite often, an investigation will be messier than this; there will be several backups to check. Here the command-line version will make the operation very quick, and easily scripted. Once you have identified the first full backup after the transactions were made, you may still need to look at the relevant transaction log of the restored database to identify the actual transactions.
In this way, one can quickly provide evidence of the original transactions that are under investigation. A properly encrypted backup must be one of the hardest pieces of evidence to tamper with, so here is an excellent reference. Because it is compressed, it is far easier to store for the retention period dictated by the relevant regulations. Because the data in these backups can be read without significant cost, it provides an excellent cross-check with your application’s current auditing system.
Put yourself in the shoes of a fraudster. He, or she, will have, for example, siphoned off money from a client account into another by a series of transactions. Then this account would be transferred to a private bank account. The only task, in a poorly audited system, is to gain administrative access to the database (only too easy, I fear but I won’t tell you how), and alter the records in the ledger tables. It is so easy that one yearns for the days of permanent ink in the numbered leaves of leather-bound ledgers.
The problem for the fraudster would only come if one has a well-encrypted and compressed backup. This is the nearest one can get to read-only media, especially if the data was held in XML, which Microsoft stores in a semi-compiled form for easy indexing. I couldn’t even begin to think how to alter the backed-up tables to reflect the tampered tables exactly.
When the alarm goes up, the DBA would merely have to check the tables against the backups. I used to use a number of command line utilities with the Microsoft ‘Native’ backup, for the encryption/decryption and compression/decompression. The problem with this was that not only did one need to restore the database, along with ‘all the trimmings’, but one invariably ran into space problems as the process of ‘hydrating’ the backup used masses of temporary disk space.
Now that, as a ‘Friend of Red-Gate’, I’ve been given a shiny new copy of SQL Data Compare 6.1, (6.0 would compare to a backup file if one was prepared to wait a wee while, but 6.1 goes like a rocket) it has set me thinking. What about a scheduled process that runs the application in command line mode, and checks all time-stamped financial records with the backup? That would trigger an alarm when a tampering occurred. Hmmm. Fine in a bought-in package that one couldn’t alter, but in a properly constructed database you’ll have already set a number of traps to alert you to any tampering with historical data. You haven’t? Dear me, maybe that is the subject for another Simple-Talk article.
For me, the important point about comparing directly with encrypted backups is that it is good evidence that would convince an auditor and, I believe, impress a court of law.
Load comments