SQLite
Although I dearly love using SQL Server, I wouldn’t use it in every circumstance; there are times, for example, when just isn’t necessary to use a Server-based RDBMS for a data-driven application. The open-source SQLite is arguably the most popular and well-tried-and-tested database ever. It is probably in your phone, and used by your browser. Your iTunes will use it. Most single-user applications that need to handle data will use SQLite because it is so reliable and easy to install.
It is specifically designed as a zero-configuration, embedded, relational database with full ACID compliance, and a good simple dialect of SQL92. The SQLite library accesses its storage files directly, using a single library, written in C, which contains the entire database system. Creating a SQLite database instance is as easy as opening a simple cross-platform file that contains the entire database instance. It requires no administration.
A huge range of utility applications are available to make the creation and administration of databases maintenance of the data easy. In scripts, I can pass tables, and their data, easily between SQL Server and SQLite, using either ODBC, a NET library or a nice, simple command-line file that is reminiscent of SQLCMD. Of course, there is some compromise since there aren’t nearly so many data types in SQLite and, like most NoSQL databases, the data is represented in storage as a string. There are many tools around for working on SQLite databases if you’re not a programmer by inclination.
There are reasons for using SQL Server instead once your application grows, because the performance and simplicity comes at a price. Although SQLite will happily handle data sizes over a gigabyte, it is limited to a single file. Where you need concurrency, ACID compliance isn’t really enough because it relies on file locks, so write transactions are effectively serialized. There is no access control beyond that provided by the operating system. There is no way of using SQLite in a client-server environment. There is also no built-in way of replicating the data or providing redundancy.
It is the obvious way of handling data in an application, and for transferring data between applications. It is very portable, easy to use, compact, efficient, and reliable. Because a SQLite database is a single file, it can be used like XML. You can send it as an attachment, query it, export and import data, all using the SQL language you already know. It is therefore a great tool for the DBA. I use it for storing all the test data for Continuous Integration processes, but it has value wherever you would otherwise have to use XML or JSON. If you are within the SQL and relational mindset, there is no good technical reason to have to change, since SQLite takes care of all the small stuff and the new breed of NewSQL takes care of the jelly-masses of unstructured data.
Most applications use the SQLite C API to interface with the SQLite database. Many scripting/programming languages have libraries for accessing the SQLite API. Python has its SQLite3 module, so has Ruby and Haskell. There is the Perl module DBI::SQLite, PHP’s data objects with the SQLite driver. Not only can these languages (and many more) use the SQLite C API to access the SQLite db For .NET programmers, SQLite presents a number of different ways of accessing it. You can perform just about every operation by using the command-line interface. You can opt for the ADO.NET data provider System.Data.SQLite, which works like other data providers and provides a great deal of functionality. PowerShell has a SQLite provider. You can use an ODBC and JDBC driver. There is even a JavaScript port. https://github.com/kripken/sql.js and a java port https://sqljet.com/ You name it, it can use SQLite.
There is plenty of information on how to use SQLite. Books abound, such as O’Reilly’s ‘Using SQLite’, Apress’s ‘The Definitive Guide to SQLite’, Sams’s SQLite and a host of others.
SQLite and SQL Server.
If, when you are working with SQL Server or any other mainstream RDBMS, you are using CSV or XML for storing data sets, it is worth considering SQLite instead. SQLite databases are just files, and so are easy to copy and transmit. They can be linked to SQL Server via ODBC, but there are plenty of utilities that allow you to transfer data between SQLite and SQL Server.
To play around with SQLite, I’ve included with this article a SQLite version of the old Pubs database. This is a very fictional ‘publishers, publications and authors’ database that we used to cut our teeth on before AdventureWorks and NorthWind. I’ve added lots more data to it using SQL Data Generator, and transferred it to SQLite. With the SQLite ODBC driver, you can use it to attach to via MS Access, and Libre Office Base. You can do queries to it via Excel. You can link to it via SQL Server. You won’t need me to tell you how to do all that. We’re going to query it using PowerShell.
A simple query with results.
The most useful way of accessing SQLite from scripts is to use the ADO.NET provider.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
#I've installed the 64 bit System.Data.SQLite ADO.NET data provider in this directory Add-Type -Path "C:\Program Files\System.Data.SQLite\2010\bin\System.Data.SQLite.dll" #I just create a connection to my existing database $con = New-Object -TypeName System.Data.SQLite.SQLiteConnection # I then give it a simple connection string $con.ConnectionString = "Data Source=MyDatabasePath"# CHANGE THIS #and open the connection $con.Open() #We'll just start by creating a SQL statement in a command $sql = $con.CreateCommand() $sql.CommandText = " SELECT au_fname ||' '|| au_lname as Author, title as Title, notes as Notes FROM authors as a inner join TitleAuthor as ta on ta.au_id=a.au_id inner join Titles t on ta.title_id=t.title_id; " # we now execute the SQL and return the results in a dataset $adapter = New-Object -TypeName System.Data.SQLite.SQLiteDataAdapter $sql #we create the dataset $data = New-Object System.Data.DataSet #and then fill the dataset [void]$adapter.Fill($data) #we can, of course, then display the first one hundred rows in a grid (1..100)|foreach{$data.tables[0].Rows[$_]}|out-gridview -Title 'Authors and books' |
If you want to try out a few queries, here is the diagram for the database
Sending a SQL Query to SQL Server and saving it in a table
We can, of course send a result to SQL Server. (it isn’t too easy to detect the datatype in a SQLite result because everything is basically held as a string.)
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 |
<# .DESCRIPTION A script to execute a select statement on a sqlite database and copy the result to a SQL Server table #> $SourceDatabasePath = "Data Source=MyDatabasePath"# CHANGE THIS $title = 'Authors and books' $sqlString = " SELECT au_fname ||' '|| au_lname as Author, title as Title, notes as Notes FROM authors as a inner join TitleAuthor as ta on ta.au_id=a.au_id inner join Titles t on ta.title_id=t.title_id; " $DeleteContentsOfTableBeforeCopy = $true; #if table exists, do we clear out existing data? $DestinationTable = 'AuthorsAndBooks'#the name of the table to put it in $Destinationinstance = 'MyInstance' #the name of the server or instance $Destinationdatabase = 'MyDatabase' #the name of the database where you want to put the data $DestinationWindowsSecurity = $true #or $False if you aren't using Windows security $DestinationUserID = '' #the name of the SQL Server user if not integrated security $DeleteContentsOfTableBeforeCopy = $false $PrecisionForNumericData = 1 if (!(Test-Path $SourceDatabasePath)) { Write-Error "Can't find '$SourceDatabasePath'. Sorry, can't proceed because of this" exit } #I've installed the 64 bit System.Data.SQLite ADO.NET data provider in this directory Add-Type -Path "C:\Program Files\System.Data.SQLite\2010\bin\System.Data.SQLite.dll" #I just create a connection to my existing database $con = New-Object -TypeName System.Data.SQLite.SQLiteConnection # I then give it a simple connection string $con.ConnectionString = "Data Source=$SourceDatabasePath" #and open the connection $con.Open() #We'll just start by creating a SQL statement in a command $SourceSQLCommand = $con.CreateCommand() $SourceSQLCommand.CommandText = $SQLString # we now execute the SQL and return the results in a dataset $adapter = New-Object -TypeName System.Data.SQLite.SQLiteDataAdapter $SourceSQLCommand #we create the dataset $data = New-Object System.Data.DataSet #and then fill the dataset [void]$adapter.Fill($data) #We can now easily send this to SQL Server #first, we'll create a table to put this query in... if ($DeleteContentsOfTableBeforeCopy) { $deletionScript = "ELSE DELETE from $DestinationTable " } else { $deletionScript = '' } $CreateScript =@" IF NOT EXISTS (select TABLE_NAME from information_schema.tables where TABLE_NAME like '$DestinationTable') CREATE TABLE $DestinationTable ( "@ $CreateScript += $data.tables[0].Columns | select columnName, Datatype, MaxLength | foreach-object{ $datatype = "$($_.DataType)"; "`n`t[$($_.columnName.Trim())] $( switch ($dataType) { 'double'{ "NUMERIC(18,$PrecisionForNumericData)" } 'long'{ 'INT' } 'byte[]'{ 'VARBINARY(MAX)' } 'string'{ 'NVARCHAR(MAX)' } default { "VARBINARY(MAX) /* unrecognised datatype $datatype*/" } })," } #we trim off the trailing comma from the list $CreateScript = $CreateScript.Substring(0, $CreateScript.Length - 1) + "`n`t)`n $deletionScript" $DestinationConnectionString = "Data Source=$Destinationinstance;Initial Catalog=$Destinationdatabase;$( if ($DestinationWindowsSecurity) { 'integrated security=true' } else { "User id=$DestinationUserID; Password=$(((Get-Credential $DestinationUserID).GetNetworkCredential()).Password); integrated security=false" })" try { #test to see if the table is there. If it isn't, then create it. If it is, then delete the contents $DestinationSQLCommand = new-object ('Data.SqlClient.SqlCommand') $CreateScript, $DestinationConnectionString; $DestinationSQLCommand.Connection.Open(); $handler = [System.Data.SqlClient.SqlInfoMessageEventHandler] { param ($sender, $event) Write-Host "Message: $($event.Message)" }; $DestinationSQLCommand.Connection.add_InfoMessage($handler); $success = $DestinationSQLCommand.ExecuteNonQuery(); #now squirt the data in using the bulk copy library. $bulkCopy = new-object ("Data.SqlClient.SqlBulkCopy") $DestinationConnectionString $bulkCopy.DestinationTableName = $DestinationTable $bulkCopy.BatchSize = 5000 #The number of rows in each batch sent to the server $bulkcopy.NotifyAfter = 200 #The number of rows to copy before firing a notification $bulkCopy.BulkCopyTimeout = 0 #the number of seconds before a time-out $objectEvent = Register-ObjectEvent $bulkcopy SqlRowsCopied -Action { write-host "Copied $($eventArgs.RowsCopied) rows " } $bulkCopy.WriteToServer($data.Tables[0]) #copy all rows to the server } catch { $ex = $_.Exception Write-Error "Whilst doing the bulk copy '$($Query.CommandText)' $($ex.Message) Sorry, but we can't proceed because of this!" } |
You get the point, I’m sure. It is easy to script.
Saving information from the file system into a SQLite database
For our next demonstration of using SQLite, we’ll just do the basis of a source-code system. This routine creates a SQLite database that contains a single table. This will be used to contain a list of all the scriptable objects in a database along with their CREATE scripts. We’ll take this from a directory containing all the scripts. We could do it from SMO, but that will be a distraction at this stage. In this demo, I’m unapologetically taking the scripts that have been created by SQL Compare.
The name of the directory is the same as the name of the database, and all the subdirectories are named after the type of object scripted (e.g. Tables, Views, functions and so on.) This makes it easy to gather the information that we need. I’ve chosen to lay this out as a script, just as with the first code sample, rather than a function because I wanted to show something that you can pick apart in the ISE, examine variables, and try parts out.
A couple of things to note: if you pass SQLite the name of a file in a directory, and the file doesn’t exist, it creates that SQLite file in that directory. You can copy a SQLite database like any other file. When you insert more than a couple of rows, it is quicker and more sociable to wrap them in a transaction. As with any inserts using a data provider, it is much better and more convenient to add parameters via the method provided by the command object. This deals with messy things such as parentheses and escaping illegal characters.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 |
#make sure no errors slip through set-psdebug -strict; $ErrorActionPreference = "stop" <# we provide the path to the database file. If you're just starting with a new one, it doesnt have to already exist #> $DatabasePath = "MySQLiteDatabasePath" <# now we give the path to the source code directory which must exist. In our case, it is a source code directory produced by SQL Compare, but you can create something similar in SMO and I've already shown how to do it in another article #> $SourceCodePath = "ThePathToMySourceCode" <# we need the System.Data.SQLite provider. Make sure it is the right one for your machine and .NET version #> try{ Add-Type -Path "C:\Program Files\System.Data.SQLite\2010\bin\System.Data.SQLite.dll"} Catch { write-warning 'You''ll need the System.Data.SQLite provider to get any further'; exit; } #make sure no errors slip through set-psdebug -strict; $ErrorActionPreference = "stop" <# we provide the path to the database file. If you're just starting with a new one, it doesnt have to already exist #> $DatabasePath = "C:\Users\andrew.clarke\SkyDrive\BigPubs3SourceCode.db" <# now we give the path to the source code directory which must exist. In our case, it is a source code directory produced by SQL Compare, but you can create something similar in SMO and I've already shown how to do it in another article #> $SourceCodePath = "C:\Users\andrew.clarke\SkyDrive\BigPubs" <# we need the System.Data.SQLite provider. Make sure it is the right one for your machine and .NET version #> try { Add-Type -Path "C:\Program Files\System.Data.SQLite\2010\bin\System.Data.SQLite.dll" } Catch { write-warning 'You''ll need the System.Data.SQLite provider to get any further'; exit; } <# I start to create a connection to my existing database. First I instantiate a connection object #> $con = New-Object -TypeName System.Data.SQLite.SQLiteConnection # I then give it a simple connection string (you can put in complicated options ... $con.ConnectionString = "Data Source=$DatabasePath" # ...and open the connection. $con.Open() #We start by creating a SQL statement in a command, using our connection $SourceSQLCommand = $con.CreateCommand() $SourceSQLCommand.CommandText = ' create table if not exists DBSourceCode (Database text, ObjectName text ,Type text,Sourcecode text, Date text)' #we execute it. try { $success = $SourceSQLCommand.ExecuteNonQuery() } catch { write-warning 'Hmm. Could not create the table we need.'; exit; } $SourceSQLCommand.CommandText = 'begin transaction' #we execute it. $success = $SourceSQLCommand.ExecuteNonQuery() <#The way I do it, the base directory has the name of the database. If you don't do that, you'll have to find a different way!#> $DatabaseName = $SourceCodePath.Split('\') | select -Last 1 <# Now here is the start of the pipeline that writes every object level script into a separate row of the SQLite database #> Get-ChildItem $SourceCodePath -Recurse -Filter "*.SQL" | # get each .SQL file Where-Object { $_.Attributes -ne "Directory" } | #excluding directories <#We just keep the Data we want such as the name of the object, the path to the file, the date the file was written, the type of object #> Select @{ Name = "Name"; Expression = { "$($_.Name -replace '.sql$')" } }, LastWriteTimeUtc, @{ Name = "path"; Expression = { "$($_.DirectoryName)\$($_.Name)" } }, @{ Name = "Type"; Expression = { ($_.Directory.Name -replace 'ies$', 'ys') -replace 's$' } } | foreach { # and we just write each object to the table, using parameters. $sqlInsert = $con.CreateCommand() $sqlInsert.CommandText = ' insert into DBSourceCode (Database, ObjectName, Type, Sourcecode, Date) VALUES (@Database,@ObjectName,@Type,@Sourcecode,@Date)' $null = $sqlInsert.Parameters.AddWithValue("@Database", $DatabaseName); $null = $sqlInsert.Parameters.AddWithValue("@ObjectName", $_.Name); $null = $sqlInsert.Parameters.AddWithValue("@Type", $_.Type); $null = $sqlInsert.Parameters.AddWithValue("@Sourcecode", [IO.File]::ReadAllText($_.Path)); $null = $sqlInsert.Parameters.AddWithValue("@Date", "$($_.LastWriteTimeUtc)"); Try { $insertSuccess = $sqlInsert.ExecuteNonQuery() } catch { write-warning 'An insert failed! '; $SourceSQLCommand.CommandText = 'rollback transaction' $success = $SourceSQLCommand.ExecuteNonQuery() exit; } } $SourceSQLCommand.CommandText = 'commit transaction' #we execute it. $success = $SourceSQLCommand.ExecuteNonQuery() |
Conclusions
Grey-muzzle database developers like me tend to think more readily in terms of relational database structures than data documents such as XML when dealing with any sort of data when we’re writing applications or scripting. Relational Database servers are fine when we’re dealing with gazillions of rows of rapidly changing data, but what if we are dealing with an applications’s requirements for local data of just a few megabytes? There are quite a few small-scale relational databases such as HSQLDB, but SQLite is the obvious choice.
Load comments