For many businesses, using a Managed Services Provider (MSP) makes sense, particularly when it comes to database management, monitoring and security. They can control costs while still having access to expert resources, and dial up or down the service as required.
It’s perhaps no surprise then that 55% of people surveyed in the Channel Futures MSP 501 Report identified professional services as a growth area, 38% identified enhanced network monitoring, and 73% identified security services.
For many MSPs, this presents a challenge because, according to Datto’s 2019 State of the MSP Report, 37% of MSPs say it will be harder to recruit new talent this year. A figure that rises to 45% in Europe where hiring good people is the #1 pain point for MSPs.
That said, there are ways database MSPs can save time, minimize effort and work more efficiently, while at the same time providing more value to their customers.
Manage growing server estates more effectively
Remote monitoring is a common task for MSPs and typically involves keeping an eye on resource usage like CPU, disk space, memory, and I/O capacity to spot trends and understand when more capacity will be needed. The same information also enables baselines to be established so that, for example, it is immediately apparent if a high-resource utilization is an abnormal spike, a worrying recent trend, or just normal behavior for the period in question.
Alongside scheduled monitoring, reactive monitoring also has its place in responding to alerts about a drop in performance or an increase in deadlocks, and drilling down to the cause of the problem before it becomes an issue. Here, the history and timelines which monitoring provides will help to identify if a stress phenomenon coincides with a particular type of processing, such as a weekly aggregation or a scheduled data import.
Many MSPs use their own scripts for monitoring and rely on the built-in resources in platforms like SQL Server and Oracle to help them. PerfMon, Dynamic Management Views (DMVs) and Extended Events in SQL Server, for example, alongside the Activity Monitor in SQL Server Management Studio, can often provide the data required for effective monitoring.
The cracks begin to appear when MSP server estates start to grow significantly in terms of size and complexity. The typical mixture of on-premises, on-client, and public cloud hosted servers adds another level of difficulty in terms of estate management. This is when a third party monitoring tool like SQL Monitor comes into play.
It doesn’t replace the use of DMVs and Extended Events, etc, which will still be required. Instead, it removes the heavy lifting of data collection and management, analyzes the data, and provides an easy-to-digest picture of activity and any issues and alerts across the monitored servers, on one screen.
For larger estates where there are likely to be different versions and editions of SQL Server, as well as instances in the cloud, this can be particularly useful because SQL Monitor shows the status and key metrics for every server:
Importantly for MSPs, it also allows databases to be grouped in a number of different ways – in the instance above, the databases are grouped by customer, whereas when used in a single business, they might be grouped into test, staging and production environments, etc.
However they are grouped, it allows all SQL Server instances, availability groups, clusters, and virtual machines to be viewed on one central web-based interface, and has customizable alerts that be configured to suit SQL Server estates of any size and complexity.
This removes manual, repetitive daily tasks and enables MSPs to keep pace with expanding estates, discover issues before they have an impact, and diagnose problems to find the root cause in minutes, not hours.
Monitor for security as well as performance
Alongside monitoring for performance, there is an additional and now pressing need to monitor for security. New data protection regulations, for example, require organizations to monitor and manage access, ensure data is available and identifiable, and report when any breaches occur
They also need to know factors like what servers are online, whether unscheduled changes are occurring inside databases, if tables or columns are being added or dropped, and if permissions are being altered.
So beyond traditional expectations, businesses need to know and have a record of which servers and what data is being managed, and be able to discover the reason for any performance issues quickly and accurately.
Should a data breach occur, it becomes even more crucial because organizations are obliged to describe the nature of the breach, the categories and number of individuals concerned, the likely consequences, and the measures to address it.
While this adds another element to the workload of MSPs, they can be prepared for it with an advanced monitoring solution like SQL Monitor which can monitor the availability of servers and databases containing personal data and provide alerts to issues that could lead to a data breach.
In his Monitoring SQL Server Security article on the Redgate Hub, Phil Factor looks at what’s required and details how SQL Monitor can be used to do everything from detect SQL injection attacks to identify changes in permissions, users, roles and logins.
Secure data everywhere
Another important requirement of the new data protection legislation being introduced is to protect personal data all the way through the development process. This is a major concern for MSPs that act as remote DBAs because many developers like to use a copy of production databases to test changes against – the very databases which contain the personal data that needs to be protected.
One solution is to have a version of the production database with a limited dataset of anonymous data that is always used to develop and test against. This does, though, mean testing changes against a database that is neither realistic, nor of a size where the impact on performance can be assessed.
Another solution is to take a copy of the production database and mask the data manually by replacing columns with similar but generic data. This copy can then be used in development and testing but will age very quickly as ongoing changes are deployed to the production database.
This is where data masking tools, which pseudonymize and anonymize data, are now being adopted to provide database copies that are truly representative of the original and retain the referential integrity and distribution characteristics.
One such tool is SQL Provision which combines the data masking capability of Redgate’s Data Masker tool for SQL Server with Microsoft’s proven virtualization technology to creates database copies which are a fraction of the size of the original. This enables the copies, or clones, to be created in seconds, with the data in those copies automatically masked.
Redgate has also taken the same data masking approaching with Data Masker for Oracle which has been specifically written for the target database architecture and replaces sensitive data with realistic, anonymized, test data.
These are challenging times for MSPs. Clients naturally expect a premium service, as required, often at short notice. MSPs need to deliver that service when talent and time are often in short supply. Fortunately, the introduction and ongoing development of tools which can streamline and improve the delivery of services is now at a point where many of the laborious tasks involved can be made a lot easier.
Also in Blog
Every company is a software company – Satya Nadella
IDC predict that in less than three years, 60% of global GDP will be digitized, and certainly, most businesses report that they plan to invest he...
Also in Database development
The Accelerate: State of DevOps Report 2019 has just been published. Accelerate is the longest running study of DevOps in academia or industry and this year's report is the latest six years of researc...