Recently we were working with Azure Logic Apps to invoke Azure Functions. By Default, Logic App runs parallel threads and we didn’t explicitly control the concurrency and left the default values.
So Logic App invoked several concurrent threads which in turn invoked several Azure Functions. The problem was Azure Functions invoked Database Calls which caused Deadlocks. In Ideal world, Database should be able to handle numerous concurrent functions without deadlocks. Our process high percentage of shared data and we wanted to ensure the consistency , so we had Explicit Transactions in our Stored procedure calls. That’s the root cause of the problem and we didn’t want to remove the explicit Transaction.
The solution we implemented to alleviate this problem is to run this process in Sequence instead of parallel threads.
Log App Concurrency Control Behavior
For each loops execute in parallel by default. Customize the degree of parallelism, or set it to 1 to execute in sequence.
Transaction (Process ID 166) was deadlocked on lock | communication buffer resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
So we have identified Deadlock happened in the database through our Application Insights. Next logical question is , what caused this deadlock.
Azure SQL Server Deadlock Count
These queries identifies the deadlock event time as well as the deadlock event details.
SELECT * FROM sys.event_log
WHERE event_type = 'deadlock';
WITH CTE AS (
SELECT CAST(event_data AS XML) AS [target_data_XML]
null, null, null)
'DateTime2') AS Timestamp,
/value/deadlock') AS deadlock_xml,
/value').value('(/value)', 'nvarchar(100)') AS db_name
You can save the Deadlock xml as xdl to view the Deadlock Diagram. This provides all the information we need to identify the root cause of the deadlock and take necessary steps to resolve the issue.
Install and Run SQL Server Docker Container on Mac
Like most people, I use Mac , Windows as well Linux OS for development and testing purposes. Primarily I use Mac for Development purpose. I have few projects which uses SQL Server as Data Storage Layer. Setting up Docker Container on Mac and Opening up the ports was pretty easy and doesn’t take more than 10 Minutes.
Steps followed :
Pull SQL Server Docker Image
Run SQL Server Docker Image
Install mssql Client
Open the Ports to connect to SQL Server from the network
Setup port forwarding to enable access outside the network
Open Up the Firewall to connect to SQL Server from outside the Host
Ensure your firewall is configured to allow the connections to the SQL Server. I turned of “Block all incoming connections” and enabled “Automatically allow downloaded signed software to receive incoming connections”. Without proper firewall configurations, you won’t be able to connect to the SQL Server outside the host.
Connecting from the Internet ( Port forwarding Setup )
Lets say you want to connect to the SQL Server you setup from outside the network or from anywhere via internet,you can setup port forwarding.
Get your public facing IP and setup the port forwarding for Port 1433 ( SQL Server port you have setup your docker container ). If its setup correctly , you should be able to telnet into that port to verify the connectivity.
telnet 220.127.116.11 1433
Unless you absolutely require it , its very bad idea to expose the SQL Server to internet. It should be behind the network , only your webserver should be accessible via internet.
While launching docker container , if you get the error saying that it doesn’t have enough memory to launch SQL Server Container, go ahead and modify the memory allocation for docker container.
At least 3.25 GB of RAM. Make sure to assign enough memory to the Docker VM if you’re running on Docker for Mac or Windows.
I have setup this way.
If you don’t provision enough memory, you will error like this.
Look into Docker logs
Following command ( docker ps -a and docker logs mcsqlserver ) shows list of running processes and docker logs.
$ docker ps -a
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
9ea3a24563f9 microsoft/mssql-server-linux:2017-latest "/bin/sh -c /opt/m..." About an hour ago Up About an hour 0.0.0.0:1433->1433/tcp macsqlserver
$ docker logs macsqlserver
2017-10-08 23:06:52.29 Server Setup step is copying system data file
'C:\templatedata\master.mdf' to '/var/opt/mssql/data/master.mdf'.
2017-10-08 23:06:52.36 Server Setup step is copying system data file
'C:\templatedata\mastlog.ldf' to '/var/opt/mssql/data/mastlog.ldf'.
2017-10-08 23:06:52.36 Server Setup step is copying system data file
'C:\templatedata\model.mdf' to '/var/opt/mssql/data/model.mdf'.
2017-10-08 23:06:52.38 Server Setup step is copying system data file
'C:\templatedata\modellog.ldf' to '/var/opt/mssql/data/modellog.ldf'.
I highly recommend to create least privileged accounts and disable SA login. If you are exposing your SQL Server to internet, there are ton of hacking and pentest tools which uses sa login for brute force attack.
Windows Containers do not ship with Active Directory support and due to their nature can’t (yet) act as a full-fledged domain joined objects, but a certain level of Active Directory functionality can be supported through the use of Globally Managed Service Accounts (GMSA).
Windows Containers cannot be domain-joined, they can also take advantage of Active Directory domain identities similar to when a device is realm-joined. With Windows Server 2012 R2 domain controllers, we introduced a new domain account called a group Managed Service Account (GMSA) which was designed to be shared by services.
If everything is working as expected then you need to create credential spec file which need passed to docker during container creation to utilize this service account. Run the below commands to downloads module which will create this file from Microsoft github account and will create a JSON file containing required data.
-UseBasicParsing -OutFile $env:TEMP\cred.psm1
New-CredentialSpec -Name Gmsa -AccountName container_gmsa
#This will return location and name of JSON file
Step 5: SQL Server Configuration to allow GMSA
On SQL server create login for GMSA account and add it to "Sysadmin" role. Based on your on premise DB access, you can add suitable roles.
CREATE LOGIN [cloudiq\container_gmsa$] FROM WINDOWS
sp_addsrvRolemember "cloudiq\container_gmsa$", "sysadmin"
We are evaluating pros and cons of different hosting solutions for SQL Server which best suits our business needs.
Our business needs
Our demand is very predictable seasonal demand. We are very small and can’t afford dedicated team for managing database infrastructure.( No DBA Team) Sky high expectation from Customers on availability and reliability for about 2 months in a year. Few minutes of downtown during peak period can cause havoc to our business . Fixed budget with very little wiggle room. Our plan is to evaluate AWS SQL Server RDS, Azure RDS , Managed solutions from hosting provider. Evaluate each option in these categories.
Performance and Reliability
Ability to scale up during peak loads
Cost ( Based on Network , Storage, Memory and CPU )
Infrastructure Requirements :
SQL Server Enterprise Edition since we use enterprise features AlwaysOn Availability group for High Availability Geo Replication or Multi Availability zone implementation for Cloud based databases Ability to route Read/Write workloads 128 Gig RAM – Minimum 1 – 2 TB Storage with 500 Gigs of SSD for TempDB Database and High Volume Tables Memory Optimized OLTP Support which needs SQL Server 2016 Edition Ability to handle ~ 30 K IOPS during peak load.
AWS SQL Server RDS Configurations On-Demand for SQL Server (License Included) Multi-AZ Deployment Region: US East (N. Virginia) Memory Optimized Instances – Current Generation Price Per Hour RAM : 244 GB 10 Gigabit 32 vCPU 20,000 Provisioned IOPS
2 x 320 SSD
Intel Xeon E5-2670 v2 (Ivy Bridge)
Azure Pricing Calculator
Azure performance is measured in DTU. We have been collecting our performance metrics during load test. The following link provides lightweight utility to convert perfmon counters to Azure DTU’s.
Perfmon Counters to Azure DTU Conversion Utility Link:
The Database Transaction Unit (DTU) is the unit of measure in SQL Database that represents the relative power of databases based on a real-world measure: the database transaction. We took a set of operations that are typical for an online transaction processing (OLTP) request, and then measured how many transactions could be completed per second under fully loaded.
CloudIQ is a leading Cloud Consulting and Solutions firm that helps businesses solve today’s problems and plan the enterprise of tomorrow by integrating intelligent cloud solutions. We help you leverage the technologies that make your people more productive, your infrastructure more intelligent, and your business more profitable.