Category: SQL

How to identify the CPU utilization in SQL Server

Using ring buffers

DECLARE @ts_now BIGINT = (SELECT cpu_ticks / ( cpu_ticks / ms_ticks )
   FROM   sys.dm_os_sys_info WITH (nolock));

SELECT TOP(256) sqlprocessutilization             AS [SQL Server Process CPU Utilization],
                systemidle                        AS [System Idle Process],
               100 - systemidle - sqlprocessutilization AS [Other Process CPU Utilization],
                Dateadd(ms, -1 * ( @ts_now - [timestamp] ), Getdate()) AS [Event Time]
FROM   
(SELECT 
       record.value('(./Record/@id)[1]', 'int') AS record_id,
       record.value('(./Record/SchedulerMonitorEvent/SystemHealth/SystemIdle)[1]', 'int')         AS [SystemIdle],
       record.value('(./Record/SchedulerMonitorEvent/SystemHealth/ProcessUtilization)[1]', 'int') AS [SQLProcessUtilization],
       [timestamp]
        FROM   (SELECT [timestamp],
                       CONVERT(XML, record) AS [record]
                FROM   sys.dm_os_ring_buffers WITH (nolock)
                WHERE  ring_buffer_type = N'RING_BUFFER_SCHEDULER_MONITOR'
                       AND record LIKE N'%%') AS x
) AS y
ORDER  BY record_id DESC
OPTION (recompile); 

ring buffers is a good way to get the CPU utilization as above. Please note, the above code is being used heavily by many DBAs, I do not really know who is the author of the above script, however, would like to give full credit to the author who drafted it.

Using perfmon

Performance monitor (perfmon) is a built-in tool in windows server to track the system performance and other data points. We can configure perfmon to run on scheduled manner and collect the information as per the requirement. Most of the production servers will be enabled with perfmon to track the performance and it has no or less impact on the server.

Once the data is collected, you can even look at the data through graphana or kibana which would provide us a good data representation.

Using open source monitoring tools

There are many open source monitoring tool in the market which can be used to get the information from server. Few of the tools are explained in the link https://geekflare.com/best-open-source-monitoring-software/

sp_databases in SQL Server

Today’s post will introduce a new system procedure sp_databases that helps If you want to know the databases and its sizes in your db server. Its a very handy procedure that helps at times.

Caveat:

  1. If you want to know the size of log and data separately, this procedure cannot be helpful. You may refer “https://sqlzealots.com/2015/01/29/find-the-database-size-logrowtotal-in-sql-server-using-t-sql/” for the same.
  2. If any of databases is having more than 2.15 TB size, then sp_databases may not provide the database_size value as this column is defined as a INT datatype which can hold value upto : 2,147,483,647

If you enjoyed this blog post, please feel free to share it with your friends!

Error Message: “The database could not be exclusively locked to perform the operation.” in SQL Server

Recently we encountered an error message as below while doing a rename database. So, let us look at the steps that we can use to overcome the issue with this post.

The database could not be exclusively locked to perform the operation.

We were trying to rename a database in one of our lower environment as below and ended up with the error message.

ALTER DATABASE dbname MODIFY NAME = dbname_new

Steps to resolve

  1. Take the database to single_user mode
  2. Rename the database
  3. Take the renamed database to multi_user mode

Script

ALTER DATABASE dbname SET SINGLE_USER WITH ROLLBACK IMMEDIATE
GO
ALTER DATABASE dbname MODIFY NAME = dbname_new
GO
ALTER DATABASE dbname_new SET MULTI_USER WITH ROLLBACK IMMEDIATE

If you enjoyed this blog post, please share it with your friends!

Warning Message in SQL Server: “Database name ‘tempdb’ ignored, referencing object in tempdb.”

Today, let us quickly try to understand an error message as below.

“Database name ‘tempdb’ ignored, referencing object in tempdb.”.

Many of us would have got this message (I do prefer to call this as a warning message, not error message), however, we would not have noticed as it is not causing any fatal results. With this post, I would like to share my thoughts and like to have your views on this as well.

Let us first see when would you get this message?

use tempdb
GO
create Table #Temp_Table(Col1 int)
GO
Select * From tempdb..#Temp_Table

Now, interesting thing, when the query is changed a bit as below, the message is vanished.

Somehow, I was not convinced the above method as a solution, because that was actually against my understanding of 4 part naming convention (servername.databasename.schemaname.objectname). So, we further tried to test with few more combinations to understand how it works internally.

A simple query as below showed #Temp_Table is created with dbo schema.

As next step, tried to create a schema and a new object is created. If you look at the schema of the object, we can clearly see that it is associated with dbo schema not the new schema. That means, the objects created in tempdb database is always creating in dbo, not in any other schema. In fact, the schema part is actually ignoring for temp tables.

create schema testschema
GO
create Table testschema.#Temp_Table10(Col1 int)

Points to ponder

There is NO need of specifying schema for objects created in tempdb database.

Even we specify a schema name, SQL Server simply ignores the schema part.

If you enjoyed this blog post, please share it with your friends!

Dark Theme in Azure Data Studio (ADS)

I always prefer Dark Theme possibly for almost all cases. This post simply explains the way to make ADS to a Dark theme mode.

Method 1: From File Menu and set the color theme.

Method 2: From “Settings” icon and set the color theme.

Step1: Go to “File” Menu and click on “Preferences” and select “Color Theme

Once the color theme is selected, we will be able to select the theme “Dark Azure Data Studio“.

See Also:

Dark Theme in SSMS

I’d like to grow my readership. If you enjoyed this blog post, please share it with your friends!