Category: NON SQL

Windows Power Shell Script to run SQL Files in Folder against SQL Server Database

Problem Statement:-

At times, we may need to run SQL files present in a folder against SQL Server Database. Since there are no built in feature available as of now, here is our try with a powershell script.


  1. Save the (.bat) Batch script & (.ps1) PS Script in same folder in which all the SQL Files are stored.
  2. The Machine should have the SQL Server instance and Power shell tool.
  3. Ensure the parameters values are verified in (.bat) Batch file

Scenarios Covered:-

i) All key values are handled with parameters
ii) If database not exist it will throw error and comes out of PS
iii) If any script has issues/failed, it will throw error and comes out of PS
iv) If we want to run the sql scripts in subfolder as well, then “Includesubfolders” parameter set to 1
v) Batch file automatically retrieve the root path

PS Script:-

$Scriptpath  = $args[0]
$Server =  $args[1]
$database = $args[2]
$user= $args[3]
$pwd= $args[4]

Function IsDBInstalled([string]$Server, [string]$database)

 $t=Invoke-Sqlcmd -ServerInstance $Server -Username  $user -Password  $pwd -Database "master" -Query "select 1 from sys.databases where name='$database'" -OutputSqlErrors $true 
  if (!$t) {
            Write-Host "Failed to connect to [$database] database on [$Server]" -BackgroundColor darkred 
            Write-Error "Failed to connect to [$database] database on [$Server]" -ErrorAction Stop
  } else {
            write-host "[$database] Database exists in SQL Server [$Server]" -BackgroundColor blue -ForegroundColor black

IsDBInstalled $Server $database

if($Includesubfolders -eq 1) {
$subscripts = Get-ChildItem $Scriptpath -recurse | Where-Object {$_.Extension -eq ".sql"}
foreach ($s in $subscripts)
    {   Write-Host "Running Script : " $s.Name -BackgroundColor green -ForegroundColor darkRed
        $tables=Invoke-Sqlcmd -ServerInstance $Server -Username  $user -Password  $pwd -Database  $database -InputFile $s.FullName -ErrorAction 'Stop' -querytimeout ([int]::MaxValue)
        write-host ($tables | Format-List | Out-String) 
} else {
$scripts = Get-ChildItem $Scriptpath | Where-Object {$_.Extension -eq ".sql"}
foreach ($s in $scripts)
    {   Write-Host "Running Script : " $s.Name -BackgroundColor green -ForegroundColor darkRed
        $tables=Invoke-Sqlcmd -ServerInstance $Server -Username  $user -Password  $pwd -Database  $database -InputFile $s.FullName -ErrorAction 'Stop' -querytimeout ([int]::MaxValue)
        write-host ($tables | Format-List | Out-String) 

Batch Script:-

SET root=%cd%
SET PSScript=%root%\RunSQLFiles.ps1
SET PowerShellDir=C:\Windows\System32\WindowsPowerShell\v1.0
CD /D "%PowerShellDir%"

SET path=%root%
SET "machine=sqlserversample45"
SET "db=sample"
SET "user=username"
SET "pwd=password"
SET "Includesubfolders=0"

Powershell -ExecutionPolicy Bypass -Command "& '%PSScript%' '%path%' '%machine%' '%db%' '%user%' '%pwd%' '%Includesubfolders%'"



If Database Not Exists:

If Database Exists:

If Any Script has Error:

If “Includesubfolders” parameters set to “1”

I’d like to grow my readership. If you enjoyed this blog post, please share it with your friends!

Amazon RDS for SQL Server versus SQL Server On Amazon EC2

Its an era of Cloud computing and anywhere you go, people talk about cloud computing and its usages, migration and benefits etc.So, I thought of taking an initiative on understanding how SQL Server is versed with Cloud. I would like to admit the fact that it was a “walk away” for me initially, but its quite interesting and fascinated knowing more about. Not just because its success stories,but also I loved the challenges while assessing different scenarios SQL Server with Cloud computing and its support.

This blog post concentrates on AWS and SQL Server.

Let us quickly understand what is AWS first?

AWS is a cloud computing platform provided by Amazon. It includes Infrastructure as a Service (IaaS), Platform as a service (PaaS) and Software As a Service (SaaS). Its a “pay-as-you-go” computing model that means you can scale your depending on your needs and pay for the service. Again, I do not want to go in detail about AWS since lots of resources available in internet and its just a google search away.

AWS offers a great flexibility to run SQL server with two major options:

1. Amazon Elastic Compute Cloud (EC2)
EC2 is a version of VM in Amazon cloud platform. It provides us a complete control for settings/configuration etc.
2. Amazon Relational Database Service (RDS)
RDS is a fully managed service without much control to the user. Service would take care of maintenance and manageability of service instances. Amazon RDS supports various relational database engines like MySQL, PostgresSQL, MS SQL, Oracle , MongoDB, Maria DB and Amazon Aurora etc.
Amazon Relational Database Service provides many more managed services compared to EC2 as shown in the below picture. Hence, its is more of business drivers that the customer can spend more effort on their data rather than maintenance and manageability of their database or instance.

Features supported by RDS & EC2

Now, its very important for SQL Server Professionals that to understand the supporting features for Amazon RDS and Amazon EC2. A quick reference table is prepared as below (as on 25th Apr 2020). We need to understand this reference may or may not needs changes as Amazon increases its capabilities in future and we may need to compare with the latest at the time.

SQL Server Features Amazon RDS for SQL Server SQL Server on Amazon EC2
Versions Supported SQL 2012/2014/2016/2017 ALL
Editions Supported Express/Wb/Standard/Enterprise
High Availability AWS Managed Self Managed
Encryption TDE , Encrypted Storage
Authentication Windows & SQL
Backups Managed Customized
Maintenance Automatic Self Managed
Monitoring & Management Amazon CloudWatch
CDC Available

The above are the major comparison chart, more detailed information, please refer See Also section.

Assessment & Planning

Since we had seen some of the differences between Amazon RDS & Amazon EC2 services, it is time for us to evaluate which one would be best suited for our application if there is a migration plan exists. We all should be aware the fact that it is NOT definitely a LIFT and SHIFT technique, but needs lots of understanding of your system and careful evaluation of certain things. This is the most time taken phase. I would like to list out few parameters to help in this decision making as below. Please note, it is a very wide topic and we should do a careful evaluation depending on your application/project workloads and other characteristics. I would also suggest to have an evaluation done by an Amazon professional for smooth transition.

1. Conduct an inventory of your SQL Server instances
2. Conduct an inventory of your SQL Server databases
3. Know your current Licensing option
One of the main reasons for cloud migration is cost saving. It can be in terms of Infrastructure cost or human resource cost or maintenance cost etc. License Cost can also be reviewed here as AWS comes with License option by default, so it would be important to verify the current licensing option with your organization and Microsoft that can be further evaluated. “Amazon RDS for SQL Server supports the “License Included” licensing model. You do not need separately purchased Microsoft SQL Server licenses. “License Included” pricing is inclusive of software, underlying hardware resources, and Amazon RDS management capabilities.” Please refer for more details.
4. Understand your HA/DR solutions
5. Performance benchmarks if any /Capacity Planning
(Resource utilization like CPU/Memory/IOPS etc)

6. Data Retention Policy if any
This assessment helps us to reduce the Disk cost and utilization. Say, If we have a database with 10 years of data and as per business rules, we just need ONLY last 5 years data , then it is good to purge or archive the older data before we migrate to AWS. This would reduce the space usage and enables an easy migration from On-Premise to Cloud. This is one of the area where Micorsoft Azure has flexibility using Stretch database feature; Amazon RDS does not support Stretch Databases yet (as on published date of this blog post).
7. Various Migration options

We will cover lot more interesting things in upcoming posts, until then stay tuned, stay safe!

See Also:

I’d like to grow my readership. If you enjoyed this blog post, please share it with your friends!

On-Demand Performance Test Rig with JMeter

These days it has become necessary to execute the performance test at low cost. This blog details about the how to setup  “On-Demand & low cost” Performance test rig on Azure and execute performance test with it.

Primarily, to setup the on-demand performance test rig below are the prerequisites

  1. JMeter scripts needs to be check-in to GIT Repo and keep it updated
  2. Azure Subscriptions with the Resource Group created in which the on-demand test rig will be created.
  3. Azure container Registry (ACR) – The JMeter docker image will be stored here
  4. JMeter plugin Reference

Azure CLI Command must include the following:

  1. Azure container Registry with JMeter Image Path
  2. Region in which the Container need to be spin up
  3. CPU, need for the load test execution
  4. Memory in GB, needed for the test execution
  5. GIT Repo Mount Path


How does it work?

There will be more posts to come on this topic, until then stay tuned and stay safe!!!

Windows Power Shell Script to Find Full File Path Length for all files in Directory

In some cases, we may need to identify the maximum length of full file path in a directory such that we can reduce the file name to avoid file length/security policy issues.

PS Script:-

$pathToScan = "C:\temp\File_Length"  
$outputFilePath = "C:\temp\File_Length\output.txt" 
$writeOnConsole = $true   

$outputDir = Split-Path $outputFilePath -Parent
if (!(Test-Path $outputDir)) { New-Item $outputDir -ItemType Directory }

if ($writeOnConsole) {Write-Host "*************************************"}
if ($writeOnConsole) {Write-Host "  List of files with file Length :-  "}
if ($writeOnConsole) {Write-Host "*************************************"}
$stream = New-Object System.IO.StreamWriter($outputFilePath, $false)
Get-ChildItem -Path $pathToScan -Recurse -Force | Sort-Object {($_.FullName.Length)} -Descending | ForEach-Object {
    $Path = $_.FullName
    $len = $_.FullName.Length
    $strg = "$len : $Path"
    if ($writeOnConsole) { Write-Host $strg }



Hope this would be helpful, thanks for reading !!

For more Powershell related blogs: refer here.


Windows Power Shell script to Purge/Cleanup Backup & Transaction files

At times, we need to purge backup files (.bak, *.trn) which are older than some x days from the server on the regular basis, such that Disk Space and SQL Data backups will be maintained consistently within the server.

Power Shell Script:-

The below power shell script will purge the backup files (which are older than 5 days) from server, this script identifies the older files based on the last modified date time.

Based on your requirement, you can change the date range and schedule this script for Server Maintenance.

Get-ChildItem -Path "C:\Backups" -Recurse -ErrorAction SilentlyContinue -include *.bak, *.trn | 
Where-Object {$_.LastWriteTime -lt (Get-Date).AddDays(-5) -and $_.PSIsContainer -eq $False} |