JSON stands for Javascript Object Natation. It has become a standard format to exchange and store data since when Big data revolution has begun. It is now used in several languages to exchange data. In Powershell, we have cmdlets: ConvertTo-JSON that can be utilized for converting a .NET object to a JSON format output.
Below is the complete demonstration:
Creating a table for the example:
Now we can continue with the script that exports the columns in JSON format.
Reading from SQL Server instance and convert to JSON:
To be able to read data from SQL Server table, you must have a table name and running SQL Server instance. I assume you have a database called db123 and server named SQL123\TST1.
Script:
Output:
The example is written in a simple manner, you may enhance and create function out of it.
While you are about to export a very large table (with size more than 1 GB), try to perform the task in batches. This will help reducing the memory consumption. You can create a view or modify the query to perform this task and if you have a unique column, your task will be more simpler.
Thanks for reading this article. Please do add some comments if you find any difficulty understanding the script.
Below is the complete demonstration:
Creating a table for the example:
Before we export the data, let's create a table with some data into it. Below example creates a table named Employee and adds two rows only.
create table Employee ( EmpID int, Name varchar(200)) go insert into Employee values ( 1 , 'Som Tripathi' ) insert into Employee values ( 2 , 'Ostwald' ) go select * from Employee go
Output:
Now we can continue with the script that exports the columns in JSON format.
Reading from SQL Server instance and convert to JSON:
To be able to read data from SQL Server table, you must have a table name and running SQL Server instance. I assume you have a database called db123 and server named SQL123\TST1.
Script:
#--Exporting SQL Server table to JSON --# cls #--Establishing connection to SQL Server --# $InstanceName = "SQLTEST1\ENG_TST1" $connectionString = "Server=$InstanceName;Database=dbadb;Integrated Security=True;" #--Main Query --# $query = "SELECT * FROM Employee" $connection = New-Object System.Data.SqlClient.SqlConnection $connection.ConnectionString = $connectionString $connection.Open() $command = $connection.CreateCommand() $command.CommandText = $query $result = $command.ExecuteReader() $table = new-object "System.Data.DataTable" $table.Load($result) #--Exporting data to the screen --# $table | select $table.Columns.ColumnName | ConvertTo-Json $connection.Close()
Output:
The example is written in a simple manner, you may enhance and create function out of it.
While you are about to export a very large table (with size more than 1 GB), try to perform the task in batches. This will help reducing the memory consumption. You can create a view or modify the query to perform this task and if you have a unique column, your task will be more simpler.
Thanks for reading this article. Please do add some comments if you find any difficulty understanding the script.