Wednesday, November 29, 2017

What is Commit,Rollback and SavePoint in SQL Server

How to use Commit, Rollback and SavePoint commands in SQL Server
The Commit, Rollback and SavePoint are the Transactional Controlling Language commands in the SQL Server. They are used to mange the Transactions in the Database. In general, these commands are use to manage the changes made by the DML Commands like Insert update and Delete.

Commit : The Commit command is used to save any Transaction permanently into the database. Once you Commit any Transaction that cannot be Rollback.

The Transaction is nothing but any operation that we perform on a Table. The transaction which begin with keyword "Tran" or "Transaction" and ends with either Commit or Rollback.

Eg :
Begin Tran T1
Update Tbl_Emp Set Job_Title='Market Analyst' Where Emp_Id=123
Commit Tran T1

Rollback : Rollback command  is used to undo the changes made by any operation but only before a Commit is done. We can't Rollback data which has been Committed in the database with the Commit keyword.
Eg :
Suppose we have a Table "Tbl_Emp" with records as follows..
Emp_Id  Job_Title
    143     Engineer
    123     Market Analyst

Now lets make an update to Table and Rollback to Original state
Begin Tran T1
Update Tbl_Emp Set Job_Title="Market Research Analyst" Where Emp_Id=123
Rollback Tran T1

Ouput : 
Emp_Id  Job_Title
    143     Engineer
    123     Market Analyst

Savepoint : 
Savepoint command is used to temporarily save a transaction so that you can rollback to that point whenever necessary. You can you use multiple save points in a Transaction.

Syntax : Save Transaction SavePoint_Name

Example-I :
In the following Transaction, I am Inserting 3 records into a Table then Saving the Trasaction ( SavePoint 1) and then Updating those 3 records and again Saving the Transaction( SavePoint 2).
Next I am Truncating the Table and Rollback to Save Point then Deleting a Record then Committing the Transaction.

Begin Tran T1
Truncate Table [Emp_Test]

Select * From  [dbo].[Emp_Test]
  Insert Into [Emp_Test] Values ( 113,'Raju')
  Insert Into [Emp_Test] Values ( 114,'Ravi')
  Insert Into [Emp_Test] Values ( 115,'Rani')

 Save Tran SP1;
  Update [Emp_Test]  Set Ename='Sai' Where Eid=113
  Update [Emp_Test]  Set Ename='Shiva' Where Eid=114
  Update [Emp_Test]  Set Ename='Sloka' Where Eid=115

 Save Tran SP2;
  Truncate Table [Emp_Test]
 Rollback Tran SP1;
  Delete From [Emp_Test] Where EId =113
   Commit Tran T1 ;
Select * From  [dbo].[Emp_Test]

GO

Output : 
EId Ename
114 Ravi
115 Rani


Notes : 
Here if you observe, we inserted 3 records before SavePoint 1(Save Tran SP1).
But when you Rollback to that Savepoint , we got only 2 records, because we deleted a record after Rollback then Committed the Transaction. So, once you Committed a Transaction after any operation that cannot be rolled back.

Example-II :
Here we Rollback to SavePoint 2 ( Save Tran SP2)

Begin Tran T1
Truncate Table [Emp_Test]

Select * From  [dbo].[Emp_Test]
  Insert Into [Emp_Test] Values ( 113,'Raju')
  Insert Into [Emp_Test] Values ( 114,'Ravi')
  Insert Into [Emp_Test] Values ( 115,'Rani')

 Save Tran SP1;
  Update [Emp_Test]  Set Ename='Sai' Where Eid=113
  Update [Emp_Test]  Set Ename='Shiva' Where Eid=114
  Update [Emp_Test]  Set Ename='Sloka' Where Eid=115

 Save Tran SP2;
  Truncate Table [Emp_Test]
 Rollback Tran SP2;
  Delete From [Emp_Test] Where EId =113
   Commit Tran T1 ;
Select * From  [dbo].[Emp_Test]

GO

Output : 
EId Ename
114 Shiva
115 Sloka


Notes :
Here, if you observe, we updated 3 records before SavePoint 2 (Save Tran SP2).
But when you Rollback to that Savepoint , we got only 2 records, because we deleted a record after Rollback then Committed the Transaction. So, once you Committed a Transaction after any operation that cannot be rolled back.

Example-III :
Here if we try to Rollback to SavePoint 1 and then SavePoint 2 then we will receive an error saying "Cannot roll back SP2. No transaction or savepoint of that name was found" , because SavePoint 2 will be erased from the Memory when you Rollback to prior SavePoint of that (SavePoint 1).

Begin Tran T1
Truncate Table [Emp_Test]

Select * From  [dbo].[Emp_Test]
  Insert Into [Emp_Test] Values ( 113,'Raju')
  Insert Into [Emp_Test] Values ( 114,'Ravi')
  Insert Into [Emp_Test] Values ( 115,'Rani')

 Save Tran SP1;
  Update [Emp_Test]  Set Ename='Sai' Where Eid=113
  Update [Emp_Test]  Set Ename='Shiva' Where Eid=114
  Update [Emp_Test]  Set Ename='Sloka' Where Eid=115

 Save Tran SP2;
  Truncate Table [Emp_Test]
 Rollback Tran SP1;
  Delete From [Emp_Test] Where EId =113

Rollback Tran SP2;
   Commit Tran T1 ;
Select * From  [dbo].[Emp_Test]

GO

Output : 
EId Ename
114 Ravi
115 Rani


Msg 6401, Level 16, State 1, Line 26
Cannot roll back SP2. No transaction or savepoint of that name was found.

--------------------------------------------------------------------------------------------------------
Thanks, TAMATAM ; Business Intelligence & Analytics Professional
--------------------------------------------------------------------------------------------------------

Sunday, November 26, 2017

What is Full Cache, Partial Cache and No Cache of Lookup Transformaton in SSIS Package

How to Handle the Row yielded no match during lookup and Row disposition Errors in SSIS Lookup Transformation
Scenario :
Suppose we have the Source data of Sales Orders details as follows..



Out of these Sales Orders, few of them are got Cancelled , and those Orders details are available in a Lookup Table which is in the SalesDB as follows...

 SELECT [Order_Id]
    ,[Region_Code]
    ,[Order_Status]
   FROM [SalesDB].[dbo].[LKP_Orders_Status]
   WHERE [Order_Status]='Cancelled' 



Now we need to Loop the Source data against this Lookup Table, then we need to send the Output to the Destination Tables which exist in the MyAnalyticsDB as follows..

The Cancelled Orders data should load to Tbl_Orders_Cancelled , and other Orders data should load to Tbl_Orders


Now I will design the Package to Fulfill the above scenario

--------------------------------------------------------------------------------------------------------------
Phase-1 : Add the Data Flow to Control Flow and then Design the Flow
--------------------------------------------------------------------------------------------------------------

A) Add the Data Flow to Control Flow:

B) Add the Flat File Source in the Data Flow and Connect to Source:




--------------------------------------------------------------------------------------------------------------
Phase-2 : Add the Lookup Transformation to Data Flow and Connect to the Source and then Set the Properties
--------------------------------------------------------------------------------------------------------------
A) Add the Lookup Transformation Inside Data Flow and Connect to Source:



B) Next go to Lookup Transformation Editor > Connection Tab; Make the connection to the Lookup Table and Select the required Columns.

The Lookup transformation supports the only the OLE DB connection manager.
The reference dataset(Lookup Table) can be a cache file, an existing table or view, a new table, or the result of an SQL query. The Lookup transformation uses either an OLE DB connection manager or a Cache connection manager to connect to the reference dataset





C) Next go to Lookup Transformation Editor > General Tab; Select the Cache Mode

Full Cache :
When we select the Cache Mode as "Full Cache" then Lookup Transformation will read all the Data once from Source into the Cache, then it will lookup each record from Source against the Lookup Table, If matches found those records will be load to Lookup Match Output and non match records will be load to Lookup No Match Output.

Partial Cache :
When we select the Cache Mode as "Partial Cache" then Lookup Transformation will read each record from the Source into the Cache, then it will lookup that record from Source against the Lookup Table, If matches found that source record will be stored in the Cache for further use(next time it will look in Cache first) and then load that record to Lookup Match Output and non match records will be load to Lookup No Match Output.

In the Partial Cache case, we need to Allocate the Cache memory size for Lookup Matching and No Matching. The memory will be allocated to Cache from your system memory.

Note :
The rows with matching entries in the reference dataset and, optionally, the rows without matching entries in the dataset are stored in Cache. When the memory size of the cache is exceeded, the Lookup transformation automatically removes the least frequently used rows from the Cache.

You can allocate the Cache from Lookup Transformation Editor > Advanced Tab



No Cache :
When we select the Cache Mode as "No Cache" then Lookup Transformation will not use the Cache. It will read the each record directly Source data then it will lookup that record  against the Lookup Table,then it will load that record to Lookup Match Output and non match records will be load to Lookup No Match Output.
Notes :
Whether you select partial cache or no cache, you use an OLE DB connection manager to connect to the reference dataset. The reference dataset is generated by using a table, view, or SQL query during the execution of the Lookup transformation.
The rows with matching entries in the reference dataset and, optionally, the rows without matching entries in the dataset are stored in cache. When the memory size of the cache is exceeded, the Lookup transformation automatically removes the least frequently used rows from the cache.

D) Next go to Lookup Transformation Editor > Columns Tab ; Map the Source key column with Lookup Column in Lookup table
Here we will lookup Order_Id column from source against the Lookup Table column "Order_Id"

Next you can select any other columns from Lookup table if you want to read into Output.
Here I am reading the Region_Code and Order_Status from Lookup table.


E) Next go to Lookup Transformation Editor > General Tab; Specify how to handle rows with No Matching entries..
Here we need to specify how you need to handle the rows with now Match output.
If you choose the option "Fail Component" , then run the Package, it will throw the following errors when it found the No Matching records from Source.


Error : Row yielded no match during lookup.
[Lookup Cancelled Orders [45]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERRORThe "Lookup Cancelled Orders" failed because error code 0xC020901E occurred, and the error row disposition on "Lookup Cancelled Orders.Outputs[Lookup Match Output]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.

To avoid these Errors we need to specify the option "Redirect rows to Match output" as follows..


Next Say OK
--------------------------------------------------------------------------------------------------------------
Phase-3 : Next Connect the OLED Destinations to Lookup Transformation to Load Matching and No Matching Outputs
--------------------------------------------------------------------------------------------------------------
A) Connect the OLEDB Destination to Lookup Transformation, for loading Matching Output


B) Next go OLEDB Destination Editor > Connection Manger Tab , then set the Connection and select the Destination Table


C) Next go OLEDB Destination Editor > Mapping Tab , then do the Source and Destination Columns mapping
Next Say OK
D) Connect the OLEDB Destination to Lookup Transformation, for loading No Matching Output

E) Next go OLEDB Destination Editor > Connection Manger Tab , then set the Connection and select the Destination Table

F) Next go OLEDB Destination Editor > Mapping Tab , then do the Source and Destination Columns mapping
Here Region_Code and Order_Status from lookup table will not flow into No Matching Output.


Next Say OK
G) Finally Run the Package

--------------------------------------------------------------------------------------------------------------
Output :
Matching Output Destination Table
Select * From [dbo].[Tbl_Orders_Cancelled]

No Matching Output Destination Table
Select * From [dbo].[Tbl_Orders]


--------------------------------------------------------------------------------------------------------
Thanks, TAMATAM ; Business Intelligence & Analytics Professional
--------------------------------------------------------------------------------------------------------

Saturday, November 25, 2017

SSIS ForEach Item Enumerator to Create Folders for each Month by Quarter for Current Year

How to create Monthly Archive folders by Quarter for Current Year and Copy the Template to Each Month Folder using SSIS File System Task with ForEach Item Enumerator
The For Each Item Enumerator is of the Enumerator option available in Foreach Loop Container. It will use to enumerate through a list of static items declared at the For Each Container level.
Scenario :
I want to create Monthly Archive folders by Quarter for Current Year and Copy a report Template to Each Month Folder as shown below...

Now lets design the Package to fulfill the above scenario.
-------------------------------------------------------------------------------------------------------------------
Phase-1 : Create the required Variables for the Package
-------------------------------------------------------------------------------------------------------------------

Here, we create below variables..
Template : It is a Static variable where we declare the Report Template which we want to copy to the destination folders.
T:\My_Tech_Lab\SSIS_Lab\SSIS_Proj1_Sol1\Source\Template\ReportTemplate.xlsx
OutPutFolderIt is a Static variable where we create the Monthly archive Folders and copy the Report Template.
T:\My_Tech_Lab\SSIS_Lab\SSIS_Proj1_Sol1\Desti\
MonthName : It is a Dynamic variable used in ForEach loop to store the Month Name during the run time. (Eg : Jan)
QuarterName: It is a Dynamic variable used in ForEach loop to store the Quarter Name during the run time. (Eg : Q1)
Year : It is a Dynamic variable used to store the Current Year calculated using the below expression.(Eg : 2017)
(DT_WSTR, 4) YEAR(Getdate()) 


NewFoderName :It is a Dynamic variable used to create the New destination Folder path, which will calculate using the below expression.

@[User::OutPutFolder]+(DT_STR,4,1252) @[User::Year] +"\\"+ @[User::QuarterName] +"\\"+@[User::MonthName]

Eg: T:\My_Tech_Lab\SSIS_Lab\SSIS_Proj1_Sol1\Desti\2017\Q1\Jan
-------------------------------------------------------------------------------------------------------------------
Phase-2 : Add the ForEach Loop to Control Flow and Set the Properties
-------------------------------------------------------------------------------------------------------------------

A) Add the ForEach Loop to Control Flow


B) Next select the Enumerator from ForEach Loop Editor > Collections Tab
Here we select Enumerator as "ForEach Item Enumerator"

C)Next Click on Column to add Columns one for Month another for Quarter


D)Next declare the Static values Month(Colum 0) another for Quarter(Colum 1)

D)Next do the variable mappings for ForEach Loop Editor > Variables Tab
Here we map the MonthName and QuarterName variables to store the Month Name and Quarter Name during run time.

Next Say OK.
-------------------------------------------------------------------------------------------------------------------
Phase-3 : Connect the File System Task to ForEach Loop and Set the Properties
-------------------------------------------------------------------------------------------------------------------

A) Add the File System Task inside the ForEach Loop


B) Next set the properties in the File System Task Editor

Operation = Create Directory ; it will create the New Folder for each Month Name during run time
Source Variable = User:NewFolderName ; The Name of the Folder which will be created in the OutFolder location.

Next Say OK.
------------------------------------------------------------------------------------------------------------------
Phase-4 : Connect a another File System Task to existing and set Properties
-------------------------------------------------------------------------------------------------------------------

A) Add a another File System Task and Connect to Existing. 
The new File System Task will be used to Copy the Report Template to the Month Folders which created by first File System Task.

B) Set the Properties to the new File System Task
Operation = Copy File; it will Copy the Template from Source to Destination.
Source Variable = User:Template
Destination Variable = User:NewFolderName


Next Say OK.
C) Finally run the Package



--------------------------------------------------------------------------------------------------------
Output :
Now the Folders for each Month name has been created under respective Quarter Folder of the Current Year Folder.

Next the ReportTemplate has been Copied to each Month Folder.
Note :
If Else Expression to Calculate the Quarter Names from Month Names

( @[User::MonthName]=="Jan"||@[User::MonthName]=="Feb"||@[User::MonthName]=="Mar" ? "Q1":
@[User::MonthName]=="Apr"||@[User::MonthName]=="May"||@[User::MonthName]=="Jun" ? "Q2":
@[User::MonthName]=="Jul"||@[User::MonthName]=="Aug"||@[User::MonthName]=="Sep" ? "Q3":"Q4")


--------------------------------------------------------------------------------------------------------
Thanks, TAMATAM ; Business Intelligence & Analytics Professional
--------------------------------------------------------------------------------------------------------

Friday, November 24, 2017

How to Import the Files from a SQL Server Table to the File System using SSIS Package

SSIS Package to Import the Files from the SQL Server Table to a Folder using Export Column Transformation
The Export Column Transformation in SSIS is used to export binary files, images, media or any kind of documents(DT_TEXT, DT_NTEXT,DT_IMAGE) from SQL Server Table to the File System/Folder. The Export Column Transformation reads the File Path from the Column and Exports the Files to this Path from the Column where the File are stored in Binary Format.
The Export Colum Transformation works exactly opposite to Import Column Transformation
Scenario :
Suppose we have a Source Table "Tbl_FilesStorage" having the File Path and File details as follows..


Here in FileName column having the Fully Qualified File Name ;
FileObject Column having the File details with binary format.
Now if we use the Export Column Transformation, then it will Export the Files to below Folder Path :
T:\My_Tech_Lab\SSIS_Lab\SSIS_Proj1_Sol1\Source\MyFiles\
Now lets design the Package to fulfill the above scenario.
--------------------------------------------------------------------------------------------------------
Phase I : Add the Data Flow Task to Control Flow and Design the Flow
--------------------------------------------------------------------------------------------------------
A) Add the Data Flow to the Control Flow
B) Add the OLE DB Source Inside the DataFlow

 C) Next make the Database Connection and Select the Source Files Table

 D) Next select the Source Columns FileName (having File Path) and FileObject (having File details )
Next Say OK.
--------------------------------------------------------------------------------------------------------
Phase II : Next Connect the Export Colum Transformation to OLE DB Source and do the Colum Mappings
--------------------------------------------------------------------------------------------------------
A) Add and Connect the Export Column Transformation to OLE DB Source
B) Next map the Source Columns at Export Column Transformation Editor 

Here..
Extract Column: For this column we will map the text or image or binary data column.
In this example, We selected FileObject because we are exporting the Files from this Column of SQL to a File System.
File Path Column:  For this column we  will map the source Column which has the File Plath, whre the files will be saved.
Allow Append: This property/option specifies whether you want the Export Column Transformation to add data to existing files or not. The default is false, you can change if requires.
Force Truncate: This property/option specifies whether you want the Export Column Transformation to delete any of the existing files before writing the new data. The default is false, you can change if you require by check mark the option.

Please note that we can use either one of the option "Allow Append" or "Force Truncate".
Write BOM: This property/option specifies whether you want the Export Column Transformation to write a byte-order mark (BOM) to the file. A BOM is only written if the data has the DT_NTEXT or DT_WSTR data type.
Next Say OK.
C) Finally Run the Package

--------------------------------------------------------------------------------------------------------
Output :
Now the Files have Exported from SQL Table to a Folder as follow..


--------------------------------------------------------------------------------------------------------
Thanks, TAMATAM ; Business Intelligence & Analytics Professional
--------------------------------------------------------------------------------------------------------

How to Export the Files from a Folder to the SQL Server Table in Binary Format using SSIS Package

SSIS Package to Export the Files from a Folder to the SQL Server Table in Binary Format using Import Column Transformation
The Import Column Transformation in SSIS is used to import binary files, images, media or any kind of document to the data flow and then do some manipulations and then load the data to the destination. The data type of the output column must be DT_TEXT, DT_NTEXT or DT_IMAGE.
Scenario :
Suppose we have a different kind(.txt,.xlsx,.jpg,.mp3) of Files in a Folder as follows..
Now we want to Import these Files along with their Fully Qualified Names in to the below SQL Server Destination table..


Here, In the FileName Colum we store the File Full Path and in the FileObject Column we store the File in a Binary Format.
Now lets Design the Package to Fulfill the above Scenario..
--------------------------------------------------------------------------------------------------------------
Phase - I : Create the required Variables
--------------------------------------------------------------------------------------------------------------
Here we create the 3 variables as follows...
FolderPath : It is a Static variable in which store the Folder Path from which we read the Files by using the ForEach Loop
FileName : It is a Dynamic variable which we will use in ForEach Loop to store the File Names dynamically during run time.
Temporarily you give any name like "TempFileName" which will change in Runtime.
varSQLQuery : This is Dynamic Variable where we Store and SQL Select Statement as a String which Selects the Fully Qualified File Name and File Name using the below Expression. :
"Select '"+@[User::FolderPath]+ @[User::FileName] +"' as FullyQualfiedFileName,'"+@[User::FolderPath]+ @[User::FileName]+"' as FileName"

Expression result : 
Select 'T:\My_Tech_Lab\SSIS_Lab\SSIS_Proj1_Sol1\Source\MyFiles\TempFileName' as FullyQualfiedFileName,'T:\My_Tech_Lab\SSIS_Lab\SSIS_Proj1_Sol1\Source\MyFiles\TempFileName' as FileName

Here, if you observe in this Select Statement, in both the Columns FullyQualfiedFileName and FileName , I am reading the File Full Path only.
But, while loading these to Destination table, FileName will stored as it is value and the FullyQualfiedFileName  value will be Converted to Binary Format and will stored as File.
--------------------------------------------------------------------------------------------------------------
Phase - 2 : Add the For Each Loop Container to Control Flow and Set the Properties
--------------------------------------------------------------------------------------------------------------
A) Add the ForEach Loop Container to the Control Flow

B) Next in ForEach Loop Editor > Collection Tab, Map the Varaibles and Set the Properties..
Enumeration = ForEach File Enumerator
Expressions > Directory = @[User::FolderPath] , the Folder Path from which the Files will be Collected.
Retrieve File Name = Fully Qualified

C) Next in ForEach Loop Editor > Variable Tab, Map the File Name Variable
Variable = User::FileName
Index=0

Next Say OK.
--------------------------------------------------------------------------------------------------------------
Phase - 3 : Add the Data Flow inside the For Each Loop Container and Design the Flow
--------------------------------------------------------------------------------------------------------------
A) Add the DataFlow inside ForEach Loop Container

B) Add the  OLE DB Source inside DataFlow and Set the Properties...
Next go to OLEDB Source Editor > Connection Mangers ;
Set the Connection and Select the Data Access Mode as "SQL Command from Variable"
Next Select the Variable Name.

Next select the Columns from the OLEDB Source Editor > Columns Tab

Next say OK.
C) Next set the Properties for OLE DB Source Advance Editor > Input and Output Properties.
We have to Setup these Properties to avoid the Truncation Error when you are reading the Data from the Data Flow.
Here we just need to set the Length for Output Columns which should be Greater than External Columns of the same section, as shown below.


Next say OK.
--------------------------------------------------------------------------------------------------------------
Phase - 4 : Connect the Import Column Transformation to OLD DB Source and Set the Properties.
--------------------------------------------------------------------------------------------------------------
A) Connect the Import Column Transformation to OLE DB Source
B) Next from Import Column Transformation >  Advanced Editor > Input Columns , Select the required Input Column which we need to Process as File in Binary Format
Here I selected only the colulmn "FullyQualifiedFileName" which I want to Process/Convert as File.
The Other Column "FileName" is not select as I want that to load to Destination as it is with out any Processing.
C) Next from Advanced Editor > Input Output Properties , Add a New Column and under Import Column Output > Output Columns
Here I added a New Column as "FileObj" , whose LineageId is 57 and DataType is DT_Image.
Next Copy that LineageID of the Newly added Column, and assign that Value to the Import Input Column > Input Columns
"FullyQualifiedFileName" > FileDatColumID
Now this Colum will be Converted from String to the Binary File Column "FileObj"

Next Say OK.
--------------------------------------------------------------------------------------------------------------
Phase - 5 : Connect the Import Column Transformation to OLD DB Destination and Set the Properties.
--------------------------------------------------------------------------------------------------------------
A) Connect the Import Column Transformation to OLE DB Destination


B) Next in OLE DB Destination Editor > Connection Mangers, Connect to the DB and Select the destination Table


Next do the Columns Mapping..

Here I have ignored the "FullyQualifiedFileName" as it Processed then Transformed as "FileObj" Field.

Also, I have ignored the FileId, which is an Identity Column that will generate automatically in the destination Table "Tbl_FilesStorage"..

Next Say OK and Run the Package..
------------------------------------------------------------------------------------------------------
Output :
Now the File Names and Files are stored in the Destination Table as follows..

SELECT [FileID]
  ,[FileName]
  ,[FileObject]
  FROM [dbo].[Tbl_FilesStorage]
GO


Note :
If You export the Binary File names from FileObject Column to a Folder, using the Export Column Transformation, then you will see the Files, as we Seen in the Source above.

--------------------------------------------------------------------------------------------------------
Thanks, TAMATAM ; Business Intelligence & Analytics Professional
--------------------------------------------------------------------------------------------------------

Featured Post from this Blog

How to compare Current Snapshot Data with Previous Snapshot in Power BI

How to Dynamically compare two Snapshots Data in Power BI Scenario: Suppose, we have a sample Sales data, which is stored with Monthly Snaps...

Popular Posts from this Blog